WorldWideScience

Sample records for legal-institutional analysis model

  1. Growth, financial development, societal norms and legal institutions

    NARCIS (Netherlands)

    Garretsen, Harry; Lensink, Robert; Sterken, Elmer

    2002-01-01

    This paper analyses whether societal norms help to explain cross-country differences in financial development. We analyze whether societal norms in addition to legal institutions have an impact on financial development. We address the implications of the inclusion of societal norms for the analysis

  2. A Legal Institutional Perspective on the European External Action Service

    DEFF Research Database (Denmark)

    Van Vooren, Bart

    2011-01-01

    This article provides a legal perspective on the new European External Action Service (EEAS), and positions this new body in the reshuffled institutional balance of EU external relations. Towards that end, the paper examines the EEAS’ legal nature as compared to that of Council, Commission...... the EEAS be drawn into proceedings before the Court of Justice? In answering those questions, this article then examines to which extent the legal-institutional choices on the structure of the EU External Action Service reflects the age-old tension entrenched in EU external relations law: the EU’s nature...

  3. Trust in Legal Institutions: an Empirical Approach from a Social Capital Perspective

    Directory of Open Access Journals (Sweden)

    Mariana Zuleta Ferrari

    2016-12-01

    Full Text Available Over the last decades, there is a growing public perception that some of the democratic institutions and frameworks, which were once taken for granted, are now showing their flaws, inefficiencies, increasingly struggling to keep up with society’s demands and expectations. This has led to a generalized feeling of uncertainty and disappointment, resulting in a lack of trust institutions. The implications of these circumstances on legal theory cannot be overlooked; this article aims to address the problem from an innovative perspective. A unique tool is presented in this article, which proposes a methodological agenda for approaching trust in legal institutions, from the perspective of the social capital theory. To this end, different variables and social capital dynamics will be identified and discussed in relation to trust in legal institutions. The aim is to, on one hand, provide an innovative methodological contribution to better understand the trust crisis, and in particular, the public perception towards legal institutions, and on the other, expand the analysis of social capital dimensions. Durante las últimas décadas, ha sido posible observar una creciente percepción general de que instituciones y estructuras democráticas que años atrás eran dadas por sentadas, presentan, hoy en día, fallas e ineficiencias que dificultan su capacidad de acompañar las demandas y expectativas de la sociedad. Ello ha llevado a un estado generalizado de incertidumbre y decepción, que resulta en la falta de confianza en las instituciones. Las implicancias de estas circumstancias para la teoría legal no pueden ser subestimadas. Este artículo aborda el problema desde una pespectiva innovadora. Presenta una herramienta única que propone una agenda metodológica para aproximarse a la temática de la confianza en las instituciones legales, desde la perspectiva de la teoría del capital social. A este fin, distintas variables y dinámicas del capital

  4. Legal-institutional arrangements facilitating offshore wind energy conversion systems (WECS) utilization. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Mayo, L.H.

    1977-09-01

    Concern for the continuing sufficiency of energy supplies in the U.S. has tended to direct increasing attention to unconventional sources of supply, including wind energy. Some of the more striking proposals for the utilization of wind energy relate to offshore configurations. The legal-institutional arrangements for facilitating the utilization of offshore wind energy conversion systems (WECS) are examined by positioning three program alternatives and analyzing the institutional support required for the implementation of each.

  5. Legal, Institutional, and Economic Indicators of Forest Conservation and Sustainable Management: Review of Information Available for the United States

    Science.gov (United States)

    Paul V. Ellefson; Calder M. Hibbard; Michael A. Kilgore; James E. Granskog

    2005-01-01

    This review looks at the Nation’s legal, institutional, and economic capacity to promote forest conservation and sustainable resource management. It focuses on 20 indicators of Criterion Seven of the so-called Montreal Process and involves an extensive search and synthesis of information from a variety of sources. It identifies ways to fill information gaps and improve...

  6. An integrated strategy for aircraft/airport noise abatement: A legal-institutional control act section 7 to the noise control act of 1972 and proposals based thereon

    Science.gov (United States)

    Mayo, L. H.

    1975-01-01

    The development of the aircraft noise control structure since the Griggs case of 1962 was examined. The Noise Control Act of 1972 is described which undertook to establish the legal-institutional framework within which an adequate aircraft/airport noise abatement program might be initiated with concern for full recognition of all the beneficial and detrimental consequences of air transportation and appropriate distribution of benefits and costs.

  7. Smart design rules for smart grids : analysing local smart grid development through an empirico-legal institutional lens

    NARCIS (Netherlands)

    Lammers, Imke; Heldeweg, Michiel A.

    2016-01-01

    Background: This article entails an innovative approach to smart grid technology implementation, as it connects governance research with legal analysis. We apply the empirico-legal ‘ILTIAD framework’, which combines Elinor Ostrom’s Institutional Analysis and Development (IAD) framework with

  8. Bank foundation – a symbiotic legal institution at the crossroad of banking system and non-profit sector

    Directory of Open Access Journals (Sweden)

    Magdalena CATARGIU

    2012-12-01

    Full Text Available In the context of the development and omnipresence, in Europe, of the non-profit sector and due to the diversification of the legal entities that are involved in the configuration of the third sector, an legal analysis of the foundation of banking origins, is very appealing. Throughout this study we aim to point out key moments in the evolution of this particular figure, mainly in the Italian legislation. Nevertheless, we intend to identify the legal nature of the foundation of banking origins in order to draw a line between banking and philanthropic activities.

  9. Legal, institutional, and political issues in transportation of nuclear materials at the back end of the LWR nuclear fuel cycle

    Energy Technology Data Exchange (ETDEWEB)

    Lippek, H.E.; Schuller, C.R.

    1979-03-01

    A study was conducted to identify major legal and institutional problems and issues in the transportation of spent fuel and associated processing wastes at the back end of the LWR nuclear fuel cycle. (Most of the discussion centers on the transportation of spent fuel, since this activity will involve virtually all of the legal and institutional problems likely to be encountered in moving waste materials, as well.) Actions or approaches that might be pursued to resolve the problems identified in the analysis are suggested. Two scenarios for the industrial-scale transportation of spent fuel and radioactive wastes, taken together, high-light most of the major problems and issues of a legal and institutional nature that are likely to arise: (1) utilizing the Allied General Nuclear Services (AGNS) facility at Barnwell, SC, as a temporary storage facility for spent fuel; and (2) utilizing AGNS for full-scale commercial reprocessing of spent LWR fuel.

  10. Legal, institutional, and political issues in transportation of nuclear materials at the back end of the LWR nuclear fuel cycle

    International Nuclear Information System (INIS)

    Lippek, H.E.; Schuller, C.R.

    1979-03-01

    A study was conducted to identify major legal and institutional problems and issues in the transportation of spent fuel and associated processing wastes at the back end of the LWR nuclear fuel cycle. (Most of the discussion centers on the transportation of spent fuel, since this activity will involve virtually all of the legal and institutional problems likely to be encountered in moving waste materials, as well.) Actions or approaches that might be pursued to resolve the problems identified in the analysis are suggested. Two scenarios for the industrial-scale transportation of spent fuel and radioactive wastes, taken together, high-light most of the major problems and issues of a legal and institutional nature that are likely to arise: (1) utilizing the Allied General Nuclear Services (AGNS) facility at Barnwell, SC, as a temporary storage facility for spent fuel; and (2) utilizing AGNS for full-scale commercial reprocessing of spent LWR fuel

  11. Corporate and public governances in transition: the limits of property rights and the significance of legal institutions

    Directory of Open Access Journals (Sweden)

    Jean-François Nivet

    2004-12-01

    Full Text Available Post-socialist transition raises crucial issues about the institutional setting of a market economy. The priority has been given to property rights, and privatization has been advocated as a means to depoliticize economic activities. The dismissal of external interventions, allied with the attraction to the American model and Hayekian ideas, often led to the introduction of minimal laws and wait for their evolutionary development. The failure of corporate and public governance, notably in Russia, helps to show why, on the contrary, democratically established legal rules are essential. Legislation should not only protect corporate shareholders and stakeholders, but more fundamentally all citizens against predatory collusive behavior of political, economic and criminal elites

  12. The ATLAS Analysis Model

    CERN Multimedia

    Amir Farbin

    The ATLAS Analysis Model is a continually developing vision of how to reconcile physics analysis requirements with the ATLAS offline software and computing model constraints. In the past year this vision has influenced the evolution of the ATLAS Event Data Model, the Athena software framework, and physics analysis tools. These developments, along with the October Analysis Model Workshop and the planning for CSC analyses have led to a rapid refinement of the ATLAS Analysis Model in the past few months. This article introduces some of the relevant issues and presents the current vision of the future ATLAS Analysis Model. Event Data Model The ATLAS Event Data Model (EDM) consists of several levels of details, each targeted for a specific set of tasks. For example the Event Summary Data (ESD) stores calorimeter cells and tracking system hits thereby permitting many calibration and alignment tasks, but will be only accessible at particular computing sites with potentially large latency. In contrast, the Analysis...

  13. SDI CFD MODELING ANALYSIS

    Energy Technology Data Exchange (ETDEWEB)

    Lee, S.

    2011-05-05

    The Savannah River Remediation (SRR) Organization requested that Savannah River National Laboratory (SRNL) develop a Computational Fluid Dynamics (CFD) method to mix and blend the miscible contents of the blend tanks to ensure the contents are properly blended before they are transferred from the blend tank; such as, Tank 50H, to the Salt Waste Processing Facility (SWPF) feed tank. The work described here consists of two modeling areas. They are the mixing modeling analysis during miscible liquid blending operation, and the flow pattern analysis during transfer operation of the blended liquid. The transient CFD governing equations consisting of three momentum equations, one mass balance, two turbulence transport equations for kinetic energy and dissipation rate, and one species transport were solved by an iterative technique until the species concentrations of tank fluid were in equilibrium. The steady-state flow solutions for the entire tank fluid were used for flow pattern analysis, for velocity scaling analysis, and the initial conditions for transient blending calculations. A series of the modeling calculations were performed to estimate the blending times for various jet flow conditions, and to investigate the impact of the cooling coils on the blending time of the tank contents. The modeling results were benchmarked against the pilot scale test results. All of the flow and mixing models were performed with the nozzles installed at the mid-elevation, and parallel to the tank wall. From the CFD modeling calculations, the main results are summarized as follows: (1) The benchmark analyses for the CFD flow velocity and blending models demonstrate their consistency with Engineering Development Laboratory (EDL) and literature test results in terms of local velocity measurements and experimental observations. Thus, an application of the established criterion to SRS full scale tank will provide a better, physically-based estimate of the required mixing time, and

  14. Operations and Modeling Analysis

    Science.gov (United States)

    Ebeling, Charles

    2005-01-01

    The Reliability and Maintainability Analysis Tool (RMAT) provides NASA the capability to estimate reliability and maintainability (R&M) parameters and operational support requirements for proposed space vehicles based upon relationships established from both aircraft and Shuttle R&M data. RMAT has matured both in its underlying database and in its level of sophistication in extrapolating this historical data to satisfy proposed mission requirements, maintenance concepts and policies, and type of vehicle (i.e. ranging from aircraft like to shuttle like). However, a companion analyses tool, the Logistics Cost Model (LCM) has not reached the same level of maturity as RMAT due, in large part, to nonexistent or outdated cost estimating relationships and underlying cost databases, and it's almost exclusive dependence on Shuttle operations and logistics cost input parameters. As a result, the full capability of the RMAT/LCM suite of analysis tools to take a conceptual vehicle and derive its operations and support requirements along with the resulting operating and support costs has not been realized.

  15. Decree No. 2.363 of 21 October 1987 abolishing the National Institute of Colonization and Agrarian Reform--INCRA, creating the Legal Institute of Rural Land--INTER, and other measures.

    Science.gov (United States)

    1989-01-01

    This Decree abolishes the Brazilian National Institute of Colonization and Agrarian Reform (INCRA) and creates a Legal Institute of Rural Land (INTER) linked to the Ministry of Agrarian Reform (MIRAD) to perform the activities of INCRA. MIRAD will henceforth be responsible for the rights, powers, and obligations of INCRA and will supervise INCRA's property and resources. In this capacity MIRAD will supervise, coordinate, and execute activities related to agrarian reform and agricultural policy. Among these activities are the promotion of social justice and productivity through 1) the just and adequate distribution of ownership of rural land, 2) limitation of the acquisition of rural property by foreigners, and 3) encouragement of the harmonious development of rural life. In developing such activities MIRAD is to make use of legal measures contained in land law, including those relating to the selection of public rural lands, the privatization of rural land through regularization of ownership, colonization, zoning, and taxation. It is also authorized to expropriate and distribute unexploited or improperly exploited land to worker families, with priority going to labor cooperatives. Further provisions establish rules on expropriation. Among these is the requirement that forests must be protected.

  16. Analysis of Business Models

    Directory of Open Access Journals (Sweden)

    Slavik Stefan

    2014-12-01

    Full Text Available The term business model has been used in practice for few years, but companies create, define and innovate their models subconsciously from the start of business. Our paper is aimed to clear the theory about business model, hence definition and all the components that form each business. In the second part, we create an analytical tool and analyze the real business models in Slovakia and define the characteristics of each part of business model, i.e., customers, distribution, value, resources, activities, cost and revenue. In the last part of our paper, we discuss the most used characteristics, extremes, discrepancies and the most important facts which were detected in our research.

  17. Survival analysis models and applications

    CERN Document Server

    Liu, Xian

    2012-01-01

    Survival analysis concerns sequential occurrences of events governed by probabilistic laws.  Recent decades have witnessed many applications of survival analysis in various disciplines. This book introduces both classic survival models and theories along with newly developed techniques. Readers will learn how to perform analysis of survival data by following numerous empirical illustrations in SAS. Survival Analysis: Models and Applications: Presents basic techniques before leading onto some of the most advanced topics in survival analysis.Assumes only a minimal knowledge of SAS whilst enablin

  18. CMS Analysis School Model

    Energy Technology Data Exchange (ETDEWEB)

    Malik, S. [Nebraska U.; Shipsey, I. [Purdue U.; Cavanaugh, R. [Illinois U., Chicago; Bloom, K. [Nebraska U.; Chan, Kai-Feng [Taiwan, Natl. Taiwan U.; D' Hondt, J. [Vrije U., Brussels; Klima, B. [Fermilab; Narain, M. [Brown U.; Palla, F. [INFN, Pisa; Rolandi, G. [CERN; Schörner-Sadenius, T. [DESY

    2014-01-01

    To impart hands-on training in physics analysis, CMS experiment initiated the concept of CMS Data Analysis School (CMSDAS). It was born over three years ago at the LPC (LHC Physics Centre), Fermilab and is based on earlier workshops held at the LPC and CLEO Experiment. As CMS transitioned from construction to the data taking mode, the nature of earlier training also evolved to include more of analysis tools, software tutorials and physics analysis. This effort epitomized as CMSDAS has proven to be a key for the new and young physicists to jump start and contribute to the physics goals of CMS by looking for new physics with the collision data. With over 400 physicists trained in six CMSDAS around the globe, CMS is trying to engage the collaboration in its discovery potential and maximize physics output. As a bigger goal, CMS is striving to nurture and increase engagement of the myriad talents, in the development of physics, service, upgrade, education of those new to CMS and the career development of younger members. An extension of the concept to the dedicated software and hardware schools is also planned, keeping in mind the ensuing upgrade phase.

  19. Benchmark risk analysis models

    NARCIS (Netherlands)

    Ale BJM; Golbach GAM; Goos D; Ham K; Janssen LAM; Shield SR; LSO

    2002-01-01

    A so-called benchmark exercise was initiated in which the results of five sets of tools available in the Netherlands would be compared. In the benchmark exercise a quantified risk analysis was performed on a -hypothetical- non-existing hazardous establishment located on a randomly chosen location in

  20. PET kinetic analysis. Compartmental model

    International Nuclear Information System (INIS)

    Watabe, Hiroshi; Ikoma, Yoko; Shidahara, Miho; Kimura, Yuichi; Naganawa, Mika

    2006-01-01

    Positron emission tomography (PET) enables not only visualization of the distribution of radiotracer, but also has ability to quantify several biomedical functions. Compartmental model is a basic idea to analyze dynamic PET data. This review describes the principle of the compartmental model and categorizes the techniques and approaches for the compartmental model according to various aspects: model design, experimental design, invasiveness, and mathematical solution. We also discussed advanced applications of the compartmental analysis with PET. (author)

  1. Multiscale Signal Analysis and Modeling

    CERN Document Server

    Zayed, Ahmed

    2013-01-01

    Multiscale Signal Analysis and Modeling presents recent advances in multiscale analysis and modeling using wavelets and other systems. This book also presents applications in digital signal processing using sampling theory and techniques from various function spaces, filter design, feature extraction and classification, signal and image representation/transmission, coding, nonparametric statistical signal processing, and statistical learning theory. This book also: Discusses recently developed signal modeling techniques, such as the multiscale method for complex time series modeling, multiscale positive density estimations, Bayesian Shrinkage Strategies, and algorithms for data adaptive statistics Introduces new sampling algorithms for multidimensional signal processing Provides comprehensive coverage of wavelets with presentations on waveform design and modeling, wavelet analysis of ECG signals and wavelet filters Reviews features extraction and classification algorithms for multiscale signal and image proce...

  2. Uncertainty analysis of environmental models

    International Nuclear Information System (INIS)

    Monte, L.

    1990-01-01

    In the present paper an evaluation of the output uncertainty of an environmental model for assessing the transfer of 137 Cs and 131 I in the human food chain are carried out on the basis of a statistical analysis of data reported by the literature. The uncertainty analysis offers the oppotunity of obtaining some remarkable information about the uncertainty of models predicting the migration of non radioactive substances in the environment mainly in relation to the dry and wet deposition

  3. Sensitivity Analysis of Simulation Models

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    2009-01-01

    This contribution presents an overview of sensitivity analysis of simulation models, including the estimation of gradients. It covers classic designs and their corresponding (meta)models; namely, resolution-III designs including fractional-factorial two-level designs for first-order polynomial

  4. Command Process Modeling & Risk Analysis

    Science.gov (United States)

    Meshkat, Leila

    2011-01-01

    Commanding Errors may be caused by a variety of root causes. It's important to understand the relative significance of each of these causes for making institutional investment decisions. One of these causes is the lack of standardized processes and procedures for command and control. We mitigate this problem by building periodic tables and models corresponding to key functions within it. These models include simulation analysis and probabilistic risk assessment models.

  5. Reliability analysis and operator modelling

    International Nuclear Information System (INIS)

    Hollnagel, Erik

    1996-01-01

    The paper considers the state of operator modelling in reliability analysis. Operator models are needed in reliability analysis because operators are needed in process control systems. HRA methods must therefore be able to account both for human performance variability and for the dynamics of the interaction. A selected set of first generation HRA approaches is briefly described in terms of the operator model they use, their classification principle, and the actual method they propose. In addition, two examples of second generation methods are also considered. It is concluded that first generation HRA methods generally have very simplistic operator models, either referring to the time-reliability relationship or to elementary information processing concepts. It is argued that second generation HRA methods must recognise that cognition is embedded in a context, and be able to account for that in the way human reliability is analysed and assessed

  6. Judicialização ou juridicização? As instituições jurídicas e suas estratégias na saúde Judicialization or juridicization? Legal institutions and their strategies in health

    Directory of Open Access Journals (Sweden)

    Felipe Dutra Asensi

    2010-01-01

    citizenship. In the context of enforcing rights, there is a discussion of legal institutions, with particular emphasis on the Prosecutor's Office, Public Defender and the Judiciary. It is observed that the realization of the right to health includes the preservation of continuity of public policies through dialogue. Thus, political conflicts suffer more juridicization (conflicts are discussed from the legal viewpoint than judicialization (to the most, you avoid taking them to the judiciary, since the intention is to avoid the judicial process and adopt multiple strategies and extra-judicial pacts. The ability to act independently brings the political highlight of the Parquet as a mediator in health. The very idea of right to health is receiving a new meaning, encompassing an interdisciplinary character, which increases the possibilities of action of the Parquet on its guarantee and helps in self-defense of a lasting health policy aimed at bringing the world of right in the world of facts.

  7. Bayesian analysis of CCDM models

    Energy Technology Data Exchange (ETDEWEB)

    Jesus, J.F. [Universidade Estadual Paulista (Unesp), Câmpus Experimental de Itapeva, Rua Geraldo Alckmin 519, Vila N. Sra. de Fátima, Itapeva, SP, 18409-010 Brazil (Brazil); Valentim, R. [Departamento de Física, Instituto de Ciências Ambientais, Químicas e Farmacêuticas—ICAQF, Universidade Federal de São Paulo (UNIFESP), Unidade José Alencar, Rua São Nicolau No. 210, Diadema, SP, 09913-030 Brazil (Brazil); Andrade-Oliveira, F., E-mail: jfjesus@itapeva.unesp.br, E-mail: valentim.rodolfo@unifesp.br, E-mail: felipe.oliveira@port.ac.uk [Institute of Cosmology and Gravitation—University of Portsmouth, Burnaby Road, Portsmouth, PO1 3FX United Kingdom (United Kingdom)

    2017-09-01

    Creation of Cold Dark Matter (CCDM), in the context of Einstein Field Equations, produces a negative pressure term which can be used to explain the accelerated expansion of the Universe. In this work we tested six different spatially flat models for matter creation using statistical criteria, in light of SNe Ia data: Akaike Information Criterion (AIC), Bayesian Information Criterion (BIC) and Bayesian Evidence (BE). These criteria allow to compare models considering goodness of fit and number of free parameters, penalizing excess of complexity. We find that JO model is slightly favoured over LJO/ΛCDM model, however, neither of these, nor Γ = 3α H {sub 0} model can be discarded from the current analysis. Three other scenarios are discarded either because poor fitting or because of the excess of free parameters. A method of increasing Bayesian evidence through reparameterization in order to reducing parameter degeneracy is also developed.

  8. Program Analysis as Model Checking

    DEFF Research Database (Denmark)

    Olesen, Mads Chr.

    and abstract interpretation. Model checking views the program as a finite automaton and tries to prove logical properties over the automaton model, or present a counter-example if not possible — with a focus on precision. Abstract interpretation translates the program semantics into abstract semantics...... are considered, among others numerical analysis of c programs, and worst-case execution time analysis of ARM programs. It is shown how lattice automata allow automatic and manual tuning of the precision and efficiency of the verification procedure. In the case of worst-case execution time analysis a sound......Software programs are proliferating throughout modern life, to a point where even the simplest appliances such as lightbulbs contain software, in addition to the software embedded in cars and airplanes. The correct functioning of these programs is therefore of the utmost importance, for the quality...

  9. Accelerated life models modeling and statistical analysis

    CERN Document Server

    Bagdonavicius, Vilijandas

    2001-01-01

    Failure Time DistributionsIntroductionParametric Classes of Failure Time DistributionsAccelerated Life ModelsIntroductionGeneralized Sedyakin's ModelAccelerated Failure Time ModelProportional Hazards ModelGeneralized Proportional Hazards ModelsGeneralized Additive and Additive-Multiplicative Hazards ModelsChanging Shape and Scale ModelsGeneralizationsModels Including Switch-Up and Cycling EffectsHeredity HypothesisSummaryAccelerated Degradation ModelsIntroductionDegradation ModelsModeling the Influence of Explanatory Varia

  10. Geometric simplification of analysis models

    Energy Technology Data Exchange (ETDEWEB)

    Watterberg, P.A.

    1999-12-01

    Analysis programs have been having to deal with more and more complex objects as the capability to model fine detail increases. This can make them unacceptably slow. This project attempts to find heuristics for removing features from models in an automatic fashion in order to reduce polygon count. The approach is not one of theoretical completeness but rather one of trying to achieve useful results with scattered practical ideas. By removing a few simple things such as screw holes, slots, chambers, and fillets, large gains can be realized. Results varied but a reduction in the number of polygons by a factor of 10 is not unusual.

  11. Distribution system modeling and analysis

    CERN Document Server

    Kersting, William H

    2001-01-01

    For decades, distribution engineers did not have the sophisticated tools developed for analyzing transmission systems-often they had only their instincts. Things have changed, and we now have computer programs that allow engineers to simulate, analyze, and optimize distribution systems. Powerful as these programs are, however, without a real understanding of the operating characteristics of a distribution system, engineers using the programs can easily make serious errors in their designs and operating procedures. Distribution System Modeling and Analysis helps prevent those errors. It gives readers a basic understanding of the modeling and operating characteristics of the major components of a distribution system. One by one, the author develops and analyzes each component as a stand-alone element, then puts them all together to analyze a distribution system comprising the various shunt and series devices for power-flow and short-circuit studies. He includes the derivation of all models and includes many num...

  12. Ventilation Model and Analysis Report

    Energy Technology Data Exchange (ETDEWEB)

    V. Chipman

    2003-07-18

    This model and analysis report develops, validates, and implements a conceptual model for heat transfer in and around a ventilated emplacement drift. This conceptual model includes thermal radiation between the waste package and the drift wall, convection from the waste package and drift wall surfaces into the flowing air, and conduction in the surrounding host rock. These heat transfer processes are coupled and vary both temporally and spatially, so numerical and analytical methods are used to implement the mathematical equations which describe the conceptual model. These numerical and analytical methods predict the transient response of the system, at the drift scale, in terms of spatially varying temperatures and ventilation efficiencies. The ventilation efficiency describes the effectiveness of the ventilation process in removing radionuclide decay heat from the drift environment. An alternative conceptual model is also developed which evaluates the influence of water and water vapor mass transport on the ventilation efficiency. These effects are described using analytical methods which bound the contribution of latent heat to the system, quantify the effects of varying degrees of host rock saturation (and hence host rock thermal conductivity) on the ventilation efficiency, and evaluate the effects of vapor and enhanced vapor diffusion on the host rock thermal conductivity.

  13. Ventilation Model and Analysis Report

    International Nuclear Information System (INIS)

    Chipman, V.

    2003-01-01

    This model and analysis report develops, validates, and implements a conceptual model for heat transfer in and around a ventilated emplacement drift. This conceptual model includes thermal radiation between the waste package and the drift wall, convection from the waste package and drift wall surfaces into the flowing air, and conduction in the surrounding host rock. These heat transfer processes are coupled and vary both temporally and spatially, so numerical and analytical methods are used to implement the mathematical equations which describe the conceptual model. These numerical and analytical methods predict the transient response of the system, at the drift scale, in terms of spatially varying temperatures and ventilation efficiencies. The ventilation efficiency describes the effectiveness of the ventilation process in removing radionuclide decay heat from the drift environment. An alternative conceptual model is also developed which evaluates the influence of water and water vapor mass transport on the ventilation efficiency. These effects are described using analytical methods which bound the contribution of latent heat to the system, quantify the effects of varying degrees of host rock saturation (and hence host rock thermal conductivity) on the ventilation efficiency, and evaluate the effects of vapor and enhanced vapor diffusion on the host rock thermal conductivity

  14. ANALYSIS MODEL FOR INVENTORY MANAGEMENT

    Directory of Open Access Journals (Sweden)

    CAMELIA BURJA

    2010-01-01

    Full Text Available The inventory represents an essential component for the assets of the enterprise and the economic analysis gives them special importance because their accurate management determines the achievement of the activity object and the financial results. The efficient management of inventory requires ensuring an optimum level for them, which will guarantee the normal functioning of the activity with minimum inventory expenses and funds which are immobilised. The paper presents an analysis model for inventory management based on their rotation speed and the correlation with the sales volume illustrated in an adequate study. The highlighting of the influence factors on the efficient inventory management ensures the useful information needed to justify managerial decisions, which will lead to a balancedfinancial position and to increased company performance.

  15. Simplified model for DNB analysis

    International Nuclear Information System (INIS)

    Silva Filho, E.

    1979-08-01

    In a pressurized water nuclear reactor (PWR), the power of operation is restricted by the possibility of the occurrence of the departure from nucleate boiling called DNB (Departure from Nucleate Boiling) in the hottest channel of the core. The present work proposes a simplified model that analyses the thermal-hydraulic conditions of the coolant in the hottest channel of PWRs with the objective to evaluate BNB in this channel. For this the coupling between the hot channel and typical nominal channels assumed imposing the existence of a cross flow between these channels in a way that a uniforme pressure axial distribution results along the channels. The model is applied for Angra-I reactor and the results are compared with those of Final Safety Analysis Report (FSAR) obtained by Westinghouse through the THINC program, beeing considered satisfactory (Author) [pt

  16. Geologic Framework Model Analysis Model Report

    International Nuclear Information System (INIS)

    Clayton, R.

    2000-01-01

    The purpose of this report is to document the Geologic Framework Model (GFM), Version 3.1 (GFM3.1) with regard to data input, modeling methods, assumptions, uncertainties, limitations, and validation of the model results, qualification status of the model, and the differences between Version 3.1 and previous versions. The GFM represents a three-dimensional interpretation of the stratigraphy and structural features of the location of the potential Yucca Mountain radioactive waste repository. The GFM encompasses an area of 65 square miles (170 square kilometers) and a volume of 185 cubic miles (771 cubic kilometers). The boundaries of the GFM were chosen to encompass the most widely distributed set of exploratory boreholes (the Water Table or WT series) and to provide a geologic framework over the area of interest for hydrologic flow and radionuclide transport modeling through the unsaturated zone (UZ). The depth of the model is constrained by the inferred depth of the Tertiary-Paleozoic unconformity. The GFM was constructed from geologic map and borehole data. Additional information from measured stratigraphy sections, gravity profiles, and seismic profiles was also considered. This interim change notice (ICN) was prepared in accordance with the Technical Work Plan for the Integrated Site Model Process Model Report Revision 01 (CRWMS M and O 2000). The constraints, caveats, and limitations associated with this model are discussed in the appropriate text sections that follow. The GFM is one component of the Integrated Site Model (ISM) (Figure l), which has been developed to provide a consistent volumetric portrayal of the rock layers, rock properties, and mineralogy of the Yucca Mountain site. The ISM consists of three components: (1) Geologic Framework Model (GFM); (2) Rock Properties Model (RPM); and (3) Mineralogic Model (MM). The ISM merges the detailed project stratigraphy into model stratigraphic units that are most useful for the primary downstream models and

  17. Geologic Framework Model Analysis Model Report

    Energy Technology Data Exchange (ETDEWEB)

    R. Clayton

    2000-12-19

    The purpose of this report is to document the Geologic Framework Model (GFM), Version 3.1 (GFM3.1) with regard to data input, modeling methods, assumptions, uncertainties, limitations, and validation of the model results, qualification status of the model, and the differences between Version 3.1 and previous versions. The GFM represents a three-dimensional interpretation of the stratigraphy and structural features of the location of the potential Yucca Mountain radioactive waste repository. The GFM encompasses an area of 65 square miles (170 square kilometers) and a volume of 185 cubic miles (771 cubic kilometers). The boundaries of the GFM were chosen to encompass the most widely distributed set of exploratory boreholes (the Water Table or WT series) and to provide a geologic framework over the area of interest for hydrologic flow and radionuclide transport modeling through the unsaturated zone (UZ). The depth of the model is constrained by the inferred depth of the Tertiary-Paleozoic unconformity. The GFM was constructed from geologic map and borehole data. Additional information from measured stratigraphy sections, gravity profiles, and seismic profiles was also considered. This interim change notice (ICN) was prepared in accordance with the Technical Work Plan for the Integrated Site Model Process Model Report Revision 01 (CRWMS M&O 2000). The constraints, caveats, and limitations associated with this model are discussed in the appropriate text sections that follow. The GFM is one component of the Integrated Site Model (ISM) (Figure l), which has been developed to provide a consistent volumetric portrayal of the rock layers, rock properties, and mineralogy of the Yucca Mountain site. The ISM consists of three components: (1) Geologic Framework Model (GFM); (2) Rock Properties Model (RPM); and (3) Mineralogic Model (MM). The ISM merges the detailed project stratigraphy into model stratigraphic units that are most useful for the primary downstream models and the

  18. Model Performance Evaluation and Scenario Analysis (MPESA)

    Science.gov (United States)

    Model Performance Evaluation and Scenario Analysis (MPESA) assesses the performance with which models predict time series data. The tool was developed Hydrological Simulation Program-Fortran (HSPF) and the Stormwater Management Model (SWMM)

  19. Pigs on the plains: Institutional analysis of a Colorado water quality initiative

    Science.gov (United States)

    King, D.; Burkardt, N.; Lee, Lamb B.

    2006-01-01

    We used the Legal-Institutional Analysis Model (LIAM) and Advocacy Coalition Framework (ACF) to analyze the campaign over passage of the Colorado Hogs Rule, an initiative passed by the voters in 1998 to require regulation of swine production facilities in Colorado. Used in tandem, LIAM and ACF provided an opportunity to develop a robust understanding of the obstacles and opportunities that face water quality managers in a state-centered multi-organizational decision process. We found that combining the LIAM with the ACF enhanced the understanding that could be achieved by using either model in isolation. The predictive capacity of the LIAM would have been reduced without information from the ACF, and the ACF by itself would have missed the importance of a single-case study.

  20. ModelMate - A graphical user interface for model analysis

    Science.gov (United States)

    Banta, Edward R.

    2011-01-01

    ModelMate is a graphical user interface designed to facilitate use of model-analysis programs with models. This initial version of ModelMate supports one model-analysis program, UCODE_2005, and one model software program, MODFLOW-2005. ModelMate can be used to prepare input files for UCODE_2005, run UCODE_2005, and display analysis results. A link to the GW_Chart graphing program facilitates visual interpretation of results. ModelMate includes capabilities for organizing directories used with the parallel-processing capabilities of UCODE_2005 and for maintaining files in those directories to be identical to a set of files in a master directory. ModelMate can be used on its own or in conjunction with ModelMuse, a graphical user interface for MODFLOW-2005 and PHAST.

  1. Diagnosis, Synthesis and Analysis of Probabilistic Models

    NARCIS (Netherlands)

    Han, Tingting

    2009-01-01

    This dissertation considers three important aspects of model checking Markov models: diagnosis — generating counterexamples, synthesis — providing valid parameter values and analysis — verifying linear real-time properties. The three aspects are relatively independent while all contribute to

  2. Bayesian Model Averaging for Propensity Score Analysis

    Science.gov (United States)

    Kaplan, David; Chen, Jianshen

    2013-01-01

    The purpose of this study is to explore Bayesian model averaging in the propensity score context. Previous research on Bayesian propensity score analysis does not take into account model uncertainty. In this regard, an internally consistent Bayesian framework for model building and estimation must also account for model uncertainty. The…

  3. Statistical Modelling of Wind Proles - Data Analysis and Modelling

    DEFF Research Database (Denmark)

    Jónsson, Tryggvi; Pinson, Pierre

    The aim of the analysis presented in this document is to investigate whether statistical models can be used to make very short-term predictions of wind profiles.......The aim of the analysis presented in this document is to investigate whether statistical models can be used to make very short-term predictions of wind profiles....

  4. Analysis of Crosscutting in Model Transformations

    NARCIS (Netherlands)

    van den Berg, Klaas; Tekinerdogan, B.; Nguyen, H.; Aagedal, J.; Neple, T.; Oldevik, J.

    2006-01-01

    This paper describes an approach for the analysis of crosscutting in model transformations in the Model Driven Architecture (MDA). Software architectures should be amenable to changes in user requirements and technological platforms. Impact analysis of changes can be based on traceability of

  5. Two sustainable energy system analysis models

    DEFF Research Database (Denmark)

    Lund, Henrik; Goran Krajacic, Neven Duic; da Graca Carvalho, Maria

    2005-01-01

    This paper presents a comparative study of two energy system analysis models both designed with the purpose of analysing electricity systems with a substantial share of fluctuating renewable energy.......This paper presents a comparative study of two energy system analysis models both designed with the purpose of analysing electricity systems with a substantial share of fluctuating renewable energy....

  6. Mathematical modeling and analysis of WEDM machining ...

    Indian Academy of Sciences (India)

    Home; Journals; Sadhana; Volume 42; Issue 6. Mathematical modeling and analysis ... The present work is mainly focused on the analysis and optimization of the WEDM process parameters of Inconel 625. The four machining ... Response surface methodology was used to develop the experimental models. The parametric ...

  7. Model Checking as Static Analysis

    DEFF Research Database (Denmark)

    Zhang, Fuyuan

    fairness problems can be encoded into ALFP as well. To deal with multi-valued model checking problems, we have proposed multivalued ALFP. A Moore Family result for multi-valued ALFP is also established, which ensures the existence and uniqueness of the least model. When the truth values in multi-valued...... of states satisfying a CTL formula can be characterized as the least model of ALFP clauses specifying this CTL formula. The existence of the least model of ALFP clauses is ensured by the Moore Family property of ALFP. Then, we take fairness assumptions in CTL into consideration and have shown that CTL...... ALFP constitute a nite distributive complete lattice, multi-valued ALFP can be reduced to two-valued ALFP. This result enables to implement a solver for multi-valued ALFP by reusing existing solvers for twovalued ALFP. Our ALFP-based technique developed for the two-valued CTL naturally generalizes...

  8. Rapid Analysis Model: Reducing Analysis Time without Sacrificing Quality.

    Science.gov (United States)

    Lee, William W.; Owens, Diana

    2001-01-01

    Discusses the performance technology design process and the fact that the analysis phase is often being eliminated to speed up the process. Proposes a rapid analysis model that reduces time needed for analysis and still ensures more successful value-added solutions that focus on customer satisfaction. (LRW)

  9. Representing Uncertainty on Model Analysis Plots

    Science.gov (United States)

    Smith, Trevor I.

    2016-01-01

    Model analysis provides a mechanism for representing student learning as measured by standard multiple-choice surveys. The model plot contains information regarding both how likely students in a particular class are to choose the correct answer and how likely they are to choose an answer consistent with a well-documented conceptual model.…

  10. Behavior Analysis of Elderly using Topic Models

    NARCIS (Netherlands)

    Rieping, K.; Englebienne, G.; Kröse, B.

    2014-01-01

    This paper describes two new topic models for the analysis of human behavior in homes that are equipped with sensor networks. The models are based on Latent Dirichlet Allocation (LDA) topic models and can detect patterns in sensor data in an unsupervised manner. LDA-Gaussian, the first variation of

  11. Model correction factor method for system analysis

    DEFF Research Database (Denmark)

    Ditlevsen, Ove Dalager; Johannesen, Johannes M.

    2000-01-01

    The Model Correction Factor Method is an intelligent response surface method based on simplifiedmodeling. MCFM is aimed for reliability analysis in case of a limit state defined by an elaborate model. Herein it isdemonstrated that the method is applicable for elaborate limit state surfaces on which...... severallocally most central points exist without there being a simple geometric definition of the corresponding failuremodes such as is the case for collapse mechanisms in rigid plastic hinge models for frame structures. Taking as simplifiedidealized model a model of similarity with the elaborate model...... surface than existing in the idealized model....

  12. CMS Data Analysis School Model

    CERN Document Server

    Malik, Sudhir; Cavanaugh, R; Bloom, K; Chan, Kai-Feng; D'Hondt, J; Klima, B; Narain, M; Palla, F; Rolandi, G; Schörner-Sadenius, T

    2014-01-01

    To impart hands-on training in physics analysis, CMS experiment initiated the  concept of CMS Data Analysis School (CMSDAS). It was born three years ago at the LPC (LHC Physics Center), Fermilab and is based on earlier workshops held at the LPC and CLEO Experiment. As CMS transitioned from construction to the data taking mode, the nature of earlier training also evolved to include more of analysis tools, software tutorials and physics analysis. This effort epitomized as CMSDAS has proven to be a key for the new and young physicists to jump start and contribute to the physics goals of CMS by looking for new physics with the collision data. With over 400 physicists trained in six CMSDAS around the globe , CMS is trying to  engage the collaboration discovery potential and maximize the physics output. As a bigger goal, CMS is striving to nurture and increase engagement of the myriad talents of CMS, in the development of physics, service, upgrade, education of those new to CMS and the caree...

  13. Analysis of radiology business models.

    Science.gov (United States)

    Enzmann, Dieter R; Schomer, Donald F

    2013-03-01

    As health care moves to value orientation, radiology's traditional business model faces challenges to adapt. The authors describe a strategic value framework that radiology practices can use to best position themselves in their environments. This simplified construct encourages practices to define their dominant value propositions. There are 3 main value propositions that form a conceptual triangle, whose vertices represent the low-cost provider, the product leader, and the customer intimacy models. Each vertex has been a valid market position, but each demands specific capabilities and trade-offs. The underlying concepts help practices select value propositions they can successfully deliver in their competitive environments. Copyright © 2013 American College of Radiology. Published by Elsevier Inc. All rights reserved.

  14. Combustion instability modeling and analysis

    Energy Technology Data Exchange (ETDEWEB)

    Santoro, R.J.; Yang, V.; Santavicca, D.A. [Pennsylvania State Univ., University Park, PA (United States); Sheppard, E.J. [Tuskeggee Univ., Tuskegee, AL (United States). Dept. of Aerospace Engineering

    1995-12-31

    It is well known that the two key elements for achieving low emissions and high performance in a gas turbine combustor are to simultaneously establish (1) a lean combustion zone for maintaining low NO{sub x} emissions and (2) rapid mixing for good ignition and flame stability. However, these requirements, when coupled with the short combustor lengths used to limit the residence time for NO formation typical of advanced gas turbine combustors, can lead to problems regarding unburned hydrocarbons (UHC) and carbon monoxide (CO) emissions, as well as the occurrence of combustion instabilities. The concurrent development of suitable analytical and numerical models that are validated with experimental studies is important for achieving this objective. A major benefit of the present research will be to provide for the first time an experimentally verified model of emissions and performance of gas turbine combustors. The present study represents a coordinated effort between industry, government and academia to investigate gas turbine combustion dynamics. Specific study areas include development of advanced diagnostics, definition of controlling phenomena, advancement of analytical and numerical modeling capabilities, and assessment of the current status of our ability to apply these tools to practical gas turbine combustors. The present work involves four tasks which address, respectively, (1) the development of a fiber-optic probe for fuel-air ratio measurements, (2) the study of combustion instability using laser-based diagnostics in a high pressure, high temperature flow reactor, (3) the development of analytical and numerical modeling capabilities for describing combustion instability which will be validated against experimental data, and (4) the preparation of a literature survey and establishment of a data base on practical experience with combustion instability.

  15. Generation and analysis of large reliability models

    Science.gov (United States)

    Palumbo, Daniel L.; Nicol, David M.

    1990-01-01

    An effort has been underway for several years at NASA's Langley Research Center to extend the capability of Markov modeling techniques for reliability analysis to the designers of highly reliable avionic systems. This effort has been focused in the areas of increased model abstraction and increased computational capability. The reliability model generator (RMG), a software tool which uses as input a graphical, object-oriented block diagram of the system, is discussed. RMG uses an automated failure modes-effects analysis algorithm to produce the reliability model from the graphical description. Also considered is the ASSURE software tool, a parallel processing program which uses the ASSIST modeling language and SURE semi-Markov solution technique. An executable failure modes-effects analysis is used by ASSURE. The successful combination of the power of graphical representation, automated model generation, and parallel computation leads to the conclusion that large system architectures can now be analyzed.

  16. Hypersonic - Model Analysis as a Service

    DEFF Research Database (Denmark)

    Acretoaie, Vlad; Störrle, Harald

    2014-01-01

    Hypersonic is a Cloud-based tool that proposes a new approach to the deployment of model analysis facilities. It is implemented as a RESTful Web service API o_ering analysis features such as model clone detection. This approach allows the migration of resource intensive analysis algorithms from...... monolithic desktop modeling tools to a wide range of mobile and Web-based clients. As a technology demonstrator, a Web application acting as a client for the Hypersonic API has been implemented and made publicly available....

  17. Reusable Launch Vehicle (RLV) Market Analysis Model

    Science.gov (United States)

    Prince, Frank A.

    1999-01-01

    The RLV Market Analysis model is at best a rough order approximation of actual market behavior. However, it does give a quick indication if the flights exists to enable an economically viable RLV, and the assumptions necessary for the vehicle to capture those flights. Additional analysis, market research, and updating with the latest information on payloads and launches would improve the model. Plans are to update the model as new information becomes available and new requirements are levied. This tool will continue to be a vital part of NASA's RLV business analysis capability for the foreseeable future.

  18. Hierarchical modeling and analysis for spatial data

    CERN Document Server

    Banerjee, Sudipto; Gelfand, Alan E

    2003-01-01

    Among the many uses of hierarchical modeling, their application to the statistical analysis of spatial and spatio-temporal data from areas such as epidemiology And environmental science has proven particularly fruitful. Yet to date, the few books that address the subject have been either too narrowly focused on specific aspects of spatial analysis, or written at a level often inaccessible to those lacking a strong background in mathematical statistics.Hierarchical Modeling and Analysis for Spatial Data is the first accessible, self-contained treatment of hierarchical methods, modeling, and dat

  19. Representing uncertainty on model analysis plots

    Directory of Open Access Journals (Sweden)

    Trevor I. Smith

    2016-09-01

    Full Text Available Model analysis provides a mechanism for representing student learning as measured by standard multiple-choice surveys. The model plot contains information regarding both how likely students in a particular class are to choose the correct answer and how likely they are to choose an answer consistent with a well-documented conceptual model. Unfortunately, Bao’s original presentation of the model plot did not include a way to represent uncertainty in these measurements. I present details of a method to add error bars to model plots by expanding the work of Sommer and Lindell. I also provide a template for generating model plots with error bars.

  20. Three-dimensional model analysis and processing

    CERN Document Server

    Yu, Faxin; Luo, Hao; Wang, Pinghui

    2011-01-01

    This book focuses on five hot research directions in 3D model analysis and processing in computer science:  compression, feature extraction, content-based retrieval, irreversible watermarking and reversible watermarking.

  1. Applied research in uncertainty modeling and analysis

    CERN Document Server

    Ayyub, Bilal

    2005-01-01

    Uncertainty has been a concern to engineers, managers, and scientists for many years. For a long time uncertainty has been considered synonymous with random, stochastic, statistic, or probabilistic. Since the early sixties views on uncertainty have become more heterogeneous. In the past forty years numerous tools that model uncertainty, above and beyond statistics, have been proposed by several engineers and scientists. The tool/method to model uncertainty in a specific context should really be chosen by considering the features of the phenomenon under consideration, not independent of what is known about the system and what causes uncertainty. In this fascinating overview of the field, the authors provide broad coverage of uncertainty analysis/modeling and its application. Applied Research in Uncertainty Modeling and Analysis presents the perspectives of various researchers and practitioners on uncertainty analysis and modeling outside their own fields and domain expertise. Rather than focusing explicitly on...

  2. Sensitivity Analysis of a Physiochemical Interaction Model ...

    African Journals Online (AJOL)

    The mathematical modelling of physiochemical interactions in the framework of industrial and environmental physics usually relies on an initial value problem which is described by a single first order ordinary differential equation. In this analysis, we will study the sensitivity analysis due to a variation of the initial condition ...

  3. Credit Risk Evaluation : Modeling - Analysis - Management

    OpenAIRE

    Wehrspohn, Uwe

    2002-01-01

    An analysis and further development of the building blocks of modern credit risk management: -Definitions of default -Estimation of default probabilities -Exposures -Recovery Rates -Pricing -Concepts of portfolio dependence -Time horizons for risk calculations -Quantification of portfolio risk -Estimation of risk measures -Portfolio analysis and portfolio improvement -Evaluation and comparison of credit risk models -Analytic portfolio loss distributions The thesis contributes to the evaluatio...

  4. A Bayesian Nonparametric Meta-Analysis Model

    Science.gov (United States)

    Karabatsos, George; Talbott, Elizabeth; Walker, Stephen G.

    2015-01-01

    In a meta-analysis, it is important to specify a model that adequately describes the effect-size distribution of the underlying population of studies. The conventional normal fixed-effect and normal random-effects models assume a normal effect-size population distribution, conditionally on parameters and covariates. For estimating the mean overall…

  5. Premium adjustment: actuarial analysis on epidemiological models ...

    African Journals Online (AJOL)

    Premium adjustment: actuarial analysis on epidemiological models. DEA Omorogbe, SO Edobor. Abstract. In this paper, we analyse insurance premium adjustment in the context of an epidemiological model where the insurer's future financial liability is greater than the premium from patients. In this situation, it becomes ...

  6. Model Based Analysis of Insider Threats

    DEFF Research Database (Denmark)

    Chen, Taolue; Han, Tingting; Kammueller, Florian

    2016-01-01

    In order to detect malicious insider attacks it is important to model and analyse infrastructures and policies of organisations and the insiders acting within them. We extend formal approaches that allow modelling such scenarios by quantitative aspects to enable a precise analysis of security...

  7. Independent Component Analysis in Multimedia Modeling

    DEFF Research Database (Denmark)

    Larsen, Jan

    2003-01-01

    Modeling of multimedia and multimodal data becomes increasingly important with the digitalization of the world. The objective of this paper is to demonstrate the potential of independent component analysis and blind sources separation methods for modeling and understanding of multimedia data, which...

  8. Perturbation analysis of nonlinear matrix population models

    Directory of Open Access Journals (Sweden)

    Hal Caswell

    2008-03-01

    Full Text Available Perturbation analysis examines the response of a model to changes in its parameters. It is commonly applied to population growth rates calculated from linear models, but there has been no general approach to the analysis of nonlinear models. Nonlinearities in demographic models may arise due to density-dependence, frequency-dependence (in 2-sex models, feedback through the environment or the economy, and recruitment subsidy due to immigration, or from the scaling inherent in calculations of proportional population structure. This paper uses matrix calculus to derive the sensitivity and elasticity of equilibria, cycles, ratios (e.g. dependency ratios, age averages and variances, temporal averages and variances, life expectancies, and population growth rates, for both age-classified and stage-classified models. Examples are presented, applying the results to both human and non-human populations.

  9. Ignalina NPP Safety Analysis: Models and Results

    International Nuclear Information System (INIS)

    Uspuras, E.

    1999-01-01

    Research directions, linked to safety assessment of the Ignalina NPP, of the scientific safety analysis group are presented: Thermal-hydraulic analysis of accidents and operational transients; Thermal-hydraulic assessment of Ignalina NPP Accident Localization System and other compartments; Structural analysis of plant components, piping and other parts of Main Circulation Circuit; Assessment of RBMK-1500 reactor core and other. Models and main works carried out last year are described. (author)

  10. MSSV Modeling for Wolsong-1 Safety Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Moon, Bok Ja; Choi, Chul Jin; Kim, Seoung Rae [KEPCO EandC, Daejeon (Korea, Republic of)

    2010-10-15

    The main steam safety valves (MSSVs) are installed on the main steam line to prevent the overpressurization of the system. MSSVs are held in closed position by spring force and the valves pop open by internal force when the main steam pressure increases to open set pressure. If the overpressure condition is relieved, the valves begin to close. For the safety analysis of anticipated accident condition, the safety systems are modeled conservatively to simulate the accident condition more severe. MSSVs are also modeled conservatively for the analysis of over-pressurization accidents. In this paper, the pressure transient is analyzed at over-pressurization condition to evaluate the conservatism for MSSV models

  11. Model selection criterion in survival analysis

    Science.gov (United States)

    Karabey, Uǧur; Tutkun, Nihal Ata

    2017-07-01

    Survival analysis deals with time until occurrence of an event of interest such as death, recurrence of an illness, the failure of an equipment or divorce. There are various survival models with semi-parametric or parametric approaches used in medical, natural or social sciences. The decision on the most appropriate model for the data is an important point of the analysis. In literature Akaike information criteria or Bayesian information criteria are used to select among nested models. In this study,the behavior of these information criterion is discussed for a real data set.

  12. Model Checking as Static Analysis: Revisited

    DEFF Research Database (Denmark)

    Zhang, Fuyuan; Nielson, Flemming; Nielson, Hanne Riis

    2012-01-01

    We show that the model checking problem of the μ-calculus can be viewed as an instance of static analysis. We propose Succinct Fixed Point Logic (SFP) within our logical approach to static analysis as an extension of Alternation-free Least Fixed Logic (ALFP). We generalize the notion of stratific......We show that the model checking problem of the μ-calculus can be viewed as an instance of static analysis. We propose Succinct Fixed Point Logic (SFP) within our logical approach to static analysis as an extension of Alternation-free Least Fixed Logic (ALFP). We generalize the notion...... of stratification to weak stratification and establish a Moore Family result for the new logic as well. The semantics of the μ-calculus is encoded as the intended model of weakly stratified clause sequences in SFP....

  13. Simulation modeling and analysis with Arena

    CERN Document Server

    Altiok, Tayfur

    2007-01-01

    Simulation Modeling and Analysis with Arena is a highly readable textbook which treats the essentials of the Monte Carlo discrete-event simulation methodology, and does so in the context of a popular Arena simulation environment.” It treats simulation modeling as an in-vitro laboratory that facilitates the understanding of complex systems and experimentation with what-if scenarios in order to estimate their performance metrics. The book contains chapters on the simulation modeling methodology and the underpinnings of discrete-event systems, as well as the relevant underlying probability, statistics, stochastic processes, input analysis, model validation and output analysis. All simulation-related concepts are illustrated in numerous Arena examples, encompassing production lines, manufacturing and inventory systems, transportation systems, and computer information systems in networked settings.· Introduces the concept of discrete event Monte Carlo simulation, the most commonly used methodology for modeli...

  14. A Legal Institutional Perspective on the European External Action Service

    DEFF Research Database (Denmark)

    Van Vooren, Bart

    2011-01-01

    , their support services and EU agencies, and seeks to define the EEAS’ sui generis status in the EU institutional set-up: What are the implications of its absence of legal personality, what does its ‘functional autonomy’ from the Council and Commission imply, what are its formal powers – if any, and could...

  15. Development of stock markets, societal norms and legal institutions

    NARCIS (Netherlands)

    Garretsen, Harry; Lensink, Robert; Sterken, Elmer

    2000-01-01

    We explain the development of stock markets by both legal and societal determinants and analyze the relevance of both determinants in the Levine-Zervos (1998) cross-sectional growth regressions. We argue that the legal indicators as developed by La Porta, Lopez-de-Silanes, Shleifer and Vishny (1998)

  16. Sensitivity Analysis in Sequential Decision Models.

    Science.gov (United States)

    Chen, Qiushi; Ayer, Turgay; Chhatwal, Jagpreet

    2017-02-01

    Sequential decision problems are frequently encountered in medical decision making, which are commonly solved using Markov decision processes (MDPs). Modeling guidelines recommend conducting sensitivity analyses in decision-analytic models to assess the robustness of the model results against the uncertainty in model parameters. However, standard methods of conducting sensitivity analyses cannot be directly applied to sequential decision problems because this would require evaluating all possible decision sequences, typically in the order of trillions, which is not practically feasible. As a result, most MDP-based modeling studies do not examine confidence in their recommended policies. In this study, we provide an approach to estimate uncertainty and confidence in the results of sequential decision models. First, we provide a probabilistic univariate method to identify the most sensitive parameters in MDPs. Second, we present a probabilistic multivariate approach to estimate the overall confidence in the recommended optimal policy considering joint uncertainty in the model parameters. We provide a graphical representation, which we call a policy acceptability curve, to summarize the confidence in the optimal policy by incorporating stakeholders' willingness to accept the base case policy. For a cost-effectiveness analysis, we provide an approach to construct a cost-effectiveness acceptability frontier, which shows the most cost-effective policy as well as the confidence in that for a given willingness to pay threshold. We demonstrate our approach using a simple MDP case study. We developed a method to conduct sensitivity analysis in sequential decision models, which could increase the credibility of these models among stakeholders.

  17. Critical analysis of algebraic collective models

    International Nuclear Information System (INIS)

    Moshinsky, M.

    1986-01-01

    The author shall understand by algebraic collective models all those based on specific Lie algebras, whether the latter are suggested through simple shell model considerations like in the case of the Interacting Boson Approximation (IBA), or have a detailed microscopic foundation like the symplectic model. To analyze these models critically, it is convenient to take a simple conceptual example of them in which all steps can be implemented analytically or through elementary numerical analysis. In this note he takes as an example the symplectic model in a two dimensional space i.e. based on a sp(4,R) Lie algebra, and show how through its complete discussion we can get a clearer understanding of the structure of algebraic collective models of nuclei. In particular he discusses the association of Hamiltonians, related to maximal subalgebras of our basic Lie algebra, with specific types of spectra, and the connections between spectra and shapes

  18. LBLOCA sensitivity analysis using meta models

    International Nuclear Information System (INIS)

    Villamizar, M.; Sanchez-Saez, F.; Villanueva, J.F.; Carlos, S.; Sanchez, A.I.; Martorell, S.

    2014-01-01

    This paper presents an approach to perform the sensitivity analysis of the results of simulation of thermal hydraulic codes within a BEPU approach. Sensitivity analysis is based on the computation of Sobol' indices that makes use of a meta model, It presents also an application to a Large-Break Loss of Coolant Accident, LBLOCA, in the cold leg of a pressurized water reactor, PWR, addressing the results of the BEMUSE program and using the thermal-hydraulic code TRACE. (authors)

  19. An Extensible Model and Analysis Framework

    Science.gov (United States)

    2010-11-01

    of a pre-existing, open-source modeling and analysis framework known as Ptolemy II (http://ptolemy.org). The University of California, Berkeley...worked with the Air Force Research Laboratory, Rome Research Site on adapting Ptolemy II for modeling and simulation of large scale dynamics of Political...capabilities were prototyped in Ptolemy II and delivered via version control and software releases. Each of these capabilities specifically supports one or

  20. Review and analysis of biomass gasification models

    DEFF Research Database (Denmark)

    Puig Arnavat, Maria; Bruno, Joan Carles; Coronas, Alberto

    2010-01-01

    The use of biomass as a source of energy has been further enhanced in recent years and special attention has been paid to biomass gasification. Due to the increasing interest in biomass gasification, several models have been proposed in order to explain and understand this complex process......, and the design, simulation, optimisation and process analysis of gasifiers have been carried out. This paper presents and analyses several gasification models based on thermodynamic equilibrium, kinetics and artificial neural networks. The thermodynamic models are found to be a useful tool for preliminary...

  1. Modeling and analysis of stochastic systems

    CERN Document Server

    Kulkarni, Vidyadhar G

    2011-01-01

    Based on the author's more than 25 years of teaching experience, Modeling and Analysis of Stochastic Systems, Second Edition covers the most important classes of stochastic processes used in the modeling of diverse systems, from supply chains and inventory systems to genetics and biological systems. For each class of stochastic process, the text includes its definition, characterization, applications, transient and limiting behavior, first passage times, and cost/reward models. Along with reorganizing the material, this edition revises and adds new exercises and examples. New to the second edi

  2. Numerical analysis of the rebellious voter model

    Czech Academy of Sciences Publication Activity Database

    Swart, Jan M.; Vrbenský, Karel

    2010-01-01

    Roč. 140, č. 5 (2010), s. 873-899 ISSN 0022-4715 R&D Projects: GA ČR GA201/09/1931; GA MŠk 1M0572 Institutional research plan: CEZ:AV0Z10750506 Keywords : rebellious voter model * parity conservation * exactly solvable model * coexistence * interface tightness * cancellative systems * Markov chain Monte Carlo Subject RIV: BA - General Mathematics Impact factor: 1.447, year: 2010 http://library.utia.cas.cz/separaty/2010/SI/swart-numerical analysis of the rebellious voter model.pdf

  3. Theoretical Analysis of a Modified Continuum Model

    Science.gov (United States)

    Ge, Hong-Xia; Wu, Shu-Zhen; Cheng, Rong-Jun; Lo, Siu-ming

    2011-09-01

    Based on the optimal velocity (OV) model, a new car-following model for traffic flow with the consideration of the driver's forecast effect (DFE) was proposed by Tang et al., which can be used to describe some complex traffic phenomena better. Using an asymptotic approximation between the headway and density, we obtain a new macro continuum version of the car-following model with the DFE. The linear stability theory is applied to derive the neutral stability condition. The Korteweg—de Vries equation near the neutral stability line is given by nonlinear analysis and the corresponding solution for the traffic density wave is derived.

  4. Model Selection in Data Analysis Competitions

    DEFF Research Database (Denmark)

    Wind, David Kofoed; Winther, Ole

    2014-01-01

    The use of data analysis competitions for selecting the most appropriate model for a problem is a recent innovation in the field of predictive machine learning. Two of the most well-known examples of this trend was the Netflix Competition and recently the competitions hosted on the online platfor...

  5. Modeling and analysis of metrics databases

    OpenAIRE

    Paul, Raymond A.

    1999-01-01

    The main objective of this research is to propose a comprehensive framework for quality and risk management in software development process based on analysis and modeling of software metrics data. Existing software metrics work has focused mainly on the type of metrics tobe collected ...

  6. Stochastic Modelling and Analysis of Warehouse Operations

    NARCIS (Netherlands)

    Y. Gong (Yeming)

    2009-01-01

    textabstractThis thesis has studied stochastic models and analysis of warehouse operations. After an overview of stochastic research in warehouse operations, we explore the following topics. Firstly, we search optimal batch sizes in a parallel-aisle warehouse with online order arrivals. We employ a

  7. Power system stability modelling, analysis and control

    CERN Document Server

    Sallam, Abdelhay A

    2015-01-01

    This book provides a comprehensive treatment of the subject from both a physical and mathematical perspective and covers a range of topics including modelling, computation of load flow in the transmission grid, stability analysis under both steady-state and disturbed conditions, and appropriate controls to enhance stability.

  8. Comparative Distributions of Hazard Modeling Analysis

    Directory of Open Access Journals (Sweden)

    Rana Abdul Wajid

    2006-07-01

    Full Text Available In this paper we present the comparison among the distributions used in hazard analysis. Simulation technique has been used to study the behavior of hazard distribution modules. The fundamentals of Hazard issues are discussed using failure criteria. We present the flexibility of the hazard modeling distribution that approaches to different distributions.

  9. Hydrographic Basins Analysis Using Digital Terrain Modelling

    Science.gov (United States)

    Mihaela, Pişleagă; -Minda Codruţa, Bădăluţă; Gabriel, Eleş; Daniela, Popescu

    2017-10-01

    The paper, emphasis the link between digital terrain modelling and studies of hydrographic basins, concerning the hydrological processes analysis. Given the evolution of computing techniques but also of the software digital terrain modelling made its presence felt increasingly, and established itself as a basic concept in many areas, due to many advantages. At present, most digital terrain modelling is derived from three alternative sources such as ground surveys, photogrammetric data capture or from digitized cartographic sources. A wide range of features may be extracted from digital terrain models, such as surface, specific points and landmarks, linear features but also areal futures like drainage basins, hills or hydrological basins. The paper highlights how the use appropriate software for the preparation of a digital terrain model, a model which is subsequently used to study hydrographic basins according to various geomorphological parameters. As a final goal, it shows the link between digital terrain modelling and hydrographic basins study that can be used to optimize the correlation between digital model terrain and hydrological processes in order to obtain results as close to the real field processes.

  10. Guideliness for system modeling: fault tree [analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Yoon Hwan; Yang, Joon Eon; Kang, Dae Il; Hwang, Mee Jeong

    2004-07-01

    This document, the guidelines for system modeling related to Fault Tree Analysis(FTA), is intended to provide the guidelines with the analyzer to construct the fault trees in the level of the capability category II of ASME PRA standard. Especially, they are to provide the essential and basic guidelines and the related contents to be used in support of revising the Ulchin 3 and 4 PSA model for risk monitor within the capability category II of ASME PRA standard. Normally the main objective of system analysis is to assess the reliability of system modeled by Event Tree Analysis (ETA). A variety of analytical techniques can be used for the system analysis, however, FTA method is used in this procedures guide. FTA is the method used for representing the failure logic of plant systems deductively using AND, OR or NOT gates. The fault tree should reflect all possible failure modes that may contribute to the system unavailability. This should include contributions due to the mechanical failures of the components, Common Cause Failures (CCFs), human errors and outages for testing and maintenance. This document identifies and describes the definitions and the general procedures of FTA and the essential and basic guidelines for reving the fault trees. Accordingly, the guidelines for FTA will be capable to guide the FTA to the level of the capability category II of ASME PRA standard.

  11. Social phenomena from data analysis to models

    CERN Document Server

    Perra, Nicola

    2015-01-01

    This book focuses on the new possibilities and approaches to social modeling currently being made possible by an unprecedented variety of datasets generated by our interactions with modern technologies. This area has witnessed a veritable explosion of activity over the last few years, yielding many interesting and useful results. Our aim is to provide an overview of the state of the art in this area of research, merging an extremely heterogeneous array of datasets and models. Social Phenomena: From Data Analysis to Models is divided into two parts. Part I deals with modeling social behavior under normal conditions: How we live, travel, collaborate and interact with each other in our daily lives. Part II deals with societal behavior under exceptional conditions: Protests, armed insurgencies, terrorist attacks, and reactions to infectious diseases. This book offers an overview of one of the most fertile emerging fields bringing together practitioners from scientific communities as diverse as social sciences, p...

  12. Modeling and analysis of calcium bromide hydrolysis

    Energy Technology Data Exchange (ETDEWEB)

    Lottes, Steven A.; Lyczkowski, Robert W.; Panchal, Chandrakant B.; Doctor, Richard D. [Energy Systems Division, Argonne National Laboratory, 9700 S. Cass Avenue, Argonne, IL 60439 (United States)

    2009-05-15

    The main focus of this paper is the modeling, simulation, and analysis of the calcium bromide hydrolysis reactor stage in the calcium-bromine thermochemical water-splitting cycle for nuclear hydrogen production. One reactor concept is to use a spray of calcium bromide into steam, in which the heat of fusion supplies the heat of reaction. Droplet models were built up in a series of steps incorporating various physical phenomena, including droplet flow, heat transfer, phase change, and reaction, separately. Given the large heat reservoir contained in a pool of molten calcium bromide that allows bubbles to rise easily, using a bubble column reactor for the hydrolysis appears to be a feasible and promising alternative to the spray reactor concept. The two limiting cases of bubble geometry, spherical and spherical-cap, are considered in the modeling. Results for both droplet and bubble modeling with COMSOL MULTIPHYSICS trademark are presented, with recommendations for the path forward. (author)

  13. Modelling dominance in a flexible intercross analysis

    Directory of Open Access Journals (Sweden)

    Besnier Francois

    2009-06-01

    Full Text Available Abstract Background The aim of this paper is to develop a flexible model for analysis of quantitative trait loci (QTL in outbred line crosses, which includes both additive and dominance effects. Our flexible intercross analysis (FIA model accounts for QTL that are not fixed within founder lines and is based on the variance component framework. Genome scans with FIA are performed using a score statistic, which does not require variance component estimation. Results Simulations of a pedigree with 800 F2 individuals showed that the power of FIA including both additive and dominance effects was almost 50% for a QTL with equal allele frequencies in both lines with complete dominance and a moderate effect, whereas the power of a traditional regression model was equal to the chosen significance value of 5%. The power of FIA without dominance effects included in the model was close to those obtained for FIA with dominance for all simulated cases except for QTL with overdominant effects. A genome-wide linkage analysis of experimental data from an F2 intercross between Red Jungle Fowl and White Leghorn was performed with both additive and dominance effects included in FIA. The score values for chicken body weight at 200 days of age were similar to those obtained in FIA analysis without dominance. Conclusion We have extended FIA to include QTL dominance effects. The power of FIA was superior, or similar, to standard regression methods for QTL effects with dominance. The difference in power for FIA with or without dominance is expected to be small as long as the QTL effects are not overdominant. We suggest that FIA with only additive effects should be the standard model to be used, especially since it is more computationally efficient.

  14. Modeling of creep for structural analysis

    Energy Technology Data Exchange (ETDEWEB)

    Naumenko, K.; Altenbach, H. [Halle-Wittenberg Univ., Halle (Germany). Lehrstuhl fuer Technische Mechanik

    2007-07-01

    ''Creep Modeling for Structural Analysis'' develops methods to simulate and analyze the time-dependent changes of stress and strain states in engineering structures up to the critical stage of creep rupture. The principal subjects of creep mechanics are the formulation of constitutive equations for creep in structural materials under multi-axial stress states; the application of structural mechanics models of beams, plates, shells and three-dimensional solids and the utilization of procedures for the solution of non-linear initial-boundary value problems. The objective of this book is to review some of the classical and recently proposed approaches to the modeling of creep for structural analysis applications as well as to extend the collection of available solutions of creep problems by new, more sophisticated examples. In Chapter 1, the book discusses basic features of the creep behavior in materials and structures and presents an overview of various approaches to the modeling of creep. Chapter 2 collects constitutive models that describe creep and damage processes under multi-axial stress states. Chapter 3 deals with the application of constitutive models to the description of creep for several structural materials. Constitutive and evolution equations, response functions and material constants are presented according to recently published experimental data. In Chapter 4 the authors discuss structural mechanics problems. Governing equations of creep in three-dimensional solids, direct variational methods and time step algorithms are reviewed. Examples are presented to illustrate the application of advanced numerical methods to the structural analysis. An emphasis is placed on the development and verification of creep-damage material subroutines inside the general purpose finite element codes. (orig.)

  15. LCD motion blur: modeling, analysis, and algorithm.

    Science.gov (United States)

    Chan, Stanley H; Nguyen, Truong Q

    2011-08-01

    Liquid crystal display (LCD) devices are well known for their slow responses due to the physical limitations of liquid crystals. Therefore, fast moving objects in a scene are often perceived as blurred. This effect is known as the LCD motion blur. In order to reduce LCD motion blur, an accurate LCD model and an efficient deblurring algorithm are needed. However, existing LCD motion blur models are insufficient to reflect the limitation of human-eye-tracking system. Also, the spatiotemporal equivalence in LCD motion blur models has not been proven directly in the discrete 2-D spatial domain, although it is widely used. There are three main contributions of this paper: modeling, analysis, and algorithm. First, a comprehensive LCD motion blur model is presented, in which human-eye-tracking limits are taken into consideration. Second, a complete analysis of spatiotemporal equivalence is provided and verified using real video sequences. Third, an LCD motion blur reduction algorithm is proposed. The proposed algorithm solves an l(1)-norm regularized least-squares minimization problem using a subgradient projection method. Numerical results show that the proposed algorithm gives higher peak SNR, lower temporal error, and lower spatial error than motion-compensated inverse filtering and Lucy-Richardson deconvolution algorithm, which are two state-of-the-art LCD deblurring algorithms.

  16. International Space Station Model Correlation Analysis

    Science.gov (United States)

    Laible, Michael R.; Fitzpatrick, Kristin; Hodge, Jennifer; Grygier, Michael

    2018-01-01

    This paper summarizes the on-orbit structural dynamic data and the related modal analysis, model validation and correlation performed for the International Space Station (ISS) configuration ISS Stage ULF7, 2015 Dedicated Thruster Firing (DTF). The objective of this analysis is to validate and correlate the analytical models used to calculate the ISS internal dynamic loads and compare the 2015 DTF with previous tests. During the ISS configurations under consideration, on-orbit dynamic measurements were collected using the three main ISS instrumentation systems; Internal Wireless Instrumentation System (IWIS), External Wireless Instrumentation System (EWIS) and the Structural Dynamic Measurement System (SDMS). The measurements were recorded during several nominal on-orbit DTF tests on August 18, 2015. Experimental modal analyses were performed on the measured data to extract modal parameters including frequency, damping, and mode shape information. Correlation and comparisons between test and analytical frequencies and mode shapes were performed to assess the accuracy of the analytical models for the configurations under consideration. These mode shapes were also compared to earlier tests. Based on the frequency comparisons, the accuracy of the mathematical models is assessed and model refinement recommendations are given. In particular, results of the first fundamental mode will be discussed, nonlinear results will be shown, and accelerometer placement will be assessed.

  17. 3D face modeling, analysis and recognition

    CERN Document Server

    Daoudi, Mohamed; Veltkamp, Remco

    2013-01-01

    3D Face Modeling, Analysis and Recognition presents methodologies for analyzing shapes of facial surfaces, develops computational tools for analyzing 3D face data, and illustrates them using state-of-the-art applications. The methodologies chosen are based on efficient representations, metrics, comparisons, and classifications of features that are especially relevant in the context of 3D measurements of human faces. These frameworks have a long-term utility in face analysis, taking into account the anticipated improvements in data collection, data storage, processing speeds, and application s

  18. Advances in statistical models for data analysis

    CERN Document Server

    Minerva, Tommaso; Vichi, Maurizio

    2015-01-01

    This edited volume focuses on recent research results in classification, multivariate statistics and machine learning and highlights advances in statistical models for data analysis. The volume provides both methodological developments and contributions to a wide range of application areas such as economics, marketing, education, social sciences and environment. The papers in this volume were first presented at the 9th biannual meeting of the Classification and Data Analysis Group (CLADAG) of the Italian Statistical Society, held in September 2013 at the University of Modena and Reggio Emilia, Italy.

  19. Formal Modeling and Analysis of Timed Systems

    DEFF Research Database (Denmark)

    Larsen, Kim Guldstrand; Niebert, Peter

    This book constitutes the thoroughly refereed post-proceedings of the First International Workshop on Formal Modeling and Analysis of Timed Systems, FORMATS 2003, held in Marseille, France in September 2003. The 19 revised full papers presented together with an invited paper and the abstracts...... of two invited talks were carefully selected from 36 submissions during two rounds of reviewing and improvement. All current aspects of formal method for modeling and analyzing timed systems are addressed; among the timed systems dealt with are timed automata, timed Petri nets, max-plus algebras, real......-time systems, discrete time systems, timed languages, and real-time operating systems....

  20. MATHEMATICAL RISK ANALYSIS: VIA NICHOLAS RISK MODEL AND BAYESIAN ANALYSIS

    Directory of Open Access Journals (Sweden)

    Anass BAYAGA

    2010-07-01

    Full Text Available The objective of this second part of a two-phased study was to explorethe predictive power of quantitative risk analysis (QRA method andprocess within Higher Education Institution (HEI. The method and process investigated the use impact analysis via Nicholas risk model and Bayesian analysis, with a sample of hundred (100 risk analysts in a historically black South African University in the greater Eastern Cape Province.The first findings supported and confirmed previous literature (KingIII report, 2009: Nicholas and Steyn, 2008: Stoney, 2007: COSA, 2004 that there was a direct relationship between risk factor, its likelihood and impact, certiris paribus. The second finding in relation to either controlling the likelihood or the impact of occurrence of risk (Nicholas risk model was that to have a brighter risk reward, it was important to control the likelihood ofoccurrence of risks as compared with its impact so to have a direct effect on entire University. On the Bayesian analysis, thus third finding, the impact of risk should be predicted along three aspects. These aspects included the human impact (decisions made, the property impact (students and infrastructural based and the business impact. Lastly, the study revealed that although in most business cases, where as business cycles considerably vary dependingon the industry and or the institution, this study revealed that, most impacts in HEI (University was within the period of one academic.The recommendation was that application of quantitative risk analysisshould be related to current legislative framework that affects HEI.

  1. Micromechatronics modeling, analysis, and design with Matlab

    CERN Document Server

    Giurgiutiu, Victor

    2009-01-01

    Focusing on recent developments in engineering science, enabling hardware, advanced technologies, and software, Micromechatronics: Modeling, Analysis, and Design with MATLAB®, Second Edition provides clear, comprehensive coverage of mechatronic and electromechanical systems. It applies cornerstone fundamentals to the design of electromechanical systems, covers emerging software and hardware, introduces the rigorous theory, examines the design of high-performance systems, and helps develop problem-solving skills. Along with more streamlined material, this edition adds many new sections to exist

  2. Model Analysis of Hotel Management Contract

    OpenAIRE

    平野, 典男; Hirano, Norio

    2014-01-01

    This paper develops model analysis of hotel management contract by agency theory. A hotel management contract is an arrangement under which the hotel owner transfers the power of attorney to the hotel operator and lets the hotel operator make decisions concerned with hotel operation on behalf of the hotel owner. Under a hotel management contract there is the inherent conflict of interest between the hotel owner and the hotel operator, because the hotel owner has the asset proprietary rights b...

  3. Energy Systems Modelling Research and Analysis

    DEFF Research Database (Denmark)

    Møller Andersen, Frits; Alberg Østergaard, Poul

    2015-01-01

    This editorial introduces the seventh volume of the International Journal of Sustainable Energy Planning and Management. The volume presents part of the outcome of the project Energy Systems Modelling Research and Analysis (ENSYMORA) funded by the Danish Innovation Fund. The project carried out...... by 11 university and industry partners has improved the basis for decision-making within energy planning and energy scenario making by providing new and improved tools and methods for energy systems analyses....

  4. Mathematical analysis of epidemiological models with heterogeneity

    Energy Technology Data Exchange (ETDEWEB)

    Van Ark, J.W.

    1992-01-01

    For many diseases in human populations the disease shows dissimilar characteristics in separate subgroups of the population; for example, the probability of disease transmission for gonorrhea or AIDS is much higher from male to female than from female to male. There is reason to construct and analyze epidemiological models which allow this heterogeneity of population, and to use these models to run computer simulations of the disease to predict the incidence and prevalence of the disease. In the models considered here the heterogeneous population is separated into subpopulations whose internal and external interactions are homogeneous in the sense that each person in the population can be assumed to have all average actions for the people of that subpopulation. The first model considered is an SIRS models; i.e., the Susceptible can become Infected, and if so he eventually Recovers with temporary immunity, and after a period of time becomes Susceptible again. Special cases allow for permanent immunity or other variations. This model is analyzed and threshold conditions are given which determine whether the disease dies out or persists. A deterministic model is presented; this model is constructed using difference equations, and it has been used in computer simulations for the AIDS epidemic in the homosexual population in San Francisco. The homogeneous version and the heterogeneous version of the differential-equations and difference-equations versions of the deterministic model are analyzed mathematically. In the analysis, equilibria are identified and threshold conditions are set forth for the disease to die out if the disease is below the threshold so that the disease-free equilibrium is globally asymptotically stable. Above the threshold the disease persists so that the disease-free equilibrium is unstable and there is a unique endemic equilibrium.

  5. 3D space analysis of dental models

    Science.gov (United States)

    Chuah, Joon H.; Ong, Sim Heng; Kondo, Toshiaki; Foong, Kelvin W. C.; Yong, Than F.

    2001-05-01

    Space analysis is an important procedure by orthodontists to determine the amount of space available and required for teeth alignment during treatment planning. Traditional manual methods of space analysis are tedious and often inaccurate. Computer-based space analysis methods that work on 2D images have been reported. However, as the space problems in the dental arch exist in all three planes of space, a full 3D analysis of the problems is necessary. This paper describes a visualization and measurement system that analyses 3D images of dental plaster models. Algorithms were developed to determine dental arches. The system is able to record the depths of the Curve of Spee, and quantify space liabilities arising from a non-planar Curve of Spee, malalignment and overjet. Furthermore, the difference between total arch space available and the space required to arrange the teeth in ideal occlusion can be accurately computed. The system for 3D space analysis of the dental arch is an accurate, comprehensive, rapid and repeatable method of space analysis to facilitate proper orthodontic diagnosis and treatment planning.

  6. Modeling and analysis of advanced binary cycles

    Energy Technology Data Exchange (ETDEWEB)

    Gawlik, K.

    1997-12-31

    A computer model (Cycle Analysis Simulation Tool, CAST) and a methodology have been developed to perform value analysis for small, low- to moderate-temperature binary geothermal power plants. The value analysis method allows for incremental changes in the levelized electricity cost (LEC) to be determined between a baseline plant and a modified plant. Thermodynamic cycle analyses and component sizing are carried out in the model followed by economic analysis which provides LEC results. The emphasis of the present work is on evaluating the effect of mixed working fluids instead of pure fluids on the LEC of a geothermal binary plant that uses a simple Organic Rankine Cycle. Four resources were studied spanning the range of 265{degrees}F to 375{degrees}F. A variety of isobutane and propane based mixtures, in addition to pure fluids, were used as working fluids. This study shows that the use of propane mixtures at a 265{degrees}F resource can reduce the LEC by 24% when compared to a base case value that utilizes commercial isobutane as its working fluid. The cost savings drop to 6% for a 375{degrees}F resource, where an isobutane mixture is favored. Supercritical cycles were found to have the lowest cost at all resources.

  7. Session 6: Dynamic Modeling and Systems Analysis

    Science.gov (United States)

    Csank, Jeffrey; Chapman, Jeffryes; May, Ryan

    2013-01-01

    These presentations cover some of the ongoing work in dynamic modeling and dynamic systems analysis. The first presentation discusses dynamic systems analysis and how to integrate dynamic performance information into the systems analysis. The ability to evaluate the dynamic performance of an engine design may allow tradeoffs between the dynamic performance and operability of a design resulting in a more efficient engine design. The second presentation discusses the Toolbox for Modeling and Analysis of Thermodynamic Systems (T-MATS). T-MATS is a Simulation system with a library containing the basic building blocks that can be used to create dynamic Thermodynamic Systems. Some of the key features include Turbo machinery components, such as turbines, compressors, etc., and basic control system blocks. T-MAT is written in the Matlab-Simulink environment and is open source software. The third presentation focuses on getting additional performance from the engine by allowing the limit regulators only to be active when a limit is danger of being violated. Typical aircraft engine control architecture is based on MINMAX scheme, which is designed to keep engine operating within prescribed mechanical/operational safety limits. Using a conditionally active min-max limit regulator scheme, additional performance can be gained by disabling non-relevant limit regulators

  8. Corneal modeling for analysis of photorefractive keratectomy

    Science.gov (United States)

    Della Vecchia, Michael A.; Lamkin-Kennard, Kathleen

    1997-05-01

    Procedurally, excimer photorefractive keratectomy is based on the refractive correction of composite spherical and cylindrical ophthalmic errors of the entire eye. These refractive errors are inputted for correction at the corneal plane and for the properly controlled duration and location of laser energy. Topography is usually taken to correspondingly monitor spherical and cylindrical corneorefractive errors. While a corneal topographer provides surface morphologic information, the keratorefractive photoablation is based on the patient's spherical and cylindrical spectacle correction. Topography is at present not directly part of the procedural deterministic parameters. Examination of how corneal curvature at each of the keratometric reference loci affect the shape of the resultant corneal photoablated surface may enhance the accuracy of the desired correction. The objective of this study was to develop a methodology to utilize corneal topography for construction of models depicting pre- and post-operative keratomorphology for analysis of photorefractive keratectomy. Multiple types of models were developed then recreated in optical design software for examination of focal lengths and other optical characteristics. The corneal models were developed using data extracted from the TMS I corneal modeling system (Computed Anatomy, New York, NY). The TMS I does not allow for manipulation of data or differentiation of pre- and post-operative surfaces within its platform, thus models needed to be created for analysis. The data were imported into Matlab where 3D models, surface meshes, and contour plots were created. The data used to generate the models were pre- and post-operative curvatures, heights from the corneal apes, and x-y positions at 6400 locations on the corneal surface. Outlying non-contributory points were eliminated through statistical operations. Pre- and post- operative models were analyzed to obtain the resultant changes in the corneal surfaces during PRK

  9. Model reduction using a posteriori analysis

    KAUST Repository

    Whiteley, Jonathan P.

    2010-05-01

    Mathematical models in biology and physiology are often represented by large systems of non-linear ordinary differential equations. In many cases, an observed behaviour may be written as a linear functional of the solution of this system of equations. A technique is presented in this study for automatically identifying key terms in the system of equations that are responsible for a given linear functional of the solution. This technique is underpinned by ideas drawn from a posteriori error analysis. This concept has been used in finite element analysis to identify regions of the computational domain and components of the solution where a fine computational mesh should be used to ensure accuracy of the numerical solution. We use this concept to identify regions of the computational domain and components of the solution where accurate representation of the mathematical model is required for accuracy of the functional of interest. The technique presented is demonstrated by application to a model problem, and then to automatically deduce known results from a cell-level cardiac electrophysiology model. © 2010 Elsevier Inc.

  10. Analysis of software for modeling atmospheric dispersion

    International Nuclear Information System (INIS)

    Grandamas, O.; Hubert, Ph.; Pages, P.

    1989-09-01

    During last few years, a number software packages for microcomputes have appeared with the aim to simulate diffusion of atmospheric pollutants. These codes, simplifying the models used for safety analyses of industrial plants are becoming more useful, and are even used for post-accidental conditions. The report presents for the first time in a critical manner, principal models available up to this date. The problem arises in adapting the models to the demanded post-accidental interventions. In parallel to this action an analysis of performance was performed. It means, identifying the need of forecasting the most appropriate actions to be performed having in mind short available time and lack of information. Because of these difficulties, it is possible to simplify the software, which will not include all the options but could deal with a specific situation. This would enable minimisation of data to be collected on the site [fr

  11. Data analysis and approximate models model choice, location-scale, analysis of variance, nonparametric regression and image analysis

    CERN Document Server

    Davies, Patrick Laurie

    2014-01-01

    Introduction IntroductionApproximate Models Notation Two Modes of Statistical AnalysisTowards One Mode of Analysis Approximation, Randomness, Chaos, Determinism ApproximationA Concept of Approximation Approximation Approximating a Data Set by a Model Approximation Regions Functionals and EquivarianceRegularization and Optimality Metrics and DiscrepanciesStrong and Weak Topologies On Being (almost) Honest Simulations and Tables Degree of Approximation and p-values ScalesStability of Analysis The Choice of En(α, P) Independence Procedures, Approximation and VaguenessDiscrete Models The Empirical Density Metrics and Discrepancies The Total Variation Metric The Kullback-Leibler and Chi-Squared Discrepancies The Po(λ) ModelThe b(k, p) and nb(k, p) Models The Flying Bomb Data The Student Study Times Data OutliersOutliers, Data Analysis and Models Breakdown Points and Equivariance Identifying Outliers and Breakdown Outliers in Multivariate Data Outliers in Linear Regression Outliers in Structured Data The Location...

  12. Modeling Analysis For Grout Hopper Waste Tank

    International Nuclear Information System (INIS)

    Lee, S.

    2012-01-01

    The Saltstone facility at Savannah River Site (SRS) has a grout hopper tank to provide agitator stirring of the Saltstone feed materials. The tank has about 300 gallon capacity to provide a larger working volume for the grout nuclear waste slurry to be held in case of a process upset, and it is equipped with a mechanical agitator, which is intended to keep the grout in motion and agitated so that it won't start to set up. The primary objective of the work was to evaluate the flow performance for mechanical agitators to prevent vortex pull-through for an adequate stirring of the feed materials and to estimate an agitator speed which provides acceptable flow performance with a 45 o pitched four-blade agitator. In addition, the power consumption required for the agitator operation was estimated. The modeling calculations were performed by taking two steps of the Computational Fluid Dynamics (CFD) modeling approach. As a first step, a simple single-stage agitator model with 45 o pitched propeller blades was developed for the initial scoping analysis of the flow pattern behaviors for a range of different operating conditions. Based on the initial phase-1 results, the phase-2 model with a two-stage agitator was developed for the final performance evaluations. A series of sensitivity calculations for different designs of agitators and operating conditions have been performed to investigate the impact of key parameters on the grout hydraulic performance in a 300-gallon hopper tank. For the analysis, viscous shear was modeled by using the Bingham plastic approximation. Steady state analyses with a two-equation turbulence model were performed. All analyses were based on three-dimensional results. Recommended operational guidance was developed by using the basic concept that local shear rate profiles and flow patterns can be used as a measure of hydraulic performance and spatial stirring. Flow patterns were estimated by a Lagrangian integration technique along the flow paths

  13. Longitudinal analysis strategies for modelling epigenetic trajectories.

    Science.gov (United States)

    Staley, James R; Suderman, Matthew; Simpkin, Andrew J; Gaunt, Tom R; Heron, Jon; Relton, Caroline L; Tilling, Kate

    2018-02-16

    DNA methylation levels are known to vary over time, and modelling these trajectories is crucial for our understanding of the biological relevance of these changes over time. However, due to the computational cost of fitting multilevel models across the epigenome, most trajectory modelling efforts to date have focused on a subset of CpG sites identified through epigenome-wide association studies (EWAS) at individual time-points. We propose using linear regression across the repeated measures, estimating cluster-robust standard errors using a sandwich estimator, as a less computationally intensive strategy than multilevel modelling. We compared these two longitudinal approaches, as well as three approaches based on EWAS (associated at baseline, at any time-point and at all time-points), for identifying epigenetic change over time related to an exposure using simulations and by applying them to blood DNA methylation profiles from the Accessible Resource for Integrated Epigenomics Studies (ARIES). Restricting association testing to EWAS at baseline identified a less complete set of associations than performing EWAS at each time-point or applying the longitudinal modelling approaches to the full dataset. Linear regression models with cluster-robust standard errors identified similar sets of associations with almost identical estimates of effect as the multilevel models, while also being 74 times more efficient. Both longitudinal modelling approaches identified comparable sets of CpG sites in ARIES with an association with prenatal exposure to smoking (>70% agreement). Linear regression with cluster-robust standard errors is an appropriate and efficient approach for longitudinal analysis of DNA methylation data.

  14. A Comparative Analysis of Task Modeling Notations

    Directory of Open Access Journals (Sweden)

    Jean Vanderdonckt

    2012-03-01

    paper a comparative analysis of selected models involving multiple users in an interaction is provided in order to identify concepts which are underexplored in today's multi-user interaction task modeling. This comparative analysis is based on three families of criteria: information criteria, conceptual coverage, and expressiveness. Merging the meta-models of the selected models enables to come up with a broader meta-model that could be instantiated in most situations involving multi-user interaction, like workflow information systems, CSCW.

  15. Dynamic modelling and analysis of space webs

    Science.gov (United States)

    Yu, Yang; Baoyin, HeXi; Li, JunFeng

    2011-04-01

    Future space missions demand operations on large flexible structures, for example, space webs, the lightweight cable nets deployable in space, which can serve as platforms for very large structures or be used to capture orbital objects. The interest in research on space webs is likely to increase in the future with the development of promising applications such as Furoshiki sat-ellite of JAXA, Robotic Geostationary Orbit Restorer (ROGER) of ESA and Grapple, Retrieve And Secure Payload (GRASP) of NASA. Unlike high-tensioned nets in civil engineering, space webs may be low-tensioned or tensionless, and extremely flexible, owing to the microgravity in the orbit and the lack of support components, which may cause computational difficulties. Mathematical models are necessary in the analysis of space webs, especially in the conceptual design and evaluation for prototypes. A full three-dimensional finite element (FE) model was developed in this work. Trivial truss elements were adopted to reduce the computational complexity. Considering cable is a compression-free material and its tensile stiffness is also variable, we introduced the cable material constitutive relationship to work out an accurate and feasible model for prototype analysis and design. In the static analysis, the stress distribution and global deformation of the webs were discussed to get access to the knowledge of strength of webs with different types of meshes. In the dynamic analysis, special attention was paid to the impact problem. The max stress and global deformation were investigated. The simulation results indicate the interesting phenomenon which may be worth further research.

  16. Automating risk analysis of software design models.

    Science.gov (United States)

    Frydman, Maxime; Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance.

  17. Automating Risk Analysis of Software Design Models

    Directory of Open Access Journals (Sweden)

    Maxime Frydman

    2014-01-01

    Full Text Available The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance.

  18. Global sensitivity analysis of thermomechanical models in modelling of welding

    International Nuclear Information System (INIS)

    Petelet, M.

    2008-01-01

    Current approach of most welding modellers is to content themselves with available material data, and to chose a mechanical model that seems to be appropriate. Among inputs, those controlling the material properties are one of the key problems of welding simulation: material data are never characterized over a sufficiently wide temperature range. This way to proceed neglect the influence of the uncertainty of input data on the result given by the computer code. In this case, how to assess the credibility of prediction? This thesis represents a step in the direction of implementing an innovative approach in welding simulation in order to bring answers to this question, with an illustration on some concretes welding cases.The global sensitivity analysis is chosen to determine which material properties are the most sensitive in a numerical welding simulation and in which range of temperature. Using this methodology require some developments to sample and explore the input space covering welding of different steel materials. Finally, input data have been divided in two groups according to their influence on the output of the model (residual stress or distortion). In this work, complete methodology of the global sensitivity analysis has been successfully applied to welding simulation and lead to reduce the input space to the only important variables. Sensitivity analysis has provided answers to what can be considered as one of the probable frequently asked questions regarding welding simulation: for a given material which properties must be measured with a good accuracy and which ones can be simply extrapolated or taken from a similar material? (author)

  19. Gentrification and models for real estate analysis

    Directory of Open Access Journals (Sweden)

    Gianfranco Brusa

    2013-08-01

    Full Text Available This research propose a deep analysis of Milanese real estate market, based on data supplied by three real estate organizations; gentrification appears in some neighborhoods, such as Tortona, Porta Genova, Bovisa, Isola Garibaldi: the latest is the subject of the final analysis, by surveying of physical and social state of the area. The survey takes place in two periods (2003 and 2009 to compare the evolution of gentrification. The results of surveys has been employed in a simulation by multi-agent system model, to foresee long term evolution of the phenomenon. These neighborhood micro-indicators allow to put in evidence actual trends, conditioning a local real estate market, which can translate themselves in phenomena such as gentrification. In present analysis, the employ of cellular automata models applied to a neighborhood in Milan (Isola Garibaldi produced the dynamic simulation of gentrification trend during a very long time: the cyclical phenomenon (one loop holds a period of twenty – thirty years appears sometimes during a theoretical time of 100 – 120 – 150 years. Simulation of long period scenarios by multi-agent systems and cellular automata provides estimator with powerful tool, without limits in implementing it, able to support him in appraisal judge. It stands also to reason that such a tool can sustain urban planning and related evaluation processes.

  20. Dynamical system analysis of interacting models

    Science.gov (United States)

    Carneiro, S.; Borges, H. A.

    2018-01-01

    We perform a dynamical system analysis of a cosmological model with linear dependence between the vacuum density and the Hubble parameter, with constant-rate creation of dark matter. We show that the de Sitter spacetime is an asymptotically stable critical point, future limit of any expanding solution. Our analysis also shows that the Minkowski spacetime is an unstable critical point, which eventually collapses to a singularity. In this way, such a prescription for the vacuum decay not only predicts the correct future de Sitter limit, but also forbids the existence of a stable Minkowski universe. We also study the effect of matter creation on the growth of structures and their peculiar velocities, showing that it is inside the current errors of redshift space distortions observations.

  1. Data Logistics and the CMS Analysis Model

    CERN Document Server

    Managan, Julie E

    2009-01-01

    The Compact Muon Solenoid Experiment (CMS) at the Large Hadron Collider (LHC) at CERN has brilliant prospects for uncovering new information about the physical structure of our universe. Soon physicists around the world will participate together in analyzing CMS data in search of new physics phenomena and the Higgs Boson. However, they face a significant problem: with 5 Petabytes of data needing distribution each year, how will physicists get the data they need? How and where will they be able to analyze it? Computing resources and scientists are scattered around the world, while CMS data exists in localized chunks. The CMS computing model only allows analysis of locally stored data, “tethering” analysis to storage. The Vanderbilt CMS team is actively working to solve this problem with the Research and Education Data Depot Network (REDDnet), a program run by Vanderbilt’s Advanced Computing Center for Research and Education (ACCRE). The Compact Muon Solenoid Experiment (CMS) at the Large Hadron Collider ...

  2. Plasma brake model for preliminary mission analysis

    Science.gov (United States)

    Orsini, Leonardo; Niccolai, Lorenzo; Mengali, Giovanni; Quarta, Alessandro A.

    2018-03-01

    Plasma brake is an innovative propellantless propulsion system concept that exploits the Coulomb collisions between a charged tether and the ions in the surrounding environment (typically, the ionosphere) to generate an electrostatic force orthogonal to the tether direction. Previous studies on the plasma brake effect have emphasized the existence of a number of different parameters necessary to obtain an accurate description of the propulsive acceleration from a physical viewpoint. The aim of this work is to discuss an analytical model capable of estimating, with the accuracy required by a preliminary mission analysis, the performance of a spacecraft equipped with a plasma brake in a (near-circular) low Earth orbit. The simplified mathematical model is first validated through numerical simulations, and is then used to evaluate the plasma brake performance in some typical mission scenarios, in order to quantify the influence of the system parameters on the mission performance index.

  3. Stability Analysis of the Embankment Model

    Directory of Open Access Journals (Sweden)

    G.S. Gopalakrishna

    2009-01-01

    Full Text Available In analysis of embankment model affected by dynamic force, employment of shaking table is a scientific way in assessment of earthquake behavior. This work focused on saturated loose sandy foundation and enbankment. The results generated through the pore pressure sensors indicated pore water pressure playing main role in creation of liquefaction and stability of the system, and also revealed deformation, settlement, liquefaction intensity and time stability of system in direct correlation with the strength and characteristics of soil. One of the economical methods in stabilization of soil foundation is improvement of some part soil foundation.

  4. Non standard analysis, polymer models, quantum fields

    International Nuclear Information System (INIS)

    Albeverio, S.

    1984-01-01

    We give an elementary introduction to non standard analysis and its applications to the theory of stochastic processes. This is based on a joint book with J.E. Fenstad, R. Hoeegh-Krohn and T. Lindstroeem. In particular we give a discussion of an hyperfinite theory of Dirichlet forms with applications to the study of the Hamiltonian for a quantum mechanical particle in the potential created by a polymer. We also discuss new results on the existence of attractive polymer measures in dimension d 1 2 phi 2 2 )sub(d)-model of interacting quantum fields. (orig.)

  5. Fluctuation microscopy analysis of amorphous silicon models

    International Nuclear Information System (INIS)

    Gibson, J.M.; Treacy, M.M.J.

    2017-01-01

    Highlights: • Studied competing computer models for amorphous silicon and simulated fluctuation microscopy data. • Show that only paracrystalline/random network composite can fit published data. • Specifically show that pure random network or random network with void models do not fit available data. • Identify a new means to measure volume fraction of ordered material. • Identify unreported limitations of the Debye model for simulating fluctuation microscopy data. - Abstract: Using computer-generated models we discuss the use of fluctuation electron microscopy (FEM) to identify the structure of amorphous silicon. We show that a combination of variable resolution FEM to measure the correlation length, with correlograph analysis to obtain the structural motif, can pin down structural correlations. We introduce the method of correlograph variance as a promising means of independently measuring the volume fraction of a paracrystalline composite. From comparisons with published data, we affirm that only a composite material of paracrystalline and continuous random network that is substantially paracrystalline could explain the existing experimental data, and point the way to more precise measurements on amorphous semiconductors. The results are of general interest for other classes of disordered materials.

  6. Modelling and analysis of global coal markets

    Energy Technology Data Exchange (ETDEWEB)

    Trueby, Johannes

    2013-01-17

    The thesis comprises four interrelated essays featuring modelling and analysis of coal markets. Each of the four essays has a dedicated chapter in this thesis. Chapters 2 to 4 have, from a topical perspective, a backward-looking focus and deal with explaining recent market outcomes in the international coal trade. The findings of those essays may serve as guidance for assessing current coal market outcomes as well as expected market outcomes in the near to medium-term future. Chapter 5 has a forward-looking focus and builds a bridge between explaining recent market outcomes and projecting long-term market equilibria. Chapter 2, Strategic Behaviour in International Metallurgical Coal Markets, deals with market conduct of large exporters in the market of coals used in steel-making in the period 2008 to 2010. In this essay I analyse whether prices and trade-flows in the international market for metallurgical coals were subject to non-competitive conduct in the period 2008 to 2010. To do so, I develop mathematical programming models - a Stackelberg model, two varieties of a Cournot model, and a perfect competition model - for computing spatial equilibria in international resource markets. Results are analysed with various statistical measures to assess the prediction accuracy of the models. The results show that real market equilibria cannot be reproduced with a competitive model. However, real market outcomes can be accurately simulated with the non-competitive models, suggesting that market equilibria in the international metallurgical coal trade were subject to the strategic behaviour of coal exporters. Chapter 3 and chapter 4 deal with market power issues in the steam coal trade in the period 2006 to 2008. Steam coals are typically used to produce steam either for electricity generation or for heating purposes. In Chapter 3 we analyse market behaviour of key exporting countries in the steam coal trade. This chapter features the essay Market Structure Scenarios in

  7. Modelling and analysis of global coal markets

    International Nuclear Information System (INIS)

    Trueby, Johannes

    2013-01-01

    The thesis comprises four interrelated essays featuring modelling and analysis of coal markets. Each of the four essays has a dedicated chapter in this thesis. Chapters 2 to 4 have, from a topical perspective, a backward-looking focus and deal with explaining recent market outcomes in the international coal trade. The findings of those essays may serve as guidance for assessing current coal market outcomes as well as expected market outcomes in the near to medium-term future. Chapter 5 has a forward-looking focus and builds a bridge between explaining recent market outcomes and projecting long-term market equilibria. Chapter 2, Strategic Behaviour in International Metallurgical Coal Markets, deals with market conduct of large exporters in the market of coals used in steel-making in the period 2008 to 2010. In this essay I analyse whether prices and trade-flows in the international market for metallurgical coals were subject to non-competitive conduct in the period 2008 to 2010. To do so, I develop mathematical programming models - a Stackelberg model, two varieties of a Cournot model, and a perfect competition model - for computing spatial equilibria in international resource markets. Results are analysed with various statistical measures to assess the prediction accuracy of the models. The results show that real market equilibria cannot be reproduced with a competitive model. However, real market outcomes can be accurately simulated with the non-competitive models, suggesting that market equilibria in the international metallurgical coal trade were subject to the strategic behaviour of coal exporters. Chapter 3 and chapter 4 deal with market power issues in the steam coal trade in the period 2006 to 2008. Steam coals are typically used to produce steam either for electricity generation or for heating purposes. In Chapter 3 we analyse market behaviour of key exporting countries in the steam coal trade. This chapter features the essay Market Structure Scenarios in

  8. Dynamical Model about Rumor Spreading with Medium

    Directory of Open Access Journals (Sweden)

    Xiaxia Zhao

    2013-01-01

    Full Text Available Rumor is a kind of social remark, that is untrue, and not be confirmed, and spreads on a large scale in a short time. Usually, it can induce a cloud of pressure, anxiety, and panic. Traditionally, it is propagated by word of mouth. Nowadays, with the emergence of the internet, rumors can be spread by instant messengers, emails, or publishing. With this new pattern of spreading, an ISRW dynamical model considering the medium as a subclass is established. Beside the dynamical analysis of the model, we mainly explore the mechanism of spreading of individuals-to-individuals and medium-to-individual. By numerical simulation, we find that if we want to control the rumor spreading, it will not only need to control the rate of change of the spreader subclass, but also need to control the change of the information about rumor in medium which has larger influence. Moreover, to control the effusion of rumor is more important than deleting existing information about rumor. On the one hand, government should enhance the management of internet. On the other hand, relevant legal institutions for punishing the rumor creator and spreader on internet who can be tracked should be established. Using this way, involved authorities can propose efficient measures to control the rumor spreading to keep the stabilization of society and development of economy.

  9. ANALYSIS MODEL FOR RETURN ON CAPITAL EMPLOYED

    Directory of Open Access Journals (Sweden)

    BURJA CAMELIA

    2013-02-01

    Full Text Available At the microeconomic level, the appreciation of the capitals’ profitability is a very complex action which is ofinterest for stakeholders. This study has as main purpose to extend the traditional analysis model for the capitals’profitability, based on the ratio “Return on capital employed”. In line with it the objectives of this work aim theidentification of factors that exert an influence on the capital’s profitability utilized by a company and the measurementof their contribution in the manifestation of the phenomenon. The proposed analysis model is validated on the use caseof a representative company from the agricultural sector. The results obtained reveal that in a company there are somefactors which can act positively on the capitals’ profitability: capital turnover, sales efficiency, increase the share ofsales in the total revenues, improvement of the expenses’ efficiency. The findings are useful both for the decisionmakingfactors in substantiating the economic strategies and for the capital owners who are interested in efficiency oftheir investments.

  10. Data analysis and source modelling for LISA

    International Nuclear Information System (INIS)

    Shang, Yu

    2014-01-01

    The gravitational waves are one of the most important predictions in general relativity. Besides of the directly proof of the existence of GWs, there are already several ground based detectors (such as LIGO, GEO, etc) and the planed future space mission (such as: LISA) which are aim to detect the GWs directly. GW contain a large amount of information of its source, extracting these information can help us dig out the physical property of the source, even open a new window for understanding the Universe. Hence, GW data analysis will be a challenging task in seeking the GWs. In this thesis, I present two works about the data analysis for LISA. In the first work, we introduce an extended multimodal genetic algorithm which utilizes the properties of the signal and the detector response function to analyze the data from the third round of mock LISA data challenge. We have found all five sources present in the data and recovered the coalescence time, chirp mass, mass ratio and sky location with reasonable accuracy. As for the orbital angular momentum and two spins of the Black Holes, we have found a large number of widely separated modes in the parameter space with similar maximum likelihood values. The performance of this method is comparable, if not better, to already existing algorithms. In the second work, we introduce an new phenomenological waveform model for the extreme mass ratio inspiral system. This waveform consists of a set of harmonics with constant amplitude and slowly evolving phase which we decompose in a Taylor series. We use these phenomenological templates to detect the signal in the simulated data, and then, assuming a particular EMRI model, estimate the physical parameters of the binary with high precision. The results show that our phenomenological waveform is very feasible in the data analysis of EMRI signal.

  11. Sensitivity analysis of Smith's AMRV model

    International Nuclear Information System (INIS)

    Ho, Chih-Hsiang

    1995-01-01

    Multiple-expert hazard/risk assessments have considerable precedent, particularly in the Yucca Mountain site characterization studies. In this paper, we present a Bayesian approach to statistical modeling in volcanic hazard assessment for the Yucca Mountain site. Specifically, we show that the expert opinion on the site disruption parameter p is elicited on the prior distribution, π (p), based on geological information that is available. Moreover, π (p) can combine all available geological information motivated by conflicting but realistic arguments (e.g., simulation, cluster analysis, structural control, etc.). The incorporated uncertainties about the probability of repository disruption p, win eventually be averaged out by taking the expectation over π (p). We use the following priors in the analysis: priors chosen for mathematical convenience: Beta (r, s) for (r, s) = (2, 2), (3, 3), (5, 5), (2, 1), (2, 8), (8, 2), and (1, 1); and three priors motivated by expert knowledge. Sensitivity analysis is performed for each prior distribution. Estimated values of hazard based on the priors chosen for mathematical simplicity are uniformly higher than those obtained based on the priors motivated by expert knowledge. And, the model using the prior, Beta (8,2), yields the highest hazard (= 2.97 X 10 -2 ). The minimum hazard is produced by the open-quotes three-expert priorclose quotes (i.e., values of p are equally likely at 10 -3 10 -2 , and 10 -1 ). The estimate of the hazard is 1.39 x which is only about one order of magnitude smaller than the maximum value. The term, open-quotes hazardclose quotes, is defined as the probability of at least one disruption of a repository at the Yucca Mountain site by basaltic volcanism for the next 10,000 years

  12. Growth models and analysis of development

    Energy Technology Data Exchange (ETDEWEB)

    Mathur, G.

    1979-10-01

    This paper deals with remnants of neoclassical elements in Keynesian and post-Keynesian thought, and attempts to demonstrate that the elimination of these elements from our modes of thinking would not impoverish economic analysis as a means of solving real problems. In the Keynesian analysis the causation from investment to savings is exhibited in terms of income determination. When put in terms of a capital-theory model, the vector of savings is represented in two ways: real savings and counterpart real savings. The former coincides with the investment vector and the latter with the vector of consumption goods foregone for diverting resources towards equipment making. Thus the Keynesian causation in capital theory terms makes the concept of national savings as an independent variable redudant. The Robinsonian causation in a golden age with full employment and its reversal of direction in a steady state with non-employment are then considered. But in each of these, variables like rate of savings and output/capital ratio are found to be dormant variables. They are termed as null variables which, being of no account in both full-employment and unemployment situations, could, without loss, be deleted from the repertory of analytical tools. The Harrod formula of warranted rate of growth, when put in causal form, thus becomes a redundant portion of economics of growth. The real determinants of the growth rate and real wage rate on which the analysis of growth or of development should be based, are also depicted.

  13. Modified Models of Bentler and Woodward Model of Confirmatory Factor Analysis to Analysis of Covariance

    OpenAIRE

    Tan, Şeref

    2016-01-01

    Analysis of covariance is a technique used to fix effects of covariates on dependent variable to test treatment effect in experimental studies. In analysis of covariance it is assumed that covariates are measured perfectly reliable. The assumption of perfectly reliable covariates is almost impossible to meet. In reality, it is almost impossible to measure a covariate without error. In this study, a structural model suggested by Bentler and Woodward (1979) was modified both observed and latent...

  14. Topological Data Analysis of Biological Aggregation Models

    Science.gov (United States)

    Topaz, Chad M.; Ziegelmeier, Lori; Halverson, Tom

    2015-01-01

    We apply tools from topological data analysis to two mathematical models inspired by biological aggregations such as bird flocks, fish schools, and insect swarms. Our data consists of numerical simulation output from the models of Vicsek and D'Orsogna. These models are dynamical systems describing the movement of agents who interact via alignment, attraction, and/or repulsion. Each simulation time frame is a point cloud in position-velocity space. We analyze the topological structure of these point clouds, interpreting the persistent homology by calculating the first few Betti numbers. These Betti numbers count connected components, topological circles, and trapped volumes present in the data. To interpret our results, we introduce a visualization that displays Betti numbers over simulation time and topological persistence scale. We compare our topological results to order parameters typically used to quantify the global behavior of aggregations, such as polarization and angular momentum. The topological calculations reveal events and structure not captured by the order parameters. PMID:25970184

  15. Verification and Validation of FAARR Model and Data Envelopment Analysis Models for United States Army Recruiting

    National Research Council Canada - National Science Library

    Piskator, Gene

    1998-01-01

    ...) model and to develop a Data Envelopment Analysis (DEA) modeling strategy. First, the FAARR model was verified using a simulation of a known production function and validated using sensitivity analysis and ex-post forecasts...

  16. A catalog of automated analysis methods for enterprise models.

    Science.gov (United States)

    Florez, Hector; Sánchez, Mario; Villalobos, Jorge

    2016-01-01

    Enterprise models are created for documenting and communicating the structure and state of Business and Information Technologies elements of an enterprise. After models are completed, they are mainly used to support analysis. Model analysis is an activity typically based on human skills and due to the size and complexity of the models, this process can be complicated and omissions or miscalculations are very likely. This situation has fostered the research of automated analysis methods, for supporting analysts in enterprise analysis processes. By reviewing the literature, we found several analysis methods; nevertheless, they are based on specific situations and different metamodels; then, some analysis methods might not be applicable to all enterprise models. This paper presents the work of compilation (literature review), classification, structuring, and characterization of automated analysis methods for enterprise models, expressing them in a standardized modeling language. In addition, we have implemented the analysis methods in our modeling tool.

  17. PCA: Principal Component Analysis for spectra modeling

    Science.gov (United States)

    Hurley, Peter D.; Oliver, Seb; Farrah, Duncan; Wang, Lingyu; Efstathiou, Andreas

    2012-07-01

    The mid-infrared spectra of ultraluminous infrared galaxies (ULIRGs) contain a variety of spectral features that can be used as diagnostics to characterize the spectra. However, such diagnostics are biased by our prior prejudices on the origin of the features. Moreover, by using only part of the spectrum they do not utilize the full information content of the spectra. Blind statistical techniques such as principal component analysis (PCA) consider the whole spectrum, find correlated features and separate them out into distinct components. This code, written in IDL, classifies principal components of IRS spectra to define a new classification scheme using 5D Gaussian mixtures modelling. The five PCs and average spectra for the four classifications to classify objects are made available with the code.

  18. Structured analysis and modeling of complex systems

    Science.gov (United States)

    Strome, David R.; Dalrymple, Mathieu A.

    1992-01-01

    The Aircrew Evaluation Sustained Operations Performance (AESOP) facility at Brooks AFB, Texas, combines the realism of an operational environment with the control of a research laboratory. In recent studies we collected extensive data from the Airborne Warning and Control Systems (AWACS) Weapons Directors subjected to high and low workload Defensive Counter Air Scenarios. A critical and complex task in this environment involves committing a friendly fighter against a hostile fighter. Structured Analysis and Design techniques and computer modeling systems were applied to this task as tools for analyzing subject performance and workload. This technology is being transferred to the Man-Systems Division of NASA Johnson Space Center for application to complex mission related tasks, such as manipulating the Shuttle grappler arm.

  19. Early Start DENVER Model: A Meta - analysis

    Directory of Open Access Journals (Sweden)

    Jane P. Canoy

    2015-11-01

    Full Text Available Each child with Autism Spectrum Disorder has different symptoms, skills and types of impairment or disorder with other children. This is why the word “spectrum” is included in this disorder. Eapen, Crncec, and Walter, 2013 claimed that there was an emerging evidence that early interventions gives the greatest capacity of child’s development during their first years of life as “brain plasticity” are high during this period. With this, the only intervention program model for children as young as 18 months that has been validated in a randomized clinical trial is “Early Start Denver Model” (ESDM. This study aimed to determine the effectiveness of the outcome of “Early Start Denver Model” (ESDM towards young children with Autism Spectrum Disorders. This study made use of meta-analysis method. In this study, the researcher utilized studies related to “Early Start Denver Model (ESDM” which is published in a refereed journal which are all available online. There were five studies included which totals 149 children exposed to ESDM. To examine the “pooled effects” of ESDM in a variety of outcomes, a meta-analytic procedure was performed after the extraction of data of the concrete outcomes. Comprehensive Meta Analysis Version 3.3.070 was used to analyze the data.  The effectiveness of the outcome of “Early Start Denver Model” towards young children with Autism Spectrum Disorders (ASD highly depends on the intensity of intervention and the younger child age. This study would provide the basis in effectively implementing an early intervention to children with autism such as the “Early Start Denver Model” (ESDM that would show great outcome effects to those children that has “Autism Spectrum Disorder”.

  20. Development of hydrogen combustion analysis model

    Energy Technology Data Exchange (ETDEWEB)

    Lim, Tae Jin; Lee, K. D.; Kim, S. N. [Soongsil University, Seoul (Korea, Republic of); Hong, J. S.; Kwon, H. Y. [Seoul National Polytechnic University, Seoul (Korea, Republic of); Kim, Y. B.; Kim, J. S. [Seoul National University, Seoul (Korea, Republic of)

    1997-07-01

    The objectives of this project is to construct a credible DB for component reliability by developing methodologies and computer codes for assessing component independent failure and common cause failure probability, incorporating applicability and dependency of the data. In addition to this, the ultimate goal is to systematize all the analysis procedures so as to provide plans for preventing component failures by employing flexible tools for the change of specific plant or data sources. For the first subject, we construct a DB for similarity index and dependence matrix and propose a systematic procedure for data analysis by investigating the similarity and redundancy of the generic data sources. Next, we develop a computer code for this procedure and construct reliability data base for major components. The second subject is focused on developing CCF procedure for assessing the plant specific defense ability, rather than developing another CCF model. We propose a procedure and computer code for estimating CCF event probability by incorporating plant specific defensive measure. 116 refs., 25 tabs., 24 figs. (author)

  1. Modeling and Performance Analysis of Manufacturing Systems in ...

    African Journals Online (AJOL)

    This study deals with modeling and performance analysis of footwear manufacturing using arena simulation modeling software. It was investigated that modeling and simulation is a potential tool for modeling and analysis of manufacturing assembly lines like footwear manufacturing because it allows the researcher to ...

  2. Modeling for Deformable Body and Motion Analysis: A Review

    Directory of Open Access Journals (Sweden)

    Hailang Pan

    2013-01-01

    Full Text Available This paper surveys the modeling methods for deformable human body and motion analysis in the recent 30 years. First, elementary knowledge of human expression and modeling is introduced. Then, typical human modeling technologies, including 2D model, 3D surface model, and geometry-based, physics-based, and anatomy-based approaches, and model-based motion analysis are summarized. Characteristics of these technologies are analyzed. The technology accumulation in the field is outlined for an overview.

  3. [Tuscan Chronic Care Model: a preliminary analysis].

    Science.gov (United States)

    Barbato, Angelo; Meggiolaro, Angela; Rossi, Luigi; Fioravanti, C; Palermita, F; La Torre, Giuseppe

    2015-01-01

    the aim of this study is to present a preliminary analysis of efficacy and effectiveness of a model of chronically ill care (Chronic Care Model, CCM). the analysis took into account 106 territorial modules, 1016 General Practitioners and 1,228,595 patients. The diagnostic and therapeutic pathways activated (PDTA), involved four chronic conditions, selected according to the prevalence and incidence, in Tuscany Region: Diabetes Mellitus (DM), Heart Failure (SC), Chronic Obstructive Pulmonary Disease (COPD) and stroke. Six epidemiological indicators of process and output were selected, in order to measure the model of care performed, before and after its application: adherence to specific follow-up for each pathology (use of clinical and laboratory indicators), annual average of expenditure per/capita/euro for diagnostic tests, in laboratory and instrumental, average expenditure per/capita/year for specialist visits; hospitalization rate for diseases related to the main pathology, hospitalization rate for long-term complications and rate of access to the emergency department (ED). Data were collected through the database; the differences before and after the intervention and between exposed and unexposed, were analyzed by method "Before-After (Controlled and Uncontrolled) Studies". The impact of the intervention was calculated as DD (difference of the differences). DM management showed an increased adhesion to follow-up (DD: +8.1%), and the use of laboratory diagnostics (DD: +4,9 €/year/pc), less hospitalization for long-term complications and for endocrine related diseases (DD respectively: 5.8/1000 and DD: +1.2/1000), finally a smaller increase of access to PS (DD: -1.6/1000), despite a slight increase of specialistic visits (DD: +0,38 €/year/pc). The management of SC initially showed a rising adherence to follow-up (DD: +2.3%), a decrease of specialist visits (DD:E 1.03 €/year/pc), hospitalization and access to PS for exacerbations (DD: -4.4/1000 and DD: -6

  4. Modeling and analysis of solar distributed generation

    Science.gov (United States)

    Ortiz Rivera, Eduardo Ivan

    Recent changes in the global economy are creating a big impact in our daily life. The price of oil is increasing and the number of reserves are less every day. Also, dramatic demographic changes are impacting the viability of the electric infrastructure and ultimately the economic future of the industry. These are some of the reasons that many countries are looking for alternative energy to produce electric energy. The most common form of green energy in our daily life is solar energy. To convert solar energy into electrical energy is required solar panels, dc-dc converters, power control, sensors, and inverters. In this work, a photovoltaic module, PVM, model using the electrical characteristics provided by the manufacturer data sheet is presented for power system applications. Experimental results from testing are showed, verifying the proposed PVM model. Also in this work, three maximum power point tracker, MPPT, algorithms would be presented to obtain the maximum power from a PVM. The first MPPT algorithm is a method based on the Rolle's and Lagrange's Theorems and can provide at least an approximate answer to a family of transcendental functions that cannot be solved using differential calculus. The second MPPT algorithm is based on the approximation of the proposed PVM model using fractional polynomials where the shape, boundary conditions and performance of the proposed PVM model are satisfied. The third MPPT algorithm is based in the determination of the optimal duty cycle for a dc-dc converter and the previous knowledge of the load or load matching conditions. Also, four algorithms to calculate the effective irradiance level and temperature over a photovoltaic module are presented in this work. The main reasons to develop these algorithms are for monitoring climate conditions, the elimination of temperature and solar irradiance sensors, reductions in cost for a photovoltaic inverter system, and development of new algorithms to be integrated with maximum

  5. Structuring Problem Analysis for Embedded Systems Modelling

    NARCIS (Netherlands)

    Marincic, J.; Mader, Angelika H.; Wieringa, Roelf J.; Lucas, Yan

    Our interest is embedded systems validation as part of the model-driven approach. To design a model, the modeller needs to obtain knowledge about the system and decide what is relevant to model and how. A part of the modelling activities is inherently informal - it cannot be formalised in such a way

  6. Aircraft vulnerability analysis by modeling and simulation

    Science.gov (United States)

    Willers, Cornelius J.; Willers, Maria S.; de Waal, Alta

    2014-10-01

    Infrared missiles pose a significant threat to civilian and military aviation. ManPADS missiles are especially dangerous in the hands of rogue and undisciplined forces. Yet, not all the launched missiles hit their targets; the miss being either attributable to misuse of the weapon or to missile performance restrictions. This paper analyses some of the factors affecting aircraft vulnerability and demonstrates a structured analysis of the risk and aircraft vulnerability problem. The aircraft-missile engagement is a complex series of events, many of which are only partially understood. Aircraft and missile designers focus on the optimal design and performance of their respective systems, often testing only in a limited set of scenarios. Most missiles react to the contrast intensity, but the variability of the background is rarely considered. Finally, the vulnerability of the aircraft depends jointly on the missile's performance and the doctrine governing the missile's launch. These factors are considered in a holistic investigation. The view direction, altitude, time of day, sun position, latitude/longitude and terrain determine the background against which the aircraft is observed. Especially high gradients in sky radiance occur around the sun and on the horizon. This paper considers uncluttered background scenes (uniform terrain and clear sky) and presents examples of background radiance at all view angles across a sphere around the sensor. A detailed geometrical and spatially distributed radiometric model is used to model the aircraft. This model provides the signature at all possible view angles across the sphere around the aircraft. The signature is determined in absolute terms (no background) and in contrast terms (with background). It is shown that the background significantly affects the contrast signature as observed by the missile sensor. A simplified missile model is constructed by defining the thrust and mass profiles, maximum seeker tracking rate, maximum

  7. Model Based Analysis and Test Generation for Flight Software

    Science.gov (United States)

    Pasareanu, Corina S.; Schumann, Johann M.; Mehlitz, Peter C.; Lowry, Mike R.; Karsai, Gabor; Nine, Harmon; Neema, Sandeep

    2009-01-01

    We describe a framework for model-based analysis and test case generation in the context of a heterogeneous model-based development paradigm that uses and combines Math- Works and UML 2.0 models and the associated code generation tools. This paradigm poses novel challenges to analysis and test case generation that, to the best of our knowledge, have not been addressed before. The framework is based on a common intermediate representation for different modeling formalisms and leverages and extends model checking and symbolic execution tools for model analysis and test case generation, respectively. We discuss the application of our framework to software models for a NASA flight mission.

  8. Model performance analysis and model validation in logistic regression

    Directory of Open Access Journals (Sweden)

    Rosa Arboretti Giancristofaro

    2007-10-01

    Full Text Available In this paper a new model validation procedure for a logistic regression model is presented. At first, we illustrate a brief review of different techniques of model validation. Next, we define a number of properties required for a model to be considered "good", and a number of quantitative performance measures. Lastly, we describe a methodology for the assessment of the performance of a given model by using an example taken from a management study.

  9. Likelihood analysis of the minimal AMSB model

    Energy Technology Data Exchange (ETDEWEB)

    Bagnaschi, E.; Weiglein, G. [DESY, Hamburg (Germany); Borsato, M.; Chobanova, V.; Lucio, M.; Santos, D.M. [Universidade de Santiago de Compostela, Santiago de Compostela (Spain); Sakurai, K. [Institute for Particle Physics Phenomenology, University of Durham, Science Laboratories, Department of Physics, Durham (United Kingdom); University of Warsaw, Faculty of Physics, Institute of Theoretical Physics, Warsaw (Poland); Buchmueller, O.; Citron, M.; Costa, J.C.; Richards, A. [Imperial College, High Energy Physics Group, Blackett Laboratory, London (United Kingdom); Cavanaugh, R. [Fermi National Accelerator Laboratory, Batavia, IL (United States); University of Illinois at Chicago, Physics Department, Chicago, IL (United States); De Roeck, A. [Experimental Physics Department, CERN, Geneva (Switzerland); Antwerp University, Wilrijk (Belgium); Dolan, M.J. [School of Physics, University of Melbourne, ARC Centre of Excellence for Particle Physics at the Terascale, Melbourne (Australia); Ellis, J.R. [King' s College London, Theoretical Particle Physics and Cosmology Group, Department of Physics, London (United Kingdom); CERN, Theoretical Physics Department, Geneva (Switzerland); Flaecher, H. [University of Bristol, H.H. Wills Physics Laboratory, Bristol (United Kingdom); Heinemeyer, S. [Campus of International Excellence UAM+CSIC, Madrid (Spain); Instituto de Fisica Teorica UAM-CSIC, Madrid (Spain); Instituto de Fisica de Cantabria (CSIC-UC), Cantabria (Spain); Isidori, G. [Physik-Institut, Universitaet Zuerich, Zurich (Switzerland); Luo, F. [Kavli IPMU (WPI), UTIAS, The University of Tokyo, Kashiwa, Chiba (Japan); Olive, K.A. [School of Physics and Astronomy, University of Minnesota, William I. Fine Theoretical Physics Institute, Minneapolis, MN (United States)

    2017-04-15

    We perform a likelihood analysis of the minimal anomaly-mediated supersymmetry-breaking (mAMSB) model using constraints from cosmology and accelerator experiments. We find that either a wino-like or a Higgsino-like neutralino LSP, χ{sup 0}{sub 1}, may provide the cold dark matter (DM), both with similar likelihoods. The upper limit on the DM density from Planck and other experiments enforces m{sub χ{sup 0}{sub 1}} 0) but the scalar mass m{sub 0} is poorly constrained. In the wino-LSP case, m{sub 3/2} is constrained to about 900 TeV and m{sub χ{sup 0}{sub 1}} to 2.9 ± 0.1 TeV, whereas in the Higgsino-LSP case m{sub 3/2} has just a lower limit >or similar 650 TeV (>or similar 480 TeV) and m{sub χ{sup 0}{sub 1}} is constrained to 1.12 (1.13) ± 0.02 TeV in the μ > 0 (μ < 0) scenario. In neither case can the anomalous magnetic moment of the muon, (g-2){sub μ}, be improved significantly relative to its Standard Model (SM) value, nor do flavour measurements constrain the model significantly, and there are poor prospects for discovering supersymmetric particles at the LHC, though there are some prospects for direct DM detection. On the other hand, if the χ{sup 0}{sub 1} contributes only a fraction of the cold DM density, future LHC E{sub T}-based searches for gluinos, squarks and heavier chargino and neutralino states as well as disappearing track searches in the wino-like LSP region will be relevant, and interference effects enable BR(B{sub s,d} → μ{sup +}μ{sup -}) to agree with the data better than in the SM in the case of wino-like DM with μ > 0. (orig.)

  10. Sensitivity analysis approaches applied to systems biology models.

    Science.gov (United States)

    Zi, Z

    2011-11-01

    With the rising application of systems biology, sensitivity analysis methods have been widely applied to study the biological systems, including metabolic networks, signalling pathways and genetic circuits. Sensitivity analysis can provide valuable insights about how robust the biological responses are with respect to the changes of biological parameters and which model inputs are the key factors that affect the model outputs. In addition, sensitivity analysis is valuable for guiding experimental analysis, model reduction and parameter estimation. Local and global sensitivity analysis approaches are the two types of sensitivity analysis that are commonly applied in systems biology. Local sensitivity analysis is a classic method that studies the impact of small perturbations on the model outputs. On the other hand, global sensitivity analysis approaches have been applied to understand how the model outputs are affected by large variations of the model input parameters. In this review, the author introduces the basic concepts of sensitivity analysis approaches applied to systems biology models. Moreover, the author discusses the advantages and disadvantages of different sensitivity analysis methods, how to choose a proper sensitivity analysis approach, the available sensitivity analysis tools for systems biology models and the caveats in the interpretation of sensitivity analysis results.

  11. Analysis of Disaster Risk Management in Colombia : A Contribution to the Creation of Public Policies

    OpenAIRE

    Campos Garcia, Ana; Holm-Nielsen, Niels; Diaz G., Carolina; Rubiano Vargas, Diana Marcela; Costa P., Carlos R.; Ramirez Cortes, Fernando; Dickson, Eric

    2011-01-01

    The objective of this analysis is to assess the state of progress of risk management in Colombia and propose recommendations to help the Government set public policy in the short-and long-term. For this reason, the study sought to: (i) establish the risk and impact of disasters in recent decades, (ii) identify legal, institutional and conceptual themes in the country, (iii) review the stat...

  12. Applied data analysis and modeling for energy engineers and scientists

    CERN Document Server

    Reddy, T Agami

    2011-01-01

    ""Applied Data Analysis and Modeling for Energy Engineers and Scientists"" discusses mathematical models, data analysis, and decision analysis in modeling. The approach taken in this volume focuses on the modeling and analysis of thermal systems in an engineering environment, while also covering a number of other critical areas. Other material covered includes the tools that researchers and engineering professionals will need in order to explore different analysis methods, use critical assessment skills and reach sound engineering conclusions. The book also covers process and system design and

  13. Experimental Design for Sensitivity Analysis of Simulation Models

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    2001-01-01

    This introductory tutorial gives a survey on the use of statistical designs for what if-or sensitivity analysis in simulation.This analysis uses regression analysis to approximate the input/output transformation that is implied by the simulation model; the resulting regression model is also known as

  14. Evaluation of RCAS Inflow Models for Wind Turbine Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Tangler, J.; Bir, G.

    2004-02-01

    The finite element structural modeling in the Rotorcraft Comprehensive Analysis System (RCAS) provides a state-of-the-art approach to aeroelastic analysis. This, coupled with its ability to model all turbine components, results in a methodology that can simulate complex system interactions characteristic of large wind. In addition, RCAS is uniquely capable of modeling advanced control algorithms and the resulting dynamic responses.

  15. [Model-based biofuels system analysis: a review].

    Science.gov (United States)

    Chang, Shiyan; Zhang, Xiliang; Zhao, Lili; Ou, Xunmin

    2011-03-01

    Model-based system analysis is an important tool for evaluating the potential and impacts of biofuels, and for drafting biofuels technology roadmaps and targets. The broad reach of the biofuels supply chain requires that biofuels system analyses span a range of disciplines, including agriculture/forestry, energy, economics, and the environment. Here we reviewed various models developed for or applied to modeling biofuels, and presented a critical analysis of Agriculture/Forestry System Models, Energy System Models, Integrated Assessment Models, Micro-level Cost, Energy and Emission Calculation Models, and Specific Macro-level Biofuel Models. We focused on the models' strengths, weaknesses, and applicability, facilitating the selection of a suitable type of model for specific issues. Such an analysis was a prerequisite for future biofuels system modeling, and represented a valuable resource for researchers and policy makers.

  16. Bayesian Model Averaging for Propensity Score Analysis.

    Science.gov (United States)

    Kaplan, David; Chen, Jianshen

    2014-01-01

    This article considers Bayesian model averaging as a means of addressing uncertainty in the selection of variables in the propensity score equation. We investigate an approximate Bayesian model averaging approach based on the model-averaged propensity score estimates produced by the R package BMA but that ignores uncertainty in the propensity score. We also provide a fully Bayesian model averaging approach via Markov chain Monte Carlo sampling (MCMC) to account for uncertainty in both parameters and models. A detailed study of our approach examines the differences in the causal estimate when incorporating noninformative versus informative priors in the model averaging stage. We examine these approaches under common methods of propensity score implementation. In addition, we evaluate the impact of changing the size of Occam's window used to narrow down the range of possible models. We also assess the predictive performance of both Bayesian model averaging propensity score approaches and compare it with the case without Bayesian model averaging. Overall, results show that both Bayesian model averaging propensity score approaches recover the treatment effect estimates well and generally provide larger uncertainty estimates, as expected. Both Bayesian model averaging approaches offer slightly better prediction of the propensity score compared with the Bayesian approach with a single propensity score equation. Covariate balance checks for the case study show that both Bayesian model averaging approaches offer good balance. The fully Bayesian model averaging approach also provides posterior probability intervals of the balance indices.

  17. Statistical models for competing risk analysis

    International Nuclear Information System (INIS)

    Sather, H.N.

    1976-08-01

    Research results on three new models for potential applications in competing risks problems. One section covers the basic statistical relationships underlying the subsequent competing risks model development. Another discusses the problem of comparing cause-specific risk structure by competing risks theory in two homogeneous populations, P1 and P2. Weibull models which allow more generality than the Berkson and Elveback models are studied for the effect of time on the hazard function. The use of concomitant information for modeling single-risk survival is extended to the multiple failure mode domain of competing risks. The model used to illustrate the use of this methodology is a life table model which has constant hazards within pre-designated intervals of the time scale. Two parametric models for bivariate dependent competing risks, which provide interesting alternatives, are proposed and examined

  18. Likelihood analysis of the I(2) model

    DEFF Research Database (Denmark)

    Johansen, Søren

    1997-01-01

    The I(2) model is defined as a submodel of the general vector autoregressive model, by two reduced rank conditions. The model describes stochastic processes with stationary second difference. A parametrization is suggested which makes likelihood inference feasible. Consistency of the maximum...

  19. Eclipsing binary stars modeling and analysis

    CERN Document Server

    Kallrath, Josef

    1999-01-01

    This book focuses on the formulation of mathematical models for the light curves of eclipsing binary stars, and on the algorithms for generating such models Since information gained from binary systems provides much of what we know of the masses, luminosities, and radii of stars, such models are acquiring increasing importance in studies of stellar structure and evolution As in other areas of science, the computer revolution has given many astronomers tools that previously only specialists could use; anyone with access to a set of data can now expect to be able to model it This book will provide astronomers, both amateur and professional, with a guide for - specifying an astrophysical model for a set of observations - selecting an algorithm to determine the parameters of the model - estimating the errors of the parameters It is written for readers with knowledge of basic calculus and linear algebra; appendices cover mathematical details on such matters as optimization, coordinate systems, and specific models ...

  20. Advances in power system modelling, control and stability analysis

    CERN Document Server

    Milano, Federico

    2016-01-01

    Advances in Power System Modelling, Control and Stability Analysis captures the variety of new methodologies and technologies that are changing the way modern electric power systems are modelled, simulated and operated.

  1. Modeling and Analysis of CSP Systems (Fact Sheet)

    Energy Technology Data Exchange (ETDEWEB)

    2010-08-01

    Fact sheet describing NREL CSP Program capabilities in the area of modeling and analysis of CSP systems: assessing the solar resource, predicting performance and cost, studying environmental impact, and developing modeling software packages.

  2. A Hierarchical Visualization Analysis Model of Power Big Data

    Science.gov (United States)

    Li, Yongjie; Wang, Zheng; Hao, Yang

    2018-01-01

    Based on the conception of integrating VR scene and power big data analysis, a hierarchical visualization analysis model of power big data is proposed, in which levels are designed, targeting at different abstract modules like transaction, engine, computation, control and store. The regularly departed modules of power data storing, data mining and analysis, data visualization are integrated into one platform by this model. It provides a visual analysis solution for the power big data.

  3. Stochastic Wake Modelling Based on POD Analysis

    Directory of Open Access Journals (Sweden)

    David Bastine

    2018-03-01

    Full Text Available In this work, large eddy simulation data is analysed to investigate a new stochastic modeling approach for the wake of a wind turbine. The data is generated by the large eddy simulation (LES model PALM combined with an actuator disk with rotation representing the turbine. After applying a proper orthogonal decomposition (POD, three different stochastic models for the weighting coefficients of the POD modes are deduced resulting in three different wake models. Their performance is investigated mainly on the basis of aeroelastic simulations of a wind turbine in the wake. Three different load cases and their statistical characteristics are compared for the original LES, truncated PODs and the stochastic wake models including different numbers of POD modes. It is shown that approximately six POD modes are enough to capture the load dynamics on large temporal scales. Modeling the weighting coefficients as independent stochastic processes leads to similar load characteristics as in the case of the truncated POD. To complete this simplified wake description, we show evidence that the small-scale dynamics can be captured by adding to our model a homogeneous turbulent field. In this way, we present a procedure to derive stochastic wake models from costly computational fluid dynamics (CFD calculations or elaborated experimental investigations. These numerically efficient models provide the added value of possible long-term studies. Depending on the aspects of interest, different minimalized models may be obtained.

  4. Modelling of SWOT Analysis Using Fuzzy Integrals

    OpenAIRE

    Haile, Meaza; Křupka, Jiří

    2016-01-01

    To develop a strategy for an organization it is important to understand the organization and its surrounding environment. SWOT(Strength, Weakness, Opportunity and Threat) analysis is a famous tool to perform this task precisely by showing the strength, weakness of the organization and the external factors, opportunities and threats that affect its success. SWOT analysis is commonly used by business; however non-profit organizations also use SWOT analysis for decision-making and strategy evalu...

  5. Multifractal modelling and 3D lacunarity analysis

    International Nuclear Information System (INIS)

    Hanen, Akkari; Imen, Bhouri; Asma, Ben Abdallah; Patrick, Dubois; Hedi, Bedoui Mohamed

    2009-01-01

    This study presents a comparative evaluation of lacunarity of 3D grey level models with different types of inhomogeneity. A new method based on the 'Relative Differential Box Counting' was developed to estimate the lacunarity features of grey level volumes. To validate our method, we generated a set of 3D grey level multifractal models with random, anisotropic and hierarchical properties. Our method gives a lacunarity measurement correlated with the theoretical one and allows a better model classification compared with a classical approach.

  6. The Modeling Analysis of Huangshan Tourism Data

    Science.gov (United States)

    Hu, Shanfeng; Yan, Xinhu; Zhu, Hongbing

    2016-06-01

    Tourism is the major industry in Huangshan city. This paper analyzes time series of tourism data to Huangshan from 2000 to 2013. The Yearly data set comprises the total arrivals of tourists, total income, Urban Resident Disposable Income Per Capital and Net Income Per Peasant. A mathematical model which is based on the binomial approximation and inverse quadratic radial basis function (RBF) is set up to model the tourist arrivals. The total income and urban resident disposable income per capital and net income per peasant are also modeled. It is shown that the established mathematical model can be used to forecast some tourism information and achieve a good management for Huangshan tourism.

  7. COMPARATIVE ANALYSIS OF THE CAUSES OF ABSOLUTE NULLITY OF THE CONTRACT IN THE ROMANIAN AND THE SPANISH CIVIL LAW

    Directory of Open Access Journals (Sweden)

    Carla Alexandra ANGHELESCU

    2015-07-01

    Full Text Available The present paper is aimed to present a comparative analysis of the causes of absolute nullity of the contract in the Romanian and the Spanish civil law. Thus, the study focuses on the presentation of both similarities and differences between the provisions of the Romanian Civil Code and the Spanish Civil Code that regulate the legal institution of the nullity of contracts, outlining the practical consequences of the conclusion.

  8. Hierarchical regression analysis in structural Equation Modeling

    NARCIS (Netherlands)

    de Jong, P.F.

    1999-01-01

    In a hierarchical or fixed-order regression analysis, the independent variables are entered into the regression equation in a prespecified order. Such an analysis is often performed when the extra amount of variance accounted for in a dependent variable by a specific independent variable is the main

  9. Approximate deconvolution models of turbulence analysis, phenomenology and numerical analysis

    CERN Document Server

    Layton, William J

    2012-01-01

    This volume presents a mathematical development of a recent approach to the modeling and simulation of turbulent flows based on methods for the approximate solution of inverse problems. The resulting Approximate Deconvolution Models or ADMs have some advantages over more commonly used turbulence models – as well as some disadvantages. Our goal in this book is to provide a clear and complete mathematical development of ADMs, while pointing out the difficulties that remain. In order to do so, we present the analytical theory of ADMs, along with its connections, motivations and complements in the phenomenology of and algorithms for ADMs.

  10. Wellness Model of Supervision: A Comparative Analysis

    Science.gov (United States)

    Lenz, A. Stephen; Sangganjanavanich, Varunee Faii; Balkin, Richard S.; Oliver, Marvarene; Smith, Robert L.

    2012-01-01

    This quasi-experimental study compared the effectiveness of the Wellness Model of Supervision (WELMS; Lenz & Smith, 2010) with alternative supervision models for developing wellness constructs, total personal wellness, and helping skills among counselors-in-training. Participants were 32 master's-level counseling students completing their…

  11. Extendable linearised adjustment model for deformation analysis

    NARCIS (Netherlands)

    Hiddo Velsink

    2015-01-01

    Author supplied: "This paper gives a linearised adjustment model for the affine, similarity and congruence transformations in 3D that is easily extendable with other parameters to describe deformations. The model considers all coordinates stochastic. Full positive semi-definite covariance matrices

  12. Extendable linearised adjustment model for deformation analysis

    NARCIS (Netherlands)

    Velsink, H.

    2015-01-01

    This paper gives a linearised adjustment model for the affine, similarity and congruence transformations in 3D that is easily extendable with other parameters to describe deformations. The model considers all coordinates stochastic. Full positive semi-definite covariance matrices and correlation

  13. Reusable launch vehicle model uncertainties impact analysis

    Science.gov (United States)

    Chen, Jiaye; Mu, Rongjun; Zhang, Xin; Deng, Yanpeng

    2018-03-01

    Reusable launch vehicle(RLV) has the typical characteristics of complex aerodynamic shape and propulsion system coupling, and the flight environment is highly complicated and intensely changeable. So its model has large uncertainty, which makes the nominal system quite different from the real system. Therefore, studying the influences caused by the uncertainties on the stability of the control system is of great significance for the controller design. In order to improve the performance of RLV, this paper proposes the approach of analyzing the influence of the model uncertainties. According to the typical RLV, the coupling dynamic and kinematics models are built. Then different factors that cause uncertainties during building the model are analyzed and summed up. After that, the model uncertainties are expressed according to the additive uncertainty model. Choosing the uncertainties matrix's maximum singular values as the boundary model, and selecting the uncertainties matrix's norm to show t how much the uncertainty factors influence is on the stability of the control system . The simulation results illustrate that the inertial factors have the largest influence on the stability of the system, and it is necessary and important to take the model uncertainties into consideration before the designing the controller of this kind of aircraft( like RLV, etc).

  14. Sensitive analysis of a finite element model of orthogonal cutting

    Science.gov (United States)

    Brocail, J.; Watremez, M.; Dubar, L.

    2011-01-01

    This paper presents a two-dimensional finite element model of orthogonal cutting. The proposed model has been developed with Abaqus/explicit software. An Arbitrary Lagrangian-Eulerian (ALE) formulation is used to predict chip formation, temperature, chip-tool contact length, chip thickness, and cutting forces. This numerical model of orthogonal cutting will be validated by comparing these process variables to experimental and numerical results obtained by Filice et al. [1]. This model can be considered to be reliable enough to make qualitative analysis of entry parameters related to cutting process and frictional models. A sensitivity analysis is conducted on the main entry parameters (coefficients of the Johnson-Cook law, and contact parameters) with the finite element model. This analysis is performed with two levels for each factor. The sensitivity analysis realised with the numerical model on the entry parameters has allowed the identification of significant parameters and the margin identification of parameters.

  15. Analytic uncertainty and sensitivity analysis of models with input correlations

    Science.gov (United States)

    Zhu, Yueying; Wang, Qiuping A.; Li, Wei; Cai, Xu

    2018-03-01

    Probabilistic uncertainty analysis is a common means of evaluating mathematical models. In mathematical modeling, the uncertainty in input variables is specified through distribution laws. Its contribution to the uncertainty in model response is usually analyzed by assuming that input variables are independent of each other. However, correlated parameters are often happened in practical applications. In the present paper, an analytic method is built for the uncertainty and sensitivity analysis of models in the presence of input correlations. With the method, it is straightforward to identify the importance of the independence and correlations of input variables in determining the model response. This allows one to decide whether or not the input correlations should be considered in practice. Numerical examples suggest the effectiveness and validation of our analytic method in the analysis of general models. A practical application of the method is also proposed to the uncertainty and sensitivity analysis of a deterministic HIV model.

  16. Analysis of nonlinear systems using ARMA [autoregressive moving average] models

    International Nuclear Information System (INIS)

    Hunter, N.F. Jr.

    1990-01-01

    While many vibration systems exhibit primarily linear behavior, a significant percentage of the systems encountered in vibration and model testing are mildly to severely nonlinear. Analysis methods for such nonlinear systems are not yet well developed and the response of such systems is not accurately predicted by linear models. Nonlinear ARMA (autoregressive moving average) models are one method for the analysis and response prediction of nonlinear vibratory systems. In this paper we review the background of linear and nonlinear ARMA models, and illustrate the application of these models to nonlinear vibration systems. We conclude by summarizing the advantages and disadvantages of ARMA models and emphasizing prospects for future development. 14 refs., 11 figs

  17. Application of autoregressive moving average model in reactor noise analysis

    International Nuclear Information System (INIS)

    Tran Dinh Tri

    1993-01-01

    The application of an autoregressive (AR) model to estimating noise measurements has achieved many successes in reactor noise analysis in the last ten years. The physical processes that take place in the nuclear reactor, however, are described by an autoregressive moving average (ARMA) model rather than by an AR model. Consequently more correct results could be obtained by applying the ARMA model instead of the AR model to reactor noise analysis. In this paper the system of the generalised Yule-Walker equations is derived from the equation of an ARMA model, then a method for its solution is given. Numerical results show the applications of the method proposed. (author)

  18. Probabilistic forward model for electroencephalography source analysis

    International Nuclear Information System (INIS)

    Plis, Sergey M; George, John S; Jun, Sung C; Ranken, Doug M; Volegov, Petr L; Schmidt, David M

    2007-01-01

    Source localization by electroencephalography (EEG) requires an accurate model of head geometry and tissue conductivity. The estimation of source time courses from EEG or from EEG in conjunction with magnetoencephalography (MEG) requires a forward model consistent with true activity for the best outcome. Although MRI provides an excellent description of soft tissue anatomy, a high resolution model of the skull (the dominant resistive component of the head) requires CT, which is not justified for routine physiological studies. Although a number of techniques have been employed to estimate tissue conductivity, no present techniques provide the noninvasive 3D tomographic mapping of conductivity that would be desirable. We introduce a formalism for probabilistic forward modeling that allows the propagation of uncertainties in model parameters into possible errors in source localization. We consider uncertainties in the conductivity profile of the skull, but the approach is general and can be extended to other kinds of uncertainties in the forward model. We and others have previously suggested the possibility of extracting conductivity of the skull from measured electroencephalography data by simultaneously optimizing over dipole parameters and the conductivity values required by the forward model. Using Cramer-Rao bounds, we demonstrate that this approach does not improve localization results nor does it produce reliable conductivity estimates. We conclude that the conductivity of the skull has to be either accurately measured by an independent technique, or that the uncertainties in the conductivity values should be reflected in uncertainty in the source location estimates

  19. Root analysis and implications to analysis model in ATLAS

    CERN Document Server

    Shibata, A

    2008-01-01

    An impressive amount of effort has been put in to realize a set of frameworks to support analysis in this new paradigm of GRID computing. However, much more than half of a physicist's time is typically spent after the GRID processing of the data. Due to the private nature of this level of analysis, there has been little common framework or methodology. While most physicists agree to use ROOT as the basis of their analysis, a number of approaches are possible for the implementation of the analysis using ROOT: conventional methods using CINT/ACLiC, development using g++, alternative interface through python, and parallel processing methods such as PROOF are some of the choices currently available on the market. Furthermore, in the ATLAS collaboration an additional layer of technology adds to the complexity because the data format is based on the POOL technology, which tends to be less portable. In this study, various modes of ROOT analysis are profiled for comparison with the main focus on the processing speed....

  20. Geometrical analysis of the interacting boson model

    International Nuclear Information System (INIS)

    Dieperink, A.E.L.

    1983-01-01

    The Interacting Boson Model is considered, in relation with geometrical models and the application of mean field techniques to algebraic models, in three lectures. In the first, several methods are reviewed to establish a connection between the algebraic formulation of collective nuclear properties in terms of the group SU(6) and the geometric approach. In the second lecture the geometric interpretation of new degrees of freedom that arise in the neutron-proton IBA is discussed, and in the third one some further applications of algebraic techniques to the calculation of static and dynamic collective properties are presented. (U.K.)

  1. Philosophical analysis of models of engineering education in Russia

    Directory of Open Access Journals (Sweden)

    Fadeeva V. N.

    2016-01-01

    Full Text Available This article defines the principles of the philosophical approach to the problems of engineering education. Ontological, epistemological and axiological components of the proposed approach are distinguished. Assessment criteria of engineering education models are specified. Basing on the presented principles and criteria, the analysis of Russian engineering education models is performed. The authors distinguish the following models: classical (tsarism, soviet transitional, soviet industrial, physicotechnical model, soviet mass (reproductive and Russian transitional models. In addition among developing models it is possible to recognize the following ones: methodological (creative and outrunning (advanced models. On the basis of the performed analysis, positive and negative aspects of the distinguished models are determined, and, it is possible to make a conclusion that every accomplished model emergence was reasoned by particular issues raised in the state at the particular period of time. The talking point of the necessity to design a proactive model of engineering education is formulated.

  2. CLPX NCAR Data Analysis and Numerical Modeling

    Data.gov (United States)

    National Aeronautics and Space Administration — The purpose of this project was to generate a research-quality, scientifically-sound, best-as-reasonably possible, three-dimensional meteorological analysis for the...

  3. A model of the gas analysis system operation process

    Science.gov (United States)

    Yakimenko, I. V.; Kanishchev, O. A.; Lyamets, L. L.; Volkova, I. V.

    2017-12-01

    The characteristic features of modeling the gas-analysis measurement system operation process on the basis of the semi-Markov process theory are discussed. The model of the measuring gas analysis system operation process is proposed, which makes it possible to take into account the influence of the replacement interval, the level of reliability and maintainability and to evaluate the product reliability.

  4. sensitivity analysis on flexible road pavement life cycle cost model

    African Journals Online (AJOL)

    user

    Sensitivity analysis is a tool used in the assessment of a model's performance. This study examined the application of sensitivity analysis on a developed flexible pavement life cycle cost model using varying discount rate. The study area is Effurun, Uvwie Local Government Area of Delta State of Nigeria. In order to ...

  5. A visual analysis of the process of process modeling

    NARCIS (Netherlands)

    Claes, J.; Vanderfeesten, I.; Pinggera, J.; Reijers, H.A.; Weber, B.; Poels, G.

    2015-01-01

    The construction of business process models has become an important requisite in the analysis and optimization of processes. The success of the analysis and optimization efforts heavily depends on the quality of the models. Therefore, a research domain emerged that studies the process of process

  6. an improved structural model for seismic analysis of tall frames

    African Journals Online (AJOL)

    Dr Obe

    ABSTRACT. This paper proposed and examined an improved structural model that overcomes the deficiencies of the shear frame model by considering the effects of flexible horizontal members and column axial loads in seismic analysis of multi-storey frames. Matrix displacement method of analysis is used on the basis of ...

  7. A Petri Nets Model for Blockchain Analysis

    OpenAIRE

    Pinna, Andrea; Tonelli, Roberto; Orrú, Matteo; Marchesi, Michele

    2017-01-01

    A Blockchain is a global shared infrastructure where cryptocurrency transactions among addresses are recorded, validated and made publicly available in a peer- to-peer network. To date the best known and important cryptocurrency is the bitcoin. In this paper we focus on this cryptocurrency and in particular on the modeling of the Bitcoin Blockchain by using the Petri Nets formalism. The proposed model allows us to quickly collect information about identities owning Bitcoin addresses and to re...

  8. Toward General Analysis of Recursive Probability Models

    OpenAIRE

    Pless, Daniel; Luger, George

    2013-01-01

    There is increasing interest within the research community in the design and use of recursive probability models. Although there still remains concern about computational complexity costs and the fact that computing exact solutions can be intractable for many nonrecursive models and impossible in the general case for recursive problems, several research groups are actively developing computational techniques for recursive stochastic languages. We have developed an extension to the traditional...

  9. Meta-analysis a structural equation modeling approach

    CERN Document Server

    Cheung, Mike W-L

    2015-01-01

    Presents a novel approach to conducting meta-analysis using structural equation modeling. Structural equation modeling (SEM) and meta-analysis are two powerful statistical methods in the educational, social, behavioral, and medical sciences. They are often treated as two unrelated topics in the literature. This book presents a unified framework on analyzing meta-analytic data within the SEM framework, and illustrates how to conduct meta-analysis using the metaSEM package in the R statistical environment. Meta-Analysis: A Structural Equation Modeling Approach begins by introducing the impo

  10. Structural dynamic analysis with generalized damping models analysis

    CERN Document Server

    Adhikari , Sondipon

    2013-01-01

    Since Lord Rayleigh introduced the idea of viscous damping in his classic work ""The Theory of Sound"" in 1877, it has become standard practice to use this approach in dynamics, covering a wide range of applications from aerospace to civil engineering. However, in the majority of practical cases this approach is adopted more for mathematical convenience than for modeling the physics of vibration damping. Over the past decade, extensive research has been undertaken on more general ""non-viscous"" damping models and vibration of non-viscously damped systems. This book, along with a related book

  11. SBKF Modeling and Analysis Plan: Buckling Analysis of Compression-Loaded Orthogrid and Isogrid Cylinders

    Science.gov (United States)

    Lovejoy, Andrew E.; Hilburger, Mark W.

    2013-01-01

    This document outlines a Modeling and Analysis Plan (MAP) to be followed by the SBKF analysts. It includes instructions on modeling and analysis formulation and execution, model verification and validation, identifying sources of error and uncertainty, and documentation. The goal of this MAP is to provide a standardized procedure that ensures uniformity and quality of the results produced by the project and corresponding documentation.

  12. Discretization model for nonlinear dynamic analysis of three dimensional structures

    International Nuclear Information System (INIS)

    Hayashi, Y.

    1982-12-01

    A discretization model for nonlinear dynamic analysis of three dimensional structures is presented. The discretization is achieved through a three dimensional spring-mass system and the dynamic response obtained by direct integration of the equations of motion using central diferences. First the viability of the model is verified through the analysis of homogeneous linear structures and then its performance in the analysis of structures subjected to impulsive or impact loads, taking into account both geometrical and physical nonlinearities is evaluated. (Author) [pt

  13. Sensitivity Analysis of the Gap Heat Transfer Model in BISON.

    Energy Technology Data Exchange (ETDEWEB)

    Swiler, Laura Painton; Schmidt, Rodney C.; Williamson, Richard (INL); Perez, Danielle (INL)

    2014-10-01

    This report summarizes the result of a NEAMS project focused on sensitivity analysis of the heat transfer model in the gap between the fuel rod and the cladding used in the BISON fuel performance code of Idaho National Laboratory. Using the gap heat transfer models in BISON, the sensitivity of the modeling parameters and the associated responses is investigated. The study results in a quantitative assessment of the role of various parameters in the analysis of gap heat transfer in nuclear fuel.

  14. European Climate - Energy Security Nexus. A model based scenario analysis

    International Nuclear Information System (INIS)

    Criqui, Patrick; Mima, Silvana

    2011-01-01

    In this research, we have provided an overview of the climate-security nexus in the European sector through a model based scenario analysis with POLES model. The analysis underline that under stringent climate policies, Europe take advantage of a double dividend in its capacity to develop a new cleaner energy model and in lower vulnerability to potential shocks on the international energy markets. (authors)

  15. European Climate - Energy Security Nexus. A model based scenario analysis

    Energy Technology Data Exchange (ETDEWEB)

    Criqui, Patrick; Mima, Silvana

    2011-01-15

    In this research, we have provided an overview of the climate-security nexus in the European sector through a model based scenario analysis with POLES model. The analysis underline that under stringent climate policies, Europe take advantage of a double dividend in its capacity to develop a new cleaner energy model and in lower vulnerability to potential shocks on the international energy markets. (authors)

  16. Environmental modeling and health risk analysis (ACTS/RISK)

    National Research Council Canada - National Science Library

    Aral, M. M

    2010-01-01

    ... presents a review of the topics of exposure and health risk analysis. The Analytical Contaminant Transport Analysis System (ACTS) and Health RISK Analysis (RISK) software tools are an integral part of the book and provide computational platforms for all the models discussed herein. The most recent versions of these two softwa...

  17. Modeling and Exergy Analysis of District Cooling

    DEFF Research Database (Denmark)

    Nguyen, Chan

    in the gas cooler, pinch temperature in the evaporator and effectiveness of the IHX. These results are complemented by the exergy analysis, where the exergy destruction ratio of the CO2 system’s component is found. Heat recovery from vapour compression heat pumps has been investigated. The heat is to be used...... based system is the investment cost for the pipes. To overcome this, a combined district heating and cooling system based on CO2 as refrigerant and transport fluid is proposed. Exergoeconomic analysis has been used to evaluate and optimize a CO2 based system for combined heating and cooling...

  18. Comparative analysis of Goodwin's business cycle models

    Science.gov (United States)

    Antonova, A. O.; Reznik, S.; Todorov, M. D.

    2016-10-01

    We compare the behavior of solutions of Goodwin's business cycle equation in the form of neutral delay differential equation with fixed delay (NDDE model) and in the form of the differential equations of 3rd, 4th and 5th orders (ODE model's). Such ODE model's (Taylor series expansion of NDDE in powers of θ) are proposed in N. Dharmaraj and K. Vela Velupillai [6] for investigation of the short periodic sawthooth oscillations in NDDE. We show that the ODE's of 3rd, 4th and 5th order may approximate the asymptotic behavior of only main Goodwin's mode, but not the sawthooth modes. If the order of the Taylor series expansion exceeds 5, then the approximate ODE becomes unstable independently of time lag θ.

  19. Computational Models for Analysis of Illicit Activities

    DEFF Research Database (Denmark)

    Nizamani, Sarwat

    to describe the phenomenon of contagious public outrage, which eventually leads to the spread of violence following a disclosure of some unpopular political decisions and/or activity. The results shed a new light on terror activity and provide some hint on how to curb the spreading of violence within...... result in a fully evolved network. This method of network evolution can help intelligence security analysts to understand the structure of the network. For suspicious emails detection and email author identification, a cluster-based text classification model has been proposed. The model outperformed...... traditional models for both of the tasks. Apart from these globally organized crimes and cybercrimes, there happen specific world issues which affect geographic locations and take the form of bursts of public violence. These kinds of issues have received little attention by the academicians. These issues have...

  20. Analysis Models for Polymer Composites Across Different Length Scales

    Science.gov (United States)

    Camanho, Pedro P.; Arteiro, Albertino

    This chapter presents the analysis models, developed at different length scales, for the prediction of inelastic deformation and fracture of polymer composite materials reinforced by unidirectional fibers. Three different length scales are covered. Micro-mechanical models are used to understand in detail the effects of the constituents on the response of the composite material, and to support the development of analysis models based on homogenized representations of composite materials. Meso-mechanical models are used to predict the strength of composite structural components under general loading conditions. Finally, macro-mechanical models based on Finite Fracture Mechanics, which enable fast strength predictions of simple structural details, are discussed.

  1. Synthesized dynamic modeling and stability analysis of novel HVDC system

    Energy Technology Data Exchange (ETDEWEB)

    Xu Sun; Li Kong [Inst. of Electrical Engineering, CAS, BJ (China)

    2008-07-01

    At the present time, many projects large offshore wind power fields connecting to the grid adopt the novel HVDC technology. Voltage source converter structure and PWM modulation technology are used in the system and active power and reactive power can be controlled respectively, so it can ensure the excellent performance of the projects. It is very necessary to build its detailed dynamic model and analyze its stability to be the base for further research. In this paper, firstly, the switch function model is established as the base of further analysis. Secondly, the steady model, small signal model and high frequency dynamic model of novel HVDC based on state space average method are established respectively. Thirdly, the stability of the whole system is analyzed on the base of above models of the novel HVDC. Finally, the whole system is validated practically by simulation analysis to prove the validity of model and stability analysis. (orig.)

  2. CRITICAL ANALYSIS OF EVALUATION MODEL LOMCE

    Directory of Open Access Journals (Sweden)

    José Luis Bernal Agudo

    2015-06-01

    Full Text Available The evaluation model that the LOMCE projects sinks its roots into the neoliberal beliefs, reflecting a specific way of understanding the world. What matters is not the process but the results, being the evaluation the center of the education-learning processes. It presents an evil planning, since the theory that justifies the model doesn’t specify upon coherent proposals, where there is an excessive worry for excellence and diversity is left out. A comprehensive way of understanding education should be recovered.

  3. Global sensitivity analysis of computer models with functional inputs

    International Nuclear Information System (INIS)

    Iooss, Bertrand; Ribatet, Mathieu

    2009-01-01

    Global sensitivity analysis is used to quantify the influence of uncertain model inputs on the response variability of a numerical model. The common quantitative methods are appropriate with computer codes having scalar model inputs. This paper aims at illustrating different variance-based sensitivity analysis techniques, based on the so-called Sobol's indices, when some model inputs are functional, such as stochastic processes or random spatial fields. In this work, we focus on large cpu time computer codes which need a preliminary metamodeling step before performing the sensitivity analysis. We propose the use of the joint modeling approach, i.e., modeling simultaneously the mean and the dispersion of the code outputs using two interlinked generalized linear models (GLMs) or generalized additive models (GAMs). The 'mean model' allows to estimate the sensitivity indices of each scalar model inputs, while the 'dispersion model' allows to derive the total sensitivity index of the functional model inputs. The proposed approach is compared to some classical sensitivity analysis methodologies on an analytical function. Lastly, the new methodology is applied to an industrial computer code that simulates the nuclear fuel irradiation.

  4. Development of a Granular Media Model for Finite Element Analysis

    National Research Council Canada - National Science Library

    SMith, Donald

    2001-01-01

    .... The model was evolved from a mechanistic interpretation of endochronic theory, which creates a modular system that can accommodate plasticity, visco-elasticity/plasticity, and damage within a single modeling format. The performance of the model is illustrated with the analysis of permanent deformations in a flexible pavement caused by repeated tire loads.

  5. System Reliability Analysis Capability and Surrogate Model Application in RAVEN

    Energy Technology Data Exchange (ETDEWEB)

    Rabiti, Cristian [Idaho National Lab. (INL), Idaho Falls, ID (United States); Alfonsi, Andrea [Idaho National Lab. (INL), Idaho Falls, ID (United States); Huang, Dongli [Idaho National Lab. (INL), Idaho Falls, ID (United States); Gleicher, Frederick [Idaho National Lab. (INL), Idaho Falls, ID (United States); Wang, Bei [Idaho National Lab. (INL), Idaho Falls, ID (United States); Adbel-Khalik, Hany S. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Pascucci, Valerio [Idaho National Lab. (INL), Idaho Falls, ID (United States); Smith, Curtis L. [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-11-01

    This report collect the effort performed to improve the reliability analysis capabilities of the RAVEN code and explore new opportunity in the usage of surrogate model by extending the current RAVEN capabilities to multi physics surrogate models and construction of surrogate models for high dimensionality fields.

  6. Analysis of longitudinal data using the hierarchical linear model

    NARCIS (Netherlands)

    Snijders, T.A.B.

    1996-01-01

    The hierarchical linear model in a linear model with nested random coefficients, fruitfully used for multilevel research. A tutorial is presented on the use of this model for the analysis of longitudinal data, i.e., repeated data on the same subjects. An important advantage of this approach is that

  7. Mathematical Models and the Experimental Analysis of Behavior

    Science.gov (United States)

    Mazur, James E.

    2006-01-01

    The use of mathematical models in the experimental analysis of behavior has increased over the years, and they offer several advantages. Mathematical models require theorists to be precise and unambiguous, often allowing comparisons of competing theories that sound similar when stated in words. Sometimes different mathematical models may make…

  8. ANALYSIS/MODEL COVER SHEET, MULTISCALE THERMOHYDROLOGIC MODEL

    International Nuclear Information System (INIS)

    Buscheck, T.A.

    2001-01-01

    The purpose of the Multiscale Thermohydrologic Model (MSTHM) is to describe the thermohydrologic evolution of the near-field environment (NFE) and engineered barrier system (EBS) throughout the potential high-level nuclear waste repository at Yucca Mountain for a particular engineering design (CRWMS M andO 2000c). The process-level model will provide thermohydrologic (TH) information and data (such as in-drift temperature, relative humidity, liquid saturation, etc.) for use in other technical products. This data is provided throughout the entire repository area as a function of time. The MSTHM couples the Smeared-heat-source Drift-scale Thermal-conduction (SDT), Line-average-heat-source Drift-scale Thermohydrologic (LDTH), Discrete-heat-source Drift-scale Thermal-conduction (DDT), and Smeared-heat-source Mountain-scale Thermal-conduction (SMT) submodels such that the flow of water and water vapor through partially-saturated fractured rock is considered. The MSTHM accounts for 3-D drift-scale and mountain-scale heat flow, repository-scale variability of stratigraphy and infiltration flux, and waste package (WP)-to-WP variability in heat output from WPs. All submodels use the nonisothermal unsaturated-saturated flow and transport (NUFT) simulation code. The MSTHM is implemented in several data-processing steps. The four major steps are: (1) submodel input-file preparation, (2) execution of the four submodel families with the use of the NUFT code, (3) execution of the multiscale thermohydrologic abstraction code (MSTHAC), and (4) binning and post-processing (i.e., graphics preparation) of the output from MSTHAC. Section 6 describes the MSTHM in detail. The objectives of this Analyses and Model Report (AMR) are to investigate near field (NF) and EBS thermohydrologic environments throughout the repository area at various evolution periods, and to provide TH data that may be used in other process model reports

  9. [Analysis of dalbavancin in animal models].

    Science.gov (United States)

    Murillo, Óscar; El-Haj, Cristina

    2017-01-01

    Multiresistant Gram-positive infections continue to pose a major clinical challenge and the development of new antibiotics is always desirable. Dalbavancin is a lipoglycopeptide with a prolonged half-life that allows long dosing intervals. In experimental models, its activity has been evaluated in distinct models and microorganisms, which limits the conclusions that can be drawn; however, the largest number of studies have been conducted in Staphylococcus aureus infection. Overall, dalbavancin has shown concentration-dependent efficacy and the parameters best explaining its activity are maximal pharmacodynamic concentration/minimal inhibitory concentration and the area under the curve/minimal inhibitory concentration. In these experimental models, dalbavancin has shown good distribution, a prolonged half-life in all animal species and efficacy that is mostly similar to that of previous glycopeptides but with lower doses and with longer dosing intervals. Of note, the efficacy of dalbavancin is not altered by methicillin resistance or the glycopeptide sensitivity of S. aureus. In the case of difficult-to-treat staphylococcal infections (eg, endocarditis, foreign body infections), an adequate dosing interval and high dosage seem to play an important role in the efficacy of the drug. All in all, experimental models can still provide greater knowledge of this new antibiotic to guide clinical research and determine its role in the treatment of distinct infections produced by Gram-positive microorganisms. Copyright © 2017 Elsevier España, S.L.U. All rights reserved.

  10. Experimental Measurement, Analysis and Modelling of Dependency ...

    African Journals Online (AJOL)

    We propose a direct method of measurement of the total emissivity of opaque samples on a range of temperature around the ambient one. The method rests on the modulation of the temperature of the sample and the infra-red signal processing resulting from the surface of the sample we model the total emissivity obtained ...

  11. Work zone lane closure analysis model.

    Science.gov (United States)

    2009-10-01

    At the Alabama Department of Transportation (ALDOT), the tool used by traffic engineers to predict whether a queue will form at a freeway work zone is the Excel-based "Lane Rental Model" developed at the Oklahoma Department of Transportation (OkDOT) ...

  12. Spatial Uncertainty Analysis of Ecological Models

    Energy Technology Data Exchange (ETDEWEB)

    Jager, H.I.; Ashwood, T.L.; Jackson, B.L.; King, A.W.

    2000-09-02

    The authors evaluated the sensitivity of a habitat model and a source-sink population model to spatial uncertainty in landscapes with different statistical properties and for hypothetical species with different habitat requirements. Sequential indicator simulation generated alternative landscapes from a source map. Their results showed that spatial uncertainty was highest for landscapes in which suitable habitat was rare and spatially uncorrelated. Although, they were able to exert some control over the degree of spatial uncertainty by varying the sampling density drawn from the source map, intrinsic spatial properties (i.e., average frequency and degree of spatial autocorrelation) played a dominant role in determining variation among realized maps. To evaluate the ecological significance of landscape variation, they compared the variation in predictions from a simple habitat model to variation among landscapes for three species types. Spatial uncertainty in predictions of the amount of source habitat depended on both the spatial life history characteristics of the species and the statistical attributes of the synthetic landscapes. Species differences were greatest when the landscape contained a high proportion of suitable habitat. The predicted amount of source habitat was greater for edge-dependent (interior) species in landscapes with spatially uncorrelated(correlated) suitable habitat. A source-sink model demonstrated that, although variation among landscapes resulted in relatively little variation in overall population growth rate, this spatial uncertainty was sufficient in some situations, to produce qualitatively different predictions about population viability (i.e., population decline vs. increase).

  13. Urban drainage models - making uncertainty analysis simple

    DEFF Research Database (Denmark)

    Vezzaro, Luca; Mikkelsen, Peter Steen; Deletic, Ana

    2012-01-01

    There is increasing awareness about uncertainties in modelling of urban drainage systems and, as such, many new methods for uncertainty analyses have been developed. Despite this, all available methods have limitations which restrict their widespread application among practitioners. Here, a modif...

  14. An Analysis of Student Model Portability

    Science.gov (United States)

    Valdés Aguirre, Benjamín; Ramírez Uresti, Jorge A.; du Boulay, Benedict

    2016-01-01

    Sharing user information between systems is an area of interest for every field involving personalization. Recommender Systems are more advanced in this aspect than Intelligent Tutoring Systems (ITSs) and Intelligent Learning Environments (ILEs). A reason for this is that the user models of Intelligent Tutoring Systems and Intelligent Learning…

  15. Transient analysis models for nuclear power plants

    International Nuclear Information System (INIS)

    Agapito, J.R.

    1981-01-01

    The modelling used for the simulation of the Angra-1 start-up reactor tests, using the RETRAN computer code is presented. Three tests are simulated: a)nuclear power plant trip from 100% of power; b)great power excursions tests and c)'load swing' tests.(E.G.) [pt

  16. Financial Markets Analysis by Probabilistic Fuzzy Modelling

    NARCIS (Netherlands)

    J.H. van den Berg (Jan); W.-M. van den Bergh (Willem-Max); U. Kaymak (Uzay)

    2003-01-01

    textabstractFor successful trading in financial markets, it is important to develop financial models where one can identify different states of the market for modifying one???s actions. In this paper, we propose to use probabilistic fuzzy systems for this purpose. We concentrate on Takagi???Sugeno

  17. Feature Analysis for Modeling Game Content Quality

    DEFF Research Database (Denmark)

    Shaker, Noor; Yannakakis, Georgios N.; Togelius, Julian

    2011-01-01

    ’ preferences, and by defining the smallest game session size for which the model can still predict reported emotion with acceptable accuracy. Neuroevolutionary preference learning is used to approximate the function from game content to reported emotional preferences. The experiments are based on a modified...

  18. Analysis and modeling of "focus" in context

    DEFF Research Database (Denmark)

    Hovy, Dirk; Anumanchipalli, Gopala; Parlikar, Alok

    2013-01-01

    or speech stimuli. We then build models to show how well we predict that focus word from lexical (and higher) level features. Also, using spectral and prosodic information, we show the differences in these focus words when spoken with and without context. Finally, we show how we can improve speech synthesis...... of these utterances given focus information....

  19. Future of human models for crash analysis

    NARCIS (Netherlands)

    Wismans, J.S.H.M.; Happee, R.; Hoof, J.F.A.M. van; Lange, R. de

    2001-01-01

    In the crash safety field mathematical models can be applied in practically all area's of research and development including: reconstruction of actual accidents, design (CAD) of the crash response of vehicles, safety devices and roadside facilities and in support of human impact biomechanical

  20. Mixture model analysis of complex samples

    NARCIS (Netherlands)

    Wedel, M; ter Hofstede, F; Steenkamp, JBEM

    1998-01-01

    We investigate the effects of a complex sampling design on the estimation of mixture models. An approximate or pseudo likelihood approach is proposed to obtain consistent estimates of class-specific parameters when the sample arises from such a complex design. The effects of ignoring the sample

  1. Model-Based Analysis of Hand Radiographs

    Science.gov (United States)

    Levitt, Tod S.; Hedgcock, Marcus W.

    1989-05-01

    As a step toward computer assisted imagery interpretation, we are developing algorithms for computed radiography that allow a computer to recognize specific bones and joints, and to identify variations from normal in size, shape and density. In this paper we report on our approach to model-based computer recognition of hands in radiographs. First, image processing hypotheses of the imaged bones. Multiple hypotheses of the size and orientation of the imaged anatomy are matched against stored 3D models fof the relevant bones, obtained from statistically valid populations studies. Probabilities of the hypotheses are accrued using Bayesian inference techniques whose evaluation is guided by the structure of the hand model and the observed image-derived evidence such as anti-parallel edges, local contrast, etc. High probability matches between the hand model and the image data can cue additional image processing-based ssearch for bones, joints and soft-tissue to confirm hypotheses of the location of the imaged hand. At this point multipule disease detection techniques, automated bone age identification, etc. can be employed.

  2. Social Ecological Model Analysis for ICT Integration

    Science.gov (United States)

    Zagami, Jason

    2013-01-01

    ICT integration of teacher preparation programmes was undertaken by the Australian Teaching Teachers for the Future (TTF) project in all 39 Australian teacher education institutions and highlighted the need for guidelines to inform systemic ICT integration approaches. A Social Ecological Model (SEM) was used to positively inform integration…

  3. TIME SERIES ANALYSIS USING A UNIQUE MODEL OF TRANSFORMATION

    Directory of Open Access Journals (Sweden)

    Goran Klepac

    2007-12-01

    Full Text Available REFII1 model is an authorial mathematical model for time series data mining. The main purpose of that model is to automate time series analysis, through a unique transformation model of time series. An advantage of this approach of time series analysis is the linkage of different methods for time series analysis, linking traditional data mining tools in time series, and constructing new algorithms for analyzing time series. It is worth mentioning that REFII model is not a closed system, which means that we have a finite set of methods. At first, this is a model for transformation of values of time series, which prepares data used by different sets of methods based on the same model of transformation in a domain of problem space. REFII model gives a new approach in time series analysis based on a unique model of transformation, which is a base for all kind of time series analysis. The advantage of REFII model is its possible application in many different areas such as finance, medicine, voice recognition, face recognition and text mining.

  4. Three dimensional mathematical model of tooth for finite element analysis

    Directory of Open Access Journals (Sweden)

    Puškar Tatjana

    2010-01-01

    Full Text Available Introduction. The mathematical model of the abutment tooth is the starting point of the finite element analysis of stress and deformation of dental structures. The simplest and easiest way is to form a model according to the literature data of dimensions and morphological characteristics of teeth. Our method is based on forming 3D models using standard geometrical forms (objects in programmes for solid modeling. Objective. Forming the mathematical model of abutment of the second upper premolar for finite element analysis of stress and deformation of dental structures. Methods. The abutment tooth has a form of a complex geometric object. It is suitable for modeling in programs for solid modeling SolidWorks. After analyzing the literature data about the morphological characteristics of teeth, we started the modeling dividing the tooth (complex geometric body into simple geometric bodies (cylinder, cone, pyramid,.... Connecting simple geometric bodies together or substricting bodies from the basic body, we formed complex geometric body, tooth. The model is then transferred into Abaqus, a computational programme for finite element analysis. Transferring the data was done by standard file format for transferring 3D models ACIS SAT. Results. Using the programme for solid modeling SolidWorks, we developed three models of abutment of the second maxillary premolar: the model of the intact abutment, the model of the endodontically treated tooth with two remaining cavity walls and the model of the endodontically treated tooth with two remaining walls and inserted post. Conclusion Mathematical models of the abutment made according to the literature data are very similar with the real abutment and the simplifications are minimal. These models enable calculations of stress and deformation of the dental structures. The finite element analysis provides useful information in understanding biomechanical problems and gives guidance for clinical research.

  5. [Three dimensional mathematical model of tooth for finite element analysis].

    Science.gov (United States)

    Puskar, Tatjana; Vasiljević, Darko; Marković, Dubravka; Jevremović, Danimir; Pantelić, Dejan; Savić-Sević, Svetlana; Murić, Branka

    2010-01-01

    The mathematical model of the abutment tooth is the starting point of the finite element analysis of stress and deformation of dental structures. The simplest and easiest way is to form a model according to the literature data of dimensions and morphological characteristics of teeth. Our method is based on forming 3D models using standard geometrical forms (objects) in programmes for solid modeling. Forming the mathematical model of abutment of the second upper premolar for finite element analysis of stress and deformation of dental structures. The abutment tooth has a form of a complex geometric object. It is suitable for modeling in programs for solid modeling SolidWorks. After analysing the literature data about the morphological characteristics of teeth, we started the modeling dividing the tooth (complex geometric body) into simple geometric bodies (cylinder, cone, pyramid,...). Connecting simple geometric bodies together or substricting bodies from the basic body, we formed complex geometric body, tooth. The model is then transferred into Abaqus, a computational programme for finite element analysis. Transferring the data was done by standard file format for transferring 3D models ACIS SAT. Using the programme for solid modeling SolidWorks, we developed three models of abutment of the second maxillary premolar: the model of the intact abutment, the model of the endodontically treated tooth with two remaining cavity walls and the model of the endodontically treated tooth with two remaining walls and inserted post. Mathematical models of the abutment made according to the literature data are very similar with the real abutment and the simplifications are minimal. These models enable calculations of stress and deformation of the dental structures. The finite element analysis provides useful information in understanding biomechanical problems and gives guidance for clinical research.

  6. Traffic analysis toolbox volume XI : weather and traffic analysis, modeling and simulation.

    Science.gov (United States)

    2010-12-01

    This document presents a weather module for the traffic analysis tools program. It provides traffic engineers, transportation modelers and decisions makers with a guide that can incorporate weather impacts into transportation system analysis and mode...

  7. Dynamic Chest Image Analysis: Evaluation of Model-Based Pulmonary Perfusion Analysis With Pyramid Images

    National Research Council Canada - National Science Library

    Liang, Jianming

    2001-01-01

    Dynamic Chest Image Analysis aims to develop model-based computer analysis and visualization methods for showing focal and general abnormalities of lung ventilation and perfusion based on a sequence...

  8. Extrusion analysis of buffer using diffusion model

    International Nuclear Information System (INIS)

    Sugino, H.; Kanno, T.

    1999-11-01

    The buffer material that will be buried as a component of the engineered barriers system swells when saturation by groundwater. As a result of this swelling, buffer material may penetrate into the peripheral rock zone surrounding the buffer through open fractures. If sustained for extremely in long-period of time. The buffer material extrusion could lead to reduction of buffer density, which may in turn degrade the assumed performance assessment properties (e.g., permeability, diffusion coefficient). JNC has been conducted the study of bentonite extrusion into fractures of rock mass as a part of high level waste research. In 1997, JNC has reported the test results concerning buffer material extrusion and buffer material erosion. These tests have been done using test facilities in Geological Isolation Basic Research Facility. After 1997, JNC also conducted analytical study of buffer material extrusion. This report describes the analysis results of this study which are reflected to the H12 report. In this analysis, the diffusion coefficient was derived as a function of the swelling pressure and the viscosity resistance of the buffer materials. Thus, the reduction in density of buffer materials after emplacement in saturated rock was assessed. The assessment was made assuming parallel-plate radial fractures initially filled by water only. Because fractures in natural rock masses inevitably have mineral inclusions inside of them and fractures orientation leads to fractures intersecting other fractures, this analysis gives significantly conservative conditions with respect to long-term extrusion of buffer and possible decrease in buffer density. (author)

  9. Formal Analysis of Graphical Security Models

    DEFF Research Database (Denmark)

    Aslanyan, Zaruhi

    The increasing usage of computer-based systems in almost every aspects of our daily life makes more and more dangerous the threat posed by potential attackers, and more and more rewarding a successful attack. Moreover, the complexity of these systems is also increasing, including physical devices......, software components and human actors interacting with each other to form so-called socio-technical systems. The importance of socio-technical systems to modern societies requires verifying their security properties formally, while their inherent complexity makes manual analyses impracticable. Graphical...... models for security offer an unrivalled opportunity to describe socio-technical systems, for they allow to represent different aspects like human behaviour, computation and physical phenomena in an abstract yet uniform manner. Moreover, these models can be assigned a formal semantics, thereby allowing...

  10. Modeling, analysis, and visualization of anisotropy

    CERN Document Server

    Özarslan, Evren; Hotz, Ingrid

    2017-01-01

    This book focuses on the modeling, processing and visualization of anisotropy, irrespective of the context in which it emerges, using state-of-the-art mathematical tools. As such, it differs substantially from conventional reference works, which are centered on a particular application. It covers the following topics: (i) the geometric structure of tensors, (ii) statistical methods for tensor field processing, (iii) challenges in mapping neural connectivity and structural mechanics, (iv) processing of uncertainty, and (v) visualizing higher-order representations. In addition to original research contributions, it provides insightful reviews. This multidisciplinary book is the sixth in a series that aims to foster scientific exchange between communities employing tensors and other higher-order representations of directionally dependent data. A significant number of the chapters were co-authored by the participants of the workshop titled Multidisciplinary Approaches to Multivalued Data: Modeling, Visualization,...

  11. Feature Analysis for Modeling Game Content Quality

    DEFF Research Database (Denmark)

    Shaker, Noor; Yannakakis, Georgios N.; Togelius, Julian

    2011-01-01

    ’ preferences, and by defining the smallest game session size for which the model can still predict reported emotion with acceptable accuracy. Neuroevolutionary preference learning is used to approximate the function from game content to reported emotional preferences. The experiments are based on a modified......entertainment for individual game players is to tailor player experience in real-time via automatic game content generation. Modeling the relationship between game content and player preferences or affective states is an important step towards this type of game personalization. In this paper we...... analyse the relationship between level design parameters of platform games and player experience. We introduce a method to extract the most useful information about game content from short game sessions by investigating the size of game session that yields the highest accuracy in predicting players...

  12. Analysis of mathematical modelling on potentiometric biosensors.

    Science.gov (United States)

    Mehala, N; Rajendran, L

    2014-01-01

    A mathematical model of potentiometric enzyme electrodes for a nonsteady condition has been developed. The model is based on the system of two coupled nonlinear time-dependent reaction diffusion equations for Michaelis-Menten formalism that describes the concentrations of substrate and product within the enzymatic layer. Analytical expressions for the concentration of substrate and product and the corresponding flux response have been derived for all values of parameters using the new homotopy perturbation method. Furthermore, the complex inversion formula is employed in this work to solve the boundary value problem. The analytical solutions obtained allow a full description of the response curves for only two kinetic parameters (unsaturation/saturation parameter and reaction/diffusion parameter). Theoretical descriptions are given for the two limiting cases (zero and first order kinetics) and relatively simple approaches for general cases are presented. All the analytical results are compared with simulation results using Scilab/Matlab program. The numerical results agree with the appropriate theories.

  13. Modelling Analysis of Sewage Sludge Amended Soil

    DEFF Research Database (Denmark)

    Sørensen, P. B.; Carlsen, L.; Vikelsøe, J.

    The topic is risk assessment of sludge supply to agricultural soil in relation to xenobiotics. A large variety of xenobiotics arrive to the wastewater treatment plant in the wastewater. Many of these components are hydrophobic and thus will accumulate in the sludge solids and are removed from...... the plant effluent. The focus in this work is the top soil as this layer is important for the fate of a xenobiotic substance due to the high biological activity. A simple model for the top soil is used where the substance is assumed homogeneously distributed as suggested in the European Union System...... for the Evaluation of Substances (EUSES). It is shown how the fraction of substance mass, which is leached, from the top soil is a simple function of the ratio between the degradation half lifetime and the adsorption coefficient. This model can be used in probabilistic risk assessment of agricultural soils...

  14. QUADCOPTER BODY FRAME MODEL AND ANALYSIS

    OpenAIRE

    KUANTAMA Endrowednes; CRACIUN Dan; TARCA Radu

    2016-01-01

    Quadcopter frame modeling is useful to analyze the reliability of body frame part and to help determine the type of rotor and propeller in order to assure the necessary flight acceleration. Quadcopter flight stability is influenced by the resulting thrust, by the distance between each rotor propeller and also by the frame rigidity; the frame has been designed to be as light as possible, meanwhile maintaining the strength to carry the load. Solidworks software has been used to design and analy...

  15. Interest rate risk analysis with multifactor model

    OpenAIRE

    Campos, Natalia; Jareño Cebrián, Francisco; Tolentino García-Abadillo, Marta

    2016-01-01

    This study focuses on analyzing the influence of changes in 10-year nominal interest rates on US sector returns, distinguishing two different periods, before and after the subprime crisis. We run the three-factor model of Fama and French, which incorporates as explanatory factors the nominal interest rate and the size and growth opportunities factors. The US sensitivity varies across sectors and periods, but we evidence a similar response to the previous literature. Finally, the “size” ...

  16. Binaural Masking: An Analysis of Models

    Science.gov (United States)

    1991-11-21

    detection. The Introduction of masker energy in temporal intervals that did not overlap with the signal could be shown to. either enhance or degrade...IN Ph.D. 1981 Pschology Dissertation title: "Molecular psychophysics and models of auditory signal detectability." VI. INTERACTIONS Invited pewm and...incorporates a double-sided exponential temporal integration window. PACS numbers: 43.66.Pn, 43.66.Nm, 43.66.Dc, 43.66.Mk [WAY] INTRODUCTION

  17. Automatic terrain modeling using transfinite element analysis

    KAUST Repository

    Collier, Nathan

    2010-05-31

    An automatic procedure for modeling terrain is developed based on L2 projection-based interpolation of discrete terrain data onto transfinite function spaces. The function space is refined automatically by the use of image processing techniques to detect regions of high error and the flexibility of the transfinite interpolation to add degrees of freedom to these areas. Examples are shown of a section of the Palo Duro Canyon in northern Texas.

  18. Asymptotic analysis of an ion extraction model

    International Nuclear Information System (INIS)

    Ben Abdallah, N.; Mas-Gallic, S.; Raviart, P.A.

    1993-01-01

    A simple model for ion extraction from a plasma is analyzed. The order of magnitude of the plasma parameters leads to a singular perturbation problem for a semilinear elliptic equation. We first prove existence of solutions for the perturbed problem and uniqueness under certain conditions. Then we prove the convergence of these solutions, when the parameters go to zero, towards the solution of a Child-Langmuir problem

  19. PROJECT ACTIVITY ANALYSIS WITHOUT THE NETWORK MODEL

    Directory of Open Access Journals (Sweden)

    S. Munapo

    2012-01-01

    Full Text Available

    ENGLISH ABSTRACT: This paper presents a new procedure for analysing and managing activity sequences in projects. The new procedure determines critical activities, critical path, start times, free floats, crash limits, and other useful information without the use of the network model. Even though network models have been successfully used in project management so far, there are weaknesses associated with the use. A network is not easy to generate, and dummies that are usually associated with it make the network diagram complex – and dummy activities have no meaning in the original project management problem. The network model for projects can be avoided while still obtaining all the useful information that is required for project management. What are required are the activities, their accurate durations, and their predecessors.

    AFRIKAANSE OPSOMMING: Die navorsing beskryf ’n nuwerwetse metode vir die ontleding en bestuur van die sekwensiële aktiwiteite van projekte. Die voorgestelde metode bepaal kritiese aktiwiteite, die kritieke pad, aanvangstye, speling, verhasing, en ander groothede sonder die gebruik van ’n netwerkmodel. Die metode funksioneer bevredigend in die praktyk, en omseil die administratiewe rompslomp van die tradisionele netwerkmodelle.

  20. Quantitative Models and Analysis for Reactive Systems

    DEFF Research Database (Denmark)

    Thrane, Claus

    The majority of modern software and hardware systems are reactive systems, where input provided by the user (possibly another system) and the output of the system is exchanged continuously throughout the (possibly) indefinite execution of the system. Natural examples include control systems, mobi......, energy consumption, latency, mean-time to failure, and cost. For systems integrated in mass-market products, the ability to quantify trade-offs between performance and robustness, under given technical and economic constraints, is of strategic importance....... by the environment in which they are embedded. This thesis studies the semantics and properties of a model-based framework for re- active systems, in which models and specifications are assumed to contain quantifiable information, such as references to time or energy. Our goal is to develop a theory of approximation......, in terms of a new mathematical basis for systems modeling which can incompas behavioural properties as well as environmental constraints. They continue by pointing out that, continuous performance and robustness measures are paramount when dealing with physical resource levels such as clock frequency...

  1. Comparative analysis of design models for concrete corbels

    OpenAIRE

    Araújo, D. L.; Silva Neto, A. P.; Lobo, F. A.; Debs, M. K. El

    2016-01-01

    ABSTRACT The main objective of this paper is performing a comparative analysis of some design models for precast concrete corbels. For this, it was analyzed design models from Brazilian (NBR 9062) and European (EUROCODE 2) Codes and a US design handbook (PCI). Moreover, three analytical models showed in the literature are analyzed. The objective of this comparative is identifying the best design models to represent the failure load of concrete corbels by the tie yields or by the concrete crus...

  2. Data Analysis A Model Comparison Approach, Second Edition

    CERN Document Server

    Judd, Charles M; Ryan, Carey S

    2008-01-01

    This completely rewritten classic text features many new examples, insights and topics including mediational, categorical, and multilevel models. Substantially reorganized, this edition provides a briefer, more streamlined examination of data analysis. Noted for its model-comparison approach and unified framework based on the general linear model, the book provides readers with a greater understanding of a variety of statistical procedures. This consistent framework, including consistent vocabulary and notation, is used throughout to develop fewer but more powerful model building techniques. T

  3. Human Modeling For Ground Processing Human Factors Engineering Analysis

    Science.gov (United States)

    Tran, Donald; Stambolian, Damon; Henderson, Gena; Barth, Tim

    2011-01-01

    There have been many advancements and accomplishments over that last few years using human modeling for human factors engineering analysis for design of spacecraft and launch vehicles. The key methods used for this are motion capture and computer generated human models. The focus of this paper is to explain the different types of human modeling used currently and in the past at Kennedy Space Center (KSC) currently, and to explain the future plans for human modeling for future spacecraft designs.

  4. Analysis of Jingdong Mall Logistics Distribution Model

    Science.gov (United States)

    Shao, Kang; Cheng, Feng

    In recent years, the development of electronic commerce in our country to speed up the pace. The role of logistics has been highlighted, more and more electronic commerce enterprise are beginning to realize the importance of logistics in the success or failure of the enterprise. In this paper, the author take Jingdong Mall for example, performing a SWOT analysis of their current situation of self-built logistics system, find out the problems existing in the current Jingdong Mall logistics distribution and give appropriate recommendations.

  5. Models and analysis for distributed systems

    CERN Document Server

    Haddad, Serge; Pautet, Laurent; Petrucci, Laure

    2013-01-01

    Nowadays, distributed systems are increasingly present, for public software applications as well as critical systems. software applications as well as critical systems. This title and Distributed Systems: Design and Algorithms - from the same editors - introduce the underlying concepts, the associated design techniques and the related security issues.The objective of this book is to describe the state of the art of the formal methods for the analysis of distributed systems. Numerous issues remain open and are the topics of major research projects. One current research trend consists of pro

  6. Modeling and analysis of offshore jacket platform

    Digital Repository Service at National Institute of Oceanography (India)

    Mohan, P.; Sidhaarth, K.R.A.; SanilKumar, V.

    water depth in this basin is analysed, to hold the operational, self-weight, deformation and environmental load acting simultaneously on the structure. The wave force on slender tubular members is calculated by the addition of the effective drag... loading in the software analysis. Fig 4: Moment (kNm) in X, Y, Z axis in jacket Table 1: Comparison of Dead/Live Load Result Item Weight (tonnes) Manual design Project design Jacket 22186.8 18000 Buoyancy Effect 3759.88 1573...

  7. Modeling and Analysis of Clandestine Networks

    Science.gov (United States)

    2005-12-15

    represents the combined flow of "information" from node i to nodej through all possible paths joining i andj. 7 Pre-Publication Draft 15 Dec 2005 If the...definition can be modified to model influence by defining Qij as the influence of node i, in clique m, over a nodej in clique n. The node-clique...defining Qb, as the influence of node i, not in clique c, over a nodej in c. The node-clique formulation then becomes m ni Z (Q/Ic/H j=1 Oo = hu’l

  8. Extrudate Expansion Modelling through Dimensional Analysis Method

    DEFF Research Database (Denmark)

    to describe the extrudates expansion. From the three dimensionless groups, an equation with three experimentally determined parameters is derived to express the extrudate expansion. The model is evaluated with whole wheat flour and aquatic feed extrusion experimental data. The average deviations...... of the correlation are respectively 5.9% and 9% for the whole wheat flour and the aquatic feed extrusion. An alternative 4-coefficient equation is also suggested from the 3 dimensionless groups. The average deviations of the alternative equation are respectively 5.8% and 2.5% in correlation with the same set...

  9. Bayesian analysis of a correlated binomial model

    OpenAIRE

    Diniz, Carlos A. R.; Tutia, Marcelo H.; Leite, Jose G.

    2010-01-01

    In this paper a Bayesian approach is applied to the correlated binomial model, CB(n, p, ρ), proposed by Luceño (Comput. Statist. Data Anal. 20 (1995) 511–520). The data augmentation scheme is used in order to overcome the complexity of the mixture likelihood. MCMC methods, including Gibbs sampling and Metropolis within Gibbs, are applied to estimate the posterior marginal for the probability of success p and for the correlation coefficient ρ. The sensitivity of the posterior is studied taking...

  10. Compartmentalization analysis using discrete fracture network models

    Energy Technology Data Exchange (ETDEWEB)

    La Pointe, P.R.; Eiben, T.; Dershowitz, W. [Golder Associates, Redmond, VA (United States); Wadleigh, E. [Marathon Oil Co., Midland, TX (United States)

    1997-08-01

    This paper illustrates how Discrete Fracture Network (DFN) technology can serve as a basis for the calculation of reservoir engineering parameters for the development of fractured reservoirs. It describes the development of quantitative techniques for defining the geometry and volume of structurally controlled compartments. These techniques are based on a combination of stochastic geometry, computational geometry, and graph the theory. The parameters addressed are compartment size, matrix block size and tributary drainage volume. The concept of DFN models is explained and methodologies to compute these parameters are demonstrated.

  11. Aspects of uncertainty analysis in accident consequence modeling

    International Nuclear Information System (INIS)

    Travis, C.C.; Hoffman, F.O.

    1981-01-01

    Mathematical models are frequently used to determine probable dose to man from an accidental release of radionuclides by a nuclear facility. With increased emphasis on the accuracy of these models, the incorporation of uncertainty analysis has become one of the most crucial and sensitive components in evaluating the significance of model predictions. In the present paper, we address three aspects of uncertainty in models used to assess the radiological impact to humans: uncertainties resulting from the natural variability in human biological parameters; the propagation of parameter variability by mathematical models; and comparison of model predictions to observational data

  12. A Conceptual Model for Multidimensional Analysis of Documents

    Science.gov (United States)

    Ravat, Franck; Teste, Olivier; Tournier, Ronan; Zurlfluh, Gilles

    Data warehousing and OLAP are mainly used for the analysis of transactional data. Nowadays, with the evolution of Internet, and the development of semi-structured data exchange format (such as XML), it is possible to consider entire fragments of data such as documents as analysis sources. As a consequence, an adapted multidimensional analysis framework needs to be provided. In this paper, we introduce an OLAP multidimensional conceptual model without facts. This model is based on the unique concept of dimensions and is adapted for multidimensional document analysis. We also provide a set of manipulation operations.

  13. New rheological model for concrete structural analysis

    International Nuclear Information System (INIS)

    Chern, J.C.

    1984-01-01

    Long time deformation is of interest in estimating stresses of the prestressed concrete reactor vessel, in predicting cracking due to shrinkage or thermal dilatation, and in the design of leak-tight structures. Many interacting influences exist among creep, shrinkage and cracking for concrete. An interaction which researchers have long observed, is that at simultaneous drying and loading, the deformation of a concrete structure under the combined effect is larger than the sum of the shrinkage deformation of the structure at no load and the deformation of the sealed structure. The excess deformation due to the difference between observed test data and conventional analysis is regarded as the Pickett Effect. A constitutive relation explaining the Pickett Effect and other similar superposition problems, which includes creep, shrinkage (or thermal dilation), cracking, aging was developed with an efficient time-step numerical algorithm. The total deformation in the analysis is the sum of strain due to elastic deformation and creep, cracking and shrinkage with thermal dilatation. Instead of a sudden stress reduction to zero after the attainment of the strength limit, the gradual strain-softening of concrete (a gradual decline of stress at increasing strain) is considered

  14. Online Statistical Modeling (Regression Analysis) for Independent Responses

    Science.gov (United States)

    Made Tirta, I.; Anggraeni, Dian; Pandutama, Martinus

    2017-06-01

    Regression analysis (statistical analmodelling) are among statistical methods which are frequently needed in analyzing quantitative data, especially to model relationship between response and explanatory variables. Nowadays, statistical models have been developed into various directions to model various type and complex relationship of data. Rich varieties of advanced and recent statistical modelling are mostly available on open source software (one of them is R). However, these advanced statistical modelling, are not very friendly to novice R users, since they are based on programming script or command line interface. Our research aims to developed web interface (based on R and shiny), so that most recent and advanced statistical modelling are readily available, accessible and applicable on web. We have previously made interface in the form of e-tutorial for several modern and advanced statistical modelling on R especially for independent responses (including linear models/LM, generalized linier models/GLM, generalized additive model/GAM and generalized additive model for location scale and shape/GAMLSS). In this research we unified them in the form of data analysis, including model using Computer Intensive Statistics (Bootstrap and Markov Chain Monte Carlo/ MCMC). All are readily accessible on our online Virtual Statistics Laboratory. The web (interface) make the statistical modeling becomes easier to apply and easier to compare them in order to find the most appropriate model for the data.

  15. Model based analysis of piezoelectric transformers.

    Science.gov (United States)

    Hemsel, T; Priya, S

    2006-12-22

    Piezoelectric transformers are increasingly getting popular in the electrical devices owing to several advantages such as small size, high efficiency, no electromagnetic noise and non-flammable. In addition to the conventional applications such as ballast for back light inverter in notebook computers, camera flash, and fuel ignition several new applications have emerged such as AC/DC converter, battery charger and automobile lighting. These new applications demand high power density and wide range of voltage gain. Currently, the transformer power density is limited to 40 W/cm(3) obtained at low voltage gain. The purpose of this study was to investigate a transformer design that has the potential of providing higher power density and wider range of voltage gain. The new transformer design utilizes radial mode both at the input and output port and has the unidirectional polarization in the ceramics. This design was found to provide 30 W power with an efficiency of 98% and 30 degrees C temperature rise from the room temperature. An electro-mechanical equivalent circuit model was developed to describe the characteristics of the piezoelectric transformer. The model was found to successfully predict the characteristics of the transformer. Excellent matching was found between the computed and experimental results. The results of this study will allow to deterministically design unipoled piezoelectric transformers with specified performance. It is expected that in near future the unipoled transformer will gain significant importance in various electrical components.

  16. The Spectrophotometric Analysis and Modeling of Sunscreens

    Science.gov (United States)

    Walters, Christina; Keeney, Allen; Wigal, Carl T.; Johnston, Cynthia R.; Cornelius, Richard D.

    1997-01-01

    Sunscreens and their SPF (Sun Protection Factor) values are the focus of this experiment that includes spectrophotometric measurements and molecular modeling. Students suspend weighed amounts of sunscreen lotions graded SPF 4, 6, 8, 15, 30, and 45 in water and dissolve aliquots of the aqueous suspensions in propanol. The expected relationship of absorbance proportional to log10(SPF) applies at 312 nm where a maximum in absorbance occurs for the sunscreen solutions. Results at 330 nm give similar results and are more accessible using spectrometers routinely available in the introductory laboratory. Sunscreens constitute a suitable class of compounds to use for modeling electronic spectra, and using the computer for the active ingredients ethylhexyl para-methoxycinnamate, oxybenzone, 2-ethylhexyl salicylate, and octocrylene found in commercially available formulations typically predicts the absorption maxima within 10 nm. This experiment lets students explore which compounds have the potential to function as sunscreen agents and thereby see the importance of a knowledge of chemistry to the formulation of household items.

  17. Digital telephony analysis model and issues

    Science.gov (United States)

    Keuthan, Lynn M.

    1995-09-01

    Experts in the fields of digital telephony and communications security have stated the need for an analytical tool for evaluating complex issues. Some important policy issues discussed by experts recently include implementing digital wire-taps, implementation of the 'Clipper Chip', required registration of encryption/decryption keys, and export control of cryptographic equipment. Associated with the implementation of these policies are direct costs resulting from implementation, indirect cost benefits from implementation, and indirect costs resulting from the risks of implementation or factors reducing cost benefits. Presented herein is a model for analyzing digital telephony policies and systems and their associated direct costs and indirect benefit and risk factors. In order to present the structure of the model, issues of national importance and business-related issues are discussed. The various factors impacting the implementation of the associated communications systems and communications security are summarized, and various implementation tradeoffs are compared based on economic benefits/impact. The importance of the issues addressed herein, as well as other digital telephony issues, has greatly increased with the enormous increases in communication system connectivity due to the advance of the National Information Infrastructure.

  18. Analysis of a classical chiral bag model

    International Nuclear Information System (INIS)

    Nadeau, H.

    1985-01-01

    The author studies a classical chiral bag model with a Mexican hat-type potential for the self-coupling of the pion fields. He assumes a static spherical bag of radius R, the hedgehog ansatz for the chiral fields and that the quarks are all in the lowest lying s state. The author has considered three classes of models, the cloudy or pantopionic bags, the little or exopionic bags and the endopionic bags, where the pions are allowed all through space, only outside the bag and only inside the bag respectively. In all cases, the quarks are confined in the interior. He calculates the bag radius R, the bag constant B and the total ground state energy R for wide ranges of the two free parameters of the theory, namely the coupling constant λ and the quark frequency omega. The author focuses the study on the endopionic bags, the least known class, and compares the results with the familiar ones of other classes

  19. QuantUM: Quantitative Safety Analysis of UML Models

    Directory of Open Access Journals (Sweden)

    Florian Leitner-Fischer

    2011-07-01

    Full Text Available When developing a safety-critical system it is essential to obtain an assessment of different design alternatives. In particular, an early safety assessment of the architectural design of a system is desirable. In spite of the plethora of available formal quantitative analysis methods it is still difficult for software and system architects to integrate these techniques into their every day work. This is mainly due to the lack of methods that can be directly applied to architecture level models, for instance given as UML diagrams. Also, it is necessary that the description methods used do not require a profound knowledge of formal methods. Our approach bridges this gap and improves the integration of quantitative safety analysis methods into the development process. All inputs of the analysis are specified at the level of a UML model. This model is then automatically translated into the analysis model, and the results of the analysis are consequently represented on the level of the UML model. Thus the analysis model and the formal methods used during the analysis are hidden from the user. We illustrate the usefulness of our approach using an industrial strength case study.

  20. Regression Model Optimization for the Analysis of Experimental Data

    Science.gov (United States)

    Ulbrich, N.

    2009-01-01

    A candidate math model search algorithm was developed at Ames Research Center that determines a recommended math model for the multivariate regression analysis of experimental data. The search algorithm is applicable to classical regression analysis problems as well as wind tunnel strain gage balance calibration analysis applications. The algorithm compares the predictive capability of different regression models using the standard deviation of the PRESS residuals of the responses as a search metric. This search metric is minimized during the search. Singular value decomposition is used during the search to reject math models that lead to a singular solution of the regression analysis problem. Two threshold dependent constraints are also applied. The first constraint rejects math models with insignificant terms. The second constraint rejects math models with near-linear dependencies between terms. The math term hierarchy rule may also be applied as an optional constraint during or after the candidate math model search. The final term selection of the recommended math model depends on the regressor and response values of the data set, the user s function class combination choice, the user s constraint selections, and the result of the search metric minimization. A frequently used regression analysis example from the literature is used to illustrate the application of the search algorithm to experimental data.

  1. Sensitivity of SBLOCA analysis to model nodalization

    International Nuclear Information System (INIS)

    Lee, C.; Ito, T.; Abramson, P.B.

    1983-01-01

    The recent Semiscale test S-UT-8 indicates the possibility for primary liquid to hang up in the steam generators during a SBLOCA, permitting core uncovery prior to loop-seal clearance. In analysis of Small Break Loss of Coolant Accidents with RELAP5, it is found that resultant transient behavior is quite sensitive to the selection of nodalization for the steam generators. Although global parameters such as integrated mass loss, primary inventory and primary pressure are relatively insensitive to the nodalization, it is found that the predicted distribution of inventory around the primary is significantly affected by nodalization. More detailed nodalization predicts that more of the inventory tends to remain in the steam generators, resulting in less inventory in the reactor vessel and therefore causing earlier and more severe core uncovery

  2. A global sensitivity analysis approach for morphogenesis models

    KAUST Repository

    Boas, Sonja E. M.

    2015-11-21

    Background Morphogenesis is a developmental process in which cells organize into shapes and patterns. Complex, non-linear and multi-factorial models with images as output are commonly used to study morphogenesis. It is difficult to understand the relation between the uncertainty in the input and the output of such ‘black-box’ models, giving rise to the need for sensitivity analysis tools. In this paper, we introduce a workflow for a global sensitivity analysis approach to study the impact of single parameters and the interactions between them on the output of morphogenesis models. Results To demonstrate the workflow, we used a published, well-studied model of vascular morphogenesis. The parameters of this cellular Potts model (CPM) represent cell properties and behaviors that drive the mechanisms of angiogenic sprouting. The global sensitivity analysis correctly identified the dominant parameters in the model, consistent with previous studies. Additionally, the analysis provided information on the relative impact of single parameters and of interactions between them. This is very relevant because interactions of parameters impede the experimental verification of the predicted effect of single parameters. The parameter interactions, although of low impact, provided also new insights in the mechanisms of in silico sprouting. Finally, the analysis indicated that the model could be reduced by one parameter. Conclusions We propose global sensitivity analysis as an alternative approach to study the mechanisms of morphogenesis. Comparison of the ranking of the impact of the model parameters to knowledge derived from experimental data and from manipulation experiments can help to falsify models and to find the operand mechanisms in morphogenesis. The workflow is applicable to all ‘black-box’ models, including high-throughput in vitro models in which output measures are affected by a set of experimental perturbations.

  3. A global sensitivity analysis approach for morphogenesis models.

    Science.gov (United States)

    Boas, Sonja E M; Navarro Jimenez, Maria I; Merks, Roeland M H; Blom, Joke G

    2015-11-21

    Morphogenesis is a developmental process in which cells organize into shapes and patterns. Complex, non-linear and multi-factorial models with images as output are commonly used to study morphogenesis. It is difficult to understand the relation between the uncertainty in the input and the output of such 'black-box' models, giving rise to the need for sensitivity analysis tools. In this paper, we introduce a workflow for a global sensitivity analysis approach to study the impact of single parameters and the interactions between them on the output of morphogenesis models. To demonstrate the workflow, we used a published, well-studied model of vascular morphogenesis. The parameters of this cellular Potts model (CPM) represent cell properties and behaviors that drive the mechanisms of angiogenic sprouting. The global sensitivity analysis correctly identified the dominant parameters in the model, consistent with previous studies. Additionally, the analysis provided information on the relative impact of single parameters and of interactions between them. This is very relevant because interactions of parameters impede the experimental verification of the predicted effect of single parameters. The parameter interactions, although of low impact, provided also new insights in the mechanisms of in silico sprouting. Finally, the analysis indicated that the model could be reduced by one parameter. We propose global sensitivity analysis as an alternative approach to study the mechanisms of morphogenesis. Comparison of the ranking of the impact of the model parameters to knowledge derived from experimental data and from manipulation experiments can help to falsify models and to find the operand mechanisms in morphogenesis. The workflow is applicable to all 'black-box' models, including high-throughput in vitro models in which output measures are affected by a set of experimental perturbations.

  4. Modeling and analysis with induction generators

    CERN Document Server

    Simões, M Godoy

    2014-01-01

    ForewordPrefaceAcknowledgmentsAuthorsPrinciples of Alternative Sources of Energy and Electric GenerationScope of This ChapterLegal DefinitionsPrinciples of Electrical ConversionBasic Definitions of Electrical PowerCharacteristics of Primary SourcesCharacteristics of Remote Industrial, Commercial, and Residential Sites and Rural EnergySelection of the Electric GeneratorInterfacing Primary Source, Generator, and LoadExample of a Simple Integrated Generating and Energy-Storing SystemSolved ProblemsSuggested ProblemsReferencesSteady-State Model of Induction GeneratorsScope of This ChapterInterconnection and Disconnection of the Electric Distribution NetworkRobustness of Induction GeneratorsClassical Steady-State Representation of the Asynchronous MachineGenerated PowerInduced TorqueRepresentation of Induction Generator LossesMeasurement of Induction Generator ParametersBlocked Rotor Test (s = 1)No-Load Test (s = 0)Features of Induction Machines Working as Generators Interconnected to the Distribution NetworkHigh-...

  5. Parametric analysis of fire model CFAST

    International Nuclear Information System (INIS)

    Lee, Y. H.; Yang, J. Y.; Kim, J. H.

    2004-01-01

    This paper describes the pump room fire of the nuclear power plant using CFAST fire modeling code developed by NIST. It is determined by the constrained or unconstrained fire, Lower Oxygen Limit (LOL), Radiative Fraction (RF), and the times to open doors, which are the input parameters of CAFST. According to the results, pump room fire is ventilation-controlled fire, so it is adequate that the value of LOL is 10% which is also the default value. It is appeared that the RF does not change the temperature of the upper gas layer. But the level of opening of the penetrating area and the times to opening it have an effect on the temperature of the upper layer, so it is determined that the results of it should be carefully analyzed

  6. Materials Analysis and Modeling of Underfill Materials.

    Energy Technology Data Exchange (ETDEWEB)

    Wyatt, Nicholas B [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Chambers, Robert S. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-08-01

    The thermal-mechanical properties of three potential underfill candidate materials for PBGA applications are characterized and reported. Two of the materials are a formulations developed at Sandia for underfill applications while the third is a commercial product that utilizes a snap-cure chemistry to drastically reduce cure time. Viscoelastic models were calibrated and fit using the property data collected for one of the Sandia formulated materials. Along with the thermal-mechanical analyses performed, a series of simple bi-material strip tests were conducted to comparatively analyze the relative effects of cure and thermal shrinkage amongst the materials under consideration. Finally, current knowledge gaps as well as questions arising from the present study are identified and a path forward presented.

  7. Expatriates Selection: An Essay of Model Analysis

    Directory of Open Access Journals (Sweden)

    Rui Bártolo-Ribeiro

    2015-03-01

    Full Text Available The business expansion to other geographical areas with different cultures from which organizations were created and developed leads to the expatriation of employees to these destinations. Recruitment and selection procedures of expatriates do not always have the intended success leading to an early return of these professionals with the consequent organizational disorders. In this study, several articles published in the last five years were analyzed in order to identify the most frequently mentioned dimensions in the selection of expatriates in terms of success and failure. The characteristics in the selection process that may increase prediction of adaptation of expatriates to new cultural contexts of the some organization were studied according to the KSAOs model. Few references were found concerning Knowledge, Skills and Abilities dimensions in the analyzed papers. There was a strong predominance on the evaluation of Other Characteristics, and was given more importance to dispositional factors than situational factors for promoting the integration of the expatriates.

  8. Analysis, Modeling, and Simulation (AMS) testbed initial screening report.

    Science.gov (United States)

    2013-11-01

    Analysis Modeling and Simulation (AMS) Testbeds can make significant contributions in identifying the benefits of more effective, more active systems management, resulting from integrating transformative applications enabled by new data from wireless...

  9. Time Aquatic Resources Modeling and Analysis Program (STARMAP)

    Data.gov (United States)

    Federal Laboratory Consortium — Colorado State University has received funding from the U.S. Environmental Protection Agency (EPA) for its Space-Time Aquatic Resources Modeling and Analysis Program...

  10. Modeling, Analysis, Simulation, and Synthesis of Biomolecular Networks

    National Research Council Canada - National Science Library

    Ruben, Harvey; Kumar, Vijay; Sokolsky, Oleg

    2006-01-01

    ...) a first example of reachability analysis applied to a biomolecular system (lactose induction), 4) a model of tetracycline resistance that discriminates between two possible mechanisms for tetracycline diffusion through the cell membrane, and 5...

  11. Microscopic Analysis and Modeling of Airport Surface Sequencing, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — Although a number of airportal surface models exist and have been successfully used for analysis of airportal operations, only recently has it become possible to...

  12. Automation of Safety Analysis with SysML Models Project

    Data.gov (United States)

    National Aeronautics and Space Administration — This project was a small proof-of-concept case study, generating SysML model information as a side effect of safety analysis. A prototype FMEA Assistant was...

  13. Error and Uncertainty Analysis for Ecological Modeling and Simulation

    National Research Council Canada - National Science Library

    Gertner, George

    1998-01-01

    The main objectives of this project are a) to develop a general methodology for conducting sensitivity and uncertainty analysis and building error budgets in simulation modeling over space and time; and b...

  14. Integration of Design and Control through Model Analysis

    DEFF Research Database (Denmark)

    Russel, Boris Mariboe; Henriksen, Jens Peter; Jørgensen, Sten Bay

    2002-01-01

    A systematic computer aided analysis of the process model is proposed as a pre-solution step for integration of design and control problems. The process model equations are classified in terms of balance equations, constitutive equations and conditional equations. Analysis of the phenomena models...... representing the constitutive equations identify the relationships between the important process and design variables, which help to understand, define and address some of the issues related to integration of design and control. Furthermore, the analysis is able to identify a set of process (control) variables...... and design (manipulative) variables that may be employed with different objectives in design and control for the integrated problem. The computer aided model analysis is highlighted through illustrative examples, involving processes with mass and/or energy recycle, where the important design and control...

  15. Computational Modeling, Formal Analysis, and Tools for Systems Biology.

    Directory of Open Access Journals (Sweden)

    Ezio Bartocci

    2016-01-01

    Full Text Available As the amount of biological data in the public domain grows, so does the range of modeling and analysis techniques employed in systems biology. In recent years, a number of theoretical computer science developments have enabled modeling methodology to keep pace. The growing interest in systems biology in executable models and their analysis has necessitated the borrowing of terms and methods from computer science, such as formal analysis, model checking, static analysis, and runtime verification. Here, we discuss the most important and exciting computational methods and tools currently available to systems biologists. We believe that a deeper understanding of the concepts and theory highlighted in this review will produce better software practice, improved investigation of complex biological processes, and even new ideas and better feedback into computer science.

  16. PSAMM: A Portable System for the Analysis of Metabolic Models.

    Directory of Open Access Journals (Sweden)

    Jon Lund Steffensen

    2016-02-01

    Full Text Available The genome-scale models of metabolic networks have been broadly applied in phenotype prediction, evolutionary reconstruction, community functional analysis, and metabolic engineering. Despite the development of tools that support individual steps along the modeling procedure, it is still difficult to associate mathematical simulation results with the annotation and biological interpretation of metabolic models. In order to solve this problem, here we developed a Portable System for the Analysis of Metabolic Models (PSAMM, a new open-source software package that supports the integration of heterogeneous metadata in model annotations and provides a user-friendly interface for the analysis of metabolic models. PSAMM is independent of paid software environments like MATLAB, and all its dependencies are freely available for academic users. Compared to existing tools, PSAMM significantly reduced the running time of constraint-based analysis and enabled flexible settings of simulation parameters using simple one-line commands. The integration of heterogeneous, model-specific annotation information in PSAMM is achieved with a novel format of YAML-based model representation, which has several advantages, such as providing a modular organization of model components and simulation settings, enabling model version tracking, and permitting the integration of multiple simulation problems. PSAMM also includes a number of quality checking procedures to examine stoichiometric balance and to identify blocked reactions. Applying PSAMM to 57 models collected from current literature, we demonstrated how the software can be used for managing and simulating metabolic models. We identified a number of common inconsistencies in existing models and constructed an updated model repository to document the resolution of these inconsistencies.

  17. Reliability of four models for clinical gait analysis.

    Science.gov (United States)

    Kainz, Hans; Graham, David; Edwards, Julie; Walsh, Henry P J; Maine, Sheanna; Boyd, Roslyn N; Lloyd, David G; Modenese, Luca; Carty, Christopher P

    2017-05-01

    Three-dimensional gait analysis (3DGA) has become a common clinical tool for treatment planning in children with cerebral palsy (CP). Many clinical gait laboratories use the conventional gait analysis model (e.g. Plug-in-Gait model), which uses Direct Kinematics (DK) for joint kinematic calculations, whereas, musculoskeletal models, mainly used for research, use Inverse Kinematics (IK). Musculoskeletal IK models have the advantage of enabling additional analyses which might improve the clinical decision-making in children with CP. Before any new model can be used in a clinical setting, its reliability has to be evaluated and compared to a commonly used clinical gait model (e.g. Plug-in-Gait model) which was the purpose of this study. Two testers performed 3DGA in eleven CP and seven typically developing participants on two occasions. Intra- and inter-tester standard deviations (SD) and standard error of measurement (SEM) were used to compare the reliability of two DK models (Plug-in-Gait and a six degrees-of-freedom model solved using Vicon software) and two IK models (two modifications of 'gait2392' solved using OpenSim). All models showed good reliability (mean SEM of 3.0° over all analysed models and joint angles). Variations in joint kinetics were less in typically developed than in CP participants. The modified 'gait2392' model which included all the joint rotations commonly reported in clinical 3DGA, showed reasonable reliable joint kinematic and kinetic estimates, and allows additional musculoskeletal analysis on surgically adjustable parameters, e.g. muscle-tendon lengths, and, therefore, is a suitable model for clinical gait analysis. Copyright © 2017. Published by Elsevier B.V.

  18. Empirical bayes model comparisons for differential methylation analysis.

    Science.gov (United States)

    Teng, Mingxiang; Wang, Yadong; Kim, Seongho; Li, Lang; Shen, Changyu; Wang, Guohua; Liu, Yunlong; Huang, Tim H M; Nephew, Kenneth P; Balch, Curt

    2012-01-01

    A number of empirical Bayes models (each with different statistical distribution assumptions) have now been developed to analyze differential DNA methylation using high-density oligonucleotide tiling arrays. However, it remains unclear which model performs best. For example, for analysis of differentially methylated regions for conservative and functional sequence characteristics (e.g., enrichment of transcription factor-binding sites (TFBSs)), the sensitivity of such analyses, using various empirical Bayes models, remains unclear. In this paper, five empirical Bayes models were constructed, based on either a gamma distribution or a log-normal distribution, for the identification of differential methylated loci and their cell division-(1, 3, and 5) and drug-treatment-(cisplatin) dependent methylation patterns. While differential methylation patterns generated by log-normal models were enriched with numerous TFBSs, we observed almost no TFBS-enriched sequences using gamma assumption models. Statistical and biological results suggest log-normal, rather than gamma, empirical Bayes model distribution to be a highly accurate and precise method for differential methylation microarray analysis. In addition, we presented one of the log-normal models for differential methylation analysis and tested its reproducibility by simulation study. We believe this research to be the first extensive comparison of statistical modeling for the analysis of differential DNA methylation, an important biological phenomenon that precisely regulates gene transcription.

  19. Empirical Bayes Model Comparisons for Differential Methylation Analysis

    Directory of Open Access Journals (Sweden)

    Mingxiang Teng

    2012-01-01

    Full Text Available A number of empirical Bayes models (each with different statistical distribution assumptions have now been developed to analyze differential DNA methylation using high-density oligonucleotide tiling arrays. However, it remains unclear which model performs best. For example, for analysis of differentially methylated regions for conservative and functional sequence characteristics (e.g., enrichment of transcription factor-binding sites (TFBSs, the sensitivity of such analyses, using various empirical Bayes models, remains unclear. In this paper, five empirical Bayes models were constructed, based on either a gamma distribution or a log-normal distribution, for the identification of differential methylated loci and their cell division—(1, 3, and 5 and drug-treatment-(cisplatin dependent methylation patterns. While differential methylation patterns generated by log-normal models were enriched with numerous TFBSs, we observed almost no TFBS-enriched sequences using gamma assumption models. Statistical and biological results suggest log-normal, rather than gamma, empirical Bayes model distribution to be a highly accurate and precise method for differential methylation microarray analysis. In addition, we presented one of the log-normal models for differential methylation analysis and tested its reproducibility by simulation study. We believe this research to be the first extensive comparison of statistical modeling for the analysis of differential DNA methylation, an important biological phenomenon that precisely regulates gene transcription.

  20. How Many Separable Sources? Model Selection In Independent Components Analysis

    DEFF Research Database (Denmark)

    Woods, Roger P.; Hansen, Lars Kai; Strother, Stephen

    2015-01-01

    Unlike mixtures consisting solely of non-Gaussian sources, mixtures including two or more Gaussian components cannot be separated using standard independent components analysis methods that are based on higher order statistics and independent observations. The mixed Independent Components Analysis....../Principal Components Analysis (mixed ICA/PCA) model described here accommodates one or more Gaussian components in the independent components analysis model and uses principal components analysis to characterize contributions from this inseparable Gaussian subspace. Information theory can then be used to select from...... among potential model categories with differing numbers of Gaussian components. Based on simulation studies, the assumptions and approximations underlying the Akaike Information Criterion do not hold in this setting, even with a very large number of observations. Cross-validation is a suitable, though...

  1. Sensitivity and uncertainty analysis of the PATHWAY radionuclide transport model

    International Nuclear Information System (INIS)

    Otis, M.D.

    1983-01-01

    Procedures were developed for the uncertainty and sensitivity analysis of a dynamic model of radionuclide transport through human food chains. Uncertainty in model predictions was estimated by propagation of parameter uncertainties using a Monte Carlo simulation technique. Sensitivity of model predictions to individual parameters was investigated using the partial correlation coefficient of each parameter with model output. Random values produced for the uncertainty analysis were used in the correlation analysis for sensitivity. These procedures were applied to the PATHWAY model which predicts concentrations of radionuclides in foods grown in Nevada and Utah and exposed to fallout during the period of atmospheric nuclear weapons testing in Nevada. Concentrations and time-integrated concentrations of iodine-131, cesium-136, and cesium-137 in milk and other foods were investigated. 9 figs., 13 tabs

  2. A Succinct Approach to Static Analysis and Model Checking

    DEFF Research Database (Denmark)

    Filipiuk, Piotr

    In a number of areas software correctness is crucial, therefore it is often desirable to formally verify the presence of various properties or the absence of errors. This thesis presents a framework for concisely expressing static analysis and model checking problems. The framework facilitates...... in the classical formulation of ALFP logic. Finally, we show that the logics and the associated solvers can be used for rapid prototyping. We illustrate that by a variety of case studies from static analysis and model checking....

  3. Integrated dynamic modeling and management system mission analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lee, A.K.

    1994-12-28

    This document summarizes the mission analysis performed on the Integrated Dynamic Modeling and Management System (IDMMS). The IDMMS will be developed to provide the modeling and analysis capability required to understand the TWRS system behavior in terms of the identified TWRS performance measures. The IDMMS will be used to demonstrate in a verified and validated manner the satisfactory performance of the TWRS system configuration and assurance that the requirements have been satisfied.

  4. Coping with Complexity Model Reduction and Data Analysis

    CERN Document Server

    Gorban, Alexander N

    2011-01-01

    This volume contains the extended version of selected talks given at the international research workshop 'Coping with Complexity: Model Reduction and Data Analysis', Ambleside, UK, August 31 - September 4, 2009. This book is deliberately broad in scope and aims at promoting new ideas and methodological perspectives. The topics of the chapters range from theoretical analysis of complex and multiscale mathematical models to applications in e.g., fluid dynamics and chemical kinetics.

  5. Integrated dynamic modeling and management system mission analysis

    International Nuclear Information System (INIS)

    Lee, A.K.

    1994-01-01

    This document summarizes the mission analysis performed on the Integrated Dynamic Modeling and Management System (IDMMS). The IDMMS will be developed to provide the modeling and analysis capability required to understand the TWRS system behavior in terms of the identified TWRS performance measures. The IDMMS will be used to demonstrate in a verified and validated manner the satisfactory performance of the TWRS system configuration and assurance that the requirements have been satisfied

  6. A Practical Ontology Framework for Static Model Analysis

    Science.gov (United States)

    2011-04-26

    throughout the model. We implement our analysis framework on top of Ptolemy II [3], an extensible open source model-based design tool written in Java...While Ptolemy II makes a good testbed for im- plementing and experimenting with new analyses, we also feel that the techniques we present here are...broadly use- ful. For this reason, we aim to make our analysis frame- work orthogonal to the execution semantics of Ptolemy II, allowing it to be

  7. Building Information Modeling (BIM) for Indoor Environmental Performance Analysis

    DEFF Research Database (Denmark)

    The report is a part of a research assignment carried out by students in the 5ETCS course “Project Byggeri – [entitled as: Building Information Modeling (BIM) – Modeling & Analysis]”, during the 3rd semester of master degree in Civil and Architectural Engineering, Department of Engineering, Aarhus...... University. This includes seven papers describing BIM for Sustainability, concentrating specifically on individual topics regarding to Indoor Environment Performance Analysis....

  8. IMAGE ANALYSIS FOR MODELLING SHEAR BEHAVIOUR

    Directory of Open Access Journals (Sweden)

    Philippe Lopez

    2011-05-01

    Full Text Available Through laboratory research performed over the past ten years, many of the critical links between fracture characteristics and hydromechanical and mechanical behaviour have been made for individual fractures. One of the remaining challenges at the laboratory scale is to directly link fracture morphology of shear behaviour with changes in stress and shear direction. A series of laboratory experiments were performed on cement mortar replicas of a granite sample with a natural fracture perpendicular to the axis of the core. Results show that there is a strong relationship between the fracture's geometry and its mechanical behaviour under shear stress and the resulting damage. Image analysis, geostatistical, stereological and directional data techniques are applied in combination to experimental data. The results highlight the role of geometric characteristics of the fracture surfaces (surface roughness, size, shape, locations and orientations of asperities to be damaged in shear behaviour. A notable improvement in shear understanding is that shear behaviour is controlled by the apparent dip in the shear direction of elementary facets forming the fracture.

  9. Model analysis of fomite mediated influenza transmission.

    Directory of Open Access Journals (Sweden)

    Jijun Zhao

    Full Text Available Fomites involved in influenza transmission are either hand- or droplet-contaminated. We evaluated the interactions of fomite characteristics and human behaviors affecting these routes using an Environmental Infection Transmission System (EITS model by comparing the basic reproduction numbers (R(0 for different fomite mediated transmission pathways. Fomites classified as large versus small surface sizes (reflecting high versus low droplet contamination levels and high versus low touching frequency have important differences. For example, 1 the highly touched large surface fomite (public tables has the highest transmission potential and generally strongest control measure effects; 2 transmission from droplet-contaminated routes exceed those from hand-contaminated routes except for highly touched small surface fomites such as door knob handles; and 3 covering a cough using the upper arm or using tissues effectively removes virus from the system and thus decreases total fomite transmission. Because covering a cough by hands diverts pathogens from the droplet-fomite route to the hand-fomite route, this has the potential to increase total fomite transmission for highly touched small surface fomites. An improved understanding and more refined data related to fomite mediated transmission routes will help inform intervention strategies for influenza and other pathogens that are mediated through the environment.

  10. Model analysis of fomite mediated influenza transmission.

    Science.gov (United States)

    Zhao, Jijun; Eisenberg, Joseph E; Spicknall, Ian H; Li, Sheng; Koopman, James S

    2012-01-01

    Fomites involved in influenza transmission are either hand- or droplet-contaminated. We evaluated the interactions of fomite characteristics and human behaviors affecting these routes using an Environmental Infection Transmission System (EITS) model by comparing the basic reproduction numbers (R(0)) for different fomite mediated transmission pathways. Fomites classified as large versus small surface sizes (reflecting high versus low droplet contamination levels) and high versus low touching frequency have important differences. For example, 1) the highly touched large surface fomite (public tables) has the highest transmission potential and generally strongest control measure effects; 2) transmission from droplet-contaminated routes exceed those from hand-contaminated routes except for highly touched small surface fomites such as door knob handles; and 3) covering a cough using the upper arm or using tissues effectively removes virus from the system and thus decreases total fomite transmission. Because covering a cough by hands diverts pathogens from the droplet-fomite route to the hand-fomite route, this has the potential to increase total fomite transmission for highly touched small surface fomites. An improved understanding and more refined data related to fomite mediated transmission routes will help inform intervention strategies for influenza and other pathogens that are mediated through the environment.

  11. Sensitivity analysis and optimization of system dynamics models : Regression analysis and statistical design of experiments

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    1995-01-01

    This tutorial discusses what-if analysis and optimization of System Dynamics models. These problems are solved, using the statistical techniques of regression analysis and design of experiments (DOE). These issues are illustrated by applying the statistical techniques to a System Dynamics model for

  12. JSim, an open-source modeling system for data analysis.

    Science.gov (United States)

    Butterworth, Erik; Jardine, Bartholomew E; Raymond, Gary M; Neal, Maxwell L; Bassingthwaighte, James B

    2013-01-01

    JSim is a simulation system for developing models, designing experiments, and evaluating hypotheses on physiological and pharmacological systems through the testing of model solutions against data. It is designed for interactive, iterative manipulation of the model code, handling of multiple data sets and parameter sets, and for making comparisons among different models running simultaneously or separately. Interactive use is supported by a large collection of graphical user interfaces for model writing and compilation diagnostics, defining input functions, model runs, selection of algorithms solving ordinary and partial differential equations, run-time multidimensional graphics, parameter optimization (8 methods), sensitivity analysis, and Monte Carlo simulation for defining confidence ranges. JSim uses Mathematical Modeling Language (MML) a declarative syntax specifying algebraic and differential equations. Imperative constructs written in other languages (MATLAB, FORTRAN, C++, etc.) are accessed through procedure calls. MML syntax is simple, basically defining the parameters and variables, then writing the equations in a straightforward, easily read and understood mathematical form. This makes JSim good for teaching modeling as well as for model analysis for research.   For high throughput applications, JSim can be run as a batch job.  JSim can automatically translate models from the repositories for Systems Biology Markup Language (SBML) and CellML models. Stochastic modeling is supported. MML supports assigning physical units to constants and variables and automates checking dimensional balance as the first step in verification testing. Automatic unit scaling follows, e.g. seconds to minutes, if needed. The JSim Project File sets a standard for reproducible modeling analysis: it includes in one file everything for analyzing a set of experiments: the data, the models, the data fitting, and evaluation of parameter confidence ranges. JSim is open source; it

  13. Topic Modeling in Sentiment Analysis: A Systematic Review

    Directory of Open Access Journals (Sweden)

    Toqir Ahmad Rana

    2016-06-01

    Full Text Available With the expansion and acceptance of Word Wide Web, sentiment analysis has become progressively popular research area in information retrieval and web data analysis. Due to the huge amount of user-generated contents over blogs, forums, social media, etc., sentiment analysis has attracted researchers both in academia and industry, since it deals with the extraction of opinions and sentiments. In this paper, we have presented a review of topic modeling, especially LDA-based techniques, in sentiment analysis. We have presented a detailed analysis of diverse approaches and techniques, and compared the accuracy of different systems among them. The results of different approaches have been summarized, analyzed and presented in a sophisticated fashion. This is the really effort to explore different topic modeling techniques in the capacity of sentiment analysis and imparting a comprehensive comparison among them.

  14. MMA, A Computer Code for Multi-Model Analysis

    Science.gov (United States)

    Poeter, Eileen P.; Hill, Mary C.

    2007-01-01

    This report documents the Multi-Model Analysis (MMA) computer code. MMA can be used to evaluate results from alternative models of a single system using the same set of observations for all models. As long as the observations, the observation weighting, and system being represented are the same, the models can differ in nearly any way imaginable. For example, they may include different processes, different simulation software, different temporal definitions (for example, steady-state and transient models could be considered), and so on. The multiple models need to be calibrated by nonlinear regression. Calibration of the individual models needs to be completed before application of MMA. MMA can be used to rank models and calculate posterior model probabilities. These can be used to (1) determine the relative importance of the characteristics embodied in the alternative models, (2) calculate model-averaged parameter estimates and predictions, and (3) quantify the uncertainty of parameter estimates and predictions in a way that integrates the variations represented by the alternative models. There is a lack of consensus on what model analysis methods are best, so MMA provides four default methods. Two are based on Kullback-Leibler information, and use the AIC (Akaike Information Criterion) or AICc (second-order-bias-corrected AIC) model discrimination criteria. The other two default methods are the BIC (Bayesian Information Criterion) and the KIC (Kashyap Information Criterion) model discrimination criteria. Use of the KIC criterion is equivalent to using the maximum-likelihood Bayesian model averaging (MLBMA) method. AIC, AICc, and BIC can be derived from Frequentist or Bayesian arguments. The default methods based on Kullback-Leibler information have a number of theoretical advantages, including that they tend to favor more complicated models as more data become available than do the other methods, which makes sense in many situations. Many applications of MMA will

  15. Product Lifecycle Management Architecture: A Model Based Systems Engineering Analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Noonan, Nicholas James [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-07-01

    This report is an analysis of the Product Lifecycle Management (PLM) program. The analysis is centered on a need statement generated by a Nuclear Weapons (NW) customer. The need statement captured in this report creates an opportunity for the PLM to provide a robust service as a solution. Lifecycles for both the NW and PLM are analyzed using Model Based System Engineering (MBSE).

  16. Sensitivity Analysis of a Simplified Fire Dynamic Model

    DEFF Research Database (Denmark)

    Sørensen, Lars Schiøtt; Nielsen, Anker

    2015-01-01

    This paper discusses a method for performing a sensitivity analysis of parameters used in a simplified fire model for temperature estimates in the upper smoke layer during a fire. The results from the sensitivity analysis can be used when individual parameters affecting fire safety are assessed...

  17. Domain Endurants: An Analysis and Description Process Model

    DEFF Research Database (Denmark)

    Bjørner, Dines

    2014-01-01

    We present a summary, Sect. 2, of a structure of domain analysis and description concepts: techniques and tools. And we link, in Sect. 3, these concepts, embodied in domain analysis prompts and domain description prompts, in a model of how a diligent domain analyser cum describer would use them. ...

  18. Current status of uncertainty analysis methods for computer models

    International Nuclear Information System (INIS)

    Ishigami, Tsutomu

    1989-11-01

    This report surveys several existing uncertainty analysis methods for estimating computer output uncertainty caused by input uncertainties, illustrating application examples of those methods to three computer models, MARCH/CORRAL II, TERFOC and SPARC. Merits and limitations of the methods are assessed in the application, and recommendation for selecting uncertainty analysis methods is provided. (author)

  19. Mixed waste treatment model: Basis and analysis

    International Nuclear Information System (INIS)

    Palmer, B.A.

    1995-09-01

    The Department of Energy's Programmatic Environmental Impact Statement (PEIS) required treatment system capacities for risk and cost calculation. Los Alamos was tasked with providing these capacities to the PEIS team. This involved understanding the Department of Energy (DOE) Complex waste, making the necessary changes to correct for problems, categorizing the waste for treatment, and determining the treatment system requirements. The treatment system requirements depended on the incoming waste, which varied for each PEIS case. The treatment system requirements also depended on the type of treatment that was desired. Because different groups contributing to the PEIS needed specific types of results, we provided the treatment system requirements in a variety of forms. In total, some 40 data files were created for the TRU cases, and for the MLLW case, there were 105 separate data files. Each data file represents one treatment case consisting of the selected waste from various sites, a selected treatment system, and the reporting requirements for such a case. The treatment system requirements in their most basic form are the treatment process rates for unit operations in the desired treatment system, based on a 10-year working life and 20-year accumulation of the waste. These results were reported in cubic meters and for the MLLW case, in kilograms as well. The treatment system model consisted of unit operations that are linked together. Each unit operation's function depended on the input waste streams, waste matrix, and contaminants. Each unit operation outputs one or more waste streams whose matrix, contaminants, and volume/mass may have changed as a result of the treatment. These output streams are then routed to the appropriate unit operation for additional treatment until the output waste stream meets the treatment requirements for disposal. The total waste for each unit operation was calculated as well as the waste for each matrix treated by the unit

  20. Bayesian uncertainty analysis with applications to turbulence modeling

    International Nuclear Information System (INIS)

    Cheung, Sai Hung; Oliver, Todd A.; Prudencio, Ernesto E.; Prudhomme, Serge; Moser, Robert D.

    2011-01-01

    In this paper, we apply Bayesian uncertainty quantification techniques to the processes of calibrating complex mathematical models and predicting quantities of interest (QoI's) with such models. These techniques also enable the systematic comparison of competing model classes. The processes of calibration and comparison constitute the building blocks of a larger validation process, the goal of which is to accept or reject a given mathematical model for the prediction of a particular QoI for a particular scenario. In this work, we take the first step in this process by applying the methodology to the analysis of the Spalart-Allmaras turbulence model in the context of incompressible, boundary layer flows. Three competing model classes based on the Spalart-Allmaras model are formulated, calibrated against experimental data, and used to issue a prediction with quantified uncertainty. The model classes are compared in terms of their posterior probabilities and their prediction of QoI's. The model posterior probability represents the relative plausibility of a model class given the data. Thus, it incorporates the model's ability to fit experimental observations. Alternatively, comparing models using the predicted QoI connects the process to the needs of decision makers that use the results of the model. We show that by using both the model plausibility and predicted QoI, one has the opportunity to reject some model classes after calibration, before subjecting the remaining classes to additional validation challenges.

  1. Practical Soil-Shallow Foundation Model for Nonlinear Structural Analysis

    Directory of Open Access Journals (Sweden)

    Moussa Leblouba

    2016-01-01

    Full Text Available Soil-shallow foundation interaction models that are incorporated into most structural analysis programs generally lack accuracy and efficiency or neglect some aspects of foundation behavior. For instance, soil-shallow foundation systems have been observed to show both small and large loops under increasing amplitude load reversals. This paper presents a practical macroelement model for soil-shallow foundation system and its stability under simultaneous horizontal and vertical loads. The model comprises three spring elements: nonlinear horizontal, nonlinear rotational, and linear vertical springs. The proposed macroelement model was verified using experimental test results from large-scale model foundations subjected to small and large cyclic loading cases.

  2. Hidden-Markov-Model Analysis Of Telemanipulator Data

    Science.gov (United States)

    Hannaford, Blake; Lee, Paul

    1991-01-01

    Mathematical model and procedure based on hidden-Markov-model concept undergoing development for use in analysis and prediction of outputs of force and torque sensors of telerobotic manipulators. In model, overall task broken down into subgoals, and transition probabilities encode ease with which operator completes each subgoal. Process portion of model encodes task-sequence/subgoal structure, and probability-density functions for forces and torques associated with each state of manipulation encode sensor signals that one expects to observe at subgoal. Parameters of model constructed from engineering knowledge of task.

  3. Development of interpretation models for PFN uranium log analysis

    International Nuclear Information System (INIS)

    Barnard, R.W.

    1980-11-01

    This report presents the models for interpretation of borehole logs for the PFN (Prompt Fission Neutron) uranium logging system. Two models have been developed, the counts-ratio model and the counts/dieaway model. Both are empirically developed, but can be related to the theoretical bases for PFN analysis. The models try to correct for the effects of external factors (such as probe or formation parameters) in the calculation of uranium grade. The theoretical bases and calculational techniques for estimating uranium concentration from raw PFN data and other parameters are discussed. Examples and discussions of borehole logs are included

  4. comparative analysis of two mathematical models for prediction

    African Journals Online (AJOL)

    Abstract. A mathematical modeling for prediction of compressive strength of sandcrete blocks was performed using statistical analysis for the sandcrete block data ob- tained from experimental work done in this study. The models used are Scheffes and Osadebes optimization theories to predict the compressive strength of ...

  5. Stochastic processes analysis in nuclear reactor using ARMA models

    International Nuclear Information System (INIS)

    Zavaljevski, N.

    1990-01-01

    The analysis of ARMA model derived from general stochastic state equations of nuclear reactor is given. The dependence of ARMA model parameters on the main physical characteristics of RB nuclear reactor in Vinca is presented. Preliminary identification results are presented, observed discrepancies between theory and experiment are explained and the possibilities of identification improvement are anticipated. (author)

  6. De novo structural modeling and computational sequence analysis ...

    African Journals Online (AJOL)

    Jane

    2011-07-25

    Jul 25, 2011 ... Our study was aimed towards computational proteomic analysis and 3D structural modeling of this novel bacteriocin protein encoded by the earlier aforementioned gene. Different bioinformatics tools and machine learning techniques were used for protein structural classification. De novo protein modeling ...

  7. Stability Analysis of a Mathematical Model for Onchocerciaisis ...

    African Journals Online (AJOL)

    ADOWIE PERE

    Stability Analysis of a Mathematical Model for Onchocerciaisis. 668. BAKO, DU; AKINWANDE, NI; ENAGI, AI; KUTA, FA; ABDULRAHMAN, S. Table 1: Values of Parameters of the model. S/N. Parameters. Value. Source. 1 ω. 0.019. Estimated. 2 h. Λ. 3,449,679. Estimated. 3 h. µ. 0.019. CIA 2016. 4. 1 α. 2.12. Shuaib 2015. 5.

  8. A Mathematical Model for Analysis on Ships Collision Avoidance ...

    African Journals Online (AJOL)

    This study develops a mathematical model for analysis on collision avoidance of ships. The obtained model provides information on the quantitative effect of the ship's engine's response and the applied reversing force on separation distance and stopping abilities of the ships. Appropriate evasive maneuvers require the ...

  9. ORIGINAL ARTICLE Stability Analysis of Delayed Cournot Model in ...

    African Journals Online (AJOL)

    HP

    Elsadany, A.A. (2010).Dynamics of a delayed duopoly game with bounded rationality. Mathematical and computer modeling, 52(9-10), 1479-1489. Kopel, M. (1996). Simple and complex adjustment dynamics in court. Duopoly model. Chaos, Solitons and Fractal, 7(12), 2013-2048. Peters, E. (1994). Fractal Market Analysis:.

  10. Development of numerical modelling of analysis program for energy ...

    Indian Academy of Sciences (India)

    2,∗ and M-J WU. 3. 1Department of Civil Engineering, National Chi-Nan University, Pu-Li, Nan-Tou, ... behaviour of the VDHD was proposed in this research. ... this study. The analysis results obtained by using the mathematical model and the proposed SAP2000 numerical model conform to the seismic resistant test results,.

  11. Using Latent Class Analysis to Model Temperament Types

    Science.gov (United States)

    Loken, Eric

    2004-01-01

    Mixture models are appropriate for data that arise from a set of qualitatively different subpopulations. In this study, latent class analysis was applied to observational data from a laboratory assessment of infant temperament at four months of age. The EM algorithm was used to fit the models, and the Bayesian method of posterior predictive checks…

  12. Modeling, analysis and control of a variable geometry actuator

    NARCIS (Netherlands)

    Evers, W.J.; Knaap, A. van der; Besselink, I.J.M.; Nijmeijer, H.

    2008-01-01

    A new design of variable geometry force actuator is presented in this paper. Based upon this design, a model is derived which is used for steady-state analysis, as well as controller design in the presence of friction. The controlled actuator model is finally used to evaluate the power consumption

  13. Impact Analysis of a Biomechanical Model of the Human Thorax

    National Research Council Canada - National Science Library

    Jolly, Johannes

    2000-01-01

    .... The objective of the study was to create a viable finite element model of the human thorax. This objective was accomplished through the construction of a three-dimensional finite element model in DYNA3D, a finite element analysis program...

  14. Mathematical Modelling Research in Turkey: A Content Analysis Study

    Science.gov (United States)

    Çelik, H. Coskun

    2017-01-01

    The aim of the present study was to examine the mathematical modelling studies done between 2004 and 2015 in Turkey and to reveal their tendencies. Forty-nine studies were selected using purposeful sampling based on the term, "mathematical modelling" with Higher Education Academic Search Engine. They were analyzed with content analysis.…

  15. Evaluation of Statistical Models for Analysis of Insect, Disease and ...

    African Journals Online (AJOL)

    It is concluded that LMMs and GLMs simultaneously consider the effect of treatments and heterogeneity of variance and hence are more appropriate for analysis of abundance and incidence data than ordinary ANOVA. Keywords: Mixed Models; Generalized Linear Models; Statistical Power East African Journal of Sciences ...

  16. A Systemic Cause Analysis Model for Human Performance Technicians

    Science.gov (United States)

    Sostrin, Jesse

    2011-01-01

    This article presents a systemic, research-based cause analysis model for use in the field of human performance technology (HPT). The model organizes the most prominent barriers to workplace learning and performance into a conceptual framework that explains and illuminates the architecture of these barriers that exist within the fabric of everyday…

  17. Standard model for safety analysis report of fuel reprocessing plants

    International Nuclear Information System (INIS)

    1979-12-01

    A standard model for a safety analysis report of fuel reprocessing plants is established. This model shows the presentation format, the origin, and the details of the minimal information required by CNEN (Comissao Nacional de Energia Nuclear) aiming to evaluate the requests of construction permits and operation licenses made according to the legislation in force. (E.G.) [pt

  18. Standard model for safety analysis report of fuel fabrication plants

    International Nuclear Information System (INIS)

    1980-09-01

    A standard model for a safety analysis report of fuel fabrication plants is established. This model shows the presentation format, the origin, and the details of the minimal information required by CNEN (Comissao Nacional de Energia Nuclear) aiming to evaluate the requests of construction permits and operation licenses made according to the legislation in force. (E.G.) [pt

  19. Biological sequence analysis: probabilistic models of proteins and nucleic acids

    National Research Council Canada - National Science Library

    Durbin, Richard

    1998-01-01

    ... analysis methods are now based on principles of probabilistic modelling. Examples of such methods include the use of probabilistically derived score matrices to determine the significance of sequence alignments, the use of hidden Markov models as the basis for profile searches to identify distant members of sequence families, and the inference...

  20. Stability analysis for a general age-dependent vaccination model

    International Nuclear Information System (INIS)

    El Doma, M.

    1995-05-01

    An SIR epidemic model of a general age-dependent vaccination model is investigated when the fertility, mortality and removal rates depends on age. We give threshold criteria of the existence of equilibriums and perform stability analysis. Furthermore a critical vaccination coverage that is sufficient to eradicate the disease is determined. (author). 12 refs

  1. Comparative Analysis of Two Mathematical Models for Prediction of ...

    African Journals Online (AJOL)

    A mathematical modeling for prediction of compressive strength of sandcrete blocks was performed using statistical analysis for the sandcrete block data obtained from experimental work done in this study. The models used are Scheffe's and Osadebe's optimization theories to predict the compressive strength of sandcrete ...

  2. Comparative analysis of design models for concrete corbels

    Directory of Open Access Journals (Sweden)

    D. L. Araújo

    Full Text Available ABSTRACT The main objective of this paper is performing a comparative analysis of some design models for precast concrete corbels. For this, it was analyzed design models from Brazilian (NBR 9062 and European (EUROCODE 2 Codes and a US design handbook (PCI. Moreover, three analytical models showed in the literature are analyzed. The objective of this comparative is identifying the best design models to represent the failure load of concrete corbels by the tie yields or by the concrete crushing. Moreover, it is intended to evaluate the contribution of horizontal stirrups to resistance of concrete corbels. For this, a database was assembled from test results of concrete corbels carried out by several researchers and they are showed in the literature. The design models were applied to this database and from statistical tools, adjustments coefficients are recommended to be applied on these design models to take into account the results dispersion found in the analysis.

  3. Analysis and synthesis of solutions for the agglomeration process modeling

    Science.gov (United States)

    Babuk, V. A.; Dolotkazin, I. N.; Nizyaev, A. A.

    2013-03-01

    The present work is devoted development of model of agglomerating process for propellants based on ammonium perchlorate (AP), ammonium dinitramide (ADN), HMX, inactive binder, and nanoaluminum. Generalization of experimental data, development of physical picture of agglomeration for listed propellants, development and analysis of mathematical models are carried out. Synthesis of models of various phenomena taking place at agglomeration implementation allows predicting of size and quantity, chemical composition, structure of forming agglomerates and its fraction in set of condensed combustion products. It became possible in many respects due to development of new model of agglomerating particle evolution on the surface of burning propellant. Obtained results correspond to available experimental data. It is supposed that analogical method based on analysis of mathematical models of particular phenomena and their synthesis will allow implementing of the agglomerating process modeling for other types of metalized solid propellants.

  4. MMA, A Computer Code for Multi-Model Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Eileen P. Poeter and Mary C. Hill

    2007-08-20

    This report documents the Multi-Model Analysis (MMA) computer code. MMA can be used to evaluate results from alternative models of a single system using the same set of observations for all models. As long as the observations, the observation weighting, and system being represented are the same, the models can differ in nearly any way imaginable. For example, they may include different processes, different simulation software, different temporal definitions (for example, steady-state and transient models could be considered), and so on. The multiple models need to be calibrated by nonlinear regression. Calibration of the individual models needs to be completed before application of MMA. MMA can be used to rank models and calculate posterior model probabilities. These can be used to (1) determine the relative importance of the characteristics embodied in the alternative models, (2) calculate model-averaged parameter estimates and predictions, and (3) quantify the uncertainty of parameter estimates and predictions in a way that integrates the variations represented by the alternative models. There is a lack of consensus on what model analysis methods are best, so MMA provides four default methods. Two are based on Kullback-Leibler information, and use the AIC (Akaike Information Criterion) or AICc (second-order-bias-corrected AIC) model discrimination criteria. The other two default methods are the BIC (Bayesian Information Criterion) and the KIC (Kashyap Information Criterion) model discrimination criteria. Use of the KIC criterion is equivalent to using the maximum-likelihood Bayesian model averaging (MLBMA) method. AIC, AICc, and BIC can be derived from Frequentist or Bayesian arguments. The default methods based on Kullback-Leibler information have a number of theoretical advantages, including that they tend to favor more complicated models as more data become available than do the other methods, which makes sense in many situations.

  5. Experimental development based on mapping rule between requirements analysis model and web framework specific design model.

    Science.gov (United States)

    Okuda, Hirotaka; Ogata, Shinpei; Matsuura, Saeko

    2013-12-01

    Model Driven Development is a promising approach to develop high quality software systems. We have proposed a method of model-driven requirements analysis using Unified Modeling Language (UML). The main feature of our method is to automatically generate a Web user interface prototype from UML requirements analysis model so that we can confirm validity of input/output data for each page and page transition on the system by directly operating the prototype. We proposes a mapping rule in which design information independent of each web application framework implementation is defined based on the requirements analysis model, so as to improve the traceability to the final product from the valid requirements analysis model. This paper discusses the result of applying our method to the development of a Group Work Support System that is currently running in our department.

  6. Validation of statistical models for creep rupture by parametric analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bolton, J., E-mail: john.bolton@uwclub.net [65, Fisher Ave., Rugby, Warks CV22 5HW (United Kingdom)

    2012-01-15

    Statistical analysis is an efficient method for the optimisation of any candidate mathematical model of creep rupture data, and for the comparative ranking of competing models. However, when a series of candidate models has been examined and the best of the series has been identified, there is no statistical criterion to determine whether a yet more accurate model might be devised. Hence there remains some uncertainty that the best of any series examined is sufficiently accurate to be considered reliable as a basis for extrapolation. This paper proposes that models should be validated primarily by parametric graphical comparison to rupture data and rupture gradient data. It proposes that no mathematical model should be considered reliable for extrapolation unless the visible divergence between model and data is so small as to leave no apparent scope for further reduction. This study is based on the data for a 12% Cr alloy steel used in BS PD6605:1998 to exemplify its recommended statistical analysis procedure. The models considered in this paper include a) a relatively simple model, b) the PD6605 recommended model and c) a more accurate model of somewhat greater complexity. - Highlights: Black-Right-Pointing-Pointer The paper discusses the validation of creep rupture models derived from statistical analysis. Black-Right-Pointing-Pointer It demonstrates that models can be satisfactorily validated by a visual-graphic comparison of models to data. Black-Right-Pointing-Pointer The method proposed utilises test data both as conventional rupture stress and as rupture stress gradient. Black-Right-Pointing-Pointer The approach is shown to be more reliable than a well-established and widely used method (BS PD6605).

  7. Analysis and Enhancements of a Prolific Macroscopic Model of Epilepsy

    OpenAIRE

    Fietkiewicz, Christopher; Loparo, Kenneth A.

    2016-01-01

    Macroscopic models of epilepsy can deliver surprisingly realistic EEG simulations. In the present study, a prolific series of models is evaluated with regard to theoretical and computational concerns, and enhancements are developed. Specifically, we analyze three aspects of the models: (1) Using dynamical systems analysis, we demonstrate and explain the presence of direct current potentials in the simulated EEG that were previously undocumented. (2) We explain how the system was not ideally f...

  8. SOA Modeling Patterns for Service Oriented Discovery and Analysis

    CERN Document Server

    Bell, Michael

    2010-01-01

    Learn the essential tools for developing a sound service-oriented architecture. SOA Modeling Patterns for Service-Oriented Discovery and Analysis introduces a universal, easy-to-use, and nimble SOA modeling language to facilitate the service identification and examination life cycle stage. This business and technological vocabulary will benefit your service development endeavors and foster organizational software asset reuse and consolidation, and reduction of expenditure. Whether you are a developer, business architect, technical architect, modeler, business analyst, team leader, or manager,

  9. Practical Soil-Shallow Foundation Model for Nonlinear Structural Analysis

    OpenAIRE

    Moussa Leblouba; Salah Al Toubat; Muhammad Ekhlasur Rahman; Omer Mugheida

    2016-01-01

    Soil-shallow foundation interaction models that are incorporated into most structural analysis programs generally lack accuracy and efficiency or neglect some aspects of foundation behavior. For instance, soil-shallow foundation systems have been observed to show both small and large loops under increasing amplitude load reversals. This paper presents a practical macroelement model for soil-shallow foundation system and its stability under simultaneous horizontal and vertical loads. The model...

  10. Model order reduction techniques with applications in finite element analysis

    CERN Document Server

    Qu, Zu-Qing

    2004-01-01

    Despite the continued rapid advance in computing speed and memory the increase in the complexity of models used by engineers persists in outpacing them. Even where there is access to the latest hardware, simulations are often extremely computationally intensive and time-consuming when full-blown models are under consideration. The need to reduce the computational cost involved when dealing with high-order/many-degree-of-freedom models can be offset by adroit computation. In this light, model-reduction methods have become a major goal of simulation and modeling research. Model reduction can also ameliorate problems in the correlation of widely used finite-element analyses and test analysis models produced by excessive system complexity. Model Order Reduction Techniques explains and compares such methods focusing mainly on recent work in dynamic condensation techniques: - Compares the effectiveness of static, exact, dynamic, SEREP and iterative-dynamic condensation techniques in producing valid reduced-order mo...

  11. Testing Process Factor Analysis Models Using the Parametric Bootstrap.

    Science.gov (United States)

    Zhang, Guangjian

    2018-01-01

    Process factor analysis (PFA) is a latent variable model for intensive longitudinal data. It combines P-technique factor analysis and time series analysis. The goodness-of-fit test in PFA is currently unavailable. In the paper, we propose a parametric bootstrap method for assessing model fit in PFA. We illustrate the test with an empirical data set in which 22 participants rated their effects everyday over a period of 90 days. We also explore Type I error and power of the parametric bootstrap test with simulated data.

  12. RFA: R-Squared Fitting Analysis Model for Power Attack

    Directory of Open Access Journals (Sweden)

    An Wang

    2017-01-01

    Full Text Available Correlation Power Analysis (CPA introduced by Brier et al. in 2004 is an important method in the side-channel attack and it enables the attacker to use less cost to derive secret or private keys with efficiency over the last decade. In this paper, we propose R-squared fitting model analysis (RFA which is more appropriate for nonlinear correlation analysis. This model can also be applied to other side-channel methods such as second-order CPA and collision-correlation power attack. Our experiments show that the RFA-based attacks bring significant advantages in both time complexity and success rate.

  13. A Graphical Model for Risk Analysis and Management

    Science.gov (United States)

    Wang, Xun; Williams, Mary-Anne

    Risk analysis and management are important capabilities in intelligent information and knowledge systems. We present a new approach using directed graph based models for risk analysis and management. Our modelling approach is inspired by and builds on the two level approach of the Transferable Belief Model. The credal level for risk analysis and model construction uses beliefs in causal inference relations among the variables within a domain and a pignistic(betting) level for the decision making. The risk model at the credal level can be transformed into a probabilistic model through a pignistic transformation function. This paper focuses on model construction at the credal level. Our modelling approach captures expert knowledge in a formal and iterative fashion based on the Open World Assumption(OWA) in contrast to Bayesian Network based approaches for managing uncertainty associated with risks which assume all the domain knowledge and data have been captured before hand. As a result, our approach does not require complete knowledges and is well suited for modelling risk in dynamic changing environments where information and knowledge is gathered over time as decisions need to be taken. Its performance is related to the quality of the knowledge at hand at any given time.

  14. Global Sensitivity Analysis of Environmental Models: Convergence, Robustness and Accuracy Analysis

    Science.gov (United States)

    Sarrazin, F.; Pianosi, F.; Hartmann, A. J.; Wagener, T.

    2014-12-01

    Sensitivity analysis aims to characterize the impact that changes in model input factors (e.g. the parameters) have on the model output (e.g. simulated streamflow). It is a valuable diagnostic tool for model understanding and for model improvement, it enhances calibration efficiency, and it supports uncertainty and scenario analysis. It is of particular interest for environmental models because they are often complex, non-linear, non-monotonic and exhibit strong interactions between their parameters. However, sensitivity analysis has to be carefully implemented to produce reliable results at moderate computational cost. For example, sample size can have a strong impact on the results and has to be carefully chosen. Yet, there is little guidance available for this step in environmental modelling. The objective of the present study is to provide guidelines for a robust sensitivity analysis, in order to support modellers in making appropriate choices for its implementation and in interpreting its outcome. We considered hydrological models with increasing level of complexity. We tested four sensitivity analysis methods, Regional Sensitivity Analysis, Method of Morris, a density-based (PAWN) and a variance-based (Sobol) method. The convergence and variability of sensitivity indices were investigated. We used bootstrapping to assess and improve the robustness of sensitivity indices even for limited sample sizes. Finally, we propose a quantitative validation approach for sensitivity analysis based on the Kolmogorov-Smirnov statistics.

  15. Demographics of reintroduced populations: estimation, modeling, and decision analysis

    Science.gov (United States)

    Converse, Sarah J.; Moore, Clinton T.; Armstrong, Doug P.

    2013-01-01

    Reintroduction can be necessary for recovering populations of threatened species. However, the success of reintroduction efforts has been poorer than many biologists and managers would hope. To increase the benefits gained from reintroduction, management decision making should be couched within formal decision-analytic frameworks. Decision analysis is a structured process for informing decision making that recognizes that all decisions have a set of components—objectives, alternative management actions, predictive models, and optimization methods—that can be decomposed, analyzed, and recomposed to facilitate optimal, transparent decisions. Because the outcome of interest in reintroduction efforts is typically population viability or related metrics, models used in decision analysis efforts for reintroductions will need to include population models. In this special section of the Journal of Wildlife Management, we highlight examples of the construction and use of models for informing management decisions in reintroduced populations. In this introductory contribution, we review concepts in decision analysis, population modeling for analysis of decisions in reintroduction settings, and future directions. Increased use of formal decision analysis, including adaptive management, has great potential to inform reintroduction efforts. Adopting these practices will require close collaboration among managers, decision analysts, population modelers, and field biologists.

  16. Sensitivity analysis technique for application to deterministic models

    International Nuclear Information System (INIS)

    Ishigami, T.; Cazzoli, E.; Khatib-Rahbar, M.; Unwin, S.D.

    1987-01-01

    The characterization of sever accident source terms for light water reactors should include consideration of uncertainties. An important element of any uncertainty analysis is an evaluation of the sensitivity of the output probability distributions reflecting source term uncertainties to assumptions regarding the input probability distributions. Historically, response surface methods (RSMs) were developed to replace physical models using, for example, regression techniques, with simplified models for example, regression techniques, with simplified models for extensive calculations. The purpose of this paper is to present a new method for sensitivity analysis that does not utilize RSM, but instead relies directly on the results obtained from the original computer code calculations. The merits of this approach are demonstrated by application of the proposed method to the suppression pool aerosol removal code (SPARC), and the results are compared with those obtained by sensitivity analysis with (a) the code itself, (b) a regression model, and (c) Iman's method

  17. Dynamic data analysis modeling data with differential equations

    CERN Document Server

    Ramsay, James

    2017-01-01

    This text focuses on the use of smoothing methods for developing and estimating differential equations following recent developments in functional data analysis and building on techniques described in Ramsay and Silverman (2005) Functional Data Analysis. The central concept of a dynamical system as a buffer that translates sudden changes in input into smooth controlled output responses has led to applications of previously analyzed data, opening up entirely new opportunities for dynamical systems. The technical level has been kept low so that those with little or no exposure to differential equations as modeling objects can be brought into this data analysis landscape. There are already many texts on the mathematical properties of ordinary differential equations, or dynamic models, and there is a large literature distributed over many fields on models for real world processes consisting of differential equations. However, a researcher interested in fitting such a model to data, or a statistician interested in...

  18. Bayesian nonparametric meta-analysis using Polya tree mixture models.

    Science.gov (United States)

    Branscum, Adam J; Hanson, Timothy E

    2008-09-01

    Summary. A common goal in meta-analysis is estimation of a single effect measure using data from several studies that are each designed to address the same scientific inquiry. Because studies are typically conducted in geographically disperse locations, recent developments in the statistical analysis of meta-analytic data involve the use of random effects models that account for study-to-study variability attributable to differences in environments, demographics, genetics, and other sources that lead to heterogeneity in populations. Stemming from asymptotic theory, study-specific summary statistics are modeled according to normal distributions with means representing latent true effect measures. A parametric approach subsequently models these latent measures using a normal distribution, which is strictly a convenient modeling assumption absent of theoretical justification. To eliminate the influence of overly restrictive parametric models on inferences, we consider a broader class of random effects distributions. We develop a novel hierarchical Bayesian nonparametric Polya tree mixture (PTM) model. We present methodology for testing the PTM versus a normal random effects model. These methods provide researchers a straightforward approach for conducting a sensitivity analysis of the normality assumption for random effects. An application involving meta-analysis of epidemiologic studies designed to characterize the association between alcohol consumption and breast cancer is presented, which together with results from simulated data highlight the performance of PTMs in the presence of nonnormality of effect measures in the source population.

  19. Development of Wolsong Unit 2 Containment Analysis Model

    Energy Technology Data Exchange (ETDEWEB)

    Hoon, Choi [Korea Hydro and Nuclear Power Co., Ltd., Daejeon (Korea, Republic of); Jin, Ko Bong; Chan, Park Young [Hanbat National Univ., Daejeon (Korea, Republic of)

    2014-05-15

    To be prepared for the full scope safety analysis of Wolsong unit 2 with modified fuel, input decks for the various objectives, which can be read by GOTHIC 7.2b(QA), are developed and tested for the steady state simulation. A detailed nodalization of 39 control volumes and 92 flow paths is constructed to determine the differential pressure across internal walls or hydrogen concentration and distribution inside containment. A lumped model with 15 control volumes and 74 flow paths has also been developed to reduce the computer run time for the assessments in which the analysis results are not sensitive to detailed thermal hydraulic distribution inside containment such as peak pressure, pressure dependent signal and radionuclide release. The input data files provide simplified representations of the geometric layout of the containment building (volumes, dimensions, flow paths, doors, panels, etc.) and the performance characteristics of the various containment subsystems. The parameter values are based on best estimate or design values for that parameter. The analysis values are determined by conservatism depending on the analysis objective and may be different for various analysis objectives. Basic input decks of Wolsong unit 2 were developed for the various analysis purposes with GOTHIC 7.2b(QA). Depend on the analysis objective, two types of models are prepared. Detailed model models each confined room in the containment as a separate node. All of the geometric data are based on the drawings of Wolsong unit 2. Developed containment models are simulating the steady state well to the designated initial condition. These base models will be used for Wolsong unit 2 in case of safety analysis of full scope is needed.

  20. Model for Analysis of Energy Demand (MAED-2). User's manual

    International Nuclear Information System (INIS)

    2006-01-01

    The IAEA has been supporting its Member States in the area of energy planning for sustainable development. Development and dissemination of appropriate methodologies and their computer codes are important parts of this support. This manual has been produced to facilitate the use of the MAED model: Model for Analysis of Energy Demand. The methodology of the MAED model was originally developed by. B. Chateau and B. Lapillonne of the Institute Economique et Juridique de l'Energie (IEJE) of the University of Grenoble, France, and was presented as the MEDEE model. Since then the MEDEE model has been developed and adopted to be appropriate for modelling of various energy demand system. The IAEA adopted MEDEE-2 model and incorporated important modifications to make it more suitable for application in the developing countries, and it was named as the MAED model. The first version of the MAED model was designed for the DOS based system, which was later on converted for the Windows system. This manual presents the latest version of the MAED model. The most prominent feature of this version is its flexibility for representing structure of energy consumption. The model now allows country-specific representations of energy consumption patterns using the MAED methodology. The user can now disaggregate energy consumption according to the needs and/or data availability in her/his country. As such, MAED has now become a powerful tool for modelling widely diverse energy consumption patterns. This manual presents the model in details and provides guidelines for its application

  1. Model for Analysis of Energy Demand (MAED-2)

    International Nuclear Information System (INIS)

    2007-01-01

    The IAEA has been supporting its Member States in the area of energy planning for sustainable development. Development and dissemination of appropriate methodologies and their computer codes are important parts of this support. This manual has been produced to facilitate the use of the MAED model: Model for Analysis of Energy Demand. The methodology of the MAED model was originally developed by. B. Chateau and B. Lapillonne of the Institute Economique et Juridique de l'Energie (IEJE) of the University of Grenoble, France, and was presented as the MEDEE model. Since then the MEDEE model has been developed and adopted to be appropriate for modelling of various energy demand system. The IAEA adopted MEDEE-2 model and incorporated important modifications to make it more suitable for application in the developing countries, and it was named as the MAED model. The first version of the MAED model was designed for the DOS based system, which was later on converted for the Windows system. This manual presents the latest version of the MAED model. The most prominent feature of this version is its flexibility for representing structure of energy consumption. The model now allows country-specific representations of energy consumption patterns using the MAED methodology. The user can now disaggregate energy consumption according to the needs and/or data availability in her/his country. As such, MAED has now become a powerful tool for modelling widely diverse energy consumption patterns. This manual presents the model in details and provides guidelines for its application

  2. Model for Analysis of Energy Demand (MAED-2). User's manual

    International Nuclear Information System (INIS)

    2007-01-01

    The IAEA has been supporting its Member States in the area of energy planning for sustainable development. Development and dissemination of appropriate methodologies and their computer codes are important parts of this support. This manual has been produced to facilitate the use of the MAED model: Model for Analysis of Energy Demand. The methodology of the MAED model was originally developed by. B. Chateau and B. Lapillonne of the Institute Economique et Juridique de l'Energie (IEJE) of the University of Grenoble, France, and was presented as the MEDEE model. Since then the MEDEE model has been developed and adopted to be appropriate for modelling of various energy demand system. The IAEA adopted MEDEE-2 model and incorporated important modifications to make it more suitable for application in the developing countries, and it was named as the MAED model. The first version of the MAED model was designed for the DOS based system, which was later on converted for the Windows system. This manual presents the latest version of the MAED model. The most prominent feature of this version is its flexibility for representing structure of energy consumption. The model now allows country-specific representations of energy consumption patterns using the MAED methodology. The user can now disaggregate energy consumption according to the needs and/or data availability in her/his country. As such, MAED has now become a powerful tool for modelling widely diverse energy consumption patterns. This manual presents the model in details and provides guidelines for its application

  3. Properties of autoregressive model in reactor noise analysis, 1

    International Nuclear Information System (INIS)

    Yamada, Sumasu; Kishida, Kuniharu; Bekki, Keisuke.

    1987-01-01

    Under appropriate conditions, stochastic processes are described by the ARMA model, however, the AR model is popularly used in reactor noise analysis. Hence, the properties of AR model as an approximate representation of the ARMA model should be made clear. Here, convergence of AR-parameters and PSD of AR model were studied through numerical analysis on specific examples such as the neutron noise in subcritical reactors, and it was found that : (1) The convergence of AR-parameters and AR model PSD is governed by the ''zero nearest to the unit circle in the complex plane'' (μ -1 ,|μ| M . (3) The AR model of the neutron noise of subcritical reactors needs a large model order because of an ARMA-zero very close to unity corresponding to the decay constant of the 6-th group of delayed neutron precursors. (4) In applying AR model for system identification, much attention has to be paid to a priori unknown error as an approximate representation of the ARMA model in addition to the statistical errors. (author)

  4. Structural Simulations and Conservation Analysis -Historic Building Information Model (HBIM

    Directory of Open Access Journals (Sweden)

    C. Dore

    2015-02-01

    Full Text Available In this paper the current findings to date of the Historic Building Information Model (HBIM of the Four Courts in Dublin are presented. The Historic Building Information Model (HBIM forms the basis for both structural and conservation analysis to measure the impact of war damage which still impacts on the building. The laser scan survey was carried out in the summer of 2014 of the internal and external structure. After registration and processing of the laser scan survey, the HBIM was created of the damaged section of the building and is presented as two separate workflows in this paper. The first is the model created from historic data, the second a procedural and segmented model developed from laser scan survey of the war damaged drum and dome. From both models structural damage and decay simulations will be developed for documentation and conservation analysis.

  5. Modeling and Analysis of Reentrant Manufacturing Systems: Micro- and Macroperspectives

    Directory of Open Access Journals (Sweden)

    Fenglan He

    2011-01-01

    Full Text Available In order to obtain the better analysis of the multiple reentrant manufacturing systems (MRMSs, their modeling and analysis from both micro- and macroperspectives are considered. First, this paper presents the discrete event simulation models for MRMS and the corresponding algorithms are developed. In order to describe MRMS more accurately, then a modified continuum model is proposed. This continuum model takes into account the re-entrant degree of products, and its effectiveness is verified through numerical experiments. Finally, based on the discrete event simulation and the modified continuum models, a numerical example is used to analyze the MRMS. The changes in the WIP levels and outflux are also analyzed in details for multiple re-entrant supply chain networks. Meanwhile, some interesting observations are discussed.

  6. Analysis on the crime model using dynamical approach

    Science.gov (United States)

    Mohammad, Fazliza; Roslan, Ummu'Atiqah Mohd

    2017-08-01

    A research is carried out to analyze a dynamical model of the spread crime system. A Simplified 2-Dimensional Model is used in this research. The objectives of this research are to investigate the stability of the model of the spread crime, to summarize the stability by using a bifurcation analysis and to study the relationship of basic reproduction number, R0 with the parameter in the model. Our results for stability of equilibrium points shows that we have two types of stability, which are asymptotically stable and saddle node. While the result for bifurcation analysis shows that the number of criminally active and incarcerated increases as we increase the value of a parameter in the model. The result for the relationship of R0 with the parameter shows that as the parameter increases, R0 increase too, and the rate of crime increase too.

  7. Modeling the situation awareness by the analysis of cognitive process.

    Science.gov (United States)

    Liu, Shuang; Wanyan, Xiaoru; Zhuang, Damin

    2014-01-01

    To predict changes of situation awareness (SA) for pilot operating with different display interfaces and tasks, a qualitative analysis and quantitative calculation joint SA model was proposed. Based on the situational awareness model according to the attention allocation built previously, the pilot cognitive process for the situation elements was analyzed according to the ACT-R (Adaptive Control of Thought, Rational) theory, which explained how the SA was produced. To verify the validity of this model, 28 subjects performed an instrument supervision task under different experiment conditions. Situation Awareness Global Assessment Technique (SAGAT), 10-dimensional Situational Awareness Rating Technique (10-D SART), performance measure and eye movement measure were adopted for evaluating SAs under different conditions. Statistical analysis demonstrated that the changing trend of SA calculated by this model was highly correlated with the experimental results. Therefore the situational awareness model can provide a reference for designing new cockpit display interfaces and help reducing human errors.

  8. An introduction to queueing theory modeling and analysis in applications

    CERN Document Server

    Bhat, U Narayan

    2015-01-01

    This introductory textbook is designed for a one-semester course on queueing theory that does not require a course on stochastic processes as a prerequisite. By integrating the necessary background on stochastic processes with the analysis of models, the work provides a sound foundational introduction to the modeling and analysis of queueing systems for a wide interdisciplinary audience of students in mathematics, statistics, and applied disciplines such as computer science, operations research, and engineering. This edition includes additional topics in methodology and applications. Key features: • An introductory chapter including a historical account of the growth of queueing theory in more than 100 years. • A modeling-based approach with emphasis on identification of models. • Rigorous treatment of the foundations of basic models commonly used in applications with appropriate references for advanced topics. • Applications in manufacturing and, computer and communication systems. • A chapter on ...

  9. Model based process-product design and analysis

    DEFF Research Database (Denmark)

    Gani, Rafiqul

    This paper gives a perspective on modelling and the important role it has within product-process design and analysis. Different modelling issues related to development and application of systematic model-based solution approaches for product-process design is discussed and the need for a hybrid......, forms and complexity, together with their associated parameters. An example of a model-based system for design of chemicals based formulated products is also given....... model-based framework is highlighted. This framework should be able to manage knowledge-data, models, and associated methods and tools integrated with design work-flows and data-flows for specific product-process design problems. In particular, the framework needs to manage models of different types...

  10. Analysis of Atmospheric Mesoscale Models for Entry, Descent and Landing

    Science.gov (United States)

    Kass, D. M.; Schofield, J. T.; Michaels, T. I.; Rafkin, S. C. R.; Richardson, M. I.; Toigo, A. D.

    2003-01-01

    Each Mars Exploration Rover (MER) is sensitive to the martian winds encountered near the surface during the Entry, Descent and Landing (EDL) process. These winds are strongly influenced by local (mesoscale) conditions. In the absence of suitable wind observations, wind fields predicted by martian mesoscale atmospheric models have been analyzed to guide landing site selection. Two different models were used, the MRAMS model and the Mars MM5 model. In order to encompass both models and render their results useful to the EDL engineering team, a series of statistical techniques were applied to the model results. These analyses cover the high priority landing sites during the expected landing times (1200 to 1500 local time). The number of sites studied is limited by the computational and analysis cost of the mesoscale models.

  11. Bayesian analysis for uncertainty estimation of a canopy transpiration model

    Science.gov (United States)

    Samanta, S.; Mackay, D. S.; Clayton, M. K.; Kruger, E. L.; Ewers, B. E.

    2007-04-01

    A Bayesian approach was used to fit a conceptual transpiration model to half-hourly transpiration rates for a sugar maple (Acer saccharum) stand collected over a 5-month period and probabilistically estimate its parameter and prediction uncertainties. The model used the Penman-Monteith equation with the Jarvis model for canopy conductance. This deterministic model was extended by adding a normally distributed error term. This extension enabled using Markov chain Monte Carlo simulations to sample the posterior parameter distributions. The residuals revealed approximate conformance to the assumption of normally distributed errors. However, minor systematic structures in the residuals at fine timescales suggested model changes that would potentially improve the modeling of transpiration. Results also indicated considerable uncertainties in the parameter and transpiration estimates. This simple methodology of uncertainty analysis would facilitate the deductive step during the development cycle of deterministic conceptual models by accounting for these uncertainties while drawing inferences from data.

  12. Static analysis of a Model of the LDL degradation pathway

    DEFF Research Database (Denmark)

    Pilegaard, Henrik; Nielson, Flemming; Nielson, Hanne Riis

    2005-01-01

    BioAmbients is a derivative of mobile ambients that has shown promise of describing interesting features of the behaviour of biological systems. As for other ambient calculi static program analysis can be used to compute safe approximations of the behavior of modelled systems. We use these tools ...... to model and analyse the production of cholesterol in living cells and show that we are able to pinpoint the difference in behaviour between models of healthy systems and models of mutated systems giving rise to known diseases....

  13. Prior Sensitivity Analysis in Default Bayesian Structural Equation Modeling.

    Science.gov (United States)

    van Erp, Sara; Mulder, Joris; Oberski, Daniel L

    2017-11-27

    Bayesian structural equation modeling (BSEM) has recently gained popularity because it enables researchers to fit complex models and solve some of the issues often encountered in classical maximum likelihood estimation, such as nonconvergence and inadmissible solutions. An important component of any Bayesian analysis is the prior distribution of the unknown model parameters. Often, researchers rely on default priors, which are constructed in an automatic fashion without requiring substantive prior information. However, the prior can have a serious influence on the estimation of the model parameters, which affects the mean squared error, bias, coverage rates, and quantiles of the estimates. In this article, we investigate the performance of three different default priors: noninformative improper priors, vague proper priors, and empirical Bayes priors-with the latter being novel in the BSEM literature. Based on a simulation study, we find that these three default BSEM methods may perform very differently, especially with small samples. A careful prior sensitivity analysis is therefore needed when performing a default BSEM analysis. For this purpose, we provide a practical step-by-step guide for practitioners to conducting a prior sensitivity analysis in default BSEM. Our recommendations are illustrated using a well-known case study from the structural equation modeling literature, and all code for conducting the prior sensitivity analysis is available in the online supplemental materials. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  14. Statistical Models and Methods for Network Meta-Analysis.

    Science.gov (United States)

    Madden, L V; Piepho, H-P; Paul, P A

    2016-08-01

    Meta-analysis, the methodology for analyzing the results from multiple independent studies, has grown tremendously in popularity over the last four decades. Although most meta-analyses involve a single effect size (summary result, such as a treatment difference) from each study, there are often multiple treatments of interest across the network of studies in the analysis. Multi-treatment (or network) meta-analysis can be used for simultaneously analyzing the results from all the treatments. However, the methodology is considerably more complicated than for the analysis of a single effect size, and there have not been adequate explanations of the approach for agricultural investigations. We review the methods and models for conducting a network meta-analysis based on frequentist statistical principles, and demonstrate the procedures using a published multi-treatment plant pathology data set. A major advantage of network meta-analysis is that correlations of estimated treatment effects are automatically taken into account when an appropriate model is used. Moreover, treatment comparisons may be possible in a network meta-analysis that are not possible in a single study because all treatments of interest may not be included in any given study. We review several models that consider the study effect as either fixed or random, and show how to interpret model-fitting output. We further show how to model the effect of moderator variables (study-level characteristics) on treatment effects, and present one approach to test for the consistency of treatment effects across the network. Online supplemental files give explanations on fitting the network meta-analytical models using SAS.

  15. Analysis of Feature Models Using Alloy: A Survey

    Directory of Open Access Journals (Sweden)

    Anjali Sree-Kumar

    2016-03-01

    Full Text Available Feature Models (FMs are a mechanism to model variability among a family of closely related software products, i.e. a software product line (SPL. Analysis of FMs using formal methods can reveal defects in the specification such as inconsistencies that cause the product line to have no valid products. A popular framework used in research for FM analysis is Alloy, a light-weight formal modeling notation equipped with an efficient model finder. Several works in the literature have proposed different strategies to encode and analyze FMs using Alloy. However, there is little discussion on the relative merits of each proposal, making it difficult to select the most suitable encoding for a specific analysis need. In this paper, we describe and compare those strategies according to various criteria such as the expressivity of the FM notation or the efficiency of the analysis. This survey is the first comparative study of research targeted towards using Alloy for FM analysis. This review aims to identify all the best practices on the use of Alloy, as a part of a framework for the automated extraction and analysis of rich FMs from natural language requirement specifications.

  16. Computer Models for IRIS Control System Transient Analysis

    International Nuclear Information System (INIS)

    Gary D Storrick; Bojan Petrovic; Luca Oriani

    2007-01-01

    This report presents results of the Westinghouse work performed under Task 3 of this Financial Assistance Award and it satisfies a Level 2 Milestone for the project. Task 3 of the collaborative effort between ORNL, Brazil and Westinghouse for the International Nuclear Energy Research Initiative entitled 'Development of Advanced Instrumentation and Control for an Integrated Primary System Reactor' focuses on developing computer models for transient analysis. This report summarizes the work performed under Task 3 on developing control system models. The present state of the IRIS plant design--such as the lack of a detailed secondary system or I and C system designs--makes finalizing models impossible at this time. However, this did not prevent making considerable progress. Westinghouse has several working models in use to further the IRIS design. We expect to continue modifying the models to incorporate the latest design information until the final IRIS unit becomes operational. Section 1.2 outlines the scope of this report. Section 2 describes the approaches we are using for non-safety transient models. It describes the need for non-safety transient analysis and the model characteristics needed to support those analyses. Section 3 presents the RELAP5 model. This is the highest-fidelity model used for benchmark evaluations. However, it is prohibitively slow for routine evaluations and additional lower-fidelity models have been developed. Section 4 discusses the current Matlab/Simulink model. This is a low-fidelity, high-speed model used to quickly evaluate and compare competing control and protection concepts. Section 5 describes the Modelica models developed by POLIMI and Westinghouse. The object-oriented Modelica language provides convenient mechanisms for developing models at several levels of detail. We have used this to develop a high-fidelity model for detailed analyses and a faster-running simplified model to help speed the I and C development process. Section

  17. Inverse Analysis and Modeling for Tunneling Thrust on Shield Machine

    Directory of Open Access Journals (Sweden)

    Qian Zhang

    2013-01-01

    Full Text Available With the rapid development of sensor and detection technologies, measured data analysis plays an increasingly important role in the design and control of heavy engineering equipment. The paper proposed a method for inverse analysis and modeling based on mass on-site measured data, in which dimensional analysis and data mining techniques were combined. The method was applied to the modeling of the tunneling thrust on shield machines and an explicit expression for thrust prediction was established. Combined with on-site data from a tunneling project in China, the inverse identification of model coefficients was carried out using the multiple regression method. The model residual was analyzed by statistical methods. By comparing the on-site data and the model predicted results in the other two projects with different tunneling conditions, the feasibility of the model was discussed. The work may provide a scientific basis for the rational design and control of shield tunneling machines and also a new way for mass on-site data analysis of complex engineering systems with nonlinear, multivariable, time-varying characteristics.

  18. Perturbation analysis for Monte Carlo continuous cross section models

    International Nuclear Information System (INIS)

    Kennedy, Chris B.; Abdel-Khalik, Hany S.

    2011-01-01

    Sensitivity analysis, including both its forward and adjoint applications, collectively referred to hereinafter as Perturbation Analysis (PA), is an essential tool to complete Uncertainty Quantification (UQ) and Data Assimilation (DA). PA-assisted UQ and DA have traditionally been carried out for reactor analysis problems using deterministic as opposed to stochastic models for radiation transport. This is because PA requires many model executions to quantify how variations in input data, primarily cross sections, affect variations in model's responses, e.g. detectors readings, flux distribution, multiplication factor, etc. Although stochastic models are often sought for their higher accuracy, their repeated execution is at best computationally expensive and in reality intractable for typical reactor analysis problems involving many input data and output responses. Deterministic methods however achieve computational efficiency needed to carry out the PA analysis by reducing problem dimensionality via various spatial and energy homogenization assumptions. This however introduces modeling error components into the PA results which propagate to the following UQ and DA analyses. The introduced errors are problem specific and therefore are expected to limit the applicability of UQ and DA analyses to reactor systems that satisfy the introduced assumptions. This manuscript introduces a new method to complete PA employing a continuous cross section stochastic model and performed in a computationally efficient manner. If successful, the modeling error components introduced by deterministic methods could be eliminated, thereby allowing for wider applicability of DA and UQ results. Two MCNP models demonstrate the application of the new method - a Critical Pu Sphere (Jezebel), a Pu Fast Metal Array (Russian BR-1). The PA is completed for reaction rate densities, reaction rate ratios, and the multiplication factor. (author)

  19. Sensitivity Analysis of Launch Vehicle Debris Risk Model

    Science.gov (United States)

    Gee, Ken; Lawrence, Scott L.

    2010-01-01

    As part of an analysis of the loss of crew risk associated with an ascent abort system for a manned launch vehicle, a model was developed to predict the impact risk of the debris resulting from an explosion of the launch vehicle on the crew module. The model consisted of a debris catalog describing the number, size and imparted velocity of each piece of debris, a method to compute the trajectories of the debris and a method to calculate the impact risk given the abort trajectory of the crew module. The model provided a point estimate of the strike probability as a function of the debris catalog, the time of abort and the delay time between the abort and destruction of the launch vehicle. A study was conducted to determine the sensitivity of the strike probability to the various model input parameters and to develop a response surface model for use in the sensitivity analysis of the overall ascent abort risk model. The results of the sensitivity analysis and the response surface model are presented in this paper.

  20. Analysis of Sting Balance Calibration Data Using Optimized Regression Models

    Science.gov (United States)

    Ulbrich, N.; Bader, Jon B.

    2010-01-01

    Calibration data of a wind tunnel sting balance was processed using a candidate math model search algorithm that recommends an optimized regression model for the data analysis. During the calibration the normal force and the moment at the balance moment center were selected as independent calibration variables. The sting balance itself had two moment gages. Therefore, after analyzing the connection between calibration loads and gage outputs, it was decided to choose the difference and the sum of the gage outputs as the two responses that best describe the behavior of the balance. The math model search algorithm was applied to these two responses. An optimized regression model was obtained for each response. Classical strain gage balance load transformations and the equations of the deflection of a cantilever beam under load are used to show that the search algorithm s two optimized regression models are supported by a theoretical analysis of the relationship between the applied calibration loads and the measured gage outputs. The analysis of the sting balance calibration data set is a rare example of a situation when terms of a regression model of a balance can directly be derived from first principles of physics. In addition, it is interesting to note that the search algorithm recommended the correct regression model term combinations using only a set of statistical quality metrics that were applied to the experimental data during the algorithm s term selection process.

  1. The concept of validation of numerical models for consequence analysis

    International Nuclear Information System (INIS)

    Borg, Audun; Paulsen Husted, Bjarne; Njå, Ove

    2014-01-01

    Numerical models such as computational fluid dynamics (CFD) models are increasingly used in life safety studies and other types of analyses to calculate the effects of fire and explosions. The validity of these models is usually established by benchmark testing. This is done to quantitatively measure the agreement between the predictions provided by the model and the real world represented by observations in experiments. This approach assumes that all variables in the real world relevant for the specific study are adequately measured in the experiments and in the predictions made by the model. In this paper the various definitions of validation for CFD models used for hazard prediction are investigated to assess their implication for consequence analysis in a design phase. In other words, how is uncertainty in the prediction of future events reflected in the validation process? The sources of uncertainty are viewed from the perspective of the safety engineer. An example of the use of a CFD model is included to illustrate the assumptions the analyst must make and how these affect the prediction made by the model. The assessments presented in this paper are based on a review of standards and best practice guides for CFD modeling and the documentation from two existing CFD programs. Our main thrust has been to assess how validation work is performed and communicated in practice. We conclude that the concept of validation adopted for numerical models is adequate in terms of model performance. However, it does not address the main sources of uncertainty from the perspective of the safety engineer. Uncertainty in the input quantities describing future events, which are determined by the model user, outweighs the inaccuracies in the model as reported in validation studies. - Highlights: • Examine the basic concept of validation applied to models for consequence analysis. • Review standards and guides for validation of numerical models. • Comparison of the validation

  2. Uncertainty and sensitivity analysis for photovoltaic system modeling.

    Energy Technology Data Exchange (ETDEWEB)

    Hansen, Clifford W. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Pohl, Andrew Phillip [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Jordan, Dirk [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2013-12-01

    We report an uncertainty and sensitivity analysis for modeling DC energy from photovoltaic systems. We consider two systems, each comprised of a single module using either crystalline silicon or CdTe cells, and located either at Albuquerque, NM, or Golden, CO. Output from a PV system is predicted by a sequence of models. Uncertainty in the output of each model is quantified by empirical distributions of each model's residuals. We sample these distributions to propagate uncertainty through the sequence of models to obtain an empirical distribution for each PV system's output. We considered models that: (1) translate measured global horizontal, direct and global diffuse irradiance to plane-of-array irradiance; (2) estimate effective irradiance from plane-of-array irradiance; (3) predict cell temperature; and (4) estimate DC voltage, current and power. We found that the uncertainty in PV system output to be relatively small, on the order of 1% for daily energy. Four alternative models were considered for the POA irradiance modeling step; we did not find the choice of one of these models to be of great significance. However, we observed that the POA irradiance model introduced a bias of upwards of 5% of daily energy which translates directly to a systematic difference in predicted energy. Sensitivity analyses relate uncertainty in the PV system output to uncertainty arising from each model. We found that the residuals arising from the POA irradiance and the effective irradiance models to be the dominant contributors to residuals for daily energy, for either technology or location considered. This analysis indicates that efforts to reduce the uncertainty in PV system output should focus on improvements to the POA and effective irradiance models.

  3. Development of trip coverage analysis methodology - CATHENA trip coverage analysis model

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Jong Ho; Ohn, M. Y.; Cho, C. H.; Huh, J. Y.; Na, Y. H.; Lee, S. Y.; Kim, B. G.; Kim, H. H.; Kim, S. W.; Bae, C. J.; Kim, T. M.; Kim, S. R.; Han, B. S.; Moon, B. J.; Oh, M. T. [Korea Power Engineering Co., Yongin (Korea)

    2001-05-01

    This report describes the CATHENA model for trip coverage analysis. This model is prepared based on the Wolsong 2 design data and consist of primary heat transport system, shutdown system, steam and feedwater system, reactor regulating system, heat transport pressure and inventory control system, and steam generator level and pressure control system. The new features and modified parts from the Wolsong 2 CATHENA LOCA Model required for trip coverage analysis is described. this model is tested by simulation of steady state at 100 % FP and at several low powers. Also, the cases of power rundown and power runup are tested. 17 refs., 124 figs., 19 tabs. (Author)

  4. Applying model analysis to a resource-based analysis of the Force and Motion Conceptual Evaluation

    Directory of Open Access Journals (Sweden)

    Trevor I. Smith

    2014-07-01

    Full Text Available Previously, we analyzed the Force and Motion Conceptual Evaluation in terms of a resources-based model that allows for clustering of questions so as to provide useful information on how students correctly or incorrectly reason about physics. In this paper, we apply model analysis to show that the associated model plots provide more information regarding the results of investigations using these question clusters than normalized gain graphs. We provide examples from two different institutions to show how the use of model analysis with our redefined clusters can provide previously hidden insight into the effectiveness of instruction.

  5. An effective convolutional neural network model for Chinese sentiment analysis

    Science.gov (United States)

    Zhang, Yu; Chen, Mengdong; Liu, Lianzhong; Wang, Yadong

    2017-06-01

    Nowadays microblog is getting more and more popular. People are increasingly accustomed to expressing their opinions on Twitter, Facebook and Sina Weibo. Sentiment analysis of microblog has received significant attention, both in academia and in industry. So far, Chinese microblog exploration still needs lots of further work. In recent years CNN has also been used to deal with NLP tasks, and already achieved good results. However, these methods ignore the effective use of a large number of existing sentimental resources. For this purpose, we propose a Lexicon-based Sentiment Convolutional Neural Networks (LSCNN) model focus on Weibo's sentiment analysis, which combines two CNNs, trained individually base on sentiment features and word embedding, at the fully connected hidden layer. The experimental results show that our model outperforms the CNN model only with word embedding features on microblog sentiment analysis task.

  6. Beyond citation analysis: a model for assessment of research impact.

    Science.gov (United States)

    Sarli, Cathy C; Dubinsky, Ellen K; Holmes, Kristi L

    2010-01-01

    Is there a means of assessing research impact beyond citation analysis? The case study took place at the Washington University School of Medicine Becker Medical Library. This case study analyzed the research study process to identify indicators beyond citation count that demonstrate research impact. The authors discovered a number of indicators that can be documented for assessment of research impact, as well as resources to locate evidence of impact. As a result of the project, the authors developed a model for assessment of research impact, the Becker Medical Library Model for Assessment of Research. Assessment of research impact using traditional citation analysis alone is not a sufficient tool for assessing the impact of research findings, and it is not predictive of subsequent clinical applications resulting in meaningful health outcomes. The Becker Model can be used by both researchers and librarians to document research impact to supplement citation analysis.

  7. Modeling and Analysis of Mixed Synchronous/Asynchronous Systems

    Science.gov (United States)

    Driscoll, Kevin R.; Madl. Gabor; Hall, Brendan

    2012-01-01

    Practical safety-critical distributed systems must integrate safety critical and non-critical data in a common platform. Safety critical systems almost always consist of isochronous components that have synchronous or asynchronous interface with other components. Many of these systems also support a mix of synchronous and asynchronous interfaces. This report presents a study on the modeling and analysis of asynchronous, synchronous, and mixed synchronous/asynchronous systems. We build on the SAE Architecture Analysis and Design Language (AADL) to capture architectures for analysis. We present preliminary work targeted to capture mixed low- and high-criticality data, as well as real-time properties in a common Model of Computation (MoC). An abstract, but representative, test specimen system was created as the system to be modeled.

  8. Analysis of deterministic cyclic gene regulatory network models with delays

    CERN Document Server

    Ahsen, Mehmet Eren; Niculescu, Silviu-Iulian

    2015-01-01

    This brief examines a deterministic, ODE-based model for gene regulatory networks (GRN) that incorporates nonlinearities and time-delayed feedback. An introductory chapter provides some insights into molecular biology and GRNs. The mathematical tools necessary for studying the GRN model are then reviewed, in particular Hill functions and Schwarzian derivatives. One chapter is devoted to the analysis of GRNs under negative feedback with time delays and a special case of a homogenous GRN is considered. Asymptotic stability analysis of GRNs under positive feedback is then considered in a separate chapter, in which conditions leading to bi-stability are derived. Graduate and advanced undergraduate students and researchers in control engineering, applied mathematics, systems biology and synthetic biology will find this brief to be a clear and concise introduction to the modeling and analysis of GRNs.

  9. Modeling and Verification of Insider Threats Using Logical Analysis

    DEFF Research Database (Denmark)

    Kammuller, Florian; Probst, Christian W.

    2017-01-01

    and use a common trick from the formal verification of security protocols, showing that it is applicable to insider threats. We introduce briefly a three-step process of social explanation, illustrating that it can be applied fruitfully to the characterization of insider threats. We introduce the insider......In this paper, we combine formal modeling and analysis of infrastructures of organizations with sociological explanation to provide a framework for insider threat analysis. We use the higher order logic (HOL) proof assistant Isabelle/HOL to support this framework. In the formal model, we exhibit...... theory constructed in Isabelle that implements this process of social explanation. To validate that the social explanation is generally useful for the analysis of insider threats and to demonstrate our framework, we model and verify the insider threat patterns of entitled independent and Ambitious Leader...

  10. Analysis and Enhancements of a Prolific Macroscopic Model of Epilepsy

    Directory of Open Access Journals (Sweden)

    Christopher Fietkiewicz

    2016-01-01

    Full Text Available Macroscopic models of epilepsy can deliver surprisingly realistic EEG simulations. In the present study, a prolific series of models is evaluated with regard to theoretical and computational concerns, and enhancements are developed. Specifically, we analyze three aspects of the models: (1 Using dynamical systems analysis, we demonstrate and explain the presence of direct current potentials in the simulated EEG that were previously undocumented. (2 We explain how the system was not ideally formulated for numerical integration of stochastic differential equations. A reformulated system is developed to support proper methodology. (3 We explain an unreported contradiction in the published model specification regarding the use of a mathematical reduction method. We then use the method to reduce the number of equations and further improve the computational efficiency. The intent of our critique is to enhance the evolution of macroscopic modeling of epilepsy and assist others who wish to explore this exciting class of models further.

  11. Analysis and Enhancements of a Prolific Macroscopic Model of Epilepsy

    Science.gov (United States)

    Fietkiewicz, Christopher; Loparo, Kenneth A.

    2016-01-01

    Macroscopic models of epilepsy can deliver surprisingly realistic EEG simulations. In the present study, a prolific series of models is evaluated with regard to theoretical and computational concerns, and enhancements are developed. Specifically, we analyze three aspects of the models: (1) Using dynamical systems analysis, we demonstrate and explain the presence of direct current potentials in the simulated EEG that were previously undocumented. (2) We explain how the system was not ideally formulated for numerical integration of stochastic differential equations. A reformulated system is developed to support proper methodology. (3) We explain an unreported contradiction in the published model specification regarding the use of a mathematical reduction method. We then use the method to reduce the number of equations and further improve the computational efficiency. The intent of our critique is to enhance the evolution of macroscopic modeling of epilepsy and assist others who wish to explore this exciting class of models further. PMID:27144054

  12. Analysis of snow feedbacks in 14 general circulation models

    Science.gov (United States)

    Randall, D. A.; Cess, R. D.; Blanchet, J. P.; Chalita, S.; Colman, R.; Dazlich, D. A.; Del Genio, A. D.; Keup, E.; Lacis, A.; Le Treut, H.; Liang, X.-Z.; McAvaney, B. J.; Mahfouf, J. F.; Meleshko, V. P.; Morcrette, J.-J.; Norris, P. M.; Potter, G. L.; Rikus, L.; Roeckner, E.; Royer, J. F.; Schlese, U.; Sheinin, D. A.; Sokolov, A. P.; Taylor, K. E.; Wetherald, R. T.; Yagai, I.; Zhang, M.-H.

    1994-10-01

    Snow feedbacks produced by 14 atmospheric general circulation models have been analyzed through idealized numerical experiments. Included in the analysis is an investigation of the surface energy budgets of the models. Negative or weak positive snow feedbacks occurred in some of the models, while others produced strong positive snow feedbacks. These feedbacks are due not only to melting snow, but also to increases in boundary temperature, changes in air temperature, changes in water vapor, and changes in cloudiness. As a result, the net response of each model is quite complex. We analyze in detail the responses of one model with a strong positive snow feedback and another with a weak negative snow feedback. Some of the models include a temperature dependence of the snow albedo, and this has significantly affected the results.

  13. Evaluation of Cost Models and Needs & Gaps Analysis

    DEFF Research Database (Denmark)

    Kejser, Ulla Bøgvad

    2014-01-01

    his report ’D3.1—Evaluation of Cost Models and Needs & Gaps Analysis’ provides an analysis of existing research related to the economics of digital curation and cost & benefit modelling. It reports upon the investigation of how well current models and tools meet stakeholders’ needs for calculating...... andcomparing financial information. Based on this evaluation, it aims to point out gaps that need to be bridged in order to increase the uptake of cost & benefit modelling and good practices that will enable costing and comparison of the costs of alternative scenarios—which in turn provides a starting point...... for amore efficient use of resources for digital curation. To facilitate and clarify the model evaluation the report first outlines a basic terminology and a generaldescription of the characteristics of cost and benefit models.The report then describes how the ten current and emerging cost and benefit...

  14. Analysis of CPN-1 sigma models via projective structures

    International Nuclear Information System (INIS)

    Post, S; Grundland, A M

    2012-01-01

    This paper represents a study of projector solutions to the Euclidean CP N-1 sigma model in two dimensions and their associated surfaces immersed in the su(N) Lie algebra. Any solution for the CP N-1 sigma model defined on the extended complex plane with finite action can be written as a raising operator acting on a holomorphic one. Here the proof is formulated in terms rank-1 projectors so it is explicitly gauge invariant. We apply these results to the analysis of surfaces associated with the CP N-1 models defined using the generalized Weierstrass formula for immersion. We show that the surfaces are conformally parametrized by the Lagrangian density, with finite area equal to the action of the model, and express several other geometrical characteristics of the surface in terms of the physical quantities of the model. Finally, we provide necessary and sufficient conditions that a surface be related to a CP N-1 sigma model

  15. Analysis and Comparison of Typical Models within Distribution Network Design

    DEFF Research Database (Denmark)

    Jørgensen, Hans Jacob; Larsen, Allan; Madsen, Oli B.G.

    This paper investigates the characteristics of typical optimisation models within Distribution Network Design. During the paper fourteen models known from the literature will be thoroughly analysed. Through this analysis a schematic approach to categorisation of distribution network design models...... for educational purposes. Furthermore, the paper can be seen as a practical introduction to network design modelling as well as a being an art manual or recipe when constructing such a model....... are covered in the categorisation include fixed vs. general networks, specialised vs. general nodes, linear vs. nonlinear costs, single vs. multi commodity, uncapacitated vs. capacitated activities, single vs. multi modal and static vs. dynamic. The models examined address both strategic and tactical planning...

  16. Formability models for warm sheet metal forming analysis

    Science.gov (United States)

    Jiang, Sen

    Several closed form models for the prediction of strain space sheet metal formability as a function of temperature and strain rate are proposed. The proposed models require only failure strain information from the uniaxial tension test at an elevated temperature setting and failure strain information from the traditionally defined strain space forming limit diagram at room temperature, thereby featuring the advantage of offering a full forming limit description without having to carry out expensive experimental studies for multiple modes of deformation under the elevated temperature. The Power law, Voce, and Johnson-Cook hardening models are considered along with the yield criterions of Hill's 48 and Logan-Hosford yield criteria. Acceptable correlations between the theory and experiment are reported for all the models under a plane strain condition. Among all the proposed models, the model featuring Johnson-Cook hardening model and Logan-Hosford yield behavior (LHJC model) was shown to best correlate with experiment. The sensitivity of the model with respect to various forming parameters is discussed. This work is significant to those aiming to incorporate closed-form formability models directly into numerical simulation programs for the purpose of design and analysis of products manufactured through the warm sheet metal forming process. An improvement based upon Swift's diffuse necking theory, is suggested in order to enhance the reliability of the model for biaxial stretch conditions. Theory relating to this improvement is provided in Appendix B.

  17. Visual modeling in an analysis of multidimensional data

    Science.gov (United States)

    Zakharova, A. A.; Vekhter, E. V.; Shklyar, A. V.; Pak, A. J.

    2018-01-01

    The article proposes an approach to solve visualization problems and the subsequent analysis of multidimensional data. Requirements to the properties of visual models, which were created to solve analysis problems, are described. As a perspective direction for the development of visual analysis tools for multidimensional and voluminous data, there was suggested an active use of factors of subjective perception and dynamic visualization. Practical results of solving the problem of multidimensional data analysis are shown using the example of a visual model of empirical data on the current state of studying processes of obtaining silicon carbide by an electric arc method. There are several results of solving this problem. At first, an idea of possibilities of determining the strategy for the development of the domain, secondly, the reliability of the published data on this subject, and changes in the areas of attention of researchers over time.

  18. Credible baseline analysis for multi-model public policy studies

    Energy Technology Data Exchange (ETDEWEB)

    Parikh, S.C.; Gass, S.I.

    1981-01-01

    The nature of public decision-making and resource allocation is such that many complex interactions can best be examined and understood by quantitative analysis. Most organizations do not possess the totality of models and needed analytical skills to perform detailed and systematic quantitative analysis. Hence, the need for coordinated, multi-organization studies that support public decision-making has grown in recent years. This trend is expected not only to continue, but to increase. This paper describes the authors' views on the process of multi-model analysis based on their participation in an analytical exercise, the ORNL/MITRE Study. One of the authors was the exercise coordinator. During the study, the authors were concerned with the issue of measuring and conveying credibility of the analysis. This work led them to identify several key determinants, described in this paper, that could be used to develop a rating of credibility.

  19. Sensitivity analysis techniques for models of human behavior.

    Energy Technology Data Exchange (ETDEWEB)

    Bier, Asmeret Brooke

    2010-09-01

    Human and social modeling has emerged as an important research area at Sandia National Laboratories due to its potential to improve national defense-related decision-making in the presence of uncertainty. To learn about which sensitivity analysis techniques are most suitable for models of human behavior, different promising methods were applied to an example model, tested, and compared. The example model simulates cognitive, behavioral, and social processes and interactions, and involves substantial nonlinearity, uncertainty, and variability. Results showed that some sensitivity analysis methods create similar results, and can thus be considered redundant. However, other methods, such as global methods that consider interactions between inputs, can generate insight not gained from traditional methods.

  20. Modeling data irregularities and structural complexities in data envelopment analysis

    CERN Document Server

    Zhu, Joe

    2007-01-01

    In a relatively short period of time, Data Envelopment Analysis (DEA) has grown into a powerful quantitative, analytical tool for measuring and evaluating performance. It has been successfully applied to a whole variety of problems in many different contexts worldwide. This book deals with the micro aspects of handling and modeling data issues in modeling DEA problems. DEA's use has grown with its capability of dealing with complex "service industry" and the "public service domain" types of problems that require modeling of both qualitative and quantitative data. This handbook treatment deals with specific data problems including: imprecise or inaccurate data; missing data; qualitative data; outliers; undesirable outputs; quality data; statistical analysis; software and other data aspects of modeling complex DEA problems. In addition, the book will demonstrate how to visualize DEA results when the data is more than 3-dimensional, and how to identify efficiency units quickly and accurately.

  1. Model-based human reliability analysis: prospects and requirements

    International Nuclear Information System (INIS)

    Mosleh, A.; Chang, Y.H.

    2004-01-01

    Major limitations of the conventional methods for human reliability analysis (HRA), particularly those developed for operator response analysis in probabilistic safety assessments (PSA) of nuclear power plants, are summarized as a motivation for the need and a basis for developing requirements for the next generation HRA methods. It is argued that a model-based approach that provides explicit cognitive causal links between operator behaviors and directly or indirectly measurable causal factors should be at the core of the advanced methods. An example of such causal model is briefly reviewed, where due to the model complexity and input requirements can only be currently implemented in a dynamic PSA environment. The computer simulation code developed for this purpose is also described briefly, together with current limitations in the models, data, and the computer implementation

  2. Causal Analysis for Performance Modeling of Computer Programs

    Directory of Open Access Journals (Sweden)

    Jan Lemeire

    2007-01-01

    Full Text Available Causal modeling and the accompanying learning algorithms provide useful extensions for in-depth statistical investigation and automation of performance modeling. We enlarged the scope of existing causal structure learning algorithms by using the form-free information-theoretic concept of mutual information and by introducing the complexity criterion for selecting direct relations among equivalent relations. The underlying probability distribution of experimental data is estimated by kernel density estimation. We then reported on the benefits of a dependency analysis and the decompositional capacities of causal models. Useful qualitative models, providing insight into the role of every performance factor, were inferred from experimental data. This paper reports on the results for a LU decomposition algorithm and on the study of the parameter sensitivity of the Kakadu implementation of the JPEG-2000 standard. Next, the analysis was used to search for generic performance characteristics of the applications.

  3. Multivariable modeling and multivariate analysis for the behavioral sciences

    CERN Document Server

    Everitt, Brian S

    2009-01-01

    Multivariable Modeling and Multivariate Analysis for the Behavioral Sciences shows students how to apply statistical methods to behavioral science data in a sensible manner. Assuming some familiarity with introductory statistics, the book analyzes a host of real-world data to provide useful answers to real-life issues.The author begins by exploring the types and design of behavioral studies. He also explains how models are used in the analysis of data. After describing graphical methods, such as scatterplot matrices, the text covers simple linear regression, locally weighted regression, multip

  4. Data Envelopment Analysis (DEA) Model in Operation Management

    Science.gov (United States)

    Malik, Meilisa; Efendi, Syahril; Zarlis, Muhammad

    2018-01-01

    Quality management is an effective system in operation management to develops, maintains, and improves quality from groups of companies that allow marketing, production, and service at the most economycal level as well as ensuring customer satisfication. Many companies are practicing quality management to improve their bussiness performance. One of performance measurement is through measurement of efficiency. One of the tools can be used to assess efficiency of companies performance is Data Envelopment Analysis (DEA). The aim of this paper is using Data Envelopment Analysis (DEA) model to assess efficiency of quality management. In this paper will be explained CCR, BCC, and SBM models to assess efficiency of quality management.

  5. Analysis of a Model for Computer Virus Transmission

    Directory of Open Access Journals (Sweden)

    Peng Qin

    2015-01-01

    Full Text Available Computer viruses remain a significant threat to computer networks. In this paper, the incorporation of new computers to the network and the removing of old computers from the network are considered. Meanwhile, the computers are equipped with antivirus software on the computer network. The computer virus model is established. Through the analysis of the model, disease-free and endemic equilibrium points are calculated. The stability conditions of the equilibria are derived. To illustrate our theoretical analysis, some numerical simulations are also included. The results provide a theoretical basis to control the spread of computer virus.

  6. Model-based analysis and simulation of regenerative heat wheel

    DEFF Research Database (Denmark)

    Wu, Zhuang; Melnik, Roderick V. N.; Borup, F.

    2006-01-01

    of mathematical models for the thermal analysis of the fluid and wheel matrix. The effect of heat conduction in the direction of the fluid flow is taken into account and the influence of variations in rotating speed of the wheel as well as other characteristics (ambient temperature, airflow and geometric size......The rotary regenerator (also called the heat wheel) is an important component of energy intensive sectors, which is used in many heat recovery systems. In this paper, a model-based analysis of a rotary regenerator is carried out with a major emphasis given to the development and implementation...

  7. Business Models For SMEs In Bandung: Swot Analysis

    Directory of Open Access Journals (Sweden)

    Senen Machmud

    2014-04-01

    Full Text Available The main objective of this study is to find the model business for small and medium-sized enterprises (SMEs with management strategy and business strategy approach. This research to help researchers, owners of SMEs and government in developing a framework for management strategy and business strategy on how the best result of business models. This study is valuable considering the limited among of empirical work previously done on the topic in question. The result of management strategies is internal and external factor analysis than analysis with strength, weakness, opportunities, and treatment (SWOT.

  8. Experimental and numerical analysis of a knee endoprosthesis numerical model

    Directory of Open Access Journals (Sweden)

    L. Zach

    2016-07-01

    Full Text Available The aim of this study is to create and verify a numerical model for a Medin Modular orthopedic knee-joint implant by investigating contact pressure, its distribution and contact surfaces. An experiment using Fuji Prescale pressure sensitive films and a finite element analysis (FEA using Abaqus software were carried out. The experimental data were evaluated using a special designed program and were compared with the results of the analysis. The designed evaluation program had been constructed on the basis of results obtained from a supplementary calibration experiment. The applicability of the numerical model for the real endoprosthesis behavior prediction was proven on the basis of their good correlation.

  9. Statistical models and methods for reliability and survival analysis

    CERN Document Server

    Couallier, Vincent; Huber-Carol, Catherine; Mesbah, Mounir; Huber -Carol, Catherine; Limnios, Nikolaos; Gerville-Reache, Leo

    2013-01-01

    Statistical Models and Methods for Reliability and Survival Analysis brings together contributions by specialists in statistical theory as they discuss their applications providing up-to-date developments in methods used in survival analysis, statistical goodness of fit, stochastic processes for system reliability, amongst others. Many of these are related to the work of Professor M. Nikulin in statistics over the past 30 years. The authors gather together various contributions with a broad array of techniques and results, divided into three parts - Statistical Models and Methods, Statistical

  10. Variance-based sensitivity analysis for wastewater treatment plant modelling.

    Science.gov (United States)

    Cosenza, Alida; Mannina, Giorgio; Vanrolleghem, Peter A; Neumann, Marc B

    2014-02-01

    Global sensitivity analysis (GSA) is a valuable tool to support the use of mathematical models that characterise technical or natural systems. In the field of wastewater modelling, most of the recent applications of GSA use either regression-based methods, which require close to linear relationships between the model outputs and model factors, or screening methods, which only yield qualitative results. However, due to the characteristics of membrane bioreactors (MBR) (non-linear kinetics, complexity, etc.) there is an interest to adequately quantify the effects of non-linearity and interactions. This can be achieved with variance-based sensitivity analysis methods. In this paper, the Extended Fourier Amplitude Sensitivity Testing (Extended-FAST) method is applied to an integrated activated sludge model (ASM2d) for an MBR system including microbial product formation and physical separation processes. Twenty-one model outputs located throughout the different sections of the bioreactor and 79 model factors are considered. Significant interactions among the model factors are found. Contrary to previous GSA studies for ASM models, we find the relationship between variables and factors to be non-linear and non-additive. By analysing the pattern of the variance decomposition along the plant, the model factors having the highest variance contributions were identified. This study demonstrates the usefulness of variance-based methods in membrane bioreactor modelling where, due to the presence of membranes and different operating conditions than those typically found in conventional activated sludge systems, several highly non-linear effects are present. Further, the obtained results highlight the relevant role played by the modelling approach for MBR taking into account simultaneously biological and physical processes. © 2013.

  11. Sensitivity analysis of the terrestrial food chain model FOOD III

    International Nuclear Information System (INIS)

    Zach, Reto.

    1980-10-01

    As a first step in constructing a terrestrial food chain model suitable for long-term waste management situations, a numerical sensitivity analysis of FOOD III was carried out to identify important model parameters. The analysis involved 42 radionuclides, four pathways, 14 food types, 93 parameters and three percentages of parameter variation. We also investigated the importance of radionuclides, pathways and food types. The analysis involved a simple contamination model to render results from individual pathways comparable. The analysis showed that radionuclides vary greatly in their dose contribution to each of the four pathways, but relative contributions to each pathway are very similar. Man's and animals' drinking water pathways are much more important than the leaf and root pathways. However, this result depends on the contamination model used. All the pathways contain unimportant food types. Considering the number of parameters involved, FOOD III has too many different food types. Many of the parameters of the leaf and root pathway are important. However, this is true for only a few of the parameters of animals' drinking water pathway, and for neither of the two parameters of mans' drinking water pathway. The radiological decay constant increases the variability of these results. The dose factor is consistently the most important variable, and it explains most of the variability of radionuclide doses within pathways. Consideration of the variability of dose factors is important in contemporary as well as long-term waste management assessment models, if realistic estimates are to be made. (auth)

  12. Mathematical models of ABE fermentation: review and analysis.

    Science.gov (United States)

    Mayank, Rahul; Ranjan, Amrita; Moholkar, Vijayanand S

    2013-12-01

    Among different liquid biofuels that have emerged in the recent past, biobutanol produced via fermentation processes is of special interest due to very similar properties to that of gasoline. For an effective design, scale-up, and optimization of the acetone-butanol-ethanol (ABE) fermentation process, it is necessary to have insight into the micro- and macro-mechanisms of the process. The mathematical models for ABE fermentation are efficient tools for this purpose, which have evolved from simple stoichiometric fermentation equations in the 1980s to the recent sophisticated and elaborate kinetic models based on metabolic pathways. In this article, we have reviewed the literature published in the area of mathematical modeling of the ABE fermentation. We have tried to present an analysis of these models in terms of their potency in describing the overall physiology of the process, design features, mode of operation along with comparison and validation with experimental results. In addition, we have also highlighted important facets of these models such as metabolic pathways, basic kinetics of different metabolites, biomass growth, inhibition modeling and other additional features such as cell retention and immobilized cultures. Our review also covers the mathematical modeling of the downstream processing of ABE fermentation, i.e. recovery and purification of solvents through flash distillation, liquid-liquid extraction, and pervaporation. We believe that this review will be a useful source of information and analysis on mathematical models for ABE fermentation for both the appropriate scientific and engineering communities.

  13. Hybrid modeling and empirical analysis of automobile supply chain network

    Science.gov (United States)

    Sun, Jun-yan; Tang, Jian-ming; Fu, Wei-ping; Wu, Bing-ying

    2017-05-01

    Based on the connection mechanism of nodes which automatically select upstream and downstream agents, a simulation model for dynamic evolutionary process of consumer-driven automobile supply chain is established by integrating ABM and discrete modeling in the GIS-based map. Firstly, the rationality is proved by analyzing the consistency of sales and changes in various agent parameters between the simulation model and a real automobile supply chain. Second, through complex network theory, hierarchical structures of the model and relationships of networks at different levels are analyzed to calculate various characteristic parameters such as mean distance, mean clustering coefficients, and degree distributions. By doing so, it verifies that the model is a typical scale-free network and small-world network. Finally, the motion law of this model is analyzed from the perspective of complex self-adaptive systems. The chaotic state of the simulation system is verified, which suggests that this system has typical nonlinear characteristics. This model not only macroscopically illustrates the dynamic evolution of complex networks of automobile supply chain but also microcosmically reflects the business process of each agent. Moreover, the model construction and simulation of the system by means of combining CAS theory and complex networks supplies a novel method for supply chain analysis, as well as theory bases and experience for supply chain analysis of auto companies.

  14. Linear modeling of rotorcraft for stability analysis and preliminary design

    OpenAIRE

    Wirth, Walter M.

    1993-01-01

    Approved for public release; distribution is unlimited. This thesis investigates linear state space modeling of single main rotor helicopters culminating in a computer program that can be used for 1) stability and control analysis for any single main rotor helicopter or 2) preliminary design of a helicopter. The trim solution for a flight condition is found, the aircraft is perturbed about the nominal point, and the stability and control derivatives are determined. State space models and ...

  15. Mathematical Modeling and Analysis of Classified Marketing of Agricultural Products

    OpenAIRE

    WANG, Fengying

    2014-01-01

    Classified marketing of agricultural products was analyzed using the Logistic Regression Model. This method can take full advantage of information in agricultural product database, to find factors influencing best selling degree of agricultural products, and make quantitative analysis accordingly. Using this model, it is also able to predict sales of agricultural products, and provide reference for mapping out individualized sales strategy for popularizing agricultural products.

  16. Analysis of a Model for Computer Virus Transmission

    OpenAIRE

    Qin, Peng

    2015-01-01

    Computer viruses remain a significant threat to computer networks. In this paper, the incorporation of new computers to the network and the removing of old computers from the network are considered. Meanwhile, the computers are equipped with antivirus software on the computer network. The computer virus model is established. Through the analysis of the model, disease-free and endemic equilibrium points are calculated. The stability conditions of the equilibria are derived. To illustrate our t...

  17. Bifurcation analysis of parametrically excited bipolar disorder model

    Science.gov (United States)

    Nana, Laurent

    2009-02-01

    Bipolar II disorder is characterized by alternating hypomanic and major depressive episode. We model the periodic mood variations of a bipolar II patient with a negatively damped harmonic oscillator. The medications administrated to the patient are modeled via a forcing function that is capable of stabilizing the mood variations and of varying their amplitude. We analyze analytically, using perturbation method, the amplitude and stability of limit cycles and check this analysis with numerical simulations.

  18. Phase space analysis of some interacting Chaplygin gas models

    Energy Technology Data Exchange (ETDEWEB)

    Khurshudyan, M. [Academy of Sciences of Armenia, Institute for Physical Research, Ashtarak (Armenia); Tomsk State University of Control Systems and Radioelectronics, Laboratory for Theoretical Cosmology, Tomsk (Russian Federation); Tomsk State Pedagogical University, Department of Theoretical Physics, Tomsk (Russian Federation); Myrzakulov, R. [Eurasian National University, Eurasian International Center for Theoretical Physics, Astana (Kazakhstan)

    2017-02-15

    In this paper we discuss a phase space analysis of various interacting Chaplygin gas models in general relativity. Linear and nonlinear sign changeable interactions are considered. For each case appropriate late time attractors of field equations are found. The Chaplygin gas is one of the dark fluids actively considered in modern cosmology due to the fact that it is a joint model of dark energy and dark matter. (orig.)

  19. A Computable OLG Model for Gender and Growth Policy Analysis

    OpenAIRE

    Pierre-Richard Agénor

    2012-01-01

    This paper develops a computable Overlapping Generations (OLG) model for gender and growth policy analysis. The model accounts for human and physical capital accumulation (both public and private), intra- and inter-generational health persistence, fertility choices, and women's time allocation between market work, child rearing, and home production. Bargaining between spouses and gender bias, in the form of discrimination in the work place and mothers' time allocation between daughters and so...

  20. User-Defined Material Model for Progressive Failure Analysis

    Science.gov (United States)

    Knight, Norman F. Jr.; Reeder, James R. (Technical Monitor)

    2006-01-01

    An overview of different types of composite material system architectures and a brief review of progressive failure material modeling methods used for structural analysis including failure initiation and material degradation are presented. Different failure initiation criteria and material degradation models are described that define progressive failure formulations. These progressive failure formulations are implemented in a user-defined material model (or UMAT) for use with the ABAQUS/Standard1 nonlinear finite element analysis tool. The failure initiation criteria include the maximum stress criteria, maximum strain criteria, the Tsai-Wu failure polynomial, and the Hashin criteria. The material degradation model is based on the ply-discounting approach where the local material constitutive coefficients are degraded. Applications and extensions of the progressive failure analysis material model address two-dimensional plate and shell finite elements and three-dimensional solid finite elements. Implementation details and use of the UMAT subroutine are described in the present paper. Parametric studies for composite structures are discussed to illustrate the features of the progressive failure modeling methods that have been implemented.

  1. Uncertainty Analysis of Multi-Model Flood Forecasts

    Directory of Open Access Journals (Sweden)

    Erich J. Plate

    2015-12-01

    Full Text Available This paper demonstrates, by means of a systematic uncertainty analysis, that the use of outputs from more than one model can significantly improve conditional forecasts of discharges or water stages, provided the models are structurally different. Discharge forecasts from two models and the actual forecasted discharge are assumed to form a three-dimensional joint probability density distribution (jpdf, calibrated on long time series of data. The jpdf is decomposed into conditional probability density distributions (cpdf by means of Bayes formula, as suggested and explored by Krzysztofowicz in a series of papers. In this paper his approach is simplified to optimize conditional forecasts for any set of two forecast models. Its application is demonstrated by means of models developed in a study of flood forecasting for station Stung Treng on the middle reach of the Mekong River in South-East Asia. Four different forecast models were used and pairwise combined: forecast with no model, with persistence model, with a regression model, and with a rainfall-runoff model. Working with cpdfs requires determination of dependency among variables, for which linear regressions are required, as was done by Krzysztofowicz. His Bayesian approach based on transforming observed probability distributions of discharges and forecasts into normal distributions is also explored. Results obtained with his method for normal prior and likelihood distributions are identical to results from direct multiple regressions. Furthermore, it is shown that in the present case forecast accuracy is only marginally improved, if Weibull distributed basic data were converted into normally distributed variables.

  2. Design of Graph Analysis Model to support Decision Making

    International Nuclear Information System (INIS)

    An, Sang Ha; Lee, Sung Jin; Chang, Soon Heung; Kim, Sung Ho; Kim, Tae Woon

    2005-01-01

    Korea is meeting the growing electric power needs by using nuclear, fissile, hydro energy and so on. But we can not use fissile energy forever, and the people's consideration about nature has been changed. So we have to prepare appropriate energy by the conditions before people need more energy. And we should prepare dynamic response because people's need would be changed as the time goes on. So we designed graphic analysis model (GAM) for the dynamic analysis of decision on the energy sources. It can support Analytic Hierarchy Process (AHP) analysis based on Graphic User Interface

  3. Sparse Principal Component Analysis in Medical Shape Modeling

    DEFF Research Database (Denmark)

    Sjöstrand, Karl; Stegmann, Mikkel Bille; Larsen, Rasmus

    2006-01-01

    Principal component analysis (PCA) is a widely used tool in medical image analysis for data reduction, model building, and data understanding and exploration. While PCA is a holistic approach where each new variable is a linear combination of all original variables, sparse PCA (SPCA) aims...... analysis in medicine. Results for three different data sets are given in relation to standard PCA and sparse PCA by simple thresholding of sufficiently small loadings. Focus is on a recent algorithm for computing sparse principal components, but a review of other approaches is supplied as well. The SPCA...

  4. Hydraulic modeling support for conflict analysis: The Manayunk canal revisited

    International Nuclear Information System (INIS)

    Chadderton, R.A.; Traver, R.G.; Rao, J.N.

    1992-01-01

    This paper presents a study which used a standard, hydraulic computer model to generate detailed design information to support conflict analysis of a water resource use issue. As an extension of previous studies, the conflict analysis in this case included several scenarios for stability analysis - all of which reached the conclusion that compromising, shared access to the water resources available would result in the most benefits to society. This expected equilibrium outcome was found to maximize benefit-cost estimates. 17 refs., 1 fig., 2 tabs

  5. Predictive models for monitoring and analysis of the total zooplankton

    Directory of Open Access Journals (Sweden)

    Obradović Milica

    2014-01-01

    Full Text Available In recent years, modeling and prediction of total zooplankton abundance have been performed by various tools and techniques, among which data mining tools have been less frequent. The purpose of this paper is to automatically determine the dependency degree and the influence of physical, chemical and biological parameters on the total zooplankton abundance, through design of the specific data mining models. For this purpose, the analysis of key influencers was used. The analysis is based on the data obtained from the SeLaR information system - specifically, the data from the two reservoirs (Gruža and Grošnica with different morphometric characteristics and trophic state. The data is transformed into optimal structure for data analysis, upon which, data mining model based on the Naïve Bayes algorithm is constructed. The results of the analysis imply that in both reservoirs, parameters of groups and species of zooplankton have the greatest influence on the total zooplankton abundance. If these inputs (group and zooplankton species are left out, differences in the impact of physical, chemical and other biological parameters in dependences of reservoirs can be noted. In the Grošnica reservoir, analysis showed that the temporal dimension (months, nitrates, water temperature, chemical oxygen demand, chlorophyll and chlorides, had the key influence with strong relative impact. In the Gruža reservoir, key influence parameters for total zooplankton are: spatial dimension (location, water temperature and physiological groups of bacteria. The results show that the presented data mining model is usable on any kind of aquatic ecosystem and can also serve for the detection of inputs which could be the basis for the future analysis and modeling.

  6. Year clustering analysis for modelling olive flowering phenology

    Science.gov (United States)

    Oteros, J.; García-Mozo, H.; Hervás-Martínez, C.; Galán, C.

    2013-07-01

    It is now widely accepted that weather conditions occurring several months prior to the onset of flowering have a major influence on various aspects of olive reproductive phenology, including flowering intensity. Given the variable characteristics of the Mediterranean climate, we analyse its influence on the registered variations in olive flowering intensity in southern Spain, and relate them to previous climatic parameters using a year-clustering approach, as a first step towards an olive flowering phenology model adapted to different year categories. Phenological data from Cordoba province (Southern Spain) for a 30-year period (1982-2011) were analysed. Meteorological and phenological data were first subjected to both hierarchical and "K-means" clustering analysis, which yielded four year-categories. For this classification purpose, three different models were tested: (1) discriminant analysis; (2) decision-tree analysis; and (3) neural network analysis. Comparison of the results showed that the neural-networks model was the most effective, classifying four different year categories with clearly distinct weather features. Flowering-intensity models were constructed for each year category using the partial least squares regression method. These category-specific models proved to be more effective than general models. They are better suited to the variability of the Mediterranean climate, due to the different response of plants to the same environmental stimuli depending on the previous weather conditions in any given year. The present detailed analysis of the influence of weather patterns of different years on olive phenology will help us to understand the short-term effects of climate change on olive crop in the Mediterranean area that is highly affected by it.

  7. Analysis of Mental Processes Represented in Models of Artificial Consciousness

    Directory of Open Access Journals (Sweden)

    Luana Folchini da Costa

    2013-12-01

    Full Text Available The Artificial Consciousness concept has been used in the engineering area as being an evolution of the Artificial Intelligence. However, consciousness is a complex subject and often used without formalism. As a main contribution, in this work one proposes an analysis of four recent models of artificial consciousness published in the engineering area. The mental processes represented by these models are highlighted and correlations with the theoretical perspective of cognitive psychology are made. Finally, considerations about consciousness in such models are discussed.

  8. A smart growth evaluation model based on data envelopment analysis

    Science.gov (United States)

    Zhang, Xiaokun; Guan, Yongyi

    2018-04-01

    With the rapid spread of urbanization, smart growth (SG) has attracted plenty of attention from all over the world. In this paper, by the establishment of index system for smart growth, data envelopment analysis (DEA) model was suggested to evaluate the SG level of the current growth situation in cities. In order to further improve the information of both radial direction and non-radial detection, we introduced the non-Archimedean infinitesimal to form C2GS2 control model. Finally, we evaluated the SG level in Canberra and identified a series of problems, which can verify the applicability of the model and provide us more improvement information.

  9. Theme section: Multi-dimensional modelling, analysis and visualization

    DEFF Research Database (Denmark)

    Guilbert, Éric; Coltekin, Arzu; Antón Castro, Francesc/François

    2016-01-01

    describing complex multidimensional phenomena. An example of the relevance of multidimensional modelling is seen with the development of urban modelling where several dimensions have been added to the traditional 2D map representation (Sester et al.,2011). These include obviously the third spatial dimension...... in order to provide a meaningful representation and assist in data visualisation and mining, modelling and analysis; such as data structures allowing representation at different scalesor in different contexts of thematic information. Such issues are of importance with regard to the mission of theI SPRS...

  10. Human Performance Modeling for Dynamic Human Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Boring, Ronald Laurids [Idaho National Laboratory; Joe, Jeffrey Clark [Idaho National Laboratory; Mandelli, Diego [Idaho National Laboratory

    2015-08-01

    Part of the U.S. Department of Energy’s (DOE’s) Light Water Reac- tor Sustainability (LWRS) Program, the Risk-Informed Safety Margin Charac- terization (RISMC) Pathway develops approaches to estimating and managing safety margins. RISMC simulations pair deterministic plant physics models with probabilistic risk models. As human interactions are an essential element of plant risk, it is necessary to integrate human actions into the RISMC risk framework. In this paper, we review simulation based and non simulation based human reliability analysis (HRA) methods. This paper summarizes the founda- tional information needed to develop a feasible approach to modeling human in- teractions in RISMC simulations.

  11. Bifurcation analysis of dengue transmission model in Baguio City, Philippines

    Science.gov (United States)

    Libatique, Criselda P.; Pajimola, Aprimelle Kris J.; Addawe, Joel M.

    2017-11-01

    In this study, we formulate a deterministic model for the transmission dynamics of dengue fever in Baguio City, Philippines. We analyzed the existence of the equilibria of the dengue model. We computed and obtained conditions for the existence of the equilibrium states. Stability analysis for the system is carried out for disease free equilibrium. We showed that the system becomes stable under certain conditions of the parameters. A particular parameter is taken and with the use of the Theory of Centre Manifold, the proposed model demonstrates a bifurcation phenomenon. We performed numerical simulation to verify the analytical results.

  12. Forecasting Analysis of Shanghai Stock Index Based on ARIMA Model

    Directory of Open Access Journals (Sweden)

    Li Chenggang

    2017-01-01

    Full Text Available Prediction and analysis of the Shanghai Composite Index is conducive for investors to investing in the stock market, and providing investors with reference. This paper selects Shanghai Composite Index monthly closing price from Jan, 2005 to Oct, 2016 to construct ARIMA model. This paper carries on the forecast of the last three monthly closing price of Shanghai Stock Index that have occurred, and compared it with the actual value, which tests the accuracy and feasibility of the model in the short term Shanghai Stock Index forecast. At last, this paper uses the ARIMA model to forecast the Shanghai Composite Index closing price of the last two months in 2016.

  13. Process Correlation Analysis Model for Process Improvement Identification

    Directory of Open Access Journals (Sweden)

    Su-jin Choi

    2014-01-01

    software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data.

  14. Creation and Reliability Analysis of Vehicle Dynamic Weighing Model

    Directory of Open Access Journals (Sweden)

    Zhi-Ling XU

    2014-08-01

    Full Text Available In this paper, it is modeled by using ADAMS to portable axle load meter of dynamic weighing system, controlling a single variable simulation weighing process, getting the simulation weighing data under the different speed and weight; simultaneously using portable weighing system with the same parameters to achieve the actual measurement, comparative analysis the simulation results under the same conditions, at 30 km/h or less, the simulation value and the measured value do not differ by more than 5 %, it is not only to verify the reliability of dynamic weighing model, but also to create possible for improving algorithm study efficiency by using dynamic weighing model simulation.

  15. Numerical model of solar dynamic radiator for parametric analysis

    Science.gov (United States)

    Rhatigan, Jennifer L.

    1989-01-01

    Growth power requirements for Space Station Freedom will be met through addition of 25 kW solar dynamic (SD) power modules. Extensive thermal and power cycle modeling capabilities have been developed which are powerful tools in Station design and analysis, but which prove cumbersome and costly for simple component preliminary design studies. In order to aid in refining the SD radiator to the mature design stage, a simple and flexible numerical model was developed. The model simulates heat transfer and fluid flow performance of the radiator and calculates area mass and impact survivability for many combinations of flow tube and panel configurations, fluid and material properties, and environmental and cycle variations.

  16. Calibration of Uncertainty Analysis of the SWAT Model Using Genetic Algorithms and Bayesian Model Averaging

    Science.gov (United States)

    In this paper, the Genetic Algorithms (GA) and Bayesian model averaging (BMA) were combined to simultaneously conduct calibration and uncertainty analysis for the Soil and Water Assessment Tool (SWAT). In this hybrid method, several SWAT models with different structures are first selected; next GA i...

  17. Critical Analysis of Underground Coal Gasification Models. Part II: Kinetic and Computational Fluid Dynamics Models

    Directory of Open Access Journals (Sweden)

    Alina Żogała

    2014-01-01

    Originality/value: This paper presents state of art in the field of coal gasification modeling using kinetic and computational fluid dynamics approach. The paper also presents own comparative analysis (concerned with mathematical formulation, input data and parameters, basic assumptions, obtained results etc. of the most important models of underground coal gasification.

  18. Stochastic modeling of friction force and vibration analysis of a mechanical system using the model

    International Nuclear Information System (INIS)

    Kang, Won Seok; Choi, Chan Kyu; Yoo, Hong Hee

    2015-01-01

    The squeal noise generated from a disk brake or chatter occurred in a machine tool primarily results from friction-induced vibration. Since friction-induced vibration is usually accompanied by abrasion and lifespan reduction of mechanical parts, it is necessary to develop a reliable analysis model by which friction-induced vibration phenomena can be accurately analyzed. The original Coulomb's friction model or the modified Coulomb friction model employed in most commercial programs employs deterministic friction coefficients. However, observing friction phenomena between two contact surfaces, one may observe that friction coefficients keep changing due to the unevenness of contact surface, temperature, lubrication and humidity. Therefore, in this study, friction coefficients are modeled as random parameters that keep changing during the motion of a mechanical system undergoing friction force. The integrity of the proposed stochastic friction model was validated by comparing the analysis results obtained by the proposed model with experimental results.

  19. Molecular structure based property modeling: Development/ improvement of property models through a systematic property-data-model analysis

    DEFF Research Database (Denmark)

    Hukkerikar, Amol Shivajirao; Sarup, Bent; Sin, Gürkan

    2013-01-01

    The objective of this work is to develop a method for performing property-data-model analysis so that efficient use of knowledge of properties could be made in the development/improvement of property prediction models. The method includes: (i) analysis of property data and its consistency check; ......, a method for selecting a minimum data-set for the parameter regression is also discussed for the cases where it is preferred to retain some data-points from the total data-set to test the reliability of predictions for validation purposes.......; (ii) selection of the most appropriate form of the property model; (iii) selection of the data-set for performing parameter regression and uncertainty analysis; and (iv) analysis of model prediction errors to take necessary corrective steps to improve the accuracy and the reliability of property...

  20. A grounded theory model for analysis of marine accidents.

    Science.gov (United States)

    Mullai, Arben; Paulsson, Ulf

    2011-07-01

    The purpose of this paper was to design a conceptual model for analysis of marine accidents. The model is grounded on large amounts of empirical data, i.e. the Swedish Maritime Administration database, which was thoroughly studied. This database contains marine accidents organized by ship and variable. The majority of variables are non-metric and some have never been analyzed because of the large number of values. Summary statistics were employed in the data analysis. In order to develop a conceptual model, the database variables were clustered into eleven main categories or constructs, which were organized according to their properties and connected with the path diagram of relationships. For demonstration purposes, one non-metric and five metric variables were selected, namely fatality, ship's properties (i.e. age, gross register tonnage, and length), number of people on board, and marine accidents. These were analyzed using the structural equation modeling (SEM) approach. The combined prediction power of the 'ship's properties' and 'number of people on board' independent variables accounted for 65% of the variance of the fatality. The model development was largely based on the data contained in the Swedish database. However, as this database shares a number of variables in common with other databases in the region and the world, the model presented in this paper could be applied to other datasets. The model has both theoretical and practical values. Recommendations for improvements in the database are also suggested. Copyright © 2011 Elsevier Ltd. All rights reserved.

  1. Bayesian analysis of physiologically based toxicokinetic and toxicodynamic models.

    Science.gov (United States)

    Hack, C Eric

    2006-04-17

    Physiologically based toxicokinetic (PBTK) and toxicodynamic (TD) models of bromate in animals and humans would improve our ability to accurately estimate the toxic doses in humans based on available animal studies. These mathematical models are often highly parameterized and must be calibrated in order for the model predictions of internal dose to adequately fit the experimentally measured doses. Highly parameterized models are difficult to calibrate and it is difficult to obtain accurate estimates of uncertainty or variability in model parameters with commonly used frequentist calibration methods, such as maximum likelihood estimation (MLE) or least squared error approaches. The Bayesian approach called Markov chain Monte Carlo (MCMC) analysis can be used to successfully calibrate these complex models. Prior knowledge about the biological system and associated model parameters is easily incorporated in this approach in the form of prior parameter distributions, and the distributions are refined or updated using experimental data to generate posterior distributions of parameter estimates. The goal of this paper is to give the non-mathematician a brief description of the Bayesian approach and Markov chain Monte Carlo analysis, how this technique is used in risk assessment, and the issues associated with this approach.

  2. Design Through Manufacturing: The Solid Model - Finite Element Analysis Interface

    Science.gov (United States)

    Rubin, Carol

    2003-01-01

    State-of-the-art computer aided design (CAD) presently affords engineers the opportunity to create solid models of machine parts which reflect every detail of the finished product. Ideally, these models should fulfill two very important functions: (1) they must provide numerical control information for automated manufacturing of precision parts, and (2) they must enable analysts to easily evaluate the stress levels (using finite element analysis - FEA) for all structurally significant parts used in space missions. Today's state-of-the-art CAD programs perform function (1) very well, providing an excellent model for precision manufacturing. But they do not provide a straightforward and simple means of automating the translation from CAD to FEA models, especially for aircraft-type structures. The research performed during the fellowship period investigated the transition process from the solid CAD model to the FEA stress analysis model with the final goal of creating an automatic interface between the two. During the period of the fellowship a detailed multi-year program for the development of such an interface was created. The ultimate goal of this program will be the development of a fully parameterized automatic ProE/FEA translator for parts and assemblies, with the incorporation of data base management into the solution, and ultimately including computational fluid dynamics and thermal modeling in the interface.

  3. Modeling and Analysis of Component Faults and Reliability

    DEFF Research Database (Denmark)

    Le Guilly, Thibaut; Olsen, Petur; Ravn, Anders Peter

    2016-01-01

    This chapter presents a process to design and validate models of reactive systems in the form of communicating timed automata. The models are extended with faults associated with probabilities of occurrence. This enables a fault tree analysis of the system using minimal cut sets that are automati......This chapter presents a process to design and validate models of reactive systems in the form of communicating timed automata. The models are extended with faults associated with probabilities of occurrence. This enables a fault tree analysis of the system using minimal cut sets...... that are automatically generated. The stochastic information on the faults is used to estimate the reliability of the fault affected system. The reliability is given with respect to properties of the system state space. We illustrate the process on a concrete example using the Uppaal model checker for validating...... the ideal system model and the fault modeling. Then the statistical version of the tool, UppaalSMC, is used to find reliability estimates....

  4. A simple model for the dynamic analysis of deteriorating structures

    International Nuclear Information System (INIS)

    Andreaus, U.; Ceradini, G.; D'Asdia, P.

    1983-01-01

    A simple model exhibiting a multi-linear constitutive law is presented which describes the behaviour of structural members and subassemblages under severe cyclic loading. The proposed model allows for: 1) pinched form of force-displacement diagrams due to, e.g., cracks in reinforced concrete members and masonry panels; 2) slippage effects due to lack of bond of steel bars in reinforced concrete and clearances in steel bolted connections; 3) post-buckling behaviour of subassemblages with unstable members; 4) cumulative damage affecting strength and/or stiffness at low cycle fatigue. The parameters governing the model behaviour have to be estimated on the basis of experimental results. The model is well suitable for analysis under statically applied cyclic displacements and forces, and under earthquake excitation. An X-type bracing system is then worked out where the member behaviour is schematized according to the proposed model. (orig.)

  5. The modelling and analysis of the mechanics of ropes

    CERN Document Server

    Leech, C M

    2014-01-01

    This book considers the modelling and analysis of the many types of ropes, linear fibre assemblies. The construction of these structures is very diverse and in the work these are considered from the modelling point of view. As well as the conventional twisted structures, braid and plaited structures and parallel assemblies are modelled and analysed, first for their assembly and secondly for their mechanical behaviour. Also since the components are assemblies of components, fibres into yarns, into strands, and into ropes the hierarchical nature of the construction is considered. The focus of the modelling is essentially toward load extension behaviour but there is reference to bending of ropes, encompassed by the two extremes, no slip between the components and zero friction resistance to component slip. Friction in ropes is considered both between the rope components, sliding, sawing and scissoring, and within the components, dilation and distortion, these latter modes being used to model component set, the p...

  6. A two stage data envelopment analysis model with undesirable output

    Science.gov (United States)

    Shariff Adli Aminuddin, Adam; Izzati Jaini, Nur; Mat Kasim, Maznah; Nawawi, Mohd Kamal Mohd

    2017-09-01

    The dependent relationship among the decision making units (DMU) is usually assumed to be non-existent in the development of Data Envelopment Analysis (DEA) model. The dependency can be represented by the multi-stage DEA model, where the outputs from the precedent stage will be the inputs for the latter stage. The multi-stage DEA model evaluate both the efficiency score for each stages and the overall efficiency of the whole process. The existing multi stage DEA models do not focus on the integration with the undesirable output, in which the higher input will generate lower output unlike the normal desirable output. This research attempts to address the inclusion of such undesirable output and investigate the theoretical implication and potential application towards the development of multi-stage DEA model.

  7. Statistical Analysis by Statistical Physics Model for the STOCK Markets

    Science.gov (United States)

    Wang, Tiansong; Wang, Jun; Fan, Bingli

    A new stochastic stock price model of stock markets based on the contact process of the statistical physics systems is presented in this paper, where the contact model is a continuous time Markov process, one interpretation of this model is as a model for the spread of an infection. Through this model, the statistical properties of Shanghai Stock Exchange (SSE) and Shenzhen Stock Exchange (SZSE) are studied. In the present paper, the data of SSE Composite Index and the data of SZSE Component Index are analyzed, and the corresponding simulation is made by the computer computation. Further, we investigate the statistical properties, fat-tail phenomena, the power-law distributions, and the long memory of returns for these indices. The techniques of skewness-kurtosis test, Kolmogorov-Smirnov test, and R/S analysis are applied to study the fluctuation characters of the stock price returns.

  8. Modeling and analysis of transport in the mammary glands

    Science.gov (United States)

    Quezada, Ana; Vafai, Kambiz

    2014-08-01

    The transport of three toxins moving from the blood stream into the ducts of the mammary glands is analyzed in this work. The model predictions are compared with experimental data from the literature. The utility of the model lies in its potential to improve our understanding of toxin transport as a pre-disposing factor to breast cancer. This work is based on a multi-layer transport model to analyze the toxins present in the breast milk. The breast milk in comparison with other sampling strategies allows us to understand the mass transport of toxins once inside the bloodstream of breastfeeding women. The multi-layer model presented describes the transport of caffeine, DDT and cimetidine. The analysis performed takes into account the unique transport mechanisms for each of the toxins. Our model predicts the movement of toxins and/or drugs within the mammary glands as well as their bioaccumulation in the tissues.

  9. Formulation, construction and analysis of kinetic models of metabolism: A review of modelling frameworks

    DEFF Research Database (Denmark)

    Saa, Pedro A.; Nielsen, Lars K.

    2017-01-01

    Kinetic models are critical to predict the dynamic behaviour of metabolic networks. Mechanistic kinetic models for large networks remain uncommon due to the difficulty of fitting their parameters. Recent modelling frameworks promise new ways to overcome this obstacle while retaining predictive...... capabilities. In this review, we present an overview of the relevant mathematical frameworks for kinetic formulation, construction and analysis. Starting with kinetic formalisms, we next review statistical methods for parameter inference, as well as recent computational frameworks applied to the construction...

  10. The Use of Models in Urban Space Pattern Analysis | Berhanu ...

    African Journals Online (AJOL)

    This paper focuses on the use of urban space pattern analysis methods. Physical developments once located in space influence a set of social and economic activities. These days urban developments are of large scale and very fast, often involving complex issues. Models are usually used to reduce complexities in ...

  11. Application of multilinear regression analysis in modeling of soil ...

    African Journals Online (AJOL)

    The application of Multi-Linear Regression Analysis (MLRA) model for predicting soil properties in Calabar South offers a technical guide and solution in foundation designs problems in the area. Forty-five soil samples were collected from fifteen different boreholes at a different depth and 270 tests were carried out for CBR, ...

  12. Mathematical modelling and linear stability analysis of laser fusion cutting

    International Nuclear Information System (INIS)

    Hermanns, Torsten; Schulz, Wolfgang; Vossen, Georg; Thombansen, Ulrich

    2016-01-01

    A model for laser fusion cutting is presented and investigated by linear stability analysis in order to study the tendency for dynamic behavior and subsequent ripple formation. The result is a so called stability function that describes the correlation of the setting values of the process and the process’ amount of dynamic behavior.

  13. Robust bayesian analysis of an autoregressive model with ...

    African Journals Online (AJOL)

    In this work, robust Bayesian analysis of the Bayesian estimation of an autoregressive model with exponential innovations is performed. Using a Bayesian robustness methodology, we show that, using a suitable generalized quadratic loss, we obtain optimal Bayesian estimators of the parameters corresponding to the ...

  14. Safety analysis report on Model UC-609 shipping package

    International Nuclear Information System (INIS)

    Sandberg, R.R.

    1977-08-01

    This Safety Analysis Report for Packaging demonstrates that model UC-609 shipping package can safely transport tritium in any of its forms. The package and its contents are described. The package when subjected to the transport conditions specified in the Code of Federal Regulations, Title 10, Part 71 is evaluated. Finally, compliance with these regulations is discussed

  15. Mathematical annuity models application in cash flow analysis ...

    African Journals Online (AJOL)

    Mathematical annuity models application in cash flow analysis. ... We also compare the cost efficiency between Amortisation and Sinking fund loan repayment as prevalent in financial institutions. Keywords: Annuity, Amortisation, Sinking Fund, Present and Future Value Annuity, Maturity date and Redemption value.

  16. Bayesian Sensitivity Analysis of Statistical Models with Missing Data.

    Science.gov (United States)

    Zhu, Hongtu; Ibrahim, Joseph G; Tang, Niansheng

    2014-04-01

    Methods for handling missing data depend strongly on the mechanism that generated the missing values, such as missing completely at random (MCAR) or missing at random (MAR), as well as other distributional and modeling assumptions at various stages. It is well known that the resulting estimates and tests may be sensitive to these assumptions as well as to outlying observations. In this paper, we introduce various perturbations to modeling assumptions and individual observations, and then develop a formal sensitivity analysis to assess these perturbations in the Bayesian analysis of statistical models with missing data. We develop a geometric framework, called the Bayesian perturbation manifold, to characterize the intrinsic structure of these perturbations. We propose several intrinsic influence measures to perform sensitivity analysis and quantify the effect of various perturbations to statistical models. We use the proposed sensitivity analysis procedure to systematically investigate the tenability of the non-ignorable missing at random (NMAR) assumption. Simulation studies are conducted to evaluate our methods, and a dataset is analyzed to illustrate the use of our diagnostic measures.

  17. Sensitivity analysis of physiochemical interaction model: which pair ...

    African Journals Online (AJOL)

    The mathematical modelling of physiochemical interactions in the framework of industrial and environmental physics usually relies on an initial value problem which is described by a deterministic system of first order ordinary differential equations. In this paper, we considered a sensitivity analysis of studying the qualitative ...

  18. Sensitivity Analysis of Mixed Models for Incomplete Longitudinal Data

    Science.gov (United States)

    Xu, Shu; Blozis, Shelley A.

    2011-01-01

    Mixed models are used for the analysis of data measured over time to study population-level change and individual differences in change characteristics. Linear and nonlinear functions may be used to describe a longitudinal response, individuals need not be observed at the same time points, and missing data, assumed to be missing at random (MAR),…

  19. Soil Retaining Structures : Development of models for structural analysis

    NARCIS (Netherlands)

    Bakker, K.J.

    2000-01-01

    The topic of this thesis is the development of models for the structural analysis of soil retaining structures. The soil retaining structures being looked at are; block revetments, flexible retaining walls and bored tunnels in soft soil. Within this context typical structural behavior of these

  20. Sensitivity analysis practices: Strategies for model-based inference

    Energy Technology Data Exchange (ETDEWEB)

    Saltelli, Andrea [Institute for the Protection and Security of the Citizen (IPSC), European Commission, Joint Research Centre, TP 361, 21020 Ispra (Vatican City State, Holy See,) (Italy)]. E-mail: andrea.saltelli@jrc.it; Ratto, Marco [Institute for the Protection and Security of the Citizen (IPSC), European Commission, Joint Research Centre, TP 361, 21020 Ispra (VA) (Italy); Tarantola, Stefano [Institute for the Protection and Security of the Citizen (IPSC), European Commission, Joint Research Centre, TP 361, 21020 Ispra (VA) (Italy); Campolongo, Francesca [Institute for the Protection and Security of the Citizen (IPSC), European Commission, Joint Research Centre, TP 361, 21020 Ispra (VA) (Italy)

    2006-10-15

    Fourteen years after Science's review of sensitivity analysis (SA) methods in 1989 (System analysis at molecular scale, by H. Rabitz) we search Science Online to identify and then review all recent articles having 'sensitivity analysis' as a keyword. In spite of the considerable developments which have taken place in this discipline, of the good practices which have emerged, and of existing guidelines for SA issued on both sides of the Atlantic, we could not find in our review other than very primitive SA tools, based on 'one-factor-at-a-time' (OAT) approaches. In the context of model corroboration or falsification, we demonstrate that this use of OAT methods is illicit and unjustified, unless the model under analysis is proved to be linear. We show that available good practices, such as variance based measures and others, are able to overcome OAT shortcomings and easy to implement. These methods also allow the concept of factors importance to be defined rigorously, thus making the factors importance ranking univocal. We analyse the requirements of SA in the context of modelling, and present best available practices on the basis of an elementary model. We also point the reader to available recipes for a rigorous SA.

  1. Discrete Discriminant analysis based on tree-structured graphical models

    DEFF Research Database (Denmark)

    Perez de la Cruz, Gonzalo; Eslava, Guillermina

    The purpose of this paper is to illustrate the potential use of discriminant analysis based on tree{structured graphical models for discrete variables. This is done by comparing its empirical performance using estimated error rates for real and simulated data. The results show that discriminant...

  2. comparative analysis of path loss prediction models for urban

    African Journals Online (AJOL)

    A. Obota, O. Simeonb, J. Afolayanc. Department of Electrical/Electronics & Computer Engineering, University of Uyo, Akwa. Ibom State .... assignments, proper determination of electric field strength, interference analysis, handover .... model based on extensive drive test measure- ments made in Japan at several frequencies.

  3. application of multilinear regression analysis in modeling of soil

    African Journals Online (AJOL)

    Windows User

    APPLICATION OF MULTILINEAR REGRESSION ANALYSIS IN MODELING OF. SOIL PROPERTIES FOR GEOTECHNICAL CIVIL ENGINEERING WORKS. IN CALABAR SOUTH. J. G. Egbe1, D. E. Ewa2, S. E. Ubi3, G. B. Ikwa4 and O. O. Tumenayo5. 1, 2, 3, 4, DEPT. OF CIVIL ENGINEERING, CROSS RIVER UNIV.

  4. Sensitivity analysis practices: Strategies for model-based inference

    International Nuclear Information System (INIS)

    Saltelli, Andrea; Ratto, Marco; Tarantola, Stefano; Campolongo, Francesca

    2006-01-01

    Fourteen years after Science's review of sensitivity analysis (SA) methods in 1989 (System analysis at molecular scale, by H. Rabitz) we search Science Online to identify and then review all recent articles having 'sensitivity analysis' as a keyword. In spite of the considerable developments which have taken place in this discipline, of the good practices which have emerged, and of existing guidelines for SA issued on both sides of the Atlantic, we could not find in our review other than very primitive SA tools, based on 'one-factor-at-a-time' (OAT) approaches. In the context of model corroboration or falsification, we demonstrate that this use of OAT methods is illicit and unjustified, unless the model under analysis is proved to be linear. We show that available good practices, such as variance based measures and others, are able to overcome OAT shortcomings and easy to implement. These methods also allow the concept of factors importance to be defined rigorously, thus making the factors importance ranking univocal. We analyse the requirements of SA in the context of modelling, and present best available practices on the basis of an elementary model. We also point the reader to available recipes for a rigorous SA

  5. and three-dimensional models for analysis of optical absorption

    Indian Academy of Sciences (India)

    Unknown

    Goldberg et al 1975; Kam and Parkinson 1982; Baglio et al 1982, 1983; Oritz 1995; Li et al 1996) has been carried out on WS2, there is no detailed analysis of the absorption spectra obtained from the single crystals of WS2 on the basis of two- and three-dimensional models. We have therefore carried out this study and the.

  6. Spatial Econometric data analysis: moving beyond traditional models

    NARCIS (Netherlands)

    Florax, R.J.G.M.; Vlist, van der A.J.

    2003-01-01

    This article appraises recent advances in the spatial econometric literature. It serves as the introduction too collection of new papers on spatial econometric data analysis brought together in this special issue, dealing specifically with new extensions to the spatial econometric modeling

  7. An unsupervised aspect detection model for sentiment analysis of reviews

    NARCIS (Netherlands)

    Bagheri, Ayoub; Saraee, M.; de Jong, Franciska M.G.

    With the rapid growth of user-generated content on the internet, sentiment analysis of online reviews has become a hot research topic recently, but due to variety and wide range of products and services, the supervised and domain-specific models are often not practical. As the number of reviews

  8. Mathematical modelling methodologies in predictive food microbiology: a SWOT analysis.

    Science.gov (United States)

    Ferrer, Jordi; Prats, Clara; López, Daniel; Vives-Rego, Josep

    2009-08-31

    Predictive microbiology is the area of food microbiology that attempts to forecast the quantitative evolution of microbial populations over time. This is achieved to a great extent through models that include the mechanisms governing population dynamics. Traditionally, the models used in predictive microbiology are whole-system continuous models that describe population dynamics by means of equations applied to extensive or averaged variables of the whole system. Many existing models can be classified by specific criteria. We can distinguish between survival and growth models by seeing whether they tackle mortality or cell duplication. We can distinguish between empirical (phenomenological) models, which mathematically describe specific behaviour, and theoretical (mechanistic) models with a biological basis, which search for the underlying mechanisms driving already observed phenomena. We can also distinguish between primary, secondary and tertiary models, by examining their treatment of the effects of external factors and constraints on the microbial community. Recently, the use of spatially explicit Individual-based Models (IbMs) has spread through predictive microbiology, due to the current technological capacity of performing measurements on single individual cells and thanks to the consolidation of computational modelling. Spatially explicit IbMs are bottom-up approaches to microbial communities that build bridges between the description of micro-organisms at the cell level and macroscopic observations at the population level. They provide greater insight into the mesoscale phenomena that link unicellular and population levels. Every model is built in response to a particular question and with different aims. Even so, in this research we conducted a SWOT (Strength, Weaknesses, Opportunities and Threats) analysis of the different approaches (population continuous modelling and Individual-based Modelling), which we hope will be helpful for current and future

  9. Practical identifiability analysis of a minimal cardiovascular system model.

    Science.gov (United States)

    Pironet, Antoine; Docherty, Paul D; Dauby, Pierre C; Chase, J Geoffrey; Desaive, Thomas

    2017-01-17

    Parameters of mathematical models of the cardiovascular system can be used to monitor cardiovascular state, such as total stressed blood volume status, vessel elastance and resistance. To do so, the model parameters have to be estimated from data collected at the patient's bedside. This work considers a seven-parameter model of the cardiovascular system and investigates whether these parameters can be uniquely determined using indices derived from measurements of arterial and venous pressures, and stroke volume. An error vector defined the residuals between the simulated and reference values of the seven clinically available haemodynamic indices. The sensitivity of this error vector to each model parameter was analysed, as well as the collinearity between parameters. To assess practical identifiability of the model parameters, profile-likelihood curves were constructed for each parameter. Four of the seven model parameters were found to be practically identifiable from the selected data. The remaining three parameters were practically non-identifiable. Among these non-identifiable parameters, one could be decreased as much as possible. The other two non-identifiable parameters were inversely correlated, which prevented their precise estimation. This work presented the practical identifiability analysis of a seven-parameter cardiovascular system model, from limited clinical data. The analysis showed that three of the seven parameters were practically non-identifiable, thus limiting the use of the model as a monitoring tool. Slight changes in the time-varying function modeling cardiac contraction and use of larger values for the reference range of venous pressure made the model fully practically identifiable. Copyright © 2017. Published by Elsevier B.V.

  10. A fuzzy set preference model for market share analysis

    Science.gov (United States)

    Turksen, I. B.; Willson, Ian A.

    1992-01-01

    Consumer preference models are widely used in new product design, marketing management, pricing, and market segmentation. The success of new products depends on accurate market share prediction and design decisions based on consumer preferences. The vague linguistic nature of consumer preferences and product attributes, combined with the substantial differences between individuals, creates a formidable challenge to marketing models. The most widely used methodology is conjoint analysis. Conjoint models, as currently implemented, represent linguistic preferences as ratio or interval-scaled numbers, use only numeric product attributes, and require aggregation of individuals for estimation purposes. It is not surprising that these models are costly to implement, are inflexible, and have a predictive validity that is not substantially better than chance. This affects the accuracy of market share estimates. A fuzzy set preference model can easily represent linguistic variables either in consumer preferences or product attributes with minimal measurement requirements (ordinal scales), while still estimating overall preferences suitable for market share prediction. This approach results in flexible individual-level conjoint models which can provide more accurate market share estimates from a smaller number of more meaningful consumer ratings. Fuzzy sets can be incorporated within existing preference model structures, such as a linear combination, using the techniques developed for conjoint analysis and market share estimation. The purpose of this article is to develop and fully test a fuzzy set preference model which can represent linguistic variables in individual-level models implemented in parallel with existing conjoint models. The potential improvements in market share prediction and predictive validity can substantially improve management decisions about what to make (product design), for whom to make it (market segmentation), and how much to make (market share

  11. Understanding earth system models: how Global Sensitivity Analysis can help

    Science.gov (United States)

    Pianosi, Francesca; Wagener, Thorsten

    2017-04-01

    Computer models are an essential element of earth system sciences, underpinning our understanding of systems functioning and influencing the planning and management of socio-economic-environmental systems. Even when these models represent a relatively low number of physical processes and variables, earth system models can exhibit a complicated behaviour because of the high level of interactions between their simulated variables. As the level of these interactions increases, we quickly lose the ability to anticipate and interpret the model's behaviour and hence the opportunity to check whether the model gives the right response for the right reasons. Moreover, even if internally consistent, an earth system model will always produce uncertain predictions because it is often forced by uncertain inputs (due to measurement errors, pre-processing uncertainties, scarcity of measurements, etc.). Lack of transparency about the scope of validity, limitations and the main sources of uncertainty of earth system models can be a strong limitation to their effective use for both scientific and decision-making purposes. Global Sensitivity Analysis (GSA) is a set of statistical analysis techniques to investigate the complex behaviour of earth system models in a structured, transparent and comprehensive way. In this presentation, we will use a range of examples across earth system sciences (with a focus on hydrology) to demonstrate how GSA is a fundamental element in advancing the construction and use of earth system models, including: verifying the consistency of the model's behaviour with our conceptual understanding of the system functioning; identifying the main sources of output uncertainty so to focus efforts for uncertainty reduction; finding tipping points in forcing inputs that, if crossed, would bring the system to specific conditions we want to avoid.

  12. Power quality analysis of STATCOM using dynamic phasor modeling

    Energy Technology Data Exchange (ETDEWEB)

    Hannan, M.A.; Mohamed, A.; Hussain, A.; AI-Dabbagh, Majid [Department of Electrical, Electronic and Systems Engineering, Faculty of Engineering, National University of Malaysia, 43600 Bangi, Selangor (Malaysia)

    2009-06-15

    Modeling of synchronous static compensator (STATCOM) of a power system based on the dynamic phasor model to investigate the performance of STATCOM for power quality analysis is described. It is compared with electromagnetic transient program (EMTP) like simulation. The dynamic phasor model and electromagnetic transient (EMT) model of the STATCOM including the power system are implemented in Matlab/Simulink toolbox and PSCAD/EMTDC, respectively. STATCOM dynamic phasor model including switching functions and their control system are presented. A satisfactory solution for power quality problems on typical distribution network is analyzed using the dynamic phasor model and EMTP like PSCAD/EMTDC simulation techniques. The simulation results revealed that the dynamic phasor model of STATCOM is in excellent agreement with the detailed time-domain EMT model of PSCAD/EMTDC simulation. The dynamic behavior of STATCOM using phasor model can be applied for analyzing power quality issues. It is found faster in speed and higher accuracy can be obtained and correlates well with PSCAD/EMTDC simulation results. (author)

  13. Computational Modelling and Movement Analysis of Hip Joint with Muscles

    Science.gov (United States)

    Siswanto, W. A.; Yoon, C. C.; Salleh, S. Md.; Ngali, M. Z.; Yusup, Eliza M.

    2017-01-01

    In this study, the model of hip joint and the main muscles are modelled by finite elements. The parts included in the model are hip joint, hemi pelvis, gluteus maximus, quadratus femoris and gamellus inferior. The materials that used in these model are isotropic elastic, Mooney Rivlin and Neo-hookean. The hip resultant force of the normal gait and stair climbing are applied on the model of hip joint. The responses of displacement, stress and strain of the muscles are then recorded. FEBio non-linear solver for biomechanics is employed to conduct the simulation of the model of hip joint with muscles. The contact interfaces that used in this model are sliding contact and tied contact. From the analysis results, the gluteus maximus has the maximum displacement, stress and strain in the stair climbing. Quadratus femoris and gamellus inferior has the maximum displacement and strain in the normal gait however the maximum stress in the stair climbing. Besides that, the computational model of hip joint with muscles is produced for research and investigation platform. The model can be used as a visualization platform of hip joint.

  14. Diffractive open charm production from the dipole model analysis

    International Nuclear Information System (INIS)

    Luszczak, A.; Golec-Biernat, K.

    2009-01-01

    The most promising QCD based approach to DIS diffraction is formulated in terms of dipole models. In this analysis, we consider two important parameterizations of the dipole scattering amplitude, called GBW and CGC in which parton saturation results are built in. We present a precise comparison of the results of the dipole models using these two parameterizations with the newest data from HERA. The comparison we performed prompt us to discuss some subtle points of the dipole models, mostly related to the qqg component, and connect them to the approach based on the diffractive parton distributions evolved with the Dokshitzer-Gribov-Lipatov-Altarelli-Parisi (DGLAP) equations. Within the latter approach, the diffractive open charm production, which is the main goal of this analysis, is particularly interesting since it is sensitive to a diffractive gluon distribution. Thus, we extracted the diffractive gluon distribution from the dipole model formulae to use it for the computation of the charm contribution to F 2 D . We found good agreement with the HERA data on the diffractive open charm production both for the the gluon distributions from the considered dipole models and the DGLAP fits to HERA data from our earlier analysis for diffractive parton distributions with higher twist. (authors)

  15. Hydraulic Hybrid Excavator—Mathematical Model Validation and Energy Analysis

    Directory of Open Access Journals (Sweden)

    Paolo Casoli

    2016-11-01

    Full Text Available Recent demands to reduce pollutant emissions and improve energy efficiency have driven the implementation of hybrid solutions in mobile machinery. This paper presents the results of a numerical and experimental analysis conducted on a hydraulic hybrid excavator (HHE. The machinery under study is a middle size excavator, whose standard version was modified with the introduction of an energy recovery system (ERS. The proposed ERS layout was designed to recover the potential energy of the boom, using a hydraulic accumulator as a storage device. The recovered energy is utilized through the pilot pump of the machinery which operates as a motor, thus reducing the torque required from the internal combustion engine (ICE. The analysis reported in this paper validates the HHE model by comparing numerical and experimental data in terms of hydraulic and mechanical variables and fuel consumption. The mathematical model shows its capability to reproduce the realistic operating conditions of the realized prototype, tested on the field. A detailed energy analysis comparison between the standard and the hybrid excavator models was carried out to evaluate the energy flows along the system, showing advantages, weaknesses and possibilities to further improve the machinery efficiency. Finally, the fuel consumption estimated by the model and that measured during the experiments are presented to highlight the fuel saving percentages. The HHE model is an important starting point for the development of other energy saving solutions.

  16. Predicate Argument Structure Analysis for Use Case Description Modeling

    Science.gov (United States)

    Takeuchi, Hironori; Nakamura, Taiga; Yamaguchi, Takahira

    In a large software system development project, many documents are prepared and updated frequently. In such a situation, support is needed for looking through these documents easily to identify inconsistencies and to maintain traceability. In this research, we focus on the requirements documents such as use cases and consider how to create models from the use case descriptions in unformatted text. In the model construction, we propose a few semantic constraints based on the features of the use cases and use them for a predicate argument structure analysis to assign semantic labels to actors and actions. With this approach, we show that we can assign semantic labels without enhancing any existing general lexical resources such as case frame dictionaries and design a less language-dependent model construction architecture. By using the constructed model, we consider a system for quality analysis of the use cases and automated test case generation to keep the traceability between document sets. We evaluated the reuse of the existing use cases and generated test case steps automatically with the proposed prototype system from real-world use cases in the development of a system using a packaged application. Based on the evaluation, we show how to construct models with high precision from English and Japanese use case data. Also, we could generate good test cases for about 90% of the real use cases through the manual improvement of the descriptions based on the feedback from the quality analysis system.

  17. Operational modal analysis modeling, Bayesian inference, uncertainty laws

    CERN Document Server

    Au, Siu-Kui

    2017-01-01

    This book presents operational modal analysis (OMA), employing a coherent and comprehensive Bayesian framework for modal identification and covering stochastic modeling, theoretical formulations, computational algorithms, and practical applications. Mathematical similarities and philosophical differences between Bayesian and classical statistical approaches to system identification are discussed, allowing their mathematical tools to be shared and their results correctly interpreted. Many chapters can be used as lecture notes for the general topic they cover beyond the OMA context. After an introductory chapter (1), Chapters 2–7 present the general theory of stochastic modeling and analysis of ambient vibrations. Readers are first introduced to the spectral analysis of deterministic time series (2) and structural dynamics (3), which do not require the use of probability concepts. The concepts and techniques in these chapters are subsequently extended to a probabilistic context in Chapter 4 (on stochastic pro...

  18. Application of data envelopment analysis models in supply chain management

    DEFF Research Database (Denmark)

    Soheilirad, Somayeh; Govindan, Kannan; Mardani, Abbas

    2017-01-01

    have been attained to reach a comprehensive review of data envelopment analysis models in evaluation supply chain management. Consequently, the selected published articles have been categorized by author name, the year of publication, technique, application area, country, scope, data envelopment...... analysis purpose, study purpose, research gap and contribution, results and outcome, and journals and conferences in which they appeared. The results of this study indicated that areas of supplier selection, supply chain efficiency and sustainable supply chain have had the highest frequently than other...... of interest as a mathematical tool to evaluate supply chain management. While, various data envelopment analysis models have been suggested to measure and evaluate the supply chain management, there is a lack of research regarding to systematic literature review and classification of study in this field...

  19. Signal analysis of accelerometry data using gravity-based modeling

    Science.gov (United States)

    Davey, Neil P.; James, Daniel A.; Anderson, Megan E.

    2004-03-01

    Triaxial accelerometers have been used to measure human movement parameters in swimming. Interpretation of data is difficult due to interference sources including interaction of external bodies. In this investigation the authors developed a model to simulate the physical movement of the lower back. Theoretical accelerometery outputs were derived thus giving an ideal, or noiseless dataset. An experimental data collection apparatus was developed by adapting a system to the aquatic environment for investigation of swimming. Model data was compared against recorded data and showed strong correlation. Comparison of recorded and modeled data can be used to identify changes in body movement, this is especially useful when cyclic patterns are present in the activity. Strong correlations between data sets allowed development of signal processing algorithms for swimming stroke analysis using first the pure noiseless data set which were then applied to performance data. Video analysis was also used to validate study results and has shown potential to provide acceptable results.

  20. Therapeutic Implications from Sensitivity Analysis of Tumor Angiogenesis Models

    Science.gov (United States)

    Poleszczuk, Jan; Hahnfeldt, Philip; Enderling, Heiko

    2015-01-01

    Anti-angiogenic cancer treatments induce tumor starvation and regression by targeting the tumor vasculature that delivers oxygen and nutrients. Mathematical models prove valuable tools to study the proof-of-concept, efficacy and underlying mechanisms of such treatment approaches. The effects of parameter value uncertainties for two models of tumor development under angiogenic signaling and anti-angiogenic treatment are studied. Data fitting is performed to compare predictions of both models and to obtain nominal parameter values for sensitivity analysis. Sensitivity analysis reveals that the success of different cancer treatments depends on tumor size and tumor intrinsic parameters. In particular, we show that tumors with ample vascular support can be successfully targeted with conventional cytotoxic treatments. On the other hand, tumors with curtailed vascular support are not limited by their growth rate and therefore interruption of neovascularization emerges as the most promising treatment target. PMID:25785600

  1. Tutorial: Parallel Computing of Simulation Models for Risk Analysis.

    Science.gov (United States)

    Reilly, Allison C; Staid, Andrea; Gao, Michael; Guikema, Seth D

    2016-10-01

    Simulation models are widely used in risk analysis to study the effects of uncertainties on outcomes of interest in complex problems. Often, these models are computationally complex and time consuming to run. This latter point may be at odds with time-sensitive evaluations or may limit the number of parameters that are considered. In this article, we give an introductory tutorial focused on parallelizing simulation code to better leverage modern computing hardware, enabling risk analysts to better utilize simulation-based methods for quantifying uncertainty in practice. This article is aimed primarily at risk analysts who use simulation methods but do not yet utilize parallelization to decrease the computational burden of these models. The discussion is focused on conceptual aspects of embarrassingly parallel computer code and software considerations. Two complementary examples are shown using the languages MATLAB and R. A brief discussion of hardware considerations is located in the Appendix. © 2016 Society for Risk Analysis.

  2. A flammability and combustion model for integrated accident analysis

    International Nuclear Information System (INIS)

    Plys, M.G.; Astleford, R.D.; Epstein, M.

    1988-01-01

    A model for flammability characteristics and combustion of hydrogen and carbon monoxide mixtures is presented for application to severe accident analysis of Advanced Light Water Reactors (ALWR's). Flammability of general mixtures for thermodynamic conditions anticipated during a severe accident is quantified with a new correlation technique applied to data for several fuel and inertant mixtures and using accepted methods for combining these data. Combustion behavior is quantified by a mechanistic model consisting of a continuity and momentum balance for the burned gases, and considering an uncertainty parameter to match the idealized process to experiment. Benchmarks against experiment demonstrate the validity of this approach for a single recommended value of the flame flux multiplier parameter. The models presented here are equally applicable to analysis of current LWR's. 21 refs., 16 figs., 6 tabs

  3. Sensitivity analysis in a Lassa fever deterministic mathematical model

    Science.gov (United States)

    Abdullahi, Mohammed Baba; Doko, Umar Chado; Mamuda, Mamman

    2015-05-01

    Lassa virus that causes the Lassa fever is on the list of potential bio-weapons agents. It was recently imported into Germany, the Netherlands, the United Kingdom and the United States as a consequence of the rapid growth of international traffic. A model with five mutually exclusive compartments related to Lassa fever is presented and the basic reproduction number analyzed. A sensitivity analysis of the deterministic model is performed. This is done in order to determine the relative importance of the model parameters to the disease transmission. The result of the sensitivity analysis shows that the most sensitive parameter is the human immigration, followed by human recovery rate, then person to person contact. This suggests that control strategies should target human immigration, effective drugs for treatment and education to reduced person to person contact.

  4. Analysis of DIRAC's behavior using model checking with process algebra

    CERN Document Server

    Remenska, Daniela; Willemse, Tim; Bal, Henri; Verstoep, Kees; Fokkink, Wan; Charpentier, Philippe; Diaz, Ricardo Graciani; Lanciotti, Elisa; Roiser, Stefan; Ciba, Krzysztof

    2012-01-01

    DIRAC is the grid solution developed to support LHCb production activities as well as user data analysis. It consists of distributed services and agents delivering the workload to the grid resources. Services maintain database back-ends to store dynamic state information of entities such as jobs, queues, staging requests, etc. Agents use polling to check and possibly react to changes in the system state. Each agent's logic is relatively simple, the main complexity lies in their cooperation. Agents run concurrently, and collaborate using the databases as shared memory. The databases can be accessed directly by the agents if running locally or through a DIRAC service interface if necessary. This shared-memory model causes entities to occasionally get into inconsistent states. Tracing and fixing such problems becomes formidable due to the inherent parallelism present. We propose more rigorous methods to cope with this. Model checking is one such technique for analysis of an abstract model of a system. Unlike con...

  5. Analysis of the representation of alcohol consumption and its prevention, from the perspective of framing theory, in the Spanish press: El País, El Mundo, Abc and La Razón

    Directory of Open Access Journals (Sweden)

    María-José Rabadán-Zaragoza

    2012-01-01

    Full Text Available This article presents the results of the analysis of 103 alcohol-related texts published from January to June, 2009, by four Spanish newspapers: El País (22, El Mundo (35, Abc (24, and La Razón (22. Two methods were used to examine the representation of alcohol: structural analysis based on Kayser’s model (1982 and content analysis based on framing theory. The latter method has been used to examine the journalistic treatment of biotechnology (Durant, Bauer, and Gaskell, 1998; Nisbet, M.C., Brossard, and Kroepsch, 2003; Rodríguez-Luque, 2009, which has been adapted to examine the subject of drugs (Paricio, 2010. A reliability of 90% was achieved in the Cohen’s Kappa coefficient test for the 67 variables examined in this study. The genre most dedicated to the subject of alcohol was news articles (66.9%, whose main theme was related conflict (19 texts, particularly the consequences of drunk driving (9 texts. The most used frames by journalists were, in decreasing order, crime (identified in 62 texts, new research results (in 11 texts, and epidemiology (in 15 texts. However, only few texts were dedicated to the prevention of alcohol consumption (a theme in only 4.85% of the sample and the legal institutions dedicated to this activity (2.91%.

  6. Image decomposition as a tool for validating stress analysis models

    Directory of Open Access Journals (Sweden)

    Mottershead J.

    2010-06-01

    Full Text Available It is good practice to validate analytical and numerical models used in stress analysis for engineering design by comparison with measurements obtained from real components either in-service or in the laboratory. In reality, this critical step is often neglected or reduced to placing a single strain gage at the predicted hot-spot of stress. Modern techniques of optical analysis allow full-field maps of displacement, strain and, or stress to be obtained from real components with relative ease and at modest cost. However, validations continued to be performed only at predicted and, or observed hot-spots and most of the wealth of data is ignored. It is proposed that image decomposition methods, commonly employed in techniques such as fingerprinting and iris recognition, can be employed to validate stress analysis models by comparing all of the key features in the data from the experiment and the model. Image decomposition techniques such as Zernike moments and Fourier transforms have been used to decompose full-field distributions for strain generated from optical techniques such as digital image correlation and thermoelastic stress analysis as well as from analytical and numerical models by treating the strain distributions as images. The result of the decomposition is 101 to 102 image descriptors instead of the 105 or 106 pixels in the original data. As a consequence, it is relatively easy to make a statistical comparison of the image descriptors from the experiment and from the analytical/numerical model and to provide a quantitative assessment of the stress analysis.

  7. Dynamic Chest Image Analysis: Model-Based Perfusion Analysis in Dynamic Pulmonary Imaging

    Directory of Open Access Journals (Sweden)

    Kiuru Aaro

    2003-01-01

    Full Text Available The "Dynamic Chest Image Analysis" project aims to develop model-based computer analysis and visualization methods for showing focal and general abnormalities of lung ventilation and perfusion based on a sequence of digital chest fluoroscopy frames collected with the dynamic pulmonary imaging technique. We have proposed and evaluated a multiresolutional method with an explicit ventilation model for ventilation analysis. This paper presents a new model-based method for pulmonary perfusion analysis. According to perfusion properties, we first devise a novel mathematical function to form a perfusion model. A simple yet accurate approach is further introduced to extract cardiac systolic and diastolic phases from the heart, so that this cardiac information may be utilized to accelerate the perfusion analysis and improve its sensitivity in detecting pulmonary perfusion abnormalities. This makes perfusion analysis not only fast but also robust in computation; consequently, perfusion analysis becomes computationally feasible without using contrast media. Our clinical case studies with 52 patients show that this technique is effective for pulmonary embolism even without using contrast media, demonstrating consistent correlations with computed tomography (CT and nuclear medicine (NM studies. This fluoroscopical examination takes only about 2 seconds for perfusion study with only low radiation dose to patient, involving no preparation, no radioactive isotopes, and no contrast media.

  8. Capital Cost Optimization for Prefabrication: A Factor Analysis Evaluation Model

    Directory of Open Access Journals (Sweden)

    Hong Xue

    2018-01-01

    Full Text Available High capital cost is a significant hindrance to the promotion of prefabrication. In order to optimize cost management and reduce capital cost, this study aims to explore the latent factors and factor analysis evaluation model. Semi-structured interviews were conducted to explore potential variables and then questionnaire survey was employed to collect professionals’ views on their effects. After data collection, exploratory factor analysis was adopted to explore the latent factors. Seven latent factors were identified, including “Management Index”, “Construction Dissipation Index”, “Productivity Index”, “Design Efficiency Index”, “Transport Dissipation Index”, “Material increment Index” and “Depreciation amortization Index”. With these latent factors, a factor analysis evaluation model (FAEM, divided into factor analysis model (FAM and comprehensive evaluation model (CEM, was established. The FAM was used to explore the effect of observed variables on the high capital cost of prefabrication, while the CEM was used to evaluate comprehensive cost management level on prefabrication projects. Case studies were conducted to verify the models. The results revealed that collaborative management had a positive effect on capital cost of prefabrication. Material increment costs and labor costs had significant impacts on production cost. This study demonstrated the potential of on-site management and standardization design to reduce capital cost. Hence, collaborative management is necessary for cost management of prefabrication. Innovation and detailed design were needed to improve cost performance. The new form of precast component factories can be explored to reduce transportation cost. Meanwhile, targeted strategies can be adopted for different prefabrication projects. The findings optimized the capital cost and improved the cost performance through providing an evaluation and optimization model, which helps managers to

  9. Urban Sprawl Analysis and Modeling in Asmara, Eritrea

    Directory of Open Access Journals (Sweden)

    Mussie G. Tewolde

    2011-09-01

    Full Text Available The extension of urban perimeter markedly cuts available productive land. Hence, studies in urban sprawl analysis and modeling play an important role to ensure sustainable urban development. The urbanization pattern of the Greater Asmara Area (GAA, the capital of Eritrea, was studied. Satellite images and geospatial tools were employed to analyze the spatiotemporal urban landuse changes. Object-Based Image Analysis (OBIA, Landuse Cover Change (LUCC analysis and urban sprawl analysis using Shannon Entropy were carried out. The Land Change Modeler (LCM was used to develop a model of urban growth. The Multi-layer Perceptron Neural Network was employed to model the transition potential maps with an accuracy of 85.9% and these were used as an input for the ‘actual’ urban modeling with Markov chains. Model validation was assessed and a scenario of urban land use change of the GAA up to year 2020 was presented. The result of the study indicated that the built-up area has tripled in size (increased by 4,441 ha between 1989 and 2009. Specially, after year 2000 urban sprawl in GAA caused large scale encroachment on high potential agricultural lands and plantation cover. The scenario for year 2020 shows an increase of the built-up areas by 1,484 ha (25% which may cause further loss. The study indicated that the land allocation system in the GAA overrode the landuse plan, which caused the loss of agricultural land and plantation cover. The recommended policy options might support decision makers to resolve further loss of agricultural land and plantation cover and to achieve sustainable urban development planning in the GAA.

  10. Source modelling in seismic risk analysis for nuclear power plants

    International Nuclear Information System (INIS)

    Yucemen, M.S.

    1978-12-01

    The proposed probabilistic procedure provides a consistent method for the modelling, analysis and updating of uncertainties that are involved in the seismic risk analysis for nuclear power plants. The potential earthquake activity zones are idealized as point, line or area sources. For these seismic source types, expressions to evaluate their contribution to seismic risk are derived, considering all the possible site-source configurations. The seismic risk at a site is found to depend not only on the inherent randomness of the earthquake occurrences with respect to magnitude, time and space, but also on the uncertainties associated with the predicted values of the seismic and geometric parameters, as well as the uncertainty in the attenuation model. The uncertainty due to the attenuation equation is incorporated into the analysis through the use of random correction factors. The influence of the uncertainty resulting from the insufficient information on the seismic parameters and source geometry is introduced into the analysis by computing a mean risk curve averaged over the various alternative assumptions on the parameters and source geometry. Seismic risk analysis is carried for the city of Denizli, which is located in the seismically most active zone of Turkey. The second analysis is for Akkuyu

  11. Comparative analysis of calculation models of railway subgrade

    Directory of Open Access Journals (Sweden)

    I.O. Sviatko

    2013-08-01

    Full Text Available Purpose. In transport engineering structures design, the primary task is to determine the parameters of foundation soil and nuances of its work under loads. It is very important to determine the parameters of shear resistance and the parameters, determining the development of deep deformations in foundation soils, while calculating the soil subgrade - upper track structure interaction. Search for generalized numerical modeling methods of embankment foundation soil work that include not only the analysis of the foundation stress state but also of its deformed one. Methodology. The analysis of existing modern and classical methods of numerical simulation of soil samples under static load was made. Findings. According to traditional methods of analysis of ground masses work, limitation and the qualitative estimation of subgrade deformations is possible only indirectly, through the estimation of stress and comparison of received values with the boundary ones. Originality. A new computational model was proposed in which it will be applied not only classical approach analysis of the soil subgrade stress state, but deformed state will be also taken into account. Practical value. The analysis showed that for accurate analysis of ground masses work it is necessary to develop a generalized methodology for analyzing of the rolling stock - railway subgrade interaction, which will use not only the classical approach of analyzing the soil subgrade stress state, but also take into account its deformed one.

  12. INTEGRATION OF FACILITY MODELING CAPABILITIES FOR NUCLEAR NONPROLIFERATION ANALYSIS

    Energy Technology Data Exchange (ETDEWEB)

    Gorensek, M.; Hamm, L.; Garcia, H.; Burr, T.; Coles, G.; Edmunds, T.; Garrett, A.; Krebs, J.; Kress, R.; Lamberti, V.; Schoenwald, D.; Tzanos, C.; Ward, R.

    2011-07-18

    Developing automated methods for data collection and analysis that can facilitate nuclear nonproliferation assessment is an important research area with significant consequences for the effective global deployment of nuclear energy. Facility modeling that can integrate and interpret observations collected from monitored facilities in order to ascertain their functional details will be a critical element of these methods. Although improvements are continually sought, existing facility modeling tools can characterize all aspects of reactor operations and the majority of nuclear fuel cycle processing steps, and include algorithms for data processing and interpretation. Assessing nonproliferation status is challenging because observations can come from many sources, including local and remote sensors that monitor facility operations, as well as open sources that provide specific business information about the monitored facilities, and can be of many different types. Although many current facility models are capable of analyzing large amounts of information, they have not been integrated in an analyst-friendly manner. This paper addresses some of these facility modeling capabilities and illustrates how they could be integrated and utilized for nonproliferation analysis. The inverse problem of inferring facility conditions based on collected observations is described, along with a proposed architecture and computer framework for utilizing facility modeling tools. After considering a representative sampling of key facility modeling capabilities, the proposed integration framework is illustrated with several examples.

  13. Porflow modeling supporting the FY14 salstone special analysis

    Energy Technology Data Exchange (ETDEWEB)

    Flach, G. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Taylor, G. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2014-04-01

    PORFLOW related analyses supporting the Saltstone FY14 Special Analysis (SA) described herein are based on prior modeling supporting the Saltstone FY13 SA. Notable changes to the previous round of simulations include: a) consideration of Saltstone Disposal Unit (SDU) design type 6 under “Nominal” and “Margin” conditions, b) omission of the clean cap fill from the nominal SDU 2 and 6 modeling cases as a reasonable approximation of greater waste grout fill heights, c) minor updates to the cementitious materials degradation analysis, d) use of updated I-129 sorption coefficient (Kd) values in soils, e) assignment of the pH/Eh environment of saltstone to the underlying floor concrete, considering down flow through an SDU, and f) implementation of an improved sub-model for Tc release in an oxidizing environment. These new model developments are discussed and followed by a cursory presentation of simulation results. The new Tc release sub-model produced significantly improved (smoother) flux results compared to the FY13 SA. Further discussion of PORFLOW model setup and simulation results will be presented in the FY14 SA, including dose results.

  14. An Effective Distributed Model for Power System Transient Stability Analysis

    Directory of Open Access Journals (Sweden)

    MUTHU, B. M.

    2011-08-01

    Full Text Available The modern power systems consist of many interconnected synchronous generators having different inertia constants, connected with large transmission network and ever increasing demand for power exchange. The size of the power system grows exponentially due to increase in power demand. The data required for various power system applications have been stored in different formats in a heterogeneous environment. The power system applications themselves have been developed and deployed in different platforms and language paradigms. Interoperability between power system applications becomes a major issue because of the heterogeneous nature. The main aim of the paper is to develop a generalized distributed model for carrying out power system stability analysis. The more flexible and loosely coupled JAX-RPC model has been developed for representing transient stability analysis in large interconnected power systems. The proposed model includes Pre-Fault, During-Fault, Post-Fault and Swing Curve services which are accessible to the remote power system clients when the system is subjected to large disturbances. A generalized XML based model for data representation has also been proposed for exchanging data in order to enhance the interoperability between legacy power system applications. The performance measure, Round Trip Time (RTT is estimated for different power systems using the proposed JAX-RPC model and compared with the results obtained using traditional client-server and Java RMI models.

  15. Integration Of Facility Modeling Capabilities For Nuclear Nonproliferation Analysis

    International Nuclear Information System (INIS)

    Gorensek, M.; Hamm, L.; Garcia, H.; Burr, T.; Coles, G.; Edmunds, T.; Garrett, A.; Krebs, J.; Kress, R.; Lamberti, V.; Schoenwald, D.; Tzanos, C.; Ward, R.

    2011-01-01

    Developing automated methods for data collection and analysis that can facilitate nuclear nonproliferation assessment is an important research area with significant consequences for the effective global deployment of nuclear energy. Facility modeling that can integrate and interpret observations collected from monitored facilities in order to ascertain their functional details will be a critical element of these methods. Although improvements are continually sought, existing facility modeling tools can characterize all aspects of reactor operations and the majority of nuclear fuel cycle processing steps, and include algorithms for data processing and interpretation. Assessing nonproliferation status is challenging because observations can come from many sources, including local and remote sensors that monitor facility operations, as well as open sources that provide specific business information about the monitored facilities, and can be of many different types. Although many current facility models are capable of analyzing large amounts of information, they have not been integrated in an analyst-friendly manner. This paper addresses some of these facility modeling capabilities and illustrates how they could be integrated and utilized for nonproliferation analysis. The inverse problem of inferring facility conditions based on collected observations is described, along with a proposed architecture and computer framework for utilizing facility modeling tools. After considering a representative sampling of key facility modeling capabilities, the proposed integration framework is illustrated with several examples.

  16. Concept analysis of moral courage in nursing: A hybrid model.

    Science.gov (United States)

    Sadooghiasl, Afsaneh; Parvizy, Soroor; Ebadi, Abbas

    2018-02-01

    Moral courage is one of the most fundamental virtues in the nursing profession, however, little attention has been paid to it. As a result, no exact and clear definition of moral courage has ever been accessible. This study is carried out for the purposes of defining and clarifying its concept in the nursing profession. This study used a hybrid model of concept analysis comprising three phases, namely, a theoretical phase, field work phase, and a final analysis phase. To find relevant literature, electronic search of valid databases was utilized using keywords related to the concept of courage. Field work data were collected over an 11 months' time period from 2013 to 2014. In the field work phase, in-depth interviews were performed with 10 nurses. The conventional content analysis was used in two theoretical and field work phases using Graneheim and Lundman stages, and the results were combined in the final analysis phase. Ethical consideration: Permission for this study was obtained from the ethics committee of Tehran University of Medical Sciences. Oral and written informed consent was received from the participants. From the sum of 750 gained titles in theoretical phase, 26 texts were analyzed. The analysis resulted in 494 codes in text analysis and 226 codes in interview analysis. The literature review in the theoretical phase revealed two features of inherent-transcendental characteristics, two of which possessed a difficult nature. Working in the field phase added moral self-actualization characteristic, rationalism, spiritual beliefs, and scientific-professional qualifications to the feature of the concept. Moral courage is a pure and prominent characteristic of human beings. The antecedents of moral courage include model orientation, model acceptance, rationalism, individual excellence, acquiring academic and professional qualification, spiritual beliefs, organizational support, organizational repression, and internal and external personal barriers

  17. Delamination Modeling of Composites for Improved Crash Analysis

    Science.gov (United States)

    Fleming, David C.

    1999-01-01

    Finite element crash modeling of composite structures is limited by the inability of current commercial crash codes to accurately model delamination growth. Efforts are made to implement and assess delamination modeling techniques using a current finite element crash code, MSC/DYTRAN. Three methods are evaluated, including a straightforward method based on monitoring forces in elements or constraints representing an interface; a cohesive fracture model proposed in the literature; and the virtual crack closure technique commonly used in fracture mechanics. Results are compared with dynamic double cantilever beam test data from the literature. Examples show that it is possible to accurately model delamination propagation in this case. However, the computational demands required for accurate solution are great and reliable property data may not be available to support general crash modeling efforts. Additional examples are modeled including an impact-loaded beam, damage initiation in laminated crushing specimens, and a scaled aircraft subfloor structures in which composite sandwich structures are used as energy-absorbing elements. These examples illustrate some of the difficulties in modeling delamination as part of a finite element crash analysis.

  18. Bayesian analysis of a reduced-form air quality model.

    Science.gov (United States)

    Foley, Kristen M; Reich, Brian J; Napelenok, Sergey L

    2012-07-17

    Numerical air quality models are being used for assessing emission control strategies for improving ambient pollution levels across the globe. This paper applies probabilistic modeling to evaluate the effectiveness of emission reduction scenarios aimed at lowering ground-level ozone concentrations. A Bayesian hierarchical model is used to combine air quality model output and monitoring data in order to characterize the impact of emissions reductions while accounting for different degrees of uncertainty in the modeled emissions inputs. The probabilistic model predictions are weighted based on population density in order to better quantify the societal benefits/disbenefits of four hypothetical emission reduction scenarios in which domain-wide NO(x) emissions from various sectors are reduced individually and then simultaneously. Cross validation analysis shows the statistical model performs well compared to observed ozone levels. Accounting for the variability and uncertainty in the emissions and atmospheric systems being modeled is shown to impact how emission reduction scenarios would be ranked, compared to standard methodology.

  19. Analysis specifications for the CC3 geosphere model GEONET

    International Nuclear Information System (INIS)

    Melnyk, T.W.

    1995-04-01

    AECL is assessing a concept for disposing of Canada's nuclear fuel waste in a sealed vault deep in plutonic rock of the Canadian Shield. A computer program has been developed as an analytical tool for the postclosure assessment case study, a system model, CC3 (Canadian Concept, generation 3), has been developed to describe a hypothetical disposal system. This system model includes separate models for the engineered barriers within the disposal vault, the geosphere in which the vault is emplaced, and the biosphere in the vicinity of any discharge zones. The system model is embedded within a computer code SYVAC3, (SYstems Variability Analysis Code, generation 3), which takes parameter uncertainty into account by repeated simulation of the system. GEONET (GEOsphere NETwork) is the geosphere model component of this system model. It simulates contaminant transport from the vault to the biosphere along a transport network composed of one-dimensional transport segments that are connected together in three-dimensional space. This document is a set of specifications for GEONET that were developed over a number of years. Improvements to the code will be based on revisions to these specifications. The specifications consist of a model synopsis, describing all the relevant equations and assumptions used in the model, a set of formal data flow diagrams and minispecifications, and a data dictionary. (author). 26 refs., 20 figs

  20. Stochiometry, Microbial community composition and decomposition, a modelling analysis

    Science.gov (United States)

    Berninger, Frank; Zhou, Xuan; Aaltonen, Heidi; Köster, Kajar; Heinonsalo, Jussi; Pumpanen, Jukka

    2017-04-01

    Enzyme activity based litter decomposition models describe the decomposition of soil organic matter as a function of microbial biomass and its activity. In these models, decomposition depends largely on microbial and litter stoïchiometry. We, used the model of Schimel and Weintraub (Soil Biology & Biochemistry 35 (2003) 549-563 largely relying on the modification of Waring B et al. Ecology Letters, (2013) 16: 887-894) and we modified the model to include bacteria, fungi and mycorrizal fungi as decomposer groups assuming different stochiometries. The model was tested against previously published data from a fire chronosequence from northern Finland. The model reconstructed well the development of soil organic matter, microbial biomasses, enzyme actitivies with time after fire. In a theoretical model analysis we tried to understand how the exchange of carbon and nitrogen between mycorrhiza and the plant as different litter stoïchiometries interact. The results indicate that if a high percentage of fungal N uptake is transferred to the plant mycorrhizal biomass will decrease drastically and does decrease, due to low mycorrhizal biomasses, the N uptake of plants. If a lower proportion of the fungal N uptake is transferred to the plant the N uptake of the plants is reasonable stable while the proportion of mycorrhiza of the total fungal biomass varies. The model is also able to simulate priming of soil organic matter decomposition.

  1. Modelling pesticides volatilisation in greenhouses: Sensitivity analysis of a modified PEARL model.

    Science.gov (United States)

    Houbraken, Michael; Doan Ngoc, Kim; van den Berg, Frederik; Spanoghe, Pieter

    2017-12-01

    The application of the existing PEARL model was extended to include estimations of the concentration of crop protection products in greenhouse (indoor) air due to volatilisation from the plant surface. The model was modified to include the processes of ventilation of the greenhouse air to the outside atmosphere and transformation in the air. A sensitivity analysis of the model was performed by varying selected input parameters on a one-by-one basis and comparing the model outputs with the outputs of the reference scenarios. The sensitivity analysis indicates that - in addition to vapour pressure - the model had the highest ratio of variation for the rate ventilation rate and thickness of the boundary layer on the day of application. On the days after application, competing processes, degradation and uptake in the plant, becomes more important. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Assessing climate model software quality: a defect density analysis of three models

    Directory of Open Access Journals (Sweden)

    J. Pipitone

    2012-08-01

    Full Text Available A climate model is an executable theory of the climate; the model encapsulates climatological theories in software so that they can be simulated and their implications investigated. Thus, in order to trust a climate model, one must trust that the software it is built from is built correctly. Our study explores the nature of software quality in the context of climate modelling. We performed an analysis of defect reports and defect fixes in several versions of leading global climate models by collecting defect data from bug tracking systems and version control repository comments. We found that the climate models all have very low defect densities compared to well-known, similarly sized open-source projects. We discuss the implications of our findings for the assessment of climate model software trustworthiness.

  3. Analysis of operating model of electronic invoice colombian Colombian electronic billing analysis of the operational model

    Directory of Open Access Journals (Sweden)

    Sérgio Roberto da Silva

    2016-06-01

    Full Text Available Colombia has been one of the first countries to introduce electronic billing process on a voluntary basis, from a traditional to a digital version. In this context, the article analyzes the electronic billing process implemented in Colombia and the advantages. Methodological research is applied, qualitative, descriptive and documentary; where the regulatory framework and the conceptualization of the model is identified; the process of adoption of electronic billing is analyzed, and finally the advantages and disadvantages of its implementation is analyzed. The findings indicate that the model applied in Colombia to issue an electronic billing in sending and receiving process, is not complex, but it requires a small adequate infrastructure and trained personnel to reach all sectors, especially the micro and business which is the largest business network in the country.

  4. Benchmarking analysis of three multimedia models: RESRAD, MMSOILS, and MEPAS

    International Nuclear Information System (INIS)

    Cheng, J.J.; Faillace, E.R.; Gnanapragasam, E.K.

    1995-11-01

    Multimedia modelers from the United States Environmental Protection Agency (EPA) and the United States Department of Energy (DOE) collaborated to conduct a comprehensive and quantitative benchmarking analysis of three multimedia models. The three models-RESRAD (DOE), MMSOILS (EPA), and MEPAS (DOE)-represent analytically based tools that are used by the respective agencies for performing human exposure and health risk assessments. The study is performed by individuals who participate directly in the ongoing design, development, and application of the models. A list of physical/chemical/biological processes related to multimedia-based exposure and risk assessment is first presented as a basis for comparing the overall capabilities of RESRAD, MMSOILS, and MEPAS. Model design, formulation, and function are then examined by applying the models to a series of hypothetical problems. Major components of the models (e.g., atmospheric, surface water, groundwater) are evaluated separately and then studied as part of an integrated system for the assessment of a multimedia release scenario to determine effects due to linking components of the models. Seven modeling scenarios are used in the conduct of this benchmarking study: (1) direct biosphere exposure, (2) direct release to the air, (3) direct release to the vadose zone, (4) direct release to the saturated zone, (5) direct release to surface water, (6) surface water hydrology, and (7) multimedia release. Study results show that the models differ with respect to (1) environmental processes included (i.e., model features) and (2) the mathematical formulation and assumptions related to the implementation of solutions (i.e., parameterization)

  5. Selecting an Appropriate Upscaled Reservoir Model Based on Connectivity Analysis

    Directory of Open Access Journals (Sweden)

    Preux Christophe

    2016-09-01

    Full Text Available Reservoir engineers aim to build reservoir models to investigate fluid flows within hydrocarbon reservoirs. These models consist of three-dimensional grids populated by petrophysical properties. In this paper, we focus on permeability that is known to significantly influence fluid flow. Reservoir models usually encompass a very large number of fine grid blocks to better represent heterogeneities. However, performing fluid flow simulations for such fine models is extensively CPU-time consuming. A common practice consists in converting the fine models into coarse models with less grid blocks: this is the upscaling process. Many upscaling methods have been proposed in the literature that all lead to distinct coarse models. The problem is how to choose the appropriate upscaling method. Various criteria have been established to evaluate the information loss due to upscaling, but none of them investigate connectivity. In this paper, we propose to first perform a connectivity analysis for the fine and candidate coarse models. This makes it possible to identify shortest paths connecting wells. Then, we introduce two indicators to quantify the length and trajectory mismatch between the paths for the fine and the coarse models. The upscaling technique to be recommended is the one that provides the coarse model for which the shortest paths are the closest to the shortest paths determined for the fine model, both in terms of length and trajectory. Last, the potential of this methodology is investigated from two test cases. We show that the two indicators help select suitable upscaling techniques as long as gravity is not a prominent factor that drives fluid flows.

  6. Analysis and optimization of a camber morphing wing model

    Directory of Open Access Journals (Sweden)

    Bing Li

    2016-09-01

    Full Text Available This article proposes a camber morphing wing model that can continuously change its camber. A mathematical model is proposed and a kinematic simulation is performed to verify the wing’s ability to change camber. An aerodynamic model is used to test its aerodynamic characteristics. Some important aerodynamic analyses are performed. A comparative analysis is conducted to explore the relationships between aerodynamic parameters, the rotation angle of the trailing edge, and the angle of attack. An improved artificial fish swarm optimization algorithm is proposed, referred to as the weighted adaptive artificial fish-swarm with embedded Hooke–Jeeves search method. Some comparison tests are used to test the performance of the improved optimization algorithm. Finally, the proposed optimization algorithm is used to optimize the proposed camber morphing wing model.

  7. An educational model for ensemble streamflow simulation and uncertainty analysis

    Directory of Open Access Journals (Sweden)

    A. AghaKouchak

    2013-02-01

    Full Text Available This paper presents the hands-on modeling toolbox, HBV-Ensemble, designed as a complement to theoretical hydrology lectures, to teach hydrological processes and their uncertainties. The HBV-Ensemble can be used for in-class lab practices and homework assignments, and assessment of students' understanding of hydrological processes. Using this modeling toolbox, students can gain more insights into how hydrological processes (e.g., precipitation, snowmelt and snow accumulation, soil moisture, evapotranspiration and runoff generation are interconnected. The educational toolbox includes a MATLAB Graphical User Interface (GUI and an ensemble simulation scheme that can be used for teaching uncertainty analysis, parameter estimation, ensemble simulation and model sensitivity. HBV-Ensemble was administered in a class for both in-class instruction and a final project, and students submitted their feedback about the toolbox. The results indicate that this educational software had a positive impact on students understanding and knowledge of uncertainty in hydrological modeling.

  8. QSAR modeling and chemical space analysis of antimalarial compounds

    Science.gov (United States)

    Sidorov, Pavel; Viira, Birgit; Davioud-Charvet, Elisabeth; Maran, Uko; Marcou, Gilles; Horvath, Dragos; Varnek, Alexandre

    2017-05-01

    Generative topographic mapping (GTM) has been used to visualize and analyze the chemical space of antimalarial compounds as well as to build predictive models linking structure of molecules with their antimalarial activity. For this, a database, including 3000 molecules tested in one or several of 17 anti- Plasmodium activity assessment protocols, has been compiled by assembling experimental data from in-house and ChEMBL databases. GTM classification models built on subsets corresponding to individual bioassays perform similarly to the earlier reported SVM models. Zones preferentially populated by active and inactive molecules, respectively, clearly emerge in the class landscapes supported by the GTM model. Their analysis resulted in identification of privileged structural motifs of potential antimalarial compounds. Projection of marketed antimalarial drugs on this map allowed us to delineate several areas in the chemical space corresponding to different mechanisms of antimalarial activity. This helped us to make a suggestion about the mode of action of the molecules populating these zones.

  9. Automated differentiation of computer models for sensitivity analysis

    International Nuclear Information System (INIS)

    Worley, B.A.

    1991-01-01

    Sensitivity analysis of reactor physics computer models is an established discipline after more than twenty years of active development of generalized perturbations theory based on direct and adjoint methods. Many reactor physics models have been enhanced to solve for sensitivities of model results to model data. The calculated sensitivities are usually normalized first derivatives, although some codes are capable of solving for higher-order sensitivities. The purpose of this paper is to report on the development and application of the GRESS system for automating the implementation of the direct and adjoint techniques into existing FORTRAN computer codes. The GRESS system was developed at ORNL to eliminate the costly man-power intensive effort required to implement the direct and adjoint techniques into already-existing FORTRAN codes. GRESS has been successfully tested for a number of codes over a wide range of applications and presently operates on VAX machines under both VMS and UNIX operating systems. (author). 9 refs, 1 tab

  10. Rotor-Flying Manipulator: Modeling, Analysis, and Control

    Directory of Open Access Journals (Sweden)

    Bin Yang

    2014-01-01

    Full Text Available Equipping multijoint manipulators on a mobile robot is a typical redesign scheme to make the latter be able to actively influence the surroundings and has been extensively used for many ground robots, underwater robots, and space robotic systems. However, the rotor-flying robot (RFR is difficult to be made such redesign. This is mainly because the motion of the manipulator will bring heavy coupling between itself and the RFR system, which makes the system model highly complicated and the controller design difficult. Thus, in this paper, the modeling, analysis, and control of the combined system, called rotor-flying multijoint manipulator (RF-MJM, are conducted. Firstly, the detailed dynamics model is constructed and analyzed. Subsequently, a full-state feedback linear quadratic regulator (LQR controller is designed through obtaining linearized model near steady state. Finally, simulations are conducted and the results are analyzed to show the basic control performance.

  11. Modeling Illicit Drug Use Dynamics and Its Optimal Control Analysis

    Directory of Open Access Journals (Sweden)

    Steady Mushayabasa

    2015-01-01

    Full Text Available The global burden of death and disability attributable to illicit drug use, remains a significant threat to public health for both developed and developing nations. This paper presents a new mathematical modeling framework to investigate the effects of illicit drug use in the community. In our model the transmission process is captured as a social “contact” process between the susceptible individuals and illicit drug users. We conduct both epidemic and endemic analysis, with a focus on the threshold dynamics characterized by the basic reproduction number. Using our model, we present illustrative numerical results with a case study in Cape Town, Gauteng, Mpumalanga and Durban communities of South Africa. In addition, the basic model is extended to incorporate time dependent intervention strategies.

  12. Coarse Analysis of Microscopic Models using Equation-Free Methods

    DEFF Research Database (Denmark)

    Marschler, Christian

    -dimensional models. The goal of this thesis is to investigate such high-dimensional multiscale models and extract relevant low-dimensional information from them. Recently developed mathematical tools allow to reach this goal: a combination of so-called equation-free methods with numerical bifurcation analysis...... using short simulation bursts of computationally-expensive complex models. Those information is subsequently used to construct bifurcation diagrams that show the parameter dependence of solutions of the system. The methods developed for this thesis have been applied to a wide range of relevant problems....... Applications include the learning behavior in the barn owl’s auditory system, traffic jam formation in an optimal velocity model for circular car traffic and oscillating behavior of pedestrian groups in a counter-flow through a corridor with narrow door. The methods do not only quantify interesting properties...

  13. Introduction to modeling and analysis of stochastic systems

    CERN Document Server

    Kulkarni, V G

    2011-01-01

    This is an introductory-level text on stochastic modeling. It is suited for undergraduate students in engineering, operations research, statistics, mathematics, actuarial science, business management, computer science, and public policy. It employs a large number of examples to teach the students to use stochastic models of real-life systems to predict their performance, and use this analysis to design better systems. The book is devoted to the study of important classes of stochastic processes: discrete and continuous time Markov processes, Poisson processes, renewal and regenerative processes, semi-Markov processes, queueing models, and diffusion processes. The book systematically studies the short-term and the long-term behavior, cost/reward models, and first passage times. All the material is illustrated with many examples, and case studies. The book provides a concise review of probability in the appendix. The book emphasizes numerical answers to the problems. A collection of MATLAB programs to accompany...

  14. QSAR modeling and chemical space analysis of antimalarial compounds.

    Science.gov (United States)

    Sidorov, Pavel; Viira, Birgit; Davioud-Charvet, Elisabeth; Maran, Uko; Marcou, Gilles; Horvath, Dragos; Varnek, Alexandre

    2017-05-01

    Generative topographic mapping (GTM) has been used to visualize and analyze the chemical space of antimalarial compounds as well as to build predictive models linking structure of molecules with their antimalarial activity. For this, a database, including ~3000 molecules tested in one or several of 17 anti-Plasmodium activity assessment protocols, has been compiled by assembling experimental data from in-house and ChEMBL databases. GTM classification models built on subsets corresponding to individual bioassays perform similarly to the earlier reported SVM models. Zones preferentially populated by active and inactive molecules, respectively, clearly emerge in the class landscapes supported by the GTM model. Their analysis resulted in identification of privileged structural motifs of potential antimalarial compounds. Projection of marketed antimalarial drugs on this map allowed us to delineate several areas in the chemical space corresponding to different mechanisms of antimalarial activity. This helped us to make a suggestion about the mode of action of the molecules populating these zones.

  15. Modeling and evaluating user behavior in exploratory visual analysis

    Energy Technology Data Exchange (ETDEWEB)

    Reda, Khairi; Johnson, Andrew E.; Papka, Michael E.; Leigh, Jason

    2016-07-25

    Empirical evaluation methods for visualizations have traditionally focused on assessing the outcome of the visual analytic process as opposed to characterizing how that process unfolds. There are only a handful of methods that can be used to systematically study how people use visualizations, making it difficult for researchers to capture and characterize the subtlety of cognitive and interaction behaviors users exhibit during visual analysis. To validate and improve visualization design, however, it is important for researchers to be able to assess and understand how users interact with visualization systems under realistic scenarios. This paper presents a methodology for modeling and evaluating the behavior of users in exploratory visual analysis. We model visual exploration using a Markov chain process comprising transitions between mental, interaction, and computational states. These states and the transitions between them can be deduced from a variety of sources, including verbal transcripts, videos and audio recordings, and log files. This model enables the evaluator to characterize the cognitive and computational processes that are essential to insight acquisition in exploratory visual analysis, and reconstruct the dynamics of interaction between the user and the visualization system. We illustrate this model with two exemplar user studies, and demonstrate the qualitative and quantitative analytical tools it affords.

  16. Improved Flow Modeling in Transient Reactor Safety Analysis Computer Codes

    International Nuclear Information System (INIS)

    Holowach, M.J.; Hochreiter, L.E.; Cheung, F.B.

    2002-01-01

    A method of accounting for fluid-to-fluid shear in between calculational cells over a wide range of flow conditions envisioned in reactor safety studies has been developed such that it may be easily implemented into a computer code such as COBRA-TF for more detailed subchannel analysis. At a given nodal height in the calculational model, equivalent hydraulic diameters are determined for each specific calculational cell using either laminar or turbulent velocity profiles. The velocity profile may be determined from a separate CFD (Computational Fluid Dynamics) analysis, experimental data, or existing semi-empirical relationships. The equivalent hydraulic diameter is then applied to the wall drag force calculation so as to determine the appropriate equivalent fluid-to-fluid shear caused by the wall for each cell based on the input velocity profile. This means of assigning the shear to a specific cell is independent of the actual wetted perimeter and flow area for the calculational cell. The use of this equivalent hydraulic diameter for each cell within a calculational subchannel results in a representative velocity profile which can further increase the accuracy and detail of heat transfer and fluid flow modeling within the subchannel when utilizing a thermal hydraulics systems analysis computer code such as COBRA-TF. Utilizing COBRA-TF with the flow modeling enhancement results in increased accuracy for a coarse-mesh model without the significantly greater computational and time requirements of a full-scale 3D (three-dimensional) transient CFD calculation. (authors)

  17. Sentiments Analysis of Reviews Based on ARCNN Model

    Science.gov (United States)

    Xu, Xiaoyu; Xu, Ming; Xu, Jian; Zheng, Ning; Yang, Tao

    2017-10-01

    The sentiments analysis of product reviews is designed to help customers understand the status of the product. The traditional method of sentiments analysis relies on the input of a fixed feature vector which is performance bottleneck of the basic codec architecture. In this paper, we propose an attention mechanism with BRNN-CNN model, referring to as ARCNN model. In order to have a good analysis of the semantic relations between words and solves the problem of dimension disaster, we use the GloVe algorithm to train the vector representations for words. Then, ARCNN model is proposed to deal with the problem of deep features training. Specifically, BRNN model is proposed to investigate non-fixed-length vectors and keep time series information perfectly and CNN can study more connection of deep semantic links. Moreover, the attention mechanism can automatically learn from the data and optimize the allocation of weights. Finally, a softmax classifier is designed to complete the sentiment classification of reviews. Experiments show that the proposed method can improve the accuracy of sentiment classification compared with benchmark methods.

  18. Dynamic Response of Linear Mechanical Systems Modeling, Analysis and Simulation

    CERN Document Server

    Angeles, Jorge

    2012-01-01

    Dynamic Response of Linear Mechanical Systems: Modeling, Analysis and Simulation can be utilized for a variety of courses, including junior and senior-level vibration and linear mechanical analysis courses. The author connects, by means of a rigorous, yet intuitive approach, the theory of vibration with the more general theory of systems. The book features: A seven-step modeling technique that helps structure the rather unstructured process of mechanical-system modeling A system-theoretic approach to deriving the time response of the linear mathematical models of mechanical systems The modal analysis and the time response of two-degree-of-freedom systems—the first step on the long way to the more elaborate study of multi-degree-of-freedom systems—using the Mohr circle Simple, yet powerful simulation algorithms that exploit the linearity of the system for both single- and multi-degree-of-freedom systems Examples and exercises that rely on modern computational toolboxes for both numerical and symbolic compu...

  19. Multifunctional Collaborative Modeling and Analysis Methods in Engineering Science

    Science.gov (United States)

    Ransom, Jonathan B.; Broduer, Steve (Technical Monitor)

    2001-01-01

    Engineers are challenged to produce better designs in less time and for less cost. Hence, to investigate novel and revolutionary design concepts, accurate, high-fidelity results must be assimilated rapidly into the design, analysis, and simulation process. This assimilation should consider diverse mathematical modeling and multi-discipline interactions necessitated by concepts exploiting advanced materials and structures. Integrated high-fidelity methods with diverse engineering applications provide the enabling technologies to assimilate these high-fidelity, multi-disciplinary results rapidly at an early stage in the design. These integrated methods must be multifunctional, collaborative, and applicable to the general field of engineering science and mechanics. Multifunctional methodologies and analysis procedures are formulated for interfacing diverse subdomain idealizations including multi-fidelity modeling methods and multi-discipline analysis methods. These methods, based on the method of weighted residuals, ensure accurate compatibility of primary and secondary variables across the subdomain interfaces. Methods are developed using diverse mathematical modeling (i.e., finite difference and finite element methods) and multi-fidelity modeling among the subdomains. Several benchmark scalar-field and vector-field problems in engineering science are presented with extensions to multidisciplinary problems. Results for all problems presented are in overall good agreement with the exact analytical solution or the reference numerical solution. Based on the results, the integrated modeling approach using the finite element method for multi-fidelity discretization among the subdomains is identified as most robust. The multiple-method approach is advantageous when interfacing diverse disciplines in which each of the method's strengths are utilized. The multifunctional methodology presented provides an effective mechanism by which domains with diverse idealizations are

  20. Sensitivity Analysis of the Integrated Medical Model for ISS Programs

    Science.gov (United States)

    Goodenow, D. A.; Myers, J. G.; Arellano, J.; Boley, L.; Garcia, Y.; Saile, L.; Walton, M.; Kerstman, E.; Reyes, D.; Young, M.

    2016-01-01

    Sensitivity analysis estimates the relative contribution of the uncertainty in input values to the uncertainty of model outputs. Partial Rank Correlation Coefficient (PRCC) and Standardized Rank Regression Coefficient (SRRC) are methods of conducting sensitivity analysis on nonlinear simulation models like the Integrated Medical Model (IMM). The PRCC method estimates the sensitivity using partial correlation of the ranks of the generated input values to each generated output value. The partial part is so named because adjustments are made for the linear effects of all the other input values in the calculation of correlation between a particular input and each output. In SRRC, standardized regression-based coefficients measure the sensitivity of each input, adjusted for all the other inputs, on each output. Because the relative ranking of each of the inputs and outputs is used, as opposed to the values themselves, both methods accommodate the nonlinear relationship of the underlying model. As part of the IMM v4.0 validation study, simulations are available that predict 33 person-missions on ISS and 111 person-missions on STS. These simulated data predictions feed the sensitivity analysis procedures. The inputs to the sensitivity procedures include the number occurrences of each of the one hundred IMM medical conditions generated over the simulations and the associated IMM outputs: total quality time lost (QTL), number of evacuations (EVAC), and number of loss of crew lives (LOCL). The IMM team will report the results of using PRCC and SRRC on IMM v4.0 predictions of the ISS and STS missions created as part of the external validation study. Tornado plots will assist in the visualization of the condition-related input sensitivities to each of the main outcomes. The outcomes of this sensitivity analysis will drive review focus by identifying conditions where changes in uncertainty could drive changes in overall model output uncertainty. These efforts are an integral