WorldWideScience

Sample records for analysis model gsam

  1. Development of a gas systems analysis model (GSAM)

    Energy Technology Data Exchange (ETDEWEB)

    Godec, M.L. [IFC Resources Inc., Fairfax, VA (United States)

    1995-04-01

    The objectives of developing a Gas Systems Analysis Model (GSAM) are to create a comprehensive, non-proprietary, PC based model of domestic gas industry activity. The system is capable of assessing the impacts of various changes in the natural gas system within North America. The individual and collective impacts due to changes in technology and economic conditions are explicitly modeled in GSAM. Major gas resources are all modeled, including conventional, tight, Devonian Shale, coalbed methane, and low-quality gas sources. The modeling system asseses all key components of the gas industry, including available resources, exploration, drilling, completion, production, and processing practices, both for now and in the future. The model similarly assesses the distribution, storage, and utilization of natural gas in a dynamic market-based analytical structure. GSAM is designed to provide METC managers with a tool to project the impacts of future research, development, and demonstration (RD&D) benefits in order to determine priorities in a rapidly changing, market-driven gas industry.

  2. Development of a natural gas systems analysis model (GSAM). Annual report, July 1996--July 1997

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-12-31

    The objective of GSAM development is to create a comprehensive, non-proprietary, microcomputer model of the North American natural gas system. GSAM explicitly evaluates the key components of the system, including the resource base, exploration and development practices, extraction technology performance and costs, project economics, transportation costs and restrictions, storage, and end-use. The primary focus is the detailed characterization of the resource base at the reservoir and subreservoir level. This disaggregation allows direct evaluation of alternative extraction technologies based on discretely estimated, individual well productivity, required investments, and associated operating costs. GSAM`s design allows users to evaluate complex interactions of current and alternative future technology and policy initiatives as they directly impact the gas market. GSAM development has been ongoing for the past five years. Key activities completed during the past year are described.

  3. DEVELOPMENT OF A NATURAL GAS SYSTEMS ANALYSIS MODEL (GSAM) VOLUME I - SUMMARY REPORT VOLUME II - USER'S GUIDE VOLUME IIIA - RP PROGRAMMER'S GUIDE VOLUME IIIB - SRPM PROGRAMMER'S GUIDE VOLUME IIIC - E&P PROGRAMMER'S GUIDE VOLUME IIID - D&I PROGRAMMER'S GUIDE

    Energy Technology Data Exchange (ETDEWEB)

    Unknown

    2001-02-01

    This report summarizes work completed on DOE Contract DE-AC21-92MC28138, Development of a Natural Gas Systems Analysis Model (GSAM). The products developed under this project directly support the National Energy Technology Laboratory (NETL) in carrying out its natural gas R&D mission. The objective of this research effort has been to create a comprehensive, non-proprietary, microcomputer model of the North American natural gas market. GSAM has been developed to explicitly evaluate components of the natural gas system, including the entire in-place gas resource base, exploration and development technologies, extraction technology and performance parameters, transportation and storage factors, and end-use demand issues. The system has been fully tested and calibrated and has been used for multiple natural gas metrics analyses at NETL in which metric associated with NETL natural gas upstream R&D technologies and strategies under the direction of NETL has been evaluated. NETL's Natural Gas Strategic Plan requires that R&D activities be evaluated for their ability to provide adequate supplies of reasonably priced natural gas. GSAM provides the capability to assess potential and on-going R&D projects using a full fuel cycle, cost-benefit approach. This method yields realistic, market-based assessments of benefits and costs of alternative or related technology advances. GSAM is capable of estimating both technical and commercial successes, quantifying the potential benefits to the market, as well as to other related research. GSAM, therefore, represents an integration of research activities and a method for planning and prioritizing efforts to maximize benefits and minimize costs. Without an analytical tool like GSAM, NETL natural gas upstream R&D activities cannot be appropriately ranked or focused on the most important aspects of natural gas extraction efforts or utilization considerations.

  4. The use of the GSAM approach for the structural investigation of low-lying isomers of molecular clusters from density-functional-theory-based potential energy surfaces: the structures of microhydrated nucleic acid bases.

    Science.gov (United States)

    Thicoipe, Sandrine; Carbonniere, Philippe; Pouchan, Claude

    2013-08-15

    This study presents structural properties of microhydrated nucleic acid bases (NABs) - uracil (U), thymine (T), guanine (G), adenine (A), and cytosine (C) - investigated by theoretical computations at the B3LYP level of theory. To obtain the different representations of these microhydrated species, we applied the GSAM procedure: the most stable conformers labeled X,nH2O (X = U, T, G, A and n = 1...5) for which the Boltzmann population is higher than 2% at 298K are calculated at the B3LYP and B3LYP-D levels of theory. At the B3LYP level, our calculated geometries are compared with those obtained in the literature. New physically relevant isomers are found with the GSAM algorithm, especially for the tetra- and pentahydrated species. The use of DFT-D functional does not strongly modify the relative energies of the isomers for the monohydrated species. When the number of water molecules increases, the results become extremely sensitive to the consideration of dispersion contributions.

  5. The ATLAS Analysis Model

    CERN Multimedia

    Amir Farbin

    The ATLAS Analysis Model is a continually developing vision of how to reconcile physics analysis requirements with the ATLAS offline software and computing model constraints. In the past year this vision has influenced the evolution of the ATLAS Event Data Model, the Athena software framework, and physics analysis tools. These developments, along with the October Analysis Model Workshop and the planning for CSC analyses have led to a rapid refinement of the ATLAS Analysis Model in the past few months. This article introduces some of the relevant issues and presents the current vision of the future ATLAS Analysis Model. Event Data Model The ATLAS Event Data Model (EDM) consists of several levels of details, each targeted for a specific set of tasks. For example the Event Summary Data (ESD) stores calorimeter cells and tracking system hits thereby permitting many calibration and alignment tasks, but will be only accessible at particular computing sites with potentially large latency. In contrast, the Analysis...

  6. RISK ANALYSIS DEVELOPED MODEL

    Directory of Open Access Journals (Sweden)

    Georgiana Cristina NUKINA

    2012-07-01

    Full Text Available Through Risk analysis developed model deciding whether control measures suitable for implementation. However, the analysis determines whether the benefits of a data control options cost more than the implementation.

  7. Communication Analysis modelling techniques

    CERN Document Server

    España, Sergio; Pastor, Óscar; Ruiz, Marcela

    2012-01-01

    This report describes and illustrates several modelling techniques proposed by Communication Analysis; namely Communicative Event Diagram, Message Structures and Event Specification Templates. The Communicative Event Diagram is a business process modelling technique that adopts a communicational perspective by focusing on communicative interactions when describing the organizational work practice, instead of focusing on physical activities1; at this abstraction level, we refer to business activities as communicative events. Message Structures is a technique based on structured text that allows specifying the messages associated to communicative events. Event Specification Templates are a means to organise the requirements concerning a communicative event. This report can be useful to analysts and business process modellers in general, since, according to our industrial experience, it is possible to apply many Communication Analysis concepts, guidelines and criteria to other business process modelling notation...

  8. SDI CFD MODELING ANALYSIS

    Energy Technology Data Exchange (ETDEWEB)

    Lee, S.

    2011-05-05

    The Savannah River Remediation (SRR) Organization requested that Savannah River National Laboratory (SRNL) develop a Computational Fluid Dynamics (CFD) method to mix and blend the miscible contents of the blend tanks to ensure the contents are properly blended before they are transferred from the blend tank; such as, Tank 50H, to the Salt Waste Processing Facility (SWPF) feed tank. The work described here consists of two modeling areas. They are the mixing modeling analysis during miscible liquid blending operation, and the flow pattern analysis during transfer operation of the blended liquid. The transient CFD governing equations consisting of three momentum equations, one mass balance, two turbulence transport equations for kinetic energy and dissipation rate, and one species transport were solved by an iterative technique until the species concentrations of tank fluid were in equilibrium. The steady-state flow solutions for the entire tank fluid were used for flow pattern analysis, for velocity scaling analysis, and the initial conditions for transient blending calculations. A series of the modeling calculations were performed to estimate the blending times for various jet flow conditions, and to investigate the impact of the cooling coils on the blending time of the tank contents. The modeling results were benchmarked against the pilot scale test results. All of the flow and mixing models were performed with the nozzles installed at the mid-elevation, and parallel to the tank wall. From the CFD modeling calculations, the main results are summarized as follows: (1) The benchmark analyses for the CFD flow velocity and blending models demonstrate their consistency with Engineering Development Laboratory (EDL) and literature test results in terms of local velocity measurements and experimental observations. Thus, an application of the established criterion to SRS full scale tank will provide a better, physically-based estimate of the required mixing time, and

  9. Model Analysis ToolKit

    Energy Technology Data Exchange (ETDEWEB)

    2015-05-15

    MATK provides basic functionality to facilitate model analysis within the Python computational environment. Model analysis setup within MATK includes: - define parameters - define observations - define model (python function) - define samplesets (sets of parameter combinations) Currently supported functionality includes: - forward model runs - Latin-Hypercube sampling of parameters - multi-dimensional parameter studies - parallel execution of parameter samples - model calibration using internal Levenberg-Marquardt algorithm - model calibration using lmfit package - model calibration using levmar package - Markov Chain Monte Carlo using pymc package MATK facilitates model analysis using: - scipy - calibration (scipy.optimize) - rpy2 - Python interface to R

  10. Model Checking as Static Analysis

    DEFF Research Database (Denmark)

    Zhang, Fuyuan

    Both model checking and static analysis are prominent approaches to detecting software errors. Model Checking is a successful formal method for verifying properties specified in temporal logics with respect to transition systems. Static analysis is also a powerful method for validating program...... properties which can predict safe approximations to program behaviors. In this thesis, we have developed several static analysis based techniques to solve model checking problems, aiming at showing the link between static analysis and model checking. We focus on logical approaches to static analysis......-calculus can be encoded as the intended model of SFP. Our research results have strengthened the link between model checking and static analysis. This provides a theoretical foundation for developing a unied tool for both model checking and static analysis techniques....

  11. Use of a simplified generalized standard additions method for the analysis of cement, gypsum and basic slag by slurry nebulization ICP-OES.

    Science.gov (United States)

    Marjanovic, Ljiljana; McCrindle, Robert I; Botha, Barend M; Potgieter, Herman J

    2004-05-01

    The simplified generalized standard additions method (GSAM) was investigated as an alternative method for the ICP-OES analysis of solid materials, introduced into the plasma in the form of slurries. The method is an expansion of the conventional standard additions method. It is based on the principle of varying both the sample mass and the amount of standard solution added. The relationship between the sample mass, standard solution added and signal intensity is assumed to be linear. Concentration of the analyte can be found either geometrically from the slope of the two-dimensional response plane in a three-dimensional space or mathematically from the ratio of the parameters estimated by multiple linear regression. The analysis of a series of certified reference materials (CRMs) (cement CRM-BCS No 353, gypsum CRM-Gyp A and basic slag CRM No 382/I) introduced into the plasma in the form of slurry is described. The slurries contained glycerol and hydrochloric acid and were placed in an ultrasonic bath to ensure good dispersion. "Table curve 3D" software was used to fit the data. Results obtained showed that the method could be successfully applied to the analysis of cement, gypsum and slag samples, without the need to dissolve them. In this way, we could avoid the use of hazardous chemicals (concentrated acids), incomplete dissolution and loss of some volatiles. The application of the simplified GSAM for the analysis did not require a CRM with similar chemical and mineralogical properties for the calibration of the instrument.

  12. Survival analysis models and applications

    CERN Document Server

    Liu, Xian

    2012-01-01

    Survival analysis concerns sequential occurrences of events governed by probabilistic laws.  Recent decades have witnessed many applications of survival analysis in various disciplines. This book introduces both classic survival models and theories along with newly developed techniques. Readers will learn how to perform analysis of survival data by following numerous empirical illustrations in SAS. Survival Analysis: Models and Applications: Presents basic techniques before leading onto some of the most advanced topics in survival analysis.Assumes only a minimal knowledge of SAS whilst enablin

  13. CMS Analysis School Model

    Energy Technology Data Exchange (ETDEWEB)

    Malik, S. [Nebraska U.; Shipsey, I. [Purdue U.; Cavanaugh, R. [Illinois U., Chicago; Bloom, K. [Nebraska U.; Chan, Kai-Feng [Taiwan, Natl. Taiwan U.; D' Hondt, J. [Vrije U., Brussels; Klima, B. [Fermilab; Narain, M. [Brown U.; Palla, F. [INFN, Pisa; Rolandi, G. [CERN; Schörner-Sadenius, T. [DESY

    2014-01-01

    To impart hands-on training in physics analysis, CMS experiment initiated the concept of CMS Data Analysis School (CMSDAS). It was born over three years ago at the LPC (LHC Physics Centre), Fermilab and is based on earlier workshops held at the LPC and CLEO Experiment. As CMS transitioned from construction to the data taking mode, the nature of earlier training also evolved to include more of analysis tools, software tutorials and physics analysis. This effort epitomized as CMSDAS has proven to be a key for the new and young physicists to jump start and contribute to the physics goals of CMS by looking for new physics with the collision data. With over 400 physicists trained in six CMSDAS around the globe, CMS is trying to engage the collaboration in its discovery potential and maximize physics output. As a bigger goal, CMS is striving to nurture and increase engagement of the myriad talents, in the development of physics, service, upgrade, education of those new to CMS and the career development of younger members. An extension of the concept to the dedicated software and hardware schools is also planned, keeping in mind the ensuing upgrade phase.

  14. Multiscale Signal Analysis and Modeling

    CERN Document Server

    Zayed, Ahmed

    2013-01-01

    Multiscale Signal Analysis and Modeling presents recent advances in multiscale analysis and modeling using wavelets and other systems. This book also presents applications in digital signal processing using sampling theory and techniques from various function spaces, filter design, feature extraction and classification, signal and image representation/transmission, coding, nonparametric statistical signal processing, and statistical learning theory. This book also: Discusses recently developed signal modeling techniques, such as the multiscale method for complex time series modeling, multiscale positive density estimations, Bayesian Shrinkage Strategies, and algorithms for data adaptive statistics Introduces new sampling algorithms for multidimensional signal processing Provides comprehensive coverage of wavelets with presentations on waveform design and modeling, wavelet analysis of ECG signals and wavelet filters Reviews features extraction and classification algorithms for multiscale signal and image proce...

  15. ROCK PROPERTIES MODEL ANALYSIS MODEL REPORT

    Energy Technology Data Exchange (ETDEWEB)

    Clinton Lum

    2002-02-04

    The purpose of this Analysis and Model Report (AMR) is to document Rock Properties Model (RPM) 3.1 with regard to input data, model methods, assumptions, uncertainties and limitations of model results, and qualification status of the model. The report also documents the differences between the current and previous versions and validation of the model. The rock properties models are intended principally for use as input to numerical physical-process modeling, such as of ground-water flow and/or radionuclide transport. The constraints, caveats, and limitations associated with this model are discussed in the appropriate text sections that follow. This work was conducted in accordance with the following planning documents: WA-0344, ''3-D Rock Properties Modeling for FY 1998'' (SNL 1997, WA-0358), ''3-D Rock Properties Modeling for FY 1999'' (SNL 1999), and the technical development plan, Rock Properties Model Version 3.1, (CRWMS M&O 1999c). The Interim Change Notice (ICNs), ICN 02 and ICN 03, of this AMR were prepared as part of activities being conducted under the Technical Work Plan, TWP-NBS-GS-000003, ''Technical Work Plan for the Integrated Site Model, Process Model Report, Revision 01'' (CRWMS M&O 2000b). The purpose of ICN 03 is to record changes in data input status due to data qualification and verification activities. These work plans describe the scope, objectives, tasks, methodology, and implementing procedures for model construction. The constraints, caveats, and limitations associated with this model are discussed in the appropriate text sections that follow. The work scope for this activity consists of the following: (1) Conversion of the input data (laboratory measured porosity data, x-ray diffraction mineralogy, petrophysical calculations of bound water, and petrophysical calculations of porosity) for each borehole into stratigraphic coordinates; (2) Re-sampling and merging of data sets; (3

  16. Frailty Models in Survival Analysis

    CERN Document Server

    Wienke, Andreas

    2010-01-01

    The concept of frailty offers a convenient way to introduce unobserved heterogeneity and associations into models for survival data. In its simplest form, frailty is an unobserved random proportionality factor that modifies the hazard function of an individual or a group of related individuals. "Frailty Models in Survival Analysis" presents a comprehensive overview of the fundamental approaches in the area of frailty models. The book extensively explores how univariate frailty models can represent unobserved heterogeneity. It also emphasizes correlated frailty models as extensions of

  17. Stochastic modeling analysis and simulation

    CERN Document Server

    Nelson, Barry L

    1995-01-01

    A coherent introduction to the techniques for modeling dynamic stochastic systems, this volume also offers a guide to the mathematical, numerical, and simulation tools of systems analysis. Suitable for advanced undergraduates and graduate-level industrial engineers and management science majors, it proposes modeling systems in terms of their simulation, regardless of whether simulation is employed for analysis. Beginning with a view of the conditions that permit a mathematical-numerical analysis, the text explores Poisson and renewal processes, Markov chains in discrete and continuous time, se

  18. Command Process Modeling & Risk Analysis

    Science.gov (United States)

    Meshkat, Leila

    2011-01-01

    Commanding Errors may be caused by a variety of root causes. It's important to understand the relative significance of each of these causes for making institutional investment decisions. One of these causes is the lack of standardized processes and procedures for command and control. We mitigate this problem by building periodic tables and models corresponding to key functions within it. These models include simulation analysis and probabilistic risk assessment models.

  19. Model building techniques for analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Walther, Howard P.; McDaniel, Karen Lynn; Keener, Donald; Cordova, Theresa Elena; Henry, Ronald C.; Brooks, Sean; Martin, Wilbur D.

    2009-09-01

    The practice of mechanical engineering for product development has evolved into a complex activity that requires a team of specialists for success. Sandia National Laboratories (SNL) has product engineers, mechanical designers, design engineers, manufacturing engineers, mechanical analysts and experimentalists, qualification engineers, and others that contribute through product realization teams to develop new mechanical hardware. The goal of SNL's Design Group is to change product development by enabling design teams to collaborate within a virtual model-based environment whereby analysis is used to guide design decisions. Computer-aided design (CAD) models using PTC's Pro/ENGINEER software tools are heavily relied upon in the product definition stage of parts and assemblies at SNL. The three-dimensional CAD solid model acts as the design solid model that is filled with all of the detailed design definition needed to manufacture the parts. Analysis is an important part of the product development process. The CAD design solid model (DSM) is the foundation for the creation of the analysis solid model (ASM). Creating an ASM from the DSM currently is a time-consuming effort; the turnaround time for results of a design needs to be decreased to have an impact on the overall product development. This effort can be decreased immensely through simple Pro/ENGINEER modeling techniques that summarize to the method features are created in a part model. This document contains recommended modeling techniques that increase the efficiency of the creation of the ASM from the DSM.

  20. Model selection for amplitude analysis

    CERN Document Server

    Guegan, Baptiste; Stevens, Justin; Williams, Mike

    2015-01-01

    Model complexity in amplitude analyses is often a priori under-constrained since the underlying theory permits a large number of amplitudes to contribute to most physical processes. The use of an overly complex model results in reduced predictive power and worse resolution on unknown parameters of interest. Therefore, it is common to reduce the complexity by removing from consideration some subset of the allowed amplitudes. This paper studies a data-driven method for limiting model complexity through regularization during regression in the context of a multivariate (Dalitz-plot) analysis. The regularization technique applied greatly improves the performance. A method is also proposed for obtaining the significance of a resonance in a multivariate amplitude analysis.

  1. Accelerated life models modeling and statistical analysis

    CERN Document Server

    Bagdonavicius, Vilijandas

    2001-01-01

    Failure Time DistributionsIntroductionParametric Classes of Failure Time DistributionsAccelerated Life ModelsIntroductionGeneralized Sedyakin's ModelAccelerated Failure Time ModelProportional Hazards ModelGeneralized Proportional Hazards ModelsGeneralized Additive and Additive-Multiplicative Hazards ModelsChanging Shape and Scale ModelsGeneralizationsModels Including Switch-Up and Cycling EffectsHeredity HypothesisSummaryAccelerated Degradation ModelsIntroductionDegradation ModelsModeling the Influence of Explanatory Varia

  2. Timing analysis by model checking

    Science.gov (United States)

    Naydich, Dimitri; Guaspari, David

    2000-01-01

    The safety of modern avionics relies on high integrity software that can be verified to meet hard real-time requirements. The limits of verification technology therefore determine acceptable engineering practice. To simplify verification problems, safety-critical systems are commonly implemented under the severe constraints of a cyclic executive, which make design an expensive trial-and-error process highly intolerant of change. Important advances in analysis techniques, such as rate monotonic analysis (RMA), have provided a theoretical and practical basis for easing these onerous restrictions. But RMA and its kindred have two limitations: they apply only to verifying the requirement of schedulability (that tasks meet their deadlines) and they cannot be applied to many common programming paradigms. We address both these limitations by applying model checking, a technique with successful industrial applications in hardware design. Model checking algorithms analyze finite state machines, either by explicit state enumeration or by symbolic manipulation. Since quantitative timing properties involve a potentially unbounded state variable (a clock), our first problem is to construct a finite approximation that is conservative for the properties being analyzed-if the approximation satisfies the properties of interest, so does the infinite model. To reduce the potential for state space explosion we must further optimize this finite model. Experiments with some simple optimizations have yielded a hundred-fold efficiency improvement over published techniques.

  3. Ventilation Model and Analysis Report

    Energy Technology Data Exchange (ETDEWEB)

    V. Chipman

    2003-07-18

    This model and analysis report develops, validates, and implements a conceptual model for heat transfer in and around a ventilated emplacement drift. This conceptual model includes thermal radiation between the waste package and the drift wall, convection from the waste package and drift wall surfaces into the flowing air, and conduction in the surrounding host rock. These heat transfer processes are coupled and vary both temporally and spatially, so numerical and analytical methods are used to implement the mathematical equations which describe the conceptual model. These numerical and analytical methods predict the transient response of the system, at the drift scale, in terms of spatially varying temperatures and ventilation efficiencies. The ventilation efficiency describes the effectiveness of the ventilation process in removing radionuclide decay heat from the drift environment. An alternative conceptual model is also developed which evaluates the influence of water and water vapor mass transport on the ventilation efficiency. These effects are described using analytical methods which bound the contribution of latent heat to the system, quantify the effects of varying degrees of host rock saturation (and hence host rock thermal conductivity) on the ventilation efficiency, and evaluate the effects of vapor and enhanced vapor diffusion on the host rock thermal conductivity.

  4. TANK48 CFD MODELING ANALYSIS

    Energy Technology Data Exchange (ETDEWEB)

    Lee, S.

    2011-05-17

    The process of recovering the waste in storage tanks at the Savannah River Site (SRS) typically requires mixing the contents of the tank to ensure uniformity of the discharge stream. Mixing is accomplished with one to four dual-nozzle slurry pumps located within the tank liquid. For the work, a Tank 48 simulation model with a maximum of four slurry pumps in operation has been developed to estimate flow patterns for efficient solid mixing. The modeling calculations were performed by using two modeling approaches. One approach is a single-phase Computational Fluid Dynamics (CFD) model to evaluate the flow patterns and qualitative mixing behaviors for a range of different modeling conditions since the model was previously benchmarked against the test results. The other is a two-phase CFD model to estimate solid concentrations in a quantitative way by solving the Eulerian governing equations for the continuous fluid and discrete solid phases over the entire fluid domain of Tank 48. The two-phase results should be considered as the preliminary scoping calculations since the model was not validated against the test results yet. A series of sensitivity calculations for different numbers of pumps and operating conditions has been performed to provide operational guidance for solids suspension and mixing in the tank. In the analysis, the pump was assumed to be stationary. Major solid obstructions including the pump housing, the pump columns, and the 82 inch central support column were included. The steady state and three-dimensional analyses with a two-equation turbulence model were performed with FLUENT{trademark} for the single-phase approach and CFX for the two-phase approach. Recommended operational guidance was developed assuming that local fluid velocity can be used as a measure of sludge suspension and spatial mixing under single-phase tank model. For quantitative analysis, a two-phase fluid-solid model was developed for the same modeling conditions as the single

  5. ANALYSIS MODEL FOR INVENTORY MANAGEMENT

    Directory of Open Access Journals (Sweden)

    CAMELIA BURJA

    2010-01-01

    Full Text Available The inventory represents an essential component for the assets of the enterprise and the economic analysis gives them special importance because their accurate management determines the achievement of the activity object and the financial results. The efficient management of inventory requires ensuring an optimum level for them, which will guarantee the normal functioning of the activity with minimum inventory expenses and funds which are immobilised. The paper presents an analysis model for inventory management based on their rotation speed and the correlation with the sales volume illustrated in an adequate study. The highlighting of the influence factors on the efficient inventory management ensures the useful information needed to justify managerial decisions, which will lead to a balancedfinancial position and to increased company performance.

  6. Distribution system modeling and analysis

    CERN Document Server

    Kersting, William H

    2002-01-01

    For decades, distribution engineers did not have the sophisticated tools developed for analyzing transmission systems-often they had only their instincts. Things have changed, and we now have computer programs that allow engineers to simulate, analyze, and optimize distribution systems. Powerful as these programs are, however, without a real understanding of the operating characteristics of a distribution system, engineers using the programs can easily make serious errors in their designs and operating procedures.Distribution System Modeling and Analysis helps prevent those errors. It gives re

  7. Probabilistic Model-Based Safety Analysis

    CERN Document Server

    Güdemann, Matthias; 10.4204/EPTCS.28.8

    2010-01-01

    Model-based safety analysis approaches aim at finding critical failure combinations by analysis of models of the whole system (i.e. software, hardware, failure modes and environment). The advantage of these methods compared to traditional approaches is that the analysis of the whole system gives more precise results. Only few model-based approaches have been applied to answer quantitative questions in safety analysis, often limited to analysis of specific failure propagation models, limited types of failure modes or without system dynamics and behavior, as direct quantitative analysis is uses large amounts of computing resources. New achievements in the domain of (probabilistic) model-checking now allow for overcoming this problem. This paper shows how functional models based on synchronous parallel semantics, which can be used for system design, implementation and qualitative safety analysis, can be directly re-used for (model-based) quantitative safety analysis. Accurate modeling of different types of proba...

  8. Geologic Framework Model Analysis Model Report

    Energy Technology Data Exchange (ETDEWEB)

    R. Clayton

    2000-12-19

    The purpose of this report is to document the Geologic Framework Model (GFM), Version 3.1 (GFM3.1) with regard to data input, modeling methods, assumptions, uncertainties, limitations, and validation of the model results, qualification status of the model, and the differences between Version 3.1 and previous versions. The GFM represents a three-dimensional interpretation of the stratigraphy and structural features of the location of the potential Yucca Mountain radioactive waste repository. The GFM encompasses an area of 65 square miles (170 square kilometers) and a volume of 185 cubic miles (771 cubic kilometers). The boundaries of the GFM were chosen to encompass the most widely distributed set of exploratory boreholes (the Water Table or WT series) and to provide a geologic framework over the area of interest for hydrologic flow and radionuclide transport modeling through the unsaturated zone (UZ). The depth of the model is constrained by the inferred depth of the Tertiary-Paleozoic unconformity. The GFM was constructed from geologic map and borehole data. Additional information from measured stratigraphy sections, gravity profiles, and seismic profiles was also considered. This interim change notice (ICN) was prepared in accordance with the Technical Work Plan for the Integrated Site Model Process Model Report Revision 01 (CRWMS M&O 2000). The constraints, caveats, and limitations associated with this model are discussed in the appropriate text sections that follow. The GFM is one component of the Integrated Site Model (ISM) (Figure l), which has been developed to provide a consistent volumetric portrayal of the rock layers, rock properties, and mineralogy of the Yucca Mountain site. The ISM consists of three components: (1) Geologic Framework Model (GFM); (2) Rock Properties Model (RPM); and (3) Mineralogic Model (MM). The ISM merges the detailed project stratigraphy into model stratigraphic units that are most useful for the primary downstream models and the

  9. Extensions in model-based system analysis

    OpenAIRE

    Graham, Matthew R.

    2007-01-01

    Model-based system analysis techniques provide a means for determining desired system performance prior to actual implementation. In addition to specifying desired performance, model-based analysis techniques require mathematical descriptions that characterize relevant behavior of the system. The developments of this dissertation give ex. tended formulations for control- relevant model estimation as well as model-based analysis conditions for performance requirements specified as frequency do...

  10. Correlated Data Analysis Modeling, Analytics, and Applications

    CERN Document Server

    Song, Peter X-K

    2007-01-01

    Presents developments in correlated data analysis. This book provides a systematic treatment for the topic of estimating functions. In addition to marginal models and mixed-effects models, it covers topics on joint regression analysis based on Gaussian copulas and generalized state space models for longitudinal data from long time series.

  11. Bayesian Model Averaging for Propensity Score Analysis

    Science.gov (United States)

    Kaplan, David; Chen, Jianshen

    2013-01-01

    The purpose of this study is to explore Bayesian model averaging in the propensity score context. Previous research on Bayesian propensity score analysis does not take into account model uncertainty. In this regard, an internally consistent Bayesian framework for model building and estimation must also account for model uncertainty. The…

  12. The Cosparse Analysis Model and Algorithms

    CERN Document Server

    Nam, Sangnam; Elad, Michael; Gribonval, Rémi

    2011-01-01

    After a decade of extensive study of the sparse representation synthesis model, we can safely say that this is a mature and stable field, with clear theoretical foundations, and appealing applications. Alongside this approach, there is an analysis counterpart model, which, despite its similarity to the synthesis alternative, is markedly different. Surprisingly, the analysis model did not get a similar attention, and its understanding today is shallow and partial. In this paper we take a closer look at the analysis approach, better define it as a generative model for signals, and contrast it with the synthesis one. This work proposes effective pursuit methods that aim to solve inverse problems regularized with the analysis-model prior, accompanied by a preliminary theoretical study of their performance. We demonstrate the effectiveness of the analysis model in several experiments.

  13. Statistical Modelling of Wind Proles - Data Analysis and Modelling

    DEFF Research Database (Denmark)

    Jónsson, Tryggvi; Pinson, Pierre

    The aim of the analysis presented in this document is to investigate whether statistical models can be used to make very short-term predictions of wind profiles.......The aim of the analysis presented in this document is to investigate whether statistical models can be used to make very short-term predictions of wind profiles....

  14. Brief analysis of Blog Websites' business models

    Institute of Scientific and Technical Information of China (English)

    魏娟

    2009-01-01

    Analysis continues using this framework of several major Blogs or Blog websites. From this analysis, three main weblog business models that are currently in operation will be introduced as well as described. As a part of this framework, this paper will also analyze the future viability of the models.

  15. Two sustainable energy system analysis models

    DEFF Research Database (Denmark)

    Lund, Henrik; Goran Krajacic, Neven Duic; da Graca Carvalho, Maria

    2005-01-01

    This paper presents a comparative study of two energy system analysis models both designed with the purpose of analysing electricity systems with a substantial share of fluctuating renewable energy.......This paper presents a comparative study of two energy system analysis models both designed with the purpose of analysing electricity systems with a substantial share of fluctuating renewable energy....

  16. Analysis of Crosscutting in Model Transformations

    NARCIS (Netherlands)

    Berg, van den K.G.; Tekinerdogan, B.; Nguyen, H.; Aagedal, J.; Neple, T.; Oldevik, J.

    2006-01-01

    This paper describes an approach for the analysis of crosscutting in model transformations in the Model Driven Architecture (MDA). Software architectures should be amenable to changes in user requirements and technological platforms. Impact analysis of changes can be based on traceability of archite

  17. Crystal structure of glutamate-1-semialdehyde-2,1-aminomutase from Arabidopsis thaliana

    Energy Technology Data Exchange (ETDEWEB)

    Song, Yingxian; Pu, Hua; Jiang, Tian; Zhang, Lixin; Ouyang, Min, E-mail: ouyangmin@ibcas.ac.cn [Chinese Academy of Sciences, Beijing 100093, People’s Republic of (China)

    2016-05-23

    A structural study of A. thaliana glutamate-1-semialdehyde-2,1-aminomutase (GSAM) has revealed asymmetry in cofactor binding as well as in the gating-loop orientation, which supports the previously proposed negative cooperativity between monomers of GSAM. Glutamate-1-semialdehyde-2,1-aminomutase (GSAM) catalyzes the isomerization of glutamate-1-semialdehyde (GSA) to 5-aminolevulinate (ALA) and is distributed in archaea, most bacteria and plants. Although structures of GSAM from archaea and bacteria have been resolved, a GSAM structure from a higher plant is not available, preventing further structure–function analysis. Here, the structure of GSAM from Arabidopsis thaliana (AtGSA1) obtained by X-ray crystallography is reported at 1.25 Å resolution. AtGSA1 forms an asymmetric dimer and displays asymmetry in cofactor binding as well as in the gating-loop orientation, which is consistent with previously reported Synechococcus GSAM structures. While one monomer binds PMP with the gating loop fixed in the open state, the other monomer binds either PMP or PLP and the gating loop is ready to close. The data also reveal the mobility of residues Gly163, Ser164 and Gly165, which are important for reorientation of the gating loop. Furthermore, the asymmetry of the AtGSA1 structure supports the previously proposed negative cooperativity between monomers of GSAM.

  18. Modeling and analysis using hybrid Petri nets

    CERN Document Server

    Ghomri, Latéfa

    2007-01-01

    This paper is devoted to the use of hybrid Petri nets (PNs) for modeling and control of hybrid dynamic systems (HDS). Modeling, analysis and control of HDS attract ever more of researchers' attention and several works have been devoted to these topics. We consider in this paper the extensions of the PN formalism (initially conceived for modeling and analysis of discrete event systems) in the direction of hybrid modeling. We present, first, the continuous PN models. These models are obtained from discrete PNs by the fluidification of the markings. They constitute the first steps in the extension of PNs toward hybrid modeling. Then, we present two hybrid PN models, which differ in the class of HDS they can deal with. The first one is used for deterministic HDS modeling, whereas the second one can deal with HDS with nondeterministic behavior. Keywords: Hybrid dynamic systems; D-elementary hybrid Petri nets; Hybrid automata; Controller synthesis

  19. A Bayesian nonparametric meta-analysis model.

    Science.gov (United States)

    Karabatsos, George; Talbott, Elizabeth; Walker, Stephen G

    2015-03-01

    In a meta-analysis, it is important to specify a model that adequately describes the effect-size distribution of the underlying population of studies. The conventional normal fixed-effect and normal random-effects models assume a normal effect-size population distribution, conditionally on parameters and covariates. For estimating the mean overall effect size, such models may be adequate, but for prediction, they surely are not if the effect-size distribution exhibits non-normal behavior. To address this issue, we propose a Bayesian nonparametric meta-analysis model, which can describe a wider range of effect-size distributions, including unimodal symmetric distributions, as well as skewed and more multimodal distributions. We demonstrate our model through the analysis of real meta-analytic data arising from behavioral-genetic research. We compare the predictive performance of the Bayesian nonparametric model against various conventional and more modern normal fixed-effects and random-effects models.

  20. OVERLOAD ANALYSIS OF MARKOVIAN MODELS

    Institute of Scientific and Technical Information of China (English)

    Yiqiang Q. ZHAO

    1999-01-01

    A new procedure for computing stationary probabilities for an overloaded Markovian model is proposed interms of the rotated Markov chain.There are two advantages to use this procedure:i) This procedure allows us to approximate an overloaded finite model by using a stable infinite Markov chain. This will makethe study easier when the infinite model has a simpler solution.ii) Numerically, this procedure often significantly reduces the number of computations and the requirement of computer memory. By using different examples,we specifically demonstratethe process of implementing this rotating procedure.

  1. [Dimensional modeling analysis for outpatient payments].

    Science.gov (United States)

    Guo, Yi-zhong; Guo, Yi-min

    2008-09-01

    This paper introduces a data warehouse model for outpatient payments, which is designed according to the requirements of the hospital financial management while dimensional modeling technique is combined with the analysis on the requirements. This data warehouse model can not only improve the accuracy of financial management requirements, but also greatly increase the efficiency and quality of the hospital management.

  2. Video Analysis and Modeling in Physics Education

    Science.gov (United States)

    Brown, Doug

    2008-03-01

    The Tracker video analysis program allows users to overlay simple dynamical models on a video clip. Video modeling offers advantages over both traditional video analysis and animation-only modeling. In traditional video analysis, for example, students measure ``g'' by tracking a dropped or tossed ball, constructing a position or velocity vs. time graph, and interpreting the graphs to obtain initial conditions and acceleration. In video modeling, by contrast, the students interactively construct theoretical force expressions and define initial conditions for a dynamical particle model that synchs with and draws itself on the video. The behavior of the model is thus compared directly with that of the real-world motion. Tracker uses the Open Source Physics code library so sophisticated models are possible. I will demonstrate and compare video modeling with video analysis and I will discuss the advantages of video modeling over animation-only modeling. The Tracker video analysis program is available at: http://www.cabrillo.edu/˜dbrown/tracker/.

  3. Qualitative Analysis of Somitogenesis Models

    Directory of Open Access Journals (Sweden)

    Maschke-Dutz E.

    2007-12-01

    Full Text Available Although recently the properties of a single somite cell oscillator have been intensively investigated, the system-level nature of the segmentation clock remains largely unknown. To elaborate qualitatively this question, we examine the possibility to transform a well-known time delay somite cell oscillator to dynamical system of differential equations allowing qualitative analysis.

  4. CMS Data Analysis School Model

    CERN Document Server

    Malik, Sudhir; Cavanaugh, R; Bloom, K; Chan, Kai-Feng; D'Hondt, J; Klima, B; Narain, M; Palla, F; Rolandi, G; Schörner-Sadenius, T

    2014-01-01

    To impart hands-on training in physics analysis, CMS experiment initiated the  concept of CMS Data Analysis School (CMSDAS). It was born three years ago at the LPC (LHC Physics Center), Fermilab and is based on earlier workshops held at the LPC and CLEO Experiment. As CMS transitioned from construction to the data taking mode, the nature of earlier training also evolved to include more of analysis tools, software tutorials and physics analysis. This effort epitomized as CMSDAS has proven to be a key for the new and young physicists to jump start and contribute to the physics goals of CMS by looking for new physics with the collision data. With over 400 physicists trained in six CMSDAS around the globe , CMS is trying to  engage the collaboration discovery potential and maximize the physics output. As a bigger goal, CMS is striving to nurture and increase engagement of the myriad talents of CMS, in the development of physics, service, upgrade, education of those new to CMS and the caree...

  5. Program Analysis as Model Checking

    DEFF Research Database (Denmark)

    Olesen, Mads Chr.

    and abstract interpretation. Model checking views the program as a finite automaton and tries to prove logical properties over the automaton model, or present a counter-example if not possible — with a focus on precision. Abstract interpretation translates the program semantics into abstract semantics...... problems as the other by a reformulation. This thesis argues that there is even a convergence on the practical level, and that a generalisation of the formalism of timed automata into lattice automata captures key aspects of both methods; indeed model checking timed automata can be formulated in terms...... of an abstract interpretation. For the generalisation to lattice automata to have benefit it is important that efficient tools exist. This thesis presents multi-core tools for efficient and scalable reachability and Büchi emptiness checking of timed/lattice automata. Finally, a number of case studies...

  6. Hierarchical modeling and analysis for spatial data

    CERN Document Server

    Banerjee, Sudipto; Gelfand, Alan E

    2003-01-01

    Among the many uses of hierarchical modeling, their application to the statistical analysis of spatial and spatio-temporal data from areas such as epidemiology And environmental science has proven particularly fruitful. Yet to date, the few books that address the subject have been either too narrowly focused on specific aspects of spatial analysis, or written at a level often inaccessible to those lacking a strong background in mathematical statistics.Hierarchical Modeling and Analysis for Spatial Data is the first accessible, self-contained treatment of hierarchical methods, modeling, and dat

  7. Hypersonic - Model Analysis as a Service

    DEFF Research Database (Denmark)

    Acretoaie, Vlad; Störrle, Harald

    2014-01-01

    Hypersonic is a Cloud-based tool that proposes a new approach to the deployment of model analysis facilities. It is implemented as a RESTful Web service API o_ering analysis features such as model clone detection. This approach allows the migration of resource intensive analysis algorithms from...... monolithic desktop modeling tools to a wide range of mobile and Web-based clients. As a technology demonstrator, a Web application acting as a client for the Hypersonic API has been implemented and made publicly available....

  8. Representing uncertainty on model analysis plots

    Science.gov (United States)

    Smith, Trevor I.

    2016-12-01

    Model analysis provides a mechanism for representing student learning as measured by standard multiple-choice surveys. The model plot contains information regarding both how likely students in a particular class are to choose the correct answer and how likely they are to choose an answer consistent with a well-documented conceptual model. Unfortunately, Bao's original presentation of the model plot did not include a way to represent uncertainty in these measurements. I present details of a method to add error bars to model plots by expanding the work of Sommer and Lindell. I also provide a template for generating model plots with error bars.

  9. Modeling Decisional Situations Using Morphological Analysis

    Directory of Open Access Journals (Sweden)

    2007-01-01

    Full Text Available This paper focuses on models of financial decisions in small and medium enterprises. The presented models are a part of a decision support system presented in the PhD dissertation. One of the modeling techniques used for model creation and development is morphological analysis. This technique is used for model scale reduction not by reducing the number of variables involved but by reducing the number of possible combinations between variables. In this paper we prove how this approach can be used in modeling financial decision problems.

  10. Three-dimensional model analysis and processing

    CERN Document Server

    Yu, Faxin; Luo, Hao; Wang, Pinghui

    2011-01-01

    This book focuses on five hot research directions in 3D model analysis and processing in computer science:  compression, feature extraction, content-based retrieval, irreversible watermarking and reversible watermarking.

  11. Analysis model of structure-HDS

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    Presents the model established for Structure-HDS(hydraulic damper system) analysis on the basis of the theoretical analysis model of non-compressed fluid in the round pipe will an uniform velocity used as the basic variable, and pressure losses resulting from cross section changes of fluid route taken into consideration. Which provides necessary basis for researches on earthquake responses of a structure with a spacious first story, equipped with HDS at first floor.

  12. Applied research in uncertainty modeling and analysis

    CERN Document Server

    Ayyub, Bilal

    2005-01-01

    Uncertainty has been a concern to engineers, managers, and scientists for many years. For a long time uncertainty has been considered synonymous with random, stochastic, statistic, or probabilistic. Since the early sixties views on uncertainty have become more heterogeneous. In the past forty years numerous tools that model uncertainty, above and beyond statistics, have been proposed by several engineers and scientists. The tool/method to model uncertainty in a specific context should really be chosen by considering the features of the phenomenon under consideration, not independent of what is known about the system and what causes uncertainty. In this fascinating overview of the field, the authors provide broad coverage of uncertainty analysis/modeling and its application. Applied Research in Uncertainty Modeling and Analysis presents the perspectives of various researchers and practitioners on uncertainty analysis and modeling outside their own fields and domain expertise. Rather than focusing explicitly on...

  13. Flux Analysis in Process Models via Causality

    CERN Document Server

    Kahramanoğullari, Ozan

    2010-01-01

    We present an approach for flux analysis in process algebra models of biological systems. We perceive flux as the flow of resources in stochastic simulations. We resort to an established correspondence between event structures, a broadly recognised model of concurrency, and state transitions of process models, seen as Petri nets. We show that we can this way extract the causal resource dependencies in simulations between individual state transitions as partial orders of events. We propose transformations on the partial orders that provide means for further analysis, and introduce a software tool, which implements these ideas. By means of an example of a published model of the Rho GTP-binding proteins, we argue that this approach can provide the substitute for flux analysis techniques on ordinary differential equation models within the stochastic setting of process algebras.

  14. Analysis of variance for model output

    NARCIS (Netherlands)

    Jansen, M.J.W.

    1999-01-01

    A scalar model output Y is assumed to depend deterministically on a set of stochastically independent input vectors of different dimensions. The composition of the variance of Y is considered; variance components of particular relevance for uncertainty analysis are identified. Several analysis of va

  15. Model Checking as Static Analysis: Revisited

    DEFF Research Database (Denmark)

    Zhang, Fuyuan; Nielson, Flemming; Nielson, Hanne Riis

    2012-01-01

    We show that the model checking problem of the μ-calculus can be viewed as an instance of static analysis. We propose Succinct Fixed Point Logic (SFP) within our logical approach to static analysis as an extension of Alternation-free Least Fixed Logic (ALFP). We generalize the notion...

  16. Combustion instability modeling and analysis

    Energy Technology Data Exchange (ETDEWEB)

    Santoro, R.J.; Yang, V.; Santavicca, D.A. [Pennsylvania State Univ., University Park, PA (United States)] [and others

    1995-10-01

    It is well known that the two key elements for achieving low emissions and high performance in a gas turbine combustor are to simultaneously establish (1) a lean combustion zone for maintaining low NO{sub x} emissions and (2) rapid mixing for good ignition and flame stability. However, these requirements, when coupled with the short combustor lengths used to limit the residence time for NO formation typical of advanced gas turbine combustors, can lead to problems regarding unburned hydrocarbons (UHC) and carbon monoxide (CO) emissions, as well as the occurrence of combustion instabilities. Clearly, the key to successful gas turbine development is based on understanding the effects of geometry and operating conditions on combustion instability, emissions (including UHC, CO and NO{sub x}) and performance. The concurrent development of suitable analytical and numerical models that are validated with experimental studies is important for achieving this objective. A major benefit of the present research will be to provide for the first time an experimentally verified model of emissions and performance of gas turbine combustors.

  17. A Bayesian Nonparametric Meta-Analysis Model

    Science.gov (United States)

    Karabatsos, George; Talbott, Elizabeth; Walker, Stephen G.

    2015-01-01

    In a meta-analysis, it is important to specify a model that adequately describes the effect-size distribution of the underlying population of studies. The conventional normal fixed-effect and normal random-effects models assume a normal effect-size population distribution, conditionally on parameters and covariates. For estimating the mean overall…

  18. Model correction factor method for system analysis

    DEFF Research Database (Denmark)

    Ditlevsen, Ove Dalager; Johannesen, Johannes M.

    2000-01-01

    The Model Correction Factor Method is an intelligent response surface method based on simplifiedmodeling. MCFM is aimed for reliability analysis in case of a limit state defined by an elaborate model. Herein it isdemonstrated that the method is applicable for elaborate limit state surfaces on which...

  19. Analysis and evaluation of collaborative modeling processes

    NARCIS (Netherlands)

    Ssebuggwawo, D.

    2012-01-01

    Analysis and evaluation of collaborative modeling processes is confronted with many challenges. On the one hand, many systems design and re-engineering projects require collaborative modeling approaches that can enhance their productivity. But, such collaborative efforts, which often consist of the

  20. Perturbation analysis of nonlinear matrix population models

    Directory of Open Access Journals (Sweden)

    Hal Caswell

    2008-03-01

    Full Text Available Perturbation analysis examines the response of a model to changes in its parameters. It is commonly applied to population growth rates calculated from linear models, but there has been no general approach to the analysis of nonlinear models. Nonlinearities in demographic models may arise due to density-dependence, frequency-dependence (in 2-sex models, feedback through the environment or the economy, and recruitment subsidy due to immigration, or from the scaling inherent in calculations of proportional population structure. This paper uses matrix calculus to derive the sensitivity and elasticity of equilibria, cycles, ratios (e.g. dependency ratios, age averages and variances, temporal averages and variances, life expectancies, and population growth rates, for both age-classified and stage-classified models. Examples are presented, applying the results to both human and non-human populations.

  1. Numerical modeling techniques for flood analysis

    Science.gov (United States)

    Anees, Mohd Talha; Abdullah, K.; Nawawi, M. N. M.; Ab Rahman, Nik Norulaini Nik; Piah, Abd. Rahni Mt.; Zakaria, Nor Azazi; Syakir, M. I.; Mohd. Omar, A. K.

    2016-12-01

    Topographic and climatic changes are the main causes of abrupt flooding in tropical areas. It is the need to find out exact causes and effects of these changes. Numerical modeling techniques plays a vital role for such studies due to their use of hydrological parameters which are strongly linked with topographic changes. In this review, some of the widely used models utilizing hydrological and river modeling parameters and their estimation in data sparse region are discussed. Shortcomings of 1D and 2D numerical models and the possible improvements over these models through 3D modeling are also discussed. It is found that the HEC-RAS and FLO 2D model are best in terms of economical and accurate flood analysis for river and floodplain modeling respectively. Limitations of FLO 2D in floodplain modeling mainly such as floodplain elevation differences and its vertical roughness in grids were found which can be improve through 3D model. Therefore, 3D model was found to be more suitable than 1D and 2D models in terms of vertical accuracy in grid cells. It was also found that 3D models for open channel flows already developed recently but not for floodplain. Hence, it was suggested that a 3D model for floodplain should be developed by considering all hydrological and high resolution topographic parameter's models, discussed in this review, to enhance the findings of causes and effects of flooding.

  2. Adsorption modeling for macroscopic contaminant dispersal analysis

    Energy Technology Data Exchange (ETDEWEB)

    Axley, J.W.

    1990-05-01

    Two families of macroscopic adsorption models are formulated, based on fundamental principles of adsorption science and technology, that may be used for macroscopic (such as whole-building) contaminant dispersal analysis. The first family of adsorption models - the Equilibrium Adsorption (EA) Models - are based upon the simple requirement of equilibrium between adsorbent and room air. The second family - the Boundary Layer Diffusion Controlled Adsorption (BLDC) Models - add to the equilibrium requirement a boundary layer model for diffusion of the adsorbate from the room air to the adsorbent surface. Two members of each of these families are explicitly discussed, one based on the linear adsorption isotherm model and the other on the Langmuir model. The linear variants of each family are applied to model the adsorption dynamics of formaldehyde in gypsum wall board and compared to measured data.

  3. Simulation modeling and analysis with Arena

    CERN Document Server

    Altiok, Tayfur

    2007-01-01

    Simulation Modeling and Analysis with Arena is a highly readable textbook which treats the essentials of the Monte Carlo discrete-event simulation methodology, and does so in the context of a popular Arena simulation environment.” It treats simulation modeling as an in-vitro laboratory that facilitates the understanding of complex systems and experimentation with what-if scenarios in order to estimate their performance metrics. The book contains chapters on the simulation modeling methodology and the underpinnings of discrete-event systems, as well as the relevant underlying probability, statistics, stochastic processes, input analysis, model validation and output analysis. All simulation-related concepts are illustrated in numerous Arena examples, encompassing production lines, manufacturing and inventory systems, transportation systems, and computer information systems in networked settings.· Introduces the concept of discrete event Monte Carlo simulation, the most commonly used methodology for modeli...

  4. A Requirements Analysis Model Based on QFD

    Institute of Scientific and Technical Information of China (English)

    TANG Zhi-wei; Nelson K.H.Tang

    2004-01-01

    The enterprise resource planning (ERP) system has emerged to offer an integrated IT solution and more and more enterprises are increasing by adopting this system and regarding it as an important innovation. However, there is already evidence of high failure risks in ERP project implementation, one major reason is poor analysis of the requirements for system implementation. In this paper, the importance of requirements analysis for ERP project implementation is highlighted, and a requirements analysis model by applying quality function deployment (QFD) is presented, which will support to conduct requirements analysis for ERP project.

  5. Decision variables analysis for structured modeling

    Institute of Scientific and Technical Information of China (English)

    潘启树; 赫东波; 张洁; 胡运权

    2002-01-01

    Structured modeling is the most commonly used modeling method, but it is not quite addaptive to significant changes in environmental conditions. Therefore, Decision Variables Analysis(DVA), a new modelling method is proposed to deal with linear programming modeling and changing environments. In variant linear programming , the most complicated relationships are those among decision variables. DVA classifies the decision variables into different levels using different index sets, and divides a model into different elements so that any change can only have its effect on part of the whole model. DVA takes into consideration the complicated relationships among decision variables at different levels, and can therefore sucessfully solve any modeling problem in dramatically changing environments.

  6. Stochastic model updating using distance discrimination analysis

    Institute of Scientific and Technical Information of China (English)

    Deng Zhongmin; Bi Sifeng; Sez Atamturktur

    2014-01-01

    This manuscript presents a stochastic model updating method, taking both uncertainties in models and variability in testing into account. The updated finite element (FE) models obtained through the proposed technique can aid in the analysis and design of structural systems. The authors developed a stochastic model updating method integrating distance discrimination analysis (DDA) and advanced Monte Carlo (MC) technique to (1) enable more efficient MC by using a response surface model, (2) calibrate parameters with an iterative test-analysis correlation based upon DDA, and (3) utilize and compare different distance functions as correlation metrics. Using DDA, the influence of distance functions on model updating results is analyzed. The proposed sto-chastic method makes it possible to obtain a precise model updating outcome with acceptable cal-culation cost. The stochastic method is demonstrated on a helicopter case study updated using both Euclidian and Mahalanobis distance metrics. It is observed that the selected distance function influ-ences the iterative calibration process and thus, the calibration outcome, indicating that an integra-tion of different metrics might yield improved results.

  7. Mathematical Model For Engineering Analysis And Optimization

    Science.gov (United States)

    Sobieski, Jaroslaw

    1992-01-01

    Computational support for engineering design process reveals behavior of designed system in response to external stimuli; and finds out how behavior modified by changing physical attributes of system. System-sensitivity analysis combined with extrapolation forms model of design complementary to model of behavior, capable of direct simulation of effects of changes in design variables. Algorithms developed for this method applicable to design of large engineering systems, especially those consisting of several subsystems involving many disciplines.

  8. Behavioral modeling and analysis of galvanic devices

    Science.gov (United States)

    Xia, Lei

    2000-10-01

    A new hybrid modeling approach was developed for galvanic devices including batteries and fuel cells. The new approach reduces the complexity of the First Principles method and adds a physical basis to the empirical methods. The resulting general model includes all the processes that affect the terminal behavior of the galvanic devices. The first step of the new model development was to build a physics-based structure or framework that reflects the important physiochemical processes and mechanisms of a galvanic device. Thermodynamics, electrode kinetics, mass transport and electrode interfacial structure of an electrochemical cell were considered and included in the model. Each process of the cell is represented by a clearly-defined and familiar electrical component, resulting in an equivalent circuit model for the galvanic device. The second step was to develop a parameter identification procedure that correlates the device response data to the parameters of the components in the model. This procedure eliminates the need for hard-to-find data on the electrochemical properties of the cell and specific device design parameters. Thus, the model is chemistry and structure independent. Implementation issues of the new modeling approach were presented. The validity of the new model over a wide range of operating conditions was verified with experimental data from actual devices. The new model was used in studying the characteristics of galvanic devices. Both the steady-state and dynamic behavior of batteries and fuel cells was studied using the impedance analysis techniques. The results were used to explain some experimental results of galvanic devices such as charging and pulsed discharge. The knowledge gained from the device analysis was also used in devising new solutions to application problems such as determining the state of charge of a battery or the maximum power output of a fuel cell. With the new model, a system can be designed that utilizes a galvanic device

  9. Modeling and analysis of stochastic systems

    CERN Document Server

    Kulkarni, Vidyadhar G

    2011-01-01

    Based on the author's more than 25 years of teaching experience, Modeling and Analysis of Stochastic Systems, Second Edition covers the most important classes of stochastic processes used in the modeling of diverse systems, from supply chains and inventory systems to genetics and biological systems. For each class of stochastic process, the text includes its definition, characterization, applications, transient and limiting behavior, first passage times, and cost/reward models. Along with reorganizing the material, this edition revises and adds new exercises and examples. New to the second edi

  10. Independent Component Analysis in Multimedia Modeling

    DEFF Research Database (Denmark)

    Larsen, Jan

    Modeling of multimedia and multimodal data becomes increasingly important with the digitalization of the world. The objective of this paper is to demonstrate the potential of independent component analysis and blind sources separation methods for modeling and understanding of multimedia data, which...... largely refers to text, images/video, audio and combinations of such data. We review a number of applications within single and combined media with the hope that this might provide inspiration for further research in this area. Finally, we provide a detailed presentation of our own recent work on modeling...

  11. Independent Component Analysis in Multimedia Modeling

    DEFF Research Database (Denmark)

    Larsen, Jan; Hansen, Lars Kai; Kolenda, Thomas;

    2003-01-01

    Modeling of multimedia and multimodal data becomes increasingly important with the digitalization of the world. The objective of this paper is to demonstrate the potential of independent component analysis and blind sources separation methods for modeling and understanding of multimedia data, which...... largely refers to text, images/video, audio and combinations of such data. We review a number of applications within single and combined media with the hope that this might provide inspiration for further research in this area. Finally, we provide a detailed presentation of our own recent work on modeling...

  12. Energy Systems Modelling Research and Analysis

    DEFF Research Database (Denmark)

    Møller Andersen, Frits; Alberg Østergaard, Poul

    2015-01-01

    This editorial introduces the seventh volume of the International Journal of Sustainable Energy Planning and Management. The volume presents part of the outcome of the project Energy Systems Modelling Research and Analysis (ENSYMORA) funded by the Danish Innovation Fund. The project carried out...

  13. Power system stability modelling, analysis and control

    CERN Document Server

    Sallam, Abdelhay A

    2015-01-01

    This book provides a comprehensive treatment of the subject from both a physical and mathematical perspective and covers a range of topics including modelling, computation of load flow in the transmission grid, stability analysis under both steady-state and disturbed conditions, and appropriate controls to enhance stability.

  14. Model Selection in Data Analysis Competitions

    DEFF Research Database (Denmark)

    Wind, David Kofoed; Winther, Ole

    2014-01-01

    The use of data analysis competitions for selecting the most appropriate model for a problem is a recent innovation in the field of predictive machine learning. Two of the most well-known examples of this trend was the Netflix Competition and recently the competitions hosted on the online platform...

  15. Stochastic Modelling and Analysis of Warehouse Operations

    NARCIS (Netherlands)

    Y. Gong (Yeming)

    2009-01-01

    textabstractThis thesis has studied stochastic models and analysis of warehouse operations. After an overview of stochastic research in warehouse operations, we explore the following topics. Firstly, we search optimal batch sizes in a parallel-aisle warehouse with online order arrivals. We employ a

  16. Comparative Distributions of Hazard Modeling Analysis

    Directory of Open Access Journals (Sweden)

    Rana Abdul Wajid

    2006-07-01

    Full Text Available In this paper we present the comparison among the distributions used in hazard analysis. Simulation technique has been used to study the behavior of hazard distribution modules. The fundamentals of Hazard issues are discussed using failure criteria. We present the flexibility of the hazard modeling distribution that approaches to different distributions.

  17. Modeling uncertainty in geographic information and analysis

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    Uncertainty modeling and data quality for spatial data and spatial analyses are im-portant topics in geographic information science together with space and time in geography,as well as spatial analysis. In the past two decades,a lot of efforts have been made to research the uncertainty modeling for spatial data and analyses. This paper presents our work in the research. In particular,four progresses in the re-search are given out: (a) from determinedness-to uncertainty-based representation of geographic objects in GIS; (b) from uncertainty modeling for static data to dy-namic spatial analyses; (c) from modeling uncertainty for spatial data to models; and (d) from error descriptions to quality control for spatial data.

  18. Model Based Analysis of Insider Threats

    DEFF Research Database (Denmark)

    Chen, Taolue; Han, Tingting; Kammueller, Florian

    2016-01-01

    In order to detect malicious insider attacks it is important to model and analyse infrastructures and policies of organisations and the insiders acting within them. We extend formal approaches that allow modelling such scenarios by quantitative aspects to enable a precise analysis of security...... designs. Our framework enables evaluating the risks of an insider attack to happen quantitatively. The framework first identifies an insider's intention to perform an inside attack, using Bayesian networks, and in a second phase computes the probability of success for an inside attack by this actor, using...... probabilistic model checking. We provide prototype tool support using Matlab for Bayesian networks and PRISM for the analysis of Markov decision processes, and validate the framework with case studies....

  19. Modeling and Analysis of Pulse Skip Modulation

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    The state space average model and the large signal models of Pulse Skip Modulation (PSM) mode are given in this paper. Farther more, based on these models and simulations of PSM converter circuits, the analysis of the characteristics of PSM converter is described in this paper, of which include efficiency, frequency spectrum analysis, output voltage ripple, response speed and interference rejection capability. Compared with PWM control mode, PSM converter has high efficiency, especially with light loads, quick response, good interference rejection and good EMC characteristic. Improved PSM slightly, it could be a kind of good independent regulating mode during the whole operating process for a DC-DC converter. Finally, some experimental results are also presented in this paper.

  20. A Dynamic Model for Energy Structure Analysis

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    Energy structure is a complicated system concerning economic development, natural resources, technological innovation, ecological balance, social progress and many other elements. It is not easy to explain clearly the developmental mechanism of an energy system and the mutual relations between the energy system and its related environments by the traditional methods. It is necessary to develop a suitable dynamic model, which can reflect the dynamic characteristics and the mutual relations of the energy system and its related environments. In this paper, the historical development of China's energy structure was analyzed. A new quantitative analysis model was developed based on system dynamics principles through analysis of energy resources, and the production and consumption of energy in China and comparison with the world. Finally, this model was used to predict China's future energy structures under different conditions.

  1. Analysis hierarchical model for discrete event systems

    Science.gov (United States)

    Ciortea, E. M.

    2015-11-01

    The This paper presents the hierarchical model based on discrete event network for robotic systems. Based on the hierarchical approach, Petri network is analysed as a network of the highest conceptual level and the lowest level of local control. For modelling and control of complex robotic systems using extended Petri nets. Such a system is structured, controlled and analysed in this paper by using Visual Object Net ++ package that is relatively simple and easy to use, and the results are shown as representations easy to interpret. The hierarchical structure of the robotic system is implemented on computers analysed using specialized programs. Implementation of hierarchical model discrete event systems, as a real-time operating system on a computer network connected via a serial bus is possible, where each computer is dedicated to local and Petri model of a subsystem global robotic system. Since Petri models are simplified to apply general computers, analysis, modelling, complex manufacturing systems control can be achieved using Petri nets. Discrete event systems is a pragmatic tool for modelling industrial systems. For system modelling using Petri nets because we have our system where discrete event. To highlight the auxiliary time Petri model using transport stream divided into hierarchical levels and sections are analysed successively. Proposed robotic system simulation using timed Petri, offers the opportunity to view the robotic time. Application of goods or robotic and transmission times obtained by measuring spot is obtained graphics showing the average time for transport activity, using the parameters sets of finished products. individually.

  2. Visualization and Data Analysis for CISM Models

    Science.gov (United States)

    Wiltberger, M.; Guild, T.; Lyon, J. G.

    2003-12-01

    The Center for Integrated Space Weather Modeling (CISM) is working on developing a model from the surface of the sun to the Earth's ionosphere. Among the many challenges facing this program is the development of visualization and data analysis package which can be used to examine the results from all of the component models. We have begun to use OpenDX as the core of the CISM visualization and data analysis package. OpenDX is an open source data visualization package based upon IBM's Data Explorer visualization software. This package allows us to provide access to simulation results through either web based front end or via a series of module extensions to the OpenDX package which can be installed on the remote user's own machine. Since the software is open source, it is freely available on wide range of platforms ranging from SGIs to Intel machines running either Linux or WinNT. The OpenDX software package includes a set of tools for turning OpenDX visual program into a Java based web page which allows the user simple control over the parameters plotted and viewing angle. We begin this presentation with an overview of OpenDX's capabilities and then present sample visualizations from the ENLIL solar wind model, the LFM and RCM magnetosphere models, and the TING ionospheric model. In addition, we illustrate how this package can be used as part of a more advanced data analysis system. In particular we examine the energy partitioning the magnetosphere during a series of substorms by using OpenDX define regions, e.g. plasma sheet, lobes, and then integrate the energy density within them as a function of time. The combination of visualization tools with data analysis routines allows us to develop a deeper understanding of the coupled Sun-Earth system.

  3. Model Performance Evaluation and Scenario Analysis (MPESA) Tutorial

    Science.gov (United States)

    This tool consists of two parts: model performance evaluation and scenario analysis (MPESA). The model performance evaluation consists of two components: model performance evaluation metrics and model diagnostics. These metrics provides modelers with statistical goodness-of-fit m...

  4. Analysis and Realization on MIMO Channel Model

    Directory of Open Access Journals (Sweden)

    Liu Hui

    2014-04-01

    Full Text Available In order to build the MIMO (Multiple Input Multiple Output channel model based on IEEE 802.16, the way and analysis on how to build good MIMO channel model are described in this study. By exploiting the spatial freedom of wireless channels, MIMO systems have the potential to achieve high bandwidth efficiency, promoting MIMO to be a key technique in the next generation communication systems. As a basic researching field of MIMO technologies, MIMO channel modeling significantly serve to the performance evaluation of space-time encoding algorithms as well as system level calibration and simulation. Having the superiorities of low inner-antenna correlation and small array size, multi-polarization tends to be a promising technique in future MIMO systems. However, polarization characteristics have not yet been modeled well in current MIMO channel models, so establishing meaningful multi-polarized MIMO channel models has become a hot spot in recent channel modeling investigation. In this study, I have mainly made further research on the related theories in the channel models and channel estimation and implementation algorithms on the others’ research work.

  5. Guideliness for system modeling: fault tree [analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Yoon Hwan; Yang, Joon Eon; Kang, Dae Il; Hwang, Mee Jeong

    2004-07-01

    This document, the guidelines for system modeling related to Fault Tree Analysis(FTA), is intended to provide the guidelines with the analyzer to construct the fault trees in the level of the capability category II of ASME PRA standard. Especially, they are to provide the essential and basic guidelines and the related contents to be used in support of revising the Ulchin 3 and 4 PSA model for risk monitor within the capability category II of ASME PRA standard. Normally the main objective of system analysis is to assess the reliability of system modeled by Event Tree Analysis (ETA). A variety of analytical techniques can be used for the system analysis, however, FTA method is used in this procedures guide. FTA is the method used for representing the failure logic of plant systems deductively using AND, OR or NOT gates. The fault tree should reflect all possible failure modes that may contribute to the system unavailability. This should include contributions due to the mechanical failures of the components, Common Cause Failures (CCFs), human errors and outages for testing and maintenance. This document identifies and describes the definitions and the general procedures of FTA and the essential and basic guidelines for reving the fault trees. Accordingly, the guidelines for FTA will be capable to guide the FTA to the level of the capability category II of ASME PRA standard.

  6. Operational modal analysis by updating autoregressive model

    Science.gov (United States)

    Vu, V. H.; Thomas, M.; Lakis, A. A.; Marcouiller, L.

    2011-04-01

    This paper presents improvements of a multivariable autoregressive (AR) model for applications in operational modal analysis considering simultaneously the temporal response data of multi-channel measurements. The parameters are estimated by using the least squares method via the implementation of the QR factorization. A new noise rate-based factor called the Noise rate Order Factor (NOF) is introduced for use in the effective selection of model order and noise rate estimation. For the selection of structural modes, an orderwise criterion called the Order Modal Assurance Criterion (OMAC) is used, based on the correlation of mode shapes computed from two successive orders. Specifically, the algorithm is updated with respect to model order from a small value to produce a cost-effective computation. Furthermore, the confidence intervals of each natural frequency, damping ratio and mode shapes are also computed and evaluated with respect to model order and noise rate. This method is thus very effective for identifying the modal parameters in case of ambient vibrations dealing with modern output-only modal analysis. Simulations and discussions on a steel plate structure are presented, and the experimental results show good agreement with the finite element analysis.

  7. Modelling dominance in a flexible intercross analysis

    Directory of Open Access Journals (Sweden)

    Besnier Francois

    2009-06-01

    Full Text Available Abstract Background The aim of this paper is to develop a flexible model for analysis of quantitative trait loci (QTL in outbred line crosses, which includes both additive and dominance effects. Our flexible intercross analysis (FIA model accounts for QTL that are not fixed within founder lines and is based on the variance component framework. Genome scans with FIA are performed using a score statistic, which does not require variance component estimation. Results Simulations of a pedigree with 800 F2 individuals showed that the power of FIA including both additive and dominance effects was almost 50% for a QTL with equal allele frequencies in both lines with complete dominance and a moderate effect, whereas the power of a traditional regression model was equal to the chosen significance value of 5%. The power of FIA without dominance effects included in the model was close to those obtained for FIA with dominance for all simulated cases except for QTL with overdominant effects. A genome-wide linkage analysis of experimental data from an F2 intercross between Red Jungle Fowl and White Leghorn was performed with both additive and dominance effects included in FIA. The score values for chicken body weight at 200 days of age were similar to those obtained in FIA analysis without dominance. Conclusion We have extended FIA to include QTL dominance effects. The power of FIA was superior, or similar, to standard regression methods for QTL effects with dominance. The difference in power for FIA with or without dominance is expected to be small as long as the QTL effects are not overdominant. We suggest that FIA with only additive effects should be the standard model to be used, especially since it is more computationally efficient.

  8. Social phenomena from data analysis to models

    CERN Document Server

    Perra, Nicola

    2015-01-01

    This book focuses on the new possibilities and approaches to social modeling currently being made possible by an unprecedented variety of datasets generated by our interactions with modern technologies. This area has witnessed a veritable explosion of activity over the last few years, yielding many interesting and useful results. Our aim is to provide an overview of the state of the art in this area of research, merging an extremely heterogeneous array of datasets and models. Social Phenomena: From Data Analysis to Models is divided into two parts. Part I deals with modeling social behavior under normal conditions: How we live, travel, collaborate and interact with each other in our daily lives. Part II deals with societal behavior under exceptional conditions: Protests, armed insurgencies, terrorist attacks, and reactions to infectious diseases. This book offers an overview of one of the most fertile emerging fields bringing together practitioners from scientific communities as diverse as social sciences, p...

  9. Modeling and Thermal Analysis of Disc

    OpenAIRE

    Brake Praveena S; Lava Kumar M

    2014-01-01

    The disc brake is a device used for slowing or stopping the rotation of the vehicle. Number of times using the brake for vehicle leads to heat generation during braking event, such that disc brake undergoes breakage due to high Temperature. Disc brake model is done by CATIA and analysis is done by using ANSYS workbench. The main purpose of this project is to study the Thermal analysis of the Materials for the Aluminum, Grey Cast Iron, HSS M42, and HSS M2. A comparison between ...

  10. 3D face modeling, analysis and recognition

    CERN Document Server

    Daoudi, Mohamed; Veltkamp, Remco

    2013-01-01

    3D Face Modeling, Analysis and Recognition presents methodologies for analyzing shapes of facial surfaces, develops computational tools for analyzing 3D face data, and illustrates them using state-of-the-art applications. The methodologies chosen are based on efficient representations, metrics, comparisons, and classifications of features that are especially relevant in the context of 3D measurements of human faces. These frameworks have a long-term utility in face analysis, taking into account the anticipated improvements in data collection, data storage, processing speeds, and application s

  11. Advances in statistical models for data analysis

    CERN Document Server

    Minerva, Tommaso; Vichi, Maurizio

    2015-01-01

    This edited volume focuses on recent research results in classification, multivariate statistics and machine learning and highlights advances in statistical models for data analysis. The volume provides both methodological developments and contributions to a wide range of application areas such as economics, marketing, education, social sciences and environment. The papers in this volume were first presented at the 9th biannual meeting of the Classification and Data Analysis Group (CLADAG) of the Italian Statistical Society, held in September 2013 at the University of Modena and Reggio Emilia, Italy.

  12. LCD motion blur: modeling, analysis, and algorithm.

    Science.gov (United States)

    Chan, Stanley H; Nguyen, Truong Q

    2011-08-01

    Liquid crystal display (LCD) devices are well known for their slow responses due to the physical limitations of liquid crystals. Therefore, fast moving objects in a scene are often perceived as blurred. This effect is known as the LCD motion blur. In order to reduce LCD motion blur, an accurate LCD model and an efficient deblurring algorithm are needed. However, existing LCD motion blur models are insufficient to reflect the limitation of human-eye-tracking system. Also, the spatiotemporal equivalence in LCD motion blur models has not been proven directly in the discrete 2-D spatial domain, although it is widely used. There are three main contributions of this paper: modeling, analysis, and algorithm. First, a comprehensive LCD motion blur model is presented, in which human-eye-tracking limits are taken into consideration. Second, a complete analysis of spatiotemporal equivalence is provided and verified using real video sequences. Third, an LCD motion blur reduction algorithm is proposed. The proposed algorithm solves an l(1)-norm regularized least-squares minimization problem using a subgradient projection method. Numerical results show that the proposed algorithm gives higher peak SNR, lower temporal error, and lower spatial error than motion-compensated inverse filtering and Lucy-Richardson deconvolution algorithm, which are two state-of-the-art LCD deblurring algorithms.

  13. Extrudate Expansion Modelling through Dimensional Analysis Method

    DEFF Research Database (Denmark)

    A new model framework is proposed to correlate extrudate expansion and extrusion operation parameters for a food extrusion cooking process through dimensional analysis principle, i.e. Buckingham pi theorem. Three dimensionless groups, i.e. energy, water content and temperature, are suggested...... to describe the extrudates expansion. From the three dimensionless groups, an equation with three experimentally determined parameters is derived to express the extrudate expansion. The model is evaluated with whole wheat flour and aquatic feed extrusion experimental data. The average deviations...

  14. MATHEMATICAL RISK ANALYSIS: VIA NICHOLAS RISK MODEL AND BAYESIAN ANALYSIS

    Directory of Open Access Journals (Sweden)

    Anass BAYAGA

    2010-07-01

    Full Text Available The objective of this second part of a two-phased study was to explorethe predictive power of quantitative risk analysis (QRA method andprocess within Higher Education Institution (HEI. The method and process investigated the use impact analysis via Nicholas risk model and Bayesian analysis, with a sample of hundred (100 risk analysts in a historically black South African University in the greater Eastern Cape Province.The first findings supported and confirmed previous literature (KingIII report, 2009: Nicholas and Steyn, 2008: Stoney, 2007: COSA, 2004 that there was a direct relationship between risk factor, its likelihood and impact, certiris paribus. The second finding in relation to either controlling the likelihood or the impact of occurrence of risk (Nicholas risk model was that to have a brighter risk reward, it was important to control the likelihood ofoccurrence of risks as compared with its impact so to have a direct effect on entire University. On the Bayesian analysis, thus third finding, the impact of risk should be predicted along three aspects. These aspects included the human impact (decisions made, the property impact (students and infrastructural based and the business impact. Lastly, the study revealed that although in most business cases, where as business cycles considerably vary dependingon the industry and or the institution, this study revealed that, most impacts in HEI (University was within the period of one academic.The recommendation was that application of quantitative risk analysisshould be related to current legislative framework that affects HEI.

  15. Mathematical analysis of a muscle architecture model.

    Science.gov (United States)

    Navallas, Javier; Malanda, Armando; Gila, Luis; Rodríguez, Javier; Rodríguez, Ignacio

    2009-01-01

    Modeling of muscle architecture, which aims to recreate mathematically the physiological structure of the muscle fibers and motor units, is a powerful tool for understanding and modeling the mechanical and electrical behavior of the muscle. Most of the published models are presented in the form of algorithms, without mathematical analysis of mechanisms or outcomes of the model. Through the study of the muscle architecture model proposed by Stashuk, we present the analytical tools needed to better understand these models. We provide a statistical description for the spatial relations between motor units and muscle fibers. We are particularly concerned with two physiological quantities: the motor unit fiber number, which we expect to be proportional to the motor unit territory area; and the motor unit fiber density, which we expect to be constant for all motor units. Our results indicate that the Stashuk model is in good agreement with the physiological evidence in terms of the expectations outlined above. However, the resulting variance is very high. In addition, a considerable 'edge effect' is present in the outer zone of the muscle cross-section, making the properties of the motor units dependent on their location. This effect is relevant when motor unit territories and muscle cross-section are of similar size.

  16. Energy Systems Modelling Research and Analysis

    DEFF Research Database (Denmark)

    Møller Andersen, Frits; Alberg Østergaard, Poul

    2015-01-01

    This editorial introduces the seventh volume of the International Journal of Sustainable Energy Planning and Management. The volume presents part of the outcome of the project Energy Systems Modelling Research and Analysis (ENSYMORA) funded by the Danish Innovation Fund. The project carried out...... by 11 university and industry partners has improved the basis for decision-making within energy planning and energy scenario making by providing new and improved tools and methods for energy systems analyses....

  17. Scripted Building Energy Modeling and Analysis: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Hale, E.; Macumber, D.; Benne, K.; Goldwasser, D.

    2012-08-01

    Building energy modeling and analysis is currently a time-intensive, error-prone, and nonreproducible process. This paper describes the scripting platform of the OpenStudio tool suite (http://openstudio.nrel.gov) and demonstrates its use in several contexts. Two classes of scripts are described and demonstrated: measures and free-form scripts. Measures are small, single-purpose scripts that conform to a predefined interface. Because measures are fairly simple, they can be written or modified by inexperienced programmers.

  18. THE FOURIER SERIES MODEL IN MAP ANALYSIS.

    Science.gov (United States)

    During the past several years the double Fourier Series has been applied to the analysis of contour-type maps as an alternative to the more commonly...used polynomial model. The double Fourier Series has high potential in the study of areal variations, inasmuch as a succession of trend maps based on...and it is shown that the double Fourier Series can be used to summarize the directional properties of areally-distributed data. An Appendix lists

  19. Micromechatronics modeling, analysis, and design with Matlab

    CERN Document Server

    Giurgiutiu, Victor

    2009-01-01

    Focusing on recent developments in engineering science, enabling hardware, advanced technologies, and software, Micromechatronics: Modeling, Analysis, and Design with MATLAB®, Second Edition provides clear, comprehensive coverage of mechatronic and electromechanical systems. It applies cornerstone fundamentals to the design of electromechanical systems, covers emerging software and hardware, introduces the rigorous theory, examines the design of high-performance systems, and helps develop problem-solving skills. Along with more streamlined material, this edition adds many new sections to exist

  20. Modeling and Thermal Analysis of Disc

    Directory of Open Access Journals (Sweden)

    Brake Praveena S

    2014-10-01

    Full Text Available The disc brake is a device used for slowing or stopping the rotation of the vehicle. Number of times using the brake for vehicle leads to heat generation during braking event, such that disc brake undergoes breakage due to high Temperature. Disc brake model is done by CATIA and analysis is done by using ANSYS workbench. The main purpose of this project is to study the Thermal analysis of the Materials for the Aluminum, Grey Cast Iron, HSS M42, and HSS M2. A comparison between the four materials for the Thermal values and material properties obtained from the Thermal analysis low thermal gradient material is preferred. Hence best suitable design, low thermal gradient material Grey cast iron is preferred for the Disc Brakes for better performance.

  1. Modeling and analysis of advanced binary cycles

    Energy Technology Data Exchange (ETDEWEB)

    Gawlik, K.

    1997-12-31

    A computer model (Cycle Analysis Simulation Tool, CAST) and a methodology have been developed to perform value analysis for small, low- to moderate-temperature binary geothermal power plants. The value analysis method allows for incremental changes in the levelized electricity cost (LEC) to be determined between a baseline plant and a modified plant. Thermodynamic cycle analyses and component sizing are carried out in the model followed by economic analysis which provides LEC results. The emphasis of the present work is on evaluating the effect of mixed working fluids instead of pure fluids on the LEC of a geothermal binary plant that uses a simple Organic Rankine Cycle. Four resources were studied spanning the range of 265{degrees}F to 375{degrees}F. A variety of isobutane and propane based mixtures, in addition to pure fluids, were used as working fluids. This study shows that the use of propane mixtures at a 265{degrees}F resource can reduce the LEC by 24% when compared to a base case value that utilizes commercial isobutane as its working fluid. The cost savings drop to 6% for a 375{degrees}F resource, where an isobutane mixture is favored. Supercritical cycles were found to have the lowest cost at all resources.

  2. Mathematical analysis of epidemiological models with heterogeneity

    Energy Technology Data Exchange (ETDEWEB)

    Van Ark, J.W.

    1992-01-01

    For many diseases in human populations the disease shows dissimilar characteristics in separate subgroups of the population; for example, the probability of disease transmission for gonorrhea or AIDS is much higher from male to female than from female to male. There is reason to construct and analyze epidemiological models which allow this heterogeneity of population, and to use these models to run computer simulations of the disease to predict the incidence and prevalence of the disease. In the models considered here the heterogeneous population is separated into subpopulations whose internal and external interactions are homogeneous in the sense that each person in the population can be assumed to have all average actions for the people of that subpopulation. The first model considered is an SIRS models; i.e., the Susceptible can become Infected, and if so he eventually Recovers with temporary immunity, and after a period of time becomes Susceptible again. Special cases allow for permanent immunity or other variations. This model is analyzed and threshold conditions are given which determine whether the disease dies out or persists. A deterministic model is presented; this model is constructed using difference equations, and it has been used in computer simulations for the AIDS epidemic in the homosexual population in San Francisco. The homogeneous version and the heterogeneous version of the differential-equations and difference-equations versions of the deterministic model are analyzed mathematically. In the analysis, equilibria are identified and threshold conditions are set forth for the disease to die out if the disease is below the threshold so that the disease-free equilibrium is globally asymptotically stable. Above the threshold the disease persists so that the disease-free equilibrium is unstable and there is a unique endemic equilibrium.

  3. Model reduction using a posteriori analysis

    KAUST Repository

    Whiteley, Jonathan P.

    2010-05-01

    Mathematical models in biology and physiology are often represented by large systems of non-linear ordinary differential equations. In many cases, an observed behaviour may be written as a linear functional of the solution of this system of equations. A technique is presented in this study for automatically identifying key terms in the system of equations that are responsible for a given linear functional of the solution. This technique is underpinned by ideas drawn from a posteriori error analysis. This concept has been used in finite element analysis to identify regions of the computational domain and components of the solution where a fine computational mesh should be used to ensure accuracy of the numerical solution. We use this concept to identify regions of the computational domain and components of the solution where accurate representation of the mathematical model is required for accuracy of the functional of interest. The technique presented is demonstrated by application to a model problem, and then to automatically deduce known results from a cell-level cardiac electrophysiology model. © 2010 Elsevier Inc.

  4. Ontological Modeling for Integrated Spacecraft Analysis

    Science.gov (United States)

    Wicks, Erica

    2011-01-01

    Current spacecraft work as a cooperative group of a number of subsystems. Each of these requiresmodeling software for development, testing, and prediction. It is the goal of my team to create anoverarching software architecture called the Integrated Spacecraft Analysis (ISCA) to aid in deploying the discrete subsystems' models. Such a plan has been attempted in the past, and has failed due to the excessive scope of the project. Our goal in this version of ISCA is to use new resources to reduce the scope of the project, including using ontological models to help link the internal interfaces of subsystems' models with the ISCA architecture.I have created an ontology of functions specific to the modeling system of the navigation system of a spacecraft. The resulting ontology not only links, at an architectural level, language specificinstantiations of the modeling system's code, but also is web-viewable and can act as a documentation standard. This ontology is proof of the concept that ontological modeling can aid in the integration necessary for ISCA to work, and can act as the prototype for future ISCA ontologies.

  5. Topological data analysis of biological aggregation models.

    Science.gov (United States)

    Topaz, Chad M; Ziegelmeier, Lori; Halverson, Tom

    2015-01-01

    We apply tools from topological data analysis to two mathematical models inspired by biological aggregations such as bird flocks, fish schools, and insect swarms. Our data consists of numerical simulation output from the models of Vicsek and D'Orsogna. These models are dynamical systems describing the movement of agents who interact via alignment, attraction, and/or repulsion. Each simulation time frame is a point cloud in position-velocity space. We analyze the topological structure of these point clouds, interpreting the persistent homology by calculating the first few Betti numbers. These Betti numbers count connected components, topological circles, and trapped volumes present in the data. To interpret our results, we introduce a visualization that displays Betti numbers over simulation time and topological persistence scale. We compare our topological results to order parameters typically used to quantify the global behavior of aggregations, such as polarization and angular momentum. The topological calculations reveal events and structure not captured by the order parameters.

  6. Mode analysis of numerical geodynamo models

    CERN Document Server

    Schrinner, Martin; Hoyng, Peter

    2011-01-01

    It has been suggested in Hoyng (2009) that dynamo action can be analysed by expansion of the magnetic field into dynamo modes and statistical evaluation of the mode coefficients. We here validate this method by analysing a numerical geodynamo model and comparing the numerically derived mean mode coefficients with the theoretical predictions. The model belongs to the class of kinematically stable dynamos with a dominating axisymmetric, antisymmetric with respect to the equator and non-periodic fundamental dynamo mode. The analysis requires a number of steps: the computation of the so-called dynamo coefficients, the derivation of the temporally and azimuthally averaged dynamo eigenmodes and the decomposition of the magnetic field of the numerical geodynamo model into the eigenmodes. For the determination of the theoretical mode excitation levels the turbulent velocity field needs to be projected on the dynamo eigenmodes. We compare the theoretically and numerically derived mean mode coefficients and find reason...

  7. Automating Risk Analysis of Software Design Models

    Directory of Open Access Journals (Sweden)

    Maxime Frydman

    2014-01-01

    Full Text Available The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance.

  8. Gentrification and models for real estate analysis

    Directory of Open Access Journals (Sweden)

    Gianfranco Brusa

    2013-08-01

    Full Text Available This research propose a deep analysis of Milanese real estate market, based on data supplied by three real estate organizations; gentrification appears in some neighborhoods, such as Tortona, Porta Genova, Bovisa, Isola Garibaldi: the latest is the subject of the final analysis, by surveying of physical and social state of the area. The survey takes place in two periods (2003 and 2009 to compare the evolution of gentrification. The results of surveys has been employed in a simulation by multi-agent system model, to foresee long term evolution of the phenomenon. These neighborhood micro-indicators allow to put in evidence actual trends, conditioning a local real estate market, which can translate themselves in phenomena such as gentrification. In present analysis, the employ of cellular automata models applied to a neighborhood in Milan (Isola Garibaldi produced the dynamic simulation of gentrification trend during a very long time: the cyclical phenomenon (one loop holds a period of twenty – thirty years appears sometimes during a theoretical time of 100 – 120 – 150 years. Simulation of long period scenarios by multi-agent systems and cellular automata provides estimator with powerful tool, without limits in implementing it, able to support him in appraisal judge. It stands also to reason that such a tool can sustain urban planning and related evaluation processes.

  9. Microblog Sentiment Analysis with Emoticon Space Model

    Institute of Scientific and Technical Information of China (English)

    姜飞; 刘奕群; 孙甲申; 朱璇; 张敏; 马少平

    2015-01-01

    Emoticons have been widely employed to express different types of moods, emotions, and feelings in microblog environments. They are therefore regarded as one of the most important signals for microblog sentiment analysis. Most existing studies use several emoticons that convey clear emotional meanings as noisy sentiment labels or similar sentiment indicators. However, in practical microblog environments, tens or even hundreds of emoticons are frequently adopted and all emoticons have their own unique emotional meanings. Besides, a considerable number of emoticons do not have clear emotional meanings. An improved sentiment analysis model should not overlook these phenomena. Instead of manually assigning sentiment labels to several emoticons that convey relatively clear meanings, we propose the emoticon space model (ESM) that leverages more emoticons to construct word representations from a massive amount of unlabeled data. By projecting words and microblog posts into an emoticon space, the proposed model helps identify subjectivity, polarity, and emotion in microblog environments. The experimental results for a public microblog benchmark corpus (NLP&CC 2013) indicate that ESM effectively leverages emoticon signals and outperforms previous state-of-the-art strategies and benchmark best runs.

  10. Data Logistics and the CMS Analysis Model

    CERN Document Server

    Managan, Julie E

    2009-01-01

    The Compact Muon Solenoid Experiment (CMS) at the Large Hadron Collider (LHC) at CERN has brilliant prospects for uncovering new information about the physical structure of our universe. Soon physicists around the world will participate together in analyzing CMS data in search of new physics phenomena and the Higgs Boson. However, they face a significant problem: with 5 Petabytes of data needing distribution each year, how will physicists get the data they need? How and where will they be able to analyze it? Computing resources and scientists are scattered around the world, while CMS data exists in localized chunks. The CMS computing model only allows analysis of locally stored data, “tethering” analysis to storage. The Vanderbilt CMS team is actively working to solve this problem with the Research and Education Data Depot Network (REDDnet), a program run by Vanderbilt’s Advanced Computing Center for Research and Education (ACCRE). The Compact Muon Solenoid Experiment (CMS) at the Large Hadron Collider ...

  11. Spatiochromatic Context Modeling for Color Saliency Analysis.

    Science.gov (United States)

    Zhang, Jun; Wang, Meng; Zhang, Shengping; Li, Xuelong; Wu, Xindong

    2016-06-01

    Visual saliency is one of the most noteworthy perceptual abilities of human vision. Recent progress in cognitive psychology suggests that: 1) visual saliency analysis is mainly completed by the bottom-up mechanism consisting of feedforward low-level processing in primary visual cortex (area V1) and 2) color interacts with spatial cues and is influenced by the neighborhood context, and thus it plays an important role in a visual saliency analysis. From a computational perspective, the most existing saliency modeling approaches exploit multiple independent visual cues, irrespective of their interactions (or are not computed explicitly), and ignore contextual influences induced by neighboring colors. In addition, the use of color is often underestimated in the visual saliency analysis. In this paper, we propose a simple yet effective color saliency model that considers color as the only visual cue and mimics the color processing in V1. Our approach uses region-/boundary-defined color features with spatiochromatic filtering by considering local color-orientation interactions, therefore captures homogeneous color elements, subtle textures within the object and the overall salient object from the color image. To account for color contextual influences, we present a divisive normalization method for chromatic stimuli through the pooling of contrary/complementary color units. We further define a color perceptual metric over the entire scene to produce saliency maps for color regions and color boundaries individually. These maps are finally globally integrated into a one single saliency map. The final saliency map is produced by Gaussian blurring for robustness. We evaluate the proposed method on both synthetic stimuli and several benchmark saliency data sets from the visual saliency analysis to salient object detection. The experimental results demonstrate that the use of color as a unique visual cue achieves competitive results on par with or better than 12 state

  12. Modelling and analysis of global coal markets

    Energy Technology Data Exchange (ETDEWEB)

    Trueby, Johannes

    2013-01-17

    The thesis comprises four interrelated essays featuring modelling and analysis of coal markets. Each of the four essays has a dedicated chapter in this thesis. Chapters 2 to 4 have, from a topical perspective, a backward-looking focus and deal with explaining recent market outcomes in the international coal trade. The findings of those essays may serve as guidance for assessing current coal market outcomes as well as expected market outcomes in the near to medium-term future. Chapter 5 has a forward-looking focus and builds a bridge between explaining recent market outcomes and projecting long-term market equilibria. Chapter 2, Strategic Behaviour in International Metallurgical Coal Markets, deals with market conduct of large exporters in the market of coals used in steel-making in the period 2008 to 2010. In this essay I analyse whether prices and trade-flows in the international market for metallurgical coals were subject to non-competitive conduct in the period 2008 to 2010. To do so, I develop mathematical programming models - a Stackelberg model, two varieties of a Cournot model, and a perfect competition model - for computing spatial equilibria in international resource markets. Results are analysed with various statistical measures to assess the prediction accuracy of the models. The results show that real market equilibria cannot be reproduced with a competitive model. However, real market outcomes can be accurately simulated with the non-competitive models, suggesting that market equilibria in the international metallurgical coal trade were subject to the strategic behaviour of coal exporters. Chapter 3 and chapter 4 deal with market power issues in the steam coal trade in the period 2006 to 2008. Steam coals are typically used to produce steam either for electricity generation or for heating purposes. In Chapter 3 we analyse market behaviour of key exporting countries in the steam coal trade. This chapter features the essay Market Structure Scenarios in

  13. MODELING ANALYSIS FOR GROUT HOPPER WASTE TANK

    Energy Technology Data Exchange (ETDEWEB)

    Lee, S.

    2012-01-04

    The Saltstone facility at Savannah River Site (SRS) has a grout hopper tank to provide agitator stirring of the Saltstone feed materials. The tank has about 300 gallon capacity to provide a larger working volume for the grout nuclear waste slurry to be held in case of a process upset, and it is equipped with a mechanical agitator, which is intended to keep the grout in motion and agitated so that it won't start to set up. The primary objective of the work was to evaluate the flow performance for mechanical agitators to prevent vortex pull-through for an adequate stirring of the feed materials and to estimate an agitator speed which provides acceptable flow performance with a 45{sup o} pitched four-blade agitator. In addition, the power consumption required for the agitator operation was estimated. The modeling calculations were performed by taking two steps of the Computational Fluid Dynamics (CFD) modeling approach. As a first step, a simple single-stage agitator model with 45{sup o} pitched propeller blades was developed for the initial scoping analysis of the flow pattern behaviors for a range of different operating conditions. Based on the initial phase-1 results, the phase-2 model with a two-stage agitator was developed for the final performance evaluations. A series of sensitivity calculations for different designs of agitators and operating conditions have been performed to investigate the impact of key parameters on the grout hydraulic performance in a 300-gallon hopper tank. For the analysis, viscous shear was modeled by using the Bingham plastic approximation. Steady state analyses with a two-equation turbulence model were performed. All analyses were based on three-dimensional results. Recommended operational guidance was developed by using the basic concept that local shear rate profiles and flow patterns can be used as a measure of hydraulic performance and spatial stirring. Flow patterns were estimated by a Lagrangian integration technique along

  14. Environmental modeling framework invasiveness: analysis and implications

    Science.gov (United States)

    Environmental modeling frameworks support scientific model development by providing an Application Programming Interface (API) which model developers use to implement models. This paper presents results of an investigation on the framework invasiveness of environmental modeling frameworks. Invasiven...

  15. Advanced Modeling, Simulation and Analysis (AMSA) Capability Roadmap Progress Review

    Science.gov (United States)

    Antonsson, Erik; Gombosi, Tamas

    2005-01-01

    Contents include the following: NASA capability roadmap activity. Advanced modeling, simulation, and analysis overview. Scientific modeling and simulation. Operations modeling. Multi-special sensing (UV-gamma). System integration. M and S Environments and Infrastructure.

  16. Modeling for Deformable Body and Motion Analysis: A Review

    Directory of Open Access Journals (Sweden)

    Hailang Pan

    2013-01-01

    Full Text Available This paper surveys the modeling methods for deformable human body and motion analysis in the recent 30 years. First, elementary knowledge of human expression and modeling is introduced. Then, typical human modeling technologies, including 2D model, 3D surface model, and geometry-based, physics-based, and anatomy-based approaches, and model-based motion analysis are summarized. Characteristics of these technologies are analyzed. The technology accumulation in the field is outlined for an overview.

  17. Structured analysis and modeling of complex systems

    Science.gov (United States)

    Strome, David R.; Dalrymple, Mathieu A.

    1992-01-01

    The Aircrew Evaluation Sustained Operations Performance (AESOP) facility at Brooks AFB, Texas, combines the realism of an operational environment with the control of a research laboratory. In recent studies we collected extensive data from the Airborne Warning and Control Systems (AWACS) Weapons Directors subjected to high and low workload Defensive Counter Air Scenarios. A critical and complex task in this environment involves committing a friendly fighter against a hostile fighter. Structured Analysis and Design techniques and computer modeling systems were applied to this task as tools for analyzing subject performance and workload. This technology is being transferred to the Man-Systems Division of NASA Johnson Space Center for application to complex mission related tasks, such as manipulating the Shuttle grappler arm.

  18. Visual behaviour analysis and driver cognitive model

    Energy Technology Data Exchange (ETDEWEB)

    Baujon, J.; Basset, M.; Gissinger, G.L. [Mulhouse Univ., (France). MIPS/MIAM Lab.

    2001-07-01

    Recent studies on driver behaviour have shown that perception - mainly visual but also proprioceptive perception - plays a key role in the ''driver-vehicle-road'' system and so considerably affects the driver's decision making. Within the framework of the behaviour analysis and studies low-cost system (BASIL), this paper presents a correlative, qualitative and quantitative study, comparing the information given by visual perception and by the trajectory followed. This information will help to obtain a cognitive model of the Rasmussen type according to different driver classes. Many experiments in real driving situations have been carried out for different driver classes and for a given trajectory profile, using a test vehicle and innovative, specially designed, real-time tools, such as the vision system or the positioning module. (orig.)

  19. Development of hydrogen combustion analysis model

    Energy Technology Data Exchange (ETDEWEB)

    Lim, Tae Jin; Lee, K. D.; Kim, S. N. [Soongsil University, Seoul (Korea, Republic of); Hong, J. S.; Kwon, H. Y. [Seoul National Polytechnic University, Seoul (Korea, Republic of); Kim, Y. B.; Kim, J. S. [Seoul National University, Seoul (Korea, Republic of)

    1997-07-01

    The objectives of this project is to construct a credible DB for component reliability by developing methodologies and computer codes for assessing component independent failure and common cause failure probability, incorporating applicability and dependency of the data. In addition to this, the ultimate goal is to systematize all the analysis procedures so as to provide plans for preventing component failures by employing flexible tools for the change of specific plant or data sources. For the first subject, we construct a DB for similarity index and dependence matrix and propose a systematic procedure for data analysis by investigating the similarity and redundancy of the generic data sources. Next, we develop a computer code for this procedure and construct reliability data base for major components. The second subject is focused on developing CCF procedure for assessing the plant specific defense ability, rather than developing another CCF model. We propose a procedure and computer code for estimating CCF event probability by incorporating plant specific defensive measure. 116 refs., 25 tabs., 24 figs. (author)

  20. Tradeoff Analysis for Optimal Multiobjective Inventory Model

    Directory of Open Access Journals (Sweden)

    Longsheng Cheng

    2013-01-01

    Full Text Available Deterministic inventory model, the economic order quantity (EOQ, reveals that carrying inventory or ordering frequency follows a relation of tradeoff. For probabilistic demand, the tradeoff surface among annual order, expected inventory and shortage are useful because they quantify what the firm must pay in terms of ordering workload and inventory investment to meet the customer service desired. Based on a triobjective inventory model, this paper employs the successive approximation to obtain efficient control policies outlining tradeoffs among conflicting objectives. The nondominated solutions obtained by successive approximation are further used to plot a 3D scatterplot for exploring the relationships between objectives. Visualization of the tradeoffs displayed by the scatterplots justifies the computation effort done in the experiment, although several iterations needed to reach a nondominated solution make the solution procedure lengthy and tedious. Information elicited from the inverse relationships may help managers make deliberate inventory decisions. For the future work, developing an efficient and effective solution procedure for tradeoff analysis in multiobjective inventory management seems imperative.

  1. Production TTR modeling and dynamic buckling analysis

    Institute of Scientific and Technical Information of China (English)

    Hugh Liu; John Wei; Edward Huang

    2013-01-01

    In a typical tension leg platform (TLP) design,the top tension factor (TTF),measuring the top tension of a top tensioned riser (TTR) relative to its submerged weight in water,is one of the most important design parameters that has to be specified properly.While a very small TTF may lead to excessive vortex induced vibration (ⅤⅣ),clashing issues and possible compression close to seafloor,an unnecessarily high TTF may translate into excessive riser cost and vessel payload,and even has impacts on the TLP sizing and design in general.In the process of a production TTR design,it is found that its outer casing can be subjected to compression in a worst-case scenario with some extreme metocean and hardware conditions.The present paper shows how finite element analysis (FEA) models using beam elements and two different software packages (Flexcom and ABAQUS) are constructed to simulate the TTR properly,and especially the pipe-in-pipe effects.An ABAQUS model with hybrid elements (beam elements globally + shell elements locally) can be used to investigate how the outer casing behaves under compression.It is shown for the specified TTR design,even with its outer casing being under some local compression in the worst-case scenario,dynamic buckling would not occur; therefore the TTR design is adequate.

  2. Comparison of Statistical Models for Regional Crop Trial Analysis

    Institute of Scientific and Technical Information of China (English)

    ZHANG Qun-yuan; KONG Fan-ling

    2002-01-01

    Based on the review and comparison of main statistical analysis models for estimating varietyenvironment cell means in regional crop trials, a new statistical model, LR-PCA composite model was proposed, and the predictive precision of these models were compared by cross validation of an example data. Results showed that the order of model precision was LR-PCA model > AMMI model > PCA model > Treatment Means (TM) model > Linear Regression (LR) model > Additive Main Effects ANOVA model. The precision gain factor of LR-PCA model was 1.55, increasing by 8.4% compared with AMMI.

  3. Linking advanced fracture models to structural analysis

    Energy Technology Data Exchange (ETDEWEB)

    Chiesa, Matteo

    2001-07-01

    Shell structures with defects occur in many situations. The defects are usually introduced during the welding process necessary for joining different parts of the structure. Higher utilization of structural materials leads to a need for accurate numerical tools for reliable prediction of structural response. The direct discretization of the cracked shell structure with solid finite elements in order to perform an integrity assessment of the structure in question leads to large size problems, and makes such analysis infeasible in structural application. In this study a link between local material models and structural analysis is outlined. An ''ad hoc'' element formulation is used in order to connect complex material models to the finite element framework used for structural analysis. An improved elasto-plastic line spring finite element formulation, used in order to take cracks into account, is linked to shell elements which are further linked to beam elements. In this way one obtain a global model of the shell structure that also accounts for local flexibilities and fractures due to defects. An important advantage with such an approach is a direct fracture mechanics assessment e.g. via computed J-integral or CTOD. A recent development in this approach is the notion of two-parameter fracture assessment. This means that the crack tip stress tri-axiality (constraint) is employed in determining the corresponding fracture toughness, giving a much more realistic capacity of cracked structures. The present thesis is organized in six research articles and an introductory chapter that reviews important background literature related to this work. Paper I and II address the performance of shell and line spring finite elements as a cost effective tool for performing the numerical calculation needed to perform a fracture assessment. In Paper II a failure assessment, based on the testing of a constraint-corrected fracture mechanics specimen under tension, is

  4. Model Based Analysis and Test Generation for Flight Software

    Science.gov (United States)

    Pasareanu, Corina S.; Schumann, Johann M.; Mehlitz, Peter C.; Lowry, Mike R.; Karsai, Gabor; Nine, Harmon; Neema, Sandeep

    2009-01-01

    We describe a framework for model-based analysis and test case generation in the context of a heterogeneous model-based development paradigm that uses and combines Math- Works and UML 2.0 models and the associated code generation tools. This paradigm poses novel challenges to analysis and test case generation that, to the best of our knowledge, have not been addressed before. The framework is based on a common intermediate representation for different modeling formalisms and leverages and extends model checking and symbolic execution tools for model analysis and test case generation, respectively. We discuss the application of our framework to software models for a NASA flight mission.

  5. Dynamic modelling and analysis of biochemical networks: mechanism-based models and model-based experiments.

    Science.gov (United States)

    van Riel, Natal A W

    2006-12-01

    Systems biology applies quantitative, mechanistic modelling to study genetic networks, signal transduction pathways and metabolic networks. Mathematical models of biochemical networks can look very different. An important reason is that the purpose and application of a model are essential for the selection of the best mathematical framework. Fundamental aspects of selecting an appropriate modelling framework and a strategy for model building are discussed. Concepts and methods from system and control theory provide a sound basis for the further development of improved and dedicated computational tools for systems biology. Identification of the network components and rate constants that are most critical to the output behaviour of the system is one of the major problems raised in systems biology. Current approaches and methods of parameter sensitivity analysis and parameter estimation are reviewed. It is shown how these methods can be applied in the design of model-based experiments which iteratively yield models that are decreasingly wrong and increasingly gain predictive power.

  6. Applied data analysis and modeling for energy engineers and scientists

    CERN Document Server

    Reddy, T Agami

    2011-01-01

    ""Applied Data Analysis and Modeling for Energy Engineers and Scientists"" discusses mathematical models, data analysis, and decision analysis in modeling. The approach taken in this volume focuses on the modeling and analysis of thermal systems in an engineering environment, while also covering a number of other critical areas. Other material covered includes the tools that researchers and engineering professionals will need in order to explore different analysis methods, use critical assessment skills and reach sound engineering conclusions. The book also covers process and system design and

  7. Molten carbonate fuel cells. Modeling, analysis, simulation, and control

    Energy Technology Data Exchange (ETDEWEB)

    Sundmacher, K.; Kienle, A. [Max-Planck-Institut fuer Dynamik Komplexer Technischer Systeme, Magdeburg (Germany); Pesch, H.J. [Bayreuth Univ. (Germany). Lehrstuhl fuer Ingenieurmathematik; Berndt, J.F. [IPF Beteiligungsgesellschaft Berndt KG, Reilingen (Germany); Huppmann, G. (eds.) [MTU CFC Solutions GmbH, Muenchen (Germany)

    2007-07-01

    This book presents model-based concepts for process analysis and control on a generalized basis. It is structured as follows: Part I - DESIGN AND OPERATION: MTU's Carbonate Fuel Cell HotModule; Operational Experiences. Part II - MODEL-BASED PROCESS ANALYSIS: MCFC Reference Model; Index Analysis of Models; Parameter Identification; Steady State Process Analysis; Hot spot formation and steady state multiplicities; Conceptual design an Reforming concepts. Part III - OPTIMIZATION AND ADVANCED CONTROL: Model reduction and State estimation; Optimal Control Strategies; Optimization of Reforming Catalyst Distribution.

  8. Integration of Design and Control through Model Analysis

    DEFF Research Database (Denmark)

    Russel, Boris Mariboe; Henriksen, Jens Peter; Jørgensen, Sten Bay;

    2002-01-01

    A systematic computer aided analysis of the process model is proposed as a pre-solution step for integration of design and control problems. The process model equations are classified in terms of balance equations, constitutive equations and conditional equations. Analysis of the phenomena models...... (structure selection) issues for the integrated problems are considered. (C) 2002 Elsevier Science Ltd. All rights reserved....

  9. Evaluation of RCAS Inflow Models for Wind Turbine Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Tangler, J.; Bir, G.

    2004-02-01

    The finite element structural modeling in the Rotorcraft Comprehensive Analysis System (RCAS) provides a state-of-the-art approach to aeroelastic analysis. This, coupled with its ability to model all turbine components, results in a methodology that can simulate complex system interactions characteristic of large wind. In addition, RCAS is uniquely capable of modeling advanced control algorithms and the resulting dynamic responses.

  10. [Model-based biofuels system analysis: a review].

    Science.gov (United States)

    Chang, Shiyan; Zhang, Xiliang; Zhao, Lili; Ou, Xunmin

    2011-03-01

    Model-based system analysis is an important tool for evaluating the potential and impacts of biofuels, and for drafting biofuels technology roadmaps and targets. The broad reach of the biofuels supply chain requires that biofuels system analyses span a range of disciplines, including agriculture/forestry, energy, economics, and the environment. Here we reviewed various models developed for or applied to modeling biofuels, and presented a critical analysis of Agriculture/Forestry System Models, Energy System Models, Integrated Assessment Models, Micro-level Cost, Energy and Emission Calculation Models, and Specific Macro-level Biofuel Models. We focused on the models' strengths, weaknesses, and applicability, facilitating the selection of a suitable type of model for specific issues. Such an analysis was a prerequisite for future biofuels system modeling, and represented a valuable resource for researchers and policy makers.

  11. Analysis on the Logarithmic Model of Relationships

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    The logarithmic model is often used to describe the relationships between factors.It often gives good statistical characteristics.Yet,in the process of modeling of soil and water conservation,we find out that this“good”model cannot guarantee good result.In this paper we make an inquiry into the intrinsic reasons.It is shown that the logarithmic model has the property of enlarging or reducing model errors,and the disadvantages of the logarithmic model are analyzed.

  12. Model performance analysis and model validation in logistic regression

    Directory of Open Access Journals (Sweden)

    Rosa Arboretti Giancristofaro

    2007-10-01

    Full Text Available In this paper a new model validation procedure for a logistic regression model is presented. At first, we illustrate a brief review of different techniques of model validation. Next, we define a number of properties required for a model to be considered "good", and a number of quantitative performance measures. Lastly, we describe a methodology for the assessment of the performance of a given model by using an example taken from a management study.

  13. Model Theory in Algebra, Analysis and Arithmetic

    CERN Document Server

    Dries, Lou; Macpherson, H Dugald; Pillay, Anand; Toffalori, Carlo; Wilkie, Alex J

    2014-01-01

    Presenting recent developments and applications, the book focuses on four main topics in current model theory: 1) the model theory of valued fields; 2) undecidability in arithmetic; 3) NIP theories; and 4) the model theory of real and complex exponentiation. Young researchers in model theory will particularly benefit from the book, as will more senior researchers in other branches of mathematics.

  14. An Extended Analysis of Requirements Traceability Model

    Institute of Scientific and Technical Information of China (English)

    Jiang Dandong(蒋丹东); Zhang Shensheng; Chen Lu

    2004-01-01

    A new extended meta model of traceability is presented. Then, a formalized fine-grained model of traceability is described. Some major issues about this model, including trace units, requirements and relations within the model, are further analyzed. Finally, a case study that comes from a key project of 863 Program is given.

  15. Managing Analysis Models in the Design Process

    Science.gov (United States)

    Briggs, Clark

    2006-01-01

    Design of large, complex space systems depends on significant model-based support for exploration of the design space. Integrated models predict system performance in mission-relevant terms given design descriptions and multiple physics-based numerical models. Both the design activities and the modeling activities warrant explicit process definitions and active process management to protect the project from excessive risk. Software and systems engineering processes have been formalized and similar formal process activities are under development for design engineering and integrated modeling. JPL is establishing a modeling process to define development and application of such system-level models.

  16. Input modelling for subchannel analysis of CANFLEX fuel bundle

    Energy Technology Data Exchange (ETDEWEB)

    Park, Joo Hwan; Jun, Ji Su; Suk, Ho Chun [Korea Atomic Energy Research Institute, Taejon (Korea)

    1998-06-01

    This report describs the input modelling for subchannel analysis of CANFLEX fuel bundle using CASS(Candu thermalhydraulic Analysis by Subchannel approacheS) code which has been developed for subchannel analysis of CANDU fuel channel. CASS code can give the different calculation results according to users' input modelling. Hence, the objective of this report provide the background information of input modelling, the accuracy of input data and gives the confidence of calculation results. (author). 11 refs., 3 figs., 4 tabs.

  17. Loss Given Default Modelling: Comparative Analysis

    OpenAIRE

    Yashkir, Olga; Yashkir, Yuriy

    2013-01-01

    In this study we investigated several most popular Loss Given Default (LGD) models (LSM, Tobit, Three-Tiered Tobit, Beta Regression, Inflated Beta Regression, Censored Gamma Regression) in order to compare their performance. We show that for a given input data set, the quality of the model calibration depends mainly on the proper choice (and availability) of explanatory variables (model factors), but not on the fitting model. Model factors were chosen based on the amplitude of their correlati...

  18. Model Analysis Assessing the dynamics of student learning

    CERN Document Server

    Bao, L; Bao, Lei; Redish, Edward F.

    2002-01-01

    In this paper we present a method of modeling and analysis that permits the extraction and quantitative display of detailed information about the effects of instruction on a class's knowledge. The method relies on a congitive model that represents student thinking in terms of mental models. Students frequently fail to recognize relevant conditions that lead to appropriate uses of their models. As a result they can use multiple models inconsistently. Once the most common mental models have been determined by qualitative research, they can be mapping onto a multiple choice test. Model analysis permits the interpretation of such a situation. We illustrate the use of our method by analyzing results from the FCI.

  19. EXPOSURE ANALYSIS MODELING SYSTEM (EXAMS): USER MANUAL AND SYSTEM DOCUMENTATION

    Science.gov (United States)

    The Exposure Analysis Modeling System, first published in 1982 (EPA-600/3-82-023), provides interactive computer software for formulating aquatic ecosystem models and rapidly evaluating the fate, transport, and exposure concentrations of synthetic organic chemicals - pesticides, ...

  20. Introduction to mixed modelling beyond regression and analysis of variance

    CERN Document Server

    Galwey, N W

    2007-01-01

    Mixed modelling is one of the most promising and exciting areas of statistical analysis, enabling more powerful interpretation of data through the recognition of random effects. However, many perceive mixed modelling as an intimidating and specialized technique.

  1. Development of statistical models for data analysis

    Energy Technology Data Exchange (ETDEWEB)

    Downham, D.Y.

    2000-07-01

    Incidents that cause, or could cause, injury to personnel, and that satisfy specific criteria, are reported to the Offshore Safety Division (OSD) of the Health and Safety Executive (HSE). The underlying purpose of this report is to improve ways of quantifying risk, a recommendation in Lord Cullen's report into the Piper Alpha disaster. Records of injuries and hydrocarbon releases from 1 January, 1991, to 31 March 1996, are analysed, because the reporting of incidents was standardised after 1990. Models are identified for risk assessment and some are applied. The appropriate analyses of one or two factors (or variables) are tests of uniformity or of independence. Radar graphs are used to represent some temporal variables. Cusums are applied for the analysis of incident frequencies over time, and could be applied for regular monitoring. Log-linear models for Poisson-distributed data are identified as being suitable for identifying 'non-random' combinations of more than two factors. Some questions cannot be addressed with the available data: for example, more data are needed to assess the risk of injury per employee in a time interval. If the questions are considered sufficiently important, resources could be assigned to obtain the data. Some of the main results from the analyses are as follows: the cusum analyses identified a change-point at the end of July 1993, when the reported number of injuries reduced by 40%. Injuries were more likely to occur between 8am and 12am or between 2pm and 5pm than at other times: between 2pm and 3pm the number of injuries was almost twice the average and was more than three fold the smallest. No seasonal effects in the numbers of injuries were identified. Three-day injuries occurred more frequently on the 5th, 6th and 7th days into a tour of duty than on other days. Three-day injuries occurred less frequently on the 13th and 14th days of a tour of duty. An injury classified as 'lifting or craning' was

  2. Finite element analysis to model complex mitral valve repair.

    Science.gov (United States)

    Labrosse, Michel; Mesana, Thierry; Baxter, Ian; Chan, Vincent

    2016-01-01

    Although finite element analysis has been used to model simple mitral repair, it has not been used to model complex repair. A virtual mitral valve model was successful in simulating normal and abnormal valve function. Models were then developed to simulate an edge-to-edge repair and repair employing quadrangular resection. Stress contour plots demonstrated increased stresses along the mitral annulus, corresponding to the annuloplasty. The role of finite element analysis in guiding clinical practice remains undetermined.

  3. Analysis and Modeling of Traffic in Modern Data Communication Networks

    OpenAIRE

    Babic, G.; Vandalore, B.; Jain, R.

    1998-01-01

    In performance analysis and design of communication netword modeling data traffic is important. With introduction of new applications, the characteristics of the data traffic changes. We present a brief review the different models of data traffic and how they have evolved. We present results of data traffic analysis and simulated traffic, which demonstrates that the packet train model fits the traffic at source destination level and long-memory (self-similar) model fits the traffic at the agg...

  4. Analysis of Cortical Flow Models In Vivo

    Science.gov (United States)

    Benink, Hélène A.; Mandato, Craig A.; Bement, William M.

    2000-01-01

    Cortical flow, the directed movement of cortical F-actin and cortical organelles, is a basic cellular motility process. Microtubules are thought to somehow direct cortical flow, but whether they do so by stimulating or inhibiting contraction of the cortical actin cytoskeleton is the subject of debate. Treatment of Xenopus oocytes with phorbol 12-myristate 13-acetate (PMA) triggers cortical flow toward the animal pole of the oocyte; this flow is suppressed by microtubules. To determine how this suppression occurs and whether it can control the direction of cortical flow, oocytes were subjected to localized manipulation of either the contractile stimulus (PMA) or microtubules. Localized PMA application resulted in redirection of cortical flow toward the site of application, as judged by movement of cortical pigment granules, cortical F-actin, and cortical myosin-2A. Such redirected flow was accelerated by microtubule depolymerization, showing that the suppression of cortical flow by microtubules is independent of the direction of flow. Direct observation of cortical F-actin by time-lapse confocal analysis in combination with photobleaching showed that cortical flow is driven by contraction of the cortical F-actin network and that microtubules suppress this contraction. The oocyte germinal vesicle serves as a microtubule organizing center in Xenopus oocytes; experimental displacement of the germinal vesicle toward the animal pole resulted in localized flow away from the animal pole. The results show that 1) cortical flow is directed toward areas of localized contraction of the cortical F-actin cytoskeleton; 2) microtubules suppress cortical flow by inhibiting contraction of the cortical F-actin cytoskeleton; and 3) localized, microtubule-dependent suppression of actomyosin-based contraction can control the direction of cortical flow. We discuss these findings in light of current models of cortical flow. PMID:10930453

  5. Analysis and modeling of parking behavior

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Analyzes the spatial structure of parking behavior and establishes a basic parking behavior model to represent the parking problem in downtown, and establishes a parking pricing model to analyze the parking equilibrium with a positive parking fee and uses a paired combinatorial logit model to analyze the effect of trip integrative cost on parking behavior and concludes from empirical results that the parking behavior model performs well.

  6. An Extensible Model and Analysis Framework

    Science.gov (United States)

    2010-11-01

    for a total of 543 seconds. For comparison purposes, in interpreted mode, opening the model took 224 seconds and running the model took 217 seconds...contains 19683 entities. 9 A comparison of the key model complexity metrics may be found in Table 3. Table 3: Comparison of the model...Triquetrum/RCP supports assembling in arbitrary ways. (12/08 presentation) 2. Prototyped OSGi component architecture for use with Netbeans and

  7. Analysis on Some of Software Reliability Models

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Software reliability & maintainability evaluation tool (SRMET 3.0) is introducted in detail in this paper,which was developed by Software Evaluation and Test Center of China Aerospace Mechanical Corporation. SRMET 3.0is supported by seven soft ware reliability models and four software maintainability models. Numerical characteristicsfor all those models are deeply studied in this paper, and corresponding numerical algorithms for each model are alsogiven in the paper.

  8. EQUIVALENT MODELS IN COVARIANCE STRUCTURE-ANALYSIS

    NARCIS (Netherlands)

    LUIJBEN, TCW

    1991-01-01

    Defining equivalent models as those that reproduce the same set of covariance matrices, necessary and sufficient conditions are stated for the local equivalence of two expanded identified models M1 and M2 when fitting the more restricted model M0. Assuming several regularity conditions, the rank def

  9. Modelling Immune System: Principles, Models,Analysis and Perspectives

    Institute of Scientific and Technical Information of China (English)

    Xiang-hua Li; Zheng-xuan Wang; Tian-yang Lu; Xiang-jiu Che

    2009-01-01

    The biological immune system is a complex adaptive system. There are lots of benefits for building the model of the immune system. For biological researchers, they can test some hypotheses about the infection process or simulate the responses of some drugs. For computer researchers, they can build distributed, robust and fault tolerant networks inspired by the functions of the immune system. This paper provides a comprehensive survey of the literatures on modelling the immune system. From the methodology perspective, the paper compares and analyzes the existing approaches and models, and also demonstrates the focusing research effort on the future immune models in the next few years.

  10. How Many Separable Sources? Model Selection In Independent Components Analysis

    DEFF Research Database (Denmark)

    Woods, Roger P.; Hansen, Lars Kai; Strother, Stephen

    2015-01-01

    Unlike mixtures consisting solely of non-Gaussian sources, mixtures including two or more Gaussian components cannot be separated using standard independent components analysis methods that are based on higher order statistics and independent observations. The mixed Independent Components Analysi...... might otherwise be questionable. Failure of the Akaike Information Criterion in model selection also has relevance in traditional independent components analysis where all sources are assumed non-Gaussian.......Unlike mixtures consisting solely of non-Gaussian sources, mixtures including two or more Gaussian components cannot be separated using standard independent components analysis methods that are based on higher order statistics and independent observations. The mixed Independent Components Analysis....../Principal Components Analysis (mixed ICA/PCA) model described here accommodates one or more Gaussian components in the independent components analysis model and uses principal components analysis to characterize contributions from this inseparable Gaussian subspace. Information theory can then be used to select from...

  11. Eclipsing binary stars modeling and analysis

    CERN Document Server

    Kallrath, Josef

    1999-01-01

    This book focuses on the formulation of mathematical models for the light curves of eclipsing binary stars, and on the algorithms for generating such models Since information gained from binary systems provides much of what we know of the masses, luminosities, and radii of stars, such models are acquiring increasing importance in studies of stellar structure and evolution As in other areas of science, the computer revolution has given many astronomers tools that previously only specialists could use; anyone with access to a set of data can now expect to be able to model it This book will provide astronomers, both amateur and professional, with a guide for - specifying an astrophysical model for a set of observations - selecting an algorithm to determine the parameters of the model - estimating the errors of the parameters It is written for readers with knowledge of basic calculus and linear algebra; appendices cover mathematical details on such matters as optimization, coordinate systems, and specific models ...

  12. BIFURCATION ANALYSIS OF A MITOTIC MODEL OF FROG EGGS

    Institute of Scientific and Technical Information of China (English)

    吕金虎; 张子范; 张锁春

    2003-01-01

    The mitotic model of frog eggs established by Borisuk and Tyson is qualitatively analyzed. The existence and stability of its steady states are further discussed. Furthermore, the bifurcation of above model is further investigated by using theoretical analysis and numerical simulations. At the same time, the numerical results of Tyson are verified by theoretical analysis.

  13. A Multilevel Nonlinear Profile Analysis Model for Dichotomous Data

    Science.gov (United States)

    Culpepper, Steven Andrew

    2009-01-01

    This study linked nonlinear profile analysis (NPA) of dichotomous responses with an existing family of item response theory models and generalized latent variable models (GLVM). The NPA method offers several benefits over previous internal profile analysis methods: (a) NPA is estimated with maximum likelihood in a GLVM framework rather than…

  14. Stochastic Analysis Method of Sea Environment Simulated by Numerical Models

    Institute of Scientific and Technical Information of China (English)

    刘德辅; 焦桂英; 张明霞; 温书勤

    2003-01-01

    This paper proposes the stochastic analysis method of sea environment simulated by numerical models, such as wave height, current field, design sea levels and longshore sediment transport. Uncertainty and sensitivity analysis of input and output factors of numerical models, their long-term distribution and confidence intervals are described in this paper.

  15. An Object Extraction Model Using Association Rules and Dependence Analysis

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Extracting objects from legacy systems is a basic step insystem's obje ct-orientation to improve the maintainability and understandability of the syst e ms. A new object extraction model using association rules an d dependence analysis is proposed. In this model data are classified by associat ion rules and the corresponding operations are partitioned by dependence analysis.

  16. Book review: Statistical Analysis and Modelling of Spatial Point Patterns

    DEFF Research Database (Denmark)

    Møller, Jesper

    2009-01-01

    Statistical Analysis and Modelling of Spatial Point Patterns by J. Illian, A. Penttinen, H. Stoyan and D. Stoyan. Wiley (2008), ISBN 9780470014912......Statistical Analysis and Modelling of Spatial Point Patterns by J. Illian, A. Penttinen, H. Stoyan and D. Stoyan. Wiley (2008), ISBN 9780470014912...

  17. Phenomenological analysis of the interacting boson model

    Science.gov (United States)

    Hatch, R. L.; Levit, S.

    1982-01-01

    The classical Hamiltonian of the interacting boson model is defined and expressed in terms of the conventional quadrupole variables. This is used in the analyses of the dynamics in the various limits of the model. The purpose is to determine the range and the features of the collective phenomena which the interacting boson model is capable of describing. In the commonly used version of the interacting boson model with one type of the s and d bosons and quartic interactions, this capability has certain limitations and the model should be used with care. A more sophisticated version of the interacting boson model with neutron and proton bosons is not discussed. NUCLEAR STRUCTURE Interacting bosons, classical IBM Hamiltonian in quadrupole variables, phenomenological content of the IBM and its limitations.

  18. Likelihood analysis of the I(2) model

    DEFF Research Database (Denmark)

    Johansen, Søren

    1997-01-01

    The I(2) model is defined as a submodel of the general vector autoregressive model, by two reduced rank conditions. The model describes stochastic processes with stationary second difference. A parametrization is suggested which makes likelihood inference feasible. Consistency of the maximum...... likelihood estimator is proved, and the asymptotic distribution of the maximum likelihood estimator is given. It is shown that the asymptotic distribution is either Gaussian, mixed Gaussian or, in some cases, even more complicated....

  19. Meta-analysis a structural equation modeling approach

    CERN Document Server

    Cheung, Mike W-L

    2015-01-01

    Presents a novel approach to conducting meta-analysis using structural equation modeling. Structural equation modeling (SEM) and meta-analysis are two powerful statistical methods in the educational, social, behavioral, and medical sciences. They are often treated as two unrelated topics in the literature. This book presents a unified framework on analyzing meta-analytic data within the SEM framework, and illustrates how to conduct meta-analysis using the metaSEM package in the R statistical environment. Meta-Analysis: A Structural Equation Modeling Approach begins by introducing the impo

  20. A Bayesian Analysis of Spectral ARMA Model

    Directory of Open Access Journals (Sweden)

    Manoel I. Silvestre Bezerra

    2012-01-01

    Full Text Available Bezerra et al. (2008 proposed a new method, based on Yule-Walker equations, to estimate the ARMA spectral model. In this paper, a Bayesian approach is developed for this model by using the noninformative prior proposed by Jeffreys (1967. The Bayesian computations, simulation via Markov Monte Carlo (MCMC is carried out and characteristics of marginal posterior distributions such as Bayes estimator and confidence interval for the parameters of the ARMA model are derived. Both methods are also compared with the traditional least squares and maximum likelihood approaches and a numerical illustration with two examples of the ARMA model is presented to evaluate the performance of the procedures.

  1. The Modeling Analysis of Huangshan Tourism Data

    Science.gov (United States)

    Hu, Shanfeng; Yan, Xinhu; Zhu, Hongbing

    2016-06-01

    Tourism is the major industry in Huangshan city. This paper analyzes time series of tourism data to Huangshan from 2000 to 2013. The Yearly data set comprises the total arrivals of tourists, total income, Urban Resident Disposable Income Per Capital and Net Income Per Peasant. A mathematical model which is based on the binomial approximation and inverse quadratic radial basis function (RBF) is set up to model the tourist arrivals. The total income and urban resident disposable income per capital and net income per peasant are also modeled. It is shown that the established mathematical model can be used to forecast some tourism information and achieve a good management for Huangshan tourism.

  2. Modeling and analysis of biomass production systems

    Energy Technology Data Exchange (ETDEWEB)

    Mishoe, J.W.; Lorber, M.N.; Peart, R.M.; Fluck, R.C.; Jones, J.W.

    1984-01-01

    BIOMET is an interactive simulation model that is used to analyze specific biomass and methane production systems. The system model is composed of crop growth models, harvesting, transportation, conversion and economic submodels. By use of menus the users can configure the structure and set selected parameters of the system to analyze the effects of variables within the component models. For example, simulations of a water hyacinth system resulted in yields of 63, 48 and 37 mg/ha/year for different harvest schedules. For napier grass, unit methane costs were $3.04, $2.86 and $2.98 for various yields of biomass. 10 references.

  3. Identifying nonlinear biomechanical models by multicriteria analysis

    Science.gov (United States)

    Srdjevic, Zorica; Cveticanin, Livija

    2012-02-01

    In this study, the methodology developed by Srdjevic and Cveticanin (International Journal of Industrial Ergonomics 34 (2004) 307-318) for the nonbiased (objective) parameter identification of the linear biomechanical model exposed to vertical vibrations is extended to the identification of n-degree of freedom (DOF) nonlinear biomechanical models. The dynamic performance of the n-DOF nonlinear model is described in terms of response functions in the frequency domain, such as the driving-point mechanical impedance and seat-to-head transmissibility function. For randomly generated parameters of the model, nonlinear equations of motion are solved using the Runge-Kutta method. The appropriate data transformation from the time-to-frequency domain is performed by a discrete Fourier transformation. Squared deviations of the response functions from the target values are used as the model performance evaluation criteria, thus shifting the problem into the multicriteria framework. The objective weights of criteria are obtained by applying the Shannon entropy concept. The suggested methodology is programmed in Pascal and tested on a 4-DOF nonlinear lumped parameter biomechanical model. The identification process over the 2000 generated sets of parameters lasts less than 20 s. The model response obtained with the imbedded identified parameters correlates well with the target values, therefore, justifying the use of the underlying concept and the mathematical instruments and numerical tools applied. It should be noted that the identified nonlinear model has an improved accuracy of the biomechanical response compared to the accuracy of a linear model.

  4. An Intelligent Analysis Model for Multisource Volatile Memory

    Directory of Open Access Journals (Sweden)

    Xiaolu Zhang

    2013-09-01

    Full Text Available For the rapidly development of network and distributed computing environment, it make researchers harder to do analysis examines only from one or few pieces of data source in persistent data-oriented approaches, so as the volatile memory analysis either. Therefore, mass data automatically analysis and action modeling needs to be considered for reporting entire network attack process. To model multiple volatile data sources situation can help understand and describe both thinking process of investigator and possible action step for attacker. This paper presents a Game model for multisource volatile data and applies it to main memory images analysis with the definition of space-time feature for volatile element information. Abstract modeling allows the lessons gleaned in performing intelligent analysis, evidence filing and automating presentation. Finally, a test demo based on the model is also present to illustrate the whole procedure

  5. Mathematical analysis and uncertain models of a nitrification process

    Energy Technology Data Exchange (ETDEWEB)

    Harmand, J.; Steyer, J.P.; Queinnec, I.; Bernet, N.

    1995-12-31

    The non linear model of a Continuous Strirred Tank Reactor (CSTR) for nitrogen removal, derived from mass balance consideration, can be linearized around a nominal steady state. The analysis of the linear model in terms of stability, observability and controllability allows to highlight the structural properties of the model. Disturbances and uncertainties can then be explicitly expressed in the linear model, such that it completes the modelling in view of a future control scheme of the process. (authors) 13 refs.

  6. Model-free linkage analysis of a binary trait.

    Science.gov (United States)

    Xu, Wei; Bull, Shelley B; Mirea, Lucia; Greenwood, Celia M T

    2012-01-01

    Genetic linkage analysis aims to detect chromosomal regions containing genes that influence risk of specific inherited diseases. The presence of linkage is indicated when a disease or trait cosegregates through the families with genetic markers at a particular region of the genome. Two main types of genetic linkage analysis are in common use, namely model-based linkage analysis and model-free linkage analysis. In this chapter, we focus solely on the latter type and specifically on binary traits or phenotypes, such as the presence or absence of a specific disease. Model-free linkage analysis is based on allele-sharing, where patterns of genetic similarity among affected relatives are compared to chance expectations. Because the model-free methods do not require the specification of the inheritance parameters of a genetic model, they are preferred by many researchers at early stages in the study of a complex disease. We introduce the history of model-free linkage analysis in Subheading 1. Table 1 describes a standard model-free linkage analysis workflow. We describe three popular model-free linkage analysis methods, the nonparametric linkage (NPL) statistic, the affected sib-pair (ASP) likelihood ratio test, and a likelihood approach for pedigrees. The theory behind each linkage test is described in this section, together with a simple example of the relevant calculations. Table 4 provides a summary of popular genetic analysis software packages that implement model-free linkage models. In Subheading 2, we work through the methods on a rich example providing sample software code and output. Subheading 3 contains notes with additional details on various topics that may need further consideration during analysis.

  7. Health care policy development: a critical analysis model.

    Science.gov (United States)

    Logan, Jean E; Pauling, Carolyn D; Franzen, Debra B

    2011-01-01

    This article describes a phased approach for teaching baccalaureate nursing students critical analysis of health care policy, including refinement of existing policy or the foundation to create new policy. Central to this approach is the application of an innovative framework, the Grand View Critical Analysis Model, which was designed to provide a conceptual base for the authentic learning experience. Students come to know the interconnectedness and the importance of the model, which includes issue selection and four phases: policy focus, colleagueship analysis, evidence-based practice analysis, and policy analysis and development.

  8. Application of dimensional analysis in systems modeling and control design

    CERN Document Server

    Balaguer, Pedro

    2013-01-01

    Dimensional analysis is an engineering tool that is widely applied to numerous engineering problems, but has only recently been applied to control theory and problems such as identification and model reduction, robust control, adaptive control, and PID control. Application of Dimensional Analysis in Systems Modeling and Control Design provides an introduction to the fundamentals of dimensional analysis for control engineers, and shows how they can exploit the benefits of the technique to theoretical and practical control problems.

  9. Information analysis for modeling and representation of meaning

    OpenAIRE

    Uda, Norihiko

    1994-01-01

    In this dissertation, information analysis and an information model called the Semantic Structure Model based on information analysis are explained for semantic processing. Methods for self organization of information are also described. In addition, Information-Base Systems for thinking support of research and development in non linear optical materials are explained. As a result of information analysis, general properties of information and structural properties of concepts become clear. Ge...

  10. Nonsynchronous Trading Model and Return Analysis

    Institute of Scientific and Technical Information of China (English)

    LIU Xiao-mao; LI Chu-lin; ZHANG Jun

    2002-01-01

    Nonsynchronous trading is one of the hot issues in financial high-frequency data processing.This paper extends the nonsynchronous trading model studied in [1] and [2] for the financial security, and considers the moment functions of the observable return series for the extended model. At last, the estimators of parameters are obtained.

  11. Modelling and analysis of Markov reward automata

    NARCIS (Netherlands)

    Guck, Dennis; Timmer, Mark; Hatefi, Hassan; Ruijters, Enno; Stoelinga, Mariëlle

    2014-01-01

    Costs and rewards are important ingredients for many types of systems, modelling critical aspects like energy consumption, task completion, repair costs, and memory usage. This paper introduces Markov reward automata, an extension of Markov automata that allows the modelling of systems incorporating

  12. Multivariate model for test response analysis

    NARCIS (Netherlands)

    Krishnan, S.; Kerkhoff, H.G.

    2010-01-01

    A systematic approach to construct an effective multivariate test response model for capturing manufacturing defects in electronic products is described. The effectiveness of the model is demonstrated by its capability in reducing the number of test-points, while achieving the maximal coverage attai

  13. Analysis of Modeling Parameters on Threaded Screws.

    Energy Technology Data Exchange (ETDEWEB)

    Vigil, Miquela S. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Brake, Matthew Robert [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Vangoethem, Douglas [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-06-01

    Assembled mechanical systems often contain a large number of bolted connections. These bolted connections (joints) are integral aspects of the load path for structural dynamics, and, consequently, are paramount for calculating a structure's stiffness and energy dissipation prop- erties. However, analysts have not found the optimal method to model appropriately these bolted joints. The complexity of the screw geometry cause issues when generating a mesh of the model. This paper will explore different approaches to model a screw-substrate connec- tion. Model parameters such as mesh continuity, node alignment, wedge angles, and thread to body element size ratios are examined. The results of this study will give analysts a better understanding of the influences of these parameters and will aide in finding the optimal method to model bolted connections.

  14. Perturbative analysis of gauged matrix models

    Science.gov (United States)

    Dijkgraaf, Robbert; Gukov, Sergei; Kazakov, Vladimir A.; Vafa, Cumrun

    2003-08-01

    We analyze perturbative aspects of gauged matrix models, including those where classically the gauge symmetry is partially broken. Ghost fields play a crucial role in the Feynman rules for these vacua. We use this formalism to elucidate the fact that nonperturbative aspects of N=1 gauge theories can be computed systematically using perturbative techniques of matrix models, even if we do not possess an exact solution for the matrix model. As examples we show how the Seiberg-Witten solution for N=2 gauge theory, the Montonen-Olive modular invariance for N=1*, and the superpotential for the Leigh-Strassler deformation of N=4 can be systematically computed in perturbation theory of the matrix model or gauge theory (even though in some of these cases an exact answer can also be obtained by summing up planar diagrams of matrix models).

  15. Perturbative Analysis of Gauged Matrix Models

    CERN Document Server

    Dijkgraaf, R; Kazakov, V A; Vafa, C; Dijkgraaf, Robbert; Gukov, Sergei; Kazakov, Vladimir A.; Vafa, Cumrun

    2003-01-01

    We analyze perturbative aspects of gauged matrix models, including those where classically the gauge symmetry is partially broken. Ghost fields play a crucial role in the Feynman rules for these vacua. We use this formalism to elucidate the fact that non-perturbative aspects of N=1 gauge theories can be computed systematically using perturbative techniques of matrix models, even if we do not possess an exact solution for the matrix model. As examples we show how the Seiberg-Witten solution for N=2 gauge theory, the Montonen-Olive modular invariance for N=1*, and the superpotential for the Leigh-Strassler deformation of N=4 can be systematically computed in perturbation theory of the matrix model/gauge theory (even though in some of these cases the exact answer can also be obtained by summing up planar diagrams of matrix models).

  16. Analysis and modeling of solar irradiance variations

    CERN Document Server

    Yeo, K L

    2014-01-01

    A prominent manifestation of the solar dynamo is the 11-year activity cycle, evident in indicators of solar activity, including solar irradiance. Although a relationship between solar activity and the brightness of the Sun had long been suspected, it was only directly observed after regular satellite measurements became available with the launch of Nimbus-7 in 1978. The measurement of solar irradiance from space is accompanied by the development of models aimed at describing the apparent variability by the intensity excess/deficit effected by magnetic structures in the photosphere. The more sophisticated models, termed semi-empirical, rely on the intensity spectra of photospheric magnetic structures generated with radiative transfer codes from semi-empirical model atmospheres. An established example of such models is SATIRE-S (Spectral And Total Irradiance REconstruction for the Satellite era). One key limitation of current semi-empirical models is the fact that the radiant properties of network and faculae a...

  17. MODAL ANALYSIS OF QUARTER CAR MODEL SUSPENSION SYSTEM

    OpenAIRE

    Viswanath. K. Allamraju *

    2016-01-01

    Suspension system is very important for comfort driving and travelling of the passengers. Therefore, this study provides a numerical tool for modeling and analyzing of a two degree of freedom quarter car model suspension system. Modal analysis places a vital role in designing the suspension system. In this paper presented the modal analysis of quarter car model suspension system by considering the undamped and damped factors.  The modal and vertical equations of motions describing the su...

  18. Sensitivity Analysis of the Gap Heat Transfer Model in BISON.

    Energy Technology Data Exchange (ETDEWEB)

    Swiler, Laura Painton; Schmidt, Rodney C.; Williamson, Richard (INL); Perez, Danielle (INL)

    2014-10-01

    This report summarizes the result of a NEAMS project focused on sensitivity analysis of the heat transfer model in the gap between the fuel rod and the cladding used in the BISON fuel performance code of Idaho National Laboratory. Using the gap heat transfer models in BISON, the sensitivity of the modeling parameters and the associated responses is investigated. The study results in a quantitative assessment of the role of various parameters in the analysis of gap heat transfer in nuclear fuel.

  19. Root analysis and implications to analysis model in ATLAS

    CERN Document Server

    Shibata, A

    2008-01-01

    An impressive amount of effort has been put in to realize a set of frameworks to support analysis in this new paradigm of GRID computing. However, much more than half of a physicist's time is typically spent after the GRID processing of the data. Due to the private nature of this level of analysis, there has been little common framework or methodology. While most physicists agree to use ROOT as the basis of their analysis, a number of approaches are possible for the implementation of the analysis using ROOT: conventional methods using CINT/ACLiC, development using g++, alternative interface through python, and parallel processing methods such as PROOF are some of the choices currently available on the market. Furthermore, in the ATLAS collaboration an additional layer of technology adds to the complexity because the data format is based on the POOL technology, which tends to be less portable. In this study, various modes of ROOT analysis are profiled for comparison with the main focus on the processing speed....

  20. Probabilistic forward model for electroencephalography source analysis

    Energy Technology Data Exchange (ETDEWEB)

    Plis, Sergey M [MS-D454, Applied Modern Physics Group, Los Alamos National Laboratory, Los Alamos, NM 87545 (United States); George, John S [MS-D454, Applied Modern Physics Group, Los Alamos National Laboratory, Los Alamos, NM 87545 (United States); Jun, Sung C [MS-D454, Applied Modern Physics Group, Los Alamos National Laboratory, Los Alamos, NM 87545 (United States); Ranken, Doug M [MS-D454, Applied Modern Physics Group, Los Alamos National Laboratory, Los Alamos, NM 87545 (United States); Volegov, Petr L [MS-D454, Applied Modern Physics Group, Los Alamos National Laboratory, Los Alamos, NM 87545 (United States); Schmidt, David M [MS-D454, Applied Modern Physics Group, Los Alamos National Laboratory, Los Alamos, NM 87545 (United States)

    2007-09-07

    Source localization by electroencephalography (EEG) requires an accurate model of head geometry and tissue conductivity. The estimation of source time courses from EEG or from EEG in conjunction with magnetoencephalography (MEG) requires a forward model consistent with true activity for the best outcome. Although MRI provides an excellent description of soft tissue anatomy, a high resolution model of the skull (the dominant resistive component of the head) requires CT, which is not justified for routine physiological studies. Although a number of techniques have been employed to estimate tissue conductivity, no present techniques provide the noninvasive 3D tomographic mapping of conductivity that would be desirable. We introduce a formalism for probabilistic forward modeling that allows the propagation of uncertainties in model parameters into possible errors in source localization. We consider uncertainties in the conductivity profile of the skull, but the approach is general and can be extended to other kinds of uncertainties in the forward model. We and others have previously suggested the possibility of extracting conductivity of the skull from measured electroencephalography data by simultaneously optimizing over dipole parameters and the conductivity values required by the forward model. Using Cramer-Rao bounds, we demonstrate that this approach does not improve localization results nor does it produce reliable conductivity estimates. We conclude that the conductivity of the skull has to be either accurately measured by an independent technique, or that the uncertainties in the conductivity values should be reflected in uncertainty in the source location estimates.

  1. Numerical bifurcation analysis of immunological models with time delays

    Science.gov (United States)

    Luzyanina, Tatyana; Roose, Dirk; Bocharov, Gennady

    2005-12-01

    In recent years, a large number of mathematical models that are described by delay differential equations (DDEs) have appeared in the life sciences. To analyze the models' dynamics, numerical methods are necessary, since analytical studies can only give limited results. In turn, the availability of efficient numerical methods and software packages encourages the use of time delays in mathematical modelling, which may lead to more realistic models. We outline recently developed numerical methods for bifurcation analysis of DDEs and illustrate the use of these methods in the analysis of a mathematical model of human hepatitis B virus infection.

  2. Analysis of Brown camera distortion model

    Science.gov (United States)

    Nowakowski, Artur; Skarbek, Władysław

    2013-10-01

    Contemporary image acquisition devices introduce optical distortion into image. It results in pixel displacement and therefore needs to be compensated for many computer vision applications. The distortion is usually modeled by the Brown distortion model, which parameters can be included in camera calibration task. In this paper we describe original model, its dependencies and analyze orthogonality with regard to radius for its decentering distortion component. We also report experiments with camera calibration algorithm included in OpenCV library, especially a stability of distortion parameters estimation is evaluated.

  3. Multifractal modelling and 3D lacunarity analysis

    Energy Technology Data Exchange (ETDEWEB)

    Hanen, Akkari, E-mail: bettaieb.hanen@topnet.t [Laboratoire de biophysique, TIM, Faculte de Medecine (Tunisia); Imen, Bhouri, E-mail: bhouri_imen@yahoo.f [Unite de recherche ondelettes et multifractals, Faculte des sciences (Tunisia); Asma, Ben Abdallah, E-mail: asma.babdallah@cristal.rnu.t [Laboratoire de biophysique, TIM, Faculte de Medecine (Tunisia); Patrick, Dubois, E-mail: pdubois@chru-lille.f [INSERM, U 703, Lille (France); Hedi, Bedoui Mohamed, E-mail: medhedi.bedoui@fmm.rnu.t [Laboratoire de biophysique, TIM, Faculte de Medecine (Tunisia)

    2009-09-28

    This study presents a comparative evaluation of lacunarity of 3D grey level models with different types of inhomogeneity. A new method based on the 'Relative Differential Box Counting' was developed to estimate the lacunarity features of grey level volumes. To validate our method, we generated a set of 3D grey level multifractal models with random, anisotropic and hierarchical properties. Our method gives a lacunarity measurement correlated with the theoretical one and allows a better model classification compared with a classical approach.

  4. Digital Avionics Information System (DAIS): Training Requirements Analysis Model (TRAMOD).

    Science.gov (United States)

    Czuchry, Andrew J.; And Others

    The training requirements analysis model (TRAMOD) described in this report represents an important portion of the larger effort called the Digital Avionics Information System (DAIS) Life Cycle Cost (LCC) Study. TRAMOD is the second of three models that comprise an LCC impact modeling system for use in the early stages of system development. As…

  5. Multisurface interface model for analysis of masonry structures

    NARCIS (Netherlands)

    Lourenço, P.B.; Rots, J.G.

    1997-01-01

    The performance of an interface elastoplastic constitutive model for the analysis of unreinforced masonry structures is evaluated. Both masonry components are discretized aiming at a rational unit-joint model able to describe cracking, slip, and crushing of the material. The model is formulated in t

  6. Rotordynamics analysis of a Jeffcott model with deadband

    Science.gov (United States)

    Zalik, R. A.

    A method is developed for determining the stability margins of a simple Jeffcott model with deadband via analysis of the discrete Fourier transform of the system response. The model in question is of a uniform, unbalanced, flexible shaft that is supported by a bearing as it rotates about its x axis. This model is represented by a system of coupled nonlinear differential equations.

  7. Complex networks analysis in socioeconomic models

    CERN Document Server

    Varela, Luis M; Ausloos, Marcel; Carrete, Jesus

    2014-01-01

    This chapter aims at reviewing complex networks models and methods that were either developed for or applied to socioeconomic issues, and pertinent to the theme of New Economic Geography. After an introduction to the foundations of the field of complex networks, the present summary adds insights on the statistical mechanical approach, and on the most relevant computational aspects for the treatment of these systems. As the most frequently used model for interacting agent-based systems, a brief description of the statistical mechanics of the classical Ising model on regular lattices, together with recent extensions of the same model on small-world Watts-Strogatz and scale-free Albert-Barabasi complex networks is included. Other sections of the chapter are devoted to applications of complex networks to economics, finance, spreading of innovations, and regional trade and developments. The chapter also reviews results involving applications of complex networks to other relevant socioeconomic issues, including res...

  8. TIME SERIES ANALYSIS USING A UNIQUE MODEL OF TRANSFORMATION

    Directory of Open Access Journals (Sweden)

    Goran Klepac

    2007-12-01

    Full Text Available REFII1 model is an authorial mathematical model for time series data mining. The main purpose of that model is to automate time series analysis, through a unique transformation model of time series. An advantage of this approach of time series analysis is the linkage of different methods for time series analysis, linking traditional data mining tools in time series, and constructing new algorithms for analyzing time series. It is worth mentioning that REFII model is not a closed system, which means that we have a finite set of methods. At first, this is a model for transformation of values of time series, which prepares data used by different sets of methods based on the same model of transformation in a domain of problem space. REFII model gives a new approach in time series analysis based on a unique model of transformation, which is a base for all kind of time series analysis. The advantage of REFII model is its possible application in many different areas such as finance, medicine, voice recognition, face recognition and text mining.

  9. Urban drainage models - making uncertainty analysis simple

    DEFF Research Database (Denmark)

    Vezzaro, Luca; Mikkelsen, Peter Steen; Deletic, Ana;

    2012-01-01

    There is increasing awareness about uncertainties in modelling of urban drainage systems and, as such, many new methods for uncertainty analyses have been developed. Despite this, all available methods have limitations which restrict their widespread application among practitioners. Here, a modif...... probability distributions (often used for sensitivity analyses) and prediction intervals. To demonstrate the new method, it is applied to a conceptual rainfall-runoff model using a dataset collected from Melbourne, Australia....

  10. COMPUTER DATA ANALYSIS AND MODELING: COMPLEX STOCHASTIC DATA AND SYSTEMS

    OpenAIRE

    2010-01-01

    This collection of papers includes proceedings of the Ninth International Conference “Computer Data Analysis and Modeling: Complex Stochastic Data and Systems” organized by the Belarusian State University and held in September 2010 in Minsk. The papers are devoted to the topical problems: robust and nonparametric data analysis; statistical analysis of time series and forecasting; multivariate data analysis; design of experiments; statistical signal and image processing...

  11. Structural dynamic analysis with generalized damping models analysis

    CERN Document Server

    Adhikari , Sondipon

    2013-01-01

    Since Lord Rayleigh introduced the idea of viscous damping in his classic work ""The Theory of Sound"" in 1877, it has become standard practice to use this approach in dynamics, covering a wide range of applications from aerospace to civil engineering. However, in the majority of practical cases this approach is adopted more for mathematical convenience than for modeling the physics of vibration damping. Over the past decade, extensive research has been undertaken on more general ""non-viscous"" damping models and vibration of non-viscously damped systems. This book, along with a related book

  12. Constrained Overcomplete Analysis Operator Learning for Cosparse Signal Modelling

    CERN Document Server

    Yaghoobi, Mehrdad; Gribonval, Remi; Davies, Mike E

    2012-01-01

    We consider the problem of learning a low-dimensional signal model from a collection of training samples. The mainstream approach would be to learn an overcomplete dictionary to provide good approximations of the training samples using sparse synthesis coefficients. This famous sparse model has a less well known counterpart, in analysis form, called the cosparse analysis model. In this new model, signals are characterised by their parsimony in a transformed domain using an overcomplete (linear) analysis operator. We propose to learn an analysis operator from a training corpus using a constrained optimisation framework based on L1 optimisation. The reason for introducing a constraint in the optimisation framework is to exclude trivial solutions. Although there is no final answer here for which constraint is the most relevant constraint, we investigate some conventional constraints in the model adaptation field and use the uniformly normalised tight frame (UNTF) for this purpose. We then derive a practical lear...

  13. Three dimensional mathematical model of tooth for finite element analysis

    Directory of Open Access Journals (Sweden)

    Puškar Tatjana

    2010-01-01

    Full Text Available Introduction. The mathematical model of the abutment tooth is the starting point of the finite element analysis of stress and deformation of dental structures. The simplest and easiest way is to form a model according to the literature data of dimensions and morphological characteristics of teeth. Our method is based on forming 3D models using standard geometrical forms (objects in programmes for solid modeling. Objective. Forming the mathematical model of abutment of the second upper premolar for finite element analysis of stress and deformation of dental structures. Methods. The abutment tooth has a form of a complex geometric object. It is suitable for modeling in programs for solid modeling SolidWorks. After analyzing the literature data about the morphological characteristics of teeth, we started the modeling dividing the tooth (complex geometric body into simple geometric bodies (cylinder, cone, pyramid,.... Connecting simple geometric bodies together or substricting bodies from the basic body, we formed complex geometric body, tooth. The model is then transferred into Abaqus, a computational programme for finite element analysis. Transferring the data was done by standard file format for transferring 3D models ACIS SAT. Results. Using the programme for solid modeling SolidWorks, we developed three models of abutment of the second maxillary premolar: the model of the intact abutment, the model of the endodontically treated tooth with two remaining cavity walls and the model of the endodontically treated tooth with two remaining walls and inserted post. Conclusion Mathematical models of the abutment made according to the literature data are very similar with the real abutment and the simplifications are minimal. These models enable calculations of stress and deformation of the dental structures. The finite element analysis provides useful information in understanding biomechanical problems and gives guidance for clinical research.

  14. [Decision analysis in radiology using Markov models].

    Science.gov (United States)

    Golder, W

    2000-01-01

    Markov models (Multistate transition models) are mathematical tools to simulate a cohort of individuals followed over time to assess the prognosis resulting from different strategies. They are applied on the assumption that persons are in one of a finite number of states of health (Markov states). Each condition is given a transition probability as well as an incremental value. Probabilities may be chosen constant or varying over time due to predefined rules. Time horizon is divided into equal increments (Markov cycles). The model calculates quality-adjusted life expectancy employing real-life units and values and summing up the length of time spent in each health state adjusted for objective outcomes and subjective appraisal. This sort of modeling prognosis for a given patient is analogous to utility in common decision trees. Markov models can be evaluated by matrix algebra, probabilistic cohort simulation and Monte Carlo simulation. They have been applied to assess the relative benefits and risks of a limited number of diagnostic and therapeutic procedures in radiology. More interventions should be submitted to Markov analyses in order to elucidate their cost-effectiveness.

  15. Computational Models for Analysis of Illicit Activities

    DEFF Research Database (Denmark)

    Nizamani, Sarwat

    result in a fully evolved network. This method of network evolution can help intelligence security analysts to understand the structure of the network. For suspicious emails detection and email author identification, a cluster-based text classification model has been proposed. The model outperformed...... traditional models for both of the tasks. Apart from these globally organized crimes and cybercrimes, there happen specific world issues which affect geographic locations and take the form of bursts of public violence. These kinds of issues have received little attention by the academicians. These issues have...... to describe the phenomenon of contagious public outrage, which eventually leads to the spread of violence following a disclosure of some unpopular political decisions and/or activity. The results shed a new light on terror activity and provide some hint on how to curb the spreading of violence within...

  16. Comparative analysis of Goodwin's business cycle models

    Science.gov (United States)

    Antonova, A. O.; Reznik, S.; Todorov, M. D.

    2016-10-01

    We compare the behavior of solutions of Goodwin's business cycle equation in the form of neutral delay differential equation with fixed delay (NDDE model) and in the form of the differential equations of 3rd, 4th and 5th orders (ODE model's). Such ODE model's (Taylor series expansion of NDDE in powers of θ) are proposed in N. Dharmaraj and K. Vela Velupillai [6] for investigation of the short periodic sawthooth oscillations in NDDE. We show that the ODE's of 3rd, 4th and 5th order may approximate the asymptotic behavior of only main Goodwin's mode, but not the sawthooth modes. If the order of the Taylor series expansion exceeds 5, then the approximate ODE becomes unstable independently of time lag θ.

  17. Quantitative Models and Analysis for Reactive Systems

    DEFF Research Database (Denmark)

    Thrane, Claus

    phones and websites. Acknowledging that now more than ever, systems come in contact with the physical world, we need to revise the way we construct models and verification algorithms, to take into account the behavior of systems in the presence of approximate, or quantitative information, provided......, allowing verification procedures to quantify judgements, on how suitable a model is for a given specification — hence mitigating the usual harsh distinction between satisfactory and non-satisfactory system designs. This information, among other things, allows us to evaluate the robustness of our framework......, by studying how small changes to our models affect the verification results. A key source of motivation for this work can be found in The Embedded Systems Design Challenge [HS06] posed by Thomas A. Henzinger and Joseph Sifakis. It contains a call for advances in the state-of-the-art of systems verification...

  18. A patient-centered care ethics analysis model for rehabilitation.

    Science.gov (United States)

    Hunt, Matthew R; Ells, Carolyn

    2013-09-01

    There exists a paucity of ethics resources tailored to rehabilitation. To help fill this ethics resource gap, the authors developed an ethics analysis model specifically for use in rehabilitation care. The Patient-Centered Care Ethics Analysis Model for Rehabilitation is a process model to guide careful moral reasoning for particularly complex or challenging matters in rehabilitation. The Patient-Centered Care Ethics Analysis Model for Rehabilitation was developed over several iterations, with feedback at different stages from rehabilitation professionals and bioethics experts. Development of the model was explicitly informed by the theoretical grounding of patient-centered care and the context of rehabilitation, including the International Classification of Functioning, Disability and Health. Being patient centered, the model encourages (1) shared control of consultations, decisions about interventions, and management of the health problems with the patient and (2) understanding the patient as a whole person who has individual preferences situated within social contexts. Although the major process headings of the Patient-Centered Care Ethics Analysis Model for Rehabilitation resemble typical ethical decision-making and problem-solving models, the probes under those headings direct attention to considerations relevant to rehabilitation care. The Patient-Centered Care Ethics Analysis Model for Rehabilitation is a suitable tool for rehabilitation professionals to use (in real time, for retrospective review, and for training purposes) to help arrive at ethical outcomes.

  19. Coverage Modeling and Reliability Analysis Using Multi-state Function

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Fault tree analysis is an effective method for predicting the reliability of a system. It gives a pictorial representation and logical framework for analyzing the reliability. Also, it has been used for a long time as an effective method for the quantitative and qualitative analysis of the failure modes of critical systems. In this paper, we propose a new general coverage model (GCM) based on hardware independent faults. Using this model, an effective software tool can be constructed to detect, locate and recover fault from the faulty system. This model can be applied to identify the key component that can cause the failure of the system using failure mode effect analysis (FMEA).

  20. Applications of model theory to functional analysis

    CERN Document Server

    Iovino, Jose

    2014-01-01

    During the last two decades, methods that originated within mathematical logic have exhibited powerful applications to Banach space theory, particularly set theory and model theory. This volume constitutes the first self-contained introduction to techniques of model theory in Banach space theory. The area of research has grown rapidly since this monograph's first appearance, but much of this material is still not readily available elsewhere. For instance, this volume offers a unified presentation of Krivine's theorem and the Krivine-Maurey theorem on stable Banach spaces, with emphasis on the

  1. MODELING HUMAN RELIABILITY ANALYSIS USING MIDAS

    Energy Technology Data Exchange (ETDEWEB)

    Ronald L. Boring; Donald D. Dudenhoeffer; Bruce P. Hallbert; Brian F. Gore

    2006-05-01

    This paper summarizes an emerging collaboration between Idaho National Laboratory and NASA Ames Research Center regarding the utilization of high-fidelity MIDAS simulations for modeling control room crew performance at nuclear power plants. The key envisioned uses for MIDAS-based control room simulations are: (i) the estimation of human error with novel control room equipment and configurations, (ii) the investigative determination of risk significance in recreating past event scenarios involving control room operating crews, and (iii) the certification of novel staffing levels in control rooms. It is proposed that MIDAS serves as a key component for the effective modeling of risk in next generation control rooms.

  2. MODELING AND ANALYSIS OF REGIONAL BOUNDARY SYSTEM

    Institute of Scientific and Technical Information of China (English)

    YAN Guangle; WANG Huanchen

    2001-01-01

    In this paper, the problems of modeling and analyzing the system with change able boundary are researched. First, a kind of expanding system is set up, in which the changeable boundary is dealt with as a regional boundary. Then some relative models are developed to describe the regional boundary system. Next, the transition or the driftage of bifurcation points in the system is discussed. A fascinating case is studied in which two or more than two classes of chaotic attractive points coexist together or exist alternatively in the same system. Lastly, an effective new method of chaos avoidance for the system is put forward.

  3. SWOT Analysis of Software Development Process Models

    Directory of Open Access Journals (Sweden)

    Ashish B. Sasankar

    2011-09-01

    Full Text Available Software worth billions and trillions of dollars have gone waste in the past due to lack of proper techniques used for developing software resulting into software crisis. Historically , the processes of software development has played an important role in the software engineering. A number of life cycle models have been developed in last three decades. This paper is an attempt to Analyze the software process model using SWOT method. The objective is to identify Strength ,Weakness ,Opportunities and Threats of Waterfall, Spiral, Prototype etc.

  4. CRITICAL ANALYSIS OF EVALUATION MODEL LOMCE

    Directory of Open Access Journals (Sweden)

    José Luis Bernal Agudo

    2015-06-01

    Full Text Available The evaluation model that the LOMCE projects sinks its roots into the neoliberal beliefs, reflecting a specific way of understanding the world. What matters is not the process but the results, being the evaluation the center of the education-learning processes. It presents an evil planning, since the theory that justifies the model doesn’t specify upon coherent proposals, where there is an excessive worry for excellence and diversity is left out. A comprehensive way of understanding education should be recovered.

  5. Theoretical analysis and modeling for nanoelectronics

    Science.gov (United States)

    Baccarani, Giorgio; Gnani, Elena; Gnudi, Antonio; Reggiani, Susanna

    2016-11-01

    In this paper we review the evolution of Microelectronics and its transformation into Nanoelectronics, following the predictions of Moore's law, and some of the issues related with this evolution. Next, we discuss the requirements of device modeling and the solutions proposed throughout the years to address the physical effects related with an extreme device miniaturization, such as hot-electron effects, band splitting into multiple sub-bands, quasi-ballistic transport and electron tunneling. The most important physical models are shortly highlighted, and a few simulation results of heterojunction TFETs are reported and discussed.

  6. QUALITATIVE ANALYSIS OF BOBWHITE QUAIL POPULATION MODEL

    Institute of Scientific and Technical Information of China (English)

    李先义; 朱德明

    2003-01-01

    In this paper, the qualitative behavior of solutions of the bobwhite quail pop-ulation modelwhere 0 < a < 1 < a + b,p,c ∈ (0, ∞) and k is a nonnegative integer, is investigated.Some necessary and suficient as well as sufficient conditions for all solutions of the modelto oscillate and some sufficient conditions for all positive solutions of the model to benonoscillatory and the convergence of nonoscillatory solutions are derived. Furthermore,the permanence of every positive solution of the model is also showed. Many known resultsare improved and extended and some new results are obtained for G. Ladas' open problems.

  7. Models for Planning. Analysis of Literature and Selected Bibliography. Analysis and Bibliography Series, No. 5.

    Science.gov (United States)

    ERIC Clearinghouse on Educational Management, Eugene, OR.

    This review analyzes current research trends in the application of planning models to broad educational systems. Planning models reviewed include systems approach models, simulation models, operational gaming, linear programing, Markov chain analysis, dynamic programing, and queuing techniques. A 77-item bibliography of recent literature is…

  8. Employing Power Graph Analysis to Facilitate Modeling Molecular Interaction Networks

    Directory of Open Access Journals (Sweden)

    Momchil Nenov

    2015-04-01

    Full Text Available Mathematical modeling is used to explore and understand complex systems ranging from weather patterns to social networks to gene-expression regulatory mechanisms. There is an upper limit to the amount of details that can be reflected in a model imposed by finite computational resources. Thus, there are methods to reduce the complexity of the modeled system to its most significant parameters. We discuss the suitability of clustering techniques, in particular Power Graph Analysis as an intermediate step of modeling.

  9. Data Analysis A Model Comparison Approach, Second Edition

    CERN Document Server

    Judd, Charles M; Ryan, Carey S

    2008-01-01

    This completely rewritten classic text features many new examples, insights and topics including mediational, categorical, and multilevel models. Substantially reorganized, this edition provides a briefer, more streamlined examination of data analysis. Noted for its model-comparison approach and unified framework based on the general linear model, the book provides readers with a greater understanding of a variety of statistical procedures. This consistent framework, including consistent vocabulary and notation, is used throughout to develop fewer but more powerful model building techniques. T

  10. Electrical Power Distribution and Control Modeling and Analysis

    Science.gov (United States)

    Fu, Johnny S.; Liffring, Mark; Mehdi, Ishaque S.

    2001-01-01

    This slide presentation reviews the use of Electrical Power Distribution and Control (EPD&C) Modeling and how modeling can support analysis. The presentation discusses using the EASY5 model to simulate and analyze the Space Shuttle Electric Auxiliary Power Unit. Diagrams of the model schematics are included, as well as graphs of the battery cell impedance, hydraulic load dynamics, and EPD&C response to hydraulic load variations.

  11. Three Dimensional Vapor Intrusion Modeling: Model Validation and Uncertainty Analysis

    Science.gov (United States)

    Akbariyeh, S.; Patterson, B.; Rakoczy, A.; Li, Y.

    2013-12-01

    Volatile organic chemicals (VOCs), such as chlorinated solvents and petroleum hydrocarbons, are prevalent groundwater contaminants due to their improper disposal and accidental spillage. In addition to contaminating groundwater, VOCs may partition into the overlying vadose zone and enter buildings through gaps and cracks in foundation slabs or basement walls, a process termed vapor intrusion. Vapor intrusion of VOCs has been recognized as a detrimental source for human exposures to potential carcinogenic or toxic compounds. The simulation of vapor intrusion from a subsurface source has been the focus of many studies to better understand the process and guide field investigation. While multiple analytical and numerical models were developed to simulate the vapor intrusion process, detailed validation of these models against well controlled experiments is still lacking, due to the complexity and uncertainties associated with site characterization and soil gas flux and indoor air concentration measurement. In this work, we present an effort to validate a three-dimensional vapor intrusion model based on a well-controlled experimental quantification of the vapor intrusion pathways into a slab-on-ground building under varying environmental conditions. Finally, a probabilistic approach based on Monte Carlo simulations is implemented to determine the probability distribution of indoor air concentration based on the most uncertain input parameters.

  12. Vibration analysis with MADYMO human models

    NARCIS (Netherlands)

    Verver, M.M.; Hoof, J.F.A.M. van

    2002-01-01

    The importance of comfort for the automotive industry is increasing. Car manufacturers use comfort to distinguish their products from their competitors. However, the development and design of a new car seat or interior is very time consuming and expensive. The introduction of computer models of huma

  13. Future of human models for crash analysis

    NARCIS (Netherlands)

    Wismans, J.S.H.M.; Happee, R.; Hoof, J.F.A.M. van; Lange, R. de

    2001-01-01

    In the crash safety field mathematical models can be applied in practically all area's of research and development including: reconstruction of actual accidents, design (CAD) of the crash response of vehicles, safety devices and roadside facilities and in support of human impact biomechanical studie

  14. Social Ecological Model Analysis for ICT Integration

    Science.gov (United States)

    Zagami, Jason

    2013-01-01

    ICT integration of teacher preparation programmes was undertaken by the Australian Teaching Teachers for the Future (TTF) project in all 39 Australian teacher education institutions and highlighted the need for guidelines to inform systemic ICT integration approaches. A Social Ecological Model (SEM) was used to positively inform integration…

  15. Financial Markets Analysis by Probabilistic Fuzzy Modelling

    NARCIS (Netherlands)

    J.H. van den Berg (Jan); W.-M. van den Bergh (Willem-Max); U. Kaymak (Uzay)

    2003-01-01

    textabstractFor successful trading in financial markets, it is important to develop financial models where one can identify different states of the market for modifying one???s actions. In this paper, we propose to use probabilistic fuzzy systems for this purpose. We concentrate on Takagi???Sugeno (

  16. Flood Progression Modelling and Impact Analysis

    DEFF Research Database (Denmark)

    Mioc, Darka; Anton, François; Nickerson, B.

    People living in the lower valley of the St. John River, New Brunswick, Canada, frequently experience flooding when the river overflows its banks during spring ice melt and rain. To better prepare the population of New Brunswick for extreme flooding, we developed a new flood prediction model...

  17. Characteristic Analysis of Fire Modeling Codes

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Yoon Hwan; Yang, Joon Eon [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Kim, Jong Hoon [Kyeongmin College, Ujeongbu (Korea, Republic of)

    2004-04-15

    This report documents and compares key features of four zone models: CFAST, COMPBRN IIIE, MAGIC and the Fire Induced Vulnerability Evaluation (FIVE) methodology. CFAST and MAGIC handle multi-compartment, multi-fire problems, using many equations; COMPBRN and FIVE handle single compartment, single fire source problems, using simpler equation. The increased rigor of the formulation of CFAST and MAGIC does not mean that these codes are more accurate in every domain; for instance, the FIVE methodology uses a single zone approximation with a plume/ceiling jet sublayer, while the other models use a two-zone treatment without a plume/ceiling jet sublayer. Comparisons with enclosure fire data indicate that inclusion of plume/ceiling jet sublayer temperatures is more conservative, and generally more accurate than neglecting them. Adding a plume/ceiling jet sublayer to the two-zone models should be relatively straightforward, but it has not been done yet for any of the two-zone models. Such an improvement is in progress for MAGIC.

  18. Mixture model analysis of complex samples

    NARCIS (Netherlands)

    Wedel, M; ter Hofstede, F; Steenkamp, JBEM

    1998-01-01

    We investigate the effects of a complex sampling design on the estimation of mixture models. An approximate or pseudo likelihood approach is proposed to obtain consistent estimates of class-specific parameters when the sample arises from such a complex design. The effects of ignoring the sample desi

  19. An Analysis of Student Model Portability

    Science.gov (United States)

    Valdés Aguirre, Benjamín; Ramírez Uresti, Jorge A.; du Boulay, Benedict

    2016-01-01

    Sharing user information between systems is an area of interest for every field involving personalization. Recommender Systems are more advanced in this aspect than Intelligent Tutoring Systems (ITSs) and Intelligent Learning Environments (ILEs). A reason for this is that the user models of Intelligent Tutoring Systems and Intelligent Learning…

  20. Feature Analysis for Modeling Game Content Quality

    DEFF Research Database (Denmark)

    Shaker, Noor; Yannakakis, Georgios N.; Togelius, Julian

    2011-01-01

    ’ preferences, and by defining the smallest game session size for which the model can still predict reported emotion with acceptable accuracy. Neuroevolutionary preference learning is used to approximate the function from game content to reported emotional preferences. The experiments are based on a modified...

  1. Analysis and modeling of "focus" in context

    DEFF Research Database (Denmark)

    Hovy, Dirk; Anumanchipalli, Gopala; Parlikar, Alok

    2013-01-01

    or speech stimuli. We then build models to show how well we predict that focus word from lexical (and higher) level features. Also, using spectral and prosodic information, we show the differences in these focus words when spoken with and without context. Finally, we show how we can improve speech synthesis...... of these utterances given focus information....

  2. Review and analysis of biomass gasification models

    DEFF Research Database (Denmark)

    Puig Arnavat, Maria; Bruno, Joan Carles; Coronas, Alberto

    2010-01-01

    The use of biomass as a source of energy has been further enhanced in recent years and special attention has been paid to biomass gasification. Due to the increasing interest in biomass gasification, several models have been proposed in order to explain and understand this complex process, and th...

  3. QuantUM: Quantitative Safety Analysis of UML Models

    Directory of Open Access Journals (Sweden)

    Florian Leitner-Fischer

    2011-07-01

    Full Text Available When developing a safety-critical system it is essential to obtain an assessment of different design alternatives. In particular, an early safety assessment of the architectural design of a system is desirable. In spite of the plethora of available formal quantitative analysis methods it is still difficult for software and system architects to integrate these techniques into their every day work. This is mainly due to the lack of methods that can be directly applied to architecture level models, for instance given as UML diagrams. Also, it is necessary that the description methods used do not require a profound knowledge of formal methods. Our approach bridges this gap and improves the integration of quantitative safety analysis methods into the development process. All inputs of the analysis are specified at the level of a UML model. This model is then automatically translated into the analysis model, and the results of the analysis are consequently represented on the level of the UML model. Thus the analysis model and the formal methods used during the analysis are hidden from the user. We illustrate the usefulness of our approach using an industrial strength case study.

  4. Army Contracting Command Workforce Model Analysis

    Science.gov (United States)

    2010-10-04

    College), and he has taught visiting seminars at American University in Cairo, and Instituto de Empresas in Madrid. Dr. Reed retired after 21 years...Advisory Panel, 2007, p. 7) not only points toward a strained workforce that lacks the requisite market expertise, but also to other factors that...mlpqdo^ar^qb=p`elli= Spyropoulos, D. (2005, March). Analysis of career progression and job performance in internal labor markets : The case of

  5. Diagnostics and future evolution analysis of the two parametric models

    CERN Document Server

    Yang, Guang; Meng, Xinhe

    2016-01-01

    In this paper, we apply three diagnostics including $Om$, Statefinder hierarchy and the growth rate of perturbations into discriminating the two parametric models for the effective pressure with the $\\Lambda$CDM model. By using the $Om$ diagnostic, we find that both the model 1 and the model 2 can be hardly distinguished from each other as well as the $\\Lambda$CDM model in terms of 68\\% confidence level. As a supplement, by using the Statefinder hierarchy diagnostics and the growth rate of perturbations, we discover that not only can our two parametric models be well distinguished from $\\Lambda$CDM model, but also, by comparing with $Om$ diagnostic, the model 1 and the model 2 can be distinguished better from each other. In addition, we also explore the fate of universe evolution of our two models by means of the rip analysis.

  6. Analysis of Irradiance Models for Bifacial PV Modules

    Energy Technology Data Exchange (ETDEWEB)

    Hansen, Clifford W.; Stein, Joshua S.; Deline, Chris; MacAlpine, Sara; Marion, Bill; Asgharzadeh, Amir; Toor, Fatima

    2016-11-21

    We describe and compare two methods for modeling irradiance on the back surface of rack-mounted bifacial PV modules: view factor models and ray-tracing simulations. For each method we formulate one or more models and compare each model with irradiance measurements and short circuit current for a bifacial module mounted a fixed tilt rack with three other similarly sized modules. Our analysis illustrates the computational requirements of the different methods and provides insight into their practical applications. We find a level of consistency among the models which indicates that consistent models may be obtained by parameter calibrations.

  7. Supplementing biomechanical modeling with EMG analysis

    Science.gov (United States)

    Lewandowski, Beth; Jagodnik, Kathleen; Crentsil, Lawton; Humphreys, Bradley; Funk, Justin; Gallo, Christopher; Thompson, William; DeWitt, John; Perusek, Gail

    2016-01-01

    It is well established that astronauts experience musculoskeletal deconditioning when exposed to microgravity environments for long periods of time. Spaceflight exercise is used to counteract these effects, and the Advanced Resistive Exercise Device (ARED) on the International Space Station (ISS) has been effective in minimizing musculoskeletal losses. However, the exercise devices of the new exploration vehicles will have requirements of limited mass, power and volume. Because of these limitations, there is a concern that the exercise devices will not be as effective as ARED in maintaining astronaut performance. Therefore, biomechanical modeling is being performed to provide insight on whether the small Multi-Purpose Crew Vehicle (MPCV) device, which utilizes a single-strap design, will provide sufficient physiological loading to maintain musculoskeletal performance. Electromyography (EMG) data are used to supplement the biomechanical model results and to explore differences in muscle activation patterns during exercises using different loading configurations.

  8. Analysis of mathematical modelling on potentiometric biosensors.

    Science.gov (United States)

    Mehala, N; Rajendran, L

    2014-01-01

    A mathematical model of potentiometric enzyme electrodes for a nonsteady condition has been developed. The model is based on the system of two coupled nonlinear time-dependent reaction diffusion equations for Michaelis-Menten formalism that describes the concentrations of substrate and product within the enzymatic layer. Analytical expressions for the concentration of substrate and product and the corresponding flux response have been derived for all values of parameters using the new homotopy perturbation method. Furthermore, the complex inversion formula is employed in this work to solve the boundary value problem. The analytical solutions obtained allow a full description of the response curves for only two kinetic parameters (unsaturation/saturation parameter and reaction/diffusion parameter). Theoretical descriptions are given for the two limiting cases (zero and first order kinetics) and relatively simple approaches for general cases are presented. All the analytical results are compared with simulation results using Scilab/Matlab program. The numerical results agree with the appropriate theories.

  9. A global sensitivity analysis approach for morphogenesis models

    KAUST Repository

    Boas, Sonja E. M.

    2015-11-21

    Background Morphogenesis is a developmental process in which cells organize into shapes and patterns. Complex, non-linear and multi-factorial models with images as output are commonly used to study morphogenesis. It is difficult to understand the relation between the uncertainty in the input and the output of such ‘black-box’ models, giving rise to the need for sensitivity analysis tools. In this paper, we introduce a workflow for a global sensitivity analysis approach to study the impact of single parameters and the interactions between them on the output of morphogenesis models. Results To demonstrate the workflow, we used a published, well-studied model of vascular morphogenesis. The parameters of this cellular Potts model (CPM) represent cell properties and behaviors that drive the mechanisms of angiogenic sprouting. The global sensitivity analysis correctly identified the dominant parameters in the model, consistent with previous studies. Additionally, the analysis provided information on the relative impact of single parameters and of interactions between them. This is very relevant because interactions of parameters impede the experimental verification of the predicted effect of single parameters. The parameter interactions, although of low impact, provided also new insights in the mechanisms of in silico sprouting. Finally, the analysis indicated that the model could be reduced by one parameter. Conclusions We propose global sensitivity analysis as an alternative approach to study the mechanisms of morphogenesis. Comparison of the ranking of the impact of the model parameters to knowledge derived from experimental data and from manipulation experiments can help to falsify models and to find the operand mechanisms in morphogenesis. The workflow is applicable to all ‘black-box’ models, including high-throughput in vitro models in which output measures are affected by a set of experimental perturbations.

  10. Modeling and analysis of caves using voxelization

    Science.gov (United States)

    Szeifert, Gábor; Szabó, Tivadar; Székely, Balázs

    2014-05-01

    Although there are many ways to create three dimensional representations of caves using modern information technology methods, modeling of caves has been challenging for researchers for a long time. One of these promising new alternative modeling methods is using voxels. We are using geodetic measurements as an input for our voxelization project. These geodetic underground surveys recorded the azimuth, altitude and distance of corner points of cave systems relative to each other. The diameter of each cave section is estimated from separate databases originating from different surveys. We have developed a simple but efficient method (it covers more than 99.9 % of the volume of the input model on the average) to convert these vector-type datasets to voxels. We have also developed software components to make visualization of the voxel and vector models easier. Since each cornerpoint position is measured relative to another cornerpoints positions, propagation of uncertainties is an important issue in case of long caves with many separate sections. We are using Monte Carlo simulations to analyze the effect of the error of each geodetic instrument possibly involved in a survey. Cross-sections of the simulated three dimensional distributions show, that even tiny uncertainties of individual measurements can result in high variation of positions that could be reduced by distributing the closing errors if such data are available. Using the results of our simulations, we can estimate cave volume and the error of the calculated cave volume depending on the complexity of the cave. Acknowledgements: the authors are grateful to Ariadne Karst and Cave Exploring Association and State Department of Environmental and Nature Protection of the Hungarian Ministry of Rural Development, Department of National Parks and Landscape Protection, Section Landscape and Cave Protection and Ecotourism for providing the cave measurement data. BS contributed as an Alexander von Humboldt Research

  11. Analysis of modeling errors in system identification

    Science.gov (United States)

    Hadaegh, F. Y.; Bekey, G. A.

    1986-01-01

    This paper is concerned with the identification of a system in the presence of several error sources. Following some basic definitions, the notion of 'near-equivalence in probability' is introduced using the concept of near-equivalence between a model and process. Necessary and sufficient conditions for the identifiability of system parameters are given. The effect of structural error on the parameter estimates for both deterministic and stochastic cases are considered.

  12. Stochastic modeling and analysis of telecoms networks

    CERN Document Server

    Decreusefond, Laurent

    2012-01-01

    This book addresses the stochastic modeling of telecommunication networks, introducing the main mathematical tools for that purpose, such as Markov processes, real and spatial point processes and stochastic recursions, and presenting a wide list of results on stability, performances and comparison of systems.The authors propose a comprehensive mathematical construction of the foundations of stochastic network theory: Markov chains, continuous time Markov chains are extensively studied using an original martingale-based approach. A complete presentation of stochastic recursions from an

  13. Automatic terrain modeling using transfinite element analysis

    KAUST Repository

    Collier, Nathaniel O.

    2010-05-31

    An automatic procedure for modeling terrain is developed based on L2 projection-based interpolation of discrete terrain data onto transfinite function spaces. The function space is refined automatically by the use of image processing techniques to detect regions of high error and the flexibility of the transfinite interpolation to add degrees of freedom to these areas. Examples are shown of a section of the Palo Duro Canyon in northern Texas.

  14. Data perturbation analysis of a linear model

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    The linear model features were carefully studied in the cases of data perturbation and mean shift perturbation.Some important features were also proved mathematically. The results show that the mean shift perturbation is equivalentto the data perturbation, that is, adding a parameter to an observation equation means that this set of data is deleted fromthe data set. The estimate of this parameter is its predicted residual in fact

  15. Time Aquatic Resources Modeling and Analysis Program (STARMAP)

    Data.gov (United States)

    Federal Laboratory Consortium — Colorado State University has received funding from the U.S. Environmental Protection Agency (EPA) for its Space-Time Aquatic Resources Modeling and Analysis Program...

  16. Microscopic Analysis and Modeling of Airport Surface Sequencing Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Although a number of airportal surface models exist and have been successfully used for analysis of airportal operations, only recently has it become possible to...

  17. Automation of Safety Analysis with SysML Models Project

    Data.gov (United States)

    National Aeronautics and Space Administration — This project was a small proof-of-concept case study, generating SysML model information as a side effect of safety analysis. A prototype FMEA Assistant was...

  18. Reachable set modeling and engagement analysis of exoatmospheric interceptor

    Institute of Scientific and Technical Information of China (English)

    Chai Hua; Liang Yangang; Chen Lei; Tang Guojin

    2014-01-01

    A novel reachable set (RS) model is developed within a framework of exoatmospheric interceptor engagement analysis. The boost phase steering scheme and trajectory distortion mech-anism of the interceptor are firstly explored. A mathematical model of the distorted RS is then for-mulated through a dimension–reduction analysis. By treating the outer boundary of the RS on sphere surface as a spherical convex hull, two relevant theorems are proposed and the RS envelope is depicted by the computational geometry theory. Based on RS model, the algorithms of intercept window analysis and launch parameters determination are proposed, and numerical simulations are carried out for interceptors with different energy or launch points. Results show that the proposed method can avoid intensive on-line computation and provide an accurate and effective approach for interceptor engagement analysis. The suggested RS model also serves as a ready reference to other related problems such as interceptor effectiveness evaluation and platform disposition.

  19. Reachable set modeling and engagement analysis of exoatmospheric interceptor

    Directory of Open Access Journals (Sweden)

    Chai Hua

    2014-12-01

    Full Text Available A novel reachable set (RS model is developed within a framework of exoatmospheric interceptor engagement analysis. The boost phase steering scheme and trajectory distortion mechanism of the interceptor are firstly explored. A mathematical model of the distorted RS is then formulated through a dimension–reduction analysis. By treating the outer boundary of the RS on sphere surface as a spherical convex hull, two relevant theorems are proposed and the RS envelope is depicted by the computational geometry theory. Based on RS model, the algorithms of intercept window analysis and launch parameters determination are proposed, and numerical simulations are carried out for interceptors with different energy or launch points. Results show that the proposed method can avoid intensive on-line computation and provide an accurate and effective approach for interceptor engagement analysis. The suggested RS model also serves as a ready reference to other related problems such as interceptor effectiveness evaluation and platform disposition.

  20. Computational Modeling, Formal Analysis, and Tools for Systems Biology.

    Science.gov (United States)

    Bartocci, Ezio; Lió, Pietro

    2016-01-01

    As the amount of biological data in the public domain grows, so does the range of modeling and analysis techniques employed in systems biology. In recent years, a number of theoretical computer science developments have enabled modeling methodology to keep pace. The growing interest in systems biology in executable models and their analysis has necessitated the borrowing of terms and methods from computer science, such as formal analysis, model checking, static analysis, and runtime verification. Here, we discuss the most important and exciting computational methods and tools currently available to systems biologists. We believe that a deeper understanding of the concepts and theory highlighted in this review will produce better software practice, improved investigation of complex biological processes, and even new ideas and better feedback into computer science.

  1. Spectral Analysis and Atmospheric Models of Microflares

    Institute of Scientific and Technical Information of China (English)

    Cheng Fang; Yu-Hua Tang; Zhi Xu

    2006-01-01

    By use of the high-resolution spectral data obtained with THEMIS on 2002 September 5, the spectra and characteristics of five well-observed microflares have been analyzed. Our results indicate that some of them are located near the longitudinal magnetic polarity inversion lines. All the microflares are accompanied by mass motions. The most obvious characteristic of the Hα microflare spectra is the emission at the center of both Hα and CaII 8542(A) lines. For the first time both thermal and non-thermal semi-empirical atmospheric models for the conspicuous and faint microflares are computed. In computing the non-thermal models, we assume that the electron beam resulting from magnetic reconnection is produced in the chromosphere, because it requires lower energies for the injected particles.It is found there is obvious heating in the low chromosphere. The temperature enhancement is about 1000-2200 K in the thermal models. If the non-thermal effects are included, then the required temperature increase can be reduced by 100-150 K. These imply that the Hα microflares can probably be produced by magnetic reconnection in the solar Iower atmosphere.The radiative and kinetic energies of the Hα microflares are estimated and the total energy is found to be 1027 - 4× 1028 erg.

  2. From phenology models to risk indicator analysis

    Directory of Open Access Journals (Sweden)

    Márta Ladányi

    2010-11-01

    Full Text Available In this paper we outline a phenology model for estimating budbreak and full bloom starting dates of sour cherry on the effective heat sums with reasonable accuracy. With the help of RegCM3.1 model the possible trends of the phenology timing in the middle of the 21st century the shift of 12-13 days earlier budbreak and 6-7 days earlier of full bloom due to the warmer weather conditions can be clearly indicated. For the climatic characterization of sour cherry bloom period in between 1984-2010 and for the description of the expected changes in this very sensitive period of sour cherry withrespect to the time slice 2021-2050, we introduce seven climatic indicators as artificial weather parameters such as the numbers of days when the temperature was under 0°C and above 10 °C, the numbers of days when there was no and more than 5 mm precipitation as well as the absolute minimum, the mean of minimum and the mean of maximum daily temperatures. We survey the changes of the indicators in the examined period (1984-2010 and, regarding the full bloom start model results, we formulate the expectations forthe future and make comparisons.

  3. Beauty and the beast: Some perspectives on efficient model analysis, surrogate models, and the future of modeling

    Science.gov (United States)

    Hill, M. C.; Jakeman, J.; Razavi, S.; Tolson, B.

    2015-12-01

    For many environmental systems model runtimes have remained very long as more capable computers have been used to add more processes and more time and space discretization. Scientists have also added more parameters and kinds of observations, and many model runs are needed to explore the models. Computational demand equals run time multiplied by number of model runs divided by parallelization opportunities. Model exploration is conducted using sensitivity analysis, optimization, and uncertainty quantification. Sensitivity analysis is used to reveal consequences of what may be very complex simulated relations, optimization is used to identify parameter values that fit the data best, or at least better, and uncertainty quantification is used to evaluate the precision of simulated results. The long execution times make such analyses a challenge. Methods for addressing this challenges include computationally frugal analysis of the demanding original model and a number of ingenious surrogate modeling methods. Both commonly use about 50-100 runs of the demanding original model. In this talk we consider the tradeoffs between (1) original model development decisions, (2) computationally frugal analysis of the original model, and (3) using many model runs of the fast surrogate model. Some questions of interest are as follows. If the added processes and discretization invested in (1) are compared with the restrictions and approximations in model analysis produced by long model execution times, is there a net benefit related of the goals of the model? Are there changes to the numerical methods that could reduce the computational demands while giving up less fidelity than is compromised by using computationally frugal methods or surrogate models for model analysis? Both the computationally frugal methods and surrogate models require that the solution of interest be a smooth function of the parameters or interest. How does the information obtained from the local methods typical

  4. Efficient Global Programming Model for Discriminant Analysis

    Directory of Open Access Journals (Sweden)

    M.ANGULAKSHMI

    2011-03-01

    Full Text Available Conventional statistical analysis includes the capacity to systematically assign individuals to groups. We suggest alternative assignment procedures, utilizing a set of interrelated goal programming formulations. This paper represents an effort to suggest ways by which the discriminant problem might reasonably be addressed via straightforward linear goal programming formulations. Simple and direct, such formulations may ultimately compete with conventional approaches - free of the classical assumptions and possessing a stronger intuitive appeal. We further demonstrate via simple illustration the potential of these procedures to play a significant part in addressing the discriminant problem, and indicate fundamental ideas that lay the foundation for other more sophisticated approaches.

  5. Models and analysis for distributed systems

    CERN Document Server

    Haddad, Serge; Pautet, Laurent; Petrucci, Laure

    2013-01-01

    Nowadays, distributed systems are increasingly present, for public software applications as well as critical systems. software applications as well as critical systems. This title and Distributed Systems: Design and Algorithms - from the same editors - introduce the underlying concepts, the associated design techniques and the related security issues.The objective of this book is to describe the state of the art of the formal methods for the analysis of distributed systems. Numerous issues remain open and are the topics of major research projects. One current research trend consists of pro

  6. Analysis of Jingdong Mall Logistics Distribution Model

    Science.gov (United States)

    Shao, Kang; Cheng, Feng

    In recent years, the development of electronic commerce in our country to speed up the pace. The role of logistics has been highlighted, more and more electronic commerce enterprise are beginning to realize the importance of logistics in the success or failure of the enterprise. In this paper, the author take Jingdong Mall for example, performing a SWOT analysis of their current situation of self-built logistics system, find out the problems existing in the current Jingdong Mall logistics distribution and give appropriate recommendations.

  7. Empirical Bayes Model Comparisons for Differential Methylation Analysis

    Directory of Open Access Journals (Sweden)

    Mingxiang Teng

    2012-01-01

    Full Text Available A number of empirical Bayes models (each with different statistical distribution assumptions have now been developed to analyze differential DNA methylation using high-density oligonucleotide tiling arrays. However, it remains unclear which model performs best. For example, for analysis of differentially methylated regions for conservative and functional sequence characteristics (e.g., enrichment of transcription factor-binding sites (TFBSs, the sensitivity of such analyses, using various empirical Bayes models, remains unclear. In this paper, five empirical Bayes models were constructed, based on either a gamma distribution or a log-normal distribution, for the identification of differential methylated loci and their cell division—(1, 3, and 5 and drug-treatment-(cisplatin dependent methylation patterns. While differential methylation patterns generated by log-normal models were enriched with numerous TFBSs, we observed almost no TFBS-enriched sequences using gamma assumption models. Statistical and biological results suggest log-normal, rather than gamma, empirical Bayes model distribution to be a highly accurate and precise method for differential methylation microarray analysis. In addition, we presented one of the log-normal models for differential methylation analysis and tested its reproducibility by simulation study. We believe this research to be the first extensive comparison of statistical modeling for the analysis of differential DNA methylation, an important biological phenomenon that precisely regulates gene transcription.

  8. PSAMM: A Portable System for the Analysis of Metabolic Models.

    Directory of Open Access Journals (Sweden)

    Jon Lund Steffensen

    2016-02-01

    Full Text Available The genome-scale models of metabolic networks have been broadly applied in phenotype prediction, evolutionary reconstruction, community functional analysis, and metabolic engineering. Despite the development of tools that support individual steps along the modeling procedure, it is still difficult to associate mathematical simulation results with the annotation and biological interpretation of metabolic models. In order to solve this problem, here we developed a Portable System for the Analysis of Metabolic Models (PSAMM, a new open-source software package that supports the integration of heterogeneous metadata in model annotations and provides a user-friendly interface for the analysis of metabolic models. PSAMM is independent of paid software environments like MATLAB, and all its dependencies are freely available for academic users. Compared to existing tools, PSAMM significantly reduced the running time of constraint-based analysis and enabled flexible settings of simulation parameters using simple one-line commands. The integration of heterogeneous, model-specific annotation information in PSAMM is achieved with a novel format of YAML-based model representation, which has several advantages, such as providing a modular organization of model components and simulation settings, enabling model version tracking, and permitting the integration of multiple simulation problems. PSAMM also includes a number of quality checking procedures to examine stoichiometric balance and to identify blocked reactions. Applying PSAMM to 57 models collected from current literature, we demonstrated how the software can be used for managing and simulating metabolic models. We identified a number of common inconsistencies in existing models and constructed an updated model repository to document the resolution of these inconsistencies.

  9. Neutrosophic Logic for Mental Model Elicitation and Analysis

    Directory of Open Access Journals (Sweden)

    Karina Pérez-Teruel

    2014-03-01

    Full Text Available Mental models are personal, internal representations of external reality that people use to interact with the world around them. They are useful in multiple situations such as muticriteria decision making, knowledge management, complex system learning and analysis. In this paper a framework for mental models elicitation and analysis based on neutrosophic Logic is presented. An illustrative example is provided to show the applicability of the proposal. The paper ends with conclusion future research directions.

  10. Simulation Modeling and Analysis of Operator-Machine Ratio

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Based on a simulation model of a semiconductor manufacturer, operator-machine ratio (OMR) analysis is made using work study and time study. Through sensitivity analysis, it is found that labor utilization decreases with the increase of lot size.Meanwhile, it is able to identify that the OMR for this company should be improved from 1∶3 to 1∶5. An application result shows that the proposed model can effectively improve the OMR by 33%.

  11. Discrete Discriminant analysis based on tree-structured graphical models

    DEFF Research Database (Denmark)

    Perez de la Cruz, Gonzalo; Eslava, Guillermina

    The purpose of this paper is to illustrate the potential use of discriminant analysis based on tree{structured graphical models for discrete variables. This is done by comparing its empirical performance using estimated error rates for real and simulated data. The results show that discriminant...... analysis based on tree{structured graphical models is a simple nonlinear method competitive with, and sometimes superior to, other well{known linear methods like those assuming mutual independence between variables and linear logistic regression....

  12. Coping with Complexity Model Reduction and Data Analysis

    CERN Document Server

    Gorban, Alexander N

    2011-01-01

    This volume contains the extended version of selected talks given at the international research workshop 'Coping with Complexity: Model Reduction and Data Analysis', Ambleside, UK, August 31 - September 4, 2009. This book is deliberately broad in scope and aims at promoting new ideas and methodological perspectives. The topics of the chapters range from theoretical analysis of complex and multiscale mathematical models to applications in e.g., fluid dynamics and chemical kinetics.

  13. Sensitivity analysis and optimization of system dynamics models : Regression analysis and statistical design of experiments

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    1995-01-01

    This tutorial discusses what-if analysis and optimization of System Dynamics models. These problems are solved, using the statistical techniques of regression analysis and design of experiments (DOE). These issues are illustrated by applying the statistical techniques to a System Dynamics model for

  14. Model-based methods for linkage analysis.

    Science.gov (United States)

    Rice, John P; Saccone, Nancy L; Corbett, Jonathan

    2008-01-01

    The logarithm of an odds ratio (LOD) score method originated in a seminal article by Newton Morton in 1955. The method is broadly concerned with issues of power and the posterior probability of linkage, ensuring that a reported linkage has a high probability of being a true linkage. In addition, the method is sequential so that pedigrees or LOD curves may be combined from published reports to pool data for analysis. This approach has been remarkably successful for 50 years in identifying disease genes for Mendelian disorders. After discussing these issues, we consider the situation for complex disorders where the maximum LOD score statistic shares some of the advantages of the traditional LOD score approach, but is limited by unknown power and the lack of sharing of the primary data needed to optimally combine analytic results. We may still learn from the LOD score method as we explore new methods in molecular biology and genetic analysis to utilize the complete human DNA sequence and the cataloging of all human genes.

  15. Topic Modeling in Sentiment Analysis: A Systematic Review

    Directory of Open Access Journals (Sweden)

    Toqir Ahmad Rana

    2016-06-01

    Full Text Available With the expansion and acceptance of Word Wide Web, sentiment analysis has become progressively popular research area in information retrieval and web data analysis. Due to the huge amount of user-generated contents over blogs, forums, social media, etc., sentiment analysis has attracted researchers both in academia and industry, since it deals with the extraction of opinions and sentiments. In this paper, we have presented a review of topic modeling, especially LDA-based techniques, in sentiment analysis. We have presented a detailed analysis of diverse approaches and techniques, and compared the accuracy of different systems among them. The results of different approaches have been summarized, analyzed and presented in a sophisticated fashion. This is the really effort to explore different topic modeling techniques in the capacity of sentiment analysis and imparting a comprehensive comparison among them.

  16. Explicit model predictive control accuracy analysis

    OpenAIRE

    Knyazev, Andrew; Zhu, Peizhen; Di Cairano, Stefano

    2015-01-01

    Model Predictive Control (MPC) can efficiently control constrained systems in real-time applications. MPC feedback law for a linear system with linear inequality constraints can be explicitly computed off-line, which results in an off-line partition of the state space into non-overlapped convex regions, with affine control laws associated to each region of the partition. An actual implementation of this explicit MPC in low cost micro-controllers requires the data to be "quantized", i.e. repre...

  17. Modelling Analysis of Sewage Sludge Amended Soil

    DEFF Research Database (Denmark)

    Sørensen, P. B.; Carlsen, L.; Vikelsøe, J.;

    The topic is risk assessment of sludge supply to agricultural soil in relation to xenobiotics. A large variety of xenobiotics arrive to the wastewater treatment plant in the wastewater. Many of these components are hydrophobic and thus will accumulate in the sludge solids and are removed from...... for the Evaluation of Substances (EUSES). It is shown how the fraction of substance mass, which is leached, from the top soil is a simple function of the ratio between the degradation half lifetime and the adsorption coefficient. This model can be used in probabilistic risk assessment of agricultural soils...

  18. Modeling and Analysis of Clandestine Networks

    Science.gov (United States)

    2005-12-15

    represents the combined flow of "information" from node i to nodej through all possible paths joining i andj. 7 Pre-Publication Draft 15 Dec 2005 If the...definition can be modified to model influence by defining Qij as the influence of node i, in clique m, over a nodej in clique n. The node-clique...defining Qb, as the influence of node i, not in clique c, over a nodej in c. The node-clique formulation then becomes m ni Z (Q/Ic/H j=1 Oo = hu’l

  19. Compartmentalization analysis using discrete fracture network models

    Energy Technology Data Exchange (ETDEWEB)

    La Pointe, P.R.; Eiben, T.; Dershowitz, W. [Golder Associates, Redmond, VA (United States); Wadleigh, E. [Marathon Oil Co., Midland, TX (United States)

    1997-08-01

    This paper illustrates how Discrete Fracture Network (DFN) technology can serve as a basis for the calculation of reservoir engineering parameters for the development of fractured reservoirs. It describes the development of quantitative techniques for defining the geometry and volume of structurally controlled compartments. These techniques are based on a combination of stochastic geometry, computational geometry, and graph the theory. The parameters addressed are compartment size, matrix block size and tributary drainage volume. The concept of DFN models is explained and methodologies to compute these parameters are demonstrated.

  20. Model Construction and Analysis of Respiration in Halobacterium salinarum.

    Directory of Open Access Journals (Sweden)

    Cherryl O Talaue

    Full Text Available The archaeon Halobacterium salinarum can produce energy using three different processes, namely photosynthesis, oxidative phosphorylation and fermentation of arginine, and is thus a model organism in bioenergetics. Compared to its bacteriorhodopsin-driven photosynthesis, less attention has been devoted to modeling its respiratory pathway. We created a system of ordinary differential equations that models its oxidative phosphorylation. The model consists of the electron transport chain, the ATP synthase, the potassium uniport and the sodium-proton antiport. By fitting the model parameters to experimental data, we show that the model can explain data on proton motive force generation, ATP production, and the charge balancing of ions between the sodium-proton antiporter and the potassium uniport. We performed sensitivity analysis of the model parameters to determine how the model will respond to perturbations in parameter values. The model and the parameters we derived provide a resource that can be used for analytical studies of the bioenergetics of H. salinarum.

  1. Product Lifecycle Management Architecture: A Model Based Systems Engineering Analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Noonan, Nicholas James [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-07-01

    This report is an analysis of the Product Lifecycle Management (PLM) program. The analysis is centered on a need statement generated by a Nuclear Weapons (NW) customer. The need statement captured in this report creates an opportunity for the PLM to provide a robust service as a solution. Lifecycles for both the NW and PLM are analyzed using Model Based System Engineering (MBSE).

  2. Equivalent circuit models for ac impedance data analysis

    Science.gov (United States)

    Danford, M. D.

    1990-01-01

    A least-squares fitting routine has been developed for the analysis of ac impedance data. It has been determined that the checking of the derived equations for a particular circuit with a commercially available electronics circuit program is essential. As a result of the investigation described, three equivalent circuit models were selected for use in the analysis of ac impedance data.

  3. Sparse Principal Component Analysis in Medical Shape Modeling

    DEFF Research Database (Denmark)

    Sjöstrand, Karl; Stegmann, Mikkel Bille; Larsen, Rasmus

    2006-01-01

    Principal component analysis (PCA) is a widely used tool in medical image analysis for data reduction, model building, and data understanding and exploration. While PCA is a holistic approach where each new variable is a linear combination of all original variables, sparse PCA (SPCA) aims...

  4. Sensitivity analysis of fine sediment models using heterogeneous data

    Science.gov (United States)

    Kamel, A. M. Yousif; Bhattacharya, B.; El Serafy, G. Y.; van Kessel, T.; Solomatine, D. P.

    2012-04-01

    Sediments play an important role in many aquatic systems. Their transportation and deposition has significant implication on morphology, navigability and water quality. Understanding the dynamics of sediment transportation in time and space is therefore important in drawing interventions and making management decisions. This research is related to the fine sediment dynamics in the Dutch coastal zone, which is subject to human interference through constructions, fishing, navigation, sand mining, etc. These activities do affect the natural flow of sediments and sometimes lead to environmental concerns or affect the siltation rates in harbours and fairways. Numerical models are widely used in studying fine sediment processes. Accuracy of numerical models depends upon the estimation of model parameters through calibration. Studying the model uncertainty related to these parameters is important in improving the spatio-temporal prediction of suspended particulate matter (SPM) concentrations, and determining the limits of their accuracy. This research deals with the analysis of a 3D numerical model of North Sea covering the Dutch coast using the Delft3D modelling tool (developed at Deltares, The Netherlands). The methodology in this research was divided into three main phases. The first phase focused on analysing the performance of the numerical model in simulating SPM concentrations near the Dutch coast by comparing the model predictions with SPM concentrations estimated from NASA's MODIS sensors at different time scales. The second phase focused on carrying out a sensitivity analysis of model parameters. Four model parameters were identified for the uncertainty and sensitivity analysis: the sedimentation velocity, the critical shear stress above which re-suspension occurs, the shields shear stress for re-suspension pick-up, and the re-suspension pick-up factor. By adopting different values of these parameters the numerical model was run and a comparison between the

  5. Rovibrational CO analysis in PDR models

    Science.gov (United States)

    Stancil, Phillip C.; Cumbee, Renata; Zhang, Ziwei; Walker, Kyle M.; Yang, Benhui; Ferland, Gary J.

    2016-01-01

    CO is one of the most important molecules in the interstellar medium and in photodissociation regions (PDRs). Most of the extragalactic non-stellar IR to submm CO emission originates in PDRs. (Hollenbach & Tielens 1999). Pure rotational CO lines have been previously used in PDR models to provide density, temperature, and other diagnostics. However, for environments exposed to intense UV radiation, CO vibrational levels become significantly populated. Given new calculations of rovibrational collisional rate coefficients for CO-H (Walker et al. 2015, Song et al. 2015) and CO-H2 (Yang et al. 2015), we explore their effects in standard Cloudy PDR (Ferland et al. 2013) and Radex (van der Tak et al. 2007) models. In particular, CO vibrational transitions due to H2 collisions are studied for the first time using reliable full-dimensional CO-H2 collisional data.Ferland, G. J., et al. 2013, Rev. Mex. Astron. y Astrof., 49, 137Hollenbach, D. J. & Tielens, A. G. G. M. 1999, RMP, 71, 173Song, L., et al. 2015, ApJ, in pressvan der Tak, F. F. S, et al. 2007, A&A, 468, 627Walker, K. M., et al. 2015, ApJ, 811, 27Yang, B., et al. 2015, Nature Comm., 6, 6629This work was supported in part by NASA grants NNX12AF42G and NNX15AI61G.

  6. Experimental development based on mapping rule between requirements analysis model and web framework specific design model.

    Science.gov (United States)

    Okuda, Hirotaka; Ogata, Shinpei; Matsuura, Saeko

    2013-12-01

    Model Driven Development is a promising approach to develop high quality software systems. We have proposed a method of model-driven requirements analysis using Unified Modeling Language (UML). The main feature of our method is to automatically generate a Web user interface prototype from UML requirements analysis model so that we can confirm validity of input/output data for each page and page transition on the system by directly operating the prototype. We proposes a mapping rule in which design information independent of each web application framework implementation is defined based on the requirements analysis model, so as to improve the traceability to the final product from the valid requirements analysis model. This paper discusses the result of applying our method to the development of a Group Work Support System that is currently running in our department.

  7. Structural model analysis of multiple quantitative traits.

    Directory of Open Access Journals (Sweden)

    Renhua Li

    2006-07-01

    Full Text Available We introduce a method for the analysis of multilocus, multitrait genetic data that provides an intuitive and precise characterization of genetic architecture. We show that it is possible to infer the magnitude and direction of causal relationships among multiple correlated phenotypes and illustrate the technique using body composition and bone density data from mouse intercross populations. Using these techniques we are able to distinguish genetic loci that affect adiposity from those that affect overall body size and thus reveal a shortcoming of standardized measures such as body mass index that are widely used in obesity research. The identification of causal networks sheds light on the nature of genetic heterogeneity and pleiotropy in complex genetic systems.

  8. JSim, an open-source modeling system for data analysis.

    Science.gov (United States)

    Butterworth, Erik; Jardine, Bartholomew E; Raymond, Gary M; Neal, Maxwell L; Bassingthwaighte, James B

    2013-01-01

    JSim is a simulation system for developing models, designing experiments, and evaluating hypotheses on physiological and pharmacological systems through the testing of model solutions against data. It is designed for interactive, iterative manipulation of the model code, handling of multiple data sets and parameter sets, and for making comparisons among different models running simultaneously or separately. Interactive use is supported by a large collection of graphical user interfaces for model writing and compilation diagnostics, defining input functions, model runs, selection of algorithms solving ordinary and partial differential equations, run-time multidimensional graphics, parameter optimization (8 methods), sensitivity analysis, and Monte Carlo simulation for defining confidence ranges. JSim uses Mathematical Modeling Language (MML) a declarative syntax specifying algebraic and differential equations. Imperative constructs written in other languages (MATLAB, FORTRAN, C++, etc.) are accessed through procedure calls. MML syntax is simple, basically defining the parameters and variables, then writing the equations in a straightforward, easily read and understood mathematical form. This makes JSim good for teaching modeling as well as for model analysis for research.   For high throughput applications, JSim can be run as a batch job.  JSim can automatically translate models from the repositories for Systems Biology Markup Language (SBML) and CellML models. Stochastic modeling is supported. MML supports assigning physical units to constants and variables and automates checking dimensional balance as the first step in verification testing. Automatic unit scaling follows, e.g. seconds to minutes, if needed. The JSim Project File sets a standard for reproducible modeling analysis: it includes in one file everything for analyzing a set of experiments: the data, the models, the data fitting, and evaluation of parameter confidence ranges. JSim is open source; it

  9. Mathematical Modelling Research in Turkey: A Content Analysis Study

    Science.gov (United States)

    Çelik, H. Coskun

    2017-01-01

    The aim of the present study was to examine the mathematical modelling studies done between 2004 and 2015 in Turkey and to reveal their tendencies. Forty-nine studies were selected using purposeful sampling based on the term, "mathematical modelling" with Higher Education Academic Search Engine. They were analyzed with content analysis.…

  10. A Noncentral "t" Regression Model for Meta-Analysis

    Science.gov (United States)

    Camilli, Gregory; de la Torre, Jimmy; Chiu, Chia-Yi

    2010-01-01

    In this article, three multilevel models for meta-analysis are examined. Hedges and Olkin suggested that effect sizes follow a noncentral "t" distribution and proposed several approximate methods. Raudenbush and Bryk further refined this model; however, this procedure is based on a normal approximation. In the current research literature, this…

  11. Illustration of a Multilevel Model for Meta-Analysis

    Science.gov (United States)

    de la Torre, Jimmy; Camilli, Gregory; Vargas, Sadako; Vernon, R. Fox

    2007-01-01

    In this article, the authors present a multilevel (or hierarchical linear) model that illustrates issues in the application of the model to data from meta-analytic studies. In doing so, several issues are discussed that typically arise in the course of a meta-analysis. These include the presence of non-zero between-study variability, how multiple…

  12. Detecting tipping points in ecological models with sensitivity analysis

    NARCIS (Netherlands)

    Broeke, G.A. ten; Voorn, van G.A.K.; Kooi, B.W.; Molenaar, J.

    2016-01-01

    Simulation models are commonly used to understand and predict the developmentof ecological systems, for instance to study the occurrence of tipping points and their possibleecological effects. Sensitivity analysis is a key tool in the study of model responses to change s in conditions. The applicabi

  13. Detecting Tipping points in Ecological Models with Sensitivity Analysis

    NARCIS (Netherlands)

    Broeke, ten G.A.; Voorn, van G.A.K.; Kooi, B.W.; Molenaar, Jaap

    2016-01-01

    Simulation models are commonly used to understand and predict the development of ecological systems, for instance to study the occurrence of tipping points and their possible ecological effects. Sensitivity analysis is a key tool in the study of model responses to changes in conditions. The appli

  14. Modeling, analysis and control of a variable geometry actuator

    NARCIS (Netherlands)

    Evers, W.J.; Knaap, A. van der; Besselink, I.J.M.; Nijmeijer, H.

    2008-01-01

    A new design of variable geometry force actuator is presented in this paper. Based upon this design, a model is derived which is used for steady-state analysis, as well as controller design in the presence of friction. The controlled actuator model is finally used to evaluate the power consumption u

  15. MMA, A Computer Code for Multi-Model Analysis

    Science.gov (United States)

    Poeter, Eileen P.; Hill, Mary C.

    2007-01-01

    This report documents the Multi-Model Analysis (MMA) computer code. MMA can be used to evaluate results from alternative models of a single system using the same set of observations for all models. As long as the observations, the observation weighting, and system being represented are the same, the models can differ in nearly any way imaginable. For example, they may include different processes, different simulation software, different temporal definitions (for example, steady-state and transient models could be considered), and so on. The multiple models need to be calibrated by nonlinear regression. Calibration of the individual models needs to be completed before application of MMA. MMA can be used to rank models and calculate posterior model probabilities. These can be used to (1) determine the relative importance of the characteristics embodied in the alternative models, (2) calculate model-averaged parameter estimates and predictions, and (3) quantify the uncertainty of parameter estimates and predictions in a way that integrates the variations represented by the alternative models. There is a lack of consensus on what model analysis methods are best, so MMA provides four default methods. Two are based on Kullback-Leibler information, and use the AIC (Akaike Information Criterion) or AICc (second-order-bias-corrected AIC) model discrimination criteria. The other two default methods are the BIC (Bayesian Information Criterion) and the KIC (Kashyap Information Criterion) model discrimination criteria. Use of the KIC criterion is equivalent to using the maximum-likelihood Bayesian model averaging (MLBMA) method. AIC, AICc, and BIC can be derived from Frequentist or Bayesian arguments. The default methods based on Kullback-Leibler information have a number of theoretical advantages, including that they tend to favor more complicated models as more data become available than do the other methods, which makes sense in many situations. Many applications of MMA will

  16. Logical Modeling and Dynamical Analysis of Cellular Networks.

    Science.gov (United States)

    Abou-Jaoudé, Wassim; Traynard, Pauline; Monteiro, Pedro T; Saez-Rodriguez, Julio; Helikar, Tomáš; Thieffry, Denis; Chaouiya, Claudine

    2016-01-01

    The logical (or logic) formalism is increasingly used to model regulatory and signaling networks. Complementing these applications, several groups contributed various methods and tools to support the definition and analysis of logical models. After an introduction to the logical modeling framework and to several of its variants, we review here a number of recent methodological advances to ease the analysis of large and intricate networks. In particular, we survey approaches to determine model attractors and their reachability properties, to assess the dynamical impact of variations of external signals, and to consistently reduce large models. To illustrate these developments, we further consider several published logical models for two important biological processes, namely the differentiation of T helper cells and the control of mammalian cell cycle.

  17. Analysis of Gumbel Model for Software Reliability Using Bayesian Paradigm

    Directory of Open Access Journals (Sweden)

    Raj Kumar

    2012-12-01

    Full Text Available In this paper, we have illustrated the suitability of Gumbel Model for software reliability data. The model parameters are estimated using likelihood based inferential procedure: classical as well as Bayesian. The quasi Newton-Raphson algorithm is applied to obtain the maximum likelihood estimates and associated probability intervals. The Bayesian estimates of the parameters of Gumbel model are obtained using Markov Chain Monte Carlo(MCMC simulation method in OpenBUGS(established software for Bayesian analysis using Markov Chain Monte Carlo methods. The R functions are developed to study the statistical properties, model validation and comparison tools of the model and the output analysis of MCMC samples generated from OpenBUGS. Details of applying MCMC to parameter estimation for the Gumbel model are elaborated and a real software reliability data set is considered to illustrate the methods of inference discussed in this paper.

  18. Comparative Analysis of Uncertainties in Urban Surface Runoff Modelling

    DEFF Research Database (Denmark)

    Thorndahl, Søren; Schaarup-Jensen, Kjeld

    2007-01-01

    In the present paper a comparison between three different surface runoff models, in the numerical urban drainage tool MOUSE, is conducted. Analysing parameter uncertainty, it is shown that the models are very sensitive with regards to the choice of hydrological parameters, when combined overflow...... analysis, further research in improved parameter assessment for surface runoff models is needed....... volumes are compared - especially when the models are uncalibrated. The occurrences of flooding and surcharge are highly dependent on both hydrological and hydrodynamic parameters. Thus, the conclusion of the paper is that if the use of model simulations is to be a reliable tool for drainage system...

  19. Practical Soil-Shallow Foundation Model for Nonlinear Structural Analysis

    Directory of Open Access Journals (Sweden)

    Moussa Leblouba

    2016-01-01

    Full Text Available Soil-shallow foundation interaction models that are incorporated into most structural analysis programs generally lack accuracy and efficiency or neglect some aspects of foundation behavior. For instance, soil-shallow foundation systems have been observed to show both small and large loops under increasing amplitude load reversals. This paper presents a practical macroelement model for soil-shallow foundation system and its stability under simultaneous horizontal and vertical loads. The model comprises three spring elements: nonlinear horizontal, nonlinear rotational, and linear vertical springs. The proposed macroelement model was verified using experimental test results from large-scale model foundations subjected to small and large cyclic loading cases.

  20. Toward a validation process for model based safety analysis

    OpenAIRE

    Adeline, Romain; Cardoso, Janette; Darfeuil, Pierre; Humbert, Sophie; Seguin, Christel

    2010-01-01

    Today, Model Based processes become more and more widespread to achieve the analysis of a system. However, there is no formal testing approach to ensure that the formal model is compliant with the real system. In the paper, we choose to study AltaRica model. We present a general process to well construct and validate an AltaRica formal model. The focus is made on this validation phase, i.e. verifying the compliance between the model and the real system. For it, the proposed process recommends...

  1. Materials Analysis and Modeling of Underfill Materials.

    Energy Technology Data Exchange (ETDEWEB)

    Wyatt, Nicholas B [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Chambers, Robert S. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-08-01

    The thermal-mechanical properties of three potential underfill candidate materials for PBGA applications are characterized and reported. Two of the materials are a formulations developed at Sandia for underfill applications while the third is a commercial product that utilizes a snap-cure chemistry to drastically reduce cure time. Viscoelastic models were calibrated and fit using the property data collected for one of the Sandia formulated materials. Along with the thermal-mechanical analyses performed, a series of simple bi-material strip tests were conducted to comparatively analyze the relative effects of cure and thermal shrinkage amongst the materials under consideration. Finally, current knowledge gaps as well as questions arising from the present study are identified and a path forward presented.

  2. Analysis of Empirical Software Effort Estimation Models

    CERN Document Server

    Basha, Saleem

    2010-01-01

    Reliable effort estimation remains an ongoing challenge to software engineers. Accurate effort estimation is the state of art of software engineering, effort estimation of software is the preliminary phase between the client and the business enterprise. The relationship between the client and the business enterprise begins with the estimation of the software. The credibility of the client to the business enterprise increases with the accurate estimation. Effort estimation often requires generalizing from a small number of historical projects. Generalization from such limited experience is an inherently under constrained problem. Accurate estimation is a complex process because it can be visualized as software effort prediction, as the term indicates prediction never becomes an actual. This work follows the basics of the empirical software effort estimation models. The goal of this paper is to study the empirical software effort estimation. The primary conclusion is that no single technique is best for all sit...

  3. Analysis and modeling of rail maintenance costs

    Directory of Open Access Journals (Sweden)

    Amir Ali Bakhshi

    2012-01-01

    Full Text Available Railroad maintenance engineering plays an important role on availability of roads and reducing the cost of railroad incidents. Rail is of the most important parts of railroad industry, which needs regular maintenance since it covers a significant part of total maintenance cost. Any attempt on optimizing total cost of maintenance could substantially reduce the cost of railroad system and it can reduce total cost of the industry. The paper presents a new method to estimate the cost of rail failure using different cost components such as cost of inspection and cost of risk associated with possible accidents. The proposed model of this paper is used for a real-world case study of railroad transportation of Tehran region and the results have been analyzed.

  4. A Quotient Space Approximation Model of Multiresolution Signal Analysis

    Institute of Scientific and Technical Information of China (English)

    Ling Zhang; Bo Zhang

    2005-01-01

    In this paper, we present a quotient space approximation model of multiresolution signal analysis and discuss the properties and characteristics of the model. Then the comparison between wavelet transform and the quotient space approximation is made. First, when wavelet transform is viewed from the new quotient space approximation perspective, it may help us to gain an insight into the essence of multiresolution signal analysis. Second, from the similarity between wavelet and quotient space approximations, it is possible to transfer the rich wavelet techniques into the latter so that a new way for multiresolution analysis may be found.

  5. MMA, A Computer Code for Multi-Model Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Eileen P. Poeter and Mary C. Hill

    2007-08-20

    This report documents the Multi-Model Analysis (MMA) computer code. MMA can be used to evaluate results from alternative models of a single system using the same set of observations for all models. As long as the observations, the observation weighting, and system being represented are the same, the models can differ in nearly any way imaginable. For example, they may include different processes, different simulation software, different temporal definitions (for example, steady-state and transient models could be considered), and so on. The multiple models need to be calibrated by nonlinear regression. Calibration of the individual models needs to be completed before application of MMA. MMA can be used to rank models and calculate posterior model probabilities. These can be used to (1) determine the relative importance of the characteristics embodied in the alternative models, (2) calculate model-averaged parameter estimates and predictions, and (3) quantify the uncertainty of parameter estimates and predictions in a way that integrates the variations represented by the alternative models. There is a lack of consensus on what model analysis methods are best, so MMA provides four default methods. Two are based on Kullback-Leibler information, and use the AIC (Akaike Information Criterion) or AICc (second-order-bias-corrected AIC) model discrimination criteria. The other two default methods are the BIC (Bayesian Information Criterion) and the KIC (Kashyap Information Criterion) model discrimination criteria. Use of the KIC criterion is equivalent to using the maximum-likelihood Bayesian model averaging (MLBMA) method. AIC, AICc, and BIC can be derived from Frequentist or Bayesian arguments. The default methods based on Kullback-Leibler information have a number of theoretical advantages, including that they tend to favor more complicated models as more data become available than do the other methods, which makes sense in many situations.

  6. Distributional Analysis for Model Predictive Deferrable Load Control

    OpenAIRE

    Chen, Niangjun; Gan, Lingwen; Low, Steven H.; Wierman, Adam

    2014-01-01

    Deferrable load control is essential for handling the uncertainties associated with the increasing penetration of renewable generation. Model predictive control has emerged as an effective approach for deferrable load control, and has received considerable attention. In particular, previous work has analyzed the average-case performance of model predictive deferrable load control. However, to this point, distributional analysis of model predictive deferrable load control has been elusive. In ...

  7. Stochastic analysis of Bazykin-Berezovskaya population model

    Science.gov (United States)

    Bashkirtseva, Irina; Filippova, Darja; Pisarchik, Alexander N.

    2016-12-01

    The predator-prey model with strong Allee effect is considered. This model demonstrates both local and global bifurcations, and coexistence of both species in the form of stable equilibria or limit cycles. An influence of random parametric noise on the dynamics of this model is studied. It is shown that increasing noise can transform system dynamics from the coexistence to the extinction of both species. Parametric analysis of this phenomenon is carried out.

  8. Evolution analysis of the states of the EZ model

    Institute of Scientific and Technical Information of China (English)

    Chen Qing-Hua; Ding Yi-Ming; Dong Hong-Guang

    2009-01-01

    Based on suitable choice of states,this paper studies the stability of the equilibrium state of the EZ model by regarding the evolution of the EZ model as a Markov chain and by showing that the Markov chain is ergodic.The Markov analysis is applied to the EZ model with small number of agents,the exact equilibrium state for N=5 and numerical results for N=18 are obtained.

  9. Practical Soil-Shallow Foundation Model for Nonlinear Structural Analysis

    OpenAIRE

    Moussa Leblouba; Salah Al Toubat; Muhammad Ekhlasur Rahman; Omer Mugheida

    2016-01-01

    Soil-shallow foundation interaction models that are incorporated into most structural analysis programs generally lack accuracy and efficiency or neglect some aspects of foundation behavior. For instance, soil-shallow foundation systems have been observed to show both small and large loops under increasing amplitude load reversals. This paper presents a practical macroelement model for soil-shallow foundation system and its stability under simultaneous horizontal and vertical loads. The model...

  10. ANALYSIS OF THE MECHANISM MODELS OF TECHNOLOGICAL INNOVATION DIFFUSION

    Institute of Scientific and Technical Information of China (English)

    XU Jiuping; HU Minan

    2004-01-01

    This paper analyzes the mechanism and principle of diffusion of technology diffusion on the basis of quantitative analysis. Then it sets up the diffusion model of innovation incorporating price, advertising and distribution, the diffusion model of innovation including various kinds of consumers, and the substitute model between the new technology and the old one applied systems dynamics, optimization method, probabilistic method and simulation method on computer. Finally this paper concludes with some practical observations from a case study.

  11. A Grammar Analysis Model for the Unified Multimedia Query Language

    Institute of Scientific and Technical Information of China (English)

    Zhong-Sheng Cao; Zong-Da Wu; Yuan-Zhen Wang

    2008-01-01

    The unified multimedia query language(UMQL) is a powerful general-purpose multimediaquery language, and it is very suitable for multimediainformation retrieval. The paper proposes a grammaranalysis model to implement an effective grammaticalprocessing for the language. It separates the grammaranalysis of a UMQL query specification into two phases:syntactic analysis and semantic analysis, and thenrespectively uses Backus-Naur form (EBNF) and logicalalgebra to specify both restrictive grammar rules. As aresult, the model can present error guiding informationfor a query specification which owns incorrect grammar.The model not only suits well the processing of UMQLqueries, but also has a guiding significance for otherprojects concerning query processings of descriptivequery languages.

  12. How many separable sources? Model selection in independent components analysis.

    Science.gov (United States)

    Woods, Roger P; Hansen, Lars Kai; Strother, Stephen

    2015-01-01

    Unlike mixtures consisting solely of non-Gaussian sources, mixtures including two or more Gaussian components cannot be separated using standard independent components analysis methods that are based on higher order statistics and independent observations. The mixed Independent Components Analysis/Principal Components Analysis (mixed ICA/PCA) model described here accommodates one or more Gaussian components in the independent components analysis model and uses principal components analysis to characterize contributions from this inseparable Gaussian subspace. Information theory can then be used to select from among potential model categories with differing numbers of Gaussian components. Based on simulation studies, the assumptions and approximations underlying the Akaike Information Criterion do not hold in this setting, even with a very large number of observations. Cross-validation is a suitable, though computationally intensive alternative for model selection. Application of the algorithm is illustrated using Fisher's iris data set and Howells' craniometric data set. Mixed ICA/PCA is of potential interest in any field of scientific investigation where the authenticity of blindly separated non-Gaussian sources might otherwise be questionable. Failure of the Akaike Information Criterion in model selection also has relevance in traditional independent components analysis where all sources are assumed non-Gaussian.

  13. IMAGE ANALYSIS FOR MODELLING SHEAR BEHAVIOUR

    Directory of Open Access Journals (Sweden)

    Philippe Lopez

    2011-05-01

    Full Text Available Through laboratory research performed over the past ten years, many of the critical links between fracture characteristics and hydromechanical and mechanical behaviour have been made for individual fractures. One of the remaining challenges at the laboratory scale is to directly link fracture morphology of shear behaviour with changes in stress and shear direction. A series of laboratory experiments were performed on cement mortar replicas of a granite sample with a natural fracture perpendicular to the axis of the core. Results show that there is a strong relationship between the fracture's geometry and its mechanical behaviour under shear stress and the resulting damage. Image analysis, geostatistical, stereological and directional data techniques are applied in combination to experimental data. The results highlight the role of geometric characteristics of the fracture surfaces (surface roughness, size, shape, locations and orientations of asperities to be damaged in shear behaviour. A notable improvement in shear understanding is that shear behaviour is controlled by the apparent dip in the shear direction of elementary facets forming the fracture.

  14. The multilevel p2 model : A random effects model for the analysis of multiple social networks

    NARCIS (Netherlands)

    Zijlstra, B.J.H.; van Duijn, M.A.J.; Snijders, T.A.B.

    2006-01-01

    The p2 model is a random effects model with covariates for the analysis of binary directed social network data coming from a single observation of a social network. Here, a multilevel variant of the p2 model is proposed for the case of multiple observations of social networks, for example, in a samp

  15. Modelling and Analysis of Smart Grid: A Stochastic Model Checking Case Study

    DEFF Research Database (Denmark)

    Yuksel, Ender; Zhu, Huibiao; Nielson, Hanne Riis

    2012-01-01

    Cyber-physical systems integrate information and communication technology functions to the physical elements of a system for monitoring and controlling purposes. The conversion of traditional power grid into a smart grid, a fundamental example of a cyber-physical system, raises a number of issues...... consumption. We employ stochastic model checking approach and present our modelling and analysis study using PRISM model checker....

  16. Design, analysis, and modeling of giant magnetostrictive transducers

    Science.gov (United States)

    Calkins, Frederick Theodore

    The increased use of giant magnetostrictive, Terfenol-D transducers in a wide variety of applications has led to a need for greater understanding of the materials performance. This dissertation attempts to add to the Terfenol-D transducer body of knowledge by providing an in-depth analysis and modeling of an experimental transducer. A description of the magnetostriction process related to Terfenol-D includes a discussion of material properties, production methods, and the effect of mechanical stress, magnetization, and temperature on the material performance. The understanding of the Terfenol-D material performance provides the basis for an analysis of the performance of a Terfenol-D transducer. Issues related to the design and utilization of the Terfenol-D material in the transducers are considered, including the magnetic circuit, application of mechanical prestress, and tuning of the mechanical resonance. Experimental results from two broadband, Tonpilz design transducers show the effects of operating conditions (prestress, magnetic bias, AC magnetization amplitude, and frequency) on performance. In an effort to understand and utlilize the rich performance space described by the experimental results a variety of models are considered. An overview of models applicable to Terfenol-D and Terfenol-D transducers is provided, including a discussion of modeling criteria. The Jiles-Atherton model of ferromagnetic hysteresis is employed to describe the quasi-static transducer performance. This model requires the estimation of only six physically-based parameters to accurately simulate performance. The model is shown to be robust with respect to model parameters over a range of mechanical prestress, magnetic biases, and AC magnetic field amplitudes, allowing predictive capability within these ranges. An additional model, based on electroacoustics theory, explains trends in the frequency domain and facilitates an analysis of efficiency based on impedance and admittance

  17. Numerical daemons in hydrological modeling: Effects on uncertainty assessment, sensitivity analysis and model predictions

    Science.gov (United States)

    Kavetski, D.; Clark, M. P.; Fenicia, F.

    2011-12-01

    Hydrologists often face sources of uncertainty that dwarf those normally encountered in many engineering and scientific disciplines. Especially when representing large scale integrated systems, internal heterogeneities such as stream networks, preferential flowpaths, vegetation, etc, are necessarily represented with a considerable degree of lumping. The inputs to these models are themselves often the products of sparse observational networks. Given the simplifications inherent in environmental models, especially lumped conceptual models, does it really matter how they are implemented? At the same time, given the complexities usually found in the response surfaces of hydrological models, increasingly sophisticated analysis methodologies are being proposed for sensitivity analysis, parameter calibration and uncertainty assessment. Quite remarkably, rather than being caused by the model structure/equations themselves, in many cases model analysis complexities are consequences of seemingly trivial aspects of the model implementation - often, literally, whether the start-of-step or end-of-step fluxes are used! The extent of problems can be staggering, including (i) degraded performance of parameter optimization and uncertainty analysis algorithms, (ii) erroneous and/or misleading conclusions of sensitivity analysis, parameter inference and model interpretations and, finally, (iii) poor reliability of a calibrated model in predictive applications. While the often nontrivial behavior of numerical approximations has long been recognized in applied mathematics and in physically-oriented fields of environmental sciences, it remains a problematic issue in many environmental modeling applications. Perhaps detailed attention to numerics is only warranted for complicated engineering models? Would not numerical errors be an insignificant component of total uncertainty when typical data and model approximations are present? Is this really a serious issue beyond some rare isolated

  18. Model order reduction techniques with applications in finite element analysis

    CERN Document Server

    Qu, Zu-Qing

    2004-01-01

    Despite the continued rapid advance in computing speed and memory the increase in the complexity of models used by engineers persists in outpacing them. Even where there is access to the latest hardware, simulations are often extremely computationally intensive and time-consuming when full-blown models are under consideration. The need to reduce the computational cost involved when dealing with high-order/many-degree-of-freedom models can be offset by adroit computation. In this light, model-reduction methods have become a major goal of simulation and modeling research. Model reduction can also ameliorate problems in the correlation of widely used finite-element analyses and test analysis models produced by excessive system complexity. Model Order Reduction Techniques explains and compares such methods focusing mainly on recent work in dynamic condensation techniques: - Compares the effectiveness of static, exact, dynamic, SEREP and iterative-dynamic condensation techniques in producing valid reduced-order mo...

  19. Multivariate Survival Mixed Models for Genetic Analysis of Longevity Traits

    DEFF Research Database (Denmark)

    Pimentel Maia, Rafael; Madsen, Per; Labouriau, Rodrigo

    2013-01-01

    A class of multivariate mixed survival models for continuous and discrete time with a complex covariance structure is introduced in a context of quantitative genetic applications. The methods introduced can be used in many applications in quantitative genetics although the discussion presented...... concentrates on longevity studies. The framework presented allows to combine models based on continuous time with models based on discrete time in a joint analysis. The continuous time models are approximations of the frailty model in which the hazard function will be assumed to be piece-wise constant....... The discrete time models used are multivariate variants of the discrete relative risk models. These models allow for regular parametric likelihood-based inference by exploring a coincidence of their likelihood functions and the likelihood functions of suitably defined multivariate generalized linear mixed...

  20. Multivariate Survival Mixed Models for Genetic Analysis of Longevity Traits

    DEFF Research Database (Denmark)

    Pimentel Maia, Rafael; Madsen, Per; Labouriau, Rodrigo

    2014-01-01

    A class of multivariate mixed survival models for continuous and discrete time with a complex covariance structure is introduced in a context of quantitative genetic applications. The methods introduced can be used in many applications in quantitative genetics although the discussion presented...... concentrates on longevity studies. The framework presented allows to combine models based on continuous time with models based on discrete time in a joint analysis. The continuous time models are approximations of the frailty model in which the hazard function will be assumed to be piece-wise constant....... The discrete time models used are multivariate variants of the discrete relative risk models. These models allow for regular parametric likelihood-based inference by exploring a coincidence of their likelihood functions and the likelihood functions of suitably defined multivariate generalized linear mixed...

  1. A Grammar Analysis Model for the Unified Multimedia Query Language

    Institute of Scientific and Technical Information of China (English)

    Zhong-Sheng Cao; Zong-Da Wu; Yuan-Zhen Wang

    2008-01-01

    The unified multimedia query language (UMQL) is a powerful general-purpose multimedia query language, and it is very suitable for multimedia information retrieval. The paper proposes a grammar analysis model to implement an effective grammatical processing for the language. It separates the grammar analysis of a UMQL query specification into two phases: syntactic analysis and semantic analysis, and then respectively uses Backus-Naur form (EBNF) and logical algebra to specify both restrictive grammar rules. As a result, the model can present error guiding information for a query specification which owns incorrect grammar. The model not only suits well the processing of UMQL queries, but also has a guiding significance for other projects concerning query processings of descriptive query languages.

  2. Comparative Analysis of Two Models of the Strouma River Ecosystem

    Directory of Open Access Journals (Sweden)

    Mitko Petrov

    2008-04-01

    Full Text Available A modified method of regression analysis for modelling of the water quality of river ecosystems is offered. The method is distinguished from the conventional regression analysis of that the factors included in the regression dependence are time functions. Two type functions are tested: polynomial and periodical. The investigations show better results the periodical functions give. In addition, a model for analysis of river quality has been developed, which is a modified method of the time series analysis. The model has been applied for an assessment of water pollution of the Strouma river. An assessment for adequately of the obtained model of the statistical criteria - correlation coefficient, Fisher function and relative error is developed and it shows that the models are adequate and they can be used for modelling of the water pollution on these indexes of the Strouma river. The analysis of the river pollution shows that there is not a materially increase of the anthropogenic impact of the Strouma river in the Bulgarian part for the period from 2001 to 2004.

  3. Joint regression analysis and AMMI model applied to oat improvement

    Science.gov (United States)

    Oliveira, A.; Oliveira, T. A.; Mejza, S.

    2012-09-01

    In our work we present an application of some biometrical methods useful in genotype stability evaluation, namely AMMI model, Joint Regression Analysis (JRA) and multiple comparison tests. A genotype stability analysis of oat (Avena Sativa L.) grain yield was carried out using data of the Portuguese Plant Breeding Board, sample of the 22 different genotypes during the years 2002, 2003 and 2004 in six locations. In Ferreira et al. (2006) the authors state the relevance of the regression models and of the Additive Main Effects and Multiplicative Interactions (AMMI) model, to study and to estimate phenotypic stability effects. As computational techniques we use the Zigzag algorithm to estimate the regression coefficients and the agricolae-package available in R software for AMMI model analysis.

  4. Cartographic Modeling: Computer-assisted Analysis of Spatially Defined Neighborhoods

    Science.gov (United States)

    Berry, J. K.; Tomlin, C. D.

    1982-01-01

    Cartographic models addressing a wide variety of applications are composed of fundamental map processing operations. These primitive operations are neither data base nor application-specific. By organizing the set of operations into a mathematical-like structure, the basis for a generalized cartographic modeling framework can be developed. Among the major classes of primitive operations are those associated with reclassifying map categories, overlaying maps, determining distance and connectivity, and characterizing cartographic neighborhoods. The conceptual framework of cartographic modeling is established and techniques for characterizing neighborhoods are used as a means of demonstrating some of the more sophisticated procedures of computer-assisted map analysis. A cartographic model for assessing effective roundwood supply is briefly described as an example of a computer analysis. Most of the techniques described have been implemented as part of the map analysis package developed at the Yale School of Forestry and Environmental Studies.

  5. Development of Wolsong Unit 2 Containment Analysis Model

    Energy Technology Data Exchange (ETDEWEB)

    Hoon, Choi [Korea Hydro and Nuclear Power Co., Ltd., Daejeon (Korea, Republic of); Jin, Ko Bong; Chan, Park Young [Hanbat National Univ., Daejeon (Korea, Republic of)

    2014-05-15

    To be prepared for the full scope safety analysis of Wolsong unit 2 with modified fuel, input decks for the various objectives, which can be read by GOTHIC 7.2b(QA), are developed and tested for the steady state simulation. A detailed nodalization of 39 control volumes and 92 flow paths is constructed to determine the differential pressure across internal walls or hydrogen concentration and distribution inside containment. A lumped model with 15 control volumes and 74 flow paths has also been developed to reduce the computer run time for the assessments in which the analysis results are not sensitive to detailed thermal hydraulic distribution inside containment such as peak pressure, pressure dependent signal and radionuclide release. The input data files provide simplified representations of the geometric layout of the containment building (volumes, dimensions, flow paths, doors, panels, etc.) and the performance characteristics of the various containment subsystems. The parameter values are based on best estimate or design values for that parameter. The analysis values are determined by conservatism depending on the analysis objective and may be different for various analysis objectives. Basic input decks of Wolsong unit 2 were developed for the various analysis purposes with GOTHIC 7.2b(QA). Depend on the analysis objective, two types of models are prepared. Detailed model models each confined room in the containment as a separate node. All of the geometric data are based on the drawings of Wolsong unit 2. Developed containment models are simulating the steady state well to the designated initial condition. These base models will be used for Wolsong unit 2 in case of safety analysis of full scope is needed.

  6. Modelling and analysis of turbulent datasets using ARMA processes

    CERN Document Server

    Faranda, Davide; Dubrulle, Bérèngere; Daviaud, François; Saint-Michel, Brice; Herbert, Éric; Cortet, Pierre-Philippe

    2014-01-01

    We introduce a novel way to extract information from turbulent datasets by applying an ARMA statistical analysis. Such analysis goes well beyond the analysis of the mean flow and of the fluctuations and links the behavior of the recorded time series to a discrete version of a stochastic differential equation which is able to describe the correlation structure in the dataset. We introduce a new intermittency parameter $\\Upsilon$ that measures the difference between the resulting analysis and the Obukhov model of turbulence, the simplest stochastic model reproducing both Richardson law and the Kolmogorov spectrum. We test the method on datasets measured in a von K\\'arm\\'an swirling flow experiment. We found that the ARMA analysis is well correlated with spatial structures of the flow, and can discriminate between two different flows with comparable mean velocities, obtained by changing the forcing. Moreover, we show that the intermittency parameter is highest in regions where shear layer vortices are present, t...

  7. FUZZY PRINCIPAL COMPONENT ANALYSIS AND ITS KERNEL BASED MODEL

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Principal Component Analysis (PCA) is one of the most important feature extraction methods, and Kernel Principal Component Analysis (KPCA) is a nonlinear extension of PCA based on kernel methods. In real world, each input data may not be fully assigned to one class and it may partially belong to other classes. Based on the theory of fuzzy sets, this paper presents Fuzzy Principal Component Analysis (FPCA) and its nonlinear extension model, i.e., Kernel-based Fuzzy Principal Component Analysis (KFPCA). The experimental results indicate that the proposed algorithms have good performances.

  8. Structural Simulations and Conservation Analysis -Historic Building Information Model (HBIM

    Directory of Open Access Journals (Sweden)

    C. Dore

    2015-02-01

    Full Text Available In this paper the current findings to date of the Historic Building Information Model (HBIM of the Four Courts in Dublin are presented. The Historic Building Information Model (HBIM forms the basis for both structural and conservation analysis to measure the impact of war damage which still impacts on the building. The laser scan survey was carried out in the summer of 2014 of the internal and external structure. After registration and processing of the laser scan survey, the HBIM was created of the damaged section of the building and is presented as two separate workflows in this paper. The first is the model created from historic data, the second a procedural and segmented model developed from laser scan survey of the war damaged drum and dome. From both models structural damage and decay simulations will be developed for documentation and conservation analysis.

  9. Modeling and performance analysis of QoS data

    Science.gov (United States)

    Strzeciwilk, Dariusz; Zuberek, Włodzimierz M.

    2016-09-01

    The article presents the results of modeling and analysis of data transmission performance on systems that support quality of service. Models are designed and tested, taking into account multiservice network architecture, i.e. supporting the transmission of data related to different classes of traffic. Studied were mechanisms of traffic shaping systems, which are based on the Priority Queuing with an integrated source of data and the various sources of data that is generated. Discussed were the basic problems of the architecture supporting QoS and queuing systems. Designed and built were models based on Petri nets, supported by temporal logics. The use of simulation tools was to verify the mechanisms of shaping traffic with the applied queuing algorithms. It is shown that temporal models of Petri nets can be effectively used in the modeling and analysis of the performance of computer networks.

  10. Modeling the situation awareness by the analysis of cognitive process.

    Science.gov (United States)

    Liu, Shuang; Wanyan, Xiaoru; Zhuang, Damin

    2014-01-01

    To predict changes of situation awareness (SA) for pilot operating with different display interfaces and tasks, a qualitative analysis and quantitative calculation joint SA model was proposed. Based on the situational awareness model according to the attention allocation built previously, the pilot cognitive process for the situation elements was analyzed according to the ACT-R (Adaptive Control of Thought, Rational) theory, which explained how the SA was produced. To verify the validity of this model, 28 subjects performed an instrument supervision task under different experiment conditions. Situation Awareness Global Assessment Technique (SAGAT), 10-dimensional Situational Awareness Rating Technique (10-D SART), performance measure and eye movement measure were adopted for evaluating SAs under different conditions. Statistical analysis demonstrated that the changing trend of SA calculated by this model was highly correlated with the experimental results. Therefore the situational awareness model can provide a reference for designing new cockpit display interfaces and help reducing human errors.

  11. An introduction to queueing theory modeling and analysis in applications

    CERN Document Server

    Bhat, U Narayan

    2015-01-01

    This introductory textbook is designed for a one-semester course on queueing theory that does not require a course on stochastic processes as a prerequisite. By integrating the necessary background on stochastic processes with the analysis of models, the work provides a sound foundational introduction to the modeling and analysis of queueing systems for a wide interdisciplinary audience of students in mathematics, statistics, and applied disciplines such as computer science, operations research, and engineering. This edition includes additional topics in methodology and applications. Key features: • An introductory chapter including a historical account of the growth of queueing theory in more than 100 years. • A modeling-based approach with emphasis on identification of models. • Rigorous treatment of the foundations of basic models commonly used in applications with appropriate references for advanced topics. • Applications in manufacturing and, computer and communication systems. • A chapter on ...

  12. Robust Linear Models for Cis-eQTL Analysis.

    Directory of Open Access Journals (Sweden)

    Mattias Rantalainen

    Full Text Available Expression Quantitative Trait Loci (eQTL analysis enables characterisation of functional genetic variation influencing expression levels of individual genes. In outbread populations, including humans, eQTLs are commonly analysed using the conventional linear model, adjusting for relevant covariates, assuming an allelic dosage model and a Gaussian error term. However, gene expression data generally have noise that induces heavy-tailed errors relative to the Gaussian distribution and often include atypical observations, or outliers. Such departures from modelling assumptions can lead to an increased rate of type II errors (false negatives, and to some extent also type I errors (false positives. Careful model checking can reduce the risk of type-I errors but often not type II errors, since it is generally too time-consuming to carefully check all models with a non-significant effect in large-scale and genome-wide studies. Here we propose the application of a robust linear model for eQTL analysis to reduce adverse effects of deviations from the assumption of Gaussian residuals. We present results from a simulation study as well as results from the analysis of real eQTL data sets. Our findings suggest that in many situations robust models have the potential to provide more reliable eQTL results compared to conventional linear models, particularly in respect to reducing type II errors due to non-Gaussian noise. Post-genomic data, such as that generated in genome-wide eQTL studies, are often noisy and frequently contain atypical observations. Robust statistical models have the potential to provide more reliable results and increased statistical power under non-Gaussian conditions. The results presented here suggest that robust models should be considered routinely alongside other commonly used methodologies for eQTL analysis.

  13. Statistical Models and Methods for Network Meta-Analysis.

    Science.gov (United States)

    Madden, L V; Piepho, H-P; Paul, P A

    2016-08-01

    Meta-analysis, the methodology for analyzing the results from multiple independent studies, has grown tremendously in popularity over the last four decades. Although most meta-analyses involve a single effect size (summary result, such as a treatment difference) from each study, there are often multiple treatments of interest across the network of studies in the analysis. Multi-treatment (or network) meta-analysis can be used for simultaneously analyzing the results from all the treatments. However, the methodology is considerably more complicated than for the analysis of a single effect size, and there have not been adequate explanations of the approach for agricultural investigations. We review the methods and models for conducting a network meta-analysis based on frequentist statistical principles, and demonstrate the procedures using a published multi-treatment plant pathology data set. A major advantage of network meta-analysis is that correlations of estimated treatment effects are automatically taken into account when an appropriate model is used. Moreover, treatment comparisons may be possible in a network meta-analysis that are not possible in a single study because all treatments of interest may not be included in any given study. We review several models that consider the study effect as either fixed or random, and show how to interpret model-fitting output. We further show how to model the effect of moderator variables (study-level characteristics) on treatment effects, and present one approach to test for the consistency of treatment effects across the network. Online supplemental files give explanations on fitting the network meta-analytical models using SAS.

  14. Precise methods for conducted EMI modeling,analysis, and prediction

    Institute of Scientific and Technical Information of China (English)

    MA WeiMing; ZHAO ZhiHua; MENG Jin; PAN QiJun; ZHANG Lei

    2008-01-01

    Focusing on the state-of-the-art conducted EMI prediction, this paper presents a noise source lumped circuit modeling and identification method, an EMI modeling method based on multiple slope approximation of switching transitions, and dou-ble Fourier integral method modeling PWM conversion units to achieve an accurate modeling of EMI noise source. Meanwhile, a new sensitivity analysis method, a general coupling model for steel ground loops, and a partial element equivalent circuit method are proposed to identify and characterize conducted EMI coupling paths. The EMI noise and propagation modeling provide an accurate prediction of conducted EMI in the entire frequency range (0-10 MHz) with good practicability and generality. Finally a new measurement approach is presented to identify the surface current of large dimensional metal shell. The proposed analytical modeling methodology is verified by experimental results.

  15. Integrated simulation and data envelopment analysis models in emergency department

    Science.gov (United States)

    Aminuddin, Wan Malissa Wan Mohd; Ismail, Wan Rosmanira

    2016-11-01

    This study aims to determine the best resource allocation and to increase the efficiency service of an emergency department in a public hospital in Kuala Lumpur. We integrate Discrete Event Simulation (DES) and three models of Data Envelopment Analysis (DEA); Input-oriented CCR model, Input-oriented BCC model and Super-Efficiency model to fulfill such objective. Based on the comparison of results taken from the DEA models, the combination of DES, Input-oriented BCC model and Super-Efficiency BCC model is seen to be the best resource allocation technique to be used for enhancing the hospital efficiency. The combination has reduced patients waiting time while improving the average utilization rate of hospital resources compared to the current situation.

  16. Precise methods for conducted EMI modeling,analysis,and prediction

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    Focusing on the state-of-the-art conducted EMI prediction, this paper presents a noise source lumped circuit modeling and identification method, an EMI modeling method based on multiple slope approximation of switching transitions, and dou-ble Fourier integral method modeling PWM conversion units to achieve an accurate modeling of EMI noise source. Meanwhile, a new sensitivity analysis method, a general coupling model for steel ground loops, and a partial element equivalent circuit method are proposed to identify and characterize conducted EMI coupling paths. The EMI noise and propagation modeling provide an accurate prediction of conducted EMI in the entire frequency range (0―10 MHz) with good practicability and generality. Finally a new measurement approach is presented to identify the surface current of large dimensional metal shell. The proposed analytical modeling methodology is verified by experimental results.

  17. Objective Bayesian Comparison of Constrained Analysis of Variance Models.

    Science.gov (United States)

    Consonni, Guido; Paroli, Roberta

    2016-10-04

    In the social sciences we are often interested in comparing models specified by parametric equality or inequality constraints. For instance, when examining three group means [Formula: see text] through an analysis of variance (ANOVA), a model may specify that [Formula: see text], while another one may state that [Formula: see text], and finally a third model may instead suggest that all means are unrestricted. This is a challenging problem, because it involves a combination of nonnested models, as well as nested models having the same dimension. We adopt an objective Bayesian approach, requiring no prior specification from the user, and derive the posterior probability of each model under consideration. Our method is based on the intrinsic prior methodology, suitably modified to accommodate equality and inequality constraints. Focussing on normal ANOVA models, a comparative assessment is carried out through simulation studies. We also present an application to real data collected in a psychological experiment.

  18. Parametric sensitivity analysis of a test cell thermal model using spectral analysis

    CERN Document Server

    Mara, Thierry Alex; Garde, François

    2012-01-01

    The paper deals with an empirical validation of a building thermal model. We put the emphasis on sensitivity analysis and on research of inputs/residual correlation to improve our model. In this article, we apply a sensitivity analysis technique in the frequency domain to point out the more important parameters of the model. Then, we compare measured and predicted data of indoor dry-air temperature. When the model is not accurate enough, recourse to time-frequency analysis is of great help to identify the inputs responsible for the major part of error. In our approach, two samples of experimental data are required. The first one is used to calibrate our model the second one to really validate the optimized model.

  19. Parameter estimation and error analysis in environmental modeling and computation

    Science.gov (United States)

    Kalmaz, E. E.

    1986-01-01

    A method for the estimation of parameters and error analysis in the development of nonlinear modeling for environmental impact assessment studies is presented. The modular computer program can interactively fit different nonlinear models to the same set of data, dynamically changing the error structure associated with observed values. Parameter estimation techniques and sequential estimation algorithms employed in parameter identification and model selection are first discussed. Then, least-square parameter estimation procedures are formulated, utilizing differential or integrated equations, and are used to define a model for association of error with experimentally observed data.

  20. Inverse Kinematic Analysis of Human Hand Thumb Model

    Science.gov (United States)

    Toth-Tascau, Mirela; Pater, Flavius; Stoia, Dan Ioan; Menyhardt, Karoly; Rosu, Serban; Rusu, Lucian; Vigaru, Cosmina

    2011-09-01

    This paper deals with a kinematic model of the thumb of the human hand. The proposed model has 3 degrees of freedom being able to model the movements of the thumb tip with respect to the wrist joint centre. The kinematic equations are derived based on Denavit-Hartenberg Convention and solved in both direct and inverse way. Inverse kinematic analysis of human hand thumb model reveals multiple and connected solutions which are characteristic to nonlinear systems when the number of equations is greater than number of unknowns and correspond to natural movements of the finger.

  1. ANALYSIS OF THE KQML MODEL IN MULTI-AGENT INTERACTION

    Institute of Scientific and Technical Information of China (English)

    刘海龙; 吴铁军

    2001-01-01

    Our analysis of the KQML(Knowledge Query and Manipulation Language) model yielded some conclusions on the knowledge level of communication in agent-oriented program. First, the agent state and transition model were given for analyzing the necessary conditions for interaction with the synchronal and asynchronous KQML model respectively. Second, we analyzed the deadlock and starvation problems in the KQML communication, and gave the solution. At last, the advantages and disadvantages of the synchronal and asynchronous KQML model were listed respectively, and the choosing principle was given.

  2. Computer Models for IRIS Control System Transient Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Gary D. Storrick; Bojan Petrovic; Luca Oriani

    2007-01-31

    This report presents results of the Westinghouse work performed under Task 3 of this Financial Assistance Award and it satisfies a Level 2 Milestone for the project. Task 3 of the collaborative effort between ORNL, Brazil and Westinghouse for the International Nuclear Energy Research Initiative entitled “Development of Advanced Instrumentation and Control for an Integrated Primary System Reactor” focuses on developing computer models for transient analysis. This report summarizes the work performed under Task 3 on developing control system models. The present state of the IRIS plant design – such as the lack of a detailed secondary system or I&C system designs – makes finalizing models impossible at this time. However, this did not prevent making considerable progress. Westinghouse has several working models in use to further the IRIS design. We expect to continue modifying the models to incorporate the latest design information until the final IRIS unit becomes operational. Section 1.2 outlines the scope of this report. Section 2 describes the approaches we are using for non-safety transient models. It describes the need for non-safety transient analysis and the model characteristics needed to support those analyses. Section 3 presents the RELAP5 model. This is the highest-fidelity model used for benchmark evaluations. However, it is prohibitively slow for routine evaluations and additional lower-fidelity models have been developed. Section 4 discusses the current Matlab/Simulink model. This is a low-fidelity, high-speed model used to quickly evaluate and compare competing control and protection concepts. Section 5 describes the Modelica models developed by POLIMI and Westinghouse. The object-oriented Modelica language provides convenient mechanisms for developing models at several levels of detail. We have used this to develop a high-fidelity model for detailed analyses and a faster-running simplified model to help speed the I&C development process

  3. Molecular structure based property modeling: Development/ improvement of property models through a systematic property-data-model analysis

    DEFF Research Database (Denmark)

    Hukkerikar, Amol Shivajirao; Sarup, Bent; Sin, Gürkan;

    2013-01-01

    to a wide range of properties of pure compounds. In this work, however, the application of the method is illustrated for the property modeling of normal melting point, enthalpy of fusion, enthalpy of formation, and critical temperature. For all the properties listed above, it has been possible to achieve......The objective of this work is to develop a method for performing property-data-model analysis so that efficient use of knowledge of properties could be made in the development/improvement of property prediction models. The method includes: (i) analysis of property data and its consistency check......; (ii) selection of the most appropriate form of the property model; (iii) selection of the data-set for performing parameter regression and uncertainty analysis; and (iv) analysis of model prediction errors to take necessary corrective steps to improve the accuracy and the reliability of property...

  4. Development of trip coverage analysis methodology - CATHENA trip coverage analysis model

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Jong Ho; Ohn, M. Y.; Cho, C. H.; Huh, J. Y.; Na, Y. H.; Lee, S. Y.; Kim, B. G.; Kim, H. H.; Kim, S. W.; Bae, C. J.; Kim, T. M.; Kim, S. R.; Han, B. S.; Moon, B. J.; Oh, M. T. [Korea Power Engineering Co., Yongin (Korea)

    2001-05-01

    This report describes the CATHENA model for trip coverage analysis. This model is prepared based on the Wolsong 2 design data and consist of primary heat transport system, shutdown system, steam and feedwater system, reactor regulating system, heat transport pressure and inventory control system, and steam generator level and pressure control system. The new features and modified parts from the Wolsong 2 CATHENA LOCA Model required for trip coverage analysis is described. this model is tested by simulation of steady state at 100 % FP and at several low powers. Also, the cases of power rundown and power runup are tested. 17 refs., 124 figs., 19 tabs. (Author)

  5. Stochastic modelling of landfill leachate and biogas production incorporating waste heterogeneity. Model formulation and uncertainty analysis.

    Science.gov (United States)

    Zacharof, A I; Butler, A P

    2004-01-01

    A mathematical model simulating the hydrological and biochemical processes occurring in landfilled waste is presented and demonstrated. The model combines biochemical and hydrological models into an integrated representation of the landfill environment. Waste decomposition is modelled using traditional biochemical waste decomposition pathways combined with a simplified methodology for representing the rate of decomposition. Water flow through the waste is represented using a statistical velocity model capable of representing the effects of waste heterogeneity on leachate flow through the waste. Given the limitations in data capture from landfill sites, significant emphasis is placed on improving parameter identification and reducing parameter requirements. A sensitivity analysis is performed, highlighting the model's response to changes in input variables. A model test run is also presented, demonstrating the model capabilities. A parameter perturbation model sensitivity analysis was also performed. This has been able to show that although the model is sensitive to certain key parameters, its overall intuitive response provides a good basis for making reasonable predictions of the future state of the landfill system. Finally, due to the high uncertainty associated with landfill data, a tool for handling input data uncertainty is incorporated in the model's structure. It is concluded that the model can be used as a reasonable tool for modelling landfill processes and that further work should be undertaken to assess the model's performance.

  6. Hybrid reliability model for fatigue reliability analysis of steel bridges

    Institute of Scientific and Technical Information of China (English)

    曹珊珊; 雷俊卿

    2016-01-01

    A kind of hybrid reliability model is presented to solve the fatigue reliability problems of steel bridges. The cumulative damage model is one kind of the models used in fatigue reliability analysis. The parameter characteristics of the model can be described as probabilistic and interval. The two-stage hybrid reliability model is given with a theoretical foundation and a solving algorithm to solve the hybrid reliability problems. The theoretical foundation is established by the consistency relationships of interval reliability model and probability reliability model with normally distributed variables in theory. The solving process is combined with the definition of interval reliability index and the probabilistic algorithm. With the consideration of the parameter characteristics of theS−N curve, the cumulative damage model with hybrid variables is given based on the standards from different countries. Lastly, a case of steel structure in the Neville Island Bridge is analyzed to verify the applicability of the hybrid reliability model in fatigue reliability analysis based on the AASHTO.

  7. Projection-Based Reduced Order Modeling for Spacecraft Thermal Analysis

    Science.gov (United States)

    Qian, Jing; Wang, Yi; Song, Hongjun; Pant, Kapil; Peabody, Hume; Ku, Jentung; Butler, Charles D.

    2015-01-01

    This paper presents a mathematically rigorous, subspace projection-based reduced order modeling (ROM) methodology and an integrated framework to automatically generate reduced order models for spacecraft thermal analysis. Two key steps in the reduced order modeling procedure are described: (1) the acquisition of a full-scale spacecraft model in the ordinary differential equation (ODE) and differential algebraic equation (DAE) form to resolve its dynamic thermal behavior; and (2) the ROM to markedly reduce the dimension of the full-scale model. Specifically, proper orthogonal decomposition (POD) in conjunction with discrete empirical interpolation method (DEIM) and trajectory piece-wise linear (TPWL) methods are developed to address the strong nonlinear thermal effects due to coupled conductive and radiative heat transfer in the spacecraft environment. Case studies using NASA-relevant satellite models are undertaken to verify the capability and to assess the computational performance of the ROM technique in terms of speed-up and error relative to the full-scale model. ROM exhibits excellent agreement in spatiotemporal thermal profiles (<0.5% relative error in pertinent time scales) along with salient computational acceleration (up to two orders of magnitude speed-up) over the full-scale analysis. These findings establish the feasibility of ROM to perform rational and computationally affordable thermal analysis, develop reliable thermal control strategies for spacecraft, and greatly reduce the development cycle times and costs.

  8. Thermal and mechanical analysis for the detailed model using submodel

    Energy Technology Data Exchange (ETDEWEB)

    Kuh, Jung Eui; Kang, Chul Hyung; Park, Jeong Hwa [Korea Atomic Energy Research Institute, Taejon (Korea)

    1999-11-01

    A very big model is required for the TM analysis for HLRW repository, and also very small size of mesh is needed to simulate precisely main parts of analysis, e.g., canister, buffer, etc. However, it is practically impossible due to high memory size and computing time. In this report, a submodel concept in ABAQUS is used to handle this difficulty. A submodel concept is the part interested only is performed detailed modelling and this result is used as a boundary condition of full scale model. To follow this kind of computation procedure temperature distribution in buffer and canister could be computed precisely. This approach can be applied to TM analysis of buffer and canister, or a finite size of repository. 12 refs., 28 figs., 9 tabs. (Author)

  9. Analysis of deterministic cyclic gene regulatory network models with delays

    CERN Document Server

    Ahsen, Mehmet Eren; Niculescu, Silviu-Iulian

    2015-01-01

    This brief examines a deterministic, ODE-based model for gene regulatory networks (GRN) that incorporates nonlinearities and time-delayed feedback. An introductory chapter provides some insights into molecular biology and GRNs. The mathematical tools necessary for studying the GRN model are then reviewed, in particular Hill functions and Schwarzian derivatives. One chapter is devoted to the analysis of GRNs under negative feedback with time delays and a special case of a homogenous GRN is considered. Asymptotic stability analysis of GRNs under positive feedback is then considered in a separate chapter, in which conditions leading to bi-stability are derived. Graduate and advanced undergraduate students and researchers in control engineering, applied mathematics, systems biology and synthetic biology will find this brief to be a clear and concise introduction to the modeling and analysis of GRNs.

  10. Uncertainty and sensitivity analysis for photovoltaic system modeling.

    Energy Technology Data Exchange (ETDEWEB)

    Hansen, Clifford W.; Pohl, Andrew Phillip; Jordan, Dirk

    2013-12-01

    We report an uncertainty and sensitivity analysis for modeling DC energy from photovoltaic systems. We consider two systems, each comprised of a single module using either crystalline silicon or CdTe cells, and located either at Albuquerque, NM, or Golden, CO. Output from a PV system is predicted by a sequence of models. Uncertainty in the output of each model is quantified by empirical distributions of each model's residuals. We sample these distributions to propagate uncertainty through the sequence of models to obtain an empirical distribution for each PV system's output. We considered models that: (1) translate measured global horizontal, direct and global diffuse irradiance to plane-of-array irradiance; (2) estimate effective irradiance from plane-of-array irradiance; (3) predict cell temperature; and (4) estimate DC voltage, current and power. We found that the uncertainty in PV system output to be relatively small, on the order of 1% for daily energy. Four alternative models were considered for the POA irradiance modeling step; we did not find the choice of one of these models to be of great significance. However, we observed that the POA irradiance model introduced a bias of upwards of 5% of daily energy which translates directly to a systematic difference in predicted energy. Sensitivity analyses relate uncertainty in the PV system output to uncertainty arising from each model. We found that the residuals arising from the POA irradiance and the effective irradiance models to be the dominant contributors to residuals for daily energy, for either technology or location considered. This analysis indicates that efforts to reduce the uncertainty in PV system output should focus on improvements to the POA and effective irradiance models.

  11. Choice-Based Conjoint Analysis: Classification vs. Discrete Choice Models

    Science.gov (United States)

    Giesen, Joachim; Mueller, Klaus; Taneva, Bilyana; Zolliker, Peter

    Conjoint analysis is a family of techniques that originated in psychology and later became popular in market research. The main objective of conjoint analysis is to measure an individual's or a population's preferences on a class of options that can be described by parameters and their levels. We consider preference data obtained in choice-based conjoint analysis studies, where one observes test persons' choices on small subsets of the options. There are many ways to analyze choice-based conjoint analysis data. Here we discuss the intuition behind a classification based approach, and compare this approach to one based on statistical assumptions (discrete choice models) and to a regression approach. Our comparison on real and synthetic data indicates that the classification approach outperforms the discrete choice models.

  12. Predoction Model of Data Envelopment Analysis with Undesirable Outputs

    Institute of Scientific and Technical Information of China (English)

    边馥萍; 范宇

    2004-01-01

    Data envelopment analysis (DEA) has become a standard non-parametric approach to productivity analysis, especially to relative efficiency analysis of decision making units (DMUs). Extended to the prediction field, it can solve the prediction problem with multiple inputs and outputs which can not be solved easily by the regression analysis method.But the traditional DEA models can not solve the problem with undesirable outputs,so in this paper the inherent relationship between goal programming and the DEA method based on the relationship between multiple goal programming and goal programming is explored,and a mixed DEA model which can make all factors of inputs and undesirable outputs decrease in different proportions is built.And at the same time,all the factors of desirable outputs increase in different proportions.

  13. Domain Endurants: An Analysis and Description Process Model

    DEFF Research Database (Denmark)

    Bjørner, Dines

    2014-01-01

    We present a summary, Sect. 2, of a structure of domain analysis and description concepts: techniques and tools. And we link, in Sect. 3, these concepts, embodied in domain analysis prompts and domain description prompts, in a model of how a diligent domain analyser cum describer would use them. ...... claim that both sections, Sects. 2–3, contribute to a methodology of software engineering.......We present a summary, Sect. 2, of a structure of domain analysis and description concepts: techniques and tools. And we link, in Sect. 3, these concepts, embodied in domain analysis prompts and domain description prompts, in a model of how a diligent domain analyser cum describer would use them. We...

  14. Uncertainty analysis in dissolved oxygen modeling in streams.

    Science.gov (United States)

    Hamed, Maged M; El-Beshry, Manar Z

    2004-08-01

    Uncertainty analysis in surface water quality modeling is an important issue. This paper presents a method based on the first-order reliability method (FORM) to assess the exceedance probability of a target dissolved oxygen concentration in a stream, using a Streeter-Phelps prototype model. Basic uncertainty in the input parameters is considered by representing them as random variables with prescribed probability distributions. Results obtained from FORM analysis compared well with those of the Monte Carlo simulation method. The analysis also presents the stochastic sensitivity of the probabilistic outcome in the form of uncertainty importance factors, and shows how they change with changing simulation time. Furthermore, a parametric sensitivity analysis was conducted to show the effect of selection of different probability distribution functions for the three most important parameters on the design point, exceedance probability, and importance factors.

  15. Parameter identification and global sensitivity analysis of Xinanjiang model using meta-modeling approach

    Directory of Open Access Journals (Sweden)

    Xiao-meng SONG

    2013-01-01

    Full Text Available Parameter identification, model calibration, and uncertainty quantification are important steps in the model-building process, and are necessary for obtaining credible results and valuable information. Sensitivity analysis of hydrological model is a key step in model uncertainty quantification, which can identify the dominant parameters, reduce the model calibration uncertainty, and enhance the model optimization efficiency. There are, however, some shortcomings in classical approaches, including the long duration of time and high computation cost required to quantitatively assess the sensitivity of a multiple-parameter hydrological model. For this reason, a two-step statistical evaluation framework using global techniques is presented. It is based on (1 a screening method (Morris for qualitative ranking of parameters, and (2 a variance-based method integrated with a meta-model for quantitative sensitivity analysis, i.e., the Sobol method integrated with the response surface model (RSMSobol. First, the Morris screening method was used to qualitatively identify the parameters’ sensitivity, and then ten parameters were selected to quantify the sensitivity indices. Subsequently, the RSMSobol method was used to quantify the sensitivity, i.e., the first-order and total sensitivity indices based on the response surface model (RSM were calculated. The RSMSobol method can not only quantify the sensitivity, but also reduce the computational cost, with good accuracy compared to the classical approaches. This approach will be effective and reliable in the global sensitivity analysis of a complex large-scale distributed hydrological model.

  16. Spatial and temporal thermal analysis of acousto-optic deflectors using finite element analysis model.

    Science.gov (United States)

    Jiang, Runhua; Zhou, Zhenqiao; Lv, Xiaohua; Zeng, Shaoqun; Huang, Zhifeng; Zhou, Huaichun

    2012-07-01

    Thermal effects greatly influence the optical properties of the acousto-optic deflectors (AODs). Thermal analysis plays an important role in modern AOD design. However, the lack of an effective method of analysis limits the prediction in the thermal performance. In this paper, we propose a finite element analysis model to analyze the thermal effects of a TeO(2)-based AOD. Both transducer heating and acoustic absorption are considered as thermal sources. The anisotropy of sound propagation is taken into account for determining the acoustic absorption. Based on this model, a transient thermal analysis is employed using ANSYS software. The spatial temperature distributions in the crystal and the temperature changes over time are acquired. The simulation results are validated by experimental results. The effect of heat source and heat convection on temperature distribution is discussed. This numerical model and analytical method of thermal analysis would be helpful in the thermal design and practical applications of AODs.

  17. Integration of Design and Control Through Model Analysis

    DEFF Research Database (Denmark)

    Russel, Boris Mariboe; Henriksen, Jens Peter; Jørgensen, Sten Bay;

    2000-01-01

    A systematic analysis of the process model is proposed as a pre-solution step for integration of design and control problems. It is shown that the same set of process (control) variables and design (manipulative) variables is employed with different objectives in design and control. Analysis of t...... processes with mass and/or energy recycle. (C) 2000 Elsevier Science Ltd. All rights reserved....

  18. Analysis of a Model for Computer Virus Transmission

    Directory of Open Access Journals (Sweden)

    Peng Qin

    2015-01-01

    Full Text Available Computer viruses remain a significant threat to computer networks. In this paper, the incorporation of new computers to the network and the removing of old computers from the network are considered. Meanwhile, the computers are equipped with antivirus software on the computer network. The computer virus model is established. Through the analysis of the model, disease-free and endemic equilibrium points are calculated. The stability conditions of the equilibria are derived. To illustrate our theoretical analysis, some numerical simulations are also included. The results provide a theoretical basis to control the spread of computer virus.

  19. Sensitivity analysis of the fission gas behavior model in BISON.

    Energy Technology Data Exchange (ETDEWEB)

    Swiler, Laura Painton; Pastore, Giovanni; Perez, Danielle; Williamson, Richard

    2013-05-01

    This report summarizes the result of a NEAMS project focused on sensitivity analysis of a new model for the fission gas behavior (release and swelling) in the BISON fuel performance code of Idaho National Laboratory. Using the new model in BISON, the sensitivity of the calculated fission gas release and swelling to the involved parameters and the associated uncertainties is investigated. The study results in a quantitative assessment of the role of intrinsic uncertainties in the analysis of fission gas behavior in nuclear fuel.

  20. A comparative analysis of multi-output frontier models

    Institute of Scientific and Technical Information of China (English)

    Tao ZHANG; Eoghan GARVEY

    2008-01-01

    Recently, there have been more debates on the methods of measuring efficiency. The main objective of this paper is to make a sensitivity analysis for different frontier models and compare the results obtained from the different methods of estimating multi-output frontier for a specific application. The methods include stochastic distance function frontier, stochastic ray frontier,and data envelopment analysis. The stochastic frontier regressions with and without the inefficiency effects model are also com-pared and tested. The results indicate that there are significant correlations between the results obtained from the alternative estimation methods.

  1. System modeling based measurement error analysis of digital sun sensors

    Institute of Scientific and Technical Information of China (English)

    WEI; M; insong; XING; Fei; WANG; Geng; YOU; Zheng

    2015-01-01

    Stringent attitude determination accuracy is required for the development of the advanced space technologies and thus the accuracy improvement of digital sun sensors is necessary.In this paper,we presented a proposal for measurement error analysis of a digital sun sensor.A system modeling including three different error sources was built and employed for system error analysis.Numerical simulations were also conducted to study the measurement error introduced by different sources of error.Based on our model and study,the system errors from different error sources are coupled and the system calibration should be elaborately designed to realize a digital sun sensor with extra-high accuracy.

  2. Statistical models and methods for reliability and survival analysis

    CERN Document Server

    Couallier, Vincent; Huber-Carol, Catherine; Mesbah, Mounir; Huber -Carol, Catherine; Limnios, Nikolaos; Gerville-Reache, Leo

    2013-01-01

    Statistical Models and Methods for Reliability and Survival Analysis brings together contributions by specialists in statistical theory as they discuss their applications providing up-to-date developments in methods used in survival analysis, statistical goodness of fit, stochastic processes for system reliability, amongst others. Many of these are related to the work of Professor M. Nikulin in statistics over the past 30 years. The authors gather together various contributions with a broad array of techniques and results, divided into three parts - Statistical Models and Methods, Statistical

  3. First experience with the new ATLAS analysis model

    CERN Document Server

    Cranshaw, Jack; The ATLAS collaboration

    2016-01-01

    During the Long shutdown of the LHC, the ATLAS collaboration overhauled its analysis model based on experience gained during Run 1. The main components are a new analysis format and Event Data Model which can be read directly by ROOT, as well as a "Derivation Framework" that takes the Petabyte-scale output from ATLAS reconstruction and produces smaller samples targeted at specific analyses, using the central production system. We will discuss the technical and operational aspects of this new system and review its performance during the first year of 13 TeV data taking.

  4. Causal Analysis for Performance Modeling of Computer Programs

    Directory of Open Access Journals (Sweden)

    Jan Lemeire

    2007-01-01

    Full Text Available Causal modeling and the accompanying learning algorithms provide useful extensions for in-depth statistical investigation and automation of performance modeling. We enlarged the scope of existing causal structure learning algorithms by using the form-free information-theoretic concept of mutual information and by introducing the complexity criterion for selecting direct relations among equivalent relations. The underlying probability distribution of experimental data is estimated by kernel density estimation. We then reported on the benefits of a dependency analysis and the decompositional capacities of causal models. Useful qualitative models, providing insight into the role of every performance factor, were inferred from experimental data. This paper reports on the results for a LU decomposition algorithm and on the study of the parameter sensitivity of the Kakadu implementation of the JPEG-2000 standard. Next, the analysis was used to search for generic performance characteristics of the applications.

  5. Evaluation of Cost Models and Needs & Gaps Analysis

    DEFF Research Database (Denmark)

    Kejser, Ulla Bøgvad

    2014-01-01

    they breakdown costs. This is followed by an in depth analysis of stakeholders’ needs for financial information derived from the 4C project stakeholder consultation.The stakeholders’ needs analysis indicated that models should:• support accounting, but more importantly they should enable budgeting• be able...... curation field. These recommendations for investigation and action include:• provision of a high-level quick entry guide to all existing models that describes the scope andstructure of the models indicating their relevance for different stakeholders and use cases• provision of a vocabulary and a generic...... description of cost & benefit models• provision of clearly designed and user-friendly tools with default reference settings that can be fine-tuned to accommodate for various stakeholder needs, and usable user-interfaces • provision of benefit models in addition to the cost models• provision of a shared...

  6. Bayesian Model Selection with Network Based Diffusion Analysis.

    Science.gov (United States)

    Whalen, Andrew; Hoppitt, William J E

    2016-01-01

    A number of recent studies have used Network Based Diffusion Analysis (NBDA) to detect the role of social transmission in the spread of a novel behavior through a population. In this paper we present a unified framework for performing NBDA in a Bayesian setting, and demonstrate how the Watanabe Akaike Information Criteria (WAIC) can be used for model selection. We present a specific example of applying this method to Time to Acquisition Diffusion Analysis (TADA). To examine the robustness of this technique, we performed a large scale simulation study and found that NBDA using WAIC could recover the correct model of social transmission under a wide range of cases, including under the presence of random effects, individual level variables, and alternative models of social transmission. This work suggests that NBDA is an effective and widely applicable tool for uncovering whether social transmission underpins the spread of a novel behavior, and may still provide accurate results even when key model assumptions are relaxed.

  7. Accuracy Analysis for SST Gravity Field Model in China

    Institute of Scientific and Technical Information of China (English)

    LUO Jia; LUO Zhicai; ZOU Xiancai; WANG Haihong

    2006-01-01

    Taking China as the region for test, the potential of the new satellite gravity technique, satellite-to-satellite tracking for improving the accuracy of regional gravity field model is studied. With WDM94 as reference, the gravity anomaly residuals of three models, the latest two GRACE global gravity field model (EIGEN_GRACE02S, GGM02S) and EGM96, are computed and compared. The causes for the differences among the residuals of the three models are discussed. The comparison between the residuals shows that in the selected region, EIGEN_GRACE02S or GGM02S is better than EGM96 in lower degree part (less than 110 degree). Additionally, through the analysis of the model gravity anomaly residuals, it is found that some systematic errors with periodical properties exist in the higher degree part of EIGEN and GGM models, the results can also be taken as references in the validation of the SST gravity data.

  8. Analysis and Comparison of Typical Models within Distribution Network Design

    DEFF Research Database (Denmark)

    Jørgensen, Hans Jacob; Larsen, Allan; Madsen, Oli B.G.

    are covered in the categorisation include fixed vs. general networks, specialised vs. general nodes, linear vs. nonlinear costs, single vs. multi commodity, uncapacitated vs. capacitated activities, single vs. multi modal and static vs. dynamic. The models examined address both strategic and tactical planning...... for educational purposes. Furthermore, the paper can be seen as a practical introduction to network design modelling as well as a being an art manual or recipe when constructing such a model.......This paper investigates the characteristics of typical optimisation models within Distribution Network Design. During the paper fourteen models known from the literature will be thoroughly analysed. Through this analysis a schematic approach to categorisation of distribution network design models...

  9. Probability bounds analysis for nonlinear population ecology models.

    Science.gov (United States)

    Enszer, Joshua A; Andrei Măceș, D; Stadtherr, Mark A

    2015-09-01

    Mathematical models in population ecology often involve parameters that are empirically determined and inherently uncertain, with probability distributions for the uncertainties not known precisely. Propagating such imprecise uncertainties rigorously through a model to determine their effect on model outputs can be a challenging problem. We illustrate here a method for the direct propagation of uncertainties represented by probability bounds though nonlinear, continuous-time, dynamic models in population ecology. This makes it possible to determine rigorous bounds on the probability that some specified outcome for a population is achieved, which can be a core problem in ecosystem modeling for risk assessment and management. Results can be obtained at a computational cost that is considerably less than that required by statistical sampling methods such as Monte Carlo analysis. The method is demonstrated using three example systems, with focus on a model of an experimental aquatic food web subject to the effects of contamination by ionic liquids, a new class of potentially important industrial chemicals.

  10. Network and adaptive system of systems modeling and analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Lawton, Craig R.; Campbell, James E. Dr. (.; .); Anderson, Dennis James; Eddy, John P.

    2007-05-01

    This report documents the results of an LDRD program entitled ''Network and Adaptive System of Systems Modeling and Analysis'' that was conducted during FY 2005 and FY 2006. The purpose of this study was to determine and implement ways to incorporate network communications modeling into existing System of Systems (SoS) modeling capabilities. Current SoS modeling, particularly for the Future Combat Systems (FCS) program, is conducted under the assumption that communication between the various systems is always possible and occurs instantaneously. A more realistic representation of these communications allows for better, more accurate simulation results. The current approach to meeting this objective has been to use existing capabilities to model network hardware reliability and adding capabilities to use that information to model the impact on the sustainment supply chain and operational availability.

  11. Probabilistic Analysis Techniques Applied to Complex Spacecraft Power System Modeling

    Science.gov (United States)

    Hojnicki, Jeffrey S.; Rusick, Jeffrey J.

    2005-01-01

    Electric power system performance predictions are critical to spacecraft, such as the International Space Station (ISS), to ensure that sufficient power is available to support all the spacecraft s power needs. In the case of the ISS power system, analyses to date have been deterministic, meaning that each analysis produces a single-valued result for power capability because of the complexity and large size of the model. As a result, the deterministic ISS analyses did not account for the sensitivity of the power capability to uncertainties in model input variables. Over the last 10 years, the NASA Glenn Research Center has developed advanced, computationally fast, probabilistic analysis techniques and successfully applied them to large (thousands of nodes) complex structural analysis models. These same techniques were recently applied to large, complex ISS power system models. This new application enables probabilistic power analyses that account for input uncertainties and produce results that include variations caused by these uncertainties. Specifically, N&R Engineering, under contract to NASA, integrated these advanced probabilistic techniques with Glenn s internationally recognized ISS power system model, System Power Analysis for Capability Evaluation (SPACE).

  12. An Overview of Path Analysis: Mediation Analysis Concept in Structural Equation Modeling

    OpenAIRE

    Jenatabadi, Hashem Salarzadeh

    2015-01-01

    This paper provides a tutorial discussion on path analysis structure with concept of structural equation modelling (SEM). The paper delivers an introduction to path analysis technique and explain to how to deal with analyzing the data with this kind of statistical methodology especially with a mediator in the research model. The intended audience is statisticians, mathematicians, or methodologists who either know about SEM or simple basic statistics especially in regression and linear/nonline...

  13. Formability models for warm sheet metal forming analysis

    Science.gov (United States)

    Jiang, Sen

    Several closed form models for the prediction of strain space sheet metal formability as a function of temperature and strain rate are proposed. The proposed models require only failure strain information from the uniaxial tension test at an elevated temperature setting and failure strain information from the traditionally defined strain space forming limit diagram at room temperature, thereby featuring the advantage of offering a full forming limit description without having to carry out expensive experimental studies for multiple modes of deformation under the elevated temperature. The Power law, Voce, and Johnson-Cook hardening models are considered along with the yield criterions of Hill's 48 and Logan-Hosford yield criteria. Acceptable correlations between the theory and experiment are reported for all the models under a plane strain condition. Among all the proposed models, the model featuring Johnson-Cook hardening model and Logan-Hosford yield behavior (LHJC model) was shown to best correlate with experiment. The sensitivity of the model with respect to various forming parameters is discussed. This work is significant to those aiming to incorporate closed-form formability models directly into numerical simulation programs for the purpose of design and analysis of products manufactured through the warm sheet metal forming process. An improvement based upon Swift's diffuse necking theory, is suggested in order to enhance the reliability of the model for biaxial stretch conditions. Theory relating to this improvement is provided in Appendix B.

  14. Video analysis of the flight of a model aircraft

    Energy Technology Data Exchange (ETDEWEB)

    Tarantino, Giovanni; Fazio, Claudio, E-mail: giovanni.tarantino19@unipa.it, E-mail: claudio.fazio@unipa.it [UOP-PERG (University of Palermo Physics Education Research Group), Dipartimento di Fisica, Universita di Palermo, Palermo (Italy)

    2011-11-15

    A video-analysis software tool has been employed in order to measure the steady-state values of the kinematics variables describing the longitudinal behaviour of a radio-controlled model aircraft during take-off, climbing and gliding. These experimental results have been compared with the theoretical steady-state configurations predicted by the phugoid model for longitudinal flight. A comparison with the parameters and performance of the full-size aircraft has also been outlined.

  15. Mathematical Modeling and Analysis of Classified Marketing of Agricultural Products

    Institute of Scientific and Technical Information of China (English)

    Fengying; WANG

    2014-01-01

    Classified marketing of agricultural products was analyzed using the Logistic Regression Model. This method can take full advantage of information in agricultural product database,to find factors influencing best selling degree of agricultural products,and make quantitative analysis accordingly. Using this model,it is also able to predict sales of agricultural products,and provide reference for mapping out individualized sales strategy for popularizing agricultural products.

  16. A Numerical Model for Torsion Analysis of Composite Ship Hulls

    Directory of Open Access Journals (Sweden)

    Ionel Chirica

    2012-01-01

    Full Text Available A new methodology based on a macroelement model proposed for torsional behaviour of the ship hull made of composite material is proposed in this paper. A computer program has been developed for the elastic analysis of linear torsion. The results are compared with the FEM-based licensed soft COSMOS/M results and measurements on the scale simplified model of a container ship, made of composite materials.

  17. Development of a post accident analysis model for KALIMER

    Energy Technology Data Exchange (ETDEWEB)

    Chang, W. P.; Ha, G. S.; Jeong, H. Y.; Kwon, Y. M.; Heo, S.; Lee, Y. B. [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    2005-07-01

    An ultimate safety measure of the KALIMER depends on the inherent safety, which have the core maintain a negative reactivity during any accident periods. In order to secure the integrity of a fuel rod, the void reactivity feedback under sodium boiling must be analyzed. Even though the KALIMER design might not allow boiling at any circumstance, sodium boiling would be possible under HCDA (Hypothetical Core Disruptive Accident) initiating events which are represented by UTOP (Unprotected Transient Over Power), ULOF (Unprotected Loss Of Flow), ULOHS (Unprotected Loss Of Heat Sink), or sudden flow channel blockage, due to power excursion caused by the reactivity feedback. The slug and annular flow regimes tend to prevail for the boiling of a liquid-metal coolant such as sodium near the atmospheric pressure. In contrast, the bubbly flow is typical under a high pressure in light water reactors. This phenomenon difference brings to develop the present model, especially, at the onset of boiling. A few models had been developed for the sodium boiling analysis. The models such as those in the HOMSEP-2 and SAS series are classified into relatively detailed models. Both models are usually called a multiple-bubble slug ejection model. Some simpler models are also introduced to evade either parameter sensitivities or a mathematical complexity associated with those rigorous models. The present model based on the multiple-bubble slug ejection model. It allows a finite number (N) of bubbles, separated by liquid slugs, in a channel. Boiling occurs at a user specified superheat, and a generated vapor is modeled to fill the whole cross section of the coolant channel except for a static liquid film left on the cladding or/and structure surfaces. The model also assumes a vapor with one uniform pressure. The present analysis is focused on the behavior of early sodium boiling after ULOHS.

  18. Advanced accident sequence precursor analysis level 1 models

    Energy Technology Data Exchange (ETDEWEB)

    Sattison, M.B.; Thatcher, T.A.; Knudsen, J.K.; Schroeder, J.A.; Siu, N.O. [Idaho National Engineering Lab., Idaho National Lab., Idaho Falls, ID (United States)

    1996-03-01

    INEL has been involved in the development of plant-specific Accident Sequence Precursor (ASP) models for the past two years. These models were developed for use with the SAPHIRE suite of PRA computer codes. They contained event tree/linked fault tree Level 1 risk models for the following initiating events: general transient, loss-of-offsite-power, steam generator tube rupture, small loss-of-coolant-accident, and anticipated transient without scram. Early in 1995 the ASP models were revised based on review comments from the NRC and an independent peer review. These models were released as Revision 1. The Office of Nuclear Regulatory Research has sponsored several projects at the INEL this fiscal year to further enhance the capabilities of the ASP models. Revision 2 models incorporates more detailed plant information into the models concerning plant response to station blackout conditions, information on battery life, and other unique features gleaned from an Office of Nuclear Reactor Regulation quick review of the Individual Plant Examination submittals. These models are currently being delivered to the NRC as they are completed. A related project is a feasibility study and model development of low power/shutdown (LP/SD) and external event extensions to the ASP models. This project will establish criteria for selection of LP/SD and external initiator operational events for analysis within the ASP program. Prototype models for each pertinent initiating event (loss of shutdown cooling, loss of inventory control, fire, flood, seismic, etc.) will be developed. A third project concerns development of enhancements to SAPHIRE. In relation to the ASP program, a new SAPHIRE module, GEM, was developed as a specific user interface for performing ASP evaluations. This module greatly simplifies the analysis process for determining the conditional core damage probability for a given combination of initiating events and equipment failures or degradations.

  19. Wind energy conversion system analysis model (WECSAM) computer program documentation

    Energy Technology Data Exchange (ETDEWEB)

    Downey, W T; Hendrick, P L

    1982-07-01

    Described is a computer-based wind energy conversion system analysis model (WECSAM) developed to predict the technical and economic performance of wind energy conversion systems (WECS). The model is written in CDC FORTRAN V. The version described accesses a data base containing wind resource data, application loads, WECS performance characteristics, utility rates, state taxes, and state subsidies for a six state region (Minnesota, Michigan, Wisconsin, Illinois, Ohio, and Indiana). The model is designed for analysis at the county level. The computer model includes a technical performance module and an economic evaluation module. The modules can be run separately or together. The model can be run for any single user-selected county within the region or looped automatically through all counties within the region. In addition, the model has a restart capability that allows the user to modify any data-base value written to a scratch file prior to the technical or economic evaluation. Thus, any user-supplied data for WECS performance, application load, utility rates, or wind resource may be entered into the scratch file to override the default data-base value. After the model and the inputs required from the user and derived from the data base are described, the model output and the various output options that can be exercised by the user are detailed. The general operation is set forth and suggestions are made for efficient modes of operation. Sample listings of various input, output, and data-base files are appended. (LEW)

  20. Modeling and analysis of electrorheological suspensions in shear flow.

    Science.gov (United States)

    Seo, Youngwook P; Seo, Yongsok

    2012-02-14

    A model capable of describing the flow behavior of electrorheological (ER) suspensions under different electric field strengths and over the full range of shear rates is proposed. Structural reformation in the low shear rate region is investigated where parts of a material are in an undeformed state, while aligned structures reform under the shear force. The model's predictions were compared with the experimental data of some ER fluids as well as the CCJ (Cho-Choi-Jhon) model. This simple model's predictions of suspension flow behavior with subsequent aligned structure reformation agreed well with the experimental data, both quantitatively and qualitatively. The proposed model plausibly predicted the static yield stress, whereas the CCJ model and the Bingham model predicted only the dynamic yield stress. The master curve describing the apparent viscosity was obtained by appropriate scaling both axes, which showed that a combination of dimensional analysis and flow curve analysis using the proposed model yielded a quantitatively and qualitatively precise description of ER fluid rheological behavior based on relatively few experimental measurements.

  1. Biological Jumping Mechanism Analysis and Modeling for Frog Robot

    Institute of Scientific and Technical Information of China (English)

    Meng Wang; Xi-zhe Zang; Ji-zhuang Fan; Jie Zhao

    2008-01-01

    This paper presents a mechanical model of jumping robot based on the biological mechanism analysis of frog. By biological observation and kinematic analysis the frog jump is divided into take-off phase, aerial phase and landing phase. We find the similar trajectories of hindlimb joints during jump, the important effect of foot during take-off and the role of forelimb in supporting the body. Based on the observation, the frog jump is simplified and a mechanical model is put forward. The robot leg is represented by a 4-bar spring/linkage mechanism model, which has three Degrees of Freedom (DOF) at hip joint and one DOF (passive) at tarsometatarsal joint on the foot. The shoulder and elbow joints each has one DOF for the balancing function of arm.The ground reaction force of the model is analyzed and compared with that of frog during take-off. The results show that the model has the same advantages of low likelihood of premature lift-off and high efficiency as the frog. Analysis results and the model can be employed to develop and control a robot capable of mimicking the jumping behavior of flog.

  2. The Financial Analysis System: An Integrated Software System for Financial Analysis and Modeling.

    Science.gov (United States)

    Groomer, S. Michael

    This paper discusses the Financial Analysis System (FAS), a software system for financial analysis, display, and modeling of the data found in the COMPUSTAT Annual Industrial, Over-the-Counter and Canadian Company files. The educational utility of FAS is also discussed briefly. (Author)

  3. Automated quantitative gait analysis in animal models of movement disorders

    Directory of Open Access Journals (Sweden)

    Vandeputte Caroline

    2010-08-01

    Full Text Available Abstract Background Accurate and reproducible behavioral tests in animal models are of major importance in the development and evaluation of new therapies for central nervous system disease. In this study we investigated for the first time gait parameters of rat models for Parkinson's disease (PD, Huntington's disease (HD and stroke using the Catwalk method, a novel automated gait analysis test. Static and dynamic gait parameters were measured in all animal models, and these data were compared to readouts of established behavioral tests, such as the cylinder test in the PD and stroke rats and the rotarod tests for the HD group. Results Hemiparkinsonian rats were generated by unilateral injection of the neurotoxin 6-hydroxydopamine in the striatum or in the medial forebrain bundle. For Huntington's disease, a transgenic rat model expressing a truncated huntingtin fragment with multiple CAG repeats was used. Thirdly, a stroke model was generated by a photothrombotic induced infarct in the right sensorimotor cortex. We found that multiple gait parameters were significantly altered in all three disease models compared to their respective controls. Behavioural deficits could be efficiently measured using the cylinder test in the PD and stroke animals, and in the case of the PD model, the deficits in gait essentially confirmed results obtained by the cylinder test. However, in the HD model and the stroke model the Catwalk analysis proved more sensitive than the rotarod test and also added new and more detailed information on specific gait parameters. Conclusion The automated quantitative gait analysis test may be a useful tool to study both motor impairment and recovery associated with various neurological motor disorders.

  4. Predictive error analysis for a water resource management model

    Science.gov (United States)

    Gallagher, Mark; Doherty, John

    2007-02-01

    SummaryIn calibrating a model, a set of parameters is assigned to the model which will be employed for the making of all future predictions. If these parameters are estimated through solution of an inverse problem, formulated to be properly posed through either pre-calibration or mathematical regularisation, then solution of this inverse problem will, of necessity, lead to a simplified parameter set that omits the details of reality, while still fitting historical data acceptably well. Furthermore, estimates of parameters so obtained will be contaminated by measurement noise. Both of these phenomena will lead to errors in predictions made by the model, with the potential for error increasing with the hydraulic property detail on which the prediction depends. Integrity of model usage demands that model predictions be accompanied by some estimate of the possible errors associated with them. The present paper applies theory developed in a previous work to the analysis of predictive error associated with a real world, water resource management model. The analysis offers many challenges, including the fact that the model is a complex one that was partly calibrated by hand. Nevertheless, it is typical of models which are commonly employed as the basis for the making of important decisions, and for which such an analysis must be made. The potential errors associated with point-based and averaged water level and creek inflow predictions are examined, together with the dependence of these errors on the amount of averaging involved. Error variances associated with predictions made by the existing model are compared with "optimized error variances" that could have been obtained had calibration been undertaken in such a way as to minimize predictive error variance. The contributions by different parameter types to the overall error variance of selected predictions are also examined.

  5. Strengthening the weak link: Built Environment modelling for loss analysis

    Science.gov (United States)

    Millinship, I.

    2012-04-01

    Methods to analyse insured losses from a range of natural perils, including pricing by primary insurers and catastrophe modelling by reinsurers, typically lack sufficient exposure information. Understanding the hazard intensity in terms of spatial severity and frequency is only the first step towards quantifying the risk of a catastrophic event. For any given event we need to know: Are any structures affected? What type of buildings are they? How much damaged occurred? How much will the repairs cost? To achieve this, detailed exposure information is required to assess the likely damage and to effectively calculate the resultant loss. Modelling exposures in the Built Environment therefore plays as important a role in understanding re/insurance risk as characterising the physical hazard. Across both primary insurance books and aggregated reinsurance portfolios, the location of a property (a risk) and its monetary value is typically known. Exactly what that risk is in terms of detailed property descriptors including structure type and rebuild cost - and therefore its vulnerability to loss - is often omitted. This data deficiency is a primary source of variations between modelled losses and the actual claims value. Built Environment models are therefore required at a high resolution to describe building attributes that relate vulnerability to property damage. However, national-scale household-level datasets are often not computationally practical in catastrophe models and data must be aggregated. In order to provide more accurate risk analysis, we have developed and applied a methodology for Built Environment modelling for incorporation into a range of re/insurance applications, including operational models for different international regions and different perils and covering residential, commercial and industry exposures. Illustrated examples are presented, including exposure modelling suitable for aggregated reinsurance analysis for the UK and bespoke high resolution

  6. Model parameter uncertainty analysis for annual field-scale P loss model

    Science.gov (United States)

    Phosphorous (P) loss models are important tools for developing and evaluating conservation practices aimed at reducing P losses from agricultural fields. All P loss models, however, have an inherent amount of uncertainty associated with them. In this study, we conducted an uncertainty analysis with ...

  7. Model parameter uncertainty analysis for an annual field-scale phosphorus loss model

    Science.gov (United States)

    Phosphorous (P) loss models are important tools for developing and evaluating conservation practices aimed at reducing P losses from agricultural fields. All P loss models, however, have an inherent amount of uncertainty associated with them. In this study, we conducted an uncertainty analysis with ...

  8. Calibration of Uncertainty Analysis of the SWAT Model Using Genetic Algorithms and Bayesian Model Averaging

    Science.gov (United States)

    In this paper, the Genetic Algorithms (GA) and Bayesian model averaging (BMA) were combined to simultaneously conduct calibration and uncertainty analysis for the Soil and Water Assessment Tool (SWAT). In this hybrid method, several SWAT models with different structures are first selected; next GA i...

  9. Networking Sensor Observations, Forecast Models & Data Analysis Tools

    Science.gov (United States)

    Falke, S. R.; Roberts, G.; Sullivan, D.; Dibner, P. C.; Husar, R. B.

    2009-12-01

    This presentation explores the interaction between sensor webs and forecast models and data analysis processes within service oriented architectures (SOA). Earth observation data from surface monitors and satellite sensors and output from earth science models are increasingly available through open interfaces that adhere to web standards, such as the OGC Web Coverage Service (WCS), OGC Sensor Observation Service (SOS), OGC Web Processing Service (WPS), SOAP-Web Services Description Language (WSDL), or RESTful web services. We examine the implementation of these standards from the perspective of forecast models and analysis tools. Interoperable interfaces for model inputs, outputs, and settings are defined with the purpose of connecting them with data access services in service oriented frameworks. We review current best practices in modular modeling, such as OpenMI and ESMF/Mapl, and examine the applicability of those practices to service oriented sensor webs. In particular, we apply sensor-model-analysis interfaces within the context of wildfire smoke analysis and forecasting scenario used in the recent GEOSS Architecture Implementation Pilot. Fire locations derived from satellites and surface observations and reconciled through a US Forest Service SOAP web service are used to initialize a CALPUFF smoke forecast model. The results of the smoke forecast model are served through an OGC WCS interface that is accessed from an analysis tool that extract areas of high particulate matter concentrations and a data comparison tool that compares the forecasted smoke with Unattended Aerial System (UAS) collected imagery and satellite-derived aerosol indices. An OGC WPS that calculates population statistics based on polygon areas is used with the extract area of high particulate matter to derive information on the population expected to be impacted by smoke from the wildfires. We described the process for enabling the fire location, smoke forecast, smoke observation, and

  10. Stochastic modeling of friction force and vibration analysis of a mechanical system using the model

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Won Seok; Choi, Chan Kyu; Yoo, Hong Hee [Hanyang University, Seoul (Korea, Republic of)

    2015-09-15

    The squeal noise generated from a disk brake or chatter occurred in a machine tool primarily results from friction-induced vibration. Since friction-induced vibration is usually accompanied by abrasion and lifespan reduction of mechanical parts, it is necessary to develop a reliable analysis model by which friction-induced vibration phenomena can be accurately analyzed. The original Coulomb's friction model or the modified Coulomb friction model employed in most commercial programs employs deterministic friction coefficients. However, observing friction phenomena between two contact surfaces, one may observe that friction coefficients keep changing due to the unevenness of contact surface, temperature, lubrication and humidity. Therefore, in this study, friction coefficients are modeled as random parameters that keep changing during the motion of a mechanical system undergoing friction force. The integrity of the proposed stochastic friction model was validated by comparing the analysis results obtained by the proposed model with experimental results.

  11. Uncertainty Analysis of Multi-Model Flood Forecasts

    Directory of Open Access Journals (Sweden)

    Erich J. Plate

    2015-12-01

    Full Text Available This paper demonstrates, by means of a systematic uncertainty analysis, that the use of outputs from more than one model can significantly improve conditional forecasts of discharges or water stages, provided the models are structurally different. Discharge forecasts from two models and the actual forecasted discharge are assumed to form a three-dimensional joint probability density distribution (jpdf, calibrated on long time series of data. The jpdf is decomposed into conditional probability density distributions (cpdf by means of Bayes formula, as suggested and explored by Krzysztofowicz in a series of papers. In this paper his approach is simplified to optimize conditional forecasts for any set of two forecast models. Its application is demonstrated by means of models developed in a study of flood forecasting for station Stung Treng on the middle reach of the Mekong River in South-East Asia. Four different forecast models were used and pairwise combined: forecast with no model, with persistence model, with a regression model, and with a rainfall-runoff model. Working with cpdfs requires determination of dependency among variables, for which linear regressions are required, as was done by Krzysztofowicz. His Bayesian approach based on transforming observed probability distributions of discharges and forecasts into normal distributions is also explored. Results obtained with his method for normal prior and likelihood distributions are identical to results from direct multiple regressions. Furthermore, it is shown that in the present case forecast accuracy is only marginally improved, if Weibull distributed basic data were converted into normally distributed variables.

  12. A Lumped Computational Model for Sodium Sulfur Battery Analysis

    Science.gov (United States)

    Wu, Fan

    Due to the cost of materials and time consuming testing procedures, development of new batteries is a slow and expensive practice. The purpose of this study is to develop a computational model and assess the capabilities of such a model designed to aid in the design process and control of sodium sulfur batteries. To this end, a transient lumped computational model derived from an integral analysis of the transport of species, energy and charge throughout the battery has been developed. The computation processes are coupled with the use of Faraday's law, and solutions for the species concentrations, electrical potential and current are produced in a time marching fashion. Properties required for solving the governing equations are calculated and updated as a function of time based on the composition of each control volume. The proposed model is validated against multi- dimensional simulations and experimental results from literatures, and simulation results using the proposed model is presented and analyzed. The computational model and electrochemical model used to solve the equations for the lumped model are compared with similar ones found in the literature. The results obtained from the current model compare favorably with those from experiments and other models.

  13. Human Performance Modeling for Dynamic Human Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Boring, Ronald Laurids [Idaho National Laboratory; Joe, Jeffrey Clark [Idaho National Laboratory; Mandelli, Diego [Idaho National Laboratory

    2015-08-01

    Part of the U.S. Department of Energy’s (DOE’s) Light Water Reac- tor Sustainability (LWRS) Program, the Risk-Informed Safety Margin Charac- terization (RISMC) Pathway develops approaches to estimating and managing safety margins. RISMC simulations pair deterministic plant physics models with probabilistic risk models. As human interactions are an essential element of plant risk, it is necessary to integrate human actions into the RISMC risk framework. In this paper, we review simulation based and non simulation based human reliability analysis (HRA) methods. This paper summarizes the founda- tional information needed to develop a feasible approach to modeling human in- teractions in RISMC simulations.

  14. Residuals analysis of the generalized linear models for longitudinal data.

    Science.gov (United States)

    Chang, Y C

    2000-05-30

    The generalized estimation equation (GEE) method, one of the generalized linear models for longitudinal data, has been used widely in medical research. However, the related sensitivity analysis problem has not been explored intensively. One of the possible reasons for this was due to the correlated structure within the same subject. We showed that the conventional residuals plots for model diagnosis in longitudinal data could mislead a researcher into trusting the fitted model. A non-parametric method, named the Wald-Wolfowitz run test, was proposed to check the residuals plots both quantitatively and graphically. The rationale proposedin this paper is well illustrated with two real clinical studies in Taiwan.

  15. Process Correlation Analysis Model for Process Improvement Identification

    Directory of Open Access Journals (Sweden)

    Su-jin Choi

    2014-01-01

    software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data.

  16. Modelling and Analysis of Dual-Stator Induction Motors

    Science.gov (United States)

    Razik, Hubert; Rezzoug, Abderrezak; Hadiouche, Djafar

    In this paper, the analysis and the modelling of a Dual-Stator Induction Motor (DSIM) are presented. In particular, the effects of the shift angle between its three-phase windings are studied. A complex steady state model is first established in order to analyse its harmonic behavior when it is supplied by a non-sinusoidal voltage source. Then, a new transformation matrix is proposed to develop a suitable dynamic model. In both cases, the study is made using an arbitrary shift angle. Simulation results of its PWM control are also presented and compared in order to confirm our theoretical observations.

  17. Quantitative modelling and analysis of a Chinese smart grid: a stochastic model checking case study

    DEFF Research Database (Denmark)

    Yuksel, Ender; Nielson, Hanne Riis; Nielson, Flemming

    2014-01-01

    Cyber-physical systems integrate information and communication technology with the physical elements of a system, mainly for monitoring and controlling purposes. The conversion of traditional power grid into a smart grid, a fundamental example of a cyber-physical system, raises a number of issues...... consumption.We employ stochastic model checking approach and present our modelling and analysis study using PRISM model checker....

  18. Numerical stability analysis in respiratory control system models

    Directory of Open Access Journals (Sweden)

    Laszlo E. Kollar

    2005-04-01

    Full Text Available Stability of the unique equilibrium in two mathematical models (based on chemical balance dynamics of human respiration is examined using numerical methods. Due to the transport delays in the respiratory control system these models are governed by delay differential equations. First, a simplified two-state model with one delay is considered, then a five-state model with four delays (where the application of numerical methods is essential is investigated. In particular, software is developed to perform linearized stability analysis and simulations of the model equations. Furthermore, the Matlab package DDE-BIFTOOL v.~2.00 is employed to carry out numerical bifurcation analysis. Our main goal is to study the effects of transport delays on the stability of the model equations. Critical values of the transport delays (i.e., where Hopf bifurcations occur are determined, and stable periodic solutions are found as the delays pass their critical values. The numerical findings are in good agreement with analytic results obtained earlier for the two-state model.

  19. Improved environmental multimedia modeling and its sensitivity analysis.

    Science.gov (United States)

    Yuan, Jing; Elektorowicz, Maria; Chen, Zhi

    2011-01-01

    Modeling of multimedia environmental issues is extremely complex due to the intricacy of the systems with the consideration of many factors. In this study, an improved environmental multimedia modeling is developed and a number of testing problems related to it are examined and compared with each other with standard numerical and analytical methodologies. The results indicate the flux output of new model is lesser in the unsaturated zone and groundwater zone compared with the traditional environmental multimedia model. Furthermore, about 90% of the total benzene flux was distributed to the air zone from the landfill sources and only 10% of the total flux emitted into the unsaturated, groundwater zones in non-uniform conditions. This paper also includes functions of model sensitivity analysis to optimize model parameters such as Peclet number (Pe). The analyses results show that the Pe can be considered as deterministic input variables for transport output. The oscillatory behavior is eliminated with the Pe decreased. In addition, the numerical methods are more accurate than analytical methods with the Pe increased. In conclusion, the improved environmental multimedia model system and its sensitivity analysis can be used to address the complex fate and transport of the pollutants in multimedia environments and then help to manage the environmental impacts.

  20. Design Through Manufacturing: The Solid Model - Finite Element Analysis Interface

    Science.gov (United States)

    Rubin, Carol

    2003-01-01

    State-of-the-art computer aided design (CAD) presently affords engineers the opportunity to create solid models of machine parts which reflect every detail of the finished product. Ideally, these models should fulfill two very important functions: (1) they must provide numerical control information for automated manufacturing of precision parts, and (2) they must enable analysts to easily evaluate the stress levels (using finite element analysis - FEA) for all structurally significant parts used in space missions. Today's state-of-the-art CAD programs perform function (1) very well, providing an excellent model for precision manufacturing. But they do not provide a straightforward and simple means of automating the translation from CAD to FEA models, especially for aircraft-type structures. The research performed during the fellowship period investigated the transition process from the solid CAD model to the FEA stress analysis model with the final goal of creating an automatic interface between the two. During the period of the fellowship a detailed multi-year program for the development of such an interface was created. The ultimate goal of this program will be the development of a fully parameterized automatic ProE/FEA translator for parts and assemblies, with the incorporation of data base management into the solution, and ultimately including computational fluid dynamics and thermal modeling in the interface.

  1. Semigroup Method for a Mathematical Model in Reliability Analysis

    Institute of Scientific and Technical Information of China (English)

    Geni Gupur; LI Xue-zhi

    2001-01-01

    The system which consists of a reliable machine, an unreliable machine and a storage buffer with infinite many workpieces has been studied. The existence of a unique positive time-dependent solution of the model corresponding to the system has been obtained by using C0-semigroup theory of linear operators in functional analysis.

  2. Landslide susceptibility analysis using an artificial neural network model

    Science.gov (United States)

    Mansor, Shattri; Pradhan, Biswajeet; Daud, Mohamed; Jamaludin, Normalina; Khuzaimah, Zailani

    2007-10-01

    This paper deals with landslide susceptibility analysis using an artificial neural network model for Cameron Highland, Malaysia. Landslide locations were identified in the study area from interpretation of aerial photographs and field surveys. Topographical/geological data and satellite images were collected and processed using GIS and image processing tools. There are ten landslide inducing parameters which are considered for the landslide hazards. These parameters are topographic slope, aspect, curvature and distance from drainage, all derived from the topographic database; geology and distance from lineament, derived from the geologic database; landuse from Landsat satellite images; soil from the soil database; precipitation amount, derived from the rainfall database; and the vegetation index value from SPOT satellite images. Landslide hazard was analyzed using landslide occurrence factors employing the logistic regression model. The results of the analysis were verified using the landslide location data and compared with logistic regression model. The accuracy of hazard map observed was 85.73%. The qualitative landslide susceptibility analysis was carried out using an artificial neural network model by doing map overlay analysis in GIS environment. This information could be used to estimate the risk to population, property and existing infrastructure like transportation network.

  3. Soil Retaining Structures: Development of models for structural analysis

    NARCIS (Netherlands)

    Bakker, K.J.

    2000-01-01

    The topic of this thesis is the development of models for the structural analysis of soil retaining structures. The soil retaining structures being looked at are; block revetments, flexible retaining walls and bored tunnels in soft soil. Within this context typical structural behavior of these struc

  4. Spatial Econometric data analysis: moving beyond traditional models

    NARCIS (Netherlands)

    Florax, R.J.G.M.; Vlist, van der A.J.

    2003-01-01

    This article appraises recent advances in the spatial econometric literature. It serves as the introduction too collection of new papers on spatial econometric data analysis brought together in this special issue, dealing specifically with new extensions to the spatial econometric modeling perspecti

  5. Video Analysis of the Flight of a Model Aircraft

    Science.gov (United States)

    Tarantino, Giovanni; Fazio, Claudio

    2011-01-01

    A video-analysis software tool has been employed in order to measure the steady-state values of the kinematics variables describing the longitudinal behaviour of a radio-controlled model aircraft during take-off, climbing and gliding. These experimental results have been compared with the theoretical steady-state configurations predicted by the…

  6. Model-based analysis and simulation of regenerative heat wheel

    DEFF Research Database (Denmark)

    Wu, Zhuang; Melnik, Roderick V. N.; Borup, F.

    2006-01-01

    of mathematical models for the thermal analysis of the fluid and wheel matrix. The effect of heat conduction in the direction of the fluid flow is taken into account and the influence of variations in rotating speed of the wheel as well as other characteristics (ambient temperature, airflow and geometric size...

  7. Alphabet Knowledge in Preschool: A Rasch Model Analysis

    Science.gov (United States)

    Drouin, Michelle; Horner, Sherri L.; Sondergeld, Toni A.

    2012-01-01

    In this study, we used Rasch model analyses to examine (1) the unidimensionality of the alphabet knowledge construct and (2) the relative difficulty of different alphabet knowledge tasks (uppercase letter recognition, names, and sounds, and lowercase letter names) within a sample of preschoolers (n=335). Rasch analysis showed that the four…

  8. Rasch Model Based Analysis of the Force Concept Inventory

    Science.gov (United States)

    Planinic, Maja; Ivanjek, Lana; Susac, Ana

    2010-01-01

    The Force Concept Inventory (FCI) is an important diagnostic instrument which is widely used in the field of physics education research. It is therefore very important to evaluate and monitor its functioning using different tools for statistical analysis. One of such tools is the stochastic Rasch model, which enables construction of linear…

  9. Mathematical modelling and linear stability analysis of laser fusion cutting

    Science.gov (United States)

    Hermanns, Torsten; Schulz, Wolfgang; Vossen, Georg; Thombansen, Ulrich

    2016-06-01

    A model for laser fusion cutting is presented and investigated by linear stability analysis in order to study the tendency for dynamic behavior and subsequent ripple formation. The result is a so called stability function that describes the correlation of the setting values of the process and the process' amount of dynamic behavior.

  10. The Cycle of Warfare - Analysis of an Analytical Model

    DEFF Research Database (Denmark)

    Jensen, Mikkel Storm

    2016-01-01

    The abstract has the title: “The Cycle of Warfare - Analysis of an Analytical Model” The Cycle of Warfare is an analytical model designed to illustrate the coherence between the organization, doctrine and technology of a military entity and the influence of the surrounding society as expressed...

  11. Static analysis of a Model of the LDL degradation pathway

    DEFF Research Database (Denmark)

    Pilegaard, Henrik; Nielson, Flemming; Nielson, Hanne Riis

    2005-01-01

    BioAmbients is a derivative of mobile ambients that has shown promise of describing interesting features of the behaviour of biological systems. As for other ambient calculi static program analysis can be used to compute safe approximations of the behavior of modelled systems. We use these tools ...

  12. Analysis of Rater Agreement by Rasch and IRT Models

    DEFF Research Database (Denmark)

    Petersen, Jørgen Holm

    2013-01-01

    This chapter first discusses the use of item response theory (IRT) model for the analysis of agreement between raters in a situation where all raters have supplied dichotomous ratings of the same cases in a sample. Next, the chapter describes two approaches to the quantification of the rater vari...... be attributed to the considerable rater variation. © 2013 by John Wiley & Sons, Inc....

  13. Sensitivity Analysis of Mixed Models for Incomplete Longitudinal Data

    Science.gov (United States)

    Xu, Shu; Blozis, Shelley A.

    2011-01-01

    Mixed models are used for the analysis of data measured over time to study population-level change and individual differences in change characteristics. Linear and nonlinear functions may be used to describe a longitudinal response, individuals need not be observed at the same time points, and missing data, assumed to be missing at random (MAR),…

  14. Analysis and Comparison of Typical Models within Distribution Network Design

    DEFF Research Database (Denmark)

    Jørgensen, Hans Jacob; Larsen, Allan; Madsen, Oli B.G.

    a number of important issues which have been identified when addressing the Distribution Network Design problem from a modelling angle. More specifically, we present an analysis of the research which has been performed in utilizing operational research in developing and optimising distribution systems....

  15. Generic modelling framework for economic analysis of battery systems

    DEFF Research Database (Denmark)

    You, Shi; Rasmussen, Claus Nygaard

    2011-01-01

    for battery cycle life estimation, since the cycle life plays a central role in the economic analysis of BS. To illustrate the modelling framework, a case study using a Sodium Sulfur Battery (NAS) system with 5-minute regulating service is performed. The economic performances of two dispatch scenarios, a so...

  16. Qualitative Analysis for Rheodynamic Model of Cardiac Pressure Pulsations

    Institute of Scientific and Technical Information of China (English)

    Zhi-cong Liu; Bei-ye Feng

    2004-01-01

    In this paper,we give a rigorous mathematical and complete parameter analysis for the rheodynamic model of cardiac and obtain the conditions and parameter region for global existence and uniqueness of limit cycle and the global bifurcation diagram of limit cycles.We also discuss the resonance phenomenons of the perturbed system.

  17. Modeling and analysis of ground target radiation cross section

    Institute of Scientific and Technical Information of China (English)

    SHI Xiang; LOU GuoWei; LI XingGuo

    2008-01-01

    Based on the analysis of the passive millimeter wave (MMW) radiometer detection, the ground target radiation cross section is modeled as the new token for the target MMW radiant characteristics. Its ap-plication and actual testing are discussed and analyzed. The essence of passive MMW stealth is target radiation cross section reduction.

  18. Elements of Constitutive Modelling and Numerical Analysis of Frictional Soils

    DEFF Research Database (Denmark)

    Jakobsen, Kim Parsberg

    This thesis deals with elements of elasto-plastic constitutive modelling and numerical analysis of frictional soils. The thesis is based on a number of scientific papers and reports in which central characteristics of soil behaviour and applied numerical techniques are considered. The development...

  19. Modeling and Analysis of A Rotary Direct Drive Servovalve

    Institute of Scientific and Technical Information of China (English)

    YU Jue; ZHUANG Jian; YU Dehong

    2014-01-01

    Direct drive servovalves are mostly restricted to low flow rate and low bandwidth applications due to the considerable flow forces. Current studies mainly focus on enhancing the driving force, which in turn is limited to the development of the magnetic material. Aiming at reducing the flow forces, a novel rotary direct drive servovalve(RDDV) is introduced in this paper. This RDDV servovalve is designed in a rotating structure and its axially symmetric spool rotates within a certain angle range in the valve chamber. The servovalve orifices are formed by the matching between the square wave shaped land on the spool and the rectangular ports on the sleeve. In order to study the RDDV servovalve performance, flow rate model and mechanical model are established, wherein flow rates and flow induced torques at different spool rotation angles or spool radiuses are obtained. The model analysis shows that the driving torque can be alleviated due to the proposed valve structure. Computational fluid dynamics(CFD) analysis using ANSYS/FLUENT is applied to evaluate and validate the theoretical analysis. In addition, experiments on the flow rate and the mechanical characteristic of the RDDV servovalve are carried out. Both simulation and experimental results conform to the results of the theoretical model analysis, which proves that this novel and innovative structure for direct drive servovalves can reduce the flow force on the spool and improve valve frequency response characteristics. This research proposes a novel rotary direct drive servovalve, which can reduce the flow forces effectively.

  20. Software applications toward quantitative metabolic flux analysis and modeling.

    Science.gov (United States)

    Dandekar, Thomas; Fieselmann, Astrid; Majeed, Saman; Ahmed, Zeeshan

    2014-01-01

    Metabolites and their pathways are central for adaptation and survival. Metabolic modeling elucidates in silico all the possible flux pathways (flux balance analysis, FBA) and predicts the actual fluxes under a given situation, further refinement of these models is possible by including experimental isotopologue data. In this review, we initially introduce the key theoretical concepts and different analysis steps in the modeling process before comparing flux calculation and metabolite analysis programs such as C13, BioOpt, COBRA toolbox, Metatool, efmtool, FiatFlux, ReMatch, VANTED, iMAT and YANA. Their respective strengths and limitations are discussed and compared to alternative software. While data analysis of metabolites, calculation of metabolic fluxes, pathways and their condition-specific changes are all possible, we highlight the considerations that need to be taken into account before deciding on a specific software. Current challenges in the field include the computation of large-scale networks (in elementary mode analysis), regulatory interactions and detailed kinetics, and these are discussed in the light of powerful new approaches.

  1. Modeling and Analysis of Component Faults and Reliability

    DEFF Research Database (Denmark)

    Le Guilly, Thibaut; Olsen, Petur; Ravn, Anders Peter;

    2016-01-01

    that are automatically generated. The stochastic information on the faults is used to estimate the reliability of the fault affected system. The reliability is given with respect to properties of the system state space. We illustrate the process on a concrete example using the Uppaal model checker for validating...... the ideal system model and the fault modeling. Then the statistical version of the tool, UppaalSMC, is used to find reliability estimates.......This chapter presents a process to design and validate models of reactive systems in the form of communicating timed automata. The models are extended with faults associated with probabilities of occurrence. This enables a fault tree analysis of the system using minimal cut sets...

  2. A tool model for predicting atmospheric kinetics with sensitivity analysis

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    A package( a tool model) for program of predicting atmospheric chemical kinetics with sensitivity analysis is presented. The new direct method of calculating the first order sensitivity coefficients using sparse matrix technology to chemical kinetics is included in the tool model, it is only necessary to triangularize the matrix related to the Jacobian matrix of the model equation. The Gear type procedure is used to integrate amodel equation and its coupled auxiliary sensitivity coefficient equations. The FORTRAN subroutines of the model equation, the sensitivity coefficient equations, and their Jacobian analytical expressions are generated automatically from a chemical mechanism. The kinetic representation for the model equation and its sensitivity coefficient equations, and their Jacobian matrix is presented. Various FORTRAN subroutines in packages, such as SLODE, modified MA28, Gear package, with which the program runs in conjunction are recommended.The photo-oxidation of dimethyl disulfide is used for illustration.

  3. Coarse Analysis of Microscopic Models using Equation-Free Methods

    DEFF Research Database (Denmark)

    Marschler, Christian

    -dimensional models. The goal of this thesis is to investigate such high-dimensional multiscale models and extract relevant low-dimensional information from them. Recently developed mathematical tools allow to reach this goal: a combination of so-called equation-free methods with numerical bifurcation analysis....... Applications include the learning behavior in the barn owl’s auditory system, traffic jam formation in an optimal velocity model for circular car traffic and oscillating behavior of pedestrian groups in a counter-flow through a corridor with narrow door. The methods do not only quantify interesting properties...... factor for the complexity of models, e.g., in real-time applications. With the increasing amount of data generated by computer simulations a challenge is to extract valuable information from the models in order to help scientists and managers in a decision-making process. Although the dynamics...

  4. Performance analysis of FXLMS algorithm with secondary path modeling error

    Institute of Scientific and Technical Information of China (English)

    SUN Xu; CHEN Duanshi

    2003-01-01

    Performance analysis of filtered-X LMS (FXLMS) algorithm with secondary path modeling error is carried out in both time and frequency domain. It is shown firstly that the effects of secondary path modeling error on the performance of FXLMS algorithm are determined by the distribution of the relative error of secondary path model along with frequency.In case of that the distribution of relative error is uniform the modeling error of secondary path will have no effects on the performance of the algorithm. In addition, a limitation property of FXLMS algorithm is proved, which implies that the negative effects of secondary path modeling error can be compensated by increasing the adaptive filter length. At last, some insights into the "spillover" phenomenon of FXLMS algorithm are given.

  5. State space modelling and data analysis exercises in LISA Pathfinder

    CERN Document Server

    Nofrarias, M; Armano, M; Audley, H; Auger, G; Benedetti, M; Binetruy, P; Bogenstahl, J; Bortoluzzi, D; Bosetti, P; Brandt, N; Caleno, M; Cañizares, P; Cavalleri, A; Cesa, M; Chmeissani, M; Conchillo, A; Congedo, G; Cristofolin, I; Cruise, M; Danzmann, K; De Marchi, F; Diaz-Aguilo, M; Diepholz, I; Dixon, G; Dolesi, R; Dunbar, N; Fauste, J; Ferraioli, L; Fichter, V Ferroni W; Fitzsimons, E; Freschi, M; Marin, A García; Marirrodriga, C García; Gesa, R Gerndt L; Gibert, F; Giardini, D; Grimani, C; Grynagier, A; Guillaume, B; Guzmán, F; Harrison, I; Heinzel, G; Hernández, V; Hewitson, M; Hollington, D; Hough, J; Hoyland, D; Hueller, M; Huesler, J; Jennrich, O; Jetzer, P; Johlander, B; Killow, C; Llamas, X; Lloro, I; Lobo, A; Maarschalkerweerd, R; Madden, S; Mance, D; Mateos, I; McNamara, P W; Mendes, J; Mitchell, E; Monsky, A; Nicolini, D; Nicolodi, D; Pedersen, F; Perreur-Lloyd, M; Plagnol, E; Prat, P; Racca, G D; Ramos-Castro, J; Reiche, J; Perez, J A Romera; Robertson, D; Rozemeijer, H; Sanjuan, J; Schleicher, A; Schulte, M; Shaul, D; Stagnaro, L; Strandmoe, S; Steier, F; Sumner, T J; Taylor, A; Texier, D; Trenkel, C; Vitale, H-B Tu S; Wanner, G; Ward, H; Waschke, S; Wass, P; Weber, W J; Ziegler, T; Zweifel, P

    2013-01-01

    LISA Pathfinder is a mission planned by the European Space Agency to test the key technologies that will allow the detection of gravitational waves in space. The instrument on-board, the LISA Technology package, will undergo an exhaustive campaign of calibrations and noise characterisation campaigns in order to fully describe the noise model. Data analysis plays an important role in the mission and for that reason the data analysis team has been developing a toolbox which contains all the functionalities required during operations. In this contribution we give an overview of recent activities, focusing on the improvements in the modelling of the instrument and in the data analysis campaigns performed both with real and simulated data.

  6. State Space Modelling and Data Analysis Exercises in LISA Pathfinder

    Science.gov (United States)

    Nofrarias, M.; Antonucci, F.; Armano, M.; Audley, H.; Auger, G.; Benedetti, M.; Binetruy, P.; Bogenstahl, J.; Bortoluzzi, D.; Brandt, N.; Caleno, M.; Cavalleri, A.; Congedo, G.; Cruise, M.; Danzmann, K.; De Marchi, F.; Diaz-Aguilo, M.; Diepholz, I.; Dixon, G.; Dolesi, R.; Dunbar, N.; Fauste, J.; Ferraioli, L.; Ferroni, V.; Fichter, W.; Fitzsimons, E.; Freschi, M.; García Marirrodriga, C.; Gerndt, R.; Gesa, L.; Gibert, F.; Giardini, D.; Grimani, C.; Grynagier, A.; Guzmán, F.; Harrison, I.; Heinzel, G.; Hewitson, M.; Hollington, D.; Hoyland, D.; Hueller, M.; Huesler, J.; Jennrich, O.; Jetzer, P.; Johlander, B.; Karnesis, N.; Korsakova, N.; Killow, C.; Llamas, X.; Lloro, I.; Lobo, A.; Maarschalkerweerd, R.; Madden, S.; Mance, D.; Martin, V.; Mateos, I.; McNamara, P.; Mendes, J.; Mitchell, E.; Nicolodi, D.; Perreur-Lloyd, M.; Plagnol, E.; Prat, P.; Ramos-Castro, J.; Reiche, J.; Romera Perez, J. A.; Robertson, D.; Rozemeijer, H.; Russano, G.; Schleicher, A.; Shaul, D.; Sopuerta, C. F.; Sumner, T. J.; Taylor, A.; Texier, D.; Trenkel, C.; Tu, H. B.; Vitale, S.; Wanner, G.; Ward, H.; Waschke, S.; Wass, P.; Wealthy, D.; Wen, S.; Weber, W.; Ziegler, T.; Zweifel, P.

    2013-01-01

    LISA Pathfinder is a mission planned by the European Space Agency (ESA) to test the key technologies that will allow the detection of gravitational waves in space. The instrument on-board, the LISA Technology package, will undergo an exhaustive campaign of calibrations and noise characterisation campaigns in order to fully describe the noise model. Data analysis plays an important role in the mission and for that reason the data analysis team has been developing a toolbox which contains all the functionality required during operations. In this contribution we give an overview of recent activities, focusing on the improvements in the modelling of the instrument and in the data analysis campaigns performed both with real and simulated data.

  7. Dynamic Chest Image Analysis: Model-Based Perfusion Analysis in Dynamic Pulmonary Imaging

    Directory of Open Access Journals (Sweden)

    Kiuru Aaro

    2003-01-01

    Full Text Available The "Dynamic Chest Image Analysis" project aims to develop model-based computer analysis and visualization methods for showing focal and general abnormalities of lung ventilation and perfusion based on a sequence of digital chest fluoroscopy frames collected with the dynamic pulmonary imaging technique. We have proposed and evaluated a multiresolutional method with an explicit ventilation model for ventilation analysis. This paper presents a new model-based method for pulmonary perfusion analysis. According to perfusion properties, we first devise a novel mathematical function to form a perfusion model. A simple yet accurate approach is further introduced to extract cardiac systolic and diastolic phases from the heart, so that this cardiac information may be utilized to accelerate the perfusion analysis and improve its sensitivity in detecting pulmonary perfusion abnormalities. This makes perfusion analysis not only fast but also robust in computation; consequently, perfusion analysis becomes computationally feasible without using contrast media. Our clinical case studies with 52 patients show that this technique is effective for pulmonary embolism even without using contrast media, demonstrating consistent correlations with computed tomography (CT and nuclear medicine (NM studies. This fluoroscopical examination takes only about 2 seconds for perfusion study with only low radiation dose to patient, involving no preparation, no radioactive isotopes, and no contrast media.

  8. Patent portfolio analysis model based on legal status information

    Institute of Scientific and Technical Information of China (English)

    Xuezhao; WANG; Yajuan; ZHAO; Jing; ZHANG; Ping; ZHAO

    2014-01-01

    Purpose:This research proposes a patent portfolio analysis model based on the legal status information to chart out a competitive landscape in a particular field,enabling organizations to position themselves within the overall technology landscape.Design/methodology/approach:Three indicators were selected for the proposed model:Patent grant rate,valid patents rate and patent maintenance period.The model uses legal status information to perform a qualitative evaluation of relative values of the individual patents,countries or regions’ technological capabilities and competitiveness of patent applicants.The results are visualized by a four-quadrant bubble chart To test the effectiveness of the model,it is used to present a competitive landscape in the lithium ion battery field.Findings:The model can be used to evaluate the values of the individual patents,highlight countries or regions’ positions in the field,and rank the competitiveness of patent applicants in the field.Research limitations:The model currently takes into consideration only three legal status indicators.It is actually feasible to introduce more indicators such as the reason for invalid patents and the distribution of patent maintenance time and associate them with those in the proposed model.Practical implications:Analysis of legal status information in combination of patent application information can help an organization to spot gaps in its patent claim coverage,as well as evaluate patent quality and maintenance situation of its granted patents.The study results can be used to support technology assessment,technology innovation and intellectual property management.Originality/value:Prior studies attempted to assess patent quality or competitiveness by using either single patent legal status indicator or comparative analysis of the impacts of each indicator.However,they are insufficient in presenting the combined effects of the evaluation indicators.Using our model,it appears possible to get a

  9. Uncertainty analysis of fluvial outcrop data for stochastic reservoir modelling

    Energy Technology Data Exchange (ETDEWEB)

    Martinius, A.W. [Statoil Research Centre, Trondheim (Norway); Naess, A. [Statoil Exploration and Production, Stjoerdal (Norway)

    2005-07-01

    Uncertainty analysis and reduction is a crucial part of stochastic reservoir modelling and fluid flow simulation studies. Outcrop analogue studies are often employed to define reservoir model parameters but the analysis of uncertainties associated with sedimentological information is often neglected. In order to define uncertainty inherent in outcrop data more accurately, this paper presents geometrical and dimensional data from individual point bars and braid bars, from part of the low net:gross outcropping Tortola fluvial system (Spain) that has been subjected to a quantitative and qualitative assessment. Four types of primary outcrop uncertainties are discussed: (1) the definition of the conceptual depositional model; (2) the number of observations on sandstone body dimensions; (3) the accuracy and representativeness of observed three-dimensional (3D) sandstone body size data; and (4) sandstone body orientation. Uncertainties related to the depositional model are the most difficult to quantify but can be appreciated qualitatively if processes of deposition related to scales of time and the general lack of information are considered. Application of the N0 measure is suggested to assess quantitatively whether a statistically sufficient number of dimensional observations is obtained to reduce uncertainty to an acceptable level. The third type of uncertainty is evaluated in a qualitative sense and determined by accurate facies analysis. The orientation of sandstone bodies is shown to influence spatial connectivity. As a result, an insufficient number or quality of observations may have important consequences for estimated connected volumes. This study will give improved estimations for reservoir modelling. (author)

  10. Predicate Argument Structure Analysis for Use Case Description Modeling

    Science.gov (United States)

    Takeuchi, Hironori; Nakamura, Taiga; Yamaguchi, Takahira

    In a large software system development project, many documents are prepared and updated frequently. In such a situation, support is needed for looking through these documents easily to identify inconsistencies and to maintain traceability. In this research, we focus on the requirements documents such as use cases and consider how to create models from the use case descriptions in unformatted text. In the model construction, we propose a few semantic constraints based on the features of the use cases and use them for a predicate argument structure analysis to assign semantic labels to actors and actions. With this approach, we show that we can assign semantic labels without enhancing any existing general lexical resources such as case frame dictionaries and design a less language-dependent model construction architecture. By using the constructed model, we consider a system for quality analysis of the use cases and automated test case generation to keep the traceability between document sets. We evaluated the reuse of the existing use cases and generated test case steps automatically with the proposed prototype system from real-world use cases in the development of a system using a packaged application. Based on the evaluation, we show how to construct models with high precision from English and Japanese use case data. Also, we could generate good test cases for about 90% of the real use cases through the manual improvement of the descriptions based on the feedback from the quality analysis system.

  11. A fuzzy set preference model for market share analysis

    Science.gov (United States)

    Turksen, I. B.; Willson, Ian A.

    1992-01-01

    Consumer preference models are widely used in new product design, marketing management, pricing, and market segmentation. The success of new products depends on accurate market share prediction and design decisions based on consumer preferences. The vague linguistic nature of consumer preferences and product attributes, combined with the substantial differences between individuals, creates a formidable challenge to marketing models. The most widely used methodology is conjoint analysis. Conjoint models, as currently implemented, represent linguistic preferences as ratio or interval-scaled numbers, use only numeric product attributes, and require aggregation of individuals for estimation purposes. It is not surprising that these models are costly to implement, are inflexible, and have a predictive validity that is not substantially better than chance. This affects the accuracy of market share estimates. A fuzzy set preference model can easily represent linguistic variables either in consumer preferences or product attributes with minimal measurement requirements (ordinal scales), while still estimating overall preferences suitable for market share prediction. This approach results in flexible individual-level conjoint models which can provide more accurate market share estimates from a smaller number of more meaningful consumer ratings. Fuzzy sets can be incorporated within existing preference model structures, such as a linear combination, using the techniques developed for conjoint analysis and market share estimation. The purpose of this article is to develop and fully test a fuzzy set preference model which can represent linguistic variables in individual-level models implemented in parallel with existing conjoint models. The potential improvements in market share prediction and predictive validity can substantially improve management decisions about what to make (product design), for whom to make it (market segmentation), and how much to make (market share

  12. Mathematical modelling methodologies in predictive food microbiology: a SWOT analysis.

    Science.gov (United States)

    Ferrer, Jordi; Prats, Clara; López, Daniel; Vives-Rego, Josep

    2009-08-31

    Predictive microbiology is the area of food microbiology that attempts to forecast the quantitative evolution of microbial populations over time. This is achieved to a great extent through models that include the mechanisms governing population dynamics. Traditionally, the models used in predictive microbiology are whole-system continuous models that describe population dynamics by means of equations applied to extensive or averaged variables of the whole system. Many existing models can be classified by specific criteria. We can distinguish between survival and growth models by seeing whether they tackle mortality or cell duplication. We can distinguish between empirical (phenomenological) models, which mathematically describe specific behaviour, and theoretical (mechanistic) models with a biological basis, which search for the underlying mechanisms driving already observed phenomena. We can also distinguish between primary, secondary and tertiary models, by examining their treatment of the effects of external factors and constraints on the microbial community. Recently, the use of spatially explicit Individual-based Models (IbMs) has spread through predictive microbiology, due to the current technological capacity of performing measurements on single individual cells and thanks to the consolidation of computational modelling. Spatially explicit IbMs are bottom-up approaches to microbial communities that build bridges between the description of micro-organisms at the cell level and macroscopic observations at the population level. They provide greater insight into the mesoscale phenomena that link unicellular and population levels. Every model is built in response to a particular question and with different aims. Even so, in this research we conducted a SWOT (Strength, Weaknesses, Opportunities and Threats) analysis of the different approaches (population continuous modelling and Individual-based Modelling), which we hope will be helpful for current and future

  13. A multiserver multiqueue network: modeling and performance analysis

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    A new category of system model, multiserver multiqueue network (MSMQN), is proposed for distributed systems such as the geographically distributed Web-server clusters. A MSMQN comprises multiple multiserver multiqueue (MSMQ) nodes distributed over the network, and everynode consists of a number of servers that each contains multiple priority queues for waiting customers. An incoming request can be distributed to a waiting queue of any server in any node, according to the routing policy integrated by the node-selection policy at network-level, request-dispatching policy at node-level, and request-scheduling policy at server-level. The model is investigated using stochastic high-level Petri net (SHLPN) modeling and performance analysis techniques. Theperformance metrics concerned includes the delay time of requests in the MSMQ node and the response time perceived by the users. The numerical example shows the efficiency of the performance analysis technique.

  14. Analysis of DIRAC's behavior using model checking with process algebra

    CERN Document Server

    Remenska, Daniela; Willemse, Tim; Bal, Henri; Verstoep, Kees; Fokkink, Wan; Charpentier, Philippe; Diaz, Ricardo Graciani; Lanciotti, Elisa; Roiser, Stefan; Ciba, Krzysztof

    2012-01-01

    DIRAC is the grid solution developed to support LHCb production activities as well as user data analysis. It consists of distributed services and agents delivering the workload to the grid resources. Services maintain database back-ends to store dynamic state information of entities such as jobs, queues, staging requests, etc. Agents use polling to check and possibly react to changes in the system state. Each agent's logic is relatively simple, the main complexity lies in their cooperation. Agents run concurrently, and collaborate using the databases as shared memory. The databases can be accessed directly by the agents if running locally or through a DIRAC service interface if necessary. This shared-memory model causes entities to occasionally get into inconsistent states. Tracing and fixing such problems becomes formidable due to the inherent parallelism present. We propose more rigorous methods to cope with this. Model checking is one such technique for analysis of an abstract model of a system. Unlike con...

  15. A multiserver multiqueue network:modeling and performance analysis

    Institute of Scientific and Technical Information of China (English)

    ZhiguangShan; YangYang; 等

    2002-01-01

    A new categroy of system model,multiserver multiqueue network(MSMQN),is proposed for distributed systems such as the geopgraphically distributed web-server clusters.A MSMQN comprises multiple multiserver multiqueue(MSMQ) nodes distributed over the network.and every node consists of a number of servers that each contains multiple priority queues for waiting customers.An incoming request can be distributed to a waiting queue of any server in any node,according to the routing policy integrated by the nodeselection policy at network-level,request-dispatching policy at node-level,and request-scheduling policy at server-level.The model is investigated using stochastic high-level Petrinet(SHLPN) modeling and performance analysis techniques.The performance metrics concerned includes the delay time of requests in the MSMQ node and the response time perceived by the users.The numerical example shows the feeiciency of the performance analysis technique.

  16. Sensitivity analysis in a Lassa fever deterministic mathematical model

    Science.gov (United States)

    Abdullahi, Mohammed Baba; Doko, Umar Chado; Mamuda, Mamman

    2015-05-01

    Lassa virus that causes the Lassa fever is on the list of potential bio-weapons agents. It was recently imported into Germany, the Netherlands, the United Kingdom and the United States as a consequence of the rapid growth of international traffic. A model with five mutually exclusive compartments related to Lassa fever is presented and the basic reproduction number analyzed. A sensitivity analysis of the deterministic model is performed. This is done in order to determine the relative importance of the model parameters to the disease transmission. The result of the sensitivity analysis shows that the most sensitive parameter is the human immigration, followed by human recovery rate, then person to person contact. This suggests that control strategies should target human immigration, effective drugs for treatment and education to reduced person to person contact.

  17. Analysis of Intensity Fluctuations of SASE using the AR Model

    CERN Document Server

    Kato, Ryukou; Isoyama, Goro; Kashiwagi, Shigeru; Okamoto, C; Suemine, Shoji; Yamamoto, Tamotsu

    2004-01-01

    We are conducting experimental study on Self-Amplified Spontaneous Emission (SASE) in the far-infrared region using the L-band linac at the Institute of Scientific and Industrial Research (ISIR), Osaka University. The intensity of SASE fluctuates intrinsically because the number of coherent optical pulses generated in an electron bunch is limited. In the actual system, however, another factor producing intensity fluctuations also shows up, which is instability of the linac. Generally speaking, it is difficult to distinguish contributions of these two factors in measured intensity fluctuations. We have applied the autoregressive (AR) model, which is one of the techniques of statistical analysis, to exclude contributions of linac instability from measured data. In the AR model, the present data can be expressed with a linear combination of the past data plus white noise. In the analysis, contributions of the linac instability are identified with the AR model and can be subtracted from the measured data of SASE,...

  18. Image decomposition as a tool for validating stress analysis models

    Directory of Open Access Journals (Sweden)

    Mottershead J.

    2010-06-01

    Full Text Available It is good practice to validate analytical and numerical models used in stress analysis for engineering design by comparison with measurements obtained from real components either in-service or in the laboratory. In reality, this critical step is often neglected or reduced to placing a single strain gage at the predicted hot-spot of stress. Modern techniques of optical analysis allow full-field maps of displacement, strain and, or stress to be obtained from real components with relative ease and at modest cost. However, validations continued to be performed only at predicted and, or observed hot-spots and most of the wealth of data is ignored. It is proposed that image decomposition methods, commonly employed in techniques such as fingerprinting and iris recognition, can be employed to validate stress analysis models by comparing all of the key features in the data from the experiment and the model. Image decomposition techniques such as Zernike moments and Fourier transforms have been used to decompose full-field distributions for strain generated from optical techniques such as digital image correlation and thermoelastic stress analysis as well as from analytical and numerical models by treating the strain distributions as images. The result of the decomposition is 101 to 102 image descriptors instead of the 105 or 106 pixels in the original data. As a consequence, it is relatively easy to make a statistical comparison of the image descriptors from the experiment and from the analytical/numerical model and to provide a quantitative assessment of the stress analysis.

  19. Comparative analysis of calculation models of railway subgrade

    Directory of Open Access Journals (Sweden)

    I.O. Sviatko

    2013-08-01

    Full Text Available Purpose. In transport engineering structures design, the primary task is to determine the parameters of foundation soil and nuances of its work under loads. It is very important to determine the parameters of shear resistance and the parameters, determining the development of deep deformations in foundation soils, while calculating the soil subgrade - upper track structure interaction. Search for generalized numerical modeling methods of embankment foundation soil work that include not only the analysis of the foundation stress state but also of its deformed one. Methodology. The analysis of existing modern and classical methods of numerical simulation of soil samples under static load was made. Findings. According to traditional methods of analysis of ground masses work, limitation and the qualitative estimation of subgrade deformations is possible only indirectly, through the estimation of stress and comparison of received values with the boundary ones. Originality. A new computational model was proposed in which it will be applied not only classical approach analysis of the soil subgrade stress state, but deformed state will be also taken into account. Practical value. The analysis showed that for accurate analysis of ground masses work it is necessary to develop a generalized methodology for analyzing of the rolling stock - railway subgrade interaction, which will use not only the classical approach of analyzing the soil subgrade stress state, but also take into account its deformed one.

  20. Urban Sprawl Analysis and Modeling in Asmara, Eritrea

    Directory of Open Access Journals (Sweden)

    Mussie G. Tewolde

    2011-09-01

    Full Text Available The extension of urban perimeter markedly cuts available productive land. Hence, studies in urban sprawl analysis and modeling play an important role to ensure sustainable urban development. The urbanization pattern of the Greater Asmara Area (GAA, the capital of Eritrea, was studied. Satellite images and geospatial tools were employed to analyze the spatiotemporal urban landuse changes. Object-Based Image Analysis (OBIA, Landuse Cover Change (LUCC analysis and urban sprawl analysis using Shannon Entropy were carried out. The Land Change Modeler (LCM was used to develop a model of urban growth. The Multi-layer Perceptron Neural Network was employed to model the transition potential maps with an accuracy of 85.9% and these were used as an input for the ‘actual’ urban modeling with Markov chains. Model validation was assessed and a scenario of urban land use change of the GAA up to year 2020 was presented. The result of the study indicated that the built-up area has tripled in size (increased by 4,441 ha between 1989 and 2009. Specially, after year 2000 urban sprawl in GAA caused large scale encroachment on high potential agricultural lands and plantation cover. The scenario for year 2020 shows an increase of the built-up areas by 1,484 ha (25% which may cause further loss. The study indicated that the land allocation system in the GAA overrode the landuse plan, which caused the loss of agricultural land and plantation cover. The recommended policy options might support decision makers to resolve further loss of agricultural land and plantation cover and to achieve sustainable urban development planning in the GAA.

  1. Constructing Maximum Entropy Language Models for Movie Review Subjectivity Analysis

    Institute of Scientific and Technical Information of China (English)

    Bo Chen; Hui He; Jun Guo

    2008-01-01

    Document subjectivity analysis has become an important aspect of web text content mining. This problem is similar to traditional text categorization, thus many related classification techniques can be adapted here. However, there is one significant difference that more language or semantic information is required for better estimating the subjectivity of a document. Therefore, in this paper, our focuses are mainly on two aspects. One is how to extract useful and meaningful language features, and the other is how to construct appropriate language models efficiently for this special task. For the first issue, we conduct a Global-Filtering and Local-Weighting strategy to select and evaluate language features in a series of n-grams with different orders and within various distance-windows. For the second issue, we adopt Maximum Entropy (MaxEnt) modeling methods to construct our language model framework. Besides the classical MaxEnt models, we have also constructed two kinds of improved models with Gaussian and exponential priors respectively. Detailed experiments given in this paper show that with well selected and weighted language features, MaxEnt models with exponential priors are significantly more suitable for the text subjectivity analysis task.

  2. An Effective Distributed Model for Power System Transient Stability Analysis

    Directory of Open Access Journals (Sweden)

    MUTHU, B. M.

    2011-08-01

    Full Text Available The modern power systems consist of many interconnected synchronous generators having different inertia constants, connected with large transmission network and ever increasing demand for power exchange. The size of the power system grows exponentially due to increase in power demand. The data required for various power system applications have been stored in different formats in a heterogeneous environment. The power system applications themselves have been developed and deployed in different platforms and language paradigms. Interoperability between power system applications becomes a major issue because of the heterogeneous nature. The main aim of the paper is to develop a generalized distributed model for carrying out power system stability analysis. The more flexible and loosely coupled JAX-RPC model has been developed for representing transient stability analysis in large interconnected power systems. The proposed model includes Pre-Fault, During-Fault, Post-Fault and Swing Curve services which are accessible to the remote power system clients when the system is subjected to large disturbances. A generalized XML based model for data representation has also been proposed for exchanging data in order to enhance the interoperability between legacy power system applications. The performance measure, Round Trip Time (RTT is estimated for different power systems using the proposed JAX-RPC model and compared with the results obtained using traditional client-server and Java RMI models.

  3. Analysis of multinomial models with unknown index using data augmentation

    Science.gov (United States)

    Royle, J. Andrew; Dorazio, R.M.; Link, W.A.

    2007-01-01

    Multinomial models with unknown index ('sample size') arise in many practical settings. In practice, Bayesian analysis of such models has proved difficult because the dimension of the parameter space is not fixed, being in some cases a function of the unknown index. We describe a data augmentation approach to the analysis of this class of models that provides for a generic and efficient Bayesian implementation. Under this approach, the data are augmented with all-zero detection histories. The resulting augmented dataset is modeled as a zero-inflated version of the complete-data model where an estimable zero-inflation parameter takes the place of the unknown multinomial index. Interestingly, data augmentation can be justified as being equivalent to imposing a discrete uniform prior on the multinomial index. We provide three examples involving estimating the size of an animal population, estimating the number of diabetes cases in a population using the Rasch model, and the motivating example of estimating the number of species in an animal community with latent probabilities of species occurrence and detection.

  4. Porflow modeling supporting the FY14 salstone special analysis

    Energy Technology Data Exchange (ETDEWEB)

    Flach, G. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Taylor, G. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2014-04-01

    PORFLOW related analyses supporting the Saltstone FY14 Special Analysis (SA) described herein are based on prior modeling supporting the Saltstone FY13 SA. Notable changes to the previous round of simulations include: a) consideration of Saltstone Disposal Unit (SDU) design type 6 under “Nominal” and “Margin” conditions, b) omission of the clean cap fill from the nominal SDU 2 and 6 modeling cases as a reasonable approximation of greater waste grout fill heights, c) minor updates to the cementitious materials degradation analysis, d) use of updated I-129 sorption coefficient (Kd) values in soils, e) assignment of the pH/Eh environment of saltstone to the underlying floor concrete, considering down flow through an SDU, and f) implementation of an improved sub-model for Tc release in an oxidizing environment. These new model developments are discussed and followed by a cursory presentation of simulation results. The new Tc release sub-model produced significantly improved (smoother) flux results compared to the FY13 SA. Further discussion of PORFLOW model setup and simulation results will be presented in the FY14 SA, including dose results.

  5. Analysis of effect factors-based stochastic network planning model

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    Looking at all the indeterminate factors as a whole and regarding activity durations as independent random variables,the traditional stochastic network planning models ignore the inevitable relationship and dependence among activity durations when more than one activity is possibly affected by the same indeterminate factors.On this basis of analysis of indeterminate effect factors of durations,the effect factors-based stochastic network planning (EFBSNP) model is proposed,which emphasizes on the effects of not only logistic and organizational relationships,but also the dependent relationships,due to indeterminate factors among activity durations on the project period.By virtue of indeterminate factor analysis the model extracts and describes the quantitatively indeterminate effect factors,and then takes into account the indeterminate factors effect schedule by using the Monte Carlo simulation technique.The method is flexible enough to deal with effect factors and is coincident with practice.A software has been developed to simplify the model-based calculation,in VisualStudio.NET language.Finally,a case study is included to demonstrate the applicability of the proposed model and comparison is made with some advantages over the existing models.

  6. INTEGRATION OF FACILITY MODELING CAPABILITIES FOR NUCLEAR NONPROLIFERATION ANALYSIS

    Energy Technology Data Exchange (ETDEWEB)

    Gorensek, M.; Hamm, L.; Garcia, H.; Burr, T.; Coles, G.; Edmunds, T.; Garrett, A.; Krebs, J.; Kress, R.; Lamberti, V.; Schoenwald, D.; Tzanos, C.; Ward, R.

    2011-07-18

    Developing automated methods for data collection and analysis that can facilitate nuclear nonproliferation assessment is an important research area with significant consequences for the effective global deployment of nuclear energy. Facility modeling that can integrate and interpret observations collected from monitored facilities in order to ascertain their functional details will be a critical element of these methods. Although improvements are continually sought, existing facility modeling tools can characterize all aspects of reactor operations and the majority of nuclear fuel cycle processing steps, and include algorithms for data processing and interpretation. Assessing nonproliferation status is challenging because observations can come from many sources, including local and remote sensors that monitor facility operations, as well as open sources that provide specific business information about the monitored facilities, and can be of many different types. Although many current facility models are capable of analyzing large amounts of information, they have not been integrated in an analyst-friendly manner. This paper addresses some of these facility modeling capabilities and illustrates how they could be integrated and utilized for nonproliferation analysis. The inverse problem of inferring facility conditions based on collected observations is described, along with a proposed architecture and computer framework for utilizing facility modeling tools. After considering a representative sampling of key facility modeling capabilities, the proposed integration framework is illustrated with several examples.

  7. A Spatial Lattice Model Applied for Meteorological Visualization and Analysis

    Directory of Open Access Journals (Sweden)

    Mingyue Lu

    2017-03-01

    Full Text Available Meteorological information has obvious spatial-temporal characteristics. Although it is meaningful to employ a geographic information system (GIS to visualize and analyze the meteorological information for better identification and forecasting of meteorological weather so as to reduce the meteorological disaster loss, modeling meteorological information based on a GIS is still difficult because meteorological elements generally have no stable shape or clear boundary. To date, there are still few GIS models that can satisfy the requirements of both meteorological visualization and analysis. In this article, a spatial lattice model based on sampling particles is proposed to support both the representation and analysis of meteorological information. In this model, a spatial sampling particle is regarded as the basic element that contains the meteorological information, and the location where the particle is placed with the time mark. The location information is generally represented using a point. As these points can be extended to a surface in two dimensions and a voxel in three dimensions, if these surfaces and voxels can occupy a certain space, then this space can be represented using these spatial sampling particles with their point locations and meteorological information. In this case, the full meteorological space can then be represented by arranging numerous particles with their point locations in a certain structure and resolution, i.e., the spatial lattice model, and extended at a higher resolution when necessary. For practical use, the meteorological space is logically classified into three types of spaces, namely the projection surface space, curved surface space, and stereoscopic space, and application-oriented spatial lattice models with different organization forms of spatial sampling particles are designed to support the representation, inquiry, and analysis of meteorological information within the three types of surfaces. Cases

  8. Epistasis analysis for quantitative traits by functional regression model.

    Science.gov (United States)

    Zhang, Futao; Boerwinkle, Eric; Xiong, Momiao

    2014-06-01

    The critical barrier in interaction analysis for rare variants is that most traditional statistical methods for testing interactions were originally designed for testing the interaction between common variants and are difficult to apply to rare variants because of their prohibitive computational time and poor ability. The great challenges for successful detection of interactions with next-generation sequencing (NGS) data are (1) lack of methods for interaction analysis with rare variants, (2) severe multiple testing, and (3) time-consuming computations. To meet these challenges, we shift the paradigm of interaction analysis between two loci to interaction analysis between two sets of loci or genomic regions and collectively test interactions between all possible pairs of SNPs within two genomic regions. In other words, we take a genome region as a basic unit of interaction analysis and use high-dimensional data reduction and functional data analysis techniques to develop a novel functional regression model to collectively test interactions between all possible pairs of single nucleotide polymorphisms (SNPs) within two genome regions. By intensive simulations, we demonstrate that the functional regression models for interaction analysis of the quantitative trait have the correct type 1 error rates and a much better ability to detect interactions than the current pairwise interaction analysis. The proposed method was applied to exome sequence data from the NHLBI's Exome Sequencing Project (ESP) and CHARGE-S study. We discovered 27 pairs of genes showing significant interactions after applying the Bonferroni correction (P-values < 4.58 × 10(-10)) in the ESP, and 11 were replicated in the CHARGE-S study.

  9. Assessing climate model software quality: a defect density analysis of three models

    Directory of Open Access Journals (Sweden)

    J. Pipitone

    2012-08-01

    Full Text Available A climate model is an executable theory of the climate; the model encapsulates climatological theories in software so that they can be simulated and their implications investigated. Thus, in order to trust a climate model, one must trust that the software it is built from is built correctly. Our study explores the nature of software quality in the context of climate modelling. We performed an analysis of defect reports and defect fixes in several versions of leading global climate models by collecting defect data from bug tracking systems and version control repository comments. We found that the climate models all have very low defect densities compared to well-known, similarly sized open-source projects. We discuss the implications of our findings for the assessment of climate model software trustworthiness.

  10. Renewable Energy and Efficiency Modeling Analysis Partnership (REMAP): An Analysis of How Different Energy Models Addressed a Common High Renewable Energy Penetration Scenario in 2025

    Energy Technology Data Exchange (ETDEWEB)

    Blair, Nate [National Renewable Energy Lab. (NREL), Golden, CO (United States); Jenkin, Thomas [National Renewable Energy Lab. (NREL), Golden, CO (United States); Milford, James [National Renewable Energy Lab. (NREL), Golden, CO (United States); Short, Walter [National Renewable Energy Lab. (NREL), Golden, CO (United States); Sullivan, Patrick [National Renewable Energy Lab. (NREL), Golden, CO (United States); Evans, David [US Environmental Protection Agency (EPA), Cincinnati, OH (United States); Lieberman, Elliot [US Environmental Protection Agency (EPA), Cincinnati, OH (United States); Goldstein, Gary [International Resources Group, Washington, DC (United States); Wright, Evelyn [International Resources Group, Washington, DC (United States); Jayaraman, Kamala R. [ICF International, Fairfax, VA (United States); Venkatesh, Boddu [ICF International, Fairfax, VA (United States); Kleiman, Gary [Northeast States for Coordinated Air Use Management, Boston, MA (United States); Namovicz, Christopher [Energy Information Administration, Washington, DC (United States); Smith, Bob [Energy Information Administration, Washington, DC (United States); Palmer, Karen [Resources of the Future, Washington, DC (United States); Wiser, Ryan [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Wood, Frances [OnLocation Inc., Vienna, VA (United States)

    2009-09-01

    Energy system modeling can be intentionally or unintentionally misused by decision-makers. This report describes how both can be minimized through careful use of models and thorough understanding of their underlying approaches and assumptions. The analysis summarized here assesses the impact that model and data choices have on forecasting energy systems by comparing seven different electric-sector models. This analysis was coordinated by the Renewable Energy and Efficiency Modeling Analysis Partnership (REMAP), a collaboration among governmental, academic, and nongovernmental participants.

  11. Biological exposure models for oil spill impact analysis

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    The oil spill impact analysis (OSIA) software system has been developed to supply a tool for comprehensive, quantitative environmental impact assessments resulting from oil spills. In the system, a biological component evaluates potential effects on exposed organisms based on results from a physico-chemieal fates component, including the extent and characteristics of the surface slick, and dissolved and total concentrations of hydrocarbons in the water column. The component includes a particle-based exposure model for migratory adult fish populations, a particle-based exposure model for spawning planktonic organisms (eggs and larvae), and an exposure model for wildlife species (sea birds or marine mammals). The exposure model for migratory adult fish populations simulates the migration behaviors of fish populations migrating to or staying in their feeding areas, over-wintering areas or spawning areas, and determines the acute effects (mortality) and chronic accumulation (body burdens) from the dissolved contaminant. The exposure model for spawning planktonic organisms simulates the release of eggs and larvae, also as particles, from specific spawning areas during the spawning period, and determines their potential exposure to contaminants in the water or sediment. The exposure model for wild species calculates the exposure to surrace oil of wildlife (bird and marine mammal ) categories inhabiting the contaminated area. Compared with the earlier models in which all kinds of organisms are assumed evenly and randomly distributed, the updated biological exposure models can more realistically estimate potential effects on marine ecological system from oil spill pollution events.

  12. SMV model-based safety analysis of software requirements

    Energy Technology Data Exchange (ETDEWEB)

    Koh, Kwang Yong [Department of Nuclear and Quantum Engineering, Korea Advanced Institute of Science and Technology, 373-1, Guseong-dong, Yuseong-gu, Daejeon 305-701 (Korea, Republic of); Seong, Poong Hyun [Department of Nuclear and Quantum Engineering, Korea Advanced Institute of Science and Technology, 373-1, Guseong-dong, Yuseong-gu, Daejeon 305-701 (Korea, Republic of)], E-mail: phseong@kaist.ac.kr

    2009-02-15

    Fault tree analysis (FTA) is one of the most frequently applied safety analysis techniques when developing safety-critical industrial systems such as software-based emergency shutdown systems of nuclear power plants and has been used for safety analysis of software requirements in the nuclear industry. However, the conventional method for safety analysis of software requirements has several problems in terms of correctness and efficiency; the fault tree generated from natural language specifications may contain flaws or errors while the manual work of safety verification is very labor-intensive and time-consuming. In this paper, we propose a new approach to resolve problems of the conventional method; we generate a fault tree from a symbolic model verifier (SMV) model, not from natural language specifications, and verify safety properties automatically, not manually, by a model checker SMV. To demonstrate the feasibility of this approach, we applied it to shutdown system 2 (SDS2) of Wolsong nuclear power plant (NPP). In spite of subtle ambiguities present in the approach, the results of this case study demonstrate its overall feasibility and effectiveness.

  13. A Suite of Tools for ROC Analysis of Spatial Models

    Directory of Open Access Journals (Sweden)

    Hermann Rodrigues

    2013-09-01

    Full Text Available The Receiver Operating Characteristic (ROC is widely used for assessing the performance of classification algorithms. In GIScience, ROC has been applied to assess models aimed at predicting events, such as land use/cover change (LUCC, species distribution and disease risk. However, GIS software packages offer few statistical tests and guidance tools for ROC analysis and interpretation. This paper presents a suite of GIS tools designed to facilitate ROC curve analysis for GIS users by applying proper statistical tests and analysis procedures. The tools are freely available as models and submodels of Dinamica EGO freeware. The tools give the ROC curve, the area under the curve (AUC, partial AUC, lower and upper AUCs, the confidence interval of AUC, the density of event in probability bins and tests to evaluate the difference between the AUCs of two models. We present first the procedures and statistical tests implemented in Dinamica EGO, then the application of the tools to assess LUCC and species distribution models. Finally, we interpret and discuss the ROC-related statistics resulting from various case studies.

  14. Magnetic Testing, and Modeling, Simulation and Analysis for Space Applications

    Science.gov (United States)

    Boghosian, Mary; Narvaez, Pablo; Herman, Ray

    2012-01-01

    The Aerospace Corporation (Aerospace) and Lockheed Martin Space Systems (LMSS) participated with Jet Propulsion Laboratory (JPL) in the implementation of a magnetic cleanliness program of the NASA/JPL JUNO mission. The magnetic cleanliness program was applied from early flight system development up through system level environmental testing. The JUNO magnetic cleanliness program required setting-up a specialized magnetic test facility at Lockheed Martin Space Systems for testing the flight system and a testing program with facility for testing system parts and subsystems at JPL. The magnetic modeling, simulation and analysis capability was set up and performed by Aerospace to provide qualitative and quantitative magnetic assessments of the magnetic parts, components, and subsystems prior to or in lieu of magnetic tests. Because of the sensitive nature of the fields and particles scientific measurements being conducted by the JUNO space mission to Jupiter, the imposition of stringent magnetic control specifications required a magnetic control program to ensure that the spacecraft's science magnetometers and plasma wave search coil were not magnetically contaminated by flight system magnetic interferences. With Aerospace's magnetic modeling, simulation and analysis and JPL's system modeling and testing approach, and LMSS's test support, the project achieved a cost effective approach to achieving a magnetically clean spacecraft. This paper presents lessons learned from the JUNO magnetic testing approach and Aerospace's modeling, simulation and analysis activities used to solve problems such as remnant magnetization, performance of hard and soft magnetic materials within the targeted space system in applied external magnetic fields.

  15. Failure Propagation Modeling and Analysis via System Interfaces

    Directory of Open Access Journals (Sweden)

    Lin Zhao

    2016-01-01

    Full Text Available Safety-critical systems must be shown to be acceptably safe to deploy and use in their operational environment. One of the key concerns of developing safety-critical systems is to understand how the system behaves in the presence of failures, regardless of whether that failure is triggered by the external environment or caused by internal errors. Safety assessment at the early stages of system development involves analysis of potential failures and their consequences. Increasingly, for complex systems, model-based safety assessment is becoming more widely used. In this paper we propose an approach for safety analysis based on system interface models. By extending interaction models on the system interface level with failure modes as well as relevant portions of the physical system to be controlled, automated support could be provided for much of the failure analysis. We focus on fault modeling and on how to compute minimal cut sets. Particularly, we explore state space reconstruction strategy and bounded searching technique to reduce the number of states that need to be analyzed, which remarkably improves the efficiency of cut sets searching algorithm.

  16. Improving Credit Scorecard Modeling Through Applying Text Analysis

    Directory of Open Access Journals (Sweden)

    Omar Ghailan

    2016-04-01

    Full Text Available In the credit card scoring and loans management, the prediction of the applicant’s future behavior is an important decision support tool and a key factor in reducing the risk of Loan Default. A lot of data mining and classification approaches have been developed for the credit scoring purpose. For the best of our knowledge, building a credit scorecard by analyzing the textual data in the application form has not been explored so far. This paper proposes a comprehensive credit scorecard model technique that improves credit scorecard modeling though employing textual data analysis. This study uses a sample of loan application forms of a financial institution providing loan services in Yemen, which represents a real-world situation of the credit scoring and loan management. The sample contains a set of Arabic textual data attributes defining the applicants. The credit scoring model based on the text mining pre-processing and logistic regression techniques is proposed and evaluated through a comparison with a group of credit scorecard modeling techniques that use only the numeric attributes in the application form. The results show that adding the textual attributes analysis achieves higher classification effectiveness and outperforms the other traditional numerical data analysis techniques.

  17. Dynamic Response of Linear Mechanical Systems Modeling, Analysis and Simulation

    CERN Document Server

    Angeles, Jorge

    2012-01-01

    Dynamic Response of Linear Mechanical Systems: Modeling, Analysis and Simulation can be utilized for a variety of courses, including junior and senior-level vibration and linear mechanical analysis courses. The author connects, by means of a rigorous, yet intuitive approach, the theory of vibration with the more general theory of systems. The book features: A seven-step modeling technique that helps structure the rather unstructured process of mechanical-system modeling A system-theoretic approach to deriving the time response of the linear mathematical models of mechanical systems The modal analysis and the time response of two-degree-of-freedom systems—the first step on the long way to the more elaborate study of multi-degree-of-freedom systems—using the Mohr circle Simple, yet powerful simulation algorithms that exploit the linearity of the system for both single- and multi-degree-of-freedom systems Examples and exercises that rely on modern computational toolboxes for both numerical and symbolic compu...

  18. Mathematical modeling of isotope labeling experiments for metabolic flux analysis.

    Science.gov (United States)

    Nargund, Shilpa; Sriram, Ganesh

    2014-01-01

    Isotope labeling experiments (ILEs) offer a powerful methodology to perform metabolic flux analysis. However, the task of interpreting data from these experiments to evaluate flux values requires significant mathematical modeling skills. Toward this, this chapter provides background information and examples to enable the reader to (1) model metabolic networks, (2) simulate ILEs, and (3) understand the optimization and statistical methods commonly used for flux evaluation. A compartmentalized model of plant glycolysis and pentose phosphate pathway illustrates the reconstruction of a typical metabolic network, whereas a simpler example network illustrates the underlying metabolite and isotopomer balancing techniques. We also discuss the salient features of commonly used flux estimation software 13CFLUX2, Metran, NMR2Flux+, FiatFlux, and OpenFLUX. Furthermore, we briefly discuss methods to improve flux estimates. A graphical checklist at the end of the chapter provides a reader a quick reference to the mathematical modeling concepts and resources.

  19. Mathematical modeling of spinning elastic bodies for modal analysis.

    Science.gov (United States)

    Likins, P. W.; Barbera, F. J.; Baddeley, V.

    1973-01-01

    The problem of modal analysis of an elastic appendage on a rotating base is examined to establish the relative advantages of various mathematical models of elastic structures and to extract general inferences concerning the magnitude and character of the influence of spin on the natural frequencies and mode shapes of rotating structures. In realization of the first objective, it is concluded that except for a small class of very special cases the elastic continuum model is devoid of useful results, while for constant nominal spin rate the distributed-mass finite-element model is quite generally tractable, since in the latter case the governing equations are always linear, constant-coefficient, ordinary differential equations. Although with both of these alternatives the details of the formulation generally obscure the essence of the problem and permit very little engineering insight to be gained without extensive computation, this difficulty is not encountered when dealing with simple concentrated mass models.

  20. Planetary and Interplanetary Environmental Models for Radiation Analysis

    Science.gov (United States)

    DeAngelis, G.; Cucinotta, F. A.

    2005-01-01

    The essence of environmental modeling is presented as suited for radiation analysis purposes. The variables of fundamental importance for radiation environmental assessment are discussed. The characterization is performed by dividing modeling into three areas, namely the interplanetary medium, the circumplanetary environment, and the planetary or satellite surface. In the first area, the galactic cosmic rays (GCR) and their modulation by the heliospheric magnetic field as well as and solar particle events (SPE) are considered, in the second area the magnetospheres are taken into account, and in the third area the effect of the planetary environment is also considered. Planetary surfaces and atmospheres are modeled based on results from the most recent targeted spacecraft. The results are coupled with suited visualization techniques and radiation transport models in support of trade studies of health risks for future exploration missions.

  1. Rotor-Flying Manipulator: Modeling, Analysis, and Control

    Directory of Open Access Journals (Sweden)

    Bin Yang

    2014-01-01

    Full Text Available Equipping multijoint manipulators on a mobile robot is a typical redesign scheme to make the latter be able to actively influence the surroundings and has been extensively used for many ground robots, underwater robots, and space robotic systems. However, the rotor-flying robot (RFR is difficult to be made such redesign. This is mainly because the motion of the manipulator will bring heavy coupling between itself and the RFR system, which makes the system model highly complicated and the controller design difficult. Thus, in this paper, the modeling, analysis, and control of the combined system, called rotor-flying multijoint manipulator (RF-MJM, are conducted. Firstly, the detailed dynamics model is constructed and analyzed. Subsequently, a full-state feedback linear quadratic regulator (LQR controller is designed through obtaining linearized model near steady state. Finally, simulations are conducted and the results are analyzed to show the basic control performance.

  2. An educational model for ensemble streamflow simulation and uncertainty analysis

    Directory of Open Access Journals (Sweden)

    A. AghaKouchak

    2012-06-01

    Full Text Available This paper presents a hands-on modeling toolbox, HBV-Ensemble, designed as a complement to theoretical hydrology lectures, to teach hydrological processes and their uncertainties. The HBV-Ensemble can be used for in-class lab practices and homework assignments, and assessment of students' understanding of hydrological processes. Using this model, students can gain more insights into how hydrological processes (e.g., precipitation, snowmelt and snow accumulation, soil moisture, evapotranspiration and runoff generation are interconnected. The model includes a MATLAB Graphical User Interface (GUI and an ensemble simulation scheme that can be used for not only hydrological processes, but also for teaching uncertainty analysis, parameter estimation, ensemble simulation and model sensitivity.

  3. Statistical models of video structure for content analysis and characterization.

    Science.gov (United States)

    Vasconcelos, N; Lippman, A

    2000-01-01

    Content structure plays an important role in the understanding of video. In this paper, we argue that knowledge about structure can be used both as a means to improve the performance of content analysis and to extract features that convey semantic information about the content. We introduce statistical models for two important components of this structure, shot duration and activity, and demonstrate the usefulness of these models with two practical applications. First, we develop a Bayesian formulation for the shot segmentation problem that is shown to extend the standard thresholding model in an adaptive and intuitive way, leading to improved segmentation accuracy. Second, by applying the transformation into the shot duration/activity feature space to a database of movie clips, we also illustrate how the Bayesian model captures semantic properties of the content. We suggest ways in which these properties can be used as a basis for intuitive content-based access to movie libraries.

  4. An educational model for ensemble streamflow simulation and uncertainty analysis

    Directory of Open Access Journals (Sweden)

    A. AghaKouchak

    2013-02-01

    Full Text Available This paper presents the hands-on modeling toolbox, HBV-Ensemble, designed as a complement to theoretical hydrology lectures, to teach hydrological processes and their uncertainties. The HBV-Ensemble can be used for in-class lab practices and homework assignments, and assessment of students' understanding of hydrological processes. Using this modeling toolbox, students can gain more insights into how hydrological processes (e.g., precipitation, snowmelt and snow accumulation, soil moisture, evapotranspiration and runoff generation are interconnected. The educational toolbox includes a MATLAB Graphical User Interface (GUI and an ensemble simulation scheme that can be used for teaching uncertainty analysis, parameter estimation, ensemble simulation and model sensitivity. HBV-Ensemble was administered in a class for both in-class instruction and a final project, and students submitted their feedback about the toolbox. The results indicate that this educational software had a positive impact on students understanding and knowledge of uncertainty in hydrological modeling.

  5. GIS application on spatial landslide analysis using statistical based models

    Science.gov (United States)

    Pradhan, Biswajeet; Lee, Saro; Buchroithner, Manfred F.

    2009-09-01

    This paper presents the assessment results of spatially based probabilistic three models using Geoinformation Techniques (GIT) for landslide susceptibility analysis at Penang Island in Malaysia. Landslide locations within the study areas were identified by interpreting aerial photographs, satellite images and supported with field surveys. Maps of the topography, soil type, lineaments and land cover were constructed from the spatial data sets. There are ten landslide related factors were extracted from the spatial database and the frequency ratio, fuzzy logic, and bivariate logistic regression coefficients of each factor was computed. Finally, landslide susceptibility maps were drawn for study area using frequency ratios, fuzzy logic and bivariate logistic regression models. For verification, the results of the analyses were compared with actual landslide locations in study area. The verification results show that bivariate logistic regression model provides slightly higher prediction accuracy than the frequency ratio and fuzzy logic models.

  6. Introduction to modeling and analysis of stochastic systems

    CERN Document Server

    Kulkarni, V G

    2011-01-01

    This is an introductory-level text on stochastic modeling. It is suited for undergraduate students in engineering, operations research, statistics, mathematics, actuarial science, business management, computer science, and public policy. It employs a large number of examples to teach the students to use stochastic models of real-life systems to predict their performance, and use this analysis to design better systems. The book is devoted to the study of important classes of stochastic processes: discrete and continuous time Markov processes, Poisson processes, renewal and regenerative processes, semi-Markov processes, queueing models, and diffusion processes. The book systematically studies the short-term and the long-term behavior, cost/reward models, and first passage times. All the material is illustrated with many examples, and case studies. The book provides a concise review of probability in the appendix. The book emphasizes numerical answers to the problems. A collection of MATLAB programs to accompany...

  7. Eikonal model analysis of elastic hadron collisions at high energies

    CERN Document Server

    Prochazka, Jiri

    2016-01-01

    Elastic collisions of protons at different energies represent main background in studying the structure of fundamental particles at the present. On the basis of standardly used model proposed by West and Yennie the protons have been then interpreted as transparent objects; elastic events have been interpreted as more central than inelastic ones. It will be shown that using eikonal model the protons may be interpreted in agreement with usual ontological conception; elastic processes being more peripheral than inelastic ones. The corresponding results (differing fundamentally from those of WY model) will be presented by analyzing the most ample elastic data set measured at ISR energy of 53 GeV. Detailed analysis of measured differential cross section will be performed and different alternatives of peripheral behavior on the basis of eikonal model will be presented. The impact of recently established electromagnetic form factors on determination of quantities specifying hadron interaction determined from the fit...

  8. Empirical Analysis of Xinjiang's Bilateral Trade: Gravity Model Approach

    Institute of Scientific and Technical Information of China (English)

    CHEN Xuegang; YANG Zhaoping; LIU Xuling

    2008-01-01

    Based on the basic trade gravity model and Xinjiang's practical situation, new explanatory variables (GDP,GDPpc and SCO) are introduced to build an extended trade gravity model fitting for Xinjiang's bilateral trade. Fromthe empirical analysis of this model, it is proposed that those three variables affect the Xinjiang's bilateral trade posi-tively. Whereas, geographic distance is found to be a significant factor influencing Xinjiang's bilateral trade negatively.Then, by the extended trade gravity model, this article analyzes the present trade situation between Xinjiang and itsmain trade partners quantitatively in 2004. The results indicate that Xinjiang cooperates with its most trade partnerssuccessfully in terms of present economic scale and developing revel. Xinjiang has established successfully trade part-nership with Central Asia, Central Europe and Eastern Europe, Western Europe, East Asia and South Asia. However,the foreign trade development with West Asia is much slower. Finally, some suggestions on developing Xinjiang's for-eign trade are put forward.

  9. Benchmarking analysis of three multimedia models: RESRAD, MMSOILS, and MEPAS

    Energy Technology Data Exchange (ETDEWEB)

    Cheng, J.J.; Faillace, E.R.; Gnanapragasam, E.K. [and others

    1995-11-01

    Multimedia modelers from the United States Environmental Protection Agency (EPA) and the United States Department of Energy (DOE) collaborated to conduct a comprehensive and quantitative benchmarking analysis of three multimedia models. The three models-RESRAD (DOE), MMSOILS (EPA), and MEPAS (DOE)-represent analytically based tools that are used by the respective agencies for performing human exposure and health risk assessments. The study is performed by individuals who participate directly in the ongoing design, development, and application of the models. A list of physical/chemical/biological processes related to multimedia-based exposure and risk assessment is first presented as a basis for comparing the overall capabilities of RESRAD, MMSOILS, and MEPAS. Model design, formulation, and function are then examined by applying the models to a series of hypothetical problems. Major components of the models (e.g., atmospheric, surface water, groundwater) are evaluated separately and then studied as part of an integrated system for the assessment of a multimedia release scenario to determine effects due to linking components of the models. Seven modeling scenarios are used in the conduct of this benchmarking study: (1) direct biosphere exposure, (2) direct release to the air, (3) direct release to the vadose zone, (4) direct release to the saturated zone, (5) direct release to surface water, (6) surface water hydrology, and (7) multimedia release. Study results show that the models differ with respect to (1) environmental processes included (i.e., model features) and (2) the mathematical formulation and assumptions related to the implementation of solutions (i.e., parameterization).

  10. Low-rank and sparse modeling for visual analysis

    CERN Document Server

    Fu, Yun

    2014-01-01

    This book provides a view of low-rank and sparse computing, especially approximation, recovery, representation, scaling, coding, embedding and learning among unconstrained visual data. The book includes chapters covering multiple emerging topics in this new field. It links multiple popular research fields in Human-Centered Computing, Social Media, Image Classification, Pattern Recognition, Computer Vision, Big Data, and Human-Computer Interaction. Contains an overview of the low-rank and sparse modeling techniques for visual analysis by examining both theoretical analysis and real-world applic

  11. Radiolysis Model Sensitivity Analysis for a Used Fuel Storage Canister

    Energy Technology Data Exchange (ETDEWEB)

    Wittman, Richard S.

    2013-09-20

    This report fulfills the M3 milestone (M3FT-13PN0810027) to report on a radiolysis computer model analysis that estimates the generation of radiolytic products for a storage canister. The analysis considers radiolysis outside storage canister walls and within the canister fill gas over a possible 300-year lifetime. Previous work relied on estimates based directly on a water radiolysis G-value. This work also includes that effect with the addition of coupled kinetics for 111 reactions for 40 gas species to account for radiolytic-induced chemistry, which includes water recombination and reactions with air.

  12. Logistics Enterprise Evaluation Model Based On Fuzzy Clustering Analysis

    Science.gov (United States)

    Fu, Pei-hua; Yin, Hong-bo

    In this thesis, we introduced an evaluation model based on fuzzy cluster algorithm of logistics enterprises. First of all,we present the evaluation index system which contains basic information, management level, technical strength, transport capacity,informatization level, market competition and customer service. We decided the index weight according to the grades, and evaluated integrate ability of the logistics enterprises using fuzzy cluster analysis method. In this thesis, we introduced the system evaluation module and cluster analysis module in detail and described how we achieved these two modules. At last, we gave the result of the system.

  13. Automated Techniques for the Qualitative Analysis of Ecological Models: Continuous Models

    Directory of Open Access Journals (Sweden)

    Lynn van Coller

    1997-06-01

    Full Text Available The mathematics required for a detailed analysis of the behavior of a model can be formidable. In this paper, I demonstrate how various computer packages can aid qualitative analyses by implementing techniques from dynamical systems theory. Because computer software is used to obtain the results, the techniques can be used by nonmathematicians as well as mathematicians. In-depth analyses of complicated models that were previously very difficult to study can now be done. Because the paper is intended as an introduction to applying the techniques to ecological models, I have included an appendix describing some of the ideas and terminology. A second appendix shows how the techniques can be applied to a fairly simple predator-prey model and establishes the reliability of the computer software. The main body of the paper discusses a ratio-dependent model. The new techniques highlight some limitations of isocline analyses in this three-dimensional setting and show that the model is structurally unstable. Another appendix describes a larger model of a sheep-pasture-hyrax-lynx system. Dynamical systems techniques are compared with a traditional sensitivity analysis and are found to give more information. As a result, an incomplete relationship in the model is highlighted. I also discuss the resilience of these models to both parameter and population perturbations.

  14. Dynamical modeling and analysis of large cellular regulatory networks

    Science.gov (United States)

    Bérenguier, D.; Chaouiya, C.; Monteiro, P. T.; Naldi, A.; Remy, E.; Thieffry, D.; Tichit, L.

    2013-06-01

    The dynamical analysis of large biological regulatory networks requires the development of scalable methods for mathematical modeling. Following the approach initially introduced by Thomas, we formalize the interactions between the components of a network in terms of discrete variables, functions, and parameters. Model simulations result in directed graphs, called state transition graphs. We are particularly interested in reachability properties and asymptotic behaviors, which correspond to terminal strongly connected components (or "attractors") in the state transition graph. A well-known problem is the exponential increase of the size of state transition graphs with the number of network components, in particular when using the biologically realistic asynchronous updating assumption. To address this problem, we have developed several complementary methods enabling the analysis of the behavior of large and complex logical models: (i) the definition of transition priority classes to simplify the dynamics; (ii) a model reduction method preserving essential dynamical properties, (iii) a novel algorithm to compact state transition graphs and directly generate compressed representations, emphasizing relevant transient and asymptotic dynamical properties. The power of an approach combining these different methods is demonstrated by applying them to a recent multilevel logical model for the network controlling CD4+ T helper cell response to antigen presentation and to a dozen cytokines. This model accounts for the differentiation of canonical Th1 and Th2 lymphocytes, as well as of inflammatory Th17 and regulatory T cells, along with many hybrid subtypes. All these methods have been implemented into the software GINsim, which enables the definition, the analysis, and the simulation of logical regulatory graphs.

  15. Contact analysis and mathematical modeling of traveling wave ultrasonic motors.

    Science.gov (United States)

    Zhu, Meiling

    2004-06-01

    An analysis of the contact layer and a mathematical modeling of traveling wave ultrasonic motors (TWUM) are presented for the guidance of the design of contact layer and the analyses of the influence of the compressive force and contact layer on motor performance. The proposed model starts from a model previously studied but differs from that model in that it adds the analysis of the contact layer and derives the steady-state solutions of the nonlinear equations in the frequency domain, rather than in the time domain, for the analyses of vibrational responses of the stator and operational characteristics of the motor. The maximum permissible compressive force of the motor, the influences of the contact layer material, the thickness of the contact layer, and the compressive force on motor performance have been discussed. The results show that by using the model, one can understand the influence of the compressive force and contact layer material on motor performance, guide the choice of proper contact layer material, and calculate the maximum permissible compressive force and starting voltage.

  16. Sensitivity analysis of geometric errors in additive manufacturing medical models.

    Science.gov (United States)

    Pinto, Jose Miguel; Arrieta, Cristobal; Andia, Marcelo E; Uribe, Sergio; Ramos-Grez, Jorge; Vargas, Alex; Irarrazaval, Pablo; Tejos, Cristian

    2015-03-01

    Additive manufacturing (AM) models are used in medical applications for surgical planning, prosthesis design and teaching. For these applications, the accuracy of the AM models is essential. Unfortunately, this accuracy is compromised due to errors introduced by each of the building steps: image acquisition, segmentation, triangulation, printing and infiltration. However, the contribution of each step to the final error remains unclear. We performed a sensitivity analysis comparing errors obtained from a reference with those obtained modifying parameters of each building step. Our analysis considered global indexes to evaluate the overall error, and local indexes to show how this error is distributed along the surface of the AM models. Our results show that the standard building process tends to overestimate the AM models, i.e. models are larger than the original structures. They also show that the triangulation resolution and the segmentation threshold are critical factors, and that the errors are concentrated at regions with high curvatures. Errors could be reduced choosing better triangulation and printing resolutions, but there is an important need for modifying some of the standard building processes, particularly the segmentation algorithms.

  17. Nonlinear Pressure Wave Analysis by Concentrated Mass Model

    Science.gov (United States)

    Ishikawa, Satoshi; Kondou, Takahiro; Matsuzaki, Kenichiro

    A pressure wave propagating in a tube often changes to a shock wave because of the nonlinear effect of fluid. Analyzing this phenomenon by the finite difference method requires high computational cost. To lessen the computational cost, a concentrated mass model is proposed. This model consists of masses, connecting nonlinear springs, connecting dampers, and base support dampers. The characteristic of a connecting nonlinear spring is derived from the adiabatic change of fluid, and the equivalent mass and equivalent damping coefficient of the base support damper are derived from the equation of motion of fluid in a cylindrical tube. Pressure waves generated in a hydraulic oil tube, a sound tube and a plane-wave tube are analyzed numerically by the proposed model to confirm the validity of the model. All numerical computational results agree very well with the experimental results carried out by Okamura, Saenger and Kamakura. Especially, the numerical analysis reproduces the phenomena that a pressure wave with large amplitude propagating in a sound tube or in a plane tube changes to a shock wave. Therefore, it is concluded that the proposed model is valid for the numerical analysis of nonlinear pressure wave problem.

  18. Model validity and frequency band selection in operational modal analysis

    Science.gov (United States)

    Au, Siu-Kui

    2016-12-01

    Experimental modal analysis aims at identifying the modal properties (e.g., natural frequencies, damping ratios, mode shapes) of a structure using vibration measurements. Two basic questions are encountered when operating in the frequency domain: Is there a mode near a particular frequency? If so, how much spectral data near the frequency can be included for modal identification without incurring significant modeling error? For data with high signal-to-noise (s/n) ratios these questions can be addressed using empirical tools such as singular value spectrum. Otherwise they are generally open and can be challenging, e.g., for modes with low s/n ratios or close modes. In this work these questions are addressed using a Bayesian approach. The focus is on operational modal analysis, i.e., with 'output-only' ambient data, where identification uncertainty and modeling error can be significant and their control is most demanding. The approach leads to 'evidence ratios' quantifying the relative plausibility of competing sets of modeling assumptions. The latter involves modeling the 'what-if-not' situation, which is non-trivial but is resolved by systematic consideration of alternative models and using maximum entropy principle. Synthetic and field data are considered to investigate the behavior of evidence ratios and how they should be interpreted in practical applications.

  19. Model-based risk analysis of coupled process steps.

    Science.gov (United States)

    Westerberg, Karin; Broberg-Hansen, Ernst; Sejergaard, Lars; Nilsson, Bernt

    2013-09-01

    A section of a biopharmaceutical manufacturing process involving the enzymatic coupling of a polymer to a therapeutic protein was characterized with regards to the process parameter sensitivity and design space. To minimize the formation of unwanted by-products in the enzymatic reaction, the substrate was added in small amounts and unreacted protein was separated using size-exclusion chromatography (SEC) and recycled to the reactor. The quality of the final recovered product was thus a result of the conditions in both the reactor and the SEC, and a design space had to be established for both processes together. This was achieved by developing mechanistic models of the reaction and SEC steps, establishing the causal links between process conditions and product quality. Model analysis was used to complement the qualitative risk assessment, and design space and critical process parameters were identified. The simulation results gave an experimental plan focusing on the "worst-case regions" in terms of product quality and yield. In this way, the experiments could be used to verify both the suggested process and the model results. This work demonstrates the necessary steps of model-assisted process analysis, from model development through experimental verification.

  20. Experimental stress analysis for materials and structures stress analysis models for developing design methodologies

    CERN Document Server

    Freddi, Alessandro; Cristofolini, Luca

    2015-01-01

    This book summarizes the main methods of experimental stress analysis and examines their application to various states of stress of major technical interest, highlighting aspects not always covered in the classic literature. It is explained how experimental stress analysis assists in the verification and completion of analytical and numerical models, the development of phenomenological theories, the measurement and control of system parameters under operating conditions, and identification of causes of failure or malfunction. Cases addressed include measurement of the state of stress in models, measurement of actual loads on structures, verification of stress states in circumstances of complex numerical modeling, assessment of stress-related material damage, and reliability analysis of artifacts (e.g. prostheses) that interact with biological systems. The book will serve graduate students and professionals as a valuable tool for finding solutions when analytical solutions do not exist.

  1. Modeling and analysis of DFIG in wind energy conversion system

    Directory of Open Access Journals (Sweden)

    Omer Elfaki Elbashir, Wang Zezhong, Liu Qihui

    2014-01-01

    Full Text Available This paper deals with the modeling, analysis, and simulation of a doubly-fed induction generator (DFIG driven by a wind turbine. The grid connected wind energy conversion system (WECS is composed of DFIG and two back to back PWM voltage source converters (VSCs in the rotor circuit. A machine model is derived in an appropriate dq reference frame. The grid voltage oriented vector control is used for the grid side converter (GSC in order to maintain a constant DC bus voltage, while the stator voltage orientated vector control is adopted in the rotor side converter (RSC to control the active and reactive powers.

  2. The FIRO model of family therapy: implications of factor analysis.

    Science.gov (United States)

    Hafner, R J; Ross, M W

    1989-11-01

    Schutz's FIRO model contains three main elements: inclusion, control, and affection. It is used widely in mental health research and practice, but has received little empirical validation. The present study is based on factor analysis of the resources to FIRO questionnaire of 120 normal couples and 191 couples who were attending a clinic for marital/psychiatric problems. Results confirmed the validity of the FIRO model for women only. The differences between the sexes reflected a considerable degree of sex-role stereotyping, the clinical implications of which are discussed.

  3. Modeling and stability analysis of the nonlinear reactive sputtering process

    Directory of Open Access Journals (Sweden)

    György Katalin

    2011-12-01

    Full Text Available The model of the reactive sputtering process has been determined from the dynamic equilibrium of the reactive gas inside the chamber and the dynamic equilibrium of the sputtered metal atoms which form the compound with the reactive gas atoms on the surface of the substrate. The analytically obtained dynamical model is a system of nonlinear differential equations which can result in a histeresis-type input/output nonlinearity. The reactive sputtering process has been simulated by integrating these differential equations. Linearization has been applied for classical analysis of the sputtering process and control system design.

  4. Stability Analysis of Some Nonlinear Anaerobic Digestion Models

    Directory of Open Access Journals (Sweden)

    Ivan Simeonov

    2010-04-01

    Full Text Available Abstract: The paper deals with local asymptotic stability analysis of some mass balance dynamic models (based on one and on two-stage reaction schemes of the anaerobic digestion (AD in CSTR. The equilibrium states for models based on one (with Monod, Contois and Haldane shapes for the specific growth rate and on two-stage (only with Monod shapes for both the specific growth rate of acidogenic and methanogenic bacterial populations reaction schemes have been determined solving sets of nonlinear algebraic equations using Maples. Their stability has been analyzed systematically, which provides insight and guidance for AD bioreactors design, operation and control.

  5. A discourse on sensitivity analysis for discretely-modeled structures

    Science.gov (United States)

    Adelman, Howard M.; Haftka, Raphael T.

    1991-01-01

    A descriptive review is presented of the most recent methods for performing sensitivity analysis of the structural behavior of discretely-modeled systems. The methods are generally but not exclusively aimed at finite element modeled structures. Topics included are: selections of finite difference step sizes; special consideration for finite difference sensitivity of iteratively-solved response problems; first and second derivatives of static structural response; sensitivity of stresses; nonlinear static response sensitivity; eigenvalue and eigenvector sensitivities for both distinct and repeated eigenvalues; and sensitivity of transient response for both linear and nonlinear structural response.

  6. Modelling application for cognitive reliability and error analysis method

    Directory of Open Access Journals (Sweden)

    Fabio De Felice

    2013-10-01

    Full Text Available The automation of production systems has delegated to machines the execution of highly repetitive and standardized tasks. In the last decade, however, the failure of the automatic factory model has led to partially automated configurations of production systems. Therefore, in this scenario, centrality and responsibility of the role entrusted to the human operators are exalted because it requires problem solving and decision making ability. Thus, human operator is the core of a cognitive process that leads to decisions, influencing the safety of the whole system in function of their reliability. The aim of this paper is to propose a modelling application for cognitive reliability and error analysis method.

  7. Stochastic Volatility Model and Technical Analysis of Stock Price

    Institute of Scientific and Technical Information of China (English)

    Wei LIU; Wei An ZHENG

    2011-01-01

    In the stock market, some popular technical analysis indicators (e.g. Bollinger Bands, RSI,ROC, ...) are widely used by traders. They use the daily (hourly, weekly, ...) stock prices as samples of certain statistics and use the observed relative frequency to show the validity of those well-knownindicators. However, those samples are not independent, so the classical sample survey theory does not apply. In earlier research, we discussed the law of large numbers related to those observations when one assumes Black-Scholes' stock price model. In this paper, we extend the above results to the more popular stochastic volatility model.

  8. Analysis of operating model of electronic invoice colombian Colombian electronic billing analysis of the operational model

    Directory of Open Access Journals (Sweden)

    Sérgio Roberto da Silva

    2016-06-01

    Full Text Available Colombia has been one of the first countries to introduce electronic billing process on a voluntary basis, from a traditional to a digital version. In this context, the article analyzes the electronic billing process implemented in Colombia and the advantages. Methodological research is applied, qualitative, descriptive and documentary; where the regulatory framework and the conceptualization of the model is identified; the process of adoption of electronic billing is analyzed, and finally the advantages and disadvantages of its implementation is analyzed. The findings indicate that the model applied in Colombia to issue an electronic billing in sending and receiving process, is not complex, but it requires a small adequate infrastructure and trained personnel to reach all sectors, especially the micro and business which is the largest business network in the country.

  9. Development of seismic analysis model of LMFBR and seismic time history response analysis

    Energy Technology Data Exchange (ETDEWEB)

    Koo, K. H.; Lee, J. H.; Yoo, B. [KAERI, Taejon (Korea, Republic of)

    2001-05-01

    The main objective of this paper is to develop the seismic analysis model of KALIMER reactor structures including the primary coolant of sodium and to evaulate the seismic responses of the maximum peak acceleration and the relative displacements by the time history seismic response analysis. The seismic time history response analyses were carried out for both cases of the seismic isolation design and the non-isolation one to verify the seismic isolation performance. From the results of seismic response analysis using the developed seismic analysis model, it is clearly verified that the seismic isolation design gives very significantly reduced seismic responses compared with the non-isolation design. All design criteria for the relative displacement repsonse were satisfied for KALIMER reactor structures.

  10. A Mathematical Modeling Framework for Analysis of Functional Clothing

    Directory of Open Access Journals (Sweden)

    Xiaolin Man

    2007-11-01

    Full Text Available In the analysis and design of functional clothing systems, it is helpful to quantify the effects of a system on a wearer’s physical performance capabilities. Toward this end, a clothing modeling framework for quantifying the mechanical interactions between a given clothing system design and a specific wearer performing defined physical tasks is proposed. The modeling framework consists of three interacting modules: (1 a macroscale fabric mechanics/dynamics model; (2 a collision detection and contact correction module; and (3 a human motion module. In the proposed framework, the macroscopic fabric model is based on a rigorous large deformation continuum-degenerated shell theory representation. Material models that capture the stress-strain behavior of different clothing fabrics are used in the continuum shell framework. The collision and contact module enforces the impenetrability constraint between the fabric and human body and computes the associated contact forces between the two. The human body is represented in the current framework as an assemblage of overlapping ellipsoids that undergo rigid body motions consistent with human motions while performing actions such as walking, running, or jumping. The transient rigid body motions of each ellipsoidal body segment in time are determined using motion capture technology. The integrated modeling framework is then exercised to quantify the resistance that the clothing exerts on the wearer during the specific activities under consideration. Current results from the framework are presented and its intended applications are discussed along with some of the key challenges remaining in clothing system modeling.

  11. Sensitivity Analysis in a Complex Marine Ecological Model

    Directory of Open Access Journals (Sweden)

    Marcos D. Mateus

    2015-05-01

    Full Text Available Sensitivity analysis (SA has long been recognized as part of best practices to assess if any particular model can be suitable to inform decisions, despite its uncertainties. SA is a commonly used approach for identifying important parameters that dominate model behavior. As such, SA address two elementary questions in the modeling exercise, namely, how sensitive is the model to changes in individual parameter values, and which parameters or associated processes have more influence on the results. In this paper we report on a local SA performed on a complex marine biogeochemical model that simulates oxygen, organic matter and nutrient cycles (N, P and Si in the water column, and well as the dynamics of biological groups such as producers, consumers and decomposers. SA was performed using a “one at a time” parameter perturbation method, and a color-code matrix was developed for result visualization. The outcome of this study was the identification of key parameters influencing model performance, a particularly helpful insight for the subsequent calibration exercise. Also, the color-code matrix methodology proved to be effective for a clear identification of the parameters with most impact on selected variables of the model.

  12. Towards the maturity model for feature oriented domain analysis

    Directory of Open Access Journals (Sweden)

    Muhammad Javed

    2014-09-01

    Full Text Available Assessing the quality of a model has always been a challenge for researchers in academia and industry. The quality of a feature model is a prime factor because it is used in the development of products. A degraded feature model leads the development of low quality products. Few efforts have been made on improving the quality of feature models. This paper is an effort to present our ongoing work i.e. development of FODA (Feature Oriented Domain Analysis maturity model which will help to evaluate the quality of a given feature model. In this paper, we provide the quality levels along with their descriptions. The proposed model consists of four levels starting from level 0 to level 3. Design of each level is based on the severity of errors, whereas severity of errors decreases from level 0 to level 3. We elaborate each level with the help of examples. We borrowed all examples from the material published by the research community of Software Product Lines (SPL for the application of our framework.

  13. Performance Model and Sensitivity Analysis for a Solar Thermoelectric Generator

    Science.gov (United States)

    Rehman, Naveed Ur; Siddiqui, Mubashir Ali

    2017-03-01

    In this paper, a regression model for evaluating the performance of solar concentrated thermoelectric generators (SCTEGs) is established and the significance of contributing parameters is discussed in detail. The model is based on several natural, design and operational parameters of the system, including the thermoelectric generator (TEG) module and its intrinsic material properties, the connected electrical load, concentrator attributes, heat transfer coefficients, solar flux, and ambient temperature. The model is developed by fitting a response curve, using the least-squares method, to the results. The sample points for the model were obtained by simulating a thermodynamic model, also developed in this paper, over a range of values of input variables. These samples were generated employing the Latin hypercube sampling (LHS) technique using a realistic distribution of parameters. The coefficient of determination was found to be 99.2%. The proposed model is validated by comparing the predicted results with those in the published literature. In addition, based on the elasticity for parameters in the model, sensitivity analysis was performed and the effects of parameters on the performance of SCTEGs are discussed in detail. This research will contribute to the design and performance evaluation of any SCTEG system for a variety of applications.

  14. Comparison of analytical eddy current models using principal components analysis

    Science.gov (United States)

    Contant, S.; Luloff, M.; Morelli, J.; Krause, T. W.

    2017-02-01

    Monitoring the gap between the pressure tube (PT) and the calandria tube (CT) in CANDU® fuel channels is essential, as contact between the two tubes can lead to delayed hydride cracking of the pressure tube. Multifrequency transmit-receive eddy current non-destructive evaluation is used to determine this gap, as this method has different depths of penetration and variable sensitivity to noise, unlike single frequency eddy current non-destructive evaluation. An Analytical model based on the Dodd and Deeds solutions, and a second model that accounts for normal and lossy self-inductances, and a non-coaxial pickup coil, are examined for representing the response of an eddy current transmit-receive probe when considering factors that affect the gap response, such as pressure tube wall thickness and pressure tube resistivity. The multifrequency model data was analyzed using principal components analysis (PCA), a statistical method used to reduce the data set into a data set of fewer variables. The results of the PCA of the analytical models were then compared to PCA performed on a previously obtained experimental data set. The models gave similar results under variable PT wall thickness conditions, but the non-coaxial coil model, which accounts for self-inductive losses, performed significantly better than the Dodd and Deeds model under variable resistivity conditions.

  15. Uncertainty Analysis in Population-Based Disease Microsimulation Models

    Directory of Open Access Journals (Sweden)

    Behnam Sharif

    2012-01-01

    Full Text Available Objective. Uncertainty analysis (UA is an important part of simulation model validation. However, literature is imprecise as to how UA should be performed in the context of population-based microsimulation (PMS models. In this expository paper, we discuss a practical approach to UA for such models. Methods. By adapting common concepts from published UA guidelines, we developed a comprehensive, step-by-step approach to UA in PMS models, including sample size calculation to reduce the computational time. As an illustration, we performed UA for POHEM-OA, a microsimulation model of osteoarthritis (OA in Canada. Results. The resulting sample size of the simulated population was 500,000 and the number of Monte Carlo (MC runs was 785 for 12-hour computational time. The estimated 95% uncertainty intervals for the prevalence of OA in Canada in 2021 were 0.09 to 0.18 for men and 0.15 to 0.23 for women. The uncertainty surrounding the sex-specific prevalence of OA increased over time. Conclusion. The proposed approach to UA considers the challenges specific to PMS models, such as selection of parameters and calculation of MC runs and population size to reduce computational burden. Our example of UA shows that the proposed approach is feasible. Estimation of uncertainty intervals should become a standard practice in the reporting of results from PMS models.

  16. Performance Model and Sensitivity Analysis for a Solar Thermoelectric Generator

    Science.gov (United States)

    Rehman, Naveed Ur; Siddiqui, Mubashir Ali

    2017-01-01

    In this paper, a regression model for evaluating the performance of solar concentrated thermoelectric generators (SCTEGs) is established and the significance of contributing parameters is discussed in detail. The model is based on several natural, design and operational parameters of the system, including the thermoelectric generator (TEG) module and its intrinsic material properties, the connected electrical load, concentrator attributes, heat transfer coefficients, solar flux, and ambient temperature. The model is developed by fitting a response curve, using the least-squares method, to the results. The sample points for the model were obtained by simulating a thermodynamic model, also developed in this paper, over a range of values of input variables. These samples were generated employing the Latin hypercube sampling (LHS) technique using a realistic distribution of parameters. The coefficient of determination was found to be 99.2%. The proposed model is validated by comparing the predicted results with those in the published literature. In addition, based on the elasticity for parameters in the model, sensitivity analysis was performed and the effects of parameters on the performance of SCTEGs are discussed in detail. This research will contribute to the design and performance evaluation of any SCTEG system for a variety of applications.

  17. Model independent analysis on the slowing down of cosmic acceleration

    CERN Document Server

    Zhang, Ming-Jian

    2016-01-01

    Possible slowing down of cosmic acceleration has attracted more and more attention. However, most analysis in previous work were commonly imposed in some parametrization models. In the present paper, we investigate this subject using the the Gaussian processes (GP), providing a model-independent analysis. We carry out the reconstruction by abundant data including luminosity distance from Union2, Union2.1 compilation and gamma-ray burst, and Hubble parameter from cosmic chronometer and baryon acoustic oscillation peaks. The GP reconstructions suggest that no slowing down of cosmic acceleration is approved within 95\\% C.L. from current observational data. We also test the influence of spatial curvature and Hubble constant, finding that spatial curvature does not present significant impact on the reconstructions. However, Hubble constant strongly influence the reconstructions especially at low redshift. In order to reveal the reason of inconsistence between our reconstruction and previous parametrization constra...

  18. Fractional-Order Nonlinear Systems Modeling, Analysis and Simulation

    CERN Document Server

    Petráš, Ivo

    2011-01-01

    "Fractional-Order Nonlinear Systems: Modeling, Analysis and Simulation" presents a study of fractional-order chaotic systems accompanied by Matlab programs for simulating their state space trajectories, which are shown in the illustrations in the book. Description of the chaotic systems is clearly presented and their analysis and numerical solution are done in an easy-to-follow manner. Simulink models for the selected fractional-order systems are also presented. The readers will understand the fundamentals of the fractional calculus, how real dynamical systems can be described using fractional derivatives and fractional differential equations, how such equations can be solved, and how to simulate and explore chaotic systems of fractional order. The book addresses to mathematicians, physicists, engineers, and other scientists interested in chaos phenomena or in fractional-order systems. It can be used in courses on dynamical systems, control theory, and applied mathematics at graduate or postgraduate level. ...

  19. A Succinct Approach to Static Analysis and Model Checking

    DEFF Research Database (Denmark)

    Filipiuk, Piotr

    In a number of areas software correctness is crucial, therefore it is often desirable to formally verify the presence of various properties or the absence of errors. This thesis presents a framework for concisely expressing static analysis and model checking problems. The framework facilitates...... in the classical formulation of ALFP logic. Finally, we show that the logics and the associated solvers can be used for rapid prototyping. We illustrate that by a variety of case studies from static analysis and model checking....... rapid prototyping of new analyses and consists of variants of ALFP logic and associated solvers. First, we present a Lattice based Least Fixed Point Logic (LLFP) that allows interpretations over complete lattices satisfying Ascending Chain Condition. We establish a Moore Family result for LLFP...

  20. Model-Based Dependability Analysis of Physical Systems with Modelica

    Directory of Open Access Journals (Sweden)

    Andrea Tundis

    2017-01-01

    Full Text Available Modelica is an innovative, equation-based, and acausal language that allows modeling complex physical systems, which are made of mechanical, electrical, and electrotechnical components, and evaluates their design through simulation techniques. Unfortunately, the increasing complexity and accuracy of such physical systems require new, more powerful, and flexible tools and techniques for evaluating important system properties and, in particular, the dependability ones such as reliability, safety, and maintainability. In this context, the paper describes some extensions of the Modelica language to support the modeling of system requirements and their relationships. Such extensions enable the requirement verification analysis through native constructs in the Modelica language. Furthermore, they allow exporting a Modelica-based system design as a Bayesian Network in order to analyze its dependability by employing a probabilistic approach. The proposal is exemplified through a case study concerning the dependability analysis of a Tank System.

  1. Fusing Quantitative Requirements Analysis with Model-based Systems Engineering

    Science.gov (United States)

    Cornford, Steven L.; Feather, Martin S.; Heron, Vance A.; Jenkins, J. Steven

    2006-01-01

    A vision is presented for fusing quantitative requirements analysis with model-based systems engineering. This vision draws upon and combines emergent themes in the engineering milieu. "Requirements engineering" provides means to explicitly represent requirements (both functional and non-functional) as constraints and preferences on acceptable solutions, and emphasizes early-lifecycle review, analysis and verification of design and development plans. "Design by shopping" emphasizes revealing the space of options available from which to choose (without presuming that all selection criteria have previously been elicited), and provides means to make understandable the range of choices and their ramifications. "Model-based engineering" emphasizes the goal of utilizing a formal representation of all aspects of system design, from development through operations, and provides powerful tool suites that support the practical application of these principles. A first step prototype towards this vision is described, embodying the key capabilities. Illustrations, implications, further challenges and opportunities are outlined.

  2. Variable cluster analysis method for building neural network model

    Institute of Scientific and Technical Information of China (English)

    王海东; 刘元东

    2004-01-01

    To address the problems that input variables should be reduced as much as possible and explain output variables fully in building neural network model of complicated system, a variable selection method based on cluster analysis was investigated. Similarity coefficient which describes the mutual relation of variables was defined. The methods of the highest contribution rate, part replacing whole and variable replacement are put forwarded and deduced by information theory. The software of the neural network based on cluster analysis, which can provide many kinds of methods for defining variable similarity coefficient, clustering system variable and evaluating variable cluster, was developed and applied to build neural network forecast model of cement clinker quality. The results show that all the network scale, training time and prediction accuracy are perfect. The practical application demonstrates that the method of selecting variables for neural network is feasible and effective.

  3. NONSMOOTH MODEL FOR PLASTIC LIMIT ANALYSIS AND ITS SMOOTHING ALGORITHM

    Institute of Scientific and Technical Information of China (English)

    LI Jian-yu; PAN Shao-hua; LI Xing-si

    2006-01-01

    By means of Lagrange duality theory of the convex program, a dual problem of Hill's maximum plastic work principle under Mises' yield condition has been derived and whereby a non-differentiable convex optimization model for the limit analysis is developed. With this model, it is not necessary to linearize the yield condition and its discrete form becomes a minimization problem of the sum of Euclidean norms subject to linear constraints. Aimed at resolving the non-differentiability of Euclidean norms, a smoothing algorithm for the limit analysis of perfect-plastic continuum media is proposed.Its efficiency is demonstrated by computing the limit load factor and the collapse state for some plane stress and plain strain problems.

  4. Using Runtime Analysis to Guide Model Checking of Java Programs

    Science.gov (United States)

    Havelund, Klaus; Norvig, Peter (Technical Monitor)

    2001-01-01

    This paper describes how two runtime analysis algorithms, an existing data race detection algorithm and a new deadlock detection algorithm, have been implemented to analyze Java programs. Runtime analysis is based on the idea of executing the program once. and observing the generated run to extract various kinds of information. This information can then be used to predict whether other different runs may violate some properties of interest, in addition of course to demonstrate whether the generated run itself violates such properties. These runtime analyses can be performed stand-alone to generate a set of warnings. It is furthermore demonstrated how these warnings can be used to guide a model checker, thereby reducing the search space. The described techniques have been implemented in the b e grown Java model checker called PathFinder.

  5. Model parameter uncertainty analysis for an annual field-scale P loss model

    Science.gov (United States)

    Bolster, Carl H.; Vadas, Peter A.; Boykin, Debbie

    2016-08-01

    Phosphorous (P) fate and transport models are important tools for developing and evaluating conservation practices aimed at reducing P losses from agricultural fields. Because all models are simplifications of complex systems, there will exist an inherent amount of uncertainty associated with their predictions. It is therefore important that efforts be directed at identifying, quantifying, and communicating the different sources of model uncertainties. In this study, we conducted an uncertainty analysis with the Annual P Loss Estimator (APLE) model. Our analysis included calculating parameter uncertainties and confidence and prediction intervals for five internal regression equations in APLE. We also estimated uncertainties of the model input variables based on values reported in the literature. We then predicted P loss for a suite of fields under different management and climatic conditions while accounting for uncertainties in the model parameters and inputs and compared the relative contributions of these two sources of uncertainty to the overall uncertainty associated with predictions of P loss. Both the overall magnitude of the prediction uncertainties and the relative contributions of the two sources of uncertainty varied depending on management practices and field characteristics. This was due to differences in the number of model input variables and the uncertainties in the regression equations associated with each P loss pathway. Inspection of the uncertainties in the five regression equations brought attention to a previously unrecognized limitation with the equation used to partition surface-applied fertilizer P between leaching and runoff losses. As a result, an alternate equation was identified that provided similar predictions with much less uncertainty. Our results demonstrate how a thorough uncertainty and model residual analysis can be used to identify limitations with a model. Such insight can then be used to guide future data collection and model

  6. A Novel Quantitative Analysis Model for Information System Survivability Based on Conflict Analysis

    Institute of Scientific and Technical Information of China (English)

    WANG Jian; WANG Huiqiang; ZHAO Guosheng

    2007-01-01

    This paper describes a novel quantitative analysis model for system survivability based on conflict analysis, which provides a direct-viewing survivable situation. Based on the three-dimensional state space of conflict, each player's efficiency matrix on its credible motion set can be obtained. The player whose desire is the strongest in all initiates the moving and the overall state transition matrix of information system may be achieved. In addition, the process of modeling and stability analysis of conflict can be converted into a Markov analysis process, thus the obtained results with occurring probability of each feasible situation will help the players to quantitatively judge the probability of their pursuing situations in conflict. Compared with the existing methods which are limited to post-explanation of system's survivable situation, the proposed model is relatively suitable for quantitatively analyzing and forecasting the future development situation of system survivability. The experimental results show that the model may be effectively applied to quantitative analysis for survivability. Moreover, there will be a good application prospect in practice.

  7. Modeling and experimental vibration analysis of nanomechanical cantilever active probes

    Science.gov (United States)

    Salehi-Khojin, Amin; Bashash, Saeid; Jalili, Nader

    2008-08-01

    Nanomechanical cantilever (NMC) active probes have recently received increased attention in a variety of nanoscale sensing and measurement applications. Current modeling practices call for a uniform cantilever beam without considering the intentional jump discontinuities associated with the piezoelectric layer attachment and the NMC cross-sectional step. This paper presents a comprehensive modeling framework for modal characterization and dynamic response analysis of NMC active probes with geometrical discontinuities. The entire length of the NMC is divided into three segments of uniform beams followed by applying appropriate continuity conditions. The characteristics matrix equation is then used to solve for system natural frequencies and mode shapes. Using an equivalent electromechanical moment of a piezoelectric layer, forced motion analysis of the system is carried out. An experimental setup consisting of a commercial NMC active probe from Veeco and a state-of-the-art microsystem analyzer, the MSA-400 from Polytec, is developed to verify the theoretical developments proposed here. Using a parameter estimation technique based on minimizing the modeling error, optimal values of system parameters are identified. Mode shapes and the modal frequency response of the system for the first three modes determined from the proposed model are compared with those obtained from the experiment and commonly used theory for uniform beams. Results indicate that the uniform beam model fails to accurately predict the actual system response, especially in multiple-mode operation, while the proposed discontinuous beam model demonstrates good agreement with the experimental data. Such detailed and accurate modeling framework can lead to significant enhancement in the sensitivity of piezoelectric-based NMC sensors for use in variety of sensing and imaging applications.

  8. Counterfactual Graphical Models for Longitudinal Mediation Analysis with Unobserved Confounding

    OpenAIRE

    Shpitser, Ilya

    2012-01-01

    Questions concerning mediated causal effects are of great interest in psychology, cognitive science, medicine, social science, public health, and many other disciplines. For instance, about 60% of recent papers published in leading journals in social psychology contain at least one mediation test (Rucker, Preacher, Tormala, & Petty, 2011). Standard parametric approaches to mediation analysis employ regression models, and either the "difference method" (Judd & Kenny, 1981), more common in epid...

  9. Mathematical analysis techniques for modeling the space network activities

    Science.gov (United States)

    Foster, Lisa M.

    1992-01-01

    The objective of the present work was to explore and identify mathematical analysis techniques, and in particular, the use of linear programming. This topic was then applied to the Tracking and Data Relay Satellite System (TDRSS) in order to understand the space network better. Finally, a small scale version of the system was modeled, variables were identified, data was gathered, and comparisons were made between actual and theoretical data.

  10. Stock Market Trend Analysis Using Hidden Markov Models

    OpenAIRE

    Kavitha, G.; Udhayakumar, A.; D. Nagarajan

    2013-01-01

    Price movements of stock market are not totally random. In fact, what drives the financial market and what pattern financial time series follows have long been the interest that attracts economists, mathematicians and most recently computer scientists [17]. This paper gives an idea about the trend analysis of stock market behaviour using Hidden Markov Model (HMM). The trend once followed over a particular period will sure repeat in future. The one day difference in close value of stocks for a...

  11. Lao Graphic Enterprise, Analysis with model metrics Scor

    OpenAIRE

    Papanicolau Denegri, Jorge Nicolás; Universidad San Juan Bautista; Evangelista Yzaguirre, Luis; Universidad Nacional Mayor de San Marcos

    2016-01-01

    In this article, the relevance of the Scor model for an analysis as is a company presents itself as demonstrated in this research and reverse the deficiencies. En este artículo se presenta, la relevancia que tiene el modelo Scor para efectuar un análisis como se encuentra una empresa, tal como se demuestra en esta investigación y revertir las deficiencias.

  12. Nonstandard Analysis Applied to Advanced Undergraduate Mathematics - Infinitesimal Modeling

    OpenAIRE

    Herrmann, Robert A.

    2003-01-01

    This is a Research and Instructional Development Project from the U. S. Naval Academy. In this monograph, the basic methods of nonstandard analysis for n-dimensional Euclidean spaces are presented. Specific rules are deveoped and these methods and rules are applied to rigorous integral and differential modeling. The topics include Robinson infinitesimals, limited and infinite numbers; convergence theory, continuity, *-transfer, internal definition, hyprefinite summation, Riemann-Stieltjes int...

  13. Joint analysis of the seismic data and velocity gravity model

    Science.gov (United States)

    Belyakov, A. S.; Lavrov, V. S.; Muchamedov, V. A.; Nikolaev, A. V.

    2016-03-01

    We performed joint analysis of the seismic noises recorded at the Japanese Ogasawara station located on Titijima Island in the Philippine Sea using the STS-2 seismograph at the OSW station in the winter period of January 1-15, 2015, over the background of a velocity gravity model. The graphs prove the existence of a cause-and-effect relation between the seismic noise and gravity and allow us to consider it as a desired signal.

  14. Protein secondary structure analysis with a coarse-grained model

    OpenAIRE

    Kneller, Gerald R.; Hinsen, Konrad

    2014-01-01

    The paper presents a geometrical model for protein secondary structure analysis which uses only the positions of the $C_{\\alpha}$-atoms. We construct a space curve connecting these positions by piecewise polynomial interpolation and describe the folding of the protein backbone by a succession of screw motions linking the Frenet frames at consecutive $C_{\\alpha}$-positions. Using the ASTRAL subset of the SCOPe data base of protein structures, we derive thresholds for the screw parameters of se...

  15. Credibility analysis of risk classes by generalized linear model

    Science.gov (United States)

    Erdemir, Ovgucan Karadag; Sucu, Meral

    2016-06-01

    In this paper generalized linear model (GLM) and credibility theory which are frequently used in nonlife insurance pricing are combined for reliability analysis. Using full credibility standard, GLM is associated with limited fluctuation credibility approach. Comparison criteria such as asymptotic variance and credibility probability are used to analyze the credibility of risk classes. An application is performed by using one-year claim frequency data of a Turkish insurance company and results of credible risk classes are interpreted.

  16. Clinical laboratory as an economic model for business performance analysis

    Science.gov (United States)

    Buljanović, Vikica; Patajac, Hrvoje; Petrovečki, Mladen

    2011-01-01

    Aim To perform SWOT (strengths, weaknesses, opportunities, and threats) analysis of a clinical laboratory as an economic model that may be used to improve business performance of laboratories by removing weaknesses, minimizing threats, and using external opportunities and internal strengths. Methods Impact of possible threats to and weaknesses of the Clinical Laboratory at Našice General County Hospital business performance and use of strengths and opportunities to improve operating profit were simulated using models created on the basis of SWOT analysis results. The operating profit as a measure of profitability of the clinical laboratory was defined as total revenue minus total expenses and presented using a profit and loss account. Changes in the input parameters in the profit and loss account for 2008 were determined using opportunities and potential threats, and economic sensitivity analysis was made by using changes in the key parameters. The profit and loss account and economic sensitivity analysis were tools for quantifying the impact of changes in the revenues and expenses on the business operations of clinical laboratory. Results Results of simulation models showed that operational profit of €470 723 in 2008 could be reduced to only €21 542 if all possible threats became a reality and current weaknesses remained the same. Also, operational gain could be increased to €535 804 if laboratory strengths and opportunities were utilized. If both the opportunities and threats became a reality, the operational profit would decrease by €384 465. Conclusion The operational profit of the clinical laboratory could be significantly reduced if all threats became a reality and the current weaknesses remained the same. The operational profit could be increased by utilizing strengths and opportunities as much as possible. This type of modeling may be used to monitor business operations of any clinical laboratory and improve its financial situation by

  17. Dynamic sensitivity analysis of long running landslide models through basis set expansion and meta-modelling

    Science.gov (United States)

    Rohmer, Jeremy

    2016-04-01

    Predicting the temporal evolution of landslides is typically supported by numerical modelling. Dynamic sensitivity analysis aims at assessing the influence of the landslide properties on the time-dependent predictions (e.g., time series of landslide displacements). Yet two major difficulties arise: 1. Global sensitivity analysis require running the landslide model a high number of times (> 1000), which may become impracticable when the landslide model has a high computation time cost (> several hours); 2. Landslide model outputs are not scalar, but function of time, i.e. they are n-dimensional vectors with n usually ranging from 100 to 1000. In this article, I explore the use of a basis set expansion, such as principal component analysis, to reduce the output dimensionality to a few components, each of them being interpreted as a dominant mode of variation in the overall structure of the temporal evolution. The computationally intensive calculation of the Sobol' indices for each of these components are then achieved through meta-modelling, i.e. by replacing the landslide model by a "costless-to-evaluate" approximation (e.g., a projection pursuit regression model). The methodology combining "basis set expansion - meta-model - Sobol' indices" is then applied to the La Frasse landslide to investigate the dynamic sensitivity analysis of the surface horizontal displacements to the slip surface properties during the pore pressure changes. I show how to extract information on the sensitivity of each main modes of temporal behaviour using a limited number (a few tens) of long running simulations. In particular, I identify the parameters, which trigger the occurrence of a turning point marking a shift between a regime of low values of landslide displacements and one of high values.

  18. Strength Reliability Analysis of Turbine Blade Using Surrogate Models

    Directory of Open Access Journals (Sweden)

    Wei Duan

    2014-05-01

    Full Text Available There are many stochastic parameters that have an effect on the reliability of steam turbine blades performance in practical operation. In order to improve the reliability of blade design, it is necessary to take these stochastic parameters into account. In this study, a variable cross-section twisted blade is investigated and geometrical parameters, material parameters and load parameters are considered as random variables. A reliability analysis method as a combination of a Finite Element Method (FEM, a surrogate model and Monte Carlo Simulation (MCS, is applied to solve the blade reliability analysis. Based on the blade finite element parametrical model and the experimental design, two kinds of surrogate models, Polynomial Response Surface (PRS and Artificial Neural Network (ANN, are applied to construct the approximation analytical expressions between the blade responses (including maximum stress and deflection and random input variables, which act as a surrogate of finite element solver to drastically reduce the number of simulations required. Then the surrogate is used for most of the samples needed in the Monte Carlo method and the statistical parameters and cumulative distribution functions of the maximum stress and deflection are obtained by Monte Carlo simulation. Finally, the probabilistic sensitivities analysis, which combines the magnitude of the gradient and the width of the scatter range of the random input variables, is applied to evaluate how much the maximum stress and deflection of the blade are influenced by the random nature of input parameters.

  19. A risk analysis model in concurrent engineering product development.

    Science.gov (United States)

    Wu, Desheng Dash; Kefan, Xie; Gang, Chen; Ping, Gui

    2010-09-01

    Concurrent engineering has been widely accepted as a viable strategy for companies to reduce time to market and achieve overall cost savings. This article analyzes various risks and challenges in product development under the concurrent engineering environment. A three-dimensional early warning approach for product development risk management is proposed by integrating graphical evaluation and review technique (GERT) and failure modes and effects analysis (FMEA). Simulation models are created to solve our proposed concurrent engineering product development risk management model. Solutions lead to identification of key risk controlling points. This article demonstrates the value of our approach to risk analysis as a means to monitor various risks typical in the manufacturing sector. This article has three main contributions. First, we establish a conceptual framework to classify various risks in concurrent engineering (CE) product development (PD). Second, we propose use of existing quantitative approaches for PD risk analysis purposes: GERT, FMEA, and product database management (PDM). Based on quantitative tools, we create our approach for risk management of CE PD and discuss solutions of the models. Third, we demonstrate the value of applying our approach using data from a typical Chinese motor company.

  20. Teaching-Learning Activity Modeling Based on Data Analysis

    Directory of Open Access Journals (Sweden)

    Kyungrog Kim

    2015-03-01

    Full Text Available Numerous studies are currently being carried out on personalized services based on data analysis to find and provide valuable information about information overload. Furthermore, the number of studies on data analysis of teaching-learning activities for personalized services in the field of teaching-learning is increasing, too. This paper proposes a learning style recency-frequency-durability (LS-RFD model for quantified analysis on the level of activities of learners, to provide the elements of teaching-learning activities according to the learning style of the learner among various parameters for personalized service. This is to measure preferences as to teaching-learning activity according to recency, frequency and durability of such activities. Based on the results, user characteristics can be classified into groups for teaching-learning activity by categorizing the level of preference and activity of the learner.