Development of a gas systems analysis model (GSAM)
Energy Technology Data Exchange (ETDEWEB)
Godec, M.L. [IFC Resources Inc., Fairfax, VA (United States)
1995-04-01
The objectives of developing a Gas Systems Analysis Model (GSAM) are to create a comprehensive, non-proprietary, PC based model of domestic gas industry activity. The system is capable of assessing the impacts of various changes in the natural gas system within North America. The individual and collective impacts due to changes in technology and economic conditions are explicitly modeled in GSAM. Major gas resources are all modeled, including conventional, tight, Devonian Shale, coalbed methane, and low-quality gas sources. The modeling system asseses all key components of the gas industry, including available resources, exploration, drilling, completion, production, and processing practices, both for now and in the future. The model similarly assesses the distribution, storage, and utilization of natural gas in a dynamic market-based analytical structure. GSAM is designed to provide METC managers with a tool to project the impacts of future research, development, and demonstration (RD&D) benefits in order to determine priorities in a rapidly changing, market-driven gas industry.
Energy Technology Data Exchange (ETDEWEB)
NONE
1994-07-01
The objective of GSAM development is to create a comprehensive, non-proprietary, microcomputer model of the North American natural gas system. GSAM explicitly evaluates the key components of the system, including the resource base, exploration and development practices, extraction technology performance and costs, project economics, transportation costs and restrictions, storage, and end-use. The primary focus is the detailed characterization of the resource base at the reservoir and sub-reservoir level. This disaggregation allows direct evaluation of alternative extraction technologies based on discretely estimated, individual well productivity, required investments, and associated operating costs. GSAM`s design allows users to evaluate complex interactions of current and alternative future technology and policy initiatives as they directly impact the gas market. Key activities completed during the past year include: conducted a comparative analysis of commercial reservoir databases; licensed and screened NRG Associates Significant Oil and Gas Fields of the US reservoir database; developed and tested reduced form reservoir model production type curves; fully developed database structures for use in GSAM and linkage to other systems; developed a methodology for the exploration module; collected and updated upstream capital and operating cost parameters; completed initial integration of downstream/demand models; presented research results at METC Contractor Review Meeting; conducted other briefings for METC managers, including initiation of the GSAM Environmental Module; and delivered draft topical reports on technology review, model review, and GSAM methodology.
Development of a natural Gas Systems Analysis Model (GSAM)
International Nuclear Information System (INIS)
Lacking a detailed characterization of the resource base and a comprehensive borehole-to-burnertip evaluation model of the North American natural gas system, past R ampersand D, tax and regulatory policies have been formulated without a full understanding of their likely direct and indirect impacts on future gas supply and demand. The recent disappearance of the deliverability surplus, pipeline deregulation, and current policy debates about regulatory initiatives in taxation, environmental compliance and leasing make the need for a comprehensive gas evaluation system critical. Traditional econometric or highly aggregated energy models are increasingly regarded as unable to incorporate available geologic detail and explicit technology performance and costing algorithms necessary to evaluate resource-technology-economic interactions in a market context. The objective of this research is to create a comprehensive, non-proprietary, microcomputer model of the North American natural gas system. GSAM explicitly evaluates the key components of the natural gas system, including resource base, exploration and development, extraction technology performance and costs, transportation and storage and end use. The primary focus is the detailed characterization of the resource base at the reservoir and sub-reservoir level and the impact of alternative extraction technologies on well productivity and economics. GSAM evaluates the complex interactions of current and alternative future technology and policy initiatives in the context of the evolving gas markets. Scheduled for completion in 1995, a prototype is planned for early 1994. ICF Resources reviewed relevant natural gas upstream, downstream and market models to identify appropriate analytic capabilities to incorporate into GSAM. We have reviewed extraction technologies to better characterize performance and costs in terms of GSAM parameters
Development of a natural gas systems analysis model (GSAM). Annual report, July 1996--July 1997
Energy Technology Data Exchange (ETDEWEB)
NONE
1997-12-31
The objective of GSAM development is to create a comprehensive, non-proprietary, microcomputer model of the North American natural gas system. GSAM explicitly evaluates the key components of the system, including the resource base, exploration and development practices, extraction technology performance and costs, project economics, transportation costs and restrictions, storage, and end-use. The primary focus is the detailed characterization of the resource base at the reservoir and subreservoir level. This disaggregation allows direct evaluation of alternative extraction technologies based on discretely estimated, individual well productivity, required investments, and associated operating costs. GSAM`s design allows users to evaluate complex interactions of current and alternative future technology and policy initiatives as they directly impact the gas market. GSAM development has been ongoing for the past five years. Key activities completed during the past year are described.
Development of a natural gas systems analysis model (GSAM). Annual report, July 1996--July 1997
International Nuclear Information System (INIS)
The objective of GSAM development is to create a comprehensive, non-proprietary, microcomputer model of the North American natural gas system. GSAM explicitly evaluates the key components of the system, including the resource base, exploration and development practices, extraction technology performance and costs, project economics, transportation costs and restrictions, storage, and end-use. The primary focus is the detailed characterization of the resource base at the reservoir and subreservoir level. This disaggregation allows direct evaluation of alternative extraction technologies based on discretely estimated, individual well productivity, required investments, and associated operating costs. GSAM's design allows users to evaluate complex interactions of current and alternative future technology and policy initiatives as they directly impact the gas market. GSAM development has been ongoing for the past five years. Key activities completed during the past year are described
Development of a natural gas systems analysis model (GSAM). Annual report, July 1994--June 1995
Energy Technology Data Exchange (ETDEWEB)
NONE
1995-07-01
North American natural gas markets have changed dramatically over the past decade. A competitive, cost-conscious production, transportation, and distribution system has emerged from the highly regulated transportation wellhead pricing structure of the 1980`s. Technology advances have played an important role in the evolution of the gas industry, a role likely to expand substantially as alternative fuel price competition and a maturing natural gas resource base force operators to maximize efficiency. Finally, significant changes continue in regional gas demand patterns, industry practices, and infrastructure needs. As the complexity of the gas system grows so does the need to evaluate and plan for alternative future resource, technology, and market scenarios. Traditional gas modeling systems focused solely on the econometric aspects of gas marketing. These systems, developed to assess a regulated industry at a high level of aggregation, rely on simple representation of complex and evolving systems, thereby precluding insight into how the industry will change over time. Credible evaluations of specific policy initiatives and research activities require a different approach. Also, the mounting pressure on energy producers from environmental compliance activities requires development of analysis that incorporates relevant geologic, engineering, and project economic details. The objective of policy, research and development (R&D), and market analysis is to integrate fundamental understanding of natural gas resources, technology, and markets to fully describe the potential of the gas resource under alternative future scenarios. This report summarizes work over the past twelve months on DOE Contract DE-AC21-92MC28138, Development of a Natural Gas Systems Analysis Model (GSAM). The products developed under this project directly support the Morgantown Energy Technology Center (METC) in carrying out its natural gas R&D mission.
International Nuclear Information System (INIS)
This report summarizes work completed on DOE Contract DE-AC21-92MC28138, Development of a Natural Gas Systems Analysis Model (GSAM). The products developed under this project directly support the National Energy Technology Laboratory (NETL) in carrying out its natural gas R and D mission. The objective of this research effort has been to create a comprehensive, non-proprietary, microcomputer model of the North American natural gas market. GSAM has been developed to explicitly evaluate components of the natural gas system, including the entire in-place gas resource base, exploration and development technologies, extraction technology and performance parameters, transportation and storage factors, and end-use demand issues. The system has been fully tested and calibrated and has been used for multiple natural gas metrics analyses at NETL in which metric associated with NETL natural gas upstream R and D technologies and strategies under the direction of NETL has been evaluated. NETL's Natural Gas Strategic Plan requires that R and D activities be evaluated for their ability to provide adequate supplies of reasonably priced natural gas. GSAM provides the capability to assess potential and on-going R and D projects using a full fuel cycle, cost-benefit approach. This method yields realistic, market-based assessments of benefits and costs of alternative or related technology advances. GSAM is capable of estimating both technical and commercial successes, quantifying the potential benefits to the market, as well as to other related research. GSAM, therefore, represents an integration of research activities and a method for planning and prioritizing efforts to maximize benefits and minimize costs. Without an analytical tool like GSAM, NETL natural gas upstream R and D activities cannot be appropriately ranked or focused on the most important aspects of natural gas extraction efforts or utilization considerations
Energy Technology Data Exchange (ETDEWEB)
Unknown
2001-02-01
This report summarizes work completed on DOE Contract DE-AC21-92MC28138, Development of a Natural Gas Systems Analysis Model (GSAM). The products developed under this project directly support the National Energy Technology Laboratory (NETL) in carrying out its natural gas R&D mission. The objective of this research effort has been to create a comprehensive, non-proprietary, microcomputer model of the North American natural gas market. GSAM has been developed to explicitly evaluate components of the natural gas system, including the entire in-place gas resource base, exploration and development technologies, extraction technology and performance parameters, transportation and storage factors, and end-use demand issues. The system has been fully tested and calibrated and has been used for multiple natural gas metrics analyses at NETL in which metric associated with NETL natural gas upstream R&D technologies and strategies under the direction of NETL has been evaluated. NETL's Natural Gas Strategic Plan requires that R&D activities be evaluated for their ability to provide adequate supplies of reasonably priced natural gas. GSAM provides the capability to assess potential and on-going R&D projects using a full fuel cycle, cost-benefit approach. This method yields realistic, market-based assessments of benefits and costs of alternative or related technology advances. GSAM is capable of estimating both technical and commercial successes, quantifying the potential benefits to the market, as well as to other related research. GSAM, therefore, represents an integration of research activities and a method for planning and prioritizing efforts to maximize benefits and minimize costs. Without an analytical tool like GSAM, NETL natural gas upstream R&D activities cannot be appropriately ranked or focused on the most important aspects of natural gas extraction efforts or utilization considerations.
A generalized approach to the modeling of the species-area relationship.
Directory of Open Access Journals (Sweden)
Katiane Silva Conceição
Full Text Available This paper proposes a statistical generalized species-area model (GSAM to represent various patterns of species-area relationship (SAR, which is one of the fundamental patterns in ecology. The approach enables the generalization of many preliminary models, as power-curve model, which is commonly used to mathematically describe the SAR. The GSAM is applied to simulated data set of species diversity in areas of different sizes and a real-world data of insects of Hymenoptera order has been modeled. We show that the GSAM enables the identification of the best statistical model and estimates the number of species according to the area.
Amir Farbin
The ATLAS Analysis Model is a continually developing vision of how to reconcile physics analysis requirements with the ATLAS offline software and computing model constraints. In the past year this vision has influenced the evolution of the ATLAS Event Data Model, the Athena software framework, and physics analysis tools. These developments, along with the October Analysis Model Workshop and the planning for CSC analyses have led to a rapid refinement of the ATLAS Analysis Model in the past few months. This article introduces some of the relevant issues and presents the current vision of the future ATLAS Analysis Model. Event Data Model The ATLAS Event Data Model (EDM) consists of several levels of details, each targeted for a specific set of tasks. For example the Event Summary Data (ESD) stores calorimeter cells and tracking system hits thereby permitting many calibration and alignment tasks, but will be only accessible at particular computing sites with potentially large latency. In contrast, the Analysis...
Directory of Open Access Journals (Sweden)
Georgiana Cristina NUKINA
2012-07-01
Full Text Available Through Risk analysis developed model deciding whether control measures suitable for implementation. However, the analysis determines whether the benefits of a data control options cost more than the implementation.
Communication Analysis modelling techniques
España, Sergio; Pastor, Óscar; Ruiz, Marcela
2012-01-01
This report describes and illustrates several modelling techniques proposed by Communication Analysis; namely Communicative Event Diagram, Message Structures and Event Specification Templates. The Communicative Event Diagram is a business process modelling technique that adopts a communicational perspective by focusing on communicative interactions when describing the organizational work practice, instead of focusing on physical activities1; at this abstraction level, we refer to business activities as communicative events. Message Structures is a technique based on structured text that allows specifying the messages associated to communicative events. Event Specification Templates are a means to organise the requirements concerning a communicative event. This report can be useful to analysts and business process modellers in general, since, according to our industrial experience, it is possible to apply many Communication Analysis concepts, guidelines and criteria to other business process modelling notation...
Model Checking as Static Analysis
DEFF Research Database (Denmark)
Zhang, Fuyuan
Both model checking and static analysis are prominent approaches to detecting software errors. Model Checking is a successful formal method for verifying properties specified in temporal logics with respect to transition systems. Static analysis is also a powerful method for validating program...... properties which can predict safe approximations to program behaviors. In this thesis, we have developed several static analysis based techniques to solve model checking problems, aiming at showing the link between static analysis and model checking. We focus on logical approaches to static analysis......-calculus can be encoded as the intended model of SFP. Our research results have strengthened the link between model checking and static analysis. This provides a theoretical foundation for developing a unied tool for both model checking and static analysis techniques....
Slavik Stefan; Bednar Richard
2014-01-01
The term business model has been used in practice for few years, but companies create, define and innovate their models subconsciously from the start of business. Our paper is aimed to clear the theory about business model, hence definition and all the components that form each business. In the second part, we create an analytical tool and analyze the real business models in Slovakia and define the characteristics of each part of business model, i.e., customers, distribution, value, resour...
Survival analysis models and applications
Liu, Xian
2012-01-01
Survival analysis concerns sequential occurrences of events governed by probabilistic laws. Recent decades have witnessed many applications of survival analysis in various disciplines. This book introduces both classic survival models and theories along with newly developed techniques. Readers will learn how to perform analysis of survival data by following numerous empirical illustrations in SAS. Survival Analysis: Models and Applications: Presents basic techniques before leading onto some of the most advanced topics in survival analysis.Assumes only a minimal knowledge of SAS whilst enablin
International Nuclear Information System (INIS)
To impart hands-on training in physics analysis, CMS experiment initiated the concept of CMS Data Analysis School (CMSDAS). It was born over three years ago at the LPC (LHC Physics Centre), Fermilab and is based on earlier workshops held at the LPC and CLEO Experiment. As CMS transitioned from construction to the data taking mode, the nature of earlier training also evolved to include more of analysis tools, software tutorials and physics analysis. This effort epitomized as CMSDAS has proven to be a key for the new and young physicists to jump start and contribute to the physics goals of CMS by looking for new physics with the collision data. With over 400 physicists trained in six CMSDAS around the globe, CMS is trying to engage the collaboration in its discovery potential and maximize physics output. As a bigger goal, CMS is striving to nurture and increase engagement of the myriad talents, in the development of physics, service, upgrade, education of those new to CMS and the career development of younger members. An extension of the concept to the dedicated software and hardware schools is also planned, keeping in mind the ensuing upgrade phase.
Energy Technology Data Exchange (ETDEWEB)
Malik, S. [Nebraska U.; Shipsey, I. [Purdue U.; Cavanaugh, R. [Illinois U., Chicago; Bloom, K. [Nebraska U.; Chan, Kai-Feng [Taiwan, Natl. Taiwan U.; D' Hondt, J. [Vrije U., Brussels; Klima, B. [Fermilab; Narain, M. [Brown U.; Palla, F. [INFN, Pisa; Rolandi, G. [CERN; Schörner-Sadenius, T. [DESY
2014-01-01
To impart hands-on training in physics analysis, CMS experiment initiated the concept of CMS Data Analysis School (CMSDAS). It was born over three years ago at the LPC (LHC Physics Centre), Fermilab and is based on earlier workshops held at the LPC and CLEO Experiment. As CMS transitioned from construction to the data taking mode, the nature of earlier training also evolved to include more of analysis tools, software tutorials and physics analysis. This effort epitomized as CMSDAS has proven to be a key for the new and young physicists to jump start and contribute to the physics goals of CMS by looking for new physics with the collision data. With over 400 physicists trained in six CMSDAS around the globe, CMS is trying to engage the collaboration in its discovery potential and maximize physics output. As a bigger goal, CMS is striving to nurture and increase engagement of the myriad talents, in the development of physics, service, upgrade, education of those new to CMS and the career development of younger members. An extension of the concept to the dedicated software and hardware schools is also planned, keeping in mind the ensuing upgrade phase.
ORGANISATIONAL CULTURE ANALYSIS MODEL
Mihaela Simona Maracine
2012-01-01
The studies and researches undertaken have demonstrated the importance of studying organisational culture because of the practical valences it presents and because it contributes to increasing the organisation’s performance. The analysis of the organisational culture’s dimensions allows observing human behaviour within the organisation and highlighting reality, identifying the strengths and also the weaknesses which have an impact on its functionality and development. In this paper, we try to...
Multiscale Signal Analysis and Modeling
Zayed, Ahmed
2013-01-01
Multiscale Signal Analysis and Modeling presents recent advances in multiscale analysis and modeling using wavelets and other systems. This book also presents applications in digital signal processing using sampling theory and techniques from various function spaces, filter design, feature extraction and classification, signal and image representation/transmission, coding, nonparametric statistical signal processing, and statistical learning theory. This book also: Discusses recently developed signal modeling techniques, such as the multiscale method for complex time series modeling, multiscale positive density estimations, Bayesian Shrinkage Strategies, and algorithms for data adaptive statistics Introduces new sampling algorithms for multidimensional signal processing Provides comprehensive coverage of wavelets with presentations on waveform design and modeling, wavelet analysis of ECG signals and wavelet filters Reviews features extraction and classification algorithms for multiscale signal and image proce...
ROCK PROPERTIES MODEL ANALYSIS MODEL REPORT
Energy Technology Data Exchange (ETDEWEB)
Clinton Lum
2002-02-04
The purpose of this Analysis and Model Report (AMR) is to document Rock Properties Model (RPM) 3.1 with regard to input data, model methods, assumptions, uncertainties and limitations of model results, and qualification status of the model. The report also documents the differences between the current and previous versions and validation of the model. The rock properties models are intended principally for use as input to numerical physical-process modeling, such as of ground-water flow and/or radionuclide transport. The constraints, caveats, and limitations associated with this model are discussed in the appropriate text sections that follow. This work was conducted in accordance with the following planning documents: WA-0344, ''3-D Rock Properties Modeling for FY 1998'' (SNL 1997, WA-0358), ''3-D Rock Properties Modeling for FY 1999'' (SNL 1999), and the technical development plan, Rock Properties Model Version 3.1, (CRWMS M&O 1999c). The Interim Change Notice (ICNs), ICN 02 and ICN 03, of this AMR were prepared as part of activities being conducted under the Technical Work Plan, TWP-NBS-GS-000003, ''Technical Work Plan for the Integrated Site Model, Process Model Report, Revision 01'' (CRWMS M&O 2000b). The purpose of ICN 03 is to record changes in data input status due to data qualification and verification activities. These work plans describe the scope, objectives, tasks, methodology, and implementing procedures for model construction. The constraints, caveats, and limitations associated with this model are discussed in the appropriate text sections that follow. The work scope for this activity consists of the following: (1) Conversion of the input data (laboratory measured porosity data, x-ray diffraction mineralogy, petrophysical calculations of bound water, and petrophysical calculations of porosity) for each borehole into stratigraphic coordinates; (2) Re-sampling and merging of data sets; (3
Sensitivity Analysis of Simulation Models
Kleijnen, J.P.C.
2009-01-01
This contribution presents an overview of sensitivity analysis of simulation models, including the estimation of gradients. It covers classic designs and their corresponding (meta)models; namely, resolution-III designs including fractional-factorial two-level designs for first-order polynomial metam
Frailty Models in Survival Analysis
Wienke, Andreas
2010-01-01
The concept of frailty offers a convenient way to introduce unobserved heterogeneity and associations into models for survival data. In its simplest form, frailty is an unobserved random proportionality factor that modifies the hazard function of an individual or a group of related individuals. "Frailty Models in Survival Analysis" presents a comprehensive overview of the fundamental approaches in the area of frailty models. The book extensively explores how univariate frailty models can represent unobserved heterogeneity. It also emphasizes correlated frailty models as extensions of
Local models for spatial analysis
Lloyd, Christopher D
2006-01-01
In both the physical and social sciences, there are now available large spatial data sets with detailed local information. Global models for analyzing these data are not suitable for investigating local variations; consequently, local models are the subject of much recent research. Collecting a variety of models into a single reference, Local Models for Spatial Analysis explains in detail a variety of approaches for analyzing univariate and multivariate spatial data. Different models make use of data in unique ways, and this book offers perspectives on various definitions of what constitutes
Stochastic modeling analysis and simulation
Nelson, Barry L
1995-01-01
A coherent introduction to the techniques for modeling dynamic stochastic systems, this volume also offers a guide to the mathematical, numerical, and simulation tools of systems analysis. Suitable for advanced undergraduates and graduate-level industrial engineers and management science majors, it proposes modeling systems in terms of their simulation, regardless of whether simulation is employed for analysis. Beginning with a view of the conditions that permit a mathematical-numerical analysis, the text explores Poisson and renewal processes, Markov chains in discrete and continuous time, se
Command Process Modeling & Risk Analysis
Meshkat, Leila
2011-01-01
Commanding Errors may be caused by a variety of root causes. It's important to understand the relative significance of each of these causes for making institutional investment decisions. One of these causes is the lack of standardized processes and procedures for command and control. We mitigate this problem by building periodic tables and models corresponding to key functions within it. These models include simulation analysis and probabilistic risk assessment models.
Hydraulic Modeling: Pipe Network Analysis
Datwyler, Trevor T.
2012-01-01
Water modeling is becoming an increasingly important part of hydraulic engineering. One application of hydraulic modeling is pipe network analysis. Using programmed algorithms to repeatedly solve continuity and energy equations, computer software can greatly reduce the amount of time required to analyze a closed conduit system. Such hydraulic models can become a valuable tool for cities to maintain their water systems and plan for future growth. The Utah Division of Drinking Water regulations...
Model building techniques for analysis.
Energy Technology Data Exchange (ETDEWEB)
Walther, Howard P.; McDaniel, Karen Lynn; Keener, Donald; Cordova, Theresa Elena; Henry, Ronald C.; Brooks, Sean; Martin, Wilbur D.
2009-09-01
The practice of mechanical engineering for product development has evolved into a complex activity that requires a team of specialists for success. Sandia National Laboratories (SNL) has product engineers, mechanical designers, design engineers, manufacturing engineers, mechanical analysts and experimentalists, qualification engineers, and others that contribute through product realization teams to develop new mechanical hardware. The goal of SNL's Design Group is to change product development by enabling design teams to collaborate within a virtual model-based environment whereby analysis is used to guide design decisions. Computer-aided design (CAD) models using PTC's Pro/ENGINEER software tools are heavily relied upon in the product definition stage of parts and assemblies at SNL. The three-dimensional CAD solid model acts as the design solid model that is filled with all of the detailed design definition needed to manufacture the parts. Analysis is an important part of the product development process. The CAD design solid model (DSM) is the foundation for the creation of the analysis solid model (ASM). Creating an ASM from the DSM currently is a time-consuming effort; the turnaround time for results of a design needs to be decreased to have an impact on the overall product development. This effort can be decreased immensely through simple Pro/ENGINEER modeling techniques that summarize to the method features are created in a part model. This document contains recommended modeling techniques that increase the efficiency of the creation of the ASM from the DSM.
Analysis of aircraft maintenance models
Directory of Open Access Journals (Sweden)
Vlada S. Sokolović
2011-10-01
Full Text Available This paper addressed several organizational models of aircraft maintenance. All models presented so far have been in use in Air Forces, so that the advantages and disadvantages of different models are known. First it shows the current model of aircraft maintenance as well as its basic characteristics. Then the paper discusses two organizational models of aircraft maintenance with their advantages and disadvantages. The advantages and disadvantages of different models are analyzed based on the criteria of operational capabilities of military units. In addition to operational capabilities, the paper presents some other criteria which should be taken into account in the evaluation and selection of an optimal model of aircraft maintenance. Performing a qualitative analysis of some models may not be sufficient for evaluating the optimum choice for models of maintenance referring to the selected set of criteria from the scope of operational capabilities. In order to choose the optimum model, it is necessary to conduct a detailed economic and technical analysis of individual tactical model maintenance. A high-quality aircraft maintenance organization requires the highest state and army authorities to be involved. It is necessary to set clear objectives for all the elements of modern air force technical support programs based on the given evaluation criteria.
Accelerated life models modeling and statistical analysis
Bagdonavicius, Vilijandas
2001-01-01
Failure Time DistributionsIntroductionParametric Classes of Failure Time DistributionsAccelerated Life ModelsIntroductionGeneralized Sedyakin's ModelAccelerated Failure Time ModelProportional Hazards ModelGeneralized Proportional Hazards ModelsGeneralized Additive and Additive-Multiplicative Hazards ModelsChanging Shape and Scale ModelsGeneralizationsModels Including Switch-Up and Cycling EffectsHeredity HypothesisSummaryAccelerated Degradation ModelsIntroductionDegradation ModelsModeling the Influence of Explanatory Varia
Analysis by fracture network modelling
International Nuclear Information System (INIS)
This report describes the Fracture Network Modelling and Performance Assessment Support performed by Golder Associates Inc. during the Heisei-11 (1999-2000) fiscal year. The primary objective of the Golder Associates work scope during HY-11 was to provide theoretical and review support to the JNC HY-12 Performance assessment effort. In addition, Golder Associates provided technical support to JNC for the Aespoe Project. Major efforts for performance assessment support included analysis of PAWorks pathways and software documentation, verification, and performance assessment visualization. Support for the Aespoe project including 'Task 4' predictive modelling of sorbing tracer transport in TRUE-1 rock block, and integrated hydrogeological and geochemical modelling of Aespoe island for 'Task 5'. Technical information about Golder Associates HY-11 support to JNC is provided in the appendices to this report. (author)
Ventilation Model and Analysis Report
International Nuclear Information System (INIS)
This model and analysis report develops, validates, and implements a conceptual model for heat transfer in and around a ventilated emplacement drift. This conceptual model includes thermal radiation between the waste package and the drift wall, convection from the waste package and drift wall surfaces into the flowing air, and conduction in the surrounding host rock. These heat transfer processes are coupled and vary both temporally and spatially, so numerical and analytical methods are used to implement the mathematical equations which describe the conceptual model. These numerical and analytical methods predict the transient response of the system, at the drift scale, in terms of spatially varying temperatures and ventilation efficiencies. The ventilation efficiency describes the effectiveness of the ventilation process in removing radionuclide decay heat from the drift environment. An alternative conceptual model is also developed which evaluates the influence of water and water vapor mass transport on the ventilation efficiency. These effects are described using analytical methods which bound the contribution of latent heat to the system, quantify the effects of varying degrees of host rock saturation (and hence host rock thermal conductivity) on the ventilation efficiency, and evaluate the effects of vapor and enhanced vapor diffusion on the host rock thermal conductivity
Energy Technology Data Exchange (ETDEWEB)
Lee, S.
2011-05-17
The process of recovering the waste in storage tanks at the Savannah River Site (SRS) typically requires mixing the contents of the tank to ensure uniformity of the discharge stream. Mixing is accomplished with one to four dual-nozzle slurry pumps located within the tank liquid. For the work, a Tank 48 simulation model with a maximum of four slurry pumps in operation has been developed to estimate flow patterns for efficient solid mixing. The modeling calculations were performed by using two modeling approaches. One approach is a single-phase Computational Fluid Dynamics (CFD) model to evaluate the flow patterns and qualitative mixing behaviors for a range of different modeling conditions since the model was previously benchmarked against the test results. The other is a two-phase CFD model to estimate solid concentrations in a quantitative way by solving the Eulerian governing equations for the continuous fluid and discrete solid phases over the entire fluid domain of Tank 48. The two-phase results should be considered as the preliminary scoping calculations since the model was not validated against the test results yet. A series of sensitivity calculations for different numbers of pumps and operating conditions has been performed to provide operational guidance for solids suspension and mixing in the tank. In the analysis, the pump was assumed to be stationary. Major solid obstructions including the pump housing, the pump columns, and the 82 inch central support column were included. The steady state and three-dimensional analyses with a two-equation turbulence model were performed with FLUENT{trademark} for the single-phase approach and CFX for the two-phase approach. Recommended operational guidance was developed assuming that local fluid velocity can be used as a measure of sludge suspension and spatial mixing under single-phase tank model. For quantitative analysis, a two-phase fluid-solid model was developed for the same modeling conditions as the single
ANALYSIS MODEL FOR INVENTORY MANAGEMENT
Directory of Open Access Journals (Sweden)
CAMELIA BURJA
2010-01-01
Full Text Available The inventory represents an essential component for the assets of the enterprise and the economic analysis gives them special importance because their accurate management determines the achievement of the activity object and the financial results. The efficient management of inventory requires ensuring an optimum level for them, which will guarantee the normal functioning of the activity with minimum inventory expenses and funds which are immobilised. The paper presents an analysis model for inventory management based on their rotation speed and the correlation with the sales volume illustrated in an adequate study. The highlighting of the influence factors on the efficient inventory management ensures the useful information needed to justify managerial decisions, which will lead to a balancedfinancial position and to increased company performance.
Distribution system modeling and analysis
Kersting, William H
2002-01-01
For decades, distribution engineers did not have the sophisticated tools developed for analyzing transmission systems-often they had only their instincts. Things have changed, and we now have computer programs that allow engineers to simulate, analyze, and optimize distribution systems. Powerful as these programs are, however, without a real understanding of the operating characteristics of a distribution system, engineers using the programs can easily make serious errors in their designs and operating procedures.Distribution System Modeling and Analysis helps prevent those errors. It gives re
The gmdl Modeling and Analysis System
Gillblad, Daniel; Holst, Anders; Kreuger, Per; Levin, Björn
2004-01-01
This report describes the gmdl modeling and analysis environment. gmdl was designed to provide powerful data analysis, modeling, and visualization with simple, clear semantics and easy to use, well defined syntactic conventions. It provides an extensive set of necessary for general data preparation, analysis, and modeling tasks.
Geologic Framework Model Analysis Model Report
International Nuclear Information System (INIS)
The purpose of this report is to document the Geologic Framework Model (GFM), Version 3.1 (GFM3.1) with regard to data input, modeling methods, assumptions, uncertainties, limitations, and validation of the model results, qualification status of the model, and the differences between Version 3.1 and previous versions. The GFM represents a three-dimensional interpretation of the stratigraphy and structural features of the location of the potential Yucca Mountain radioactive waste repository. The GFM encompasses an area of 65 square miles (170 square kilometers) and a volume of 185 cubic miles (771 cubic kilometers). The boundaries of the GFM were chosen to encompass the most widely distributed set of exploratory boreholes (the Water Table or WT series) and to provide a geologic framework over the area of interest for hydrologic flow and radionuclide transport modeling through the unsaturated zone (UZ). The depth of the model is constrained by the inferred depth of the Tertiary-Paleozoic unconformity. The GFM was constructed from geologic map and borehole data. Additional information from measured stratigraphy sections, gravity profiles, and seismic profiles was also considered. This interim change notice (ICN) was prepared in accordance with the Technical Work Plan for the Integrated Site Model Process Model Report Revision 01 (CRWMS M and O 2000). The constraints, caveats, and limitations associated with this model are discussed in the appropriate text sections that follow. The GFM is one component of the Integrated Site Model (ISM) (Figure l), which has been developed to provide a consistent volumetric portrayal of the rock layers, rock properties, and mineralogy of the Yucca Mountain site. The ISM consists of three components: (1) Geologic Framework Model (GFM); (2) Rock Properties Model (RPM); and (3) Mineralogic Model (MM). The ISM merges the detailed project stratigraphy into model stratigraphic units that are most useful for the primary downstream models and
Geologic Framework Model Analysis Model Report
Energy Technology Data Exchange (ETDEWEB)
R. Clayton
2000-12-19
The purpose of this report is to document the Geologic Framework Model (GFM), Version 3.1 (GFM3.1) with regard to data input, modeling methods, assumptions, uncertainties, limitations, and validation of the model results, qualification status of the model, and the differences between Version 3.1 and previous versions. The GFM represents a three-dimensional interpretation of the stratigraphy and structural features of the location of the potential Yucca Mountain radioactive waste repository. The GFM encompasses an area of 65 square miles (170 square kilometers) and a volume of 185 cubic miles (771 cubic kilometers). The boundaries of the GFM were chosen to encompass the most widely distributed set of exploratory boreholes (the Water Table or WT series) and to provide a geologic framework over the area of interest for hydrologic flow and radionuclide transport modeling through the unsaturated zone (UZ). The depth of the model is constrained by the inferred depth of the Tertiary-Paleozoic unconformity. The GFM was constructed from geologic map and borehole data. Additional information from measured stratigraphy sections, gravity profiles, and seismic profiles was also considered. This interim change notice (ICN) was prepared in accordance with the Technical Work Plan for the Integrated Site Model Process Model Report Revision 01 (CRWMS M&O 2000). The constraints, caveats, and limitations associated with this model are discussed in the appropriate text sections that follow. The GFM is one component of the Integrated Site Model (ISM) (Figure l), which has been developed to provide a consistent volumetric portrayal of the rock layers, rock properties, and mineralogy of the Yucca Mountain site. The ISM consists of three components: (1) Geologic Framework Model (GFM); (2) Rock Properties Model (RPM); and (3) Mineralogic Model (MM). The ISM merges the detailed project stratigraphy into model stratigraphic units that are most useful for the primary downstream models and the
Hypersonic - Model Analysis as a Service
DEFF Research Database (Denmark)
Acretoaie, Vlad; Störrle, Harald
2014-01-01
Hypersonic is a Cloud-based tool that proposes a new approach to the deployment of model analysis facilities. It is implemented as a RESTful Web service API o_ering analysis features such as model clone detection. This approach allows the migration of resource intensive analysis algorithms from m...
Colour model analysis for microscopic image processing
García-Rojo Marcial; González Jesús; Déniz Oscar; González Roberto; Bueno Gloria
2008-01-01
Abstract This article presents a comparative study between different colour models (RGB, HSI and CIEL*a*b*) applied to a very large microscopic image analysis. Such analysis of different colour models is needed in order to carry out a successful detection and therefore a classification of different regions of interest (ROIs) within the image. This, in turn, allows both distinguishing possible ROIs and retrieving their proper colour for further ROI analysis. This analysis is not commonly done ...
FAME, the Flux Analysis and Modeling Environment
Boele Joost; Olivier Brett G; Teusink Bas
2012-01-01
Background The creation and modification of genome-scale metabolic models is a task that requires specialized software tools. While these are available, subsequently running or visualizing a model often relies on disjoint code, which adds additional actions to the analysis routine and, in our experience, renders these applications suboptimal for routine use by (systems) biologists. Results The Flux Analysis and Modeling Environment (FAME) is the first web-based modeling tool that combines th...
Bayesian Model Averaging for Propensity Score Analysis
Kaplan, David; Chen, Jianshen
2013-01-01
The purpose of this study is to explore Bayesian model averaging in the propensity score context. Previous research on Bayesian propensity score analysis does not take into account model uncertainty. In this regard, an internally consistent Bayesian framework for model building and estimation must also account for model uncertainty. The…
The Cosparse Analysis Model and Algorithms
Nam, Sangnam; Elad, Michael; Gribonval, Rémi
2011-01-01
After a decade of extensive study of the sparse representation synthesis model, we can safely say that this is a mature and stable field, with clear theoretical foundations, and appealing applications. Alongside this approach, there is an analysis counterpart model, which, despite its similarity to the synthesis alternative, is markedly different. Surprisingly, the analysis model did not get a similar attention, and its understanding today is shallow and partial. In this paper we take a closer look at the analysis approach, better define it as a generative model for signals, and contrast it with the synthesis one. This work proposes effective pursuit methods that aim to solve inverse problems regularized with the analysis-model prior, accompanied by a preliminary theoretical study of their performance. We demonstrate the effectiveness of the analysis model in several experiments.
Statistical Modelling of Wind Proles - Data Analysis and Modelling
DEFF Research Database (Denmark)
Jónsson, Tryggvi; Pinson, Pierre
The aim of the analysis presented in this document is to investigate whether statistical models can be used to make very short-term predictions of wind profiles.......The aim of the analysis presented in this document is to investigate whether statistical models can be used to make very short-term predictions of wind profiles....
Two sustainable energy system analysis models
DEFF Research Database (Denmark)
Lund, Henrik; Goran Krajacic, Neven Duic; da Graca Carvalho, Maria
2005-01-01
This paper presents a comparative study of two energy system analysis models both designed with the purpose of analysing electricity systems with a substantial share of fluctuating renewable energy.......This paper presents a comparative study of two energy system analysis models both designed with the purpose of analysing electricity systems with a substantial share of fluctuating renewable energy....
Brief analysis of Blog Websites' business models
Institute of Scientific and Technical Information of China (English)
魏娟
2009-01-01
Analysis continues using this framework of several major Blogs or Blog websites. From this analysis, three main weblog business models that are currently in operation will be introduced as well as described. As a part of this framework, this paper will also analyze the future viability of the models.
Introduction to Models in Spatial Analysis
Sanders, Lena
2007-01-01
The book provides a broad overview of the different types of models used in advanced spatial analysis. The models concern spatial organization, location factors and spatial interaction patterns from both static and dynamic perspectives. This introductory chapter proposes a discussion on the different meanings which are given to models in the field of spatial analysis depending on the formalization framework (statistics, GIS, computational approach). Core concepts as spatial interaction and le...
A Bayesian nonparametric meta-analysis model.
Karabatsos, George; Talbott, Elizabeth; Walker, Stephen G
2015-03-01
In a meta-analysis, it is important to specify a model that adequately describes the effect-size distribution of the underlying population of studies. The conventional normal fixed-effect and normal random-effects models assume a normal effect-size population distribution, conditionally on parameters and covariates. For estimating the mean overall effect size, such models may be adequate, but for prediction, they surely are not if the effect-size distribution exhibits non-normal behavior. To address this issue, we propose a Bayesian nonparametric meta-analysis model, which can describe a wider range of effect-size distributions, including unimodal symmetric distributions, as well as skewed and more multimodal distributions. We demonstrate our model through the analysis of real meta-analytic data arising from behavioral-genetic research. We compare the predictive performance of the Bayesian nonparametric model against various conventional and more modern normal fixed-effects and random-effects models.
System of systems modeling and analysis.
Energy Technology Data Exchange (ETDEWEB)
Campbell, James E.; Anderson, Dennis James; Longsine, Dennis E. (Intera, Inc., Austin, TX); Shirah, Donald N.
2005-01-01
This report documents the results of an LDRD program entitled 'System of Systems Modeling and Analysis' that was conducted during FY 2003 and FY 2004. Systems that themselves consist of multiple systems (referred to here as System of Systems or SoS) introduce a level of complexity to systems performance analysis and optimization that is not readily addressable by existing capabilities. The objective of the 'System of Systems Modeling and Analysis' project was to develop an integrated modeling and simulation environment that addresses the complex SoS modeling and analysis needs. The approach to meeting this objective involved two key efforts. First, a static analysis approach, called state modeling, has been developed that is useful for analyzing the average performance of systems over defined use conditions. The state modeling capability supports analysis and optimization of multiple systems and multiple performance measures or measures of effectiveness. The second effort involves time simulation which represents every system in the simulation using an encapsulated state model (State Model Object or SMO). The time simulation can analyze any number of systems including cross-platform dependencies and a detailed treatment of the logistics required to support the systems in a defined mission.
CMS Data Analysis School Model
Malik, Sudhir; Cavanaugh, R; Bloom, K; Chan, Kai-Feng; D'Hondt, J; Klima, B; Narain, M; Palla, F; Rolandi, G; Schörner-Sadenius, T
2014-01-01
To impart hands-on training in physics analysis, CMS experiment initiated the Â concept of CMS Data Analysis School (CMSDAS). It was born three years ago at the LPC (LHC Physics Center), Fermilab and is based on earlier workshops held at the LPC and CLEO Experiment. As CMS transitioned from construction to the data taking mode, the nature of earlier training also evolved to include more of analysis tools, software tutorialsÂ and physics analysis. This effort epitomized as CMSDAS has proven to be a key for the new and young physicists to jump start and contribute to the physics goals of CMS by looking for new physics with the collision data. With over 400 physicists trained inÂ six CMSDAS around the globe , CMS is trying toÂ Â engage the collaboration discovery potential and maximize the physics output. As a bigger goal,Â CMS is striving to nurture and increase engagement of the myriad talentsÂ of CMS, in the development of physics, service, upgrade, education ofÂ those new to CMS and the caree...
Analysis of radiology business models.
Enzmann, Dieter R; Schomer, Donald F
2013-03-01
As health care moves to value orientation, radiology's traditional business model faces challenges to adapt. The authors describe a strategic value framework that radiology practices can use to best position themselves in their environments. This simplified construct encourages practices to define their dominant value propositions. There are 3 main value propositions that form a conceptual triangle, whose vertices represent the low-cost provider, the product leader, and the customer intimacy models. Each vertex has been a valid market position, but each demands specific capabilities and trade-offs. The underlying concepts help practices select value propositions they can successfully deliver in their competitive environments. PMID:23245438
Hierarchical modeling and analysis for spatial data
Banerjee, Sudipto; Gelfand, Alan E
2003-01-01
Among the many uses of hierarchical modeling, their application to the statistical analysis of spatial and spatio-temporal data from areas such as epidemiology And environmental science has proven particularly fruitful. Yet to date, the few books that address the subject have been either too narrowly focused on specific aspects of spatial analysis, or written at a level often inaccessible to those lacking a strong background in mathematical statistics.Hierarchical Modeling and Analysis for Spatial Data is the first accessible, self-contained treatment of hierarchical methods, modeling, and dat
Representing Uncertainty on Model Analysis Plots
Smith, Trevor I
2016-01-01
Model analysis provides a mechanism for representing student learning as measured by standard multiple-choice surveys. The model plot contains information regarding both how likely students in a particular class are to choose the correct answer and how likely they are to choose an answer consistent with a well-documented conceptual model. Unfortunately Bao's original presentation of the model plot did not include a way to represent uncertainty in these measurements. I present details of a method to add error bars to model plots by expanding the work of Sommer and Lindell. I also provide a template for generating model plots with error bars.
Three-dimensional model analysis and processing
Yu, Faxin; Luo, Hao; Wang, Pinghui
2011-01-01
This book focuses on five hot research directions in 3D model analysis and processing in computer science: compression, feature extraction, content-based retrieval, irreversible watermarking and reversible watermarking.
Analysis model of structure-HDS
Institute of Scientific and Technical Information of China (English)
无
2000-01-01
Presents the model established for Structure-HDS(hydraulic damper system) analysis on the basis of the theoretical analysis model of non-compressed fluid in the round pipe will an uniform velocity used as the basic variable, and pressure losses resulting from cross section changes of fluid route taken into consideration. Which provides necessary basis for researches on earthquake responses of a structure with a spacious first story, equipped with HDS at first floor.
Applied research in uncertainty modeling and analysis
Ayyub, Bilal
2005-01-01
Uncertainty has been a concern to engineers, managers, and scientists for many years. For a long time uncertainty has been considered synonymous with random, stochastic, statistic, or probabilistic. Since the early sixties views on uncertainty have become more heterogeneous. In the past forty years numerous tools that model uncertainty, above and beyond statistics, have been proposed by several engineers and scientists. The tool/method to model uncertainty in a specific context should really be chosen by considering the features of the phenomenon under consideration, not independent of what is known about the system and what causes uncertainty. In this fascinating overview of the field, the authors provide broad coverage of uncertainty analysis/modeling and its application. Applied Research in Uncertainty Modeling and Analysis presents the perspectives of various researchers and practitioners on uncertainty analysis and modeling outside their own fields and domain expertise. Rather than focusing explicitly on...
Combustion instability modeling and analysis
Energy Technology Data Exchange (ETDEWEB)
Santoro, R.J.; Yang, V.; Santavicca, D.A. [Pennsylvania State Univ., University Park, PA (United States)] [and others
1995-10-01
It is well known that the two key elements for achieving low emissions and high performance in a gas turbine combustor are to simultaneously establish (1) a lean combustion zone for maintaining low NO{sub x} emissions and (2) rapid mixing for good ignition and flame stability. However, these requirements, when coupled with the short combustor lengths used to limit the residence time for NO formation typical of advanced gas turbine combustors, can lead to problems regarding unburned hydrocarbons (UHC) and carbon monoxide (CO) emissions, as well as the occurrence of combustion instabilities. Clearly, the key to successful gas turbine development is based on understanding the effects of geometry and operating conditions on combustion instability, emissions (including UHC, CO and NO{sub x}) and performance. The concurrent development of suitable analytical and numerical models that are validated with experimental studies is important for achieving this objective. A major benefit of the present research will be to provide for the first time an experimentally verified model of emissions and performance of gas turbine combustors.
Model Checking as Static Analysis: Revisited
DEFF Research Database (Denmark)
Zhang, Fuyuan; Nielson, Flemming; Nielson, Hanne Riis
2012-01-01
We show that the model checking problem of the μ-calculus can be viewed as an instance of static analysis. We propose Succinct Fixed Point Logic (SFP) within our logical approach to static analysis as an extension of Alternation-free Least Fixed Logic (ALFP). We generalize the notion...
Analysis of variance for model output
Jansen, M.J.W.
1999-01-01
A scalar model output Y is assumed to depend deterministically on a set of stochastically independent input vectors of different dimensions. The composition of the variance of Y is considered; variance components of particular relevance for uncertainty analysis are identified. Several analysis of va
Credit Risk Evaluation : Modeling - Analysis - Management
Wehrspohn, Uwe
2002-01-01
An analysis and further development of the building blocks of modern credit risk management: -Definitions of default -Estimation of default probabilities -Exposures -Recovery Rates -Pricing -Concepts of portfolio dependence -Time horizons for risk calculations -Quantification of portfolio risk -Estimation of risk measures -Portfolio analysis and portfolio improvement -Evaluation and comparison of credit risk models -Analytic portfolio loss distributions The thesis contributes to the evaluatio...
Analysis and evaluation of collaborative modeling processes
Ssebuggwawo, D.
2012-01-01
Analysis and evaluation of collaborative modeling processes is confronted with many challenges. On the one hand, many systems design and re-engineering projects require collaborative modeling approaches that can enhance their productivity. But, such collaborative efforts, which often consist of the
FAME, the Flux Analysis and Modeling Environment
Boele, J.; Olivier, B.G.; Teusink, B.
2012-01-01
The creation and modification of genome-scale metabolic models is a task that requires specialized software tools. While these are available, subsequently running or visualizing a model often relies on disjoint code, which adds additional actions to the analysis routi
Perturbation analysis of nonlinear matrix population models
Directory of Open Access Journals (Sweden)
Hal Caswell
2008-03-01
Full Text Available Perturbation analysis examines the response of a model to changes in its parameters. It is commonly applied to population growth rates calculated from linear models, but there has been no general approach to the analysis of nonlinear models. Nonlinearities in demographic models may arise due to density-dependence, frequency-dependence (in 2-sex models, feedback through the environment or the economy, and recruitment subsidy due to immigration, or from the scaling inherent in calculations of proportional population structure. This paper uses matrix calculus to derive the sensitivity and elasticity of equilibria, cycles, ratios (e.g. dependency ratios, age averages and variances, temporal averages and variances, life expectancies, and population growth rates, for both age-classified and stage-classified models. Examples are presented, applying the results to both human and non-human populations.
Formalising responsibility modelling for automatic analysis
Simpson, Robbie; Storer, Tim
2015-01-01
Modelling the structure of social-technical systems as a basis for informing software system design is a difficult compromise. Formal methods struggle to capture the scale and complexity of the heterogeneous organisations that use technical systems. Conversely, informal approaches lack the rigour needed to inform the software design and construction process or enable automated analysis. We revisit the concept of responsibility modelling, which models social technical systems as a collec...
Adsorption modeling for macroscopic contaminant dispersal analysis
Energy Technology Data Exchange (ETDEWEB)
Axley, J.W.
1990-05-01
Two families of macroscopic adsorption models are formulated, based on fundamental principles of adsorption science and technology, that may be used for macroscopic (such as whole-building) contaminant dispersal analysis. The first family of adsorption models - the Equilibrium Adsorption (EA) Models - are based upon the simple requirement of equilibrium between adsorbent and room air. The second family - the Boundary Layer Diffusion Controlled Adsorption (BLDC) Models - add to the equilibrium requirement a boundary layer model for diffusion of the adsorbate from the room air to the adsorbent surface. Two members of each of these families are explicitly discussed, one based on the linear adsorption isotherm model and the other on the Langmuir model. The linear variants of each family are applied to model the adsorption dynamics of formaldehyde in gypsum wall board and compared to measured data.
Simulation modeling and analysis with Arena
Altiok, Tayfur
2007-01-01
Simulation Modeling and Analysis with Arena is a highly readable textbook which treats the essentials of the Monte Carlo discrete-event simulation methodology, and does so in the context of a popular Arena simulation environment. It treats simulation modeling as an in-vitro laboratory that facilitates the understanding of complex systems and experimentation with what-if scenarios in order to estimate their performance metrics. The book contains chapters on the simulation modeling methodology and the underpinnings of discrete-event systems, as well as the relevant underlying probability, statistics, stochastic processes, input analysis, model validation and output analysis. All simulation-related concepts are illustrated in numerous Arena examples, encompassing production lines, manufacturing and inventory systems, transportation systems, and computer information systems in networked settings.· Introduces the concept of discrete event Monte Carlo simulation, the most commonly used methodology for modeli...
HVDC dynamic modelling for small signal analysis
Energy Technology Data Exchange (ETDEWEB)
Yang, X.; Chen, C. [Shanghai Jiaotong Univ. (China). Dept. of Electrical Engineering
2004-11-01
The conventional quasi-steady model of HVDC is not able to describe the dynamic switching behaviour of HVDC converters. By means of the sampled-data modelling approach, a linear time-invariant (LTI) small-signal dynamic model is developed for the HVDC main circuit in the synchronous rotating d-q reference frame. The linearised model is validated by time-domain simulation, and it can be seen that the model represents the dynamic response of the static switching circuits to perturbations in operating points. The model is valid for analysing oscillations including high frequency modes such as subsynchronous oscillation (SSO) and high frequency instability. The model is applied in two cases: (i) SSO analysis where the results are compared with the quasi-steady approach that has shown its validation for normal SSO analysis; (ii) high frequency eigenvalue analysis for HVDC benchmark system in which the results of root locus analysis and simulation shows that increased gain of rectifier DC PI controller may result in high-frequency oscillatory instability. (author)
A Requirements Analysis Model Based on QFD
Institute of Scientific and Technical Information of China (English)
TANG Zhi-wei; Nelson K.H.Tang
2004-01-01
The enterprise resource planning (ERP) system has emerged to offer an integrated IT solution and more and more enterprises are increasing by adopting this system and regarding it as an important innovation. However, there is already evidence of high failure risks in ERP project implementation, one major reason is poor analysis of the requirements for system implementation. In this paper, the importance of requirements analysis for ERP project implementation is highlighted, and a requirements analysis model by applying quality function deployment (QFD) is presented, which will support to conduct requirements analysis for ERP project.
Decision variables analysis for structured modeling
Institute of Scientific and Technical Information of China (English)
潘启树; 赫东波; 张洁; 胡运权
2002-01-01
Structured modeling is the most commonly used modeling method, but it is not quite addaptive to significant changes in environmental conditions. Therefore, Decision Variables Analysis(DVA), a new modelling method is proposed to deal with linear programming modeling and changing environments. In variant linear programming , the most complicated relationships are those among decision variables. DVA classifies the decision variables into different levels using different index sets, and divides a model into different elements so that any change can only have its effect on part of the whole model. DVA takes into consideration the complicated relationships among decision variables at different levels, and can therefore sucessfully solve any modeling problem in dramatically changing environments.
Sensitivity analysis of periodic matrix population models.
Caswell, Hal; Shyu, Esther
2012-12-01
Periodic matrix models are frequently used to describe cyclic temporal variation (seasonal or interannual) and to account for the operation of multiple processes (e.g., demography and dispersal) within a single projection interval. In either case, the models take the form of periodic matrix products. The perturbation analysis of periodic models must trace the effects of parameter changes, at each phase of the cycle, on output variables that are calculated over the entire cycle. Here, we apply matrix calculus to obtain the sensitivity and elasticity of scalar-, vector-, or matrix-valued output variables. We apply the method to linear models for periodic environments (including seasonal harvest models), to vec-permutation models in which individuals are classified by multiple criteria, and to nonlinear models including both immediate and delayed density dependence. The results can be used to evaluate management strategies and to study selection gradients in periodic environments. PMID:23316494
Analysis of N Category Privacy Models
Directory of Open Access Journals (Sweden)
Marn-Ling Shing
2012-10-01
Full Text Available File sharing becomes popular in social networking and the disclosure of private information without user’s consent can be found easily. Password management becomes increasingly necessary for maintaining privacy policy. Monitoring of violations of a privacy policy is needed to support the confidentiality of information security. This paper extends the analysis of two category confidentiality model to N categories, and illustrates how to use it to monitor the security state transitions in the information security privacy modeling.
Flood Progression Modelling and Impact Analysis
DEFF Research Database (Denmark)
Mioc, Darka; Anton, François; Nickerson, B.;
People living in the lower valley of the St. John River, New Brunswick, Canada, frequently experience flooding when the river overflows its banks during spring ice melt and rain. To better prepare the population of New Brunswick for extreme flooding, we developed a new flood prediction model that...... not be familiar with GIS analytical tools like Query Languages, can still understand technical discussions on flood analysis through the use of 3D models, which are close to reality....
Crystal structure of glutamate-1-semialdehyde-2,1-aminomutase from Arabidopsis thaliana
Energy Technology Data Exchange (ETDEWEB)
Song, Yingxian; Pu, Hua; Jiang, Tian; Zhang, Lixin; Ouyang, Min, E-mail: ouyangmin@ibcas.ac.cn [Chinese Academy of Sciences, Beijing 100093, People’s Republic of (China)
2016-05-23
A structural study of A. thaliana glutamate-1-semialdehyde-2,1-aminomutase (GSAM) has revealed asymmetry in cofactor binding as well as in the gating-loop orientation, which supports the previously proposed negative cooperativity between monomers of GSAM. Glutamate-1-semialdehyde-2,1-aminomutase (GSAM) catalyzes the isomerization of glutamate-1-semialdehyde (GSA) to 5-aminolevulinate (ALA) and is distributed in archaea, most bacteria and plants. Although structures of GSAM from archaea and bacteria have been resolved, a GSAM structure from a higher plant is not available, preventing further structure–function analysis. Here, the structure of GSAM from Arabidopsis thaliana (AtGSA1) obtained by X-ray crystallography is reported at 1.25 Å resolution. AtGSA1 forms an asymmetric dimer and displays asymmetry in cofactor binding as well as in the gating-loop orientation, which is consistent with previously reported Synechococcus GSAM structures. While one monomer binds PMP with the gating loop fixed in the open state, the other monomer binds either PMP or PLP and the gating loop is ready to close. The data also reveal the mobility of residues Gly163, Ser164 and Gly165, which are important for reorientation of the gating loop. Furthermore, the asymmetry of the AtGSA1 structure supports the previously proposed negative cooperativity between monomers of GSAM.
FAME, the Flux Analysis and Modeling Environment
Directory of Open Access Journals (Sweden)
Boele Joost
2012-01-01
Full Text Available Abstract Background The creation and modification of genome-scale metabolic models is a task that requires specialized software tools. While these are available, subsequently running or visualizing a model often relies on disjoint code, which adds additional actions to the analysis routine and, in our experience, renders these applications suboptimal for routine use by (systems biologists. Results The Flux Analysis and Modeling Environment (FAME is the first web-based modeling tool that combines the tasks of creating, editing, running, and analyzing/visualizing stoichiometric models into a single program. Analysis results can be automatically superimposed on familiar KEGG-like maps. FAME is written in PHP and uses the Python-based PySCeS-CBM for its linear solving capabilities. It comes with a comprehensive manual and a quick-start tutorial, and can be accessed online at http://f-a-m-e.org/. Conclusions With FAME, we present the community with an open source, user-friendly, web-based "one stop shop" for stoichiometric modeling. We expect the application will be of substantial use to investigators and educators alike.
Independent Component Analysis in Multimedia Modeling
DEFF Research Database (Denmark)
Larsen, Jan
Modeling of multimedia and multimodal data becomes increasingly important with the digitalization of the world. The objective of this paper is to demonstrate the potential of independent component analysis and blind sources separation methods for modeling and understanding of multimedia data, which...... largely refers to text, images/video, audio and combinations of such data. We review a number of applications within single and combined media with the hope that this might provide inspiration for further research in this area. Finally, we provide a detailed presentation of our own recent work on modeling...... combined text/image data for the purpose of cross-media retrieval....
Independent Component Analysis in Multimedia Modeling
DEFF Research Database (Denmark)
Larsen, Jan; Hansen, Lars Kai; Kolenda, Thomas;
2003-01-01
Modeling of multimedia and multimodal data becomes increasingly important with the digitalization of the world. The objective of this paper is to demonstrate the potential of independent component analysis and blind sources separation methods for modeling and understanding of multimedia data, which...... largely refers to text, images/video, audio and combinations of such data. We review a number of applications within single and combined media with the hope that this might provide inspiration for further research in this area. Finally, we provide a detailed presentation of our own recent work on modeling...... combined text/image data for the purpose of cross-media retrieval....
Modeling and analysis of stochastic systems
Kulkarni, Vidyadhar G
2011-01-01
Based on the author's more than 25 years of teaching experience, Modeling and Analysis of Stochastic Systems, Second Edition covers the most important classes of stochastic processes used in the modeling of diverse systems, from supply chains and inventory systems to genetics and biological systems. For each class of stochastic process, the text includes its definition, characterization, applications, transient and limiting behavior, first passage times, and cost/reward models. Along with reorganizing the material, this edition revises and adds new exercises and examples. New to the second edi
Power system stability modelling, analysis and control
Sallam, Abdelhay A
2015-01-01
This book provides a comprehensive treatment of the subject from both a physical and mathematical perspective and covers a range of topics including modelling, computation of load flow in the transmission grid, stability analysis under both steady-state and disturbed conditions, and appropriate controls to enhance stability.
Comparative Distributions of Hazard Modeling Analysis
Directory of Open Access Journals (Sweden)
Rana Abdul Wajid
2006-07-01
Full Text Available In this paper we present the comparison among the distributions used in hazard analysis. Simulation technique has been used to study the behavior of hazard distribution modules. The fundamentals of Hazard issues are discussed using failure criteria. We present the flexibility of the hazard modeling distribution that approaches to different distributions.
Energy Systems Modelling Research and Analysis
DEFF Research Database (Denmark)
Møller Andersen, Frits; Alberg Østergaard, Poul
2015-01-01
This editorial introduces the seventh volume of the International Journal of Sustainable Energy Planning and Management. The volume presents part of the outcome of the project Energy Systems Modelling Research and Analysis (ENSYMORA) funded by the Danish Innovation Fund. The project carried out...
Model Selection in Data Analysis Competitions
DEFF Research Database (Denmark)
Wind, David Kofoed; Winther, Ole
2014-01-01
The use of data analysis competitions for selecting the most appropriate model for a problem is a recent innovation in the field of predictive machine learning. Two of the most well-known examples of this trend was the Netflix Competition and recently the competitions hosted on the online platform...
Stochastic Modelling and Analysis of Warehouse Operations
Y. Gong (Yeming)
2009-01-01
textabstractThis thesis has studied stochastic models and analysis of warehouse operations. After an overview of stochastic research in warehouse operations, we explore the following topics. Firstly, we search optimal batch sizes in a parallel-aisle warehouse with online order arrivals. We employ a
Modeling uncertainty in geographic information and analysis
Institute of Scientific and Technical Information of China (English)
2008-01-01
Uncertainty modeling and data quality for spatial data and spatial analyses are im-portant topics in geographic information science together with space and time in geography,as well as spatial analysis. In the past two decades,a lot of efforts have been made to research the uncertainty modeling for spatial data and analyses. This paper presents our work in the research. In particular,four progresses in the re-search are given out: (a) from determinedness-to uncertainty-based representation of geographic objects in GIS; (b) from uncertainty modeling for static data to dy-namic spatial analyses; (c) from modeling uncertainty for spatial data to models; and (d) from error descriptions to quality control for spatial data.
End user analysis model at CMS
Spiga, D
2009-01-01
The CMS experiment at LHC has had a distributed computing model since early in the project plan. The geographically distributed computing system is based on a hierarchy of tiered regional computing centers; data reconstructed at Tier-0 are then distributed and archived at Tier-1 where re-reconstruction on data events is performed and computing resources for skimming and selection are provided. The Tier-2 centers are the primary location for analysis activities. The analysis will be thus performed in a distributed way using Grid infrastructure. The CMS computing model architecture has also the goal to enable thousands physicist collaboration worldwide spread (about 2600 from 180 scientific institutes) to access data. In order to require to the end user a very limited knowledge of underlying technical details, CMS has been developed a set of specific tools using the Grid services. This model is being tested in many Grid Service Challenges of increasing complexity, coordinated with the Worldwide LHC Computing Gr...
A Dynamic Model for Energy Structure Analysis
Institute of Scientific and Technical Information of China (English)
无
2006-01-01
Energy structure is a complicated system concerning economic development, natural resources, technological innovation, ecological balance, social progress and many other elements. It is not easy to explain clearly the developmental mechanism of an energy system and the mutual relations between the energy system and its related environments by the traditional methods. It is necessary to develop a suitable dynamic model, which can reflect the dynamic characteristics and the mutual relations of the energy system and its related environments. In this paper, the historical development of China's energy structure was analyzed. A new quantitative analysis model was developed based on system dynamics principles through analysis of energy resources, and the production and consumption of energy in China and comparison with the world. Finally, this model was used to predict China's future energy structures under different conditions.
Modeling and Analysis of Pulse Skip Modulation
Institute of Scientific and Technical Information of China (English)
无
2006-01-01
The state space average model and the large signal models of Pulse Skip Modulation (PSM) mode are given in this paper. Farther more, based on these models and simulations of PSM converter circuits, the analysis of the characteristics of PSM converter is described in this paper, of which include efficiency, frequency spectrum analysis, output voltage ripple, response speed and interference rejection capability. Compared with PWM control mode, PSM converter has high efficiency, especially with light loads, quick response, good interference rejection and good EMC characteristic. Improved PSM slightly, it could be a kind of good independent regulating mode during the whole operating process for a DC-DC converter. Finally, some experimental results are also presented in this paper.
Analysis hierarchical model for discrete event systems
Ciortea, E. M.
2015-11-01
The This paper presents the hierarchical model based on discrete event network for robotic systems. Based on the hierarchical approach, Petri network is analysed as a network of the highest conceptual level and the lowest level of local control. For modelling and control of complex robotic systems using extended Petri nets. Such a system is structured, controlled and analysed in this paper by using Visual Object Net ++ package that is relatively simple and easy to use, and the results are shown as representations easy to interpret. The hierarchical structure of the robotic system is implemented on computers analysed using specialized programs. Implementation of hierarchical model discrete event systems, as a real-time operating system on a computer network connected via a serial bus is possible, where each computer is dedicated to local and Petri model of a subsystem global robotic system. Since Petri models are simplified to apply general computers, analysis, modelling, complex manufacturing systems control can be achieved using Petri nets. Discrete event systems is a pragmatic tool for modelling industrial systems. For system modelling using Petri nets because we have our system where discrete event. To highlight the auxiliary time Petri model using transport stream divided into hierarchical levels and sections are analysed successively. Proposed robotic system simulation using timed Petri, offers the opportunity to view the robotic time. Application of goods or robotic and transmission times obtained by measuring spot is obtained graphics showing the average time for transport activity, using the parameters sets of finished products. individually.
To model bolted parts for tolerance analysis using variational model
Directory of Open Access Journals (Sweden)
Wilma Polini
2015-01-01
Full Text Available Mechanical products are usually made by assembling many parts. Among the different type of links, bolts are widely used to join the components of an assembly. In a bolting a clearance exists among the bolt and the holes of the parts to join. This clearance has to be modeled in order to define the possible movements agreed to the joined parts. The model of the clearance takes part to the global model that builds the stack-up functions by accumulating the tolerances applied to the assembly components. Then, the stack-up functions are solved to evaluate the influence of the tolerances assigned to the assembly components on the functional requirements of the assembly product. The aim of this work is to model the joining between two parts by a planar contact surface and two bolts inside the model that builds and solves the stack-up functions of the tolerance analysis. It adopts the variational solid model. The proposed model uses the simplified hypothesis that each surface maintains its nominal shape, i.e. the effects of the form errors are neglected. The proposed model has been applied to a case study where the holes have dimensional and positional tolerances in order to demonstrate its effectiveness.
Ferrofluids: Modeling, numerical analysis, and scientific computation
Tomas, Ignacio
This dissertation presents some developments in the Numerical Analysis of Partial Differential Equations (PDEs) describing the behavior of ferrofluids. The most widely accepted PDE model for ferrofluids is the Micropolar model proposed by R.E. Rosensweig. The Micropolar Navier-Stokes Equations (MNSE) is a subsystem of PDEs within the Rosensweig model. Being a simplified version of the much bigger system of PDEs proposed by Rosensweig, the MNSE are a natural starting point of this thesis. The MNSE couple linear velocity u, angular velocity w, and pressure p. We propose and analyze a first-order semi-implicit fully-discrete scheme for the MNSE, which decouples the computation of the linear and angular velocities, is unconditionally stable and delivers optimal convergence rates under assumptions analogous to those used for the Navier-Stokes equations. Moving onto the much more complex Rosensweig's model, we provide a definition (approximation) for the effective magnetizing field h, and explain the assumptions behind this definition. Unlike previous definitions available in the literature, this new definition is able to accommodate the effect of external magnetic fields. Using this definition we setup the system of PDEs coupling linear velocity u, pressure p, angular velocity w, magnetization m, and magnetic potential ϕ We show that this system is energy-stable and devise a numerical scheme that mimics the same stability property. We prove that solutions of the numerical scheme always exist and, under certain simplifying assumptions, that the discrete solutions converge. A notable outcome of the analysis of the numerical scheme for the Rosensweig's model is the choice of finite element spaces that allow the construction of an energy-stable scheme. Finally, with the lessons learned from Rosensweig's model, we develop a diffuse-interface model describing the behavior of two-phase ferrofluid flows and present an energy-stable numerical scheme for this model. For a
Guideliness for system modeling: fault tree [analysis
Energy Technology Data Exchange (ETDEWEB)
Lee, Yoon Hwan; Yang, Joon Eon; Kang, Dae Il; Hwang, Mee Jeong
2004-07-01
This document, the guidelines for system modeling related to Fault Tree Analysis(FTA), is intended to provide the guidelines with the analyzer to construct the fault trees in the level of the capability category II of ASME PRA standard. Especially, they are to provide the essential and basic guidelines and the related contents to be used in support of revising the Ulchin 3 and 4 PSA model for risk monitor within the capability category II of ASME PRA standard. Normally the main objective of system analysis is to assess the reliability of system modeled by Event Tree Analysis (ETA). A variety of analytical techniques can be used for the system analysis, however, FTA method is used in this procedures guide. FTA is the method used for representing the failure logic of plant systems deductively using AND, OR or NOT gates. The fault tree should reflect all possible failure modes that may contribute to the system unavailability. This should include contributions due to the mechanical failures of the components, Common Cause Failures (CCFs), human errors and outages for testing and maintenance. This document identifies and describes the definitions and the general procedures of FTA and the essential and basic guidelines for reving the fault trees. Accordingly, the guidelines for FTA will be capable to guide the FTA to the level of the capability category II of ASME PRA standard.
Model Performance Evaluation and Scenario Analysis (MPESA) Tutorial
This tool consists of two parts: model performance evaluation and scenario analysis (MPESA). The model performance evaluation consists of two components: model performance evaluation metrics and model diagnostics. These metrics provides modelers with statistical goodness-of-fit m...
Social phenomena from data analysis to models
Perra, Nicola
2015-01-01
This book focuses on the new possibilities and approaches to social modeling currently being made possible by an unprecedented variety of datasets generated by our interactions with modern technologies. This area has witnessed a veritable explosion of activity over the last few years, yielding many interesting and useful results. Our aim is to provide an overview of the state of the art in this area of research, merging an extremely heterogeneous array of datasets and models. Social Phenomena: From Data Analysis to Models is divided into two parts. Part I deals with modeling social behavior under normal conditions: How we live, travel, collaborate and interact with each other in our daily lives. Part II deals with societal behavior under exceptional conditions: Protests, armed insurgencies, terrorist attacks, and reactions to infectious diseases. This book offers an overview of one of the most fertile emerging fields bringing together practitioners from scientific communities as diverse as social sciences, p...
Advances in statistical models for data analysis
Minerva, Tommaso; Vichi, Maurizio
2015-01-01
This edited volume focuses on recent research results in classification, multivariate statistics and machine learning and highlights advances in statistical models for data analysis. The volume provides both methodological developments and contributions to a wide range of application areas such as economics, marketing, education, social sciences and environment. The papers in this volume were first presented at the 9th biannual meeting of the Classification and Data Analysis Group (CLADAG) of the Italian Statistical Society, held in September 2013 at the University of Modena and Reggio Emilia, Italy.
3D face modeling, analysis and recognition
Daoudi, Mohamed; Veltkamp, Remco
2013-01-01
3D Face Modeling, Analysis and Recognition presents methodologies for analyzing shapes of facial surfaces, develops computational tools for analyzing 3D face data, and illustrates them using state-of-the-art applications. The methodologies chosen are based on efficient representations, metrics, comparisons, and classifications of features that are especially relevant in the context of 3D measurements of human faces. These frameworks have a long-term utility in face analysis, taking into account the anticipated improvements in data collection, data storage, processing speeds, and application s
LCD motion blur: modeling, analysis, and algorithm.
Chan, Stanley H; Nguyen, Truong Q
2011-08-01
Liquid crystal display (LCD) devices are well known for their slow responses due to the physical limitations of liquid crystals. Therefore, fast moving objects in a scene are often perceived as blurred. This effect is known as the LCD motion blur. In order to reduce LCD motion blur, an accurate LCD model and an efficient deblurring algorithm are needed. However, existing LCD motion blur models are insufficient to reflect the limitation of human-eye-tracking system. Also, the spatiotemporal equivalence in LCD motion blur models has not been proven directly in the discrete 2-D spatial domain, although it is widely used. There are three main contributions of this paper: modeling, analysis, and algorithm. First, a comprehensive LCD motion blur model is presented, in which human-eye-tracking limits are taken into consideration. Second, a complete analysis of spatiotemporal equivalence is provided and verified using real video sequences. Third, an LCD motion blur reduction algorithm is proposed. The proposed algorithm solves an l(1)-norm regularized least-squares minimization problem using a subgradient projection method. Numerical results show that the proposed algorithm gives higher peak SNR, lower temporal error, and lower spatial error than motion-compensated inverse filtering and Lucy-Richardson deconvolution algorithm, which are two state-of-the-art LCD deblurring algorithms.
Modelling and analysis of global coal markets
Energy Technology Data Exchange (ETDEWEB)
Trueby, Johannes
2013-01-17
The thesis comprises four interrelated essays featuring modelling and analysis of coal markets. Each of the four essays has a dedicated chapter in this thesis. Chapters 2 to 4 have, from a topical perspective, a backward-looking focus and deal with explaining recent market outcomes in the international coal trade. The findings of those essays may serve as guidance for assessing current coal market outcomes as well as expected market outcomes in the near to medium-term future. Chapter 5 has a forward-looking focus and builds a bridge between explaining recent market outcomes and projecting long-term market equilibria. Chapter 2, Strategic Behaviour in International Metallurgical Coal Markets, deals with market conduct of large exporters in the market of coals used in steel-making in the period 2008 to 2010. In this essay I analyse whether prices and trade-flows in the international market for metallurgical coals were subject to non-competitive conduct in the period 2008 to 2010. To do so, I develop mathematical programming models - a Stackelberg model, two varieties of a Cournot model, and a perfect competition model - for computing spatial equilibria in international resource markets. Results are analysed with various statistical measures to assess the prediction accuracy of the models. The results show that real market equilibria cannot be reproduced with a competitive model. However, real market outcomes can be accurately simulated with the non-competitive models, suggesting that market equilibria in the international metallurgical coal trade were subject to the strategic behaviour of coal exporters. Chapter 3 and chapter 4 deal with market power issues in the steam coal trade in the period 2006 to 2008. Steam coals are typically used to produce steam either for electricity generation or for heating purposes. In Chapter 3 we analyse market behaviour of key exporting countries in the steam coal trade. This chapter features the essay Market Structure Scenarios in
Extrudate Expansion Modelling through Dimensional Analysis Method
DEFF Research Database (Denmark)
A new model framework is proposed to correlate extrudate expansion and extrusion operation parameters for a food extrusion cooking process through dimensional analysis principle, i.e. Buckingham pi theorem. Three dimensionless groups, i.e. energy, water content and temperature, are suggested...... to describe the extrudates expansion. From the three dimensionless groups, an equation with three experimentally determined parameters is derived to express the extrudate expansion. The model is evaluated with whole wheat flour and aquatic feed extrusion experimental data. The average deviations...
Formal Modeling and Analysis of Timed Systems
DEFF Research Database (Denmark)
Larsen, Kim Guldstrand; Niebert, Peter
This book constitutes the thoroughly refereed post-proceedings of the First International Workshop on Formal Modeling and Analysis of Timed Systems, FORMATS 2003, held in Marseille, France in September 2003. The 19 revised full papers presented together with an invited paper and the abstracts of...... two invited talks were carefully selected from 36 submissions during two rounds of reviewing and improvement. All current aspects of formal method for modeling and analyzing timed systems are addressed; among the timed systems dealt with are timed automata, timed Petri nets, max-plus algebras, real...
MATHEMATICAL RISK ANALYSIS: VIA NICHOLAS RISK MODEL AND BAYESIAN ANALYSIS
Directory of Open Access Journals (Sweden)
Anass BAYAGA
2010-07-01
Full Text Available The objective of this second part of a two-phased study was to explorethe predictive power of quantitative risk analysis (QRA method andprocess within Higher Education Institution (HEI. The method and process investigated the use impact analysis via Nicholas risk model and Bayesian analysis, with a sample of hundred (100 risk analysts in a historically black South African University in the greater Eastern Cape Province.The first findings supported and confirmed previous literature (KingIII report, 2009: Nicholas and Steyn, 2008: Stoney, 2007: COSA, 2004 that there was a direct relationship between risk factor, its likelihood and impact, certiris paribus. The second finding in relation to either controlling the likelihood or the impact of occurrence of risk (Nicholas risk model was that to have a brighter risk reward, it was important to control the likelihood ofoccurrence of risks as compared with its impact so to have a direct effect on entire University. On the Bayesian analysis, thus third finding, the impact of risk should be predicted along three aspects. These aspects included the human impact (decisions made, the property impact (students and infrastructural based and the business impact. Lastly, the study revealed that although in most business cases, where as business cycles considerably vary dependingon the industry and or the institution, this study revealed that, most impacts in HEI (University was within the period of one academic.The recommendation was that application of quantitative risk analysisshould be related to current legislative framework that affects HEI.
Mathematical analysis of a muscle architecture model.
Navallas, Javier; Malanda, Armando; Gila, Luis; Rodríguez, Javier; Rodríguez, Ignacio
2009-01-01
Modeling of muscle architecture, which aims to recreate mathematically the physiological structure of the muscle fibers and motor units, is a powerful tool for understanding and modeling the mechanical and electrical behavior of the muscle. Most of the published models are presented in the form of algorithms, without mathematical analysis of mechanisms or outcomes of the model. Through the study of the muscle architecture model proposed by Stashuk, we present the analytical tools needed to better understand these models. We provide a statistical description for the spatial relations between motor units and muscle fibers. We are particularly concerned with two physiological quantities: the motor unit fiber number, which we expect to be proportional to the motor unit territory area; and the motor unit fiber density, which we expect to be constant for all motor units. Our results indicate that the Stashuk model is in good agreement with the physiological evidence in terms of the expectations outlined above. However, the resulting variance is very high. In addition, a considerable 'edge effect' is present in the outer zone of the muscle cross-section, making the properties of the motor units dependent on their location. This effect is relevant when motor unit territories and muscle cross-section are of similar size.
Multivariate Probabilistic Analysis of an Hydrological Model
Franceschini, Samuela; Marani, Marco
2010-05-01
Model predictions derived based on rainfall measurements and hydrological model results are often limited by the systematic error of measuring instruments, by the intrinsic variability of the natural processes and by the uncertainty of the mathematical representation. We propose a means to identify such sources of uncertainty and to quantify their effects based on point-estimate approaches, as a valid alternative to cumbersome Montecarlo methods. We present uncertainty analyses on the hydrologic response to selected meteorological events, in the mountain streamflow-generating portion of the Brenta basin at Bassano del Grappa, Italy. The Brenta river catchment has a relatively uniform morphology and quite a heterogeneous rainfall-pattern. In the present work, we evaluate two sources of uncertainty: data uncertainty (the uncertainty due to data handling and analysis) and model uncertainty (the uncertainty related to the formulation of the model). We thus evaluate the effects of the measurement error of tipping-bucket rain gauges, the uncertainty in estimating spatially-distributed rainfall through block kriging, and the uncertainty associated with estimated model parameters. To this end, we coupled a deterministic model based on the geomorphological theory of the hydrologic response to probabilistic methods. In particular we compare the results of Monte Carlo Simulations (MCS) to the results obtained, in the same conditions, using Li's Point Estimate Method (LiM). The LiM is a probabilistic technique that approximates the continuous probability distribution function of the considered stochastic variables by means of discrete points and associated weights. This allows to satisfactorily reproduce results with only few evaluations of the model function. The comparison between the LiM and MCS results highlights the pros and cons of using an approximating method. LiM is less computationally demanding than MCS, but has limited applicability especially when the model
Scripted Building Energy Modeling and Analysis: Preprint
Energy Technology Data Exchange (ETDEWEB)
Hale, E.; Macumber, D.; Benne, K.; Goldwasser, D.
2012-08-01
Building energy modeling and analysis is currently a time-intensive, error-prone, and nonreproducible process. This paper describes the scripting platform of the OpenStudio tool suite (http://openstudio.nrel.gov) and demonstrates its use in several contexts. Two classes of scripts are described and demonstrated: measures and free-form scripts. Measures are small, single-purpose scripts that conform to a predefined interface. Because measures are fairly simple, they can be written or modified by inexperienced programmers.
Bayesian Analysis of Multivariate Probit Models
Siddhartha Chib; Edward Greenberg
1996-01-01
This paper provides a unified simulation-based Bayesian and non-Bayesian analysis of correlated binary data using the multivariate probit model. The posterior distribution is simulated by Markov chain Monte Carlo methods, and maximum likelihood estimates are obtained by a Markov chain Monte Carlo version of the E-M algorithm. Computation of Bayes factors from the simulation output is also considered. The methods are applied to a bivariate data set, to a 534-subject, four-year longitudinal dat...
Micromechatronics modeling, analysis, and design with Matlab
Giurgiutiu, Victor
2009-01-01
Focusing on recent developments in engineering science, enabling hardware, advanced technologies, and software, Micromechatronics: Modeling, Analysis, and Design with MATLAB®, Second Edition provides clear, comprehensive coverage of mechatronic and electromechanical systems. It applies cornerstone fundamentals to the design of electromechanical systems, covers emerging software and hardware, introduces the rigorous theory, examines the design of high-performance systems, and helps develop problem-solving skills. Along with more streamlined material, this edition adds many new sections to exist
Economic Modeling and Analysis of Educational Vouchers
Dennis Epple; Richard Romano
2012-01-01
The analysis of educational vouchers has evolved from market-based analogies to models that incorporate distinctive features of the educational environment. These distinctive features include peer effects, scope for private school pricing and admissions based on student characteristics, the linkage of household residential and school choices in multidistrict settings, the potential for rent seeking in public and private schools, the role of school reputations, incentives for student effort, a...
ANALYSIS MODEL FOR RETURN ON CAPITAL EMPLOYED
BURJA CAMELIA
2013-01-01
At the microeconomic level, the appreciation of the capitals’ profitability is a very complex action which is of interest for stakeholders. This study has as main purpose to extend the traditional analysis model for the capitals’ profitability, based on the ratio “Return on capital employed”. In line with it the objectives of this work aim the identification of factors that exert an influence on the capital’s profitability utilized by a company and the measurement of their contribution in the...
An analysis of penalized interaction models
Zhao, Junlong; Leng, Chenlei
2016-01-01
An important consideration for variable selection in interaction models is to design an appropriate penalty that respects hierarchy of the importance of the variables. A common theme is to include an interaction term only after the corresponding main effects are present. In this paper, we study several recently proposed approaches and present a unified analysis on the convergence rate for a class of estimators, when the design satisfies the restricted eigenvalue condition. In particular, we s...
Modeling and Thermal Analysis of Disc
Directory of Open Access Journals (Sweden)
Brake Praveena S
2014-10-01
Full Text Available The disc brake is a device used for slowing or stopping the rotation of the vehicle. Number of times using the brake for vehicle leads to heat generation during braking event, such that disc brake undergoes breakage due to high Temperature. Disc brake model is done by CATIA and analysis is done by using ANSYS workbench. The main purpose of this project is to study the Thermal analysis of the Materials for the Aluminum, Grey Cast Iron, HSS M42, and HSS M2. A comparison between the four materials for the Thermal values and material properties obtained from the Thermal analysis low thermal gradient material is preferred. Hence best suitable design, low thermal gradient material Grey cast iron is preferred for the Disc Brakes for better performance.
Modeling late entry bias in survival analysis.
Matsuura, Masaaki; Eguchi, Shinto
2005-06-01
In a failure time analysis, we sometimes observe additional study subjects who enter during the study period. These late entries are treated as left-truncated data in the statistical literature. However, with real data, there is a substantial possibility that the delayed entries may have extremely different hazards compared to the other standard subjects. We focus on a situation in which such entry bias might arise in the analysis of survival data. The purpose of the present article is to develop an appropriate methodology for making inference about data including late entries. We construct a model that includes parameters for the effect of delayed entry bias having no specification for the distribution of entry time. We also discuss likelihood inference based on this model and derive the asymptotic behavior of estimates. A simulation study is conducted for a finite sample size in order to compare the analysis results using our method with those using the standard method, where independence between entry time and failure time is assumed. We apply this method to mortality analysis among atomic bomb survivors defined in a geographical study region. PMID:16011705
Modeling and analysis of advanced binary cycles
Energy Technology Data Exchange (ETDEWEB)
Gawlik, K.
1997-12-31
A computer model (Cycle Analysis Simulation Tool, CAST) and a methodology have been developed to perform value analysis for small, low- to moderate-temperature binary geothermal power plants. The value analysis method allows for incremental changes in the levelized electricity cost (LEC) to be determined between a baseline plant and a modified plant. Thermodynamic cycle analyses and component sizing are carried out in the model followed by economic analysis which provides LEC results. The emphasis of the present work is on evaluating the effect of mixed working fluids instead of pure fluids on the LEC of a geothermal binary plant that uses a simple Organic Rankine Cycle. Four resources were studied spanning the range of 265{degrees}F to 375{degrees}F. A variety of isobutane and propane based mixtures, in addition to pure fluids, were used as working fluids. This study shows that the use of propane mixtures at a 265{degrees}F resource can reduce the LEC by 24% when compared to a base case value that utilizes commercial isobutane as its working fluid. The cost savings drop to 6% for a 375{degrees}F resource, where an isobutane mixture is favored. Supercritical cycles were found to have the lowest cost at all resources.
Mathematical analysis of epidemiological models with heterogeneity
Energy Technology Data Exchange (ETDEWEB)
Van Ark, J.W.
1992-01-01
For many diseases in human populations the disease shows dissimilar characteristics in separate subgroups of the population; for example, the probability of disease transmission for gonorrhea or AIDS is much higher from male to female than from female to male. There is reason to construct and analyze epidemiological models which allow this heterogeneity of population, and to use these models to run computer simulations of the disease to predict the incidence and prevalence of the disease. In the models considered here the heterogeneous population is separated into subpopulations whose internal and external interactions are homogeneous in the sense that each person in the population can be assumed to have all average actions for the people of that subpopulation. The first model considered is an SIRS models; i.e., the Susceptible can become Infected, and if so he eventually Recovers with temporary immunity, and after a period of time becomes Susceptible again. Special cases allow for permanent immunity or other variations. This model is analyzed and threshold conditions are given which determine whether the disease dies out or persists. A deterministic model is presented; this model is constructed using difference equations, and it has been used in computer simulations for the AIDS epidemic in the homosexual population in San Francisco. The homogeneous version and the heterogeneous version of the differential-equations and difference-equations versions of the deterministic model are analyzed mathematically. In the analysis, equilibria are identified and threshold conditions are set forth for the disease to die out if the disease is below the threshold so that the disease-free equilibrium is globally asymptotically stable. Above the threshold the disease persists so that the disease-free equilibrium is unstable and there is a unique endemic equilibrium.
Model reduction using a posteriori analysis
Whiteley, Jonathan P.
2010-05-01
Mathematical models in biology and physiology are often represented by large systems of non-linear ordinary differential equations. In many cases, an observed behaviour may be written as a linear functional of the solution of this system of equations. A technique is presented in this study for automatically identifying key terms in the system of equations that are responsible for a given linear functional of the solution. This technique is underpinned by ideas drawn from a posteriori error analysis. This concept has been used in finite element analysis to identify regions of the computational domain and components of the solution where a fine computational mesh should be used to ensure accuracy of the numerical solution. We use this concept to identify regions of the computational domain and components of the solution where accurate representation of the mathematical model is required for accuracy of the functional of interest. The technique presented is demonstrated by application to a model problem, and then to automatically deduce known results from a cell-level cardiac electrophysiology model. © 2010 Elsevier Inc.
Ontological Modeling for Integrated Spacecraft Analysis
Wicks, Erica
2011-01-01
Current spacecraft work as a cooperative group of a number of subsystems. Each of these requiresmodeling software for development, testing, and prediction. It is the goal of my team to create anoverarching software architecture called the Integrated Spacecraft Analysis (ISCA) to aid in deploying the discrete subsystems' models. Such a plan has been attempted in the past, and has failed due to the excessive scope of the project. Our goal in this version of ISCA is to use new resources to reduce the scope of the project, including using ontological models to help link the internal interfaces of subsystems' models with the ISCA architecture.I have created an ontology of functions specific to the modeling system of the navigation system of a spacecraft. The resulting ontology not only links, at an architectural level, language specificinstantiations of the modeling system's code, but also is web-viewable and can act as a documentation standard. This ontology is proof of the concept that ontological modeling can aid in the integration necessary for ISCA to work, and can act as the prototype for future ISCA ontologies.
Computational Models for Analysis of Illicit Activities
DEFF Research Database (Denmark)
Nizamani, Sarwat
Numerous illicit activities happen in our society, which, from time to time affect the population by harming individuals directly or indirectly. Researchers from different disciplines have contributed to developing strategies to analyze such activities, in order to help law enforcement agents dev...... population globally sensitive to specific world issues. The models discuss the dynamics of population in response to such issues. All the models presented in the thesis can be combined for a systematic analysis of illicit activities.......Numerous illicit activities happen in our society, which, from time to time affect the population by harming individuals directly or indirectly. Researchers from different disciplines have contributed to developing strategies to analyze such activities, in order to help law enforcement agents...... devise policies to minimize them. These activities include cybercrimes, terrorist attacks or violent actions in response to certain world issues. Beside such activities, there are several other related activities worth analyzing, for which computational models have been presented in this thesis...
Analysis of software for modeling atmospheric dispersion
International Nuclear Information System (INIS)
During last few years, a number software packages for microcomputes have appeared with the aim to simulate diffusion of atmospheric pollutants. These codes, simplifying the models used for safety analyses of industrial plants are becoming more useful, and are even used for post-accidental conditions. The report presents for the first time in a critical manner, principal models available up to this date. The problem arises in adapting the models to the demanded post-accidental interventions. In parallel to this action an analysis of performance was performed. It means, identifying the need of forecasting the most appropriate actions to be performed having in mind short available time and lack of information. Because of these difficulties, it is possible to simplify the software, which will not include all the options but could deal with a specific situation. This would enable minimisation of data to be collected on the site
Modeling and Hazard Analysis Using STPA
Ishimatsu, Takuto; Leveson, Nancy; Thomas, John; Katahira, Masa; Miyamoto, Yuko; Nakao, Haruka
2010-09-01
A joint research project between MIT and JAXA/JAMSS is investigating the application of a new hazard analysis to the system and software in the HTV. Traditional hazard analysis focuses on component failures but software does not fail in this way. Software most often contributes to accidents by commanding the spacecraft into an unsafe state(e.g., turning off the descent engines prematurely) or by not issuing required commands. That makes the standard hazard analysis techniques of limited usefulness on software-intensive systems, which describes most spacecraft built today. STPA is a new hazard analysis technique based on systems theory rather than reliability theory. It treats safety as a control problem rather than a failure problem. The goal of STPA, which is to create a set of scenarios that can lead to a hazard, is the same as FTA but STPA includes a broader set of potential scenarios including those in which no failures occur but the problems arise due to unsafe and unintended interactions among the system components. STPA also provides more guidance to the analysts that traditional fault tree analysis. Functional control diagrams are used to guide the analysis. In addition, JAXA uses a model-based system engineering development environment(created originally by Leveson and called SpecTRM) which also assists in the hazard analysis. One of the advantages of STPA is that it can be applied early in the system engineering and development process in a safety-driven design process where hazard analysis drives the design decisions rather than waiting until reviews identify problems that are then costly or difficult to fix. It can also be applied in an after-the-fact analysis and hazard assessment, which is what we did in this case study. This paper describes the experimental application of STPA to the JAXA HTV in order to determine the feasibility and usefulness of the new hazard analysis technique. Because the HTV was originally developed using fault tree analysis
Global sensitivity analysis of thermomechanical models in modelling of welding
International Nuclear Information System (INIS)
Current approach of most welding modellers is to content themselves with available material data, and to chose a mechanical model that seems to be appropriate. Among inputs, those controlling the material properties are one of the key problems of welding simulation: material data are never characterized over a sufficiently wide temperature range. This way to proceed neglect the influence of the uncertainty of input data on the result given by the computer code. In this case, how to assess the credibility of prediction? This thesis represents a step in the direction of implementing an innovative approach in welding simulation in order to bring answers to this question, with an illustration on some concretes welding cases.The global sensitivity analysis is chosen to determine which material properties are the most sensitive in a numerical welding simulation and in which range of temperature. Using this methodology require some developments to sample and explore the input space covering welding of different steel materials. Finally, input data have been divided in two groups according to their influence on the output of the model (residual stress or distortion). In this work, complete methodology of the global sensitivity analysis has been successfully applied to welding simulation and lead to reduce the input space to the only important variables. Sensitivity analysis has provided answers to what can be considered as one of the probable frequently asked questions regarding welding simulation: for a given material which properties must be measured with a good accuracy and which ones can be simply extrapolated or taken from a similar material? (author)
Automating Risk Analysis of Software Design Models
Directory of Open Access Journals (Sweden)
Maxime Frydman
2014-01-01
Full Text Available The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance.
Gentrification and models for real estate analysis
Directory of Open Access Journals (Sweden)
Gianfranco Brusa
2013-08-01
Full Text Available This research propose a deep analysis of Milanese real estate market, based on data supplied by three real estate organizations; gentrification appears in some neighborhoods, such as Tortona, Porta Genova, Bovisa, Isola Garibaldi: the latest is the subject of the final analysis, by surveying of physical and social state of the area. The survey takes place in two periods (2003 and 2009 to compare the evolution of gentrification. The results of surveys has been employed in a simulation by multi-agent system model, to foresee long term evolution of the phenomenon. These neighborhood micro-indicators allow to put in evidence actual trends, conditioning a local real estate market, which can translate themselves in phenomena such as gentrification. In present analysis, the employ of cellular automata models applied to a neighborhood in Milan (Isola Garibaldi produced the dynamic simulation of gentrification trend during a very long time: the cyclical phenomenon (one loop holds a period of twenty – thirty years appears sometimes during a theoretical time of 100 – 120 – 150 years. Simulation of long period scenarios by multi-agent systems and cellular automata provides estimator with powerful tool, without limits in implementing it, able to support him in appraisal judge. It stands also to reason that such a tool can sustain urban planning and related evaluation processes.
Microblog Sentiment Analysis with Emoticon Space Model
Institute of Scientific and Technical Information of China (English)
姜飞; 刘奕群; 孙甲申; 朱璇; 张敏; 马少平
2015-01-01
Emoticons have been widely employed to express different types of moods, emotions, and feelings in microblog environments. They are therefore regarded as one of the most important signals for microblog sentiment analysis. Most existing studies use several emoticons that convey clear emotional meanings as noisy sentiment labels or similar sentiment indicators. However, in practical microblog environments, tens or even hundreds of emoticons are frequently adopted and all emoticons have their own unique emotional meanings. Besides, a considerable number of emoticons do not have clear emotional meanings. An improved sentiment analysis model should not overlook these phenomena. Instead of manually assigning sentiment labels to several emoticons that convey relatively clear meanings, we propose the emoticon space model (ESM) that leverages more emoticons to construct word representations from a massive amount of unlabeled data. By projecting words and microblog posts into an emoticon space, the proposed model helps identify subjectivity, polarity, and emotion in microblog environments. The experimental results for a public microblog benchmark corpus (NLP&CC 2013) indicate that ESM effectively leverages emoticon signals and outperforms previous state-of-the-art strategies and benchmark best runs.
Inducer analysis/pump model development
Cheng, Gary C.
1994-01-01
Current design of high performance turbopumps for rocket engines requires effective and robust analytical tools to provide design information in a productive manner. The main goal of this study was to develop a robust and effective computational fluid dynamics (CFD) pump model for general turbopump design and analysis applications. A finite difference Navier-Stokes flow solver, FDNS, which includes an extended k-epsilon turbulence model and appropriate moving zonal interface boundary conditions, was developed to analyze turbulent flows in turbomachinery devices. In the present study, three key components of the turbopump, the inducer, impeller, and diffuser, were investigated by the proposed pump model, and the numerical results were benchmarked by the experimental data provided by Rocketdyne. For the numerical calculation of inducer flows with tip clearance, the turbulence model and grid spacing are very important. Meanwhile, the development of the cross-stream secondary flow, generated by curved blade passage and the flow through tip leakage, has a strong effect on the inducer flow. Hence, the prediction of the inducer performance critically depends on whether the numerical scheme of the pump model can simulate the secondary flow pattern accurately or not. The impeller and diffuser, however, are dominated by pressure-driven flows such that the effects of turbulence model and grid spacing (except near leading and trailing edges of blades) are less sensitive. The present CFD pump model has been proved to be an efficient and robust analytical tool for pump design due to its very compact numerical structure (requiring small memory), fast turnaround computing time, and versatility for different geometries.
Modelling structural systems for transient response analysis
International Nuclear Information System (INIS)
This paper introduces and reports success of a direct means of determining the time periods in which a structural system behaves as a linear system. Numerical results are based on post fracture transient analyses of simplified nuclear piping systems. Knowledge of the linear response ranges will lead to improved analysis-test correlation and more efficient analyses. It permits direct use of data from physical tests in analysis and simplication of the analytical model and interpretation of its behaviour. The paper presents a procedure for deducing linearity based on transient responses. Given the forcing functions and responses of discrete points of the system at various times, the process produces evidence of linearity and quantifies an adequate set of equations of motion. Results of use of the process with linear and nonlinear analyses of piping systems with damping illustrate its success. Results cover the application to data from mathematical system responses. (Auth.)
Data Logistics and the CMS Analysis Model
Managan, Julie E
2009-01-01
The Compact Muon Solenoid Experiment (CMS) at the Large Hadron Collider (LHC) at CERN has brilliant prospects for uncovering new information about the physical structure of our universe. Soon physicists around the world will participate together in analyzing CMS data in search of new physics phenomena and the Higgs Boson. However, they face a significant problem: with 5 Petabytes of data needing distribution each year, how will physicists get the data they need? How and where will they be able to analyze it? Computing resources and scientists are scattered around the world, while CMS data exists in localized chunks. The CMS computing model only allows analysis of locally stored data, “tethering” analysis to storage. The Vanderbilt CMS team is actively working to solve this problem with the Research and Education Data Depot Network (REDDnet), a program run by Vanderbilt’s Advanced Computing Center for Research and Education (ACCRE). The Compact Muon Solenoid Experiment (CMS) at the Large Hadron Collider ...
Spatiochromatic Context Modeling for Color Saliency Analysis.
Zhang, Jun; Wang, Meng; Zhang, Shengping; Li, Xuelong; Wu, Xindong
2016-06-01
Visual saliency is one of the most noteworthy perceptual abilities of human vision. Recent progress in cognitive psychology suggests that: 1) visual saliency analysis is mainly completed by the bottom-up mechanism consisting of feedforward low-level processing in primary visual cortex (area V1) and 2) color interacts with spatial cues and is influenced by the neighborhood context, and thus it plays an important role in a visual saliency analysis. From a computational perspective, the most existing saliency modeling approaches exploit multiple independent visual cues, irrespective of their interactions (or are not computed explicitly), and ignore contextual influences induced by neighboring colors. In addition, the use of color is often underestimated in the visual saliency analysis. In this paper, we propose a simple yet effective color saliency model that considers color as the only visual cue and mimics the color processing in V1. Our approach uses region-/boundary-defined color features with spatiochromatic filtering by considering local color-orientation interactions, therefore captures homogeneous color elements, subtle textures within the object and the overall salient object from the color image. To account for color contextual influences, we present a divisive normalization method for chromatic stimuli through the pooling of contrary/complementary color units. We further define a color perceptual metric over the entire scene to produce saliency maps for color regions and color boundaries individually. These maps are finally globally integrated into a one single saliency map. The final saliency map is produced by Gaussian blurring for robustness. We evaluate the proposed method on both synthetic stimuli and several benchmark saliency data sets from the visual saliency analysis to salient object detection. The experimental results demonstrate that the use of color as a unique visual cue achieves competitive results on par with or better than 12 state
Non standard analysis, polymer models, quantum fields
International Nuclear Information System (INIS)
We give an elementary introduction to non standard analysis and its applications to the theory of stochastic processes. This is based on a joint book with J.E. Fenstad, R. Hoeegh-Krohn and T. Lindstroeem. In particular we give a discussion of an hyperfinite theory of Dirichlet forms with applications to the study of the Hamiltonian for a quantum mechanical particle in the potential created by a polymer. We also discuss new results on the existence of attractive polymer measures in dimension d 12 phi22)sub(d)-model of interacting quantum fields. (orig.)
Modelling and analysis of global coal markets
International Nuclear Information System (INIS)
The thesis comprises four interrelated essays featuring modelling and analysis of coal markets. Each of the four essays has a dedicated chapter in this thesis. Chapters 2 to 4 have, from a topical perspective, a backward-looking focus and deal with explaining recent market outcomes in the international coal trade. The findings of those essays may serve as guidance for assessing current coal market outcomes as well as expected market outcomes in the near to medium-term future. Chapter 5 has a forward-looking focus and builds a bridge between explaining recent market outcomes and projecting long-term market equilibria. Chapter 2, Strategic Behaviour in International Metallurgical Coal Markets, deals with market conduct of large exporters in the market of coals used in steel-making in the period 2008 to 2010. In this essay I analyse whether prices and trade-flows in the international market for metallurgical coals were subject to non-competitive conduct in the period 2008 to 2010. To do so, I develop mathematical programming models - a Stackelberg model, two varieties of a Cournot model, and a perfect competition model - for computing spatial equilibria in international resource markets. Results are analysed with various statistical measures to assess the prediction accuracy of the models. The results show that real market equilibria cannot be reproduced with a competitive model. However, real market outcomes can be accurately simulated with the non-competitive models, suggesting that market equilibria in the international metallurgical coal trade were subject to the strategic behaviour of coal exporters. Chapter 3 and chapter 4 deal with market power issues in the steam coal trade in the period 2006 to 2008. Steam coals are typically used to produce steam either for electricity generation or for heating purposes. In Chapter 3 we analyse market behaviour of key exporting countries in the steam coal trade. This chapter features the essay Market Structure Scenarios in
MODELING ANALYSIS FOR GROUT HOPPER WASTE TANK
Energy Technology Data Exchange (ETDEWEB)
Lee, S.
2012-01-04
The Saltstone facility at Savannah River Site (SRS) has a grout hopper tank to provide agitator stirring of the Saltstone feed materials. The tank has about 300 gallon capacity to provide a larger working volume for the grout nuclear waste slurry to be held in case of a process upset, and it is equipped with a mechanical agitator, which is intended to keep the grout in motion and agitated so that it won't start to set up. The primary objective of the work was to evaluate the flow performance for mechanical agitators to prevent vortex pull-through for an adequate stirring of the feed materials and to estimate an agitator speed which provides acceptable flow performance with a 45{sup o} pitched four-blade agitator. In addition, the power consumption required for the agitator operation was estimated. The modeling calculations were performed by taking two steps of the Computational Fluid Dynamics (CFD) modeling approach. As a first step, a simple single-stage agitator model with 45{sup o} pitched propeller blades was developed for the initial scoping analysis of the flow pattern behaviors for a range of different operating conditions. Based on the initial phase-1 results, the phase-2 model with a two-stage agitator was developed for the final performance evaluations. A series of sensitivity calculations for different designs of agitators and operating conditions have been performed to investigate the impact of key parameters on the grout hydraulic performance in a 300-gallon hopper tank. For the analysis, viscous shear was modeled by using the Bingham plastic approximation. Steady state analyses with a two-equation turbulence model were performed. All analyses were based on three-dimensional results. Recommended operational guidance was developed by using the basic concept that local shear rate profiles and flow patterns can be used as a measure of hydraulic performance and spatial stirring. Flow patterns were estimated by a Lagrangian integration technique along
ANALYSIS MODEL FOR RETURN ON CAPITAL EMPLOYED
Directory of Open Access Journals (Sweden)
BURJA CAMELIA
2013-02-01
Full Text Available At the microeconomic level, the appreciation of the capitals’ profitability is a very complex action which is ofinterest for stakeholders. This study has as main purpose to extend the traditional analysis model for the capitals’profitability, based on the ratio “Return on capital employed”. In line with it the objectives of this work aim theidentification of factors that exert an influence on the capital’s profitability utilized by a company and the measurementof their contribution in the manifestation of the phenomenon. The proposed analysis model is validated on the use caseof a representative company from the agricultural sector. The results obtained reveal that in a company there are somefactors which can act positively on the capitals’ profitability: capital turnover, sales efficiency, increase the share ofsales in the total revenues, improvement of the expenses’ efficiency. The findings are useful both for the decisionmakingfactors in substantiating the economic strategies and for the capital owners who are interested in efficiency oftheir investments.
Data analysis and source modelling for LISA
International Nuclear Information System (INIS)
The gravitational waves are one of the most important predictions in general relativity. Besides of the directly proof of the existence of GWs, there are already several ground based detectors (such as LIGO, GEO, etc) and the planed future space mission (such as: LISA) which are aim to detect the GWs directly. GW contain a large amount of information of its source, extracting these information can help us dig out the physical property of the source, even open a new window for understanding the Universe. Hence, GW data analysis will be a challenging task in seeking the GWs. In this thesis, I present two works about the data analysis for LISA. In the first work, we introduce an extended multimodal genetic algorithm which utilizes the properties of the signal and the detector response function to analyze the data from the third round of mock LISA data challenge. We have found all five sources present in the data and recovered the coalescence time, chirp mass, mass ratio and sky location with reasonable accuracy. As for the orbital angular momentum and two spins of the Black Holes, we have found a large number of widely separated modes in the parameter space with similar maximum likelihood values. The performance of this method is comparable, if not better, to already existing algorithms. In the second work, we introduce an new phenomenological waveform model for the extreme mass ratio inspiral system. This waveform consists of a set of harmonics with constant amplitude and slowly evolving phase which we decompose in a Taylor series. We use these phenomenological templates to detect the signal in the simulated data, and then, assuming a particular EMRI model, estimate the physical parameters of the binary with high precision. The results show that our phenomenological waveform is very feasible in the data analysis of EMRI signal.
Visual behaviour analysis and driver cognitive model
Energy Technology Data Exchange (ETDEWEB)
Baujon, J.; Basset, M.; Gissinger, G.L. [Mulhouse Univ., (France). MIPS/MIAM Lab.
2001-07-01
Recent studies on driver behaviour have shown that perception - mainly visual but also proprioceptive perception - plays a key role in the ''driver-vehicle-road'' system and so considerably affects the driver's decision making. Within the framework of the behaviour analysis and studies low-cost system (BASIL), this paper presents a correlative, qualitative and quantitative study, comparing the information given by visual perception and by the trajectory followed. This information will help to obtain a cognitive model of the Rasmussen type according to different driver classes. Many experiments in real driving situations have been carried out for different driver classes and for a given trajectory profile, using a test vehicle and innovative, specially designed, real-time tools, such as the vision system or the positioning module. (orig.)
Saturn Ring Data Analysis and Thermal Modeling
Dobson, Coleman
2011-01-01
CIRS, VIMS, UVIS, and ISS (Cassini's Composite Infrared Specrtometer, Visual and Infrared Mapping Spectrometer, Ultra Violet Imaging Spectrometer and Imaging Science Subsystem, respectively), have each operated in a multidimensional observation space and have acquired scans of the lit and unlit rings at multiple phase angles. To better understand physical and dynamical ring particle parametric dependence, we co-registered profiles from these three instruments, taken at a wide range of wavelengths, from ultraviolet through the thermal infrared, to associate changes in ring particle temperature with changes in observed brightness, specifically with albedos inferred by ISS, UVIS and VIMS. We work in a parameter space where the solar elevation range is constrained to 12 deg - 14 deg and the chosen radial region is the B3 region of the B ring; this region is the most optically thick region in Saturn's rings. From this compilation of multiple wavelength data, we construct and fit phase curves and color ratios using independent dynamical thermal models for ring structure and overplot Saturn, Saturn ring, and Solar spectra. Analysis of phase curve construction and color ratios reveals thermal emission to fall within the extrema of the ISS bandwidth and a geometrical dependence of reddening on phase angle, respectively. Analysis of spectra reveals Cassini CIRS Saturn spectra dominate Cassini CIRS B3 Ring Spectra from 19 to 1000 microns, while Earth-based B Ring Spectrum dominates Earth-based Saturn Spectrum from 0.4 to 4 microns. From our fits we test out dynamical thermal models; from the phase curves we derive ring albedos and non-lambertian properties of the ring particle surfaces; and from the color ratios we examine multiple scattering within the regolith of ring particles.
A visual analysis of the process of process modeling
Claes, J Jan; Vanderfeesten, ITP Irene; Pinggera, J.; Reijers, HA Hajo; Weber, B.; Poels, G
2015-01-01
The construction of business process models has become an important requisite in the analysis and optimization of processes. The success of the analysis and optimization efforts heavily depends on the quality of the models. Therefore, a research domain emerged that studies the process of process modeling. This paper contributes to this research by presenting a way of visualizing the different steps a modeler undertakes to construct a process model, in a so-called process of process modeling C...
Statistical Analysis and Modeling of Elastic Functions
Srivastava, Anuj; Kurtek, Sebastian; Klassen, Eric; Marron, J S
2011-01-01
We introduce a novel geometric framework for separating, analyzing and modeling the $x$ (or horizontal) and the $y$ (or vertical) variability in time-warped functional data of the type frequently studied in growth curve analysis. This framework is based on the use of the Fisher-Rao Riemannian metric that provides a proper distance for: (1) aligning, comparing and modeling functions and (2) analyzing the warping functions. A convenient square-root velocity function (SRVF) representation transforms the Fisher-Rao metric to the standard $\\ltwo$ metric, a tool that is applied twice in this framework. Firstly, it is applied to the given functions where it leads to a parametric family of penalized-$\\ltwo$ distances in SRVF space. The parameter controls the levels of elasticity of the individual functions. These distances are then used to define Karcher means and the individual functions are optimally warped to align them to the Karcher means to extract the $y$ variability. Secondly, the resulting warping functions,...
Tradeoff Analysis for Optimal Multiobjective Inventory Model
Directory of Open Access Journals (Sweden)
Longsheng Cheng
2013-01-01
Full Text Available Deterministic inventory model, the economic order quantity (EOQ, reveals that carrying inventory or ordering frequency follows a relation of tradeoff. For probabilistic demand, the tradeoff surface among annual order, expected inventory and shortage are useful because they quantify what the firm must pay in terms of ordering workload and inventory investment to meet the customer service desired. Based on a triobjective inventory model, this paper employs the successive approximation to obtain efficient control policies outlining tradeoffs among conflicting objectives. The nondominated solutions obtained by successive approximation are further used to plot a 3D scatterplot for exploring the relationships between objectives. Visualization of the tradeoffs displayed by the scatterplots justifies the computation effort done in the experiment, although several iterations needed to reach a nondominated solution make the solution procedure lengthy and tedious. Information elicited from the inverse relationships may help managers make deliberate inventory decisions. For the future work, developing an efficient and effective solution procedure for tradeoff analysis in multiobjective inventory management seems imperative.
Comparative analysis of enterprise risk management models
Nikolaev Igor V.
2012-01-01
The article is devoted to the analysis and the comparison of modern enterprise risk management models used in domestic and world practice. Some thesis to build such a model are proposed.Статья посвящена анализу и сравнению современных моделей управления рисками предприятий, которые используются в отечественной и зарубежной практике. Предложены некоторые положения, на которых должны базироваться такие модели....
Production TTR modeling and dynamic buckling analysis
Institute of Scientific and Technical Information of China (English)
Hugh Liu; John Wei; Edward Huang
2013-01-01
In a typical tension leg platform (TLP) design,the top tension factor (TTF),measuring the top tension of a top tensioned riser (TTR) relative to its submerged weight in water,is one of the most important design parameters that has to be specified properly.While a very small TTF may lead to excessive vortex induced vibration (ⅤⅣ),clashing issues and possible compression close to seafloor,an unnecessarily high TTF may translate into excessive riser cost and vessel payload,and even has impacts on the TLP sizing and design in general.In the process of a production TTR design,it is found that its outer casing can be subjected to compression in a worst-case scenario with some extreme metocean and hardware conditions.The present paper shows how finite element analysis (FEA) models using beam elements and two different software packages (Flexcom and ABAQUS) are constructed to simulate the TTR properly,and especially the pipe-in-pipe effects.An ABAQUS model with hybrid elements (beam elements globally + shell elements locally) can be used to investigate how the outer casing behaves under compression.It is shown for the specified TTR design,even with its outer casing being under some local compression in the worst-case scenario,dynamic buckling would not occur; therefore the TTR design is adequate.
Linking advanced fracture models to structural analysis
Energy Technology Data Exchange (ETDEWEB)
Chiesa, Matteo
2001-07-01
Shell structures with defects occur in many situations. The defects are usually introduced during the welding process necessary for joining different parts of the structure. Higher utilization of structural materials leads to a need for accurate numerical tools for reliable prediction of structural response. The direct discretization of the cracked shell structure with solid finite elements in order to perform an integrity assessment of the structure in question leads to large size problems, and makes such analysis infeasible in structural application. In this study a link between local material models and structural analysis is outlined. An ''ad hoc'' element formulation is used in order to connect complex material models to the finite element framework used for structural analysis. An improved elasto-plastic line spring finite element formulation, used in order to take cracks into account, is linked to shell elements which are further linked to beam elements. In this way one obtain a global model of the shell structure that also accounts for local flexibilities and fractures due to defects. An important advantage with such an approach is a direct fracture mechanics assessment e.g. via computed J-integral or CTOD. A recent development in this approach is the notion of two-parameter fracture assessment. This means that the crack tip stress tri-axiality (constraint) is employed in determining the corresponding fracture toughness, giving a much more realistic capacity of cracked structures. The present thesis is organized in six research articles and an introductory chapter that reviews important background literature related to this work. Paper I and II address the performance of shell and line spring finite elements as a cost effective tool for performing the numerical calculation needed to perform a fracture assessment. In Paper II a failure assessment, based on the testing of a constraint-corrected fracture mechanics specimen under tension, is
Comparison of Statistical Models for Regional Crop Trial Analysis
Institute of Scientific and Technical Information of China (English)
ZHANG Qun-yuan; KONG Fan-ling
2002-01-01
Based on the review and comparison of main statistical analysis models for estimating varietyenvironment cell means in regional crop trials, a new statistical model, LR-PCA composite model was proposed, and the predictive precision of these models were compared by cross validation of an example data. Results showed that the order of model precision was LR-PCA model ＞ AMMI model ＞ PCA model ＞ Treatment Means (TM) model ＞ Linear Regression (LR) model ＞ Additive Main Effects ANOVA model. The precision gain factor of LR-PCA model was 1.55, increasing by 8.4% compared with AMMI.
Model Based Analysis and Test Generation for Flight Software
Pasareanu, Corina S.; Schumann, Johann M.; Mehlitz, Peter C.; Lowry, Mike R.; Karsai, Gabor; Nine, Harmon; Neema, Sandeep
2009-01-01
We describe a framework for model-based analysis and test case generation in the context of a heterogeneous model-based development paradigm that uses and combines Math- Works and UML 2.0 models and the associated code generation tools. This paradigm poses novel challenges to analysis and test case generation that, to the best of our knowledge, have not been addressed before. The framework is based on a common intermediate representation for different modeling formalisms and leverages and extends model checking and symbolic execution tools for model analysis and test case generation, respectively. We discuss the application of our framework to software models for a NASA flight mission.
Modeling and analysis of solar distributed generation
Ortiz Rivera, Eduardo Ivan
Recent changes in the global economy are creating a big impact in our daily life. The price of oil is increasing and the number of reserves are less every day. Also, dramatic demographic changes are impacting the viability of the electric infrastructure and ultimately the economic future of the industry. These are some of the reasons that many countries are looking for alternative energy to produce electric energy. The most common form of green energy in our daily life is solar energy. To convert solar energy into electrical energy is required solar panels, dc-dc converters, power control, sensors, and inverters. In this work, a photovoltaic module, PVM, model using the electrical characteristics provided by the manufacturer data sheet is presented for power system applications. Experimental results from testing are showed, verifying the proposed PVM model. Also in this work, three maximum power point tracker, MPPT, algorithms would be presented to obtain the maximum power from a PVM. The first MPPT algorithm is a method based on the Rolle's and Lagrange's Theorems and can provide at least an approximate answer to a family of transcendental functions that cannot be solved using differential calculus. The second MPPT algorithm is based on the approximation of the proposed PVM model using fractional polynomials where the shape, boundary conditions and performance of the proposed PVM model are satisfied. The third MPPT algorithm is based in the determination of the optimal duty cycle for a dc-dc converter and the previous knowledge of the load or load matching conditions. Also, four algorithms to calculate the effective irradiance level and temperature over a photovoltaic module are presented in this work. The main reasons to develop these algorithms are for monitoring climate conditions, the elimination of temperature and solar irradiance sensors, reductions in cost for a photovoltaic inverter system, and development of new algorithms to be integrated with maximum
Applied data analysis and modeling for energy engineers and scientists
Reddy, T Agami
2011-01-01
""Applied Data Analysis and Modeling for Energy Engineers and Scientists"" discusses mathematical models, data analysis, and decision analysis in modeling. The approach taken in this volume focuses on the modeling and analysis of thermal systems in an engineering environment, while also covering a number of other critical areas. Other material covered includes the tools that researchers and engineering professionals will need in order to explore different analysis methods, use critical assessment skills and reach sound engineering conclusions. The book also covers process and system design and
Towards a controlled sensitivity analysis of model development decisions
Clark, Martyn; Nijssen, Bart
2016-04-01
The current generation of hydrologic models have followed a myriad of different development paths, making it difficult for the community to test underlying hypotheses and identify a clear path to model improvement. Model comparison studies have been undertaken to explore model differences, but these studies have not been able to meaningfully attribute inter-model differences in predictive ability to individual model components because there are often too many structural and implementation differences among the models considered. As a consequence, model comparison studies to date have provided limited insight into the causes of differences in model behavior, and model development has often relied on the inspiration and experience of individual modelers rather than a systematic analysis of model shortcomings. This presentation will discuss a unified approach to process-based hydrologic modeling to enable controlled and systematic analysis of multiple model representations (hypotheses) of hydrologic processes and scaling behavior. Our approach, which we term the Structure for Unifying Multiple Modeling Alternatives (SUMMA), formulates a general set of conservation equations, providing the flexibility to experiment with different spatial representations, different flux parameterizations, different model parameter values, and different time stepping schemes. We will discuss the use of SUMMA to systematically analyze different model development decisions, focusing on both analysis of simulations for intensively instrumented research watersheds as well as simulations across a global dataset of FLUXNET sites. The intent of the presentation is to demonstrate how the systematic analysis of model shortcomings can help identify model weaknesses and inform future model development priorities.
Evaluation of RCAS Inflow Models for Wind Turbine Analysis
Energy Technology Data Exchange (ETDEWEB)
Tangler, J.; Bir, G.
2004-02-01
The finite element structural modeling in the Rotorcraft Comprehensive Analysis System (RCAS) provides a state-of-the-art approach to aeroelastic analysis. This, coupled with its ability to model all turbine components, results in a methodology that can simulate complex system interactions characteristic of large wind. In addition, RCAS is uniquely capable of modeling advanced control algorithms and the resulting dynamic responses.
Analysis on the Logarithmic Model of Relationships
Institute of Scientific and Technical Information of China (English)
无
2005-01-01
The logarithmic model is often used to describe the relationships between factors.It often gives good statistical characteristics.Yet,in the process of modeling of soil and water conservation,we find out that this“good”model cannot guarantee good result.In this paper we make an inquiry into the intrinsic reasons.It is shown that the logarithmic model has the property of enlarging or reducing model errors,and the disadvantages of the logarithmic model are analyzed.
Model performance analysis and model validation in logistic regression
Directory of Open Access Journals (Sweden)
Rosa Arboretti Giancristofaro
2007-10-01
Full Text Available In this paper a new model validation procedure for a logistic regression model is presented. At first, we illustrate a brief review of different techniques of model validation. Next, we define a number of properties required for a model to be considered "good", and a number of quantitative performance measures. Lastly, we describe a methodology for the assessment of the performance of a given model by using an example taken from a management study.
An Extended Analysis of Requirements Traceability Model
Institute of Scientific and Technical Information of China (English)
Jiang Dandong(蒋丹东); Zhang Shensheng; Chen Lu
2004-01-01
A new extended meta model of traceability is presented. Then, a formalized fine-grained model of traceability is described. Some major issues about this model, including trace units, requirements and relations within the model, are further analyzed. Finally, a case study that comes from a key project of 863 Program is given.
Managing Analysis Models in the Design Process
Briggs, Clark
2006-01-01
Design of large, complex space systems depends on significant model-based support for exploration of the design space. Integrated models predict system performance in mission-relevant terms given design descriptions and multiple physics-based numerical models. Both the design activities and the modeling activities warrant explicit process definitions and active process management to protect the project from excessive risk. Software and systems engineering processes have been formalized and similar formal process activities are under development for design engineering and integrated modeling. JPL is establishing a modeling process to define development and application of such system-level models.
Model Analysis Assessing the dynamics of student learning
Bao, L; Bao, Lei; Redish, Edward F.
2002-01-01
In this paper we present a method of modeling and analysis that permits the extraction and quantitative display of detailed information about the effects of instruction on a class's knowledge. The method relies on a congitive model that represents student thinking in terms of mental models. Students frequently fail to recognize relevant conditions that lead to appropriate uses of their models. As a result they can use multiple models inconsistently. Once the most common mental models have been determined by qualitative research, they can be mapping onto a multiple choice test. Model analysis permits the interpretation of such a situation. We illustrate the use of our method by analyzing results from the FCI.
EXPOSURE ANALYSIS MODELING SYSTEM (EXAMS): USER MANUAL AND SYSTEM DOCUMENTATION
The Exposure Analysis Modeling System, first published in 1982 (EPA-600/3-82-023), provides interactive computer software for formulating aquatic ecosystem models and rapidly evaluating the fate, transport, and exposure concentrations of synthetic organic chemicals - pesticides, ...
Qualitative Analysis of Integration Adapter Modeling
Ritter, Daniel; Holzleitner, Manuel
2015-01-01
Integration Adapters are a fundamental part of an integration system, since they provide (business) applications access to its messaging channel. However, their modeling and configuration remain under-represented. In previous work, the integration control and data flow syntax and semantics have been expressed in the Business Process Model and Notation (BPMN) as a semantic model for message-based integration, while adapter and the related quality of service modeling were left for further studi...
Analysis and modeling of parking behavior
Institute of Scientific and Technical Information of China (English)
无
2001-01-01
Analyzes the spatial structure of parking behavior and establishes a basic parking behavior model to represent the parking problem in downtown, and establishes a parking pricing model to analyze the parking equilibrium with a positive parking fee and uses a paired combinatorial logit model to analyze the effect of trip integrative cost on parking behavior and concludes from empirical results that the parking behavior model performs well.
Multiattribute shopping models and ridge regression analysis
Timmermans, HJP Harry
1981-01-01
Policy decisions regarding retailing facilities essentially involve multiple attributes of shopping centres. If mathematical shopping models are to contribute to these decision processes, their structure should reflect the multiattribute character of retailing planning. Examination of existing models shows that most operational shopping models include only two policy variables. A serious problem in the calibration of the existing multiattribute shopping models is that of multicollinearity ari...
Analysis on Some of Software Reliability Models
Institute of Scientific and Technical Information of China (English)
无
2001-01-01
Software reliability & maintainability evaluation tool (SRMET 3.0) is introducted in detail in this paper,which was developed by Software Evaluation and Test Center of China Aerospace Mechanical Corporation. SRMET 3.0is supported by seven soft ware reliability models and four software maintainability models. Numerical characteristicsfor all those models are deeply studied in this paper, and corresponding numerical algorithms for each model are alsogiven in the paper.
The Model and Analysis of Mechatronics Systems
Bačkys, Gediminas
2004-01-01
Modeling process of mechatronics systems. Software used with PLC and simulate real equipment. System has few samples of models with pneumatics elements and model with analog device. There are education materials for students too. Software has been used for education goal at university and kolege.
Likelihood analysis of the I(2) model
DEFF Research Database (Denmark)
Johansen, Søren
1997-01-01
The I(2) model is defined as a submodel of the general vector autoregressive model, by two reduced rank conditions. The model describes stochastic processes with stationary second difference. A parametrization is suggested which makes likelihood inference feasible. Consistency of the maximum...
Modelling Immune System: Principles, Models,Analysis and Perspectives
Institute of Scientific and Technical Information of China (English)
Xiang-hua Li; Zheng-xuan Wang; Tian-yang Lu; Xiang-jiu Che
2009-01-01
The biological immune system is a complex adaptive system. There are lots of benefits for building the model of the immune system. For biological researchers, they can test some hypotheses about the infection process or simulate the responses of some drugs. For computer researchers, they can build distributed, robust and fault tolerant networks inspired by the functions of the immune system. This paper provides a comprehensive survey of the literatures on modelling the immune system. From the methodology perspective, the paper compares and analyzes the existing approaches and models, and also demonstrates the focusing research effort on the future immune models in the next few years.
Practical Use of Computationally Frugal Model Analysis Methods.
Hill, Mary C; Kavetski, Dmitri; Clark, Martyn; Ye, Ming; Arabi, Mazdak; Lu, Dan; Foglia, Laura; Mehl, Steffen
2016-03-01
Three challenges compromise the utility of mathematical models of groundwater and other environmental systems: (1) a dizzying array of model analysis methods and metrics make it difficult to compare evaluations of model adequacy, sensitivity, and uncertainty; (2) the high computational demands of many popular model analysis methods (requiring 1000's, 10,000 s, or more model runs) make them difficult to apply to complex models; and (3) many models are plagued by unrealistic nonlinearities arising from the numerical model formulation and implementation. This study proposes a strategy to address these challenges through a careful combination of model analysis and implementation methods. In this strategy, computationally frugal model analysis methods (often requiring a few dozen parallelizable model runs) play a major role, and computationally demanding methods are used for problems where (relatively) inexpensive diagnostics suggest the frugal methods are unreliable. We also argue in favor of detecting and, where possible, eliminating unrealistic model nonlinearities-this increases the realism of the model itself and facilitates the application of frugal methods. Literature examples are used to demonstrate the use of frugal methods and associated diagnostics. We suggest that the strategy proposed in this paper would allow the environmental sciences community to achieve greater transparency and falsifiability of environmental models, and obtain greater scientific insight from ongoing and future modeling efforts. PMID:25810333
Eclipsing binary stars modeling and analysis
Kallrath, Josef
1999-01-01
This book focuses on the formulation of mathematical models for the light curves of eclipsing binary stars, and on the algorithms for generating such models Since information gained from binary systems provides much of what we know of the masses, luminosities, and radii of stars, such models are acquiring increasing importance in studies of stellar structure and evolution As in other areas of science, the computer revolution has given many astronomers tools that previously only specialists could use; anyone with access to a set of data can now expect to be able to model it This book will provide astronomers, both amateur and professional, with a guide for - specifying an astrophysical model for a set of observations - selecting an algorithm to determine the parameters of the model - estimating the errors of the parameters It is written for readers with knowledge of basic calculus and linear algebra; appendices cover mathematical details on such matters as optimization, coordinate systems, and specific models ...
Analysis of nonlinear systems using ARMA [autoregressive moving average] models
International Nuclear Information System (INIS)
While many vibration systems exhibit primarily linear behavior, a significant percentage of the systems encountered in vibration and model testing are mildly to severely nonlinear. Analysis methods for such nonlinear systems are not yet well developed and the response of such systems is not accurately predicted by linear models. Nonlinear ARMA (autoregressive moving average) models are one method for the analysis and response prediction of nonlinear vibratory systems. In this paper we review the background of linear and nonlinear ARMA models, and illustrate the application of these models to nonlinear vibration systems. We conclude by summarizing the advantages and disadvantages of ARMA models and emphasizing prospects for future development. 14 refs., 11 figs
Modeling Composite Laminate Crushing for Crash Analysis
Fleming, David C.; Jones, Lisa (Technical Monitor)
2002-01-01
Crash modeling of composite structures remains limited in application and has not been effectively demonstrated as a predictive tool. While the global response of composite structures may be well modeled, when composite structures act as energy-absorbing members through direct laminate crushing the modeling accuracy is greatly reduced. The most efficient composite energy absorbing structures, in terms of energy absorbed per unit mass, are those that absorb energy through a complex progressive crushing response in which fiber and matrix fractures on a small scale dominate the behavior. Such failure modes simultaneously include delamination of plies, failure of the matrix to produce fiber bundles, and subsequent failure of fiber bundles either in bending or in shear. In addition, the response may include the significant action of friction, both internally (between delaminated plies or fiber bundles) or externally (between the laminate and the crushing surface). A figure shows the crushing damage observed in a fiberglass composite tube specimen, illustrating the complexity of the response. To achieve a finite element model of such complex behavior is an extremely challenging problem. A practical crushing model based on detailed modeling of the physical mechanisms of crushing behavior is not expected in the foreseeable future. The present research describes attempts to model composite crushing behavior using a novel hybrid modeling procedure. Experimental testing is done is support of the modeling efforts, and a test specimen is developed to provide data for validating laminate crushing models.
Mineralogic Model (MM3.0) Analysis Model Report
Energy Technology Data Exchange (ETDEWEB)
C. Lum
2002-02-12
The purpose of this report is to document the Mineralogic Model (MM), Version 3.0 (MM3.0) with regard to data input, modeling methods, assumptions, uncertainties, limitations and validation of the model results, qualification status of the model, and the differences between Version 3.0 and previous versions. A three-dimensional (3-D) Mineralogic Model was developed for Yucca Mountain to support the analyses of hydrologic properties, radionuclide transport, mineral health hazards, repository performance, and repository design. Version 3.0 of the MM was developed from mineralogic data obtained from borehole samples. It consists of matrix mineral abundances as a function of x (easting), y (northing), and z (elevation), referenced to the stratigraphic framework defined in Version 3.1 of the Geologic Framework Model (GFM). The MM was developed specifically for incorporation into the 3-D Integrated Site Model (ISM). The MM enables project personnel to obtain calculated mineral abundances at any position, within any region, or within any stratigraphic unit in the model area. The significance of the MM for key aspects of site characterization and performance assessment is explained in the following subsections. This work was conducted in accordance with the Development Plan for the MM (CRWMS M&O 2000). The planning document for this Rev. 00, ICN 02 of this AMR is Technical Work Plan, TWP-NBS-GS-000003, Technical Work Plan for the Integrated Site Model, Process Model Report, Revision 01 (CRWMS M&O 2000). The purpose of this ICN is to record changes in the classification of input status by the resolution of the use of TBV software and data in this report. Constraints and limitations of the MM are discussed in the appropriate sections that follow. The MM is one component of the ISM, which has been developed to provide a consistent volumetric portrayal of the rock layers, rock properties, and mineralogy of the Yucca Mountain site. The ISM consists of three components: (1
Phenomenological analysis of the interacting boson model
Hatch, R. L.; Levit, S.
1982-01-01
The classical Hamiltonian of the interacting boson model is defined and expressed in terms of the conventional quadrupole variables. This is used in the analyses of the dynamics in the various limits of the model. The purpose is to determine the range and the features of the collective phenomena which the interacting boson model is capable of describing. In the commonly used version of the interacting boson model with one type of the s and d bosons and quartic interactions, this capability has certain limitations and the model should be used with care. A more sophisticated version of the interacting boson model with neutron and proton bosons is not discussed. NUCLEAR STRUCTURE Interacting bosons, classical IBM Hamiltonian in quadrupole variables, phenomenological content of the IBM and its limitations.
BIFURCATION ANALYSIS OF A MITOTIC MODEL OF FROG EGGS
Institute of Scientific and Technical Information of China (English)
吕金虎; 张子范; 张锁春
2003-01-01
The mitotic model of frog eggs established by Borisuk and Tyson is qualitatively analyzed. The existence and stability of its steady states are further discussed. Furthermore, the bifurcation of above model is further investigated by using theoretical analysis and numerical simulations. At the same time, the numerical results of Tyson are verified by theoretical analysis.
An Object Extraction Model Using Association Rules and Dependence Analysis
Institute of Scientific and Technical Information of China (English)
无
2001-01-01
Extracting objects from legacy systems is a basic step insystem's obje ct-orientation to improve the maintainability and understandability of the syst e ms. A new object extraction model using association rules an d dependence analysis is proposed. In this model data are classified by associat ion rules and the corresponding operations are partitioned by dependence analysis.
Stochastic Analysis Method of Sea Environment Simulated by Numerical Models
Institute of Scientific and Technical Information of China (English)
刘德辅; 焦桂英; 张明霞; 温书勤
2003-01-01
This paper proposes the stochastic analysis method of sea environment simulated by numerical models, such as wave height, current field, design sea levels and longshore sediment transport. Uncertainty and sensitivity analysis of input and output factors of numerical models, their long-term distribution and confidence intervals are described in this paper.
Multidimensional data modeling for business process analysis
Mansmann, Svetlana; Neumuth, Thomas; Scholl, Marc H.
2007-01-01
The emerging area of business process intelligence attempts to enhance the analytical capabilities of business process management systems by employing data warehousing and mining technologies. This paper presents an approach to re-engineering the business process modeling in conformity with the multidimensional data model. Since the business process and the multidimensional model are driven by rather different objectives and assumptions, there is no straightforward solution to converging thes...
Evaluating Network Models: A Likelihood Analysis
Wang, Wen-Qiang; Zhou, Tao
2011-01-01
Many models are put forward to mimic the evolution of real networked systems. A well-accepted way to judge the validity is to compare the modeling results with real networks subject to several structural features. Even for a specific real network, we cannot fairly evaluate the goodness of different models since there are too many structural features while there is no criterion to select and assign weights on them. Motivated by the studies on link prediction algorithms, we propose a unified method to evaluate the network models via the comparison of the likelihoods of the currently observed network driven by different models, with an assumption that the higher the likelihood is, the better the model is. We test our method on the real Internet at the Autonomous System (AS) level, and the results suggest that the Generalized Linear Preferential (GLP) model outperforms the Tel Aviv Network Generator (Tang), while both two models are better than the Barab\\'asi-Albert (BA) and Erd\\"os-R\\'enyi (ER) models. Our metho...
The Modeling Analysis of Huangshan Tourism Data
Hu, Shanfeng; Yan, Xinhu; Zhu, Hongbing
2016-06-01
Tourism is the major industry in Huangshan city. This paper analyzes time series of tourism data to Huangshan from 2000 to 2013. The Yearly data set comprises the total arrivals of tourists, total income, Urban Resident Disposable Income Per Capital and Net Income Per Peasant. A mathematical model which is based on the binomial approximation and inverse quadratic radial basis function (RBF) is set up to model the tourist arrivals. The total income and urban resident disposable income per capital and net income per peasant are also modeled. It is shown that the established mathematical model can be used to forecast some tourism information and achieve a good management for Huangshan tourism.
A Bayesian Analysis of Spectral ARMA Model
Directory of Open Access Journals (Sweden)
Manoel I. Silvestre Bezerra
2012-01-01
Full Text Available Bezerra et al. (2008 proposed a new method, based on Yule-Walker equations, to estimate the ARMA spectral model. In this paper, a Bayesian approach is developed for this model by using the noninformative prior proposed by Jeffreys (1967. The Bayesian computations, simulation via Markov Monte Carlo (MCMC is carried out and characteristics of marginal posterior distributions such as Bayes estimator and confidence interval for the parameters of the ARMA model are derived. Both methods are also compared with the traditional least squares and maximum likelihood approaches and a numerical illustration with two examples of the ARMA model is presented to evaluate the performance of the procedures.
Meta-analysis a structural equation modeling approach
Cheung, Mike W-L
2015-01-01
Presents a novel approach to conducting meta-analysis using structural equation modeling. Structural equation modeling (SEM) and meta-analysis are two powerful statistical methods in the educational, social, behavioral, and medical sciences. They are often treated as two unrelated topics in the literature. This book presents a unified framework on analyzing meta-analytic data within the SEM framework, and illustrates how to conduct meta-analysis using the metaSEM package in the R statistical environment. Meta-Analysis: A Structural Equation Modeling Approach begins by introducing the impo
Models as Tools of Analysis of a Network Organisation
Directory of Open Access Journals (Sweden)
Wojciech Pająk
2013-06-01
Full Text Available The paper presents models which may be applied as tools of analysis of a network organisation. The starting point of the discussion is defining the following terms: supply chain and network organisation. Further parts of the paper present basic assumptions analysis of a network organisation. Then the study characterises the best known models utilised in analysis of a network organisation. The purpose of the article is to define the notion and the essence of network organizations and to present the models used for their analysis.
An Intelligent Analysis Model for Multisource Volatile Memory
Directory of Open Access Journals (Sweden)
Xiaolu Zhang
2013-09-01
Full Text Available For the rapidly development of network and distributed computing environment, it make researchers harder to do analysis examines only from one or few pieces of data source in persistent data-oriented approaches, so as the volatile memory analysis either. Therefore, mass data automatically analysis and action modeling needs to be considered for reporting entire network attack process. To model multiple volatile data sources situation can help understand and describe both thinking process of investigator and possible action step for attacker. This paper presents a Game model for multisource volatile data and applies it to main memory images analysis with the definition of space-time feature for volatile element information. Abstract modeling allows the lessons gleaned in performing intelligent analysis, evidence filing and automating presentation. Finally, a test demo based on the model is also present to illustrate the whole procedure
Integration of Design and Control Through Model Analysis
DEFF Research Database (Denmark)
Russel, Boris Mariboe; Henriksen, Jens Peter; Jørgensen, Sten Bay;
2000-01-01
A systematic analysis of the process model is proposed as a pre-solution step for integration of design and control problems. It is shown that the same set of process (control) variables and design (manipulative) variables is employed with different objectives in design and control. Analysis...... of the phenomena models representing the process model identify the relationships between the important process and design variables, which help to understand, define and address some of the issues related to integration of design and control issues. The model analysis is highlighted through examples involving...... processes with mass and/or energy recycle. (C) 2000 Elsevier Science Ltd. All rights reserved....
Multivariate model for test response analysis
Krishnan, S.; Kerkhoff, H.G.
2010-01-01
A systematic approach to construct an effective multivariate test response model for capturing manufacturing defects in electronic products is described. The effectiveness of the model is demonstrated by its capability in reducing the number of test-points, while achieving the maximal coverage attai
Meta-analysis of clinical prediction models
Debray, T.P.A.
2013-01-01
The past decades there has been a clear shift from implicit to explicit diagnosis and prognosis. This includes appreciation of clinical -diagnostic and prognostic- prediction models, which is likely to increase with the introduction of fully computerized patient records. Prediction models aim to pro
Mixture model analysis of complex samples
Wedel, M.; Hofstede, F. ter; Steenkamp, J.-B.E.M.
1997-01-01
This paper investigates asymmetric effects of monetary policy over the business cycle. A two-state Markov Switching Model is employed to model both recessions and expansions. For the United States and Germany, strong evidence is found that monetary policy is more effective in a recession than during
Modelling and analysis of Markov reward automata
Guck, Dennis; Timmer, Mark; Hatefi, Hassan; Ruijters, Enno; Stoelinga, Mariëlle
2014-01-01
Costs and rewards are important ingredients for many types of systems, modelling critical aspects like energy consumption, task completion, repair costs, and memory usage. This paper introduces Markov reward automata, an extension of Markov automata that allows the modelling of systems incorporating
Analysis and modeling of solar irradiance variations
Yeo, K L
2014-01-01
A prominent manifestation of the solar dynamo is the 11-year activity cycle, evident in indicators of solar activity, including solar irradiance. Although a relationship between solar activity and the brightness of the Sun had long been suspected, it was only directly observed after regular satellite measurements became available with the launch of Nimbus-7 in 1978. The measurement of solar irradiance from space is accompanied by the development of models aimed at describing the apparent variability by the intensity excess/deficit effected by magnetic structures in the photosphere. The more sophisticated models, termed semi-empirical, rely on the intensity spectra of photospheric magnetic structures generated with radiative transfer codes from semi-empirical model atmospheres. An established example of such models is SATIRE-S (Spectral And Total Irradiance REconstruction for the Satellite era). One key limitation of current semi-empirical models is the fact that the radiant properties of network and faculae a...
Analysis of Modeling Parameters on Threaded Screws.
Energy Technology Data Exchange (ETDEWEB)
Vigil, Miquela S. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Brake, Matthew Robert [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Vangoethem, Douglas [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)
2015-06-01
Assembled mechanical systems often contain a large number of bolted connections. These bolted connections (joints) are integral aspects of the load path for structural dynamics, and, consequently, are paramount for calculating a structure's stiffness and energy dissipation prop- erties. However, analysts have not found the optimal method to model appropriately these bolted joints. The complexity of the screw geometry cause issues when generating a mesh of the model. This paper will explore different approaches to model a screw-substrate connec- tion. Model parameters such as mesh continuity, node alignment, wedge angles, and thread to body element size ratios are examined. The results of this study will give analysts a better understanding of the influences of these parameters and will aide in finding the optimal method to model bolted connections.
Quark model analysis of the Sivers function
Courtoy, A; Scopetta, S; Vento, V
2008-01-01
A formalism is developed aimed at evaluating the Sivers function entering single spin asymmetries. The approach is well suited for calculations which use constituent quark models to describe the structure of the nucleon. A non-relativistic approximation of the scheme is performed to calculate the Sivers function using the Isgur-Karl model. The results we have obtained are consistent with a sizable Sivers effect, with an opposite sign for the u and d flavors. This pattern is in agreement with the one found analysing, in the same model, the impact parameter dependent generalized parton distributions. Although a consistent QCD evolution of the results from the momentum scale of the model to the experimental one is not yet possible, an estimate shows that a reasonable agreement with the available data is obtained once the evolution of the model results is performed.
Analysis and modelization of lightweight structures subjected to impact
Barbero Pozuelo, Enrique; López-Puente, Jorge
2008-01-01
Mechanics of Advanced Materials research group (Department of Continuum Mechanics and Structural Analysis) of the University Carlos III of Madrid (Spain) offers their experience in the analysis and modelization of high and low velocity impact behaviour of composite structures. Their research focuses on both numerical analysis and non-standard experimental methodologies).
Model-free linkage analysis of a binary trait.
Xu, Wei; Bull, Shelley B; Mirea, Lucia; Greenwood, Celia M T
2012-01-01
Genetic linkage analysis aims to detect chromosomal regions containing genes that influence risk of specific inherited diseases. The presence of linkage is indicated when a disease or trait cosegregates through the families with genetic markers at a particular region of the genome. Two main types of genetic linkage analysis are in common use, namely model-based linkage analysis and model-free linkage analysis. In this chapter, we focus solely on the latter type and specifically on binary traits or phenotypes, such as the presence or absence of a specific disease. Model-free linkage analysis is based on allele-sharing, where patterns of genetic similarity among affected relatives are compared to chance expectations. Because the model-free methods do not require the specification of the inheritance parameters of a genetic model, they are preferred by many researchers at early stages in the study of a complex disease. We introduce the history of model-free linkage analysis in Subheading 1. Table 1 describes a standard model-free linkage analysis workflow. We describe three popular model-free linkage analysis methods, the nonparametric linkage (NPL) statistic, the affected sib-pair (ASP) likelihood ratio test, and a likelihood approach for pedigrees. The theory behind each linkage test is described in this section, together with a simple example of the relevant calculations. Table 4 provides a summary of popular genetic analysis software packages that implement model-free linkage models. In Subheading 2, we work through the methods on a rich example providing sample software code and output. Subheading 3 contains notes with additional details on various topics that may need further consideration during analysis.
Application of dimensional analysis in systems modeling and control design
Balaguer, Pedro
2013-01-01
Dimensional analysis is an engineering tool that is widely applied to numerous engineering problems, but has only recently been applied to control theory and problems such as identification and model reduction, robust control, adaptive control, and PID control. Application of Dimensional Analysis in Systems Modeling and Control Design provides an introduction to the fundamentals of dimensional analysis for control engineers, and shows how they can exploit the benefits of the technique to theoretical and practical control problems.
Root analysis and implications to analysis model in ATLAS
Shibata, A
2008-01-01
An impressive amount of effort has been put in to realize a set of frameworks to support analysis in this new paradigm of GRID computing. However, much more than half of a physicist's time is typically spent after the GRID processing of the data. Due to the private nature of this level of analysis, there has been little common framework or methodology. While most physicists agree to use ROOT as the basis of their analysis, a number of approaches are possible for the implementation of the analysis using ROOT: conventional methods using CINT/ACLiC, development using g++, alternative interface through python, and parallel processing methods such as PROOF are some of the choices currently available on the market. Furthermore, in the ATLAS collaboration an additional layer of technology adds to the complexity because the data format is based on the POOL technology, which tends to be less portable. In this study, various modes of ROOT analysis are profiled for comparison with the main focus on the processing speed....
MODAL ANALYSIS OF QUARTER CAR MODEL SUSPENSION SYSTEM
Viswanath. K. Allamraju *
2016-01-01
Suspension system is very important for comfort driving and travelling of the passengers. Therefore, this study provides a numerical tool for modeling and analyzing of a two degree of freedom quarter car model suspension system. Modal analysis places a vital role in designing the suspension system. In this paper presented the modal analysis of quarter car model suspension system by considering the undamped and damped factors. The modal and vertical equations of motions describing the su...
European Climate - Energy Security Nexus. A model based scenario analysis
International Nuclear Information System (INIS)
In this research, we have provided an overview of the climate-security nexus in the European sector through a model based scenario analysis with POLES model. The analysis underline that under stringent climate policies, Europe take advantage of a double dividend in its capacity to develop a new cleaner energy model and in lower vulnerability to potential shocks on the international energy markets. (authors)
Sensitivity Analysis of the Gap Heat Transfer Model in BISON.
Energy Technology Data Exchange (ETDEWEB)
Swiler, Laura Painton; Schmidt, Rodney C.; Williamson, Richard (INL); Perez, Danielle (INL)
2014-10-01
This report summarizes the result of a NEAMS project focused on sensitivity analysis of the heat transfer model in the gap between the fuel rod and the cladding used in the BISON fuel performance code of Idaho National Laboratory. Using the gap heat transfer models in BISON, the sensitivity of the modeling parameters and the associated responses is investigated. The study results in a quantitative assessment of the role of various parameters in the analysis of gap heat transfer in nuclear fuel.
Probabilistic forward model for electroencephalography source analysis
Energy Technology Data Exchange (ETDEWEB)
Plis, Sergey M [MS-D454, Applied Modern Physics Group, Los Alamos National Laboratory, Los Alamos, NM 87545 (United States); George, John S [MS-D454, Applied Modern Physics Group, Los Alamos National Laboratory, Los Alamos, NM 87545 (United States); Jun, Sung C [MS-D454, Applied Modern Physics Group, Los Alamos National Laboratory, Los Alamos, NM 87545 (United States); Ranken, Doug M [MS-D454, Applied Modern Physics Group, Los Alamos National Laboratory, Los Alamos, NM 87545 (United States); Volegov, Petr L [MS-D454, Applied Modern Physics Group, Los Alamos National Laboratory, Los Alamos, NM 87545 (United States); Schmidt, David M [MS-D454, Applied Modern Physics Group, Los Alamos National Laboratory, Los Alamos, NM 87545 (United States)
2007-09-07
Source localization by electroencephalography (EEG) requires an accurate model of head geometry and tissue conductivity. The estimation of source time courses from EEG or from EEG in conjunction with magnetoencephalography (MEG) requires a forward model consistent with true activity for the best outcome. Although MRI provides an excellent description of soft tissue anatomy, a high resolution model of the skull (the dominant resistive component of the head) requires CT, which is not justified for routine physiological studies. Although a number of techniques have been employed to estimate tissue conductivity, no present techniques provide the noninvasive 3D tomographic mapping of conductivity that would be desirable. We introduce a formalism for probabilistic forward modeling that allows the propagation of uncertainties in model parameters into possible errors in source localization. We consider uncertainties in the conductivity profile of the skull, but the approach is general and can be extended to other kinds of uncertainties in the forward model. We and others have previously suggested the possibility of extracting conductivity of the skull from measured electroencephalography data by simultaneously optimizing over dipole parameters and the conductivity values required by the forward model. Using Cramer-Rao bounds, we demonstrate that this approach does not improve localization results nor does it produce reliable conductivity estimates. We conclude that the conductivity of the skull has to be either accurately measured by an independent technique, or that the uncertainties in the conductivity values should be reflected in uncertainty in the source location estimates.
Analysis of Brown camera distortion model
Nowakowski, Artur; Skarbek, Władysław
2013-10-01
Contemporary image acquisition devices introduce optical distortion into image. It results in pixel displacement and therefore needs to be compensated for many computer vision applications. The distortion is usually modeled by the Brown distortion model, which parameters can be included in camera calibration task. In this paper we describe original model, its dependencies and analyze orthogonality with regard to radius for its decentering distortion component. We also report experiments with camera calibration algorithm included in OpenCV library, especially a stability of distortion parameters estimation is evaluated.
Multifractal modelling and 3D lacunarity analysis
Energy Technology Data Exchange (ETDEWEB)
Hanen, Akkari, E-mail: bettaieb.hanen@topnet.t [Laboratoire de biophysique, TIM, Faculte de Medecine (Tunisia); Imen, Bhouri, E-mail: bhouri_imen@yahoo.f [Unite de recherche ondelettes et multifractals, Faculte des sciences (Tunisia); Asma, Ben Abdallah, E-mail: asma.babdallah@cristal.rnu.t [Laboratoire de biophysique, TIM, Faculte de Medecine (Tunisia); Patrick, Dubois, E-mail: pdubois@chru-lille.f [INSERM, U 703, Lille (France); Hedi, Bedoui Mohamed, E-mail: medhedi.bedoui@fmm.rnu.t [Laboratoire de biophysique, TIM, Faculte de Medecine (Tunisia)
2009-09-28
This study presents a comparative evaluation of lacunarity of 3D grey level models with different types of inhomogeneity. A new method based on the 'Relative Differential Box Counting' was developed to estimate the lacunarity features of grey level volumes. To validate our method, we generated a set of 3D grey level multifractal models with random, anisotropic and hierarchical properties. Our method gives a lacunarity measurement correlated with the theoretical one and allows a better model classification compared with a classical approach.
Comparative Analysis of Uncertainties in Urban Surface Runoff Modelling
DEFF Research Database (Denmark)
Thorndahl, Søren; Schaarup-Jensen, Kjeld
2007-01-01
volumes are compared - especially when the models are uncalibrated. The occurrences of flooding and surcharge are highly dependent on both hydrological and hydrodynamic parameters. Thus, the conclusion of the paper is that if the use of model simulations is to be a reliable tool for drainage system...... analysis, further research in improved parameter assessment for surface runoff models is needed....
Mathematical Modelling and Experimental Analysis of Early Age Concrete
DEFF Research Database (Denmark)
Hauggaard-Nielsen, Anders Boe
1997-01-01
lead to cracks in the later cooling phase. The matrial model has intrigate couplings between the involved mechanics, and in the thesis special emphasize is put on the creep behaviour. The mathematical models are based on experimental analysis and numerical implementation of the models in a finite...
Feature Analysis for Modeling Game Content Quality
DEFF Research Database (Denmark)
Shaker, Noor; Yannakakis, Georgios N.; Togelius, Julian
2011-01-01
’ preferences, and by defining the smallest game session size for which the model can still predict reported emotion with acceptable accuracy. Neuroevolutionary preference learning is used to approximate the function from game content to reported emotional preferences. The experiments are based on a modified......entertainment for individual game players is to tailor player experience in real-time via automatic game content generation. Modeling the relationship between game content and player preferences or affective states is an important step towards this type of game personalization. In this paper we......, and that the models built on selected features derived from the whole set of extracted features (combining the two types of features) outperforms other models constructed on partial information about game content....
Modeling and analysis of web portals performance
Abdul Rahim, Rahela; Ibrahim, Haslinda; Syed Yahaya, Sharipah Soaad; Khalid, Khairini
2011-10-01
The main objective of this study is to develop a model based on queuing theory at a system level of web portals performance for a university. A system level performance model views the system being modeled as a 'black box' which considers the arrival rate of packets to the portals server and service rate of the portals server. These two parameters are important elements to measure Web portals performance metrics such as server utilization, average server throughput, average number of packet in the server and mean response time. This study refers to infinite population and finite queue. The proposed analytical model is simple in such a way that it is easy to define and fast to interpret the results but still represents the real situation.
Complex networks analysis in socioeconomic models
Varela, Luis M; Ausloos, Marcel; Carrete, Jesus
2014-01-01
This chapter aims at reviewing complex networks models and methods that were either developed for or applied to socioeconomic issues, and pertinent to the theme of New Economic Geography. After an introduction to the foundations of the field of complex networks, the present summary adds insights on the statistical mechanical approach, and on the most relevant computational aspects for the treatment of these systems. As the most frequently used model for interacting agent-based systems, a brief description of the statistical mechanics of the classical Ising model on regular lattices, together with recent extensions of the same model on small-world Watts-Strogatz and scale-free Albert-Barabasi complex networks is included. Other sections of the chapter are devoted to applications of complex networks to economics, finance, spreading of innovations, and regional trade and developments. The chapter also reviews results involving applications of complex networks to other relevant socioeconomic issues, including res...
Tradeoff Analysis for Optimal Multiobjective Inventory Model
Longsheng Cheng; Ching-Shih Tsou; Ming-Chang Lee; Li-Hua Huang; Dingwei Song; Wei-Shan Teng
2013-01-01
Deterministic inventory model, the economic order quantity (EOQ), reveals that carrying inventory or ordering frequency follows a relation of tradeoff. For probabilistic demand, the tradeoff surface among annual order, expected inventory and shortage are useful because they quantify what the firm must pay in terms of ordering workload and inventory investment to meet the customer service desired. Based on a triobjective inventory model, this paper employs the successive approximation to obtai...
Urban drainage models - making uncertainty analysis simple
DEFF Research Database (Denmark)
Vezzaro, Luca; Mikkelsen, Peter Steen; Deletic, Ana;
2012-01-01
There is increasing awareness about uncertainties in modelling of urban drainage systems and, as such, many new methods for uncertainty analyses have been developed. Despite this, all available methods have limitations which restrict their widespread application among practitioners. Here, a modif...... probability distributions (often used for sensitivity analyses) and prediction intervals. To demonstrate the new method, it is applied to a conceptual rainfall-runoff model using a dataset collected from Melbourne, Australia....
Nonlinear Analysis and Modeling of Tires
Noor, Ahmed K.
1996-01-01
The objective of the study was to develop efficient modeling techniques and computational strategies for: (1) predicting the nonlinear response of tires subjected to inflation pressure, mechanical and thermal loads; (2) determining the footprint region, and analyzing the tire pavement contact problem, including the effect of friction; and (3) determining the sensitivity of the tire response (displacements, stresses, strain energy, contact pressures and contact area) to variations in the different material and geometric parameters. Two computational strategies were developed. In the first strategy the tire was modeled by using either a two-dimensional shear flexible mixed shell finite elements or a quasi-three-dimensional solid model. The contact conditions were incorporated into the formulation by using a perturbed Lagrangian approach. A number of model reduction techniques were applied to substantially reduce the number of degrees of freedom used in describing the response outside the contact region. The second strategy exploited the axial symmetry of the undeformed tire, and uses cylindrical coordinates in the development of three-dimensional elements for modeling each of the different parts of the tire cross section. Model reduction techniques are also used with this strategy.
Analysis of multistate models for electromigration failure
Dwyer, V. M.
2010-02-01
The application of a multistate Markov chain is considered as a model of electromigration interconnect degradation and eventual failure. Such a model has already been used [Tan et al., J. Appl. Phys. 102, 103703 (2007)], maintaining that, in general, it leads to a failure distribution described by a gamma mixture, and that as a result, this type of distribution (rather than a lognormal) should be used as a prior in any Bayesian mode fitting and subsequent reliability budgeting. Although it appears that the model is able to produce reasonably realistic resistance curves R(t), we are unable to find any evidence that the failure distribution is a simple gamma mixture except under contrived conditions. The distributions generated are largely sums of exponentials (phase-type distributions), convolutions of gamma distributions with different scales, or roughly normal. We note also some inconsistencies in the derivation of the gamma mixture in the work cited above and conclude that, as it stands, the Markov chain model is probably unsuitable for electromigration modeling and a change from lognormal to gamma mixture distribution generally cannot be justified in this way. A hidden Markov model, which describes the interconnect behavior at time t rather than its resistance, in terms of generally observed physical processes such as void nucleating, slitlike growth (where the growth is slow and steady), transverse growth, current shunting (where the resistance jumps in value), etc., seems a more likely prospect, but treating failure in such a manner would still require significant justification.
TIME SERIES ANALYSIS USING A UNIQUE MODEL OF TRANSFORMATION
Directory of Open Access Journals (Sweden)
Goran Klepac
2007-12-01
Full Text Available REFII1 model is an authorial mathematical model for time series data mining. The main purpose of that model is to automate time series analysis, through a unique transformation model of time series. An advantage of this approach of time series analysis is the linkage of different methods for time series analysis, linking traditional data mining tools in time series, and constructing new algorithms for analyzing time series. It is worth mentioning that REFII model is not a closed system, which means that we have a finite set of methods. At first, this is a model for transformation of values of time series, which prepares data used by different sets of methods based on the same model of transformation in a domain of problem space. REFII model gives a new approach in time series analysis based on a unique model of transformation, which is a base for all kind of time series analysis. The advantage of REFII model is its possible application in many different areas such as finance, medicine, voice recognition, face recognition and text mining.
Urban Sprawl Analysis and Modeling in Asmara, Eritrea
Tewolde, Mussie G.; Pedro Cabral
2011-01-01
The extension of urban perimeter markedly cuts available productive land. Hence, studies in urban sprawl analysis and modeling play an important role to ensure sustainable urban development. The urbanization pattern of the Greater Asmara Area (GAA), the capital of Eritrea, was studied. Satellite images and geospatial tools were employed to analyze the spatiotemporal urban landuse changes. Object-Based Image Analysis (OBIA), Landuse Cover Change (LUCC) analysis and urban sprawl analysis using ...
COMPUTER DATA ANALYSIS AND MODELING: COMPLEX STOCHASTIC DATA AND SYSTEMS
2010-01-01
This collection of papers includes proceedings of the Ninth International Conference “Computer Data Analysis and Modeling: Complex Stochastic Data and Systems” organized by the Belarusian State University and held in September 2010 in Minsk. The papers are devoted to the topical problems: robust and nonparametric data analysis; statistical analysis of time series and forecasting; multivariate data analysis; design of experiments; statistical signal and image processing...
Dynamic Chest Image Analysis: Model-Based Perfusion Analysis in Dynamic Pulmonary Imaging
Kiuru Aaro; Kormano Martti; Svedström Erkki; Liang Jianming; Järvi Timo
2003-01-01
The "Dynamic Chest Image Analysis" project aims to develop model-based computer analysis and visualization methods for showing focal and general abnormalities of lung ventilation and perfusion based on a sequence of digital chest fluoroscopy frames collected with the dynamic pulmonary imaging technique. We have proposed and evaluated a multiresolutional method with an explicit ventilation model for ventilation analysis. This paper presents a new model-based method for pulmonary perfusion ana...
Simplified Analysis Model for Predicting Pyroshock Responses on Composite Panel
Iwasa, Takashi; Shi, Qinzhong
A simplified analysis model based on the frequency response analysis and the wave propagation analysis was established for predicting Shock Response Spectrum (SRS) on the composite panel subjected to pyroshock loadings. The complex composite panel was modeled as an isotropic single layer panel defined in NASA Lewis Method. Through the conductance of an impact excitation test on a composite panel with no equipment mounted on, it was presented that the simplified analysis model could estimate the SRS as well as the acceleration peak values in both near and far field in an accurate way. In addition, through the simulation for actual pyroshock tests on an actual satellite system, the simplified analysis model was proved to be applicable in predicting the actual pyroshock responses, while bringing forth several technical issues to estimate the pyroshock test specifications in early design stages.
The case for repeatable analysis with energy economy optimization models
International Nuclear Information System (INIS)
Energy economy optimization (EEO) models employ formal search techniques to explore the future decision space over several decades in order to deliver policy-relevant insights. EEO models are a critical tool for decision-makers who must make near-term decisions with long-term effects in the face of large future uncertainties. While the number of model-based analyses proliferates, insufficient attention is paid to transparency in model development and application. Given the complex, data-intensive nature of EEO models and the general lack of access to source code and data, many of the assumptions underlying model-based analysis are hidden from external observers. This paper discusses the simplifications and subjective judgments involved in the model building process, which cannot be fully articulated in journal papers, reports, or model documentation. In addition, we argue that for all practical purposes, EEO model-based insights cannot be validated through comparison to real world outcomes. As a result, modelers are left without credible metrics to assess a model's ability to deliver reliable insight. We assert that EEO models should be discoverable through interrogation of publicly available source code and data. In addition, third parties should be able to run a specific model instance in order to independently verify published results. Yet a review of twelve EEO models suggests that in most cases, replication of model results is currently impossible. We provide several recommendations to help develop and sustain a software framework for repeatable model analysis.
Constrained Overcomplete Analysis Operator Learning for Cosparse Signal Modelling
Yaghoobi, Mehrdad; Gribonval, Remi; Davies, Mike E
2012-01-01
We consider the problem of learning a low-dimensional signal model from a collection of training samples. The mainstream approach would be to learn an overcomplete dictionary to provide good approximations of the training samples using sparse synthesis coefficients. This famous sparse model has a less well known counterpart, in analysis form, called the cosparse analysis model. In this new model, signals are characterised by their parsimony in a transformed domain using an overcomplete (linear) analysis operator. We propose to learn an analysis operator from a training corpus using a constrained optimisation framework based on L1 optimisation. The reason for introducing a constraint in the optimisation framework is to exclude trivial solutions. Although there is no final answer here for which constraint is the most relevant constraint, we investigate some conventional constraints in the model adaptation field and use the uniformly normalised tight frame (UNTF) for this purpose. We then derive a practical lear...
Statistical Performance Analysis and Modeling Techniques for Nanometer VLSI Designs
Shen, Ruijing; Yu, Hao
2012-01-01
Since process variation and chip performance uncertainties have become more pronounced as technologies scale down into the nanometer regime, accurate and efficient modeling or characterization of variations from the device to the architecture level have become imperative for the successful design of VLSI chips. This book provides readers with tools for variation-aware design methodologies and computer-aided design (CAD) of VLSI systems, in the presence of process variations at the nanometer scale. It presents the latest developments for modeling and analysis, with a focus on statistical interconnect modeling, statistical parasitic extractions, statistical full-chip leakage and dynamic power analysis considering spatial correlations, statistical analysis and modeling for large global interconnects and analog/mixed-signal circuits. Provides readers with timely, systematic and comprehensive treatments of statistical modeling and analysis of VLSI systems with a focus on interconnects, on-chip power grids and ...
Three dimensional mathematical model of tooth for finite element analysis
Directory of Open Access Journals (Sweden)
Puškar Tatjana
2010-01-01
Full Text Available Introduction. The mathematical model of the abutment tooth is the starting point of the finite element analysis of stress and deformation of dental structures. The simplest and easiest way is to form a model according to the literature data of dimensions and morphological characteristics of teeth. Our method is based on forming 3D models using standard geometrical forms (objects in programmes for solid modeling. Objective. Forming the mathematical model of abutment of the second upper premolar for finite element analysis of stress and deformation of dental structures. Methods. The abutment tooth has a form of a complex geometric object. It is suitable for modeling in programs for solid modeling SolidWorks. After analyzing the literature data about the morphological characteristics of teeth, we started the modeling dividing the tooth (complex geometric body into simple geometric bodies (cylinder, cone, pyramid,.... Connecting simple geometric bodies together or substricting bodies from the basic body, we formed complex geometric body, tooth. The model is then transferred into Abaqus, a computational programme for finite element analysis. Transferring the data was done by standard file format for transferring 3D models ACIS SAT. Results. Using the programme for solid modeling SolidWorks, we developed three models of abutment of the second maxillary premolar: the model of the intact abutment, the model of the endodontically treated tooth with two remaining cavity walls and the model of the endodontically treated tooth with two remaining walls and inserted post. Conclusion Mathematical models of the abutment made according to the literature data are very similar with the real abutment and the simplifications are minimal. These models enable calculations of stress and deformation of the dental structures. The finite element analysis provides useful information in understanding biomechanical problems and gives guidance for clinical research.
Information gap analysis of flood model uncertainties and regional frequency analysis
Hine, Daniel; Hall, Jim W.
2010-01-01
Flood risk analysis is subject to often severe uncertainties, which can potentially undermine flood management decisions. This paper explores the use of information gap theory to analyze the sensitivity of flood management decisions to uncertainties in flood inundation models and flood frequency analysis. Information gap is a quantified nonprobabilistic theory of robustness. To analyze uncertainties in flood modeling, an energy-bounded information gap model is established and applied first to a simplified uniform channel and then to a more realistic 2-D flood model. Information gap theory is then applied to the estimation of flood discharges using regional frequency analysis. The use of an information gap model is motivated by the notion that hydrologically similar sites are clustered in the space of their L moments. The information gap model is constructed around a parametric statistical flood frequency analysis, resulting in a hybrid model of uncertainty in which natural variability is handled statistically while epistemic uncertainties are represented in the information gap model. The analysis is demonstrated for sites in the Trent catchment, United Kingdom. The analysis is extended to address ungauged catchments, which, because of the attendant uncertainties in flood frequency analysis, are particularly appropriate for information gap analysis. Finally, the information gap model of flood frequency is combined with the treatment of hydraulic model uncertainties in an example of how both sources of uncertainty can be accounted for using information gap theory in a flood risk management decision.
Theory and application of experimental model analysis in earthquake engineering
Moncarz, P. D.
The feasibility and limitations of small-scale model studies in earthquake engineering research and practice is considered with emphasis on dynamic modeling theory, a study of the mechanical properties of model materials, the development of suitable model construction techniques and an evaluation of the accuracy of prototype response prediction through model case studies on components and simple steel and reinforced concrete structures. It is demonstrated that model analysis can be used in many cases to obtain quantitative information on the seismic behavior of complex structures which cannot be analyzed confidently by conventional techniques. Methodologies for model testing and response evaluation are developed in the project and applications of model analysis in seismic response studies on various types of civil engineering structures (buildings, bridges, dams, etc.) are evaluated.
Modeling of Supply Chain Contextual-Load Model for Instability Analysis
Kadirkamanathan, Nordin Saad Visakan; Bennett, Stuart
2008-01-01
Simulation using a DES is an effective tool for the dynamically changing supply chain variables, thus allowing the system to be modeled more realistically. The modelling, simulation, and analysis of a supply chain discussed in this chapter are a preliminary attempt to establish a methodology for modelling, simulation and analysis of a supply chain using a DES. There are further problems to be considered both in the development of the model and the experimental design. The main contributions o...
Applications of model theory to functional analysis
Iovino, Jose
2014-01-01
During the last two decades, methods that originated within mathematical logic have exhibited powerful applications to Banach space theory, particularly set theory and model theory. This volume constitutes the first self-contained introduction to techniques of model theory in Banach space theory. The area of research has grown rapidly since this monograph's first appearance, but much of this material is still not readily available elsewhere. For instance, this volume offers a unified presentation of Krivine's theorem and the Krivine-Maurey theorem on stable Banach spaces, with emphasis on the
CRITICAL ANALYSIS OF EVALUATION MODEL LOMCE
Directory of Open Access Journals (Sweden)
José Luis Bernal Agudo
2015-06-01
Full Text Available The evaluation model that the LOMCE projects sinks its roots into the neoliberal beliefs, reflecting a specific way of understanding the world. What matters is not the process but the results, being the evaluation the center of the education-learning processes. It presents an evil planning, since the theory that justifies the model doesn’t specify upon coherent proposals, where there is an excessive worry for excellence and diversity is left out. A comprehensive way of understanding education should be recovered.
Theoretical analysis and modeling for nanoelectronics
Baccarani, Giorgio; Gnani, Elena; Gnudi, Antonio; Reggiani, Susanna
2016-11-01
In this paper we review the evolution of Microelectronics and its transformation into Nanoelectronics, following the predictions of Moore's law, and some of the issues related with this evolution. Next, we discuss the requirements of device modeling and the solutions proposed throughout the years to address the physical effects related with an extreme device miniaturization, such as hot-electron effects, band splitting into multiple sub-bands, quasi-ballistic transport and electron tunneling. The most important physical models are shortly highlighted, and a few simulation results of heterojunction TFETs are reported and discussed.
MODELING AND ANALYSIS OF REGIONAL BOUNDARY SYSTEM
Institute of Scientific and Technical Information of China (English)
YAN Guangle; WANG Huanchen
2001-01-01
In this paper, the problems of modeling and analyzing the system with change able boundary are researched. First, a kind of expanding system is set up, in which the changeable boundary is dealt with as a regional boundary. Then some relative models are developed to describe the regional boundary system. Next, the transition or the driftage of bifurcation points in the system is discussed. A fascinating case is studied in which two or more than two classes of chaotic attractive points coexist together or exist alternatively in the same system. Lastly, an effective new method of chaos avoidance for the system is put forward.
MODELING HUMAN RELIABILITY ANALYSIS USING MIDAS
Energy Technology Data Exchange (ETDEWEB)
Ronald L. Boring; Donald D. Dudenhoeffer; Bruce P. Hallbert; Brian F. Gore
2006-05-01
This paper summarizes an emerging collaboration between Idaho National Laboratory and NASA Ames Research Center regarding the utilization of high-fidelity MIDAS simulations for modeling control room crew performance at nuclear power plants. The key envisioned uses for MIDAS-based control room simulations are: (i) the estimation of human error with novel control room equipment and configurations, (ii) the investigative determination of risk significance in recreating past event scenarios involving control room operating crews, and (iii) the certification of novel staffing levels in control rooms. It is proposed that MIDAS serves as a key component for the effective modeling of risk in next generation control rooms.
ANALYSIS OF SOFTWARE COST ESTIMATION MODELS
Tahir Abdullah; Rabia Saleem; Shahbaz Nazeer; Muhammad Usman
2012-01-01
Software Cost estimation is a process of forecasting the Cost of project in terms of budget, time, and other resources needed to complete a software system and it is a core issue in the software project management to estimate the cost of a project before initiating the Software Project. Different models have been developed to estimate the cost of software projects for the last several years. Most of these models rely on the Analysts’ experience, size of the software project and some other sof...
A patient-centered care ethics analysis model for rehabilitation.
Hunt, Matthew R; Ells, Carolyn
2013-09-01
There exists a paucity of ethics resources tailored to rehabilitation. To help fill this ethics resource gap, the authors developed an ethics analysis model specifically for use in rehabilitation care. The Patient-Centered Care Ethics Analysis Model for Rehabilitation is a process model to guide careful moral reasoning for particularly complex or challenging matters in rehabilitation. The Patient-Centered Care Ethics Analysis Model for Rehabilitation was developed over several iterations, with feedback at different stages from rehabilitation professionals and bioethics experts. Development of the model was explicitly informed by the theoretical grounding of patient-centered care and the context of rehabilitation, including the International Classification of Functioning, Disability and Health. Being patient centered, the model encourages (1) shared control of consultations, decisions about interventions, and management of the health problems with the patient and (2) understanding the patient as a whole person who has individual preferences situated within social contexts. Although the major process headings of the Patient-Centered Care Ethics Analysis Model for Rehabilitation resemble typical ethical decision-making and problem-solving models, the probes under those headings direct attention to considerations relevant to rehabilitation care. The Patient-Centered Care Ethics Analysis Model for Rehabilitation is a suitable tool for rehabilitation professionals to use (in real time, for retrospective review, and for training purposes) to help arrive at ethical outcomes.
ANALYSIS/MODEL COVER SHEET, MULTISCALE THERMOHYDROLOGIC MODEL
International Nuclear Information System (INIS)
The purpose of the Multiscale Thermohydrologic Model (MSTHM) is to describe the thermohydrologic evolution of the near-field environment (NFE) and engineered barrier system (EBS) throughout the potential high-level nuclear waste repository at Yucca Mountain for a particular engineering design (CRWMS M andO 2000c). The process-level model will provide thermohydrologic (TH) information and data (such as in-drift temperature, relative humidity, liquid saturation, etc.) for use in other technical products. This data is provided throughout the entire repository area as a function of time. The MSTHM couples the Smeared-heat-source Drift-scale Thermal-conduction (SDT), Line-average-heat-source Drift-scale Thermohydrologic (LDTH), Discrete-heat-source Drift-scale Thermal-conduction (DDT), and Smeared-heat-source Mountain-scale Thermal-conduction (SMT) submodels such that the flow of water and water vapor through partially-saturated fractured rock is considered. The MSTHM accounts for 3-D drift-scale and mountain-scale heat flow, repository-scale variability of stratigraphy and infiltration flux, and waste package (WP)-to-WP variability in heat output from WPs. All submodels use the nonisothermal unsaturated-saturated flow and transport (NUFT) simulation code. The MSTHM is implemented in several data-processing steps. The four major steps are: (1) submodel input-file preparation, (2) execution of the four submodel families with the use of the NUFT code, (3) execution of the multiscale thermohydrologic abstraction code (MSTHAC), and (4) binning and post-processing (i.e., graphics preparation) of the output from MSTHAC. Section 6 describes the MSTHM in detail. The objectives of this Analyses and Model Report (AMR) are to investigate near field (NF) and EBS thermohydrologic environments throughout the repository area at various evolution periods, and to provide TH data that may be used in other process model reports
Coverage Modeling and Reliability Analysis Using Multi-state Function
Institute of Scientific and Technical Information of China (English)
无
2007-01-01
Fault tree analysis is an effective method for predicting the reliability of a system. It gives a pictorial representation and logical framework for analyzing the reliability. Also, it has been used for a long time as an effective method for the quantitative and qualitative analysis of the failure modes of critical systems. In this paper, we propose a new general coverage model (GCM) based on hardware independent faults. Using this model, an effective software tool can be constructed to detect, locate and recover fault from the faulty system. This model can be applied to identify the key component that can cause the failure of the system using failure mode effect analysis (FMEA).
Three Dimensional Vapor Intrusion Modeling: Model Validation and Uncertainty Analysis
Akbariyeh, S.; Patterson, B.; Rakoczy, A.; Li, Y.
2013-12-01
Volatile organic chemicals (VOCs), such as chlorinated solvents and petroleum hydrocarbons, are prevalent groundwater contaminants due to their improper disposal and accidental spillage. In addition to contaminating groundwater, VOCs may partition into the overlying vadose zone and enter buildings through gaps and cracks in foundation slabs or basement walls, a process termed vapor intrusion. Vapor intrusion of VOCs has been recognized as a detrimental source for human exposures to potential carcinogenic or toxic compounds. The simulation of vapor intrusion from a subsurface source has been the focus of many studies to better understand the process and guide field investigation. While multiple analytical and numerical models were developed to simulate the vapor intrusion process, detailed validation of these models against well controlled experiments is still lacking, due to the complexity and uncertainties associated with site characterization and soil gas flux and indoor air concentration measurement. In this work, we present an effort to validate a three-dimensional vapor intrusion model based on a well-controlled experimental quantification of the vapor intrusion pathways into a slab-on-ground building under varying environmental conditions. Finally, a probabilistic approach based on Monte Carlo simulations is implemented to determine the probability distribution of indoor air concentration based on the most uncertain input parameters.
Continuum methods of physical modeling continuum mechanics, dimensional analysis, turbulence
Hutter, Kolumban
2004-01-01
The book unifies classical continuum mechanics and turbulence modeling, i.e. the same fundamental concepts are used to derive model equations for material behaviour and turbulence closure and complements these with methods of dimensional analysis. The intention is to equip the reader with the ability to understand the complex nonlinear modeling in material behaviour and turbulence closure as well as to derive or invent his own models. Examples are mostly taken from environmental physics and geophysics.
Employing Power Graph Analysis to Facilitate Modeling Molecular Interaction Networks
Directory of Open Access Journals (Sweden)
Momchil Nenov
2015-04-01
Full Text Available Mathematical modeling is used to explore and understand complex systems ranging from weather patterns to social networks to gene-expression regulatory mechanisms. There is an upper limit to the amount of details that can be reflected in a model imposed by finite computational resources. Thus, there are methods to reduce the complexity of the modeled system to its most significant parameters. We discuss the suitability of clustering techniques, in particular Power Graph Analysis as an intermediate step of modeling.
Review and analysis of biomass gasification models
DEFF Research Database (Denmark)
Puig Arnavat, Maria; Bruno, Joan Carles; Coronas, Alberto
2010-01-01
The use of biomass as a source of energy has been further enhanced in recent years and special attention has been paid to biomass gasification. Due to the increasing interest in biomass gasification, several models have been proposed in order to explain and understand this complex process, and th...
Vibration analysis with MADYMO human models
Verver, M.M.; Hoof, J.F.A.M. van
2002-01-01
The importance of comfort for the automotive industry is increasing. Car manufacturers use comfort to distinguish their products from their competitors. However, the development and design of a new car seat or interior is very time consuming and expensive. The introduction of computer models of huma
Future of human models for crash analysis
Wismans, J.S.H.M.; Happee, R.; Hoof, J.F.A.M. van; Lange, R. de
2001-01-01
In the crash safety field mathematical models can be applied in practically all area's of research and development including: reconstruction of actual accidents, design (CAD) of the crash response of vehicles, safety devices and roadside facilities and in support of human impact biomechanical studie
Social Ecological Model Analysis for ICT Integration
Zagami, Jason
2013-01-01
ICT integration of teacher preparation programmes was undertaken by the Australian Teaching Teachers for the Future (TTF) project in all 39 Australian teacher education institutions and highlighted the need for guidelines to inform systemic ICT integration approaches. A Social Ecological Model (SEM) was used to positively inform integration…
Analysis and modeling of "focus" in context
DEFF Research Database (Denmark)
Hovy, Dirk; Anumanchipalli, Gopala; Parlikar, Alok;
2013-01-01
or speech stimuli. We then build models to show how well we predict that focus word from lexical (and higher) level features. Also, using spectral and prosodic information, we show the differences in these focus words when spoken with and without context. Finally, we show how we can improve speech synthesis...... of these utterances given focus information....
QuantUM: Quantitative Safety Analysis of UML Models
Directory of Open Access Journals (Sweden)
Florian Leitner-Fischer
2011-07-01
Full Text Available When developing a safety-critical system it is essential to obtain an assessment of different design alternatives. In particular, an early safety assessment of the architectural design of a system is desirable. In spite of the plethora of available formal quantitative analysis methods it is still difficult for software and system architects to integrate these techniques into their every day work. This is mainly due to the lack of methods that can be directly applied to architecture level models, for instance given as UML diagrams. Also, it is necessary that the description methods used do not require a profound knowledge of formal methods. Our approach bridges this gap and improves the integration of quantitative safety analysis methods into the development process. All inputs of the analysis are specified at the level of a UML model. This model is then automatically translated into the analysis model, and the results of the analysis are consequently represented on the level of the UML model. Thus the analysis model and the formal methods used during the analysis are hidden from the user. We illustrate the usefulness of our approach using an industrial strength case study.
Diagnostics and future evolution analysis of the two parametric models
Yang, Guang; Meng, Xinhe
2016-01-01
In this paper, we apply three diagnostics including $Om$, Statefinder hierarchy and the growth rate of perturbations into discriminating the two parametric models for the effective pressure with the $\\Lambda$CDM model. By using the $Om$ diagnostic, we find that both the model 1 and the model 2 can be hardly distinguished from each other as well as the $\\Lambda$CDM model in terms of 68\\% confidence level. As a supplement, by using the Statefinder hierarchy diagnostics and the growth rate of perturbations, we discover that not only can our two parametric models be well distinguished from $\\Lambda$CDM model, but also, by comparing with $Om$ diagnostic, the model 1 and the model 2 can be distinguished better from each other. In addition, we also explore the fate of universe evolution of our two models by means of the rip analysis.
Model based process-product design and analysis
DEFF Research Database (Denmark)
Gani, Rafiqul
This paper gives a perspective on modelling and the important role it has within product-process design and analysis. Different modelling issues related to development and application of systematic model-based solution approaches for product-process design is discussed and the need for a hybrid...... model-based framework is highlighted. This framework should be able to manage knowledge-data, models, and associated methods and tools integrated with design work-flows and data-flows for specific product-process design problems. In particular, the framework needs to manage models of different types...
Validity of covariance models for the analysis of geographical variation
DEFF Research Database (Denmark)
Guillot, Gilles; Schilling, Rene L.; Porcu, Emilio;
2014-01-01
1. Due to the availability of large molecular data-sets, covariance models are increasingly used to describe the structure of genetic variation as an alternative to more heavily parametrised biological models. 2. We focus here on a class of parametric covariance models that received sustained...... attention lately and show that the conditions under which they are valid mathematical models have been overlooked so far. 3. We provide rigorous results for the construction of valid covariance models in this family. 4. We also outline how to construct alternative covariance models for the analysis...
Quantitative Models and Analysis for Reactive Systems
DEFF Research Database (Denmark)
Thrane, Claus
The majority of modern software and hardware systems are reactive systems, where input provided by the user (possibly another system) and the output of the system is exchanged continuously throughout the (possibly) indefinite execution of the system. Natural examples include control systems, mobi......, energy consumption, latency, mean-time to failure, and cost. For systems integrated in mass-market products, the ability to quantify trade-offs between performance and robustness, under given technical and economic constraints, is of strategic importance....... by the environment in which they are embedded. This thesis studies the semantics and properties of a model-based framework for re- active systems, in which models and specifications are assumed to contain quantifiable information, such as references to time or energy. Our goal is to develop a theory of approximation...
Supplementing biomechanical modeling with EMG analysis
Lewandowski, Beth; Jagodnik, Kathleen; Crentsil, Lawton; Humphreys, Bradley; Funk, Justin; Gallo, Christopher; Thompson, William; DeWitt, John; Perusek, Gail
2016-01-01
It is well established that astronauts experience musculoskeletal deconditioning when exposed to microgravity environments for long periods of time. Spaceflight exercise is used to counteract these effects, and the Advanced Resistive Exercise Device (ARED) on the International Space Station (ISS) has been effective in minimizing musculoskeletal losses. However, the exercise devices of the new exploration vehicles will have requirements of limited mass, power and volume. Because of these limitations, there is a concern that the exercise devices will not be as effective as ARED in maintaining astronaut performance. Therefore, biomechanical modeling is being performed to provide insight on whether the small Multi-Purpose Crew Vehicle (MPCV) device, which utilizes a single-strap design, will provide sufficient physiological loading to maintain musculoskeletal performance. Electromyography (EMG) data are used to supplement the biomechanical model results and to explore differences in muscle activation patterns during exercises using different loading configurations.
Numerical model for atomtronic circuit analysis
Chow, Weng W; Anderson, Dana Z
2015-01-01
A model for studying atomtronic devices and circuits based on finite temperature Bose-condensed gases is presented. The approach involves numerically solving equations of motion for atomic populations and coherences, derived using the Bose-Hubbard Hamiltonian and the Heisenberg picture. The resulting cluster expansion is truncated at a level giving balance between physics rigor and numerical demand mitigation. This approach allows parametric studies involving time scales that cover both the rapid population dynamics relevant to non-equilibrium state evolution, as well as the much longer time durations typical for reaching steady-state device operation. The model is demonstrated by studying the evolution of a Bose-condensed gas in the presence of atom injection and extraction in a double-well potential. In this configuration phase-locking between condensates in each well of the potential is readily observed, and its influence on the evolution of the system is studied.
Automatic terrain modeling using transfinite element analysis
Collier, Nathaniel O.
2010-05-31
An automatic procedure for modeling terrain is developed based on L2 projection-based interpolation of discrete terrain data onto transfinite function spaces. The function space is refined automatically by the use of image processing techniques to detect regions of high error and the flexibility of the transfinite interpolation to add degrees of freedom to these areas. Examples are shown of a section of the Palo Duro Canyon in northern Texas.
Stochastic modeling and analysis of telecoms networks
Decreusefond, Laurent
2012-01-01
This book addresses the stochastic modeling of telecommunication networks, introducing the main mathematical tools for that purpose, such as Markov processes, real and spatial point processes and stochastic recursions, and presenting a wide list of results on stability, performances and comparison of systems.The authors propose a comprehensive mathematical construction of the foundations of stochastic network theory: Markov chains, continuous time Markov chains are extensively studied using an original martingale-based approach. A complete presentation of stochastic recursions from an
Modeling and analysis of caves using voxelization
Szeifert, Gábor; Szabó, Tivadar; Székely, Balázs
2014-05-01
Although there are many ways to create three dimensional representations of caves using modern information technology methods, modeling of caves has been challenging for researchers for a long time. One of these promising new alternative modeling methods is using voxels. We are using geodetic measurements as an input for our voxelization project. These geodetic underground surveys recorded the azimuth, altitude and distance of corner points of cave systems relative to each other. The diameter of each cave section is estimated from separate databases originating from different surveys. We have developed a simple but efficient method (it covers more than 99.9 % of the volume of the input model on the average) to convert these vector-type datasets to voxels. We have also developed software components to make visualization of the voxel and vector models easier. Since each cornerpoint position is measured relative to another cornerpoints positions, propagation of uncertainties is an important issue in case of long caves with many separate sections. We are using Monte Carlo simulations to analyze the effect of the error of each geodetic instrument possibly involved in a survey. Cross-sections of the simulated three dimensional distributions show, that even tiny uncertainties of individual measurements can result in high variation of positions that could be reduced by distributing the closing errors if such data are available. Using the results of our simulations, we can estimate cave volume and the error of the calculated cave volume depending on the complexity of the cave. Acknowledgements: the authors are grateful to Ariadne Karst and Cave Exploring Association and State Department of Environmental and Nature Protection of the Hungarian Ministry of Rural Development, Department of National Parks and Landscape Protection, Section Landscape and Cave Protection and Ecotourism for providing the cave measurement data. BS contributed as an Alexander von Humboldt Research
Data perturbation analysis of a linear model
Institute of Scientific and Technical Information of China (English)
无
2000-01-01
The linear model features were carefully studied in the cases of data perturbation and mean shift perturbation.Some important features were also proved mathematically. The results show that the mean shift perturbation is equivalentto the data perturbation, that is, adding a parameter to an observation equation means that this set of data is deleted fromthe data set. The estimate of this parameter is its predicted residual in fact
Spectral Analysis and Atmospheric Models of Microflares
Institute of Scientific and Technical Information of China (English)
Cheng Fang; Yu-Hua Tang; Zhi Xu
2006-01-01
By use of the high-resolution spectral data obtained with THEMIS on 2002 September 5, the spectra and characteristics of five well-observed microflares have been analyzed. Our results indicate that some of them are located near the longitudinal magnetic polarity inversion lines. All the microflares are accompanied by mass motions. The most obvious characteristic of the Hα microflare spectra is the emission at the center of both Hα and CaII 8542(A) lines. For the first time both thermal and non-thermal semi-empirical atmospheric models for the conspicuous and faint microflares are computed. In computing the non-thermal models, we assume that the electron beam resulting from magnetic reconnection is produced in the chromosphere, because it requires lower energies for the injected particles.It is found there is obvious heating in the low chromosphere. The temperature enhancement is about 1000-2200 K in the thermal models. If the non-thermal effects are included, then the required temperature increase can be reduced by 100-150 K. These imply that the Hα microflares can probably be produced by magnetic reconnection in the solar Iower atmosphere.The radiative and kinetic energies of the Hα microflares are estimated and the total energy is found to be 1027 - 4× 1028 erg.
From phenology models to risk indicator analysis
Directory of Open Access Journals (Sweden)
Márta Ladányi
2010-11-01
Full Text Available In this paper we outline a phenology model for estimating budbreak and full bloom starting dates of sour cherry on the effective heat sums with reasonable accuracy. With the help of RegCM3.1 model the possible trends of the phenology timing in the middle of the 21st century the shift of 12-13 days earlier budbreak and 6-7 days earlier of full bloom due to the warmer weather conditions can be clearly indicated. For the climatic characterization of sour cherry bloom period in between 1984-2010 and for the description of the expected changes in this very sensitive period of sour cherry withrespect to the time slice 2021-2050, we introduce seven climatic indicators as artificial weather parameters such as the numbers of days when the temperature was under 0°C and above 10 °C, the numbers of days when there was no and more than 5 mm precipitation as well as the absolute minimum, the mean of minimum and the mean of maximum daily temperatures. We survey the changes of the indicators in the examined period (1984-2010 and, regarding the full bloom start model results, we formulate the expectations forthe future and make comparisons.
A global sensitivity analysis approach for morphogenesis models
Boas, Sonja E. M.
2015-11-21
Background Morphogenesis is a developmental process in which cells organize into shapes and patterns. Complex, non-linear and multi-factorial models with images as output are commonly used to study morphogenesis. It is difficult to understand the relation between the uncertainty in the input and the output of such ‘black-box’ models, giving rise to the need for sensitivity analysis tools. In this paper, we introduce a workflow for a global sensitivity analysis approach to study the impact of single parameters and the interactions between them on the output of morphogenesis models. Results To demonstrate the workflow, we used a published, well-studied model of vascular morphogenesis. The parameters of this cellular Potts model (CPM) represent cell properties and behaviors that drive the mechanisms of angiogenic sprouting. The global sensitivity analysis correctly identified the dominant parameters in the model, consistent with previous studies. Additionally, the analysis provided information on the relative impact of single parameters and of interactions between them. This is very relevant because interactions of parameters impede the experimental verification of the predicted effect of single parameters. The parameter interactions, although of low impact, provided also new insights in the mechanisms of in silico sprouting. Finally, the analysis indicated that the model could be reduced by one parameter. Conclusions We propose global sensitivity analysis as an alternative approach to study the mechanisms of morphogenesis. Comparison of the ranking of the impact of the model parameters to knowledge derived from experimental data and from manipulation experiments can help to falsify models and to find the operand mechanisms in morphogenesis. The workflow is applicable to all ‘black-box’ models, including high-throughput in vitro models in which output measures are affected by a set of experimental perturbations.
Model Checking Is Static Analysis of Modal Logic
DEFF Research Database (Denmark)
Nielson, Flemming; Nielson, Hanne Riis
2010-01-01
Flow Logic is an approach to the static analysis of programs that has been developed for functional, imperative and object-oriented programming languages and for concurrent, distributed, mobile and cryptographic process calculi. In this paper we extend it; to deal with modal logics and prove...... that it can give an exact characterisation of the semantics of formulae in a modal logic. This shows that model checking can be performed by means of state-of-the-art approaches to static analysis and allow us to conclude that the problems of model checking and static analysis are reducible to each other....... In terms of computational complexity we show that model checking by means of static analysis gives the same complexity bounds as are known for traditional approaches to model checking....
Computational Modeling, Formal Analysis, and Tools for Systems Biology.
Directory of Open Access Journals (Sweden)
Ezio Bartocci
2016-01-01
Full Text Available As the amount of biological data in the public domain grows, so does the range of modeling and analysis techniques employed in systems biology. In recent years, a number of theoretical computer science developments have enabled modeling methodology to keep pace. The growing interest in systems biology in executable models and their analysis has necessitated the borrowing of terms and methods from computer science, such as formal analysis, model checking, static analysis, and runtime verification. Here, we discuss the most important and exciting computational methods and tools currently available to systems biologists. We believe that a deeper understanding of the concepts and theory highlighted in this review will produce better software practice, improved investigation of complex biological processes, and even new ideas and better feedback into computer science.
Automation of Safety Analysis with SysML Models Project
National Aeronautics and Space Administration — This project was a small proof-of-concept case study, generating SysML model information as a side effect of safety analysis. A prototype FMEA Assistant was...
Reachable set modeling and engagement analysis of exoatmospheric interceptor
Institute of Scientific and Technical Information of China (English)
Chai Hua; Liang Yangang; Chen Lei; Tang Guojin
2014-01-01
A novel reachable set (RS) model is developed within a framework of exoatmospheric interceptor engagement analysis. The boost phase steering scheme and trajectory distortion mech-anism of the interceptor are firstly explored. A mathematical model of the distorted RS is then for-mulated through a dimension–reduction analysis. By treating the outer boundary of the RS on sphere surface as a spherical convex hull, two relevant theorems are proposed and the RS envelope is depicted by the computational geometry theory. Based on RS model, the algorithms of intercept window analysis and launch parameters determination are proposed, and numerical simulations are carried out for interceptors with different energy or launch points. Results show that the proposed method can avoid intensive on-line computation and provide an accurate and effective approach for interceptor engagement analysis. The suggested RS model also serves as a ready reference to other related problems such as interceptor effectiveness evaluation and platform disposition.
Reachable set modeling and engagement analysis of exoatmospheric interceptor
Directory of Open Access Journals (Sweden)
Chai Hua
2014-12-01
Full Text Available A novel reachable set (RS model is developed within a framework of exoatmospheric interceptor engagement analysis. The boost phase steering scheme and trajectory distortion mechanism of the interceptor are firstly explored. A mathematical model of the distorted RS is then formulated through a dimension–reduction analysis. By treating the outer boundary of the RS on sphere surface as a spherical convex hull, two relevant theorems are proposed and the RS envelope is depicted by the computational geometry theory. Based on RS model, the algorithms of intercept window analysis and launch parameters determination are proposed, and numerical simulations are carried out for interceptors with different energy or launch points. Results show that the proposed method can avoid intensive on-line computation and provide an accurate and effective approach for interceptor engagement analysis. The suggested RS model also serves as a ready reference to other related problems such as interceptor effectiveness evaluation and platform disposition.
Microscopic Analysis and Modeling of Airport Surface Sequencing Project
National Aeronautics and Space Administration — Although a number of airportal surface models exist and have been successfully used for analysis of airportal operations, only recently has it become possible to...
Integration of Design and Control through Model Analysis
DEFF Research Database (Denmark)
Russel, Boris Mariboe; Henriksen, Jens Peter; Jørgensen, Sten Bay;
2002-01-01
A systematic computer aided analysis of the process model is proposed as a pre-solution step for integration of design and control problems. The process model equations are classified in terms of balance equations, constitutive equations and conditional equations. Analysis of the phenomena models...... representing the constitutive equations identify the relationships between the important process and design variables, which help to understand, define and address some of the issues related to integration of design and control. Furthermore, the analysis is able to identify a set of process (control) variables...... and design (manipulative) variables that may be employed with different objectives in design and control for the integrated problem. The computer aided model analysis is highlighted through illustrative examples, involving processes with mass and/or energy recycle, where the important design and control...
Computational Modeling, Formal Analysis, and Tools for Systems Biology.
Bartocci, Ezio; Lió, Pietro
2016-01-01
As the amount of biological data in the public domain grows, so does the range of modeling and analysis techniques employed in systems biology. In recent years, a number of theoretical computer science developments have enabled modeling methodology to keep pace. The growing interest in systems biology in executable models and their analysis has necessitated the borrowing of terms and methods from computer science, such as formal analysis, model checking, static analysis, and runtime verification. Here, we discuss the most important and exciting computational methods and tools currently available to systems biologists. We believe that a deeper understanding of the concepts and theory highlighted in this review will produce better software practice, improved investigation of complex biological processes, and even new ideas and better feedback into computer science.
Analysis of Jingdong Mall Logistics Distribution Model
Shao, Kang; Cheng, Feng
In recent years, the development of electronic commerce in our country to speed up the pace. The role of logistics has been highlighted, more and more electronic commerce enterprise are beginning to realize the importance of logistics in the success or failure of the enterprise. In this paper, the author take Jingdong Mall for example, performing a SWOT analysis of their current situation of self-built logistics system, find out the problems existing in the current Jingdong Mall logistics distribution and give appropriate recommendations.
Dispersion analysis with inverse dielectric function modelling.
Mayerhöfer, Thomas G; Ivanovski, Vladimir; Popp, Jürgen
2016-11-01
We investigate how dispersion analysis can profit from the use of a Lorentz-type description of the inverse dielectric function. In particular at higher angles of incidence, reflectance spectra using p-polarized light are dominated by bands from modes that have their transition moments perpendicular to the surface. Accordingly, the spectra increasingly resemble inverse dielectric functions. A corresponding description can therefore eliminate the complex dependencies of the dispersion parameters, allow their determination and facilitate a more accurate description of the optical properties of single crystals. PMID:27294550
Models and analysis for distributed systems
Haddad, Serge; Pautet, Laurent; Petrucci, Laure
2013-01-01
Nowadays, distributed systems are increasingly present, for public software applications as well as critical systems. software applications as well as critical systems. This title and Distributed Systems: Design and Algorithms - from the same editors - introduce the underlying concepts, the associated design techniques and the related security issues.The objective of this book is to describe the state of the art of the formal methods for the analysis of distributed systems. Numerous issues remain open and are the topics of major research projects. One current research trend consists of pro
Hill, M. C.; Jakeman, J.; Razavi, S.; Tolson, B.
2015-12-01
For many environmental systems model runtimes have remained very long as more capable computers have been used to add more processes and more time and space discretization. Scientists have also added more parameters and kinds of observations, and many model runs are needed to explore the models. Computational demand equals run time multiplied by number of model runs divided by parallelization opportunities. Model exploration is conducted using sensitivity analysis, optimization, and uncertainty quantification. Sensitivity analysis is used to reveal consequences of what may be very complex simulated relations, optimization is used to identify parameter values that fit the data best, or at least better, and uncertainty quantification is used to evaluate the precision of simulated results. The long execution times make such analyses a challenge. Methods for addressing this challenges include computationally frugal analysis of the demanding original model and a number of ingenious surrogate modeling methods. Both commonly use about 50-100 runs of the demanding original model. In this talk we consider the tradeoffs between (1) original model development decisions, (2) computationally frugal analysis of the original model, and (3) using many model runs of the fast surrogate model. Some questions of interest are as follows. If the added processes and discretization invested in (1) are compared with the restrictions and approximations in model analysis produced by long model execution times, is there a net benefit related of the goals of the model? Are there changes to the numerical methods that could reduce the computational demands while giving up less fidelity than is compromised by using computationally frugal methods or surrogate models for model analysis? Both the computationally frugal methods and surrogate models require that the solution of interest be a smooth function of the parameters or interest. How does the information obtained from the local methods typical
Empirical Bayes Model Comparisons for Differential Methylation Analysis
Directory of Open Access Journals (Sweden)
Mingxiang Teng
2012-01-01
Full Text Available A number of empirical Bayes models (each with different statistical distribution assumptions have now been developed to analyze differential DNA methylation using high-density oligonucleotide tiling arrays. However, it remains unclear which model performs best. For example, for analysis of differentially methylated regions for conservative and functional sequence characteristics (e.g., enrichment of transcription factor-binding sites (TFBSs, the sensitivity of such analyses, using various empirical Bayes models, remains unclear. In this paper, five empirical Bayes models were constructed, based on either a gamma distribution or a log-normal distribution, for the identification of differential methylated loci and their cell division—(1, 3, and 5 and drug-treatment-(cisplatin dependent methylation patterns. While differential methylation patterns generated by log-normal models were enriched with numerous TFBSs, we observed almost no TFBS-enriched sequences using gamma assumption models. Statistical and biological results suggest log-normal, rather than gamma, empirical Bayes model distribution to be a highly accurate and precise method for differential methylation microarray analysis. In addition, we presented one of the log-normal models for differential methylation analysis and tested its reproducibility by simulation study. We believe this research to be the first extensive comparison of statistical modeling for the analysis of differential DNA methylation, an important biological phenomenon that precisely regulates gene transcription.
PSAMM: A Portable System for the Analysis of Metabolic Models.
Directory of Open Access Journals (Sweden)
Jon Lund Steffensen
2016-02-01
Full Text Available The genome-scale models of metabolic networks have been broadly applied in phenotype prediction, evolutionary reconstruction, community functional analysis, and metabolic engineering. Despite the development of tools that support individual steps along the modeling procedure, it is still difficult to associate mathematical simulation results with the annotation and biological interpretation of metabolic models. In order to solve this problem, here we developed a Portable System for the Analysis of Metabolic Models (PSAMM, a new open-source software package that supports the integration of heterogeneous metadata in model annotations and provides a user-friendly interface for the analysis of metabolic models. PSAMM is independent of paid software environments like MATLAB, and all its dependencies are freely available for academic users. Compared to existing tools, PSAMM significantly reduced the running time of constraint-based analysis and enabled flexible settings of simulation parameters using simple one-line commands. The integration of heterogeneous, model-specific annotation information in PSAMM is achieved with a novel format of YAML-based model representation, which has several advantages, such as providing a modular organization of model components and simulation settings, enabling model version tracking, and permitting the integration of multiple simulation problems. PSAMM also includes a number of quality checking procedures to examine stoichiometric balance and to identify blocked reactions. Applying PSAMM to 57 models collected from current literature, we demonstrated how the software can be used for managing and simulating metabolic models. We identified a number of common inconsistencies in existing models and constructed an updated model repository to document the resolution of these inconsistencies.
Neutrosophic Logic for Mental Model Elicitation and Analysis
Directory of Open Access Journals (Sweden)
Karina Pérez-Teruel
2014-03-01
Full Text Available Mental models are personal, internal representations of external reality that people use to interact with the world around them. They are useful in multiple situations such as muticriteria decision making, knowledge management, complex system learning and analysis. In this paper a framework for mental models elicitation and analysis based on neutrosophic Logic is presented. An illustrative example is provided to show the applicability of the proposal. The paper ends with conclusion future research directions.
Discrete Discriminant analysis based on tree-structured graphical models
DEFF Research Database (Denmark)
Perez de la Cruz, Gonzalo; Eslava, Guillermina
The purpose of this paper is to illustrate the potential use of discriminant analysis based on tree{structured graphical models for discrete variables. This is done by comparing its empirical performance using estimated error rates for real and simulated data. The results show that discriminant...... analysis based on tree{structured graphical models is a simple nonlinear method competitive with, and sometimes superior to, other well{known linear methods like those assuming mutual independence between variables and linear logistic regression....
Coping with Complexity Model Reduction and Data Analysis
Gorban, Alexander N
2011-01-01
This volume contains the extended version of selected talks given at the international research workshop 'Coping with Complexity: Model Reduction and Data Analysis', Ambleside, UK, August 31 - September 4, 2009. This book is deliberately broad in scope and aims at promoting new ideas and methodological perspectives. The topics of the chapters range from theoretical analysis of complex and multiscale mathematical models to applications in e.g., fluid dynamics and chemical kinetics.
Model-based methods for linkage analysis.
Rice, John P; Saccone, Nancy L; Corbett, Jonathan
2008-01-01
The logarithm of an odds ratio (LOD) score method originated in a seminal article by Newton Morton in 1955. The method is broadly concerned with issues of power and the posterior probability of linkage, ensuring that a reported linkage has a high probability of being a true linkage. In addition, the method is sequential so that pedigrees or LOD curves may be combined from published reports to pool data for analysis. This approach has been remarkably successful for 50 years in identifying disease genes for Mendelian disorders. After discussing these issues, we consider the situation for complex disorders where the maximum LOD score statistic shares some of the advantages of the traditional LOD score approach, but is limited by unknown power and the lack of sharing of the primary data needed to optimally combine analytic results. We may still learn from the LOD score method as we explore new methods in molecular biology and genetic analysis to utilize the complete human DNA sequence and the cataloging of all human genes.
Sensitivity and uncertainty analysis of the PATHWAY radionuclide transport model
International Nuclear Information System (INIS)
Procedures were developed for the uncertainty and sensitivity analysis of a dynamic model of radionuclide transport through human food chains. Uncertainty in model predictions was estimated by propagation of parameter uncertainties using a Monte Carlo simulation technique. Sensitivity of model predictions to individual parameters was investigated using the partial correlation coefficient of each parameter with model output. Random values produced for the uncertainty analysis were used in the correlation analysis for sensitivity. These procedures were applied to the PATHWAY model which predicts concentrations of radionuclides in foods grown in Nevada and Utah and exposed to fallout during the period of atmospheric nuclear weapons testing in Nevada. Concentrations and time-integrated concentrations of iodine-131, cesium-136, and cesium-137 in milk and other foods were investigated. 9 figs., 13 tabs
Kleijnen, J.P.C.
1995-01-01
This tutorial discusses what-if analysis and optimization of System Dynamics models. These problems are solved, using the statistical techniques of regression analysis and design of experiments (DOE). These issues are illustrated by applying the statistical techniques to a System Dynamics model for
AIR INGRESS ANALYSIS: COMPUTATIONAL FLUID DYNAMIC MODELS
International Nuclear Information System (INIS)
The Idaho National Laboratory (INL), under the auspices of the U.S. Department of Energy, is performing research and development that focuses on key phenomena important during potential scenarios that may occur in very high temperature reactors (VHTRs). Phenomena Identification and Ranking Studies to date have ranked an air ingress event, following on the heels of a VHTR depressurization, as important with regard to core safety. Consequently, the development of advanced air ingress-related models and verification and validation data are a very high priority. Following a loss of coolant and system depressurization incident, air will enter the core of the High Temperature Gas Cooled Reactor through the break, possibly causing oxidation of the in-the core and reflector graphite structure. Simple core and plant models indicate that, under certain circumstances, the oxidation may proceed at an elevated rate with additional heat generated from the oxidation reaction itself. Under postulated conditions of fluid flow and temperature, excessive degradation of the lower plenum graphite can lead to a loss of structural support. Excessive oxidation of core graphite can also lead to the release of fission products into the confinement, which could be detrimental to a reactor safety. Computational fluid dynamic model developed in this study will improve our understanding of this phenomenon. This paper presents two-dimensional and three-dimensional CFD results for the quantitative assessment of the air ingress phenomena. A portion of results of the density-driven stratified flow in the inlet pipe will be compared with results of the experimental results.
Modelling Analysis of Sewage Sludge Amended Soil
DEFF Research Database (Denmark)
Sørensen, P. B.; Carlsen, L.; Vikelsøe, J.;
The topic is risk assessment of sludge supply to agricultural soil in relation to xenobiotics. A large variety of xenobiotics arrive to the wastewater treatment plant in the wastewater. Many of these components are hydrophobic and thus will accumulate in the sludge solids and are removed from...... for the Evaluation of Substances (EUSES). It is shown how the fraction of substance mass, which is leached, from the top soil is a simple function of the ratio between the degradation half lifetime and the adsorption coefficient. This model can be used in probabilistic risk assessment of agricultural soils...
Explicit model predictive control accuracy analysis
Knyazev, Andrew; Zhu, Peizhen; Di Cairano, Stefano
2015-01-01
Model Predictive Control (MPC) can efficiently control constrained systems in real-time applications. MPC feedback law for a linear system with linear inequality constraints can be explicitly computed off-line, which results in an off-line partition of the state space into non-overlapped convex regions, with affine control laws associated to each region of the partition. An actual implementation of this explicit MPC in low cost micro-controllers requires the data to be "quantized", i.e. repre...
Compartmentalization analysis using discrete fracture network models
Energy Technology Data Exchange (ETDEWEB)
La Pointe, P.R.; Eiben, T.; Dershowitz, W. [Golder Associates, Redmond, VA (United States); Wadleigh, E. [Marathon Oil Co., Midland, TX (United States)
1997-08-01
This paper illustrates how Discrete Fracture Network (DFN) technology can serve as a basis for the calculation of reservoir engineering parameters for the development of fractured reservoirs. It describes the development of quantitative techniques for defining the geometry and volume of structurally controlled compartments. These techniques are based on a combination of stochastic geometry, computational geometry, and graph the theory. The parameters addressed are compartment size, matrix block size and tributary drainage volume. The concept of DFN models is explained and methodologies to compute these parameters are demonstrated.
Influence of pipeline modeling in stability analysis for severe slugging
Azevedo, G. R.; Baliño, J. L.; Burr, K. P.
2016-06-01
In this paper a numerical linear stability analysis is performed to a mathematical model for the two-phase flow in a pipeline-riser system. Most of stability criteria and most of the models are based on a simplified pipeline, where it is assumed that the void fraction fluctuation can be neglected. It is evident that a pipeline with a constant void fraction would not be able to capture the flow pattern transition or void fraction propagation waves. Three different models for the pipeline are considered: a lumped parameter model with constant void fraction; a lumped parameter model with time dependent void fraction; a distributed parameter model, with void fraction dependent on time and position. The results showed that a simplified model would lose some stable region for operational conditions, but the complete models would not. As result, a more general modeling is necessary to capture all the influence of the stratified flow over stability and over the pipeline dynamics.
Model Construction and Analysis of Respiration in Halobacterium salinarum.
Directory of Open Access Journals (Sweden)
Cherryl O Talaue
Full Text Available The archaeon Halobacterium salinarum can produce energy using three different processes, namely photosynthesis, oxidative phosphorylation and fermentation of arginine, and is thus a model organism in bioenergetics. Compared to its bacteriorhodopsin-driven photosynthesis, less attention has been devoted to modeling its respiratory pathway. We created a system of ordinary differential equations that models its oxidative phosphorylation. The model consists of the electron transport chain, the ATP synthase, the potassium uniport and the sodium-proton antiport. By fitting the model parameters to experimental data, we show that the model can explain data on proton motive force generation, ATP production, and the charge balancing of ions between the sodium-proton antiporter and the potassium uniport. We performed sensitivity analysis of the model parameters to determine how the model will respond to perturbations in parameter values. The model and the parameters we derived provide a resource that can be used for analytical studies of the bioenergetics of H. salinarum.
Rovibrational CO analysis in PDR models
Stancil, Phillip C.; Cumbee, Renata; Zhang, Ziwei; Walker, Kyle M.; Yang, Benhui; Ferland, Gary J.
2016-01-01
CO is one of the most important molecules in the interstellar medium and in photodissociation regions (PDRs). Most of the extragalactic non-stellar IR to submm CO emission originates in PDRs. (Hollenbach & Tielens 1999). Pure rotational CO lines have been previously used in PDR models to provide density, temperature, and other diagnostics. However, for environments exposed to intense UV radiation, CO vibrational levels become significantly populated. Given new calculations of rovibrational collisional rate coefficients for CO-H (Walker et al. 2015, Song et al. 2015) and CO-H2 (Yang et al. 2015), we explore their effects in standard Cloudy PDR (Ferland et al. 2013) and Radex (van der Tak et al. 2007) models. In particular, CO vibrational transitions due to H2 collisions are studied for the first time using reliable full-dimensional CO-H2 collisional data.Ferland, G. J., et al. 2013, Rev. Mex. Astron. y Astrof., 49, 137Hollenbach, D. J. & Tielens, A. G. G. M. 1999, RMP, 71, 173Song, L., et al. 2015, ApJ, in pressvan der Tak, F. F. S, et al. 2007, A&A, 468, 627Walker, K. M., et al. 2015, ApJ, 811, 27Yang, B., et al. 2015, Nature Comm., 6, 6629This work was supported in part by NASA grants NNX12AF42G and NNX15AI61G.
Sparse Principal Component Analysis in Medical Shape Modeling
DEFF Research Database (Denmark)
Sjöstrand, Karl; Stegmann, Mikkel Bille; Larsen, Rasmus
2006-01-01
Principal component analysis (PCA) is a widely used tool in medical image analysis for data reduction, model building, and data understanding and exploration. While PCA is a holistic approach where each new variable is a linear combination of all original variables, sparse PCA (SPCA) aims...
Equivalent circuit models for ac impedance data analysis
Danford, M. D.
1990-01-01
A least-squares fitting routine has been developed for the analysis of ac impedance data. It has been determined that the checking of the derived equations for a particular circuit with a commercially available electronics circuit program is essential. As a result of the investigation described, three equivalent circuit models were selected for use in the analysis of ac impedance data.
Product Lifecycle Management Architecture: A Model Based Systems Engineering Analysis.
Energy Technology Data Exchange (ETDEWEB)
Noonan, Nicholas James [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)
2015-07-01
This report is an analysis of the Product Lifecycle Management (PLM) program. The analysis is centered on a need statement generated by a Nuclear Weapons (NW) customer. The need statement captured in this report creates an opportunity for the PLM to provide a robust service as a solution. Lifecycles for both the NW and PLM are analyzed using Model Based System Engineering (MBSE).
Domain Endurants: An Analysis and Description Process Model
DEFF Research Database (Denmark)
Bjørner, Dines
2014-01-01
We present a summary, Sect. 2, of a structure of domain analysis and description concepts: techniques and tools. And we link, in Sect. 3, these concepts, embodied in domain analysis prompts and domain description prompts, in a model of how a diligent domain analyser cum describer would use them. ...
Stability analysis of an autocatalytic protein model
Lee, Julian
2016-05-01
A self-regulatory genetic circuit, where a protein acts as a positive regulator of its own production, is known to be the simplest biological network with a positive feedback loop. Although at least three components—DNA, RNA, and the protein—are required to form such a circuit, stability analysis of the fixed points of this self-regulatory circuit has been performed only after reducing the system to a two-component system, either by assuming a fast equilibration of the DNA component or by removing the RNA component. Here, stability of the fixed points of the three-component positive feedback loop is analyzed by obtaining eigenvalues of the full three-dimensional Hessian matrix. In addition to rigorously identifying the stable fixed points and saddle points, detailed information about the system can be obtained, such as the existence of complex eigenvalues near a fixed point.
Structural model analysis of multiple quantitative traits.
Directory of Open Access Journals (Sweden)
Renhua Li
2006-07-01
Full Text Available We introduce a method for the analysis of multilocus, multitrait genetic data that provides an intuitive and precise characterization of genetic architecture. We show that it is possible to infer the magnitude and direction of causal relationships among multiple correlated phenotypes and illustrate the technique using body composition and bone density data from mouse intercross populations. Using these techniques we are able to distinguish genetic loci that affect adiposity from those that affect overall body size and thus reveal a shortcoming of standardized measures such as body mass index that are widely used in obesity research. The identification of causal networks sheds light on the nature of genetic heterogeneity and pleiotropy in complex genetic systems.
Modeling and Exergy Analysis of District Cooling
DEFF Research Database (Denmark)
Nguyen, Chan
and surrounding temperature has been carried out. It has been demonstrated that the two methods yield significantly different results. Energy costing prices the unit cost of heating and cooling equally independent of the quality of the heat transfer, and it tends to overprice the cost of cooling in an irrational...... to be the more rational apportioning method for simultaneous district heating and cooling. The methodology of the exergy costing method can also be used to calculate the environmental impact of each consumer. Taxation can eventually be based on this. The method is called exergoenvironmental analysis....... As a principle example, the CO2 emission for each of the cooling and heating consumer is found. The conclusion is analogue to the exergy costing method, i.e. the exergoenvironmmental method can be used as motivation for reducing CO2 emission. One of the main obstacles with district cooling in a traditional water...
Sensitivity analysis of fine sediment models using heterogeneous data
Kamel, A. M. Yousif; Bhattacharya, B.; El Serafy, G. Y.; van Kessel, T.; Solomatine, D. P.
2012-04-01
Sediments play an important role in many aquatic systems. Their transportation and deposition has significant implication on morphology, navigability and water quality. Understanding the dynamics of sediment transportation in time and space is therefore important in drawing interventions and making management decisions. This research is related to the fine sediment dynamics in the Dutch coastal zone, which is subject to human interference through constructions, fishing, navigation, sand mining, etc. These activities do affect the natural flow of sediments and sometimes lead to environmental concerns or affect the siltation rates in harbours and fairways. Numerical models are widely used in studying fine sediment processes. Accuracy of numerical models depends upon the estimation of model parameters through calibration. Studying the model uncertainty related to these parameters is important in improving the spatio-temporal prediction of suspended particulate matter (SPM) concentrations, and determining the limits of their accuracy. This research deals with the analysis of a 3D numerical model of North Sea covering the Dutch coast using the Delft3D modelling tool (developed at Deltares, The Netherlands). The methodology in this research was divided into three main phases. The first phase focused on analysing the performance of the numerical model in simulating SPM concentrations near the Dutch coast by comparing the model predictions with SPM concentrations estimated from NASA's MODIS sensors at different time scales. The second phase focused on carrying out a sensitivity analysis of model parameters. Four model parameters were identified for the uncertainty and sensitivity analysis: the sedimentation velocity, the critical shear stress above which re-suspension occurs, the shields shear stress for re-suspension pick-up, and the re-suspension pick-up factor. By adopting different values of these parameters the numerical model was run and a comparison between the
JSim, an open-source modeling system for data analysis.
Butterworth, Erik; Jardine, Bartholomew E; Raymond, Gary M; Neal, Maxwell L; Bassingthwaighte, James B
2013-01-01
JSim is a simulation system for developing models, designing experiments, and evaluating hypotheses on physiological and pharmacological systems through the testing of model solutions against data. It is designed for interactive, iterative manipulation of the model code, handling of multiple data sets and parameter sets, and for making comparisons among different models running simultaneously or separately. Interactive use is supported by a large collection of graphical user interfaces for model writing and compilation diagnostics, defining input functions, model runs, selection of algorithms solving ordinary and partial differential equations, run-time multidimensional graphics, parameter optimization (8 methods), sensitivity analysis, and Monte Carlo simulation for defining confidence ranges. JSim uses Mathematical Modeling Language (MML) a declarative syntax specifying algebraic and differential equations. Imperative constructs written in other languages (MATLAB, FORTRAN, C++, etc.) are accessed through procedure calls. MML syntax is simple, basically defining the parameters and variables, then writing the equations in a straightforward, easily read and understood mathematical form. This makes JSim good for teaching modeling as well as for model analysis for research. For high throughput applications, JSim can be run as a batch job. JSim can automatically translate models from the repositories for Systems Biology Markup Language (SBML) and CellML models. Stochastic modeling is supported. MML supports assigning physical units to constants and variables and automates checking dimensional balance as the first step in verification testing. Automatic unit scaling follows, e.g. seconds to minutes, if needed. The JSim Project File sets a standard for reproducible modeling analysis: it includes in one file everything for analyzing a set of experiments: the data, the models, the data fitting, and evaluation of parameter confidence ranges. JSim is open source; it
Analysis and modeling of rail maintenance costs
Directory of Open Access Journals (Sweden)
Amir Ali Bakhshi
2012-01-01
Full Text Available Railroad maintenance engineering plays an important role on availability of roads and reducing the cost of railroad incidents. Rail is of the most important parts of railroad industry, which needs regular maintenance since it covers a significant part of total maintenance cost. Any attempt on optimizing total cost of maintenance could substantially reduce the cost of railroad system and it can reduce total cost of the industry. The paper presents a new method to estimate the cost of rail failure using different cost components such as cost of inspection and cost of risk associated with possible accidents. The proposed model of this paper is used for a real-world case study of railroad transportation of Tehran region and the results have been analyzed.
Materials Analysis and Modeling of Underfill Materials.
Energy Technology Data Exchange (ETDEWEB)
Wyatt, Nicholas B [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Chambers, Robert S. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)
2015-08-01
The thermal-mechanical properties of three potential underfill candidate materials for PBGA applications are characterized and reported. Two of the materials are a formulations developed at Sandia for underfill applications while the third is a commercial product that utilizes a snap-cure chemistry to drastically reduce cure time. Viscoelastic models were calibrated and fit using the property data collected for one of the Sandia formulated materials. Along with the thermal-mechanical analyses performed, a series of simple bi-material strip tests were conducted to comparatively analyze the relative effects of cure and thermal shrinkage amongst the materials under consideration. Finally, current knowledge gaps as well as questions arising from the present study are identified and a path forward presented.
Expatriates Selection: An Essay of Model Analysis
Directory of Open Access Journals (Sweden)
Rui Bártolo-Ribeiro
2015-03-01
Full Text Available The business expansion to other geographical areas with different cultures from which organizations were created and developed leads to the expatriation of employees to these destinations. Recruitment and selection procedures of expatriates do not always have the intended success leading to an early return of these professionals with the consequent organizational disorders. In this study, several articles published in the last five years were analyzed in order to identify the most frequently mentioned dimensions in the selection of expatriates in terms of success and failure. The characteristics in the selection process that may increase prediction of adaptation of expatriates to new cultural contexts of the some organization were studied according to the KSAOs model. Few references were found concerning Knowledge, Skills and Abilities dimensions in the analyzed papers. There was a strong predominance on the evaluation of Other Characteristics, and was given more importance to dispositional factors than situational factors for promoting the integration of the expatriates.
Parametric analysis of fire model CFAST
International Nuclear Information System (INIS)
This paper describes the pump room fire of the nuclear power plant using CFAST fire modeling code developed by NIST. It is determined by the constrained or unconstrained fire, Lower Oxygen Limit (LOL), Radiative Fraction (RF), and the times to open doors, which are the input parameters of CAFST. According to the results, pump room fire is ventilation-controlled fire, so it is adequate that the value of LOL is 10% which is also the default value. It is appeared that the RF does not change the temperature of the upper gas layer. But the level of opening of the penetrating area and the times to opening it have an effect on the temperature of the upper layer, so it is determined that the results of it should be carefully analyzed
Strong absorption model analysis of alpha scattering
International Nuclear Information System (INIS)
Angular distribution of alpha-particles at several energies, Eα = 21 ∼ 85.6 MeV from a number of nuclei between 20Ni and 119Sn, extending to wide angular range up to ∼ 160 deg. C in some cases, have been analyzed in terms of three-parameter strong absorption model of Frahn and Venter. Interaction radius and surface diffuseness are obtained from the parameter values rendering the best fit to the elastic scattering data. The inelastic scattering of alpha-particles from a number of nuclei, leading to quadrupole and octupole excitations has also been studied giving the deformation parameters βL. (author). 14 refs, 7 figs, 3 tabs
Analysis of Empirical Software Effort Estimation Models
Basha, Saleem
2010-01-01
Reliable effort estimation remains an ongoing challenge to software engineers. Accurate effort estimation is the state of art of software engineering, effort estimation of software is the preliminary phase between the client and the business enterprise. The relationship between the client and the business enterprise begins with the estimation of the software. The credibility of the client to the business enterprise increases with the accurate estimation. Effort estimation often requires generalizing from a small number of historical projects. Generalization from such limited experience is an inherently under constrained problem. Accurate estimation is a complex process because it can be visualized as software effort prediction, as the term indicates prediction never becomes an actual. This work follows the basics of the empirical software effort estimation models. The goal of this paper is to study the empirical software effort estimation. The primary conclusion is that no single technique is best for all sit...
Dynamic Modelling and Statistical Analysis of Event Times
Peña, Edsel A.
2006-01-01
This review article provides an overview of recent work in the modelling and analysis of recurrent events arising in engineering, reliability, public health, biomedical, and other areas. Recurrent event modelling possesses unique facets making it different and more difficult to handle than single event settings. For instance, the impact of an increasing number of event occurrences ...
Modelling, authoring and publishing the "document analysis" learning object
Flament, Alexandre; Villiot Leclercq, Emmanuelle
2004-01-01
This article describes the modelling and implementation of a document analysis learning object. Actually, the objective of this research is double : providing models and tools to teachers allowing them to produce learning objects and building a publishing chain which can be applied to other kinds of learning objects. Implementation choices rely on interoperability and use of standard.
Numerical equilibrium analysis for structured consumer resource models
de Roos, A.M.; Diekmann, O.; Getto, P.; Kirkilionis, M.A.
2010-01-01
In this paper, we present methods for a numerical equilibrium and stability analysis for models of a size structured population competing for an unstructured resource. We concentrate on cases where two model parameters are free, and thus existence boundaries for equilibria and stability boundaries c
Standard model for safety analysis report of fuel reprocessing plants
International Nuclear Information System (INIS)
A standard model for a safety analysis report of fuel reprocessing plants is established. This model shows the presentation format, the origin, and the details of the minimal information required by CNEN (Comissao Nacional de Energia Nuclear) aiming to evaluate the requests of construction permits and operation licenses made according to the legislation in force. (E.G.)
Bifurcation Analysis in a Delayed Diffusive Leslie-Gower Model
Directory of Open Access Journals (Sweden)
Shuling Yan
2013-01-01
Full Text Available We investigate a modified delayed Leslie-Gower model under homogeneous Neumann boundary conditions. We give the stability analysis of the equilibria of the model and show the existence of Hopf bifurcation at the positive equilibrium under some conditions. Furthermore, we investigate the stability and direction of bifurcating periodic orbits by using normal form theorem and the center manifold theorem.
Analysis of Layer Cross-degree on Supernetwork Models
Institute of Scientific and Technical Information of China (English)
LIU; Qiang; FANG; Jin-qing; LI; Yong
2013-01-01
In order to improve the analysis of supernetwork,we introduced a new characteristic quantity so-called as layer cross-degree to analyze some properties of supernetwork models.In three layered supernetwork models there are five different link types of edges.When we consider the shortest path length L of a multilevel supernetwork,two kinds of layer
A Noncentral "t" Regression Model for Meta-Analysis
Camilli, Gregory; de la Torre, Jimmy; Chiu, Chia-Yi
2010-01-01
In this article, three multilevel models for meta-analysis are examined. Hedges and Olkin suggested that effect sizes follow a noncentral "t" distribution and proposed several approximate methods. Raudenbush and Bryk further refined this model; however, this procedure is based on a normal approximation. In the current research literature, this…
Illustration of a Multilevel Model for Meta-Analysis
de la Torre, Jimmy; Camilli, Gregory; Vargas, Sadako; Vernon, R. Fox
2007-01-01
In this article, the authors present a multilevel (or hierarchical linear) model that illustrates issues in the application of the model to data from meta-analytic studies. In doing so, several issues are discussed that typically arise in the course of a meta-analysis. These include the presence of non-zero between-study variability, how multiple…
Detecting tipping points in ecological models with sensitivity analysis
Broeke, G.A. ten; Voorn, van G.A.K.; Kooi, B.W.; Molenaar, J.
2016-01-01
Simulation models are commonly used to understand and predict the developmentof ecological systems, for instance to study the occurrence of tipping points and their possibleecological effects. Sensitivity analysis is a key tool in the study of model responses to change s in conditions. The applicabi
Detecting Tipping points in Ecological Models with Sensitivity Analysis
Broeke, ten G.A.; Voorn, van G.A.K.; Kooi, B.W.; Molenaar, Jaap
2016-01-01
Simulation models are commonly used to understand and predict the development of ecological systems, for instance to study the occurrence of tipping points and their possible ecological effects. Sensitivity analysis is a key tool in the study of model responses to changes in conditions. The appli
Standard model for safety analysis report of fuel fabrication plants
International Nuclear Information System (INIS)
A standard model for a safety analysis report of fuel fabrication plants is established. This model shows the presentation format, the origin, and the details of the minimal information required by CNEN (Comissao Nacional de Energia Nuclear) aiming to evaluate the requests of construction permits and operation licenses made according to the legislation in force. (E.G.)
MMA, A Computer Code for Multi-Model Analysis
Poeter, Eileen P.; Hill, Mary C.
2007-01-01
This report documents the Multi-Model Analysis (MMA) computer code. MMA can be used to evaluate results from alternative models of a single system using the same set of observations for all models. As long as the observations, the observation weighting, and system being represented are the same, the models can differ in nearly any way imaginable. For example, they may include different processes, different simulation software, different temporal definitions (for example, steady-state and transient models could be considered), and so on. The multiple models need to be calibrated by nonlinear regression. Calibration of the individual models needs to be completed before application of MMA. MMA can be used to rank models and calculate posterior model probabilities. These can be used to (1) determine the relative importance of the characteristics embodied in the alternative models, (2) calculate model-averaged parameter estimates and predictions, and (3) quantify the uncertainty of parameter estimates and predictions in a way that integrates the variations represented by the alternative models. There is a lack of consensus on what model analysis methods are best, so MMA provides four default methods. Two are based on Kullback-Leibler information, and use the AIC (Akaike Information Criterion) or AICc (second-order-bias-corrected AIC) model discrimination criteria. The other two default methods are the BIC (Bayesian Information Criterion) and the KIC (Kashyap Information Criterion) model discrimination criteria. Use of the KIC criterion is equivalent to using the maximum-likelihood Bayesian model averaging (MLBMA) method. AIC, AICc, and BIC can be derived from Frequentist or Bayesian arguments. The default methods based on Kullback-Leibler information have a number of theoretical advantages, including that they tend to favor more complicated models as more data become available than do the other methods, which makes sense in many situations. Many applications of MMA will
Analysis of Gumbel Model for Software Reliability Using Bayesian Paradigm
Directory of Open Access Journals (Sweden)
Raj Kumar
2012-12-01
Full Text Available In this paper, we have illustrated the suitability of Gumbel Model for software reliability data. The model parameters are estimated using likelihood based inferential procedure: classical as well as Bayesian. The quasi Newton-Raphson algorithm is applied to obtain the maximum likelihood estimates and associated probability intervals. The Bayesian estimates of the parameters of Gumbel model are obtained using Markov Chain Monte Carlo(MCMC simulation method in OpenBUGS(established software for Bayesian analysis using Markov Chain Monte Carlo methods. The R functions are developed to study the statistical properties, model validation and comparison tools of the model and the output analysis of MCMC samples generated from OpenBUGS. Details of applying MCMC to parameter estimation for the Gumbel model are elaborated and a real software reliability data set is considered to illustrate the methods of inference discussed in this paper.
Logical Modeling and Dynamical Analysis of Cellular Networks.
Abou-Jaoudé, Wassim; Traynard, Pauline; Monteiro, Pedro T; Saez-Rodriguez, Julio; Helikar, Tomáš; Thieffry, Denis; Chaouiya, Claudine
2016-01-01
The logical (or logic) formalism is increasingly used to model regulatory and signaling networks. Complementing these applications, several groups contributed various methods and tools to support the definition and analysis of logical models. After an introduction to the logical modeling framework and to several of its variants, we review here a number of recent methodological advances to ease the analysis of large and intricate networks. In particular, we survey approaches to determine model attractors and their reachability properties, to assess the dynamical impact of variations of external signals, and to consistently reduce large models. To illustrate these developments, we further consider several published logical models for two important biological processes, namely the differentiation of T helper cells and the control of mammalian cell cycle.
Analysis and synthesis of solutions for the agglomeration process modeling
Babuk, V. A.; Dolotkazin, I. N.; Nizyaev, A. A.
2013-03-01
The present work is devoted development of model of agglomerating process for propellants based on ammonium perchlorate (AP), ammonium dinitramide (ADN), HMX, inactive binder, and nanoaluminum. Generalization of experimental data, development of physical picture of agglomeration for listed propellants, development and analysis of mathematical models are carried out. Synthesis of models of various phenomena taking place at agglomeration implementation allows predicting of size and quantity, chemical composition, structure of forming agglomerates and its fraction in set of condensed combustion products. It became possible in many respects due to development of new model of agglomerating particle evolution on the surface of burning propellant. Obtained results correspond to available experimental data. It is supposed that analogical method based on analysis of mathematical models of particular phenomena and their synthesis will allow implementing of the agglomerating process modeling for other types of metalized solid propellants.
Comparative analysis of model assessment in community detection
Kawamoto, Tatsuro
2016-01-01
Bayesian cluster inference with a flexible generative model allows us to detect various types of structures. However, it has problems stemming from computational complexity and difficulties in model assessment. We consider the stochastic block model with restricted hyperparameter space, which is known to correspond to modularity maximization. We show that it not only reduces computational complexity, but is also beneficial for model assessment. Using various criteria, we conduct a comparative analysis of the model assessments, and analyze whether each criterion tends to overfit or underfit. We also show that the learning of hyperparameters leads to qualitative differences in Bethe free energy and cross-validation errors.
Practical Soil-Shallow Foundation Model for Nonlinear Structural Analysis
Directory of Open Access Journals (Sweden)
Moussa Leblouba
2016-01-01
Full Text Available Soil-shallow foundation interaction models that are incorporated into most structural analysis programs generally lack accuracy and efficiency or neglect some aspects of foundation behavior. For instance, soil-shallow foundation systems have been observed to show both small and large loops under increasing amplitude load reversals. This paper presents a practical macroelement model for soil-shallow foundation system and its stability under simultaneous horizontal and vertical loads. The model comprises three spring elements: nonlinear horizontal, nonlinear rotational, and linear vertical springs. The proposed macroelement model was verified using experimental test results from large-scale model foundations subjected to small and large cyclic loading cases.
Landsat analysis of tropical forest succession employing a terrain model
Barringer, T. H.; Robinson, V. B.; Coiner, J. C.; Bruce, R. C.
1980-01-01
Landsat multispectral scanner (MSS) data have yielded a dual classification of rain forest and shadow in an analysis of a semi-deciduous forest on Mindonoro Island, Philippines. Both a spatial terrain model, using a fifth side polynomial trend surface analysis for quantitatively estimating the general spatial variation in the data set, and a spectral terrain model, based on the MSS data, have been set up. A discriminant analysis, using both sets of data, has suggested that shadowing effects may be due primarily to local variations in the spectral regions and can therefore be compensated for through the decomposition of the spatial variation in both elevation and MSS data.
A Quotient Space Approximation Model of Multiresolution Signal Analysis
Institute of Scientific and Technical Information of China (English)
Ling Zhang; Bo Zhang
2005-01-01
In this paper, we present a quotient space approximation model of multiresolution signal analysis and discuss the properties and characteristics of the model. Then the comparison between wavelet transform and the quotient space approximation is made. First, when wavelet transform is viewed from the new quotient space approximation perspective, it may help us to gain an insight into the essence of multiresolution signal analysis. Second, from the similarity between wavelet and quotient space approximations, it is possible to transfer the rich wavelet techniques into the latter so that a new way for multiresolution analysis may be found.
MMA, A Computer Code for Multi-Model Analysis
Energy Technology Data Exchange (ETDEWEB)
Eileen P. Poeter and Mary C. Hill
2007-08-20
This report documents the Multi-Model Analysis (MMA) computer code. MMA can be used to evaluate results from alternative models of a single system using the same set of observations for all models. As long as the observations, the observation weighting, and system being represented are the same, the models can differ in nearly any way imaginable. For example, they may include different processes, different simulation software, different temporal definitions (for example, steady-state and transient models could be considered), and so on. The multiple models need to be calibrated by nonlinear regression. Calibration of the individual models needs to be completed before application of MMA. MMA can be used to rank models and calculate posterior model probabilities. These can be used to (1) determine the relative importance of the characteristics embodied in the alternative models, (2) calculate model-averaged parameter estimates and predictions, and (3) quantify the uncertainty of parameter estimates and predictions in a way that integrates the variations represented by the alternative models. There is a lack of consensus on what model analysis methods are best, so MMA provides four default methods. Two are based on Kullback-Leibler information, and use the AIC (Akaike Information Criterion) or AICc (second-order-bias-corrected AIC) model discrimination criteria. The other two default methods are the BIC (Bayesian Information Criterion) and the KIC (Kashyap Information Criterion) model discrimination criteria. Use of the KIC criterion is equivalent to using the maximum-likelihood Bayesian model averaging (MLBMA) method. AIC, AICc, and BIC can be derived from Frequentist or Bayesian arguments. The default methods based on Kullback-Leibler information have a number of theoretical advantages, including that they tend to favor more complicated models as more data become available than do the other methods, which makes sense in many situations.
IMAGE ANALYSIS FOR MODELLING SHEAR BEHAVIOUR
Directory of Open Access Journals (Sweden)
Philippe Lopez
2011-05-01
Full Text Available Through laboratory research performed over the past ten years, many of the critical links between fracture characteristics and hydromechanical and mechanical behaviour have been made for individual fractures. One of the remaining challenges at the laboratory scale is to directly link fracture morphology of shear behaviour with changes in stress and shear direction. A series of laboratory experiments were performed on cement mortar replicas of a granite sample with a natural fracture perpendicular to the axis of the core. Results show that there is a strong relationship between the fracture's geometry and its mechanical behaviour under shear stress and the resulting damage. Image analysis, geostatistical, stereological and directional data techniques are applied in combination to experimental data. The results highlight the role of geometric characteristics of the fracture surfaces (surface roughness, size, shape, locations and orientations of asperities to be damaged in shear behaviour. A notable improvement in shear understanding is that shear behaviour is controlled by the apparent dip in the shear direction of elementary facets forming the fracture.
Schizophrenia: the testing of genetic models by pedigree analysis.
Stewart, J.; Debray, Q; Caillard, V
1980-01-01
Simulated pedigrees of schizophrenia generally show a clear peak in their likelihood surface corresponding to analysis by the genetic models, which served as the basis for the simulation. The likelihood surface obtained with real data permits determination of the allelic frequency and the selection of an optimal one-locus, two-locus, and four-locus model. These three models have certain features in common, notably, a relatively high frequency of the allele predisposing to schizophrenia (about...
Modelling and analysis of behaviour of biomedical scaffolds
Reali, Luca
2015-01-01
Since articular cartilage related diseases are an increasing issue and they are nowadays treated by invasive prosthesis implantations, there is a strong demand for new solutions such as those offered by scaffold engineering. This work deals with the characterization and modelling of polymeric fabrics for cartilage repair. Creep tests data at three different applied forces were successfully modelled both analytically, using viscoelastic models, and by finite element analysis which embraced the...
ANALYSIS OF THE MECHANISM MODELS OF TECHNOLOGICAL INNOVATION DIFFUSION
Institute of Scientific and Technical Information of China (English)
XU Jiuping; HU Minan
2004-01-01
This paper analyzes the mechanism and principle of diffusion of technology diffusion on the basis of quantitative analysis. Then it sets up the diffusion model of innovation incorporating price, advertising and distribution, the diffusion model of innovation including various kinds of consumers, and the substitute model between the new technology and the old one applied systems dynamics, optimization method, probabilistic method and simulation method on computer. Finally this paper concludes with some practical observations from a case study.
Comparative analysis of credit risk models for loan portfolios.
Han, C
2014-01-01
This study is distinct from previous studies in its inclusion of new models, consideration of sector correlation and performance of comprehensive sensitivity analysis. CreditRisk++, CreditMetrics, the Basel II internal-ratings-based method and the Mercer Oliver Wyman model are considered. Risk factor distribution and the relationship between risk components and risk factors are the key distinguishing characteristics of each model. CreditRisk++, due to its extra degree of freedom, has the high...
SOA Modeling Patterns for Service Oriented Discovery and Analysis
Bell, Michael
2010-01-01
Learn the essential tools for developing a sound service-oriented architecture. SOA Modeling Patterns for Service-Oriented Discovery and Analysis introduces a universal, easy-to-use, and nimble SOA modeling language to facilitate the service identification and examination life cycle stage. This business and technological vocabulary will benefit your service development endeavors and foster organizational software asset reuse and consolidation, and reduction of expenditure. Whether you are a developer, business architect, technical architect, modeler, business analyst, team leader, or manager,
Computer-based modelling and analysis in engineering geology
Giles, David
2014-01-01
This body of work presents the research and publications undertaken under a general theme of computer-based modelling and analysis in engineering geology. Papers are presented on geotechnical data management, data interchange, Geographical Information Systems, surface modelling, geostatistical methods, risk-based modelling, knowledge-based systems, remote sensing in engineering geology and on the integration of computer applications into applied geoscience teaching. The work highlights my...
A Novel Two-Dimension’ Customer Knowledge Analysis Model
Liu Xuelian; Nopasit Chakpitak; Pitipong Yodmongkol
2015-01-01
Customer knowledge has increasingly importance in customer-oriented enterprise. Customer knowledge management process with models can help managers to identify the real value chain in business process. The purpose of the paper is to develop a tool for classification and processing of customer knowledge from perspective of knowledge management. By review previous customer knowledge management model, this paper proposes a novel two-dimension’ customer knowledge analysis model, which make custom...
Practical Soil-Shallow Foundation Model for Nonlinear Structural Analysis
Moussa Leblouba; Salah Al Toubat; Muhammad Ekhlasur Rahman; Omer Mugheida
2016-01-01
Soil-shallow foundation interaction models that are incorporated into most structural analysis programs generally lack accuracy and efficiency or neglect some aspects of foundation behavior. For instance, soil-shallow foundation systems have been observed to show both small and large loops under increasing amplitude load reversals. This paper presents a practical macroelement model for soil-shallow foundation system and its stability under simultaneous horizontal and vertical loads. The model...
Semiparametric theory based MIMO model and performance analysis
Institute of Scientific and Technical Information of China (English)
XU Fang-min; XU Xiao-dong; ZHANG Ping
2007-01-01
In this article, a new approach for modeling multi- input multi-output (MIMO) systems with unknown nonlinear interference is introduced. The semiparametric theory based MIMO model is established, and Kernel estimation is applied to combat the nonlinear interference. Furthermore, we derive MIMO capacity for these systems and explore the asymptotic properties of the new channel matrix via theoretical analysis. The simulation results show that the semiparametric theory based modeling and kernel estimation are valid to combat this kind of interference.
ANALYSIS OF ORGANIZATIONAL CULTURE WITH SOCIAL NETWORK MODELS
Titov, S.
2015-01-01
Organizational culture is nowadays an object of numerous scientific papers. However, only marginal part of existing research attempts to use the formal models of organizational cultures. The lack of organizational culture models significantly limits the further research in this area and restricts the application of the theory to practice of organizational culture change projects. The article consists of general views on potential application of network models and social network analysis to th...
Flood modeling for risk evaluation: a MIKE FLOOD sensitivity analysis
Vanderkimpen, P.; Peeters, P
2008-01-01
The flood risk for a section of the Belgian coastal plain was evaluated by means of dynamically linked 1D (breach) and 2D (floodplain) hydraulic models. First, a one-at-a-time factor screening was performed to evaluate the relative importance of various model processes and parameters. Subsequently, a systematic sensitivity analysis was added to establish the contribution of the most influential factors (breach growth and surface roughness) to hydraulic modeling uncertainty. Finally, the uncer...
Validation of statistical models for creep rupture by parametric analysis
Energy Technology Data Exchange (ETDEWEB)
Bolton, J., E-mail: john.bolton@uwclub.net [65, Fisher Ave., Rugby, Warks CV22 5HW (United Kingdom)
2012-01-15
Statistical analysis is an efficient method for the optimisation of any candidate mathematical model of creep rupture data, and for the comparative ranking of competing models. However, when a series of candidate models has been examined and the best of the series has been identified, there is no statistical criterion to determine whether a yet more accurate model might be devised. Hence there remains some uncertainty that the best of any series examined is sufficiently accurate to be considered reliable as a basis for extrapolation. This paper proposes that models should be validated primarily by parametric graphical comparison to rupture data and rupture gradient data. It proposes that no mathematical model should be considered reliable for extrapolation unless the visible divergence between model and data is so small as to leave no apparent scope for further reduction. This study is based on the data for a 12% Cr alloy steel used in BS PD6605:1998 to exemplify its recommended statistical analysis procedure. The models considered in this paper include a) a relatively simple model, b) the PD6605 recommended model and c) a more accurate model of somewhat greater complexity. - Highlights: Black-Right-Pointing-Pointer The paper discusses the validation of creep rupture models derived from statistical analysis. Black-Right-Pointing-Pointer It demonstrates that models can be satisfactorily validated by a visual-graphic comparison of models to data. Black-Right-Pointing-Pointer The method proposed utilises test data both as conventional rupture stress and as rupture stress gradient. Black-Right-Pointing-Pointer The approach is shown to be more reliable than a well-established and widely used method (BS PD6605).
A Grammar Analysis Model for the Unified Multimedia Query Language
Institute of Scientific and Technical Information of China (English)
Zhong-Sheng Cao; Zong-Da Wu; Yuan-Zhen Wang
2008-01-01
The unified multimedia query language(UMQL) is a powerful general-purpose multimediaquery language, and it is very suitable for multimediainformation retrieval. The paper proposes a grammaranalysis model to implement an effective grammaticalprocessing for the language. It separates the grammaranalysis of a UMQL query specification into two phases:syntactic analysis and semantic analysis, and thenrespectively uses Backus-Naur form (EBNF) and logicalalgebra to specify both restrictive grammar rules. As aresult, the model can present error guiding informationfor a query specification which owns incorrect grammar.The model not only suits well the processing of UMQLqueries, but also has a guiding significance for otherprojects concerning query processings of descriptivequery languages.
QuantUM: Quantitative Safety Analysis of UML Models
Leitner-Fischer, Florian; 10.4204/EPTCS.57.2
2011-01-01
When developing a safety-critical system it is essential to obtain an assessment of different design alternatives. In particular, an early safety assessment of the architectural design of a system is desirable. In spite of the plethora of available formal quantitative analysis methods it is still difficult for software and system architects to integrate these techniques into their every day work. This is mainly due to the lack of methods that can be directly applied to architecture level models, for instance given as UML diagrams. Also, it is necessary that the description methods used do not require a profound knowledge of formal methods. Our approach bridges this gap and improves the integration of quantitative safety analysis methods into the development process. All inputs of the analysis are specified at the level of a UML model. This model is then automatically translated into the analysis model, and the results of the analysis are consequently represented on the level of the UML model. Thus the analysi...
Microeconomic co-evolution model for financial technical analysis signals
Rotundo, G
2006-01-01
Technical analysis (TA) has been used for a long time before the availability of more sophisticated instruments for financial forecasting in order to suggest decisions on the basis of the occurrence of data patterns. Many mathematical and statistical tools for quantitative analysis of financial markets have experienced a fast and wide growth and have the power for overcoming classical technical analysis methods. This paper aims to give a measure of the reliability of some information used in TA by exploring the probability of their occurrence within a particular $microeconomic$ agent based model of markets, i.e., the co-evolution Bak-Sneppen model originally invented for describing species population evolutions. After having proved the practical interest of such a model in describing financial index so called avalanches, in the prebursting bubble time rise, the attention focuses on the occurrence of trend line detection crossing of meaningful barriers, those that give rise to some usual technical analysis str...
A Grammar Analysis Model for the Unified Multimedia Query Language
Institute of Scientific and Technical Information of China (English)
Zhong-Sheng Cao; Zong-Da Wu; Yuan-Zhen Wang
2008-01-01
The unified multimedia query language (UMQL) is a powerful general-purpose multimedia query language, and it is very suitable for multimedia information retrieval. The paper proposes a grammar analysis model to implement an effective grammatical processing for the language. It separates the grammar analysis of a UMQL query specification into two phases: syntactic analysis and semantic analysis, and then respectively uses Backus-Naur form (EBNF) and logical algebra to specify both restrictive grammar rules. As a result, the model can present error guiding information for a query specification which owns incorrect grammar. The model not only suits well the processing of UMQL queries, but also has a guiding significance for other projects concerning query processings of descriptive query languages.
Model analysis of fomite mediated influenza transmission.
Directory of Open Access Journals (Sweden)
Jijun Zhao
Full Text Available Fomites involved in influenza transmission are either hand- or droplet-contaminated. We evaluated the interactions of fomite characteristics and human behaviors affecting these routes using an Environmental Infection Transmission System (EITS model by comparing the basic reproduction numbers (R(0 for different fomite mediated transmission pathways. Fomites classified as large versus small surface sizes (reflecting high versus low droplet contamination levels and high versus low touching frequency have important differences. For example, 1 the highly touched large surface fomite (public tables has the highest transmission potential and generally strongest control measure effects; 2 transmission from droplet-contaminated routes exceed those from hand-contaminated routes except for highly touched small surface fomites such as door knob handles; and 3 covering a cough using the upper arm or using tissues effectively removes virus from the system and thus decreases total fomite transmission. Because covering a cough by hands diverts pathogens from the droplet-fomite route to the hand-fomite route, this has the potential to increase total fomite transmission for highly touched small surface fomites. An improved understanding and more refined data related to fomite mediated transmission routes will help inform intervention strategies for influenza and other pathogens that are mediated through the environment.
Model analysis of fomite mediated influenza transmission.
Zhao, Jijun; Eisenberg, Joseph E; Spicknall, Ian H; Li, Sheng; Koopman, James S
2012-01-01
Fomites involved in influenza transmission are either hand- or droplet-contaminated. We evaluated the interactions of fomite characteristics and human behaviors affecting these routes using an Environmental Infection Transmission System (EITS) model by comparing the basic reproduction numbers (R(0)) for different fomite mediated transmission pathways. Fomites classified as large versus small surface sizes (reflecting high versus low droplet contamination levels) and high versus low touching frequency have important differences. For example, 1) the highly touched large surface fomite (public tables) has the highest transmission potential and generally strongest control measure effects; 2) transmission from droplet-contaminated routes exceed those from hand-contaminated routes except for highly touched small surface fomites such as door knob handles; and 3) covering a cough using the upper arm or using tissues effectively removes virus from the system and thus decreases total fomite transmission. Because covering a cough by hands diverts pathogens from the droplet-fomite route to the hand-fomite route, this has the potential to increase total fomite transmission for highly touched small surface fomites. An improved understanding and more refined data related to fomite mediated transmission routes will help inform intervention strategies for influenza and other pathogens that are mediated through the environment.
Evaluation of Cost Models and Needs & Gaps Analysis
DEFF Research Database (Denmark)
Kejser, Ulla Bøgvad
2014-01-01
his report ’D3.1—Evaluation of Cost Models and Needs & Gaps Analysis’ provides an analysis of existing research related to the economics of digital curation and cost & benefit modelling. It reports upon the investigation of how well current models and tools meet stakeholders’ needs for calculating...... for amore efficient use of resources for digital curation. To facilitate and clarify the model evaluation the report first outlines a basic terminology and a generaldescription of the characteristics of cost and benefit models.The report then describes how the ten current and emerging cost and benefit...... models included in the evaluation were identified and provides a summary of each of the models. To facilitate comparison of the models, it also provides tables that lists each of the models’ core features, such as which informationassets they handle, which curation activities they address and how...
Multivariate Survival Mixed Models for Genetic Analysis of Longevity Traits
DEFF Research Database (Denmark)
Pimentel Maia, Rafael; Madsen, Per; Labouriau, Rodrigo
2013-01-01
A class of multivariate mixed survival models for continuous and discrete time with a complex covariance structure is introduced in a context of quantitative genetic applications. The methods introduced can be used in many applications in quantitative genetics although the discussion presented...... concentrates on longevity studies. The framework presented allows to combine models based on continuous time with models based on discrete time in a joint analysis. The continuous time models are approximations of the frailty model in which the hazard function will be assumed to be piece-wise constant....... The discrete time models used are multivariate variants of the discrete relative risk models. These models allow for regular parametric likelihood-based inference by exploring a coincidence of their likelihood functions and the likelihood functions of suitably defined multivariate generalized linear mixed...
Multivariate Survival Mixed Models for Genetic Analysis of Longevity Traits
DEFF Research Database (Denmark)
Pimentel Maia, Rafael; Madsen, Per; Labouriau, Rodrigo
2014-01-01
A class of multivariate mixed survival models for continuous and discrete time with a complex covariance structure is introduced in a context of quantitative genetic applications. The methods introduced can be used in many applications in quantitative genetics although the discussion presented...... concentrates on longevity studies. The framework presented allows to combine models based on continuous time with models based on discrete time in a joint analysis. The continuous time models are approximations of the frailty model in which the hazard function will be assumed to be piece-wise constant....... The discrete time models used are multivariate variants of the discrete relative risk models. These models allow for regular parametric likelihood-based inference by exploring a coincidence of their likelihood functions and the likelihood functions of suitably defined multivariate generalized linear mixed...
Model order reduction techniques with applications in finite element analysis
Qu, Zu-Qing
2004-01-01
Despite the continued rapid advance in computing speed and memory the increase in the complexity of models used by engineers persists in outpacing them. Even where there is access to the latest hardware, simulations are often extremely computationally intensive and time-consuming when full-blown models are under consideration. The need to reduce the computational cost involved when dealing with high-order/many-degree-of-freedom models can be offset by adroit computation. In this light, model-reduction methods have become a major goal of simulation and modeling research. Model reduction can also ameliorate problems in the correlation of widely used finite-element analyses and test analysis models produced by excessive system complexity. Model Order Reduction Techniques explains and compares such methods focusing mainly on recent work in dynamic condensation techniques: - Compares the effectiveness of static, exact, dynamic, SEREP and iterative-dynamic condensation techniques in producing valid reduced-order mo...
Joint regression analysis and AMMI model applied to oat improvement
Oliveira, A.; Oliveira, T. A.; Mejza, S.
2012-09-01
In our work we present an application of some biometrical methods useful in genotype stability evaluation, namely AMMI model, Joint Regression Analysis (JRA) and multiple comparison tests. A genotype stability analysis of oat (Avena Sativa L.) grain yield was carried out using data of the Portuguese Plant Breeding Board, sample of the 22 different genotypes during the years 2002, 2003 and 2004 in six locations. In Ferreira et al. (2006) the authors state the relevance of the regression models and of the Additive Main Effects and Multiplicative Interactions (AMMI) model, to study and to estimate phenotypic stability effects. As computational techniques we use the Zigzag algorithm to estimate the regression coefficients and the agricolae-package available in R software for AMMI model analysis.
The bivariate combined model for spatial data analysis.
Neyens, Thomas; Lawson, Andrew B; Kirby, Russell S; Faes, Christel
2016-08-15
To describe the spatial distribution of diseases, a number of methods have been proposed to model relative risks within areas. Most models use Bayesian hierarchical methods, in which one models both spatially structured and unstructured extra-Poisson variance present in the data. For modelling a single disease, the conditional autoregressive (CAR) convolution model has been very popular. More recently, a combined model was proposed that 'combines' ideas from the CAR convolution model and the well-known Poisson-gamma model. The combined model was shown to be a good alternative to the CAR convolution model when there was a large amount of uncorrelated extra-variance in the data. Less solutions exist for modelling two diseases simultaneously or modelling a disease in two sub-populations simultaneously. Furthermore, existing models are typically based on the CAR convolution model. In this paper, a bivariate version of the combined model is proposed in which the unstructured heterogeneity term is split up into terms that are shared and terms that are specific to the disease or subpopulation, while spatial dependency is introduced via a univariate or multivariate Markov random field. The proposed method is illustrated by analysis of disease data in Georgia (USA) and Limburg (Belgium) and in a simulation study. We conclude that the bivariate combined model constitutes an interesting model when two diseases are possibly correlated. As the choice of the preferred model differs between data sets, we suggest to use the new and existing modelling approaches together and to choose the best model via goodness-of-fit statistics. Copyright © 2016 John Wiley & Sons, Ltd. PMID:26928309
An introduction to queueing theory modeling and analysis in applications
Bhat, U Narayan
2015-01-01
This introductory textbook is designed for a one-semester course on queueing theory that does not require a course on stochastic processes as a prerequisite. By integrating the necessary background on stochastic processes with the analysis of models, the work provides a sound foundational introduction to the modeling and analysis of queueing systems for a wide interdisciplinary audience of students in mathematics, statistics, and applied disciplines such as computer science, operations research, and engineering. This edition includes additional topics in methodology and applications. Key features: • An introductory chapter including a historical account of the growth of queueing theory in more than 100 years. • A modeling-based approach with emphasis on identification of models. • Rigorous treatment of the foundations of basic models commonly used in applications with appropriate references for advanced topics. • Applications in manufacturing and, computer and communication systems. • A chapter on ...
Structural Simulations and Conservation Analysis -Historic Building Information Model (HBIM
Directory of Open Access Journals (Sweden)
C. Dore
2015-02-01
Full Text Available In this paper the current findings to date of the Historic Building Information Model (HBIM of the Four Courts in Dublin are presented. The Historic Building Information Model (HBIM forms the basis for both structural and conservation analysis to measure the impact of war damage which still impacts on the building. The laser scan survey was carried out in the summer of 2014 of the internal and external structure. After registration and processing of the laser scan survey, the HBIM was created of the damaged section of the building and is presented as two separate workflows in this paper. The first is the model created from historic data, the second a procedural and segmented model developed from laser scan survey of the war damaged drum and dome. From both models structural damage and decay simulations will be developed for documentation and conservation analysis.
Optimization model for air quality analysis in energy facility siting
Energy Technology Data Exchange (ETDEWEB)
Emanuel, W. R.; Murphy, B. D.; Huff, D. D.; Begovich, C. L.; Hurt, J. F.
1977-09-01
The siting of energy facilities on a regional scale is discussed with particular attention to environmental planning criteria. A multiple objective optimization model is proposed as a framework for the analysis of siting problems. Each planning criterion (e.g., air quality, water quality, or power demand) is treated as an objective function to be minimized or maximized subject to constraints in this optimization procedure. The formulation of the objective functions is illustrated by the development of a siting model for the minimization of human exposure to air pollutants. This air quality siting model takes the form of a linear programming problem. A graphical analysis of this type of problem, which provides insight into the nature of the siting model, is given. The air quality siting model is applied to an illustrative siting example for the Tennessee Valley area.
Model for Analysis of Energy Demand (MAED-2). User's manual
International Nuclear Information System (INIS)
The IAEA has been supporting its Member States in the area of energy planning for sustainable development. Development and dissemination of appropriate methodologies and their computer codes are important parts of this support. This manual has been produced to facilitate the use of the MAED model: Model for Analysis of Energy Demand. The methodology of the MAED model was originally developed by. B. Chateau and B. Lapillonne of the Institute Economique et Juridique de l'Energie (IEJE) of the University of Grenoble, France, and was presented as the MEDEE model. Since then the MEDEE model has been developed and adopted to be appropriate for modelling of various energy demand system. The IAEA adopted MEDEE-2 model and incorporated important modifications to make it more suitable for application in the developing countries, and it was named as the MAED model. The first version of the MAED model was designed for the DOS based system, which was later on converted for the Windows system. This manual presents the latest version of the MAED model. The most prominent feature of this version is its flexibility for representing structure of energy consumption. The model now allows country-specific representations of energy consumption patterns using the MAED methodology. The user can now disaggregate energy consumption according to the needs and/or data availability in her/his country. As such, MAED has now become a powerful tool for modelling widely diverse energy consumption patterns. This manual presents the model in details and provides guidelines for its application
Model for Analysis of Energy Demand (MAED-2)
International Nuclear Information System (INIS)
The IAEA has been supporting its Member States in the area of energy planning for sustainable development. Development and dissemination of appropriate methodologies and their computer codes are important parts of this support. This manual has been produced to facilitate the use of the MAED model: Model for Analysis of Energy Demand. The methodology of the MAED model was originally developed by. B. Chateau and B. Lapillonne of the Institute Economique et Juridique de l'Energie (IEJE) of the University of Grenoble, France, and was presented as the MEDEE model. Since then the MEDEE model has been developed and adopted to be appropriate for modelling of various energy demand system. The IAEA adopted MEDEE-2 model and incorporated important modifications to make it more suitable for application in the developing countries, and it was named as the MAED model. The first version of the MAED model was designed for the DOS based system, which was later on converted for the Windows system. This manual presents the latest version of the MAED model. The most prominent feature of this version is its flexibility for representing structure of energy consumption. The model now allows country-specific representations of energy consumption patterns using the MAED methodology. The user can now disaggregate energy consumption according to the needs and/or data availability in her/his country. As such, MAED has now become a powerful tool for modelling widely diverse energy consumption patterns. This manual presents the model in details and provides guidelines for its application
Interval process model and non-random vibration analysis
Jiang, C.; Ni, B. Y.; Liu, N. Y.; Han, X.; Liu, J.
2016-07-01
This paper develops an interval process model for time-varying or dynamic uncertainty analysis when information of the uncertain parameter is inadequate. By using the interval process model to describe a time-varying uncertain parameter, only its upper and lower bounds are required at each time point rather than its precise probability distribution, which is quite different from the traditional stochastic process model. A correlation function is defined for quantification of correlation between the uncertain-but-bounded variables at different times, and a matrix-decomposition-based method is presented to transform the original dependent interval process into an independent one for convenience of subsequent uncertainty analysis. More importantly, based on the interval process model, a non-random vibration analysis method is proposed for response computation of structures subjected to time-varying uncertain external excitations or loads. The structural dynamic responses thus can be derived in the form of upper and lower bounds, providing an important guidance for practical safety analysis and reliability design of structures. Finally, two numerical examples and one engineering application are investigated to demonstrate the feasibility of the interval process model and corresponding non-random vibration analysis method.
Global Sensitivity Analysis of Environmental Models: Convergence, Robustness and Validation
Sarrazin, Fanny; Pianosi, Francesca; Khorashadi Zadeh, Farkhondeh; Van Griensven, Ann; Wagener, Thorsten
2015-04-01
Global Sensitivity Analysis aims to characterize the impact that variations in model input factors (e.g. the parameters) have on the model output (e.g. simulated streamflow). In sampling-based Global Sensitivity Analysis, the sample size has to be chosen carefully in order to obtain reliable sensitivity estimates while spending computational resources efficiently. Furthermore, insensitive parameters are typically identified through the definition of a screening threshold: the theoretical value of their sensitivity index is zero but in a sampling-base framework they regularly take non-zero values. There is little guidance available for these two steps in environmental modelling though. The objective of the present study is to support modellers in making appropriate choices, regarding both sample size and screening threshold, so that a robust sensitivity analysis can be implemented. We performed sensitivity analysis for the parameters of three hydrological models with increasing level of complexity (Hymod, HBV and SWAT), and tested three widely used sensitivity analysis methods (Elementary Effect Test or method of Morris, Regional Sensitivity Analysis, and Variance-Based Sensitivity Analysis). We defined criteria based on a bootstrap approach to assess three different types of convergence: the convergence of the value of the sensitivity indices, of the ranking (the ordering among the parameters) and of the screening (the identification of the insensitive parameters). We investigated the screening threshold through the definition of a validation procedure. The results showed that full convergence of the value of the sensitivity indices is not necessarily needed to rank or to screen the model input factors. Furthermore, typical values of the sample sizes that are reported in the literature can be well below the sample sizes that actually ensure convergence of ranking and screening.
Precise methods for conducted EMI modeling,analysis,and prediction
Institute of Scientific and Technical Information of China (English)
2008-01-01
Focusing on the state-of-the-art conducted EMI prediction, this paper presents a noise source lumped circuit modeling and identification method, an EMI modeling method based on multiple slope approximation of switching transitions, and dou-ble Fourier integral method modeling PWM conversion units to achieve an accurate modeling of EMI noise source. Meanwhile, a new sensitivity analysis method, a general coupling model for steel ground loops, and a partial element equivalent circuit method are proposed to identify and characterize conducted EMI coupling paths. The EMI noise and propagation modeling provide an accurate prediction of conducted EMI in the entire frequency range (0―10 MHz) with good practicability and generality. Finally a new measurement approach is presented to identify the surface current of large dimensional metal shell. The proposed analytical modeling methodology is verified by experimental results.
Parametric sensitivity analysis of a test cell thermal model using spectral analysis
Mara, Thierry Alex; Garde, François
2012-01-01
The paper deals with an empirical validation of a building thermal model. We put the emphasis on sensitivity analysis and on research of inputs/residual correlation to improve our model. In this article, we apply a sensitivity analysis technique in the frequency domain to point out the more important parameters of the model. Then, we compare measured and predicted data of indoor dry-air temperature. When the model is not accurate enough, recourse to time-frequency analysis is of great help to identify the inputs responsible for the major part of error. In our approach, two samples of experimental data are required. The first one is used to calibrate our model the second one to really validate the optimized model.
ANALYSIS OF THE KQML MODEL IN MULTI-AGENT INTERACTION
Institute of Scientific and Technical Information of China (English)
刘海龙; 吴铁军
2001-01-01
Our analysis of the KQML(Knowledge Query and Manipulation Language) model yielded some conclusions on the knowledge level of communication in agent-oriented program. First, the agent state and transition model were given for analyzing the necessary conditions for interaction with the synchronal and asynchronous KQML model respectively. Second, we analyzed the deadlock and starvation problems in the KQML communication, and gave the solution. At last, the advantages and disadvantages of the synchronal and asynchronous KQML model were listed respectively, and the choosing principle was given.
Inverse Analysis and Modeling for Tunneling Thrust on Shield Machine
Directory of Open Access Journals (Sweden)
Qian Zhang
2013-01-01
Full Text Available With the rapid development of sensor and detection technologies, measured data analysis plays an increasingly important role in the design and control of heavy engineering equipment. The paper proposed a method for inverse analysis and modeling based on mass on-site measured data, in which dimensional analysis and data mining techniques were combined. The method was applied to the modeling of the tunneling thrust on shield machines and an explicit expression for thrust prediction was established. Combined with on-site data from a tunneling project in China, the inverse identification of model coefficients was carried out using the multiple regression method. The model residual was analyzed by statistical methods. By comparing the on-site data and the model predicted results in the other two projects with different tunneling conditions, the feasibility of the model was discussed. The work may provide a scientific basis for the rational design and control of shield tunneling machines and also a new way for mass on-site data analysis of complex engineering systems with nonlinear, multivariable, time-varying characteristics.
Perturbation analysis for Monte Carlo continuous cross section models
International Nuclear Information System (INIS)
Sensitivity analysis, including both its forward and adjoint applications, collectively referred to hereinafter as Perturbation Analysis (PA), is an essential tool to complete Uncertainty Quantification (UQ) and Data Assimilation (DA). PA-assisted UQ and DA have traditionally been carried out for reactor analysis problems using deterministic as opposed to stochastic models for radiation transport. This is because PA requires many model executions to quantify how variations in input data, primarily cross sections, affect variations in model's responses, e.g. detectors readings, flux distribution, multiplication factor, etc. Although stochastic models are often sought for their higher accuracy, their repeated execution is at best computationally expensive and in reality intractable for typical reactor analysis problems involving many input data and output responses. Deterministic methods however achieve computational efficiency needed to carry out the PA analysis by reducing problem dimensionality via various spatial and energy homogenization assumptions. This however introduces modeling error components into the PA results which propagate to the following UQ and DA analyses. The introduced errors are problem specific and therefore are expected to limit the applicability of UQ and DA analyses to reactor systems that satisfy the introduced assumptions. This manuscript introduces a new method to complete PA employing a continuous cross section stochastic model and performed in a computationally efficient manner. If successful, the modeling error components introduced by deterministic methods could be eliminated, thereby allowing for wider applicability of DA and UQ results. Two MCNP models demonstrate the application of the new method - a Critical Pu Sphere (Jezebel), a Pu Fast Metal Array (Russian BR-1). The PA is completed for reaction rate densities, reaction rate ratios, and the multiplication factor. (author)
Computer Models for IRIS Control System Transient Analysis
Energy Technology Data Exchange (ETDEWEB)
Gary D. Storrick; Bojan Petrovic; Luca Oriani
2007-01-31
This report presents results of the Westinghouse work performed under Task 3 of this Financial Assistance Award and it satisfies a Level 2 Milestone for the project. Task 3 of the collaborative effort between ORNL, Brazil and Westinghouse for the International Nuclear Energy Research Initiative entitled “Development of Advanced Instrumentation and Control for an Integrated Primary System Reactor” focuses on developing computer models for transient analysis. This report summarizes the work performed under Task 3 on developing control system models. The present state of the IRIS plant design – such as the lack of a detailed secondary system or I&C system designs – makes finalizing models impossible at this time. However, this did not prevent making considerable progress. Westinghouse has several working models in use to further the IRIS design. We expect to continue modifying the models to incorporate the latest design information until the final IRIS unit becomes operational. Section 1.2 outlines the scope of this report. Section 2 describes the approaches we are using for non-safety transient models. It describes the need for non-safety transient analysis and the model characteristics needed to support those analyses. Section 3 presents the RELAP5 model. This is the highest-fidelity model used for benchmark evaluations. However, it is prohibitively slow for routine evaluations and additional lower-fidelity models have been developed. Section 4 discusses the current Matlab/Simulink model. This is a low-fidelity, high-speed model used to quickly evaluate and compare competing control and protection concepts. Section 5 describes the Modelica models developed by POLIMI and Westinghouse. The object-oriented Modelica language provides convenient mechanisms for developing models at several levels of detail. We have used this to develop a high-fidelity model for detailed analyses and a faster-running simplified model to help speed the I&C development process
Computer Models for IRIS Control System Transient Analysis
International Nuclear Information System (INIS)
This report presents results of the Westinghouse work performed under Task 3 of this Financial Assistance Award and it satisfies a Level 2 Milestone for the project. Task 3 of the collaborative effort between ORNL, Brazil and Westinghouse for the International Nuclear Energy Research Initiative entitled 'Development of Advanced Instrumentation and Control for an Integrated Primary System Reactor' focuses on developing computer models for transient analysis. This report summarizes the work performed under Task 3 on developing control system models. The present state of the IRIS plant design--such as the lack of a detailed secondary system or I and C system designs--makes finalizing models impossible at this time. However, this did not prevent making considerable progress. Westinghouse has several working models in use to further the IRIS design. We expect to continue modifying the models to incorporate the latest design information until the final IRIS unit becomes operational. Section 1.2 outlines the scope of this report. Section 2 describes the approaches we are using for non-safety transient models. It describes the need for non-safety transient analysis and the model characteristics needed to support those analyses. Section 3 presents the RELAP5 model. This is the highest-fidelity model used for benchmark evaluations. However, it is prohibitively slow for routine evaluations and additional lower-fidelity models have been developed. Section 4 discusses the current Matlab/Simulink model. This is a low-fidelity, high-speed model used to quickly evaluate and compare competing control and protection concepts. Section 5 describes the Modelica models developed by POLIMI and Westinghouse. The object-oriented Modelica language provides convenient mechanisms for developing models at several levels of detail. We have used this to develop a high-fidelity model for detailed analyses and a faster-running simplified model to help speed the I and C development process. Section
Hybrid reliability model for fatigue reliability analysis of steel bridges
Institute of Scientific and Technical Information of China (English)
曹珊珊; 雷俊卿
2016-01-01
A kind of hybrid reliability model is presented to solve the fatigue reliability problems of steel bridges. The cumulative damage model is one kind of the models used in fatigue reliability analysis. The parameter characteristics of the model can be described as probabilistic and interval. The two-stage hybrid reliability model is given with a theoretical foundation and a solving algorithm to solve the hybrid reliability problems. The theoretical foundation is established by the consistency relationships of interval reliability model and probability reliability model with normally distributed variables in theory. The solving process is combined with the definition of interval reliability index and the probabilistic algorithm. With the consideration of the parameter characteristics of theS−N curve, the cumulative damage model with hybrid variables is given based on the standards from different countries. Lastly, a case of steel structure in the Neville Island Bridge is analyzed to verify the applicability of the hybrid reliability model in fatigue reliability analysis based on the AASHTO.
Projection-Based Reduced Order Modeling for Spacecraft Thermal Analysis
Qian, Jing; Wang, Yi; Song, Hongjun; Pant, Kapil; Peabody, Hume; Ku, Jentung; Butler, Charles D.
2015-01-01
This paper presents a mathematically rigorous, subspace projection-based reduced order modeling (ROM) methodology and an integrated framework to automatically generate reduced order models for spacecraft thermal analysis. Two key steps in the reduced order modeling procedure are described: (1) the acquisition of a full-scale spacecraft model in the ordinary differential equation (ODE) and differential algebraic equation (DAE) form to resolve its dynamic thermal behavior; and (2) the ROM to markedly reduce the dimension of the full-scale model. Specifically, proper orthogonal decomposition (POD) in conjunction with discrete empirical interpolation method (DEIM) and trajectory piece-wise linear (TPWL) methods are developed to address the strong nonlinear thermal effects due to coupled conductive and radiative heat transfer in the spacecraft environment. Case studies using NASA-relevant satellite models are undertaken to verify the capability and to assess the computational performance of the ROM technique in terms of speed-up and error relative to the full-scale model. ROM exhibits excellent agreement in spatiotemporal thermal profiles (analysis. These findings establish the feasibility of ROM to perform rational and computationally affordable thermal analysis, develop reliable thermal control strategies for spacecraft, and greatly reduce the development cycle times and costs.
Analysis of Sting Balance Calibration Data Using Optimized Regression Models
Ulbrich, N.; Bader, Jon B.
2010-01-01
Calibration data of a wind tunnel sting balance was processed using a candidate math model search algorithm that recommends an optimized regression model for the data analysis. During the calibration the normal force and the moment at the balance moment center were selected as independent calibration variables. The sting balance itself had two moment gages. Therefore, after analyzing the connection between calibration loads and gage outputs, it was decided to choose the difference and the sum of the gage outputs as the two responses that best describe the behavior of the balance. The math model search algorithm was applied to these two responses. An optimized regression model was obtained for each response. Classical strain gage balance load transformations and the equations of the deflection of a cantilever beam under load are used to show that the search algorithm s two optimized regression models are supported by a theoretical analysis of the relationship between the applied calibration loads and the measured gage outputs. The analysis of the sting balance calibration data set is a rare example of a situation when terms of a regression model of a balance can directly be derived from first principles of physics. In addition, it is interesting to note that the search algorithm recommended the correct regression model term combinations using only a set of statistical quality metrics that were applied to the experimental data during the algorithm s term selection process.
Which biomechanical models are currently used in standing posture analysis?
Crétual, A
2015-11-01
In 1995, David Winter concluded that postural analysis of upright stance was often restricted to studying the trajectory of the center of pressure (CoP). However, postural control means regulation of the center of mass (CoM) with respect to CoP. As CoM is only accessible by using a biomechanical model of the human body, the present article proposes to determine which models are actually used in postural analysis, twenty years after Winter's observation. To do so, a selection of 252 representative articles dealing with upright posture and published during the four last years has been checked. It appears that the CoP model largely remains the most common one (accounting for nearly two thirds of the selection). Other models, CoP/CoM and segmental models (with one, two or more segments) are much less used. The choice of the model does not appear to be guided by the population studied. Conversely, while some confusion remains between postural control and the associated concepts of stability or strategy, this choice is better justified for real methodological concerns when dealing with such high-level parameters. Finally, the computation of the CoM continues to be a limitation in achieving a more complete postural analysis. This unfortunately implies that the model is chosen for technological reasons in many cases (choice being a euphemism here). Some effort still has to be made so that bioengineering developments allow us to go beyond this limit. PMID:26388359
Analysis of deterministic cyclic gene regulatory network models with delays
Ahsen, Mehmet Eren; Niculescu, Silviu-Iulian
2015-01-01
This brief examines a deterministic, ODE-based model for gene regulatory networks (GRN) that incorporates nonlinearities and time-delayed feedback. An introductory chapter provides some insights into molecular biology and GRNs. The mathematical tools necessary for studying the GRN model are then reviewed, in particular Hill functions and Schwarzian derivatives. One chapter is devoted to the analysis of GRNs under negative feedback with time delays and a special case of a homogenous GRN is considered. Asymptotic stability analysis of GRNs under positive feedback is then considered in a separate chapter, in which conditions leading to bi-stability are derived. Graduate and advanced undergraduate students and researchers in control engineering, applied mathematics, systems biology and synthetic biology will find this brief to be a clear and concise introduction to the modeling and analysis of GRNs.
Choice-Based Conjoint Analysis: Classification vs. Discrete Choice Models
Giesen, Joachim; Mueller, Klaus; Taneva, Bilyana; Zolliker, Peter
Conjoint analysis is a family of techniques that originated in psychology and later became popular in market research. The main objective of conjoint analysis is to measure an individual's or a population's preferences on a class of options that can be described by parameters and their levels. We consider preference data obtained in choice-based conjoint analysis studies, where one observes test persons' choices on small subsets of the options. There are many ways to analyze choice-based conjoint analysis data. Here we discuss the intuition behind a classification based approach, and compare this approach to one based on statistical assumptions (discrete choice models) and to a regression approach. Our comparison on real and synthetic data indicates that the classification approach outperforms the discrete choice models.
Uncertainty analysis in dissolved oxygen modeling in streams.
Hamed, Maged M; El-Beshry, Manar Z
2004-08-01
Uncertainty analysis in surface water quality modeling is an important issue. This paper presents a method based on the first-order reliability method (FORM) to assess the exceedance probability of a target dissolved oxygen concentration in a stream, using a Streeter-Phelps prototype model. Basic uncertainty in the input parameters is considered by representing them as random variables with prescribed probability distributions. Results obtained from FORM analysis compared well with those of the Monte Carlo simulation method. The analysis also presents the stochastic sensitivity of the probabilistic outcome in the form of uncertainty importance factors, and shows how they change with changing simulation time. Furthermore, a parametric sensitivity analysis was conducted to show the effect of selection of different probability distribution functions for the three most important parameters on the design point, exceedance probability, and importance factors.
Predoction Model of Data Envelopment Analysis with Undesirable Outputs
Institute of Scientific and Technical Information of China (English)
边馥萍; 范宇
2004-01-01
Data envelopment analysis (DEA) has become a standard non-parametric approach to productivity analysis, especially to relative efficiency analysis of decision making units (DMUs). Extended to the prediction field, it can solve the prediction problem with multiple inputs and outputs which can not be solved easily by the regression analysis method.But the traditional DEA models can not solve the problem with undesirable outputs,so in this paper the inherent relationship between goal programming and the DEA method based on the relationship between multiple goal programming and goal programming is explored,and a mixed DEA model which can make all factors of inputs and undesirable outputs decrease in different proportions is built.And at the same time,all the factors of desirable outputs increase in different proportions.
Uncertainty and sensitivity analysis for photovoltaic system modeling.
Energy Technology Data Exchange (ETDEWEB)
Hansen, Clifford W.; Pohl, Andrew Phillip; Jordan, Dirk
2013-12-01
We report an uncertainty and sensitivity analysis for modeling DC energy from photovoltaic systems. We consider two systems, each comprised of a single module using either crystalline silicon or CdTe cells, and located either at Albuquerque, NM, or Golden, CO. Output from a PV system is predicted by a sequence of models. Uncertainty in the output of each model is quantified by empirical distributions of each model's residuals. We sample these distributions to propagate uncertainty through the sequence of models to obtain an empirical distribution for each PV system's output. We considered models that: (1) translate measured global horizontal, direct and global diffuse irradiance to plane-of-array irradiance; (2) estimate effective irradiance from plane-of-array irradiance; (3) predict cell temperature; and (4) estimate DC voltage, current and power. We found that the uncertainty in PV system output to be relatively small, on the order of 1% for daily energy. Four alternative models were considered for the POA irradiance modeling step; we did not find the choice of one of these models to be of great significance. However, we observed that the POA irradiance model introduced a bias of upwards of 5% of daily energy which translates directly to a systematic difference in predicted energy. Sensitivity analyses relate uncertainty in the PV system output to uncertainty arising from each model. We found that the residuals arising from the POA irradiance and the effective irradiance models to be the dominant contributors to residuals for daily energy, for either technology or location considered. This analysis indicates that efforts to reduce the uncertainty in PV system output should focus on improvements to the POA and effective irradiance models.
Jiang, Runhua; Zhou, Zhenqiao; Lv, Xiaohua; Zeng, Shaoqun; Huang, Zhifeng; Zhou, Huaichun
2012-07-01
Thermal effects greatly influence the optical properties of the acousto-optic deflectors (AODs). Thermal analysis plays an important role in modern AOD design. However, the lack of an effective method of analysis limits the prediction in the thermal performance. In this paper, we propose a finite element analysis model to analyze the thermal effects of a TeO(2)-based AOD. Both transducer heating and acoustic absorption are considered as thermal sources. The anisotropy of sound propagation is taken into account for determining the acoustic absorption. Based on this model, a transient thermal analysis is employed using ANSYS software. The spatial temperature distributions in the crystal and the temperature changes over time are acquired. The simulation results are validated by experimental results. The effect of heat source and heat convection on temperature distribution is discussed. This numerical model and analytical method of thermal analysis would be helpful in the thermal design and practical applications of AODs.
Modelling and analysis of multiagent systems concerning cooperation problems
Reinhold, Thomas
2005-01-01
The subject of this diploma thesis is the modelling and the analysis of mechanisms that enable multiagentsystems to establish communication relations and using them to control the interaction. With regards to the emergence of such symbol systems one groundwork of this paper is the realization that coordination problems aren't applicative to advance to evolution of "higher communication capabilities". With this in mind, this analysis uses a class of problems with explicit conflicts of inte...
Dendritic spine shape analysis using disjunctive normal shape models
Ghani, Muhammad Usman; Mesadi, Fitsum; Demir Kanık, Sümerya Ümmühan; Demir Kanik, Sumerya Ummuhan; Argunşah, Ali Özgür; Argunsah, Ali Ozgur; Israely, Inbal; Ünay, Devrim; Unay, Devrim; Taşdizen, Tolga; Tasdizen, Tolga; Çetin, Müjdat; Cetin, Mujdat
2016-01-01
Analysis of dendritic spines is an essential task to understand the functional behavior of neurons. Their shape variations are known to be closely linked with neuronal activities. Spine shape analysis in particular, can assist neuroscientists to identify this relationship. A novel shape representation has been proposed recently, called Disjunctive Normal Shape Models (DNSM). DNSM is a parametric shape representation and has proven to be successful in several segmentation problems. In this pap...
Empirical validation and comparison of models for customer base analysis
Persentili Batislam, Emine; Denizel, Meltem; Filiztekin, Alpay
2007-01-01
The benefits of retaining customers lead companies to search for means to profile their customers individually and track their retention and defection behaviors. To this end, the main issues addressed in customer base analysis are identification of customer active/inactive status and prediction of future purchase levels. We compare the predictive performance of Pareto/NBD and BG/NBD models from the customer base analysis literature — in terms of repeat purchase levels and active status — usi...
Ducted propeller performance analysis using a boundary element model
Salvatore, Francesco; Calcagni, Danilo; Greco, Luca
2006-01-01
This report describes the computational analysis of the unviscid flow around a ducted propeller using a BEM model. The activity is performed in the framework of a research program co-funded by the European Union under the "SUPERPROP" Project TST4-CT-2005-516219. The theoretical and computational methodology is described and results of a validation excercise on several test cases is presented and discussed. In particular, the proposed formulation is applied to the analysis of ducted propellers...
Experimental and numerical analysis of a knee endoprosthesis numerical model
Directory of Open Access Journals (Sweden)
L. Zach
2016-07-01
Full Text Available The aim of this study is to create and verify a numerical model for a Medin Modular orthopedic knee-joint implant by investigating contact pressure, its distribution and contact surfaces. An experiment using Fuji Prescale pressure sensitive films and a finite element analysis (FEA using Abaqus software were carried out. The experimental data were evaluated using a special designed program and were compared with the results of the analysis. The designed evaluation program had been constructed on the basis of results obtained from a supplementary calibration experiment. The applicability of the numerical model for the real endoprosthesis behavior prediction was proven on the basis of their good correlation.
Multivariable modeling and multivariate analysis for the behavioral sciences
Everitt, Brian S
2009-01-01
Multivariable Modeling and Multivariate Analysis for the Behavioral Sciences shows students how to apply statistical methods to behavioral science data in a sensible manner. Assuming some familiarity with introductory statistics, the book analyzes a host of real-world data to provide useful answers to real-life issues.The author begins by exploring the types and design of behavioral studies. He also explains how models are used in the analysis of data. After describing graphical methods, such as scatterplot matrices, the text covers simple linear regression, locally weighted regression, multip
A comparative analysis of multi-output frontier models
Institute of Scientific and Technical Information of China (English)
Tao ZHANG; Eoghan GARVEY
2008-01-01
Recently, there have been more debates on the methods of measuring efficiency. The main objective of this paper is to make a sensitivity analysis for different frontier models and compare the results obtained from the different methods of estimating multi-output frontier for a specific application. The methods include stochastic distance function frontier, stochastic ray frontier,and data envelopment analysis. The stochastic frontier regressions with and without the inefficiency effects model are also com-pared and tested. The results indicate that there are significant correlations between the results obtained from the alternative estimation methods.
Analysis of a Model for Computer Virus Transmission
Directory of Open Access Journals (Sweden)
Peng Qin
2015-01-01
Full Text Available Computer viruses remain a significant threat to computer networks. In this paper, the incorporation of new computers to the network and the removing of old computers from the network are considered. Meanwhile, the computers are equipped with antivirus software on the computer network. The computer virus model is established. Through the analysis of the model, disease-free and endemic equilibrium points are calculated. The stability conditions of the equilibria are derived. To illustrate our theoretical analysis, some numerical simulations are also included. The results provide a theoretical basis to control the spread of computer virus.
First experience with the new ATLAS analysis model
Cranshaw, Jack; The ATLAS collaboration
2016-01-01
During the Long shutdown of the LHC, the ATLAS collaboration overhauled its analysis model based on experience gained during Run 1. The main components are a new analysis format and Event Data Model which can be read directly by ROOT, as well as a "Derivation Framework" that takes the Petabyte-scale output from ATLAS reconstruction and produces smaller samples targeted at specific analyses, using the central production system. We will discuss the technical and operational aspects of this new system and review its performance during the first year of 13 TeV data taking.
Ryan D Ward; Simpson, Eleanor H.; Kandel, Eric R.; Balsam, Peter D.
2011-01-01
In recent years it has become possible to develop animal models of psychiatric disease in genetically modified mice. While great strides have been made in the development of genetic and neurobiological tools with which to model psychiatric disease, elucidation of neural and molecular mechanisms thought to underlie behavioral phenotypes has been hindered by an inadequate analysis of behavior. This is unfortunate given the fact that the experimental analysis of behavior has created powerful met...
Performance Analysis of Hybrid Forecasting Model in Stock Market Forecasting
Directory of Open Access Journals (Sweden)
Mahesh S. Khadka
2012-09-01
Full Text Available This paper presents performance analysis of hybrid model comprise of concordance and Genetic Programming (GP to forecast financial market with some existing models. This scheme can be used for in depth analysis of stock market. Different measures of concordances such as Kendall’s Tau, Gini’s Mean Difference, Spearman’s Rho, and weak interpretation of concordance are used to search for the pattern in past that look similar to present. Genetic Programming is then used to match the past trend to presenttrend as close as possible. Then Genetic Program estimates what will happen next based on what had happened next. The concept is validated using financial time series data (S&P 500 and NASDAQ indices as sample data sets. The forecasted result is then compared with standard ARIMA model and other model to analyse its performance.
Accounting for Errors in Model Analysis Theory: A Numerical Approach
Sommer, Steven R.; Lindell, Rebecca S.
2004-09-01
By studying the patterns of a group of individuals' responses to a series of multiple-choice questions, researchers can utilize Model Analysis Theory to create a probability distribution of mental models for a student population. The eigenanalysis of this distribution yields information about what mental models the students possess, as well as how consistently they utilize said mental models. Although the theory considers the probabilistic distribution to be fundamental, there exists opportunities for random errors to occur. In this paper we will discuss a numerical approach for mathematically accounting for these random errors. As an example of this methodology, analysis of data obtained from the Lunar Phases Concept Inventory will be presented. Limitations and applicability of this numerical approach will be discussed.
Archetypal Analysis for Modeling Multisubject fMRI Data
DEFF Research Database (Denmark)
Hinrich, Jesper Løve; Bardenfleth, Sophia Elizabeth; Røge, Rasmus;
2016-01-01
to significant intersubject variability in the signal structure and noise. Group-level modeling is typically performed using component decompositions such as independent component analysis (ICA), which represent data as a linear combination of latent brain patterns, or using clustering models, where data...... to group-level modeling and an alternative to preexisting methods that account for inter-subject variability by extracting individual maps as a postprocessing step (e.g., dual-regression ICA), or assuming spatial dependency of maps across subjects (e.g., independent vector analysis). MS-AA shows robust...... performance when modelling archetypes for a motor task experiment. The procedure extracts a 'seed map' across subjects, used to provide brain parcellations with subject-specific temporal profiles. Our approach thus decomposes multisubject fMRI data into distinct interpretable component archetypes that may...
A finite element model for nonlinear structural earthquake analysis
International Nuclear Information System (INIS)
Towards the analysis of damage in reinforced concrete structures subjected to earthquakes, we propose a numerical model capable of describing the non-linear behaviour of reinforced concrete beams and columns under alternate cyclic loading, which can be efficiently used also in dynamic analysis: with the assumption of a local uniaxial state of stress, we are able to obtain the rapidity needed for the time integration of the dynamic equations of equilibrium for real structures, within a time interval corresponding to a seismic action. The model is presented: path-dependant constitutive material law and finite element formulation. An short example of validation serves to evaluate some characteristics of the model. A methodology is then developed to extend the applicability of the model for limit cases, regarding slenderness and semi-rigid limit conditions. (author)
Causal Analysis for Performance Modeling of Computer Programs
Directory of Open Access Journals (Sweden)
Jan Lemeire
2007-01-01
Full Text Available Causal modeling and the accompanying learning algorithms provide useful extensions for in-depth statistical investigation and automation of performance modeling. We enlarged the scope of existing causal structure learning algorithms by using the form-free information-theoretic concept of mutual information and by introducing the complexity criterion for selecting direct relations among equivalent relations. The underlying probability distribution of experimental data is estimated by kernel density estimation. We then reported on the benefits of a dependency analysis and the decompositional capacities of causal models. Useful qualitative models, providing insight into the role of every performance factor, were inferred from experimental data. This paper reports on the results for a LU decomposition algorithm and on the study of the parameter sensitivity of the Kakadu implementation of the JPEG-2000 standard. Next, the analysis was used to search for generic performance characteristics of the applications.
Accuracy Analysis for SST Gravity Field Model in China
Institute of Scientific and Technical Information of China (English)
LUO Jia; LUO Zhicai; ZOU Xiancai; WANG Haihong
2006-01-01
Taking China as the region for test, the potential of the new satellite gravity technique, satellite-to-satellite tracking for improving the accuracy of regional gravity field model is studied. With WDM94 as reference, the gravity anomaly residuals of three models, the latest two GRACE global gravity field model (EIGEN_GRACE02S, GGM02S) and EGM96, are computed and compared. The causes for the differences among the residuals of the three models are discussed. The comparison between the residuals shows that in the selected region, EIGEN_GRACE02S or GGM02S is better than EGM96 in lower degree part (less than 110 degree). Additionally, through the analysis of the model gravity anomaly residuals, it is found that some systematic errors with periodical properties exist in the higher degree part of EIGEN and GGM models, the results can also be taken as references in the validation of the SST gravity data.
Analysis and Comparison of Typical Models within Distribution Network Design
DEFF Research Database (Denmark)
Jørgensen, Hans Jacob; Larsen, Allan; Madsen, Oli B.G.
This paper investigates the characteristics of typical optimisation models within Distribution Network Design. During the paper fourteen models known from the literature will be thoroughly analysed. Through this analysis a schematic approach to categorisation of distribution network design models...... for educational purposes. Furthermore, the paper can be seen as a practical introduction to network design modelling as well as a being an art manual or recipe when constructing such a model....... are covered in the categorisation include fixed vs. general networks, specialised vs. general nodes, linear vs. nonlinear costs, single vs. multi commodity, uncapacitated vs. capacitated activities, single vs. multi modal and static vs. dynamic. The models examined address both strategic and tactical planning...
Probability bounds analysis for nonlinear population ecology models.
Enszer, Joshua A; Andrei Măceș, D; Stadtherr, Mark A
2015-09-01
Mathematical models in population ecology often involve parameters that are empirically determined and inherently uncertain, with probability distributions for the uncertainties not known precisely. Propagating such imprecise uncertainties rigorously through a model to determine their effect on model outputs can be a challenging problem. We illustrate here a method for the direct propagation of uncertainties represented by probability bounds though nonlinear, continuous-time, dynamic models in population ecology. This makes it possible to determine rigorous bounds on the probability that some specified outcome for a population is achieved, which can be a core problem in ecosystem modeling for risk assessment and management. Results can be obtained at a computational cost that is considerably less than that required by statistical sampling methods such as Monte Carlo analysis. The method is demonstrated using three example systems, with focus on a model of an experimental aquatic food web subject to the effects of contamination by ionic liquids, a new class of potentially important industrial chemicals.
Network and adaptive system of systems modeling and analysis.
Energy Technology Data Exchange (ETDEWEB)
Lawton, Craig R.; Campbell, James E. Dr. (.; .); Anderson, Dennis James; Eddy, John P.
2007-05-01
This report documents the results of an LDRD program entitled ''Network and Adaptive System of Systems Modeling and Analysis'' that was conducted during FY 2005 and FY 2006. The purpose of this study was to determine and implement ways to incorporate network communications modeling into existing System of Systems (SoS) modeling capabilities. Current SoS modeling, particularly for the Future Combat Systems (FCS) program, is conducted under the assumption that communication between the various systems is always possible and occurs instantaneously. A more realistic representation of these communications allows for better, more accurate simulation results. The current approach to meeting this objective has been to use existing capabilities to model network hardware reliability and adding capabilities to use that information to model the impact on the sustainment supply chain and operational availability.
[Stability Analysis of Susceptible-Infected-Recovered Epidemic Model].
Pan, Duotao; Shi, Hongyan; Huang, Mingzhong; Yuan, Decheng
2015-10-01
With the range of application of computational biology and systems biology gradually expanding, the complexity of the bioprocess models is also increased. To address this difficult problem, it is required to introduce positive alternative analysis method to cope with it. Taking the dynamic model of the epidemic control process as research object, we established an evaluation model in our laboratory. Firstly, the model was solved with nonlinear programming method. The results were shown to be good. Based on biochemical systems theory, the ODE dynamic model was transformed into S-system. The eigen values of the model showed that the system was stable and contained oscillation phenomenon. Next the sensitivities of rate constant and logarithmic gains of the three key parameters were analyzed, as well as the robust of the system. The result indicated that the biochemical systems theory could be applied in different fields more widely. PMID:26964304
An Overview of Path Analysis: Mediation Analysis Concept in Structural Equation Modeling
Jenatabadi, Hashem Salarzadeh
2015-01-01
This paper provides a tutorial discussion on path analysis structure with concept of structural equation modelling (SEM). The paper delivers an introduction to path analysis technique and explain to how to deal with analyzing the data with this kind of statistical methodology especially with a mediator in the research model. The intended audience is statisticians, mathematicians, or methodologists who either know about SEM or simple basic statistics especially in regression and linear/nonline...
Pattern mixture models for the analysis of repeated attempt designs.
Daniels, Michael J; Jackson, Dan; Feng, Wei; White, Ian R
2015-12-01
It is not uncommon in follow-up studies to make multiple attempts to collect a measurement after baseline. Recording whether these attempts are successful or not provides useful information for the purposes of assessing the missing at random (MAR) assumption and facilitating missing not at random (MNAR) modeling. This is because measurements from subjects who provide this data after multiple failed attempts may differ from those who provide the measurement after fewer attempts. This type of "continuum of resistance" to providing a measurement has hitherto been modeled in a selection model framework, where the outcome data is modeled jointly with the success or failure of the attempts given these outcomes. Here, we present a pattern mixture approach to model this type of data. We re-analyze the repeated attempt data from a trial that was previously analyzed using a selection model approach. Our pattern mixture model is more flexible and is more transparent in terms of parameter identifiability than the models that have previously been used to model repeated attempt data and allows for sensitivity analysis. We conclude that our approach to modeling this type of data provides a fully viable alternative to the more established selection model. PMID:26149119
Performance Analysis of a 3D Ionosphere Tomographic Model
Institute of Scientific and Technical Information of China (English)
Liu Zhi-zhao; Gao Yang
2003-01-01
A 3D high precision ionospheric model is developed based on tomography technique. This tomographic model employs GPS data observed by an operational network of dual-frequency GPS receivers. The methodology of developing a 3D ionospheric tomography model is briefly summarized. However emphasis is put on the analysis and evaluation of the accuracy variation of 3D ionosphere modeling with respect to the change of GPS data cutoff angle.Three typical cutoff angle values (15°, 20° and 25°) are tested. For each testing cutoff angle, the performances of the3D ionospheric model constructed using tomography technique are assessed by calibrating the model predicted ionospheric TEC with the GPS measured TEC and by employing the model predicted TEC to a practical GPS positioning application single point positioning (SPP).Test results indicate the 3D model predicted VTEC has about 0.4 TECU improvement in accuracy when cutoff angle rises from 15° to 20°. However, no apparent improvement is found from 20° to 25°. The model's improvement is also validated by the better SPP accuracy of 3D model than its counterpart-dual frequency model in the 20° and 25° cases.
Mathematical Modeling and Analysis of Classified Marketing of Agricultural Products
Institute of Scientific and Technical Information of China (English)
Fengying; WANG
2014-01-01
Classified marketing of agricultural products was analyzed using the Logistic Regression Model. This method can take full advantage of information in agricultural product database,to find factors influencing best selling degree of agricultural products,and make quantitative analysis accordingly. Using this model,it is also able to predict sales of agricultural products,and provide reference for mapping out individualized sales strategy for popularizing agricultural products.
Configurational analysis as an alternative way of modeling sales response
Aarnio, Susanna
2013-01-01
Objectives of the Study The objectives of the study are both managerial and methodological. On the one hand, the aim is to apply a novel research approach, fuzzy set qualitative comparative analysis or fsQCA (see f. ex. Ragin, 2000; Rihoux & Ragin, 2009), to sales response modeling and thus, create a response model for the case company to identify complex, configurational causalities affecting the company's sales volumes within the chosen product category. On the other hand, to goal is to ...
A Numerical Model for Torsion Analysis of Composite Ship Hulls
Directory of Open Access Journals (Sweden)
Ionel Chirica
2012-01-01
Full Text Available A new methodology based on a macroelement model proposed for torsional behaviour of the ship hull made of composite material is proposed in this paper. A computer program has been developed for the elastic analysis of linear torsion. The results are compared with the FEM-based licensed soft COSMOS/M results and measurements on the scale simplified model of a container ship, made of composite materials.
Stability Analysis of Some Nonlinear Anaerobic Digestion Models
Ivan Simeonov; Sette Diop
2010-01-01
Abstract: The paper deals with local asymptotic stability analysis of some mass balance dynamic models (based on one and on two-stage reaction schemes) of the anaerobic digestion (AD) in CSTR. The equilibrium states for models based on one (with Monod, Contois and Haldane shapes for the specific growth rate) and on two-stage (only with Monod shapes for both the specific growth rate of acidogenic and methanogenic bacterial populations) reaction schemes have been determined solving sets of nonl...
Bifurcation analysis of parametrically excited bipolar disorder model
Nana, Laurent
2009-02-01
Bipolar II disorder is characterized by alternating hypomanic and major depressive episode. We model the periodic mood variations of a bipolar II patient with a negatively damped harmonic oscillator. The medications administrated to the patient are modeled via a forcing function that is capable of stabilizing the mood variations and of varying their amplitude. We analyze analytically, using perturbation method, the amplitude and stability of limit cycles and check this analysis with numerical simulations.
Advanced accident sequence precursor analysis level 1 models
Energy Technology Data Exchange (ETDEWEB)
Sattison, M.B.; Thatcher, T.A.; Knudsen, J.K.; Schroeder, J.A.; Siu, N.O. [Idaho National Engineering Lab., Idaho National Lab., Idaho Falls, ID (United States)
1996-03-01
INEL has been involved in the development of plant-specific Accident Sequence Precursor (ASP) models for the past two years. These models were developed for use with the SAPHIRE suite of PRA computer codes. They contained event tree/linked fault tree Level 1 risk models for the following initiating events: general transient, loss-of-offsite-power, steam generator tube rupture, small loss-of-coolant-accident, and anticipated transient without scram. Early in 1995 the ASP models were revised based on review comments from the NRC and an independent peer review. These models were released as Revision 1. The Office of Nuclear Regulatory Research has sponsored several projects at the INEL this fiscal year to further enhance the capabilities of the ASP models. Revision 2 models incorporates more detailed plant information into the models concerning plant response to station blackout conditions, information on battery life, and other unique features gleaned from an Office of Nuclear Reactor Regulation quick review of the Individual Plant Examination submittals. These models are currently being delivered to the NRC as they are completed. A related project is a feasibility study and model development of low power/shutdown (LP/SD) and external event extensions to the ASP models. This project will establish criteria for selection of LP/SD and external initiator operational events for analysis within the ASP program. Prototype models for each pertinent initiating event (loss of shutdown cooling, loss of inventory control, fire, flood, seismic, etc.) will be developed. A third project concerns development of enhancements to SAPHIRE. In relation to the ASP program, a new SAPHIRE module, GEM, was developed as a specific user interface for performing ASP evaluations. This module greatly simplifies the analysis process for determining the conditional core damage probability for a given combination of initiating events and equipment failures or degradations.
Modeling and analysis of electrorheological suspensions in shear flow.
Seo, Youngwook P; Seo, Yongsok
2012-02-14
A model capable of describing the flow behavior of electrorheological (ER) suspensions under different electric field strengths and over the full range of shear rates is proposed. Structural reformation in the low shear rate region is investigated where parts of a material are in an undeformed state, while aligned structures reform under the shear force. The model's predictions were compared with the experimental data of some ER fluids as well as the CCJ (Cho-Choi-Jhon) model. This simple model's predictions of suspension flow behavior with subsequent aligned structure reformation agreed well with the experimental data, both quantitatively and qualitatively. The proposed model plausibly predicted the static yield stress, whereas the CCJ model and the Bingham model predicted only the dynamic yield stress. The master curve describing the apparent viscosity was obtained by appropriate scaling both axes, which showed that a combination of dimensional analysis and flow curve analysis using the proposed model yielded a quantitatively and qualitatively precise description of ER fluid rheological behavior based on relatively few experimental measurements.
Hydraulic modeling support for conflict analysis: The Manayunk canal revisited
International Nuclear Information System (INIS)
This paper presents a study which used a standard, hydraulic computer model to generate detailed design information to support conflict analysis of a water resource use issue. As an extension of previous studies, the conflict analysis in this case included several scenarios for stability analysis - all of which reached the conclusion that compromising, shared access to the water resources available would result in the most benefits to society. This expected equilibrium outcome was found to maximize benefit-cost estimates. 17 refs., 1 fig., 2 tabs
Manual versus digital Landsat analysis for modeling river flooding
Philipson, W. R.; Hafker, W. R.
1981-01-01
The comparative value of manual versus digital image analysis for determining flood boundaries is being examined in a study of the use of Landsat data for modeling flooding of the Black River, in northern New York. The work is an extension of an earlier study in which Black River flooding was assessed through visually interpreted, multi-date Landsat band 7 images. Based on the results to date, it appears that neither color-additive viewing nor digital analysis of Landsat data provide improvement in accuracy over visual analysis of band 7 images, for delineating the boundaries of flood-affected areas.
Biological Jumping Mechanism Analysis and Modeling for Frog Robot
Institute of Scientific and Technical Information of China (English)
Meng Wang; Xi-zhe Zang; Ji-zhuang Fan; Jie Zhao
2008-01-01
This paper presents a mechanical model of jumping robot based on the biological mechanism analysis of frog. By biological observation and kinematic analysis the frog jump is divided into take-off phase, aerial phase and landing phase. We find the similar trajectories of hindlimb joints during jump, the important effect of foot during take-off and the role of forelimb in supporting the body. Based on the observation, the frog jump is simplified and a mechanical model is put forward. The robot leg is represented by a 4-bar spring/linkage mechanism model, which has three Degrees of Freedom (DOF) at hip joint and one DOF (passive) at tarsometatarsal joint on the foot. The shoulder and elbow joints each has one DOF for the balancing function of arm.The ground reaction force of the model is analyzed and compared with that of frog during take-off. The results show that the model has the same advantages of low likelihood of premature lift-off and high efficiency as the frog. Analysis results and the model can be employed to develop and control a robot capable of mimicking the jumping behavior of flog.
Predictive error analysis for a water resource management model
Gallagher, Mark; Doherty, John
2007-02-01
SummaryIn calibrating a model, a set of parameters is assigned to the model which will be employed for the making of all future predictions. If these parameters are estimated through solution of an inverse problem, formulated to be properly posed through either pre-calibration or mathematical regularisation, then solution of this inverse problem will, of necessity, lead to a simplified parameter set that omits the details of reality, while still fitting historical data acceptably well. Furthermore, estimates of parameters so obtained will be contaminated by measurement noise. Both of these phenomena will lead to errors in predictions made by the model, with the potential for error increasing with the hydraulic property detail on which the prediction depends. Integrity of model usage demands that model predictions be accompanied by some estimate of the possible errors associated with them. The present paper applies theory developed in a previous work to the analysis of predictive error associated with a real world, water resource management model. The analysis offers many challenges, including the fact that the model is a complex one that was partly calibrated by hand. Nevertheless, it is typical of models which are commonly employed as the basis for the making of important decisions, and for which such an analysis must be made. The potential errors associated with point-based and averaged water level and creek inflow predictions are examined, together with the dependence of these errors on the amount of averaging involved. Error variances associated with predictions made by the existing model are compared with "optimized error variances" that could have been obtained had calibration been undertaken in such a way as to minimize predictive error variance. The contributions by different parameter types to the overall error variance of selected predictions are also examined.
Automated quantitative gait analysis in animal models of movement disorders
Directory of Open Access Journals (Sweden)
Vandeputte Caroline
2010-08-01
Full Text Available Abstract Background Accurate and reproducible behavioral tests in animal models are of major importance in the development and evaluation of new therapies for central nervous system disease. In this study we investigated for the first time gait parameters of rat models for Parkinson's disease (PD, Huntington's disease (HD and stroke using the Catwalk method, a novel automated gait analysis test. Static and dynamic gait parameters were measured in all animal models, and these data were compared to readouts of established behavioral tests, such as the cylinder test in the PD and stroke rats and the rotarod tests for the HD group. Results Hemiparkinsonian rats were generated by unilateral injection of the neurotoxin 6-hydroxydopamine in the striatum or in the medial forebrain bundle. For Huntington's disease, a transgenic rat model expressing a truncated huntingtin fragment with multiple CAG repeats was used. Thirdly, a stroke model was generated by a photothrombotic induced infarct in the right sensorimotor cortex. We found that multiple gait parameters were significantly altered in all three disease models compared to their respective controls. Behavioural deficits could be efficiently measured using the cylinder test in the PD and stroke animals, and in the case of the PD model, the deficits in gait essentially confirmed results obtained by the cylinder test. However, in the HD model and the stroke model the Catwalk analysis proved more sensitive than the rotarod test and also added new and more detailed information on specific gait parameters. Conclusion The automated quantitative gait analysis test may be a useful tool to study both motor impairment and recovery associated with various neurological motor disorders.
In this paper, the Genetic Algorithms (GA) and Bayesian model averaging (BMA) were combined to simultaneously conduct calibration and uncertainty analysis for the Soil and Water Assessment Tool (SWAT). In this hybrid method, several SWAT models with different structures are first selected; next GA i...
A Lumped Computational Model for Sodium Sulfur Battery Analysis
Wu, Fan
Due to the cost of materials and time consuming testing procedures, development of new batteries is a slow and expensive practice. The purpose of this study is to develop a computational model and assess the capabilities of such a model designed to aid in the design process and control of sodium sulfur batteries. To this end, a transient lumped computational model derived from an integral analysis of the transport of species, energy and charge throughout the battery has been developed. The computation processes are coupled with the use of Faraday's law, and solutions for the species concentrations, electrical potential and current are produced in a time marching fashion. Properties required for solving the governing equations are calculated and updated as a function of time based on the composition of each control volume. The proposed model is validated against multi- dimensional simulations and experimental results from literatures, and simulation results using the proposed model is presented and analyzed. The computational model and electrochemical model used to solve the equations for the lumped model are compared with similar ones found in the literature. The results obtained from the current model compare favorably with those from experiments and other models.
Process Correlation Analysis Model for Process Improvement Identification
Directory of Open Access Journals (Sweden)
Su-jin Choi
2014-01-01
software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data.
Human Performance Modeling for Dynamic Human Reliability Analysis
Energy Technology Data Exchange (ETDEWEB)
Boring, Ronald Laurids [Idaho National Laboratory; Joe, Jeffrey Clark [Idaho National Laboratory; Mandelli, Diego [Idaho National Laboratory
2015-08-01
Part of the U.S. Department of Energy’s (DOE’s) Light Water Reac- tor Sustainability (LWRS) Program, the Risk-Informed Safety Margin Charac- terization (RISMC) Pathway develops approaches to estimating and managing safety margins. RISMC simulations pair deterministic plant physics models with probabilistic risk models. As human interactions are an essential element of plant risk, it is necessary to integrate human actions into the RISMC risk framework. In this paper, we review simulation based and non simulation based human reliability analysis (HRA) methods. This paper summarizes the founda- tional information needed to develop a feasible approach to modeling human in- teractions in RISMC simulations.
Evaluating statistical analysis models for RNA sequencing experiments
Directory of Open Access Journals (Sweden)
Pablo eReeb
2013-09-01
Full Text Available Validating statistical analysis methods for RNA sequencing (RNA-seq experiments is a complex task. Researcher often find themselves having to decide between competing models or assessing the reliability of results obtained with a designated analysis program. Computer simulation has been the most frequently used procedure to verify the adequacy of a model. However, datasets generated by simulations depend on the parameterization and the assumptions of the selected model. Moreover, such datasets may constitute a partial representation of reality as the complexity or RNA-seq data is hard to mimic. We present the use of plasmode datasets to complement the evaluation of statistical models for RNA-seq data. A plasmode is a dataset obtained from experimental data but for which come truth is known. Using a set of simulated scenarios of technical and biological replicates, and public available datasets, we illustrate how to design algorithms to construct plasmodes under different experimental conditions. We contrast results from two types of methods for RNA-seq: i models based on negative binomial distribution (edgeR and DESeq, and ii Gaussian models applied after transformation of data (MAANOVA. Results emphasize the fact that deciding what method to use may be experiment-specific due to the unknown distributions of expression levels. Plasmodes may contribute to choose which method to apply by using a similar pre-existing dataset. The promising results obtained from this approach, emphasize the need of promoting and improving systematic data sharing across the research community to facilitate plasmode building. Although we illustrate the use of plasmode for comparing differential expression analysis models, the flexibility of plasmode construction allows comparing upstream analysis, as normalization procedures or alignment pipelines, as well.
The business models of the patent market: an empirical analysis of IP business model characteristics
Seissonen, Julia
2014-01-01
This thesis surveys previous academic literature and performs an empirical analysis to reach two main objectives. The first objective is to build a broader and more detailed picture of the different IP specialized business models present in the US business environment. Additionally, the thesis studies their effects on the economy and the patent system efficiency. The second objective is to build an econometric model of the IP business model strategy and empirically test the hypothesis that IP...
Bayesian analysis of physiologically based toxicokinetic and toxicodynamic models.
Hack, C Eric
2006-04-17
Physiologically based toxicokinetic (PBTK) and toxicodynamic (TD) models of bromate in animals and humans would improve our ability to accurately estimate the toxic doses in humans based on available animal studies. These mathematical models are often highly parameterized and must be calibrated in order for the model predictions of internal dose to adequately fit the experimentally measured doses. Highly parameterized models are difficult to calibrate and it is difficult to obtain accurate estimates of uncertainty or variability in model parameters with commonly used frequentist calibration methods, such as maximum likelihood estimation (MLE) or least squared error approaches. The Bayesian approach called Markov chain Monte Carlo (MCMC) analysis can be used to successfully calibrate these complex models. Prior knowledge about the biological system and associated model parameters is easily incorporated in this approach in the form of prior parameter distributions, and the distributions are refined or updated using experimental data to generate posterior distributions of parameter estimates. The goal of this paper is to give the non-mathematician a brief description of the Bayesian approach and Markov chain Monte Carlo analysis, how this technique is used in risk assessment, and the issues associated with this approach. PMID:16466842
Modeling and analysis of ground target radiation cross section
Institute of Scientific and Technical Information of China (English)
SHI Xiang; LOU GuoWei; LI XingGuo
2008-01-01
Based on the analysis of the passive millimeter wave (MMW) radiometer detection, the ground target radiation cross section is modeled as the new token for the target MMW radiant characteristics. Its ap-plication and actual testing are discussed and analyzed. The essence of passive MMW stealth is target radiation cross section reduction.
Video Analysis of the Flight of a Model Aircraft
Tarantino, Giovanni; Fazio, Claudio
2011-01-01
A video-analysis software tool has been employed in order to measure the steady-state values of the kinematics variables describing the longitudinal behaviour of a radio-controlled model aircraft during take-off, climbing and gliding. These experimental results have been compared with the theoretical steady-state configurations predicted by the…
Mathematical modelling and linear stability analysis of laser fusion cutting
Hermanns, Torsten; Schulz, Wolfgang; Vossen, Georg; Thombansen, Ulrich
2016-06-01
A model for laser fusion cutting is presented and investigated by linear stability analysis in order to study the tendency for dynamic behavior and subsequent ripple formation. The result is a so called stability function that describes the correlation of the setting values of the process and the process' amount of dynamic behavior.
Static analysis of a Model of the LDL degradation pathway
DEFF Research Database (Denmark)
Pilegaard, Henrik; Nielson, Flemming; Nielson, Hanne Riis
2005-01-01
BioAmbients is a derivative of mobile ambients that has shown promise of describing interesting features of the behaviour of biological systems. As for other ambient calculi static program analysis can be used to compute safe approximations of the behavior of modelled systems. We use these tools ...
Landslide susceptibility analysis using an artificial neural network model
Mansor, Shattri; Pradhan, Biswajeet; Daud, Mohamed; Jamaludin, Normalina; Khuzaimah, Zailani
2007-10-01
This paper deals with landslide susceptibility analysis using an artificial neural network model for Cameron Highland, Malaysia. Landslide locations were identified in the study area from interpretation of aerial photographs and field surveys. Topographical/geological data and satellite images were collected and processed using GIS and image processing tools. There are ten landslide inducing parameters which are considered for the landslide hazards. These parameters are topographic slope, aspect, curvature and distance from drainage, all derived from the topographic database; geology and distance from lineament, derived from the geologic database; landuse from Landsat satellite images; soil from the soil database; precipitation amount, derived from the rainfall database; and the vegetation index value from SPOT satellite images. Landslide hazard was analyzed using landslide occurrence factors employing the logistic regression model. The results of the analysis were verified using the landslide location data and compared with logistic regression model. The accuracy of hazard map observed was 85.73%. The qualitative landslide susceptibility analysis was carried out using an artificial neural network model by doing map overlay analysis in GIS environment. This information could be used to estimate the risk to population, property and existing infrastructure like transportation network.
Qualitative Analysis for Rheodynamic Model of Cardiac Pressure Pulsations
Institute of Scientific and Technical Information of China (English)
Zhi-cong Liu; Bei-ye Feng
2004-01-01
In this paper,we give a rigorous mathematical and complete parameter analysis for the rheodynamic model of cardiac and obtain the conditions and parameter region for global existence and uniqueness of limit cycle and the global bifurcation diagram of limit cycles.We also discuss the resonance phenomenons of the perturbed system.
Alphabet Knowledge in Preschool: A Rasch Model Analysis
Drouin, Michelle; Horner, Sherri L.; Sondergeld, Toni A.
2012-01-01
In this study, we used Rasch model analyses to examine (1) the unidimensionality of the alphabet knowledge construct and (2) the relative difficulty of different alphabet knowledge tasks (uppercase letter recognition, names, and sounds, and lowercase letter names) within a sample of preschoolers (n=335). Rasch analysis showed that the four…
Rasch Model Based Analysis of the Force Concept Inventory
Planinic, Maja; Ivanjek, Lana; Susac, Ana
2010-01-01
The Force Concept Inventory (FCI) is an important diagnostic instrument which is widely used in the field of physics education research. It is therefore very important to evaluate and monitor its functioning using different tools for statistical analysis. One of such tools is the stochastic Rasch model, which enables construction of linear…
Model-based analysis and simulation of regenerative heat wheel
DEFF Research Database (Denmark)
Wu, Zhuang; Melnik, Roderick V. N.; Borup, F.
2006-01-01
of mathematical models for the thermal analysis of the fluid and wheel matrix. The effect of heat conduction in the direction of the fluid flow is taken into account and the influence of variations in rotating speed of the wheel as well as other characteristics (ambient temperature, airflow and geometric size...
Analysis and Comparison of Typical Models within Distribution Network Design
DEFF Research Database (Denmark)
Jørgensen, Hans Jacob; Larsen, Allan; Madsen, Oli B.G.
a number of important issues which have been identified when addressing the Distribution Network Design problem from a modelling angle. More specifically, we present an analysis of the research which has been performed in utilizing operational research in developing and optimising distribution systems....
An unsupervised aspect detection model for sentiment analysis of reviews
Bagheri, A.; Saraee, M.; Jong, de F.M.G.
2013-01-01
With the rapid growth of user-generated content on the internet, sentiment analysis of online reviews has become a hot research topic recently, but due to variety and wide range of products and services, the supervised and domain-specific models are often not practical. As the number of reviews expa
Modeling and Analysis of A Rotary Direct Drive Servovalve
Institute of Scientific and Technical Information of China (English)
YU Jue; ZHUANG Jian; YU Dehong
2014-01-01
Direct drive servovalves are mostly restricted to low flow rate and low bandwidth applications due to the considerable flow forces. Current studies mainly focus on enhancing the driving force, which in turn is limited to the development of the magnetic material. Aiming at reducing the flow forces, a novel rotary direct drive servovalve(RDDV) is introduced in this paper. This RDDV servovalve is designed in a rotating structure and its axially symmetric spool rotates within a certain angle range in the valve chamber. The servovalve orifices are formed by the matching between the square wave shaped land on the spool and the rectangular ports on the sleeve. In order to study the RDDV servovalve performance, flow rate model and mechanical model are established, wherein flow rates and flow induced torques at different spool rotation angles or spool radiuses are obtained. The model analysis shows that the driving torque can be alleviated due to the proposed valve structure. Computational fluid dynamics(CFD) analysis using ANSYS/FLUENT is applied to evaluate and validate the theoretical analysis. In addition, experiments on the flow rate and the mechanical characteristic of the RDDV servovalve are carried out. Both simulation and experimental results conform to the results of the theoretical model analysis, which proves that this novel and innovative structure for direct drive servovalves can reduce the flow force on the spool and improve valve frequency response characteristics. This research proposes a novel rotary direct drive servovalve, which can reduce the flow forces effectively.
Semigroup Method for a Mathematical Model in Reliability Analysis
Institute of Scientific and Technical Information of China (English)
Geni Gupur; LI Xue-zhi
2001-01-01
The system which consists of a reliable machine, an unreliable machine and a storage buffer with infinite many workpieces has been studied. The existence of a unique positive time-dependent solution of the model corresponding to the system has been obtained by using C0-semigroup theory of linear operators in functional analysis.
Spatial Econometric data analysis: moving beyond traditional models
Florax, R.J.G.M.; Vlist, van der A.J.
2003-01-01
This article appraises recent advances in the spatial econometric literature. It serves as the introduction too collection of new papers on spatial econometric data analysis brought together in this special issue, dealing specifically with new extensions to the spatial econometric modeling perspecti
A new analysis of a simple model of fair allocation
Juan D. Moreno-Ternero
2012-01-01
In a recent article, Fragnelli and Gagliardo [Cooperative models for allocating an object, Economics Letters 117 (2012) 227-229] propose several procedures to solve a basic problem of fair allocation. We scrutinize their proposal and contextualize it into recent developments of the literature on bankruptcy problems. Our analysis supports two of the procedures they propose; namely, the Shapley and Talmud rules.
Multi-Scale Distributed Sensitivity Analysis of Radiative Transfer Model
Neelam, M.; Mohanty, B.
2015-12-01
Amidst nature's great variability and complexity and Soil Moisture Active Passive (SMAP) mission aims to provide high resolution soil moisture products for earth sciences applications. One of the biggest challenges still faced by the remote sensing community are the uncertainties, heterogeneities and scaling exhibited by soil, land cover, topography, precipitation etc. At each spatial scale, there are different levels of uncertainties and heterogeneities. Also, each land surface variable derived from various satellite mission comes with their own error margins. As such, soil moisture retrieval accuracy is affected as radiative model sensitivity changes with space, time, and scale. In this paper, we explore the distributed sensitivity analysis of radiative model under different hydro-climates and spatial scales, 1.5 km, 3 km, 9km and 39km. This analysis is conducted in three different regions Iowa, U.S.A (SMEX02), Arizona, USA (SMEX04) and Winnipeg, Canada (SMAPVEX12). Distributed variables such as soil moisture, soil texture, vegetation and temperature are assumed to be uncertain and are conditionally simulated to obtain uncertain maps, whereas roughness data which is spatially limited are assumed a probability distribution. The relative contribution of the uncertain model inputs to the aggregated model output is also studied, using various aggregation techniques. We use global sensitivity analysis (GSA) to conduct this analysis across spatio-temporal scales. Keywords: Soil moisture, radiative transfer, remote sensing, sensitivity, SMEX02, SMAPVEX12.
Modeling and Analysis of Component Faults and Reliability
DEFF Research Database (Denmark)
Le Guilly, Thibaut; Olsen, Petur; Ravn, Anders Peter;
2016-01-01
that are automatically generated. The stochastic information on the faults is used to estimate the reliability of the fault affected system. The reliability is given with respect to properties of the system state space. We illustrate the process on a concrete example using the Uppaal model checker for validating...... the ideal system model and the fault modeling. Then the statistical version of the tool, UppaalSMC, is used to find reliability estimates.......This chapter presents a process to design and validate models of reactive systems in the form of communicating timed automata. The models are extended with faults associated with probabilities of occurrence. This enables a fault tree analysis of the system using minimal cut sets...
A tool model for predicting atmospheric kinetics with sensitivity analysis
Institute of Scientific and Technical Information of China (English)
无
2001-01-01
A package( a tool model) for program of predicting atmospheric chemical kinetics with sensitivity analysis is presented. The new direct method of calculating the first order sensitivity coefficients using sparse matrix technology to chemical kinetics is included in the tool model, it is only necessary to triangularize the matrix related to the Jacobian matrix of the model equation. The Gear type procedure is used to integrate amodel equation and its coupled auxiliary sensitivity coefficient equations. The FORTRAN subroutines of the model equation, the sensitivity coefficient equations, and their Jacobian analytical expressions are generated automatically from a chemical mechanism. The kinetic representation for the model equation and its sensitivity coefficient equations, and their Jacobian matrix is presented. Various FORTRAN subroutines in packages, such as SLODE, modified MA28, Gear package, with which the program runs in conjunction are recommended.The photo-oxidation of dimethyl disulfide is used for illustration.
A stochastic model for the analysis of maximum daily temperature
Sirangelo, B.; Caloiero, T.; Coscarelli, R.; Ferrari, E.
2016-08-01
In this paper, a stochastic model for the analysis of the daily maximum temperature is proposed. First, a deseasonalization procedure based on the truncated Fourier expansion is adopted. Then, the Johnson transformation functions were applied for the data normalization. Finally, the fractionally autoregressive integrated moving average model was used to reproduce both short- and long-memory behavior of the temperature series. The model was applied to the data of the Cosenza gauge (Calabria region) and verified on other four gauges of southern Italy. Through a Monte Carlo simulation procedure based on the proposed model, 105 years of daily maximum temperature have been generated. Among the possible applications of the model, the occurrence probabilities of the annual maximum values have been evaluated. Moreover, the procedure was applied for the estimation of the return periods of long sequences of days with maximum temperature above prefixed thresholds.
Domain ontology and multi-criteria analysis for enterprise modeling
Directory of Open Access Journals (Sweden)
Sabria Hadj Tayeb
2012-03-01
Full Text Available Knowing that an enterprise is a complex reality, it is necessary to develop a modeling framework allowing the description of system structure and dynamics that alter the structure. The concept of enterprise modeling addresses this need and many techniques have emerged. Our goal is to provide leaders of Algerian enterprise an overview of modeling techniques. Thus these managers may elect, in collaboration with the University, the modeling technique best suited to their requirements. We believe that this could be a step towards an effective reorganization of the enterprise leading. This article proposes a domain ontology and multi-criteria analysis in the frame of modeling enterprise. Our approach is based on two stages using the Protg tool for the technique representation and the PROMETHEE method for their evaluation. The result is a ranking between the different techniques, which allows selecting the most appropriate methodology according to the criteria for a given enterprise.
Performance analysis of FXLMS algorithm with secondary path modeling error
Institute of Scientific and Technical Information of China (English)
SUN Xu; CHEN Duanshi
2003-01-01
Performance analysis of filtered-X LMS (FXLMS) algorithm with secondary path modeling error is carried out in both time and frequency domain. It is shown firstly that the effects of secondary path modeling error on the performance of FXLMS algorithm are determined by the distribution of the relative error of secondary path model along with frequency.In case of that the distribution of relative error is uniform the modeling error of secondary path will have no effects on the performance of the algorithm. In addition, a limitation property of FXLMS algorithm is proved, which implies that the negative effects of secondary path modeling error can be compensated by increasing the adaptive filter length. At last, some insights into the "spillover" phenomenon of FXLMS algorithm are given.
Modeling and analysis of transport in the mammary glands
Quezada, Ana; Vafai, Kambiz
2014-08-01
The transport of three toxins moving from the blood stream into the ducts of the mammary glands is analyzed in this work. The model predictions are compared with experimental data from the literature. The utility of the model lies in its potential to improve our understanding of toxin transport as a pre-disposing factor to breast cancer. This work is based on a multi-layer transport model to analyze the toxins present in the breast milk. The breast milk in comparison with other sampling strategies allows us to understand the mass transport of toxins once inside the bloodstream of breastfeeding women. The multi-layer model presented describes the transport of caffeine, DDT and cimetidine. The analysis performed takes into account the unique transport mechanisms for each of the toxins. Our model predicts the movement of toxins and/or drugs within the mammary glands as well as their bioaccumulation in the tissues.
The modelling and analysis of the mechanics of ropes
Leech, C M
2014-01-01
This book considers the modelling and analysis of the many types of ropes, linear fibre assemblies. The construction of these structures is very diverse and in the work these are considered from the modelling point of view. As well as the conventional twisted structures, braid and plaited structures and parallel assemblies are modelled and analysed, first for their assembly and secondly for their mechanical behaviour. Also since the components are assemblies of components, fibres into yarns, into strands, and into ropes the hierarchical nature of the construction is considered. The focus of the modelling is essentially toward load extension behaviour but there is reference to bending of ropes, encompassed by the two extremes, no slip between the components and zero friction resistance to component slip. Friction in ropes is considered both between the rope components, sliding, sawing and scissoring, and within the components, dilation and distortion, these latter modes being used to model component set, the p...
A Succinct Approach to Static Analysis and Model Checking
DEFF Research Database (Denmark)
Filipiuk, Piotr
In a number of areas software correctness is crucial, therefore it is often desirable to formally verify the presence of various properties or the absence of errors. This thesis presents a framework for concisely expressing static analysis and model checking problems. The framework facilitates...... in the classical formulation of ALFP logic. Finally, we show that the logics and the associated solvers can be used for rapid prototyping. We illustrate that by a variety of case studies from static analysis and model checking....... that guarantees that there always is single best solution for a problem under consideration. We also develop a solving algorithm, based on a dierential worklist, that computes the least solution guaranteed by the Moore Family result. Furthermore, we present a logic for specifying analysis problems called Layered...
How Many Separable Sources? Model Selection In Independent Components Analysis
DEFF Research Database (Denmark)
Woods, Roger P.; Hansen, Lars Kai; Strother, Stephen
2015-01-01
Unlike mixtures consisting solely of non-Gaussian sources, mixtures including two or more Gaussian components cannot be separated using standard independent components analysis methods that are based on higher order statistics and independent observations. The mixed Independent Components Analysis...... computationally intensive alternative for model selection. Application of the algorithm is illustrated using Fisher’s iris data set and Howells’ craniometric data set. Mixed ICA/PCA is of potential interest in any field of scientific investigation where the authenticity of blindly separated non-Gaussian sources...... might otherwise be questionable. Failure of the Akaike Information Criterion in model selection also has relevance in traditional independent components analysis where all sources are assumed non-Gaussian....
State space modelling and data analysis exercises in LISA Pathfinder
Nofrarias, M; Armano, M; Audley, H; Auger, G; Benedetti, M; Binetruy, P; Bogenstahl, J; Bortoluzzi, D; Bosetti, P; Brandt, N; Caleno, M; Cañizares, P; Cavalleri, A; Cesa, M; Chmeissani, M; Conchillo, A; Congedo, G; Cristofolin, I; Cruise, M; Danzmann, K; De Marchi, F; Diaz-Aguilo, M; Diepholz, I; Dixon, G; Dolesi, R; Dunbar, N; Fauste, J; Ferraioli, L; Fichter, V Ferroni W; Fitzsimons, E; Freschi, M; Marin, A García; Marirrodriga, C García; Gesa, R Gerndt L; Gibert, F; Giardini, D; Grimani, C; Grynagier, A; Guillaume, B; Guzmán, F; Harrison, I; Heinzel, G; Hernández, V; Hewitson, M; Hollington, D; Hough, J; Hoyland, D; Hueller, M; Huesler, J; Jennrich, O; Jetzer, P; Johlander, B; Killow, C; Llamas, X; Lloro, I; Lobo, A; Maarschalkerweerd, R; Madden, S; Mance, D; Mateos, I; McNamara, P W; Mendes, J; Mitchell, E; Monsky, A; Nicolini, D; Nicolodi, D; Pedersen, F; Perreur-Lloyd, M; Plagnol, E; Prat, P; Racca, G D; Ramos-Castro, J; Reiche, J; Perez, J A Romera; Robertson, D; Rozemeijer, H; Sanjuan, J; Schleicher, A; Schulte, M; Shaul, D; Stagnaro, L; Strandmoe, S; Steier, F; Sumner, T J; Taylor, A; Texier, D; Trenkel, C; Vitale, H-B Tu S; Wanner, G; Ward, H; Waschke, S; Wass, P; Weber, W J; Ziegler, T; Zweifel, P
2013-01-01
LISA Pathfinder is a mission planned by the European Space Agency to test the key technologies that will allow the detection of gravitational waves in space. The instrument on-board, the LISA Technology package, will undergo an exhaustive campaign of calibrations and noise characterisation campaigns in order to fully describe the noise model. Data analysis plays an important role in the mission and for that reason the data analysis team has been developing a toolbox which contains all the functionalities required during operations. In this contribution we give an overview of recent activities, focusing on the improvements in the modelling of the instrument and in the data analysis campaigns performed both with real and simulated data.
Dynamic Chest Image Analysis: Model-Based Perfusion Analysis in Dynamic Pulmonary Imaging
Liang, Jianming; Järvi, Timo; Kiuru, Aaro; Kormano, Martti; Svedström, Erkki
2003-12-01
The "Dynamic Chest Image Analysis" project aims to develop model-based computer analysis and visualization methods for showing focal and general abnormalities of lung ventilation and perfusion based on a sequence of digital chest fluoroscopy frames collected with the dynamic pulmonary imaging technique. We have proposed and evaluated a multiresolutional method with an explicit ventilation model for ventilation analysis. This paper presents a new model-based method for pulmonary perfusion analysis. According to perfusion properties, we first devise a novel mathematical function to form a perfusion model. A simple yet accurate approach is further introduced to extract cardiac systolic and diastolic phases from the heart, so that this cardiac information may be utilized to accelerate the perfusion analysis and improve its sensitivity in detecting pulmonary perfusion abnormalities. This makes perfusion analysis not only fast but also robust in computation; consequently, perfusion analysis becomes computationally feasible without using contrast media. Our clinical case studies with 52 patients show that this technique is effective for pulmonary embolism even without using contrast media, demonstrating consistent correlations with computed tomography (CT) and nuclear medicine (NM) studies. This fluoroscopical examination takes only about 2 seconds for perfusion study with only low radiation dose to patient, involving no preparation, no radioactive isotopes, and no contrast media.
Dynamic Chest Image Analysis: Model-Based Perfusion Analysis in Dynamic Pulmonary Imaging
Directory of Open Access Journals (Sweden)
Kiuru Aaro
2003-01-01
Full Text Available The "Dynamic Chest Image Analysis" project aims to develop model-based computer analysis and visualization methods for showing focal and general abnormalities of lung ventilation and perfusion based on a sequence of digital chest fluoroscopy frames collected with the dynamic pulmonary imaging technique. We have proposed and evaluated a multiresolutional method with an explicit ventilation model for ventilation analysis. This paper presents a new model-based method for pulmonary perfusion analysis. According to perfusion properties, we first devise a novel mathematical function to form a perfusion model. A simple yet accurate approach is further introduced to extract cardiac systolic and diastolic phases from the heart, so that this cardiac information may be utilized to accelerate the perfusion analysis and improve its sensitivity in detecting pulmonary perfusion abnormalities. This makes perfusion analysis not only fast but also robust in computation; consequently, perfusion analysis becomes computationally feasible without using contrast media. Our clinical case studies with 52 patients show that this technique is effective for pulmonary embolism even without using contrast media, demonstrating consistent correlations with computed tomography (CT and nuclear medicine (NM studies. This fluoroscopical examination takes only about 2 seconds for perfusion study with only low radiation dose to patient, involving no preparation, no radioactive isotopes, and no contrast media.
Patent portfolio analysis model based on legal status information
Institute of Scientific and Technical Information of China (English)
Xuezhao; WANG; Yajuan; ZHAO; Jing; ZHANG; Ping; ZHAO
2014-01-01
Purpose:This research proposes a patent portfolio analysis model based on the legal status information to chart out a competitive landscape in a particular field,enabling organizations to position themselves within the overall technology landscape.Design/methodology/approach:Three indicators were selected for the proposed model:Patent grant rate,valid patents rate and patent maintenance period.The model uses legal status information to perform a qualitative evaluation of relative values of the individual patents,countries or regions’ technological capabilities and competitiveness of patent applicants.The results are visualized by a four-quadrant bubble chart To test the effectiveness of the model,it is used to present a competitive landscape in the lithium ion battery field.Findings:The model can be used to evaluate the values of the individual patents,highlight countries or regions’ positions in the field,and rank the competitiveness of patent applicants in the field.Research limitations:The model currently takes into consideration only three legal status indicators.It is actually feasible to introduce more indicators such as the reason for invalid patents and the distribution of patent maintenance time and associate them with those in the proposed model.Practical implications:Analysis of legal status information in combination of patent application information can help an organization to spot gaps in its patent claim coverage,as well as evaluate patent quality and maintenance situation of its granted patents.The study results can be used to support technology assessment,technology innovation and intellectual property management.Originality/value:Prior studies attempted to assess patent quality or competitiveness by using either single patent legal status indicator or comparative analysis of the impacts of each indicator.However,they are insufficient in presenting the combined effects of the evaluation indicators.Using our model,it appears possible to get a
Uncertainty analysis of fluvial outcrop data for stochastic reservoir modelling
Energy Technology Data Exchange (ETDEWEB)
Martinius, A.W. [Statoil Research Centre, Trondheim (Norway); Naess, A. [Statoil Exploration and Production, Stjoerdal (Norway)
2005-07-01
Uncertainty analysis and reduction is a crucial part of stochastic reservoir modelling and fluid flow simulation studies. Outcrop analogue studies are often employed to define reservoir model parameters but the analysis of uncertainties associated with sedimentological information is often neglected. In order to define uncertainty inherent in outcrop data more accurately, this paper presents geometrical and dimensional data from individual point bars and braid bars, from part of the low net:gross outcropping Tortola fluvial system (Spain) that has been subjected to a quantitative and qualitative assessment. Four types of primary outcrop uncertainties are discussed: (1) the definition of the conceptual depositional model; (2) the number of observations on sandstone body dimensions; (3) the accuracy and representativeness of observed three-dimensional (3D) sandstone body size data; and (4) sandstone body orientation. Uncertainties related to the depositional model are the most difficult to quantify but can be appreciated qualitatively if processes of deposition related to scales of time and the general lack of information are considered. Application of the N
Predicate Argument Structure Analysis for Use Case Description Modeling
Takeuchi, Hironori; Nakamura, Taiga; Yamaguchi, Takahira
In a large software system development project, many documents are prepared and updated frequently. In such a situation, support is needed for looking through these documents easily to identify inconsistencies and to maintain traceability. In this research, we focus on the requirements documents such as use cases and consider how to create models from the use case descriptions in unformatted text. In the model construction, we propose a few semantic constraints based on the features of the use cases and use them for a predicate argument structure analysis to assign semantic labels to actors and actions. With this approach, we show that we can assign semantic labels without enhancing any existing general lexical resources such as case frame dictionaries and design a less language-dependent model construction architecture. By using the constructed model, we consider a system for quality analysis of the use cases and automated test case generation to keep the traceability between document sets. We evaluated the reuse of the existing use cases and generated test case steps automatically with the proposed prototype system from real-world use cases in the development of a system using a packaged application. Based on the evaluation, we show how to construct models with high precision from English and Japanese use case data. Also, we could generate good test cases for about 90% of the real use cases through the manual improvement of the descriptions based on the feedback from the quality analysis system.
A fuzzy set preference model for market share analysis
Turksen, I. B.; Willson, Ian A.
1992-01-01
Consumer preference models are widely used in new product design, marketing management, pricing, and market segmentation. The success of new products depends on accurate market share prediction and design decisions based on consumer preferences. The vague linguistic nature of consumer preferences and product attributes, combined with the substantial differences between individuals, creates a formidable challenge to marketing models. The most widely used methodology is conjoint analysis. Conjoint models, as currently implemented, represent linguistic preferences as ratio or interval-scaled numbers, use only numeric product attributes, and require aggregation of individuals for estimation purposes. It is not surprising that these models are costly to implement, are inflexible, and have a predictive validity that is not substantially better than chance. This affects the accuracy of market share estimates. A fuzzy set preference model can easily represent linguistic variables either in consumer preferences or product attributes with minimal measurement requirements (ordinal scales), while still estimating overall preferences suitable for market share prediction. This approach results in flexible individual-level conjoint models which can provide more accurate market share estimates from a smaller number of more meaningful consumer ratings. Fuzzy sets can be incorporated within existing preference model structures, such as a linear combination, using the techniques developed for conjoint analysis and market share estimation. The purpose of this article is to develop and fully test a fuzzy set preference model which can represent linguistic variables in individual-level models implemented in parallel with existing conjoint models. The potential improvements in market share prediction and predictive validity can substantially improve management decisions about what to make (product design), for whom to make it (market segmentation), and how much to make (market share
Performance analysis of NOAA tropospheric signal delay model
International Nuclear Information System (INIS)
Tropospheric delay is one of the dominant global positioning system (GPS) errors, which degrades the positioning accuracy. Recent development in tropospheric modeling relies on implementation of more accurate numerical weather prediction (NWP) models. In North America one of the NWP-based tropospheric correction models is the NOAA Tropospheric Signal Delay Model (NOAATrop), which was developed by the US National Oceanic and Atmospheric Administration (NOAA). Because of its potential to improve the GPS positioning accuracy, the NOAATrop model became the focus of many researchers. In this paper, we analyzed the performance of the NOAATrop model and examined its effect on ionosphere-free-based precise point positioning (PPP) solution. We generated 3 year long tropospheric zenith total delay (ZTD) data series for the NOAATrop model, Hopfield model, and the International GNSS Services (IGS) final tropospheric correction product, respectively. These data sets were generated at ten IGS reference stations spanning Canada and the United States. We analyzed the NOAATrop ZTD data series and compared them with those of the Hopfield model. The IGS final tropospheric product was used as a reference. The analysis shows that the performance of the NOAATrop model is a function of both season (time of the year) and geographical location. However, its performance was superior to the Hopfield model in all cases. We further investigated the effect of implementing the NOAATrop model on the ionosphere-free-based PPP solution convergence and accuracy. It is shown that the use of the NOAATrop model improved the PPP solution convergence by 1%, 10% and 15% for the latitude, longitude and height components, respectively
Mathematical modelling methodologies in predictive food microbiology: a SWOT analysis.
Ferrer, Jordi; Prats, Clara; López, Daniel; Vives-Rego, Josep
2009-08-31
Predictive microbiology is the area of food microbiology that attempts to forecast the quantitative evolution of microbial populations over time. This is achieved to a great extent through models that include the mechanisms governing population dynamics. Traditionally, the models used in predictive microbiology are whole-system continuous models that describe population dynamics by means of equations applied to extensive or averaged variables of the whole system. Many existing models can be classified by specific criteria. We can distinguish between survival and growth models by seeing whether they tackle mortality or cell duplication. We can distinguish between empirical (phenomenological) models, which mathematically describe specific behaviour, and theoretical (mechanistic) models with a biological basis, which search for the underlying mechanisms driving already observed phenomena. We can also distinguish between primary, secondary and tertiary models, by examining their treatment of the effects of external factors and constraints on the microbial community. Recently, the use of spatially explicit Individual-based Models (IbMs) has spread through predictive microbiology, due to the current technological capacity of performing measurements on single individual cells and thanks to the consolidation of computational modelling. Spatially explicit IbMs are bottom-up approaches to microbial communities that build bridges between the description of micro-organisms at the cell level and macroscopic observations at the population level. They provide greater insight into the mesoscale phenomena that link unicellular and population levels. Every model is built in response to a particular question and with different aims. Even so, in this research we conducted a SWOT (Strength, Weaknesses, Opportunities and Threats) analysis of the different approaches (population continuous modelling and Individual-based Modelling), which we hope will be helpful for current and future
Mathematical modelling methodologies in predictive food microbiology: a SWOT analysis.
Ferrer, Jordi; Prats, Clara; López, Daniel; Vives-Rego, Josep
2009-08-31
Predictive microbiology is the area of food microbiology that attempts to forecast the quantitative evolution of microbial populations over time. This is achieved to a great extent through models that include the mechanisms governing population dynamics. Traditionally, the models used in predictive microbiology are whole-system continuous models that describe population dynamics by means of equations applied to extensive or averaged variables of the whole system. Many existing models can be classified by specific criteria. We can distinguish between survival and growth models by seeing whether they tackle mortality or cell duplication. We can distinguish between empirical (phenomenological) models, which mathematically describe specific behaviour, and theoretical (mechanistic) models with a biological basis, which search for the underlying mechanisms driving already observed phenomena. We can also distinguish between primary, secondary and tertiary models, by examining their treatment of the effects of external factors and constraints on the microbial community. Recently, the use of spatially explicit Individual-based Models (IbMs) has spread through predictive microbiology, due to the current technological capacity of performing measurements on single individual cells and thanks to the consolidation of computational modelling. Spatially explicit IbMs are bottom-up approaches to microbial communities that build bridges between the description of micro-organisms at the cell level and macroscopic observations at the population level. They provide greater insight into the mesoscale phenomena that link unicellular and population levels. Every model is built in response to a particular question and with different aims. Even so, in this research we conducted a SWOT (Strength, Weaknesses, Opportunities and Threats) analysis of the different approaches (population continuous modelling and Individual-based Modelling), which we hope will be helpful for current and future
Analysis of DIRAC's behavior using model checking with process algebra
Remenska, Daniela; Willemse, Tim; Bal, Henri; Verstoep, Kees; Fokkink, Wan; Charpentier, Philippe; Diaz, Ricardo Graciani; Lanciotti, Elisa; Roiser, Stefan; Ciba, Krzysztof
2012-01-01
DIRAC is the grid solution developed to support LHCb production activities as well as user data analysis. It consists of distributed services and agents delivering the workload to the grid resources. Services maintain database back-ends to store dynamic state information of entities such as jobs, queues, staging requests, etc. Agents use polling to check and possibly react to changes in the system state. Each agent's logic is relatively simple, the main complexity lies in their cooperation. Agents run concurrently, and collaborate using the databases as shared memory. The databases can be accessed directly by the agents if running locally or through a DIRAC service interface if necessary. This shared-memory model causes entities to occasionally get into inconsistent states. Tracing and fixing such problems becomes formidable due to the inherent parallelism present. We propose more rigorous methods to cope with this. Model checking is one such technique for analysis of an abstract model of a system. Unlike con...
Sensitivity analysis in a Lassa fever deterministic mathematical model
Abdullahi, Mohammed Baba; Doko, Umar Chado; Mamuda, Mamman
2015-05-01
Lassa virus that causes the Lassa fever is on the list of potential bio-weapons agents. It was recently imported into Germany, the Netherlands, the United Kingdom and the United States as a consequence of the rapid growth of international traffic. A model with five mutually exclusive compartments related to Lassa fever is presented and the basic reproduction number analyzed. A sensitivity analysis of the deterministic model is performed. This is done in order to determine the relative importance of the model parameters to the disease transmission. The result of the sensitivity analysis shows that the most sensitive parameter is the human immigration, followed by human recovery rate, then person to person contact. This suggests that control strategies should target human immigration, effective drugs for treatment and education to reduced person to person contact.
A multiserver multiqueue network: modeling and performance analysis
Institute of Scientific and Technical Information of China (English)
无
2002-01-01
A new category of system model, multiserver multiqueue network (MSMQN), is proposed for distributed systems such as the geographically distributed Web-server clusters. A MSMQN comprises multiple multiserver multiqueue (MSMQ) nodes distributed over the network, and everynode consists of a number of servers that each contains multiple priority queues for waiting customers. An incoming request can be distributed to a waiting queue of any server in any node, according to the routing policy integrated by the node-selection policy at network-level, request-dispatching policy at node-level, and request-scheduling policy at server-level. The model is investigated using stochastic high-level Petri net (SHLPN) modeling and performance analysis techniques. Theperformance metrics concerned includes the delay time of requests in the MSMQ node and the response time perceived by the users. The numerical example shows the efficiency of the performance analysis technique.
A multiserver multiqueue network：modeling and performance analysis
Institute of Scientific and Technical Information of China (English)
ZhiguangShan; YangYang; 等
2002-01-01
A new categroy of system model,multiserver multiqueue network(MSMQN),is proposed for distributed systems such as the geopgraphically distributed web-server clusters.A MSMQN comprises multiple multiserver multiqueue(MSMQ) nodes distributed over the network.and every node consists of a number of servers that each contains multiple priority queues for waiting customers.An incoming request can be distributed to a waiting queue of any server in any node,according to the routing policy integrated by the nodeselection policy at network-level,request-dispatching policy at node-level,and request-scheduling policy at server-level.The model is investigated using stochastic high-level Petrinet(SHLPN) modeling and performance analysis techniques.The performance metrics concerned includes the delay time of requests in the MSMQ node and the response time perceived by the users.The numerical example shows the feeiciency of the performance analysis technique.
Global sensitivity analysis for models with spatially dependent outputs
Marrel, Amandine; Jullien, Michel; Laurent, Beatrice; Volkova, Elena
2010-01-01
The global sensitivity analysis of a complex numerical model often calls for the estimation of variance-based importance measures, named Sobol' indices. Metamodel-based techniques have been developed in order to replace the cpu time-expensive computer code with an inexpensive mathematical function, which predicts the computer code output. The common metamodel-based sensitivity analysis methods are well-suited for computer codes with scalar outputs. However, in the environmental domain, as in many areas of application, the numerical model outputs are often spatial maps, which may also vary with time. In this paper, we introduce an innovative method to obtain a spatial map of Sobol' indices with a minimal number of numerical model computations. It is based upon the functional decomposition of the spatial output onto a wavelet basis and the metamodeling of the wavelet coefficients by the Gaussian process. An analytical example is presented to clarify the various steps of our methodology. This technique is then a...
Lumped Capacitance Model in Thermal Analysis of Solid Materials
International Nuclear Information System (INIS)
The paper is devoted to the presentation of a method for measurement of thermal conductivity k, specific heat capacity cp and thermal diffusivity α applying the lumped capacitance model (LCM) as a special case of Newton's model of cooling. At the specific experimental conditions resulting from the theoretical analysis of the used model, we present relatively very precise method for experimental determination of all three above mentioned thermal parameters for materials with different thermal transport properties. The input experimental data provide a cooling curve of the tested material obtained in special experimental arrangement. The evaluation of experimental data is realized by software the fundamental features of which are presented here. The statistical analysis of experimental data was performed (99% confidence interval P99)
Comparative analysis of calculation models of railway subgrade
Directory of Open Access Journals (Sweden)
I.O. Sviatko
2013-08-01
Full Text Available Purpose. In transport engineering structures design, the primary task is to determine the parameters of foundation soil and nuances of its work under loads. It is very important to determine the parameters of shear resistance and the parameters, determining the development of deep deformations in foundation soils, while calculating the soil subgrade - upper track structure interaction. Search for generalized numerical modeling methods of embankment foundation soil work that include not only the analysis of the foundation stress state but also of its deformed one. Methodology. The analysis of existing modern and classical methods of numerical simulation of soil samples under static load was made. Findings. According to traditional methods of analysis of ground masses work, limitation and the qualitative estimation of subgrade deformations is possible only indirectly, through the estimation of stress and comparison of received values with the boundary ones. Originality. A new computational model was proposed in which it will be applied not only classical approach analysis of the soil subgrade stress state, but deformed state will be also taken into account. Practical value. The analysis showed that for accurate analysis of ground masses work it is necessary to develop a generalized methodology for analyzing of the rolling stock - railway subgrade interaction, which will use not only the classical approach of analyzing the soil subgrade stress state, but also take into account its deformed one.
Analysis modeling for plate buckling load of vibration test
Institute of Scientific and Technical Information of China (English)
SUNG Wen-pei; LIN Cheng-I; SHIH Ming-hsiang; GO Cheer-germ
2005-01-01
In view of the recent technological development, the pursuit of safe high-precision structural designs has been the goal of most structural designers. To bridge the gap between the construction theories and the actual construction techniques, safety factors are adopted for designing the strength loading of structural members. If safety factors are too conservative, the extra building materials necessary will result in high construction cost. Thus, there has been a tendency in the construction field to derive a precise buckling load analysis model of member in order to establish accurate safety factors. A numerical analysis model, using modal analysis to acquire the dynamic function calculated by dynamic parameter to get the buckling load of member, is proposed in this paper. The fixed and simple supports around the circular plate are analyzed by this proposed method. And then, the Monte Carlo method and the normal distribution method are used for random sampling and measuring errors of numerical simulation respectively. The analysis results indicated that this proposed method only needs to apply modal parameters of 7×7 test points to obtain a theoretical value of buckling load. Moreover, the analysis method of inequality-distant test points produces better analysis results than the other methods.
Image decomposition as a tool for validating stress analysis models
Directory of Open Access Journals (Sweden)
Mottershead J.
2010-06-01
Full Text Available It is good practice to validate analytical and numerical models used in stress analysis for engineering design by comparison with measurements obtained from real components either in-service or in the laboratory. In reality, this critical step is often neglected or reduced to placing a single strain gage at the predicted hot-spot of stress. Modern techniques of optical analysis allow full-field maps of displacement, strain and, or stress to be obtained from real components with relative ease and at modest cost. However, validations continued to be performed only at predicted and, or observed hot-spots and most of the wealth of data is ignored. It is proposed that image decomposition methods, commonly employed in techniques such as fingerprinting and iris recognition, can be employed to validate stress analysis models by comparing all of the key features in the data from the experiment and the model. Image decomposition techniques such as Zernike moments and Fourier transforms have been used to decompose full-field distributions for strain generated from optical techniques such as digital image correlation and thermoelastic stress analysis as well as from analytical and numerical models by treating the strain distributions as images. The result of the decomposition is 101 to 102 image descriptors instead of the 105 or 106 pixels in the original data. As a consequence, it is relatively easy to make a statistical comparison of the image descriptors from the experiment and from the analytical/numerical model and to provide a quantitative assessment of the stress analysis.
Urban Sprawl Analysis and Modeling in Asmara, Eritrea
Directory of Open Access Journals (Sweden)
Mussie G. Tewolde
2011-09-01
Full Text Available The extension of urban perimeter markedly cuts available productive land. Hence, studies in urban sprawl analysis and modeling play an important role to ensure sustainable urban development. The urbanization pattern of the Greater Asmara Area (GAA, the capital of Eritrea, was studied. Satellite images and geospatial tools were employed to analyze the spatiotemporal urban landuse changes. Object-Based Image Analysis (OBIA, Landuse Cover Change (LUCC analysis and urban sprawl analysis using Shannon Entropy were carried out. The Land Change Modeler (LCM was used to develop a model of urban growth. The Multi-layer Perceptron Neural Network was employed to model the transition potential maps with an accuracy of 85.9% and these were used as an input for the ‘actual’ urban modeling with Markov chains. Model validation was assessed and a scenario of urban land use change of the GAA up to year 2020 was presented. The result of the study indicated that the built-up area has tripled in size (increased by 4,441 ha between 1989 and 2009. Specially, after year 2000 urban sprawl in GAA caused large scale encroachment on high potential agricultural lands and plantation cover. The scenario for year 2020 shows an increase of the built-up areas by 1,484 ha (25% which may cause further loss. The study indicated that the land allocation system in the GAA overrode the landuse plan, which caused the loss of agricultural land and plantation cover. The recommended policy options might support decision makers to resolve further loss of agricultural land and plantation cover and to achieve sustainable urban development planning in the GAA.
Multiway modeling and analysis in stem cell systems biology
Directory of Open Access Journals (Sweden)
Vandenberg Scott L
2008-07-01
Full Text Available Abstract Background Systems biology refers to multidisciplinary approaches designed to uncover emergent properties of biological systems. Stem cells are an attractive target for this analysis, due to their broad therapeutic potential. A central theme of systems biology is the use of computational modeling to reconstruct complex systems from a wealth of reductionist, molecular data (e.g., gene/protein expression, signal transduction activity, metabolic activity, etc.. A number of deterministic, probabilistic, and statistical learning models are used to understand sophisticated cellular behaviors such as protein expression during cellular differentiation and the activity of signaling networks. However, many of these models are bimodal i.e., they only consider row-column relationships. In contrast, multiway modeling techniques (also known as tensor models can analyze multimodal data, which capture much more information about complex behaviors such as cell differentiation. In particular, tensors can be very powerful tools for modeling the dynamic activity of biological networks over time. Here, we review the application of systems biology to stem cells and illustrate application of tensor analysis to model collagen-induced osteogenic differentiation of human mesenchymal stem cells. Results We applied Tucker1, Tucker3, and Parallel Factor Analysis (PARAFAC models to identify protein/gene expression patterns during extracellular matrix-induced osteogenic differentiation of human mesenchymal stem cells. In one case, we organized our data into a tensor of type protein/gene locus link × gene ontology category × osteogenic stimulant, and found that our cells expressed two distinct, stimulus-dependent sets of functionally related genes as they underwent osteogenic differentiation. In a second case, we organized DNA microarray data in a three-way tensor of gene IDs × osteogenic stimulus × replicates, and found that application of tensile strain to a
INTEGRATION OF FACILITY MODELING CAPABILITIES FOR NUCLEAR NONPROLIFERATION ANALYSIS
Energy Technology Data Exchange (ETDEWEB)
Gorensek, M.; Hamm, L.; Garcia, H.; Burr, T.; Coles, G.; Edmunds, T.; Garrett, A.; Krebs, J.; Kress, R.; Lamberti, V.; Schoenwald, D.; Tzanos, C.; Ward, R.
2011-07-18
Developing automated methods for data collection and analysis that can facilitate nuclear nonproliferation assessment is an important research area with significant consequences for the effective global deployment of nuclear energy. Facility modeling that can integrate and interpret observations collected from monitored facilities in order to ascertain their functional details will be a critical element of these methods. Although improvements are continually sought, existing facility modeling tools can characterize all aspects of reactor operations and the majority of nuclear fuel cycle processing steps, and include algorithms for data processing and interpretation. Assessing nonproliferation status is challenging because observations can come from many sources, including local and remote sensors that monitor facility operations, as well as open sources that provide specific business information about the monitored facilities, and can be of many different types. Although many current facility models are capable of analyzing large amounts of information, they have not been integrated in an analyst-friendly manner. This paper addresses some of these facility modeling capabilities and illustrates how they could be integrated and utilized for nonproliferation analysis. The inverse problem of inferring facility conditions based on collected observations is described, along with a proposed architecture and computer framework for utilizing facility modeling tools. After considering a representative sampling of key facility modeling capabilities, the proposed integration framework is illustrated with several examples.
Analysis of multinomial models with unknown index using data augmentation
Royle, J. Andrew; Dorazio, R.M.; Link, W.A.
2007-01-01
Multinomial models with unknown index ('sample size') arise in many practical settings. In practice, Bayesian analysis of such models has proved difficult because the dimension of the parameter space is not fixed, being in some cases a function of the unknown index. We describe a data augmentation approach to the analysis of this class of models that provides for a generic and efficient Bayesian implementation. Under this approach, the data are augmented with all-zero detection histories. The resulting augmented dataset is modeled as a zero-inflated version of the complete-data model where an estimable zero-inflation parameter takes the place of the unknown multinomial index. Interestingly, data augmentation can be justified as being equivalent to imposing a discrete uniform prior on the multinomial index. We provide three examples involving estimating the size of an animal population, estimating the number of diabetes cases in a population using the Rasch model, and the motivating example of estimating the number of species in an animal community with latent probabilities of species occurrence and detection.
An Effective Distributed Model for Power System Transient Stability Analysis
Directory of Open Access Journals (Sweden)
MUTHU, B. M.
2011-08-01
Full Text Available The modern power systems consist of many interconnected synchronous generators having different inertia constants, connected with large transmission network and ever increasing demand for power exchange. The size of the power system grows exponentially due to increase in power demand. The data required for various power system applications have been stored in different formats in a heterogeneous environment. The power system applications themselves have been developed and deployed in different platforms and language paradigms. Interoperability between power system applications becomes a major issue because of the heterogeneous nature. The main aim of the paper is to develop a generalized distributed model for carrying out power system stability analysis. The more flexible and loosely coupled JAX-RPC model has been developed for representing transient stability analysis in large interconnected power systems. The proposed model includes Pre-Fault, During-Fault, Post-Fault and Swing Curve services which are accessible to the remote power system clients when the system is subjected to large disturbances. A generalized XML based model for data representation has also been proposed for exchanging data in order to enhance the interoperability between legacy power system applications. The performance measure, Round Trip Time (RTT is estimated for different power systems using the proposed JAX-RPC model and compared with the results obtained using traditional client-server and Java RMI models.
Analysis of effect factors-based stochastic network planning model
Institute of Scientific and Technical Information of China (English)
无
2008-01-01
Looking at all the indeterminate factors as a whole and regarding activity durations as independent random variables,the traditional stochastic network planning models ignore the inevitable relationship and dependence among activity durations when more than one activity is possibly affected by the same indeterminate factors.On this basis of analysis of indeterminate effect factors of durations,the effect factors-based stochastic network planning (EFBSNP) model is proposed,which emphasizes on the effects of not only logistic and organizational relationships,but also the dependent relationships,due to indeterminate factors among activity durations on the project period.By virtue of indeterminate factor analysis the model extracts and describes the quantitatively indeterminate effect factors,and then takes into account the indeterminate factors effect schedule by using the Monte Carlo simulation technique.The method is flexible enough to deal with effect factors and is coincident with practice.A software has been developed to simplify the model-based calculation,in VisualStudio.NET language.Finally,a case study is included to demonstrate the applicability of the proposed model and comparison is made with some advantages over the existing models.
Constructing Maximum Entropy Language Models for Movie Review Subjectivity Analysis
Institute of Scientific and Technical Information of China (English)
Bo Chen; Hui He; Jun Guo
2008-01-01
Document subjectivity analysis has become an important aspect of web text content mining. This problem is similar to traditional text categorization, thus many related classification techniques can be adapted here. However, there is one significant difference that more language or semantic information is required for better estimating the subjectivity of a document. Therefore, in this paper, our focuses are mainly on two aspects. One is how to extract useful and meaningful language features, and the other is how to construct appropriate language models efficiently for this special task. For the first issue, we conduct a Global-Filtering and Local-Weighting strategy to select and evaluate language features in a series of n-grams with different orders and within various distance-windows. For the second issue, we adopt Maximum Entropy (MaxEnt) modeling methods to construct our language model framework. Besides the classical MaxEnt models, we have also constructed two kinds of improved models with Gaussian and exponential priors respectively. Detailed experiments given in this paper show that with well selected and weighted language features, MaxEnt models with exponential priors are significantly more suitable for the text subjectivity analysis task.
Automating sensitivity analysis of computer models using computer calculus
International Nuclear Information System (INIS)
An automated procedure for performing sensitivity analysis has been developed. The procedure uses a new FORTRAN compiler with computer calculus capabilities to generate the derivatives needed to set up sensitivity equations. The new compiler is called GRESS - Gradient Enhanced Software System. Application of the automated procedure with direct and adjoint sensitivity theory for the analysis of non-linear, iterative systems of equations is discussed. Calculational efficiency consideration and techniques for adjoint sensitivity analysis are emphasized. The new approach is found to preserve the traditional advantages of adjoint theory while removing the tedious human effort previously needed to apply this theoretical methodology. Conclusions are drawn about the applicability of the automated procedure in numerical analysis and large-scale modelling sensitivity studies
DEFF Research Database (Denmark)
Hukkerikar, Amol Shivajirao; Sarup, Bent; Sin, Gürkan;
2013-01-01
The objective of this work is to develop a method for performing property-data-model analysis so that efficient use of knowledge of properties could be made in the development/improvement of property prediction models. The method includes: (i) analysis of property data and its consistency check...... to a wide range of properties of pure compounds. In this work, however, the application of the method is illustrated for the property modeling of normal melting point, enthalpy of fusion, enthalpy of formation, and critical temperature. For all the properties listed above, it has been possible to achieve...
Epistasis analysis for quantitative traits by functional regression model.
Zhang, Futao; Boerwinkle, Eric; Xiong, Momiao
2014-06-01
The critical barrier in interaction analysis for rare variants is that most traditional statistical methods for testing interactions were originally designed for testing the interaction between common variants and are difficult to apply to rare variants because of their prohibitive computational time and poor ability. The great challenges for successful detection of interactions with next-generation sequencing (NGS) data are (1) lack of methods for interaction analysis with rare variants, (2) severe multiple testing, and (3) time-consuming computations. To meet these challenges, we shift the paradigm of interaction analysis between two loci to interaction analysis between two sets of loci or genomic regions and collectively test interactions between all possible pairs of SNPs within two genomic regions. In other words, we take a genome region as a basic unit of interaction analysis and use high-dimensional data reduction and functional data analysis techniques to develop a novel functional regression model to collectively test interactions between all possible pairs of single nucleotide polymorphisms (SNPs) within two genome regions. By intensive simulations, we demonstrate that the functional regression models for interaction analysis of the quantitative trait have the correct type 1 error rates and a much better ability to detect interactions than the current pairwise interaction analysis. The proposed method was applied to exome sequence data from the NHLBI's Exome Sequencing Project (ESP) and CHARGE-S study. We discovered 27 pairs of genes showing significant interactions after applying the Bonferroni correction (P-values < 4.58 × 10(-10)) in the ESP, and 11 were replicated in the CHARGE-S study.
SMV model-based safety analysis of software requirements
Energy Technology Data Exchange (ETDEWEB)
Koh, Kwang Yong [Department of Nuclear and Quantum Engineering, Korea Advanced Institute of Science and Technology, 373-1, Guseong-dong, Yuseong-gu, Daejeon 305-701 (Korea, Republic of); Seong, Poong Hyun [Department of Nuclear and Quantum Engineering, Korea Advanced Institute of Science and Technology, 373-1, Guseong-dong, Yuseong-gu, Daejeon 305-701 (Korea, Republic of)], E-mail: phseong@kaist.ac.kr
2009-02-15
Fault tree analysis (FTA) is one of the most frequently applied safety analysis techniques when developing safety-critical industrial systems such as software-based emergency shutdown systems of nuclear power plants and has been used for safety analysis of software requirements in the nuclear industry. However, the conventional method for safety analysis of software requirements has several problems in terms of correctness and efficiency; the fault tree generated from natural language specifications may contain flaws or errors while the manual work of safety verification is very labor-intensive and time-consuming. In this paper, we propose a new approach to resolve problems of the conventional method; we generate a fault tree from a symbolic model verifier (SMV) model, not from natural language specifications, and verify safety properties automatically, not manually, by a model checker SMV. To demonstrate the feasibility of this approach, we applied it to shutdown system 2 (SDS2) of Wolsong nuclear power plant (NPP). In spite of subtle ambiguities present in the approach, the results of this case study demonstrate its overall feasibility and effectiveness.
Biological exposure models for oil spill impact analysis
Institute of Scientific and Technical Information of China (English)
2000-01-01
The oil spill impact analysis (OSIA) software system has been developed to supply a tool for comprehensive, quantitative environmental impact assessments resulting from oil spills. In the system, a biological component evaluates potential effects on exposed organisms based on results from a physico-chemieal fates component, including the extent and characteristics of the surface slick, and dissolved and total concentrations of hydrocarbons in the water column. The component includes a particle-based exposure model for migratory adult fish populations, a particle-based exposure model for spawning planktonic organisms (eggs and larvae), and an exposure model for wildlife species (sea birds or marine mammals). The exposure model for migratory adult fish populations simulates the migration behaviors of fish populations migrating to or staying in their feeding areas, over-wintering areas or spawning areas, and determines the acute effects (mortality) and chronic accumulation (body burdens) from the dissolved contaminant. The exposure model for spawning planktonic organisms simulates the release of eggs and larvae, also as particles, from specific spawning areas during the spawning period, and determines their potential exposure to contaminants in the water or sediment. The exposure model for wild species calculates the exposure to surrace oil of wildlife (bird and marine mammal ) categories inhabiting the contaminated area. Compared with the earlier models in which all kinds of organisms are assumed evenly and randomly distributed, the updated biological exposure models can more realistically estimate potential effects on marine ecological system from oil spill pollution events.
Delamination Modeling of Composites for Improved Crash Analysis
Fleming, David C.
1999-01-01
Finite element crash modeling of composite structures is limited by the inability of current commercial crash codes to accurately model delamination growth. Efforts are made to implement and assess delamination modeling techniques using a current finite element crash code, MSC/DYTRAN. Three methods are evaluated, including a straightforward method based on monitoring forces in elements or constraints representing an interface; a cohesive fracture model proposed in the literature; and the virtual crack closure technique commonly used in fracture mechanics. Results are compared with dynamic double cantilever beam test data from the literature. Examples show that it is possible to accurately model delamination propagation in this case. However, the computational demands required for accurate solution are great and reliable property data may not be available to support general crash modeling efforts. Additional examples are modeled including an impact-loaded beam, damage initiation in laminated crushing specimens, and a scaled aircraft subfloor structures in which composite sandwich structures are used as energy-absorbing elements. These examples illustrate some of the difficulties in modeling delamination as part of a finite element crash analysis.
Dynamic Response of Linear Mechanical Systems Modeling, Analysis and Simulation
Angeles, Jorge
2012-01-01
Dynamic Response of Linear Mechanical Systems: Modeling, Analysis and Simulation can be utilized for a variety of courses, including junior and senior-level vibration and linear mechanical analysis courses. The author connects, by means of a rigorous, yet intuitive approach, the theory of vibration with the more general theory of systems. The book features: A seven-step modeling technique that helps structure the rather unstructured process of mechanical-system modeling A system-theoretic approach to deriving the time response of the linear mathematical models of mechanical systems The modal analysis and the time response of two-degree-of-freedom systems—the first step on the long way to the more elaborate study of multi-degree-of-freedom systems—using the Mohr circle Simple, yet powerful simulation algorithms that exploit the linearity of the system for both single- and multi-degree-of-freedom systems Examples and exercises that rely on modern computational toolboxes for both numerical and symbolic compu...
A Suite of Tools for ROC Analysis of Spatial Models
Directory of Open Access Journals (Sweden)
Hermann Rodrigues
2013-09-01
Full Text Available The Receiver Operating Characteristic (ROC is widely used for assessing the performance of classification algorithms. In GIScience, ROC has been applied to assess models aimed at predicting events, such as land use/cover change (LUCC, species distribution and disease risk. However, GIS software packages offer few statistical tests and guidance tools for ROC analysis and interpretation. This paper presents a suite of GIS tools designed to facilitate ROC curve analysis for GIS users by applying proper statistical tests and analysis procedures. The tools are freely available as models and submodels of Dinamica EGO freeware. The tools give the ROC curve, the area under the curve (AUC, partial AUC, lower and upper AUCs, the confidence interval of AUC, the density of event in probability bins and tests to evaluate the difference between the AUCs of two models. We present first the procedures and statistical tests implemented in Dinamica EGO, then the application of the tools to assess LUCC and species distribution models. Finally, we interpret and discuss the ROC-related statistics resulting from various case studies.
Application of Chaboche Model in Rocket Thrust Chamber Analysis
Asraff, Ahmedul Kabir; Suresh Babu, Sheela; Babu, Aneena; Eapen, Reeba
2015-12-01
Liquid Propellant Rocket Engines are commonly used in space technology. Thrust chamber is one of the most important subsystems of a rocket engine. The thrust chamber generates propulsive thrust force for flight of the rocket by ejection of combustion products at supersonic speeds. Often double walled construction is employed for these chambers. The thrust chamber investigated here has its hot inner wall fabricated out of a high thermal conductive material like copper alloy and outer wall made of stainless steel. Inner wall is subjected to high thermal and pressure loads during operation of engine due to which it will be in the plastic regime. Main reasons for the failure of such chambers are fatigue in the plastic range (called as low cycle fatigue since the number of cycles to failure will be low in plastic range), creep and thermal ratcheting. Elasto plastic material models are required to simulate the above effects through a cyclic stress analysis. This paper gives the details of cyclic stress analysis carried out for the thrust chamber using different plasticity model combinations available in ANSYS (Version 15) FE code. The best model among the above is applied in the cyclic stress analysis of two dimensional (plane strain and axisymmetric) and three dimensional finite element models of thrust chamber. Cyclic life of the chamber is calculated from stress-strain graph obtained from above analyses.
Magnetic Testing, and Modeling, Simulation and Analysis for Space Applications
Boghosian, Mary; Narvaez, Pablo; Herman, Ray
2012-01-01
The Aerospace Corporation (Aerospace) and Lockheed Martin Space Systems (LMSS) participated with Jet Propulsion Laboratory (JPL) in the implementation of a magnetic cleanliness program of the NASA/JPL JUNO mission. The magnetic cleanliness program was applied from early flight system development up through system level environmental testing. The JUNO magnetic cleanliness program required setting-up a specialized magnetic test facility at Lockheed Martin Space Systems for testing the flight system and a testing program with facility for testing system parts and subsystems at JPL. The magnetic modeling, simulation and analysis capability was set up and performed by Aerospace to provide qualitative and quantitative magnetic assessments of the magnetic parts, components, and subsystems prior to or in lieu of magnetic tests. Because of the sensitive nature of the fields and particles scientific measurements being conducted by the JUNO space mission to Jupiter, the imposition of stringent magnetic control specifications required a magnetic control program to ensure that the spacecraft's science magnetometers and plasma wave search coil were not magnetically contaminated by flight system magnetic interferences. With Aerospace's magnetic modeling, simulation and analysis and JPL's system modeling and testing approach, and LMSS's test support, the project achieved a cost effective approach to achieving a magnetically clean spacecraft. This paper presents lessons learned from the JUNO magnetic testing approach and Aerospace's modeling, simulation and analysis activities used to solve problems such as remnant magnetization, performance of hard and soft magnetic materials within the targeted space system in applied external magnetic fields.
Failure Propagation Modeling and Analysis via System Interfaces
Directory of Open Access Journals (Sweden)
Lin Zhao
2016-01-01
Full Text Available Safety-critical systems must be shown to be acceptably safe to deploy and use in their operational environment. One of the key concerns of developing safety-critical systems is to understand how the system behaves in the presence of failures, regardless of whether that failure is triggered by the external environment or caused by internal errors. Safety assessment at the early stages of system development involves analysis of potential failures and their consequences. Increasingly, for complex systems, model-based safety assessment is becoming more widely used. In this paper we propose an approach for safety analysis based on system interface models. By extending interaction models on the system interface level with failure modes as well as relevant portions of the physical system to be controlled, automated support could be provided for much of the failure analysis. We focus on fault modeling and on how to compute minimal cut sets. Particularly, we explore state space reconstruction strategy and bounded searching technique to reduce the number of states that need to be analyzed, which remarkably improves the efficiency of cut sets searching algorithm.
Improving Credit Scorecard Modeling Through Applying Text Analysis
Directory of Open Access Journals (Sweden)
Omar Ghailan
2016-04-01
Full Text Available In the credit card scoring and loans management, the prediction of the applicant’s future behavior is an important decision support tool and a key factor in reducing the risk of Loan Default. A lot of data mining and classification approaches have been developed for the credit scoring purpose. For the best of our knowledge, building a credit scorecard by analyzing the textual data in the application form has not been explored so far. This paper proposes a comprehensive credit scorecard model technique that improves credit scorecard modeling though employing textual data analysis. This study uses a sample of loan application forms of a financial institution providing loan services in Yemen, which represents a real-world situation of the credit scoring and loan management. The sample contains a set of Arabic textual data attributes defining the applicants. The credit scoring model based on the text mining pre-processing and logistic regression techniques is proposed and evaluated through a comparison with a group of credit scorecard modeling techniques that use only the numeric attributes in the application form. The results show that adding the textual attributes analysis achieves higher classification effectiveness and outperforms the other traditional numerical data analysis techniques.
Energy Technology Data Exchange (ETDEWEB)
Blair, Nate [National Renewable Energy Lab. (NREL), Golden, CO (United States); Jenkin, Thomas [National Renewable Energy Lab. (NREL), Golden, CO (United States); Milford, James [National Renewable Energy Lab. (NREL), Golden, CO (United States); Short, Walter [National Renewable Energy Lab. (NREL), Golden, CO (United States); Sullivan, Patrick [National Renewable Energy Lab. (NREL), Golden, CO (United States); Evans, David [US Environmental Protection Agency (EPA), Cincinnati, OH (United States); Lieberman, Elliot [US Environmental Protection Agency (EPA), Cincinnati, OH (United States); Goldstein, Gary [International Resources Group, Washington, DC (United States); Wright, Evelyn [International Resources Group, Washington, DC (United States); Jayaraman, Kamala R. [ICF International, Fairfax, VA (United States); Venkatesh, Boddu [ICF International, Fairfax, VA (United States); Kleiman, Gary [Northeast States for Coordinated Air Use Management, Boston, MA (United States); Namovicz, Christopher [Energy Information Administration, Washington, DC (United States); Smith, Bob [Energy Information Administration, Washington, DC (United States); Palmer, Karen [Resources of the Future, Washington, DC (United States); Wiser, Ryan [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Wood, Frances [OnLocation Inc., Vienna, VA (United States)
2009-09-01
Energy system modeling can be intentionally or unintentionally misused by decision-makers. This report describes how both can be minimized through careful use of models and thorough understanding of their underlying approaches and assumptions. The analysis summarized here assesses the impact that model and data choices have on forecasting energy systems by comparing seven different electric-sector models. This analysis was coordinated by the Renewable Energy and Efficiency Modeling Analysis Partnership (REMAP), a collaboration among governmental, academic, and nongovernmental participants.
Multi-model Analysis through Delayed Mode Evaluation
Hankin, S.; Manke, A.; O'Brien, K.; Smith, K.; Schweitzer, R.; Weusijana, K.; Williams, D.
2012-12-01
As data volumes from both simulations and instruments continue to grow ever larger, there is a need for the ability to find ways of dealing with this staggering amount of information without overloading tools or network infrastructure. For example, during the CMIP3 experiments in 2005-2006, there were a total of 12 model experiments generating 36 TBytes of data. Fast forward to the CMIP5 experiment just five years later, and you'll find 110 experiments generating over 3 Pedabytes of data! This presentation will focus on the data visualization and analysis tools developed by the Thermal Modeling and Analysis Project at NOAA's Pacific Marine Environmental Lab, including Ferret, PyFerret, Live Access Server and the Ferret-THREDDS Data Server. These tools have all been evolving to meet the ever-growing demands of users needing to analyze and visualize large data collections, such as generated by the CMIP5 experiments. We'll talk about how these tools implement both the OPeNDAP protocol and delayed-mode analysis to minimize data that needs to be transferred via the network. We'll demonstrate the Ferret and PyFerret enhancements of two additional dimensions, giving those tools a capability to work in six dimensions. These additional dimensions could represent any coordinate direction, but they typically would be used to represent a collection of model runs, otherwise known as an ensemble, and forecast time for forecast model run collections. In addition, we'll be touching on the wide variety of Python/SciPy analysis capabilities that are available in PyFerret. Lastly, we'll talk about how we are integrating all of these capabilities in the newly redesigned Live Access Server to give the user the ability to visualize, analyze and compare complex data such as ensembles and forecast model run collections in an easy-to-use web-based interface.
Sensitivity Analysis of the Integrated Medical Model for ISS Programs
Goodenow, D. A.; Myers, J. G.; Arellano, J.; Boley, L.; Garcia, Y.; Saile, L.; Walton, M.; Kerstman, E.; Reyes, D.; Young, M.
2016-01-01
Sensitivity analysis estimates the relative contribution of the uncertainty in input values to the uncertainty of model outputs. Partial Rank Correlation Coefficient (PRCC) and Standardized Rank Regression Coefficient (SRRC) are methods of conducting sensitivity analysis on nonlinear simulation models like the Integrated Medical Model (IMM). The PRCC method estimates the sensitivity using partial correlation of the ranks of the generated input values to each generated output value. The partial part is so named because adjustments are made for the linear effects of all the other input values in the calculation of correlation between a particular input and each output. In SRRC, standardized regression-based coefficients measure the sensitivity of each input, adjusted for all the other inputs, on each output. Because the relative ranking of each of the inputs and outputs is used, as opposed to the values themselves, both methods accommodate the nonlinear relationship of the underlying model. As part of the IMM v4.0 validation study, simulations are available that predict 33 person-missions on ISS and 111 person-missions on STS. These simulated data predictions feed the sensitivity analysis procedures. The inputs to the sensitivity procedures include the number occurrences of each of the one hundred IMM medical conditions generated over the simulations and the associated IMM outputs: total quality time lost (QTL), number of evacuations (EVAC), and number of loss of crew lives (LOCL). The IMM team will report the results of using PRCC and SRRC on IMM v4.0 predictions of the ISS and STS missions created as part of the external validation study. Tornado plots will assist in the visualization of the condition-related input sensitivities to each of the main outcomes. The outcomes of this sensitivity analysis will drive review focus by identifying conditions where changes in uncertainty could drive changes in overall model output uncertainty. These efforts are an integral
Multifunctional Collaborative Modeling and Analysis Methods in Engineering Science
Ransom, Jonathan B.; Broduer, Steve (Technical Monitor)
2001-01-01
Engineers are challenged to produce better designs in less time and for less cost. Hence, to investigate novel and revolutionary design concepts, accurate, high-fidelity results must be assimilated rapidly into the design, analysis, and simulation process. This assimilation should consider diverse mathematical modeling and multi-discipline interactions necessitated by concepts exploiting advanced materials and structures. Integrated high-fidelity methods with diverse engineering applications provide the enabling technologies to assimilate these high-fidelity, multi-disciplinary results rapidly at an early stage in the design. These integrated methods must be multifunctional, collaborative, and applicable to the general field of engineering science and mechanics. Multifunctional methodologies and analysis procedures are formulated for interfacing diverse subdomain idealizations including multi-fidelity modeling methods and multi-discipline analysis methods. These methods, based on the method of weighted residuals, ensure accurate compatibility of primary and secondary variables across the subdomain interfaces. Methods are developed using diverse mathematical modeling (i.e., finite difference and finite element methods) and multi-fidelity modeling among the subdomains. Several benchmark scalar-field and vector-field problems in engineering science are presented with extensions to multidisciplinary problems. Results for all problems presented are in overall good agreement with the exact analytical solution or the reference numerical solution. Based on the results, the integrated modeling approach using the finite element method for multi-fidelity discretization among the subdomains is identified as most robust. The multiple-method approach is advantageous when interfacing diverse disciplines in which each of the method's strengths are utilized. The multifunctional methodology presented provides an effective mechanism by which domains with diverse idealizations are
Wire-wrap models for subchannel blockage analysis
Energy Technology Data Exchange (ETDEWEB)
Ha, K. S.; Jeong, H. Y.; Chang, W. P.; Kwon, Y. M.; Lee, Y. B. [KAERI, Taejon (Korea, Republic of)
2004-04-01
The distributed resistance model has been recently implemented into the MATRA-LMR code in order to improve its prediction capability over the wire-wrap model for a flow blockage analysis in the LMR. The code capability has been investigated using experimental data observed in the FFM (Fuel Failure Mock-up)-2A and 5B for two typical flow conditions in a blocked channel. The predicted results by the MATRA-LMR with a distributed resistance model agreed well with the experimental data for wire-wrapped subchannels. However, it is suggested that the parameter n in the distributed resistance model needs to be calibrated accurately for a reasonable prediction of the temperature field under a low flow condition. Finally, the analyses of a blockage for the assembly of the KALIMER design are performed. Satisfactory results by the MATRA-LMR code were obtained through and reified a comparison with results of the SABRE code.
GIS application on spatial landslide analysis using statistical based models
Pradhan, Biswajeet; Lee, Saro; Buchroithner, Manfred F.
2009-09-01
This paper presents the assessment results of spatially based probabilistic three models using Geoinformation Techniques (GIT) for landslide susceptibility analysis at Penang Island in Malaysia. Landslide locations within the study areas were identified by interpreting aerial photographs, satellite images and supported with field surveys. Maps of the topography, soil type, lineaments and land cover were constructed from the spatial data sets. There are ten landslide related factors were extracted from the spatial database and the frequency ratio, fuzzy logic, and bivariate logistic regression coefficients of each factor was computed. Finally, landslide susceptibility maps were drawn for study area using frequency ratios, fuzzy logic and bivariate logistic regression models. For verification, the results of the analyses were compared with actual landslide locations in study area. The verification results show that bivariate logistic regression model provides slightly higher prediction accuracy than the frequency ratio and fuzzy logic models.
Empirical Analysis of Xinjiang's Bilateral Trade: Gravity Model Approach
Institute of Scientific and Technical Information of China (English)
CHEN Xuegang; YANG Zhaoping; LIU Xuling
2008-01-01
Based on the basic trade gravity model and Xinjiang's practical situation, new explanatory variables (GDP,GDPpc and SCO) are introduced to build an extended trade gravity model fitting for Xinjiang's bilateral trade. Fromthe empirical analysis of this model, it is proposed that those three variables affect the Xinjiang's bilateral trade posi-tively. Whereas, geographic distance is found to be a significant factor influencing Xinjiang's bilateral trade negatively.Then, by the extended trade gravity model, this article analyzes the present trade situation between Xinjiang and itsmain trade partners quantitatively in 2004. The results indicate that Xinjiang cooperates with its most trade partnerssuccessfully in terms of present economic scale and developing revel. Xinjiang has established successfully trade part-nership with Central Asia, Central Europe and Eastern Europe, Western Europe, East Asia and South Asia. However,the foreign trade development with West Asia is much slower. Finally, some suggestions on developing Xinjiang's for-eign trade are put forward.
Statistical models of video structure for content analysis and characterization.
Vasconcelos, N; Lippman, A
2000-01-01
Content structure plays an important role in the understanding of video. In this paper, we argue that knowledge about structure can be used both as a means to improve the performance of content analysis and to extract features that convey semantic information about the content. We introduce statistical models for two important components of this structure, shot duration and activity, and demonstrate the usefulness of these models with two practical applications. First, we develop a Bayesian formulation for the shot segmentation problem that is shown to extend the standard thresholding model in an adaptive and intuitive way, leading to improved segmentation accuracy. Second, by applying the transformation into the shot duration/activity feature space to a database of movie clips, we also illustrate how the Bayesian model captures semantic properties of the content. We suggest ways in which these properties can be used as a basis for intuitive content-based access to movie libraries.
An educational model for ensemble streamflow simulation and uncertainty analysis
Directory of Open Access Journals (Sweden)
A. AghaKouchak
2013-02-01
Full Text Available This paper presents the hands-on modeling toolbox, HBV-Ensemble, designed as a complement to theoretical hydrology lectures, to teach hydrological processes and their uncertainties. The HBV-Ensemble can be used for in-class lab practices and homework assignments, and assessment of students' understanding of hydrological processes. Using this modeling toolbox, students can gain more insights into how hydrological processes (e.g., precipitation, snowmelt and snow accumulation, soil moisture, evapotranspiration and runoff generation are interconnected. The educational toolbox includes a MATLAB Graphical User Interface (GUI and an ensemble simulation scheme that can be used for teaching uncertainty analysis, parameter estimation, ensemble simulation and model sensitivity. HBV-Ensemble was administered in a class for both in-class instruction and a final project, and students submitted their feedback about the toolbox. The results indicate that this educational software had a positive impact on students understanding and knowledge of uncertainty in hydrological modeling.
Introduction to modeling and analysis of stochastic systems
Kulkarni, V G
2011-01-01
This is an introductory-level text on stochastic modeling. It is suited for undergraduate students in engineering, operations research, statistics, mathematics, actuarial science, business management, computer science, and public policy. It employs a large number of examples to teach the students to use stochastic models of real-life systems to predict their performance, and use this analysis to design better systems. The book is devoted to the study of important classes of stochastic processes: discrete and continuous time Markov processes, Poisson processes, renewal and regenerative processes, semi-Markov processes, queueing models, and diffusion processes. The book systematically studies the short-term and the long-term behavior, cost/reward models, and first passage times. All the material is illustrated with many examples, and case studies. The book provides a concise review of probability in the appendix. The book emphasizes numerical answers to the problems. A collection of MATLAB programs to accompany...
An educational model for ensemble streamflow simulation and uncertainty analysis
Directory of Open Access Journals (Sweden)
A. AghaKouchak
2012-06-01
Full Text Available This paper presents a hands-on modeling toolbox, HBV-Ensemble, designed as a complement to theoretical hydrology lectures, to teach hydrological processes and their uncertainties. The HBV-Ensemble can be used for in-class lab practices and homework assignments, and assessment of students' understanding of hydrological processes. Using this model, students can gain more insights into how hydrological processes (e.g., precipitation, snowmelt and snow accumulation, soil moisture, evapotranspiration and runoff generation are interconnected. The model includes a MATLAB Graphical User Interface (GUI and an ensemble simulation scheme that can be used for not only hydrological processes, but also for teaching uncertainty analysis, parameter estimation, ensemble simulation and model sensitivity.
Modeling Illicit Drug Use Dynamics and Its Optimal Control Analysis
Directory of Open Access Journals (Sweden)
Steady Mushayabasa
2015-01-01
Full Text Available The global burden of death and disability attributable to illicit drug use, remains a significant threat to public health for both developed and developing nations. This paper presents a new mathematical modeling framework to investigate the effects of illicit drug use in the community. In our model the transmission process is captured as a social “contact” process between the susceptible individuals and illicit drug users. We conduct both epidemic and endemic analysis, with a focus on the threshold dynamics characterized by the basic reproduction number. Using our model, we present illustrative numerical results with a case study in Cape Town, Gauteng, Mpumalanga and Durban communities of South Africa. In addition, the basic model is extended to incorporate time dependent intervention strategies.
Modeling Illicit Drug Use Dynamics and Its Optimal Control Analysis.
Mushayabasa, Steady; Tapedzesa, Gift
2015-01-01
The global burden of death and disability attributable to illicit drug use, remains a significant threat to public health for both developed and developing nations. This paper presents a new mathematical modeling framework to investigate the effects of illicit drug use in the community. In our model the transmission process is captured as a social "contact" process between the susceptible individuals and illicit drug users. We conduct both epidemic and endemic analysis, with a focus on the threshold dynamics characterized by the basic reproduction number. Using our model, we present illustrative numerical results with a case study in Cape Town, Gauteng, Mpumalanga and Durban communities of South Africa. In addition, the basic model is extended to incorporate time dependent intervention strategies. PMID:26819625
Eikonal model analysis of elastic hadron collisions at high energies
Prochazka, Jiri
2016-01-01
Elastic collisions of protons at different energies represent main background in studying the structure of fundamental particles at the present. On the basis of standardly used model proposed by West and Yennie the protons have been then interpreted as transparent objects; elastic events have been interpreted as more central than inelastic ones. It will be shown that using eikonal model the protons may be interpreted in agreement with usual ontological conception; elastic processes being more peripheral than inelastic ones. The corresponding results (differing fundamentally from those of WY model) will be presented by analyzing the most ample elastic data set measured at ISR energy of 53 GeV. Detailed analysis of measured differential cross section will be performed and different alternatives of peripheral behavior on the basis of eikonal model will be presented. The impact of recently established electromagnetic form factors on determination of quantities specifying hadron interaction determined from the fit...
Dimensionless Analysis and Numerical Modeling of Rebalancing Phenomena During Levitation
Gao, Lei; Shi, Zhe; Li, Donghui; McLean, Alexander; Chattopadhyay, Kinnor
2016-06-01
Electromagnetic levitation (EML) has proved to be a powerful tool for research activities in areas pertaining to materials physics and engineering. The customized EML setups in various fields, ranging from solidification to nanomaterial manufacturing, require the designing of stable levitation systems. Since the elevated droplet is opaque, the most effective way to research on EML is mathematical modeling. In the present study, a 3D model was built to investigate the rebalancing phenomenon causing instabilities during droplet melting. A mathematical model modified based on Hooke's law (spring) was proposed to describe the levitation system. This was combined with dimensionless analysis to investigate the generation of levitation forces as it will significantly affect the behavior of the spring model.
Equity and carbon emissions trading: a model analysis
International Nuclear Information System (INIS)
Carbon emissions trading is a key instrument of climate policy. It helps to bring about emission reductions in that place where they are least costly. However, fair burden sharing is about more than just cost-efficiency. While focussing on the instrument of emissions trading, this paper touches upon equity issues that frame decisions on emission rights allocation. The analysis is based on the ICLIPS model. The model study gives new insights on how the equal per capita allocation principle influences the intertemporal emission paths and about the distribution of mitigation costs in the long run. Apart from the intuitive economic evaluation of model results, this paper also attempts to provide an evaluation from an equity point of view. For a variety of assumptions, model results show that several developing countries could benefit considerably from joining an international emissions trading system, thereby becoming potential collaborators in post-Kyoto climate agreements
Rotor-Flying Manipulator: Modeling, Analysis, and Control
Directory of Open Access Journals (Sweden)
Bin Yang
2014-01-01
Full Text Available Equipping multijoint manipulators on a mobile robot is a typical redesign scheme to make the latter be able to actively influence the surroundings and has been extensively used for many ground robots, underwater robots, and space robotic systems. However, the rotor-flying robot (RFR is difficult to be made such redesign. This is mainly because the motion of the manipulator will bring heavy coupling between itself and the RFR system, which makes the system model highly complicated and the controller design difficult. Thus, in this paper, the modeling, analysis, and control of the combined system, called rotor-flying multijoint manipulator (RF-MJM, are conducted. Firstly, the detailed dynamics model is constructed and analyzed. Subsequently, a full-state feedback linear quadratic regulator (LQR controller is designed through obtaining linearized model near steady state. Finally, simulations are conducted and the results are analyzed to show the basic control performance.
Planetary and interplanetary environmental models for radiation analysis
de Angelis, G.; Cucinotta, F. A.
In this introductory talk the essence of environmental modeling is presented as suited for radiation analysis purposes. The variables of fundamental importance for radiation environmental assessment are discussed. The characterization is performed by dividing modeling into three areas, namely the interplanetary medium, the circumplanetary environment, and the planetary or satellite surface. In the first area, the galactic cosmic rays (GCR) and their modulation by the heliospheric magnetic field as well as solar particle events (SPE) are considered, in the second area the magnetospheres are taken into account, and in the third area the effect of the planetary environment is also considered. Planetary surfaces and atmospheres are modeled based on results from the most recent targeted spacecraft. The results are coupled with suited visualization techniques and radiation transport models in support of trade studies of spacecraft and crew health risks for future exploration missions.
Benchmarking analysis of three multimedia models: RESRAD, MMSOILS, and MEPAS
International Nuclear Information System (INIS)
Multimedia modelers from the United States Environmental Protection Agency (EPA) and the United States Department of Energy (DOE) collaborated to conduct a comprehensive and quantitative benchmarking analysis of three multimedia models. The three models-RESRAD (DOE), MMSOILS (EPA), and MEPAS (DOE)-represent analytically based tools that are used by the respective agencies for performing human exposure and health risk assessments. The study is performed by individuals who participate directly in the ongoing design, development, and application of the models. A list of physical/chemical/biological processes related to multimedia-based exposure and risk assessment is first presented as a basis for comparing the overall capabilities of RESRAD, MMSOILS, and MEPAS. Model design, formulation, and function are then examined by applying the models to a series of hypothetical problems. Major components of the models (e.g., atmospheric, surface water, groundwater) are evaluated separately and then studied as part of an integrated system for the assessment of a multimedia release scenario to determine effects due to linking components of the models. Seven modeling scenarios are used in the conduct of this benchmarking study: (1) direct biosphere exposure, (2) direct release to the air, (3) direct release to the vadose zone, (4) direct release to the saturated zone, (5) direct release to surface water, (6) surface water hydrology, and (7) multimedia release. Study results show that the models differ with respect to (1) environmental processes included (i.e., model features) and (2) the mathematical formulation and assumptions related to the implementation of solutions (i.e., parameterization)
Nuclear power plant modeling and steam generator stability analysis
International Nuclear Information System (INIS)
This thesis describes the development of a computer model simulating the transient behavior of a pressurized water reactor (PWR) nuclear steam supply system (NSSS) and a stability analysis of steam generators in an overall NSSS structure. In the analysis of stream generator stability characteristics, an emphasis was placed on the physical interpretation of density wave oscillation (DWO) phenomena in boiling channels. The PWR NSSS code TRANSG-P is based on the nonlinear steam generator code TRANSG, in which the basic flow channel and heat-exchanger models were previously formulated. In addition to the steam generator, the TRANSG-P code includes models for the pressurizer, the pump, and the turbine. The mathematical model for fluid channels is based upon one-dimensional, nonlinear, single-fluid conservation equations of mass, momentum, and energy. Space and time discretization of these equations is accomplished using an implicit finite-difference formulation. The pressurizer is modeled as a nonequilibrium system at uniform pressure, consisting of vapor and liquid regions. Flashing and condensation are accounted for, and control elements are also modeled. The pump behavior is determined by making use of homologous curves, whereas simple energy conservation and choked flow equations are used to model the turbine. Efforts were made to assess the accuracy of the entire plant model of the TRANSG-P code through simulation of a loss-of-feedwater accident that occurred at a PWR plant. The TRANSG-P results are in reasonable agreement with the plant data, which inherently are subject to considerable uncertainties. In addition, once-through and natural-circulation boiling channel calculations, performed for the investigation of flow stability characteristics, showed good agreement with the test data
Radiolysis Model Sensitivity Analysis for a Used Fuel Storage Canister
Energy Technology Data Exchange (ETDEWEB)
Wittman, Richard S.
2013-09-20
This report fulfills the M3 milestone (M3FT-13PN0810027) to report on a radiolysis computer model analysis that estimates the generation of radiolytic products for a storage canister. The analysis considers radiolysis outside storage canister walls and within the canister fill gas over a possible 300-year lifetime. Previous work relied on estimates based directly on a water radiolysis G-value. This work also includes that effect with the addition of coupled kinetics for 111 reactions for 40 gas species to account for radiolytic-induced chemistry, which includes water recombination and reactions with air.
Data warehouse model design technology analysis and research
Jiang, Wenhua; Li, Qingshui
2012-01-01
Existing data storage format can not meet the needs of information analysis, data warehouse onto the historical stage, the data warehouse is to support business decision making and the creation of specially designed data collection. With the data warehouse, the companies will all collected information is stored in the data warehouse. The data warehouse is organized according to some, making information easy to access and has value. This paper focuses on the establishment of data warehouse and analysis, design, data warehouse, two barrier models, and compares them.
Low-rank and sparse modeling for visual analysis
Fu, Yun
2014-01-01
This book provides a view of low-rank and sparse computing, especially approximation, recovery, representation, scaling, coding, embedding and learning among unconstrained visual data. The book includes chapters covering multiple emerging topics in this new field. It links multiple popular research fields in Human-Centered Computing, Social Media, Image Classification, Pattern Recognition, Computer Vision, Big Data, and Human-Computer Interaction. Contains an overview of the low-rank and sparse modeling techniques for visual analysis by examining both theoretical analysis and real-world applic
Traffic Accident, System Model and Cluster Analysis in GIS
Directory of Open Access Journals (Sweden)
Veronika Vlčková
2015-07-01
Full Text Available One of the many often frequented topics as normal journalism, so the professional public, is the problem of traffic accidents. This article illustrates the orientation of considerations to a less known context of accidents, with the help of constructive systems theory and its methods, cluster analysis and geoinformation engineering. Traffic accident is reframing the space-time, and therefore it can be to study with tools of technology of geographic information systems. The application of system approach enabling the formulation of the system model, grabbed by tools of geoinformation engineering and multicriterial and cluster analysis.
Bicoherence analysis of model-scale jet noise.
Gee, Kent L; Atchley, Anthony A; Falco, Lauren E; Shepherd, Micah R; Ukeiley, Lawrence S; Jansen, Bernard J; Seiner, John M
2010-11-01
Bicoherence analysis has been used to characterize nonlinear effects in the propagation of noise from a model-scale, Mach-2.0, unheated jet. Nonlinear propagation effects are predominantly limited to regions near the peak directivity angle for this jet source and propagation range. The analysis also examines the practice of identifying nonlinear propagation by comparing spectra measured at two different distances and assuming far-field, linear propagation between them. This spectral comparison method can lead to erroneous conclusions regarding the role of nonlinearity when the observations are made in the geometric near field of an extended, directional radiator, such as a jet. PMID:21110528
Initial results from model independent analysis of the KEK ATF
International Nuclear Information System (INIS)
Model Independent Analysis (MIA) has shown the potential to be a useful tool for diagnostics and optics verification. The Accelerator Test Facility (ATF) prototype damping ring at KEK has a diagnostic system with the ability to collect data allowing the application of MIA for analysis of the injection stability, and the storage ring optics and diagnostics. Understanding of all these issues will be important for improving the operational performance of a damping ring. We report here the results of a first attempt to apply MIA to the ATF
IMS Security Analysis using Multi-attribute Model
Directory of Open Access Journals (Sweden)
Kai Shuang
2011-02-01
Full Text Available IP Multimedia Subsystem (IMS plays as a core platform in next-generation-network architecture, which advocates for an open and inter-operable service infrastructure. However, over the years, not much attention has been paid to the security problem of IMS. The time has come to now address security issues. This paper proposes a multi-attribute stereo model based on X.805 standard, which addresses a comprehensive, top-down, end-to-end perspective for IMS security analysis, classification and assessment. Besides, a detailed threat analysis of IMS is demonstrated and the vulnerability utilized by the threat is also presented.
Logistics Enterprise Evaluation Model Based On Fuzzy Clustering Analysis
Fu, Pei-hua; Yin, Hong-bo
In this thesis, we introduced an evaluation model based on fuzzy cluster algorithm of logistics enterprises. First of all,we present the evaluation index system which contains basic information, management level, technical strength, transport capacity,informatization level, market competition and customer service. We decided the index weight according to the grades, and evaluated integrate ability of the logistics enterprises using fuzzy cluster analysis method. In this thesis, we introduced the system evaluation module and cluster analysis module in detail and described how we achieved these two modules. At last, we gave the result of the system.
Automated Techniques for the Qualitative Analysis of Ecological Models: Continuous Models
Directory of Open Access Journals (Sweden)
Lynn van Coller
1997-06-01
Full Text Available The mathematics required for a detailed analysis of the behavior of a model can be formidable. In this paper, I demonstrate how various computer packages can aid qualitative analyses by implementing techniques from dynamical systems theory. Because computer software is used to obtain the results, the techniques can be used by nonmathematicians as well as mathematicians. In-depth analyses of complicated models that were previously very difficult to study can now be done. Because the paper is intended as an introduction to applying the techniques to ecological models, I have included an appendix describing some of the ideas and terminology. A second appendix shows how the techniques can be applied to a fairly simple predator-prey model and establishes the reliability of the computer software. The main body of the paper discusses a ratio-dependent model. The new techniques highlight some limitations of isocline analyses in this three-dimensional setting and show that the model is structurally unstable. Another appendix describes a larger model of a sheep-pasture-hyrax-lynx system. Dynamical systems techniques are compared with a traditional sensitivity analysis and are found to give more information. As a result, an incomplete relationship in the model is highlighted. I also discuss the resilience of these models to both parameter and population perturbations.
Nonlinear Pressure Wave Analysis by Concentrated Mass Model
Ishikawa, Satoshi; Kondou, Takahiro; Matsuzaki, Kenichiro
A pressure wave propagating in a tube often changes to a shock wave because of the nonlinear effect of fluid. Analyzing this phenomenon by the finite difference method requires high computational cost. To lessen the computational cost, a concentrated mass model is proposed. This model consists of masses, connecting nonlinear springs, connecting dampers, and base support dampers. The characteristic of a connecting nonlinear spring is derived from the adiabatic change of fluid, and the equivalent mass and equivalent damping coefficient of the base support damper are derived from the equation of motion of fluid in a cylindrical tube. Pressure waves generated in a hydraulic oil tube, a sound tube and a plane-wave tube are analyzed numerically by the proposed model to confirm the validity of the model. All numerical computational results agree very well with the experimental results carried out by Okamura, Saenger and Kamakura. Especially, the numerical analysis reproduces the phenomena that a pressure wave with large amplitude propagating in a sound tube or in a plane tube changes to a shock wave. Therefore, it is concluded that the proposed model is valid for the numerical analysis of nonlinear pressure wave problem.
Error analysis of short term wind power prediction models
International Nuclear Information System (INIS)
The integration of wind farms in power networks has become an important problem. This is because the electricity produced cannot be preserved because of the high cost of storage and electricity production must follow market demand. Short-long-range wind forecasting over different lengths/periods of time is becoming an important process for the management of wind farms. Time series modelling of wind speeds is based upon the valid assumption that all the causative factors are implicitly accounted for in the sequence of occurrence of the process itself. Hence time series modelling is equivalent to physical modelling. Auto Regressive Moving Average (ARMA) models, which perform a linear mapping between inputs and outputs, and Artificial Neural Networks (ANNs) and Adaptive Neuro-Fuzzy Inference Systems (ANFIS), which perform a non-linear mapping, provide a robust approach to wind power prediction. In this work, these models are developed in order to forecast power production of a wind farm with three wind turbines, using real load data and comparing different time prediction periods. This comparative analysis takes in the first time, various forecasting methods, time horizons and a deep performance analysis focused upon the normalised mean error and the statistical distribution hereof in order to evaluate error distribution within a narrower curve and therefore forecasting methods whereby it is more improbable to make errors in prediction. (author)
Deformable models with sparsity constraints for cardiac motion analysis.
Yu, Yang; Zhang, Shaoting; Li, Kang; Metaxas, Dimitris; Axel, Leon
2014-08-01
Deformable models integrate bottom-up information derived from image appearance cues and top-down priori knowledge of the shape. They have been widely used with success in medical image analysis. One limitation of traditional deformable models is that the information extracted from the image data may contain gross errors, which adversely affect the deformation accuracy. To alleviate this issue, we introduce a new family of deformable models that are inspired from the compressed sensing, a technique for accurate signal reconstruction by harnessing some sparseness priors. In this paper, we employ sparsity constraints to handle the outliers or gross errors, and integrate them seamlessly with deformable models. The proposed new formulation is applied to the analysis of cardiac motion using tagged magnetic resonance imaging (tMRI), where the automated tagging line tracking results are very noisy due to the poor image quality. Our new deformable models track the heart motion robustly, and the resulting strains are consistent with those calculated from manual labels. PMID:24721617
Error analysis of short term wind power prediction models
Energy Technology Data Exchange (ETDEWEB)
De Giorgi, Maria Grazia; Ficarella, Antonio; Tarantino, Marco [Dipartimento di Ingegneria dell' Innovazione, Universita del Salento, Via per Monteroni, 73100 Lecce (Italy)
2011-04-15
The integration of wind farms in power networks has become an important problem. This is because the electricity produced cannot be preserved because of the high cost of storage and electricity production must follow market demand. Short-long-range wind forecasting over different lengths/periods of time is becoming an important process for the management of wind farms. Time series modelling of wind speeds is based upon the valid assumption that all the causative factors are implicitly accounted for in the sequence of occurrence of the process itself. Hence time series modelling is equivalent to physical modelling. Auto Regressive Moving Average (ARMA) models, which perform a linear mapping between inputs and outputs, and Artificial Neural Networks (ANNs) and Adaptive Neuro-Fuzzy Inference Systems (ANFIS), which perform a non-linear mapping, provide a robust approach to wind power prediction. In this work, these models are developed in order to forecast power production of a wind farm with three wind turbines, using real load data and comparing different time prediction periods. This comparative analysis takes in the first time, various forecasting methods, time horizons and a deep performance analysis focused upon the normalised mean error and the statistical distribution hereof in order to evaluate error distribution within a narrower curve and therefore forecasting methods whereby it is more improbable to make errors in prediction. (author)
Directory of Open Access Journals (Sweden)
Sérgio Roberto da Silva
2016-06-01
Full Text Available Colombia has been one of the first countries to introduce electronic billing process on a voluntary basis, from a traditional to a digital version. In this context, the article analyzes the electronic billing process implemented in Colombia and the advantages. Methodological research is applied, qualitative, descriptive and documentary; where the regulatory framework and the conceptualization of the model is identified; the process of adoption of electronic billing is analyzed, and finally the advantages and disadvantages of its implementation is analyzed. The findings indicate that the model applied in Colombia to issue an electronic billing in sending and receiving process, is not complex, but it requires a small adequate infrastructure and trained personnel to reach all sectors, especially the micro and business which is the largest business network in the country.
Freddi, Alessandro; Cristofolini, Luca
2015-01-01
This book summarizes the main methods of experimental stress analysis and examines their application to various states of stress of major technical interest, highlighting aspects not always covered in the classic literature. It is explained how experimental stress analysis assists in the verification and completion of analytical and numerical models, the development of phenomenological theories, the measurement and control of system parameters under operating conditions, and identification of causes of failure or malfunction. Cases addressed include measurement of the state of stress in models, measurement of actual loads on structures, verification of stress states in circumstances of complex numerical modeling, assessment of stress-related material damage, and reliability analysis of artifacts (e.g. prostheses) that interact with biological systems. The book will serve graduate students and professionals as a valuable tool for finding solutions when analytical solutions do not exist.
Lee, L. A.; Carslaw, K. S.; Pringle, K. J.
2012-04-01
Global aerosol contributions to radiative forcing (and hence climate change) are persistently subject to large uncertainty in successive Intergovernmental Panel on Climate Change (IPCC) reports (Schimel et al., 1996; Penner et al., 2001; Forster et al., 2007). As such more complex global aerosol models are being developed to simulate aerosol microphysics in the atmosphere. The uncertainty in global aerosol model estimates is currently estimated by measuring the diversity amongst different models (Textor et al., 2006, 2007; Meehl et al., 2007). The uncertainty at the process level due to the need to parameterise in such models is not yet understood and it is difficult to know whether the added model complexity comes at a cost of high model uncertainty. In this work the model uncertainty and its sources due to the uncertain parameters is quantified using variance-based sensitivity analysis. Due to the complexity of a global aerosol model we use Gaussian process emulation with a sufficient experimental design to make such as a sensitivity analysis possible. The global aerosol model used here is GLOMAP (Mann et al., 2010) and we quantify the sensitivity of numerous model outputs to 27 expertly elicited uncertain model parameters describing emissions and processes such as growth and removal of aerosol. Using the R package DiceKriging (Roustant et al., 2010) along with the package sensitivity (Pujol, 2008) it has been possible to produce monthly global maps of model sensitivity to the uncertain parameters over the year 2008. Global model outputs estimated by the emulator are shown to be consistent with previously published estimates (Spracklen et al. 2010, Mann et al. 2010) but now we have an associated measure of parameter uncertainty and its sources. It can be seen that globally some parameters have no effect on the model predictions and any further effort in their development may be unnecessary, although a structural error in the model might also be identified. The
Stochastic Volatility Model and Technical Analysis of Stock Price
Institute of Scientific and Technical Information of China (English)
Wei LIU; Wei An ZHENG
2011-01-01
In the stock market, some popular technical analysis indicators (e.g. Bollinger Bands, RSI,ROC, ...) are widely used by traders. They use the daily (hourly, weekly, ...) stock prices as samples of certain statistics and use the observed relative frequency to show the validity of those well-knownindicators. However, those samples are not independent, so the classical sample survey theory does not apply. In earlier research, we discussed the law of large numbers related to those observations when one assumes Black-Scholes' stock price model. In this paper, we extend the above results to the more popular stochastic volatility model.
Stability Analysis of Some Nonlinear Anaerobic Digestion Models
Directory of Open Access Journals (Sweden)
Ivan Simeonov
2010-04-01
Full Text Available Abstract: The paper deals with local asymptotic stability analysis of some mass balance dynamic models (based on one and on two-stage reaction schemes of the anaerobic digestion (AD in CSTR. The equilibrium states for models based on one (with Monod, Contois and Haldane shapes for the specific growth rate and on two-stage (only with Monod shapes for both the specific growth rate of acidogenic and methanogenic bacterial populations reaction schemes have been determined solving sets of nonlinear algebraic equations using Maples. Their stability has been analyzed systematically, which provides insight and guidance for AD bioreactors design, operation and control.
The FIRO model of family therapy: implications of factor analysis.
Hafner, R J; Ross, M W
1989-11-01
Schutz's FIRO model contains three main elements: inclusion, control, and affection. It is used widely in mental health research and practice, but has received little empirical validation. The present study is based on factor analysis of the resources to FIRO questionnaire of 120 normal couples and 191 couples who were attending a clinic for marital/psychiatric problems. Results confirmed the validity of the FIRO model for women only. The differences between the sexes reflected a considerable degree of sex-role stereotyping, the clinical implications of which are discussed.
Modelling application for cognitive reliability and error analysis method
Directory of Open Access Journals (Sweden)
Fabio De Felice
2013-10-01
Full Text Available The automation of production systems has delegated to machines the execution of highly repetitive and standardized tasks. In the last decade, however, the failure of the automatic factory model has led to partially automated configurations of production systems. Therefore, in this scenario, centrality and responsibility of the role entrusted to the human operators are exalted because it requires problem solving and decision making ability. Thus, human operator is the core of a cognitive process that leads to decisions, influencing the safety of the whole system in function of their reliability. The aim of this paper is to propose a modelling application for cognitive reliability and error analysis method.
Modeling and analysis of DFIG in wind energy conversion system
Directory of Open Access Journals (Sweden)
Omer Elfaki Elbashir, Wang Zezhong, Liu Qihui
2014-01-01
Full Text Available This paper deals with the modeling, analysis, and simulation of a doubly-fed induction generator (DFIG driven by a wind turbine. The grid connected wind energy conversion system (WECS is composed of DFIG and two back to back PWM voltage source converters (VSCs in the rotor circuit. A machine model is derived in an appropriate dq reference frame. The grid voltage oriented vector control is used for the grid side converter (GSC in order to maintain a constant DC bus voltage, while the stator voltage orientated vector control is adopted in the rotor side converter (RSC to control the active and reactive powers.
Modeling and stability analysis of the nonlinear reactive sputtering process
Directory of Open Access Journals (Sweden)
György Katalin
2011-12-01
Full Text Available The model of the reactive sputtering process has been determined from the dynamic equilibrium of the reactive gas inside the chamber and the dynamic equilibrium of the sputtered metal atoms which form the compound with the reactive gas atoms on the surface of the substrate. The analytically obtained dynamical model is a system of nonlinear differential equations which can result in a histeresis-type input/output nonlinearity. The reactive sputtering process has been simulated by integrating these differential equations. Linearization has been applied for classical analysis of the sputtering process and control system design.
Hierarchical Modelling of Flood Risk for Engineering Decision Analysis
DEFF Research Database (Denmark)
Custer, Rocco
to changing flood risk. In the presence of flood protection structures, flood development depends on the state of all protection structures in the system. As such, hazard is a function not only of rainfall and river discharge, but also of protection structures’ fragility. A methodology for flood risk analysis.......g. dikes), engineering vulnerability models that allow considering the impact of flood proofing measures on residential building vulnerability seem to be lacking. Thus, a flood vulnerability modelling approach for residential buildings is proposed, which allows for detailed building and hazard...