Development of a gas systems analysis model (GSAM)
Energy Technology Data Exchange (ETDEWEB)
Godec, M.L. [IFC Resources Inc., Fairfax, VA (United States)
1995-04-01
The objectives of developing a Gas Systems Analysis Model (GSAM) are to create a comprehensive, non-proprietary, PC based model of domestic gas industry activity. The system is capable of assessing the impacts of various changes in the natural gas system within North America. The individual and collective impacts due to changes in technology and economic conditions are explicitly modeled in GSAM. Major gas resources are all modeled, including conventional, tight, Devonian Shale, coalbed methane, and low-quality gas sources. The modeling system asseses all key components of the gas industry, including available resources, exploration, drilling, completion, production, and processing practices, both for now and in the future. The model similarly assesses the distribution, storage, and utilization of natural gas in a dynamic market-based analytical structure. GSAM is designed to provide METC managers with a tool to project the impacts of future research, development, and demonstration (RD&D) benefits in order to determine priorities in a rapidly changing, market-driven gas industry.
Energy Technology Data Exchange (ETDEWEB)
NONE
1994-07-01
The objective of GSAM development is to create a comprehensive, non-proprietary, microcomputer model of the North American natural gas system. GSAM explicitly evaluates the key components of the system, including the resource base, exploration and development practices, extraction technology performance and costs, project economics, transportation costs and restrictions, storage, and end-use. The primary focus is the detailed characterization of the resource base at the reservoir and sub-reservoir level. This disaggregation allows direct evaluation of alternative extraction technologies based on discretely estimated, individual well productivity, required investments, and associated operating costs. GSAM`s design allows users to evaluate complex interactions of current and alternative future technology and policy initiatives as they directly impact the gas market. Key activities completed during the past year include: conducted a comparative analysis of commercial reservoir databases; licensed and screened NRG Associates Significant Oil and Gas Fields of the US reservoir database; developed and tested reduced form reservoir model production type curves; fully developed database structures for use in GSAM and linkage to other systems; developed a methodology for the exploration module; collected and updated upstream capital and operating cost parameters; completed initial integration of downstream/demand models; presented research results at METC Contractor Review Meeting; conducted other briefings for METC managers, including initiation of the GSAM Environmental Module; and delivered draft topical reports on technology review, model review, and GSAM methodology.
Development of a natural Gas Systems Analysis Model (GSAM)
International Nuclear Information System (INIS)
Lacking a detailed characterization of the resource base and a comprehensive borehole-to-burnertip evaluation model of the North American natural gas system, past R ampersand D, tax and regulatory policies have been formulated without a full understanding of their likely direct and indirect impacts on future gas supply and demand. The recent disappearance of the deliverability surplus, pipeline deregulation, and current policy debates about regulatory initiatives in taxation, environmental compliance and leasing make the need for a comprehensive gas evaluation system critical. Traditional econometric or highly aggregated energy models are increasingly regarded as unable to incorporate available geologic detail and explicit technology performance and costing algorithms necessary to evaluate resource-technology-economic interactions in a market context. The objective of this research is to create a comprehensive, non-proprietary, microcomputer model of the North American natural gas system. GSAM explicitly evaluates the key components of the natural gas system, including resource base, exploration and development, extraction technology performance and costs, transportation and storage and end use. The primary focus is the detailed characterization of the resource base at the reservoir and sub-reservoir level and the impact of alternative extraction technologies on well productivity and economics. GSAM evaluates the complex interactions of current and alternative future technology and policy initiatives in the context of the evolving gas markets. Scheduled for completion in 1995, a prototype is planned for early 1994. ICF Resources reviewed relevant natural gas upstream, downstream and market models to identify appropriate analytic capabilities to incorporate into GSAM. We have reviewed extraction technologies to better characterize performance and costs in terms of GSAM parameters
Development of a natural gas systems analysis model (GSAM). Annual report, July 1996--July 1997
Energy Technology Data Exchange (ETDEWEB)
NONE
1997-12-31
The objective of GSAM development is to create a comprehensive, non-proprietary, microcomputer model of the North American natural gas system. GSAM explicitly evaluates the key components of the system, including the resource base, exploration and development practices, extraction technology performance and costs, project economics, transportation costs and restrictions, storage, and end-use. The primary focus is the detailed characterization of the resource base at the reservoir and subreservoir level. This disaggregation allows direct evaluation of alternative extraction technologies based on discretely estimated, individual well productivity, required investments, and associated operating costs. GSAM`s design allows users to evaluate complex interactions of current and alternative future technology and policy initiatives as they directly impact the gas market. GSAM development has been ongoing for the past five years. Key activities completed during the past year are described.
Development of a natural gas systems analysis model (GSAM). Annual report, July 1996--July 1997
International Nuclear Information System (INIS)
The objective of GSAM development is to create a comprehensive, non-proprietary, microcomputer model of the North American natural gas system. GSAM explicitly evaluates the key components of the system, including the resource base, exploration and development practices, extraction technology performance and costs, project economics, transportation costs and restrictions, storage, and end-use. The primary focus is the detailed characterization of the resource base at the reservoir and subreservoir level. This disaggregation allows direct evaluation of alternative extraction technologies based on discretely estimated, individual well productivity, required investments, and associated operating costs. GSAM's design allows users to evaluate complex interactions of current and alternative future technology and policy initiatives as they directly impact the gas market. GSAM development has been ongoing for the past five years. Key activities completed during the past year are described
Development of a natural gas systems analysis model (GSAM). Annual report, July 1994--June 1995
Energy Technology Data Exchange (ETDEWEB)
NONE
1995-07-01
North American natural gas markets have changed dramatically over the past decade. A competitive, cost-conscious production, transportation, and distribution system has emerged from the highly regulated transportation wellhead pricing structure of the 1980`s. Technology advances have played an important role in the evolution of the gas industry, a role likely to expand substantially as alternative fuel price competition and a maturing natural gas resource base force operators to maximize efficiency. Finally, significant changes continue in regional gas demand patterns, industry practices, and infrastructure needs. As the complexity of the gas system grows so does the need to evaluate and plan for alternative future resource, technology, and market scenarios. Traditional gas modeling systems focused solely on the econometric aspects of gas marketing. These systems, developed to assess a regulated industry at a high level of aggregation, rely on simple representation of complex and evolving systems, thereby precluding insight into how the industry will change over time. Credible evaluations of specific policy initiatives and research activities require a different approach. Also, the mounting pressure on energy producers from environmental compliance activities requires development of analysis that incorporates relevant geologic, engineering, and project economic details. The objective of policy, research and development (R&D), and market analysis is to integrate fundamental understanding of natural gas resources, technology, and markets to fully describe the potential of the gas resource under alternative future scenarios. This report summarizes work over the past twelve months on DOE Contract DE-AC21-92MC28138, Development of a Natural Gas Systems Analysis Model (GSAM). The products developed under this project directly support the Morgantown Energy Technology Center (METC) in carrying out its natural gas R&D mission.
Energy Technology Data Exchange (ETDEWEB)
Unknown
2001-02-01
This report summarizes work completed on DOE Contract DE-AC21-92MC28138, Development of a Natural Gas Systems Analysis Model (GSAM). The products developed under this project directly support the National Energy Technology Laboratory (NETL) in carrying out its natural gas R&D mission. The objective of this research effort has been to create a comprehensive, non-proprietary, microcomputer model of the North American natural gas market. GSAM has been developed to explicitly evaluate components of the natural gas system, including the entire in-place gas resource base, exploration and development technologies, extraction technology and performance parameters, transportation and storage factors, and end-use demand issues. The system has been fully tested and calibrated and has been used for multiple natural gas metrics analyses at NETL in which metric associated with NETL natural gas upstream R&D technologies and strategies under the direction of NETL has been evaluated. NETL's Natural Gas Strategic Plan requires that R&D activities be evaluated for their ability to provide adequate supplies of reasonably priced natural gas. GSAM provides the capability to assess potential and on-going R&D projects using a full fuel cycle, cost-benefit approach. This method yields realistic, market-based assessments of benefits and costs of alternative or related technology advances. GSAM is capable of estimating both technical and commercial successes, quantifying the potential benefits to the market, as well as to other related research. GSAM, therefore, represents an integration of research activities and a method for planning and prioritizing efforts to maximize benefits and minimize costs. Without an analytical tool like GSAM, NETL natural gas upstream R&D activities cannot be appropriately ranked or focused on the most important aspects of natural gas extraction efforts or utilization considerations.
International Nuclear Information System (INIS)
This report summarizes work completed on DOE Contract DE-AC21-92MC28138, Development of a Natural Gas Systems Analysis Model (GSAM). The products developed under this project directly support the National Energy Technology Laboratory (NETL) in carrying out its natural gas R and D mission. The objective of this research effort has been to create a comprehensive, non-proprietary, microcomputer model of the North American natural gas market. GSAM has been developed to explicitly evaluate components of the natural gas system, including the entire in-place gas resource base, exploration and development technologies, extraction technology and performance parameters, transportation and storage factors, and end-use demand issues. The system has been fully tested and calibrated and has been used for multiple natural gas metrics analyses at NETL in which metric associated with NETL natural gas upstream R and D technologies and strategies under the direction of NETL has been evaluated. NETL's Natural Gas Strategic Plan requires that R and D activities be evaluated for their ability to provide adequate supplies of reasonably priced natural gas. GSAM provides the capability to assess potential and on-going R and D projects using a full fuel cycle, cost-benefit approach. This method yields realistic, market-based assessments of benefits and costs of alternative or related technology advances. GSAM is capable of estimating both technical and commercial successes, quantifying the potential benefits to the market, as well as to other related research. GSAM, therefore, represents an integration of research activities and a method for planning and prioritizing efforts to maximize benefits and minimize costs. Without an analytical tool like GSAM, NETL natural gas upstream R and D activities cannot be appropriately ranked or focused on the most important aspects of natural gas extraction efforts or utilization considerations
A generalized approach to the modeling of the species-area relationship.
Directory of Open Access Journals (Sweden)
Katiane Silva Conceição
Full Text Available This paper proposes a statistical generalized species-area model (GSAM to represent various patterns of species-area relationship (SAR, which is one of the fundamental patterns in ecology. The approach enables the generalization of many preliminary models, as power-curve model, which is commonly used to mathematically describe the SAR. The GSAM is applied to simulated data set of species diversity in areas of different sizes and a real-world data of insects of Hymenoptera order has been modeled. We show that the GSAM enables the identification of the best statistical model and estimates the number of species according to the area.
International Nuclear Information System (INIS)
In this paper, we present the results of a study of the impact of Canadian carbon stabilization programs on exports of natural gas to the United States. This work was based on a study conducted for the US Environmental Protection Agency. The Gas Systems Analysis model (GSAM), developed by ICF Consulting for the US Department of Energy, was used to gauge the overall impact of the stabilization programs on the North American natural gas market. GSAM is an intertemporal, spatial price equilibrium (SPE) type model of the North American natural gas system. Salient features of this model include characterization of over 17 000 gas production reservoirs with explicit reservoir-level geologic and economic information used to build up the supply side of the market. On the demand side, four sectors, residential, commercial, industrial and electric power generation, are characterized in the model. Lastly, both above and below ground storage facilities as well as a comprehensive pipeline network are used with the supply and demand side characterizations to arrive at estimates of market equilibrium prices and quantities and flows. 35 refs
Amir Farbin
The ATLAS Analysis Model is a continually developing vision of how to reconcile physics analysis requirements with the ATLAS offline software and computing model constraints. In the past year this vision has influenced the evolution of the ATLAS Event Data Model, the Athena software framework, and physics analysis tools. These developments, along with the October Analysis Model Workshop and the planning for CSC analyses have led to a rapid refinement of the ATLAS Analysis Model in the past few months. This article introduces some of the relevant issues and presents the current vision of the future ATLAS Analysis Model. Event Data Model The ATLAS Event Data Model (EDM) consists of several levels of details, each targeted for a specific set of tasks. For example the Event Summary Data (ESD) stores calorimeter cells and tracking system hits thereby permitting many calibration and alignment tasks, but will be only accessible at particular computing sites with potentially large latency. In contrast, the Analysis...
Directory of Open Access Journals (Sweden)
Georgiana Cristina NUKINA
2012-07-01
Full Text Available Through Risk analysis developed model deciding whether control measures suitable for implementation. However, the analysis determines whether the benefits of a data control options cost more than the implementation.
Andrist, Rafael B.; Haworth, Guy McCrossan
2005-01-01
A reference model of Fallible Endgame Play has been implemented and exercised with the chess-engine WILHELM. Past experiments have demonstrated the value of the model and the robustness of decisions based on it: experiments agree well with a Markov Model theory. Here, the reference model is exercised on the well-known endgame KBBKN.
Slavik Stefan; Bednar Richard
2014-01-01
The term business model has been used in practice for few years, but companies create, define and innovate their models subconsciously from the start of business. Our paper is aimed to clear the theory about business model, hence definition and all the components that form each business. In the second part, we create an analytical tool and analyze the real business models in Slovakia and define the characteristics of each part of business model, i.e., customers, distribution, value, resour...
EXPOSURE ANALYSIS MODELING SYSTEM (EXAMS)
The Exposure Analysis Modeling System (EXAMS), first published in 1982 (EPA-600/3-82-023), provides interactive computer software for formulating aquatic ecosystem models and rapidly evaluating the fate, transport, and exposure concentrations of synthetic organic chemicals--pesti...
Model Checking as Static Analysis
DEFF Research Database (Denmark)
Zhang, Fuyuan
properties which can predict safe approximations to program behaviors. In this thesis, we have developed several static analysis based techniques to solve model checking problems, aiming at showing the link between static analysis and model checking. We focus on logical approaches to static analysis......Both model checking and static analysis are prominent approaches to detecting software errors. Model Checking is a successful formal method for verifying properties specified in temporal logics with respect to transition systems. Static analysis is also a powerful method for validating program...... multi-valued setting, and we therefore obtain a multivalued analysis for temporal properties specied by CTL formulas. In particular, we have shown that the three-valued CTL model checking problem over Kripke modal transition systems can be exactly encoded in three-valued ALFP. Last, we come back to two...
Haworth, Guy McCrossan; Andrist, Rafael B.
2004-01-01
A reference model of Fallible Endgame Play has been implemented and exercised with the chess engine WILHELM. Various experiments have demonstrated the value of the model and the robustness of decisions based on it. Experimental results have also been compared with the theoretical predictions of a Markov model of the endgame and found to be in close agreement.
Survival analysis models and applications
Liu, Xian
2012-01-01
Survival analysis concerns sequential occurrences of events governed by probabilistic laws. Recent decades have witnessed many applications of survival analysis in various disciplines. This book introduces both classic survival models and theories along with newly developed techniques. Readers will learn how to perform analysis of survival data by following numerous empirical illustrations in SAS. Survival Analysis: Models and Applications: Presents basic techniques before leading onto some of the most advanced topics in survival analysis.Assumes only a minimal knowledge of SAS whilst enablin
International Nuclear Information System (INIS)
To impart hands-on training in physics analysis, CMS experiment initiated the concept of CMS Data Analysis School (CMSDAS). It was born over three years ago at the LPC (LHC Physics Centre), Fermilab and is based on earlier workshops held at the LPC and CLEO Experiment. As CMS transitioned from construction to the data taking mode, the nature of earlier training also evolved to include more of analysis tools, software tutorials and physics analysis. This effort epitomized as CMSDAS has proven to be a key for the new and young physicists to jump start and contribute to the physics goals of CMS by looking for new physics with the collision data. With over 400 physicists trained in six CMSDAS around the globe, CMS is trying to engage the collaboration in its discovery potential and maximize physics output. As a bigger goal, CMS is striving to nurture and increase engagement of the myriad talents, in the development of physics, service, upgrade, education of those new to CMS and the career development of younger members. An extension of the concept to the dedicated software and hardware schools is also planned, keeping in mind the ensuing upgrade phase.
Energy Technology Data Exchange (ETDEWEB)
Malik, S. [Nebraska U.; Shipsey, I. [Purdue U.; Cavanaugh, R. [Illinois U., Chicago; Bloom, K. [Nebraska U.; Chan, Kai-Feng [Taiwan, Natl. Taiwan U.; D' Hondt, J. [Vrije U., Brussels; Klima, B. [Fermilab; Narain, M. [Brown U.; Palla, F. [INFN, Pisa; Rolandi, G. [CERN; Schörner-Sadenius, T. [DESY
2014-01-01
To impart hands-on training in physics analysis, CMS experiment initiated the concept of CMS Data Analysis School (CMSDAS). It was born over three years ago at the LPC (LHC Physics Centre), Fermilab and is based on earlier workshops held at the LPC and CLEO Experiment. As CMS transitioned from construction to the data taking mode, the nature of earlier training also evolved to include more of analysis tools, software tutorials and physics analysis. This effort epitomized as CMSDAS has proven to be a key for the new and young physicists to jump start and contribute to the physics goals of CMS by looking for new physics with the collision data. With over 400 physicists trained in six CMSDAS around the globe, CMS is trying to engage the collaboration in its discovery potential and maximize physics output. As a bigger goal, CMS is striving to nurture and increase engagement of the myriad talents, in the development of physics, service, upgrade, education of those new to CMS and the career development of younger members. An extension of the concept to the dedicated software and hardware schools is also planned, keeping in mind the ensuing upgrade phase.
ORGANISATIONAL CULTURE ANALYSIS MODEL
Mihaela Simona Maracine
2012-01-01
The studies and researches undertaken have demonstrated the importance of studying organisational culture because of the practical valences it presents and because it contributes to increasing the organisation’s performance. The analysis of the organisational culture’s dimensions allows observing human behaviour within the organisation and highlighting reality, identifying the strengths and also the weaknesses which have an impact on its functionality and development. In this paper, we try to...
Multiscale Signal Analysis and Modeling
Zayed, Ahmed
2013-01-01
Multiscale Signal Analysis and Modeling presents recent advances in multiscale analysis and modeling using wavelets and other systems. This book also presents applications in digital signal processing using sampling theory and techniques from various function spaces, filter design, feature extraction and classification, signal and image representation/transmission, coding, nonparametric statistical signal processing, and statistical learning theory. This book also: Discusses recently developed signal modeling techniques, such as the multiscale method for complex time series modeling, multiscale positive density estimations, Bayesian Shrinkage Strategies, and algorithms for data adaptive statistics Introduces new sampling algorithms for multidimensional signal processing Provides comprehensive coverage of wavelets with presentations on waveform design and modeling, wavelet analysis of ECG signals and wavelet filters Reviews features extraction and classification algorithms for multiscale signal and image proce...
ROCK PROPERTIES MODEL ANALYSIS MODEL REPORT
International Nuclear Information System (INIS)
The purpose of this Analysis and Model Report (AMR) is to document Rock Properties Model (RPM) 3.1 with regard to input data, model methods, assumptions, uncertainties and limitations of model results, and qualification status of the model. The report also documents the differences between the current and previous versions and validation of the model. The rock properties models are intended principally for use as input to numerical physical-process modeling, such as of ground-water flow and/or radionuclide transport. The constraints, caveats, and limitations associated with this model are discussed in the appropriate text sections that follow. This work was conducted in accordance with the following planning documents: WA-0344, ''3-D Rock Properties Modeling for FY 1998'' (SNL 1997, WA-0358), ''3-D Rock Properties Modeling for FY 1999'' (SNL 1999), and the technical development plan, Rock Properties Model Version 3.1, (CRWMS MandO 1999c). The Interim Change Notice (ICNs), ICN 02 and ICN 03, of this AMR were prepared as part of activities being conducted under the Technical Work Plan, TWP-NBS-GS-000003, ''Technical Work Plan for the Integrated Site Model, Process Model Report, Revision 01'' (CRWMS MandO 2000b). The purpose of ICN 03 is to record changes in data input status due to data qualification and verification activities. These work plans describe the scope, objectives, tasks, methodology, and implementing procedures for model construction. The constraints, caveats, and limitations associated with this model are discussed in the appropriate text sections that follow. The work scope for this activity consists of the following: (1) Conversion of the input data (laboratory measured porosity data, x-ray diffraction mineralogy, petrophysical calculations of bound water, and petrophysical calculations of porosity) for each borehole into stratigraphic coordinates; (2) Re-sampling and merging of data sets; (3) Development of geostatistical simulations of porosity; (4
ROCK PROPERTIES MODEL ANALYSIS MODEL REPORT
Energy Technology Data Exchange (ETDEWEB)
Clinton Lum
2002-02-04
The purpose of this Analysis and Model Report (AMR) is to document Rock Properties Model (RPM) 3.1 with regard to input data, model methods, assumptions, uncertainties and limitations of model results, and qualification status of the model. The report also documents the differences between the current and previous versions and validation of the model. The rock properties models are intended principally for use as input to numerical physical-process modeling, such as of ground-water flow and/or radionuclide transport. The constraints, caveats, and limitations associated with this model are discussed in the appropriate text sections that follow. This work was conducted in accordance with the following planning documents: WA-0344, ''3-D Rock Properties Modeling for FY 1998'' (SNL 1997, WA-0358), ''3-D Rock Properties Modeling for FY 1999'' (SNL 1999), and the technical development plan, Rock Properties Model Version 3.1, (CRWMS M&O 1999c). The Interim Change Notice (ICNs), ICN 02 and ICN 03, of this AMR were prepared as part of activities being conducted under the Technical Work Plan, TWP-NBS-GS-000003, ''Technical Work Plan for the Integrated Site Model, Process Model Report, Revision 01'' (CRWMS M&O 2000b). The purpose of ICN 03 is to record changes in data input status due to data qualification and verification activities. These work plans describe the scope, objectives, tasks, methodology, and implementing procedures for model construction. The constraints, caveats, and limitations associated with this model are discussed in the appropriate text sections that follow. The work scope for this activity consists of the following: (1) Conversion of the input data (laboratory measured porosity data, x-ray diffraction mineralogy, petrophysical calculations of bound water, and petrophysical calculations of porosity) for each borehole into stratigraphic coordinates; (2) Re-sampling and merging of data sets; (3
Local models for spatial analysis
Lloyd, Christopher D
2006-01-01
In both the physical and social sciences, there are now available large spatial data sets with detailed local information. Global models for analyzing these data are not suitable for investigating local variations; consequently, local models are the subject of much recent research. Collecting a variety of models into a single reference, Local Models for Spatial Analysis explains in detail a variety of approaches for analyzing univariate and multivariate spatial data. Different models make use of data in unique ways, and this book offers perspectives on various definitions of what constitutes
Stochastic modeling analysis and simulation
Nelson, Barry L
1995-01-01
A coherent introduction to the techniques for modeling dynamic stochastic systems, this volume also offers a guide to the mathematical, numerical, and simulation tools of systems analysis. Suitable for advanced undergraduates and graduate-level industrial engineers and management science majors, it proposes modeling systems in terms of their simulation, regardless of whether simulation is employed for analysis. Beginning with a view of the conditions that permit a mathematical-numerical analysis, the text explores Poisson and renewal processes, Markov chains in discrete and continuous time, se
Hydraulic Modeling: Pipe Network Analysis
Datwyler, Trevor T.
2012-01-01
Water modeling is becoming an increasingly important part of hydraulic engineering. One application of hydraulic modeling is pipe network analysis. Using programmed algorithms to repeatedly solve continuity and energy equations, computer software can greatly reduce the amount of time required to analyze a closed conduit system. Such hydraulic models can become a valuable tool for cities to maintain their water systems and plan for future growth. The Utah Division of Drinking Water regulations...
Command Process Modeling & Risk Analysis
Meshkat, Leila
2011-01-01
Commanding Errors may be caused by a variety of root causes. It's important to understand the relative significance of each of these causes for making institutional investment decisions. One of these causes is the lack of standardized processes and procedures for command and control. We mitigate this problem by building periodic tables and models corresponding to key functions within it. These models include simulation analysis and probabilistic risk assessment models.
Analysis of aircraft maintenance models
Directory of Open Access Journals (Sweden)
Vlada S. Sokolović
2011-10-01
Full Text Available This paper addressed several organizational models of aircraft maintenance. All models presented so far have been in use in Air Forces, so that the advantages and disadvantages of different models are known. First it shows the current model of aircraft maintenance as well as its basic characteristics. Then the paper discusses two organizational models of aircraft maintenance with their advantages and disadvantages. The advantages and disadvantages of different models are analyzed based on the criteria of operational capabilities of military units. In addition to operational capabilities, the paper presents some other criteria which should be taken into account in the evaluation and selection of an optimal model of aircraft maintenance. Performing a qualitative analysis of some models may not be sufficient for evaluating the optimum choice for models of maintenance referring to the selected set of criteria from the scope of operational capabilities. In order to choose the optimum model, it is necessary to conduct a detailed economic and technical analysis of individual tactical model maintenance. A high-quality aircraft maintenance organization requires the highest state and army authorities to be involved. It is necessary to set clear objectives for all the elements of modern air force technical support programs based on the given evaluation criteria.
Model selection for amplitude analysis
International Nuclear Information System (INIS)
Model complexity in amplitude analyses is often a priori under-constrained since the underlying theory permits a large number of possible amplitudes to contribute to most physical processes. The use of an overly complex model results in reduced predictive power and worse resolution on unknown parameters of interest. Therefore, it is common to reduce the complexity by removing from consideration some subset of the allowed amplitudes. This paper studies a method for limiting model complexity from the data sample itself through regularization during regression in the context of a multivariate (Dalitz-plot) analysis. The regularization technique applied greatly improves the performance. An outline of how to obtain the significance of a resonance in a multivariate amplitude analysis is also provided
Accelerated life models modeling and statistical analysis
Bagdonavicius, Vilijandas
2001-01-01
Failure Time DistributionsIntroductionParametric Classes of Failure Time DistributionsAccelerated Life ModelsIntroductionGeneralized Sedyakin's ModelAccelerated Failure Time ModelProportional Hazards ModelGeneralized Proportional Hazards ModelsGeneralized Additive and Additive-Multiplicative Hazards ModelsChanging Shape and Scale ModelsGeneralizationsModels Including Switch-Up and Cycling EffectsHeredity HypothesisSummaryAccelerated Degradation ModelsIntroductionDegradation ModelsModeling the Influence of Explanatory Varia
Geometric simplification of analysis models
Energy Technology Data Exchange (ETDEWEB)
Watterberg, P.A.
1999-12-01
Analysis programs have been having to deal with more and more complex objects as the capability to model fine detail increases. This can make them unacceptably slow. This project attempts to find heuristics for removing features from models in an automatic fashion in order to reduce polygon count. The approach is not one of theoretical completeness but rather one of trying to achieve useful results with scattered practical ideas. By removing a few simple things such as screw holes, slots, chambers, and fillets, large gains can be realized. Results varied but a reduction in the number of polygons by a factor of 10 is not unusual.
Marine systems analysis and modeling
Fedra, K.
1995-03-01
Oceanography and marine ecology have a considerable history in the use of computers for modeling both physical and ecological processes. With increasing stress on the marine environment due to human activities such as fisheries and numerous forms of pollution, the analysis of marine problems must increasingly and jointly consider physical, ecological and socio-economic aspects in a broader systems framework that transcends more traditional disciplinary boundaries. This often introduces difficult-to-quantify, “soft” elements, such as values and perceptions, into formal analysis. Thus, the problem domain combines a solid foundation in the physical sciences, with strong elements of ecological, socio-economic and political considerations. At the same time, the domain is also characterized by both a very large volume of some data, and an extremely datapoor situation for other variables, as well as a very high degree of uncertainty, partly due to the temporal and spatial heterogeneity of the marine environment. Consequently, marine systems analysis and management require tools that can integrate these diverse aspects into efficient information systems that can support research as well as planning and also policy- and decisionmaking processes. Supporting scientific research, as well as decision-making processes and the diverse groups and actors involved, requires better access and direct understanding of the information basis as well as easy-to-use, but powerful tools for analysis. Advanced information technology provides the tools to design and implement smart software where, in a broad sense, the emphasis is on the man-machine interface. Symbolic and analogous, graphical interaction, visual representation of problems, integrated data sources, and built-in domain knowledge can effectively support users of complex and complicated software systems. Integration, interaction, visualization and intelligence are key concepts that are discussed in detail, using an
Analysis by fracture network modelling
International Nuclear Information System (INIS)
This report describes the Fracture Network Modelling and Performance Assessment Support performed by Golder Associates Inc. during the Heisei-11 (1999-2000) fiscal year. The primary objective of the Golder Associates work scope during HY-11 was to provide theoretical and review support to the JNC HY-12 Performance assessment effort. In addition, Golder Associates provided technical support to JNC for the Aespoe Project. Major efforts for performance assessment support included analysis of PAWorks pathways and software documentation, verification, and performance assessment visualization. Support for the Aespoe project including 'Task 4' predictive modelling of sorbing tracer transport in TRUE-1 rock block, and integrated hydrogeological and geochemical modelling of Aespoe island for 'Task 5'. Technical information about Golder Associates HY-11 support to JNC is provided in the appendices to this report. (author)
Ventilation Model and Analysis Report
International Nuclear Information System (INIS)
This model and analysis report develops, validates, and implements a conceptual model for heat transfer in and around a ventilated emplacement drift. This conceptual model includes thermal radiation between the waste package and the drift wall, convection from the waste package and drift wall surfaces into the flowing air, and conduction in the surrounding host rock. These heat transfer processes are coupled and vary both temporally and spatially, so numerical and analytical methods are used to implement the mathematical equations which describe the conceptual model. These numerical and analytical methods predict the transient response of the system, at the drift scale, in terms of spatially varying temperatures and ventilation efficiencies. The ventilation efficiency describes the effectiveness of the ventilation process in removing radionuclide decay heat from the drift environment. An alternative conceptual model is also developed which evaluates the influence of water and water vapor mass transport on the ventilation efficiency. These effects are described using analytical methods which bound the contribution of latent heat to the system, quantify the effects of varying degrees of host rock saturation (and hence host rock thermal conductivity) on the ventilation efficiency, and evaluate the effects of vapor and enhanced vapor diffusion on the host rock thermal conductivity
ANALYSIS MODEL FOR INVENTORY MANAGEMENT
Directory of Open Access Journals (Sweden)
CAMELIA BURJA
2010-01-01
Full Text Available The inventory represents an essential component for the assets of the enterprise and the economic analysis gives them special importance because their accurate management determines the achievement of the activity object and the financial results. The efficient management of inventory requires ensuring an optimum level for them, which will guarantee the normal functioning of the activity with minimum inventory expenses and funds which are immobilised. The paper presents an analysis model for inventory management based on their rotation speed and the correlation with the sales volume illustrated in an adequate study. The highlighting of the influence factors on the efficient inventory management ensures the useful information needed to justify managerial decisions, which will lead to a balancedfinancial position and to increased company performance.
Distribution system modeling and analysis
Kersting, William H
2002-01-01
For decades, distribution engineers did not have the sophisticated tools developed for analyzing transmission systems-often they had only their instincts. Things have changed, and we now have computer programs that allow engineers to simulate, analyze, and optimize distribution systems. Powerful as these programs are, however, without a real understanding of the operating characteristics of a distribution system, engineers using the programs can easily make serious errors in their designs and operating procedures.Distribution System Modeling and Analysis helps prevent those errors. It gives re
Simplified model for DNB analysis
International Nuclear Information System (INIS)
In a pressurized water nuclear reactor (PWR), the power of operation is restricted by the possibility of the occurrence of the departure from nucleate boiling called DNB (Departure from Nucleate Boiling) in the hottest channel of the core. The present work proposes a simplified model that analyses the thermal-hydraulic conditions of the coolant in the hottest channel of PWRs with the objective to evaluate BNB in this channel. For this the coupling between the hot channel and typical nominal channels assumed imposing the existence of a cross flow between these channels in a way that a uniforme pressure axial distribution results along the channels. The model is applied for Angra-I reactor and the results are compared with those of Final Safety Analysis Report (FSAR) obtained by Westinghouse through the THINC program, beeing considered satisfactory (Author)
Probabilistic Model-Based Safety Analysis
Güdemann, Matthias; 10.4204/EPTCS.28.8
2010-01-01
Model-based safety analysis approaches aim at finding critical failure combinations by analysis of models of the whole system (i.e. software, hardware, failure modes and environment). The advantage of these methods compared to traditional approaches is that the analysis of the whole system gives more precise results. Only few model-based approaches have been applied to answer quantitative questions in safety analysis, often limited to analysis of specific failure propagation models, limited types of failure modes or without system dynamics and behavior, as direct quantitative analysis is uses large amounts of computing resources. New achievements in the domain of (probabilistic) model-checking now allow for overcoming this problem. This paper shows how functional models based on synchronous parallel semantics, which can be used for system design, implementation and qualitative safety analysis, can be directly re-used for (model-based) quantitative safety analysis. Accurate modeling of different types of proba...
The gmdl Modeling and Analysis System
Gillblad, Daniel; Holst, Anders; Kreuger, Per; Levin, Björn
2004-01-01
This report describes the gmdl modeling and analysis environment. gmdl was designed to provide powerful data analysis, modeling, and visualization with simple, clear semantics and easy to use, well defined syntactic conventions. It provides an extensive set of necessary for general data preparation, analysis, and modeling tasks.
Geologic Framework Model Analysis Model Report
Energy Technology Data Exchange (ETDEWEB)
R. Clayton
2000-12-19
The purpose of this report is to document the Geologic Framework Model (GFM), Version 3.1 (GFM3.1) with regard to data input, modeling methods, assumptions, uncertainties, limitations, and validation of the model results, qualification status of the model, and the differences between Version 3.1 and previous versions. The GFM represents a three-dimensional interpretation of the stratigraphy and structural features of the location of the potential Yucca Mountain radioactive waste repository. The GFM encompasses an area of 65 square miles (170 square kilometers) and a volume of 185 cubic miles (771 cubic kilometers). The boundaries of the GFM were chosen to encompass the most widely distributed set of exploratory boreholes (the Water Table or WT series) and to provide a geologic framework over the area of interest for hydrologic flow and radionuclide transport modeling through the unsaturated zone (UZ). The depth of the model is constrained by the inferred depth of the Tertiary-Paleozoic unconformity. The GFM was constructed from geologic map and borehole data. Additional information from measured stratigraphy sections, gravity profiles, and seismic profiles was also considered. This interim change notice (ICN) was prepared in accordance with the Technical Work Plan for the Integrated Site Model Process Model Report Revision 01 (CRWMS M&O 2000). The constraints, caveats, and limitations associated with this model are discussed in the appropriate text sections that follow. The GFM is one component of the Integrated Site Model (ISM) (Figure l), which has been developed to provide a consistent volumetric portrayal of the rock layers, rock properties, and mineralogy of the Yucca Mountain site. The ISM consists of three components: (1) Geologic Framework Model (GFM); (2) Rock Properties Model (RPM); and (3) Mineralogic Model (MM). The ISM merges the detailed project stratigraphy into model stratigraphic units that are most useful for the primary downstream models and the
Geologic Framework Model Analysis Model Report
International Nuclear Information System (INIS)
The purpose of this report is to document the Geologic Framework Model (GFM), Version 3.1 (GFM3.1) with regard to data input, modeling methods, assumptions, uncertainties, limitations, and validation of the model results, qualification status of the model, and the differences between Version 3.1 and previous versions. The GFM represents a three-dimensional interpretation of the stratigraphy and structural features of the location of the potential Yucca Mountain radioactive waste repository. The GFM encompasses an area of 65 square miles (170 square kilometers) and a volume of 185 cubic miles (771 cubic kilometers). The boundaries of the GFM were chosen to encompass the most widely distributed set of exploratory boreholes (the Water Table or WT series) and to provide a geologic framework over the area of interest for hydrologic flow and radionuclide transport modeling through the unsaturated zone (UZ). The depth of the model is constrained by the inferred depth of the Tertiary-Paleozoic unconformity. The GFM was constructed from geologic map and borehole data. Additional information from measured stratigraphy sections, gravity profiles, and seismic profiles was also considered. This interim change notice (ICN) was prepared in accordance with the Technical Work Plan for the Integrated Site Model Process Model Report Revision 01 (CRWMS M and O 2000). The constraints, caveats, and limitations associated with this model are discussed in the appropriate text sections that follow. The GFM is one component of the Integrated Site Model (ISM) (Figure l), which has been developed to provide a consistent volumetric portrayal of the rock layers, rock properties, and mineralogy of the Yucca Mountain site. The ISM consists of three components: (1) Geologic Framework Model (GFM); (2) Rock Properties Model (RPM); and (3) Mineralogic Model (MM). The ISM merges the detailed project stratigraphy into model stratigraphic units that are most useful for the primary downstream models and
Colour model analysis for microscopic image processing
García-Rojo Marcial; González Jesús; Déniz Oscar; González Roberto; Bueno Gloria
2008-01-01
Abstract This article presents a comparative study between different colour models (RGB, HSI and CIEL*a*b*) applied to a very large microscopic image analysis. Such analysis of different colour models is needed in order to carry out a successful detection and therefore a classification of different regions of interest (ROIs) within the image. This, in turn, allows both distinguishing possible ROIs and retrieving their proper colour for further ROI analysis. This analysis is not commonly done ...
FAME, the Flux Analysis and Modeling Environment
Boele Joost; Olivier Brett G; Teusink Bas
2012-01-01
Background The creation and modification of genome-scale metabolic models is a task that requires specialized software tools. While these are available, subsequently running or visualizing a model often relies on disjoint code, which adds additional actions to the analysis routine and, in our experience, renders these applications suboptimal for routine use by (systems) biologists. Results The Flux Analysis and Modeling Environment (FAME) is the first web-based modeling tool that combines th...
Bayesian Model Averaging for Propensity Score Analysis
Kaplan, David; Chen, Jianshen
2013-01-01
The purpose of this study is to explore Bayesian model averaging in the propensity score context. Previous research on Bayesian propensity score analysis does not take into account model uncertainty. In this regard, an internally consistent Bayesian framework for model building and estimation must also account for model uncertainty. The…
Statistical Modelling of Wind Proles - Data Analysis and Modelling
DEFF Research Database (Denmark)
Jónsson, Tryggvi; Pinson, Pierre
The aim of the analysis presented in this document is to investigate whether statistical models can be used to make very short-term predictions of wind profiles.......The aim of the analysis presented in this document is to investigate whether statistical models can be used to make very short-term predictions of wind profiles....
Uncertainty analysis and probabilistic modelling
International Nuclear Information System (INIS)
Many factors affect the accuracy and precision of probabilistic assessment. This report discusses sources of uncertainty and ways of addressing them. Techniques for propagating uncertainties in model input parameters through to model prediction are discussed as are various techniques for examining how sensitive and uncertain model predictions are to one or more input parameters. Various statements of confidence which can be made concerning the prediction of a probabilistic assessment are discussed as are several matters of potential regulatory interest. 55 refs
Evolution of Gross Domestic Product - Analysis Models
Constantin ANGHELACHE; Catalin DEATCU; Daniel DUMITRESCU; Adina Mihaela DINU
2013-01-01
This paper describes a use case for macro economical models, the objective being the structural analysis of the Gross Domestic Product. The authors first introduce the theoretical foundation of the model, then offer a snapshot on GDP evolution. The econometric models proposed for analysis are designed with the help of EViews software and their performance and reliability are described through the optics of the statistical tests.
Introduction to Models in Spatial Analysis
Sanders, Lena
2007-01-01
The book provides a broad overview of the different types of models used in advanced spatial analysis. The models concern spatial organization, location factors and spatial interaction patterns from both static and dynamic perspectives. This introductory chapter proposes a discussion on the different meanings which are given to models in the field of spatial analysis depending on the formalization framework (statistics, GIS, computational approach). Core concepts as spatial interaction and le...
Sensitivity Analysis in the Model Web
Jones, R.; Cornford, D.; Boukouvalas, A.
2012-04-01
The Model Web, and in particular the Uncertainty enabled Model Web being developed in the UncertWeb project aims to allow model developers and model users to deploy and discover models exposed as services on the Web. In particular model users will be able to compose model and data resources to construct and evaluate complex workflows. When discovering such workflows and models on the Web it is likely that the users might not have prior experience of the model behaviour in detail. It would be particularly beneficial if users could undertake a sensitivity analysis of the models and workflows they have discovered and constructed to allow them to assess the sensitivity to their assumptions and parameters. This work presents a Web-based sensitivity analysis tool which provides computationally efficient sensitivity analysis methods for models exposed on the Web. In particular the tool is tailored to the UncertWeb profiles for both information models (NetCDF and Observations and Measurements) and service specifications (WPS and SOAP/WSDL). The tool employs emulation technology where this is found to be possible, constructing statistical surrogate models for the models or workflows, to allow very fast variance based sensitivity analysis. Where models are too complex for emulation to be possible, or evaluate too fast for this to be necessary the original models are used with a carefully designed sampling strategy. A particular benefit of constructing emulators of the models or workflow components is that within the framework these can be communicated and evaluated at any physical location. The Web-based tool and backend API provide several functions to facilitate the process of creating an emulator and performing sensitivity analysis. A user can select a model exposed on the Web and specify the input ranges. Once this process is complete, they are able to perform screening to discover important inputs, train an emulator, and validate the accuracy of the trained emulator. In
[Dimensional modeling analysis for outpatient payments].
Guo, Yi-zhong; Guo, Yi-min
2008-09-01
This paper introduces a data warehouse model for outpatient payments, which is designed according to the requirements of the hospital financial management while dimensional modeling technique is combined with the analysis on the requirements. This data warehouse model can not only improve the accuracy of financial management requirements, but also greatly increase the efficiency and quality of the hospital management. PMID:19119657
Qualitative Analysis of Somitogenesis Models
Directory of Open Access Journals (Sweden)
Maschke-Dutz E.
2007-12-01
Full Text Available Although recently the properties of a single somite cell oscillator have been intensively investigated, the system-level nature of the segmentation clock remains largely unknown. To elaborate qualitatively this question, we examine the possibility to transform a well-known time delay somite cell oscillator to dynamical system of differential equations allowing qualitative analysis.
System of systems modeling and analysis.
Energy Technology Data Exchange (ETDEWEB)
Campbell, James E.; Anderson, Dennis James; Longsine, Dennis E. (Intera, Inc., Austin, TX); Shirah, Donald N.
2005-01-01
This report documents the results of an LDRD program entitled 'System of Systems Modeling and Analysis' that was conducted during FY 2003 and FY 2004. Systems that themselves consist of multiple systems (referred to here as System of Systems or SoS) introduce a level of complexity to systems performance analysis and optimization that is not readily addressable by existing capabilities. The objective of the 'System of Systems Modeling and Analysis' project was to develop an integrated modeling and simulation environment that addresses the complex SoS modeling and analysis needs. The approach to meeting this objective involved two key efforts. First, a static analysis approach, called state modeling, has been developed that is useful for analyzing the average performance of systems over defined use conditions. The state modeling capability supports analysis and optimization of multiple systems and multiple performance measures or measures of effectiveness. The second effort involves time simulation which represents every system in the simulation using an encapsulated state model (State Model Object or SMO). The time simulation can analyze any number of systems including cross-platform dependencies and a detailed treatment of the logistics required to support the systems in a defined mission.
Li, Lin; Fang, Wen-Cheng; Wang, Chao-Peng; Gu, Qiang
2014-01-01
SLED is a crucial component for C-band microwave acceleration unit of SXFEL. To study the behavior of SLED (SLAC Energy Doubler), mathematic model is commonly built and analyzed. In this paper, a new method is proposed to build the model of SLED at SINAP. With this method, the parameters of the two cavities can be analyzed separately. Also it is suitable to study parameter optimization of SLED and analyze the effect from the parameters variations. Simulation results of our method are also pre...
Analysis of radiology business models.
Enzmann, Dieter R; Schomer, Donald F
2013-03-01
As health care moves to value orientation, radiology's traditional business model faces challenges to adapt. The authors describe a strategic value framework that radiology practices can use to best position themselves in their environments. This simplified construct encourages practices to define their dominant value propositions. There are 3 main value propositions that form a conceptual triangle, whose vertices represent the low-cost provider, the product leader, and the customer intimacy models. Each vertex has been a valid market position, but each demands specific capabilities and trade-offs. The underlying concepts help practices select value propositions they can successfully deliver in their competitive environments. PMID:23245438
Hypersonic - Model Analysis as a Service
DEFF Research Database (Denmark)
Acretoaie, Vlad; Störrle, Harald
2014-01-01
Hypersonic is a Cloud-based tool that proposes a new approach to the deployment of model analysis facilities. It is implemented as a RESTful Web service API o_ering analysis features such as model clone detection. This approach allows the migration of resource intensive analysis algorithms from...... monolithic desktop modeling tools to a wide range of mobile and Web-based clients. As a technology demonstrator, a Web application acting as a client for the Hypersonic API has been implemented and made publicly available....
Hierarchical modeling and analysis for spatial data
Banerjee, Sudipto; Gelfand, Alan E
2003-01-01
Among the many uses of hierarchical modeling, their application to the statistical analysis of spatial and spatio-temporal data from areas such as epidemiology And environmental science has proven particularly fruitful. Yet to date, the few books that address the subject have been either too narrowly focused on specific aspects of spatial analysis, or written at a level often inaccessible to those lacking a strong background in mathematical statistics.Hierarchical Modeling and Analysis for Spatial Data is the first accessible, self-contained treatment of hierarchical methods, modeling, and dat
Material modeling and structural analysis with the microplane constitutive model
Brocca, Michele
The microplane model is a versatile and powerful approach to constitutive modeling in which the stress-strain relations are defined in terms of vectors rather than tensors on planes of all possible orientations. Such planes are called the microplanes and are representative of the microstructure of the material. The microplane model with kinematic constraint has been successfully employed in the past in the modeling of concrete, soils, ice, rocks, fiber composites and other quasibrittle materials. The microplane model provides a powerful and efficient numerical and theoretical framework for the development and implementation of constitutive models for any kind of material. The dissertation presents a review of the background from which the microplane model stems, highlighting differences and similarities with other approaches. The basic structure of the microplane model is then presented, together with its extension to finite strain deformation. To show the effectiveness of the microplane model approach, some examples are given demonstrating applications of microplane models in structural analysis with the finite element method. Some new constitutive models are also introduced for materials characterized by very different properties and microstructures, showing that the approach is indeed very versatile and provides a robust basis for the study of a broad range of problems. New models are introduced for metal plasticity, shape memory alloys and cellular materials. The new models are compared quantitatively with the existing models and experimental data. In particular, the newly introduced microplane models for metal plasticity are compared with the classical J2-flow theory for incremental plasticity. An existing microplane model for concrete is employed in finite element analysis of the 'tube-squash' test, in which concrete undergoes very large deviatoric deformation, and of the size effect in compressive failure of concrete columns. The microplane model for shape
Representing Uncertainty on Model Analysis Plots
Smith, Trevor I.
2016-01-01
Model analysis provides a mechanism for representing student learning as measured by standard multiple-choice surveys. The model plot contains information regarding both how likely students in a particular class are to choose the correct answer and how likely they are to choose an answer consistent with a well-documented conceptual model. Unfortunately Bao's original presentation of the model plot did not include a way to represent uncertainty in these measurements. I present details of a met...
Influence analysis for the factor analysis model with ranking data.
Xu, Liang; Poon, Wai-Yin; Lee, Sik-Yum
2008-05-01
Influence analysis is an important component of data analysis, and the local influence approach has been widely applied to many statistical models to identify influential observations and assess minor model perturbations since the pioneering work of Cook (1986). The approach is often adopted to develop influence analysis procedures for factor analysis models with ranking data. However, as this well-known approach is based on the observed data likelihood, which involves multidimensional integrals, directly applying it to develop influence analysis procedures for the factor analysis models with ranking data is difficult. To address this difficulty, a Monte Carlo expectation and maximization algorithm (MCEM) is used to obtain the maximum-likelihood estimate of the model parameters, and measures for influence analysis on the basis of the conditional expectation of the complete data log likelihood at the E-step of the MCEM algorithm are then obtained. Very little additional computation is needed to compute the influence measures, because it is possible to make use of the by-products of the estimation procedure. Influence measures that are based on several typical perturbation schemes are discussed in detail, and the proposed method is illustrated with two real examples and an artificial example. PMID:18482479
Representing Uncertainty on Model Analysis Plots
Smith, Trevor I
2016-01-01
Model analysis provides a mechanism for representing student learning as measured by standard multiple-choice surveys. The model plot contains information regarding both how likely students in a particular class are to choose the correct answer and how likely they are to choose an answer consistent with a well-documented conceptual model. Unfortunately Bao's original presentation of the model plot did not include a way to represent uncertainty in these measurements. I present details of a method to add error bars to model plots by expanding the work of Sommer and Lindell. I also provide a template for generating model plots with error bars.
Phrasal Document Analysis for Modeling
Sojitra, Ritesh D.
1998-01-01
Specifications of digital hardware systems are typically written in a natural language. The objective of this research is automatic information extraction from specifications to aid model generation for system level design automation. This is done by automatic extraction of the noun phrases and the verbs from the natural language specification statements. First, the natural language sentences are parsed using a chart parser. Then, a noun phrase and verb extractor scans these charts to obt...
Three-dimensional model analysis and processing
Yu, Faxin; Luo, Hao; Wang, Pinghui
2011-01-01
This book focuses on five hot research directions in 3D model analysis and processing in computer science: compression, feature extraction, content-based retrieval, irreversible watermarking and reversible watermarking.
Two sustainable energy system analysis models
DEFF Research Database (Denmark)
Lund, Henrik; Goran Krajacic, Neven Duic; da Graca Carvalho, Maria
2005-01-01
This paper presents a comparative study of two energy system analysis models both designed with the purpose of analysing electricity systems with a substantial share of fluctuating renewable energy....
Requirement Analysis: Specify the Environmental Model
2005-01-01
This interactive case study illustrates the specification of the environment model in software requirements analysis using the spec language to refine and formalize the requirements in the initial problem statement. Last modified: 5/18/2009
Analysis model of structure-HDS
Institute of Scientific and Technical Information of China (English)
无
2000-01-01
Presents the model established for Structure-HDS(hydraulic damper system) analysis on the basis of the theoretical analysis model of non-compressed fluid in the round pipe will an uniform velocity used as the basic variable, and pressure losses resulting from cross section changes of fluid route taken into consideration. Which provides necessary basis for researches on earthquake responses of a structure with a spacious first story, equipped with HDS at first floor.
Applied research in uncertainty modeling and analysis
Ayyub, Bilal
2005-01-01
Uncertainty has been a concern to engineers, managers, and scientists for many years. For a long time uncertainty has been considered synonymous with random, stochastic, statistic, or probabilistic. Since the early sixties views on uncertainty have become more heterogeneous. In the past forty years numerous tools that model uncertainty, above and beyond statistics, have been proposed by several engineers and scientists. The tool/method to model uncertainty in a specific context should really be chosen by considering the features of the phenomenon under consideration, not independent of what is known about the system and what causes uncertainty. In this fascinating overview of the field, the authors provide broad coverage of uncertainty analysis/modeling and its application. Applied Research in Uncertainty Modeling and Analysis presents the perspectives of various researchers and practitioners on uncertainty analysis and modeling outside their own fields and domain expertise. Rather than focusing explicitly on...
Combustion instability modeling and analysis
Energy Technology Data Exchange (ETDEWEB)
Santoro, R.J.; Yang, V.; Santavicca, D.A. [Pennsylvania State Univ., University Park, PA (United States)] [and others
1995-10-01
It is well known that the two key elements for achieving low emissions and high performance in a gas turbine combustor are to simultaneously establish (1) a lean combustion zone for maintaining low NO{sub x} emissions and (2) rapid mixing for good ignition and flame stability. However, these requirements, when coupled with the short combustor lengths used to limit the residence time for NO formation typical of advanced gas turbine combustors, can lead to problems regarding unburned hydrocarbons (UHC) and carbon monoxide (CO) emissions, as well as the occurrence of combustion instabilities. Clearly, the key to successful gas turbine development is based on understanding the effects of geometry and operating conditions on combustion instability, emissions (including UHC, CO and NO{sub x}) and performance. The concurrent development of suitable analytical and numerical models that are validated with experimental studies is important for achieving this objective. A major benefit of the present research will be to provide for the first time an experimentally verified model of emissions and performance of gas turbine combustors.
Credit Risk Evaluation : Modeling - Analysis - Management
Wehrspohn, Uwe
2002-01-01
An analysis and further development of the building blocks of modern credit risk management: -Definitions of default -Estimation of default probabilities -Exposures -Recovery Rates -Pricing -Concepts of portfolio dependence -Time horizons for risk calculations -Quantification of portfolio risk -Estimation of risk measures -Portfolio analysis and portfolio improvement -Evaluation and comparison of credit risk models -Analytic portfolio loss distributions The thesis contributes to the evaluatio...
Analysis of variance for model output
Jansen, M.J.W.
1999-01-01
A scalar model output Y is assumed to depend deterministically on a set of stochastically independent input vectors of different dimensions. The composition of the variance of Y is considered; variance components of particular relevance for uncertainty analysis are identified. Several analysis of va
Ignalina NPP Safety Analysis: Models and Results
International Nuclear Information System (INIS)
Research directions, linked to safety assessment of the Ignalina NPP, of the scientific safety analysis group are presented: Thermal-hydraulic analysis of accidents and operational transients; Thermal-hydraulic assessment of Ignalina NPP Accident Localization System and other compartments; Structural analysis of plant components, piping and other parts of Main Circulation Circuit; Assessment of RBMK-1500 reactor core and other. Models and main works carried out last year are described. (author)
Analysis and evaluation of collaborative modeling processes
Ssebuggwawo, D.
2012-01-01
Analysis and evaluation of collaborative modeling processes is confronted with many challenges. On the one hand, many systems design and re-engineering projects require collaborative modeling approaches that can enhance their productivity. But, such collaborative efforts, which often consist of the
Independent Component Analysis in Multimedia Modeling
DEFF Research Database (Denmark)
Larsen, Jan
Modeling of multimedia and multimodal data becomes increasingly important with the digitalization of the world. The objective of this paper is to demonstrate the potential of independent component analysis and blind sources separation methods for modeling and understanding of multimedia data, which...
Independent Component Analysis in Multimedia Modeling
DEFF Research Database (Denmark)
Larsen, Jan; Hansen, Lars Kai; Kolenda, Thomas;
2003-01-01
Modeling of multimedia and multimodal data becomes increasingly important with the digitalization of the world. The objective of this paper is to demonstrate the potential of independent component analysis and blind sources separation methods for modeling and understanding of multimedia data, which...
FAME, the Flux Analysis and Modeling Environment
Boele, J.; Olivier, B.G.; Teusink, B.
2012-01-01
The creation and modification of genome-scale metabolic models is a task that requires specialized software tools. While these are available, subsequently running or visualizing a model often relies on disjoint code, which adds additional actions to the analysis routi
Perturbation analysis of nonlinear matrix population models
Directory of Open Access Journals (Sweden)
Hal Caswell
2008-03-01
Full Text Available Perturbation analysis examines the response of a model to changes in its parameters. It is commonly applied to population growth rates calculated from linear models, but there has been no general approach to the analysis of nonlinear models. Nonlinearities in demographic models may arise due to density-dependence, frequency-dependence (in 2-sex models, feedback through the environment or the economy, and recruitment subsidy due to immigration, or from the scaling inherent in calculations of proportional population structure. This paper uses matrix calculus to derive the sensitivity and elasticity of equilibria, cycles, ratios (e.g. dependency ratios, age averages and variances, temporal averages and variances, life expectancies, and population growth rates, for both age-classified and stage-classified models. Examples are presented, applying the results to both human and non-human populations.
A Simulation Analysis of Bivariate Availability Models
Caruso, Elise M.
2000-01-01
Equipment behavior is often discussed in terms of age and use. For example, an automobile is frequently referred to 3 years old with 30,000 miles. Bivariate failure modeling provides a framework for studying system behavior as a function of two variables. This is meaningful when studying the reliability/availability of systems and equipment. This thesis extends work done in the area of bivariate failure modeling. Four bivariate failure models are selected for analysis. The study in...
Formalising responsibility modelling for automatic analysis
Simpson, Robbie; Storer, Tim
2015-01-01
Modelling the structure of social-technical systems as a basis for informing software system design is a difficult compromise. Formal methods struggle to capture the scale and complexity of the heterogeneous organisations that use technical systems. Conversely, informal approaches lack the rigour needed to inform the software design and construction process or enable automated analysis. We revisit the concept of responsibility modelling, which models social technical systems as a collec...
Simulation modeling and analysis with Arena
Altiok, Tayfur
2007-01-01
Simulation Modeling and Analysis with Arena is a highly readable textbook which treats the essentials of the Monte Carlo discrete-event simulation methodology, and does so in the context of a popular Arena simulation environment. It treats simulation modeling as an in-vitro laboratory that facilitates the understanding of complex systems and experimentation with what-if scenarios in order to estimate their performance metrics. The book contains chapters on the simulation modeling methodology and the underpinnings of discrete-event systems, as well as the relevant underlying probability, statistics, stochastic processes, input analysis, model validation and output analysis. All simulation-related concepts are illustrated in numerous Arena examples, encompassing production lines, manufacturing and inventory systems, transportation systems, and computer information systems in networked settings.· Introduces the concept of discrete event Monte Carlo simulation, the most commonly used methodology for modeli...
HVDC dynamic modelling for small signal analysis
Energy Technology Data Exchange (ETDEWEB)
Yang, X.; Chen, C. [Shanghai Jiaotong Univ. (China). Dept. of Electrical Engineering
2004-11-01
The conventional quasi-steady model of HVDC is not able to describe the dynamic switching behaviour of HVDC converters. By means of the sampled-data modelling approach, a linear time-invariant (LTI) small-signal dynamic model is developed for the HVDC main circuit in the synchronous rotating d-q reference frame. The linearised model is validated by time-domain simulation, and it can be seen that the model represents the dynamic response of the static switching circuits to perturbations in operating points. The model is valid for analysing oscillations including high frequency modes such as subsynchronous oscillation (SSO) and high frequency instability. The model is applied in two cases: (i) SSO analysis where the results are compared with the quasi-steady approach that has shown its validation for normal SSO analysis; (ii) high frequency eigenvalue analysis for HVDC benchmark system in which the results of root locus analysis and simulation shows that increased gain of rectifier DC PI controller may result in high-frequency oscillatory instability. (author)
A Requirements Analysis Model Based on QFD
Institute of Scientific and Technical Information of China (English)
TANG Zhi-wei; Nelson K.H.Tang
2004-01-01
The enterprise resource planning (ERP) system has emerged to offer an integrated IT solution and more and more enterprises are increasing by adopting this system and regarding it as an important innovation. However, there is already evidence of high failure risks in ERP project implementation, one major reason is poor analysis of the requirements for system implementation. In this paper, the importance of requirements analysis for ERP project implementation is highlighted, and a requirements analysis model by applying quality function deployment (QFD) is presented, which will support to conduct requirements analysis for ERP project.
Sensitivity analysis of periodic matrix population models.
Caswell, Hal; Shyu, Esther
2012-12-01
Periodic matrix models are frequently used to describe cyclic temporal variation (seasonal or interannual) and to account for the operation of multiple processes (e.g., demography and dispersal) within a single projection interval. In either case, the models take the form of periodic matrix products. The perturbation analysis of periodic models must trace the effects of parameter changes, at each phase of the cycle, on output variables that are calculated over the entire cycle. Here, we apply matrix calculus to obtain the sensitivity and elasticity of scalar-, vector-, or matrix-valued output variables. We apply the method to linear models for periodic environments (including seasonal harvest models), to vec-permutation models in which individuals are classified by multiple criteria, and to nonlinear models including both immediate and delayed density dependence. The results can be used to evaluate management strategies and to study selection gradients in periodic environments. PMID:23316494
Decision variables analysis for structured modeling
Institute of Scientific and Technical Information of China (English)
潘启树; 赫东波; 张洁; 胡运权
2002-01-01
Structured modeling is the most commonly used modeling method, but it is not quite addaptive to significant changes in environmental conditions. Therefore, Decision Variables Analysis(DVA), a new modelling method is proposed to deal with linear programming modeling and changing environments. In variant linear programming , the most complicated relationships are those among decision variables. DVA classifies the decision variables into different levels using different index sets, and divides a model into different elements so that any change can only have its effect on part of the whole model. DVA takes into consideration the complicated relationships among decision variables at different levels, and can therefore sucessfully solve any modeling problem in dramatically changing environments.
[Integrated model system for environmental policy analysis].
Jiang, Lin
2006-05-01
An integrated model system for environmental policy analysis is built up with a Computable General Equilibrium (CGE) model as a core model, which is linked with an environmental model, air dispersion model, and health effect model (exposure-response functions) in an explicit way, therefore the model system is capable of evaluating the effects of policies on environment, health and economy and their interactions comprehensively. This method is used to analyze the effects of Beijing presumptive (energy) taxes on air quality, health, welfare and economic growth, and the conclusion is that sole presumptive taxes may slow down the economic growth, but the presumptive taxes with green tax reform can promote Beijing sustainable development. PMID:16850855
Model Selection in Data Analysis Competitions
DEFF Research Database (Denmark)
Wind, David Kofoed; Winther, Ole
2014-01-01
The use of data analysis competitions for selecting the most appropriate model for a problem is a recent innovation in the field of predictive machine learning. Two of the most well-known examples of this trend was the Netflix Competition and recently the competitions hosted on the online platform...... Kaggle. In this paper, we will state and try to verify a set of qualitative hypotheses about predictive modelling, both in general and in the scope of data analysis competitions. To verify our hypotheses we will look at previous competitions and their outcomes, use qualitative interviews with top...
Modeling Controller Tasks for Safety Analysis
Brown, Molly; Leveson, Nancy G.
1998-01-01
As control systems become more complex, the use of automated control has increased. At the same time, the role of the human operator has changed from primary system controller to supervisor or monitor. Safe design of the human computer interaction becomes more difficult. In this paper, we present a visual task modeling language that can be used by system designers to model human-computer interactions. The visual models can be translated into SpecTRM-RL, a blackbox specification language for modeling the automated portion of the control system. The SpecTRM-RL suite of analysis tools allow the designer to perform formal and informal safety analyses on the task model in isolation or integrated with the rest of the modeled system.
Analysis of Kelvin Probe Operational Models
Popescu, Eugeniu M.
2011-01-01
We present a study of several models on which Kelvin Probe instruments with flat and spherical tips rely for operation and for the determination of the contact potential difference. Using covariance analysis, we have investigated the precision limits of each model as imposed by the Cramer-Rao bound. Where the situation demanded, we have evaluated the bias introduced by the method in the estimation of the contact potential difference.
Analysis of N Category Privacy Models
Directory of Open Access Journals (Sweden)
Marn-Ling Shing
2012-10-01
Full Text Available File sharing becomes popular in social networking and the disclosure of private information without user’s consent can be found easily. Password management becomes increasingly necessary for maintaining privacy policy. Monitoring of violations of a privacy policy is needed to support the confidentiality of information security. This paper extends the analysis of two category confidentiality model to N categories, and illustrates how to use it to monitor the security state transitions in the information security privacy modeling.
Structuring multi-criteria portfolio analysis models
Montibeller, Gilberto; Franco, Alberto; Lord, Ewan; Iglesias, Aline
2008-01-01
Multi-Criteria Portfolio Analysis (MCPA) models have been extensively employed as an effective means to allocate scarce resources for investment in projects or services, considering different organisational areas and balancing costs, benefits & risks. However, structuring this type of models in practice is not a trivial task. How should be areas defined? Where should new projects be included? How should one define the criteria to evaluate performance? As far as the authors are aware, there is...
FAME, the Flux Analysis and Modeling Environment
Directory of Open Access Journals (Sweden)
Boele Joost
2012-01-01
Full Text Available Abstract Background The creation and modification of genome-scale metabolic models is a task that requires specialized software tools. While these are available, subsequently running or visualizing a model often relies on disjoint code, which adds additional actions to the analysis routine and, in our experience, renders these applications suboptimal for routine use by (systems biologists. Results The Flux Analysis and Modeling Environment (FAME is the first web-based modeling tool that combines the tasks of creating, editing, running, and analyzing/visualizing stoichiometric models into a single program. Analysis results can be automatically superimposed on familiar KEGG-like maps. FAME is written in PHP and uses the Python-based PySCeS-CBM for its linear solving capabilities. It comes with a comprehensive manual and a quick-start tutorial, and can be accessed online at http://f-a-m-e.org/. Conclusions With FAME, we present the community with an open source, user-friendly, web-based "one stop shop" for stoichiometric modeling. We expect the application will be of substantial use to investigators and educators alike.
Review and analysis of biomass gasification models
DEFF Research Database (Denmark)
Puig Arnavat, Maria; Bruno, Joan Carles; Coronas, Alberto
2010-01-01
The use of biomass as a source of energy has been further enhanced in recent years and special attention has been paid to biomass gasification. Due to the increasing interest in biomass gasification, several models have been proposed in order to explain and understand this complex process, and the...... design, simulation, optimisation and process analysis of gasifiers have been carried out. This paper presents and analyses several gasification models based on thermodynamic equilibrium, kinetics and artificial neural networks. The thermodynamic models are found to be a useful tool for preliminary...
Numerical analysis of the rebellious voter model
Czech Academy of Sciences Publication Activity Database
Swart, Jan M.; Vrbenský, Karel
2010-01-01
Roč. 140, č. 5 (2010), s. 873-899. ISSN 0022-4715 R&D Projects: GA ČR GA201/09/1931; GA MŠk 1M0572 Institutional research plan: CEZ:AV0Z10750506 Keywords : rebellious voter model * parity conservation * exactly solvable model * coexistence * interface tightness * cancellative systems * Markov chain Monte Carlo Subject RIV: BA - General Mathematics Impact factor: 1.447, year: 2010 http://library.utia.cas.cz/separaty/2010/SI/swart-numerical analysis of the rebellious voter model.pdf
Modeling and analysis of stochastic systems
Kulkarni, Vidyadhar G
2011-01-01
Based on the author's more than 25 years of teaching experience, Modeling and Analysis of Stochastic Systems, Second Edition covers the most important classes of stochastic processes used in the modeling of diverse systems, from supply chains and inventory systems to genetics and biological systems. For each class of stochastic process, the text includes its definition, characterization, applications, transient and limiting behavior, first passage times, and cost/reward models. Along with reorganizing the material, this edition revises and adds new exercises and examples. New to the second edi
Comparative Distributions of Hazard Modeling Analysis
Directory of Open Access Journals (Sweden)
Rana Abdul Wajid
2006-07-01
Full Text Available In this paper we present the comparison among the distributions used in hazard analysis. Simulation technique has been used to study the behavior of hazard distribution modules. The fundamentals of Hazard issues are discussed using failure criteria. We present the flexibility of the hazard modeling distribution that approaches to different distributions.
Power system stability modelling, analysis and control
Sallam, Abdelhay A
2015-01-01
This book provides a comprehensive treatment of the subject from both a physical and mathematical perspective and covers a range of topics including modelling, computation of load flow in the transmission grid, stability analysis under both steady-state and disturbed conditions, and appropriate controls to enhance stability.
Texture Analysis by Means of Model Functions
Eschner, Th.
1993-01-01
The conception of texture components is widely used in texture analysis. Mostly it is used to describe the orientation distribution function (ODF) qualitatively, and there are only a few special functions used to provide texture component calculations.This paper attempts to introduce another model function describing common texture components and giving a compromise between universality and computational efficiency.
Modeling uncertainty in geographic information and analysis
Institute of Scientific and Technical Information of China (English)
2008-01-01
Uncertainty modeling and data quality for spatial data and spatial analyses are im-portant topics in geographic information science together with space and time in geography,as well as spatial analysis. In the past two decades,a lot of efforts have been made to research the uncertainty modeling for spatial data and analyses. This paper presents our work in the research. In particular,four progresses in the re-search are given out: (a) from determinedness-to uncertainty-based representation of geographic objects in GIS; (b) from uncertainty modeling for static data to dy-namic spatial analyses; (c) from modeling uncertainty for spatial data to models; and (d) from error descriptions to quality control for spatial data.
Computational Models for Analysis of Illicit Activities
DEFF Research Database (Denmark)
Nizamani, Sarwat
devise policies to minimize them. These activities include cybercrimes, terrorist attacks or violent actions in response to certain world issues. Beside such activities, there are several other related activities worth analyzing, for which computational models have been presented in this thesis....... These models include a model for analyzing evolution of terrorist networks; a text classification model for detecting suspicious text and identification of suspected authors of anonymous emails; and a semantic analysis model for news reports, which may help analyze the illicit activities in certain area...... with location and temporal information. For the network evolution, the hierarchical agglomerative clustering approach has been applied to terrorist networks as case studies. The networks' evolutions show that how individual actors who are initially isolated from each other are converted in small groups, which...
Model authoring system for fail safe analysis
Sikora, Scott E.
1990-01-01
The Model Authoring System is a prototype software application for generating fault tree analyses and failure mode and effects analyses for circuit designs. Utilizing established artificial intelligence and expert system techniques, the circuits are modeled as a frame-based knowledge base in an expert system shell, which allows the use of object oriented programming and an inference engine. The behavior of the circuit is then captured through IF-THEN rules, which then are searched to generate either a graphical fault tree analysis or failure modes and effects analysis. Sophisticated authoring techniques allow the circuit to be easily modeled, permit its behavior to be quickly defined, and provide abstraction features to deal with complexity.
A Dynamic Model for Energy Structure Analysis
Institute of Scientific and Technical Information of China (English)
无
2006-01-01
Energy structure is a complicated system concerning economic development, natural resources, technological innovation, ecological balance, social progress and many other elements. It is not easy to explain clearly the developmental mechanism of an energy system and the mutual relations between the energy system and its related environments by the traditional methods. It is necessary to develop a suitable dynamic model, which can reflect the dynamic characteristics and the mutual relations of the energy system and its related environments. In this paper, the historical development of China's energy structure was analyzed. A new quantitative analysis model was developed based on system dynamics principles through analysis of energy resources, and the production and consumption of energy in China and comparison with the world. Finally, this model was used to predict China's future energy structures under different conditions.
Analysis hierarchical model for discrete event systems
Ciortea, E. M.
2015-11-01
The This paper presents the hierarchical model based on discrete event network for robotic systems. Based on the hierarchical approach, Petri network is analysed as a network of the highest conceptual level and the lowest level of local control. For modelling and control of complex robotic systems using extended Petri nets. Such a system is structured, controlled and analysed in this paper by using Visual Object Net ++ package that is relatively simple and easy to use, and the results are shown as representations easy to interpret. The hierarchical structure of the robotic system is implemented on computers analysed using specialized programs. Implementation of hierarchical model discrete event systems, as a real-time operating system on a computer network connected via a serial bus is possible, where each computer is dedicated to local and Petri model of a subsystem global robotic system. Since Petri models are simplified to apply general computers, analysis, modelling, complex manufacturing systems control can be achieved using Petri nets. Discrete event systems is a pragmatic tool for modelling industrial systems. For system modelling using Petri nets because we have our system where discrete event. To highlight the auxiliary time Petri model using transport stream divided into hierarchical levels and sections are analysed successively. Proposed robotic system simulation using timed Petri, offers the opportunity to view the robotic time. Application of goods or robotic and transmission times obtained by measuring spot is obtained graphics showing the average time for transport activity, using the parameters sets of finished products. individually.
Ferrofluids: Modeling, numerical analysis, and scientific computation
Tomas, Ignacio
This dissertation presents some developments in the Numerical Analysis of Partial Differential Equations (PDEs) describing the behavior of ferrofluids. The most widely accepted PDE model for ferrofluids is the Micropolar model proposed by R.E. Rosensweig. The Micropolar Navier-Stokes Equations (MNSE) is a subsystem of PDEs within the Rosensweig model. Being a simplified version of the much bigger system of PDEs proposed by Rosensweig, the MNSE are a natural starting point of this thesis. The MNSE couple linear velocity u, angular velocity w, and pressure p. We propose and analyze a first-order semi-implicit fully-discrete scheme for the MNSE, which decouples the computation of the linear and angular velocities, is unconditionally stable and delivers optimal convergence rates under assumptions analogous to those used for the Navier-Stokes equations. Moving onto the much more complex Rosensweig's model, we provide a definition (approximation) for the effective magnetizing field h, and explain the assumptions behind this definition. Unlike previous definitions available in the literature, this new definition is able to accommodate the effect of external magnetic fields. Using this definition we setup the system of PDEs coupling linear velocity u, pressure p, angular velocity w, magnetization m, and magnetic potential ϕ We show that this system is energy-stable and devise a numerical scheme that mimics the same stability property. We prove that solutions of the numerical scheme always exist and, under certain simplifying assumptions, that the discrete solutions converge. A notable outcome of the analysis of the numerical scheme for the Rosensweig's model is the choice of finite element spaces that allow the construction of an energy-stable scheme. Finally, with the lessons learned from Rosensweig's model, we develop a diffuse-interface model describing the behavior of two-phase ferrofluid flows and present an energy-stable numerical scheme for this model. For a
To model bolted parts for tolerance analysis using variational model
Directory of Open Access Journals (Sweden)
Wilma Polini
2015-01-01
Full Text Available Mechanical products are usually made by assembling many parts. Among the different type of links, bolts are widely used to join the components of an assembly. In a bolting a clearance exists among the bolt and the holes of the parts to join. This clearance has to be modeled in order to define the possible movements agreed to the joined parts. The model of the clearance takes part to the global model that builds the stack-up functions by accumulating the tolerances applied to the assembly components. Then, the stack-up functions are solved to evaluate the influence of the tolerances assigned to the assembly components on the functional requirements of the assembly product. The aim of this work is to model the joining between two parts by a planar contact surface and two bolts inside the model that builds and solves the stack-up functions of the tolerance analysis. It adopts the variational solid model. The proposed model uses the simplified hypothesis that each surface maintains its nominal shape, i.e. the effects of the form errors are neglected. The proposed model has been applied to a case study where the holes have dimensional and positional tolerances in order to demonstrate its effectiveness.
Identifiability analysis in conceptual sewer modelling.
Kleidorfer, M; Leonhardt, G; Rauch, W
2012-01-01
For a sufficient calibration of an environmental model not only parameter sensitivity but also parameter identifiability is an important issue. In identifiability analysis it is possible to analyse whether changes in one parameter can be compensated by appropriate changes of the other ones within a given uncertainty range. Parameter identifiability is conditional to the information content of the calibration data and consequently conditional to a certain measurement layout (i.e. types of measurements, number and location of measurement sites, temporal resolution of measurements etc.). Hence the influence of number and location of measurement sites on the number of identifiable parameters can be investigated. In the present study identifiability analysis is applied to a conceptual model of a combined sewer system aiming to predict the combined sewer overflow emissions. Different measurement layouts are tested and it can be shown that only 13 of the most sensitive catchment areas (represented by the model parameter 'effective impervious area') can be identified when overflow measurements of the 20 highest overflows and the runoff to the waste water treatment plant are used for calibration. The main advantage of this method is very low computational costs as the number of required model runs equals the total number of model parameters. Hence, this method is a valuable tool when analysing large models with a long runtime and many parameters. PMID:22864432
Guideliness for system modeling: fault tree [analysis
Energy Technology Data Exchange (ETDEWEB)
Lee, Yoon Hwan; Yang, Joon Eon; Kang, Dae Il; Hwang, Mee Jeong
2004-07-01
This document, the guidelines for system modeling related to Fault Tree Analysis(FTA), is intended to provide the guidelines with the analyzer to construct the fault trees in the level of the capability category II of ASME PRA standard. Especially, they are to provide the essential and basic guidelines and the related contents to be used in support of revising the Ulchin 3 and 4 PSA model for risk monitor within the capability category II of ASME PRA standard. Normally the main objective of system analysis is to assess the reliability of system modeled by Event Tree Analysis (ETA). A variety of analytical techniques can be used for the system analysis, however, FTA method is used in this procedures guide. FTA is the method used for representing the failure logic of plant systems deductively using AND, OR or NOT gates. The fault tree should reflect all possible failure modes that may contribute to the system unavailability. This should include contributions due to the mechanical failures of the components, Common Cause Failures (CCFs), human errors and outages for testing and maintenance. This document identifies and describes the definitions and the general procedures of FTA and the essential and basic guidelines for reving the fault trees. Accordingly, the guidelines for FTA will be capable to guide the FTA to the level of the capability category II of ASME PRA standard.
Analysis of mathematical models of radioisotope gauges
International Nuclear Information System (INIS)
Radioisotope gauges as industrial sensors were briefly reviewed. Regression models of instruments based on various principles developed in Institute of Nuclear Research and Institute of Nuclear Chemistry and Technology were analysed and their mathematical models assessed. It was found that for one - dimensional models the lowest value of standard error of estimate was achieved when calibration procedure was modelled by logarithmic function. Mathematical expressions for variance and mean value of intrinsic error for linear and non - linear one - as well as for multi - dimensional models of radioisotope gauges were derived. A conclusion was drawn that optimal model of calibration procedure determined by regression analysis method not always corresponds to the minimum value of the intrinsic error variance. Influence of cutting off of probability distribution function of measured quantity and its error at the lower upper limit of measurement range on variance and mean value of intrinsic error was evaluated. Feasibility study for application of some aspects of Shannon's information theory for evaluation of mathematical models of radioisotope gauges was accomplished. Its usefulness for complex evaluation of multidimensional models was confirmed. 105 refs. (author)
Sensitivity analysis of a modified energy model
International Nuclear Information System (INIS)
Sensitivity analysis is carried out to validate model formulation. A modified model has been developed to predict the future energy requirement of coal, oil and electricity, considering price, income, technological and environmental factors. The impact and sensitivity of the independent variables on the dependent variable are analysed. The error distribution pattern in the modified model as compared to a conventional time series model indicated the absence of clusters. The residual plot of the modified model showed no distinct pattern of variation. The percentage variation of error in the conventional time series model for coal and oil ranges from -20% to +20%, while for electricity it ranges from -80% to +20%. However, in the case of the modified model the percentage variation in error is greatly reduced - for coal it ranges from -0.25% to +0.15%, for oil -0.6% to +0.6% and for electricity it ranges from -10% to +10%. The upper and lower limit consumption levels at 95% confidence is determined. The consumption at varying percentage changes in price and population are analysed. The gap between the modified model predictions at varying percentage changes in price and population over the years from 1990 to 2001 is found to be increasing. This is because of the increasing rate of energy consumption over the years and also the confidence level decreases as the projection is made far into the future. (author)
Model Performance Evaluation and Scenario Analysis (MPESA) Tutorial
This tool consists of two parts: model performance evaluation and scenario analysis (MPESA). The model performance evaluation consists of two components: model performance evaluation metrics and model diagnostics. These metrics provides modelers with statistical goodness-of-fit m...
Social phenomena from data analysis to models
Perra, Nicola
2015-01-01
This book focuses on the new possibilities and approaches to social modeling currently being made possible by an unprecedented variety of datasets generated by our interactions with modern technologies. This area has witnessed a veritable explosion of activity over the last few years, yielding many interesting and useful results. Our aim is to provide an overview of the state of the art in this area of research, merging an extremely heterogeneous array of datasets and models. Social Phenomena: From Data Analysis to Models is divided into two parts. Part I deals with modeling social behavior under normal conditions: How we live, travel, collaborate and interact with each other in our daily lives. Part II deals with societal behavior under exceptional conditions: Protests, armed insurgencies, terrorist attacks, and reactions to infectious diseases. This book offers an overview of one of the most fertile emerging fields bringing together practitioners from scientific communities as diverse as social sciences, p...
Modeling and analysis of calcium bromide hydrolysis
Energy Technology Data Exchange (ETDEWEB)
Lottes, Steven A.; Lyczkowski, Robert W.; Panchal, Chandrakant B.; Doctor, Richard D. [Energy Systems Division, Argonne National Laboratory, 9700 S. Cass Avenue, Argonne, IL 60439 (United States)
2009-05-15
The main focus of this paper is the modeling, simulation, and analysis of the calcium bromide hydrolysis reactor stage in the calcium-bromine thermochemical water-splitting cycle for nuclear hydrogen production. One reactor concept is to use a spray of calcium bromide into steam, in which the heat of fusion supplies the heat of reaction. Droplet models were built up in a series of steps incorporating various physical phenomena, including droplet flow, heat transfer, phase change, and reaction, separately. Given the large heat reservoir contained in a pool of molten calcium bromide that allows bubbles to rise easily, using a bubble column reactor for the hydrolysis appears to be a feasible and promising alternative to the spray reactor concept. The two limiting cases of bubble geometry, spherical and spherical-cap, are considered in the modeling. Results for both droplet and bubble modeling with COMSOL MULTIPHYSICS trademark are presented, with recommendations for the path forward. (author)
SRMAFTE facility checkout model flow field analysis
Dill, Richard A.; Whitesides, Harold R.
1992-07-01
The Solid Rocket Motor Air Flow Equipment (SRMAFTE) facility was constructed for the purpose of evaluating the internal propellant, insulation, and nozzle configurations of solid propellant rocket motor designs. This makes the characterization of the facility internal flow field very important in assuring that no facility induced flow field features exist which would corrupt the model related measurements. In order to verify the design and operation of the facility, a three-dimensional computational flow field analysis was performed on the facility checkout model setup. The checkout model measurement data, one-dimensional and three-dimensional estimates were compared, and the design and proper operation of the facility was verified. The proper operation of the metering nozzles, adapter chamber transition, model nozzle, and diffuser were verified. The one-dimensional and three-dimensional flow field estimates along with the available measurement data are compared.
MATHEMATICAL RISK ANALYSIS: VIA NICHOLAS RISK MODEL AND BAYESIAN ANALYSIS
Directory of Open Access Journals (Sweden)
Anass BAYAGA
2010-07-01
Full Text Available The objective of this second part of a two-phased study was to explorethe predictive power of quantitative risk analysis (QRA method andprocess within Higher Education Institution (HEI. The method and process investigated the use impact analysis via Nicholas risk model and Bayesian analysis, with a sample of hundred (100 risk analysts in a historically black South African University in the greater Eastern Cape Province.The first findings supported and confirmed previous literature (KingIII report, 2009: Nicholas and Steyn, 2008: Stoney, 2007: COSA, 2004 that there was a direct relationship between risk factor, its likelihood and impact, certiris paribus. The second finding in relation to either controlling the likelihood or the impact of occurrence of risk (Nicholas risk model was that to have a brighter risk reward, it was important to control the likelihood ofoccurrence of risks as compared with its impact so to have a direct effect on entire University. On the Bayesian analysis, thus third finding, the impact of risk should be predicted along three aspects. These aspects included the human impact (decisions made, the property impact (students and infrastructural based and the business impact. Lastly, the study revealed that although in most business cases, where as business cycles considerably vary dependingon the industry and or the institution, this study revealed that, most impacts in HEI (University was within the period of one academic.The recommendation was that application of quantitative risk analysisshould be related to current legislative framework that affects HEI.
3D face modeling, analysis and recognition
Daoudi, Mohamed; Veltkamp, Remco
2013-01-01
3D Face Modeling, Analysis and Recognition presents methodologies for analyzing shapes of facial surfaces, develops computational tools for analyzing 3D face data, and illustrates them using state-of-the-art applications. The methodologies chosen are based on efficient representations, metrics, comparisons, and classifications of features that are especially relevant in the context of 3D measurements of human faces. These frameworks have a long-term utility in face analysis, taking into account the anticipated improvements in data collection, data storage, processing speeds, and application s
Advances in statistical models for data analysis
Minerva, Tommaso; Vichi, Maurizio
2015-01-01
This edited volume focuses on recent research results in classification, multivariate statistics and machine learning and highlights advances in statistical models for data analysis. The volume provides both methodological developments and contributions to a wide range of application areas such as economics, marketing, education, social sciences and environment. The papers in this volume were first presented at the 9th biannual meeting of the Classification and Data Analysis Group (CLADAG) of the Italian Statistical Society, held in September 2013 at the University of Modena and Reggio Emilia, Italy.
Extrudate Expansion Modelling through Dimensional Analysis Method
DEFF Research Database (Denmark)
A new model framework is proposed to correlate extrudate expansion and extrusion operation parameters for a food extrusion cooking process through dimensional analysis principle, i.e. Buckingham pi theorem. Three dimensionless groups, i.e. energy, water content and temperature, are suggested to...... describe the extrudates expansion. From the three dimensionless groups, an equation with three experimentally determined parameters is derived to express the extrudate expansion. The model is evaluated with whole wheat flour and aquatic feed extrusion experimental data. The average deviations of the...
Computer modelling for LOCA analysis in PHWRs
International Nuclear Information System (INIS)
A computer code THYNAC developed for analysis of thermal hydraulic transient phenomena during LOCA in the PHWR type reactor and primary coolant system is described. The code predicts coolant voiding rate in the core, coolant discharge rate from the break, primary system depressurization history and temperature history of both fuel and fuel clad. Reactor system is modelled as a set of connected fluid segments which represent piping, feeders, coolant channels, etc. Method of finite difference is used in the code. Modelling of various specific phenomena e.g. two-phase pressure drop, slip flow, pumps etc. in the code is described. (M.G.B.)
Handwriting Analysis with Online Fuzzy Models
Bouillon, Manuel; Anquetil, Eric
2015-01-01
This paper presents the early work, done in the context of the IntuiScript project, on handwriting quality analysis. This IntuiScript project aims at developing a digital workbook to help with teaching children how to handwrite. To do so, we must be able to analyse their handwriting, to evaluate if the letters are correctly written, and to detail what aspects of the child symbols – letters, numbers, and geometric forms-do not correspond to the teacher models. We use an online fuzzy model to e...
Model correction factor method for system analysis
DEFF Research Database (Denmark)
Ditlevsen, Ove Dalager; Johannesen, Johannes M.
2000-01-01
The Model Correction Factor Method is an intelligent response surface method based on simplifiedmodeling. MCFM is aimed for reliability analysis in case of a limit state defined by an elaborate model. Herein it isdemonstrated that the method is applicable for elaborate limit state surfaces on which...... clearly defined failure modes, the MCFM can bestarted from each idealized single mode limit state in turn to identify a locally most central point on the elaborate limitstate surface. Typically this procedure leads to a fewer number of locally most central failure points on the elaboratelimit state...
Formal Modeling and Analysis of Timed Systems
DEFF Research Database (Denmark)
Larsen, Kim Guldstrand; Niebert, Peter
This book constitutes the thoroughly refereed post-proceedings of the First International Workshop on Formal Modeling and Analysis of Timed Systems, FORMATS 2003, held in Marseille, France in September 2003. The 19 revised full papers presented together with an invited paper and the abstracts of...... two invited talks were carefully selected from 36 submissions during two rounds of reviewing and improvement. All current aspects of formal method for modeling and analyzing timed systems are addressed; among the timed systems dealt with are timed automata, timed Petri nets, max-plus algebras, real...
A factor analysis model for functional genomics
Shioda Romy; Kustra Rafal; Zhu Mu
2006-01-01
Abstract Background Expression array data are used to predict biological functions of uncharacterized genes by comparing their expression profiles to those of characterized genes. While biologically plausible, this is both statistically and computationally challenging. Typical approaches are computationally expensive and ignore correlations among expression profiles and functional categories. Results We propose a factor analysis model (FAM) for functional genomics and give a two-step algorith...
Bayesian Analysis of Multivariate Probit Models
Siddhartha Chib; Edward Greenberg
1996-01-01
This paper provides a unified simulation-based Bayesian and non-Bayesian analysis of correlated binary data using the multivariate probit model. The posterior distribution is simulated by Markov chain Monte Carlo methods, and maximum likelihood estimates are obtained by a Markov chain Monte Carlo version of the E-M algorithm. Computation of Bayes factors from the simulation output is also considered. The methods are applied to a bivariate data set, to a 534-subject, four-year longitudinal dat...
ANALYSIS MODEL FOR RETURN ON CAPITAL EMPLOYED
BURJA CAMELIA
2013-01-01
At the microeconomic level, the appreciation of the capitals’ profitability is a very complex action which is of interest for stakeholders. This study has as main purpose to extend the traditional analysis model for the capitals’ profitability, based on the ratio “Return on capital employed”. In line with it the objectives of this work aim the identification of factors that exert an influence on the capital’s profitability utilized by a company and the measurement of their contribution in the...
Economic Modeling and Analysis of Educational Vouchers
Dennis Epple; Richard Romano
2012-01-01
The analysis of educational vouchers has evolved from market-based analogies to models that incorporate distinctive features of the educational environment. These distinctive features include peer effects, scope for private school pricing and admissions based on student characteristics, the linkage of household residential and school choices in multidistrict settings, the potential for rent seeking in public and private schools, the role of school reputations, incentives for student effort, a...
Micromechatronics modeling, analysis, and design with Matlab
Giurgiutiu, Victor
2009-01-01
Focusing on recent developments in engineering science, enabling hardware, advanced technologies, and software, Micromechatronics: Modeling, Analysis, and Design with MATLAB®, Second Edition provides clear, comprehensive coverage of mechatronic and electromechanical systems. It applies cornerstone fundamentals to the design of electromechanical systems, covers emerging software and hardware, introduces the rigorous theory, examines the design of high-performance systems, and helps develop problem-solving skills. Along with more streamlined material, this edition adds many new sections to exist
Scripted Building Energy Modeling and Analysis: Preprint
Energy Technology Data Exchange (ETDEWEB)
Hale, E.; Macumber, D.; Benne, K.; Goldwasser, D.
2012-08-01
Building energy modeling and analysis is currently a time-intensive, error-prone, and nonreproducible process. This paper describes the scripting platform of the OpenStudio tool suite (http://openstudio.nrel.gov) and demonstrates its use in several contexts. Two classes of scripts are described and demonstrated: measures and free-form scripts. Measures are small, single-purpose scripts that conform to a predefined interface. Because measures are fairly simple, they can be written or modified by inexperienced programmers.
An analysis of penalized interaction models
Zhao, Junlong; Leng, Chenlei
2016-01-01
An important consideration for variable selection in interaction models is to design an appropriate penalty that respects hierarchy of the importance of the variables. A common theme is to include an interaction term only after the corresponding main effects are present. In this paper, we study several recently proposed approaches and present a unified analysis on the convergence rate for a class of estimators, when the design satisfies the restricted eigenvalue condition. In particular, we s...
Automating Risk Analysis of Software Design Models
Maxime Frydman; Guifré Ruiz; Elisa Heymann; Eduardo César; Barton P. Miller
2014-01-01
The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security e...
Energy Systems Modelling Research and Analysis
DEFF Research Database (Denmark)
Møller Andersen, Frits; Alberg Østergaard, Poul
2015-01-01
This editorial introduces the seventh volume of the International Journal of Sustainable Energy Planning and Management. The volume presents part of the outcome of the project Energy Systems Modelling Research and Analysis (ENSYMORA) funded by the Danish Innovation Fund. The project carried out...... by 11 university and industry partners has improved the basis for decision-making within energy planning and energy scenario making by providing new and improved tools and methods for energy systems analyses....
Multivariate Probabilistic Analysis of an Hydrological Model
Franceschini, Samuela; Marani, Marco
2010-05-01
Model predictions derived based on rainfall measurements and hydrological model results are often limited by the systematic error of measuring instruments, by the intrinsic variability of the natural processes and by the uncertainty of the mathematical representation. We propose a means to identify such sources of uncertainty and to quantify their effects based on point-estimate approaches, as a valid alternative to cumbersome Montecarlo methods. We present uncertainty analyses on the hydrologic response to selected meteorological events, in the mountain streamflow-generating portion of the Brenta basin at Bassano del Grappa, Italy. The Brenta river catchment has a relatively uniform morphology and quite a heterogeneous rainfall-pattern. In the present work, we evaluate two sources of uncertainty: data uncertainty (the uncertainty due to data handling and analysis) and model uncertainty (the uncertainty related to the formulation of the model). We thus evaluate the effects of the measurement error of tipping-bucket rain gauges, the uncertainty in estimating spatially-distributed rainfall through block kriging, and the uncertainty associated with estimated model parameters. To this end, we coupled a deterministic model based on the geomorphological theory of the hydrologic response to probabilistic methods. In particular we compare the results of Monte Carlo Simulations (MCS) to the results obtained, in the same conditions, using Li's Point Estimate Method (LiM). The LiM is a probabilistic technique that approximates the continuous probability distribution function of the considered stochastic variables by means of discrete points and associated weights. This allows to satisfactorily reproduce results with only few evaluations of the model function. The comparison between the LiM and MCS results highlights the pros and cons of using an approximating method. LiM is less computationally demanding than MCS, but has limited applicability especially when the model
Modeling late entry bias in survival analysis.
Matsuura, Masaaki; Eguchi, Shinto
2005-06-01
In a failure time analysis, we sometimes observe additional study subjects who enter during the study period. These late entries are treated as left-truncated data in the statistical literature. However, with real data, there is a substantial possibility that the delayed entries may have extremely different hazards compared to the other standard subjects. We focus on a situation in which such entry bias might arise in the analysis of survival data. The purpose of the present article is to develop an appropriate methodology for making inference about data including late entries. We construct a model that includes parameters for the effect of delayed entry bias having no specification for the distribution of entry time. We also discuss likelihood inference based on this model and derive the asymptotic behavior of estimates. A simulation study is conducted for a finite sample size in order to compare the analysis results using our method with those using the standard method, where independence between entry time and failure time is assumed. We apply this method to mortality analysis among atomic bomb survivors defined in a geographical study region. PMID:16011705
Modeling and analysis of advanced binary cycles
Energy Technology Data Exchange (ETDEWEB)
Gawlik, K.
1997-12-31
A computer model (Cycle Analysis Simulation Tool, CAST) and a methodology have been developed to perform value analysis for small, low- to moderate-temperature binary geothermal power plants. The value analysis method allows for incremental changes in the levelized electricity cost (LEC) to be determined between a baseline plant and a modified plant. Thermodynamic cycle analyses and component sizing are carried out in the model followed by economic analysis which provides LEC results. The emphasis of the present work is on evaluating the effect of mixed working fluids instead of pure fluids on the LEC of a geothermal binary plant that uses a simple Organic Rankine Cycle. Four resources were studied spanning the range of 265{degrees}F to 375{degrees}F. A variety of isobutane and propane based mixtures, in addition to pure fluids, were used as working fluids. This study shows that the use of propane mixtures at a 265{degrees}F resource can reduce the LEC by 24% when compared to a base case value that utilizes commercial isobutane as its working fluid. The cost savings drop to 6% for a 375{degrees}F resource, where an isobutane mixture is favored. Supercritical cycles were found to have the lowest cost at all resources.
Computer modeling for neutron activation analysis methods
International Nuclear Information System (INIS)
Full text: The INP AS RU develops databases for the neutron-activation analysis - ND INAA [1] and ELEMENT [2]. Based on these databases, the automated complex is under construction aimed at modeling of methods for natural and technogenic materials analysis. It is well known, that there is a variety of analysis objects with wide spectra, different composition and concentration of elements, which makes it impossible to develop universal methods applicable for every analytical research. The modelling is based on algorithm, that counts the period of time in which the sample was irradiated in nuclear reactor, providing the sample's total absorption and activity analytical peaks areas with given errors. The analytical complex was tested for low-elemental analysis (determination of Fe and Zn in vegetation samples, and Cu, Ag and Au - in technological objects). At present, the complex is applied for multielemental analysis of sediment samples. In this work, modern achievements in the analytical chemistry (measurement facilities, high-resolution detectors, IAEA and IUPAC databases) and information technology applications (Java software, database management systems (DBMS), internet technologies) are applied. Reference: 1. Tillaev T., Umaraliev A., Gurvich L.G., Yuldasheva K., Kadirova J. Specialized database for instrumental neutron activation analysis - ND INAA 1.0, The 3-rd Eurasian Conference Nuclear Science and its applications, 2004, pp.270-271.; 2. Gurvich L.G., Tillaev T., Umaraliev A. The Information-analytical database on the element contents of natural objects. The 4-th International Conference Modern problems of Nuclear Physics, Samarkand, 2003, p.337. (authors)
Mathematical analysis of epidemiological models with heterogeneity
Energy Technology Data Exchange (ETDEWEB)
Van Ark, J.W.
1992-01-01
For many diseases in human populations the disease shows dissimilar characteristics in separate subgroups of the population; for example, the probability of disease transmission for gonorrhea or AIDS is much higher from male to female than from female to male. There is reason to construct and analyze epidemiological models which allow this heterogeneity of population, and to use these models to run computer simulations of the disease to predict the incidence and prevalence of the disease. In the models considered here the heterogeneous population is separated into subpopulations whose internal and external interactions are homogeneous in the sense that each person in the population can be assumed to have all average actions for the people of that subpopulation. The first model considered is an SIRS models; i.e., the Susceptible can become Infected, and if so he eventually Recovers with temporary immunity, and after a period of time becomes Susceptible again. Special cases allow for permanent immunity or other variations. This model is analyzed and threshold conditions are given which determine whether the disease dies out or persists. A deterministic model is presented; this model is constructed using difference equations, and it has been used in computer simulations for the AIDS epidemic in the homosexual population in San Francisco. The homogeneous version and the heterogeneous version of the differential-equations and difference-equations versions of the deterministic model are analyzed mathematically. In the analysis, equilibria are identified and threshold conditions are set forth for the disease to die out if the disease is below the threshold so that the disease-free equilibrium is globally asymptotically stable. Above the threshold the disease persists so that the disease-free equilibrium is unstable and there is a unique endemic equilibrium.
Model reduction using a posteriori analysis
Whiteley, Jonathan P.
2010-05-01
Mathematical models in biology and physiology are often represented by large systems of non-linear ordinary differential equations. In many cases, an observed behaviour may be written as a linear functional of the solution of this system of equations. A technique is presented in this study for automatically identifying key terms in the system of equations that are responsible for a given linear functional of the solution. This technique is underpinned by ideas drawn from a posteriori error analysis. This concept has been used in finite element analysis to identify regions of the computational domain and components of the solution where a fine computational mesh should be used to ensure accuracy of the numerical solution. We use this concept to identify regions of the computational domain and components of the solution where accurate representation of the mathematical model is required for accuracy of the functional of interest. The technique presented is demonstrated by application to a model problem, and then to automatically deduce known results from a cell-level cardiac electrophysiology model. © 2010 Elsevier Inc.
Ontological Modeling for Integrated Spacecraft Analysis
Wicks, Erica
2011-01-01
Current spacecraft work as a cooperative group of a number of subsystems. Each of these requiresmodeling software for development, testing, and prediction. It is the goal of my team to create anoverarching software architecture called the Integrated Spacecraft Analysis (ISCA) to aid in deploying the discrete subsystems' models. Such a plan has been attempted in the past, and has failed due to the excessive scope of the project. Our goal in this version of ISCA is to use new resources to reduce the scope of the project, including using ontological models to help link the internal interfaces of subsystems' models with the ISCA architecture.I have created an ontology of functions specific to the modeling system of the navigation system of a spacecraft. The resulting ontology not only links, at an architectural level, language specificinstantiations of the modeling system's code, but also is web-viewable and can act as a documentation standard. This ontology is proof of the concept that ontological modeling can aid in the integration necessary for ISCA to work, and can act as the prototype for future ISCA ontologies.
Analysis of software for modeling atmospheric dispersion
International Nuclear Information System (INIS)
During last few years, a number software packages for microcomputes have appeared with the aim to simulate diffusion of atmospheric pollutants. These codes, simplifying the models used for safety analyses of industrial plants are becoming more useful, and are even used for post-accidental conditions. The report presents for the first time in a critical manner, principal models available up to this date. The problem arises in adapting the models to the demanded post-accidental interventions. In parallel to this action an analysis of performance was performed. It means, identifying the need of forecasting the most appropriate actions to be performed having in mind short available time and lack of information. Because of these difficulties, it is possible to simplify the software, which will not include all the options but could deal with a specific situation. This would enable minimisation of data to be collected on the site
Modeling and Hazard Analysis Using STPA
Ishimatsu, Takuto; Leveson, Nancy; Thomas, John; Katahira, Masa; Miyamoto, Yuko; Nakao, Haruka
2010-09-01
A joint research project between MIT and JAXA/JAMSS is investigating the application of a new hazard analysis to the system and software in the HTV. Traditional hazard analysis focuses on component failures but software does not fail in this way. Software most often contributes to accidents by commanding the spacecraft into an unsafe state(e.g., turning off the descent engines prematurely) or by not issuing required commands. That makes the standard hazard analysis techniques of limited usefulness on software-intensive systems, which describes most spacecraft built today. STPA is a new hazard analysis technique based on systems theory rather than reliability theory. It treats safety as a control problem rather than a failure problem. The goal of STPA, which is to create a set of scenarios that can lead to a hazard, is the same as FTA but STPA includes a broader set of potential scenarios including those in which no failures occur but the problems arise due to unsafe and unintended interactions among the system components. STPA also provides more guidance to the analysts that traditional fault tree analysis. Functional control diagrams are used to guide the analysis. In addition, JAXA uses a model-based system engineering development environment(created originally by Leveson and called SpecTRM) which also assists in the hazard analysis. One of the advantages of STPA is that it can be applied early in the system engineering and development process in a safety-driven design process where hazard analysis drives the design decisions rather than waiting until reviews identify problems that are then costly or difficult to fix. It can also be applied in an after-the-fact analysis and hazard assessment, which is what we did in this case study. This paper describes the experimental application of STPA to the JAXA HTV in order to determine the feasibility and usefulness of the new hazard analysis technique. Because the HTV was originally developed using fault tree analysis
Automating Risk Analysis of Software Design Models
Directory of Open Access Journals (Sweden)
Maxime Frydman
2014-01-01
Full Text Available The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance.
Automating risk analysis of software design models.
Frydman, Maxime; Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P
2014-01-01
The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance. PMID:25136688
Gentrification and models for real estate analysis
Directory of Open Access Journals (Sweden)
Gianfranco Brusa
2013-08-01
Full Text Available This research propose a deep analysis of Milanese real estate market, based on data supplied by three real estate organizations; gentrification appears in some neighborhoods, such as Tortona, Porta Genova, Bovisa, Isola Garibaldi: the latest is the subject of the final analysis, by surveying of physical and social state of the area. The survey takes place in two periods (2003 and 2009 to compare the evolution of gentrification. The results of surveys has been employed in a simulation by multi-agent system model, to foresee long term evolution of the phenomenon. These neighborhood micro-indicators allow to put in evidence actual trends, conditioning a local real estate market, which can translate themselves in phenomena such as gentrification. In present analysis, the employ of cellular automata models applied to a neighborhood in Milan (Isola Garibaldi produced the dynamic simulation of gentrification trend during a very long time: the cyclical phenomenon (one loop holds a period of twenty – thirty years appears sometimes during a theoretical time of 100 – 120 – 150 years. Simulation of long period scenarios by multi-agent systems and cellular automata provides estimator with powerful tool, without limits in implementing it, able to support him in appraisal judge. It stands also to reason that such a tool can sustain urban planning and related evaluation processes.
Erosion Modeling Analysis for SME Tank Cavity
International Nuclear Information System (INIS)
Previous computational work to evaluate erosion in the DWPF Slurry Mix Evaporator vessel has been extended to address the potential for the erosion to accelerate because of changes to the tank bottom profile. The same erosion mechanism identified in the previous work, abrasive erosion driven by high wall shear stress, was applied to the current evaluation. The current work extends the previous analysis by incorporating the observed changes to the tank bottom and coil support structure in the vicinity of the coil guides. The results show that wall shear on the tank bottom is about the same magnitude as found in previous results. Shear stresses in the eroded cavities are reduced compared to those that caused the initial erosion to the extent that anticipated continued erosion of those locations is minimal. If SR operations were continued at an agitator speed of 130 rpm, the edge of the existing eroded cavities would probably smooth out, while the rate of erosion at the bottom of the cavity would decrease significantly with time. Further, reducing the agitator speed to 103 rpm will reduce shear stresses throughout the bottom region of the tank enough to essentially preclude any significant continued erosion. Because this report is an extension to previously documented work, most background information has been omitted. A complete discussion of the motivation for both the analysis and the modeling is provided in Lee et al., ''Erosion Modeling Analysis for Modified DWPF SR Tank''
Global sensitivity analysis of thermomechanical models in modelling of welding
International Nuclear Information System (INIS)
Current approach of most welding modellers is to content themselves with available material data, and to chose a mechanical model that seems to be appropriate. Among inputs, those controlling the material properties are one of the key problems of welding simulation: material data are never characterized over a sufficiently wide temperature range. This way to proceed neglect the influence of the uncertainty of input data on the result given by the computer code. In this case, how to assess the credibility of prediction? This thesis represents a step in the direction of implementing an innovative approach in welding simulation in order to bring answers to this question, with an illustration on some concretes welding cases.The global sensitivity analysis is chosen to determine which material properties are the most sensitive in a numerical welding simulation and in which range of temperature. Using this methodology require some developments to sample and explore the input space covering welding of different steel materials. Finally, input data have been divided in two groups according to their influence on the output of the model (residual stress or distortion). In this work, complete methodology of the global sensitivity analysis has been successfully applied to welding simulation and lead to reduce the input space to the only important variables. Sensitivity analysis has provided answers to what can be considered as one of the probable frequently asked questions regarding welding simulation: for a given material which properties must be measured with a good accuracy and which ones can be simply extrapolated or taken from a similar material? (author)
Inducer analysis/pump model development
Cheng, Gary C.
1994-01-01
Current design of high performance turbopumps for rocket engines requires effective and robust analytical tools to provide design information in a productive manner. The main goal of this study was to develop a robust and effective computational fluid dynamics (CFD) pump model for general turbopump design and analysis applications. A finite difference Navier-Stokes flow solver, FDNS, which includes an extended k-epsilon turbulence model and appropriate moving zonal interface boundary conditions, was developed to analyze turbulent flows in turbomachinery devices. In the present study, three key components of the turbopump, the inducer, impeller, and diffuser, were investigated by the proposed pump model, and the numerical results were benchmarked by the experimental data provided by Rocketdyne. For the numerical calculation of inducer flows with tip clearance, the turbulence model and grid spacing are very important. Meanwhile, the development of the cross-stream secondary flow, generated by curved blade passage and the flow through tip leakage, has a strong effect on the inducer flow. Hence, the prediction of the inducer performance critically depends on whether the numerical scheme of the pump model can simulate the secondary flow pattern accurately or not. The impeller and diffuser, however, are dominated by pressure-driven flows such that the effects of turbulence model and grid spacing (except near leading and trailing edges of blades) are less sensitive. The present CFD pump model has been proved to be an efficient and robust analytical tool for pump design due to its very compact numerical structure (requiring small memory), fast turnaround computing time, and versatility for different geometries.
Modeling and Exergy Analysis of District Cooling
DEFF Research Database (Denmark)
Nguyen, Chan
/or economically which is the objective of the PhD project. A thermodynamic (energy and exergy) model of a transcritical CO2 cooling and heating system has been developed. The coefficient of performance (COP) of the system is the characteristic of interest. A sensitivity analysis of the parameters: compressor...... the gas cooler, pinch temperature in the evaporator and effectiveness of the IHX. These results are complemented by the exergy analysis, where the exergy destruction ratio of the CO2 system’s component is found. Heat recovery from vapour compression heat pumps has been investigated. The heat is to be...... used in a district heating system based on combined heat and power plants (CHP). A theoretical comparison of trigeneration (cooling, heating and electricity) systems, a traditional system and a recovery system is carried out. The comparison is based on the systems overall exergy efficiency. The...
Modelling structural systems for transient response analysis
International Nuclear Information System (INIS)
This paper introduces and reports success of a direct means of determining the time periods in which a structural system behaves as a linear system. Numerical results are based on post fracture transient analyses of simplified nuclear piping systems. Knowledge of the linear response ranges will lead to improved analysis-test correlation and more efficient analyses. It permits direct use of data from physical tests in analysis and simplication of the analytical model and interpretation of its behaviour. The paper presents a procedure for deducing linearity based on transient responses. Given the forcing functions and responses of discrete points of the system at various times, the process produces evidence of linearity and quantifies an adequate set of equations of motion. Results of use of the process with linear and nonlinear analyses of piping systems with damping illustrate its success. Results cover the application to data from mathematical system responses. (Auth.)
Spatiochromatic Context Modeling for Color Saliency Analysis.
Zhang, Jun; Wang, Meng; Zhang, Shengping; Li, Xuelong; Wu, Xindong
2016-06-01
Visual saliency is one of the most noteworthy perceptual abilities of human vision. Recent progress in cognitive psychology suggests that: 1) visual saliency analysis is mainly completed by the bottom-up mechanism consisting of feedforward low-level processing in primary visual cortex (area V1) and 2) color interacts with spatial cues and is influenced by the neighborhood context, and thus it plays an important role in a visual saliency analysis. From a computational perspective, the most existing saliency modeling approaches exploit multiple independent visual cues, irrespective of their interactions (or are not computed explicitly), and ignore contextual influences induced by neighboring colors. In addition, the use of color is often underestimated in the visual saliency analysis. In this paper, we propose a simple yet effective color saliency model that considers color as the only visual cue and mimics the color processing in V1. Our approach uses region-/boundary-defined color features with spatiochromatic filtering by considering local color-orientation interactions, therefore captures homogeneous color elements, subtle textures within the object and the overall salient object from the color image. To account for color contextual influences, we present a divisive normalization method for chromatic stimuli through the pooling of contrary/complementary color units. We further define a color perceptual metric over the entire scene to produce saliency maps for color regions and color boundaries individually. These maps are finally globally integrated into a one single saliency map. The final saliency map is produced by Gaussian blurring for robustness. We evaluate the proposed method on both synthetic stimuli and several benchmark saliency data sets from the visual saliency analysis to salient object detection. The experimental results demonstrate that the use of color as a unique visual cue achieves competitive results on par with or better than 12 state
Non standard analysis, polymer models, quantum fields
International Nuclear Information System (INIS)
We give an elementary introduction to non standard analysis and its applications to the theory of stochastic processes. This is based on a joint book with J.E. Fenstad, R. Hoeegh-Krohn and T. Lindstroeem. In particular we give a discussion of an hyperfinite theory of Dirichlet forms with applications to the study of the Hamiltonian for a quantum mechanical particle in the potential created by a polymer. We also discuss new results on the existence of attractive polymer measures in dimension d 12 phi22)sub(d)-model of interacting quantum fields. (orig.)
Modelling and analysis of global coal markets
International Nuclear Information System (INIS)
The thesis comprises four interrelated essays featuring modelling and analysis of coal markets. Each of the four essays has a dedicated chapter in this thesis. Chapters 2 to 4 have, from a topical perspective, a backward-looking focus and deal with explaining recent market outcomes in the international coal trade. The findings of those essays may serve as guidance for assessing current coal market outcomes as well as expected market outcomes in the near to medium-term future. Chapter 5 has a forward-looking focus and builds a bridge between explaining recent market outcomes and projecting long-term market equilibria. Chapter 2, Strategic Behaviour in International Metallurgical Coal Markets, deals with market conduct of large exporters in the market of coals used in steel-making in the period 2008 to 2010. In this essay I analyse whether prices and trade-flows in the international market for metallurgical coals were subject to non-competitive conduct in the period 2008 to 2010. To do so, I develop mathematical programming models - a Stackelberg model, two varieties of a Cournot model, and a perfect competition model - for computing spatial equilibria in international resource markets. Results are analysed with various statistical measures to assess the prediction accuracy of the models. The results show that real market equilibria cannot be reproduced with a competitive model. However, real market outcomes can be accurately simulated with the non-competitive models, suggesting that market equilibria in the international metallurgical coal trade were subject to the strategic behaviour of coal exporters. Chapter 3 and chapter 4 deal with market power issues in the steam coal trade in the period 2006 to 2008. Steam coals are typically used to produce steam either for electricity generation or for heating purposes. In Chapter 3 we analyse market behaviour of key exporting countries in the steam coal trade. This chapter features the essay Market Structure Scenarios in
Modelling and analysis of global coal markets
Energy Technology Data Exchange (ETDEWEB)
Trueby, Johannes
2013-01-17
The thesis comprises four interrelated essays featuring modelling and analysis of coal markets. Each of the four essays has a dedicated chapter in this thesis. Chapters 2 to 4 have, from a topical perspective, a backward-looking focus and deal with explaining recent market outcomes in the international coal trade. The findings of those essays may serve as guidance for assessing current coal market outcomes as well as expected market outcomes in the near to medium-term future. Chapter 5 has a forward-looking focus and builds a bridge between explaining recent market outcomes and projecting long-term market equilibria. Chapter 2, Strategic Behaviour in International Metallurgical Coal Markets, deals with market conduct of large exporters in the market of coals used in steel-making in the period 2008 to 2010. In this essay I analyse whether prices and trade-flows in the international market for metallurgical coals were subject to non-competitive conduct in the period 2008 to 2010. To do so, I develop mathematical programming models - a Stackelberg model, two varieties of a Cournot model, and a perfect competition model - for computing spatial equilibria in international resource markets. Results are analysed with various statistical measures to assess the prediction accuracy of the models. The results show that real market equilibria cannot be reproduced with a competitive model. However, real market outcomes can be accurately simulated with the non-competitive models, suggesting that market equilibria in the international metallurgical coal trade were subject to the strategic behaviour of coal exporters. Chapter 3 and chapter 4 deal with market power issues in the steam coal trade in the period 2006 to 2008. Steam coals are typically used to produce steam either for electricity generation or for heating purposes. In Chapter 3 we analyse market behaviour of key exporting countries in the steam coal trade. This chapter features the essay Market Structure Scenarios in
MODELING ANALYSIS FOR GROUT HOPPER WASTE TANK
Energy Technology Data Exchange (ETDEWEB)
Lee, S.
2012-01-04
The Saltstone facility at Savannah River Site (SRS) has a grout hopper tank to provide agitator stirring of the Saltstone feed materials. The tank has about 300 gallon capacity to provide a larger working volume for the grout nuclear waste slurry to be held in case of a process upset, and it is equipped with a mechanical agitator, which is intended to keep the grout in motion and agitated so that it won't start to set up. The primary objective of the work was to evaluate the flow performance for mechanical agitators to prevent vortex pull-through for an adequate stirring of the feed materials and to estimate an agitator speed which provides acceptable flow performance with a 45{sup o} pitched four-blade agitator. In addition, the power consumption required for the agitator operation was estimated. The modeling calculations were performed by taking two steps of the Computational Fluid Dynamics (CFD) modeling approach. As a first step, a simple single-stage agitator model with 45{sup o} pitched propeller blades was developed for the initial scoping analysis of the flow pattern behaviors for a range of different operating conditions. Based on the initial phase-1 results, the phase-2 model with a two-stage agitator was developed for the final performance evaluations. A series of sensitivity calculations for different designs of agitators and operating conditions have been performed to investigate the impact of key parameters on the grout hydraulic performance in a 300-gallon hopper tank. For the analysis, viscous shear was modeled by using the Bingham plastic approximation. Steady state analyses with a two-equation turbulence model were performed. All analyses were based on three-dimensional results. Recommended operational guidance was developed by using the basic concept that local shear rate profiles and flow patterns can be used as a measure of hydraulic performance and spatial stirring. Flow patterns were estimated by a Lagrangian integration technique along
A catalog of automated analysis methods for enterprise models.
Florez, Hector; Sánchez, Mario; Villalobos, Jorge
2016-01-01
Enterprise models are created for documenting and communicating the structure and state of Business and Information Technologies elements of an enterprise. After models are completed, they are mainly used to support analysis. Model analysis is an activity typically based on human skills and due to the size and complexity of the models, this process can be complicated and omissions or miscalculations are very likely. This situation has fostered the research of automated analysis methods, for supporting analysts in enterprise analysis processes. By reviewing the literature, we found several analysis methods; nevertheless, they are based on specific situations and different metamodels; then, some analysis methods might not be applicable to all enterprise models. This paper presents the work of compilation (literature review), classification, structuring, and characterization of automated analysis methods for enterprise models, expressing them in a standardized modeling language. In addition, we have implemented the analysis methods in our modeling tool. PMID:27047732
Modeling human reliability analysis using MIDAS
Energy Technology Data Exchange (ETDEWEB)
Boring, R. L. [Human Factors, Instrumentation and Control Systems Dept., Idaho National Laboratory, Idaho Falls, ID 83415 (United States)
2006-07-01
This paper documents current efforts to infuse human reliability analysis (HRA) into human performance simulation. The Idaho National Laboratory is teamed with NASA Ames Research Center to bridge the SPAR-H HRA method with NASA's Man-machine Integration Design and Analysis System (MIDAS) for use in simulating and modeling the human contribution to risk in nuclear power plant control room operations. It is anticipated that the union of MIDAS and SPAR-H will pave the path for cost-effective, timely, and valid simulated control room operators for studying current and next generation control room configurations. This paper highlights considerations for creating the dynamic HRA framework necessary for simulation, including event dependency and granularity. This paper also highlights how the SPAR-H performance shaping factors can be modeled in MIDAS across static, dynamic, and initiator conditions common to control room scenarios. This paper concludes with a discussion of the relationship of the workload factors currently in MIDAS and the performance shaping factors in SPAR-H. (authors)
Modeling human reliability analysis using MIDAS
International Nuclear Information System (INIS)
This paper documents current efforts to infuse human reliability analysis (HRA) into human performance simulation. The Idaho National Laboratory is teamed with NASA Ames Research Center to bridge the SPAR-H HRA method with NASA's Man-machine Integration Design and Analysis System (MIDAS) for use in simulating and modeling the human contribution to risk in nuclear power plant control room operations. It is anticipated that the union of MIDAS and SPAR-H will pave the path for cost-effective, timely, and valid simulated control room operators for studying current and next generation control room configurations. This paper highlights considerations for creating the dynamic HRA framework necessary for simulation, including event dependency and granularity. This paper also highlights how the SPAR-H performance shaping factors can be modeled in MIDAS across static, dynamic, and initiator conditions common to control room scenarios. This paper concludes with a discussion of the relationship of the workload factors currently in MIDAS and the performance shaping factors in SPAR-H. (authors)
ANALYSIS MODEL FOR RETURN ON CAPITAL EMPLOYED
Directory of Open Access Journals (Sweden)
BURJA CAMELIA
2013-02-01
Full Text Available At the microeconomic level, the appreciation of the capitals’ profitability is a very complex action which is ofinterest for stakeholders. This study has as main purpose to extend the traditional analysis model for the capitals’profitability, based on the ratio “Return on capital employed”. In line with it the objectives of this work aim theidentification of factors that exert an influence on the capital’s profitability utilized by a company and the measurementof their contribution in the manifestation of the phenomenon. The proposed analysis model is validated on the use caseof a representative company from the agricultural sector. The results obtained reveal that in a company there are somefactors which can act positively on the capitals’ profitability: capital turnover, sales efficiency, increase the share ofsales in the total revenues, improvement of the expenses’ efficiency. The findings are useful both for the decisionmakingfactors in substantiating the economic strategies and for the capital owners who are interested in efficiency oftheir investments.
Computational Modeling, Formal Analysis, and Tools for Systems Biology
Bartocci, Ezio; Lió, Pietro
2016-01-01
As the amount of biological data in the public domain grows, so does the range of modeling and analysis techniques employed in systems biology. In recent years, a number of theoretical computer science developments have enabled modeling methodology to keep pace. The growing interest in systems biology in executable models and their analysis has necessitated the borrowing of terms and methods from computer science, such as formal analysis, model checking, static analysis, and runtime verificat...
Data analysis and source modelling for LISA
International Nuclear Information System (INIS)
The gravitational waves are one of the most important predictions in general relativity. Besides of the directly proof of the existence of GWs, there are already several ground based detectors (such as LIGO, GEO, etc) and the planed future space mission (such as: LISA) which are aim to detect the GWs directly. GW contain a large amount of information of its source, extracting these information can help us dig out the physical property of the source, even open a new window for understanding the Universe. Hence, GW data analysis will be a challenging task in seeking the GWs. In this thesis, I present two works about the data analysis for LISA. In the first work, we introduce an extended multimodal genetic algorithm which utilizes the properties of the signal and the detector response function to analyze the data from the third round of mock LISA data challenge. We have found all five sources present in the data and recovered the coalescence time, chirp mass, mass ratio and sky location with reasonable accuracy. As for the orbital angular momentum and two spins of the Black Holes, we have found a large number of widely separated modes in the parameter space with similar maximum likelihood values. The performance of this method is comparable, if not better, to already existing algorithms. In the second work, we introduce an new phenomenological waveform model for the extreme mass ratio inspiral system. This waveform consists of a set of harmonics with constant amplitude and slowly evolving phase which we decompose in a Taylor series. We use these phenomenological templates to detect the signal in the simulated data, and then, assuming a particular EMRI model, estimate the physical parameters of the binary with high precision. The results show that our phenomenological waveform is very feasible in the data analysis of EMRI signal.
Sensitivity analysis of Smith's AMRV model
International Nuclear Information System (INIS)
Multiple-expert hazard/risk assessments have considerable precedent, particularly in the Yucca Mountain site characterization studies. In this paper, we present a Bayesian approach to statistical modeling in volcanic hazard assessment for the Yucca Mountain site. Specifically, we show that the expert opinion on the site disruption parameter p is elicited on the prior distribution, π (p), based on geological information that is available. Moreover, π (p) can combine all available geological information motivated by conflicting but realistic arguments (e.g., simulation, cluster analysis, structural control, etc.). The incorporated uncertainties about the probability of repository disruption p, win eventually be averaged out by taking the expectation over π (p). We use the following priors in the analysis: priors chosen for mathematical convenience: Beta (r, s) for (r, s) = (2, 2), (3, 3), (5, 5), (2, 1), (2, 8), (8, 2), and (1, 1); and three priors motivated by expert knowledge. Sensitivity analysis is performed for each prior distribution. Estimated values of hazard based on the priors chosen for mathematical simplicity are uniformly higher than those obtained based on the priors motivated by expert knowledge. And, the model using the prior, Beta (8,2), yields the highest hazard (= 2.97 X 10-2). The minimum hazard is produced by the open-quotes three-expert priorclose quotes (i.e., values of p are equally likely at 10-3 10-2, and 10-1). The estimate of the hazard is 1.39 x which is only about one order of magnitude smaller than the maximum value. The term, open-quotes hazardclose quotes, is defined as the probability of at least one disruption of a repository at the Yucca Mountain site by basaltic volcanism for the next 10,000 years
Data analysis and source modelling for LISA
Energy Technology Data Exchange (ETDEWEB)
Shang, Yu
2014-07-01
The gravitational waves are one of the most important predictions in general relativity. Besides of the directly proof of the existence of GWs, there are already several ground based detectors (such as LIGO, GEO, etc) and the planed future space mission (such as: LISA) which are aim to detect the GWs directly. GW contain a large amount of information of its source, extracting these information can help us dig out the physical property of the source, even open a new window for understanding the Universe. Hence, GW data analysis will be a challenging task in seeking the GWs. In this thesis, I present two works about the data analysis for LISA. In the first work, we introduce an extended multimodal genetic algorithm which utilizes the properties of the signal and the detector response function to analyze the data from the third round of mock LISA data challenge. We have found all five sources present in the data and recovered the coalescence time, chirp mass, mass ratio and sky location with reasonable accuracy. As for the orbital angular momentum and two spins of the Black Holes, we have found a large number of widely separated modes in the parameter space with similar maximum likelihood values. The performance of this method is comparable, if not better, to already existing algorithms. In the second work, we introduce an new phenomenological waveform model for the extreme mass ratio inspiral system. This waveform consists of a set of harmonics with constant amplitude and slowly evolving phase which we decompose in a Taylor series. We use these phenomenological templates to detect the signal in the simulated data, and then, assuming a particular EMRI model, estimate the physical parameters of the binary with high precision. The results show that our phenomenological waveform is very feasible in the data analysis of EMRI signal.
Application of Statistical Analysis Software in Food Scientific Modeling
Miaochao Chen; Kong Xiangsheng; Kan Chen
2014-01-01
In food scientific researches, sophisticated statistical analysis problems often can be met and in this study, through SPSS statistical analysis software, the method of the curve regression model and the multiple regression model that both are common in food science has been established and the experimental results show that the method can be effectively used in the statistical analysis model of food science.
A structural analysis model for clay caps
International Nuclear Information System (INIS)
This paper presents a structural analysis model for clay caps used in the landfill of low-level nuclear waste to minimize the migration of fluid through the soil. The clay cap resting on the soil foundation is treated as an axially symmetric elastic plate supported by an elastic foundation. A circular hole (concentric with the plate) in the elastic foundation represents an underlying cavity formed in the landfill due to waste decomposition and volume reduction. Unlike the models that commonly represent the soil foundation with equivalent springs, this model treats the foundation as a semi-infinite space and accounts for the work done by both compression and shear stresses in the foundation. The governing equation of the plate is based upon the classical theory of plate bending, whereas the governing equation derived by using Vlasov's general variational method describes the soil foundation. The solutions are expressed in terms of Basset functions. A FORTRAN program was written to carry out the numerical calculations
Dynamical Systems Analysis of Various Dark Energy Models
Roy, Nandan
2015-01-01
In this thesis, we used dynamical systems analysis to find the qualitative behaviour of some dark energy models. Specifically, dynamical systems analysis of quintessence scalar field models, chameleon scalar field models and holographic models of dark energy are discussed in this thesis.
Saturn Ring Data Analysis and Thermal Modeling
Dobson, Coleman
2011-01-01
CIRS, VIMS, UVIS, and ISS (Cassini's Composite Infrared Specrtometer, Visual and Infrared Mapping Spectrometer, Ultra Violet Imaging Spectrometer and Imaging Science Subsystem, respectively), have each operated in a multidimensional observation space and have acquired scans of the lit and unlit rings at multiple phase angles. To better understand physical and dynamical ring particle parametric dependence, we co-registered profiles from these three instruments, taken at a wide range of wavelengths, from ultraviolet through the thermal infrared, to associate changes in ring particle temperature with changes in observed brightness, specifically with albedos inferred by ISS, UVIS and VIMS. We work in a parameter space where the solar elevation range is constrained to 12 deg - 14 deg and the chosen radial region is the B3 region of the B ring; this region is the most optically thick region in Saturn's rings. From this compilation of multiple wavelength data, we construct and fit phase curves and color ratios using independent dynamical thermal models for ring structure and overplot Saturn, Saturn ring, and Solar spectra. Analysis of phase curve construction and color ratios reveals thermal emission to fall within the extrema of the ISS bandwidth and a geometrical dependence of reddening on phase angle, respectively. Analysis of spectra reveals Cassini CIRS Saturn spectra dominate Cassini CIRS B3 Ring Spectra from 19 to 1000 microns, while Earth-based B Ring Spectrum dominates Earth-based Saturn Spectrum from 0.4 to 4 microns. From our fits we test out dynamical thermal models; from the phase curves we derive ring albedos and non-lambertian properties of the ring particle surfaces; and from the color ratios we examine multiple scattering within the regolith of ring particles.
A visual analysis of the process of process modeling
Claes, J Jan; Vanderfeesten, ITP Irene; Pinggera, J.; Reijers, HA Hajo; Weber, B.; Poels, G
2015-01-01
The construction of business process models has become an important requisite in the analysis and optimization of processes. The success of the analysis and optimization efforts heavily depends on the quality of the models. Therefore, a research domain emerged that studies the process of process modeling. This paper contributes to this research by presenting a way of visualizing the different steps a modeler undertakes to construct a process model, in a so-called process of process modeling C...
Comparative Analysis of Parametric Engine Model and Engine Map Model
Zeeshan Ali Memon; Sadiq Ali Shah; Muhammas Saleh Jumani
2015-01-01
Two different engine models, parametric engine model and engine map model are employed to analyze the dynamics of an engine during the gear shifting. The models are analyzed under critical transitional manoeuvres to investigate their appropriateness for vehicle longitudinal dynamics. The simulation results for both models have been compared. The results show the engine map model matches well with the parametric model and can be used for the vehicle longitudinal dynamics model. The proposed ap...
Comparative analysis of enterprise risk management models
Nikolaev Igor V.
2012-01-01
The article is devoted to the analysis and the comparison of modern enterprise risk management models used in domestic and world practice. Some thesis to build such a model are proposed.Статья посвящена анализу и сравнению современных моделей управления рисками предприятий, которые используются в отечественной и зарубежной практике. Предложены некоторые положения, на которых должны базироваться такие модели....
Production TTR modeling and dynamic buckling analysis
Institute of Scientific and Technical Information of China (English)
Hugh Liu; John Wei; Edward Huang
2013-01-01
In a typical tension leg platform (TLP) design,the top tension factor (TTF),measuring the top tension of a top tensioned riser (TTR) relative to its submerged weight in water,is one of the most important design parameters that has to be specified properly.While a very small TTF may lead to excessive vortex induced vibration (ⅤⅣ),clashing issues and possible compression close to seafloor,an unnecessarily high TTF may translate into excessive riser cost and vessel payload,and even has impacts on the TLP sizing and design in general.In the process of a production TTR design,it is found that its outer casing can be subjected to compression in a worst-case scenario with some extreme metocean and hardware conditions.The present paper shows how finite element analysis (FEA) models using beam elements and two different software packages (Flexcom and ABAQUS) are constructed to simulate the TTR properly,and especially the pipe-in-pipe effects.An ABAQUS model with hybrid elements (beam elements globally + shell elements locally) can be used to investigate how the outer casing behaves under compression.It is shown for the specified TTR design,even with its outer casing being under some local compression in the worst-case scenario,dynamic buckling would not occur; therefore the TTR design is adequate.
Statistical Analysis and Modeling of Elastic Functions
Srivastava, Anuj; Kurtek, Sebastian; Klassen, Eric; Marron, J S
2011-01-01
We introduce a novel geometric framework for separating, analyzing and modeling the $x$ (or horizontal) and the $y$ (or vertical) variability in time-warped functional data of the type frequently studied in growth curve analysis. This framework is based on the use of the Fisher-Rao Riemannian metric that provides a proper distance for: (1) aligning, comparing and modeling functions and (2) analyzing the warping functions. A convenient square-root velocity function (SRVF) representation transforms the Fisher-Rao metric to the standard $\\ltwo$ metric, a tool that is applied twice in this framework. Firstly, it is applied to the given functions where it leads to a parametric family of penalized-$\\ltwo$ distances in SRVF space. The parameter controls the levels of elasticity of the individual functions. These distances are then used to define Karcher means and the individual functions are optimally warped to align them to the Karcher means to extract the $y$ variability. Secondly, the resulting warping functions,...
Linking advanced fracture models to structural analysis
Energy Technology Data Exchange (ETDEWEB)
Chiesa, Matteo
2001-07-01
Shell structures with defects occur in many situations. The defects are usually introduced during the welding process necessary for joining different parts of the structure. Higher utilization of structural materials leads to a need for accurate numerical tools for reliable prediction of structural response. The direct discretization of the cracked shell structure with solid finite elements in order to perform an integrity assessment of the structure in question leads to large size problems, and makes such analysis infeasible in structural application. In this study a link between local material models and structural analysis is outlined. An ''ad hoc'' element formulation is used in order to connect complex material models to the finite element framework used for structural analysis. An improved elasto-plastic line spring finite element formulation, used in order to take cracks into account, is linked to shell elements which are further linked to beam elements. In this way one obtain a global model of the shell structure that also accounts for local flexibilities and fractures due to defects. An important advantage with such an approach is a direct fracture mechanics assessment e.g. via computed J-integral or CTOD. A recent development in this approach is the notion of two-parameter fracture assessment. This means that the crack tip stress tri-axiality (constraint) is employed in determining the corresponding fracture toughness, giving a much more realistic capacity of cracked structures. The present thesis is organized in six research articles and an introductory chapter that reviews important background literature related to this work. Paper I and II address the performance of shell and line spring finite elements as a cost effective tool for performing the numerical calculation needed to perform a fracture assessment. In Paper II a failure assessment, based on the testing of a constraint-corrected fracture mechanics specimen under tension, is
Trend analysis model to forecast energy supply and demand
Energy Technology Data Exchange (ETDEWEB)
1984-01-01
A particular approach to energy forecasting which was studied in considerable detail was trend extrapolation. This technique, termed the trend analysis model, was suggested by Dr. S. Scott Sutton, the EIA contract technical officer. While a variety of equations were explored during this part of the study, they are variations of a basic formulation. This report describes the trend analysis model, demonstrates the trend analysis model and documents the computer program used to produce the model results.
Model Checking Is Static Analysis of Modal Logic
DEFF Research Database (Denmark)
Nielson, Flemming; Nielson, Hanne Riis
2010-01-01
it can give an exact characterisation of the semantics of formulae in a modal logic. This shows that model checking can be performed by means of state-of-the-art approaches to static analysis and allow us to conclude that the problems of model checking and static analysis are reducible to each other....... In terms of computational complexity we show that model checking by means of static analysis gives the same complexity bounds as are known for traditional approaches to model checking....
Comparison of Statistical Models for Regional Crop Trial Analysis
Institute of Scientific and Technical Information of China (English)
ZHANG Qun-yuan; KONG Fan-ling
2002-01-01
Based on the review and comparison of main statistical analysis models for estimating varietyenvironment cell means in regional crop trials, a new statistical model, LR-PCA composite model was proposed, and the predictive precision of these models were compared by cross validation of an example data. Results showed that the order of model precision was LR-PCA model ＞ AMMI model ＞ PCA model ＞ Treatment Means (TM) model ＞ Linear Regression (LR) model ＞ Additive Main Effects ANOVA model. The precision gain factor of LR-PCA model was 1.55, increasing by 8.4% compared with AMMI.
Applied data analysis and modeling for energy engineers and scientists
Reddy, T Agami
2011-01-01
""Applied Data Analysis and Modeling for Energy Engineers and Scientists"" discusses mathematical models, data analysis, and decision analysis in modeling. The approach taken in this volume focuses on the modeling and analysis of thermal systems in an engineering environment, while also covering a number of other critical areas. Other material covered includes the tools that researchers and engineering professionals will need in order to explore different analysis methods, use critical assessment skills and reach sound engineering conclusions. The book also covers process and system design and
Towards a controlled sensitivity analysis of model development decisions
Clark, Martyn; Nijssen, Bart
2016-04-01
The current generation of hydrologic models have followed a myriad of different development paths, making it difficult for the community to test underlying hypotheses and identify a clear path to model improvement. Model comparison studies have been undertaken to explore model differences, but these studies have not been able to meaningfully attribute inter-model differences in predictive ability to individual model components because there are often too many structural and implementation differences among the models considered. As a consequence, model comparison studies to date have provided limited insight into the causes of differences in model behavior, and model development has often relied on the inspiration and experience of individual modelers rather than a systematic analysis of model shortcomings. This presentation will discuss a unified approach to process-based hydrologic modeling to enable controlled and systematic analysis of multiple model representations (hypotheses) of hydrologic processes and scaling behavior. Our approach, which we term the Structure for Unifying Multiple Modeling Alternatives (SUMMA), formulates a general set of conservation equations, providing the flexibility to experiment with different spatial representations, different flux parameterizations, different model parameter values, and different time stepping schemes. We will discuss the use of SUMMA to systematically analyze different model development decisions, focusing on both analysis of simulations for intensively instrumented research watersheds as well as simulations across a global dataset of FLUXNET sites. The intent of the presentation is to demonstrate how the systematic analysis of model shortcomings can help identify model weaknesses and inform future model development priorities.
Comparative analysis of parametric engine model and engine map model
International Nuclear Information System (INIS)
Two different engine models, parametric engine model and engine map model are employed to analyze the dynamics of an engine during the gear shifting. The models are analyzed under critical transitional manoeuvres to investigate their appropriateness for vehicle longitudinal dynamics. The simulation results for both models have been compared. The results show the engine map model matches well with the parametric model and can be used for the vehicle longitudinal dynamics model. The proposed approach can be useful for the selection of the appropriate vehicle for the given application. (author)
Translation model, translation analysis, translation strategy: an integrated methodology
VOLKOVA TATIANA A.
2014-01-01
The paper revisits the concepts of translation model, translation analysis, and translation strategy from an integrated perspective: a translation strategy naturally follows translation analysis performed on a given set of textual, discursive and communicative parameters that form a valid translation model. Translation modeling is reconsidered in terms of a paradigm shift and a distinction between a process-oriented (descriptive) model and an action-oriented (prescriptive) model. Following th...
Analysis on the Logarithmic Model of Relationships
Institute of Scientific and Technical Information of China (English)
无
2005-01-01
The logarithmic model is often used to describe the relationships between factors.It often gives good statistical characteristics.Yet,in the process of modeling of soil and water conservation,we find out that this“good”model cannot guarantee good result.In this paper we make an inquiry into the intrinsic reasons.It is shown that the logarithmic model has the property of enlarging or reducing model errors,and the disadvantages of the logarithmic model are analyzed.
Economic analysis model for total energy and economic systems
International Nuclear Information System (INIS)
This report describes framing an economic analysis model developed as a tool of total energy systems. To prospect and analyze future energy systems, it is important to analyze the relation between energy system and economic structure. We prepared an economic analysis model which was suited for this purpose. Our model marks that we can analyze in more detail energy related matters than other economic ones, and can forecast long-term economic progress rather than short-term economic fluctuation. From view point of economics, our model is longterm multi-sectoral economic analysis model of open Leontief type. Our model gave us appropriate results for fitting test and forecasting estimation. (author)
Model performance analysis and model validation in logistic regression
Directory of Open Access Journals (Sweden)
Rosa Arboretti Giancristofaro
2007-10-01
Full Text Available In this paper a new model validation procedure for a logistic regression model is presented. At first, we illustrate a brief review of different techniques of model validation. Next, we define a number of properties required for a model to be considered "good", and a number of quantitative performance measures. Lastly, we describe a methodology for the assessment of the performance of a given model by using an example taken from a management study.
An Extended Analysis of Requirements Traceability Model
Institute of Scientific and Technical Information of China (English)
Jiang Dandong(蒋丹东); Zhang Shensheng; Chen Lu
2004-01-01
A new extended meta model of traceability is presented. Then, a formalized fine-grained model of traceability is described. Some major issues about this model, including trace units, requirements and relations within the model, are further analyzed. Finally, a case study that comes from a key project of 863 Program is given.
Managing Analysis Models in the Design Process
Briggs, Clark
2006-01-01
Design of large, complex space systems depends on significant model-based support for exploration of the design space. Integrated models predict system performance in mission-relevant terms given design descriptions and multiple physics-based numerical models. Both the design activities and the modeling activities warrant explicit process definitions and active process management to protect the project from excessive risk. Software and systems engineering processes have been formalized and similar formal process activities are under development for design engineering and integrated modeling. JPL is establishing a modeling process to define development and application of such system-level models.
Loss Given Default Modelling: Comparative Analysis
Yashkir, Olga; Yashkir, Yuriy
2013-01-01
In this study we investigated several most popular Loss Given Default (LGD) models (LSM, Tobit, Three-Tiered Tobit, Beta Regression, Inflated Beta Regression, Censored Gamma Regression) in order to compare their performance. We show that for a given input data set, the quality of the model calibration depends mainly on the proper choice (and availability) of explanatory variables (model factors), but not on the fitting model. Model factors were chosen based on the amplitude of their correlati...
Comparison of Integrated Analysis Methods for Two Model Scenarios
Amundsen, Ruth M.
1999-01-01
Integrated analysis methods have the potential to substantially decrease the time required for analysis modeling. Integration with computer aided design (CAD) software can also allow a model to be more accurate by facilitating import of exact design geometry. However, the integrated method utilized must sometimes be tailored to the specific modeling situation, in order to make the process most efficient. Two cases are presented here that illustrate different processes used for thermal analysis on two different models. These examples are used to illustrate how the requirements, available input, expected output, and tools available all affect the process selected by the analyst for the most efficient and effective analysis.
Model Analysis Assessing the dynamics of student learning
Bao, L; Bao, Lei; Redish, Edward F.
2002-01-01
In this paper we present a method of modeling and analysis that permits the extraction and quantitative display of detailed information about the effects of instruction on a class's knowledge. The method relies on a congitive model that represents student thinking in terms of mental models. Students frequently fail to recognize relevant conditions that lead to appropriate uses of their models. As a result they can use multiple models inconsistently. Once the most common mental models have been determined by qualitative research, they can be mapping onto a multiple choice test. Model analysis permits the interpretation of such a situation. We illustrate the use of our method by analyzing results from the FCI.
EXPOSURE ANALYSIS MODELING SYSTEM (EXAMS): USER MANUAL AND SYSTEM DOCUMENTATION
The Exposure Analysis Modeling System, first published in 1982 (EPA-600/3-82-023), provides interactive computer software for formulating aquatic ecosystem models and rapidly evaluating the fate, transport, and exposure concentrations of synthetic organic chemicals - pesticides, ...
[Analysis of the stability and adaptability of near infrared spectra qualitative analysis model].
Cao, Wu; Li, Wei-jun; Wang, Ping; Zhang, Li-ping
2014-06-01
The stability and adaptability of model of near infrared spectra qualitative analysis were studied. Method of separate modeling can significantly improve the stability and adaptability of model; but its ability of improving adaptability of model is limited. Method of joint modeling can not only improve the adaptability of the model, but also the stability of model, at the same time, compared to separate modeling, the method can shorten the modeling time, reduce the modeling workload; extend the term of validity of model, and improve the modeling efficiency. The experiment of model adaptability shows that, the correct recognition rate of separate modeling method is relatively low, which can not meet the requirements of application, and joint modeling method can reach the correct recognition rate of 90%, and significantly enhances the recognition effect. The experiment of model stability shows that, the identification results of model by joint modeling are better than the model by separate modeling, and has good application value. PMID:25358155
Priors from DSGE models for dynamic factor analysis
Bäurle, Gregor
2008-01-01
We propose a method to incorporate information from Dynamic Stochastic General Equilibrium (DSGE) models into Dynamic Factor Analysis. The method combines a procedure previously applied for Bayesian Vector Autoregressions and a Gibbs Sampling approach for Dynamic Factor Models. The factors in the model are rotated such that they can be interpreted as variables from a DSGE model. In contrast to standard Dynamic Factor Analysis, a direct economic interpretation of the factors is given. We evalu...
Modeling Paradigms Applied to the Analysis of European Air Quality
Makowski, M.
2000-01-01
The paper presents an overview of various modeling paradigms applicable to the analysis of complex decision-making that can be represented by large non-linear models. Such paradigms are illustrated by their application to the analysis of a model that helps to identify and analyze various cost-effective policy options aimed at improving European air quality. Also presented is the application of this model to support intergovernmental negotiations.
Computational models for the nonlinear analysis of reinforced concrete plates
Hinton, E.; Rahman, H. H. A.; Huq, M. M.
1980-01-01
A finite element computational model for the nonlinear analysis of reinforced concrete solid, stiffened and cellular plates is briefly outlined. Typically, Mindlin elements are used to model the plates whereas eccentric Timoshenko elements are adopted to represent the beams. The layering technique, common in the analysis of reinforced concrete flexural systems, is incorporated in the model. The proposed model provides an inexpensive and reasonably accurate approach which can be extended for use with voided plates.
Modeling for ultrasonic testing accuracy in probabilistic fracture mechanics analysis
International Nuclear Information System (INIS)
This study proposes models for ultrasonic testing (UT) accuracy at In-service Inspection (ISI) in probabilistic fracture mechanics (PFM) analysis. Regression analysis of the data brought by Ultrasonic Test and Evaluation for Maintenance Standards (UTS) project and modeling for successful candidates of Performance demonstration certification system provided the models for accuracy of flaw detection and sizing. New PFM analysis code, which evaluates failure probabilities at weld lines in piping aged by Stress Corrosion Cracking, has been developed by JAEA. The models were introduced into the code. Failure probabilities under the UT models at a weld line were evaluated by the code. (author)
Multivariate Survival Mixed Models for Genetic Analysis of Longevity Traits
DEFF Research Database (Denmark)
Pimentel Maia, Rafael; Madsen, Per; Labouriau, Rodrigo
2013-01-01
concentrates on longevity studies. The framework presented allows to combine models based on continuous time with models based on discrete time in a joint analysis. The continuous time models are approximations of the frailty model in which the hazard function will be assumed to be piece-wise constant. The...
Multidimensional Data Modeling for Business Process Analysis
Mansmann, Svetlana; Neumuth, Thomas; Scholl, Marc H.
The emerging area of business process intelligence attempts to enhance the analytical capabilities of business process management systems by employing data warehousing and mining technologies. This paper presents an approach to re-engineering the business process modeling in conformity with the multidimensional data model. Since the business process and the multidimensional model are driven by rather different objectives and assumptions, there is no straightforward solution to converging these models.
Analysis of a computer model of emotions
Moffat, D.; Frijda, N.H.; Phaf, R.H.
1993-01-01
In the fields of psychology, AI, and philosophy there has recently been theoretical activity in the cognitively-based modelling of emotions. Using AI methodology it is possible to implement and test these complex models, and in this paper we examine an emotion model called ACRES. We propose a set of requirements any such model should satisfy, and compare ACRES against them. Then, analysing its behaviour in detail, we formulate more requirements and criteria that can be applied to future compu...
Multiattribute shopping models and ridge regression analysis
Timmermans, HJP Harry
1981-01-01
Policy decisions regarding retailing facilities essentially involve multiple attributes of shopping centres. If mathematical shopping models are to contribute to these decision processes, their structure should reflect the multiattribute character of retailing planning. Examination of existing models shows that most operational shopping models include only two policy variables. A serious problem in the calibration of the existing multiattribute shopping models is that of multicollinearity ari...
Analysis on Some of Software Reliability Models
Institute of Scientific and Technical Information of China (English)
无
2001-01-01
Software reliability & maintainability evaluation tool (SRMET 3.0) is introducted in detail in this paper,which was developed by Software Evaluation and Test Center of China Aerospace Mechanical Corporation. SRMET 3.0is supported by seven soft ware reliability models and four software maintainability models. Numerical characteristicsfor all those models are deeply studied in this paper, and corresponding numerical algorithms for each model are alsogiven in the paper.
Analysis and modeling of parking behavior
Institute of Scientific and Technical Information of China (English)
无
2001-01-01
Analyzes the spatial structure of parking behavior and establishes a basic parking behavior model to represent the parking problem in downtown, and establishes a parking pricing model to analyze the parking equilibrium with a positive parking fee and uses a paired combinatorial logit model to analyze the effect of trip integrative cost on parking behavior and concludes from empirical results that the parking behavior model performs well.
Likelihood analysis of the I(2) model
DEFF Research Database (Denmark)
Johansen, Søren
1997-01-01
The I(2) model is defined as a submodel of the general vector autoregressive model, by two reduced rank conditions. The model describes stochastic processes with stationary second difference. A parametrization is suggested which makes likelihood inference feasible. Consistency of the maximum...
The Model and Analysis of Mechatronics Systems
Bačkys, Gediminas
2004-01-01
Modeling process of mechatronics systems. Software used with PLC and simulate real equipment. System has few samples of models with pneumatics elements and model with analog device. There are education materials for students too. Software has been used for education goal at university and kolege.
Projected principal component analysis in factor models
Fan, Jianqing; Liao, Yuan; Wang, Weichen
2014-01-01
This paper introduces a Projected Principal Component Analysis (Projected-PCA), which employees principal component analysis to the projected (smoothed) data matrix onto a given linear space spanned by covariates. When it applies to high-dimensional factor analysis, the projection removes noise components. We show that the unobserved latent factors can be more accurately estimated than the conventional PCA if the projection is genuine, or more precisely, when the factor loading matrices are r...
Evaluation of Thermal Margin Analysis Models for SMART
Energy Technology Data Exchange (ETDEWEB)
Seo, Kyong Won; Kwon, Hyuk; Hwang, Dae Hyun [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)
2011-05-15
Thermal margin of SMART would be analyzed by three different methods. The first method is subchannel analysis by MATRA-S code and it would be a reference data for the other two methods. The second method is an on-line few channel analysis by FAST code that would be integrated into SCOPS/SCOMS. The last one is a single channel module analysis by safety analysis. Several thermal margin analysis models for SMART reactor core by subchannel analysis were setup and tested. We adopted a strategy of single stage analysis for thermal analysis of SMART reactor core. The model should represent characteristics of the SMART reactor core including hot channel. The model should be simple as possible to be evaluated within reasonable time and cost
Practical Use of Computationally Frugal Model Analysis Methods.
Hill, Mary C; Kavetski, Dmitri; Clark, Martyn; Ye, Ming; Arabi, Mazdak; Lu, Dan; Foglia, Laura; Mehl, Steffen
2016-03-01
Three challenges compromise the utility of mathematical models of groundwater and other environmental systems: (1) a dizzying array of model analysis methods and metrics make it difficult to compare evaluations of model adequacy, sensitivity, and uncertainty; (2) the high computational demands of many popular model analysis methods (requiring 1000's, 10,000 s, or more model runs) make them difficult to apply to complex models; and (3) many models are plagued by unrealistic nonlinearities arising from the numerical model formulation and implementation. This study proposes a strategy to address these challenges through a careful combination of model analysis and implementation methods. In this strategy, computationally frugal model analysis methods (often requiring a few dozen parallelizable model runs) play a major role, and computationally demanding methods are used for problems where (relatively) inexpensive diagnostics suggest the frugal methods are unreliable. We also argue in favor of detecting and, where possible, eliminating unrealistic model nonlinearities-this increases the realism of the model itself and facilitates the application of frugal methods. Literature examples are used to demonstrate the use of frugal methods and associated diagnostics. We suggest that the strategy proposed in this paper would allow the environmental sciences community to achieve greater transparency and falsifiability of environmental models, and obtain greater scientific insight from ongoing and future modeling efforts. PMID:25810333
Solar Advisor Model; Session: Modeling and Analysis (Presentation)
Energy Technology Data Exchange (ETDEWEB)
Blair, N.
2008-04-01
This project supports the Solar America Initiative by: (1) providing a consistent framework for analyzing and comparing power system costs and performance across the range of solar technologies and markets, PV, solar heat systems, CSP, residential, commercial and utility markets; (2) developing and validating performance models to enable accurate calculation of levelized cost of energy (LCOE); (3) providing a consistent modeling platform for all TPP's; and (4) supporting implementation and usage of cost models.
Modelling Immune System: Principles, Models,Analysis and Perspectives
Institute of Scientific and Technical Information of China (English)
Xiang-hua Li; Zheng-xuan Wang; Tian-yang Lu; Xiang-jiu Che
2009-01-01
The biological immune system is a complex adaptive system. There are lots of benefits for building the model of the immune system. For biological researchers, they can test some hypotheses about the infection process or simulate the responses of some drugs. For computer researchers, they can build distributed, robust and fault tolerant networks inspired by the functions of the immune system. This paper provides a comprehensive survey of the literatures on modelling the immune system. From the methodology perspective, the paper compares and analyzes the existing approaches and models, and also demonstrates the focusing research effort on the future immune models in the next few years.
Modeling Composite Laminate Crushing for Crash Analysis
Fleming, David C.; Jones, Lisa (Technical Monitor)
2002-01-01
Crash modeling of composite structures remains limited in application and has not been effectively demonstrated as a predictive tool. While the global response of composite structures may be well modeled, when composite structures act as energy-absorbing members through direct laminate crushing the modeling accuracy is greatly reduced. The most efficient composite energy absorbing structures, in terms of energy absorbed per unit mass, are those that absorb energy through a complex progressive crushing response in which fiber and matrix fractures on a small scale dominate the behavior. Such failure modes simultaneously include delamination of plies, failure of the matrix to produce fiber bundles, and subsequent failure of fiber bundles either in bending or in shear. In addition, the response may include the significant action of friction, both internally (between delaminated plies or fiber bundles) or externally (between the laminate and the crushing surface). A figure shows the crushing damage observed in a fiberglass composite tube specimen, illustrating the complexity of the response. To achieve a finite element model of such complex behavior is an extremely challenging problem. A practical crushing model based on detailed modeling of the physical mechanisms of crushing behavior is not expected in the foreseeable future. The present research describes attempts to model composite crushing behavior using a novel hybrid modeling procedure. Experimental testing is done is support of the modeling efforts, and a test specimen is developed to provide data for validating laminate crushing models.
Analysis of nonlinear systems using ARMA [autoregressive moving average] models
International Nuclear Information System (INIS)
While many vibration systems exhibit primarily linear behavior, a significant percentage of the systems encountered in vibration and model testing are mildly to severely nonlinear. Analysis methods for such nonlinear systems are not yet well developed and the response of such systems is not accurately predicted by linear models. Nonlinear ARMA (autoregressive moving average) models are one method for the analysis and response prediction of nonlinear vibratory systems. In this paper we review the background of linear and nonlinear ARMA models, and illustrate the application of these models to nonlinear vibration systems. We conclude by summarizing the advantages and disadvantages of ARMA models and emphasizing prospects for future development. 14 refs., 11 figs
An Object Extraction Model Using Association Rules and Dependence Analysis
Institute of Scientific and Technical Information of China (English)
无
2001-01-01
Extracting objects from legacy systems is a basic step insystem's obje ct-orientation to improve the maintainability and understandability of the syst e ms. A new object extraction model using association rules an d dependence analysis is proposed. In this model data are classified by associat ion rules and the corresponding operations are partitioned by dependence analysis.
Probabilistic Models of Analysis of Loan Activity of Internet Banking
Kondrateva Irina G.; Ostapenko Irina N.
2012-01-01
In article the main advantages of electronic banking in comparison with traditional, methods of the analysis of credit activity are considered. The special role is taken to the probabilistic method of analysis of credit activity in the Internet-bank. Modeling of activity of bank on the basis of probabilistic models of credit operations.
Book review: Statistical Analysis and Modelling of Spatial Point Patterns
DEFF Research Database (Denmark)
Møller, Jesper
2009-01-01
Statistical Analysis and Modelling of Spatial Point Patterns by J. Illian, A. Penttinen, H. Stoyan and D. Stoyan. Wiley (2008), ISBN 9780470014912......Statistical Analysis and Modelling of Spatial Point Patterns by J. Illian, A. Penttinen, H. Stoyan and D. Stoyan. Wiley (2008), ISBN 9780470014912...
Multidimensional data modeling for business process analysis
Mansmann, Svetlana; Neumuth, Thomas; Scholl, Marc H.
2007-01-01
The emerging area of business process intelligence attempts to enhance the analytical capabilities of business process management systems by employing data warehousing and mining technologies. This paper presents an approach to re-engineering the business process modeling in conformity with the multidimensional data model. Since the business process and the multidimensional model are driven by rather different objectives and assumptions, there is no straightforward solution to converging thes...
Multifractal modelling and 3D lacunarity analysis
International Nuclear Information System (INIS)
This study presents a comparative evaluation of lacunarity of 3D grey level models with different types of inhomogeneity. A new method based on the 'Relative Differential Box Counting' was developed to estimate the lacunarity features of grey level volumes. To validate our method, we generated a set of 3D grey level multifractal models with random, anisotropic and hierarchical properties. Our method gives a lacunarity measurement correlated with the theoretical one and allows a better model classification compared with a classical approach.
Finite-Element Modeling For Structural Analysis
Min, J. B.; Androlake, S. G.
1995-01-01
Report presents study of finite-element mathematical modeling as used in analyzing stresses and strains at joints between thin, shell-like components (e.g., ducts) and thicker components (e.g., flanges or engine blocks). First approach uses global/local model to evaluate system. Provides correct total response and correct representation of stresses away from any discontinuities. Second approach involves development of special transition finite elements to model transitions between shells and thicker structural components.
The Modeling Analysis of Huangshan Tourism Data
Hu, Shanfeng; Yan, Xinhu; Zhu, Hongbing
2016-06-01
Tourism is the major industry in Huangshan city. This paper analyzes time series of tourism data to Huangshan from 2000 to 2013. The Yearly data set comprises the total arrivals of tourists, total income, Urban Resident Disposable Income Per Capital and Net Income Per Peasant. A mathematical model which is based on the binomial approximation and inverse quadratic radial basis function (RBF) is set up to model the tourist arrivals. The total income and urban resident disposable income per capital and net income per peasant are also modeled. It is shown that the established mathematical model can be used to forecast some tourism information and achieve a good management for Huangshan tourism.
Evaluating Network Models: A Likelihood Analysis
Wang, Wen-Qiang; Zhou, Tao
2011-01-01
Many models are put forward to mimic the evolution of real networked systems. A well-accepted way to judge the validity is to compare the modeling results with real networks subject to several structural features. Even for a specific real network, we cannot fairly evaluate the goodness of different models since there are too many structural features while there is no criterion to select and assign weights on them. Motivated by the studies on link prediction algorithms, we propose a unified method to evaluate the network models via the comparison of the likelihoods of the currently observed network driven by different models, with an assumption that the higher the likelihood is, the better the model is. We test our method on the real Internet at the Autonomous System (AS) level, and the results suggest that the Generalized Linear Preferential (GLP) model outperforms the Tel Aviv Network Generator (Tang), while both two models are better than the Barab\\'asi-Albert (BA) and Erd\\"os-R\\'enyi (ER) models. Our metho...
Mineralogic Model (MM3.0) Analysis Model Report
International Nuclear Information System (INIS)
The purpose of this report is to document the Mineralogic Model (MM), Version 3.0 (MM3.0) with regard to data input, modeling methods, assumptions, uncertainties, limitations and validation of the model results, qualification status of the model, and the differences between Version 3.0 and previous versions. A three-dimensional (3-D) Mineralogic Model was developed for Yucca Mountain to support the analyses of hydrologic properties, radionuclide transport, mineral health hazards, repository performance, and repository design. Version 3.0 of the MM was developed from mineralogic data obtained from borehole samples. It consists of matrix mineral abundances as a function of x (easting), y (northing), and z (elevation), referenced to the stratigraphic framework defined in Version 3.1 of the Geologic Framework Model (GFM). The MM was developed specifically for incorporation into the 3-D Integrated Site Model (ISM). The MM enables project personnel to obtain calculated mineral abundances at any position, within any region, or within any stratigraphic unit in the model area. The significance of the MM for key aspects of site characterization and performance assessment is explained in the following subsections. This work was conducted in accordance with the Development Plan for the MM (CRWMS M and O 2000). The planning document for this Rev. 00, ICN 02 of this AMR is Technical Work Plan, TWP-NBS-GS-000003, Technical Work Plan for the Integrated Site Model, Process Model Report, Revision 01 (CRWMS M and O 2000). The purpose of this ICN is to record changes in the classification of input status by the resolution of the use of TBV software and data in this report. Constraints and limitations of the MM are discussed in the appropriate sections that follow. The MM is one component of the ISM, which has been developed to provide a consistent volumetric portrayal of the rock layers, rock properties, and mineralogy of the Yucca Mountain site. The ISM consists of three components
Mineralogic Model (MM3.0) Analysis Model Report
Energy Technology Data Exchange (ETDEWEB)
C. Lum
2002-02-12
The purpose of this report is to document the Mineralogic Model (MM), Version 3.0 (MM3.0) with regard to data input, modeling methods, assumptions, uncertainties, limitations and validation of the model results, qualification status of the model, and the differences between Version 3.0 and previous versions. A three-dimensional (3-D) Mineralogic Model was developed for Yucca Mountain to support the analyses of hydrologic properties, radionuclide transport, mineral health hazards, repository performance, and repository design. Version 3.0 of the MM was developed from mineralogic data obtained from borehole samples. It consists of matrix mineral abundances as a function of x (easting), y (northing), and z (elevation), referenced to the stratigraphic framework defined in Version 3.1 of the Geologic Framework Model (GFM). The MM was developed specifically for incorporation into the 3-D Integrated Site Model (ISM). The MM enables project personnel to obtain calculated mineral abundances at any position, within any region, or within any stratigraphic unit in the model area. The significance of the MM for key aspects of site characterization and performance assessment is explained in the following subsections. This work was conducted in accordance with the Development Plan for the MM (CRWMS M&O 2000). The planning document for this Rev. 00, ICN 02 of this AMR is Technical Work Plan, TWP-NBS-GS-000003, Technical Work Plan for the Integrated Site Model, Process Model Report, Revision 01 (CRWMS M&O 2000). The purpose of this ICN is to record changes in the classification of input status by the resolution of the use of TBV software and data in this report. Constraints and limitations of the MM are discussed in the appropriate sections that follow. The MM is one component of the ISM, which has been developed to provide a consistent volumetric portrayal of the rock layers, rock properties, and mineralogy of the Yucca Mountain site. The ISM consists of three components: (1
Meta-analysis a structural equation modeling approach
Cheung, Mike W-L
2015-01-01
Presents a novel approach to conducting meta-analysis using structural equation modeling. Structural equation modeling (SEM) and meta-analysis are two powerful statistical methods in the educational, social, behavioral, and medical sciences. They are often treated as two unrelated topics in the literature. This book presents a unified framework on analyzing meta-analytic data within the SEM framework, and illustrates how to conduct meta-analysis using the metaSEM package in the R statistical environment. Meta-Analysis: A Structural Equation Modeling Approach begins by introducing the impo
An Intelligent Analysis Model for Multisource Volatile Memory
Directory of Open Access Journals (Sweden)
Xiaolu Zhang
2013-09-01
Full Text Available For the rapidly development of network and distributed computing environment, it make researchers harder to do analysis examines only from one or few pieces of data source in persistent data-oriented approaches, so as the volatile memory analysis either. Therefore, mass data automatically analysis and action modeling needs to be considered for reporting entire network attack process. To model multiple volatile data sources situation can help understand and describe both thinking process of investigator and possible action step for attacker. This paper presents a Game model for multisource volatile data and applies it to main memory images analysis with the definition of space-time feature for volatile element information. Abstract modeling allows the lessons gleaned in performing intelligent analysis, evidence filing and automating presentation. Finally, a test demo based on the model is also present to illustrate the whole procedure
Lovejoy, Andrew E.; Hilburger, Mark W.
2013-01-01
This document outlines a Modeling and Analysis Plan (MAP) to be followed by the SBKF analysts. It includes instructions on modeling and analysis formulation and execution, model verification and validation, identifying sources of error and uncertainty, and documentation. The goal of this MAP is to provide a standardized procedure that ensures uniformity and quality of the results produced by the project and corresponding documentation.
Meta-analysis of clinical prediction models
Debray, T.P.A.
2013-01-01
The past decades there has been a clear shift from implicit to explicit diagnosis and prognosis. This includes appreciation of clinical -diagnostic and prognostic- prediction models, which is likely to increase with the introduction of fully computerized patient records. Prediction models aim to pro
Wellness Model of Supervision: A Comparative Analysis
Lenz, A. Stephen; Sangganjanavanich, Varunee Faii; Balkin, Richard S.; Oliver, Marvarene; Smith, Robert L.
2012-01-01
This quasi-experimental study compared the effectiveness of the Wellness Model of Supervision (WELMS; Lenz & Smith, 2010) with alternative supervision models for developing wellness constructs, total personal wellness, and helping skills among counselors-in-training. Participants were 32 master's-level counseling students completing their…
Multivariate model for test response analysis
Krishnan, S.; Kerkhoff, H.G.
2010-01-01
A systematic approach to construct an effective multivariate test response model for capturing manufacturing defects in electronic products is described. The effectiveness of the model is demonstrated by its capability in reducing the number of test-points, while achieving the maximal coverage attai
Modeling and motion analysis of autonomous paragliders
Chiara Toglia; Marilena Vendittelli
2010-01-01
This report describes a preliminary study on modeling and control of parafoil and payload systems with the twofold objective of developing tools for automatic testing and classification of parafoils and of devising autonomous paragliders able to accomplish long-range delivery or monitoring tasks. Three different models of decreasing complexity are derived and their accuracy compared by simulation.
Temporal analysis of text data using latent variable models
DEFF Research Database (Denmark)
Mølgaard, Lasse Lohilahti; Larsen, Jan; Goutte, Cyril
2009-01-01
Probabilistic Latent Semantic Analysis (PLSA) approach and a global multiway PLSA method. The analysis indicates that the global analysis method is able to identify relevant trends which are difficult to get using a step-by-step approach. Furthermore we show that inspection of PLSA models with different number...
Analysis and modelization of lightweight structures subjected to impact
Barbero Pozuelo, Enrique; López-Puente, Jorge
2008-01-01
Mechanics of Advanced Materials research group (Department of Continuum Mechanics and Structural Analysis) of the University Carlos III of Madrid (Spain) offers their experience in the analysis and modelization of high and low velocity impact behaviour of composite structures. Their research focuses on both numerical analysis and non-standard experimental methodologies).
GLOBAL ANALYSIS OF AGRICULTURAL TRADE LIBERALIZATION: ASSESSING MODEL VALIDITY
Hertel, Thomas W.; Keeney, Roman; Valenzuela, Ernesto
2004-01-01
This paper presents a validation experiment of a global CGE trade model widely used for analysis of trade liberalization. We focus on the ability of the model to reproduce price volatility in wheat markets. The literature on model validation is reviewed with an eye towards designing an appropriate methodology for validating large scale CGE models. The validation experiment results indicate that in its current form, the GTAP-AGR model is incapable of reproducing wheat market price volatility a...
Analysis and modeling of solar irradiance variations
Yeo, K L
2014-01-01
A prominent manifestation of the solar dynamo is the 11-year activity cycle, evident in indicators of solar activity, including solar irradiance. Although a relationship between solar activity and the brightness of the Sun had long been suspected, it was only directly observed after regular satellite measurements became available with the launch of Nimbus-7 in 1978. The measurement of solar irradiance from space is accompanied by the development of models aimed at describing the apparent variability by the intensity excess/deficit effected by magnetic structures in the photosphere. The more sophisticated models, termed semi-empirical, rely on the intensity spectra of photospheric magnetic structures generated with radiative transfer codes from semi-empirical model atmospheres. An established example of such models is SATIRE-S (Spectral And Total Irradiance REconstruction for the Satellite era). One key limitation of current semi-empirical models is the fact that the radiant properties of network and faculae a...
Quark model analysis of the Sivers function
Courtoy, A; Scopetta, S; Vento, V
2008-01-01
A formalism is developed aimed at evaluating the Sivers function entering single spin asymmetries. The approach is well suited for calculations which use constituent quark models to describe the structure of the nucleon. A non-relativistic approximation of the scheme is performed to calculate the Sivers function using the Isgur-Karl model. The results we have obtained are consistent with a sizable Sivers effect, with an opposite sign for the u and d flavors. This pattern is in agreement with the one found analysing, in the same model, the impact parameter dependent generalized parton distributions. Although a consistent QCD evolution of the results from the momentum scale of the model to the experimental one is not yet possible, an estimate shows that a reasonable agreement with the available data is obtained once the evolution of the model results is performed.
Analysis of Modeling Parameters on Threaded Screws.
Energy Technology Data Exchange (ETDEWEB)
Vigil, Miquela S. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Brake, Matthew Robert [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Vangoethem, Douglas [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)
2015-06-01
Assembled mechanical systems often contain a large number of bolted connections. These bolted connections (joints) are integral aspects of the load path for structural dynamics, and, consequently, are paramount for calculating a structure's stiffness and energy dissipation prop- erties. However, analysts have not found the optimal method to model appropriately these bolted joints. The complexity of the screw geometry cause issues when generating a mesh of the model. This paper will explore different approaches to model a screw-substrate connec- tion. Model parameters such as mesh continuity, node alignment, wedge angles, and thread to body element size ratios are examined. The results of this study will give analysts a better understanding of the influences of these parameters and will aide in finding the optimal method to model bolted connections.
Application of dimensional analysis in systems modeling and control design
Balaguer, Pedro
2013-01-01
Dimensional analysis is an engineering tool that is widely applied to numerous engineering problems, but has only recently been applied to control theory and problems such as identification and model reduction, robust control, adaptive control, and PID control. Application of Dimensional Analysis in Systems Modeling and Control Design provides an introduction to the fundamentals of dimensional analysis for control engineers, and shows how they can exploit the benefits of the technique to theoretical and practical control problems.
Gentrification and models for real estate analysis
Gianfranco Brusa; Alessandra Armiraglio
2013-01-01
This research propose a deep analysis of Milanese real estate market, based on data supplied by three real estate organizations; gentrification appears in some neighborhoods, such as Tortona, Porta Genova, Bovisa, Isola Garibaldi: the latest is the subject of the final analysis, by surveying of physical and social state of the area. The survey takes place in two periods (2003 and 2009) to compare the evolution of gentrification. The results of surveys has been employed in a simulation by mult...
Sensitivity Analysis of the Gap Heat Transfer Model in BISON.
Energy Technology Data Exchange (ETDEWEB)
Swiler, Laura Painton; Schmidt, Rodney C.; Williamson, Richard (INL); Perez, Danielle (INL)
2014-10-01
This report summarizes the result of a NEAMS project focused on sensitivity analysis of the heat transfer model in the gap between the fuel rod and the cladding used in the BISON fuel performance code of Idaho National Laboratory. Using the gap heat transfer models in BISON, the sensitivity of the modeling parameters and the associated responses is investigated. The study results in a quantitative assessment of the role of various parameters in the analysis of gap heat transfer in nuclear fuel.
European Climate - Energy Security Nexus. A model based scenario analysis
International Nuclear Information System (INIS)
In this research, we have provided an overview of the climate-security nexus in the European sector through a model based scenario analysis with POLES model. The analysis underline that under stringent climate policies, Europe take advantage of a double dividend in its capacity to develop a new cleaner energy model and in lower vulnerability to potential shocks on the international energy markets. (authors)
Policy Sensitivity Analysis: simple versus complex fishery models
Moxnes, Erling
2003-01-01
Sensitivity analysis is often used to judge the sensitivity of model behaviour to uncertain assumptions about model formulations and parameter values. Since the ultimate goal of modelling is typically policy recommendation, one may suspect that it is even more useful to test the sensitivity of policy recommendations. A major reason for this is that behaviour sensitivity is not necessarily a reliable predictor of policy sensitivity. Policy sensitivity analysis is greatly simplified if one can ...
Multifractal modelling and 3D lacunarity analysis
Energy Technology Data Exchange (ETDEWEB)
Hanen, Akkari, E-mail: bettaieb.hanen@topnet.t [Laboratoire de biophysique, TIM, Faculte de Medecine (Tunisia); Imen, Bhouri, E-mail: bhouri_imen@yahoo.f [Unite de recherche ondelettes et multifractals, Faculte des sciences (Tunisia); Asma, Ben Abdallah, E-mail: asma.babdallah@cristal.rnu.t [Laboratoire de biophysique, TIM, Faculte de Medecine (Tunisia); Patrick, Dubois, E-mail: pdubois@chru-lille.f [INSERM, U 703, Lille (France); Hedi, Bedoui Mohamed, E-mail: medhedi.bedoui@fmm.rnu.t [Laboratoire de biophysique, TIM, Faculte de Medecine (Tunisia)
2009-09-28
This study presents a comparative evaluation of lacunarity of 3D grey level models with different types of inhomogeneity. A new method based on the 'Relative Differential Box Counting' was developed to estimate the lacunarity features of grey level volumes. To validate our method, we generated a set of 3D grey level multifractal models with random, anisotropic and hierarchical properties. Our method gives a lacunarity measurement correlated with the theoretical one and allows a better model classification compared with a classical approach.
Analysis of Brown camera distortion model
Nowakowski, Artur; Skarbek, Władysław
2013-10-01
Contemporary image acquisition devices introduce optical distortion into image. It results in pixel displacement and therefore needs to be compensated for many computer vision applications. The distortion is usually modeled by the Brown distortion model, which parameters can be included in camera calibration task. In this paper we describe original model, its dependencies and analyze orthogonality with regard to radius for its decentering distortion component. We also report experiments with camera calibration algorithm included in OpenCV library, especially a stability of distortion parameters estimation is evaluated.
q-gram analysis and urn models
Nicodème, Pierre
2003-01-01
Words of fixed size q are commonly referred to as q-grams. We consider the problem of q-gram filtration, a method commonly used to speed upsequence comparison. We are interested in the statistics of the number of q-grams common to two random texts (where multiplicities are not counted) in the non uniform Bernoulli model. In the exact and dependent model, when omitting border effects, a q-gramin a random sequence depends on the q-1 preceding q-grams. In an approximate and independent model, we...
Structural dynamic analysis with generalized damping models analysis
Adhikari , Sondipon
2013-01-01
Since Lord Rayleigh introduced the idea of viscous damping in his classic work ""The Theory of Sound"" in 1877, it has become standard practice to use this approach in dynamics, covering a wide range of applications from aerospace to civil engineering. However, in the majority of practical cases this approach is adopted more for mathematical convenience than for modeling the physics of vibration damping. Over the past decade, extensive research has been undertaken on more general ""non-viscous"" damping models and vibration of non-viscously damped systems. This book, along with a related book
Horizontal crash testing and analysis of model flatrols
International Nuclear Information System (INIS)
To assess the behaviour of a full scale flask and flatrol during a proposed demonstration impact into a tunnel abutment, a mathematical modelling technique was developed and validated. The work was performed at quarter scale and comprised of both scale model tests and mathematical analysis in one and two dimensions. Good agreement between model test results of the 26.8m/s (60 mph) abutment impacts and the mathematical analysis, validated the modelling techniques. The modelling method may be used with confidence to predict the outcome of the proposed full scale demonstration. (author)
Flood Progression Modelling and Impact Analysis
DEFF Research Database (Denmark)
Mioc, Darka; Anton, François; Nickerson, B.;
People living in the lower valley of the St. John River, New Brunswick, Canada, frequently experience flooding when the river overflows its banks during spring ice melt and rain. To better prepare the population of New Brunswick for extreme flooding, we developed a new flood prediction model that...... computes floodplain polygons before the flood occurs. This allows emergency managers to access the impact of the flood before it occurs and make the early decisions for evacuation of the population and flood rescue. This research shows that the use of GIS and LiDAR technologies combined with hydrological...... modelling can significantly improve the decision making and visualization of flood impact needed for emergency planning and flood rescue. Furthermore, the 3D GIS application we developed for modelling flooded buildings and infrastructure provides a better platform for modelling and visualizing flood...
Modeling and analysis of web portals performance
Abdul Rahim, Rahela; Ibrahim, Haslinda; Syed Yahaya, Sharipah Soaad; Khalid, Khairini
2011-10-01
The main objective of this study is to develop a model based on queuing theory at a system level of web portals performance for a university. A system level performance model views the system being modeled as a 'black box' which considers the arrival rate of packets to the portals server and service rate of the portals server. These two parameters are important elements to measure Web portals performance metrics such as server utilization, average server throughput, average number of packet in the server and mean response time. This study refers to infinite population and finite queue. The proposed analytical model is simple in such a way that it is easy to define and fast to interpret the results but still represents the real situation.
Complex networks analysis in socioeconomic models
Varela, Luis M; Ausloos, Marcel; Carrete, Jesus
2014-01-01
This chapter aims at reviewing complex networks models and methods that were either developed for or applied to socioeconomic issues, and pertinent to the theme of New Economic Geography. After an introduction to the foundations of the field of complex networks, the present summary adds insights on the statistical mechanical approach, and on the most relevant computational aspects for the treatment of these systems. As the most frequently used model for interacting agent-based systems, a brief description of the statistical mechanics of the classical Ising model on regular lattices, together with recent extensions of the same model on small-world Watts-Strogatz and scale-free Albert-Barabasi complex networks is included. Other sections of the chapter are devoted to applications of complex networks to economics, finance, spreading of innovations, and regional trade and developments. The chapter also reviews results involving applications of complex networks to other relevant socioeconomic issues, including res...
Dynamic Chest Image Analysis: Model-Based Perfusion Analysis in Dynamic Pulmonary Imaging
Kiuru Aaro; Kormano Martti; Svedström Erkki; Liang Jianming; Järvi Timo
2003-01-01
The "Dynamic Chest Image Analysis" project aims to develop model-based computer analysis and visualization methods for showing focal and general abnormalities of lung ventilation and perfusion based on a sequence of digital chest fluoroscopy frames collected with the dynamic pulmonary imaging technique. We have proposed and evaluated a multiresolutional method with an explicit ventilation model for ventilation analysis. This paper presents a new model-based method for pulmonary perfusion ana...
Modeling, Analysis, and Numerics in Electrohydrodynamics
Schmuck, Markus
2008-01-01
The main subject of this thesis is to analyze the incompressible Navier-Stokes-Nernst-Planck-Poisson system for bounded domains. Such a system is used as a model in electrohydrodynamics or physicochemical models. First, we verify existence of weak and strong solutions. Moreover, we are able to characterize the weak solutions by an energy and an entropy law. The concentrations in the Nernst-Planck equations additionally are non-negative and bounded. These results motivate to construct conv...
Tradeoff Analysis for Optimal Multiobjective Inventory Model
Longsheng Cheng; Ching-Shih Tsou; Ming-Chang Lee; Li-Hua Huang; Dingwei Song; Wei-Shan Teng
2013-01-01
Deterministic inventory model, the economic order quantity (EOQ), reveals that carrying inventory or ordering frequency follows a relation of tradeoff. For probabilistic demand, the tradeoff surface among annual order, expected inventory and shortage are useful because they quantify what the firm must pay in terms of ordering workload and inventory investment to meet the customer service desired. Based on a triobjective inventory model, this paper employs the successive approximation to obtai...
Model-driven Engineering for Requirements Analysis
Baudry, Benoit; Nebut, Clementine; Le Traon,Yves
2007-01-01
Requirements engineering (RE) encompasses a set of activities for eliciting, modelling, agreeing, communicating and validating requirements that precisely deﬁne the problem domain for a software system. Several tools and methods exist to perform each of these activities, but they mainly remain separate, making it difﬁcult to capture the global consistency of large requirement documents. In this paper we introduce model-driven engineering (MDE) as a possible technical solution to integrate the...
Digital Avionics Information System (DAIS): Training Requirements Analysis Model (TRAMOD).
Czuchry, Andrew J.; And Others
The training requirements analysis model (TRAMOD) described in this report represents an important portion of the larger effort called the Digital Avionics Information System (DAIS) Life Cycle Cost (LCC) Study. TRAMOD is the second of three models that comprise an LCC impact modeling system for use in the early stages of system development. As…
Mathematical Modelling and Experimental Analysis of Early Age Concrete
DEFF Research Database (Denmark)
Hauggaard-Nielsen, Anders Boe
1997-01-01
lead to cracks in the later cooling phase. The matrial model has intrigate couplings between the involved mechanics, and in the thesis special emphasize is put on the creep behaviour. The mathematical models are based on experimental analysis and numerical implementation of the models in a finite...
Adversarial Scheduling Analysis of Game Theoretic Models of Norm Diffusion
Istrate, Gabriel; Ravi, S S
2008-01-01
In (Istrate, Marathe, Ravi SODA 2001) we advocated the investigation of robustness of results in the theory of learning in games under adversarial scheduling models. We provide evidence that such an analysis is feasible and can lead to nontrivial results by investigating, in an adversarial scheduling setting, Peyton Young's model of diffusion of norms. In particular, our main result incorporates into Peyton Young's model.
Integration of Design and Control Through Model Analysis
DEFF Research Database (Denmark)
Russel, Boris Mariboe; Henriksen, Jens Peter; Jørgensen, Sten Bay;
2000-01-01
phenomena models representing the process model identify the relationships between the important process and design variables, which help to understand, define and address some of the issues related to integration of design and control issues. The model analysis is highlighted through examples involving...... processes with mass and/or energy recycle. (C) 2000 Elsevier Science Ltd. All rights reserved....
Comparative Analysis of Uncertainties in Urban Surface Runoff Modelling
DEFF Research Database (Denmark)
Thorndahl, Søren; Schaarup-Jensen, Kjeld
In the present paper a comparison between three different surface runoff models, in the numerical urban drainage tool MOUSE, is conducted. Analysing parameter uncertainty, it is shown that the models are very sensitive with regards to the choice of hydrological parameters, when combined overflow...... analysis, further research in improved parameter assessment for surface runoff models is needed....
Analysis of multistate models for electromigration failure
Dwyer, V. M.
2010-02-01
The application of a multistate Markov chain is considered as a model of electromigration interconnect degradation and eventual failure. Such a model has already been used [Tan et al., J. Appl. Phys. 102, 103703 (2007)], maintaining that, in general, it leads to a failure distribution described by a gamma mixture, and that as a result, this type of distribution (rather than a lognormal) should be used as a prior in any Bayesian mode fitting and subsequent reliability budgeting. Although it appears that the model is able to produce reasonably realistic resistance curves R(t), we are unable to find any evidence that the failure distribution is a simple gamma mixture except under contrived conditions. The distributions generated are largely sums of exponentials (phase-type distributions), convolutions of gamma distributions with different scales, or roughly normal. We note also some inconsistencies in the derivation of the gamma mixture in the work cited above and conclude that, as it stands, the Markov chain model is probably unsuitable for electromigration modeling and a change from lognormal to gamma mixture distribution generally cannot be justified in this way. A hidden Markov model, which describes the interconnect behavior at time t rather than its resistance, in terms of generally observed physical processes such as void nucleating, slitlike growth (where the growth is slow and steady), transverse growth, current shunting (where the resistance jumps in value), etc., seems a more likely prospect, but treating failure in such a manner would still require significant justification.