WorldWideScience

Sample records for analysis model gsam

  1. Development of a gas systems analysis model (GSAM)

    Energy Technology Data Exchange (ETDEWEB)

    Godec, M.L. [IFC Resources Inc., Fairfax, VA (United States)

    1995-04-01

    The objectives of developing a Gas Systems Analysis Model (GSAM) are to create a comprehensive, non-proprietary, PC based model of domestic gas industry activity. The system is capable of assessing the impacts of various changes in the natural gas system within North America. The individual and collective impacts due to changes in technology and economic conditions are explicitly modeled in GSAM. Major gas resources are all modeled, including conventional, tight, Devonian Shale, coalbed methane, and low-quality gas sources. The modeling system asseses all key components of the gas industry, including available resources, exploration, drilling, completion, production, and processing practices, both for now and in the future. The model similarly assesses the distribution, storage, and utilization of natural gas in a dynamic market-based analytical structure. GSAM is designed to provide METC managers with a tool to project the impacts of future research, development, and demonstration (RD&D) benefits in order to determine priorities in a rapidly changing, market-driven gas industry.

  2. Development of a natural gas systems analysis model (GSAM). Annual report, January 1994--January 1995

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1994-07-01

    The objective of GSAM development is to create a comprehensive, non-proprietary, microcomputer model of the North American natural gas system. GSAM explicitly evaluates the key components of the system, including the resource base, exploration and development practices, extraction technology performance and costs, project economics, transportation costs and restrictions, storage, and end-use. The primary focus is the detailed characterization of the resource base at the reservoir and sub-reservoir level. This disaggregation allows direct evaluation of alternative extraction technologies based on discretely estimated, individual well productivity, required investments, and associated operating costs. GSAM`s design allows users to evaluate complex interactions of current and alternative future technology and policy initiatives as they directly impact the gas market. Key activities completed during the past year include: conducted a comparative analysis of commercial reservoir databases; licensed and screened NRG Associates Significant Oil and Gas Fields of the US reservoir database; developed and tested reduced form reservoir model production type curves; fully developed database structures for use in GSAM and linkage to other systems; developed a methodology for the exploration module; collected and updated upstream capital and operating cost parameters; completed initial integration of downstream/demand models; presented research results at METC Contractor Review Meeting; conducted other briefings for METC managers, including initiation of the GSAM Environmental Module; and delivered draft topical reports on technology review, model review, and GSAM methodology.

  3. Development of a natural Gas Systems Analysis Model (GSAM)

    International Nuclear Information System (INIS)

    Lacking a detailed characterization of the resource base and a comprehensive borehole-to-burnertip evaluation model of the North American natural gas system, past R ampersand D, tax and regulatory policies have been formulated without a full understanding of their likely direct and indirect impacts on future gas supply and demand. The recent disappearance of the deliverability surplus, pipeline deregulation, and current policy debates about regulatory initiatives in taxation, environmental compliance and leasing make the need for a comprehensive gas evaluation system critical. Traditional econometric or highly aggregated energy models are increasingly regarded as unable to incorporate available geologic detail and explicit technology performance and costing algorithms necessary to evaluate resource-technology-economic interactions in a market context. The objective of this research is to create a comprehensive, non-proprietary, microcomputer model of the North American natural gas system. GSAM explicitly evaluates the key components of the natural gas system, including resource base, exploration and development, extraction technology performance and costs, transportation and storage and end use. The primary focus is the detailed characterization of the resource base at the reservoir and sub-reservoir level and the impact of alternative extraction technologies on well productivity and economics. GSAM evaluates the complex interactions of current and alternative future technology and policy initiatives in the context of the evolving gas markets. Scheduled for completion in 1995, a prototype is planned for early 1994. ICF Resources reviewed relevant natural gas upstream, downstream and market models to identify appropriate analytic capabilities to incorporate into GSAM. We have reviewed extraction technologies to better characterize performance and costs in terms of GSAM parameters

  4. Development of a natural gas systems analysis model (GSAM). Annual report, July 1996--July 1997

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-12-31

    The objective of GSAM development is to create a comprehensive, non-proprietary, microcomputer model of the North American natural gas system. GSAM explicitly evaluates the key components of the system, including the resource base, exploration and development practices, extraction technology performance and costs, project economics, transportation costs and restrictions, storage, and end-use. The primary focus is the detailed characterization of the resource base at the reservoir and subreservoir level. This disaggregation allows direct evaluation of alternative extraction technologies based on discretely estimated, individual well productivity, required investments, and associated operating costs. GSAM`s design allows users to evaluate complex interactions of current and alternative future technology and policy initiatives as they directly impact the gas market. GSAM development has been ongoing for the past five years. Key activities completed during the past year are described.

  5. Development of a natural gas systems analysis model (GSAM). Annual report, July 1996--July 1997

    International Nuclear Information System (INIS)

    The objective of GSAM development is to create a comprehensive, non-proprietary, microcomputer model of the North American natural gas system. GSAM explicitly evaluates the key components of the system, including the resource base, exploration and development practices, extraction technology performance and costs, project economics, transportation costs and restrictions, storage, and end-use. The primary focus is the detailed characterization of the resource base at the reservoir and subreservoir level. This disaggregation allows direct evaluation of alternative extraction technologies based on discretely estimated, individual well productivity, required investments, and associated operating costs. GSAM's design allows users to evaluate complex interactions of current and alternative future technology and policy initiatives as they directly impact the gas market. GSAM development has been ongoing for the past five years. Key activities completed during the past year are described

  6. Development of a natural gas systems analysis model (GSAM). Annual report, July 1994--June 1995

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-07-01

    North American natural gas markets have changed dramatically over the past decade. A competitive, cost-conscious production, transportation, and distribution system has emerged from the highly regulated transportation wellhead pricing structure of the 1980`s. Technology advances have played an important role in the evolution of the gas industry, a role likely to expand substantially as alternative fuel price competition and a maturing natural gas resource base force operators to maximize efficiency. Finally, significant changes continue in regional gas demand patterns, industry practices, and infrastructure needs. As the complexity of the gas system grows so does the need to evaluate and plan for alternative future resource, technology, and market scenarios. Traditional gas modeling systems focused solely on the econometric aspects of gas marketing. These systems, developed to assess a regulated industry at a high level of aggregation, rely on simple representation of complex and evolving systems, thereby precluding insight into how the industry will change over time. Credible evaluations of specific policy initiatives and research activities require a different approach. Also, the mounting pressure on energy producers from environmental compliance activities requires development of analysis that incorporates relevant geologic, engineering, and project economic details. The objective of policy, research and development (R&D), and market analysis is to integrate fundamental understanding of natural gas resources, technology, and markets to fully describe the potential of the gas resource under alternative future scenarios. This report summarizes work over the past twelve months on DOE Contract DE-AC21-92MC28138, Development of a Natural Gas Systems Analysis Model (GSAM). The products developed under this project directly support the Morgantown Energy Technology Center (METC) in carrying out its natural gas R&D mission.

  7. DEVELOPMENT OF A NATURAL GAS SYSTEMS ANALYSIS MODEL (GSAM) VOLUME I - SUMMARY REPORT VOLUME II - USER'S GUIDE VOLUME IIIA - RP PROGRAMMER'S GUIDE VOLUME IIIB - SRPM PROGRAMMER'S GUIDE VOLUME IIIC - E&P PROGRAMMER'S GUIDE VOLUME IIID - D&I PROGRAMMER'S GUIDE

    Energy Technology Data Exchange (ETDEWEB)

    Unknown

    2001-02-01

    This report summarizes work completed on DOE Contract DE-AC21-92MC28138, Development of a Natural Gas Systems Analysis Model (GSAM). The products developed under this project directly support the National Energy Technology Laboratory (NETL) in carrying out its natural gas R&D mission. The objective of this research effort has been to create a comprehensive, non-proprietary, microcomputer model of the North American natural gas market. GSAM has been developed to explicitly evaluate components of the natural gas system, including the entire in-place gas resource base, exploration and development technologies, extraction technology and performance parameters, transportation and storage factors, and end-use demand issues. The system has been fully tested and calibrated and has been used for multiple natural gas metrics analyses at NETL in which metric associated with NETL natural gas upstream R&D technologies and strategies under the direction of NETL has been evaluated. NETL's Natural Gas Strategic Plan requires that R&D activities be evaluated for their ability to provide adequate supplies of reasonably priced natural gas. GSAM provides the capability to assess potential and on-going R&D projects using a full fuel cycle, cost-benefit approach. This method yields realistic, market-based assessments of benefits and costs of alternative or related technology advances. GSAM is capable of estimating both technical and commercial successes, quantifying the potential benefits to the market, as well as to other related research. GSAM, therefore, represents an integration of research activities and a method for planning and prioritizing efforts to maximize benefits and minimize costs. Without an analytical tool like GSAM, NETL natural gas upstream R&D activities cannot be appropriately ranked or focused on the most important aspects of natural gas extraction efforts or utilization considerations.

  8. DEVELOPMENT OF A NATURAL GAS SYSTEMS ANALYSIS MODEL (GSAM) VOLUME I - SUMMARY REPORT VOLUME II - USER'S GUIDE VOLUME IIIA - RP PROGRAMMER'S GUIDE VOLUME IIIB - SRPM PROGRAMMER'S GUIDE VOLUME IIIC - E and P PROGRAMMER'S GUIDE VOLUME IIID - D and I PROGRAMMER'S GUIDE

    International Nuclear Information System (INIS)

    This report summarizes work completed on DOE Contract DE-AC21-92MC28138, Development of a Natural Gas Systems Analysis Model (GSAM). The products developed under this project directly support the National Energy Technology Laboratory (NETL) in carrying out its natural gas R and D mission. The objective of this research effort has been to create a comprehensive, non-proprietary, microcomputer model of the North American natural gas market. GSAM has been developed to explicitly evaluate components of the natural gas system, including the entire in-place gas resource base, exploration and development technologies, extraction technology and performance parameters, transportation and storage factors, and end-use demand issues. The system has been fully tested and calibrated and has been used for multiple natural gas metrics analyses at NETL in which metric associated with NETL natural gas upstream R and D technologies and strategies under the direction of NETL has been evaluated. NETL's Natural Gas Strategic Plan requires that R and D activities be evaluated for their ability to provide adequate supplies of reasonably priced natural gas. GSAM provides the capability to assess potential and on-going R and D projects using a full fuel cycle, cost-benefit approach. This method yields realistic, market-based assessments of benefits and costs of alternative or related technology advances. GSAM is capable of estimating both technical and commercial successes, quantifying the potential benefits to the market, as well as to other related research. GSAM, therefore, represents an integration of research activities and a method for planning and prioritizing efforts to maximize benefits and minimize costs. Without an analytical tool like GSAM, NETL natural gas upstream R and D activities cannot be appropriately ranked or focused on the most important aspects of natural gas extraction efforts or utilization considerations

  9. A generalized approach to the modeling of the species-area relationship.

    Directory of Open Access Journals (Sweden)

    Katiane Silva Conceição

    Full Text Available This paper proposes a statistical generalized species-area model (GSAM to represent various patterns of species-area relationship (SAR, which is one of the fundamental patterns in ecology. The approach enables the generalization of many preliminary models, as power-curve model, which is commonly used to mathematically describe the SAR. The GSAM is applied to simulated data set of species diversity in areas of different sizes and a real-world data of insects of Hymenoptera order has been modeled. We show that the GSAM enables the identification of the best statistical model and estimates the number of species according to the area.

  10. Measuring the influence of Canadian carbon stabilization programs on natural gas exports to the United States via a 'bottom-up' intertemporal spatial price equilibrium model

    International Nuclear Information System (INIS)

    In this paper, we present the results of a study of the impact of Canadian carbon stabilization programs on exports of natural gas to the United States. This work was based on a study conducted for the US Environmental Protection Agency. The Gas Systems Analysis model (GSAM), developed by ICF Consulting for the US Department of Energy, was used to gauge the overall impact of the stabilization programs on the North American natural gas market. GSAM is an intertemporal, spatial price equilibrium (SPE) type model of the North American natural gas system. Salient features of this model include characterization of over 17 000 gas production reservoirs with explicit reservoir-level geologic and economic information used to build up the supply side of the market. On the demand side, four sectors, residential, commercial, industrial and electric power generation, are characterized in the model. Lastly, both above and below ground storage facilities as well as a comprehensive pipeline network are used with the supply and demand side characterizations to arrive at estimates of market equilibrium prices and quantities and flows. 35 refs

  11. The ATLAS Analysis Model

    CERN Multimedia

    Amir Farbin

    The ATLAS Analysis Model is a continually developing vision of how to reconcile physics analysis requirements with the ATLAS offline software and computing model constraints. In the past year this vision has influenced the evolution of the ATLAS Event Data Model, the Athena software framework, and physics analysis tools. These developments, along with the October Analysis Model Workshop and the planning for CSC analyses have led to a rapid refinement of the ATLAS Analysis Model in the past few months. This article introduces some of the relevant issues and presents the current vision of the future ATLAS Analysis Model. Event Data Model The ATLAS Event Data Model (EDM) consists of several levels of details, each targeted for a specific set of tasks. For example the Event Summary Data (ESD) stores calorimeter cells and tracking system hits thereby permitting many calibration and alignment tasks, but will be only accessible at particular computing sites with potentially large latency. In contrast, the Analysis...

  12. RISK ANALYSIS DEVELOPED MODEL

    Directory of Open Access Journals (Sweden)

    Georgiana Cristina NUKINA

    2012-07-01

    Full Text Available Through Risk analysis developed model deciding whether control measures suitable for implementation. However, the analysis determines whether the benefits of a data control options cost more than the implementation.

  13. Deeper model endgame analysis

    OpenAIRE

    Andrist, Rafael B.; Haworth, Guy McCrossan

    2005-01-01

    A reference model of Fallible Endgame Play has been implemented and exercised with the chess-engine WILHELM. Past experiments have demonstrated the value of the model and the robustness of decisions based on it: experiments agree well with a Markov Model theory. Here, the reference model is exercised on the well-known endgame KBBKN.

  14. Analysis of Business Models

    OpenAIRE

    Slavik Stefan; Bednar Richard

    2014-01-01

    The term business model has been used in practice for few years, but companies create, define and innovate their models subconsciously from the start of business. Our paper is aimed to clear the theory about business model, hence definition and all the components that form each business. In the second part, we create an analytical tool and analyze the real business models in Slovakia and define the characteristics of each part of business model, i.e., customers, distribution, value, resour...

  15. EXPOSURE ANALYSIS MODELING SYSTEM (EXAMS)

    Science.gov (United States)

    The Exposure Analysis Modeling System (EXAMS), first published in 1982 (EPA-600/3-82-023), provides interactive computer software for formulating aquatic ecosystem models and rapidly evaluating the fate, transport, and exposure concentrations of synthetic organic chemicals--pesti...

  16. Model Checking as Static Analysis

    DEFF Research Database (Denmark)

    Zhang, Fuyuan

    properties which can predict safe approximations to program behaviors. In this thesis, we have developed several static analysis based techniques to solve model checking problems, aiming at showing the link between static analysis and model checking. We focus on logical approaches to static analysis......Both model checking and static analysis are prominent approaches to detecting software errors. Model Checking is a successful formal method for verifying properties specified in temporal logics with respect to transition systems. Static analysis is also a powerful method for validating program...... multi-valued setting, and we therefore obtain a multivalued analysis for temporal properties specied by CTL formulas. In particular, we have shown that the three-valued CTL model checking problem over Kripke modal transition systems can be exactly encoded in three-valued ALFP. Last, we come back to two...

  17. Model endgame analysis

    OpenAIRE

    Haworth, Guy McCrossan; Andrist, Rafael B.

    2004-01-01

    A reference model of Fallible Endgame Play has been implemented and exercised with the chess engine WILHELM. Various experiments have demonstrated the value of the model and the robustness of decisions based on it. Experimental results have also been compared with the theoretical predictions of a Markov model of the endgame and found to be in close agreement.

  18. Survival analysis models and applications

    CERN Document Server

    Liu, Xian

    2012-01-01

    Survival analysis concerns sequential occurrences of events governed by probabilistic laws.  Recent decades have witnessed many applications of survival analysis in various disciplines. This book introduces both classic survival models and theories along with newly developed techniques. Readers will learn how to perform analysis of survival data by following numerous empirical illustrations in SAS. Survival Analysis: Models and Applications: Presents basic techniques before leading onto some of the most advanced topics in survival analysis.Assumes only a minimal knowledge of SAS whilst enablin

  19. CMS analysis school model

    International Nuclear Information System (INIS)

    To impart hands-on training in physics analysis, CMS experiment initiated the concept of CMS Data Analysis School (CMSDAS). It was born over three years ago at the LPC (LHC Physics Centre), Fermilab and is based on earlier workshops held at the LPC and CLEO Experiment. As CMS transitioned from construction to the data taking mode, the nature of earlier training also evolved to include more of analysis tools, software tutorials and physics analysis. This effort epitomized as CMSDAS has proven to be a key for the new and young physicists to jump start and contribute to the physics goals of CMS by looking for new physics with the collision data. With over 400 physicists trained in six CMSDAS around the globe, CMS is trying to engage the collaboration in its discovery potential and maximize physics output. As a bigger goal, CMS is striving to nurture and increase engagement of the myriad talents, in the development of physics, service, upgrade, education of those new to CMS and the career development of younger members. An extension of the concept to the dedicated software and hardware schools is also planned, keeping in mind the ensuing upgrade phase.

  20. CMS Analysis School Model

    Energy Technology Data Exchange (ETDEWEB)

    Malik, S. [Nebraska U.; Shipsey, I. [Purdue U.; Cavanaugh, R. [Illinois U., Chicago; Bloom, K. [Nebraska U.; Chan, Kai-Feng [Taiwan, Natl. Taiwan U.; D' Hondt, J. [Vrije U., Brussels; Klima, B. [Fermilab; Narain, M. [Brown U.; Palla, F. [INFN, Pisa; Rolandi, G. [CERN; Schörner-Sadenius, T. [DESY

    2014-01-01

    To impart hands-on training in physics analysis, CMS experiment initiated the concept of CMS Data Analysis School (CMSDAS). It was born over three years ago at the LPC (LHC Physics Centre), Fermilab and is based on earlier workshops held at the LPC and CLEO Experiment. As CMS transitioned from construction to the data taking mode, the nature of earlier training also evolved to include more of analysis tools, software tutorials and physics analysis. This effort epitomized as CMSDAS has proven to be a key for the new and young physicists to jump start and contribute to the physics goals of CMS by looking for new physics with the collision data. With over 400 physicists trained in six CMSDAS around the globe, CMS is trying to engage the collaboration in its discovery potential and maximize physics output. As a bigger goal, CMS is striving to nurture and increase engagement of the myriad talents, in the development of physics, service, upgrade, education of those new to CMS and the career development of younger members. An extension of the concept to the dedicated software and hardware schools is also planned, keeping in mind the ensuing upgrade phase.

  1. ORGANISATIONAL CULTURE ANALYSIS MODEL

    OpenAIRE

    Mihaela Simona Maracine

    2012-01-01

    The studies and researches undertaken have demonstrated the importance of studying organisational culture because of the practical valences it presents and because it contributes to increasing the organisation’s performance. The analysis of the organisational culture’s dimensions allows observing human behaviour within the organisation and highlighting reality, identifying the strengths and also the weaknesses which have an impact on its functionality and development. In this paper, we try to...

  2. Multiscale Signal Analysis and Modeling

    CERN Document Server

    Zayed, Ahmed

    2013-01-01

    Multiscale Signal Analysis and Modeling presents recent advances in multiscale analysis and modeling using wavelets and other systems. This book also presents applications in digital signal processing using sampling theory and techniques from various function spaces, filter design, feature extraction and classification, signal and image representation/transmission, coding, nonparametric statistical signal processing, and statistical learning theory. This book also: Discusses recently developed signal modeling techniques, such as the multiscale method for complex time series modeling, multiscale positive density estimations, Bayesian Shrinkage Strategies, and algorithms for data adaptive statistics Introduces new sampling algorithms for multidimensional signal processing Provides comprehensive coverage of wavelets with presentations on waveform design and modeling, wavelet analysis of ECG signals and wavelet filters Reviews features extraction and classification algorithms for multiscale signal and image proce...

  3. ROCK PROPERTIES MODEL ANALYSIS MODEL REPORT

    International Nuclear Information System (INIS)

    The purpose of this Analysis and Model Report (AMR) is to document Rock Properties Model (RPM) 3.1 with regard to input data, model methods, assumptions, uncertainties and limitations of model results, and qualification status of the model. The report also documents the differences between the current and previous versions and validation of the model. The rock properties models are intended principally for use as input to numerical physical-process modeling, such as of ground-water flow and/or radionuclide transport. The constraints, caveats, and limitations associated with this model are discussed in the appropriate text sections that follow. This work was conducted in accordance with the following planning documents: WA-0344, ''3-D Rock Properties Modeling for FY 1998'' (SNL 1997, WA-0358), ''3-D Rock Properties Modeling for FY 1999'' (SNL 1999), and the technical development plan, Rock Properties Model Version 3.1, (CRWMS MandO 1999c). The Interim Change Notice (ICNs), ICN 02 and ICN 03, of this AMR were prepared as part of activities being conducted under the Technical Work Plan, TWP-NBS-GS-000003, ''Technical Work Plan for the Integrated Site Model, Process Model Report, Revision 01'' (CRWMS MandO 2000b). The purpose of ICN 03 is to record changes in data input status due to data qualification and verification activities. These work plans describe the scope, objectives, tasks, methodology, and implementing procedures for model construction. The constraints, caveats, and limitations associated with this model are discussed in the appropriate text sections that follow. The work scope for this activity consists of the following: (1) Conversion of the input data (laboratory measured porosity data, x-ray diffraction mineralogy, petrophysical calculations of bound water, and petrophysical calculations of porosity) for each borehole into stratigraphic coordinates; (2) Re-sampling and merging of data sets; (3) Development of geostatistical simulations of porosity; (4

  4. ROCK PROPERTIES MODEL ANALYSIS MODEL REPORT

    Energy Technology Data Exchange (ETDEWEB)

    Clinton Lum

    2002-02-04

    The purpose of this Analysis and Model Report (AMR) is to document Rock Properties Model (RPM) 3.1 with regard to input data, model methods, assumptions, uncertainties and limitations of model results, and qualification status of the model. The report also documents the differences between the current and previous versions and validation of the model. The rock properties models are intended principally for use as input to numerical physical-process modeling, such as of ground-water flow and/or radionuclide transport. The constraints, caveats, and limitations associated with this model are discussed in the appropriate text sections that follow. This work was conducted in accordance with the following planning documents: WA-0344, ''3-D Rock Properties Modeling for FY 1998'' (SNL 1997, WA-0358), ''3-D Rock Properties Modeling for FY 1999'' (SNL 1999), and the technical development plan, Rock Properties Model Version 3.1, (CRWMS M&O 1999c). The Interim Change Notice (ICNs), ICN 02 and ICN 03, of this AMR were prepared as part of activities being conducted under the Technical Work Plan, TWP-NBS-GS-000003, ''Technical Work Plan for the Integrated Site Model, Process Model Report, Revision 01'' (CRWMS M&O 2000b). The purpose of ICN 03 is to record changes in data input status due to data qualification and verification activities. These work plans describe the scope, objectives, tasks, methodology, and implementing procedures for model construction. The constraints, caveats, and limitations associated with this model are discussed in the appropriate text sections that follow. The work scope for this activity consists of the following: (1) Conversion of the input data (laboratory measured porosity data, x-ray diffraction mineralogy, petrophysical calculations of bound water, and petrophysical calculations of porosity) for each borehole into stratigraphic coordinates; (2) Re-sampling and merging of data sets; (3

  5. Local models for spatial analysis

    CERN Document Server

    Lloyd, Christopher D

    2006-01-01

    In both the physical and social sciences, there are now available large spatial data sets with detailed local information. Global models for analyzing these data are not suitable for investigating local variations; consequently, local models are the subject of much recent research. Collecting a variety of models into a single reference, Local Models for Spatial Analysis explains in detail a variety of approaches for analyzing univariate and multivariate spatial data. Different models make use of data in unique ways, and this book offers perspectives on various definitions of what constitutes

  6. Stochastic modeling analysis and simulation

    CERN Document Server

    Nelson, Barry L

    1995-01-01

    A coherent introduction to the techniques for modeling dynamic stochastic systems, this volume also offers a guide to the mathematical, numerical, and simulation tools of systems analysis. Suitable for advanced undergraduates and graduate-level industrial engineers and management science majors, it proposes modeling systems in terms of their simulation, regardless of whether simulation is employed for analysis. Beginning with a view of the conditions that permit a mathematical-numerical analysis, the text explores Poisson and renewal processes, Markov chains in discrete and continuous time, se

  7. Hydraulic Modeling: Pipe Network Analysis

    OpenAIRE

    Datwyler, Trevor T.

    2012-01-01

    Water modeling is becoming an increasingly important part of hydraulic engineering. One application of hydraulic modeling is pipe network analysis. Using programmed algorithms to repeatedly solve continuity and energy equations, computer software can greatly reduce the amount of time required to analyze a closed conduit system. Such hydraulic models can become a valuable tool for cities to maintain their water systems and plan for future growth. The Utah Division of Drinking Water regulations...

  8. Command Process Modeling & Risk Analysis

    Science.gov (United States)

    Meshkat, Leila

    2011-01-01

    Commanding Errors may be caused by a variety of root causes. It's important to understand the relative significance of each of these causes for making institutional investment decisions. One of these causes is the lack of standardized processes and procedures for command and control. We mitigate this problem by building periodic tables and models corresponding to key functions within it. These models include simulation analysis and probabilistic risk assessment models.

  9. Analysis of aircraft maintenance models

    Directory of Open Access Journals (Sweden)

    Vlada S. Sokolović

    2011-10-01

    Full Text Available This paper addressed several organizational models of aircraft maintenance. All models presented so far have been in use in Air Forces, so that the advantages and disadvantages of different models are known. First it shows the current model of aircraft maintenance as well as its basic characteristics. Then the paper discusses two organizational models of aircraft maintenance with their advantages and disadvantages. The advantages and disadvantages of different models are analyzed based on the criteria of operational capabilities of military units. In addition to operational capabilities, the paper presents some other criteria which should be taken into account in the evaluation and selection of an optimal model of aircraft maintenance. Performing a qualitative analysis of some models may not be sufficient for evaluating the optimum choice for models of maintenance referring to the selected set of criteria from the scope of operational capabilities. In order to choose the optimum model, it is necessary to conduct a detailed economic and technical analysis of individual tactical model maintenance. A high-quality aircraft maintenance organization requires the highest state and army authorities to be involved. It is necessary to set clear objectives for all the elements of modern air force technical support programs based on the given evaluation criteria.

  10. Model selection for amplitude analysis

    International Nuclear Information System (INIS)

    Model complexity in amplitude analyses is often a priori under-constrained since the underlying theory permits a large number of possible amplitudes to contribute to most physical processes. The use of an overly complex model results in reduced predictive power and worse resolution on unknown parameters of interest. Therefore, it is common to reduce the complexity by removing from consideration some subset of the allowed amplitudes. This paper studies a method for limiting model complexity from the data sample itself through regularization during regression in the context of a multivariate (Dalitz-plot) analysis. The regularization technique applied greatly improves the performance. An outline of how to obtain the significance of a resonance in a multivariate amplitude analysis is also provided

  11. Accelerated life models modeling and statistical analysis

    CERN Document Server

    Bagdonavicius, Vilijandas

    2001-01-01

    Failure Time DistributionsIntroductionParametric Classes of Failure Time DistributionsAccelerated Life ModelsIntroductionGeneralized Sedyakin's ModelAccelerated Failure Time ModelProportional Hazards ModelGeneralized Proportional Hazards ModelsGeneralized Additive and Additive-Multiplicative Hazards ModelsChanging Shape and Scale ModelsGeneralizationsModels Including Switch-Up and Cycling EffectsHeredity HypothesisSummaryAccelerated Degradation ModelsIntroductionDegradation ModelsModeling the Influence of Explanatory Varia

  12. Geometric simplification of analysis models

    Energy Technology Data Exchange (ETDEWEB)

    Watterberg, P.A.

    1999-12-01

    Analysis programs have been having to deal with more and more complex objects as the capability to model fine detail increases. This can make them unacceptably slow. This project attempts to find heuristics for removing features from models in an automatic fashion in order to reduce polygon count. The approach is not one of theoretical completeness but rather one of trying to achieve useful results with scattered practical ideas. By removing a few simple things such as screw holes, slots, chambers, and fillets, large gains can be realized. Results varied but a reduction in the number of polygons by a factor of 10 is not unusual.

  13. Marine systems analysis and modeling

    Science.gov (United States)

    Fedra, K.

    1995-03-01

    Oceanography and marine ecology have a considerable history in the use of computers for modeling both physical and ecological processes. With increasing stress on the marine environment due to human activities such as fisheries and numerous forms of pollution, the analysis of marine problems must increasingly and jointly consider physical, ecological and socio-economic aspects in a broader systems framework that transcends more traditional disciplinary boundaries. This often introduces difficult-to-quantify, “soft” elements, such as values and perceptions, into formal analysis. Thus, the problem domain combines a solid foundation in the physical sciences, with strong elements of ecological, socio-economic and political considerations. At the same time, the domain is also characterized by both a very large volume of some data, and an extremely datapoor situation for other variables, as well as a very high degree of uncertainty, partly due to the temporal and spatial heterogeneity of the marine environment. Consequently, marine systems analysis and management require tools that can integrate these diverse aspects into efficient information systems that can support research as well as planning and also policy- and decisionmaking processes. Supporting scientific research, as well as decision-making processes and the diverse groups and actors involved, requires better access and direct understanding of the information basis as well as easy-to-use, but powerful tools for analysis. Advanced information technology provides the tools to design and implement smart software where, in a broad sense, the emphasis is on the man-machine interface. Symbolic and analogous, graphical interaction, visual representation of problems, integrated data sources, and built-in domain knowledge can effectively support users of complex and complicated software systems. Integration, interaction, visualization and intelligence are key concepts that are discussed in detail, using an

  14. Analysis by fracture network modelling

    International Nuclear Information System (INIS)

    This report describes the Fracture Network Modelling and Performance Assessment Support performed by Golder Associates Inc. during the Heisei-11 (1999-2000) fiscal year. The primary objective of the Golder Associates work scope during HY-11 was to provide theoretical and review support to the JNC HY-12 Performance assessment effort. In addition, Golder Associates provided technical support to JNC for the Aespoe Project. Major efforts for performance assessment support included analysis of PAWorks pathways and software documentation, verification, and performance assessment visualization. Support for the Aespoe project including 'Task 4' predictive modelling of sorbing tracer transport in TRUE-1 rock block, and integrated hydrogeological and geochemical modelling of Aespoe island for 'Task 5'. Technical information about Golder Associates HY-11 support to JNC is provided in the appendices to this report. (author)

  15. Ventilation Model and Analysis Report

    International Nuclear Information System (INIS)

    This model and analysis report develops, validates, and implements a conceptual model for heat transfer in and around a ventilated emplacement drift. This conceptual model includes thermal radiation between the waste package and the drift wall, convection from the waste package and drift wall surfaces into the flowing air, and conduction in the surrounding host rock. These heat transfer processes are coupled and vary both temporally and spatially, so numerical and analytical methods are used to implement the mathematical equations which describe the conceptual model. These numerical and analytical methods predict the transient response of the system, at the drift scale, in terms of spatially varying temperatures and ventilation efficiencies. The ventilation efficiency describes the effectiveness of the ventilation process in removing radionuclide decay heat from the drift environment. An alternative conceptual model is also developed which evaluates the influence of water and water vapor mass transport on the ventilation efficiency. These effects are described using analytical methods which bound the contribution of latent heat to the system, quantify the effects of varying degrees of host rock saturation (and hence host rock thermal conductivity) on the ventilation efficiency, and evaluate the effects of vapor and enhanced vapor diffusion on the host rock thermal conductivity

  16. ANALYSIS MODEL FOR INVENTORY MANAGEMENT

    Directory of Open Access Journals (Sweden)

    CAMELIA BURJA

    2010-01-01

    Full Text Available The inventory represents an essential component for the assets of the enterprise and the economic analysis gives them special importance because their accurate management determines the achievement of the activity object and the financial results. The efficient management of inventory requires ensuring an optimum level for them, which will guarantee the normal functioning of the activity with minimum inventory expenses and funds which are immobilised. The paper presents an analysis model for inventory management based on their rotation speed and the correlation with the sales volume illustrated in an adequate study. The highlighting of the influence factors on the efficient inventory management ensures the useful information needed to justify managerial decisions, which will lead to a balancedfinancial position and to increased company performance.

  17. Distribution system modeling and analysis

    CERN Document Server

    Kersting, William H

    2002-01-01

    For decades, distribution engineers did not have the sophisticated tools developed for analyzing transmission systems-often they had only their instincts. Things have changed, and we now have computer programs that allow engineers to simulate, analyze, and optimize distribution systems. Powerful as these programs are, however, without a real understanding of the operating characteristics of a distribution system, engineers using the programs can easily make serious errors in their designs and operating procedures.Distribution System Modeling and Analysis helps prevent those errors. It gives re

  18. Simplified model for DNB analysis

    International Nuclear Information System (INIS)

    In a pressurized water nuclear reactor (PWR), the power of operation is restricted by the possibility of the occurrence of the departure from nucleate boiling called DNB (Departure from Nucleate Boiling) in the hottest channel of the core. The present work proposes a simplified model that analyses the thermal-hydraulic conditions of the coolant in the hottest channel of PWRs with the objective to evaluate BNB in this channel. For this the coupling between the hot channel and typical nominal channels assumed imposing the existence of a cross flow between these channels in a way that a uniforme pressure axial distribution results along the channels. The model is applied for Angra-I reactor and the results are compared with those of Final Safety Analysis Report (FSAR) obtained by Westinghouse through the THINC program, beeing considered satisfactory (Author)

  19. Probabilistic Model-Based Safety Analysis

    CERN Document Server

    Güdemann, Matthias; 10.4204/EPTCS.28.8

    2010-01-01

    Model-based safety analysis approaches aim at finding critical failure combinations by analysis of models of the whole system (i.e. software, hardware, failure modes and environment). The advantage of these methods compared to traditional approaches is that the analysis of the whole system gives more precise results. Only few model-based approaches have been applied to answer quantitative questions in safety analysis, often limited to analysis of specific failure propagation models, limited types of failure modes or without system dynamics and behavior, as direct quantitative analysis is uses large amounts of computing resources. New achievements in the domain of (probabilistic) model-checking now allow for overcoming this problem. This paper shows how functional models based on synchronous parallel semantics, which can be used for system design, implementation and qualitative safety analysis, can be directly re-used for (model-based) quantitative safety analysis. Accurate modeling of different types of proba...

  20. The gmdl Modeling and Analysis System

    OpenAIRE

    Gillblad, Daniel; Holst, Anders; Kreuger, Per; Levin, Björn

    2004-01-01

    This report describes the gmdl modeling and analysis environment. gmdl was designed to provide powerful data analysis, modeling, and visualization with simple, clear semantics and easy to use, well defined syntactic conventions. It provides an extensive set of necessary for general data preparation, analysis, and modeling tasks.

  1. Geologic Framework Model Analysis Model Report

    Energy Technology Data Exchange (ETDEWEB)

    R. Clayton

    2000-12-19

    The purpose of this report is to document the Geologic Framework Model (GFM), Version 3.1 (GFM3.1) with regard to data input, modeling methods, assumptions, uncertainties, limitations, and validation of the model results, qualification status of the model, and the differences between Version 3.1 and previous versions. The GFM represents a three-dimensional interpretation of the stratigraphy and structural features of the location of the potential Yucca Mountain radioactive waste repository. The GFM encompasses an area of 65 square miles (170 square kilometers) and a volume of 185 cubic miles (771 cubic kilometers). The boundaries of the GFM were chosen to encompass the most widely distributed set of exploratory boreholes (the Water Table or WT series) and to provide a geologic framework over the area of interest for hydrologic flow and radionuclide transport modeling through the unsaturated zone (UZ). The depth of the model is constrained by the inferred depth of the Tertiary-Paleozoic unconformity. The GFM was constructed from geologic map and borehole data. Additional information from measured stratigraphy sections, gravity profiles, and seismic profiles was also considered. This interim change notice (ICN) was prepared in accordance with the Technical Work Plan for the Integrated Site Model Process Model Report Revision 01 (CRWMS M&O 2000). The constraints, caveats, and limitations associated with this model are discussed in the appropriate text sections that follow. The GFM is one component of the Integrated Site Model (ISM) (Figure l), which has been developed to provide a consistent volumetric portrayal of the rock layers, rock properties, and mineralogy of the Yucca Mountain site. The ISM consists of three components: (1) Geologic Framework Model (GFM); (2) Rock Properties Model (RPM); and (3) Mineralogic Model (MM). The ISM merges the detailed project stratigraphy into model stratigraphic units that are most useful for the primary downstream models and the

  2. Geologic Framework Model Analysis Model Report

    International Nuclear Information System (INIS)

    The purpose of this report is to document the Geologic Framework Model (GFM), Version 3.1 (GFM3.1) with regard to data input, modeling methods, assumptions, uncertainties, limitations, and validation of the model results, qualification status of the model, and the differences between Version 3.1 and previous versions. The GFM represents a three-dimensional interpretation of the stratigraphy and structural features of the location of the potential Yucca Mountain radioactive waste repository. The GFM encompasses an area of 65 square miles (170 square kilometers) and a volume of 185 cubic miles (771 cubic kilometers). The boundaries of the GFM were chosen to encompass the most widely distributed set of exploratory boreholes (the Water Table or WT series) and to provide a geologic framework over the area of interest for hydrologic flow and radionuclide transport modeling through the unsaturated zone (UZ). The depth of the model is constrained by the inferred depth of the Tertiary-Paleozoic unconformity. The GFM was constructed from geologic map and borehole data. Additional information from measured stratigraphy sections, gravity profiles, and seismic profiles was also considered. This interim change notice (ICN) was prepared in accordance with the Technical Work Plan for the Integrated Site Model Process Model Report Revision 01 (CRWMS M and O 2000). The constraints, caveats, and limitations associated with this model are discussed in the appropriate text sections that follow. The GFM is one component of the Integrated Site Model (ISM) (Figure l), which has been developed to provide a consistent volumetric portrayal of the rock layers, rock properties, and mineralogy of the Yucca Mountain site. The ISM consists of three components: (1) Geologic Framework Model (GFM); (2) Rock Properties Model (RPM); and (3) Mineralogic Model (MM). The ISM merges the detailed project stratigraphy into model stratigraphic units that are most useful for the primary downstream models and

  3. Colour model analysis for microscopic image processing

    OpenAIRE

    García-Rojo Marcial; González Jesús; Déniz Oscar; González Roberto; Bueno Gloria

    2008-01-01

    Abstract This article presents a comparative study between different colour models (RGB, HSI and CIEL*a*b*) applied to a very large microscopic image analysis. Such analysis of different colour models is needed in order to carry out a successful detection and therefore a classification of different regions of interest (ROIs) within the image. This, in turn, allows both distinguishing possible ROIs and retrieving their proper colour for further ROI analysis. This analysis is not commonly done ...

  4. FAME, the Flux Analysis and Modeling Environment

    OpenAIRE

    Boele Joost; Olivier Brett G; Teusink Bas

    2012-01-01

    Background The creation and modification of genome-scale metabolic models is a task that requires specialized software tools. While these are available, subsequently running or visualizing a model often relies on disjoint code, which adds additional actions to the analysis routine and, in our experience, renders these applications suboptimal for routine use by (systems) biologists. Results The Flux Analysis and Modeling Environment (FAME) is the first web-based modeling tool that combines th...

  5. Bayesian Model Averaging for Propensity Score Analysis

    Science.gov (United States)

    Kaplan, David; Chen, Jianshen

    2013-01-01

    The purpose of this study is to explore Bayesian model averaging in the propensity score context. Previous research on Bayesian propensity score analysis does not take into account model uncertainty. In this regard, an internally consistent Bayesian framework for model building and estimation must also account for model uncertainty. The…

  6. Statistical Modelling of Wind Proles - Data Analysis and Modelling

    DEFF Research Database (Denmark)

    Jónsson, Tryggvi; Pinson, Pierre

    The aim of the analysis presented in this document is to investigate whether statistical models can be used to make very short-term predictions of wind profiles.......The aim of the analysis presented in this document is to investigate whether statistical models can be used to make very short-term predictions of wind profiles....

  7. Uncertainty analysis and probabilistic modelling

    International Nuclear Information System (INIS)

    Many factors affect the accuracy and precision of probabilistic assessment. This report discusses sources of uncertainty and ways of addressing them. Techniques for propagating uncertainties in model input parameters through to model prediction are discussed as are various techniques for examining how sensitive and uncertain model predictions are to one or more input parameters. Various statements of confidence which can be made concerning the prediction of a probabilistic assessment are discussed as are several matters of potential regulatory interest. 55 refs

  8. Evolution of Gross Domestic Product - Analysis Models

    OpenAIRE

    Constantin ANGHELACHE; Catalin DEATCU; Daniel DUMITRESCU; Adina Mihaela DINU

    2013-01-01

    This paper describes a use case for macro economical models, the objective being the structural analysis of the Gross Domestic Product. The authors first introduce the theoretical foundation of the model, then offer a snapshot on GDP evolution. The econometric models proposed for analysis are designed with the help of EViews software and their performance and reliability are described through the optics of the statistical tests.

  9. Introduction to Models in Spatial Analysis

    OpenAIRE

    Sanders, Lena

    2007-01-01

    The book provides a broad overview of the different types of models used in advanced spatial analysis. The models concern spatial organization, location factors and spatial interaction patterns from both static and dynamic perspectives. This introductory chapter proposes a discussion on the different meanings which are given to models in the field of spatial analysis depending on the formalization framework (statistics, GIS, computational approach). Core concepts as spatial interaction and le...

  10. Sensitivity Analysis in the Model Web

    Science.gov (United States)

    Jones, R.; Cornford, D.; Boukouvalas, A.

    2012-04-01

    The Model Web, and in particular the Uncertainty enabled Model Web being developed in the UncertWeb project aims to allow model developers and model users to deploy and discover models exposed as services on the Web. In particular model users will be able to compose model and data resources to construct and evaluate complex workflows. When discovering such workflows and models on the Web it is likely that the users might not have prior experience of the model behaviour in detail. It would be particularly beneficial if users could undertake a sensitivity analysis of the models and workflows they have discovered and constructed to allow them to assess the sensitivity to their assumptions and parameters. This work presents a Web-based sensitivity analysis tool which provides computationally efficient sensitivity analysis methods for models exposed on the Web. In particular the tool is tailored to the UncertWeb profiles for both information models (NetCDF and Observations and Measurements) and service specifications (WPS and SOAP/WSDL). The tool employs emulation technology where this is found to be possible, constructing statistical surrogate models for the models or workflows, to allow very fast variance based sensitivity analysis. Where models are too complex for emulation to be possible, or evaluate too fast for this to be necessary the original models are used with a carefully designed sampling strategy. A particular benefit of constructing emulators of the models or workflow components is that within the framework these can be communicated and evaluated at any physical location. The Web-based tool and backend API provide several functions to facilitate the process of creating an emulator and performing sensitivity analysis. A user can select a model exposed on the Web and specify the input ranges. Once this process is complete, they are able to perform screening to discover important inputs, train an emulator, and validate the accuracy of the trained emulator. In

  11. [Dimensional modeling analysis for outpatient payments].

    Science.gov (United States)

    Guo, Yi-zhong; Guo, Yi-min

    2008-09-01

    This paper introduces a data warehouse model for outpatient payments, which is designed according to the requirements of the hospital financial management while dimensional modeling technique is combined with the analysis on the requirements. This data warehouse model can not only improve the accuracy of financial management requirements, but also greatly increase the efficiency and quality of the hospital management. PMID:19119657

  12. Qualitative Analysis of Somitogenesis Models

    Directory of Open Access Journals (Sweden)

    Maschke-Dutz E.

    2007-12-01

    Full Text Available Although recently the properties of a single somite cell oscillator have been intensively investigated, the system-level nature of the segmentation clock remains largely unknown. To elaborate qualitatively this question, we examine the possibility to transform a well-known time delay somite cell oscillator to dynamical system of differential equations allowing qualitative analysis.

  13. System of systems modeling and analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Campbell, James E.; Anderson, Dennis James; Longsine, Dennis E. (Intera, Inc., Austin, TX); Shirah, Donald N.

    2005-01-01

    This report documents the results of an LDRD program entitled 'System of Systems Modeling and Analysis' that was conducted during FY 2003 and FY 2004. Systems that themselves consist of multiple systems (referred to here as System of Systems or SoS) introduce a level of complexity to systems performance analysis and optimization that is not readily addressable by existing capabilities. The objective of the 'System of Systems Modeling and Analysis' project was to develop an integrated modeling and simulation environment that addresses the complex SoS modeling and analysis needs. The approach to meeting this objective involved two key efforts. First, a static analysis approach, called state modeling, has been developed that is useful for analyzing the average performance of systems over defined use conditions. The state modeling capability supports analysis and optimization of multiple systems and multiple performance measures or measures of effectiveness. The second effort involves time simulation which represents every system in the simulation using an encapsulated state model (State Model Object or SMO). The time simulation can analyze any number of systems including cross-platform dependencies and a detailed treatment of the logistics required to support the systems in a defined mission.

  14. Modeling and Analysis of SLED

    OpenAIRE

    Li, Lin; Fang, Wen-Cheng; Wang, Chao-Peng; Gu, Qiang

    2014-01-01

    SLED is a crucial component for C-band microwave acceleration unit of SXFEL. To study the behavior of SLED (SLAC Energy Doubler), mathematic model is commonly built and analyzed. In this paper, a new method is proposed to build the model of SLED at SINAP. With this method, the parameters of the two cavities can be analyzed separately. Also it is suitable to study parameter optimization of SLED and analyze the effect from the parameters variations. Simulation results of our method are also pre...

  15. Analysis of radiology business models.

    Science.gov (United States)

    Enzmann, Dieter R; Schomer, Donald F

    2013-03-01

    As health care moves to value orientation, radiology's traditional business model faces challenges to adapt. The authors describe a strategic value framework that radiology practices can use to best position themselves in their environments. This simplified construct encourages practices to define their dominant value propositions. There are 3 main value propositions that form a conceptual triangle, whose vertices represent the low-cost provider, the product leader, and the customer intimacy models. Each vertex has been a valid market position, but each demands specific capabilities and trade-offs. The underlying concepts help practices select value propositions they can successfully deliver in their competitive environments. PMID:23245438

  16. Hypersonic - Model Analysis as a Service

    DEFF Research Database (Denmark)

    Acretoaie, Vlad; Störrle, Harald

    2014-01-01

    Hypersonic is a Cloud-based tool that proposes a new approach to the deployment of model analysis facilities. It is implemented as a RESTful Web service API o_ering analysis features such as model clone detection. This approach allows the migration of resource intensive analysis algorithms from...... monolithic desktop modeling tools to a wide range of mobile and Web-based clients. As a technology demonstrator, a Web application acting as a client for the Hypersonic API has been implemented and made publicly available....

  17. Hierarchical modeling and analysis for spatial data

    CERN Document Server

    Banerjee, Sudipto; Gelfand, Alan E

    2003-01-01

    Among the many uses of hierarchical modeling, their application to the statistical analysis of spatial and spatio-temporal data from areas such as epidemiology And environmental science has proven particularly fruitful. Yet to date, the few books that address the subject have been either too narrowly focused on specific aspects of spatial analysis, or written at a level often inaccessible to those lacking a strong background in mathematical statistics.Hierarchical Modeling and Analysis for Spatial Data is the first accessible, self-contained treatment of hierarchical methods, modeling, and dat

  18. Material modeling and structural analysis with the microplane constitutive model

    Science.gov (United States)

    Brocca, Michele

    The microplane model is a versatile and powerful approach to constitutive modeling in which the stress-strain relations are defined in terms of vectors rather than tensors on planes of all possible orientations. Such planes are called the microplanes and are representative of the microstructure of the material. The microplane model with kinematic constraint has been successfully employed in the past in the modeling of concrete, soils, ice, rocks, fiber composites and other quasibrittle materials. The microplane model provides a powerful and efficient numerical and theoretical framework for the development and implementation of constitutive models for any kind of material. The dissertation presents a review of the background from which the microplane model stems, highlighting differences and similarities with other approaches. The basic structure of the microplane model is then presented, together with its extension to finite strain deformation. To show the effectiveness of the microplane model approach, some examples are given demonstrating applications of microplane models in structural analysis with the finite element method. Some new constitutive models are also introduced for materials characterized by very different properties and microstructures, showing that the approach is indeed very versatile and provides a robust basis for the study of a broad range of problems. New models are introduced for metal plasticity, shape memory alloys and cellular materials. The new models are compared quantitatively with the existing models and experimental data. In particular, the newly introduced microplane models for metal plasticity are compared with the classical J2-flow theory for incremental plasticity. An existing microplane model for concrete is employed in finite element analysis of the 'tube-squash' test, in which concrete undergoes very large deviatoric deformation, and of the size effect in compressive failure of concrete columns. The microplane model for shape

  19. Representing Uncertainty on Model Analysis Plots

    OpenAIRE

    Smith, Trevor I.

    2016-01-01

    Model analysis provides a mechanism for representing student learning as measured by standard multiple-choice surveys. The model plot contains information regarding both how likely students in a particular class are to choose the correct answer and how likely they are to choose an answer consistent with a well-documented conceptual model. Unfortunately Bao's original presentation of the model plot did not include a way to represent uncertainty in these measurements. I present details of a met...

  20. Influence analysis for the factor analysis model with ranking data.

    Science.gov (United States)

    Xu, Liang; Poon, Wai-Yin; Lee, Sik-Yum

    2008-05-01

    Influence analysis is an important component of data analysis, and the local influence approach has been widely applied to many statistical models to identify influential observations and assess minor model perturbations since the pioneering work of Cook (1986). The approach is often adopted to develop influence analysis procedures for factor analysis models with ranking data. However, as this well-known approach is based on the observed data likelihood, which involves multidimensional integrals, directly applying it to develop influence analysis procedures for the factor analysis models with ranking data is difficult. To address this difficulty, a Monte Carlo expectation and maximization algorithm (MCEM) is used to obtain the maximum-likelihood estimate of the model parameters, and measures for influence analysis on the basis of the conditional expectation of the complete data log likelihood at the E-step of the MCEM algorithm are then obtained. Very little additional computation is needed to compute the influence measures, because it is possible to make use of the by-products of the estimation procedure. Influence measures that are based on several typical perturbation schemes are discussed in detail, and the proposed method is illustrated with two real examples and an artificial example. PMID:18482479

  1. Representing Uncertainty on Model Analysis Plots

    CERN Document Server

    Smith, Trevor I

    2016-01-01

    Model analysis provides a mechanism for representing student learning as measured by standard multiple-choice surveys. The model plot contains information regarding both how likely students in a particular class are to choose the correct answer and how likely they are to choose an answer consistent with a well-documented conceptual model. Unfortunately Bao's original presentation of the model plot did not include a way to represent uncertainty in these measurements. I present details of a method to add error bars to model plots by expanding the work of Sommer and Lindell. I also provide a template for generating model plots with error bars.

  2. Phrasal Document Analysis for Modeling

    OpenAIRE

    Sojitra, Ritesh D.

    1998-01-01

    Specifications of digital hardware systems are typically written in a natural language. The objective of this research is automatic information extraction from specifications to aid model generation for system level design automation. This is done by automatic extraction of the noun phrases and the verbs from the natural language specification statements. First, the natural language sentences are parsed using a chart parser. Then, a noun phrase and verb extractor scans these charts to obt...

  3. Three-dimensional model analysis and processing

    CERN Document Server

    Yu, Faxin; Luo, Hao; Wang, Pinghui

    2011-01-01

    This book focuses on five hot research directions in 3D model analysis and processing in computer science:  compression, feature extraction, content-based retrieval, irreversible watermarking and reversible watermarking.

  4. Two sustainable energy system analysis models

    DEFF Research Database (Denmark)

    Lund, Henrik; Goran Krajacic, Neven Duic; da Graca Carvalho, Maria

    2005-01-01

    This paper presents a comparative study of two energy system analysis models both designed with the purpose of analysing electricity systems with a substantial share of fluctuating renewable energy....

  5. Requirement Analysis: Specify the Environmental Model

    OpenAIRE

    2005-01-01

    This interactive case study illustrates the specification of the environment model in software requirements analysis using the spec language to refine and formalize the requirements in the initial problem statement. Last modified: 5/18/2009

  6. Analysis model of structure-HDS

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    Presents the model established for Structure-HDS(hydraulic damper system) analysis on the basis of the theoretical analysis model of non-compressed fluid in the round pipe will an uniform velocity used as the basic variable, and pressure losses resulting from cross section changes of fluid route taken into consideration. Which provides necessary basis for researches on earthquake responses of a structure with a spacious first story, equipped with HDS at first floor.

  7. Applied research in uncertainty modeling and analysis

    CERN Document Server

    Ayyub, Bilal

    2005-01-01

    Uncertainty has been a concern to engineers, managers, and scientists for many years. For a long time uncertainty has been considered synonymous with random, stochastic, statistic, or probabilistic. Since the early sixties views on uncertainty have become more heterogeneous. In the past forty years numerous tools that model uncertainty, above and beyond statistics, have been proposed by several engineers and scientists. The tool/method to model uncertainty in a specific context should really be chosen by considering the features of the phenomenon under consideration, not independent of what is known about the system and what causes uncertainty. In this fascinating overview of the field, the authors provide broad coverage of uncertainty analysis/modeling and its application. Applied Research in Uncertainty Modeling and Analysis presents the perspectives of various researchers and practitioners on uncertainty analysis and modeling outside their own fields and domain expertise. Rather than focusing explicitly on...

  8. Combustion instability modeling and analysis

    Energy Technology Data Exchange (ETDEWEB)

    Santoro, R.J.; Yang, V.; Santavicca, D.A. [Pennsylvania State Univ., University Park, PA (United States)] [and others

    1995-10-01

    It is well known that the two key elements for achieving low emissions and high performance in a gas turbine combustor are to simultaneously establish (1) a lean combustion zone for maintaining low NO{sub x} emissions and (2) rapid mixing for good ignition and flame stability. However, these requirements, when coupled with the short combustor lengths used to limit the residence time for NO formation typical of advanced gas turbine combustors, can lead to problems regarding unburned hydrocarbons (UHC) and carbon monoxide (CO) emissions, as well as the occurrence of combustion instabilities. Clearly, the key to successful gas turbine development is based on understanding the effects of geometry and operating conditions on combustion instability, emissions (including UHC, CO and NO{sub x}) and performance. The concurrent development of suitable analytical and numerical models that are validated with experimental studies is important for achieving this objective. A major benefit of the present research will be to provide for the first time an experimentally verified model of emissions and performance of gas turbine combustors.

  9. Credit Risk Evaluation : Modeling - Analysis - Management

    OpenAIRE

    Wehrspohn, Uwe

    2002-01-01

    An analysis and further development of the building blocks of modern credit risk management: -Definitions of default -Estimation of default probabilities -Exposures -Recovery Rates -Pricing -Concepts of portfolio dependence -Time horizons for risk calculations -Quantification of portfolio risk -Estimation of risk measures -Portfolio analysis and portfolio improvement -Evaluation and comparison of credit risk models -Analytic portfolio loss distributions The thesis contributes to the evaluatio...

  10. Analysis of variance for model output

    NARCIS (Netherlands)

    Jansen, M.J.W.

    1999-01-01

    A scalar model output Y is assumed to depend deterministically on a set of stochastically independent input vectors of different dimensions. The composition of the variance of Y is considered; variance components of particular relevance for uncertainty analysis are identified. Several analysis of va

  11. Ignalina NPP Safety Analysis: Models and Results

    International Nuclear Information System (INIS)

    Research directions, linked to safety assessment of the Ignalina NPP, of the scientific safety analysis group are presented: Thermal-hydraulic analysis of accidents and operational transients; Thermal-hydraulic assessment of Ignalina NPP Accident Localization System and other compartments; Structural analysis of plant components, piping and other parts of Main Circulation Circuit; Assessment of RBMK-1500 reactor core and other. Models and main works carried out last year are described. (author)

  12. Analysis and evaluation of collaborative modeling processes

    NARCIS (Netherlands)

    Ssebuggwawo, D.

    2012-01-01

    Analysis and evaluation of collaborative modeling processes is confronted with many challenges. On the one hand, many systems design and re-engineering projects require collaborative modeling approaches that can enhance their productivity. But, such collaborative efforts, which often consist of the

  13. Independent Component Analysis in Multimedia Modeling

    DEFF Research Database (Denmark)

    Larsen, Jan

    Modeling of multimedia and multimodal data becomes increasingly important with the digitalization of the world. The objective of this paper is to demonstrate the potential of independent component analysis and blind sources separation methods for modeling and understanding of multimedia data, which...

  14. Independent Component Analysis in Multimedia Modeling

    DEFF Research Database (Denmark)

    Larsen, Jan; Hansen, Lars Kai; Kolenda, Thomas;

    2003-01-01

    Modeling of multimedia and multimodal data becomes increasingly important with the digitalization of the world. The objective of this paper is to demonstrate the potential of independent component analysis and blind sources separation methods for modeling and understanding of multimedia data, which...

  15. FAME, the Flux Analysis and Modeling Environment

    NARCIS (Netherlands)

    Boele, J.; Olivier, B.G.; Teusink, B.

    2012-01-01

    Background

    The creation and modification of genome-scale metabolic models is a task that requires specialized software tools. While these are available, subsequently running or visualizing a model often relies on disjoint code, which adds additional actions to the analysis routi

  16. Perturbation analysis of nonlinear matrix population models

    Directory of Open Access Journals (Sweden)

    Hal Caswell

    2008-03-01

    Full Text Available Perturbation analysis examines the response of a model to changes in its parameters. It is commonly applied to population growth rates calculated from linear models, but there has been no general approach to the analysis of nonlinear models. Nonlinearities in demographic models may arise due to density-dependence, frequency-dependence (in 2-sex models, feedback through the environment or the economy, and recruitment subsidy due to immigration, or from the scaling inherent in calculations of proportional population structure. This paper uses matrix calculus to derive the sensitivity and elasticity of equilibria, cycles, ratios (e.g. dependency ratios, age averages and variances, temporal averages and variances, life expectancies, and population growth rates, for both age-classified and stage-classified models. Examples are presented, applying the results to both human and non-human populations.

  17. A Simulation Analysis of Bivariate Availability Models

    OpenAIRE

    Caruso, Elise M.

    2000-01-01

    Equipment behavior is often discussed in terms of age and use. For example, an automobile is frequently referred to 3 years old with 30,000 miles. Bivariate failure modeling provides a framework for studying system behavior as a function of two variables. This is meaningful when studying the reliability/availability of systems and equipment. This thesis extends work done in the area of bivariate failure modeling. Four bivariate failure models are selected for analysis. The study in...

  18. Formalising responsibility modelling for automatic analysis

    OpenAIRE

    Simpson, Robbie; Storer, Tim

    2015-01-01

    Modelling the structure of social-technical systems as a basis for informing software system design is a difficult compromise. Formal methods struggle to capture the scale and complexity of the heterogeneous organisations that use technical systems. Conversely, informal approaches lack the rigour needed to inform the software design and construction process or enable automated analysis. We revisit the concept of responsibility modelling, which models social technical systems as a collec...

  19. Simulation modeling and analysis with Arena

    CERN Document Server

    Altiok, Tayfur

    2007-01-01

    Simulation Modeling and Analysis with Arena is a highly readable textbook which treats the essentials of the Monte Carlo discrete-event simulation methodology, and does so in the context of a popular Arena simulation environment.” It treats simulation modeling as an in-vitro laboratory that facilitates the understanding of complex systems and experimentation with what-if scenarios in order to estimate their performance metrics. The book contains chapters on the simulation modeling methodology and the underpinnings of discrete-event systems, as well as the relevant underlying probability, statistics, stochastic processes, input analysis, model validation and output analysis. All simulation-related concepts are illustrated in numerous Arena examples, encompassing production lines, manufacturing and inventory systems, transportation systems, and computer information systems in networked settings.· Introduces the concept of discrete event Monte Carlo simulation, the most commonly used methodology for modeli...

  20. HVDC dynamic modelling for small signal analysis

    Energy Technology Data Exchange (ETDEWEB)

    Yang, X.; Chen, C. [Shanghai Jiaotong Univ. (China). Dept. of Electrical Engineering

    2004-11-01

    The conventional quasi-steady model of HVDC is not able to describe the dynamic switching behaviour of HVDC converters. By means of the sampled-data modelling approach, a linear time-invariant (LTI) small-signal dynamic model is developed for the HVDC main circuit in the synchronous rotating d-q reference frame. The linearised model is validated by time-domain simulation, and it can be seen that the model represents the dynamic response of the static switching circuits to perturbations in operating points. The model is valid for analysing oscillations including high frequency modes such as subsynchronous oscillation (SSO) and high frequency instability. The model is applied in two cases: (i) SSO analysis where the results are compared with the quasi-steady approach that has shown its validation for normal SSO analysis; (ii) high frequency eigenvalue analysis for HVDC benchmark system in which the results of root locus analysis and simulation shows that increased gain of rectifier DC PI controller may result in high-frequency oscillatory instability. (author)

  1. A Requirements Analysis Model Based on QFD

    Institute of Scientific and Technical Information of China (English)

    TANG Zhi-wei; Nelson K.H.Tang

    2004-01-01

    The enterprise resource planning (ERP) system has emerged to offer an integrated IT solution and more and more enterprises are increasing by adopting this system and regarding it as an important innovation. However, there is already evidence of high failure risks in ERP project implementation, one major reason is poor analysis of the requirements for system implementation. In this paper, the importance of requirements analysis for ERP project implementation is highlighted, and a requirements analysis model by applying quality function deployment (QFD) is presented, which will support to conduct requirements analysis for ERP project.

  2. Sensitivity analysis of periodic matrix population models.

    Science.gov (United States)

    Caswell, Hal; Shyu, Esther

    2012-12-01

    Periodic matrix models are frequently used to describe cyclic temporal variation (seasonal or interannual) and to account for the operation of multiple processes (e.g., demography and dispersal) within a single projection interval. In either case, the models take the form of periodic matrix products. The perturbation analysis of periodic models must trace the effects of parameter changes, at each phase of the cycle, on output variables that are calculated over the entire cycle. Here, we apply matrix calculus to obtain the sensitivity and elasticity of scalar-, vector-, or matrix-valued output variables. We apply the method to linear models for periodic environments (including seasonal harvest models), to vec-permutation models in which individuals are classified by multiple criteria, and to nonlinear models including both immediate and delayed density dependence. The results can be used to evaluate management strategies and to study selection gradients in periodic environments. PMID:23316494

  3. Decision variables analysis for structured modeling

    Institute of Scientific and Technical Information of China (English)

    潘启树; 赫东波; 张洁; 胡运权

    2002-01-01

    Structured modeling is the most commonly used modeling method, but it is not quite addaptive to significant changes in environmental conditions. Therefore, Decision Variables Analysis(DVA), a new modelling method is proposed to deal with linear programming modeling and changing environments. In variant linear programming , the most complicated relationships are those among decision variables. DVA classifies the decision variables into different levels using different index sets, and divides a model into different elements so that any change can only have its effect on part of the whole model. DVA takes into consideration the complicated relationships among decision variables at different levels, and can therefore sucessfully solve any modeling problem in dramatically changing environments.

  4. [Integrated model system for environmental policy analysis].

    Science.gov (United States)

    Jiang, Lin

    2006-05-01

    An integrated model system for environmental policy analysis is built up with a Computable General Equilibrium (CGE) model as a core model, which is linked with an environmental model, air dispersion model, and health effect model (exposure-response functions) in an explicit way, therefore the model system is capable of evaluating the effects of policies on environment, health and economy and their interactions comprehensively. This method is used to analyze the effects of Beijing presumptive (energy) taxes on air quality, health, welfare and economic growth, and the conclusion is that sole presumptive taxes may slow down the economic growth, but the presumptive taxes with green tax reform can promote Beijing sustainable development. PMID:16850855

  5. Model Selection in Data Analysis Competitions

    DEFF Research Database (Denmark)

    Wind, David Kofoed; Winther, Ole

    2014-01-01

    The use of data analysis competitions for selecting the most appropriate model for a problem is a recent innovation in the field of predictive machine learning. Two of the most well-known examples of this trend was the Netflix Competition and recently the competitions hosted on the online platform...... Kaggle. In this paper, we will state and try to verify a set of qualitative hypotheses about predictive modelling, both in general and in the scope of data analysis competitions. To verify our hypotheses we will look at previous competitions and their outcomes, use qualitative interviews with top...

  6. Modeling Controller Tasks for Safety Analysis

    Science.gov (United States)

    Brown, Molly; Leveson, Nancy G.

    1998-01-01

    As control systems become more complex, the use of automated control has increased. At the same time, the role of the human operator has changed from primary system controller to supervisor or monitor. Safe design of the human computer interaction becomes more difficult. In this paper, we present a visual task modeling language that can be used by system designers to model human-computer interactions. The visual models can be translated into SpecTRM-RL, a blackbox specification language for modeling the automated portion of the control system. The SpecTRM-RL suite of analysis tools allow the designer to perform formal and informal safety analyses on the task model in isolation or integrated with the rest of the modeled system.

  7. Analysis of Kelvin Probe Operational Models

    OpenAIRE

    Popescu, Eugeniu M.

    2011-01-01

    We present a study of several models on which Kelvin Probe instruments with flat and spherical tips rely for operation and for the determination of the contact potential difference. Using covariance analysis, we have investigated the precision limits of each model as imposed by the Cramer-Rao bound. Where the situation demanded, we have evaluated the bias introduced by the method in the estimation of the contact potential difference.

  8. Analysis of N Category Privacy Models

    Directory of Open Access Journals (Sweden)

    Marn-Ling Shing

    2012-10-01

    Full Text Available File sharing becomes popular in social networking and the disclosure of private information without user’s consent can be found easily. Password management becomes increasingly necessary for maintaining privacy policy. Monitoring of violations of a privacy policy is needed to support the confidentiality of information security. This paper extends the analysis of two category confidentiality model to N categories, and illustrates how to use it to monitor the security state transitions in the information security privacy modeling.

  9. Structuring multi-criteria portfolio analysis models

    OpenAIRE

    Montibeller, Gilberto; Franco, Alberto; Lord, Ewan; Iglesias, Aline

    2008-01-01

    Multi-Criteria Portfolio Analysis (MCPA) models have been extensively employed as an effective means to allocate scarce resources for investment in projects or services, considering different organisational areas and balancing costs, benefits & risks. However, structuring this type of models in practice is not a trivial task. How should be areas defined? Where should new projects be included? How should one define the criteria to evaluate performance? As far as the authors are aware, there is...

  10. FAME, the Flux Analysis and Modeling Environment

    Directory of Open Access Journals (Sweden)

    Boele Joost

    2012-01-01

    Full Text Available Abstract Background The creation and modification of genome-scale metabolic models is a task that requires specialized software tools. While these are available, subsequently running or visualizing a model often relies on disjoint code, which adds additional actions to the analysis routine and, in our experience, renders these applications suboptimal for routine use by (systems biologists. Results The Flux Analysis and Modeling Environment (FAME is the first web-based modeling tool that combines the tasks of creating, editing, running, and analyzing/visualizing stoichiometric models into a single program. Analysis results can be automatically superimposed on familiar KEGG-like maps. FAME is written in PHP and uses the Python-based PySCeS-CBM for its linear solving capabilities. It comes with a comprehensive manual and a quick-start tutorial, and can be accessed online at http://f-a-m-e.org/. Conclusions With FAME, we present the community with an open source, user-friendly, web-based "one stop shop" for stoichiometric modeling. We expect the application will be of substantial use to investigators and educators alike.

  11. Review and analysis of biomass gasification models

    DEFF Research Database (Denmark)

    Puig Arnavat, Maria; Bruno, Joan Carles; Coronas, Alberto

    2010-01-01

    The use of biomass as a source of energy has been further enhanced in recent years and special attention has been paid to biomass gasification. Due to the increasing interest in biomass gasification, several models have been proposed in order to explain and understand this complex process, and the...... design, simulation, optimisation and process analysis of gasifiers have been carried out. This paper presents and analyses several gasification models based on thermodynamic equilibrium, kinetics and artificial neural networks. The thermodynamic models are found to be a useful tool for preliminary...

  12. Numerical analysis of the rebellious voter model

    Czech Academy of Sciences Publication Activity Database

    Swart, Jan M.; Vrbenský, Karel

    2010-01-01

    Roč. 140, č. 5 (2010), s. 873-899. ISSN 0022-4715 R&D Projects: GA ČR GA201/09/1931; GA MŠk 1M0572 Institutional research plan: CEZ:AV0Z10750506 Keywords : rebellious voter model * parity conservation * exactly solvable model * coexistence * interface tightness * cancellative systems * Markov chain Monte Carlo Subject RIV: BA - General Mathematics Impact factor: 1.447, year: 2010 http://library.utia.cas.cz/separaty/2010/SI/swart-numerical analysis of the rebellious voter model.pdf

  13. Modeling and analysis of stochastic systems

    CERN Document Server

    Kulkarni, Vidyadhar G

    2011-01-01

    Based on the author's more than 25 years of teaching experience, Modeling and Analysis of Stochastic Systems, Second Edition covers the most important classes of stochastic processes used in the modeling of diverse systems, from supply chains and inventory systems to genetics and biological systems. For each class of stochastic process, the text includes its definition, characterization, applications, transient and limiting behavior, first passage times, and cost/reward models. Along with reorganizing the material, this edition revises and adds new exercises and examples. New to the second edi

  14. Comparative Distributions of Hazard Modeling Analysis

    Directory of Open Access Journals (Sweden)

    Rana Abdul Wajid

    2006-07-01

    Full Text Available In this paper we present the comparison among the distributions used in hazard analysis. Simulation technique has been used to study the behavior of hazard distribution modules. The fundamentals of Hazard issues are discussed using failure criteria. We present the flexibility of the hazard modeling distribution that approaches to different distributions.

  15. Power system stability modelling, analysis and control

    CERN Document Server

    Sallam, Abdelhay A

    2015-01-01

    This book provides a comprehensive treatment of the subject from both a physical and mathematical perspective and covers a range of topics including modelling, computation of load flow in the transmission grid, stability analysis under both steady-state and disturbed conditions, and appropriate controls to enhance stability.

  16. Texture Analysis by Means of Model Functions

    OpenAIRE

    Eschner, Th.

    1993-01-01

    The conception of texture components is widely used in texture analysis. Mostly it is used to describe the orientation distribution function (ODF) qualitatively, and there are only a few special functions used to provide texture component calculations.This paper attempts to introduce another model function describing common texture components and giving a compromise between universality and computational efficiency.

  17. Modeling uncertainty in geographic information and analysis

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    Uncertainty modeling and data quality for spatial data and spatial analyses are im-portant topics in geographic information science together with space and time in geography,as well as spatial analysis. In the past two decades,a lot of efforts have been made to research the uncertainty modeling for spatial data and analyses. This paper presents our work in the research. In particular,four progresses in the re-search are given out: (a) from determinedness-to uncertainty-based representation of geographic objects in GIS; (b) from uncertainty modeling for static data to dy-namic spatial analyses; (c) from modeling uncertainty for spatial data to models; and (d) from error descriptions to quality control for spatial data.

  18. Computational Models for Analysis of Illicit Activities

    DEFF Research Database (Denmark)

    Nizamani, Sarwat

    devise policies to minimize them. These activities include cybercrimes, terrorist attacks or violent actions in response to certain world issues. Beside such activities, there are several other related activities worth analyzing, for which computational models have been presented in this thesis....... These models include a model for analyzing evolution of terrorist networks; a text classification model for detecting suspicious text and identification of suspected authors of anonymous emails; and a semantic analysis model for news reports, which may help analyze the illicit activities in certain area...... with location and temporal information. For the network evolution, the hierarchical agglomerative clustering approach has been applied to terrorist networks as case studies. The networks' evolutions show that how individual actors who are initially isolated from each other are converted in small groups, which...

  19. Model authoring system for fail safe analysis

    Science.gov (United States)

    Sikora, Scott E.

    1990-01-01

    The Model Authoring System is a prototype software application for generating fault tree analyses and failure mode and effects analyses for circuit designs. Utilizing established artificial intelligence and expert system techniques, the circuits are modeled as a frame-based knowledge base in an expert system shell, which allows the use of object oriented programming and an inference engine. The behavior of the circuit is then captured through IF-THEN rules, which then are searched to generate either a graphical fault tree analysis or failure modes and effects analysis. Sophisticated authoring techniques allow the circuit to be easily modeled, permit its behavior to be quickly defined, and provide abstraction features to deal with complexity.

  20. A Dynamic Model for Energy Structure Analysis

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    Energy structure is a complicated system concerning economic development, natural resources, technological innovation, ecological balance, social progress and many other elements. It is not easy to explain clearly the developmental mechanism of an energy system and the mutual relations between the energy system and its related environments by the traditional methods. It is necessary to develop a suitable dynamic model, which can reflect the dynamic characteristics and the mutual relations of the energy system and its related environments. In this paper, the historical development of China's energy structure was analyzed. A new quantitative analysis model was developed based on system dynamics principles through analysis of energy resources, and the production and consumption of energy in China and comparison with the world. Finally, this model was used to predict China's future energy structures under different conditions.

  1. Analysis hierarchical model for discrete event systems

    Science.gov (United States)

    Ciortea, E. M.

    2015-11-01

    The This paper presents the hierarchical model based on discrete event network for robotic systems. Based on the hierarchical approach, Petri network is analysed as a network of the highest conceptual level and the lowest level of local control. For modelling and control of complex robotic systems using extended Petri nets. Such a system is structured, controlled and analysed in this paper by using Visual Object Net ++ package that is relatively simple and easy to use, and the results are shown as representations easy to interpret. The hierarchical structure of the robotic system is implemented on computers analysed using specialized programs. Implementation of hierarchical model discrete event systems, as a real-time operating system on a computer network connected via a serial bus is possible, where each computer is dedicated to local and Petri model of a subsystem global robotic system. Since Petri models are simplified to apply general computers, analysis, modelling, complex manufacturing systems control can be achieved using Petri nets. Discrete event systems is a pragmatic tool for modelling industrial systems. For system modelling using Petri nets because we have our system where discrete event. To highlight the auxiliary time Petri model using transport stream divided into hierarchical levels and sections are analysed successively. Proposed robotic system simulation using timed Petri, offers the opportunity to view the robotic time. Application of goods or robotic and transmission times obtained by measuring spot is obtained graphics showing the average time for transport activity, using the parameters sets of finished products. individually.

  2. Ferrofluids: Modeling, numerical analysis, and scientific computation

    Science.gov (United States)

    Tomas, Ignacio

    This dissertation presents some developments in the Numerical Analysis of Partial Differential Equations (PDEs) describing the behavior of ferrofluids. The most widely accepted PDE model for ferrofluids is the Micropolar model proposed by R.E. Rosensweig. The Micropolar Navier-Stokes Equations (MNSE) is a subsystem of PDEs within the Rosensweig model. Being a simplified version of the much bigger system of PDEs proposed by Rosensweig, the MNSE are a natural starting point of this thesis. The MNSE couple linear velocity u, angular velocity w, and pressure p. We propose and analyze a first-order semi-implicit fully-discrete scheme for the MNSE, which decouples the computation of the linear and angular velocities, is unconditionally stable and delivers optimal convergence rates under assumptions analogous to those used for the Navier-Stokes equations. Moving onto the much more complex Rosensweig's model, we provide a definition (approximation) for the effective magnetizing field h, and explain the assumptions behind this definition. Unlike previous definitions available in the literature, this new definition is able to accommodate the effect of external magnetic fields. Using this definition we setup the system of PDEs coupling linear velocity u, pressure p, angular velocity w, magnetization m, and magnetic potential ϕ We show that this system is energy-stable and devise a numerical scheme that mimics the same stability property. We prove that solutions of the numerical scheme always exist and, under certain simplifying assumptions, that the discrete solutions converge. A notable outcome of the analysis of the numerical scheme for the Rosensweig's model is the choice of finite element spaces that allow the construction of an energy-stable scheme. Finally, with the lessons learned from Rosensweig's model, we develop a diffuse-interface model describing the behavior of two-phase ferrofluid flows and present an energy-stable numerical scheme for this model. For a

  3. To model bolted parts for tolerance analysis using variational model

    Directory of Open Access Journals (Sweden)

    Wilma Polini

    2015-01-01

    Full Text Available Mechanical products are usually made by assembling many parts. Among the different type of links, bolts are widely used to join the components of an assembly. In a bolting a clearance exists among the bolt and the holes of the parts to join. This clearance has to be modeled in order to define the possible movements agreed to the joined parts. The model of the clearance takes part to the global model that builds the stack-up functions by accumulating the tolerances applied to the assembly components. Then, the stack-up functions are solved to evaluate the influence of the tolerances assigned to the assembly components on the functional requirements of the assembly product. The aim of this work is to model the joining between two parts by a planar contact surface and two bolts inside the model that builds and solves the stack-up functions of the tolerance analysis. It adopts the variational solid model. The proposed model uses the simplified hypothesis that each surface maintains its nominal shape, i.e. the effects of the form errors are neglected. The proposed model has been applied to a case study where the holes have dimensional and positional tolerances in order to demonstrate its effectiveness.

  4. Identifiability analysis in conceptual sewer modelling.

    Science.gov (United States)

    Kleidorfer, M; Leonhardt, G; Rauch, W

    2012-01-01

    For a sufficient calibration of an environmental model not only parameter sensitivity but also parameter identifiability is an important issue. In identifiability analysis it is possible to analyse whether changes in one parameter can be compensated by appropriate changes of the other ones within a given uncertainty range. Parameter identifiability is conditional to the information content of the calibration data and consequently conditional to a certain measurement layout (i.e. types of measurements, number and location of measurement sites, temporal resolution of measurements etc.). Hence the influence of number and location of measurement sites on the number of identifiable parameters can be investigated. In the present study identifiability analysis is applied to a conceptual model of a combined sewer system aiming to predict the combined sewer overflow emissions. Different measurement layouts are tested and it can be shown that only 13 of the most sensitive catchment areas (represented by the model parameter 'effective impervious area') can be identified when overflow measurements of the 20 highest overflows and the runoff to the waste water treatment plant are used for calibration. The main advantage of this method is very low computational costs as the number of required model runs equals the total number of model parameters. Hence, this method is a valuable tool when analysing large models with a long runtime and many parameters. PMID:22864432

  5. Guideliness for system modeling: fault tree [analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Yoon Hwan; Yang, Joon Eon; Kang, Dae Il; Hwang, Mee Jeong

    2004-07-01

    This document, the guidelines for system modeling related to Fault Tree Analysis(FTA), is intended to provide the guidelines with the analyzer to construct the fault trees in the level of the capability category II of ASME PRA standard. Especially, they are to provide the essential and basic guidelines and the related contents to be used in support of revising the Ulchin 3 and 4 PSA model for risk monitor within the capability category II of ASME PRA standard. Normally the main objective of system analysis is to assess the reliability of system modeled by Event Tree Analysis (ETA). A variety of analytical techniques can be used for the system analysis, however, FTA method is used in this procedures guide. FTA is the method used for representing the failure logic of plant systems deductively using AND, OR or NOT gates. The fault tree should reflect all possible failure modes that may contribute to the system unavailability. This should include contributions due to the mechanical failures of the components, Common Cause Failures (CCFs), human errors and outages for testing and maintenance. This document identifies and describes the definitions and the general procedures of FTA and the essential and basic guidelines for reving the fault trees. Accordingly, the guidelines for FTA will be capable to guide the FTA to the level of the capability category II of ASME PRA standard.

  6. Analysis of mathematical models of radioisotope gauges

    International Nuclear Information System (INIS)

    Radioisotope gauges as industrial sensors were briefly reviewed. Regression models of instruments based on various principles developed in Institute of Nuclear Research and Institute of Nuclear Chemistry and Technology were analysed and their mathematical models assessed. It was found that for one - dimensional models the lowest value of standard error of estimate was achieved when calibration procedure was modelled by logarithmic function. Mathematical expressions for variance and mean value of intrinsic error for linear and non - linear one - as well as for multi - dimensional models of radioisotope gauges were derived. A conclusion was drawn that optimal model of calibration procedure determined by regression analysis method not always corresponds to the minimum value of the intrinsic error variance. Influence of cutting off of probability distribution function of measured quantity and its error at the lower upper limit of measurement range on variance and mean value of intrinsic error was evaluated. Feasibility study for application of some aspects of Shannon's information theory for evaluation of mathematical models of radioisotope gauges was accomplished. Its usefulness for complex evaluation of multidimensional models was confirmed. 105 refs. (author)

  7. Sensitivity analysis of a modified energy model

    International Nuclear Information System (INIS)

    Sensitivity analysis is carried out to validate model formulation. A modified model has been developed to predict the future energy requirement of coal, oil and electricity, considering price, income, technological and environmental factors. The impact and sensitivity of the independent variables on the dependent variable are analysed. The error distribution pattern in the modified model as compared to a conventional time series model indicated the absence of clusters. The residual plot of the modified model showed no distinct pattern of variation. The percentage variation of error in the conventional time series model for coal and oil ranges from -20% to +20%, while for electricity it ranges from -80% to +20%. However, in the case of the modified model the percentage variation in error is greatly reduced - for coal it ranges from -0.25% to +0.15%, for oil -0.6% to +0.6% and for electricity it ranges from -10% to +10%. The upper and lower limit consumption levels at 95% confidence is determined. The consumption at varying percentage changes in price and population are analysed. The gap between the modified model predictions at varying percentage changes in price and population over the years from 1990 to 2001 is found to be increasing. This is because of the increasing rate of energy consumption over the years and also the confidence level decreases as the projection is made far into the future. (author)

  8. Model Performance Evaluation and Scenario Analysis (MPESA) Tutorial

    Science.gov (United States)

    This tool consists of two parts: model performance evaluation and scenario analysis (MPESA). The model performance evaluation consists of two components: model performance evaluation metrics and model diagnostics. These metrics provides modelers with statistical goodness-of-fit m...

  9. Social phenomena from data analysis to models

    CERN Document Server

    Perra, Nicola

    2015-01-01

    This book focuses on the new possibilities and approaches to social modeling currently being made possible by an unprecedented variety of datasets generated by our interactions with modern technologies. This area has witnessed a veritable explosion of activity over the last few years, yielding many interesting and useful results. Our aim is to provide an overview of the state of the art in this area of research, merging an extremely heterogeneous array of datasets and models. Social Phenomena: From Data Analysis to Models is divided into two parts. Part I deals with modeling social behavior under normal conditions: How we live, travel, collaborate and interact with each other in our daily lives. Part II deals with societal behavior under exceptional conditions: Protests, armed insurgencies, terrorist attacks, and reactions to infectious diseases. This book offers an overview of one of the most fertile emerging fields bringing together practitioners from scientific communities as diverse as social sciences, p...

  10. Modeling and analysis of calcium bromide hydrolysis

    Energy Technology Data Exchange (ETDEWEB)

    Lottes, Steven A.; Lyczkowski, Robert W.; Panchal, Chandrakant B.; Doctor, Richard D. [Energy Systems Division, Argonne National Laboratory, 9700 S. Cass Avenue, Argonne, IL 60439 (United States)

    2009-05-15

    The main focus of this paper is the modeling, simulation, and analysis of the calcium bromide hydrolysis reactor stage in the calcium-bromine thermochemical water-splitting cycle for nuclear hydrogen production. One reactor concept is to use a spray of calcium bromide into steam, in which the heat of fusion supplies the heat of reaction. Droplet models were built up in a series of steps incorporating various physical phenomena, including droplet flow, heat transfer, phase change, and reaction, separately. Given the large heat reservoir contained in a pool of molten calcium bromide that allows bubbles to rise easily, using a bubble column reactor for the hydrolysis appears to be a feasible and promising alternative to the spray reactor concept. The two limiting cases of bubble geometry, spherical and spherical-cap, are considered in the modeling. Results for both droplet and bubble modeling with COMSOL MULTIPHYSICS trademark are presented, with recommendations for the path forward. (author)

  11. SRMAFTE facility checkout model flow field analysis

    Science.gov (United States)

    Dill, Richard A.; Whitesides, Harold R.

    1992-07-01

    The Solid Rocket Motor Air Flow Equipment (SRMAFTE) facility was constructed for the purpose of evaluating the internal propellant, insulation, and nozzle configurations of solid propellant rocket motor designs. This makes the characterization of the facility internal flow field very important in assuring that no facility induced flow field features exist which would corrupt the model related measurements. In order to verify the design and operation of the facility, a three-dimensional computational flow field analysis was performed on the facility checkout model setup. The checkout model measurement data, one-dimensional and three-dimensional estimates were compared, and the design and proper operation of the facility was verified. The proper operation of the metering nozzles, adapter chamber transition, model nozzle, and diffuser were verified. The one-dimensional and three-dimensional flow field estimates along with the available measurement data are compared.

  12. MATHEMATICAL RISK ANALYSIS: VIA NICHOLAS RISK MODEL AND BAYESIAN ANALYSIS

    Directory of Open Access Journals (Sweden)

    Anass BAYAGA

    2010-07-01

    Full Text Available The objective of this second part of a two-phased study was to explorethe predictive power of quantitative risk analysis (QRA method andprocess within Higher Education Institution (HEI. The method and process investigated the use impact analysis via Nicholas risk model and Bayesian analysis, with a sample of hundred (100 risk analysts in a historically black South African University in the greater Eastern Cape Province.The first findings supported and confirmed previous literature (KingIII report, 2009: Nicholas and Steyn, 2008: Stoney, 2007: COSA, 2004 that there was a direct relationship between risk factor, its likelihood and impact, certiris paribus. The second finding in relation to either controlling the likelihood or the impact of occurrence of risk (Nicholas risk model was that to have a brighter risk reward, it was important to control the likelihood ofoccurrence of risks as compared with its impact so to have a direct effect on entire University. On the Bayesian analysis, thus third finding, the impact of risk should be predicted along three aspects. These aspects included the human impact (decisions made, the property impact (students and infrastructural based and the business impact. Lastly, the study revealed that although in most business cases, where as business cycles considerably vary dependingon the industry and or the institution, this study revealed that, most impacts in HEI (University was within the period of one academic.The recommendation was that application of quantitative risk analysisshould be related to current legislative framework that affects HEI.

  13. 3D face modeling, analysis and recognition

    CERN Document Server

    Daoudi, Mohamed; Veltkamp, Remco

    2013-01-01

    3D Face Modeling, Analysis and Recognition presents methodologies for analyzing shapes of facial surfaces, develops computational tools for analyzing 3D face data, and illustrates them using state-of-the-art applications. The methodologies chosen are based on efficient representations, metrics, comparisons, and classifications of features that are especially relevant in the context of 3D measurements of human faces. These frameworks have a long-term utility in face analysis, taking into account the anticipated improvements in data collection, data storage, processing speeds, and application s

  14. Advances in statistical models for data analysis

    CERN Document Server

    Minerva, Tommaso; Vichi, Maurizio

    2015-01-01

    This edited volume focuses on recent research results in classification, multivariate statistics and machine learning and highlights advances in statistical models for data analysis. The volume provides both methodological developments and contributions to a wide range of application areas such as economics, marketing, education, social sciences and environment. The papers in this volume were first presented at the 9th biannual meeting of the Classification and Data Analysis Group (CLADAG) of the Italian Statistical Society, held in September 2013 at the University of Modena and Reggio Emilia, Italy.

  15. Extrudate Expansion Modelling through Dimensional Analysis Method

    DEFF Research Database (Denmark)

    A new model framework is proposed to correlate extrudate expansion and extrusion operation parameters for a food extrusion cooking process through dimensional analysis principle, i.e. Buckingham pi theorem. Three dimensionless groups, i.e. energy, water content and temperature, are suggested to...... describe the extrudates expansion. From the three dimensionless groups, an equation with three experimentally determined parameters is derived to express the extrudate expansion. The model is evaluated with whole wheat flour and aquatic feed extrusion experimental data. The average deviations of the...

  16. Computer modelling for LOCA analysis in PHWRs

    International Nuclear Information System (INIS)

    A computer code THYNAC developed for analysis of thermal hydraulic transient phenomena during LOCA in the PHWR type reactor and primary coolant system is described. The code predicts coolant voiding rate in the core, coolant discharge rate from the break, primary system depressurization history and temperature history of both fuel and fuel clad. Reactor system is modelled as a set of connected fluid segments which represent piping, feeders, coolant channels, etc. Method of finite difference is used in the code. Modelling of various specific phenomena e.g. two-phase pressure drop, slip flow, pumps etc. in the code is described. (M.G.B.)

  17. Handwriting Analysis with Online Fuzzy Models

    OpenAIRE

    Bouillon, Manuel; Anquetil, Eric

    2015-01-01

    This paper presents the early work, done in the context of the IntuiScript project, on handwriting quality analysis. This IntuiScript project aims at developing a digital workbook to help with teaching children how to handwrite. To do so, we must be able to analyse their handwriting, to evaluate if the letters are correctly written, and to detail what aspects of the child symbols – letters, numbers, and geometric forms-do not correspond to the teacher models. We use an online fuzzy model to e...

  18. Model correction factor method for system analysis

    DEFF Research Database (Denmark)

    Ditlevsen, Ove Dalager; Johannesen, Johannes M.

    2000-01-01

    The Model Correction Factor Method is an intelligent response surface method based on simplifiedmodeling. MCFM is aimed for reliability analysis in case of a limit state defined by an elaborate model. Herein it isdemonstrated that the method is applicable for elaborate limit state surfaces on which...... clearly defined failure modes, the MCFM can bestarted from each idealized single mode limit state in turn to identify a locally most central point on the elaborate limitstate surface. Typically this procedure leads to a fewer number of locally most central failure points on the elaboratelimit state...

  19. Formal Modeling and Analysis of Timed Systems

    DEFF Research Database (Denmark)

    Larsen, Kim Guldstrand; Niebert, Peter

    This book constitutes the thoroughly refereed post-proceedings of the First International Workshop on Formal Modeling and Analysis of Timed Systems, FORMATS 2003, held in Marseille, France in September 2003. The 19 revised full papers presented together with an invited paper and the abstracts of...... two invited talks were carefully selected from 36 submissions during two rounds of reviewing and improvement. All current aspects of formal method for modeling and analyzing timed systems are addressed; among the timed systems dealt with are timed automata, timed Petri nets, max-plus algebras, real...

  20. A factor analysis model for functional genomics

    OpenAIRE

    Shioda Romy; Kustra Rafal; Zhu Mu

    2006-01-01

    Abstract Background Expression array data are used to predict biological functions of uncharacterized genes by comparing their expression profiles to those of characterized genes. While biologically plausible, this is both statistically and computationally challenging. Typical approaches are computationally expensive and ignore correlations among expression profiles and functional categories. Results We propose a factor analysis model (FAM) for functional genomics and give a two-step algorith...

  1. Bayesian Analysis of Multivariate Probit Models

    OpenAIRE

    Siddhartha Chib; Edward Greenberg

    1996-01-01

    This paper provides a unified simulation-based Bayesian and non-Bayesian analysis of correlated binary data using the multivariate probit model. The posterior distribution is simulated by Markov chain Monte Carlo methods, and maximum likelihood estimates are obtained by a Markov chain Monte Carlo version of the E-M algorithm. Computation of Bayes factors from the simulation output is also considered. The methods are applied to a bivariate data set, to a 534-subject, four-year longitudinal dat...

  2. ANALYSIS MODEL FOR RETURN ON CAPITAL EMPLOYED

    OpenAIRE

    BURJA CAMELIA

    2013-01-01

    At the microeconomic level, the appreciation of the capitals’ profitability is a very complex action which is of interest for stakeholders. This study has as main purpose to extend the traditional analysis model for the capitals’ profitability, based on the ratio “Return on capital employed”. In line with it the objectives of this work aim the identification of factors that exert an influence on the capital’s profitability utilized by a company and the measurement of their contribution in the...

  3. Economic Modeling and Analysis of Educational Vouchers

    OpenAIRE

    Dennis Epple; Richard Romano

    2012-01-01

    The analysis of educational vouchers has evolved from market-based analogies to models that incorporate distinctive features of the educational environment. These distinctive features include peer effects, scope for private school pricing and admissions based on student characteristics, the linkage of household residential and school choices in multidistrict settings, the potential for rent seeking in public and private schools, the role of school reputations, incentives for student effort, a...

  4. Micromechatronics modeling, analysis, and design with Matlab

    CERN Document Server

    Giurgiutiu, Victor

    2009-01-01

    Focusing on recent developments in engineering science, enabling hardware, advanced technologies, and software, Micromechatronics: Modeling, Analysis, and Design with MATLAB®, Second Edition provides clear, comprehensive coverage of mechatronic and electromechanical systems. It applies cornerstone fundamentals to the design of electromechanical systems, covers emerging software and hardware, introduces the rigorous theory, examines the design of high-performance systems, and helps develop problem-solving skills. Along with more streamlined material, this edition adds many new sections to exist

  5. Scripted Building Energy Modeling and Analysis: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Hale, E.; Macumber, D.; Benne, K.; Goldwasser, D.

    2012-08-01

    Building energy modeling and analysis is currently a time-intensive, error-prone, and nonreproducible process. This paper describes the scripting platform of the OpenStudio tool suite (http://openstudio.nrel.gov) and demonstrates its use in several contexts. Two classes of scripts are described and demonstrated: measures and free-form scripts. Measures are small, single-purpose scripts that conform to a predefined interface. Because measures are fairly simple, they can be written or modified by inexperienced programmers.

  6. An analysis of penalized interaction models

    OpenAIRE

    Zhao, Junlong; Leng, Chenlei

    2016-01-01

    An important consideration for variable selection in interaction models is to design an appropriate penalty that respects hierarchy of the importance of the variables. A common theme is to include an interaction term only after the corresponding main effects are present. In this paper, we study several recently proposed approaches and present a unified analysis on the convergence rate for a class of estimators, when the design satisfies the restricted eigenvalue condition. In particular, we s...

  7. Automating Risk Analysis of Software Design Models

    OpenAIRE

    Maxime Frydman; Guifré Ruiz; Elisa Heymann; Eduardo César; Barton P. Miller

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security e...

  8. Energy Systems Modelling Research and Analysis

    DEFF Research Database (Denmark)

    Møller Andersen, Frits; Alberg Østergaard, Poul

    2015-01-01

    This editorial introduces the seventh volume of the International Journal of Sustainable Energy Planning and Management. The volume presents part of the outcome of the project Energy Systems Modelling Research and Analysis (ENSYMORA) funded by the Danish Innovation Fund. The project carried out...... by 11 university and industry partners has improved the basis for decision-making within energy planning and energy scenario making by providing new and improved tools and methods for energy systems analyses....

  9. Multivariate Probabilistic Analysis of an Hydrological Model

    Science.gov (United States)

    Franceschini, Samuela; Marani, Marco

    2010-05-01

    Model predictions derived based on rainfall measurements and hydrological model results are often limited by the systematic error of measuring instruments, by the intrinsic variability of the natural processes and by the uncertainty of the mathematical representation. We propose a means to identify such sources of uncertainty and to quantify their effects based on point-estimate approaches, as a valid alternative to cumbersome Montecarlo methods. We present uncertainty analyses on the hydrologic response to selected meteorological events, in the mountain streamflow-generating portion of the Brenta basin at Bassano del Grappa, Italy. The Brenta river catchment has a relatively uniform morphology and quite a heterogeneous rainfall-pattern. In the present work, we evaluate two sources of uncertainty: data uncertainty (the uncertainty due to data handling and analysis) and model uncertainty (the uncertainty related to the formulation of the model). We thus evaluate the effects of the measurement error of tipping-bucket rain gauges, the uncertainty in estimating spatially-distributed rainfall through block kriging, and the uncertainty associated with estimated model parameters. To this end, we coupled a deterministic model based on the geomorphological theory of the hydrologic response to probabilistic methods. In particular we compare the results of Monte Carlo Simulations (MCS) to the results obtained, in the same conditions, using Li's Point Estimate Method (LiM). The LiM is a probabilistic technique that approximates the continuous probability distribution function of the considered stochastic variables by means of discrete points and associated weights. This allows to satisfactorily reproduce results with only few evaluations of the model function. The comparison between the LiM and MCS results highlights the pros and cons of using an approximating method. LiM is less computationally demanding than MCS, but has limited applicability especially when the model

  10. Modeling late entry bias in survival analysis.

    Science.gov (United States)

    Matsuura, Masaaki; Eguchi, Shinto

    2005-06-01

    In a failure time analysis, we sometimes observe additional study subjects who enter during the study period. These late entries are treated as left-truncated data in the statistical literature. However, with real data, there is a substantial possibility that the delayed entries may have extremely different hazards compared to the other standard subjects. We focus on a situation in which such entry bias might arise in the analysis of survival data. The purpose of the present article is to develop an appropriate methodology for making inference about data including late entries. We construct a model that includes parameters for the effect of delayed entry bias having no specification for the distribution of entry time. We also discuss likelihood inference based on this model and derive the asymptotic behavior of estimates. A simulation study is conducted for a finite sample size in order to compare the analysis results using our method with those using the standard method, where independence between entry time and failure time is assumed. We apply this method to mortality analysis among atomic bomb survivors defined in a geographical study region. PMID:16011705

  11. Modeling and analysis of advanced binary cycles

    Energy Technology Data Exchange (ETDEWEB)

    Gawlik, K.

    1997-12-31

    A computer model (Cycle Analysis Simulation Tool, CAST) and a methodology have been developed to perform value analysis for small, low- to moderate-temperature binary geothermal power plants. The value analysis method allows for incremental changes in the levelized electricity cost (LEC) to be determined between a baseline plant and a modified plant. Thermodynamic cycle analyses and component sizing are carried out in the model followed by economic analysis which provides LEC results. The emphasis of the present work is on evaluating the effect of mixed working fluids instead of pure fluids on the LEC of a geothermal binary plant that uses a simple Organic Rankine Cycle. Four resources were studied spanning the range of 265{degrees}F to 375{degrees}F. A variety of isobutane and propane based mixtures, in addition to pure fluids, were used as working fluids. This study shows that the use of propane mixtures at a 265{degrees}F resource can reduce the LEC by 24% when compared to a base case value that utilizes commercial isobutane as its working fluid. The cost savings drop to 6% for a 375{degrees}F resource, where an isobutane mixture is favored. Supercritical cycles were found to have the lowest cost at all resources.

  12. Computer modeling for neutron activation analysis methods

    International Nuclear Information System (INIS)

    Full text: The INP AS RU develops databases for the neutron-activation analysis - ND INAA [1] and ELEMENT [2]. Based on these databases, the automated complex is under construction aimed at modeling of methods for natural and technogenic materials analysis. It is well known, that there is a variety of analysis objects with wide spectra, different composition and concentration of elements, which makes it impossible to develop universal methods applicable for every analytical research. The modelling is based on algorithm, that counts the period of time in which the sample was irradiated in nuclear reactor, providing the sample's total absorption and activity analytical peaks areas with given errors. The analytical complex was tested for low-elemental analysis (determination of Fe and Zn in vegetation samples, and Cu, Ag and Au - in technological objects). At present, the complex is applied for multielemental analysis of sediment samples. In this work, modern achievements in the analytical chemistry (measurement facilities, high-resolution detectors, IAEA and IUPAC databases) and information technology applications (Java software, database management systems (DBMS), internet technologies) are applied. Reference: 1. Tillaev T., Umaraliev A., Gurvich L.G., Yuldasheva K., Kadirova J. Specialized database for instrumental neutron activation analysis - ND INAA 1.0, The 3-rd Eurasian Conference Nuclear Science and its applications, 2004, pp.270-271.; 2. Gurvich L.G., Tillaev T., Umaraliev A. The Information-analytical database on the element contents of natural objects. The 4-th International Conference Modern problems of Nuclear Physics, Samarkand, 2003, p.337. (authors)

  13. Mathematical analysis of epidemiological models with heterogeneity

    Energy Technology Data Exchange (ETDEWEB)

    Van Ark, J.W.

    1992-01-01

    For many diseases in human populations the disease shows dissimilar characteristics in separate subgroups of the population; for example, the probability of disease transmission for gonorrhea or AIDS is much higher from male to female than from female to male. There is reason to construct and analyze epidemiological models which allow this heterogeneity of population, and to use these models to run computer simulations of the disease to predict the incidence and prevalence of the disease. In the models considered here the heterogeneous population is separated into subpopulations whose internal and external interactions are homogeneous in the sense that each person in the population can be assumed to have all average actions for the people of that subpopulation. The first model considered is an SIRS models; i.e., the Susceptible can become Infected, and if so he eventually Recovers with temporary immunity, and after a period of time becomes Susceptible again. Special cases allow for permanent immunity or other variations. This model is analyzed and threshold conditions are given which determine whether the disease dies out or persists. A deterministic model is presented; this model is constructed using difference equations, and it has been used in computer simulations for the AIDS epidemic in the homosexual population in San Francisco. The homogeneous version and the heterogeneous version of the differential-equations and difference-equations versions of the deterministic model are analyzed mathematically. In the analysis, equilibria are identified and threshold conditions are set forth for the disease to die out if the disease is below the threshold so that the disease-free equilibrium is globally asymptotically stable. Above the threshold the disease persists so that the disease-free equilibrium is unstable and there is a unique endemic equilibrium.

  14. Model reduction using a posteriori analysis

    KAUST Repository

    Whiteley, Jonathan P.

    2010-05-01

    Mathematical models in biology and physiology are often represented by large systems of non-linear ordinary differential equations. In many cases, an observed behaviour may be written as a linear functional of the solution of this system of equations. A technique is presented in this study for automatically identifying key terms in the system of equations that are responsible for a given linear functional of the solution. This technique is underpinned by ideas drawn from a posteriori error analysis. This concept has been used in finite element analysis to identify regions of the computational domain and components of the solution where a fine computational mesh should be used to ensure accuracy of the numerical solution. We use this concept to identify regions of the computational domain and components of the solution where accurate representation of the mathematical model is required for accuracy of the functional of interest. The technique presented is demonstrated by application to a model problem, and then to automatically deduce known results from a cell-level cardiac electrophysiology model. © 2010 Elsevier Inc.

  15. Ontological Modeling for Integrated Spacecraft Analysis

    Science.gov (United States)

    Wicks, Erica

    2011-01-01

    Current spacecraft work as a cooperative group of a number of subsystems. Each of these requiresmodeling software for development, testing, and prediction. It is the goal of my team to create anoverarching software architecture called the Integrated Spacecraft Analysis (ISCA) to aid in deploying the discrete subsystems' models. Such a plan has been attempted in the past, and has failed due to the excessive scope of the project. Our goal in this version of ISCA is to use new resources to reduce the scope of the project, including using ontological models to help link the internal interfaces of subsystems' models with the ISCA architecture.I have created an ontology of functions specific to the modeling system of the navigation system of a spacecraft. The resulting ontology not only links, at an architectural level, language specificinstantiations of the modeling system's code, but also is web-viewable and can act as a documentation standard. This ontology is proof of the concept that ontological modeling can aid in the integration necessary for ISCA to work, and can act as the prototype for future ISCA ontologies.

  16. Analysis of software for modeling atmospheric dispersion

    International Nuclear Information System (INIS)

    During last few years, a number software packages for microcomputes have appeared with the aim to simulate diffusion of atmospheric pollutants. These codes, simplifying the models used for safety analyses of industrial plants are becoming more useful, and are even used for post-accidental conditions. The report presents for the first time in a critical manner, principal models available up to this date. The problem arises in adapting the models to the demanded post-accidental interventions. In parallel to this action an analysis of performance was performed. It means, identifying the need of forecasting the most appropriate actions to be performed having in mind short available time and lack of information. Because of these difficulties, it is possible to simplify the software, which will not include all the options but could deal with a specific situation. This would enable minimisation of data to be collected on the site

  17. Modeling and Hazard Analysis Using STPA

    Science.gov (United States)

    Ishimatsu, Takuto; Leveson, Nancy; Thomas, John; Katahira, Masa; Miyamoto, Yuko; Nakao, Haruka

    2010-09-01

    A joint research project between MIT and JAXA/JAMSS is investigating the application of a new hazard analysis to the system and software in the HTV. Traditional hazard analysis focuses on component failures but software does not fail in this way. Software most often contributes to accidents by commanding the spacecraft into an unsafe state(e.g., turning off the descent engines prematurely) or by not issuing required commands. That makes the standard hazard analysis techniques of limited usefulness on software-intensive systems, which describes most spacecraft built today. STPA is a new hazard analysis technique based on systems theory rather than reliability theory. It treats safety as a control problem rather than a failure problem. The goal of STPA, which is to create a set of scenarios that can lead to a hazard, is the same as FTA but STPA includes a broader set of potential scenarios including those in which no failures occur but the problems arise due to unsafe and unintended interactions among the system components. STPA also provides more guidance to the analysts that traditional fault tree analysis. Functional control diagrams are used to guide the analysis. In addition, JAXA uses a model-based system engineering development environment(created originally by Leveson and called SpecTRM) which also assists in the hazard analysis. One of the advantages of STPA is that it can be applied early in the system engineering and development process in a safety-driven design process where hazard analysis drives the design decisions rather than waiting until reviews identify problems that are then costly or difficult to fix. It can also be applied in an after-the-fact analysis and hazard assessment, which is what we did in this case study. This paper describes the experimental application of STPA to the JAXA HTV in order to determine the feasibility and usefulness of the new hazard analysis technique. Because the HTV was originally developed using fault tree analysis

  18. Automating Risk Analysis of Software Design Models

    Directory of Open Access Journals (Sweden)

    Maxime Frydman

    2014-01-01

    Full Text Available The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance.

  19. Automating risk analysis of software design models.

    Science.gov (United States)

    Frydman, Maxime; Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance. PMID:25136688

  20. Gentrification and models for real estate analysis

    Directory of Open Access Journals (Sweden)

    Gianfranco Brusa

    2013-08-01

    Full Text Available This research propose a deep analysis of Milanese real estate market, based on data supplied by three real estate organizations; gentrification appears in some neighborhoods, such as Tortona, Porta Genova, Bovisa, Isola Garibaldi: the latest is the subject of the final analysis, by surveying of physical and social state of the area. The survey takes place in two periods (2003 and 2009 to compare the evolution of gentrification. The results of surveys has been employed in a simulation by multi-agent system model, to foresee long term evolution of the phenomenon. These neighborhood micro-indicators allow to put in evidence actual trends, conditioning a local real estate market, which can translate themselves in phenomena such as gentrification. In present analysis, the employ of cellular automata models applied to a neighborhood in Milan (Isola Garibaldi produced the dynamic simulation of gentrification trend during a very long time: the cyclical phenomenon (one loop holds a period of twenty – thirty years appears sometimes during a theoretical time of 100 – 120 – 150 years. Simulation of long period scenarios by multi-agent systems and cellular automata provides estimator with powerful tool, without limits in implementing it, able to support him in appraisal judge. It stands also to reason that such a tool can sustain urban planning and related evaluation processes.

  1. Erosion Modeling Analysis for SME Tank Cavity

    International Nuclear Information System (INIS)

    Previous computational work to evaluate erosion in the DWPF Slurry Mix Evaporator vessel has been extended to address the potential for the erosion to accelerate because of changes to the tank bottom profile. The same erosion mechanism identified in the previous work, abrasive erosion driven by high wall shear stress, was applied to the current evaluation. The current work extends the previous analysis by incorporating the observed changes to the tank bottom and coil support structure in the vicinity of the coil guides. The results show that wall shear on the tank bottom is about the same magnitude as found in previous results. Shear stresses in the eroded cavities are reduced compared to those that caused the initial erosion to the extent that anticipated continued erosion of those locations is minimal. If SR operations were continued at an agitator speed of 130 rpm, the edge of the existing eroded cavities would probably smooth out, while the rate of erosion at the bottom of the cavity would decrease significantly with time. Further, reducing the agitator speed to 103 rpm will reduce shear stresses throughout the bottom region of the tank enough to essentially preclude any significant continued erosion. Because this report is an extension to previously documented work, most background information has been omitted. A complete discussion of the motivation for both the analysis and the modeling is provided in Lee et al., ''Erosion Modeling Analysis for Modified DWPF SR Tank''

  2. Global sensitivity analysis of thermomechanical models in modelling of welding

    International Nuclear Information System (INIS)

    Current approach of most welding modellers is to content themselves with available material data, and to chose a mechanical model that seems to be appropriate. Among inputs, those controlling the material properties are one of the key problems of welding simulation: material data are never characterized over a sufficiently wide temperature range. This way to proceed neglect the influence of the uncertainty of input data on the result given by the computer code. In this case, how to assess the credibility of prediction? This thesis represents a step in the direction of implementing an innovative approach in welding simulation in order to bring answers to this question, with an illustration on some concretes welding cases.The global sensitivity analysis is chosen to determine which material properties are the most sensitive in a numerical welding simulation and in which range of temperature. Using this methodology require some developments to sample and explore the input space covering welding of different steel materials. Finally, input data have been divided in two groups according to their influence on the output of the model (residual stress or distortion). In this work, complete methodology of the global sensitivity analysis has been successfully applied to welding simulation and lead to reduce the input space to the only important variables. Sensitivity analysis has provided answers to what can be considered as one of the probable frequently asked questions regarding welding simulation: for a given material which properties must be measured with a good accuracy and which ones can be simply extrapolated or taken from a similar material? (author)

  3. Inducer analysis/pump model development

    Science.gov (United States)

    Cheng, Gary C.

    1994-01-01

    Current design of high performance turbopumps for rocket engines requires effective and robust analytical tools to provide design information in a productive manner. The main goal of this study was to develop a robust and effective computational fluid dynamics (CFD) pump model for general turbopump design and analysis applications. A finite difference Navier-Stokes flow solver, FDNS, which includes an extended k-epsilon turbulence model and appropriate moving zonal interface boundary conditions, was developed to analyze turbulent flows in turbomachinery devices. In the present study, three key components of the turbopump, the inducer, impeller, and diffuser, were investigated by the proposed pump model, and the numerical results were benchmarked by the experimental data provided by Rocketdyne. For the numerical calculation of inducer flows with tip clearance, the turbulence model and grid spacing are very important. Meanwhile, the development of the cross-stream secondary flow, generated by curved blade passage and the flow through tip leakage, has a strong effect on the inducer flow. Hence, the prediction of the inducer performance critically depends on whether the numerical scheme of the pump model can simulate the secondary flow pattern accurately or not. The impeller and diffuser, however, are dominated by pressure-driven flows such that the effects of turbulence model and grid spacing (except near leading and trailing edges of blades) are less sensitive. The present CFD pump model has been proved to be an efficient and robust analytical tool for pump design due to its very compact numerical structure (requiring small memory), fast turnaround computing time, and versatility for different geometries.

  4. Modeling and Exergy Analysis of District Cooling

    DEFF Research Database (Denmark)

    Nguyen, Chan

    /or economically which is the objective of the PhD project. A thermodynamic (energy and exergy) model of a transcritical CO2 cooling and heating system has been developed. The coefficient of performance (COP) of the system is the characteristic of interest. A sensitivity analysis of the parameters: compressor...... the gas cooler, pinch temperature in the evaporator and effectiveness of the IHX. These results are complemented by the exergy analysis, where the exergy destruction ratio of the CO2 system’s component is found. Heat recovery from vapour compression heat pumps has been investigated. The heat is to be...... used in a district heating system based on combined heat and power plants (CHP). A theoretical comparison of trigeneration (cooling, heating and electricity) systems, a traditional system and a recovery system is carried out. The comparison is based on the systems overall exergy efficiency. The...

  5. Modelling structural systems for transient response analysis

    International Nuclear Information System (INIS)

    This paper introduces and reports success of a direct means of determining the time periods in which a structural system behaves as a linear system. Numerical results are based on post fracture transient analyses of simplified nuclear piping systems. Knowledge of the linear response ranges will lead to improved analysis-test correlation and more efficient analyses. It permits direct use of data from physical tests in analysis and simplication of the analytical model and interpretation of its behaviour. The paper presents a procedure for deducing linearity based on transient responses. Given the forcing functions and responses of discrete points of the system at various times, the process produces evidence of linearity and quantifies an adequate set of equations of motion. Results of use of the process with linear and nonlinear analyses of piping systems with damping illustrate its success. Results cover the application to data from mathematical system responses. (Auth.)

  6. Spatiochromatic Context Modeling for Color Saliency Analysis.

    Science.gov (United States)

    Zhang, Jun; Wang, Meng; Zhang, Shengping; Li, Xuelong; Wu, Xindong

    2016-06-01

    Visual saliency is one of the most noteworthy perceptual abilities of human vision. Recent progress in cognitive psychology suggests that: 1) visual saliency analysis is mainly completed by the bottom-up mechanism consisting of feedforward low-level processing in primary visual cortex (area V1) and 2) color interacts with spatial cues and is influenced by the neighborhood context, and thus it plays an important role in a visual saliency analysis. From a computational perspective, the most existing saliency modeling approaches exploit multiple independent visual cues, irrespective of their interactions (or are not computed explicitly), and ignore contextual influences induced by neighboring colors. In addition, the use of color is often underestimated in the visual saliency analysis. In this paper, we propose a simple yet effective color saliency model that considers color as the only visual cue and mimics the color processing in V1. Our approach uses region-/boundary-defined color features with spatiochromatic filtering by considering local color-orientation interactions, therefore captures homogeneous color elements, subtle textures within the object and the overall salient object from the color image. To account for color contextual influences, we present a divisive normalization method for chromatic stimuli through the pooling of contrary/complementary color units. We further define a color perceptual metric over the entire scene to produce saliency maps for color regions and color boundaries individually. These maps are finally globally integrated into a one single saliency map. The final saliency map is produced by Gaussian blurring for robustness. We evaluate the proposed method on both synthetic stimuli and several benchmark saliency data sets from the visual saliency analysis to salient object detection. The experimental results demonstrate that the use of color as a unique visual cue achieves competitive results on par with or better than 12 state

  7. Non standard analysis, polymer models, quantum fields

    International Nuclear Information System (INIS)

    We give an elementary introduction to non standard analysis and its applications to the theory of stochastic processes. This is based on a joint book with J.E. Fenstad, R. Hoeegh-Krohn and T. Lindstroeem. In particular we give a discussion of an hyperfinite theory of Dirichlet forms with applications to the study of the Hamiltonian for a quantum mechanical particle in the potential created by a polymer. We also discuss new results on the existence of attractive polymer measures in dimension d 12 phi22)sub(d)-model of interacting quantum fields. (orig.)

  8. Modelling and analysis of global coal markets

    International Nuclear Information System (INIS)

    The thesis comprises four interrelated essays featuring modelling and analysis of coal markets. Each of the four essays has a dedicated chapter in this thesis. Chapters 2 to 4 have, from a topical perspective, a backward-looking focus and deal with explaining recent market outcomes in the international coal trade. The findings of those essays may serve as guidance for assessing current coal market outcomes as well as expected market outcomes in the near to medium-term future. Chapter 5 has a forward-looking focus and builds a bridge between explaining recent market outcomes and projecting long-term market equilibria. Chapter 2, Strategic Behaviour in International Metallurgical Coal Markets, deals with market conduct of large exporters in the market of coals used in steel-making in the period 2008 to 2010. In this essay I analyse whether prices and trade-flows in the international market for metallurgical coals were subject to non-competitive conduct in the period 2008 to 2010. To do so, I develop mathematical programming models - a Stackelberg model, two varieties of a Cournot model, and a perfect competition model - for computing spatial equilibria in international resource markets. Results are analysed with various statistical measures to assess the prediction accuracy of the models. The results show that real market equilibria cannot be reproduced with a competitive model. However, real market outcomes can be accurately simulated with the non-competitive models, suggesting that market equilibria in the international metallurgical coal trade were subject to the strategic behaviour of coal exporters. Chapter 3 and chapter 4 deal with market power issues in the steam coal trade in the period 2006 to 2008. Steam coals are typically used to produce steam either for electricity generation or for heating purposes. In Chapter 3 we analyse market behaviour of key exporting countries in the steam coal trade. This chapter features the essay Market Structure Scenarios in

  9. Modelling and analysis of global coal markets

    Energy Technology Data Exchange (ETDEWEB)

    Trueby, Johannes

    2013-01-17

    The thesis comprises four interrelated essays featuring modelling and analysis of coal markets. Each of the four essays has a dedicated chapter in this thesis. Chapters 2 to 4 have, from a topical perspective, a backward-looking focus and deal with explaining recent market outcomes in the international coal trade. The findings of those essays may serve as guidance for assessing current coal market outcomes as well as expected market outcomes in the near to medium-term future. Chapter 5 has a forward-looking focus and builds a bridge between explaining recent market outcomes and projecting long-term market equilibria. Chapter 2, Strategic Behaviour in International Metallurgical Coal Markets, deals with market conduct of large exporters in the market of coals used in steel-making in the period 2008 to 2010. In this essay I analyse whether prices and trade-flows in the international market for metallurgical coals were subject to non-competitive conduct in the period 2008 to 2010. To do so, I develop mathematical programming models - a Stackelberg model, two varieties of a Cournot model, and a perfect competition model - for computing spatial equilibria in international resource markets. Results are analysed with various statistical measures to assess the prediction accuracy of the models. The results show that real market equilibria cannot be reproduced with a competitive model. However, real market outcomes can be accurately simulated with the non-competitive models, suggesting that market equilibria in the international metallurgical coal trade were subject to the strategic behaviour of coal exporters. Chapter 3 and chapter 4 deal with market power issues in the steam coal trade in the period 2006 to 2008. Steam coals are typically used to produce steam either for electricity generation or for heating purposes. In Chapter 3 we analyse market behaviour of key exporting countries in the steam coal trade. This chapter features the essay Market Structure Scenarios in

  10. MODELING ANALYSIS FOR GROUT HOPPER WASTE TANK

    Energy Technology Data Exchange (ETDEWEB)

    Lee, S.

    2012-01-04

    The Saltstone facility at Savannah River Site (SRS) has a grout hopper tank to provide agitator stirring of the Saltstone feed materials. The tank has about 300 gallon capacity to provide a larger working volume for the grout nuclear waste slurry to be held in case of a process upset, and it is equipped with a mechanical agitator, which is intended to keep the grout in motion and agitated so that it won't start to set up. The primary objective of the work was to evaluate the flow performance for mechanical agitators to prevent vortex pull-through for an adequate stirring of the feed materials and to estimate an agitator speed which provides acceptable flow performance with a 45{sup o} pitched four-blade agitator. In addition, the power consumption required for the agitator operation was estimated. The modeling calculations were performed by taking two steps of the Computational Fluid Dynamics (CFD) modeling approach. As a first step, a simple single-stage agitator model with 45{sup o} pitched propeller blades was developed for the initial scoping analysis of the flow pattern behaviors for a range of different operating conditions. Based on the initial phase-1 results, the phase-2 model with a two-stage agitator was developed for the final performance evaluations. A series of sensitivity calculations for different designs of agitators and operating conditions have been performed to investigate the impact of key parameters on the grout hydraulic performance in a 300-gallon hopper tank. For the analysis, viscous shear was modeled by using the Bingham plastic approximation. Steady state analyses with a two-equation turbulence model were performed. All analyses were based on three-dimensional results. Recommended operational guidance was developed by using the basic concept that local shear rate profiles and flow patterns can be used as a measure of hydraulic performance and spatial stirring. Flow patterns were estimated by a Lagrangian integration technique along

  11. A catalog of automated analysis methods for enterprise models.

    Science.gov (United States)

    Florez, Hector; Sánchez, Mario; Villalobos, Jorge

    2016-01-01

    Enterprise models are created for documenting and communicating the structure and state of Business and Information Technologies elements of an enterprise. After models are completed, they are mainly used to support analysis. Model analysis is an activity typically based on human skills and due to the size and complexity of the models, this process can be complicated and omissions or miscalculations are very likely. This situation has fostered the research of automated analysis methods, for supporting analysts in enterprise analysis processes. By reviewing the literature, we found several analysis methods; nevertheless, they are based on specific situations and different metamodels; then, some analysis methods might not be applicable to all enterprise models. This paper presents the work of compilation (literature review), classification, structuring, and characterization of automated analysis methods for enterprise models, expressing them in a standardized modeling language. In addition, we have implemented the analysis methods in our modeling tool. PMID:27047732

  12. Modeling human reliability analysis using MIDAS

    Energy Technology Data Exchange (ETDEWEB)

    Boring, R. L. [Human Factors, Instrumentation and Control Systems Dept., Idaho National Laboratory, Idaho Falls, ID 83415 (United States)

    2006-07-01

    This paper documents current efforts to infuse human reliability analysis (HRA) into human performance simulation. The Idaho National Laboratory is teamed with NASA Ames Research Center to bridge the SPAR-H HRA method with NASA's Man-machine Integration Design and Analysis System (MIDAS) for use in simulating and modeling the human contribution to risk in nuclear power plant control room operations. It is anticipated that the union of MIDAS and SPAR-H will pave the path for cost-effective, timely, and valid simulated control room operators for studying current and next generation control room configurations. This paper highlights considerations for creating the dynamic HRA framework necessary for simulation, including event dependency and granularity. This paper also highlights how the SPAR-H performance shaping factors can be modeled in MIDAS across static, dynamic, and initiator conditions common to control room scenarios. This paper concludes with a discussion of the relationship of the workload factors currently in MIDAS and the performance shaping factors in SPAR-H. (authors)

  13. Modeling human reliability analysis using MIDAS

    International Nuclear Information System (INIS)

    This paper documents current efforts to infuse human reliability analysis (HRA) into human performance simulation. The Idaho National Laboratory is teamed with NASA Ames Research Center to bridge the SPAR-H HRA method with NASA's Man-machine Integration Design and Analysis System (MIDAS) for use in simulating and modeling the human contribution to risk in nuclear power plant control room operations. It is anticipated that the union of MIDAS and SPAR-H will pave the path for cost-effective, timely, and valid simulated control room operators for studying current and next generation control room configurations. This paper highlights considerations for creating the dynamic HRA framework necessary for simulation, including event dependency and granularity. This paper also highlights how the SPAR-H performance shaping factors can be modeled in MIDAS across static, dynamic, and initiator conditions common to control room scenarios. This paper concludes with a discussion of the relationship of the workload factors currently in MIDAS and the performance shaping factors in SPAR-H. (authors)

  14. ANALYSIS MODEL FOR RETURN ON CAPITAL EMPLOYED

    Directory of Open Access Journals (Sweden)

    BURJA CAMELIA

    2013-02-01

    Full Text Available At the microeconomic level, the appreciation of the capitals’ profitability is a very complex action which is ofinterest for stakeholders. This study has as main purpose to extend the traditional analysis model for the capitals’profitability, based on the ratio “Return on capital employed”. In line with it the objectives of this work aim theidentification of factors that exert an influence on the capital’s profitability utilized by a company and the measurementof their contribution in the manifestation of the phenomenon. The proposed analysis model is validated on the use caseof a representative company from the agricultural sector. The results obtained reveal that in a company there are somefactors which can act positively on the capitals’ profitability: capital turnover, sales efficiency, increase the share ofsales in the total revenues, improvement of the expenses’ efficiency. The findings are useful both for the decisionmakingfactors in substantiating the economic strategies and for the capital owners who are interested in efficiency oftheir investments.

  15. Computational Modeling, Formal Analysis, and Tools for Systems Biology

    OpenAIRE

    Bartocci, Ezio; Lió, Pietro

    2016-01-01

    As the amount of biological data in the public domain grows, so does the range of modeling and analysis techniques employed in systems biology. In recent years, a number of theoretical computer science developments have enabled modeling methodology to keep pace. The growing interest in systems biology in executable models and their analysis has necessitated the borrowing of terms and methods from computer science, such as formal analysis, model checking, static analysis, and runtime verificat...

  16. Data analysis and source modelling for LISA

    International Nuclear Information System (INIS)

    The gravitational waves are one of the most important predictions in general relativity. Besides of the directly proof of the existence of GWs, there are already several ground based detectors (such as LIGO, GEO, etc) and the planed future space mission (such as: LISA) which are aim to detect the GWs directly. GW contain a large amount of information of its source, extracting these information can help us dig out the physical property of the source, even open a new window for understanding the Universe. Hence, GW data analysis will be a challenging task in seeking the GWs. In this thesis, I present two works about the data analysis for LISA. In the first work, we introduce an extended multimodal genetic algorithm which utilizes the properties of the signal and the detector response function to analyze the data from the third round of mock LISA data challenge. We have found all five sources present in the data and recovered the coalescence time, chirp mass, mass ratio and sky location with reasonable accuracy. As for the orbital angular momentum and two spins of the Black Holes, we have found a large number of widely separated modes in the parameter space with similar maximum likelihood values. The performance of this method is comparable, if not better, to already existing algorithms. In the second work, we introduce an new phenomenological waveform model for the extreme mass ratio inspiral system. This waveform consists of a set of harmonics with constant amplitude and slowly evolving phase which we decompose in a Taylor series. We use these phenomenological templates to detect the signal in the simulated data, and then, assuming a particular EMRI model, estimate the physical parameters of the binary with high precision. The results show that our phenomenological waveform is very feasible in the data analysis of EMRI signal.

  17. Sensitivity analysis of Smith's AMRV model

    International Nuclear Information System (INIS)

    Multiple-expert hazard/risk assessments have considerable precedent, particularly in the Yucca Mountain site characterization studies. In this paper, we present a Bayesian approach to statistical modeling in volcanic hazard assessment for the Yucca Mountain site. Specifically, we show that the expert opinion on the site disruption parameter p is elicited on the prior distribution, π (p), based on geological information that is available. Moreover, π (p) can combine all available geological information motivated by conflicting but realistic arguments (e.g., simulation, cluster analysis, structural control, etc.). The incorporated uncertainties about the probability of repository disruption p, win eventually be averaged out by taking the expectation over π (p). We use the following priors in the analysis: priors chosen for mathematical convenience: Beta (r, s) for (r, s) = (2, 2), (3, 3), (5, 5), (2, 1), (2, 8), (8, 2), and (1, 1); and three priors motivated by expert knowledge. Sensitivity analysis is performed for each prior distribution. Estimated values of hazard based on the priors chosen for mathematical simplicity are uniformly higher than those obtained based on the priors motivated by expert knowledge. And, the model using the prior, Beta (8,2), yields the highest hazard (= 2.97 X 10-2). The minimum hazard is produced by the open-quotes three-expert priorclose quotes (i.e., values of p are equally likely at 10-3 10-2, and 10-1). The estimate of the hazard is 1.39 x which is only about one order of magnitude smaller than the maximum value. The term, open-quotes hazardclose quotes, is defined as the probability of at least one disruption of a repository at the Yucca Mountain site by basaltic volcanism for the next 10,000 years

  18. Data analysis and source modelling for LISA

    Energy Technology Data Exchange (ETDEWEB)

    Shang, Yu

    2014-07-01

    The gravitational waves are one of the most important predictions in general relativity. Besides of the directly proof of the existence of GWs, there are already several ground based detectors (such as LIGO, GEO, etc) and the planed future space mission (such as: LISA) which are aim to detect the GWs directly. GW contain a large amount of information of its source, extracting these information can help us dig out the physical property of the source, even open a new window for understanding the Universe. Hence, GW data analysis will be a challenging task in seeking the GWs. In this thesis, I present two works about the data analysis for LISA. In the first work, we introduce an extended multimodal genetic algorithm which utilizes the properties of the signal and the detector response function to analyze the data from the third round of mock LISA data challenge. We have found all five sources present in the data and recovered the coalescence time, chirp mass, mass ratio and sky location with reasonable accuracy. As for the orbital angular momentum and two spins of the Black Holes, we have found a large number of widely separated modes in the parameter space with similar maximum likelihood values. The performance of this method is comparable, if not better, to already existing algorithms. In the second work, we introduce an new phenomenological waveform model for the extreme mass ratio inspiral system. This waveform consists of a set of harmonics with constant amplitude and slowly evolving phase which we decompose in a Taylor series. We use these phenomenological templates to detect the signal in the simulated data, and then, assuming a particular EMRI model, estimate the physical parameters of the binary with high precision. The results show that our phenomenological waveform is very feasible in the data analysis of EMRI signal.

  19. Application of Statistical Analysis Software in Food Scientific Modeling

    OpenAIRE

    Miaochao Chen; Kong Xiangsheng; Kan Chen

    2014-01-01

    In food scientific researches, sophisticated statistical analysis problems often can be met and in this study, through SPSS statistical analysis software, the method of the curve regression model and the multiple regression model that both are common in food science has been established and the experimental results show that the method can be effectively used in the statistical analysis model of food science.

  20. A structural analysis model for clay caps

    International Nuclear Information System (INIS)

    This paper presents a structural analysis model for clay caps used in the landfill of low-level nuclear waste to minimize the migration of fluid through the soil. The clay cap resting on the soil foundation is treated as an axially symmetric elastic plate supported by an elastic foundation. A circular hole (concentric with the plate) in the elastic foundation represents an underlying cavity formed in the landfill due to waste decomposition and volume reduction. Unlike the models that commonly represent the soil foundation with equivalent springs, this model treats the foundation as a semi-infinite space and accounts for the work done by both compression and shear stresses in the foundation. The governing equation of the plate is based upon the classical theory of plate bending, whereas the governing equation derived by using Vlasov's general variational method describes the soil foundation. The solutions are expressed in terms of Basset functions. A FORTRAN program was written to carry out the numerical calculations

  1. Dynamical Systems Analysis of Various Dark Energy Models

    CERN Document Server

    Roy, Nandan

    2015-01-01

    In this thesis, we used dynamical systems analysis to find the qualitative behaviour of some dark energy models. Specifically, dynamical systems analysis of quintessence scalar field models, chameleon scalar field models and holographic models of dark energy are discussed in this thesis.

  2. Saturn Ring Data Analysis and Thermal Modeling

    Science.gov (United States)

    Dobson, Coleman

    2011-01-01

    CIRS, VIMS, UVIS, and ISS (Cassini's Composite Infrared Specrtometer, Visual and Infrared Mapping Spectrometer, Ultra Violet Imaging Spectrometer and Imaging Science Subsystem, respectively), have each operated in a multidimensional observation space and have acquired scans of the lit and unlit rings at multiple phase angles. To better understand physical and dynamical ring particle parametric dependence, we co-registered profiles from these three instruments, taken at a wide range of wavelengths, from ultraviolet through the thermal infrared, to associate changes in ring particle temperature with changes in observed brightness, specifically with albedos inferred by ISS, UVIS and VIMS. We work in a parameter space where the solar elevation range is constrained to 12 deg - 14 deg and the chosen radial region is the B3 region of the B ring; this region is the most optically thick region in Saturn's rings. From this compilation of multiple wavelength data, we construct and fit phase curves and color ratios using independent dynamical thermal models for ring structure and overplot Saturn, Saturn ring, and Solar spectra. Analysis of phase curve construction and color ratios reveals thermal emission to fall within the extrema of the ISS bandwidth and a geometrical dependence of reddening on phase angle, respectively. Analysis of spectra reveals Cassini CIRS Saturn spectra dominate Cassini CIRS B3 Ring Spectra from 19 to 1000 microns, while Earth-based B Ring Spectrum dominates Earth-based Saturn Spectrum from 0.4 to 4 microns. From our fits we test out dynamical thermal models; from the phase curves we derive ring albedos and non-lambertian properties of the ring particle surfaces; and from the color ratios we examine multiple scattering within the regolith of ring particles.

  3. A visual analysis of the process of process modeling

    OpenAIRE

    Claes, J Jan; Vanderfeesten, ITP Irene; Pinggera, J.; Reijers, HA Hajo; Weber, B.; Poels, G

    2015-01-01

    The construction of business process models has become an important requisite in the analysis and optimization of processes. The success of the analysis and optimization efforts heavily depends on the quality of the models. Therefore, a research domain emerged that studies the process of process modeling. This paper contributes to this research by presenting a way of visualizing the different steps a modeler undertakes to construct a process model, in a so-called process of process modeling C...

  4. Comparative Analysis of Parametric Engine Model and Engine Map Model

    OpenAIRE

    Zeeshan Ali Memon; Sadiq Ali Shah; Muhammas Saleh Jumani

    2015-01-01

    Two different engine models, parametric engine model and engine map model are employed to analyze the dynamics of an engine during the gear shifting. The models are analyzed under critical transitional manoeuvres to investigate their appropriateness for vehicle longitudinal dynamics. The simulation results for both models have been compared. The results show the engine map model matches well with the parametric model and can be used for the vehicle longitudinal dynamics model. The proposed ap...

  5. Comparative analysis of enterprise risk management models

    OpenAIRE

    Nikolaev Igor V.

    2012-01-01

    The article is devoted to the analysis and the comparison of modern enterprise risk management models used in domestic and world practice. Some thesis to build such a model are proposed.Статья посвящена анализу и сравнению современных моделей управления рисками предприятий, которые используются в отечественной и зарубежной практике. Предложены некоторые положения, на которых должны базироваться такие модели....

  6. Production TTR modeling and dynamic buckling analysis

    Institute of Scientific and Technical Information of China (English)

    Hugh Liu; John Wei; Edward Huang

    2013-01-01

    In a typical tension leg platform (TLP) design,the top tension factor (TTF),measuring the top tension of a top tensioned riser (TTR) relative to its submerged weight in water,is one of the most important design parameters that has to be specified properly.While a very small TTF may lead to excessive vortex induced vibration (ⅤⅣ),clashing issues and possible compression close to seafloor,an unnecessarily high TTF may translate into excessive riser cost and vessel payload,and even has impacts on the TLP sizing and design in general.In the process of a production TTR design,it is found that its outer casing can be subjected to compression in a worst-case scenario with some extreme metocean and hardware conditions.The present paper shows how finite element analysis (FEA) models using beam elements and two different software packages (Flexcom and ABAQUS) are constructed to simulate the TTR properly,and especially the pipe-in-pipe effects.An ABAQUS model with hybrid elements (beam elements globally + shell elements locally) can be used to investigate how the outer casing behaves under compression.It is shown for the specified TTR design,even with its outer casing being under some local compression in the worst-case scenario,dynamic buckling would not occur; therefore the TTR design is adequate.

  7. Statistical Analysis and Modeling of Elastic Functions

    CERN Document Server

    Srivastava, Anuj; Kurtek, Sebastian; Klassen, Eric; Marron, J S

    2011-01-01

    We introduce a novel geometric framework for separating, analyzing and modeling the $x$ (or horizontal) and the $y$ (or vertical) variability in time-warped functional data of the type frequently studied in growth curve analysis. This framework is based on the use of the Fisher-Rao Riemannian metric that provides a proper distance for: (1) aligning, comparing and modeling functions and (2) analyzing the warping functions. A convenient square-root velocity function (SRVF) representation transforms the Fisher-Rao metric to the standard $\\ltwo$ metric, a tool that is applied twice in this framework. Firstly, it is applied to the given functions where it leads to a parametric family of penalized-$\\ltwo$ distances in SRVF space. The parameter controls the levels of elasticity of the individual functions. These distances are then used to define Karcher means and the individual functions are optimally warped to align them to the Karcher means to extract the $y$ variability. Secondly, the resulting warping functions,...

  8. Linking advanced fracture models to structural analysis

    Energy Technology Data Exchange (ETDEWEB)

    Chiesa, Matteo

    2001-07-01

    Shell structures with defects occur in many situations. The defects are usually introduced during the welding process necessary for joining different parts of the structure. Higher utilization of structural materials leads to a need for accurate numerical tools for reliable prediction of structural response. The direct discretization of the cracked shell structure with solid finite elements in order to perform an integrity assessment of the structure in question leads to large size problems, and makes such analysis infeasible in structural application. In this study a link between local material models and structural analysis is outlined. An ''ad hoc'' element formulation is used in order to connect complex material models to the finite element framework used for structural analysis. An improved elasto-plastic line spring finite element formulation, used in order to take cracks into account, is linked to shell elements which are further linked to beam elements. In this way one obtain a global model of the shell structure that also accounts for local flexibilities and fractures due to defects. An important advantage with such an approach is a direct fracture mechanics assessment e.g. via computed J-integral or CTOD. A recent development in this approach is the notion of two-parameter fracture assessment. This means that the crack tip stress tri-axiality (constraint) is employed in determining the corresponding fracture toughness, giving a much more realistic capacity of cracked structures. The present thesis is organized in six research articles and an introductory chapter that reviews important background literature related to this work. Paper I and II address the performance of shell and line spring finite elements as a cost effective tool for performing the numerical calculation needed to perform a fracture assessment. In Paper II a failure assessment, based on the testing of a constraint-corrected fracture mechanics specimen under tension, is

  9. Trend analysis model to forecast energy supply and demand

    Energy Technology Data Exchange (ETDEWEB)

    1984-01-01

    A particular approach to energy forecasting which was studied in considerable detail was trend extrapolation. This technique, termed the trend analysis model, was suggested by Dr. S. Scott Sutton, the EIA contract technical officer. While a variety of equations were explored during this part of the study, they are variations of a basic formulation. This report describes the trend analysis model, demonstrates the trend analysis model and documents the computer program used to produce the model results.

  10. Model Checking Is Static Analysis of Modal Logic

    DEFF Research Database (Denmark)

    Nielson, Flemming; Nielson, Hanne Riis

    2010-01-01

    it can give an exact characterisation of the semantics of formulae in a modal logic. This shows that model checking can be performed by means of state-of-the-art approaches to static analysis and allow us to conclude that the problems of model checking and static analysis are reducible to each other....... In terms of computational complexity we show that model checking by means of static analysis gives the same complexity bounds as are known for traditional approaches to model checking....

  11. Comparison of Statistical Models for Regional Crop Trial Analysis

    Institute of Scientific and Technical Information of China (English)

    ZHANG Qun-yuan; KONG Fan-ling

    2002-01-01

    Based on the review and comparison of main statistical analysis models for estimating varietyenvironment cell means in regional crop trials, a new statistical model, LR-PCA composite model was proposed, and the predictive precision of these models were compared by cross validation of an example data. Results showed that the order of model precision was LR-PCA model > AMMI model > PCA model > Treatment Means (TM) model > Linear Regression (LR) model > Additive Main Effects ANOVA model. The precision gain factor of LR-PCA model was 1.55, increasing by 8.4% compared with AMMI.

  12. Applied data analysis and modeling for energy engineers and scientists

    CERN Document Server

    Reddy, T Agami

    2011-01-01

    ""Applied Data Analysis and Modeling for Energy Engineers and Scientists"" discusses mathematical models, data analysis, and decision analysis in modeling. The approach taken in this volume focuses on the modeling and analysis of thermal systems in an engineering environment, while also covering a number of other critical areas. Other material covered includes the tools that researchers and engineering professionals will need in order to explore different analysis methods, use critical assessment skills and reach sound engineering conclusions. The book also covers process and system design and

  13. Towards a controlled sensitivity analysis of model development decisions

    Science.gov (United States)

    Clark, Martyn; Nijssen, Bart

    2016-04-01

    The current generation of hydrologic models have followed a myriad of different development paths, making it difficult for the community to test underlying hypotheses and identify a clear path to model improvement. Model comparison studies have been undertaken to explore model differences, but these studies have not been able to meaningfully attribute inter-model differences in predictive ability to individual model components because there are often too many structural and implementation differences among the models considered. As a consequence, model comparison studies to date have provided limited insight into the causes of differences in model behavior, and model development has often relied on the inspiration and experience of individual modelers rather than a systematic analysis of model shortcomings. This presentation will discuss a unified approach to process-based hydrologic modeling to enable controlled and systematic analysis of multiple model representations (hypotheses) of hydrologic processes and scaling behavior. Our approach, which we term the Structure for Unifying Multiple Modeling Alternatives (SUMMA), formulates a general set of conservation equations, providing the flexibility to experiment with different spatial representations, different flux parameterizations, different model parameter values, and different time stepping schemes. We will discuss the use of SUMMA to systematically analyze different model development decisions, focusing on both analysis of simulations for intensively instrumented research watersheds as well as simulations across a global dataset of FLUXNET sites. The intent of the presentation is to demonstrate how the systematic analysis of model shortcomings can help identify model weaknesses and inform future model development priorities.

  14. Comparative analysis of parametric engine model and engine map model

    International Nuclear Information System (INIS)

    Two different engine models, parametric engine model and engine map model are employed to analyze the dynamics of an engine during the gear shifting. The models are analyzed under critical transitional manoeuvres to investigate their appropriateness for vehicle longitudinal dynamics. The simulation results for both models have been compared. The results show the engine map model matches well with the parametric model and can be used for the vehicle longitudinal dynamics model. The proposed approach can be useful for the selection of the appropriate vehicle for the given application. (author)

  15. Translation model, translation analysis, translation strategy: an integrated methodology

    OpenAIRE

    VOLKOVA TATIANA A.

    2014-01-01

    The paper revisits the concepts of translation model, translation analysis, and translation strategy from an integrated perspective: a translation strategy naturally follows translation analysis performed on a given set of textual, discursive and communicative parameters that form a valid translation model. Translation modeling is reconsidered in terms of a paradigm shift and a distinction between a process-oriented (descriptive) model and an action-oriented (prescriptive) model. Following th...

  16. Analysis on the Logarithmic Model of Relationships

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    The logarithmic model is often used to describe the relationships between factors.It often gives good statistical characteristics.Yet,in the process of modeling of soil and water conservation,we find out that this“good”model cannot guarantee good result.In this paper we make an inquiry into the intrinsic reasons.It is shown that the logarithmic model has the property of enlarging or reducing model errors,and the disadvantages of the logarithmic model are analyzed.

  17. Economic analysis model for total energy and economic systems

    International Nuclear Information System (INIS)

    This report describes framing an economic analysis model developed as a tool of total energy systems. To prospect and analyze future energy systems, it is important to analyze the relation between energy system and economic structure. We prepared an economic analysis model which was suited for this purpose. Our model marks that we can analyze in more detail energy related matters than other economic ones, and can forecast long-term economic progress rather than short-term economic fluctuation. From view point of economics, our model is longterm multi-sectoral economic analysis model of open Leontief type. Our model gave us appropriate results for fitting test and forecasting estimation. (author)

  18. Model performance analysis and model validation in logistic regression

    Directory of Open Access Journals (Sweden)

    Rosa Arboretti Giancristofaro

    2007-10-01

    Full Text Available In this paper a new model validation procedure for a logistic regression model is presented. At first, we illustrate a brief review of different techniques of model validation. Next, we define a number of properties required for a model to be considered "good", and a number of quantitative performance measures. Lastly, we describe a methodology for the assessment of the performance of a given model by using an example taken from a management study.

  19. An Extended Analysis of Requirements Traceability Model

    Institute of Scientific and Technical Information of China (English)

    Jiang Dandong(蒋丹东); Zhang Shensheng; Chen Lu

    2004-01-01

    A new extended meta model of traceability is presented. Then, a formalized fine-grained model of traceability is described. Some major issues about this model, including trace units, requirements and relations within the model, are further analyzed. Finally, a case study that comes from a key project of 863 Program is given.

  20. Managing Analysis Models in the Design Process

    Science.gov (United States)

    Briggs, Clark

    2006-01-01

    Design of large, complex space systems depends on significant model-based support for exploration of the design space. Integrated models predict system performance in mission-relevant terms given design descriptions and multiple physics-based numerical models. Both the design activities and the modeling activities warrant explicit process definitions and active process management to protect the project from excessive risk. Software and systems engineering processes have been formalized and similar formal process activities are under development for design engineering and integrated modeling. JPL is establishing a modeling process to define development and application of such system-level models.

  1. Loss Given Default Modelling: Comparative Analysis

    OpenAIRE

    Yashkir, Olga; Yashkir, Yuriy

    2013-01-01

    In this study we investigated several most popular Loss Given Default (LGD) models (LSM, Tobit, Three-Tiered Tobit, Beta Regression, Inflated Beta Regression, Censored Gamma Regression) in order to compare their performance. We show that for a given input data set, the quality of the model calibration depends mainly on the proper choice (and availability) of explanatory variables (model factors), but not on the fitting model. Model factors were chosen based on the amplitude of their correlati...

  2. Comparison of Integrated Analysis Methods for Two Model Scenarios

    Science.gov (United States)

    Amundsen, Ruth M.

    1999-01-01

    Integrated analysis methods have the potential to substantially decrease the time required for analysis modeling. Integration with computer aided design (CAD) software can also allow a model to be more accurate by facilitating import of exact design geometry. However, the integrated method utilized must sometimes be tailored to the specific modeling situation, in order to make the process most efficient. Two cases are presented here that illustrate different processes used for thermal analysis on two different models. These examples are used to illustrate how the requirements, available input, expected output, and tools available all affect the process selected by the analyst for the most efficient and effective analysis.

  3. Model Analysis Assessing the dynamics of student learning

    CERN Document Server

    Bao, L; Bao, Lei; Redish, Edward F.

    2002-01-01

    In this paper we present a method of modeling and analysis that permits the extraction and quantitative display of detailed information about the effects of instruction on a class's knowledge. The method relies on a congitive model that represents student thinking in terms of mental models. Students frequently fail to recognize relevant conditions that lead to appropriate uses of their models. As a result they can use multiple models inconsistently. Once the most common mental models have been determined by qualitative research, they can be mapping onto a multiple choice test. Model analysis permits the interpretation of such a situation. We illustrate the use of our method by analyzing results from the FCI.

  4. EXPOSURE ANALYSIS MODELING SYSTEM (EXAMS): USER MANUAL AND SYSTEM DOCUMENTATION

    Science.gov (United States)

    The Exposure Analysis Modeling System, first published in 1982 (EPA-600/3-82-023), provides interactive computer software for formulating aquatic ecosystem models and rapidly evaluating the fate, transport, and exposure concentrations of synthetic organic chemicals - pesticides, ...

  5. [Analysis of the stability and adaptability of near infrared spectra qualitative analysis model].

    Science.gov (United States)

    Cao, Wu; Li, Wei-jun; Wang, Ping; Zhang, Li-ping

    2014-06-01

    The stability and adaptability of model of near infrared spectra qualitative analysis were studied. Method of separate modeling can significantly improve the stability and adaptability of model; but its ability of improving adaptability of model is limited. Method of joint modeling can not only improve the adaptability of the model, but also the stability of model, at the same time, compared to separate modeling, the method can shorten the modeling time, reduce the modeling workload; extend the term of validity of model, and improve the modeling efficiency. The experiment of model adaptability shows that, the correct recognition rate of separate modeling method is relatively low, which can not meet the requirements of application, and joint modeling method can reach the correct recognition rate of 90%, and significantly enhances the recognition effect. The experiment of model stability shows that, the identification results of model by joint modeling are better than the model by separate modeling, and has good application value. PMID:25358155

  6. Priors from DSGE models for dynamic factor analysis

    OpenAIRE

    Bäurle, Gregor

    2008-01-01

    We propose a method to incorporate information from Dynamic Stochastic General Equilibrium (DSGE) models into Dynamic Factor Analysis. The method combines a procedure previously applied for Bayesian Vector Autoregressions and a Gibbs Sampling approach for Dynamic Factor Models. The factors in the model are rotated such that they can be interpreted as variables from a DSGE model. In contrast to standard Dynamic Factor Analysis, a direct economic interpretation of the factors is given. We evalu...

  7. Modeling Paradigms Applied to the Analysis of European Air Quality

    OpenAIRE

    Makowski, M.

    2000-01-01

    The paper presents an overview of various modeling paradigms applicable to the analysis of complex decision-making that can be represented by large non-linear models. Such paradigms are illustrated by their application to the analysis of a model that helps to identify and analyze various cost-effective policy options aimed at improving European air quality. Also presented is the application of this model to support intergovernmental negotiations.

  8. Computational models for the nonlinear analysis of reinforced concrete plates

    Science.gov (United States)

    Hinton, E.; Rahman, H. H. A.; Huq, M. M.

    1980-01-01

    A finite element computational model for the nonlinear analysis of reinforced concrete solid, stiffened and cellular plates is briefly outlined. Typically, Mindlin elements are used to model the plates whereas eccentric Timoshenko elements are adopted to represent the beams. The layering technique, common in the analysis of reinforced concrete flexural systems, is incorporated in the model. The proposed model provides an inexpensive and reasonably accurate approach which can be extended for use with voided plates.

  9. Modeling for ultrasonic testing accuracy in probabilistic fracture mechanics analysis

    International Nuclear Information System (INIS)

    This study proposes models for ultrasonic testing (UT) accuracy at In-service Inspection (ISI) in probabilistic fracture mechanics (PFM) analysis. Regression analysis of the data brought by Ultrasonic Test and Evaluation for Maintenance Standards (UTS) project and modeling for successful candidates of Performance demonstration certification system provided the models for accuracy of flaw detection and sizing. New PFM analysis code, which evaluates failure probabilities at weld lines in piping aged by Stress Corrosion Cracking, has been developed by JAEA. The models were introduced into the code. Failure probabilities under the UT models at a weld line were evaluated by the code. (author)

  10. Multivariate Survival Mixed Models for Genetic Analysis of Longevity Traits

    DEFF Research Database (Denmark)

    Pimentel Maia, Rafael; Madsen, Per; Labouriau, Rodrigo

    2013-01-01

    concentrates on longevity studies. The framework presented allows to combine models based on continuous time with models based on discrete time in a joint analysis. The continuous time models are approximations of the frailty model in which the hazard function will be assumed to be piece-wise constant. The...

  11. Multidimensional Data Modeling for Business Process Analysis

    Science.gov (United States)

    Mansmann, Svetlana; Neumuth, Thomas; Scholl, Marc H.

    The emerging area of business process intelligence attempts to enhance the analytical capabilities of business process management systems by employing data warehousing and mining technologies. This paper presents an approach to re-engineering the business process modeling in conformity with the multidimensional data model. Since the business process and the multidimensional model are driven by rather different objectives and assumptions, there is no straightforward solution to converging these models.

  12. Analysis of a computer model of emotions

    OpenAIRE

    Moffat, D.; Frijda, N.H.; Phaf, R.H.

    1993-01-01

    In the fields of psychology, AI, and philosophy there has recently been theoretical activity in the cognitively-based modelling of emotions. Using AI methodology it is possible to implement and test these complex models, and in this paper we examine an emotion model called ACRES. We propose a set of requirements any such model should satisfy, and compare ACRES against them. Then, analysing its behaviour in detail, we formulate more requirements and criteria that can be applied to future compu...

  13. Multiattribute shopping models and ridge regression analysis

    OpenAIRE

    Timmermans, HJP Harry

    1981-01-01

    Policy decisions regarding retailing facilities essentially involve multiple attributes of shopping centres. If mathematical shopping models are to contribute to these decision processes, their structure should reflect the multiattribute character of retailing planning. Examination of existing models shows that most operational shopping models include only two policy variables. A serious problem in the calibration of the existing multiattribute shopping models is that of multicollinearity ari...

  14. Analysis on Some of Software Reliability Models

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Software reliability & maintainability evaluation tool (SRMET 3.0) is introducted in detail in this paper,which was developed by Software Evaluation and Test Center of China Aerospace Mechanical Corporation. SRMET 3.0is supported by seven soft ware reliability models and four software maintainability models. Numerical characteristicsfor all those models are deeply studied in this paper, and corresponding numerical algorithms for each model are alsogiven in the paper.

  15. Analysis and modeling of parking behavior

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Analyzes the spatial structure of parking behavior and establishes a basic parking behavior model to represent the parking problem in downtown, and establishes a parking pricing model to analyze the parking equilibrium with a positive parking fee and uses a paired combinatorial logit model to analyze the effect of trip integrative cost on parking behavior and concludes from empirical results that the parking behavior model performs well.

  16. Likelihood analysis of the I(2) model

    DEFF Research Database (Denmark)

    Johansen, Søren

    1997-01-01

    The I(2) model is defined as a submodel of the general vector autoregressive model, by two reduced rank conditions. The model describes stochastic processes with stationary second difference. A parametrization is suggested which makes likelihood inference feasible. Consistency of the maximum...

  17. The Model and Analysis of Mechatronics Systems

    OpenAIRE

    Bačkys, Gediminas

    2004-01-01

    Modeling process of mechatronics systems. Software used with PLC and simulate real equipment. System has few samples of models with pneumatics elements and model with analog device. There are education materials for students too. Software has been used for education goal at university and kolege.

  18. Projected principal component analysis in factor models

    OpenAIRE

    Fan, Jianqing; Liao, Yuan; Wang, Weichen

    2014-01-01

    This paper introduces a Projected Principal Component Analysis (Projected-PCA), which employees principal component analysis to the projected (smoothed) data matrix onto a given linear space spanned by covariates. When it applies to high-dimensional factor analysis, the projection removes noise components. We show that the unobserved latent factors can be more accurately estimated than the conventional PCA if the projection is genuine, or more precisely, when the factor loading matrices are r...

  19. Evaluation of Thermal Margin Analysis Models for SMART

    Energy Technology Data Exchange (ETDEWEB)

    Seo, Kyong Won; Kwon, Hyuk; Hwang, Dae Hyun [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2011-05-15

    Thermal margin of SMART would be analyzed by three different methods. The first method is subchannel analysis by MATRA-S code and it would be a reference data for the other two methods. The second method is an on-line few channel analysis by FAST code that would be integrated into SCOPS/SCOMS. The last one is a single channel module analysis by safety analysis. Several thermal margin analysis models for SMART reactor core by subchannel analysis were setup and tested. We adopted a strategy of single stage analysis for thermal analysis of SMART reactor core. The model should represent characteristics of the SMART reactor core including hot channel. The model should be simple as possible to be evaluated within reasonable time and cost

  20. Practical Use of Computationally Frugal Model Analysis Methods.

    Science.gov (United States)

    Hill, Mary C; Kavetski, Dmitri; Clark, Martyn; Ye, Ming; Arabi, Mazdak; Lu, Dan; Foglia, Laura; Mehl, Steffen

    2016-03-01

    Three challenges compromise the utility of mathematical models of groundwater and other environmental systems: (1) a dizzying array of model analysis methods and metrics make it difficult to compare evaluations of model adequacy, sensitivity, and uncertainty; (2) the high computational demands of many popular model analysis methods (requiring 1000's, 10,000 s, or more model runs) make them difficult to apply to complex models; and (3) many models are plagued by unrealistic nonlinearities arising from the numerical model formulation and implementation. This study proposes a strategy to address these challenges through a careful combination of model analysis and implementation methods. In this strategy, computationally frugal model analysis methods (often requiring a few dozen parallelizable model runs) play a major role, and computationally demanding methods are used for problems where (relatively) inexpensive diagnostics suggest the frugal methods are unreliable. We also argue in favor of detecting and, where possible, eliminating unrealistic model nonlinearities-this increases the realism of the model itself and facilitates the application of frugal methods. Literature examples are used to demonstrate the use of frugal methods and associated diagnostics. We suggest that the strategy proposed in this paper would allow the environmental sciences community to achieve greater transparency and falsifiability of environmental models, and obtain greater scientific insight from ongoing and future modeling efforts. PMID:25810333

  1. Solar Advisor Model; Session: Modeling and Analysis (Presentation)

    Energy Technology Data Exchange (ETDEWEB)

    Blair, N.

    2008-04-01

    This project supports the Solar America Initiative by: (1) providing a consistent framework for analyzing and comparing power system costs and performance across the range of solar technologies and markets, PV, solar heat systems, CSP, residential, commercial and utility markets; (2) developing and validating performance models to enable accurate calculation of levelized cost of energy (LCOE); (3) providing a consistent modeling platform for all TPP's; and (4) supporting implementation and usage of cost models.

  2. Modelling Immune System: Principles, Models,Analysis and Perspectives

    Institute of Scientific and Technical Information of China (English)

    Xiang-hua Li; Zheng-xuan Wang; Tian-yang Lu; Xiang-jiu Che

    2009-01-01

    The biological immune system is a complex adaptive system. There are lots of benefits for building the model of the immune system. For biological researchers, they can test some hypotheses about the infection process or simulate the responses of some drugs. For computer researchers, they can build distributed, robust and fault tolerant networks inspired by the functions of the immune system. This paper provides a comprehensive survey of the literatures on modelling the immune system. From the methodology perspective, the paper compares and analyzes the existing approaches and models, and also demonstrates the focusing research effort on the future immune models in the next few years.

  3. Modeling Composite Laminate Crushing for Crash Analysis

    Science.gov (United States)

    Fleming, David C.; Jones, Lisa (Technical Monitor)

    2002-01-01

    Crash modeling of composite structures remains limited in application and has not been effectively demonstrated as a predictive tool. While the global response of composite structures may be well modeled, when composite structures act as energy-absorbing members through direct laminate crushing the modeling accuracy is greatly reduced. The most efficient composite energy absorbing structures, in terms of energy absorbed per unit mass, are those that absorb energy through a complex progressive crushing response in which fiber and matrix fractures on a small scale dominate the behavior. Such failure modes simultaneously include delamination of plies, failure of the matrix to produce fiber bundles, and subsequent failure of fiber bundles either in bending or in shear. In addition, the response may include the significant action of friction, both internally (between delaminated plies or fiber bundles) or externally (between the laminate and the crushing surface). A figure shows the crushing damage observed in a fiberglass composite tube specimen, illustrating the complexity of the response. To achieve a finite element model of such complex behavior is an extremely challenging problem. A practical crushing model based on detailed modeling of the physical mechanisms of crushing behavior is not expected in the foreseeable future. The present research describes attempts to model composite crushing behavior using a novel hybrid modeling procedure. Experimental testing is done is support of the modeling efforts, and a test specimen is developed to provide data for validating laminate crushing models.

  4. Analysis of nonlinear systems using ARMA [autoregressive moving average] models

    International Nuclear Information System (INIS)

    While many vibration systems exhibit primarily linear behavior, a significant percentage of the systems encountered in vibration and model testing are mildly to severely nonlinear. Analysis methods for such nonlinear systems are not yet well developed and the response of such systems is not accurately predicted by linear models. Nonlinear ARMA (autoregressive moving average) models are one method for the analysis and response prediction of nonlinear vibratory systems. In this paper we review the background of linear and nonlinear ARMA models, and illustrate the application of these models to nonlinear vibration systems. We conclude by summarizing the advantages and disadvantages of ARMA models and emphasizing prospects for future development. 14 refs., 11 figs

  5. An Object Extraction Model Using Association Rules and Dependence Analysis

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Extracting objects from legacy systems is a basic step insystem's obje ct-orientation to improve the maintainability and understandability of the syst e ms. A new object extraction model using association rules an d dependence analysis is proposed. In this model data are classified by associat ion rules and the corresponding operations are partitioned by dependence analysis.

  6. Probabilistic Models of Analysis of Loan Activity of Internet Banking

    OpenAIRE

    Kondrateva Irina G.; Ostapenko Irina N.

    2012-01-01

    In article the main advantages of electronic banking in comparison with traditional, methods of the analysis of credit activity are considered. The special role is taken to the probabilistic method of analysis of credit activity in the Internet-bank. Modeling of activity of bank on the basis of probabilistic models of credit operations.

  7. Book review: Statistical Analysis and Modelling of Spatial Point Patterns

    DEFF Research Database (Denmark)

    Møller, Jesper

    2009-01-01

    Statistical Analysis and Modelling of Spatial Point Patterns by J. Illian, A. Penttinen, H. Stoyan and D. Stoyan. Wiley (2008), ISBN 9780470014912......Statistical Analysis and Modelling of Spatial Point Patterns by J. Illian, A. Penttinen, H. Stoyan and D. Stoyan. Wiley (2008), ISBN 9780470014912...

  8. Multidimensional data modeling for business process analysis

    OpenAIRE

    Mansmann, Svetlana; Neumuth, Thomas; Scholl, Marc H.

    2007-01-01

    The emerging area of business process intelligence attempts to enhance the analytical capabilities of business process management systems by employing data warehousing and mining technologies. This paper presents an approach to re-engineering the business process modeling in conformity with the multidimensional data model. Since the business process and the multidimensional model are driven by rather different objectives and assumptions, there is no straightforward solution to converging thes...

  9. Multifractal modelling and 3D lacunarity analysis

    International Nuclear Information System (INIS)

    This study presents a comparative evaluation of lacunarity of 3D grey level models with different types of inhomogeneity. A new method based on the 'Relative Differential Box Counting' was developed to estimate the lacunarity features of grey level volumes. To validate our method, we generated a set of 3D grey level multifractal models with random, anisotropic and hierarchical properties. Our method gives a lacunarity measurement correlated with the theoretical one and allows a better model classification compared with a classical approach.

  10. Finite-Element Modeling For Structural Analysis

    Science.gov (United States)

    Min, J. B.; Androlake, S. G.

    1995-01-01

    Report presents study of finite-element mathematical modeling as used in analyzing stresses and strains at joints between thin, shell-like components (e.g., ducts) and thicker components (e.g., flanges or engine blocks). First approach uses global/local model to evaluate system. Provides correct total response and correct representation of stresses away from any discontinuities. Second approach involves development of special transition finite elements to model transitions between shells and thicker structural components.

  11. The Modeling Analysis of Huangshan Tourism Data

    Science.gov (United States)

    Hu, Shanfeng; Yan, Xinhu; Zhu, Hongbing

    2016-06-01

    Tourism is the major industry in Huangshan city. This paper analyzes time series of tourism data to Huangshan from 2000 to 2013. The Yearly data set comprises the total arrivals of tourists, total income, Urban Resident Disposable Income Per Capital and Net Income Per Peasant. A mathematical model which is based on the binomial approximation and inverse quadratic radial basis function (RBF) is set up to model the tourist arrivals. The total income and urban resident disposable income per capital and net income per peasant are also modeled. It is shown that the established mathematical model can be used to forecast some tourism information and achieve a good management for Huangshan tourism.

  12. Evaluating Network Models: A Likelihood Analysis

    CERN Document Server

    Wang, Wen-Qiang; Zhou, Tao

    2011-01-01

    Many models are put forward to mimic the evolution of real networked systems. A well-accepted way to judge the validity is to compare the modeling results with real networks subject to several structural features. Even for a specific real network, we cannot fairly evaluate the goodness of different models since there are too many structural features while there is no criterion to select and assign weights on them. Motivated by the studies on link prediction algorithms, we propose a unified method to evaluate the network models via the comparison of the likelihoods of the currently observed network driven by different models, with an assumption that the higher the likelihood is, the better the model is. We test our method on the real Internet at the Autonomous System (AS) level, and the results suggest that the Generalized Linear Preferential (GLP) model outperforms the Tel Aviv Network Generator (Tang), while both two models are better than the Barab\\'asi-Albert (BA) and Erd\\"os-R\\'enyi (ER) models. Our metho...

  13. Mineralogic Model (MM3.0) Analysis Model Report

    International Nuclear Information System (INIS)

    The purpose of this report is to document the Mineralogic Model (MM), Version 3.0 (MM3.0) with regard to data input, modeling methods, assumptions, uncertainties, limitations and validation of the model results, qualification status of the model, and the differences between Version 3.0 and previous versions. A three-dimensional (3-D) Mineralogic Model was developed for Yucca Mountain to support the analyses of hydrologic properties, radionuclide transport, mineral health hazards, repository performance, and repository design. Version 3.0 of the MM was developed from mineralogic data obtained from borehole samples. It consists of matrix mineral abundances as a function of x (easting), y (northing), and z (elevation), referenced to the stratigraphic framework defined in Version 3.1 of the Geologic Framework Model (GFM). The MM was developed specifically for incorporation into the 3-D Integrated Site Model (ISM). The MM enables project personnel to obtain calculated mineral abundances at any position, within any region, or within any stratigraphic unit in the model area. The significance of the MM for key aspects of site characterization and performance assessment is explained in the following subsections. This work was conducted in accordance with the Development Plan for the MM (CRWMS M and O 2000). The planning document for this Rev. 00, ICN 02 of this AMR is Technical Work Plan, TWP-NBS-GS-000003, Technical Work Plan for the Integrated Site Model, Process Model Report, Revision 01 (CRWMS M and O 2000). The purpose of this ICN is to record changes in the classification of input status by the resolution of the use of TBV software and data in this report. Constraints and limitations of the MM are discussed in the appropriate sections that follow. The MM is one component of the ISM, which has been developed to provide a consistent volumetric portrayal of the rock layers, rock properties, and mineralogy of the Yucca Mountain site. The ISM consists of three components

  14. Mineralogic Model (MM3.0) Analysis Model Report

    Energy Technology Data Exchange (ETDEWEB)

    C. Lum

    2002-02-12

    The purpose of this report is to document the Mineralogic Model (MM), Version 3.0 (MM3.0) with regard to data input, modeling methods, assumptions, uncertainties, limitations and validation of the model results, qualification status of the model, and the differences between Version 3.0 and previous versions. A three-dimensional (3-D) Mineralogic Model was developed for Yucca Mountain to support the analyses of hydrologic properties, radionuclide transport, mineral health hazards, repository performance, and repository design. Version 3.0 of the MM was developed from mineralogic data obtained from borehole samples. It consists of matrix mineral abundances as a function of x (easting), y (northing), and z (elevation), referenced to the stratigraphic framework defined in Version 3.1 of the Geologic Framework Model (GFM). The MM was developed specifically for incorporation into the 3-D Integrated Site Model (ISM). The MM enables project personnel to obtain calculated mineral abundances at any position, within any region, or within any stratigraphic unit in the model area. The significance of the MM for key aspects of site characterization and performance assessment is explained in the following subsections. This work was conducted in accordance with the Development Plan for the MM (CRWMS M&O 2000). The planning document for this Rev. 00, ICN 02 of this AMR is Technical Work Plan, TWP-NBS-GS-000003, Technical Work Plan for the Integrated Site Model, Process Model Report, Revision 01 (CRWMS M&O 2000). The purpose of this ICN is to record changes in the classification of input status by the resolution of the use of TBV software and data in this report. Constraints and limitations of the MM are discussed in the appropriate sections that follow. The MM is one component of the ISM, which has been developed to provide a consistent volumetric portrayal of the rock layers, rock properties, and mineralogy of the Yucca Mountain site. The ISM consists of three components: (1

  15. Meta-analysis a structural equation modeling approach

    CERN Document Server

    Cheung, Mike W-L

    2015-01-01

    Presents a novel approach to conducting meta-analysis using structural equation modeling. Structural equation modeling (SEM) and meta-analysis are two powerful statistical methods in the educational, social, behavioral, and medical sciences. They are often treated as two unrelated topics in the literature. This book presents a unified framework on analyzing meta-analytic data within the SEM framework, and illustrates how to conduct meta-analysis using the metaSEM package in the R statistical environment. Meta-Analysis: A Structural Equation Modeling Approach begins by introducing the impo

  16. An Intelligent Analysis Model for Multisource Volatile Memory

    Directory of Open Access Journals (Sweden)

    Xiaolu Zhang

    2013-09-01

    Full Text Available For the rapidly development of network and distributed computing environment, it make researchers harder to do analysis examines only from one or few pieces of data source in persistent data-oriented approaches, so as the volatile memory analysis either. Therefore, mass data automatically analysis and action modeling needs to be considered for reporting entire network attack process. To model multiple volatile data sources situation can help understand and describe both thinking process of investigator and possible action step for attacker. This paper presents a Game model for multisource volatile data and applies it to main memory images analysis with the definition of space-time feature for volatile element information. Abstract modeling allows the lessons gleaned in performing intelligent analysis, evidence filing and automating presentation. Finally, a test demo based on the model is also present to illustrate the whole procedure

  17. SBKF Modeling and Analysis Plan: Buckling Analysis of Compression-Loaded Orthogrid and Isogrid Cylinders

    Science.gov (United States)

    Lovejoy, Andrew E.; Hilburger, Mark W.

    2013-01-01

    This document outlines a Modeling and Analysis Plan (MAP) to be followed by the SBKF analysts. It includes instructions on modeling and analysis formulation and execution, model verification and validation, identifying sources of error and uncertainty, and documentation. The goal of this MAP is to provide a standardized procedure that ensures uniformity and quality of the results produced by the project and corresponding documentation.

  18. Meta-analysis of clinical prediction models

    NARCIS (Netherlands)

    Debray, T.P.A.

    2013-01-01

    The past decades there has been a clear shift from implicit to explicit diagnosis and prognosis. This includes appreciation of clinical -diagnostic and prognostic- prediction models, which is likely to increase with the introduction of fully computerized patient records. Prediction models aim to pro

  19. Wellness Model of Supervision: A Comparative Analysis

    Science.gov (United States)

    Lenz, A. Stephen; Sangganjanavanich, Varunee Faii; Balkin, Richard S.; Oliver, Marvarene; Smith, Robert L.

    2012-01-01

    This quasi-experimental study compared the effectiveness of the Wellness Model of Supervision (WELMS; Lenz & Smith, 2010) with alternative supervision models for developing wellness constructs, total personal wellness, and helping skills among counselors-in-training. Participants were 32 master's-level counseling students completing their…

  20. Multivariate model for test response analysis

    NARCIS (Netherlands)

    Krishnan, S.; Kerkhoff, H.G.

    2010-01-01

    A systematic approach to construct an effective multivariate test response model for capturing manufacturing defects in electronic products is described. The effectiveness of the model is demonstrated by its capability in reducing the number of test-points, while achieving the maximal coverage attai

  1. Modeling and motion analysis of autonomous paragliders

    OpenAIRE

    Chiara Toglia; Marilena Vendittelli

    2010-01-01

    This report describes a preliminary study on modeling and control of parafoil and payload systems with the twofold objective of developing tools for automatic testing and classification of parafoils and of devising autonomous paragliders able to accomplish long-range delivery or monitoring tasks. Three different models of decreasing complexity are derived and their accuracy compared by simulation.

  2. Temporal analysis of text data using latent variable models

    DEFF Research Database (Denmark)

    Mølgaard, Lasse Lohilahti; Larsen, Jan; Goutte, Cyril

    2009-01-01

    Probabilistic Latent Semantic Analysis (PLSA) approach and a global multiway PLSA method. The analysis indicates that the global analysis method is able to identify relevant trends which are difficult to get using a step-by-step approach. Furthermore we show that inspection of PLSA models with different number...

  3. Analysis and modelization of lightweight structures subjected to impact

    OpenAIRE

    Barbero Pozuelo, Enrique; López-Puente, Jorge

    2008-01-01

    Mechanics of Advanced Materials research group (Department of Continuum Mechanics and Structural Analysis) of the University Carlos III of Madrid (Spain) offers their experience in the analysis and modelization of high and low velocity impact behaviour of composite structures. Their research focuses on both numerical analysis and non-standard experimental methodologies).

  4. GLOBAL ANALYSIS OF AGRICULTURAL TRADE LIBERALIZATION: ASSESSING MODEL VALIDITY

    OpenAIRE

    Hertel, Thomas W.; Keeney, Roman; Valenzuela, Ernesto

    2004-01-01

    This paper presents a validation experiment of a global CGE trade model widely used for analysis of trade liberalization. We focus on the ability of the model to reproduce price volatility in wheat markets. The literature on model validation is reviewed with an eye towards designing an appropriate methodology for validating large scale CGE models. The validation experiment results indicate that in its current form, the GTAP-AGR model is incapable of reproducing wheat market price volatility a...

  5. Analysis and modeling of solar irradiance variations

    CERN Document Server

    Yeo, K L

    2014-01-01

    A prominent manifestation of the solar dynamo is the 11-year activity cycle, evident in indicators of solar activity, including solar irradiance. Although a relationship between solar activity and the brightness of the Sun had long been suspected, it was only directly observed after regular satellite measurements became available with the launch of Nimbus-7 in 1978. The measurement of solar irradiance from space is accompanied by the development of models aimed at describing the apparent variability by the intensity excess/deficit effected by magnetic structures in the photosphere. The more sophisticated models, termed semi-empirical, rely on the intensity spectra of photospheric magnetic structures generated with radiative transfer codes from semi-empirical model atmospheres. An established example of such models is SATIRE-S (Spectral And Total Irradiance REconstruction for the Satellite era). One key limitation of current semi-empirical models is the fact that the radiant properties of network and faculae a...

  6. Quark model analysis of the Sivers function

    CERN Document Server

    Courtoy, A; Scopetta, S; Vento, V

    2008-01-01

    A formalism is developed aimed at evaluating the Sivers function entering single spin asymmetries. The approach is well suited for calculations which use constituent quark models to describe the structure of the nucleon. A non-relativistic approximation of the scheme is performed to calculate the Sivers function using the Isgur-Karl model. The results we have obtained are consistent with a sizable Sivers effect, with an opposite sign for the u and d flavors. This pattern is in agreement with the one found analysing, in the same model, the impact parameter dependent generalized parton distributions. Although a consistent QCD evolution of the results from the momentum scale of the model to the experimental one is not yet possible, an estimate shows that a reasonable agreement with the available data is obtained once the evolution of the model results is performed.

  7. Analysis of Modeling Parameters on Threaded Screws.

    Energy Technology Data Exchange (ETDEWEB)

    Vigil, Miquela S. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Brake, Matthew Robert [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Vangoethem, Douglas [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-06-01

    Assembled mechanical systems often contain a large number of bolted connections. These bolted connections (joints) are integral aspects of the load path for structural dynamics, and, consequently, are paramount for calculating a structure's stiffness and energy dissipation prop- erties. However, analysts have not found the optimal method to model appropriately these bolted joints. The complexity of the screw geometry cause issues when generating a mesh of the model. This paper will explore different approaches to model a screw-substrate connec- tion. Model parameters such as mesh continuity, node alignment, wedge angles, and thread to body element size ratios are examined. The results of this study will give analysts a better understanding of the influences of these parameters and will aide in finding the optimal method to model bolted connections.

  8. Application of dimensional analysis in systems modeling and control design

    CERN Document Server

    Balaguer, Pedro

    2013-01-01

    Dimensional analysis is an engineering tool that is widely applied to numerous engineering problems, but has only recently been applied to control theory and problems such as identification and model reduction, robust control, adaptive control, and PID control. Application of Dimensional Analysis in Systems Modeling and Control Design provides an introduction to the fundamentals of dimensional analysis for control engineers, and shows how they can exploit the benefits of the technique to theoretical and practical control problems.

  9. Gentrification and models for real estate analysis

    OpenAIRE

    Gianfranco Brusa; Alessandra Armiraglio

    2013-01-01

    This research propose a deep analysis of Milanese real estate market, based on data supplied by three real estate organizations; gentrification appears in some neighborhoods, such as Tortona, Porta Genova, Bovisa, Isola Garibaldi: the latest is the subject of the final analysis, by surveying of physical and social state of the area. The survey takes place in two periods (2003 and 2009) to compare the evolution of gentrification. The results of surveys has been employed in a simulation by mult...

  10. Sensitivity Analysis of the Gap Heat Transfer Model in BISON.

    Energy Technology Data Exchange (ETDEWEB)

    Swiler, Laura Painton; Schmidt, Rodney C.; Williamson, Richard (INL); Perez, Danielle (INL)

    2014-10-01

    This report summarizes the result of a NEAMS project focused on sensitivity analysis of the heat transfer model in the gap between the fuel rod and the cladding used in the BISON fuel performance code of Idaho National Laboratory. Using the gap heat transfer models in BISON, the sensitivity of the modeling parameters and the associated responses is investigated. The study results in a quantitative assessment of the role of various parameters in the analysis of gap heat transfer in nuclear fuel.

  11. European Climate - Energy Security Nexus. A model based scenario analysis

    International Nuclear Information System (INIS)

    In this research, we have provided an overview of the climate-security nexus in the European sector through a model based scenario analysis with POLES model. The analysis underline that under stringent climate policies, Europe take advantage of a double dividend in its capacity to develop a new cleaner energy model and in lower vulnerability to potential shocks on the international energy markets. (authors)

  12. Policy Sensitivity Analysis: simple versus complex fishery models

    OpenAIRE

    Moxnes, Erling

    2003-01-01

    Sensitivity analysis is often used to judge the sensitivity of model behaviour to uncertain assumptions about model formulations and parameter values. Since the ultimate goal of modelling is typically policy recommendation, one may suspect that it is even more useful to test the sensitivity of policy recommendations. A major reason for this is that behaviour sensitivity is not necessarily a reliable predictor of policy sensitivity. Policy sensitivity analysis is greatly simplified if one can ...

  13. Multifractal modelling and 3D lacunarity analysis

    Energy Technology Data Exchange (ETDEWEB)

    Hanen, Akkari, E-mail: bettaieb.hanen@topnet.t [Laboratoire de biophysique, TIM, Faculte de Medecine (Tunisia); Imen, Bhouri, E-mail: bhouri_imen@yahoo.f [Unite de recherche ondelettes et multifractals, Faculte des sciences (Tunisia); Asma, Ben Abdallah, E-mail: asma.babdallah@cristal.rnu.t [Laboratoire de biophysique, TIM, Faculte de Medecine (Tunisia); Patrick, Dubois, E-mail: pdubois@chru-lille.f [INSERM, U 703, Lille (France); Hedi, Bedoui Mohamed, E-mail: medhedi.bedoui@fmm.rnu.t [Laboratoire de biophysique, TIM, Faculte de Medecine (Tunisia)

    2009-09-28

    This study presents a comparative evaluation of lacunarity of 3D grey level models with different types of inhomogeneity. A new method based on the 'Relative Differential Box Counting' was developed to estimate the lacunarity features of grey level volumes. To validate our method, we generated a set of 3D grey level multifractal models with random, anisotropic and hierarchical properties. Our method gives a lacunarity measurement correlated with the theoretical one and allows a better model classification compared with a classical approach.

  14. Analysis of Brown camera distortion model

    Science.gov (United States)

    Nowakowski, Artur; Skarbek, Władysław

    2013-10-01

    Contemporary image acquisition devices introduce optical distortion into image. It results in pixel displacement and therefore needs to be compensated for many computer vision applications. The distortion is usually modeled by the Brown distortion model, which parameters can be included in camera calibration task. In this paper we describe original model, its dependencies and analyze orthogonality with regard to radius for its decentering distortion component. We also report experiments with camera calibration algorithm included in OpenCV library, especially a stability of distortion parameters estimation is evaluated.

  15. q-gram analysis and urn models

    OpenAIRE

    Nicodème, Pierre

    2003-01-01

    Words of fixed size q are commonly referred to as q-grams. We consider the problem of q-gram filtration, a method commonly used to speed upsequence comparison. We are interested in the statistics of the number of q-grams common to two random texts (where multiplicities are not counted) in the non uniform Bernoulli model. In the exact and dependent model, when omitting border effects, a q-gramin a random sequence depends on the q-1 preceding q-grams. In an approximate and independent model, we...

  16. Structural dynamic analysis with generalized damping models analysis

    CERN Document Server

    Adhikari , Sondipon

    2013-01-01

    Since Lord Rayleigh introduced the idea of viscous damping in his classic work ""The Theory of Sound"" in 1877, it has become standard practice to use this approach in dynamics, covering a wide range of applications from aerospace to civil engineering. However, in the majority of practical cases this approach is adopted more for mathematical convenience than for modeling the physics of vibration damping. Over the past decade, extensive research has been undertaken on more general ""non-viscous"" damping models and vibration of non-viscously damped systems. This book, along with a related book

  17. Horizontal crash testing and analysis of model flatrols

    International Nuclear Information System (INIS)

    To assess the behaviour of a full scale flask and flatrol during a proposed demonstration impact into a tunnel abutment, a mathematical modelling technique was developed and validated. The work was performed at quarter scale and comprised of both scale model tests and mathematical analysis in one and two dimensions. Good agreement between model test results of the 26.8m/s (60 mph) abutment impacts and the mathematical analysis, validated the modelling techniques. The modelling method may be used with confidence to predict the outcome of the proposed full scale demonstration. (author)

  18. Flood Progression Modelling and Impact Analysis

    DEFF Research Database (Denmark)

    Mioc, Darka; Anton, François; Nickerson, B.;

    People living in the lower valley of the St. John River, New Brunswick, Canada, frequently experience flooding when the river overflows its banks during spring ice melt and rain. To better prepare the population of New Brunswick for extreme flooding, we developed a new flood prediction model that...... computes floodplain polygons before the flood occurs. This allows emergency managers to access the impact of the flood before it occurs and make the early decisions for evacuation of the population and flood rescue. This research shows that the use of GIS and LiDAR technologies combined with hydrological...... modelling can significantly improve the decision making and visualization of flood impact needed for emergency planning and flood rescue. Furthermore, the 3D GIS application we developed for modelling flooded buildings and infrastructure provides a better platform for modelling and visualizing flood...

  19. Modeling and analysis of web portals performance

    Science.gov (United States)

    Abdul Rahim, Rahela; Ibrahim, Haslinda; Syed Yahaya, Sharipah Soaad; Khalid, Khairini

    2011-10-01

    The main objective of this study is to develop a model based on queuing theory at a system level of web portals performance for a university. A system level performance model views the system being modeled as a 'black box' which considers the arrival rate of packets to the portals server and service rate of the portals server. These two parameters are important elements to measure Web portals performance metrics such as server utilization, average server throughput, average number of packet in the server and mean response time. This study refers to infinite population and finite queue. The proposed analytical model is simple in such a way that it is easy to define and fast to interpret the results but still represents the real situation.

  20. Complex networks analysis in socioeconomic models

    CERN Document Server

    Varela, Luis M; Ausloos, Marcel; Carrete, Jesus

    2014-01-01

    This chapter aims at reviewing complex networks models and methods that were either developed for or applied to socioeconomic issues, and pertinent to the theme of New Economic Geography. After an introduction to the foundations of the field of complex networks, the present summary adds insights on the statistical mechanical approach, and on the most relevant computational aspects for the treatment of these systems. As the most frequently used model for interacting agent-based systems, a brief description of the statistical mechanics of the classical Ising model on regular lattices, together with recent extensions of the same model on small-world Watts-Strogatz and scale-free Albert-Barabasi complex networks is included. Other sections of the chapter are devoted to applications of complex networks to economics, finance, spreading of innovations, and regional trade and developments. The chapter also reviews results involving applications of complex networks to other relevant socioeconomic issues, including res...

  1. Dynamic Chest Image Analysis: Model-Based Perfusion Analysis in Dynamic Pulmonary Imaging

    OpenAIRE

    Kiuru Aaro; Kormano Martti; Svedström Erkki; Liang Jianming; Järvi Timo

    2003-01-01

    The "Dynamic Chest Image Analysis" project aims to develop model-based computer analysis and visualization methods for showing focal and general abnormalities of lung ventilation and perfusion based on a sequence of digital chest fluoroscopy frames collected with the dynamic pulmonary imaging technique. We have proposed and evaluated a multiresolutional method with an explicit ventilation model for ventilation analysis. This paper presents a new model-based method for pulmonary perfusion ana...

  2. Modeling, Analysis, and Numerics in Electrohydrodynamics

    OpenAIRE

    Schmuck, Markus

    2008-01-01

    The main subject of this thesis is to analyze the incompressible Navier-Stokes-Nernst-Planck-Poisson system for bounded domains. Such a system is used as a model in electrohydrodynamics or physicochemical models. First, we verify existence of weak and strong solutions. Moreover, we are able to characterize the weak solutions by an energy and an entropy law. The concentrations in the Nernst-Planck equations additionally are non-negative and bounded. These results motivate to construct conv...

  3. Tradeoff Analysis for Optimal Multiobjective Inventory Model

    OpenAIRE

    Longsheng Cheng; Ching-Shih Tsou; Ming-Chang Lee; Li-Hua Huang; Dingwei Song; Wei-Shan Teng

    2013-01-01

    Deterministic inventory model, the economic order quantity (EOQ), reveals that carrying inventory or ordering frequency follows a relation of tradeoff. For probabilistic demand, the tradeoff surface among annual order, expected inventory and shortage are useful because they quantify what the firm must pay in terms of ordering workload and inventory investment to meet the customer service desired. Based on a triobjective inventory model, this paper employs the successive approximation to obtai...

  4. Model-driven Engineering for Requirements Analysis

    OpenAIRE

    Baudry, Benoit; Nebut, Clementine; Le Traon,Yves

    2007-01-01

    Requirements engineering (RE) encompasses a set of activities for eliciting, modelling, agreeing, communicating and validating requirements that precisely define the problem domain for a software system. Several tools and methods exist to perform each of these activities, but they mainly remain separate, making it difficult to capture the global consistency of large requirement documents. In this paper we introduce model-driven engineering (MDE) as a possible technical solution to integrate the...

  5. Digital Avionics Information System (DAIS): Training Requirements Analysis Model (TRAMOD).

    Science.gov (United States)

    Czuchry, Andrew J.; And Others

    The training requirements analysis model (TRAMOD) described in this report represents an important portion of the larger effort called the Digital Avionics Information System (DAIS) Life Cycle Cost (LCC) Study. TRAMOD is the second of three models that comprise an LCC impact modeling system for use in the early stages of system development. As…

  6. Mathematical Modelling and Experimental Analysis of Early Age Concrete

    DEFF Research Database (Denmark)

    Hauggaard-Nielsen, Anders Boe

    1997-01-01

    lead to cracks in the later cooling phase. The matrial model has intrigate couplings between the involved mechanics, and in the thesis special emphasize is put on the creep behaviour. The mathematical models are based on experimental analysis and numerical implementation of the models in a finite...

  7. Adversarial Scheduling Analysis of Game Theoretic Models of Norm Diffusion

    CERN Document Server

    Istrate, Gabriel; Ravi, S S

    2008-01-01

    In (Istrate, Marathe, Ravi SODA 2001) we advocated the investigation of robustness of results in the theory of learning in games under adversarial scheduling models. We provide evidence that such an analysis is feasible and can lead to nontrivial results by investigating, in an adversarial scheduling setting, Peyton Young's model of diffusion of norms. In particular, our main result incorporates into Peyton Young's model.

  8. Integration of Design and Control Through Model Analysis

    DEFF Research Database (Denmark)

    Russel, Boris Mariboe; Henriksen, Jens Peter; Jørgensen, Sten Bay;

    2000-01-01

    phenomena models representing the process model identify the relationships between the important process and design variables, which help to understand, define and address some of the issues related to integration of design and control issues. The model analysis is highlighted through examples involving...... processes with mass and/or energy recycle. (C) 2000 Elsevier Science Ltd. All rights reserved....

  9. Comparative Analysis of Uncertainties in Urban Surface Runoff Modelling

    DEFF Research Database (Denmark)

    Thorndahl, Søren; Schaarup-Jensen, Kjeld

    In the present paper a comparison between three different surface runoff models, in the numerical urban drainage tool MOUSE, is conducted. Analysing parameter uncertainty, it is shown that the models are very sensitive with regards to the choice of hydrological parameters, when combined overflow...... analysis, further research in improved parameter assessment for surface runoff models is needed....

  10. Analysis of multistate models for electromigration failure

    Science.gov (United States)

    Dwyer, V. M.

    2010-02-01

    The application of a multistate Markov chain is considered as a model of electromigration interconnect degradation and eventual failure. Such a model has already been used [Tan et al., J. Appl. Phys. 102, 103703 (2007)], maintaining that, in general, it leads to a failure distribution described by a gamma mixture, and that as a result, this type of distribution (rather than a lognormal) should be used as a prior in any Bayesian mode fitting and subsequent reliability budgeting. Although it appears that the model is able to produce reasonably realistic resistance curves R(t), we are unable to find any evidence that the failure distribution is a simple gamma mixture except under contrived conditions. The distributions generated are largely sums of exponentials (phase-type distributions), convolutions of gamma distributions with different scales, or roughly normal. We note also some inconsistencies in the derivation of the gamma mixture in the work cited above and conclude that, as it stands, the Markov chain model is probably unsuitable for electromigration modeling and a change from lognormal to gamma mixture distribution generally cannot be justified in this way. A hidden Markov model, which describes the interconnect behavior at time t rather than its resistance, in terms of generally observed physical processes such as void nucleating, slitlike growth (where the growth is slow and steady), transverse growth, current shunting (where the resistance jumps in value), etc., seems a more likely prospect, but treating failure in such a manner would still require significant justification.

  11. COMPUTER DATA ANALYSIS AND MODELING: COMPLEX STOCHASTIC DATA AND SYSTEMS

    OpenAIRE

    2010-01-01

    This collection of papers includes proceedings of the Ninth International Conference “Computer Data Analysis and Modeling: Complex Stochastic Data and Systems” organized by the Belarusian State University and held in September 2010 in Minsk. The papers are devoted to the topical problems: robust and nonparametric data analysis; statistical analysis of time series and forecasting; multivariate data analysis; design of experiments; statistical signal and image processing...

  12. Urban Sprawl Analysis and Modeling in Asmara, Eritrea

    OpenAIRE

    Tewolde, Mussie G.; Pedro Cabral

    2011-01-01

    The extension of urban perimeter markedly cuts available productive land. Hence, studies in urban sprawl analysis and modeling play an important role to ensure sustainable urban development. The urbanization pattern of the Greater Asmara Area (GAA), the capital of Eritrea, was studied. Satellite images and geospatial tools were employed to analyze the spatiotemporal urban landuse changes. Object-Based Image Analysis (OBIA), Landuse Cover Change (LUCC) analysis and urban sprawl analysis using ...

  13. TIME SERIES ANALYSIS USING A UNIQUE MODEL OF TRANSFORMATION

    Directory of Open Access Journals (Sweden)

    Goran Klepac

    2007-12-01

    Full Text Available REFII1 model is an authorial mathematical model for time series data mining. The main purpose of that model is to automate time series analysis, through a unique transformation model of time series. An advantage of this approach of time series analysis is the linkage of different methods for time series analysis, linking traditional data mining tools in time series, and constructing new algorithms for analyzing time series. It is worth mentioning that REFII model is not a closed system, which means that we have a finite set of methods. At first, this is a model for transformation of values of time series, which prepares data used by different sets of methods based on the same model of transformation in a domain of problem space. REFII model gives a new approach in time series analysis based on a unique model of transformation, which is a base for all kind of time series analysis. The advantage of REFII model is its possible application in many different areas such as finance, medicine, voice recognition, face recognition and text mining.

  14. Simplified Analysis Model for Predicting Pyroshock Responses on Composite Panel

    Science.gov (United States)

    Iwasa, Takashi; Shi, Qinzhong

    A simplified analysis model based on the frequency response analysis and the wave propagation analysis was established for predicting Shock Response Spectrum (SRS) on the composite panel subjected to pyroshock loadings. The complex composite panel was modeled as an isotropic single layer panel defined in NASA Lewis Method. Through the conductance of an impact excitation test on a composite panel with no equipment mounted on, it was presented that the simplified analysis model could estimate the SRS as well as the acceleration peak values in both near and far field in an accurate way. In addition, through the simulation for actual pyroshock tests on an actual satellite system, the simplified analysis model was proved to be applicable in predicting the actual pyroshock responses, while bringing forth several technical issues to estimate the pyroshock test specifications in early design stages.

  15. Statistical Performance Analysis and Modeling Techniques for Nanometer VLSI Designs

    CERN Document Server

    Shen, Ruijing; Yu, Hao

    2012-01-01

    Since process variation and chip performance uncertainties have become more pronounced as technologies scale down into the nanometer regime, accurate and efficient modeling or characterization of variations from the device to the architecture level have  become imperative for the successful design of VLSI chips. This book provides readers with tools for variation-aware design methodologies and computer-aided design (CAD) of VLSI systems, in the presence of process variations at the nanometer scale. It presents the latest developments for modeling and analysis, with a focus on statistical interconnect modeling, statistical parasitic extractions, statistical full-chip leakage and dynamic power analysis considering spatial correlations, statistical analysis and modeling for large global interconnects and analog/mixed-signal circuits.  Provides readers with timely, systematic and comprehensive treatments of statistical modeling and analysis of VLSI systems with a focus on interconnects, on-chip power grids and ...

  16. Modeling Mass Spectrometry Based Protein Analysis

    OpenAIRE

    Eriksson, Jan; Fenyö, David

    2011-01-01

    The success of mass spectrometry based proteomics depends on efficient methods for data analysis. These methods require a detailed understanding of the information value of the data. Here, we describe how the information value can be elucidated by performing simulations using synthetic data.

  17. The case for repeatable analysis with energy economy optimization models

    International Nuclear Information System (INIS)

    Energy economy optimization (EEO) models employ formal search techniques to explore the future decision space over several decades in order to deliver policy-relevant insights. EEO models are a critical tool for decision-makers who must make near-term decisions with long-term effects in the face of large future uncertainties. While the number of model-based analyses proliferates, insufficient attention is paid to transparency in model development and application. Given the complex, data-intensive nature of EEO models and the general lack of access to source code and data, many of the assumptions underlying model-based analysis are hidden from external observers. This paper discusses the simplifications and subjective judgments involved in the model building process, which cannot be fully articulated in journal papers, reports, or model documentation. In addition, we argue that for all practical purposes, EEO model-based insights cannot be validated through comparison to real world outcomes. As a result, modelers are left without credible metrics to assess a model's ability to deliver reliable insight. We assert that EEO models should be discoverable through interrogation of publicly available source code and data. In addition, third parties should be able to run a specific model instance in order to independently verify published results. Yet a review of twelve EEO models suggests that in most cases, replication of model results is currently impossible. We provide several recommendations to help develop and sustain a software framework for repeatable model analysis.

  18. The ECOGEM-Chile Model: A CGE Model for Environmental and Trade Policy Analysis

    OpenAIRE

    Raúl O’Ryan; De Miguel, Carlos J.; Sebastián Miller

    2003-01-01

    Computable General Equilibrium (CGE) models are a powerful economic tool for multidimensional/multisectoral analysis. They improve traditional input-output analysis generating quantities and prices endogenously and reflecting market incentives. They complement partial equilibrium analysis with a broader scope of analysis and the quantification of indirect and often non-intuitive effects. Environmental applications of CGE models include trade and environment, climate change, energy problems, n...

  19. MODELING HUMAN RELIABILITY ANALYSIS USING MIDAS

    Energy Technology Data Exchange (ETDEWEB)

    Ronald L. Boring; Donald D. Dudenhoeffer; Bruce P. Hallbert; Brian F. Gore

    2006-05-01

    This paper summarizes an emerging collaboration between Idaho National Laboratory and NASA Ames Research Center regarding the utilization of high-fidelity MIDAS simulations for modeling control room crew performance at nuclear power plants. The key envisioned uses for MIDAS-based control room simulations are: (i) the estimation of human error with novel control room equipment and configurations, (ii) the investigative determination of risk significance in recreating past event scenarios involving control room operating crews, and (iii) the certification of novel staffing levels in control rooms. It is proposed that MIDAS serves as a key component for the effective modeling of risk in next generation control rooms.

  20. MODELING HUMAN RELIABILITY ANALYSIS USING MIDAS

    International Nuclear Information System (INIS)

    This paper summarizes an emerging collaboration between Idaho National Laboratory and NASA Ames Research Center regarding the utilization of high-fidelity MIDAS simulations for modeling control room crew performance at nuclear power plants. The key envisioned uses for MIDAS-based control room simulations are: (1) the estimation of human error with novel control room equipment and configurations, (2) the investigative determination of risk significance in recreating past event scenarios involving control room operating crews, and (3) the certification of novel staffing levels in control rooms. It is proposed that MIDAS serves as a key component for the effective modeling of risk in next generation control rooms

  1. A critical analysis of the hydrino model

    CERN Document Server

    Rathke, A

    2005-01-01

    Recently, spectroscopic and calorimetric observations of hydrogen plasmas and chemical reactions with them have been interpreted as evidence for the existence of electronic states of the hydrogen atom with a binding energy of more than 13.6 eV. The theoretical basis for such states, that have been dubbed hydrinos, is investigated. We discuss both, the novel deterministic model of the hydrogen atom, in which the existence of hydrinos was predicted, and standard quantum mechanics. Severe inconsistencies in the deterministic model are pointed out and the incompatibility of hydrino states with quantum mechanics is reviewed.

  2. ANALYSIS OF SOFTWARE COST ESTIMATION MODELS

    OpenAIRE

    Tahir Abdullah; Rabia Saleem; Shahbaz Nazeer; Muhammad Usman

    2012-01-01

    Software Cost estimation is a process of forecasting the Cost of project in terms of budget, time, and other resources needed to complete a software system and it is a core issue in the software project management to estimate the cost of a project before initiating the Software Project. Different models have been developed to estimate the cost of software projects for the last several years. Most of these models rely on the Analysts’ experience, size of the software project and some other sof...

  3. Applications of model theory to functional analysis

    CERN Document Server

    Iovino, Jose

    2014-01-01

    During the last two decades, methods that originated within mathematical logic have exhibited powerful applications to Banach space theory, particularly set theory and model theory. This volume constitutes the first self-contained introduction to techniques of model theory in Banach space theory. The area of research has grown rapidly since this monograph's first appearance, but much of this material is still not readily available elsewhere. For instance, this volume offers a unified presentation of Krivine's theorem and the Krivine-Maurey theorem on stable Banach spaces, with emphasis on the

  4. CRITICAL ANALYSIS OF EVALUATION MODEL LOMCE

    Directory of Open Access Journals (Sweden)

    José Luis Bernal Agudo

    2015-06-01

    Full Text Available The evaluation model that the LOMCE projects sinks its roots into the neoliberal beliefs, reflecting a specific way of understanding the world. What matters is not the process but the results, being the evaluation the center of the education-learning processes. It presents an evil planning, since the theory that justifies the model doesn’t specify upon coherent proposals, where there is an excessive worry for excellence and diversity is left out. A comprehensive way of understanding education should be recovered.

  5. Non-compartment model to compartment model pharmacokinetics transformation meta-analysis – a multivariate nonlinear mixed model

    OpenAIRE

    2010-01-01

    Background To fulfill the model based drug development, the very first step is usually a model establishment from published literatures. Pharmacokinetics model is the central piece of model based drug development. This paper proposed an important approach to transform published non-compartment model pharmacokinetics (PK) parameters into compartment model PK parameters. This meta-analysis was performed with a multivariate nonlinear mixed model. A conditional first-order linearization approach ...

  6. Theory and application of experimental model analysis in earthquake engineering

    Science.gov (United States)

    Moncarz, P. D.

    The feasibility and limitations of small-scale model studies in earthquake engineering research and practice is considered with emphasis on dynamic modeling theory, a study of the mechanical properties of model materials, the development of suitable model construction techniques and an evaluation of the accuracy of prototype response prediction through model case studies on components and simple steel and reinforced concrete structures. It is demonstrated that model analysis can be used in many cases to obtain quantitative information on the seismic behavior of complex structures which cannot be analyzed confidently by conventional techniques. Methodologies for model testing and response evaluation are developed in the project and applications of model analysis in seismic response studies on various types of civil engineering structures (buildings, bridges, dams, etc.) are evaluated.

  7. Coverage Modeling and Reliability Analysis Using Multi-state Function

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Fault tree analysis is an effective method for predicting the reliability of a system. It gives a pictorial representation and logical framework for analyzing the reliability. Also, it has been used for a long time as an effective method for the quantitative and qualitative analysis of the failure modes of critical systems. In this paper, we propose a new general coverage model (GCM) based on hardware independent faults. Using this model, an effective software tool can be constructed to detect, locate and recover fault from the faulty system. This model can be applied to identify the key component that can cause the failure of the system using failure mode effect analysis (FMEA).

  8. Evaluation of Cost Models and Needs & Gaps Analysis

    DEFF Research Database (Denmark)

    Kejser, Ulla Bøgvad

    2014-01-01

    his report ’D3.1—Evaluation of Cost Models and Needs & Gaps Analysis’ provides an analysis of existing research related to the economics of digital curation and cost & benefit modelling. It reports upon the investigation of how well current models and tools meet stakeholders’ needs for calculating...... for amore efficient use of resources for digital curation. To facilitate and clarify the model evaluation the report first outlines a basic terminology and a generaldescription of the characteristics of cost and benefit models.The report then describes how the ten current and emerging cost and benefit...... breakdown costs. This is followed by an in depth analysis of stakeholders’ needs for financial information derived from the 4C project stakeholder consultation.The stakeholders’ needs analysis indicated that models should:• support accounting, but more importantly they should enable budgeting• be able to...

  9. Analysis of mathematical model of leukemia

    Directory of Open Access Journals (Sweden)

    Helal Mohamed

    2015-01-01

    Full Text Available In this paper, a model describing the dynamic of chronic myeloid leukemia is studied. By analyzing the corresponding characteristic equations, the local stability of trivial and nontrivial equilibria are discussed. By establishing appropriate Lyapunov functions, we prove the global stability of the positive constant equilibrium solutions.

  10. Transient analysis models for nuclear power plants

    International Nuclear Information System (INIS)

    The modelling used for the simulation of the Angra-1 start-up reactor tests, using the RETRAN computer code is presented. Three tests are simulated: a)nuclear power plant trip from 100% of power; b)great power excursions tests and c)'load swing' tests.(E.G.)

  11. Social Ecological Model Analysis for ICT Integration

    Science.gov (United States)

    Zagami, Jason

    2013-01-01

    ICT integration of teacher preparation programmes was undertaken by the Australian Teaching Teachers for the Future (TTF) project in all 39 Australian teacher education institutions and highlighted the need for guidelines to inform systemic ICT integration approaches. A Social Ecological Model (SEM) was used to positively inform integration…

  12. Characteristic Analysis of Fire Modeling Codes

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Yoon Hwan; Yang, Joon Eon [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Kim, Jong Hoon [Kyeongmin College, Ujeongbu (Korea, Republic of)

    2004-04-15

    This report documents and compares key features of four zone models: CFAST, COMPBRN IIIE, MAGIC and the Fire Induced Vulnerability Evaluation (FIVE) methodology. CFAST and MAGIC handle multi-compartment, multi-fire problems, using many equations; COMPBRN and FIVE handle single compartment, single fire source problems, using simpler equation. The increased rigor of the formulation of CFAST and MAGIC does not mean that these codes are more accurate in every domain; for instance, the FIVE methodology uses a single zone approximation with a plume/ceiling jet sublayer, while the other models use a two-zone treatment without a plume/ceiling jet sublayer. Comparisons with enclosure fire data indicate that inclusion of plume/ceiling jet sublayer temperatures is more conservative, and generally more accurate than neglecting them. Adding a plume/ceiling jet sublayer to the two-zone models should be relatively straightforward, but it has not been done yet for any of the two-zone models. Such an improvement is in progress for MAGIC.

  13. Feature Analysis for Modeling Game Content Quality

    DEFF Research Database (Denmark)

    Shaker, Noor; Yannakakis, Georgios N.; Togelius, Julian

    2011-01-01

    ’ preferences, and by defining the smallest game session size for which the model can still predict reported emotion with acceptable accuracy. Neuroevolutionary preference learning is used to approximate the function from game content to reported emotional preferences. The experiments are based on a modified...

  14. Vibration analysis with MADYMO human models

    NARCIS (Netherlands)

    Verver, M.M.; Hoof, J.F.A.M. van

    2002-01-01

    The importance of comfort for the automotive industry is increasing. Car manufacturers use comfort to distinguish their products from their competitors. However, the development and design of a new car seat or interior is very time consuming and expensive. The introduction of computer models of huma

  15. Electrical Power Distribution and Control Modeling and Analysis

    Science.gov (United States)

    Fu, Johnny S.; Liffring, Mark; Mehdi, Ishaque S.

    2001-01-01

    This slide presentation reviews the use of Electrical Power Distribution and Control (EPD&C) Modeling and how modeling can support analysis. The presentation discusses using the EASY5 model to simulate and analyze the Space Shuttle Electric Auxiliary Power Unit. Diagrams of the model schematics are included, as well as graphs of the battery cell impedance, hydraulic load dynamics, and EPD&C response to hydraulic load variations.

  16. Continuum methods of physical modeling continuum mechanics, dimensional analysis, turbulence

    CERN Document Server

    Hutter, Kolumban

    2004-01-01

    The book unifies classical continuum mechanics and turbulence modeling, i.e. the same fundamental concepts are used to derive model equations for material behaviour and turbulence closure and complements these with methods of dimensional analysis. The intention is to equip the reader with the ability to understand the complex nonlinear modeling in material behaviour and turbulence closure as well as to derive or invent his own models. Examples are mostly taken from environmental physics and geophysics.

  17. Employing Power Graph Analysis to Facilitate Modeling Molecular Interaction Networks

    Directory of Open Access Journals (Sweden)

    Momchil Nenov

    2015-04-01

    Full Text Available Mathematical modeling is used to explore and understand complex systems ranging from weather patterns to social networks to gene-expression regulatory mechanisms. There is an upper limit to the amount of details that can be reflected in a model imposed by finite computational resources. Thus, there are methods to reduce the complexity of the modeled system to its most significant parameters. We discuss the suitability of clustering techniques, in particular Power Graph Analysis as an intermediate step of modeling.

  18. Photovoltaic Generation Model for Power System Transient Stability Analysis

    OpenAIRE

    Linan Qu; Dawei Zhao; Tao Shi; Ning Chen; Jie Ding

    2013-01-01

    It is necessary to model photovoltaic generation system based power system electromechanical transient time scales for large-scale PV connected to power system stability analysis. The model should reflect the non-linear output characteristics, fault ride-through response characteristics and output limits of photovoltaic generation system. A PV model used to meet these demands is proposed in this paper. Base on a 3-generator, 9-bus power system, the comparison and verification of the model is ...

  19. Human Modeling For Ground Processing Human Factors Engineering Analysis

    Science.gov (United States)

    Tran, Donald; Stambolian, Damon; Henderson, Gena; Barth, Tim

    2011-01-01

    There have been many advancements and accomplishments over that last few years using human modeling for human factors engineering analysis for design of spacecraft and launch vehicles. The key methods used for this are motion capture and computer generated human models. The focus of this paper is to explain the different types of human modeling used currently and in the past at Kennedy Space Center (KSC) currently, and to explain the future plans for human modeling for future spacecraft designs.

  20. Non-linear coupled CNN models for multiscale image analysis

    OpenAIRE

    Corinto, Fernando; Biey, Mario; Gilli, Marco

    2006-01-01

    A CNN model of partial differential equations (PDEs) for image multiscale analysis is proposed. The model is based on a polynomial representation of the diffusivity function and defines a paradigm of polynomial CNNs,for approximating a large class of nonlinear isotropic and/or anisotropic PDEs. The global dynamics of spacediscrete polynomial CNN models is analyzed and compared with the dynamic behavior of the corresponding space-continuous PDE models. It is shown that in the isotropic case th...

  1. Tram wheel modelling for dynamic analysis

    Czech Academy of Sciences Publication Activity Database

    Pešek, Luděk; Veselý, Jan; Podruh, J.

    Brno : SVS FEN s.r.o., 2003, s. 1-7. ISBN 80-239-1598-3. [ANSYS & STAR-CD Users' Meeting /11./. Znojmo (CZ), 25.09.2003-26.09.2003] R&D Projects: GA ČR GA101/02/0241 Institutional research plan: CEZ:AV0Z2076919 Keywords : tram wheel * experimental modal analysis Subject RIV: BI - Acoustics

  2. Modeling of Supply Chain Contextual-Load Model for Instability Analysis

    OpenAIRE

    Kadirkamanathan, Nordin Saad Visakan; Bennett, Stuart

    2008-01-01

    Simulation using a DES is an effective tool for the dynamically changing supply chain variables, thus allowing the system to be modeled more realistically. The modelling, simulation, and analysis of a supply chain discussed in this chapter are a preliminary attempt to establish a methodology for modelling, simulation and analysis of a supply chain using a DES. There are further problems to be considered both in the development of the model and the experimental design. The main contributions o...

  3. ANALYSIS/MODEL COVER SHEET, MULTISCALE THERMOHYDROLOGIC MODEL

    International Nuclear Information System (INIS)

    The purpose of the Multiscale Thermohydrologic Model (MSTHM) is to describe the thermohydrologic evolution of the near-field environment (NFE) and engineered barrier system (EBS) throughout the potential high-level nuclear waste repository at Yucca Mountain for a particular engineering design (CRWMS M andO 2000c). The process-level model will provide thermohydrologic (TH) information and data (such as in-drift temperature, relative humidity, liquid saturation, etc.) for use in other technical products. This data is provided throughout the entire repository area as a function of time. The MSTHM couples the Smeared-heat-source Drift-scale Thermal-conduction (SDT), Line-average-heat-source Drift-scale Thermohydrologic (LDTH), Discrete-heat-source Drift-scale Thermal-conduction (DDT), and Smeared-heat-source Mountain-scale Thermal-conduction (SMT) submodels such that the flow of water and water vapor through partially-saturated fractured rock is considered. The MSTHM accounts for 3-D drift-scale and mountain-scale heat flow, repository-scale variability of stratigraphy and infiltration flux, and waste package (WP)-to-WP variability in heat output from WPs. All submodels use the nonisothermal unsaturated-saturated flow and transport (NUFT) simulation code. The MSTHM is implemented in several data-processing steps. The four major steps are: (1) submodel input-file preparation, (2) execution of the four submodel families with the use of the NUFT code, (3) execution of the multiscale thermohydrologic abstraction code (MSTHAC), and (4) binning and post-processing (i.e., graphics preparation) of the output from MSTHAC. Section 6 describes the MSTHM in detail. The objectives of this Analyses and Model Report (AMR) are to investigate near field (NF) and EBS thermohydrologic environments throughout the repository area at various evolution periods, and to provide TH data that may be used in other process model reports

  4. Three Dimensional Vapor Intrusion Modeling: Model Validation and Uncertainty Analysis

    Science.gov (United States)

    Akbariyeh, S.; Patterson, B.; Rakoczy, A.; Li, Y.

    2013-12-01

    Volatile organic chemicals (VOCs), such as chlorinated solvents and petroleum hydrocarbons, are prevalent groundwater contaminants due to their improper disposal and accidental spillage. In addition to contaminating groundwater, VOCs may partition into the overlying vadose zone and enter buildings through gaps and cracks in foundation slabs or basement walls, a process termed vapor intrusion. Vapor intrusion of VOCs has been recognized as a detrimental source for human exposures to potential carcinogenic or toxic compounds. The simulation of vapor intrusion from a subsurface source has been the focus of many studies to better understand the process and guide field investigation. While multiple analytical and numerical models were developed to simulate the vapor intrusion process, detailed validation of these models against well controlled experiments is still lacking, due to the complexity and uncertainties associated with site characterization and soil gas flux and indoor air concentration measurement. In this work, we present an effort to validate a three-dimensional vapor intrusion model based on a well-controlled experimental quantification of the vapor intrusion pathways into a slab-on-ground building under varying environmental conditions. Finally, a probabilistic approach based on Monte Carlo simulations is implemented to determine the probability distribution of indoor air concentration based on the most uncertain input parameters.

  5. QuantUM: Quantitative Safety Analysis of UML Models

    Directory of Open Access Journals (Sweden)

    Florian Leitner-Fischer

    2011-07-01

    Full Text Available When developing a safety-critical system it is essential to obtain an assessment of different design alternatives. In particular, an early safety assessment of the architectural design of a system is desirable. In spite of the plethora of available formal quantitative analysis methods it is still difficult for software and system architects to integrate these techniques into their every day work. This is mainly due to the lack of methods that can be directly applied to architecture level models, for instance given as UML diagrams. Also, it is necessary that the description methods used do not require a profound knowledge of formal methods. Our approach bridges this gap and improves the integration of quantitative safety analysis methods into the development process. All inputs of the analysis are specified at the level of a UML model. This model is then automatically translated into the analysis model, and the results of the analysis are consequently represented on the level of the UML model. Thus the analysis model and the formal methods used during the analysis are hidden from the user. We illustrate the usefulness of our approach using an industrial strength case study.

  6. DNB analysis with mechanistic models for PWR fuel assemblies

    International Nuclear Information System (INIS)

    In order to predict the DNB heat flux of PWR fuel assemblies and the critical power of BWR fuel bundles, the Boiling Transition Analysis Code CAPE' has been developed in the IMPACT project. The CAPE code for PWR includes three analysis modules, subchannel analysis module, three-dimensional two-phase flow analysis module, and DNB evaluation module. The subchannel module uses drift-flux model to identify the hottest subchannel. The three-dimensional two-phase flow analysis module uses nonhomogeneous and nonequilibrium two fluid model to analyze the detailed three-dimensional two-phase flow behaviors such as void distribution. For DNB heat flux prediction, the DNB evaluation module uses the Weisman model in which is a mechanistic DNB evaluation model. This paper describes the analysis models, analysis techniques and the results of validation by rod bundle test analysis. To date, the average difference between calculated and 11 measured values was -0.6% with a standard deviation of 7.0%. (author)

  7. Modeling for insight using Tools for Energy Model Optimization and Analysis (Temoa)

    International Nuclear Information System (INIS)

    This paper introduces Tools for Energy Model Optimization and Analysis (Temoa), an open source framework for conducting energy system analysis. The core component of Temoa is an energy economy optimization (EEO) model, which minimizes the system-wide cost of energy supply by optimizing the deployment and utilization of energy technologies over a user-specified time horizon. The design of Temoa is intended to fill a unique niche within the energy modeling landscape by addressing two critical shortcomings associated with existing models: an inability to perform third party verification of published model results and the difficulty of conducting uncertainty analysis with large, complex models. Temoa leverages a modern revision control system to publicly archive model source code and data, which ensures repeatability of all published modeling work. From its initial conceptualization, Temoa was also designed for operation within a high performance computing environment to enable rigorous uncertainty analysis. We present the algebraic formulation of Temoa and conduct a verification exercise by implementing a simple test system in both Temoa and MARKAL, a widely used commercial model of the same type. In addition, a stochastic optimization of the test system is presented as a proof-of-concept application of uncertainty analysis using the Temoa framework. - Highlights: • We introduce Temoa, an open source, bottom-up energy system model. • Temoa is designed to facilitate third party replication of model-based results. • Temoa can utilize high performance computing resources for uncertainty analysis. • We present the model's algebraic formulation and a simple application

  8. Application of visual modelization technology to reactor thermohydraulics analysis programmes

    International Nuclear Information System (INIS)

    Visual modelization and XML was used to build a thermo-hydraulics security analysis model library for RELAP5 programme. The modeling system frame was constructed by the components transferred from this model library. Interfaces were described with thermo-hydraulics parameters based on windows platform. Every component was linked with the corresponding interface and input parameters, and RELAP data deck was created after checking errors. The analysis of a simple fluid flow in pipe verified that this technology could simplify the use of RELAP and improve the efficiency of study. (authors)

  9. Modelling Analysis of Sewage Sludge Amended Soil

    DEFF Research Database (Denmark)

    Sørensen, P. B.; Carlsen, L.; Vikelsøe, J.; Rasmussen, A. G.

    plant effluent. The focus in this work is the top soil as this layer is important for the fate of a xenobiotic substance due to the high biological activity. A simple model for the top soil is used where the substance is assumed homogeneously distributed as suggested in the European Union System for the...... Evaluation of Substances (EUSES). It is shown how the fraction of substance mass, which is leached, from the top soil is a simple function of the ratio between the degradation half lifetime and the adsorption coefficient. This model can be used in probabilistic risk assessment of agricultural soils and......The topic is risk assessment of sludge supply to agricultural soil in relation to xenobiotics. A large variety of xenobiotics arrive to the wastewater treatment plant in the wastewater. Many of these components are hydrophobic and thus will accumulate in the sludge solids and are removed from the...

  10. Simulation modeling and analysis in safety. II

    International Nuclear Information System (INIS)

    The paper introduces and illustrates simulation modeling as a viable approach for dealing with complex issues and decisions in safety and health. The author details two studies: evaluation of employee exposure to airborne radioactive materials and effectiveness of the safety organization. The first study seeks to define a policy to manage a facility used in testing employees for radiation contamination. An acceptable policy is one that would permit the testing of all employees as defined under regulatory requirements, while not exceeding available resources. The second study evaluates the relationship between safety performance and the characteristics of the organization, its management, its policy, and communication patterns among various functions and levels. Both studies use models where decisions are reached based on the prevailing conditions and occurrence of key events within the simulation environment. Finally, several problem areas suitable for simulation studies are highlighted. (Auth.)

  11. Numerical model for atomtronic circuit analysis

    CERN Document Server

    Chow, Weng W; Anderson, Dana Z

    2015-01-01

    A model for studying atomtronic devices and circuits based on finite temperature Bose-condensed gases is presented. The approach involves numerically solving equations of motion for atomic populations and coherences, derived using the Bose-Hubbard Hamiltonian and the Heisenberg picture. The resulting cluster expansion is truncated at a level giving balance between physics rigor and numerical demand mitigation. This approach allows parametric studies involving time scales that cover both the rapid population dynamics relevant to non-equilibrium state evolution, as well as the much longer time durations typical for reaching steady-state device operation. The model is demonstrated by studying the evolution of a Bose-condensed gas in the presence of atom injection and extraction in a double-well potential. In this configuration phase-locking between condensates in each well of the potential is readily observed, and its influence on the evolution of the system is studied.

  12. A Numerical Model for Atomtronic Circuit Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Chow, Weng W. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Straatsma, Cameron J. E. [Univ. of Colorado, Boulder, CO (United States); Anderson, Dana Z. [Univ. of Colorado and National Inst. of Standards and Technology, Boulder, CO (United States)

    2015-07-16

    A model for studying atomtronic devices and circuits based on finite-temperature Bose-condensed gases is presented. The approach involves numerically solving equations of motion for atomic populations and coherences, derived using the Bose-Hubbard Hamiltonian and the Heisenberg picture. The resulting cluster expansion is truncated at a level giving balance between physics rigor and numerical demand mitigation. This approach allows parametric studies involving time scales that cover both the rapid population dynamics relevant to nonequilibrium state evolution, as well as the much longer time durations typical for reaching steady-state device operation. This model is demonstrated by studying the evolution of a Bose-condensed gas in the presence of atom injection and extraction in a double-well potential. In this configuration phase locking between condensates in each well of the potential is readily observed, and its influence on the evolution of the system is studied.

  13. Aspects of uncertainty analysis in accident consequence modeling

    International Nuclear Information System (INIS)

    Mathematical models are frequently used to determine probable dose to man from an accidental release of radionuclides by a nuclear facility. With increased emphasis on the accuracy of these models, the incorporation of uncertainty analysis has become one of the most crucial and sensitive components in evaluating the significance of model predictions. In the present paper, we address three aspects of uncertainty in models used to assess the radiological impact to humans: uncertainties resulting from the natural variability in human biological parameters; the propagation of parameter variability by mathematical models; and comparison of model predictions to observational data

  14. Model based process-product design and analysis

    DEFF Research Database (Denmark)

    Gani, Rafiqul

    This paper gives a perspective on modelling and the important role it has within product-process design and analysis. Different modelling issues related to development and application of systematic model-based solution approaches for product-process design is discussed and the need for a hybrid...... model-based framework is highlighted. This framework should be able to manage knowledge-data, models, and associated methods and tools integrated with design work-flows and data-flows for specific product-process design problems. In particular, the framework needs to manage models of different types...

  15. Stochastic modeling and analysis of telecoms networks

    CERN Document Server

    Decreusefond, Laurent

    2012-01-01

    This book addresses the stochastic modeling of telecommunication networks, introducing the main mathematical tools for that purpose, such as Markov processes, real and spatial point processes and stochastic recursions, and presenting a wide list of results on stability, performances and comparison of systems.The authors propose a comprehensive mathematical construction of the foundations of stochastic network theory: Markov chains, continuous time Markov chains are extensively studied using an original martingale-based approach. A complete presentation of stochastic recursions from an

  16. Mathematical Models of Hydraulic Systems, Examples, Analysis

    Czech Academy of Sciences Publication Activity Database

    Straškraba, Ivan

    Praha : ÚT AV ČR, 2006 - (Příhoda, J.; Kozel, K.), s. 159-162 ISBN 80-85918-98-6. [Conference Topical Problems of Fluid Mechanics 2006. Praha (CZ), 22.02.2006-24.02.2006] R&D Projects: GA ČR(CZ) GA201/05/0005 Institutional research plan: CEZ:AV0Z10190503 Keywords : hydraulic systems * fluid flow * mathematical models Subject RIV: BA - General Mathematics

  17. Flood Progression Modelling and Impact Analysis

    OpenAIRE

    D. Mioc; F. Anton; Nickerson, B.; M. Santos; Adda, P.; T. Tienaah; Ahmad, A; Mezouaghi, M.; MacGillivray, E.; Morton, A.; P. Tang

    2011-01-01

    People living in the lower valley of the St. John River, New Brunswick, Canada, frequently experience flooding when the river overflows its banks during spring ice melt and rain. To better prepare the population of New Brunswick for extreme flooding, we developed a new flood prediction model that computesfloodplain polygons before the flood occurs. This allows emergency managers to access the impact of the flood before it occurs and make the early decisions for evacuation of the population an...

  18. Automatic terrain modeling using transfinite element analysis

    KAUST Repository

    Collier, Nathaniel O.

    2010-05-31

    An automatic procedure for modeling terrain is developed based on L2 projection-based interpolation of discrete terrain data onto transfinite function spaces. The function space is refined automatically by the use of image processing techniques to detect regions of high error and the flexibility of the transfinite interpolation to add degrees of freedom to these areas. Examples are shown of a section of the Palo Duro Canyon in northern Texas.

  19. Data perturbation analysis of a linear model

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    The linear model features were carefully studied in the cases of data perturbation and mean shift perturbation.Some important features were also proved mathematically. The results show that the mean shift perturbation is equivalentto the data perturbation, that is, adding a parameter to an observation equation means that this set of data is deleted fromthe data set. The estimate of this parameter is its predicted residual in fact

  20. Simulation analysis of a dynamic ridesharing model

    OpenAIRE

    Guasch Petit, Antonio; Figueras Jové, Jaume; Fonseca Casas, Pau; Montañola Sales, Cristina; Casanovas Garcia, Josep

    2014-01-01

    A dynamic ridesharing service is a system that enables drivers and riders to arrange one-time shared rides, with sufficient convenience and flexibility to be used on a daily basis. The quality of a dynamic ridesharing service is critical for commuters who need to reach their end destination on time every day. To ensure satisfactory quality, the waiting times in a ridesharing service must be low. This paper describes a dynamic ridesharing model proposal for commuters living in a small comm...

  1. Strain localization analysis using a multiscale model

    OpenAIRE

    FRANZ, Gérald; ABED-MERAIM, Farid; BEN ZINEB, Tarak; LEMOINE, Xavier; Berveiller, Marcel

    2009-01-01

    The development of a relevant constitutive model adapted to sheet metal forming simulations requires an accurate description of the most important sources of anisotropy, i.e. the slip processes, the intragranular substructure changes and the texture development. During plastic deformation of thin metallic sheets, strain-path changes often occur in the material resulting in macroscopic effects. These softening/hardening effects must be correctly predicted because they can significantly influen...

  2. Modeling and analysis of caves using voxelization

    Science.gov (United States)

    Szeifert, Gábor; Szabó, Tivadar; Székely, Balázs

    2014-05-01

    Although there are many ways to create three dimensional representations of caves using modern information technology methods, modeling of caves has been challenging for researchers for a long time. One of these promising new alternative modeling methods is using voxels. We are using geodetic measurements as an input for our voxelization project. These geodetic underground surveys recorded the azimuth, altitude and distance of corner points of cave systems relative to each other. The diameter of each cave section is estimated from separate databases originating from different surveys. We have developed a simple but efficient method (it covers more than 99.9 % of the volume of the input model on the average) to convert these vector-type datasets to voxels. We have also developed software components to make visualization of the voxel and vector models easier. Since each cornerpoint position is measured relative to another cornerpoints positions, propagation of uncertainties is an important issue in case of long caves with many separate sections. We are using Monte Carlo simulations to analyze the effect of the error of each geodetic instrument possibly involved in a survey. Cross-sections of the simulated three dimensional distributions show, that even tiny uncertainties of individual measurements can result in high variation of positions that could be reduced by distributing the closing errors if such data are available. Using the results of our simulations, we can estimate cave volume and the error of the calculated cave volume depending on the complexity of the cave. Acknowledgements: the authors are grateful to Ariadne Karst and Cave Exploring Association and State Department of Environmental and Nature Protection of the Hungarian Ministry of Rural Development, Department of National Parks and Landscape Protection, Section Landscape and Cave Protection and Ecotourism for providing the cave measurement data. BS contributed as an Alexander von Humboldt Research

  3. Security Trend Analysis with CVE Topic Models

    OpenAIRE

    Neuhaus, Stephan; Zimmermann, Thomas

    2010-01-01

    We study the vulnerability reports in the Common Vulnerability and Exposures (CVE) database by using topic models on their description texts to find prevalent vulnerability types and new trends semi-automatically. In our study of the 39,393 unique CVEs until the end of 2009, we identify the following trends, given here in the form of a weather forecast: PHP: declining, with occasional SQL injection. Buffer Overflows: flattening out after decline. Format Strings: in steep decline. SQL Injectio...

  4. Dispersion analysis with inverse dielectric function modelling.

    Science.gov (United States)

    Mayerhöfer, Thomas G; Ivanovski, Vladimir; Popp, Jürgen

    2016-11-01

    We investigate how dispersion analysis can profit from the use of a Lorentz-type description of the inverse dielectric function. In particular at higher angles of incidence, reflectance spectra using p-polarized light are dominated by bands from modes that have their transition moments perpendicular to the surface. Accordingly, the spectra increasingly resemble inverse dielectric functions. A corresponding description can therefore eliminate the complex dependencies of the dispersion parameters, allow their determination and facilitate a more accurate description of the optical properties of single crystals. PMID:27294550

  5. Analysis of Jingdong Mall Logistics Distribution Model

    Science.gov (United States)

    Shao, Kang; Cheng, Feng

    In recent years, the development of electronic commerce in our country to speed up the pace. The role of logistics has been highlighted, more and more electronic commerce enterprise are beginning to realize the importance of logistics in the success or failure of the enterprise. In this paper, the author take Jingdong Mall for example, performing a SWOT analysis of their current situation of self-built logistics system, find out the problems existing in the current Jingdong Mall logistics distribution and give appropriate recommendations.

  6. Models and analysis for distributed systems

    CERN Document Server

    Haddad, Serge; Pautet, Laurent; Petrucci, Laure

    2013-01-01

    Nowadays, distributed systems are increasingly present, for public software applications as well as critical systems. software applications as well as critical systems. This title and Distributed Systems: Design and Algorithms - from the same editors - introduce the underlying concepts, the associated design techniques and the related security issues.The objective of this book is to describe the state of the art of the formal methods for the analysis of distributed systems. Numerous issues remain open and are the topics of major research projects. One current research trend consists of pro

  7. A global sensitivity analysis approach for morphogenesis models

    KAUST Repository

    Boas, Sonja E. M.

    2015-11-21

    Background Morphogenesis is a developmental process in which cells organize into shapes and patterns. Complex, non-linear and multi-factorial models with images as output are commonly used to study morphogenesis. It is difficult to understand the relation between the uncertainty in the input and the output of such ‘black-box’ models, giving rise to the need for sensitivity analysis tools. In this paper, we introduce a workflow for a global sensitivity analysis approach to study the impact of single parameters and the interactions between them on the output of morphogenesis models. Results To demonstrate the workflow, we used a published, well-studied model of vascular morphogenesis. The parameters of this cellular Potts model (CPM) represent cell properties and behaviors that drive the mechanisms of angiogenic sprouting. The global sensitivity analysis correctly identified the dominant parameters in the model, consistent with previous studies. Additionally, the analysis provided information on the relative impact of single parameters and of interactions between them. This is very relevant because interactions of parameters impede the experimental verification of the predicted effect of single parameters. The parameter interactions, although of low impact, provided also new insights in the mechanisms of in silico sprouting. Finally, the analysis indicated that the model could be reduced by one parameter. Conclusions We propose global sensitivity analysis as an alternative approach to study the mechanisms of morphogenesis. Comparison of the ranking of the impact of the model parameters to knowledge derived from experimental data and from manipulation experiments can help to falsify models and to find the operand mechanisms in morphogenesis. The workflow is applicable to all ‘black-box’ models, including high-throughput in vitro models in which output measures are affected by a set of experimental perturbations.

  8. Automation of Safety Analysis with SysML Models Project

    Data.gov (United States)

    National Aeronautics and Space Administration — This project was a small proof-of-concept case study, generating SysML model information as a side effect of safety analysis. A prototype FMEA Assistant was...

  9. EXPOSURE ANALYSIS MODELING SYSTEM: REFERENCE MANUAL FOR EXAMS 2

    Science.gov (United States)

    The Exposure Analysis Modeling System (EXAMS), published in 1982 (EPA-600/3-82-023), provides rapid evaluations of the behavior of synthetic organic chemicals in aquatic ecosystems. EXAMS combines laboratory data describing reactivity and thermodynamic properties of chemicals wit...

  10. Computational Modeling, Formal Analysis, and Tools for Systems Biology.

    Directory of Open Access Journals (Sweden)

    Ezio Bartocci

    2016-01-01

    Full Text Available As the amount of biological data in the public domain grows, so does the range of modeling and analysis techniques employed in systems biology. In recent years, a number of theoretical computer science developments have enabled modeling methodology to keep pace. The growing interest in systems biology in executable models and their analysis has necessitated the borrowing of terms and methods from computer science, such as formal analysis, model checking, static analysis, and runtime verification. Here, we discuss the most important and exciting computational methods and tools currently available to systems biologists. We believe that a deeper understanding of the concepts and theory highlighted in this review will produce better software practice, improved investigation of complex biological processes, and even new ideas and better feedback into computer science.

  11. Microscopic Analysis and Modeling of Airport Surface Sequencing Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Although a number of airportal surface models exist and have been successfully used for analysis of airportal operations, only recently has it become possible to...

  12. Analysis of the Retina in the Zebrafish Model

    OpenAIRE

    Avanesov, Andrei; Malicki, Jarema

    2010-01-01

    The zebrafish is one of the leading models for the analysis of the vertebrate visual system. A wide assortment of molecular, genetic, and cell biological approaches is available to study zebrafish visual system development and function. As new techniques become available, genetic analysis and imaging continue to be the strengths of the zebrafish model. In particular, recent developments in the use of transposons and zinc finger nucleases to produce new generations of mutant strains enhance bo...

  13. Coping with Complexity Model Reduction and Data Analysis

    CERN Document Server

    Gorban, Alexander N

    2011-01-01

    This volume contains the extended version of selected talks given at the international research workshop 'Coping with Complexity: Model Reduction and Data Analysis', Ambleside, UK, August 31 - September 4, 2009. This book is deliberately broad in scope and aims at promoting new ideas and methodological perspectives. The topics of the chapters range from theoretical analysis of complex and multiscale mathematical models to applications in e.g., fluid dynamics and chemical kinetics.

  14. Discrete Discriminant analysis based on tree-structured graphical models

    DEFF Research Database (Denmark)

    Perez de la Cruz, Gonzalo; Eslava, Guillermina

    The purpose of this paper is to illustrate the potential use of discriminant analysis based on tree{structured graphical models for discrete variables. This is done by comparing its empirical performance using estimated error rates for real and simulated data. The results show that discriminant a...... analysis based on tree{structured graphical models is a simple nonlinear method competitive with, and sometimes superior to, other well{known linear methods like those assuming mutual independence between variables and linear logistic regression....

  15. Empirical Bayes Model Comparisons for Differential Methylation Analysis

    Directory of Open Access Journals (Sweden)

    Mingxiang Teng

    2012-01-01

    Full Text Available A number of empirical Bayes models (each with different statistical distribution assumptions have now been developed to analyze differential DNA methylation using high-density oligonucleotide tiling arrays. However, it remains unclear which model performs best. For example, for analysis of differentially methylated regions for conservative and functional sequence characteristics (e.g., enrichment of transcription factor-binding sites (TFBSs, the sensitivity of such analyses, using various empirical Bayes models, remains unclear. In this paper, five empirical Bayes models were constructed, based on either a gamma distribution or a log-normal distribution, for the identification of differential methylated loci and their cell division—(1, 3, and 5 and drug-treatment-(cisplatin dependent methylation patterns. While differential methylation patterns generated by log-normal models were enriched with numerous TFBSs, we observed almost no TFBS-enriched sequences using gamma assumption models. Statistical and biological results suggest log-normal, rather than gamma, empirical Bayes model distribution to be a highly accurate and precise method for differential methylation microarray analysis. In addition, we presented one of the log-normal models for differential methylation analysis and tested its reproducibility by simulation study. We believe this research to be the first extensive comparison of statistical modeling for the analysis of differential DNA methylation, an important biological phenomenon that precisely regulates gene transcription.

  16. PSAMM: A Portable System for the Analysis of Metabolic Models.

    Directory of Open Access Journals (Sweden)

    Jon Lund Steffensen

    2016-02-01

    Full Text Available The genome-scale models of metabolic networks have been broadly applied in phenotype prediction, evolutionary reconstruction, community functional analysis, and metabolic engineering. Despite the development of tools that support individual steps along the modeling procedure, it is still difficult to associate mathematical simulation results with the annotation and biological interpretation of metabolic models. In order to solve this problem, here we developed a Portable System for the Analysis of Metabolic Models (PSAMM, a new open-source software package that supports the integration of heterogeneous metadata in model annotations and provides a user-friendly interface for the analysis of metabolic models. PSAMM is independent of paid software environments like MATLAB, and all its dependencies are freely available for academic users. Compared to existing tools, PSAMM significantly reduced the running time of constraint-based analysis and enabled flexible settings of simulation parameters using simple one-line commands. The integration of heterogeneous, model-specific annotation information in PSAMM is achieved with a novel format of YAML-based model representation, which has several advantages, such as providing a modular organization of model components and simulation settings, enabling model version tracking, and permitting the integration of multiple simulation problems. PSAMM also includes a number of quality checking procedures to examine stoichiometric balance and to identify blocked reactions. Applying PSAMM to 57 models collected from current literature, we demonstrated how the software can be used for managing and simulating metabolic models. We identified a number of common inconsistencies in existing models and constructed an updated model repository to document the resolution of these inconsistencies.

  17. AIR INGRESS ANALYSIS: COMPUTATIONAL FLUID DYNAMIC MODELS

    International Nuclear Information System (INIS)

    The Idaho National Laboratory (INL), under the auspices of the U.S. Department of Energy, is performing research and development that focuses on key phenomena important during potential scenarios that may occur in very high temperature reactors (VHTRs). Phenomena Identification and Ranking Studies to date have ranked an air ingress event, following on the heels of a VHTR depressurization, as important with regard to core safety. Consequently, the development of advanced air ingress-related models and verification and validation data are a very high priority. Following a loss of coolant and system depressurization incident, air will enter the core of the High Temperature Gas Cooled Reactor through the break, possibly causing oxidation of the in-the core and reflector graphite structure. Simple core and plant models indicate that, under certain circumstances, the oxidation may proceed at an elevated rate with additional heat generated from the oxidation reaction itself. Under postulated conditions of fluid flow and temperature, excessive degradation of the lower plenum graphite can lead to a loss of structural support. Excessive oxidation of core graphite can also lead to the release of fission products into the confinement, which could be detrimental to a reactor safety. Computational fluid dynamic model developed in this study will improve our understanding of this phenomenon. This paper presents two-dimensional and three-dimensional CFD results for the quantitative assessment of the air ingress phenomena. A portion of results of the density-driven stratified flow in the inlet pipe will be compared with results of the experimental results.

  18. AIR INGRESS ANALYSIS: COMPUTATIONAL FLUID DYNAMIC MODELS

    Energy Technology Data Exchange (ETDEWEB)

    Chang H. Oh; Eung S. Kim; Richard Schultz; Hans Gougar; David Petti; Hyung S. Kang

    2010-08-01

    The Idaho National Laboratory (INL), under the auspices of the U.S. Department of Energy, is performing research and development that focuses on key phenomena important during potential scenarios that may occur in very high temperature reactors (VHTRs). Phenomena Identification and Ranking Studies to date have ranked an air ingress event, following on the heels of a VHTR depressurization, as important with regard to core safety. Consequently, the development of advanced air ingress-related models and verification and validation data are a very high priority. Following a loss of coolant and system depressurization incident, air will enter the core of the High Temperature Gas Cooled Reactor through the break, possibly causing oxidation of the in-the core and reflector graphite structure. Simple core and plant models indicate that, under certain circumstances, the oxidation may proceed at an elevated rate with additional heat generated from the oxidation reaction itself. Under postulated conditions of fluid flow and temperature, excessive degradation of the lower plenum graphite can lead to a loss of structural support. Excessive oxidation of core graphite can also lead to the release of fission products into the confinement, which could be detrimental to a reactor safety. Computational fluid dynamic model developed in this study will improve our understanding of this phenomenon. This paper presents two-dimensional and three-dimensional CFD results for the quantitative assessment of the air ingress phenomena. A portion of results of the density-driven stratified flow in the inlet pipe will be compared with results of the experimental results.

  19. Analysis and modeling of "focus" in context

    DEFF Research Database (Denmark)

    Hovy, Dirk; Anumanchipalli, Gopala; Parlikar, Alok;

    2013-01-01

    This paper uses a crowd-sourced definition of a speech phenomenon we have called focus. Given sentences, text and speech, in isolation and in context, we asked annotators to identify what we term the focus word. We present their consistency in identifying the focused word, when presented with text...... or speech stimuli. We then build models to show how well we predict that focus word from lexical (and higher) level features. Also, using spectral and prosodic information, we show the differences in these focus words when spoken with and without context. Finally, we show how we can improve speech...... synthesis of these utterances given focus information....

  20. Explicit model predictive control accuracy analysis

    OpenAIRE

    Knyazev, Andrew; Zhu, Peizhen; Di Cairano, Stefano

    2015-01-01

    Model Predictive Control (MPC) can efficiently control constrained systems in real-time applications. MPC feedback law for a linear system with linear inequality constraints can be explicitly computed off-line, which results in an off-line partition of the state space into non-overlapped convex regions, with affine control laws associated to each region of the partition. An actual implementation of this explicit MPC in low cost micro-controllers requires the data to be "quantized", i.e. repre...

  1. Compartmentalization analysis using discrete fracture network models

    Energy Technology Data Exchange (ETDEWEB)

    La Pointe, P.R.; Eiben, T.; Dershowitz, W. [Golder Associates, Redmond, VA (United States); Wadleigh, E. [Marathon Oil Co., Midland, TX (United States)

    1997-08-01

    This paper illustrates how Discrete Fracture Network (DFN) technology can serve as a basis for the calculation of reservoir engineering parameters for the development of fractured reservoirs. It describes the development of quantitative techniques for defining the geometry and volume of structurally controlled compartments. These techniques are based on a combination of stochastic geometry, computational geometry, and graph the theory. The parameters addressed are compartment size, matrix block size and tributary drainage volume. The concept of DFN models is explained and methodologies to compute these parameters are demonstrated.

  2. Sensitivity and uncertainty analysis of the PATHWAY radionuclide transport model

    International Nuclear Information System (INIS)

    Procedures were developed for the uncertainty and sensitivity analysis of a dynamic model of radionuclide transport through human food chains. Uncertainty in model predictions was estimated by propagation of parameter uncertainties using a Monte Carlo simulation technique. Sensitivity of model predictions to individual parameters was investigated using the partial correlation coefficient of each parameter with model output. Random values produced for the uncertainty analysis were used in the correlation analysis for sensitivity. These procedures were applied to the PATHWAY model which predicts concentrations of radionuclides in foods grown in Nevada and Utah and exposed to fallout during the period of atmospheric nuclear weapons testing in Nevada. Concentrations and time-integrated concentrations of iodine-131, cesium-136, and cesium-137 in milk and other foods were investigated. 9 figs., 13 tabs

  3. Similar words analysis based on POS-CBOW language model

    Directory of Open Access Journals (Sweden)

    Dongru RUAN

    2015-10-01

    Full Text Available Similar words analysis is one of the important aspects in the field of natural language processing, and it has important research and application values in text classification, machine translation and information recommendation. Focusing on the features of Sina Weibo's short text, this paper presents a language model named as POS-CBOW, which is a kind of continuous bag-of-words language model with the filtering layer and part-of-speech tagging layer. The proposed approach can adjust the word vectors' similarity according to the cosine similarity and the word vectors' part-of-speech metrics. It can also filter those similar words set on the base of the statistical analysis model. The experimental result shows that the similar words analysis algorithm based on the proposed POS-CBOW language model is better than that based on the traditional CBOW language model.

  4. Stability analysis of an autocatalytic protein model

    Science.gov (United States)

    Lee, Julian

    2016-05-01

    A self-regulatory genetic circuit, where a protein acts as a positive regulator of its own production, is known to be the simplest biological network with a positive feedback loop. Although at least three components—DNA, RNA, and the protein—are required to form such a circuit, stability analysis of the fixed points of this self-regulatory circuit has been performed only after reducing the system to a two-component system, either by assuming a fast equilibration of the DNA component or by removing the RNA component. Here, stability of the fixed points of the three-component positive feedback loop is analyzed by obtaining eigenvalues of the full three-dimensional Hessian matrix. In addition to rigorously identifying the stable fixed points and saddle points, detailed information about the system can be obtained, such as the existence of complex eigenvalues near a fixed point.

  5. Product Lifecycle Management Architecture: A Model Based Systems Engineering Analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Noonan, Nicholas James [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-07-01

    This report is an analysis of the Product Lifecycle Management (PLM) program. The analysis is centered on a need statement generated by a Nuclear Weapons (NW) customer. The need statement captured in this report creates an opportunity for the PLM to provide a robust service as a solution. Lifecycles for both the NW and PLM are analyzed using Model Based System Engineering (MBSE).

  6. Representing the Past by Solid Modeling + Golden Ratio Analysis

    Science.gov (United States)

    Ding, Suining

    2008-01-01

    This paper describes the procedures of reconstructing ancient architecture using solid modeling with geometric analysis, especially the Golden Ratio analysis. In the past the recovery and reconstruction of ruins required bringing together fragments of evidence and vast amount of measurements from archaeological site. Although researchers and…

  7. Current status of uncertainty analysis methods for computer models

    International Nuclear Information System (INIS)

    This report surveys several existing uncertainty analysis methods for estimating computer output uncertainty caused by input uncertainties, illustrating application examples of those methods to three computer models, MARCH/CORRAL II, TERFOC and SPARC. Merits and limitations of the methods are assessed in the application, and recommendation for selecting uncertainty analysis methods is provided. (author)

  8. Sparse Principal Component Analysis in Medical Shape Modeling

    DEFF Research Database (Denmark)

    Sjöstrand, Karl; Stegmann, Mikkel Bille; Larsen, Rasmus

    2006-01-01

    Principal component analysis (PCA) is a widely used tool in medical image analysis for data reduction, model building, and data understanding and exploration. While PCA is a holistic approach where each new variable is a linear combination of all original variables, sparse PCA (SPCA) aims at prod...

  9. Domain Endurants: An Analysis and Description Process Model

    DEFF Research Database (Denmark)

    Bjørner, Dines

    2014-01-01

    We present a summary, Sect. 2, of a structure of domain analysis and description concepts: techniques and tools. And we link, in Sect. 3, these concepts, embodied in domain analysis prompts and domain description prompts, in a model of how a diligent domain analyser cum describer would use them. We...

  10. Influence of pipeline modeling in stability analysis for severe slugging

    Science.gov (United States)

    Azevedo, G. R.; Baliño, J. L.; Burr, K. P.

    2016-06-01

    In this paper a numerical linear stability analysis is performed to a mathematical model for the two-phase flow in a pipeline-riser system. Most of stability criteria and most of the models are based on a simplified pipeline, where it is assumed that the void fraction fluctuation can be neglected. It is evident that a pipeline with a constant void fraction would not be able to capture the flow pattern transition or void fraction propagation waves. Three different models for the pipeline are considered: a lumped parameter model with constant void fraction; a lumped parameter model with time dependent void fraction; a distributed parameter model, with void fraction dependent on time and position. The results showed that a simplified model would lose some stable region for operational conditions, but the complete models would not. As result, a more general modeling is necessary to capture all the influence of the stratified flow over stability and over the pipeline dynamics.

  11. Model Construction and Analysis of Respiration in Halobacterium salinarum.

    Directory of Open Access Journals (Sweden)

    Cherryl O Talaue

    Full Text Available The archaeon Halobacterium salinarum can produce energy using three different processes, namely photosynthesis, oxidative phosphorylation and fermentation of arginine, and is thus a model organism in bioenergetics. Compared to its bacteriorhodopsin-driven photosynthesis, less attention has been devoted to modeling its respiratory pathway. We created a system of ordinary differential equations that models its oxidative phosphorylation. The model consists of the electron transport chain, the ATP synthase, the potassium uniport and the sodium-proton antiport. By fitting the model parameters to experimental data, we show that the model can explain data on proton motive force generation, ATP production, and the charge balancing of ions between the sodium-proton antiporter and the potassium uniport. We performed sensitivity analysis of the model parameters to determine how the model will respond to perturbations in parameter values. The model and the parameters we derived provide a resource that can be used for analytical studies of the bioenergetics of H. salinarum.

  12. Beauty and the beast: Some perspectives on efficient model analysis, surrogate models, and the future of modeling

    Science.gov (United States)

    Hill, M. C.; Jakeman, J.; Razavi, S.; Tolson, B.

    2015-12-01

    For many environmental systems model runtimes have remained very long as more capable computers have been used to add more processes and more time and space discretization. Scientists have also added more parameters and kinds of observations, and many model runs are needed to explore the models. Computational demand equals run time multiplied by number of model runs divided by parallelization opportunities. Model exploration is conducted using sensitivity analysis, optimization, and uncertainty quantification. Sensitivity analysis is used to reveal consequences of what may be very complex simulated relations, optimization is used to identify parameter values that fit the data best, or at least better, and uncertainty quantification is used to evaluate the precision of simulated results. The long execution times make such analyses a challenge. Methods for addressing this challenges include computationally frugal analysis of the demanding original model and a number of ingenious surrogate modeling methods. Both commonly use about 50-100 runs of the demanding original model. In this talk we consider the tradeoffs between (1) original model development decisions, (2) computationally frugal analysis of the original model, and (3) using many model runs of the fast surrogate model. Some questions of interest are as follows. If the added processes and discretization invested in (1) are compared with the restrictions and approximations in model analysis produced by long model execution times, is there a net benefit related of the goals of the model? Are there changes to the numerical methods that could reduce the computational demands while giving up less fidelity than is compromised by using computationally frugal methods or surrogate models for model analysis? Both the computationally frugal methods and surrogate models require that the solution of interest be a smooth function of the parameters or interest. How does the information obtained from the local methods typical

  13. Sensitivity analysis for computational models of biochemical systems

    OpenAIRE

    Maj,

    2014-01-01

    Systems biology is an integrated area of science which aims at the analysis of biochemical systems using an holistic perspective. In this context, sensitivity analysis, a technique studying how the output variation of a computational model can be associated to its input state plays a pivotal role. In the thesis it is described how to properly apply the different sensitivity analysis techniques according to the specific case study (i.e., continuous deterministic rather than discrete stochastic...

  14. Stability Analysis for Car Following Model Based on Control Theory

    International Nuclear Information System (INIS)

    Stability analysis is one of the key issues in car-following theory. The stability analysis with Lyapunov function for the two velocity difference car-following model (for short, TVDM) is conducted and the control method to suppress traffic congestion is introduced. Numerical simulations are given and results are consistent with the theoretical analysis. (electromagnetism, optics, acoustics, heat transfer, classical mechanics, and fluid dynamics)

  15. 10031 Executive Summary -- Quantitative Models: Expressiveness and Analysis

    OpenAIRE

    Baier, Christel; Droste, Manfred; Gastin, Paul; Larsen, Kim Guldstrand

    2010-01-01

    Quantitative models and quantitative analysis in Computer Science are currently intensively studied, resulting in a revision of the foundation of Computer Science where classical yes/no answers are replaced by quantitative analyses. The potential application areas are huge, e.g., performance analysis, operational research or embedded systems. The aim of the seminar was to address three fundamental topics which are closely related: quantitative analysis of real-time and h...

  16. Geometric analysis of the capital asset pricing model

    OpenAIRE

    Kure, Thomas

    2010-01-01

    The derivation of the capital asset pricing model is in most literature limited to a graphical analysis. Since this method avoids a complicated mathematical framework the derivation is more intuitive to people who are unfamiliar to this topic. This approach, however, can result in misleading or even wrong results if the analysis is imprecise. Some of the main mistakes seem to be already established in financial textbooks. This thesis gives a deeper analysis of the so often used ...

  17. Sensitivity analysis of fine sediment models using heterogeneous data

    Science.gov (United States)

    Kamel, A. M. Yousif; Bhattacharya, B.; El Serafy, G. Y.; van Kessel, T.; Solomatine, D. P.

    2012-04-01

    Sediments play an important role in many aquatic systems. Their transportation and deposition has significant implication on morphology, navigability and water quality. Understanding the dynamics of sediment transportation in time and space is therefore important in drawing interventions and making management decisions. This research is related to the fine sediment dynamics in the Dutch coastal zone, which is subject to human interference through constructions, fishing, navigation, sand mining, etc. These activities do affect the natural flow of sediments and sometimes lead to environmental concerns or affect the siltation rates in harbours and fairways. Numerical models are widely used in studying fine sediment processes. Accuracy of numerical models depends upon the estimation of model parameters through calibration. Studying the model uncertainty related to these parameters is important in improving the spatio-temporal prediction of suspended particulate matter (SPM) concentrations, and determining the limits of their accuracy. This research deals with the analysis of a 3D numerical model of North Sea covering the Dutch coast using the Delft3D modelling tool (developed at Deltares, The Netherlands). The methodology in this research was divided into three main phases. The first phase focused on analysing the performance of the numerical model in simulating SPM concentrations near the Dutch coast by comparing the model predictions with SPM concentrations estimated from NASA's MODIS sensors at different time scales. The second phase focused on carrying out a sensitivity analysis of model parameters. Four model parameters were identified for the uncertainty and sensitivity analysis: the sedimentation velocity, the critical shear stress above which re-suspension occurs, the shields shear stress for re-suspension pick-up, and the re-suspension pick-up factor. By adopting different values of these parameters the numerical model was run and a comparison between the

  18. Strong absorption model analysis of alpha scattering

    International Nuclear Information System (INIS)

    Angular distribution of alpha-particles at several energies, Eα = 21 ∼ 85.6 MeV from a number of nuclei between 20Ni and 119Sn, extending to wide angular range up to ∼ 160 deg. C in some cases, have been analyzed in terms of three-parameter strong absorption model of Frahn and Venter. Interaction radius and surface diffuseness are obtained from the parameter values rendering the best fit to the elastic scattering data. The inelastic scattering of alpha-particles from a number of nuclei, leading to quadrupole and octupole excitations has also been studied giving the deformation parameters βL. (author). 14 refs, 7 figs, 3 tabs

  19. Materials Analysis and Modeling of Underfill Materials.

    Energy Technology Data Exchange (ETDEWEB)

    Wyatt, Nicholas B [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Chambers, Robert S. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-08-01

    The thermal-mechanical properties of three potential underfill candidate materials for PBGA applications are characterized and reported. Two of the materials are a formulations developed at Sandia for underfill applications while the third is a commercial product that utilizes a snap-cure chemistry to drastically reduce cure time. Viscoelastic models were calibrated and fit using the property data collected for one of the Sandia formulated materials. Along with the thermal-mechanical analyses performed, a series of simple bi-material strip tests were conducted to comparatively analyze the relative effects of cure and thermal shrinkage amongst the materials under consideration. Finally, current knowledge gaps as well as questions arising from the present study are identified and a path forward presented.

  20. Expatriates Selection: An Essay of Model Analysis

    Directory of Open Access Journals (Sweden)

    Rui Bártolo-Ribeiro

    2015-03-01

    Full Text Available The business expansion to other geographical areas with different cultures from which organizations were created and developed leads to the expatriation of employees to these destinations. Recruitment and selection procedures of expatriates do not always have the intended success leading to an early return of these professionals with the consequent organizational disorders. In this study, several articles published in the last five years were analyzed in order to identify the most frequently mentioned dimensions in the selection of expatriates in terms of success and failure. The characteristics in the selection process that may increase prediction of adaptation of expatriates to new cultural contexts of the some organization were studied according to the KSAOs model. Few references were found concerning Knowledge, Skills and Abilities dimensions in the analyzed papers. There was a strong predominance on the evaluation of Other Characteristics, and was given more importance to dispositional factors than situational factors for promoting the integration of the expatriates.

  1. Parametric analysis of fire model CFAST

    International Nuclear Information System (INIS)

    This paper describes the pump room fire of the nuclear power plant using CFAST fire modeling code developed by NIST. It is determined by the constrained or unconstrained fire, Lower Oxygen Limit (LOL), Radiative Fraction (RF), and the times to open doors, which are the input parameters of CAFST. According to the results, pump room fire is ventilation-controlled fire, so it is adequate that the value of LOL is 10% which is also the default value. It is appeared that the RF does not change the temperature of the upper gas layer. But the level of opening of the penetrating area and the times to opening it have an effect on the temperature of the upper layer, so it is determined that the results of it should be carefully analyzed

  2. Model and Analysis of Individual Rehearsals

    DEFF Research Database (Denmark)

    Jensen, Kristoffer; Frimodt-Møller, Søren

    2013-01-01

    This work focuses on the quantitative improvement over time made by a musician rehearsing a specific piece of music. In particular, this study considers the attention paid by the musician alternately to the instrument and the score during rehearsal, in order to describe the evolution of this...... only to look at the instrument or the score when necessary and otherwise to keep their focus on the camera (to emulate the presence of an audience or conductor). This process was annotated and modeled using UML diagrams, in particular the Use Case diagram. The annotation includes the attention to the...... score and the instrument, respectively, when applicable. Via these observations, hypotheses are formed and discussed regarding the development of a musician’s Long Term Memory (LTM) in relation to the score, as well as how much s/he is able to store in Short Term Memory (STM) while playing....

  3. Folding model analysis of alpha radioactivity

    CERN Document Server

    Basu, D N

    2003-01-01

    Radioactive decay of nuclei via emission of $\\alpha$ particles has been studied theoretically in the framework of a superasymmetric fission model using the double folding (DF) procedure for obtaining the $\\alpha$-nucleus interaction potential. The DF nuclear potential has been obtained by folding in the density distribution functions of the $\\alpha$ nucleus and the daughter nucleus with a realistic effective interaction. The M3Y effective interaction has been used for calculating the nuclear interaction potential which has been supplemented by a zero-range pseudo-potential for exchange along with the density dependence. The nuclear microscopic $\\alpha$-nucleus potential thus obtained has been used along with the Coulomb interaction potential to calculate the action integral within the WKB approximation. This subsequently yields microscopic calculations for the half lives of $\\alpha$ decays of nuclei. The density dependence and the exchange effects have not been found to be very significant. These calculations...

  4. Model-free model elimination: A new step in the model-free dynamic analysis of NMR relaxation data

    International Nuclear Information System (INIS)

    Model-free analysis is a technique commonly used within the field of NMR spectroscopy to extract atomic resolution, interpretable dynamic information on multiple timescales from the R1, R2, and steady state NOE. Model-free approaches employ two disparate areas of data analysis, the discipline of mathematical optimisation, specifically the minimisation of a χ2 function, and the statistical field of model selection. By searching through a large number of model-free minimisations, which were setup using synthetic relaxation data whereby the true underlying dynamics is known, certain model-free models have been identified to, at times, fail. This has been characterised as either the internal correlation times, τe, τf, or τs, or the global correlation time parameter, local τm, heading towards infinity, the result being that the final parameter values are far from the true values. In a number of cases the minimised χ2 value of the failed model is significantly lower than that of all other models and, hence, will be the model which is chosen by model selection techniques. If these models are not removed prior to model selection the final model-free results could be far from the truth. By implementing a series of empirical rules involving inequalities these models can be specifically isolated and removed. Model-free analysis should therefore consist of three distinct steps: model-free minimisation, model-free model elimination, and finally model-free model selection. Failure has also been identified to affect the individual Monte Carlo simulations used within error analysis. Each simulation involves an independent randomised relaxation data set and model-free minimisation, thus simulations suffer from exactly the same types of failure as model-free models. Therefore, to prevent these outliers from causing a significant overestimation of the errors the failed Monte Carlo simulations need to be culled prior to calculating the parameter standard deviations

  5. JSim, an open-source modeling system for data analysis.

    Science.gov (United States)

    Butterworth, Erik; Jardine, Bartholomew E; Raymond, Gary M; Neal, Maxwell L; Bassingthwaighte, James B

    2013-01-01

    JSim is a simulation system for developing models, designing experiments, and evaluating hypotheses on physiological and pharmacological systems through the testing of model solutions against data. It is designed for interactive, iterative manipulation of the model code, handling of multiple data sets and parameter sets, and for making comparisons among different models running simultaneously or separately. Interactive use is supported by a large collection of graphical user interfaces for model writing and compilation diagnostics, defining input functions, model runs, selection of algorithms solving ordinary and partial differential equations, run-time multidimensional graphics, parameter optimization (8 methods), sensitivity analysis, and Monte Carlo simulation for defining confidence ranges. JSim uses Mathematical Modeling Language (MML) a declarative syntax specifying algebraic and differential equations. Imperative constructs written in other languages (MATLAB, FORTRAN, C++, etc.) are accessed through procedure calls. MML syntax is simple, basically defining the parameters and variables, then writing the equations in a straightforward, easily read and understood mathematical form. This makes JSim good for teaching modeling as well as for model analysis for research.   For high throughput applications, JSim can be run as a batch job.  JSim can automatically translate models from the repositories for Systems Biology Markup Language (SBML) and CellML models. Stochastic modeling is supported. MML supports assigning physical units to constants and variables and automates checking dimensional balance as the first step in verification testing. Automatic unit scaling follows, e.g. seconds to minutes, if needed. The JSim Project File sets a standard for reproducible modeling analysis: it includes in one file everything for analyzing a set of experiments: the data, the models, the data fitting, and evaluation of parameter confidence ranges. JSim is open source; it

  6. Bifurcation Analysis in a Delayed Diffusive Leslie-Gower Model

    Directory of Open Access Journals (Sweden)

    Shuling Yan

    2013-01-01

    Full Text Available We investigate a modified delayed Leslie-Gower model under homogeneous Neumann boundary conditions. We give the stability analysis of the equilibria of the model and show the existence of Hopf bifurcation at the positive equilibrium under some conditions. Furthermore, we investigate the stability and direction of bifurcating periodic orbits by using normal form theorem and the center manifold theorem.

  7. Dynamic Modelling and Statistical Analysis of Event Times

    OpenAIRE

    Peña, Edsel A.

    2006-01-01

    This review article provides an overview of recent work in the modelling and analysis of recurrent events arising in engineering, reliability, public health, biomedical, and other areas. Recurrent event modelling possesses unique facets making it different and more difficult to handle than single event settings. For instance, the impact of an increasing number of event occurrences ...

  8. Modelling, authoring and publishing the "document analysis" learning object

    OpenAIRE

    Flament, Alexandre; Villiot Leclercq, Emmanuelle

    2004-01-01

    This article describes the modelling and implementation of a document analysis learning object. Actually, the objective of this research is double : providing models and tools to teachers allowing them to produce learning objects and building a publishing chain which can be applied to other kinds of learning objects. Implementation choices rely on interoperability and use of standard.

  9. A Noncentral "t" Regression Model for Meta-Analysis

    Science.gov (United States)

    Camilli, Gregory; de la Torre, Jimmy; Chiu, Chia-Yi

    2010-01-01

    In this article, three multilevel models for meta-analysis are examined. Hedges and Olkin suggested that effect sizes follow a noncentral "t" distribution and proposed several approximate methods. Raudenbush and Bryk further refined this model; however, this procedure is based on a normal approximation. In the current research literature, this…

  10. Standard model for safety analysis report of fuel fabrication plants

    International Nuclear Information System (INIS)

    A standard model for a safety analysis report of fuel fabrication plants is established. This model shows the presentation format, the origin, and the details of the minimal information required by CNEN (Comissao Nacional de Energia Nuclear) aiming to evaluate the requests of construction permits and operation licenses made according to the legislation in force. (E.G.)

  11. ATLAS analysis model and SUSY searches in lepton channels

    International Nuclear Information System (INIS)

    The ATLAS experiment built at CERN will start to take data in some months.The computing model for data analysis includes many tools.The new ATLAS Event Data Model will be investigated here.As an example the sensitivity of a SUSY search requiring 2/3/4 jets plus one lepton will be shown

  12. A Systemic Cause Analysis Model for Human Performance Technicians

    Science.gov (United States)

    Sostrin, Jesse

    2011-01-01

    This article presents a systemic, research-based cause analysis model for use in the field of human performance technology (HPT). The model organizes the most prominent barriers to workplace learning and performance into a conceptual framework that explains and illuminates the architecture of these barriers that exist within the fabric of everyday…

  13. Stability analysis for a general age-dependent vaccination model

    International Nuclear Information System (INIS)

    An SIR epidemic model of a general age-dependent vaccination model is investigated when the fertility, mortality and removal rates depends on age. We give threshold criteria of the existence of equilibriums and perform stability analysis. Furthermore a critical vaccination coverage that is sufficient to eradicate the disease is determined. (author). 12 refs

  14. Standard model for safety analysis report of fuel reprocessing plants

    International Nuclear Information System (INIS)

    A standard model for a safety analysis report of fuel reprocessing plants is established. This model shows the presentation format, the origin, and the details of the minimal information required by CNEN (Comissao Nacional de Energia Nuclear) aiming to evaluate the requests of construction permits and operation licenses made according to the legislation in force. (E.G.)

  15. Analysis of Gumbel Model for Software Reliability Using Bayesian Paradigm

    Directory of Open Access Journals (Sweden)

    Raj Kumar

    2012-12-01

    Full Text Available In this paper, we have illustrated the suitability of Gumbel Model for software reliability data. The model parameters are estimated using likelihood based inferential procedure: classical as well as Bayesian. The quasi Newton-Raphson algorithm is applied to obtain the maximum likelihood estimates and associated probability intervals. The Bayesian estimates of the parameters of Gumbel model are obtained using Markov Chain Monte Carlo(MCMC simulation method in OpenBUGS(established software for Bayesian analysis using Markov Chain Monte Carlo methods. The R functions are developed to study the statistical properties, model validation and comparison tools of the model and the output analysis of MCMC samples generated from OpenBUGS. Details of applying MCMC to parameter estimation for the Gumbel model are elaborated and a real software reliability data set is considered to illustrate the methods of inference discussed in this paper.

  16. Verification of cladding performance analysis models in INFRA

    International Nuclear Information System (INIS)

    Recent trend of PWR fuel development is continuously increasing the burnup to improve the economy as well as the safety. Development of high burnup fuel raised the new issues for the fuel behaviour that was not considered beyond the high burnup. High burnup fuel performance code, INFRA (INtegraed Fuel Rod Analysis), was developed for the prediction of high burnup fuel behavior. Cladding performance models such as creep model, creep-out model, corrosion model and axial irradiation growth model were developed to analyze the performance of high burnup zircaloy-4 cladding. Cladding performance analysis were performed to verify the cladding performance model in INFRA by using the performance data of commercial PWR and halden reactor, etc. INFRA predicted the measured data reasonably well

  17. Analysis and synthesis of solutions for the agglomeration process modeling

    Science.gov (United States)

    Babuk, V. A.; Dolotkazin, I. N.; Nizyaev, A. A.

    2013-03-01

    The present work is devoted development of model of agglomerating process for propellants based on ammonium perchlorate (AP), ammonium dinitramide (ADN), HMX, inactive binder, and nanoaluminum. Generalization of experimental data, development of physical picture of agglomeration for listed propellants, development and analysis of mathematical models are carried out. Synthesis of models of various phenomena taking place at agglomeration implementation allows predicting of size and quantity, chemical composition, structure of forming agglomerates and its fraction in set of condensed combustion products. It became possible in many respects due to development of new model of agglomerating particle evolution on the surface of burning propellant. Obtained results correspond to available experimental data. It is supposed that analogical method based on analysis of mathematical models of particular phenomena and their synthesis will allow implementing of the agglomerating process modeling for other types of metalized solid propellants.

  18. Wind Energy Conversion System Analysis Model (WECSAM) computer program documentation

    Science.gov (United States)

    Downey, W. T.; Hendrick, P. L.

    1982-07-01

    Described is a computer-based wind energy conversion system analysis model (WECSAM) developed to predict the technical and economic performance of wind energy conversion systems (WECS). The model is written in CDC FORTRAN V. The version described accesses a data base containing wind resource data, application loads, WECS performance characteristics, utility rates, state taxes, and state subsidies for a six state region (Minnesota, Michigan, Wisconsin, Illinois, Ohio, and Indiana). The model is designed for analysis at the county level. The computer model includes a technical performance module and an economic evaluation module. The modules can be run separately or together. The model can be run for any single user-selected county within the region or looped automatically through all counties within the region. In addition, the model has a restart capability that allows the user to modify any data-base value written to a scratch file prior to the technical or economic evaluation.

  19. Experimental development based on mapping rule between requirements analysis model and web framework specific design model.

    Science.gov (United States)

    Okuda, Hirotaka; Ogata, Shinpei; Matsuura, Saeko

    2013-12-01

    Model Driven Development is a promising approach to develop high quality software systems. We have proposed a method of model-driven requirements analysis using Unified Modeling Language (UML). The main feature of our method is to automatically generate a Web user interface prototype from UML requirements analysis model so that we can confirm validity of input/output data for each page and page transition on the system by directly operating the prototype. We proposes a mapping rule in which design information independent of each web application framework implementation is defined based on the requirements analysis model, so as to improve the traceability to the final product from the valid requirements analysis model. This paper discusses the result of applying our method to the development of a Group Work Support System that is currently running in our department. PMID:23565356

  20. Comparative analysis of model assessment in community detection

    CERN Document Server

    Kawamoto, Tatsuro

    2016-01-01

    Bayesian cluster inference with a flexible generative model allows us to detect various types of structures. However, it has problems stemming from computational complexity and difficulties in model assessment. We consider the stochastic block model with restricted hyperparameter space, which is known to correspond to modularity maximization. We show that it not only reduces computational complexity, but is also beneficial for model assessment. Using various criteria, we conduct a comparative analysis of the model assessments, and analyze whether each criterion tends to overfit or underfit. We also show that the learning of hyperparameters leads to qualitative differences in Bethe free energy and cross-validation errors.

  1. Hidden-Markov-Model Analysis Of Telemanipulator Data

    Science.gov (United States)

    Hannaford, Blake; Lee, Paul

    1991-01-01

    Mathematical model and procedure based on hidden-Markov-model concept undergoing development for use in analysis and prediction of outputs of force and torque sensors of telerobotic manipulators. In model, overall task broken down into subgoals, and transition probabilities encode ease with which operator completes each subgoal. Process portion of model encodes task-sequence/subgoal structure, and probability-density functions for forces and torques associated with each state of manipulation encode sensor signals that one expects to observe at subgoal. Parameters of model constructed from engineering knowledge of task.

  2. IMAGE ANALYSIS FOR MODELLING SHEAR BEHAVIOUR

    Directory of Open Access Journals (Sweden)

    Philippe Lopez

    2011-05-01

    Full Text Available Through laboratory research performed over the past ten years, many of the critical links between fracture characteristics and hydromechanical and mechanical behaviour have been made for individual fractures. One of the remaining challenges at the laboratory scale is to directly link fracture morphology of shear behaviour with changes in stress and shear direction. A series of laboratory experiments were performed on cement mortar replicas of a granite sample with a natural fracture perpendicular to the axis of the core. Results show that there is a strong relationship between the fracture's geometry and its mechanical behaviour under shear stress and the resulting damage. Image analysis, geostatistical, stereological and directional data techniques are applied in combination to experimental data. The results highlight the role of geometric characteristics of the fracture surfaces (surface roughness, size, shape, locations and orientations of asperities to be damaged in shear behaviour. A notable improvement in shear understanding is that shear behaviour is controlled by the apparent dip in the shear direction of elementary facets forming the fracture.

  3. Landsat analysis of tropical forest succession employing a terrain model

    Science.gov (United States)

    Barringer, T. H.; Robinson, V. B.; Coiner, J. C.; Bruce, R. C.

    1980-01-01

    Landsat multispectral scanner (MSS) data have yielded a dual classification of rain forest and shadow in an analysis of a semi-deciduous forest on Mindonoro Island, Philippines. Both a spatial terrain model, using a fifth side polynomial trend surface analysis for quantitatively estimating the general spatial variation in the data set, and a spectral terrain model, based on the MSS data, have been set up. A discriminant analysis, using both sets of data, has suggested that shadowing effects may be due primarily to local variations in the spectral regions and can therefore be compensated for through the decomposition of the spatial variation in both elevation and MSS data.

  4. A Quotient Space Approximation Model of Multiresolution Signal Analysis

    Institute of Scientific and Technical Information of China (English)

    Ling Zhang; Bo Zhang

    2005-01-01

    In this paper, we present a quotient space approximation model of multiresolution signal analysis and discuss the properties and characteristics of the model. Then the comparison between wavelet transform and the quotient space approximation is made. First, when wavelet transform is viewed from the new quotient space approximation perspective, it may help us to gain an insight into the essence of multiresolution signal analysis. Second, from the similarity between wavelet and quotient space approximations, it is possible to transfer the rich wavelet techniques into the latter so that a new way for multiresolution analysis may be found.

  5. MMA, A Computer Code for Multi-Model Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Eileen P. Poeter and Mary C. Hill

    2007-08-20

    This report documents the Multi-Model Analysis (MMA) computer code. MMA can be used to evaluate results from alternative models of a single system using the same set of observations for all models. As long as the observations, the observation weighting, and system being represented are the same, the models can differ in nearly any way imaginable. For example, they may include different processes, different simulation software, different temporal definitions (for example, steady-state and transient models could be considered), and so on. The multiple models need to be calibrated by nonlinear regression. Calibration of the individual models needs to be completed before application of MMA. MMA can be used to rank models and calculate posterior model probabilities. These can be used to (1) determine the relative importance of the characteristics embodied in the alternative models, (2) calculate model-averaged parameter estimates and predictions, and (3) quantify the uncertainty of parameter estimates and predictions in a way that integrates the variations represented by the alternative models. There is a lack of consensus on what model analysis methods are best, so MMA provides four default methods. Two are based on Kullback-Leibler information, and use the AIC (Akaike Information Criterion) or AICc (second-order-bias-corrected AIC) model discrimination criteria. The other two default methods are the BIC (Bayesian Information Criterion) and the KIC (Kashyap Information Criterion) model discrimination criteria. Use of the KIC criterion is equivalent to using the maximum-likelihood Bayesian model averaging (MLBMA) method. AIC, AICc, and BIC can be derived from Frequentist or Bayesian arguments. The default methods based on Kullback-Leibler information have a number of theoretical advantages, including that they tend to favor more complicated models as more data become available than do the other methods, which makes sense in many situations.

  6. ANALYSIS OF ORGANIZATIONAL CULTURE WITH SOCIAL NETWORK MODELS

    OpenAIRE

    Titov, S.

    2015-01-01

    Organizational culture is nowadays an object of numerous scientific papers. However, only marginal part of existing research attempts to use the formal models of organizational cultures. The lack of organizational culture models significantly limits the further research in this area and restricts the application of the theory to practice of organizational culture change projects. The article consists of general views on potential application of network models and social network analysis to th...

  7. Schizophrenia: the testing of genetic models by pedigree analysis.

    OpenAIRE

    Stewart, J.; Debray, Q; Caillard, V

    1980-01-01

    Simulated pedigrees of schizophrenia generally show a clear peak in their likelihood surface corresponding to analysis by the genetic models, which served as the basis for the simulation. The likelihood surface obtained with real data permits determination of the allelic frequency and the selection of an optimal one-locus, two-locus, and four-locus model. These three models have certain features in common, notably, a relatively high frequency of the allele predisposing to schizophrenia (about...

  8. Modelling and analysis of behaviour of biomedical scaffolds

    OpenAIRE

    Reali, Luca

    2015-01-01

    Since articular cartilage related diseases are an increasing issue and they are nowadays treated by invasive prosthesis implantations, there is a strong demand for new solutions such as those offered by scaffold engineering. This work deals with the characterization and modelling of polymeric fabrics for cartilage repair. Creep tests data at three different applied forces were successfully modelled both analytically, using viscoelastic models, and by finite element analysis which embraced the...

  9. Synthesis analysis of regression models with a continuous outcome

    OpenAIRE

    Zhou, Xiao-Hua; Hu, Nan; Hu, Guizhou; Root, Martin

    2009-01-01

    To estimate the multivariate regression model from multiple individual studies, it would be challenging to obtain results if the input from individual studies only provide univariate or incomplete multivariate regression information. Samsa et al. (J. Biomed. Biotechnol. 2005; 2:113–123) proposed a simple method to combine coefficients from univariate linear regression models into a multivariate linear regression model, a method known as synthesis analysis. However, the validity of this method...

  10. Comparative analysis of credit risk models for loan portfolios.

    OpenAIRE

    Han, C

    2014-01-01

    This study is distinct from previous studies in its inclusion of new models, consideration of sector correlation and performance of comprehensive sensitivity analysis. CreditRisk++, CreditMetrics, the Basel II internal-ratings-based method and the Mercer Oliver Wyman model are considered. Risk factor distribution and the relationship between risk components and risk factors are the key distinguishing characteristics of each model. CreditRisk++, due to its extra degree of freedom, has the high...

  11. A Novel Two-Dimension’ Customer Knowledge Analysis Model

    OpenAIRE

    Liu Xuelian; Nopasit Chakpitak; Pitipong Yodmongkol

    2015-01-01

    Customer knowledge has increasingly importance in customer-oriented enterprise. Customer knowledge management process with models can help managers to identify the real value chain in business process. The purpose of the paper is to develop a tool for classification and processing of customer knowledge from perspective of knowledge management. By review previous customer knowledge management model, this paper proposes a novel two-dimension’ customer knowledge analysis model, which make custom...

  12. ANALYSIS OF THE MECHANISM MODELS OF TECHNOLOGICAL INNOVATION DIFFUSION

    Institute of Scientific and Technical Information of China (English)

    XU Jiuping; HU Minan

    2004-01-01

    This paper analyzes the mechanism and principle of diffusion of technology diffusion on the basis of quantitative analysis. Then it sets up the diffusion model of innovation incorporating price, advertising and distribution, the diffusion model of innovation including various kinds of consumers, and the substitute model between the new technology and the old one applied systems dynamics, optimization method, probabilistic method and simulation method on computer. Finally this paper concludes with some practical observations from a case study.

  13. Semiparametric theory based MIMO model and performance analysis

    Institute of Scientific and Technical Information of China (English)

    XU Fang-min; XU Xiao-dong; ZHANG Ping

    2007-01-01

    In this article, a new approach for modeling multi- input multi-output (MIMO) systems with unknown nonlinear interference is introduced. The semiparametric theory based MIMO model is established, and Kernel estimation is applied to combat the nonlinear interference. Furthermore, we derive MIMO capacity for these systems and explore the asymptotic properties of the new channel matrix via theoretical analysis. The simulation results show that the semiparametric theory based modeling and kernel estimation are valid to combat this kind of interference.

  14. Apply Functional Modelling to Consequence Analysis in Supervision Systems

    OpenAIRE

    Zhang, Xinxin; Lind, Morten; Gola, Giulio; Ravn, Ole

    2013-01-01

    This paper will first present the purpose and goals of applying functional modelling approach to consequence analysis by adopting Multilevel Flow Modelling (MFM). MFM Models describe a complex system in multiple abstraction levels in both means-end dimension and whole-part dimension. It contains causal relations between functions and goals. A rule base system can be developed to trace the causal relations and perform consequence propagations. This paper will illustrate how to use MFM for cons...

  15. Computer-based modelling and analysis in engineering geology

    OpenAIRE

    Giles, David

    2014-01-01

    This body of work presents the research and publications undertaken under a general theme of computer-based modelling and analysis in engineering geology. Papers are presented on geotechnical data management, data interchange, Geographical Information Systems, surface modelling, geostatistical methods, risk-based modelling, knowledge-based systems, remote sensing in engineering geology and on the integration of computer applications into applied geoscience teaching. The work highlights my...

  16. SOA Modeling Patterns for Service Oriented Discovery and Analysis

    CERN Document Server

    Bell, Michael

    2010-01-01

    Learn the essential tools for developing a sound service-oriented architecture. SOA Modeling Patterns for Service-Oriented Discovery and Analysis introduces a universal, easy-to-use, and nimble SOA modeling language to facilitate the service identification and examination life cycle stage. This business and technological vocabulary will benefit your service development endeavors and foster organizational software asset reuse and consolidation, and reduction of expenditure. Whether you are a developer, business architect, technical architect, modeler, business analyst, team leader, or manager,

  17. Flood modeling for risk evaluation: a MIKE FLOOD sensitivity analysis

    OpenAIRE

    Vanderkimpen, P.; Peeters, P

    2008-01-01

    The flood risk for a section of the Belgian coastal plain was evaluated by means of dynamically linked 1D (breach) and 2D (floodplain) hydraulic models. First, a one-at-a-time factor screening was performed to evaluate the relative importance of various model processes and parameters. Subsequently, a systematic sensitivity analysis was added to establish the contribution of the most influential factors (breach growth and surface roughness) to hydraulic modeling uncertainty. Finally, the uncer...

  18. Evolution analysis of the states of the EZ model

    International Nuclear Information System (INIS)

    Based on suitable choice of states, this paper studies the stability of the equilibrium state of the EZ model by regarding the evolution of the EZ model as a Markov chain and by showing that the Markov chain is ergodic. The Markov analysis is applied to the EZ model with small number of agents, the exact equilibrium state for N = 5 and numerical results for N = 18 are obtained. (cross-disciplinary physics and related areas of science and technology)

  19. QuantUM: Quantitative Safety Analysis of UML Models

    CERN Document Server

    Leitner-Fischer, Florian; 10.4204/EPTCS.57.2

    2011-01-01

    When developing a safety-critical system it is essential to obtain an assessment of different design alternatives. In particular, an early safety assessment of the architectural design of a system is desirable. In spite of the plethora of available formal quantitative analysis methods it is still difficult for software and system architects to integrate these techniques into their every day work. This is mainly due to the lack of methods that can be directly applied to architecture level models, for instance given as UML diagrams. Also, it is necessary that the description methods used do not require a profound knowledge of formal methods. Our approach bridges this gap and improves the integration of quantitative safety analysis methods into the development process. All inputs of the analysis are specified at the level of a UML model. This model is then automatically translated into the analysis model, and the results of the analysis are consequently represented on the level of the UML model. Thus the analysi...

  20. Analysis of solar models - neutrinos and oscillations

    International Nuclear Information System (INIS)

    The theory of stellar structure and evolution is used to calculate the properties of a variety of objects from red giants and supernova precursors to white dwarfs and neutron stars. Accurate tests of the theory in the context of these applications are generally not available. The sun as the nearest star provides a unique example of a star which can be subjected to a variety of precise tests not possible with remote stars. We will concentrate on two of these tests - solar neutrinos and solar oscillations - which currently indicate that there is something seriously wrong with our standard solar model. Although we do not yet known what the source of the error is, it is quite possible that the correction of this error will require some modification of the results of other applications of stellar structure theory. It now seems unlikely that the difficulty with the solar neutrino experiment lies in the experiment itself. The combination of the difficulty with the solar neutrino flux and the difficulty with the solar oscillation frequencies suggests that the solar neutrino problem is a failure of stellar structure theory rather than a failure of weak interaction theory, although this latter possibility cannot yet be firmly ruled out

  1. COMPARATIVE ANALYSIS OF EMPLOYMENT POLICY MODELS: RUSSIAN AND INTERNATIONAL EXPERIENCE

    Directory of Open Access Journals (Sweden)

    Konstantin Khrabrov

    2016-04-01

    Full Text Available The article provides a comparative analysis of existing theoretical models for regulation of employment identified as a result of studying the experience of public administration of the EU countries, USA and Japan. In particular, paper shows that the international experience of employment includes various models that define the existing relationship in the labour market. The result is a substantial analysis of the models indicates the features of interdependence applied policy regulating the employment of the state of the institutional environment of the national economy.

  2. Reflector modeling in advanced nodal analysis of pressurized water reactors

    International Nuclear Information System (INIS)

    Recent progress in the modeling of the reflector regions of pressurized water reactors within the framework of advanced nodal diffusion analysis methods is reviewed. Attention is focused on the modeling of the radial reflector of a PWR which is most problematic because of its irregular and heterogeneous structure. Numerical results are presented to demonstrate the high accuracy of the methods which are now available for generating nodal reflector parameters and it is shown that errors due to reflector modeling in multi-dimensional nodal reactor analysis can be practically eliminated. (author). 23 refs, 1 fig., 2 tabs

  3. Validation of statistical models for creep rupture by parametric analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bolton, J., E-mail: john.bolton@uwclub.net [65, Fisher Ave., Rugby, Warks CV22 5HW (United Kingdom)

    2012-01-15

    Statistical analysis is an efficient method for the optimisation of any candidate mathematical model of creep rupture data, and for the comparative ranking of competing models. However, when a series of candidate models has been examined and the best of the series has been identified, there is no statistical criterion to determine whether a yet more accurate model might be devised. Hence there remains some uncertainty that the best of any series examined is sufficiently accurate to be considered reliable as a basis for extrapolation. This paper proposes that models should be validated primarily by parametric graphical comparison to rupture data and rupture gradient data. It proposes that no mathematical model should be considered reliable for extrapolation unless the visible divergence between model and data is so small as to leave no apparent scope for further reduction. This study is based on the data for a 12% Cr alloy steel used in BS PD6605:1998 to exemplify its recommended statistical analysis procedure. The models considered in this paper include a) a relatively simple model, b) the PD6605 recommended model and c) a more accurate model of somewhat greater complexity. - Highlights: Black-Right-Pointing-Pointer The paper discusses the validation of creep rupture models derived from statistical analysis. Black-Right-Pointing-Pointer It demonstrates that models can be satisfactorily validated by a visual-graphic comparison of models to data. Black-Right-Pointing-Pointer The method proposed utilises test data both as conventional rupture stress and as rupture stress gradient. Black-Right-Pointing-Pointer The approach is shown to be more reliable than a well-established and widely used method (BS PD6605).

  4. A Grammar Analysis Model for the Unified Multimedia Query Language

    Institute of Scientific and Technical Information of China (English)

    Zhong-Sheng Cao; Zong-Da Wu; Yuan-Zhen Wang

    2008-01-01

    The unified multimedia query language (UMQL) is a powerful general-purpose multimedia query language, and it is very suitable for multimedia information retrieval. The paper proposes a grammar analysis model to implement an effective grammatical processing for the language. It separates the grammar analysis of a UMQL query specification into two phases: syntactic analysis and semantic analysis, and then respectively uses Backus-Naur form (EBNF) and logical algebra to specify both restrictive grammar rules. As a result, the model can present error guiding information for a query specification which owns incorrect grammar. The model not only suits well the processing of UMQL queries, but also has a guiding significance for other projects concerning query processings of descriptive query languages.

  5. Microeconomic co-evolution model for financial technical analysis signals

    CERN Document Server

    Rotundo, G

    2006-01-01

    Technical analysis (TA) has been used for a long time before the availability of more sophisticated instruments for financial forecasting in order to suggest decisions on the basis of the occurrence of data patterns. Many mathematical and statistical tools for quantitative analysis of financial markets have experienced a fast and wide growth and have the power for overcoming classical technical analysis methods. This paper aims to give a measure of the reliability of some information used in TA by exploring the probability of their occurrence within a particular $microeconomic$ agent based model of markets, i.e., the co-evolution Bak-Sneppen model originally invented for describing species population evolutions. After having proved the practical interest of such a model in describing financial index so called avalanches, in the prebursting bubble time rise, the attention focuses on the occurrence of trend line detection crossing of meaningful barriers, those that give rise to some usual technical analysis str...

  6. Demographics of reintroduced populations: estimation, modeling, and decision analysis

    Science.gov (United States)

    Converse, Sarah J.; Moore, Clinton T.; Armstrong, Doug P.

    2013-01-01

    Reintroduction can be necessary for recovering populations of threatened species. However, the success of reintroduction efforts has been poorer than many biologists and managers would hope. To increase the benefits gained from reintroduction, management decision making should be couched within formal decision-analytic frameworks. Decision analysis is a structured process for informing decision making that recognizes that all decisions have a set of components—objectives, alternative management actions, predictive models, and optimization methods—that can be decomposed, analyzed, and recomposed to facilitate optimal, transparent decisions. Because the outcome of interest in reintroduction efforts is typically population viability or related metrics, models used in decision analysis efforts for reintroductions will need to include population models. In this special section of the Journal of Wildlife Management, we highlight examples of the construction and use of models for informing management decisions in reintroduced populations. In this introductory contribution, we review concepts in decision analysis, population modeling for analysis of decisions in reintroduction settings, and future directions. Increased use of formal decision analysis, including adaptive management, has great potential to inform reintroduction efforts. Adopting these practices will require close collaboration among managers, decision analysts, population modelers, and field biologists.

  7. Model order reduction techniques with applications in finite element analysis

    CERN Document Server

    Qu, Zu-Qing

    2004-01-01

    Despite the continued rapid advance in computing speed and memory the increase in the complexity of models used by engineers persists in outpacing them. Even where there is access to the latest hardware, simulations are often extremely computationally intensive and time-consuming when full-blown models are under consideration. The need to reduce the computational cost involved when dealing with high-order/many-degree-of-freedom models can be offset by adroit computation. In this light, model-reduction methods have become a major goal of simulation and modeling research. Model reduction can also ameliorate problems in the correlation of widely used finite-element analyses and test analysis models produced by excessive system complexity. Model Order Reduction Techniques explains and compares such methods focusing mainly on recent work in dynamic condensation techniques: - Compares the effectiveness of static, exact, dynamic, SEREP and iterative-dynamic condensation techniques in producing valid reduced-order mo...

  8. Cartographic Modeling: Computer-assisted Analysis of Spatially Defined Neighborhoods

    Science.gov (United States)

    Berry, J. K.; Tomlin, C. D.

    1982-01-01

    Cartographic models addressing a wide variety of applications are composed of fundamental map processing operations. These primitive operations are neither data base nor application-specific. By organizing the set of operations into a mathematical-like structure, the basis for a generalized cartographic modeling framework can be developed. Among the major classes of primitive operations are those associated with reclassifying map categories, overlaying maps, determining distance and connectivity, and characterizing cartographic neighborhoods. The conceptual framework of cartographic modeling is established and techniques for characterizing neighborhoods are used as a means of demonstrating some of the more sophisticated procedures of computer-assisted map analysis. A cartographic model for assessing effective roundwood supply is briefly described as an example of a computer analysis. Most of the techniques described have been implemented as part of the map analysis package developed at the Yale School of Forestry and Environmental Studies.

  9. Conservative RIA analysis with use of spatial kinetic model

    International Nuclear Information System (INIS)

    Description of methodology of conservative RIA analysis with use of spatial kinetic reactor core model is presented. It is shown that their application yields more conservative assessment of reactor core parameters for which acceptance criteria for rod ejection RIA are established, in comparison with point-one-dimensional kinetic model. Application of methodology based on using of point-one-dimensional kinetic model and power peaking factor obtained from stationery calculations of states that can be realized during RIA is also allowable if choice of given state is substantiated. But, as it is shown the choice of reactor core state for power peaking factor definition is not trivial and it can be calculated on the base of rod ejection RIA analysis with use of 3-D spatial kinetic reactor core model. Performed studies come to conclusion about necessity to indicate using of spatial kinetic software for RIA analysis in normative documents. (authors)

  10. Development of Wolsong Unit 2 Containment Analysis Model

    Energy Technology Data Exchange (ETDEWEB)

    Hoon, Choi [Korea Hydro and Nuclear Power Co., Ltd., Daejeon (Korea, Republic of); Jin, Ko Bong; Chan, Park Young [Hanbat National Univ., Daejeon (Korea, Republic of)

    2014-05-15

    To be prepared for the full scope safety analysis of Wolsong unit 2 with modified fuel, input decks for the various objectives, which can be read by GOTHIC 7.2b(QA), are developed and tested for the steady state simulation. A detailed nodalization of 39 control volumes and 92 flow paths is constructed to determine the differential pressure across internal walls or hydrogen concentration and distribution inside containment. A lumped model with 15 control volumes and 74 flow paths has also been developed to reduce the computer run time for the assessments in which the analysis results are not sensitive to detailed thermal hydraulic distribution inside containment such as peak pressure, pressure dependent signal and radionuclide release. The input data files provide simplified representations of the geometric layout of the containment building (volumes, dimensions, flow paths, doors, panels, etc.) and the performance characteristics of the various containment subsystems. The parameter values are based on best estimate or design values for that parameter. The analysis values are determined by conservatism depending on the analysis objective and may be different for various analysis objectives. Basic input decks of Wolsong unit 2 were developed for the various analysis purposes with GOTHIC 7.2b(QA). Depend on the analysis objective, two types of models are prepared. Detailed model models each confined room in the containment as a separate node. All of the geometric data are based on the drawings of Wolsong unit 2. Developed containment models are simulating the steady state well to the designated initial condition. These base models will be used for Wolsong unit 2 in case of safety analysis of full scope is needed.

  11. Mixed waste treatment model: Basis and analysis

    International Nuclear Information System (INIS)

    The Department of Energy's Programmatic Environmental Impact Statement (PEIS) required treatment system capacities for risk and cost calculation. Los Alamos was tasked with providing these capacities to the PEIS team. This involved understanding the Department of Energy (DOE) Complex waste, making the necessary changes to correct for problems, categorizing the waste for treatment, and determining the treatment system requirements. The treatment system requirements depended on the incoming waste, which varied for each PEIS case. The treatment system requirements also depended on the type of treatment that was desired. Because different groups contributing to the PEIS needed specific types of results, we provided the treatment system requirements in a variety of forms. In total, some 40 data files were created for the TRU cases, and for the MLLW case, there were 105 separate data files. Each data file represents one treatment case consisting of the selected waste from various sites, a selected treatment system, and the reporting requirements for such a case. The treatment system requirements in their most basic form are the treatment process rates for unit operations in the desired treatment system, based on a 10-year working life and 20-year accumulation of the waste. These results were reported in cubic meters and for the MLLW case, in kilograms as well. The treatment system model consisted of unit operations that are linked together. Each unit operation's function depended on the input waste streams, waste matrix, and contaminants. Each unit operation outputs one or more waste streams whose matrix, contaminants, and volume/mass may have changed as a result of the treatment. These output streams are then routed to the appropriate unit operation for additional treatment until the output waste stream meets the treatment requirements for disposal. The total waste for each unit operation was calculated as well as the waste for each matrix treated by the unit

  12. Modelling and analysis of turbulent datasets using ARMA processes

    CERN Document Server

    Faranda, Davide; Dubrulle, Bérèngere; Daviaud, François; Saint-Michel, Brice; Herbert, Éric; Cortet, Pierre-Philippe

    2014-01-01

    We introduce a novel way to extract information from turbulent datasets by applying an ARMA statistical analysis. Such analysis goes well beyond the analysis of the mean flow and of the fluctuations and links the behavior of the recorded time series to a discrete version of a stochastic differential equation which is able to describe the correlation structure in the dataset. We introduce a new intermittency parameter $\\Upsilon$ that measures the difference between the resulting analysis and the Obukhov model of turbulence, the simplest stochastic model reproducing both Richardson law and the Kolmogorov spectrum. We test the method on datasets measured in a von K\\'arm\\'an swirling flow experiment. We found that the ARMA analysis is well correlated with spatial structures of the flow, and can discriminate between two different flows with comparable mean velocities, obtained by changing the forcing. Moreover, we show that the intermittency parameter is highest in regions where shear layer vortices are present, t...

  13. The bivariate combined model for spatial data analysis.

    Science.gov (United States)

    Neyens, Thomas; Lawson, Andrew B; Kirby, Russell S; Faes, Christel

    2016-08-15

    To describe the spatial distribution of diseases, a number of methods have been proposed to model relative risks within areas. Most models use Bayesian hierarchical methods, in which one models both spatially structured and unstructured extra-Poisson variance present in the data. For modelling a single disease, the conditional autoregressive (CAR) convolution model has been very popular. More recently, a combined model was proposed that 'combines' ideas from the CAR convolution model and the well-known Poisson-gamma model. The combined model was shown to be a good alternative to the CAR convolution model when there was a large amount of uncorrelated extra-variance in the data. Less solutions exist for modelling two diseases simultaneously or modelling a disease in two sub-populations simultaneously. Furthermore, existing models are typically based on the CAR convolution model. In this paper, a bivariate version of the combined model is proposed in which the unstructured heterogeneity term is split up into terms that are shared and terms that are specific to the disease or subpopulation, while spatial dependency is introduced via a univariate or multivariate Markov random field. The proposed method is illustrated by analysis of disease data in Georgia (USA) and Limburg (Belgium) and in a simulation study. We conclude that the bivariate combined model constitutes an interesting model when two diseases are possibly correlated. As the choice of the preferred model differs between data sets, we suggest to use the new and existing modelling approaches together and to choose the best model via goodness-of-fit statistics. Copyright © 2016 John Wiley & Sons, Ltd. PMID:26928309

  14. Structural Simulations and Conservation Analysis -Historic Building Information Model (HBIM

    Directory of Open Access Journals (Sweden)

    C. Dore

    2015-02-01

    Full Text Available In this paper the current findings to date of the Historic Building Information Model (HBIM of the Four Courts in Dublin are presented. The Historic Building Information Model (HBIM forms the basis for both structural and conservation analysis to measure the impact of war damage which still impacts on the building. The laser scan survey was carried out in the summer of 2014 of the internal and external structure. After registration and processing of the laser scan survey, the HBIM was created of the damaged section of the building and is presented as two separate workflows in this paper. The first is the model created from historic data, the second a procedural and segmented model developed from laser scan survey of the war damaged drum and dome. From both models structural damage and decay simulations will be developed for documentation and conservation analysis.

  15. An analysis of urban collisions using an artificial intelligence model.

    Science.gov (United States)

    Mussone, L; Ferrari, A; Oneta, M

    1999-11-01

    Traditional studies on road accidents estimate the effect of variables (such as vehicular flows, road geometry, vehicular characteristics), and the calculation of the number of accidents. A descriptive statistical analysis of the accidents (those used in the model) over the period 1992-1995 is proposed. The paper describes an alternative method based on the use of artificial neural networks (ANN) in order to work out a model that relates to the analysis of vehicular accidents in Milan. The degree of danger of urban intersections using different scenarios is quantified by the ANN model. Methodology is the first result, which allows us to tackle the modelling of urban vehicular accidents by the innovative use of ANN. Other results deal with model outputs: intersection complexity may determine a higher accident index depending on the regulation of intersection. The highest index for running over of pedestrian occurs at non-signalised intersections at night-time. PMID:10487346

  16. An introduction to queueing theory modeling and analysis in applications

    CERN Document Server

    Bhat, U Narayan

    2015-01-01

    This introductory textbook is designed for a one-semester course on queueing theory that does not require a course on stochastic processes as a prerequisite. By integrating the necessary background on stochastic processes with the analysis of models, the work provides a sound foundational introduction to the modeling and analysis of queueing systems for a wide interdisciplinary audience of students in mathematics, statistics, and applied disciplines such as computer science, operations research, and engineering. This edition includes additional topics in methodology and applications. Key features: • An introductory chapter including a historical account of the growth of queueing theory in more than 100 years. • A modeling-based approach with emphasis on identification of models. • Rigorous treatment of the foundations of basic models commonly used in applications with appropriate references for advanced topics. • Applications in manufacturing and, computer and communication systems. • A chapter on ...

  17. Optimization model for air quality analysis in energy facility siting

    Energy Technology Data Exchange (ETDEWEB)

    Emanuel, W. R.; Murphy, B. D.; Huff, D. D.; Begovich, C. L.; Hurt, J. F.

    1977-09-01

    The siting of energy facilities on a regional scale is discussed with particular attention to environmental planning criteria. A multiple objective optimization model is proposed as a framework for the analysis of siting problems. Each planning criterion (e.g., air quality, water quality, or power demand) is treated as an objective function to be minimized or maximized subject to constraints in this optimization procedure. The formulation of the objective functions is illustrated by the development of a siting model for the minimization of human exposure to air pollutants. This air quality siting model takes the form of a linear programming problem. A graphical analysis of this type of problem, which provides insight into the nature of the siting model, is given. The air quality siting model is applied to an illustrative siting example for the Tennessee Valley area.

  18. Discussion of OECD LWR Uncertainty Analysis in Modelling Benchmark

    International Nuclear Information System (INIS)

    The demand for best estimate calculations in nuclear reactor design and safety evaluations has increased in recent years. Uncertainty quantification has been highlighted as part of the best estimate calculations. The modelling aspects of uncertainty and sensitivity analysis are to be further developed and validated on scientific grounds in support of their performance and application to multi-physics reactor simulations. The Organization for Economic Co-operation and Development (OECD) / Nuclear Energy Agency (NEA) Nuclear Science Committee (NSC) has endorsed the creation of an Expert Group on Uncertainty Analysis in Modelling (EGUAM). Within the framework of activities of EGUAM/NSC the OECD/NEA initiated the Benchmark for Uncertainty Analysis in Modelling for Design, Operation, and Safety Analysis of Light Water Reactor (OECD LWR UAM benchmark). The general objective of the benchmark is to propagate the predictive uncertainties of code results through complex coupled multi-physics and multi-scale simulations. The benchmark is divided into three phases with Phase I highlighting the uncertainty propagation in stand-alone neutronics calculations, while Phase II and III are focused on uncertainty analysis of reactor core and system respectively. This paper discusses the progress made in Phase I calculations, the Specifications for Phase II and the incoming challenges in defining Phase 3 exercises. The challenges of applying uncertainty quantification to complex code systems, in particular the time-dependent coupled physics models are the large computational burden and the utilization of non-linear models (expected due to the physics coupling). (authors)

  19. Interval process model and non-random vibration analysis

    Science.gov (United States)

    Jiang, C.; Ni, B. Y.; Liu, N. Y.; Han, X.; Liu, J.

    2016-07-01

    This paper develops an interval process model for time-varying or dynamic uncertainty analysis when information of the uncertain parameter is inadequate. By using the interval process model to describe a time-varying uncertain parameter, only its upper and lower bounds are required at each time point rather than its precise probability distribution, which is quite different from the traditional stochastic process model. A correlation function is defined for quantification of correlation between the uncertain-but-bounded variables at different times, and a matrix-decomposition-based method is presented to transform the original dependent interval process into an independent one for convenience of subsequent uncertainty analysis. More importantly, based on the interval process model, a non-random vibration analysis method is proposed for response computation of structures subjected to time-varying uncertain external excitations or loads. The structural dynamic responses thus can be derived in the form of upper and lower bounds, providing an important guidance for practical safety analysis and reliability design of structures. Finally, two numerical examples and one engineering application are investigated to demonstrate the feasibility of the interval process model and corresponding non-random vibration analysis method.

  20. Global Sensitivity Analysis of Environmental Models: Convergence, Robustness and Validation

    Science.gov (United States)

    Sarrazin, Fanny; Pianosi, Francesca; Khorashadi Zadeh, Farkhondeh; Van Griensven, Ann; Wagener, Thorsten

    2015-04-01

    Global Sensitivity Analysis aims to characterize the impact that variations in model input factors (e.g. the parameters) have on the model output (e.g. simulated streamflow). In sampling-based Global Sensitivity Analysis, the sample size has to be chosen carefully in order to obtain reliable sensitivity estimates while spending computational resources efficiently. Furthermore, insensitive parameters are typically identified through the definition of a screening threshold: the theoretical value of their sensitivity index is zero but in a sampling-base framework they regularly take non-zero values. There is little guidance available for these two steps in environmental modelling though. The objective of the present study is to support modellers in making appropriate choices, regarding both sample size and screening threshold, so that a robust sensitivity analysis can be implemented. We performed sensitivity analysis for the parameters of three hydrological models with increasing level of complexity (Hymod, HBV and SWAT), and tested three widely used sensitivity analysis methods (Elementary Effect Test or method of Morris, Regional Sensitivity Analysis, and Variance-Based Sensitivity Analysis). We defined criteria based on a bootstrap approach to assess three different types of convergence: the convergence of the value of the sensitivity indices, of the ranking (the ordering among the parameters) and of the screening (the identification of the insensitive parameters). We investigated the screening threshold through the definition of a validation procedure. The results showed that full convergence of the value of the sensitivity indices is not necessarily needed to rank or to screen the model input factors. Furthermore, typical values of the sample sizes that are reported in the literature can be well below the sample sizes that actually ensure convergence of ranking and screening.

  1. Model for Analysis of Energy Demand (MAED-2). User's manual

    International Nuclear Information System (INIS)

    The IAEA has been supporting its Member States in the area of energy planning for sustainable development. Development and dissemination of appropriate methodologies and their computer codes are important parts of this support. This manual has been produced to facilitate the use of the MAED model: Model for Analysis of Energy Demand. The methodology of the MAED model was originally developed by. B. Chateau and B. Lapillonne of the Institute Economique et Juridique de l'Energie (IEJE) of the University of Grenoble, France, and was presented as the MEDEE model. Since then the MEDEE model has been developed and adopted to be appropriate for modelling of various energy demand system. The IAEA adopted MEDEE-2 model and incorporated important modifications to make it more suitable for application in the developing countries, and it was named as the MAED model. The first version of the MAED model was designed for the DOS based system, which was later on converted for the Windows system. This manual presents the latest version of the MAED model. The most prominent feature of this version is its flexibility for representing structure of energy consumption. The model now allows country-specific representations of energy consumption patterns using the MAED methodology. The user can now disaggregate energy consumption according to the needs and/or data availability in her/his country. As such, MAED has now become a powerful tool for modelling widely diverse energy consumption patterns. This manual presents the model in details and provides guidelines for its application

  2. Model for Analysis of Energy Demand (MAED-2)

    International Nuclear Information System (INIS)

    The IAEA has been supporting its Member States in the area of energy planning for sustainable development. Development and dissemination of appropriate methodologies and their computer codes are important parts of this support. This manual has been produced to facilitate the use of the MAED model: Model for Analysis of Energy Demand. The methodology of the MAED model was originally developed by. B. Chateau and B. Lapillonne of the Institute Economique et Juridique de l'Energie (IEJE) of the University of Grenoble, France, and was presented as the MEDEE model. Since then the MEDEE model has been developed and adopted to be appropriate for modelling of various energy demand system. The IAEA adopted MEDEE-2 model and incorporated important modifications to make it more suitable for application in the developing countries, and it was named as the MAED model. The first version of the MAED model was designed for the DOS based system, which was later on converted for the Windows system. This manual presents the latest version of the MAED model. The most prominent feature of this version is its flexibility for representing structure of energy consumption. The model now allows country-specific representations of energy consumption patterns using the MAED methodology. The user can now disaggregate energy consumption according to the needs and/or data availability in her/his country. As such, MAED has now become a powerful tool for modelling widely diverse energy consumption patterns. This manual presents the model in details and provides guidelines for its application

  3. Precise methods for conducted EMI modeling,analysis,and prediction

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    Focusing on the state-of-the-art conducted EMI prediction, this paper presents a noise source lumped circuit modeling and identification method, an EMI modeling method based on multiple slope approximation of switching transitions, and dou-ble Fourier integral method modeling PWM conversion units to achieve an accurate modeling of EMI noise source. Meanwhile, a new sensitivity analysis method, a general coupling model for steel ground loops, and a partial element equivalent circuit method are proposed to identify and characterize conducted EMI coupling paths. The EMI noise and propagation modeling provide an accurate prediction of conducted EMI in the entire frequency range (0―10 MHz) with good practicability and generality. Finally a new measurement approach is presented to identify the surface current of large dimensional metal shell. The proposed analytical modeling methodology is verified by experimental results.

  4. Static analysis of a Model of the LDL degradation pathway

    DEFF Research Database (Denmark)

    Pilegaard, Henrik; Nielson, Flemming; Nielson, Hanne Riis

    2005-01-01

    BioAmbients is a derivative of mobile ambients that has shown promise of describing interesting features of the behaviour of biological systems. As for other ambient calculi static program analysis can be used to compute safe approximations of the behavior of modelled systems. We use these tools to...... model and analyse the production of cholesterol in living cells and show that we are able to pinpoint the difference in behaviour between models of healthy systems and models of mutated systems giving rise to known diseases....

  5. Perturbation analysis for Monte Carlo continuous cross section models

    International Nuclear Information System (INIS)

    Sensitivity analysis, including both its forward and adjoint applications, collectively referred to hereinafter as Perturbation Analysis (PA), is an essential tool to complete Uncertainty Quantification (UQ) and Data Assimilation (DA). PA-assisted UQ and DA have traditionally been carried out for reactor analysis problems using deterministic as opposed to stochastic models for radiation transport. This is because PA requires many model executions to quantify how variations in input data, primarily cross sections, affect variations in model's responses, e.g. detectors readings, flux distribution, multiplication factor, etc. Although stochastic models are often sought for their higher accuracy, their repeated execution is at best computationally expensive and in reality intractable for typical reactor analysis problems involving many input data and output responses. Deterministic methods however achieve computational efficiency needed to carry out the PA analysis by reducing problem dimensionality via various spatial and energy homogenization assumptions. This however introduces modeling error components into the PA results which propagate to the following UQ and DA analyses. The introduced errors are problem specific and therefore are expected to limit the applicability of UQ and DA analyses to reactor systems that satisfy the introduced assumptions. This manuscript introduces a new method to complete PA employing a continuous cross section stochastic model and performed in a computationally efficient manner. If successful, the modeling error components introduced by deterministic methods could be eliminated, thereby allowing for wider applicability of DA and UQ results. Two MCNP models demonstrate the application of the new method - a Critical Pu Sphere (Jezebel), a Pu Fast Metal Array (Russian BR-1). The PA is completed for reaction rate densities, reaction rate ratios, and the multiplication factor. (author)

  6. Computer Models for IRIS Control System Transient Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Gary D. Storrick; Bojan Petrovic; Luca Oriani

    2007-01-31

    This report presents results of the Westinghouse work performed under Task 3 of this Financial Assistance Award and it satisfies a Level 2 Milestone for the project. Task 3 of the collaborative effort between ORNL, Brazil and Westinghouse for the International Nuclear Energy Research Initiative entitled “Development of Advanced Instrumentation and Control for an Integrated Primary System Reactor” focuses on developing computer models for transient analysis. This report summarizes the work performed under Task 3 on developing control system models. The present state of the IRIS plant design – such as the lack of a detailed secondary system or I&C system designs – makes finalizing models impossible at this time. However, this did not prevent making considerable progress. Westinghouse has several working models in use to further the IRIS design. We expect to continue modifying the models to incorporate the latest design information until the final IRIS unit becomes operational. Section 1.2 outlines the scope of this report. Section 2 describes the approaches we are using for non-safety transient models. It describes the need for non-safety transient analysis and the model characteristics needed to support those analyses. Section 3 presents the RELAP5 model. This is the highest-fidelity model used for benchmark evaluations. However, it is prohibitively slow for routine evaluations and additional lower-fidelity models have been developed. Section 4 discusses the current Matlab/Simulink model. This is a low-fidelity, high-speed model used to quickly evaluate and compare competing control and protection concepts. Section 5 describes the Modelica models developed by POLIMI and Westinghouse. The object-oriented Modelica language provides convenient mechanisms for developing models at several levels of detail. We have used this to develop a high-fidelity model for detailed analyses and a faster-running simplified model to help speed the I&C development process

  7. Computer Models for IRIS Control System Transient Analysis

    International Nuclear Information System (INIS)

    This report presents results of the Westinghouse work performed under Task 3 of this Financial Assistance Award and it satisfies a Level 2 Milestone for the project. Task 3 of the collaborative effort between ORNL, Brazil and Westinghouse for the International Nuclear Energy Research Initiative entitled 'Development of Advanced Instrumentation and Control for an Integrated Primary System Reactor' focuses on developing computer models for transient analysis. This report summarizes the work performed under Task 3 on developing control system models. The present state of the IRIS plant design--such as the lack of a detailed secondary system or I and C system designs--makes finalizing models impossible at this time. However, this did not prevent making considerable progress. Westinghouse has several working models in use to further the IRIS design. We expect to continue modifying the models to incorporate the latest design information until the final IRIS unit becomes operational. Section 1.2 outlines the scope of this report. Section 2 describes the approaches we are using for non-safety transient models. It describes the need for non-safety transient analysis and the model characteristics needed to support those analyses. Section 3 presents the RELAP5 model. This is the highest-fidelity model used for benchmark evaluations. However, it is prohibitively slow for routine evaluations and additional lower-fidelity models have been developed. Section 4 discusses the current Matlab/Simulink model. This is a low-fidelity, high-speed model used to quickly evaluate and compare competing control and protection concepts. Section 5 describes the Modelica models developed by POLIMI and Westinghouse. The object-oriented Modelica language provides convenient mechanisms for developing models at several levels of detail. We have used this to develop a high-fidelity model for detailed analyses and a faster-running simplified model to help speed the I and C development process. Section

  8. Chromatin structure analysis based on a hierarchic texture model.

    Science.gov (United States)

    Wolf, G; Beil, M; Guski, H

    1995-02-01

    The quantification of chromatin structures is an important part of nuclear grading of malignant and premalignant lesions. In order to achieve high accuracy, computerized image analysis systems have been applied in this process. Chromatin texture analysis of cell nuclei requires a suitable texture model. A hierarchic model seemed to be most compatible for this purpose. It assumes that texture consists of homogeneous regions (textons). Based on this model, two approaches to texture segmentation and feature extraction were investigated using sections of cervical tissue. We examined the reproducibility of the measurement under changing optical conditions. The coefficients of variations of the texture features ranged from 2.1% to 16.9%. The features were tested for their discriminating capability in a pilot study including 30 cases of cervical dysplasia and carcinoma. The overall classification accuracy reached 65%. This study presents an automated technique for texture analysis that is similar to human perception. PMID:7766266

  9. Thermal and mechanical analysis for the detailed model using submodel

    Energy Technology Data Exchange (ETDEWEB)

    Kuh, Jung Eui; Kang, Chul Hyung; Park, Jeong Hwa [Korea Atomic Energy Research Institute, Taejon (Korea)

    1999-11-01

    A very big model is required for the TM analysis for HLRW repository, and also very small size of mesh is needed to simulate precisely main parts of analysis, e.g., canister, buffer, etc. However, it is practically impossible due to high memory size and computing time. In this report, a submodel concept in ABAQUS is used to handle this difficulty. A submodel concept is the part interested only is performed detailed modelling and this result is used as a boundary condition of full scale model. To follow this kind of computation procedure temperature distribution in buffer and canister could be computed precisely. This approach can be applied to TM analysis of buffer and canister, or a finite size of repository. 12 refs., 28 figs., 9 tabs. (Author)

  10. Analysis of deterministic cyclic gene regulatory network models with delays

    CERN Document Server

    Ahsen, Mehmet Eren; Niculescu, Silviu-Iulian

    2015-01-01

    This brief examines a deterministic, ODE-based model for gene regulatory networks (GRN) that incorporates nonlinearities and time-delayed feedback. An introductory chapter provides some insights into molecular biology and GRNs. The mathematical tools necessary for studying the GRN model are then reviewed, in particular Hill functions and Schwarzian derivatives. One chapter is devoted to the analysis of GRNs under negative feedback with time delays and a special case of a homogenous GRN is considered. Asymptotic stability analysis of GRNs under positive feedback is then considered in a separate chapter, in which conditions leading to bi-stability are derived. Graduate and advanced undergraduate students and researchers in control engineering, applied mathematics, systems biology and synthetic biology will find this brief to be a clear and concise introduction to the modeling and analysis of GRNs.

  11. Development of Seismic Analysis Model and Time History Analysis for KALIMER-600

    Energy Technology Data Exchange (ETDEWEB)

    Koo, Gyeong Hoi; Lee, J. H

    2007-02-15

    This report describes a simple seismic analysis model of the KALIMER-600 sodium cooled fast reactor and its application to the seismic time history analysis. To develop the simple seismic analysis model, the detailed 3-D finite element analyses for main components, IHTS piping system, and reactor building were carried out to verify the dynamic characteristics of each part of simple seismic analysis models. By using the developed simple model, the seismic time history analyses for both cases of a seismic isolation and non-isolation design of KALIMER-600 were performed. From the comparison of the calculated floor response spectrum, it is verified that the seismically isolated KALIMER-600 reactor building shows a great performance of a seismic isolation and assures a seismic integrity.

  12. Which biomechanical models are currently used in standing posture analysis?

    Science.gov (United States)

    Crétual, A

    2015-11-01

    In 1995, David Winter concluded that postural analysis of upright stance was often restricted to studying the trajectory of the center of pressure (CoP). However, postural control means regulation of the center of mass (CoM) with respect to CoP. As CoM is only accessible by using a biomechanical model of the human body, the present article proposes to determine which models are actually used in postural analysis, twenty years after Winter's observation. To do so, a selection of 252 representative articles dealing with upright posture and published during the four last years has been checked. It appears that the CoP model largely remains the most common one (accounting for nearly two thirds of the selection). Other models, CoP/CoM and segmental models (with one, two or more segments) are much less used. The choice of the model does not appear to be guided by the population studied. Conversely, while some confusion remains between postural control and the associated concepts of stability or strategy, this choice is better justified for real methodological concerns when dealing with such high-level parameters. Finally, the computation of the CoM continues to be a limitation in achieving a more complete postural analysis. This unfortunately implies that the model is chosen for technological reasons in many cases (choice being a euphemism here). Some effort still has to be made so that bioengineering developments allow us to go beyond this limit. PMID:26388359

  13. ORMONTE, Uncertainty Analysis for User-Developed System Models

    International Nuclear Information System (INIS)

    1 - Description of program or function: ORMONTE is a generic multivariable uncertainty analysis driver which can be linked to any FORTRAN model supplied by the user. The user tells ORMONTE which variables in his model are uncertain and describes the associated probability distributions. The user also tells ORMONTE which outputs from his model are of interest and for which uncertainty profiles are desired. Given the uncertainties in the inputs, ORMONTE samples the user-defined input distributions and 'drives' or runs the users model enough times such that a probability histogram or profile is constructed for the user-defined outputs of interest. ORMONTE can also perform sequential one-variable-at-a-time sensitivity studies and elasticity analysis. The user-supplied model is not restricted to a shielding model. Any FORTRAN model where uncertain outputs can be represented as functions of uncertain, independent inputs can be used. The ORMONTE package includes a set of Probability Data Analysis (PDA) routines for converting raw probability data into probability distribution format suitable for input to ORMONTE. 2 - Method of solution: ORMONTE uses the Monte Carlo technique to sample user-defined input probability distributions. 3 - Restrictions on the complexity of the problem: None noted

  14. Analysis of Sting Balance Calibration Data Using Optimized Regression Models

    Science.gov (United States)

    Ulbrich, N.; Bader, Jon B.

    2010-01-01

    Calibration data of a wind tunnel sting balance was processed using a candidate math model search algorithm that recommends an optimized regression model for the data analysis. During the calibration the normal force and the moment at the balance moment center were selected as independent calibration variables. The sting balance itself had two moment gages. Therefore, after analyzing the connection between calibration loads and gage outputs, it was decided to choose the difference and the sum of the gage outputs as the two responses that best describe the behavior of the balance. The math model search algorithm was applied to these two responses. An optimized regression model was obtained for each response. Classical strain gage balance load transformations and the equations of the deflection of a cantilever beam under load are used to show that the search algorithm s two optimized regression models are supported by a theoretical analysis of the relationship between the applied calibration loads and the measured gage outputs. The analysis of the sting balance calibration data set is a rare example of a situation when terms of a regression model of a balance can directly be derived from first principles of physics. In addition, it is interesting to note that the search algorithm recommended the correct regression model term combinations using only a set of statistical quality metrics that were applied to the experimental data during the algorithm s term selection process.

  15. Investigation of turbulence modelling in thermal stratification analysis

    International Nuclear Information System (INIS)

    In-vessel thermal stratification analysis was carried out using a multidimensional thermohydraulic analysis code, in which a higher-order finite difference scheme was applied to the convection terms. Discussions centred on the buoyancy modelling in the vicinity of the stratification interface through comparisons between experiment and calculation.Computational results were obtained from the following three turbulence models: (i) the k-εmodel with a constant turbulent Prandtl number Prt, (ii) the k-εmodel with the turbulent Prandtl number being dependent on the local Richardson number Ri, and (iii) the algebraic stress model. Numerical analysis of the stratification phenomena using the higher-order scheme showed that, in general, the modelling of the buoyancy terms appearing in the turbulence transport equations was the most important key to successful results. When the k-εmodel was used, it was pointed out that a dependence on the local Richardson number must be carefully included in the turbulent Prandtl number. In this case, however, the range of applicability was limited to the phenomena observed in the water system in general because the model was constructed and calibrated for water experiments. Overall it was found that the calculated stratification interface rise agreed well with experimental results in water and sodium insofar as the algebraic stress model was utilized. As a conclusion, in predicting the behaviour of the thermal stratification phenomena in liquid metal cooled reactors, the coupled use of the higher-order difference scheme and the algebaric stress model was most appropriate and recommended. ((orig.))

  16. Surrogate models for efficient stability analysis of brake systems

    Science.gov (United States)

    Nechak, Lyes; Gillot, Frédéric; Besset, Sébastien; Sinou, Jean-Jacques

    2015-07-01

    This study assesses capacities of the global sensitivity analysis combined together with the kriging formalism to be useful in the robust stability analysis of brake systems, which is too costly when performed with the classical complex eigenvalues analysis (CEA) based on finite element models (FEMs). By considering a simplified brake system, the global sensitivity analysis is first shown very helpful for understanding the effects of design parameters on the brake system's stability. This is allowed by the so-called Sobol indices which discriminate design parameters with respect to their influence on the stability. Consequently, only uncertainty of influent parameters is taken into account in the following step, namely, the surrogate modelling based on kriging. The latter is then demonstrated to be an interesting alternative to FEMs since it allowed, with a lower cost, an accurate estimation of the system's proportions of instability corresponding to the influent parameters.

  17. Predoction Model of Data Envelopment Analysis with Undesirable Outputs

    Institute of Scientific and Technical Information of China (English)

    边馥萍; 范宇

    2004-01-01

    Data envelopment analysis (DEA) has become a standard non-parametric approach to productivity analysis, especially to relative efficiency analysis of decision making units (DMUs). Extended to the prediction field, it can solve the prediction problem with multiple inputs and outputs which can not be solved easily by the regression analysis method.But the traditional DEA models can not solve the problem with undesirable outputs,so in this paper the inherent relationship between goal programming and the DEA method based on the relationship between multiple goal programming and goal programming is explored,and a mixed DEA model which can make all factors of inputs and undesirable outputs decrease in different proportions is built.And at the same time,all the factors of desirable outputs increase in different proportions.

  18. Automatic simplification of solid models for engineering analysis independent of modeling sequences

    International Nuclear Information System (INIS)

    Although solid models can represent complex and detailed geometry of parts, it is often necessary to simplify solid models by removing the detailed geometry in some applications such as finite element analysis and similarity assessment of CAD models. There are no standards for judging the goodness of a simplification method, but one essential criterion would be that it should generate a consistent and acceptable simplification for the same solid model, regardless of how the solid model has been created. Since a design-feature-based approach is tightly dependent on modeling sequences and designer's modeling preferences, it sometimes produces inconsistent and unacceptable simplifications. In this paper, a new method is proposed to simplify solid models of machined parts. Independently of user-specified design features, this method directly recognizes and generates subtractive features from the final model of the part, and then simplifies the solid model by removing the detailed geometry by using these subtractive features

  19. Using automatic differentiation in sensitivity analysis of nuclear simulatoin models.

    Energy Technology Data Exchange (ETDEWEB)

    Alexe, M.; Roderick, O.; Anitescu, M.; Utke, J.; Fanning, T.; Hovland, P.; Virginia Tech.

    2010-01-01

    Sensitivity analysis is an important tool in the study of nuclear systems. In our recent work, we introduced a hybrid method that combines sampling techniques with first-order sensitivity analysis to approximate the effects of uncertainty in parameters of a nuclear reactor simulation model. For elementary examples, the approach offers a substantial advantage (in precision, computational efficiency, or both) over classical methods of uncertainty quantification.

  20. Dendritic spine shape analysis using disjunctive normal shape models

    OpenAIRE

    Ghani, Muhammad Usman; Mesadi, Fitsum; Demir Kanık, Sümerya Ümmühan; Demir Kanik, Sumerya Ummuhan; Argunşah, Ali Özgür; Argunsah, Ali Ozgur; Israely, Inbal; Ünay, Devrim; Unay, Devrim; Taşdizen, Tolga; Tasdizen, Tolga; Çetin, Müjdat; Cetin, Mujdat

    2016-01-01

    Analysis of dendritic spines is an essential task to understand the functional behavior of neurons. Their shape variations are known to be closely linked with neuronal activities. Spine shape analysis in particular, can assist neuroscientists to identify this relationship. A novel shape representation has been proposed recently, called Disjunctive Normal Shape Models (DNSM). DNSM is a parametric shape representation and has proven to be successful in several segmentation problems. In this pap...

  1. Hidden Markov Models and their Applications in Biological Sequence Analysis

    OpenAIRE

    Yoon, Byung-Jun

    2009-01-01

    Hidden Markov models (HMMs) have been extensively used in biological sequence analysis. In this paper, we give a tutorial review of HMMs and their applications in a variety of problems in molecular biology. We especially focus on three types of HMMs: the profile-HMMs, pair-HMMs, and context-sensitive HMMs. We show how these HMMs can be used to solve various sequence analysis problems, such as pairwise and multiple sequence alignments, gene annotation, classification, similarity search, and ma...

  2. Empirical validation and comparison of models for customer base analysis

    OpenAIRE

    Persentili Batislam, Emine; Denizel, Meltem; Filiztekin, Alpay

    2007-01-01

    The benefits of retaining customers lead companies to search for means to profile their customers individually and track their retention and defection behaviors. To this end, the main issues addressed in customer base analysis are identification of customer active/inactive status and prediction of future purchase levels. We compare the predictive performance of Pareto/NBD and BG/NBD models from the customer base analysis literature — in terms of repeat purchase levels and active status — usi...

  3. Modelling and analysis of multiagent systems concerning cooperation problems

    OpenAIRE

    Reinhold, Thomas

    2005-01-01

    The subject of this diploma thesis is the modelling and the analysis of mechanisms that enable multiagentsystems to establish communication relations and using them to control the interaction. With regards to the emergence of such symbol systems one groundwork of this paper is the realization that coordination problems aren't applicative to advance to evolution of "higher communication capabilities". With this in mind, this analysis uses a class of problems with explicit conflicts of inte...

  4. Ducted propeller performance analysis using a boundary element model

    OpenAIRE

    Salvatore, Francesco; Calcagni, Danilo; Greco, Luca

    2006-01-01

    This report describes the computational analysis of the unviscid flow around a ducted propeller using a BEM model. The activity is performed in the framework of a research program co-funded by the European Union under the "SUPERPROP" Project TST4-CT-2005-516219. The theoretical and computational methodology is described and results of a validation excercise on several test cases is presented and discussed. In particular, the proposed formulation is applied to the analysis of ducted propellers...

  5. Stochastic modelling of landfill leachate and biogas production incorporating waste heterogeneity. Model formulation and uncertainty analysis

    International Nuclear Information System (INIS)

    A mathematical model simulating the hydrological and biochemical processes occurring in landfilled waste is presented and demonstrated. The model combines biochemical and hydrological models into an integrated representation of the landfill environment. Waste decomposition is modelled using traditional biochemical waste decomposition pathways combined with a simplified methodology for representing the rate of decomposition. Water flow through the waste is represented using a statistical velocity model capable of representing the effects of waste heterogeneity on leachate flow through the waste. Given the limitations in data capture from landfill sites, significant emphasis is placed on improving parameter identification and reducing parameter requirements. A sensitivity analysis is performed, highlighting the model's response to changes in input variables. A model test run is also presented, demonstrating the model capabilities. A parameter perturbation model sensitivity analysis was also performed. This has been able to show that although the model is sensitive to certain key parameters, its overall intuitive response provides a good basis for making reasonable predictions of the future state of the landfill system. Finally, due to the high uncertainty associated with landfill data, a tool for handling input data uncertainty is incorporated in the model's structure. It is concluded that the model can be used as a reasonable tool for modelling landfill processes and that further work should be undertaken to assess the model's performance

  6. First experience with the new ATLAS analysis model

    CERN Document Server

    Cranshaw, Jack; The ATLAS collaboration

    2016-01-01

    During the Long shutdown of the LHC, the ATLAS collaboration overhauled its analysis model based on experience gained during Run 1. The main components are a new analysis format and Event Data Model which can be read directly by ROOT, as well as a "Derivation Framework" that takes the Petabyte-scale output from ATLAS reconstruction and produces smaller samples targeted at specific analyses, using the central production system. We will discuss the technical and operational aspects of this new system and review its performance during the first year of 13 TeV data taking.

  7. Experimental and numerical analysis of a knee endoprosthesis numerical model

    Directory of Open Access Journals (Sweden)

    L. Zach

    2016-07-01

    Full Text Available The aim of this study is to create and verify a numerical model for a Medin Modular orthopedic knee-joint implant by investigating contact pressure, its distribution and contact surfaces. An experiment using Fuji Prescale pressure sensitive films and a finite element analysis (FEA using Abaqus software were carried out. The experimental data were evaluated using a special designed program and were compared with the results of the analysis. The designed evaluation program had been constructed on the basis of results obtained from a supplementary calibration experiment. The applicability of the numerical model for the real endoprosthesis behavior prediction was proven on the basis of their good correlation.

  8. Multivariable modeling and multivariate analysis for the behavioral sciences

    CERN Document Server

    Everitt, Brian S

    2009-01-01

    Multivariable Modeling and Multivariate Analysis for the Behavioral Sciences shows students how to apply statistical methods to behavioral science data in a sensible manner. Assuming some familiarity with introductory statistics, the book analyzes a host of real-world data to provide useful answers to real-life issues.The author begins by exploring the types and design of behavioral studies. He also explains how models are used in the analysis of data. After describing graphical methods, such as scatterplot matrices, the text covers simple linear regression, locally weighted regression, multip

  9. Model-based analysis and simulation of regenerative heat wheel

    DEFF Research Database (Denmark)

    Wu, Zhuang; Melnik, Roderick V. N.; Borup, F.

    2006-01-01

    The rotary regenerator (also called the heat wheel) is an important component of energy intensive sectors, which is used in many heat recovery systems. In this paper, a model-based analysis of a rotary regenerator is carried out with a major emphasis given to the development and implementation of...... mathematical models for the thermal analysis of the fluid and wheel matrix. The effect of heat conduction in the direction of the fluid flow is taken into account and the influence of variations in rotating speed of the wheel as well as other characteristics (ambient temperature, airflow and geometric size) on...

  10. Structure Model Analysis of the Kashima 34m Telescope

    Science.gov (United States)

    Nakajima, Junichi; Nakamura, Toshio; Saita, Takeshi; Horiguchi, Junji; Yuge, Kouhei

    2001-03-01

    Deformation analysis of the Kashima 34-m radio telescope is performed. Although the telescope has a large aperture and accurate reflector panels, the dish support structures determine the high-frequency performance. Especially in millimeter wavelength, deformations above 1-mm affect the telescope efficiency seriously. We have modeled 34-m telescopes into elements and used a finite element method FEM to simulate accurate telescope deformations. The first results we obtained agreed well with the realistic deformation. Future analysis and telescope evaluations based on computer simulations are possible with this FEM model.

  11. Performance Analysis of Hybrid Forecasting Model in Stock Market Forecasting

    Directory of Open Access Journals (Sweden)

    Mahesh S. Khadka

    2012-09-01

    Full Text Available This paper presents performance analysis of hybrid model comprise of concordance and Genetic Programming (GP to forecast financial market with some existing models. This scheme can be used for in depth analysis of stock market. Different measures of concordances such as Kendall’s Tau, Gini’s Mean Difference, Spearman’s Rho, and weak interpretation of concordance are used to search for the pattern in past that look similar to present. Genetic Programming is then used to match the past trend to presenttrend as close as possible. Then Genetic Program estimates what will happen next based on what had happened next. The concept is validated using financial time series data (S&P 500 and NASDAQ indices as sample data sets. The forecasted result is then compared with standard ARIMA model and other model to analyse its performance.

  12. Causal Analysis for Performance Modeling of Computer Programs

    Directory of Open Access Journals (Sweden)

    Jan Lemeire

    2007-01-01

    Full Text Available Causal modeling and the accompanying learning algorithms provide useful extensions for in-depth statistical investigation and automation of performance modeling. We enlarged the scope of existing causal structure learning algorithms by using the form-free information-theoretic concept of mutual information and by introducing the complexity criterion for selecting direct relations among equivalent relations. The underlying probability distribution of experimental data is estimated by kernel density estimation. We then reported on the benefits of a dependency analysis and the decompositional capacities of causal models. Useful qualitative models, providing insight into the role of every performance factor, were inferred from experimental data. This paper reports on the results for a LU decomposition algorithm and on the study of the parameter sensitivity of the Kakadu implementation of the JPEG-2000 standard. Next, the analysis was used to search for generic performance characteristics of the applications.

  13. A finite element model for nonlinear structural earthquake analysis

    International Nuclear Information System (INIS)

    Towards the analysis of damage in reinforced concrete structures subjected to earthquakes, we propose a numerical model capable of describing the non-linear behaviour of reinforced concrete beams and columns under alternate cyclic loading, which can be efficiently used also in dynamic analysis: with the assumption of a local uniaxial state of stress, we are able to obtain the rapidity needed for the time integration of the dynamic equations of equilibrium for real structures, within a time interval corresponding to a seismic action. The model is presented: path-dependant constitutive material law and finite element formulation. An short example of validation serves to evaluate some characteristics of the model. A methodology is then developed to extend the applicability of the model for limit cases, regarding slenderness and semi-rigid limit conditions. (author)

  14. Accounting for Errors in Model Analysis Theory: A Numerical Approach

    Science.gov (United States)

    Sommer, Steven R.; Lindell, Rebecca S.

    2004-09-01

    By studying the patterns of a group of individuals' responses to a series of multiple-choice questions, researchers can utilize Model Analysis Theory to create a probability distribution of mental models for a student population. The eigenanalysis of this distribution yields information about what mental models the students possess, as well as how consistently they utilize said mental models. Although the theory considers the probabilistic distribution to be fundamental, there exists opportunities for random errors to occur. In this paper we will discuss a numerical approach for mathematically accounting for these random errors. As an example of this methodology, analysis of data obtained from the Lunar Phases Concept Inventory will be presented. Limitations and applicability of this numerical approach will be discussed.

  15. Global analysis of a supersymmetric Pati-Salam model

    International Nuclear Information System (INIS)

    We perform a complete global phenomenological analysis of a realistic string-inspired model based on the supersymmetric Pati-Salam SU(4)xSU(2)LxSU(2)R gauge group supplemented by a U(1) family symmetry, and present predictions for all observables including muon g-2, τγ, and the CHOOZ angle. Our analysis demonstrates the compatibility of such a model with all laboratory data including charged fermion masses and mixing angles, LMA MSW and atmospheric neutrino masses and mixing angles, and b→sγ, allowing for small deviations from third family Yukawa unification. We show that in such models the squark and slepton masses may be rather light compared to similar models with exact Yukawa unification. (author)

  16. Sensitivity analysis techniques for models of human behavior.

    Energy Technology Data Exchange (ETDEWEB)

    Bier, Asmeret Brooke

    2010-09-01

    Human and social modeling has emerged as an important research area at Sandia National Laboratories due to its potential to improve national defense-related decision-making in the presence of uncertainty. To learn about which sensitivity analysis techniques are most suitable for models of human behavior, different promising methods were applied to an example model, tested, and compared. The example model simulates cognitive, behavioral, and social processes and interactions, and involves substantial nonlinearity, uncertainty, and variability. Results showed that some sensitivity analysis methods create similar results, and can thus be considered redundant. However, other methods, such as global methods that consider interactions between inputs, can generate insight not gained from traditional methods.

  17. Sensitivity analysis of the terrestrial food chain model FOOD III

    International Nuclear Information System (INIS)

    As a first step in constructing a terrestrial food chain model suitable for long-term waste management situations, a numerical sensitivity analysis of FOOD III was carried out to identify important model parameters. The analysis involved 42 radionuclides, four pathways, 14 food types, 93 parameters and three percentages of parameter variation. We also investigated the importance of radionuclides, pathways and food types. The analysis involved a simple contamination model to render results from individual pathways comparable. The analysis showed that radionuclides vary greatly in their dose contribution to each of the four pathways, but relative contributions to each pathway are very similar. Man's and animals' drinking water pathways are much more important than the leaf and root pathways. However, this result depends on the contamination model used. All the pathways contain unimportant food types. Considering the number of parameters involved, FOOD III has too many different food types. Many of the parameters of the leaf and root pathway are important. However, this is true for only a few of the parameters of animals' drinking water pathway, and for neither of the two parameters of mans' drinking water pathway. The radiological decay constant increases the variability of these results. The dose factor is consistently the most important variable, and it explains most of the variability of radionuclide doses within pathways. Consideration of the variability of dose factors is important in contemporary as well as long-term waste management assessment models, if realistic estimates are to be made. (auth)

  18. Engineering approach for medium modeling in piping dynamic analysis

    International Nuclear Information System (INIS)

    Two approaches to the problem of dynamic interaction between pipe and medium are compared in the given paper: 1) The first one treats medium as mass rigidly connected to the pipe finite-element model's nodes. 2) In the second one medium is modeled by the finite-element system of rod-elements. In this case the basic fluid-structure interaction (FSI) effects are taken into account. The main techniques for FE modeling of some pipeline elements are presented in the paper. The second approach can be implemented by the use of general purpose FE programs. A model of a feed water pipeline of VVER-440 type NPP has been developed to study how the FSI affects on pipeline response. The results of the analysis which allow estimation of inaccuracy arising from medium dynamics neglecting are as follows: 1. calculation of Eigen frequency and mode shapes; 2. seismic analysis using the response-spectrum method; 3. accidental blast impact assessment with the use of time history analysis; 4. operating vibration assessment on the basis of harmonic analysis. It has become apparent that the way of medium modeling has an essential influence on the dynamic behavior of pipelines. (author)

  19. Probabilistic Analysis Techniques Applied to Complex Spacecraft Power System Modeling

    Science.gov (United States)

    Hojnicki, Jeffrey S.; Rusick, Jeffrey J.

    2005-01-01

    Electric power system performance predictions are critical to spacecraft, such as the International Space Station (ISS), to ensure that sufficient power is available to support all the spacecraft s power needs. In the case of the ISS power system, analyses to date have been deterministic, meaning that each analysis produces a single-valued result for power capability because of the complexity and large size of the model. As a result, the deterministic ISS analyses did not account for the sensitivity of the power capability to uncertainties in model input variables. Over the last 10 years, the NASA Glenn Research Center has developed advanced, computationally fast, probabilistic analysis techniques and successfully applied them to large (thousands of nodes) complex structural analysis models. These same techniques were recently applied to large, complex ISS power system models. This new application enables probabilistic power analyses that account for input uncertainties and produce results that include variations caused by these uncertainties. Specifically, N&R Engineering, under contract to NASA, integrated these advanced probabilistic techniques with Glenn s internationally recognized ISS power system model, System Power Analysis for Capability Evaluation (SPACE).

  20. Accuracy Analysis for SST Gravity Field Model in China

    Institute of Scientific and Technical Information of China (English)

    LUO Jia; LUO Zhicai; ZOU Xiancai; WANG Haihong

    2006-01-01

    Taking China as the region for test, the potential of the new satellite gravity technique, satellite-to-satellite tracking for improving the accuracy of regional gravity field model is studied. With WDM94 as reference, the gravity anomaly residuals of three models, the latest two GRACE global gravity field model (EIGEN_GRACE02S, GGM02S) and EGM96, are computed and compared. The causes for the differences among the residuals of the three models are discussed. The comparison between the residuals shows that in the selected region, EIGEN_GRACE02S or GGM02S is better than EGM96 in lower degree part (less than 110 degree). Additionally, through the analysis of the model gravity anomaly residuals, it is found that some systematic errors with periodical properties exist in the higher degree part of EIGEN and GGM models, the results can also be taken as references in the validation of the SST gravity data.

  1. Analysis of CPN-1 sigma models via projective structures

    International Nuclear Information System (INIS)

    This paper represents a study of projector solutions to the Euclidean CPN-1 sigma model in two dimensions and their associated surfaces immersed in the su(N) Lie algebra. Any solution for the CPN-1 sigma model defined on the extended complex plane with finite action can be written as a raising operator acting on a holomorphic one. Here the proof is formulated in terms rank-1 projectors so it is explicitly gauge invariant. We apply these results to the analysis of surfaces associated with the CPN-1 models defined using the generalized Weierstrass formula for immersion. We show that the surfaces are conformally parametrized by the Lagrangian density, with finite area equal to the action of the model, and express several other geometrical characteristics of the surface in terms of the physical quantities of the model. Finally, we provide necessary and sufficient conditions that a surface be related to a CPN-1 sigma model

  2. Network and adaptive system of systems modeling and analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Lawton, Craig R.; Campbell, James E. Dr. (.; .); Anderson, Dennis James; Eddy, John P.

    2007-05-01

    This report documents the results of an LDRD program entitled ''Network and Adaptive System of Systems Modeling and Analysis'' that was conducted during FY 2005 and FY 2006. The purpose of this study was to determine and implement ways to incorporate network communications modeling into existing System of Systems (SoS) modeling capabilities. Current SoS modeling, particularly for the Future Combat Systems (FCS) program, is conducted under the assumption that communication between the various systems is always possible and occurs instantaneously. A more realistic representation of these communications allows for better, more accurate simulation results. The current approach to meeting this objective has been to use existing capabilities to model network hardware reliability and adding capabilities to use that information to model the impact on the sustainment supply chain and operational availability.

  3. [Stability Analysis of Susceptible-Infected-Recovered Epidemic Model].

    Science.gov (United States)

    Pan, Duotao; Shi, Hongyan; Huang, Mingzhong; Yuan, Decheng

    2015-10-01

    With the range of application of computational biology and systems biology gradually expanding, the complexity of the bioprocess models is also increased. To address this difficult problem, it is required to introduce positive alternative analysis method to cope with it. Taking the dynamic model of the epidemic control process as research object, we established an evaluation model in our laboratory. Firstly, the model was solved with nonlinear programming method. The results were shown to be good. Based on biochemical systems theory, the ODE dynamic model was transformed into S-system. The eigen values of the model showed that the system was stable and contained oscillation phenomenon. Next the sensitivities of rate constant and logarithmic gains of the three key parameters were analyzed, as well as the robust of the system. The result indicated that the biochemical systems theory could be applied in different fields more widely. PMID:26964304

  4. Pattern mixture models for the analysis of repeated attempt designs.

    Science.gov (United States)

    Daniels, Michael J; Jackson, Dan; Feng, Wei; White, Ian R

    2015-12-01

    It is not uncommon in follow-up studies to make multiple attempts to collect a measurement after baseline. Recording whether these attempts are successful or not provides useful information for the purposes of assessing the missing at random (MAR) assumption and facilitating missing not at random (MNAR) modeling. This is because measurements from subjects who provide this data after multiple failed attempts may differ from those who provide the measurement after fewer attempts. This type of "continuum of resistance" to providing a measurement has hitherto been modeled in a selection model framework, where the outcome data is modeled jointly with the success or failure of the attempts given these outcomes. Here, we present a pattern mixture approach to model this type of data. We re-analyze the repeated attempt data from a trial that was previously analyzed using a selection model approach. Our pattern mixture model is more flexible and is more transparent in terms of parameter identifiability than the models that have previously been used to model repeated attempt data and allows for sensitivity analysis. We conclude that our approach to modeling this type of data provides a fully viable alternative to the more established selection model. PMID:26149119

  5. Performance Analysis of a 3D Ionosphere Tomographic Model

    Institute of Scientific and Technical Information of China (English)

    Liu Zhi-zhao; Gao Yang

    2003-01-01

    A 3D high precision ionospheric model is developed based on tomography technique. This tomographic model employs GPS data observed by an operational network of dual-frequency GPS receivers. The methodology of developing a 3D ionospheric tomography model is briefly summarized. However emphasis is put on the analysis and evaluation of the accuracy variation of 3D ionosphere modeling with respect to the change of GPS data cutoff angle.Three typical cutoff angle values (15°, 20° and 25°) are tested. For each testing cutoff angle, the performances of the3D ionospheric model constructed using tomography technique are assessed by calibrating the model predicted ionospheric TEC with the GPS measured TEC and by employing the model predicted TEC to a practical GPS positioning application single point positioning (SPP).Test results indicate the 3D model predicted VTEC has about 0.4 TECU improvement in accuracy when cutoff angle rises from 15° to 20°. However, no apparent improvement is found from 20° to 25°. The model's improvement is also validated by the better SPP accuracy of 3D model than its counterpart-dual frequency model in the 20° and 25° cases.

  6. Non-stationarity in GARCH models: A Bayesian analysis

    OpenAIRE

    Kleibergen, Frank; Dijk, Herman

    1993-01-01

    textabstractFirst, the non-stationarity properties of the conditional variances in the GARCH(1,1) model are analysed using the concept of infinite persistence of shocks. Given a time sequence of probabilities for increasing/decreasing conditional variances, a theoretical formula for quasi-strict non-stationarity is defined. The resulting conditions for the GARCH(1,1) model are shown to differ from the weak stationarity conditions mainly used in the literature. Bayesian statistical analysis us...

  7. Configurational analysis as an alternative way of modeling sales response

    OpenAIRE

    Aarnio, Susanna

    2013-01-01

    Objectives of the Study The objectives of the study are both managerial and methodological. On the one hand, the aim is to apply a novel research approach, fuzzy set qualitative comparative analysis or fsQCA (see f. ex. Ragin, 2000; Rihoux & Ragin, 2009), to sales response modeling and thus, create a response model for the case company to identify complex, configurational causalities affecting the company's sales volumes within the chosen product category. On the other hand, to goal is to ...

  8. Stability Analysis of Some Nonlinear Anaerobic Digestion Models

    OpenAIRE

    Ivan Simeonov; Sette Diop

    2010-01-01

    Abstract: The paper deals with local asymptotic stability analysis of some mass balance dynamic models (based on one and on two-stage reaction schemes) of the anaerobic digestion (AD) in CSTR. The equilibrium states for models based on one (with Monod, Contois and Haldane shapes for the specific growth rate) and on two-stage (only with Monod shapes for both the specific growth rate of acidogenic and methanogenic bacterial populations) reaction schemes have been determined solving sets of nonl...

  9. Kinematic Modeling, Linearization and First-Order Error Analysis

    OpenAIRE

    Pott, Andreas; Hiller, Manfred

    2008-01-01

    The contribution describes a general method for kinematic modeling of many wide-spread parallel kinematic machines, i.e. for the Stewart-Gough-platform, the Delta-robot, and Linaglide machines. The kinetostatic method is applied for a comprehensive kinematic analysis of these machines. Based on that model, a general method is proposed to compute the linearization of the transmission behaviour from geometric parameters to the endeffector motion of these machines. By applying the force transmis...

  10. Analysis of a Model for Computer Virus Transmission

    OpenAIRE

    Peng Qin

    2015-01-01

    Computer viruses remain a significant threat to computer networks. In this paper, the incorporation of new computers to the network and the removing of old computers from the network are considered. Meanwhile, the computers are equipped with antivirus software on the computer network. The computer virus model is established. Through the analysis of the model, disease-free and endemic equilibrium points are calculated. The stability conditions of the equilibria are derived. To illustrate our t...

  11. Bifurcation analysis of parametrically excited bipolar disorder model

    Science.gov (United States)

    Nana, Laurent

    2009-02-01

    Bipolar II disorder is characterized by alternating hypomanic and major depressive episode. We model the periodic mood variations of a bipolar II patient with a negatively damped harmonic oscillator. The medications administrated to the patient are modeled via a forcing function that is capable of stabilizing the mood variations and of varying their amplitude. We analyze analytically, using perturbation method, the amplitude and stability of limit cycles and check this analysis with numerical simulations.

  12. A Numerical Model for Torsion Analysis of Composite Ship Hulls

    Directory of Open Access Journals (Sweden)

    Ionel Chirica

    2012-01-01

    Full Text Available A new methodology based on a macroelement model proposed for torsional behaviour of the ship hull made of composite material is proposed in this paper. A computer program has been developed for the elastic analysis of linear torsion. The results are compared with the FEM-based licensed soft COSMOS/M results and measurements on the scale simplified model of a container ship, made of composite materials.

  13. Mathematical Modeling and Analysis of Classified Marketing of Agricultural Products

    Institute of Scientific and Technical Information of China (English)

    Fengying; WANG

    2014-01-01

    Classified marketing of agricultural products was analyzed using the Logistic Regression Model. This method can take full advantage of information in agricultural product database,to find factors influencing best selling degree of agricultural products,and make quantitative analysis accordingly. Using this model,it is also able to predict sales of agricultural products,and provide reference for mapping out individualized sales strategy for popularizing agricultural products.

  14. Modeling Motivational Deficits in Mouse Models of Schizophrenia: Behavior Analysis as a Guide for Neuroscience

    OpenAIRE

    Ryan D Ward; Simpson, Eleanor H.; Kandel, Eric R.; Balsam, Peter D.

    2011-01-01

    In recent years it has become possible to develop animal models of psychiatric disease in genetically modified mice. While great strides have been made in the development of genetic and neurobiological tools with which to model psychiatric disease, elucidation of neural and molecular mechanisms thought to underlie behavioral phenotypes has been hindered by an inadequate analysis of behavior. This is unfortunate given the fact that the experimental analysis of behavior has created powerful met...

  15. Advanced accident sequence precursor analysis level 1 models

    Energy Technology Data Exchange (ETDEWEB)

    Sattison, M.B.; Thatcher, T.A.; Knudsen, J.K.; Schroeder, J.A.; Siu, N.O. [Idaho National Engineering Lab., Idaho National Lab., Idaho Falls, ID (United States)

    1996-03-01

    INEL has been involved in the development of plant-specific Accident Sequence Precursor (ASP) models for the past two years. These models were developed for use with the SAPHIRE suite of PRA computer codes. They contained event tree/linked fault tree Level 1 risk models for the following initiating events: general transient, loss-of-offsite-power, steam generator tube rupture, small loss-of-coolant-accident, and anticipated transient without scram. Early in 1995 the ASP models were revised based on review comments from the NRC and an independent peer review. These models were released as Revision 1. The Office of Nuclear Regulatory Research has sponsored several projects at the INEL this fiscal year to further enhance the capabilities of the ASP models. Revision 2 models incorporates more detailed plant information into the models concerning plant response to station blackout conditions, information on battery life, and other unique features gleaned from an Office of Nuclear Reactor Regulation quick review of the Individual Plant Examination submittals. These models are currently being delivered to the NRC as they are completed. A related project is a feasibility study and model development of low power/shutdown (LP/SD) and external event extensions to the ASP models. This project will establish criteria for selection of LP/SD and external initiator operational events for analysis within the ASP program. Prototype models for each pertinent initiating event (loss of shutdown cooling, loss of inventory control, fire, flood, seismic, etc.) will be developed. A third project concerns development of enhancements to SAPHIRE. In relation to the ASP program, a new SAPHIRE module, GEM, was developed as a specific user interface for performing ASP evaluations. This module greatly simplifies the analysis process for determining the conditional core damage probability for a given combination of initiating events and equipment failures or degradations.

  16. Hydraulic modeling support for conflict analysis: The Manayunk canal revisited

    International Nuclear Information System (INIS)

    This paper presents a study which used a standard, hydraulic computer model to generate detailed design information to support conflict analysis of a water resource use issue. As an extension of previous studies, the conflict analysis in this case included several scenarios for stability analysis - all of which reached the conclusion that compromising, shared access to the water resources available would result in the most benefits to society. This expected equilibrium outcome was found to maximize benefit-cost estimates. 17 refs., 1 fig., 2 tabs

  17. Comparative study of models for pipe-whip analysis

    International Nuclear Information System (INIS)

    Analysis of the response of high energy lines after the occurence of pipe rupture has received considerable attention in recent past, with an array of different modelling techniques proposed and available in the literature. Information on relative merits of such methods is scarce and the present study provides tentative guidelines to the designer confronted with the selection of an appropriate model for a given system. The criteria of evaluation are, implicitly, the time required for analysis and the computer cost involved and, explicitly, the degree of accuracy of the solutions. The models compared are grouped into four classes: (i) one-degree-of-freedom-systems that introduce a stationary plastic hinge and require simple hand calculations, (ii) kinematic models incorporating the concept of a travelling hinge, easily analysed by means of inexpensive computer programs, (iii) engineering beam models considering elasto-plastic pipe behavior and (iv) two dimensional finite element systems based on plane stress theory. The kinematic model, proposed earlier by the authors, is used to conduct a qualitative analysis of the response as a function of the magnitude of external force, gap size and rigidity of the restraint. The engineering beam model gives results that practically coincide with those generated by the plane stress approach, although consistently a little higher than the latter. The methods were applied to typical configurations with bilinear restraints and the results obtained prove that the single-degree-of-freedom model may lead to non conservative solutions whereas the straight application of the kinematic model would always underestimate the maximum deformation of the restraint. (orig.)

  18. Uncertainty Analysis of Multi-Model Flood Forecasts

    Directory of Open Access Journals (Sweden)

    Erich J. Plate

    2015-12-01

    Full Text Available This paper demonstrates, by means of a systematic uncertainty analysis, that the use of outputs from more than one model can significantly improve conditional forecasts of discharges or water stages, provided the models are structurally different. Discharge forecasts from two models and the actual forecasted discharge are assumed to form a three-dimensional joint probability density distribution (jpdf, calibrated on long time series of data. The jpdf is decomposed into conditional probability density distributions (cpdf by means of Bayes formula, as suggested and explored by Krzysztofowicz in a series of papers. In this paper his approach is simplified to optimize conditional forecasts for any set of two forecast models. Its application is demonstrated by means of models developed in a study of flood forecasting for station Stung Treng on the middle reach of the Mekong River in South-East Asia. Four different forecast models were used and pairwise combined: forecast with no model, with persistence model, with a regression model, and with a rainfall-runoff model. Working with cpdfs requires determination of dependency among variables, for which linear regressions are required, as was done by Krzysztofowicz. His Bayesian approach based on transforming observed probability distributions of discharges and forecasts into normal distributions is also explored. Results obtained with his method for normal prior and likelihood distributions are identical to results from direct multiple regressions. Furthermore, it is shown that in the present case forecast accuracy is only marginally improved, if Weibull distributed basic data were converted into normally distributed variables.

  19. An accuracy analysis of Army Material System Analysis Activity discrete reliability growth model

    OpenAIRE

    Thalieb, Rio M.

    1988-01-01

    The accuracy of the discrete reliability growth model developed by Army Material System Analysis Activity (AMSAA) is analysed. The mean, standard deviation, and 95 precent confidence interval of the estimate of reliability resulting from simulating the AMSAA discrete reliability growth model are computed. The mean of the estimate of reliability from the AMSAA discrete reliability growth model is compared with the mean of the reliability estimate using the Exponential discrete reliability grow...

  20. Two-lump fission product model for fast reactor analysis

    International Nuclear Information System (INIS)

    As a part of the Fast-Mixed Spectrum Reactor (FMSR) Project, a study was made on the adequacy of the conventional fission product lump models for the analysis of the different FMSR core concepts. A two-lump fission product model consisting of an odd-A fission product lump and an even-A fission product lump with transmutation between the odd- and even-A lumps was developed. This two-lump model is capable of predicting the exact burnup-dependent behavior of the fission products within a few percent over a wide range of spectra and is therefore also applicable to the conventional fast breeder reactor

  1. Human Performance Modeling for Dynamic Human Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Boring, Ronald Laurids [Idaho National Laboratory; Joe, Jeffrey Clark [Idaho National Laboratory; Mandelli, Diego [Idaho National Laboratory

    2015-08-01

    Part of the U.S. Department of Energy’s (DOE’s) Light Water Reac- tor Sustainability (LWRS) Program, the Risk-Informed Safety Margin Charac- terization (RISMC) Pathway develops approaches to estimating and managing safety margins. RISMC simulations pair deterministic plant physics models with probabilistic risk models. As human interactions are an essential element of plant risk, it is necessary to integrate human actions into the RISMC risk framework. In this paper, we review simulation based and non simulation based human reliability analysis (HRA) methods. This paper summarizes the founda- tional information needed to develop a feasible approach to modeling human in- teractions in RISMC simulations.

  2. On selecting policy analysis models by forecast accuracy

    OpenAIRE

    D. F. Hendry; Mizon, G.E.

    1999-01-01

    The value of selecting the best forecasting model as the basis for empirical economic policy analysis is questioned. When no model coincides with the data generation process, non-causal statistical devices may provide the best available forecasts: examples from recent work include intercept corrections and differenced-data VARs. However, the resulting models need have no policy implications. A ‘paradox’ may result if their forecasts induce policy changes which can be used to improve the s...

  3. Evaluating statistical analysis models for RNA sequencing experiments

    Directory of Open Access Journals (Sweden)

    Pablo eReeb

    2013-09-01

    Full Text Available Validating statistical analysis methods for RNA sequencing (RNA-seq experiments is a complex task. Researcher often find themselves having to decide between competing models or assessing the reliability of results obtained with a designated analysis program. Computer simulation has been the most frequently used procedure to verify the adequacy of a model. However, datasets generated by simulations depend on the parameterization and the assumptions of the selected model. Moreover, such datasets may constitute a partial representation of reality as the complexity or RNA-seq data is hard to mimic. We present the use of plasmode datasets to complement the evaluation of statistical models for RNA-seq data. A plasmode is a dataset obtained from experimental data but for which come truth is known. Using a set of simulated scenarios of technical and biological replicates, and public available datasets, we illustrate how to design algorithms to construct plasmodes under different experimental conditions. We contrast results from two types of methods for RNA-seq: i models based on negative binomial distribution (edgeR and DESeq, and ii Gaussian models applied after transformation of data (MAANOVA. Results emphasize the fact that deciding what method to use may be experiment-specific due to the unknown distributions of expression levels. Plasmodes may contribute to choose which method to apply by using a similar pre-existing dataset. The promising results obtained from this approach, emphasize the need of promoting and improving systematic data sharing across the research community to facilitate plasmode building. Although we illustrate the use of plasmode for comparing differential expression analysis models, the flexibility of plasmode construction allows comparing upstream analysis, as normalization procedures or alignment pipelines, as well.

  4. Dynamic Chest Image Analysis: Model-Based Perfusion Analysis in Dynamic Pulmonary Imaging

    Science.gov (United States)

    Liang, Jianming; Järvi, Timo; Kiuru, Aaro; Kormano, Martti; Svedström, Erkki

    2003-12-01

    The "Dynamic Chest Image Analysis" project aims to develop model-based computer analysis and visualization methods for showing focal and general abnormalities of lung ventilation and perfusion based on a sequence of digital chest fluoroscopy frames collected with the dynamic pulmonary imaging technique. We have proposed and evaluated a multiresolutional method with an explicit ventilation model for ventilation analysis. This paper presents a new model-based method for pulmonary perfusion analysis. According to perfusion properties, we first devise a novel mathematical function to form a perfusion model. A simple yet accurate approach is further introduced to extract cardiac systolic and diastolic phases from the heart, so that this cardiac information may be utilized to accelerate the perfusion analysis and improve its sensitivity in detecting pulmonary perfusion abnormalities. This makes perfusion analysis not only fast but also robust in computation; consequently, perfusion analysis becomes computationally feasible without using contrast media. Our clinical case studies with 52 patients show that this technique is effective for pulmonary embolism even without using contrast media, demonstrating consistent correlations with computed tomography (CT) and nuclear medicine (NM) studies. This fluoroscopical examination takes only about 2 seconds for perfusion study with only low radiation dose to patient, involving no preparation, no radioactive isotopes, and no contrast media.

  5. Dynamic Chest Image Analysis: Model-Based Perfusion Analysis in Dynamic Pulmonary Imaging

    Directory of Open Access Journals (Sweden)

    Kiuru Aaro

    2003-01-01

    Full Text Available The "Dynamic Chest Image Analysis" project aims to develop model-based computer analysis and visualization methods for showing focal and general abnormalities of lung ventilation and perfusion based on a sequence of digital chest fluoroscopy frames collected with the dynamic pulmonary imaging technique. We have proposed and evaluated a multiresolutional method with an explicit ventilation model for ventilation analysis. This paper presents a new model-based method for pulmonary perfusion analysis. According to perfusion properties, we first devise a novel mathematical function to form a perfusion model. A simple yet accurate approach is further introduced to extract cardiac systolic and diastolic phases from the heart, so that this cardiac information may be utilized to accelerate the perfusion analysis and improve its sensitivity in detecting pulmonary perfusion abnormalities. This makes perfusion analysis not only fast but also robust in computation; consequently, perfusion analysis becomes computationally feasible without using contrast media. Our clinical case studies with 52 patients show that this technique is effective for pulmonary embolism even without using contrast media, demonstrating consistent correlations with computed tomography (CT and nuclear medicine (NM studies. This fluoroscopical examination takes only about 2 seconds for perfusion study with only low radiation dose to patient, involving no preparation, no radioactive isotopes, and no contrast media.

  6. Calibration of Uncertainty Analysis of the SWAT Model Using Genetic Algorithms and Bayesian Model Averaging

    Science.gov (United States)

    In this paper, the Genetic Algorithms (GA) and Bayesian model averaging (BMA) were combined to simultaneously conduct calibration and uncertainty analysis for the Soil and Water Assessment Tool (SWAT). In this hybrid method, several SWAT models with different structures are first selected; next GA i...

  7. Modeling and analysis of ground target radiation cross section

    Institute of Scientific and Technical Information of China (English)

    SHI Xiang; LOU GuoWei; LI XingGuo

    2008-01-01

    Based on the analysis of the passive millimeter wave (MMW) radiometer detection, the ground target radiation cross section is modeled as the new token for the target MMW radiant characteristics. Its ap-plication and actual testing are discussed and analyzed. The essence of passive MMW stealth is target radiation cross section reduction.

  8. Algebraic analysis of a model of two-dimensional gravity

    CERN Document Server

    Frolov, A M; Kuzmin, S V

    2009-01-01

    An algebraic analysis of the Hamiltonian formulation of the model two-dimensional gravity is performed. The crucial fact is an exact coincidence of the Poisson brackets algebra of the secondary constraints of this Hamiltonian formulation with the SO(2,1)-algebra. The eigenvectors of the canonical Hamiltonian $H_{c}$ are obtained and explicitly written in closed form.

  9. Mathematical model for safety analysis of heavy water power reactor

    International Nuclear Information System (INIS)

    Fundamental information in formulating the mathematical model for accident analysis is concerned with reactivity changes of the system. These parameters are: changes of fuel and moderator temperature, changes of the upper reflector thickness, reactivity changes due to moderator density variation dependent on the steam quantity and neutron flux distribution in the core

  10. Modeling and Analysis of A Rotary Direct Drive Servovalve

    Institute of Scientific and Technical Information of China (English)

    YU Jue; ZHUANG Jian; YU Dehong

    2014-01-01

    Direct drive servovalves are mostly restricted to low flow rate and low bandwidth applications due to the considerable flow forces. Current studies mainly focus on enhancing the driving force, which in turn is limited to the development of the magnetic material. Aiming at reducing the flow forces, a novel rotary direct drive servovalve(RDDV) is introduced in this paper. This RDDV servovalve is designed in a rotating structure and its axially symmetric spool rotates within a certain angle range in the valve chamber. The servovalve orifices are formed by the matching between the square wave shaped land on the spool and the rectangular ports on the sleeve. In order to study the RDDV servovalve performance, flow rate model and mechanical model are established, wherein flow rates and flow induced torques at different spool rotation angles or spool radiuses are obtained. The model analysis shows that the driving torque can be alleviated due to the proposed valve structure. Computational fluid dynamics(CFD) analysis using ANSYS/FLUENT is applied to evaluate and validate the theoretical analysis. In addition, experiments on the flow rate and the mechanical characteristic of the RDDV servovalve are carried out. Both simulation and experimental results conform to the results of the theoretical model analysis, which proves that this novel and innovative structure for direct drive servovalves can reduce the flow force on the spool and improve valve frequency response characteristics. This research proposes a novel rotary direct drive servovalve, which can reduce the flow forces effectively.

  11. Rasch Model Based Analysis of the Force Concept Inventory

    Science.gov (United States)

    Planinic, Maja; Ivanjek, Lana; Susac, Ana

    2010-01-01

    The Force Concept Inventory (FCI) is an important diagnostic instrument which is widely used in the field of physics education research. It is therefore very important to evaluate and monitor its functioning using different tools for statistical analysis. One of such tools is the stochastic Rasch model, which enables construction of linear…

  12. Alphabet Knowledge in Preschool: A Rasch Model Analysis

    Science.gov (United States)

    Drouin, Michelle; Horner, Sherri L.; Sondergeld, Toni A.

    2012-01-01

    In this study, we used Rasch model analyses to examine (1) the unidimensionality of the alphabet knowledge construct and (2) the relative difficulty of different alphabet knowledge tasks (uppercase letter recognition, names, and sounds, and lowercase letter names) within a sample of preschoolers (n=335). Rasch analysis showed that the four…

  13. Video Analysis of the Flight of a Model Aircraft

    Science.gov (United States)

    Tarantino, Giovanni; Fazio, Claudio

    2011-01-01

    A video-analysis software tool has been employed in order to measure the steady-state values of the kinematics variables describing the longitudinal behaviour of a radio-controlled model aircraft during take-off, climbing and gliding. These experimental results have been compared with the theoretical steady-state configurations predicted by the…

  14. An Analysis of E-Commerce Models and Strategies

    OpenAIRE

    Mishra Rohita Kumar; Mahalik Debendra Kumar

    2009-01-01

    E-Commerce is the buzzword of today’s global business. In order to take the advantage, organizations are adopting different strategy and utilizing a lot of resources. This paper focuses on analysis of different e-commerce web sites and their categorization with respect to different e-commerce models. It also emphasizes the strategic issues relating to e-commerce.

  15. A new analysis of a simple model of fair allocation

    OpenAIRE

    Juan D. Moreno-Ternero

    2012-01-01

    In a recent article, Fragnelli and Gagliardo [Cooperative models for allocating an object, Economics Letters 117 (2012) 227-229] propose several procedures to solve a basic problem of fair allocation. We scrutinize their proposal and contextualize it into recent developments of the literature on bankruptcy problems. Our analysis supports two of the procedures they propose; namely, the Shapley and Talmud rules.

  16. Elements of Constitutive Modelling and Numerical Analysis of Frictional Soils

    DEFF Research Database (Denmark)

    Jakobsen, Kim Parsberg

    This thesis deals with elements of elasto-plastic constitutive modelling and numerical analysis of frictional soils. The thesis is based on a number of scientific papers and reports in which central characteristics of soil behaviour and applied numerical techniques are considered. The development...

  17. An unsupervised aspect detection model for sentiment analysis of reviews

    NARCIS (Netherlands)

    Bagheri, A.; Saraee, M.; Jong, de F.M.G.

    2013-01-01

    With the rapid growth of user-generated content on the internet, sentiment analysis of online reviews has become a hot research topic recently, but due to variety and wide range of products and services, the supervised and domain-specific models are often not practical. As the number of reviews expa

  18. VHTR Prismatic Super Lattice Model for Equilibrium Fuel Cycle Analysis

    Energy Technology Data Exchange (ETDEWEB)

    G. S. Chang

    2006-09-01

    The advanced Very High Temperature gas-cooled Reactor (VHTR), which is currently being developed, achieves simplification of safety through reliance on innovative features and passive systems. One of the VHTRs innovative features is the reliance on ceramic-coated fuel particles to retain the fission products under extreme accident conditions. The effect of the random fuel kernel distribution in the fuel prismatic block is addressed through the use of the Dancoff correction factor in the resonance treatment. However, if the fuel kernels are not perfect black absorbers, the Dancoff correction factor is a function of burnup and fuel kernel packing factor, which requires that the Dancoff correction factor be updated during Equilibrium Fuel Cycle (EqFC) analysis. An advanced Kernel-by-Kernel (K-b-K) hexagonal super lattice model can be used to address and update the burnup dependent Dancoff effect during the EqFC analysis. The developed Prismatic Super Homogeneous Lattice Model (PSHLM) is verified by comparing the calculated burnup characteristics of the double-heterogeneous Prismatic Super Kernel-by-Kernel Lattice Model (PSK-b-KLM). This paper summarizes and compares the PSHLM and PSK-b-KLM burnup analysis study and results. This paper also discusses the coupling of a Monte-Carlo code with fuel depletion and buildup code, which provides the fuel burnup analysis tool used to produce the results of the VHTR EqFC burnup analysis.

  19. Bayesian analysis of physiologically based toxicokinetic and toxicodynamic models.

    Science.gov (United States)

    Hack, C Eric

    2006-04-17

    Physiologically based toxicokinetic (PBTK) and toxicodynamic (TD) models of bromate in animals and humans would improve our ability to accurately estimate the toxic doses in humans based on available animal studies. These mathematical models are often highly parameterized and must be calibrated in order for the model predictions of internal dose to adequately fit the experimentally measured doses. Highly parameterized models are difficult to calibrate and it is difficult to obtain accurate estimates of uncertainty or variability in model parameters with commonly used frequentist calibration methods, such as maximum likelihood estimation (MLE) or least squared error approaches. The Bayesian approach called Markov chain Monte Carlo (MCMC) analysis can be used to successfully calibrate these complex models. Prior knowledge about the biological system and associated model parameters is easily incorporated in this approach in the form of prior parameter distributions, and the distributions are refined or updated using experimental data to generate posterior distributions of parameter estimates. The goal of this paper is to give the non-mathematician a brief description of the Bayesian approach and Markov chain Monte Carlo analysis, how this technique is used in risk assessment, and the issues associated with this approach. PMID:16466842

  20. Radionuclide migration analysis using a discrete fracture network model

    International Nuclear Information System (INIS)

    This paper describes an approach for assessing the geosphere performance of nuclear waste disposal in fractured rock. In this approach, a three-dimensional heterogeneous channel-network model is constructed using a stochastic discrete fracture network (DFN) code. Radionuclide migration in the channel-network model is solved using the Laplace transform Galerkin finite element method, taking into account advection-dispersion in a fracture network, matrix diffusion, sorption in the rock matrix as well as radioactive chain decay. Preliminary radionuclide migration analysis was performed for fifty realizations of a synthetic block-scale DFN model. The total radionuclide release from all packages in the repository was estimated from the statistics of the results of fifty realizations under the hypothesis of ergodicity. The interpretation of the result of the three-dimensional network model by a combination of simpler one-dimensional parallel plate models is also discussed

  1. Coarse Analysis of Microscopic Models using Equation-Free Methods

    DEFF Research Database (Denmark)

    Marschler, Christian

    factor for the complexity of models, e.g., in real-time applications. With the increasing amount of data generated by computer simulations a challenge is to extract valuable information from the models in order to help scientists and managers in a decision-making process. Although the dynamics......-dimensional models. The goal of this thesis is to investigate such high-dimensional multiscale models and extract relevant low-dimensional information from them. Recently developed mathematical tools allow to reach this goal: a combination of so-called equation-free methods with numerical bifurcation analysis...... using short simulation bursts of computationally-expensive complex models. Those information is subsequently used to construct bifurcation diagrams that show the parameter dependence of solutions of the system. The methods developed for this thesis have been applied to a wide range of relevant problems...

  2. Modeling and analysis of transport in the mammary glands

    International Nuclear Information System (INIS)

    The transport of three toxins moving from the blood stream into the ducts of the mammary glands is analyzed in this work. The model predictions are compared with experimental data from the literature. The utility of the model lies in its potential to improve our understanding of toxin transport as a pre-disposing factor to breast cancer. This work is based on a multi-layer transport model to analyze the toxins present in the breast milk. The breast milk in comparison with other sampling strategies allows us to understand the mass transport of toxins once inside the bloodstream of breastfeeding women. The multi-layer model presented describes the transport of caffeine, DDT and cimetidine. The analysis performed takes into account the unique transport mechanisms for each of the toxins. Our model predicts the movement of toxins and/or drugs within the mammary glands as well as their bioaccumulation in the tissues. (paper)

  3. Modeling and analysis of transport in the mammary glands

    Science.gov (United States)

    Quezada, Ana; Vafai, Kambiz

    2014-08-01

    The transport of three toxins moving from the blood stream into the ducts of the mammary glands is analyzed in this work. The model predictions are compared with experimental data from the literature. The utility of the model lies in its potential to improve our understanding of toxin transport as a pre-disposing factor to breast cancer. This work is based on a multi-layer transport model to analyze the toxins present in the breast milk. The breast milk in comparison with other sampling strategies allows us to understand the mass transport of toxins once inside the bloodstream of breastfeeding women. The multi-layer model presented describes the transport of caffeine, DDT and cimetidine. The analysis performed takes into account the unique transport mechanisms for each of the toxins. Our model predicts the movement of toxins and/or drugs within the mammary glands as well as their bioaccumulation in the tissues.

  4. The modelling and analysis of the mechanics of ropes

    CERN Document Server

    Leech, C M

    2014-01-01

    This book considers the modelling and analysis of the many types of ropes, linear fibre assemblies. The construction of these structures is very diverse and in the work these are considered from the modelling point of view. As well as the conventional twisted structures, braid and plaited structures and parallel assemblies are modelled and analysed, first for their assembly and secondly for their mechanical behaviour. Also since the components are assemblies of components, fibres into yarns, into strands, and into ropes the hierarchical nature of the construction is considered. The focus of the modelling is essentially toward load extension behaviour but there is reference to bending of ropes, encompassed by the two extremes, no slip between the components and zero friction resistance to component slip. Friction in ropes is considered both between the rope components, sliding, sawing and scissoring, and within the components, dilation and distortion, these latter modes being used to model component set, the p...

  5. A stochastic model for the analysis of maximum daily temperature

    Science.gov (United States)

    Sirangelo, B.; Caloiero, T.; Coscarelli, R.; Ferrari, E.

    2016-08-01

    In this paper, a stochastic model for the analysis of the daily maximum temperature is proposed. First, a deseasonalization procedure based on the truncated Fourier expansion is adopted. Then, the Johnson transformation functions were applied for the data normalization. Finally, the fractionally autoregressive integrated moving average model was used to reproduce both short- and long-memory behavior of the temperature series. The model was applied to the data of the Cosenza gauge (Calabria region) and verified on other four gauges of southern Italy. Through a Monte Carlo simulation procedure based on the proposed model, 105 years of daily maximum temperature have been generated. Among the possible applications of the model, the occurrence probabilities of the annual maximum values have been evaluated. Moreover, the procedure was applied for the estimation of the return periods of long sequences of days with maximum temperature above prefixed thresholds.

  6. A tool model for predicting atmospheric kinetics with sensitivity analysis

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    A package( a tool model) for program of predicting atmospheric chemical kinetics with sensitivity analysis is presented. The new direct method of calculating the first order sensitivity coefficients using sparse matrix technology to chemical kinetics is included in the tool model, it is only necessary to triangularize the matrix related to the Jacobian matrix of the model equation. The Gear type procedure is used to integrate amodel equation and its coupled auxiliary sensitivity coefficient equations. The FORTRAN subroutines of the model equation, the sensitivity coefficient equations, and their Jacobian analytical expressions are generated automatically from a chemical mechanism. The kinetic representation for the model equation and its sensitivity coefficient equations, and their Jacobian matrix is presented. Various FORTRAN subroutines in packages, such as SLODE, modified MA28, Gear package, with which the program runs in conjunction are recommended.The photo-oxidation of dimethyl disulfide is used for illustration.

  7. Sensitivity analysis in severe accidents semi-mechanistic modeling

    International Nuclear Information System (INIS)

    A sensitivity analysis to determine the most influent phenomena in the core melt progression to be considered in a semi-mechanistic modeling have been performed in the present work. The semi-mechanistic program MARCH3 and the TMI-2 plant parameters were used in the TMI-2 severe accident. The sensitivity analysis was performed with the comparison of the results obtained by the program with the plant data recorded during the accident. The results enabled us to verify that although many phenomena are present in the accident, the modelling of the most important ones was enough to reproduce, at least in a qualitative way, the accident progression. This fact reflects the importance of the sensitivity analysis to select the most influent phenomena in a core melting process. (author). 48 refs., 28 figs., 6 tabs

  8. Modeling and structural analysis of honeycomb structure mirror

    Science.gov (United States)

    Li, Yeping

    2012-09-01

    In development of large-scale astronomical telescopes, some promising new technology and method such as honeycomb structure mirrors and silicon carbide mirrors are applied for primary mirrors. Especially in space telescopes, the mirror lightweight design is becoming the key technology and honeycomb structure mirrors are normally required more and more to reduce the cost and increase the feasibility of the telescopes system. In this paper, a parameter FEA model of a two meters honeycomb structure mirror has been built, by using the engineering analysis software ANSYS. Through this model, the structural analysis, thermal deformation analysis and the simulation active correction of low-order frequency aberration by the finite element method have been presented.

  9. How Many Separable Sources? Model Selection In Independent Components Analysis

    DEFF Research Database (Denmark)

    Woods, Roger P.; Hansen, Lars Kai; Strother, Stephen

    2015-01-01

    Unlike mixtures consisting solely of non-Gaussian sources, mixtures including two or more Gaussian components cannot be separated using standard independent components analysis methods that are based on higher order statistics and independent observations. The mixed Independent Components Analysis...... computationally intensive alternative for model selection. Application of the algorithm is illustrated using Fisher’s iris data set and Howells’ craniometric data set. Mixed ICA/PCA is of potential interest in any field of scientific investigation where the authenticity of blindly separated non-Gaussian sources...... might otherwise be questionable. Failure of the Akaike Information Criterion in model selection also has relevance in traditional independent components analysis where all sources are assumed non-Gaussian....

  10. State space modelling and data analysis exercises in LISA Pathfinder

    CERN Document Server

    Nofrarias, M; Armano, M; Audley, H; Auger, G; Benedetti, M; Binetruy, P; Bogenstahl, J; Bortoluzzi, D; Bosetti, P; Brandt, N; Caleno, M; Cañizares, P; Cavalleri, A; Cesa, M; Chmeissani, M; Conchillo, A; Congedo, G; Cristofolin, I; Cruise, M; Danzmann, K; De Marchi, F; Diaz-Aguilo, M; Diepholz, I; Dixon, G; Dolesi, R; Dunbar, N; Fauste, J; Ferraioli, L; Fichter, V Ferroni W; Fitzsimons, E; Freschi, M; Marin, A García; Marirrodriga, C García; Gesa, R Gerndt L; Gibert, F; Giardini, D; Grimani, C; Grynagier, A; Guillaume, B; Guzmán, F; Harrison, I; Heinzel, G; Hernández, V; Hewitson, M; Hollington, D; Hough, J; Hoyland, D; Hueller, M; Huesler, J; Jennrich, O; Jetzer, P; Johlander, B; Killow, C; Llamas, X; Lloro, I; Lobo, A; Maarschalkerweerd, R; Madden, S; Mance, D; Mateos, I; McNamara, P W; Mendes, J; Mitchell, E; Monsky, A; Nicolini, D; Nicolodi, D; Pedersen, F; Perreur-Lloyd, M; Plagnol, E; Prat, P; Racca, G D; Ramos-Castro, J; Reiche, J; Perez, J A Romera; Robertson, D; Rozemeijer, H; Sanjuan, J; Schleicher, A; Schulte, M; Shaul, D; Stagnaro, L; Strandmoe, S; Steier, F; Sumner, T J; Taylor, A; Texier, D; Trenkel, C; Vitale, H-B Tu S; Wanner, G; Ward, H; Waschke, S; Wass, P; Weber, W J; Ziegler, T; Zweifel, P

    2013-01-01

    LISA Pathfinder is a mission planned by the European Space Agency to test the key technologies that will allow the detection of gravitational waves in space. The instrument on-board, the LISA Technology package, will undergo an exhaustive campaign of calibrations and noise characterisation campaigns in order to fully describe the noise model. Data analysis plays an important role in the mission and for that reason the data analysis team has been developing a toolbox which contains all the functionalities required during operations. In this contribution we give an overview of recent activities, focusing on the improvements in the modelling of the instrument and in the data analysis campaigns performed both with real and simulated data.

  11. Functional linear models for association analysis of quantitative traits.

    Science.gov (United States)

    Fan, Ruzong; Wang, Yifan; Mills, James L; Wilson, Alexander F; Bailey-Wilson, Joan E; Xiong, Momiao

    2013-11-01

    Functional linear models are developed in this paper for testing associations between quantitative traits and genetic variants, which can be rare variants or common variants or the combination of the two. By treating multiple genetic variants of an individual in a human population as a realization of a stochastic process, the genome of an individual in a chromosome region is a continuum of sequence data rather than discrete observations. The genome of an individual is viewed as a stochastic function that contains both linkage and linkage disequilibrium (LD) information of the genetic markers. By using techniques of functional data analysis, both fixed and mixed effect functional linear models are built to test the association between quantitative traits and genetic variants adjusting for covariates. After extensive simulation analysis, it is shown that the F-distributed tests of the proposed fixed effect functional linear models have higher power than that of sequence kernel association test (SKAT) and its optimal unified test (SKAT-O) for three scenarios in most cases: (1) the causal variants are all rare, (2) the causal variants are both rare and common, and (3) the causal variants are common. The superior performance of the fixed effect functional linear models is most likely due to its optimal utilization of both genetic linkage and LD information of multiple genetic variants in a genome and similarity among different individuals, while SKAT and SKAT-O only model the similarities and pairwise LD but do not model linkage and higher order LD information sufficiently. In addition, the proposed fixed effect models generate accurate type I error rates in simulation studies. We also show that the functional kernel score tests of the proposed mixed effect functional linear models are preferable in candidate gene analysis and small sample problems. The methods are applied to analyze three biochemical traits in data from the Trinity Students Study. PMID:24130119

  12. Patent portfolio analysis model based on legal status information

    Institute of Scientific and Technical Information of China (English)

    Xuezhao; WANG; Yajuan; ZHAO; Jing; ZHANG; Ping; ZHAO

    2014-01-01

    Purpose:This research proposes a patent portfolio analysis model based on the legal status information to chart out a competitive landscape in a particular field,enabling organizations to position themselves within the overall technology landscape.Design/methodology/approach:Three indicators were selected for the proposed model:Patent grant rate,valid patents rate and patent maintenance period.The model uses legal status information to perform a qualitative evaluation of relative values of the individual patents,countries or regions’ technological capabilities and competitiveness of patent applicants.The results are visualized by a four-quadrant bubble chart To test the effectiveness of the model,it is used to present a competitive landscape in the lithium ion battery field.Findings:The model can be used to evaluate the values of the individual patents,highlight countries or regions’ positions in the field,and rank the competitiveness of patent applicants in the field.Research limitations:The model currently takes into consideration only three legal status indicators.It is actually feasible to introduce more indicators such as the reason for invalid patents and the distribution of patent maintenance time and associate them with those in the proposed model.Practical implications:Analysis of legal status information in combination of patent application information can help an organization to spot gaps in its patent claim coverage,as well as evaluate patent quality and maintenance situation of its granted patents.The study results can be used to support technology assessment,technology innovation and intellectual property management.Originality/value:Prior studies attempted to assess patent quality or competitiveness by using either single patent legal status indicator or comparative analysis of the impacts of each indicator.However,they are insufficient in presenting the combined effects of the evaluation indicators.Using our model,it appears possible to get a

  13. The business models of the patent market: an empirical analysis of IP business model characteristics

    OpenAIRE

    Seissonen, Julia

    2014-01-01

    This thesis surveys previous academic literature and performs an empirical analysis to reach two main objectives. The first objective is to build a broader and more detailed picture of the different IP specialized business models present in the US business environment. Additionally, the thesis studies their effects on the economy and the patent system efficiency. The second objective is to build an econometric model of the IP business model strategy and empirically test the hypothesis that IP...

  14. Comparative analysis of calculation models of railway subgrade

    Directory of Open Access Journals (Sweden)

    I.O. Sviatko

    2013-08-01

    Full Text Available Purpose. In transport engineering structures design, the primary task is to determine the parameters of foundation soil and nuances of its work under loads. It is very important to determine the parameters of shear resistance and the parameters, determining the development of deep deformations in foundation soils, while calculating the soil subgrade - upper track structure interaction. Search for generalized numerical modeling methods of embankment foundation soil work that include not only the analysis of the foundation stress state but also of its deformed one. Methodology. The analysis of existing modern and classical methods of numerical simulation of soil samples under static load was made. Findings. According to traditional methods of analysis of ground masses work, limitation and the qualitative estimation of subgrade deformations is possible only indirectly, through the estimation of stress and comparison of received values with the boundary ones. Originality. A new computational model was proposed in which it will be applied not only classical approach analysis of the soil subgrade stress state, but deformed state will be also taken into account. Practical value. The analysis showed that for accurate analysis of ground masses work it is necessary to develop a generalized methodology for analyzing of the rolling stock - railway subgrade interaction, which will use not only the classical approach of analyzing the soil subgrade stress state, but also take into account its deformed one.

  15. Source modelling in seismic risk analysis for nuclear power plants

    International Nuclear Information System (INIS)

    The proposed probabilistic procedure provides a consistent method for the modelling, analysis and updating of uncertainties that are involved in the seismic risk analysis for nuclear power plants. The potential earthquake activity zones are idealized as point, line or area sources. For these seismic source types, expressions to evaluate their contribution to seismic risk are derived, considering all the possible site-source configurations. The seismic risk at a site is found to depend not only on the inherent randomness of the earthquake occurrences with respect to magnitude, time and space, but also on the uncertainties associated with the predicted values of the seismic and geometric parameters, as well as the uncertainty in the attenuation model. The uncertainty due to the attenuation equation is incorporated into the analysis through the use of random correction factors. The influence of the uncertainty resulting from the insufficient information on the seismic parameters and source geometry is introduced into the analysis by computing a mean risk curve averaged over the various alternative assumptions on the parameters and source geometry. Seismic risk analysis is carried for the city of Denizli, which is located in the seismically most active zone of Turkey. The second analysis is for Akkuyu

  16. Molecular structure based property modeling: Development/ improvement of property models through a systematic property-data-model analysis

    DEFF Research Database (Denmark)

    Hukkerikar, Amol Shivajirao; Sarup, Bent; Sin, Gürkan;

    2013-01-01

    models. To make the property-data-model analysis fast and efficient, an approach based on the “molecular structure similarity criteria” to identify molecules (mono-functional, bi-functional, etc.) containing specified set of structural parameters (that is, groups) is employed. The method has been applied......The objective of this work is to develop a method for performing property-data-model analysis so that efficient use of knowledge of properties could be made in the development/improvement of property prediction models. The method includes: (i) analysis of property data and its consistency check...... to a wide range of properties of pure compounds. In this work, however, the application of the method is illustrated for the property modeling of normal melting point, enthalpy of fusion, enthalpy of formation, and critical temperature. For all the properties listed above, it has been possible to achieve...

  17. A fuzzy set preference model for market share analysis

    Science.gov (United States)

    Turksen, I. B.; Willson, Ian A.

    1992-01-01

    Consumer preference models are widely used in new product design, marketing management, pricing, and market segmentation. The success of new products depends on accurate market share prediction and design decisions based on consumer preferences. The vague linguistic nature of consumer preferences and product attributes, combined with the substantial differences between individuals, creates a formidable challenge to marketing models. The most widely used methodology is conjoint analysis. Conjoint models, as currently implemented, represent linguistic preferences as ratio or interval-scaled numbers, use only numeric product attributes, and require aggregation of individuals for estimation purposes. It is not surprising that these models are costly to implement, are inflexible, and have a predictive validity that is not substantially better than chance. This affects the accuracy of market share estimates. A fuzzy set preference model can easily represent linguistic variables either in consumer preferences or product attributes with minimal measurement requirements (ordinal scales), while still estimating overall preferences suitable for market share prediction. This approach results in flexible individual-level conjoint models which can provide more accurate market share estimates from a smaller number of more meaningful consumer ratings. Fuzzy sets can be incorporated within existing preference model structures, such as a linear combination, using the techniques developed for conjoint analysis and market share estimation. The purpose of this article is to develop and fully test a fuzzy set preference model which can represent linguistic variables in individual-level models implemented in parallel with existing conjoint models. The potential improvements in market share prediction and predictive validity can substantially improve management decisions about what to make (product design), for whom to make it (market segmentation), and how much to make (market share

  18. Image decomposition as a tool for validating stress analysis models

    Directory of Open Access Journals (Sweden)

    Mottershead J.

    2010-06-01

    Full Text Available It is good practice to validate analytical and numerical models used in stress analysis for engineering design by comparison with measurements obtained from real components either in-service or in the laboratory. In reality, this critical step is often neglected or reduced to placing a single strain gage at the predicted hot-spot of stress. Modern techniques of optical analysis allow full-field maps of displacement, strain and, or stress to be obtained from real components with relative ease and at modest cost. However, validations continued to be performed only at predicted and, or observed hot-spots and most of the wealth of data is ignored. It is proposed that image decomposition methods, commonly employed in techniques such as fingerprinting and iris recognition, can be employed to validate stress analysis models by comparing all of the key features in the data from the experiment and the model. Image decomposition techniques such as Zernike moments and Fourier transforms have been used to decompose full-field distributions for strain generated from optical techniques such as digital image correlation and thermoelastic stress analysis as well as from analytical and numerical models by treating the strain distributions as images. The result of the decomposition is 101 to 102 image descriptors instead of the 105 or 106 pixels in the original data. As a consequence, it is relatively easy to make a statistical comparison of the image descriptors from the experiment and from the analytical/numerical model and to provide a quantitative assessment of the stress analysis.

  19. Conditions for transmission path analysis in energy distribution models

    Science.gov (United States)

    Aragonès, Àngels; Guasch, Oriol

    2016-02-01

    In this work, we explore under which conditions transmission path analysis (TPA) developed for statistical energy analysis (SEA) can be applied to the less restrictive energy distribution (ED) models. It is shown that TPA can be extended without problems to proper-SEA systems whereas the situation is not so clear for quasi-SEA systems. In the general case, it has been found that a TPA can always be performed on an ED model if its inverse influence energy coefficient (EIC) matrix turns to have negative off-diagonal entries. If this condition is satisfied, it can be shown that the inverse EIC matrix automatically becomes an M-matrix. An ED graph can then be defined for it and use can be made of graph theory ranking path algorithms, previously developed for SEA systems, to classify dominant paths in ED models. A small mechanical system consisting of connected plates has been used to illustrate some of the exposed theoretical results.

  20. Sensitivity analysis in a Lassa fever deterministic mathematical model

    Science.gov (United States)

    Abdullahi, Mohammed Baba; Doko, Umar Chado; Mamuda, Mamman

    2015-05-01

    Lassa virus that causes the Lassa fever is on the list of potential bio-weapons agents. It was recently imported into Germany, the Netherlands, the United Kingdom and the United States as a consequence of the rapid growth of international traffic. A model with five mutually exclusive compartments related to Lassa fever is presented and the basic reproduction number analyzed. A sensitivity analysis of the deterministic model is performed. This is done in order to determine the relative importance of the model parameters to the disease transmission. The result of the sensitivity analysis shows that the most sensitive parameter is the human immigration, followed by human recovery rate, then person to person contact. This suggests that control strategies should target human immigration, effective drugs for treatment and education to reduced person to person contact.