WorldWideScience

Sample records for model promet-v process

  1. Process modeling style

    CERN Document Server

    Long, John

    2014-01-01

    Process Modeling Style focuses on other aspects of process modeling beyond notation that are very important to practitioners. Many people who model processes focus on the specific notation used to create their drawings. While that is important, there are many other aspects to modeling, such as naming, creating identifiers, descriptions, interfaces, patterns, and creating useful process documentation. Experience author John Long focuses on those non-notational aspects of modeling, which practitioners will find invaluable. Gives solid advice for creating roles, work produ

  2. Product and Process Modelling

    DEFF Research Database (Denmark)

    Cameron, Ian T.; Gani, Rafiqul

    This book covers the area of product and process modelling via a case study approach. It addresses a wide range of modelling applications with emphasis on modelling methodology and the subsequent in-depth analysis of mathematical models to gain insight via structural aspects of the models....... These approaches are put into the context of life cycle modelling, where multiscale and multiform modelling is increasingly prevalent in the 21st century. The book commences with a discussion of modern product and process modelling theory and practice followed by a series of case studies drawn from a variety...... to biotechnology applications, food, polymer and human health application areas. The book highlights to important nature of modern product and process modelling in the decision making processes across the life cycle. As such it provides an important resource for students, researchers and industrial practitioners....

  3. Standard Model processes

    CERN Document Server

    Mangano, M.L.; Aguilar-Saavedra, Juan Antonio; Alekhin, S.; Badger, S.; Bauer, C.W.; Becher, T.; Bertone, V.; Bonvini, M.; Boselli, S.; Bothmann, E.; Boughezal, R.; Cacciari, M.; Carloni Calame, C.M.; Caola, F.; Campbell, J.M.; Carrazza, S.; Chiesa, M.; Cieri, L.; Cimaglia, F.; Febres Cordero, F.; Ferrarese, P.; D'Enterria, D.; Ferrera, G.; Garcia i Tormo, X.; Garzelli, M.V.; Germann, E.; Hirschi, V.; Han, T.; Ita, H.; Jäger, B.; Kallweit, S.; Karlberg, A.; Kuttimalai, S.; Krauss, F.; Larkoski, A.J.; Lindert, J.; Luisoni, G.; Maierhöfer, P.; Mattelaer, O.; Martinez, H.; Moch, S.; Montagna, G.; Moretti, M.; Nason, P.; Nicrosini, O.; Oleari, C.; Pagani, D.; Papaefstathiou, A.; Petriello, F.; Piccinini, F.; Pierini, M.; Pierog, T.; Pozzorini, S.; Re, E.; Robens, T.; Rojo, J.; Ruiz, R.; Sakurai, K.; Salam, G.P.; Salfelder, L.; Schönherr, M.; Schulze, M.; Schumann, S.; Selvaggi, M.; Shivaji, A.; Siodmok, A.; Skands, P.; Torrielli, P.; Tramontano, F.; Tsinikos, I.; Tweedie, B.; Vicini, A.; Westhoff, S.; Zaro, M.; Zeppenfeld, D.; CERN. Geneva. ATS Department

    2017-06-22

    This report summarises the properties of Standard Model processes at the 100 TeV pp collider. We document the production rates and typical distributions for a number of benchmark Standard Model processes, and discuss new dynamical phenomena arising at the highest energies available at this collider. We discuss the intrinsic physics interest in the measurement of these Standard Model processes, as well as their role as backgrounds for New Physics searches.

  4. WWTP Process Tank Modelling

    DEFF Research Database (Denmark)

    Laursen, Jesper

    solution of the Navier-Stokes equations in a multiphase scheme. After a general introduction to the activated sludge tank as a system, the activated sludge tank model is gradually setup in separate stages. The individual sub-processes that are often occurring in activated sludge tanks are initially......-process models, the last part of the thesis, where the integrated process tank model is tested on three examples of activated sludge systems, is initiated. The three case studies are introduced with an increasing degree of model complexity. All three cases are take basis in Danish municipal wastewater treatment...... plants. The first case study involves the modeling of an activated sludge tank undergoing a special controlling strategy with the intention minimizing the sludge loading on the subsequent secondary settlers during storm events. The applied model is a two-phase model, where the sedimentation of sludge...

  5. Model Process Control Language

    Data.gov (United States)

    National Aeronautics and Space Administration — The MPC (Model Process Control) language enables the capture, communication and preservation of a simulation instance, with sufficient detail that it can be...

  6. Business Model Process Configurations

    DEFF Research Database (Denmark)

    Taran, Yariv; Nielsen, Christian; Thomsen, Peter

    2015-01-01

    strategic preference, as part of their business model innovation activity planned. Practical implications – This paper aimed at strengthening researchers and, particularly, practitioner’s perspectives into the field of business model process configurations. By insuring an [abstracted] alignment between......Purpose – The paper aims: 1) To develop systematically a structural list of various business model process configuration and to group (deductively) these selected configurations in a structured typological categorization list. 2) To facilitate companies in the process of BM innovation......, by developing (inductively) an ontological classification framework, in view of the BM process configurations typology developed. Design/methodology/approach – Given the inconsistencies found in the business model studies (e.g. definitions, configurations, classifications) we adopted the analytical induction...

  7. Biosphere Process Model Report

    Energy Technology Data Exchange (ETDEWEB)

    J. Schmitt

    2000-05-25

    To evaluate the postclosure performance of a potential monitored geologic repository at Yucca Mountain, a Total System Performance Assessment (TSPA) will be conducted. Nine Process Model Reports (PMRs), including this document, are being developed to summarize the technical basis for each of the process models supporting the TSPA model. These reports cover the following areas: (1) Integrated Site Model; (2) Unsaturated Zone Flow and Transport; (3) Near Field Environment; (4) Engineered Barrier System Degradation, Flow, and Transport; (5) Waste Package Degradation; (6) Waste Form Degradation; (7) Saturated Zone Flow and Transport; (8) Biosphere; and (9) Disruptive Events. Analysis/Model Reports (AMRs) contain the more detailed technical information used to support TSPA and the PMRs. The AMRs consists of data, analyses, models, software, and supporting documentation that will be used to defend the applicability of each process model for evaluating the postclosure performance of the potential Yucca Mountain repository system. This documentation will ensure the traceability of information from its source through its ultimate use in the TSPA-Site Recommendation (SR) and in the National Environmental Policy Act (NEPA) analysis processes. The objective of the Biosphere PMR is to summarize (1) the development of the biosphere model, and (2) the Biosphere Dose Conversion Factors (BDCFs) developed for use in TSPA. The Biosphere PMR does not present or summarize estimates of potential radiation doses to human receptors. Dose calculations are performed as part of TSPA and will be presented in the TSPA documentation. The biosphere model is a component of the process to evaluate postclosure repository performance and regulatory compliance for a potential monitored geologic repository at Yucca Mountain, Nevada. The biosphere model describes those exposure pathways in the biosphere by which radionuclides released from a potential repository could reach a human receptor

  8. Biosphere Process Model Report

    International Nuclear Information System (INIS)

    Schmitt, J.

    2000-01-01

    To evaluate the postclosure performance of a potential monitored geologic repository at Yucca Mountain, a Total System Performance Assessment (TSPA) will be conducted. Nine Process Model Reports (PMRs), including this document, are being developed to summarize the technical basis for each of the process models supporting the TSPA model. These reports cover the following areas: (1) Integrated Site Model; (2) Unsaturated Zone Flow and Transport; (3) Near Field Environment; (4) Engineered Barrier System Degradation, Flow, and Transport; (5) Waste Package Degradation; (6) Waste Form Degradation; (7) Saturated Zone Flow and Transport; (8) Biosphere; and (9) Disruptive Events. Analysis/Model Reports (AMRs) contain the more detailed technical information used to support TSPA and the PMRs. The AMRs consists of data, analyses, models, software, and supporting documentation that will be used to defend the applicability of each process model for evaluating the postclosure performance of the potential Yucca Mountain repository system. This documentation will ensure the traceability of information from its source through its ultimate use in the TSPA-Site Recommendation (SR) and in the National Environmental Policy Act (NEPA) analysis processes. The objective of the Biosphere PMR is to summarize (1) the development of the biosphere model, and (2) the Biosphere Dose Conversion Factors (BDCFs) developed for use in TSPA. The Biosphere PMR does not present or summarize estimates of potential radiation doses to human receptors. Dose calculations are performed as part of TSPA and will be presented in the TSPA documentation. The biosphere model is a component of the process to evaluate postclosure repository performance and regulatory compliance for a potential monitored geologic repository at Yucca Mountain, Nevada. The biosphere model describes those exposure pathways in the biosphere by which radionuclides released from a potential repository could reach a human receptor

  9. Foam process models.

    Energy Technology Data Exchange (ETDEWEB)

    Moffat, Harry K.; Noble, David R.; Baer, Thomas A. (Procter & Gamble Co., West Chester, OH); Adolf, Douglas Brian; Rao, Rekha Ranjana; Mondy, Lisa Ann

    2008-09-01

    In this report, we summarize our work on developing a production level foam processing computational model suitable for predicting the self-expansion of foam in complex geometries. The model is based on a finite element representation of the equations of motion, with the movement of the free surface represented using the level set method, and has been implemented in SIERRA/ARIA. An empirically based time- and temperature-dependent density model is used to encapsulate the complex physics of foam nucleation and growth in a numerically tractable model. The change in density with time is at the heart of the foam self-expansion as it creates the motion of the foam. This continuum-level model uses an homogenized description of foam, which does not include the gas explicitly. Results from the model are compared to temperature-instrumented flow visualization experiments giving the location of the foam front as a function of time for our EFAR model system.

  10. GREENSCOPE: Sustainable Process Modeling

    Science.gov (United States)

    EPA researchers are responding to environmental problems by incorporating sustainability into process design and evaluation. EPA researchers are also developing a tool that allows users to assess modifications to existing and new chemical processes to determine whether changes in...

  11. Auditory processing models

    DEFF Research Database (Denmark)

    Dau, Torsten

    2008-01-01

    The Handbook of Signal Processing in Acoustics will compile the techniques and applications of signal processing as they are used in the many varied areas of Acoustics. The Handbook will emphasize the interdisciplinary nature of signal processing in acoustics. Each Section of the Handbook...... will present topics on signal processing which are important in a specific area of acoustics. These will be of interest to specialists in these areas because they will be presented from their technical perspective, rather than a generic engineering approach to signal processing. Non-specialists, or specialists...

  12. INNOVATION PROCESS MODELLING

    Directory of Open Access Journals (Sweden)

    JANUSZ K. GRABARA

    2011-01-01

    Full Text Available Modelling phenomena in accordance with the structural approach enables one to simplify the observed relations and to present the classification grounds. An example may be a model of organisational structure identifying the logical relations between particular units and presenting the division of authority, work.

  13. Multi-enzyme Process Modeling

    DEFF Research Database (Denmark)

    Andrade Santacoloma, Paloma de Gracia

    The subject of this thesis is to develop a methodological framework that can systematically guide mathematical model building for better understanding of multi-enzyme processes. In this way, opportunities for process improvements can be identified by analyzing simulations of either existing...... features of the process and provides the information required to structure the process model by using a step-by-step procedure with the required tools and methods. In this way, this framework increases efficiency of the model development process with respect to time and resources needed (fast and effective...... in the scientific literature. Reliable mathematical models of such multi-catalytic schemes can exploit the potential benefit of these processes. In this way, the best outcome of the process can be obtained understanding the types of modification that are required for process optimization. An effective evaluation...

  14. Modeling of column apparatus processes

    CERN Document Server

    Boyadjiev, Christo; Boyadjiev, Boyan; Popova-Krumova, Petya

    2016-01-01

    This book presents a new approach for the modeling of chemical and interphase mass transfer processes in industrial column apparatuses, using convection-diffusion and average-concentration models. The convection-diffusion type models are used for a qualitative analysis of the processes and to assess the main, small and slight physical effects, and then reject the slight effects. As a result, the process mechanism can be identified. It also introduces average concentration models for quantitative analysis, which use the average values of the velocity and concentration over the cross-sectional area of the column. The new models are used to analyze different processes (simple and complex chemical reactions, absorption, adsorption and catalytic reactions), and make it possible to model the processes of gas purification with sulfur dioxide, which form the basis of several patents.

  15. UML in business process modeling

    Directory of Open Access Journals (Sweden)

    Bartosz Marcinkowski

    2013-03-01

    Full Text Available Selection and proper application of business process modeling methods and techniques have a significant impact on organizational improvement capabilities as well as proper understanding of functionality of information systems that shall support activity of the organization. A number of business process modeling notations were popularized in practice in recent decades. Most significant of the notations include Business Process Modeling Notation (OMG BPMN and several Unified Modeling Language (OMG UML extensions. In this paper, the assessment whether one of the most flexible and strictly standardized contemporary business process modeling notations, i.e. Rational UML Profile for Business Modeling, enable business analysts to prepare business models that are all-embracing and understandable by all the stakeholders. After the introduction, methodology of research is discussed. Section 2 presents selected case study results. The paper is concluded with a summary.

  16. Business Process Modeling: Perceived Benefits

    Science.gov (United States)

    Indulska, Marta; Green, Peter; Recker, Jan; Rosemann, Michael

    The process-centered design of organizations and information systems is globally seen as an appropriate response to the increased economic pressure on organizations. At the methodological core of process-centered management is process modeling. However, business process modeling in large initiatives can be a time-consuming and costly exercise, making it potentially difficult to convince executive management of its benefits. To date, and despite substantial interest and research in the area of process modeling, the understanding of the actual benefits of process modeling in academia and practice is limited. To address this gap, this paper explores the perception of benefits derived from process modeling initiatives, as reported through a global Delphi study. The study incorporates the views of three groups of stakeholders - academics, practitioners and vendors. Our findings lead to the first identification and ranking of 19 unique benefits associated with process modeling. The study in particular found that process modeling benefits vary significantly between practitioners and academics. We argue that the variations may point to a disconnect between research projects and practical demands.

  17. Chemical Process Modeling and Control.

    Science.gov (United States)

    Bartusiak, R. Donald; Price, Randel M.

    1987-01-01

    Describes some of the features of Lehigh University's (Pennsylvania) process modeling and control program. Highlights the creation and operation of the Chemical Process Modeling and Control Center (PMC). Outlines the program's philosophy, faculty, technical program, current research projects, and facilities. (TW)

  18. Chapter 1: Standard Model processes

    OpenAIRE

    Becher, Thomas

    2017-01-01

    This chapter documents the production rates and typical distributions for a number of benchmark Standard Model processes, and discusses new dynamical phenomena arising at the highest energies available at this collider. We discuss the intrinsic physics interest in the measurement of these Standard Model processes, as well as their role as backgrounds for New Physics searches.

  19. Business process modeling in healthcare.

    Science.gov (United States)

    Ruiz, Francisco; Garcia, Felix; Calahorra, Luis; Llorente, César; Gonçalves, Luis; Daniel, Christel; Blobel, Bernd

    2012-01-01

    The importance of the process point of view is not restricted to a specific enterprise sector. In the field of health, as a result of the nature of the service offered, health institutions' processes are also the basis for decision making which is focused on achieving their objective of providing quality medical assistance. In this chapter the application of business process modelling - using the Business Process Modelling Notation (BPMN) standard is described. Main challenges of business process modelling in healthcare are the definition of healthcare processes, the multi-disciplinary nature of healthcare, the flexibility and variability of the activities involved in health care processes, the need of interoperability between multiple information systems, and the continuous updating of scientific knowledge in healthcare.

  20. Modeling nuclear processes by Simulink

    Science.gov (United States)

    Rashid, Nahrul Khair Alang Md

    2015-04-01

    Modelling and simulation are essential parts in the study of dynamic systems behaviours. In nuclear engineering, modelling and simulation are important to assess the expected results of an experiment before the actual experiment is conducted or in the design of nuclear facilities. In education, modelling can give insight into the dynamic of systems and processes. Most nuclear processes can be described by ordinary or partial differential equations. Efforts expended to solve the equations using analytical or numerical solutions consume time and distract attention from the objectives of modelling itself. This paper presents the use of Simulink, a MATLAB toolbox software that is widely used in control engineering, as a modelling platform for the study of nuclear processes including nuclear reactor behaviours. Starting from the describing equations, Simulink models for heat transfer, radionuclide decay process, delayed neutrons effect, reactor point kinetic equations with delayed neutron groups, and the effect of temperature feedback are used as examples.

  1. Modeling nuclear processes by Simulink

    Energy Technology Data Exchange (ETDEWEB)

    Rashid, Nahrul Khair Alang Md, E-mail: nahrul@iium.edu.my [Faculty of Engineering, International Islamic University Malaysia, Jalan Gombak, Selangor (Malaysia)

    2015-04-29

    Modelling and simulation are essential parts in the study of dynamic systems behaviours. In nuclear engineering, modelling and simulation are important to assess the expected results of an experiment before the actual experiment is conducted or in the design of nuclear facilities. In education, modelling can give insight into the dynamic of systems and processes. Most nuclear processes can be described by ordinary or partial differential equations. Efforts expended to solve the equations using analytical or numerical solutions consume time and distract attention from the objectives of modelling itself. This paper presents the use of Simulink, a MATLAB toolbox software that is widely used in control engineering, as a modelling platform for the study of nuclear processes including nuclear reactor behaviours. Starting from the describing equations, Simulink models for heat transfer, radionuclide decay process, delayed neutrons effect, reactor point kinetic equations with delayed neutron groups, and the effect of temperature feedback are used as examples.

  2. Path modeling and process control

    DEFF Research Database (Denmark)

    Høskuldsson, Agnar; Rodionova, O.; Pomerantsev, A.

    2007-01-01

    and having three or more stages. The methods are applied to a process control of a multi-stage production process having 25 variables and one output variable. When moving along the process, variables change their roles. It is shown how the methods of path modeling can be applied to estimate variables......Many production processes are carried out in stages. At the end of each stage, the production engineer can analyze the intermediate results and correct process parameters (variables) of the next stage. Both analysis of the process and correction to process parameters at next stage should...... be performed regarding the foreseeable output property y, and with respect to an admissible range of correcting actions for the parameters of the next stage. In this paper the basic principles of path modeling is presented. The mathematics is presented for processes having only one stage, having two stages...

  3. Markov Decision Process Measurement Model.

    Science.gov (United States)

    LaMar, Michelle M

    2018-03-01

    Within-task actions can provide additional information on student competencies but are challenging to model. This paper explores the potential of using a cognitive model for decision making, the Markov decision process, to provide a mapping between within-task actions and latent traits of interest. Psychometric properties of the model are explored, and simulation studies report on parameter recovery within the context of a simple strategy game. The model is then applied to empirical data from an educational game. Estimates from the model are found to correlate more strongly with posttest results than a partial-credit IRT model based on outcome data alone.

  4. Simple Models for Process Control

    Czech Academy of Sciences Publication Activity Database

    Gorez, R.; Klán, Petr

    2011-01-01

    Roč. 22, č. 2 (2011), s. 58-62 ISSN 0929-2268 Institutional research plan: CEZ:AV0Z10300504 Keywords : process models * PID control * second order dynamics Subject RIV: JB - Sensors, Measurment, Regulation

  5. Model feedstock supply processing plants

    Directory of Open Access Journals (Sweden)

    V. M. Bautin

    2013-01-01

    Full Text Available The model of raw providing the processing enterprises entering into vertically integrated structure on production and processing of dairy raw materials, differing by an orientation on achievement of cumulative effect by the integrated structure acting as criterion function which maximizing is reached by optimization of capacities, volumes of deliveries of raw materials and its qualitative characteristics, costs of industrial processing of raw materials and demand for dairy production is developed.

  6. Sato Processes in Default Modeling

    DEFF Research Database (Denmark)

    Kokholm, Thomas; Nicolato, Elisa

    In reduced form default models, the instantaneous default intensity is classically the modeling object. Survival probabilities are then given by the Laplace transform of the cumulative hazard defined as the integrated intensity process. Instead, recent literature has shown a tendency towards...

  7. Sato Processes in Default Modeling

    DEFF Research Database (Denmark)

    Kokholm, Thomas; Nicolato, Elisa

    2010-01-01

    In reduced form default models, the instantaneous default intensity is the classical modeling object. Survival probabilities are then given by the Laplace transform of the cumulative hazard defined as the integrated intensity process. Instead, recent literature tends to specify the cumulative...

  8. Command Process Modeling & Risk Analysis

    Science.gov (United States)

    Meshkat, Leila

    2011-01-01

    Commanding Errors may be caused by a variety of root causes. It's important to understand the relative significance of each of these causes for making institutional investment decisions. One of these causes is the lack of standardized processes and procedures for command and control. We mitigate this problem by building periodic tables and models corresponding to key functions within it. These models include simulation analysis and probabilistic risk assessment models.

  9. Modelling Hospital Materials Management Processes

    Directory of Open Access Journals (Sweden)

    Raffaele Iannone

    2013-06-01

    integrated and detailed analysis and description model for hospital materials management data and tasks, which is able to tackle information from patient requirements to usage, from replenishment requests to supplying and handling activities. The model takes account of medical risk reduction, traceability and streamlined processes perspectives. Second, the paper translates this information into a business process model and mathematical formalization.The study provides a useful guide to the various relevant technology‐related, management and business issues, laying the foundations of an efficient reengineering of the supply chain to reduce healthcare costs and improve the quality of care.

  10. The Brookhaven Process Optimization Models

    Energy Technology Data Exchange (ETDEWEB)

    Pilati, D. A.; Sparrow, F. T.

    1979-01-01

    The Brookhaven National Laboratory Industry Model Program (IMP) has undertaken the development of a set of industry-specific process-optimization models. These models are to be used for energy-use projections, energy-policy analyses, and process technology assessments. Applications of the models currently under development show that system-wide energy impacts may be very different from engineering estimates, selected investment tax credits for cogeneration (or other conservation strategies) may have the perverse effect of increasing industrial energy use, and that a proper combination of energy taxes and investment tax credits is more socially desirable than either policy alone. A section is included describing possible extensions of these models to answer questions or address other systems (e.g., a single plant instead of an entire industry).

  11. Modeling of biopharmaceutical processes. Part 2: Process chromatography unit operation

    DEFF Research Database (Denmark)

    Kaltenbrunner, Oliver; McCue, Justin; Engel, Philip

    2008-01-01

    Process modeling can be a useful tool to aid in process development, process optimization, and process scale-up. When modeling a chromatography process, one must first select the appropriate models that describe the mass transfer and adsorption that occurs within the porous adsorbent. The theoret......Process modeling can be a useful tool to aid in process development, process optimization, and process scale-up. When modeling a chromatography process, one must first select the appropriate models that describe the mass transfer and adsorption that occurs within the porous adsorbent...

  12. Neuroscientific model of motivational process.

    Science.gov (United States)

    Kim, Sung-Il

    2013-01-01

    Considering the neuroscientific findings on reward, learning, value, decision-making, and cognitive control, motivation can be parsed into three sub processes, a process of generating motivation, a process of maintaining motivation, and a process of regulating motivation. I propose a tentative neuroscientific model of motivational processes which consists of three distinct but continuous sub processes, namely reward-driven approach, value-based decision-making, and goal-directed control. Reward-driven approach is the process in which motivation is generated by reward anticipation and selective approach behaviors toward reward. This process recruits the ventral striatum (reward area) in which basic stimulus-action association is formed, and is classified as an automatic motivation to which relatively less attention is assigned. By contrast, value-based decision-making is the process of evaluating various outcomes of actions, learning through positive prediction error, and calculating the value continuously. The striatum and the orbitofrontal cortex (valuation area) play crucial roles in sustaining motivation. Lastly, the goal-directed control is the process of regulating motivation through cognitive control to achieve goals. This consciously controlled motivation is associated with higher-level cognitive functions such as planning, retaining the goal, monitoring the performance, and regulating action. The anterior cingulate cortex (attention area) and the dorsolateral prefrontal cortex (cognitive control area) are the main neural circuits related to regulation of motivation. These three sub processes interact with each other by sending reward prediction error signals through dopaminergic pathway from the striatum and to the prefrontal cortex. The neuroscientific model of motivational process suggests several educational implications with regard to the generation, maintenance, and regulation of motivation to learn in the learning environment.

  13. Process Models for Security Architectures

    Directory of Open Access Journals (Sweden)

    Floarea NASTASE

    2006-01-01

    Full Text Available This paper presents a model for an integrated security system, which can be implemented in any organization. It is based on security-specific standards and taxonomies as ISO 7498-2 and Common Criteria. The functionalities are derived from the classes proposed in the Common Criteria document. In the paper we present the process model for each functionality and also we focus on the specific components.

  14. Mathematical modelling in economic processes.

    Directory of Open Access Journals (Sweden)

    L.V. Kravtsova

    2008-06-01

    Full Text Available In article are considered a number of methods of mathematical modelling of economic processes and opportunities of use of spreadsheets Excel for reception of the optimum decision of tasks or calculation of financial operations with the help of the built-in functions.

  15. Modeling of biopharmaceutical processes. Part 2: Process chromatography unit operation

    DEFF Research Database (Denmark)

    Kaltenbrunner, Oliver; McCue, Justin; Engel, Philip

    2008-01-01

    Process modeling can be a useful tool to aid in process development, process optimization, and process scale-up. When modeling a chromatography process, one must first select the appropriate models that describe the mass transfer and adsorption that occurs within the porous adsorbent. The theoret...

  16. Investigating the Process of Process Modeling with Eye Movement Analysis

    OpenAIRE

    Pinggera, Jakob; Furtner, Marco; Martini, Markus; Sachse, Pierre; Reiter, Katharina; Zugal, Stefan; Weber, Barbara

    2015-01-01

    Research on quality issues of business process models has recently begun to explore the process of creating process models by analyzing the modeler's interactions with the modeling environment. In this paper we aim to complement previous insights on the modeler's modeling behavior with data gathered by tracking the modeler's eye movements when engaged in the act of modeling. We present preliminary results and outline directions for future research to triangulate toward a more comprehensive un...

  17. Integrated Site Model Process Model Report

    International Nuclear Information System (INIS)

    Booth, T.

    2000-01-01

    The Integrated Site Model (ISM) provides a framework for discussing the geologic features and properties of Yucca Mountain, which is being evaluated as a potential site for a geologic repository for the disposal of nuclear waste. The ISM is important to the evaluation of the site because it provides 3-D portrayals of site geologic, rock property, and mineralogic characteristics and their spatial variabilities. The ISM is not a single discrete model; rather, it is a set of static representations that provide three-dimensional (3-D), computer representations of site geology, selected hydrologic and rock properties, and mineralogic-characteristics data. These representations are manifested in three separate model components of the ISM: the Geologic Framework Model (GFM), the Rock Properties Model (RPM), and the Mineralogic Model (MM). The GFM provides a representation of the 3-D stratigraphy and geologic structure. Based on the framework provided by the GFM, the RPM and MM provide spatial simulations of the rock and hydrologic properties, and mineralogy, respectively. Functional summaries of the component models and their respective output are provided in Section 1.4. Each of the component models of the ISM considers different specific aspects of the site geologic setting. Each model was developed using unique methodologies and inputs, and the determination of the modeled units for each of the components is dependent on the requirements of that component. Therefore, while the ISM represents the integration of the rock properties and mineralogy into a geologic framework, the discussion of ISM construction and results is most appropriately presented in terms of the three separate components. This Process Model Report (PMR) summarizes the individual component models of the ISM (the GFM, RPM, and MM) and describes how the three components are constructed and combined to form the ISM

  18. Animal models and conserved processes

    Directory of Open Access Journals (Sweden)

    Greek Ray

    2012-09-01

    Full Text Available Abstract Background The concept of conserved processes presents unique opportunities for using nonhuman animal models in biomedical research. However, the concept must be examined in the context that humans and nonhuman animals are evolved, complex, adaptive systems. Given that nonhuman animals are examples of living systems that are differently complex from humans, what does the existence of a conserved gene or process imply for inter-species extrapolation? Methods We surveyed the literature including philosophy of science, biological complexity, conserved processes, evolutionary biology, comparative medicine, anti-neoplastic agents, inhalational anesthetics, and drug development journals in order to determine the value of nonhuman animal models when studying conserved processes. Results Evolution through natural selection has employed components and processes both to produce the same outcomes among species but also to generate different functions and traits. Many genes and processes are conserved, but new combinations of these processes or different regulation of the genes involved in these processes have resulted in unique organisms. Further, there is a hierarchy of organization in complex living systems. At some levels, the components are simple systems that can be analyzed by mathematics or the physical sciences, while at other levels the system cannot be fully analyzed by reducing it to a physical system. The study of complex living systems must alternate between focusing on the parts and examining the intact whole organism while taking into account the connections between the two. Systems biology aims for this holism. We examined the actions of inhalational anesthetic agents and anti-neoplastic agents in order to address what the characteristics of complex living systems imply for inter-species extrapolation of traits and responses related to conserved processes. Conclusion We conclude that even the presence of conserved processes is

  19. Animal models and conserved processes.

    Science.gov (United States)

    Greek, Ray; Rice, Mark J

    2012-09-10

    The concept of conserved processes presents unique opportunities for using nonhuman animal models in biomedical research. However, the concept must be examined in the context that humans and nonhuman animals are evolved, complex, adaptive systems. Given that nonhuman animals are examples of living systems that are differently complex from humans, what does the existence of a conserved gene or process imply for inter-species extrapolation? We surveyed the literature including philosophy of science, biological complexity, conserved processes, evolutionary biology, comparative medicine, anti-neoplastic agents, inhalational anesthetics, and drug development journals in order to determine the value of nonhuman animal models when studying conserved processes. Evolution through natural selection has employed components and processes both to produce the same outcomes among species but also to generate different functions and traits. Many genes and processes are conserved, but new combinations of these processes or different regulation of the genes involved in these processes have resulted in unique organisms. Further, there is a hierarchy of organization in complex living systems. At some levels, the components are simple systems that can be analyzed by mathematics or the physical sciences, while at other levels the system cannot be fully analyzed by reducing it to a physical system. The study of complex living systems must alternate between focusing on the parts and examining the intact whole organism while taking into account the connections between the two. Systems biology aims for this holism. We examined the actions of inhalational anesthetic agents and anti-neoplastic agents in order to address what the characteristics of complex living systems imply for inter-species extrapolation of traits and responses related to conserved processes. We conclude that even the presence of conserved processes is insufficient for inter-species extrapolation when the trait or response

  20. Model for amorphous aggregation processes

    Science.gov (United States)

    Stranks, Samuel D.; Ecroyd, Heath; van Sluyter, Steven; Waters, Elizabeth J.; Carver, John A.; von Smekal, Lorenz

    2009-11-01

    The amorphous aggregation of proteins is associated with many phenomena, ranging from the formation of protein wine haze to the development of cataract in the eye lens and the precipitation of recombinant proteins during their expression and purification. While much literature exists describing models for linear protein aggregation, such as amyloid fibril formation, there are few reports of models which address amorphous aggregation. Here, we propose a model to describe the amorphous aggregation of proteins which is also more widely applicable to other situations where a similar process occurs, such as in the formation of colloids and nanoclusters. As first applications of the model, we have tested it against experimental turbidimetry data of three proteins relevant to the wine industry and biochemistry, namely, thaumatin, a thaumatinlike protein, and α -lactalbumin. The model is very robust and describes amorphous experimental data to a high degree of accuracy. Details about the aggregation process, such as shape parameters of the aggregates and rate constants, can also be extracted.

  1. Models of memory: information processing.

    Science.gov (United States)

    Eysenck, M W

    1988-01-01

    A complete understanding of human memory will necessarily involve consideration of the active processes involved at the time of learning and of the organization and nature of representation of information in long-term memory. In addition to process and structure, it is important for theory to indicate the ways in which stimulus-driven and conceptually driven processes interact with each other in the learning situation. Not surprisingly, no existent theory provides a detailed specification of all of these factors. However, there are a number of more specific theories which are successful in illuminating some of the component structures and processes. The working memory model proposed by Baddeley and Hitch (1974) and modified subsequently has shown how the earlier theoretical construct of the short-term store should be replaced with the notion of working memory. In essence, working memory is a system which is used both to process information and to permit the transient storage of information. It comprises a number of conceptually distinct, but functionally interdependent components. So far as long-term memory is concerned, there is evidence of a number of different kinds of representation. Of particular importance is the distinction between declarative knowledge and procedural knowledge, a distinction which has received support from the study of amnesic patients. Kosslyn has argued for a distinction between literal representation and propositional representation, whereas Tulving has distinguished between episodic and semantic memories. While Tulving's distinction is perhaps the best known, there is increasing evidence that episodic and semantic memory differ primarily in content rather than in process, and so the distinction may be of less theoretical value than was originally believed.(ABSTRACT TRUNCATED AT 250 WORDS)

  2. Mathematical modeling of biological processes

    CERN Document Server

    Friedman, Avner

    2014-01-01

    This book on mathematical modeling of biological processes includes a wide selection of biological topics that demonstrate the power of mathematics and computational codes in setting up biological processes with a rigorous and predictive framework. Topics include: enzyme dynamics, spread of disease, harvesting bacteria, competition among live species, neuronal oscillations, transport of neurofilaments in axon, cancer and cancer therapy, and granulomas. Complete with a description of the biological background and biological question that requires the use of mathematics, this book is developed for graduate students and advanced undergraduate students with only basic knowledge of ordinary differential equations and partial differential equations; background in biology is not required. Students will gain knowledge on how to program with MATLAB without previous programming experience and how to use codes in order to test biological hypothesis.

  3. Modeling pellet impact drilling process

    Science.gov (United States)

    Kovalyov, A. V.; Ryabchikov, S. Ya; Isaev, Ye D.; Ulyanova, O. S.

    2016-03-01

    The paper describes pellet impact drilling which could be used to increase the drilling speed and the rate of penetration when drilling hard rocks. Pellet impact drilling implies rock destruction by metal pellets with high kinetic energy in the immediate vicinity of the earth formation encountered. The pellets are circulated in the bottom hole by a high velocity fluid jet, which is the principle component of the ejector pellet impact drill bit. The experiments conducted has allowed modeling the process of pellet impact drilling, which creates the scientific and methodological basis for engineering design of drilling operations under different geo-technical conditions.

  4. Integrated modelling in materials and process technology

    DEFF Research Database (Denmark)

    Hattel, Jesper Henri

    2008-01-01

    Integrated modelling of entire process sequences and the subsequent in-service conditions, and multiphysics modelling of the single process steps are areas that increasingly support optimisation of manufactured parts. In the present paper, three different examples of modelling manufacturing...... processes from the viewpoint of combined materials and process modelling are presented: solidification of thin walled ductile cast iron, integrated modelling of spray forming and multiphysics modelling of friction stir welding. The fourth example describes integrated modelling applied to a failure analysis...

  5. A visual analysis of the process of process modeling

    NARCIS (Netherlands)

    Claes, J.; Vanderfeesten, I.; Pinggera, J.; Reijers, H.A.; Weber, B.; Poels, G.

    2015-01-01

    The construction of business process models has become an important requisite in the analysis and optimization of processes. The success of the analysis and optimization efforts heavily depends on the quality of the models. Therefore, a research domain emerged that studies the process of process

  6. Collapse models and perceptual processes

    International Nuclear Information System (INIS)

    Ghirardi, Gian Carlo; Romano, Raffaele

    2014-01-01

    Theories including a collapse mechanism have been presented various years ago. They are based on a modification of standard quantum mechanics in which nonlinear and stochastic terms are added to the evolution equation. Their principal merits derive from the fact that they are mathematically precise schemes accounting, on the basis of a unique universal dynamical principle, both for the quantum behavior of microscopic systems as well as for the reduction associated to measurement processes and for the classical behavior of macroscopic objects. Since such theories qualify themselves not as new interpretations but as modifications of the standard theory they can be, in principle, tested against quantum mechanics. Recently, various investigations identifying possible crucial test have been discussed. In spite of the extreme difficulty to perform such tests it seems that recent technological developments allow at least to put precise limits on the parameters characterizing the modifications of the evolution equation. Here we will simply mention some of the recent investigations in this direction, while we will mainly concentrate our attention to the way in which collapse theories account for definite perceptual process. The differences between the case of reductions induced by perceptions and those related to measurement procedures by means of standard macroscopic devices will be discussed. On this basis, we suggest a precise experimental test of collapse theories involving conscious observers. We make plausible, by discussing in detail a toy model, that the modified dynamics can give rise to quite small but systematic errors in the visual perceptual process.

  7. Properties of spatial Cox process models

    DEFF Research Database (Denmark)

    Møller, Jesper

    Probabilistic properties of Cox processes of relevance for statistical modelling and inference are studied. Particularly, we study the most important classes of Cox processes, including log Gaussian Cox processes, shot noise Cox processes, and permanent Cox processes. We consider moment properties...... and point process operations such as thinning, displacements, and superpositioning. We also discuss how to simulate specific Cox processes....

  8. Cupola Furnace Computer Process Model

    Energy Technology Data Exchange (ETDEWEB)

    Seymour Katz

    2004-12-31

    The cupola furnace generates more than 50% of the liquid iron used to produce the 9+ million tons of castings annually. The cupola converts iron and steel into cast iron. The main advantages of the cupola furnace are lower energy costs than those of competing furnaces (electric) and the ability to melt less expensive metallic scrap than the competing furnaces. However the chemical and physical processes that take place in the cupola furnace are highly complex making it difficult to operate the furnace in optimal fashion. The results are low energy efficiency and poor recovery of important and expensive alloy elements due to oxidation. Between 1990 and 2004 under the auspices of the Department of Energy, the American Foundry Society and General Motors Corp. a computer simulation of the cupola furnace was developed that accurately describes the complex behavior of the furnace. When provided with the furnace input conditions the model provides accurate values of the output conditions in a matter of seconds. It also provides key diagnostics. Using clues from the diagnostics a trained specialist can infer changes in the operation that will move the system toward higher efficiency. Repeating the process in an iterative fashion leads to near optimum operating conditions with just a few iterations. More advanced uses of the program have been examined. The program is currently being combined with an ''Expert System'' to permit optimization in real time. The program has been combined with ''neural network'' programs to affect very easy scanning of a wide range of furnace operation. Rudimentary efforts were successfully made to operate the furnace using a computer. References to these more advanced systems will be found in the ''Cupola Handbook''. Chapter 27, American Foundry Society, Des Plaines, IL (1999).

  9. Computer Aided Continuous Time Stochastic Process Modelling

    DEFF Research Database (Denmark)

    Kristensen, N.R.; Madsen, Henrik; Jørgensen, Sten Bay

    2001-01-01

    A grey-box approach to process modelling that combines deterministic and stochastic modelling is advocated for identification of models for model-based control of batch and semi-batch processes. A computer-aided tool designed for supporting decision-making within the corresponding modelling cycle...

  10. Process Correlation Analysis Model for Process Improvement Identification

    Directory of Open Access Journals (Sweden)

    Su-jin Choi

    2014-01-01

    software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data.

  11. Analog modelling of obduction processes

    Science.gov (United States)

    Agard, P.; Zuo, X.; Funiciello, F.; Bellahsen, N.; Faccenna, C.; Savva, D.

    2012-04-01

    Obduction corresponds to one of plate tectonics oddities, whereby dense, oceanic rocks (ophiolites) are presumably 'thrust' on top of light, continental ones, as for the short-lived, almost synchronous Peri-Arabic obduction (which took place along thousands of km from Turkey to Oman in c. 5-10 Ma). Analog modelling experiments were performed to study the mechanisms of obduction initiation and test various triggering hypotheses (i.e., plate acceleration, slab hitting the 660 km discontinuity, ridge subduction; Agard et al., 2007). The experimental setup comprises (1) an upper mantle, modelled as a low-viscosity transparent Newtonian glucose syrup filling a rigid Plexiglas tank and (2) high-viscosity silicone plates (Rhodrosil Gomme with PDMS iron fillers to reproduce densities of continental or oceanic plates), located at the centre of the tank above the syrup to simulate the subducting and the overriding plates - and avoid friction on the sides of the tank. Convergence is simulated by pushing on a piston at one end of the model with velocities comparable to those of plate tectonics (i.e., in the range 1-10 cm/yr). The reference set-up includes, from one end to the other (~60 cm): (i) the piston, (ii) a continental margin containing a transition zone to the adjacent oceanic plate, (iii) a weakness zone with variable resistance and dip (W), (iv) an oceanic plate - with or without a spreading ridge, (v) a subduction zone (S) dipping away from the piston and (vi) an upper, active continental margin, below which the oceanic plate is being subducted at the start of the experiment (as is known to have been the case in Oman). Several configurations were tested and over thirty different parametric tests were performed. Special emphasis was placed on comparing different types of weakness zone (W) and the extent of mechanical coupling across them, particularly when plates were accelerated. Displacements, together with along-strike and across-strike internal deformation in all

  12. From business value model to coordination process model

    NARCIS (Netherlands)

    Fatemi, Hassan; Wieringa, Roelf J.; Poler, R.; van Sinderen, Marten J.; Sanchis, R.

    2009-01-01

    The increased complexity of business webs calls for modeling the collaboration of enterprises from different perspectives, in particular the business and process perspectives, and for mutually aligning these perspectives. Business value modeling and coordination process modeling both are necessary

  13. Business process modeling for processing classified documents using RFID technology

    Directory of Open Access Journals (Sweden)

    Koszela Jarosław

    2016-01-01

    Full Text Available The article outlines the application of the processing approach to the functional description of the designed IT system supporting the operations of the secret office, which processes classified documents. The article describes the application of the method of incremental modeling of business processes according to the BPMN model to the description of the processes currently implemented (“as is” in a manual manner and target processes (“to be”, using the RFID technology for the purpose of their automation. Additionally, the examples of applying the method of structural and dynamic analysis of the processes (process simulation to verify their correctness and efficiency were presented. The extension of the process analysis method is a possibility of applying the warehouse of processes and process mining methods.

  14. Process correlation analysis model for process improvement identification.

    Science.gov (United States)

    Choi, Su-jin; Kim, Dae-Kyoo; Park, Sooyong

    2014-01-01

    Software process improvement aims at improving the development process of software systems. It is initiated by process assessment identifying strengths and weaknesses and based on the findings, improvement plans are developed. In general, a process reference model (e.g., CMMI) is used throughout the process of software process improvement as the base. CMMI defines a set of process areas involved in software development and what to be carried out in process areas in terms of goals and practices. Process areas and their elements (goals and practices) are often correlated due to the iterative nature of software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data.

  15. Towards the Automated Annotation of Process Models

    NARCIS (Netherlands)

    Leopold, H.; Meilicke, C.; Fellmann, M.; Pittke, F.; Stuckenschmidt, H.; Mendling, J.

    2016-01-01

    Many techniques for the advanced analysis of process models build on the annotation of process models with elements from predefined vocabularies such as taxonomies. However, the manual annotation of process models is cumbersome and sometimes even hardly manageable taking the size of taxonomies into

  16. Business Process Modelling based on Petri nets

    Directory of Open Access Journals (Sweden)

    Qin Jianglong

    2017-01-01

    Full Text Available Business process modelling is the way business processes are expressed. Business process modelling is the foundation of business process analysis, reengineering, reorganization and optimization. It can not only help enterprises to achieve internal information system integration and reuse, but also help enterprises to achieve with the external collaboration. Based on the prototype Petri net, this paper adds time and cost factors to form an extended generalized stochastic Petri net. It is a formal description of the business process. The semi-formalized business process modelling algorithm based on Petri nets is proposed. Finally, The case from a logistics company proved that the modelling algorithm is correct and effective.

  17. Modeling process flow using diagrams

    NARCIS (Netherlands)

    Kemper, B.; de Mast, J.; Mandjes, M.

    2010-01-01

    In the practice of process improvement, tools such as the flowchart, the value-stream map (VSM), and a variety of ad hoc variants of such diagrams are commonly used. The purpose of this paper is to present a clear, precise, and consistent framework for the use of such flow diagrams in process

  18. A new process sensitivity index to identify important system processes under process model and parametric uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Dai, Heng [Pacific Northwest National Laboratory, Richland Washington USA; Ye, Ming [Department of Scientific Computing, Florida State University, Tallahassee Florida USA; Walker, Anthony P. [Environmental Sciences Division and Climate Change Science Institute, Oak Ridge National Laboratory, Oak Ridge Tennessee USA; Chen, Xingyuan [Pacific Northwest National Laboratory, Richland Washington USA

    2017-04-01

    Hydrological models are always composed of multiple components that represent processes key to intended model applications. When a process can be simulated by multiple conceptual-mathematical models (process models), model uncertainty in representing the process arises. While global sensitivity analysis methods have been widely used for identifying important processes in hydrologic modeling, the existing methods consider only parametric uncertainty but ignore the model uncertainty for process representation. To address this problem, this study develops a new method to probe multimodel process sensitivity by integrating the model averaging methods into the framework of variance-based global sensitivity analysis, given that the model averaging methods quantify both parametric and model uncertainty. A new process sensitivity index is derived as a metric of relative process importance, and the index includes variance in model outputs caused by uncertainty in both process models and model parameters. For demonstration, the new index is used to evaluate the processes of recharge and geology in a synthetic study of groundwater reactive transport modeling. The recharge process is simulated by two models that converting precipitation to recharge, and the geology process is also simulated by two models of different parameterizations of hydraulic conductivity; each process model has its own random parameters. The new process sensitivity index is mathematically general, and can be applied to a wide range of problems in hydrology and beyond.

  19. modeling grinding modeling grinding processes as micro processes

    African Journals Online (AJOL)

    eobe

    into two parts: static specific chip formation energy and dynamic specific chip formation ... the ratio of static normal chip formation force to static tangential chip formation force and the ratio ... grinding processing parameters to the friction coefficient between workpiece and grinding wheel. From equation. (20), the calculation ...

  20. Model-based software process improvement

    Science.gov (United States)

    Zettervall, Brenda T.

    1994-01-01

    The activities of a field test site for the Software Engineering Institute's software process definition project are discussed. Products tested included the improvement model itself, descriptive modeling techniques, the CMM level 2 framework document, and the use of process definition guidelines and templates. The software process improvement model represents a five stage cyclic approach for organizational process improvement. The cycles consist of the initiating, diagnosing, establishing, acting, and leveraging phases.

  1. Modelling of Batch Process Operations

    DEFF Research Database (Denmark)

    Abdul Samad, Noor Asma Fazli; Cameron, Ian; Gani, Rafiqul

    2011-01-01

    Here a batch cooling crystalliser is modelled and simulated as is a batch distillation system. In the batch crystalliser four operational modes of the crystalliser are considered, namely: initial cooling, nucleation, crystal growth and product removal. A model generation procedure is shown that s...

  2. Mathematical Modeling: A Structured Process

    Science.gov (United States)

    Anhalt, Cynthia Oropesa; Cortez, Ricardo

    2015-01-01

    Mathematical modeling, in which students use mathematics to explain or interpret physical, social, or scientific phenomena, is an essential component of the high school curriculum. The Common Core State Standards for Mathematics (CCSSM) classify modeling as a K-12 standard for mathematical practice and as a conceptual category for high school…

  3. Modeling business processes: theoretical and practical aspects

    Directory of Open Access Journals (Sweden)

    V.V. Dubininа

    2015-06-01

    Full Text Available The essence of process-oriented enterprise management has been examined in the article. The content and types of information technology have been analyzed in the article, due to the complexity and differentiation of existing methods, as well as the specificity of language, terminology of the enterprise business processes modeling. The theoretical aspects of business processes modeling have been reviewed and the modern traditional modeling techniques received practical application in the visualization model of retailers activity have been studied in the article. In the process of theoretical analysis of the modeling methods found that UFO-toolkit method that has been developed by Ukrainian scientists due to it systemology integrated opportunities, is the most suitable for structural and object analysis of retailers business processes. It was designed visualized simulation model of the business process "sales" as is" of retailers using a combination UFO-elements with the aim of the further practical formalization and optimization of a given business process.

  4. Modelling heat processing of dairy products

    NARCIS (Netherlands)

    Hotrum, N.; Fox, M.B.; Lieverloo, H.; Smit, E.; Jong, de P.; Schutyser, M.A.I.

    2010-01-01

    This chapter discusses the application of computer modelling to optimise the heat processing of milk. The chapter first reviews types of heat processing equipment used in the dairy industry. Then, the types of objectives that can be achieved using model-based process optimisation are discussed.

  5. How visual cognition influences process model comprehension

    NARCIS (Netherlands)

    Petrusel, Razvan; Mendling, Jan; Reijers, Hajo A.

    2017-01-01

    Process analysts and other professionals extensively use process models to analyze business processes and identify performance improvement opportunities. Therefore, it is important that such models can be easily and properly understood. Previous research has mainly focused on two types of factors

  6. Modeling process flow using diagrams

    OpenAIRE

    Kemper, B.; de Mast, J.; Mandjes, M.

    2010-01-01

    In the practice of process improvement, tools such as the flowchart, the value-stream map (VSM), and a variety of ad hoc variants of such diagrams are commonly used. The purpose of this paper is to present a clear, precise, and consistent framework for the use of such flow diagrams in process improvement projects. The paper finds that traditional diagrams, such as the flowchart, the VSM, and OR-type of diagrams, have severe limitations, miss certain elements, or are based on implicit but cons...

  7. Steady-State Process Modelling

    DEFF Research Database (Denmark)

    Cameron, Ian; Gani, Rafiqul

    2011-01-01

    illustrate the “equation oriented” approach as well as the “sequential modular” approach to solving complex flowsheets for steady state applications. The applications include the Williams-Otto plant, the hydrodealkylation (HDA) of toluene, conversion of ethylene to ethanol and a bio-ethanol process....

  8. Numerical modelling of reflood processes

    International Nuclear Information System (INIS)

    Glynn, D.R.; Rhodes, N.; Tatchell, D.G.

    1983-01-01

    The use of a detailed computer model to investigate the effects of grid size and the choice of wall-to-fluid heat-transfer correlations on the predictions obtained for reflooding of a vertical heated channel is described. The model employs equations for the momentum and enthalpy of vapour and liquid and hence accounts for both thermal non-equilibrium and slip between the phases. Empirical correlations are used to calculate interphase and wall-to-fluid friction and heat-transfer as functions of flow regime and local conditions. The empirical formulae have remained fixed with the exception of the wall-to-fluid heat-transfer correlations. These have been varied according to the practices adopted in other computer codes used to model reflood, namely REFLUX, RELAP and TRAC. Calculations have been performed to predict the CSNI standard problem number 7, and the results are compared with experiment. It is shown that the results are substantially grid-independent, and that the choice of correlation has a significant influence on the general flow behaviour, the rate of quenching and on the maximum cladding temperature predicted by the model. It is concluded that good predictions of reflooding rates can be obtained with particular correlation sets. (author)

  9. Branching process models of cancer

    CERN Document Server

    Durrett, Richard

    2015-01-01

    This volume develops results on continuous time branching processes and applies them to study rate of tumor growth, extending classic work on the Luria-Delbruck distribution. As a consequence, the authors calculate the probability that mutations that confer resistance to treatment are present at detection and quantify the extent of tumor heterogeneity. As applications, the authors evaluate ovarian cancer screening strategies and give rigorous proofs for results of Heano and Michor concerning tumor metastasis. These notes should be accessible to students who are familiar with Poisson processes and continuous time. Richard Durrett is mathematics professor at Duke University, USA. He is the author of 8 books, over 200 journal articles, and has supervised more than 40 Ph.D. students. Most of his current research concerns the applications of probability to biology: ecology, genetics, and most recently cancer.

  10. Discovering Process Reference Models from Process Variants Using Clustering Techniques

    NARCIS (Netherlands)

    Li, C.; Reichert, M.U.; Wombacher, Andreas

    2008-01-01

    In today's dynamic business world, success of an enterprise increasingly depends on its ability to react to changes in a quick and flexible way. In response to this need, process-aware information systems (PAIS) emerged, which support the modeling, orchestration and monitoring of business processes

  11. Systematic approach for the identification of process reference models

    CSIR Research Space (South Africa)

    Van Der Merwe, A

    2009-02-01

    Full Text Available Process models are used in different application domains to capture knowledge on the process flow. Process reference models (PRM) are used to capture reusable process models, which should simplify the identification process of process models...

  12. Validation process of simulation model

    International Nuclear Information System (INIS)

    San Isidro, M. J.

    1998-01-01

    It is presented a methodology on empirical validation about any detailed simulation model. This king of validation it is always related with an experimental case. The empirical validation has a residual sense, because the conclusions are based on comparisons between simulated outputs and experimental measurements. This methodology will guide us to detect the fails of the simulation model. Furthermore, it can be used a guide in the design of posterior experiments. Three steps can be well differentiated: Sensitivity analysis. It can be made with a DSA, differential sensitivity analysis, and with a MCSA, Monte-Carlo sensitivity analysis. Looking the optimal domains of the input parameters. It has been developed a procedure based on the Monte-Carlo methods and Cluster techniques, to find the optimal domains of these parameters. Residual analysis. This analysis has been made on the time domain and on the frequency domain, it has been used the correlation analysis and spectral analysis. As application of this methodology, it is presented the validation carried out on a thermal simulation model on buildings, Esp., studying the behavior of building components on a Test Cell of LECE of CIEMAT. (Author) 17 refs

  13. Integrated Process Modeling-A Process Validation Life Cycle Companion.

    Science.gov (United States)

    Zahel, Thomas; Hauer, Stefan; Mueller, Eric M; Murphy, Patrick; Abad, Sandra; Vasilieva, Elena; Maurer, Daniel; Brocard, Cécile; Reinisch, Daniela; Sagmeister, Patrick; Herwig, Christoph

    2017-10-17

    During the regulatory requested process validation of pharmaceutical manufacturing processes, companies aim to identify, control, and continuously monitor process variation and its impact on critical quality attributes (CQAs) of the final product. It is difficult to directly connect the impact of single process parameters (PPs) to final product CQAs, especially in biopharmaceutical process development and production, where multiple unit operations are stacked together and interact with each other. Therefore, we want to present the application of Monte Carlo (MC) simulation using an integrated process model (IPM) that enables estimation of process capability even in early stages of process validation. Once the IPM is established, its capability in risk and criticality assessment is furthermore demonstrated. IPMs can be used to enable holistic production control strategies that take interactions of process parameters of multiple unit operations into account. Moreover, IPMs can be trained with development data, refined with qualification runs, and maintained with routine manufacturing data which underlines the lifecycle concept. These applications will be shown by means of a process characterization study recently conducted at a world-leading contract manufacturing organization (CMO). The new IPM methodology therefore allows anticipation of out of specification (OOS) events, identify critical process parameters, and take risk-based decisions on counteractions that increase process robustness and decrease the likelihood of OOS events.

  14. Process modelling on a canonical basis[Process modelling; Canonical modelling

    Energy Technology Data Exchange (ETDEWEB)

    Siepmann, Volker

    2006-12-20

    Based on an equation oriented solving strategy, this thesis investigates a new approach to process modelling. Homogeneous thermodynamic state functions represent consistent mathematical models of thermodynamic properties. Such state functions of solely extensive canonical state variables are the basis of this work, as they are natural objective functions in optimisation nodes to calculate thermodynamic equilibrium regarding phase-interaction and chemical reactions. Analytical state function derivatives are utilised within the solution process as well as interpreted as physical properties. By this approach, only a limited range of imaginable process constraints are considered, namely linear balance equations of state variables. A second-order update of source contributions to these balance equations is obtained by an additional constitutive equation system. These equations are general dependent on state variables and first-order sensitivities, and cover therefore practically all potential process constraints. Symbolic computation technology efficiently provides sparsity and derivative information of active equations to avoid performance problems regarding robustness and computational effort. A benefit of detaching the constitutive equation system is that the structure of the main equation system remains unaffected by these constraints, and a priori information allows to implement an efficient solving strategy and a concise error diagnosis. A tailor-made linear algebra library handles the sparse recursive block structures efficiently. The optimisation principle for single modules of thermodynamic equilibrium is extended to host entire process models. State variables of different modules interact through balance equations, representing material flows from one module to the other. To account for reusability and encapsulation of process module details, modular process modelling is supported by a recursive module structure. The second-order solving algorithm makes it

  15. Measures of Quality in Business Process Modelling

    Directory of Open Access Journals (Sweden)

    Radek Hronza

    2015-06-01

    Full Text Available Business process modelling and analysing is undoubtedly one of the most important parts of Applied (Business Informatics. Quality of business process models (diagrams is crucial for any purpose in this area. The goal of a process analyst’s work is to create generally understandable, explicit and error free models. If a process is properly described, created models can be used as an input into deep analysis and optimization. It can be assumed that properly designed business process models (similarly as in the case of correctly written algorithms contain characteristics that can be mathematically described. Besides it will be possible to create a tool that will help process analysts to design proper models. As part of this review will be conducted systematic literature review in order to find and analyse business process model’s design and business process model’s quality measures. It was found that mentioned area had already been the subject of research investigation in the past. Thirty-three suitable scietific publications and twenty-two quality measures were found. Analysed scientific publications and existing quality measures do not reflect all important attributes of business process model’s clarity, simplicity and completeness. Therefore it would be appropriate to add new measures of quality.

  16. Modelling income processes with lots of heterogeneity

    DEFF Research Database (Denmark)

    Browning, Martin; Ejrnæs, Mette; Alvarez, Javier

    2010-01-01

    We model earnings processes allowing for lots of heterogeneity across agents. We also introduce an extension to the linear ARMA model which allows the initial convergence in the long run to be different from that implied by the conventional ARMA model. This is particularly important for unit root...

  17. Counting Processes for Retail Default Modeling

    DEFF Research Database (Denmark)

    Kiefer, Nicholas Maximilian; Larson, C. Erik

    in a discrete state space. In a simple case, the states could be default/non-default; in other models relevant for credit modeling the states could be credit scores or payment status (30 dpd, 60 dpd, etc.). Here we focus on the use of stochastic counting processes for mortgage default modeling, using data...

  18. Distillation modeling for a uranium refining process

    International Nuclear Information System (INIS)

    Westphal, B.R.

    1996-01-01

    As part of the spent fuel treatment program at Argonne National Laboratory, a vacuum distillation process is being employed for the recovery of uranium following an electrorefining process. Distillation of a salt electrolyte, containing a eutectic mixture of lithium and potassium chlorides, from uranium is achieved by a simple batch operation and is termed open-quotes cathode processingclose quotes. The incremental distillation of electrolyte salt will be modeled by an equilibrium expression and on a molecular basis since the operation is conducted under moderate vacuum conditions. As processing continues, the two models will be compared and analyzed for correlation with actual operating results. Possible factors that may contribute to aberrations from the models include impurities at the vapor-liquid boundary, distillate reflux, anomalous pressure gradients, and mass transport phenomena at the evaporating surface. Ultimately, the purpose of either process model is to enable the parametric optimization of the process

  19. Piecewise deterministic processes in biological models

    CERN Document Server

    Rudnicki, Ryszard

    2017-01-01

    This book presents a concise introduction to piecewise deterministic Markov processes (PDMPs), with particular emphasis on their applications to biological models. Further, it presents examples of biological phenomena, such as gene activity and population growth, where different types of PDMPs appear: continuous time Markov chains, deterministic processes with jumps, processes with switching dynamics, and point processes. Subsequent chapters present the necessary tools from the theory of stochastic processes and semigroups of linear operators, as well as theoretical results concerning the long-time behaviour of stochastic semigroups induced by PDMPs and their applications to biological models. As such, the book offers a valuable resource for mathematicians and biologists alike. The first group will find new biological models that lead to interesting and often new mathematical questions, while the second can observe how to include seemingly disparate biological processes into a unified mathematical theory, and...

  20. Modeling closed nuclear fuel cycles processes

    Energy Technology Data Exchange (ETDEWEB)

    Shmidt, O.V. [A.A. Bochvar All-Russian Scientific Research Institute for Inorganic Materials, Rogova, 5a street, Moscow, 123098 (Russian Federation); Makeeva, I.R. [Zababakhin All-Russian Scientific Research Institute of Technical Physics, Vasiliev street 13, Snezhinsk, Chelyabinsk region, 456770 (Russian Federation); Liventsov, S.N. [Tomsk Polytechnic University, Tomsk, Lenin Avenue, 30, 634050 (Russian Federation)

    2016-07-01

    Computer models of processes are necessary for determination of optimal operating conditions for closed nuclear fuel cycle (NFC) processes. Computer models can be quickly changed in accordance with new and fresh data from experimental research. 3 kinds of process simulation are necessary. First, the VIZART software package is a balance model development used for calculating the material flow in technological processes. VIZART involves taking into account of equipment capacity, transport lines and storage volumes. Secondly, it is necessary to simulate the physico-chemical processes that are involved in the closure of NFC. The third kind of simulation is the development of software that allows the optimization, diagnostics and control of the processes which implies real-time simulation of product flows on the whole plant or on separate lines of the plant. (A.C.)

  1. MODELLING PURCHASING PROCESSES FROM QUALITY ASPECTS

    Directory of Open Access Journals (Sweden)

    Zora Arsovski

    2008-12-01

    Full Text Available Management has a fundamental task to identify and direct primary and specific processes within purchasing function, applying the up-to-date information infrastructure. ISO 9001:2000 defines a process as a number of interrelated or interactive activities transforming inputs and outputs, and the "process approach" as a systematic identification in management processes employed with the organization and particularly - relationships among the processes. To direct a quality management system using process approach, the organization is to determine the map of its general (basic processes. Primary processes are determined on the grounds of their interrelationship and impact on satisfying customers' needs. To make a proper choice of general business processes, it is necessary to determine the entire business flow, beginning with the customer demand up to the delivery of products or service provided. In the next step the process model is to be converted into data model which is essential for implementation of the information system enabling automation, monitoring, measuring, inspection, analysis and improvement of key purchase processes. In this paper are given methodology and some results of investigation of development of IS for purchasing process from aspects of quality.

  2. Three-dimensional model analysis and processing

    CERN Document Server

    Yu, Faxin; Luo, Hao; Wang, Pinghui

    2011-01-01

    This book focuses on five hot research directions in 3D model analysis and processing in computer science:  compression, feature extraction, content-based retrieval, irreversible watermarking and reversible watermarking.

  3. Value-Oriented Coordination Process Modeling

    NARCIS (Netherlands)

    Fatemi, Hassan; van Sinderen, Marten J.; Wieringa, Roelf J.; Hull, Richard; Mendling, Jan; Tai, Stefan

    Business webs are collections of enterprises designed to jointly satisfy a consumer need. Designing business webs calls for modeling the collaboration of enterprises from different perspectives, in particular the business value and coordination process perspectives, and for mutually aligning these

  4. Process and Context in Choice Models

    DEFF Research Database (Denmark)

    Ben-Akiva, Moshe; Palma, André de; McFadden, Daniel

    2012-01-01

    We develop a general framework that extends choice models by including an explicit representation of the process and context of decision making. Process refers to the steps involved in decision making. Context refers to factors affecting the process, focusing in this paper on social networks....... The extended choice framework includes more behavioral richness through the explicit representation of the planning process preceding an action and its dynamics and the effects of context (family, friends, and market) on the process leading to a choice, as well as the inclusion of new types of subjective data...

  5. Modeling of Dielectric Heating within Lyophilization Process

    Directory of Open Access Journals (Sweden)

    Jan Kyncl

    2014-01-01

    Full Text Available A process of lyophilization of paper books is modeled. The process of drying is controlled by a dielectric heating system. From the physical viewpoint, the task represents a 2D coupled problem described by two partial differential equations for the electric and temperature fields. The material parameters are supposed to be temperature-dependent functions. The continuous mathematical model is solved numerically. The methodology is illustrated with some examples whose results are discussed.

  6. Cost Models for MMC Manufacturing Processes

    Science.gov (United States)

    Elzey, Dana M.; Wadley, Haydn N. G.

    1996-01-01

    Processes for the manufacture of advanced metal matrix composites are rapidly approaching maturity in the research laboratory and there is growing interest in their transition to industrial production. However, research conducted to date has almost exclusively focused on overcoming the technical barriers to producing high-quality material and little attention has been given to the economical feasibility of these laboratory approaches and process cost issues. A quantitative cost modeling (QCM) approach was developed to address these issues. QCM are cost analysis tools based on predictive process models relating process conditions to the attributes of the final product. An important attribute, of the QCM approach is the ability to predict the sensitivity of material production costs to product quality and to quantitatively explore trade-offs between cost and quality. Applications of the cost models allow more efficient direction of future MMC process technology development and a more accurate assessment of MMC market potential. Cost models were developed for two state-of-the art metal matrix composite (MMC) manufacturing processes: tape casting and plasma spray deposition. Quality and Cost models are presented for both processes and the resulting predicted quality-cost curves are presented and discussed.

  7. Physical and mathematical modelling of extrusion processes

    DEFF Research Database (Denmark)

    Arentoft, Mogens; Gronostajski, Z.; Niechajowics, A.

    2000-01-01

    The main objective of the work is to study the extrusion process using physical modelling and to compare the findings of the study with finite element predictions. The possibilities and advantages of the simultaneous application of both of these methods for the analysis of metal forming processes...

  8. Business Process Modeling Notation - An Overview

    Directory of Open Access Journals (Sweden)

    Alexandra Fortiş

    2006-01-01

    Full Text Available BPMN represents an industrial standard created to offer a common and user friendly notation to all the participants to a business process. The present paper aims to briefly present the main features of this notation as well as an interpretation of some of the main patterns characterizing a business process modeled by the working fluxes.

  9. Qualitative simulation in formal process modelling

    International Nuclear Information System (INIS)

    Sivertsen, Elin R.

    1999-01-01

    In relation to several different research activities at the OECD Halden Reactor Project, the usefulness of formal process models has been identified. Being represented in some appropriate representation language, the purpose of these models is to model process plants and plant automatics in a unified way to allow verification and computer aided design of control strategies. The present report discusses qualitative simulation and the tool QSIM as one approach to formal process models. In particular, the report aims at investigating how recent improvements of the tool facilitate the use of the approach in areas like process system analysis, procedure verification, and control software safety analysis. An important long term goal is to provide a basis for using qualitative reasoning in combination with other techniques to facilitate the treatment of embedded programmable systems in Probabilistic Safety Analysis (PSA). This is motivated from the potential of such a combination in safety analysis based on models comprising both software, hardware, and operator. It is anticipated that the research results from this activity will benefit V and V in a wide variety of applications where formal process models can be utilized. Examples are operator procedures, intelligent decision support systems, and common model repositories (author) (ml)

  10. The (Mathematical) Modeling Process in Biosciences.

    Science.gov (United States)

    Torres, Nestor V; Santos, Guido

    2015-01-01

    In this communication, we introduce a general framework and discussion on the role of models and the modeling process in the field of biosciences. The objective is to sum up the common procedures during the formalization and analysis of a biological problem from the perspective of Systems Biology, which approaches the study of biological systems as a whole. We begin by presenting the definitions of (biological) system and model. Particular attention is given to the meaning of mathematical model within the context of biology. Then, we present the process of modeling and analysis of biological systems. Three stages are described in detail: conceptualization of the biological system into a model, mathematical formalization of the previous conceptual model and optimization and system management derived from the analysis of the mathematical model. All along this work the main features and shortcomings of the process are analyzed and a set of rules that could help in the task of modeling any biological system are presented. Special regard is given to the formative requirements and the interdisciplinary nature of this approach. We conclude with some general considerations on the challenges that modeling is posing to current biology.

  11. Models and Modelling Tools for Chemical Product and Process Design

    DEFF Research Database (Denmark)

    Gani, Rafiqul

    2016-01-01

    -process design. Illustrative examples highlighting the need for efficient model-based systems will be presented, where the need for predictive models for innovative chemical product-process design will be highlighted. The examples will cover aspects of chemical product-process design where the idea of the grand......The design, development and reliability of a chemical product and the process to manufacture it, need to be consistent with the end-use characteristics of the desired product. One of the common ways to match the desired product-process characteristics is through trial and error based experiments......, which can be expensive and time consuming. An alternative approach is the use of a systematic model-based framework according to an established work-flow in product-process design, replacing some of the time consuming and/or repetitive experimental steps. The advantages of the use of a model...

  12. Extending Model Checking To Object Process Validation

    NARCIS (Netherlands)

    van Rein, H.

    2002-01-01

    Object-oriented techniques allow the gathering and modelling of system requirements in terms of an application area. The expression of data and process models at that level is a great asset in communication with non-technical people in that area, but it does not necessarily lead to consistent

  13. Hierarchical Structured Model for Nonlinear Dynamical Processes ...

    African Journals Online (AJOL)

    The mathematical representation of the process, in this context, is by a set of linear stochastic differential equations (SDE) with unique solutions. The problem of realization is that of constructing the dynamical system by looking at the problem of scientific model building. In model building, one must be able to calculate the ...

  14. Filament winding cylinders. I - Process model

    Science.gov (United States)

    Lee, Soo-Yong; Springer, George S.

    1990-01-01

    A model was developed which describes the filament winding process of composite cylinders. The model relates the significant process variables such as winding speed, fiber tension, and applied temperature to the thermal, chemical and mechanical behavior of the composite cylinder and the mandrel. Based on the model, a user friendly code was written which can be used to calculate (1) the temperature in the cylinder and the mandrel, (2) the degree of cure and viscosity in the cylinder, (3) the fiber tensions and fiber positions, (4) the stresses and strains in the cylinder and in the mandrel, and (5) the void diameters in the cylinder.

  15. Modeling Aspects of Activated Sludge Processes Part l l: Mathematical Process Modeling and Biokinetics of Activated Sludge Processes

    International Nuclear Information System (INIS)

    AbdElHaleem, H.S.; EI-Ahwany, A. H.; Ibrahim, H.I.; Ibrahim, G.

    2004-01-01

    Mathematical process modeling and biokinetics of activated sludge process were reviewed considering different types of models. It has been evaluated the task group models of ASMI. and 2, and 3 versioned by Henze et al considering the conditions of each model and the different processes of which every model consists. It is revealed that ASMI contains some defects avoided in ASM3. Relied on homogeneity, Models can be classified into homogenous models characterized by taking the activated sludge process as one phase. In this type of models, the internal mass transfer inside the floes was neglected.. Hence, the kinetic parameter produces can be considered inaccurate. The other type of models is the heterogeneous model This type considers the mass transfer operations in addition to the biochemical reaction processes; hence, the resulted kinetic parameters can be considered more accurate than that of homogenous type

  16. Process modeling study of the CIF incinerator

    International Nuclear Information System (INIS)

    Hang, T.

    1995-01-01

    The Savannah River Site (SRS) plans to begin operating the Consolidated Incineration Facility (CIF) in 1996. The CIF will treat liquid and solid low-level radioactive, mixed and RCRA hazardous wastes generated at SRS. In addition to experimental test programs, process modeling was applied to provide guidance in areas of safety, environmental regulation compliances, process improvement and optimization. A steady-state flowsheet model was used to calculate material/energy balances and to track key chemical constituents throughout the process units. Dynamic models were developed to predict the CIF transient characteristics in normal and abnormal operation scenarios. Predictions include the rotary kiln heat transfer, dynamic responses of the CIF to fluctuations in the solid waste feed or upsets in the system equipments, performance of the control system, air inleakage in the kiln, etc. This paper reviews the modeling study performed to assist in the deflagration risk assessment

  17. Pedagogic process modeling: Humanistic-integrative approach

    Directory of Open Access Journals (Sweden)

    Boritko Nikolaj M.

    2007-01-01

    Full Text Available The paper deals with some current problems of modeling the dynamics of the subject-features development of the individual. The term "process" is considered in the context of the humanistic-integrative approach, in which the principles of self education are regarded as criteria for efficient pedagogic activity. Four basic characteristics of the pedagogic process are pointed out: intentionality reflects logicality and regularity of the development of the process; discreteness (stageability in dicates qualitative stages through which the pedagogic phenomenon passes; nonlinearity explains the crisis character of pedagogic processes and reveals inner factors of self-development; situationality requires a selection of pedagogic conditions in accordance with the inner factors, which would enable steering the pedagogic process. Offered are two steps for singling out a particular stage and the algorithm for developing an integrative model for it. The suggested conclusions might be of use for further theoretic research, analyses of educational practices and for realistic predicting of pedagogical phenomena. .

  18. From Business Value Model to Coordination Process Model

    Science.gov (United States)

    Fatemi, Hassan; van Sinderen, Marten; Wieringa, Roel

    The increased complexity of business webs calls for modeling the collaboration of enterprises from different perspectives, in particular the business and process perspectives, and for mutually aligning these perspectives. Business value modeling and coordination process modeling both are necessary for a good e-business design, but these activities have different goals and use different concepts. Nevertheless, the resulting models should be consistent with each other because they refer to the same system from different perspectives. Hence, checking the consistency between these models or producing one based on the other would be of high value. In this paper we discuss the issue of achieving consistency in multi-level e-business design and give guidelines to produce consistent coordination process models from business value models in a stepwise manner.

  19. Numerical modeling of atmospheric washout processes

    International Nuclear Information System (INIS)

    Bayer, D.; Beheng, K.D.; Herbert, F.

    1987-01-01

    For the washout of particles from the atmosphere by clouds and rain one has to distinguish between processes which work in the first phase of cloud development, when condensation nuclei build up in saturated air (Nucleation Aerosol Scavenging, NAS) and those processes which work at the following cloud development. In the second case particles are taken off by cloud droplets or by falling rain drops via collision (Collision Aerosol Scavenging, CAS). The physics of both processes is described. For the CAS process a numerical model is presented. The report contains a documentation of the mathematical equations and the computer programs (FORTRAN). (KW) [de

  20. Modeling nonhomogeneous Markov processes via time transformation.

    Science.gov (United States)

    Hubbard, R A; Inoue, L Y T; Fann, J R

    2008-09-01

    Longitudinal studies are a powerful tool for characterizing the course of chronic disease. These studies are usually carried out with subjects observed at periodic visits giving rise to panel data. Under this observation scheme the exact times of disease state transitions and sequence of disease states visited are unknown and Markov process models are often used to describe disease progression. Most applications of Markov process models rely on the assumption of time homogeneity, that is, that the transition rates are constant over time. This assumption is not satisfied when transition rates depend on time from the process origin. However, limited statistical tools are available for dealing with nonhomogeneity. We propose models in which the time scale of a nonhomogeneous Markov process is transformed to an operational time scale on which the process is homogeneous. We develop a method for jointly estimating the time transformation and the transition intensity matrix for the time transformed homogeneous process. We assess maximum likelihood estimation using the Fisher scoring algorithm via simulation studies and compare performance of our method to homogeneous and piecewise homogeneous models. We apply our methodology to a study of delirium progression in a cohort of stem cell transplantation recipients and show that our method identifies temporal trends in delirium incidence and recovery.

  1. Software Engineering Laboratory (SEL) cleanroom process model

    Science.gov (United States)

    Green, Scott; Basili, Victor; Godfrey, Sally; Mcgarry, Frank; Pajerski, Rose; Waligora, Sharon

    1991-01-01

    The Software Engineering Laboratory (SEL) cleanroom process model is described. The term 'cleanroom' originates in the integrated circuit (IC) production process, where IC's are assembled in dust free 'clean rooms' to prevent the destructive effects of dust. When applying the clean room methodology to the development of software systems, the primary focus is on software defect prevention rather than defect removal. The model is based on data and analysis from previous cleanroom efforts within the SEL and is tailored to serve as a guideline in applying the methodology to future production software efforts. The phases that are part of the process model life cycle from the delivery of requirements to the start of acceptance testing are described. For each defined phase, a set of specific activities is discussed, and the appropriate data flow is described. Pertinent managerial issues, key similarities and differences between the SEL's cleanroom process model and the standard development approach used on SEL projects, and significant lessons learned from prior cleanroom projects are presented. It is intended that the process model described here will be further tailored as additional SEL cleanroom projects are analyzed.

  2. Causally nonseparable processes admitting a causal model

    International Nuclear Information System (INIS)

    Feix, Adrien; Araújo, Mateus; Brukner, Caslav

    2016-01-01

    A recent framework of quantum theory with no global causal order predicts the existence of ‘causally nonseparable’ processes. Some of these processes produce correlations incompatible with any causal order (they violate so-called ‘causal inequalities’ analogous to Bell inequalities ) while others do not (they admit a ‘causal model’ analogous to a local model ). Here we show for the first time that bipartite causally nonseparable processes with a causal model exist, and give evidence that they have no clear physical interpretation. We also provide an algorithm to generate processes of this kind and show that they have nonzero measure in the set of all processes. We demonstrate the existence of processes which stop violating causal inequalities but are still causally nonseparable when mixed with a certain amount of ‘white noise’. This is reminiscent of the behavior of Werner states in the context of entanglement and nonlocality. Finally, we provide numerical evidence for the existence of causally nonseparable processes which have a causal model even when extended with an entangled state shared among the parties. (paper)

  3. Stochastic differential equation model to Prendiville processes

    International Nuclear Information System (INIS)

    Granita; Bahar, Arifah

    2015-01-01

    The Prendiville process is another variation of the logistic model which assumes linearly decreasing population growth rate. It is a continuous time Markov chain (CTMC) taking integer values in the finite interval. The continuous time Markov chain can be approximated by stochastic differential equation (SDE). This paper discusses the stochastic differential equation of Prendiville process. The work started with the forward Kolmogorov equation in continuous time Markov chain of Prendiville process. Then it was formulated in the form of a central-difference approximation. The approximation was then used in Fokker-Planck equation in relation to the stochastic differential equation of the Prendiville process. The explicit solution of the Prendiville process was obtained from the stochastic differential equation. Therefore, the mean and variance function of the Prendiville process could be easily found from the explicit solution

  4. Stochastic differential equation model to Prendiville processes

    Energy Technology Data Exchange (ETDEWEB)

    Granita, E-mail: granitafc@gmail.com [Dept. of Mathematical Science, Universiti Teknologi Malaysia, 81310, Johor Malaysia (Malaysia); Bahar, Arifah [Dept. of Mathematical Science, Universiti Teknologi Malaysia, 81310, Johor Malaysia (Malaysia); UTM Center for Industrial & Applied Mathematics (UTM-CIAM) (Malaysia)

    2015-10-22

    The Prendiville process is another variation of the logistic model which assumes linearly decreasing population growth rate. It is a continuous time Markov chain (CTMC) taking integer values in the finite interval. The continuous time Markov chain can be approximated by stochastic differential equation (SDE). This paper discusses the stochastic differential equation of Prendiville process. The work started with the forward Kolmogorov equation in continuous time Markov chain of Prendiville process. Then it was formulated in the form of a central-difference approximation. The approximation was then used in Fokker-Planck equation in relation to the stochastic differential equation of the Prendiville process. The explicit solution of the Prendiville process was obtained from the stochastic differential equation. Therefore, the mean and variance function of the Prendiville process could be easily found from the explicit solution.

  5. The Probability Model of Expectation Disconfirmation Process

    Directory of Open Access Journals (Sweden)

    Hui-Hsin HUANG

    2015-06-01

    Full Text Available This paper proposes a probability model to explore the dynamic process of customer’s satisfaction. Bases on expectation disconfirmation theory, the satisfaction is constructed with customer’s expectation before buying behavior and the perceived performance after purchase. The experiment method is designed to measure expectation disconfirmation effects and we also use the collection data to estimate the overall satisfaction and model calibration. The results show good fitness between the model and the real data. This model has application for business marketing areas in order to manage relationship satisfaction.

  6. Chain binomial models and binomial autoregressive processes.

    Science.gov (United States)

    Weiss, Christian H; Pollett, Philip K

    2012-09-01

    We establish a connection between a class of chain-binomial models of use in ecology and epidemiology and binomial autoregressive (AR) processes. New results are obtained for the latter, including expressions for the lag-conditional distribution and related quantities. We focus on two types of chain-binomial model, extinction-colonization and colonization-extinction models, and present two approaches to parameter estimation. The asymptotic distributions of the resulting estimators are studied, as well as their finite-sample performance, and we give an application to real data. A connection is made with standard AR models, which also has implications for parameter estimation. © 2011, The International Biometric Society.

  7. A neurolinguistic model of grammatical construction processing.

    Science.gov (United States)

    Dominey, Peter Ford; Hoen, Michel; Inui, Toshio

    2006-12-01

    One of the functions of everyday human language is to communicate meaning. Thus, when one hears or reads the sentence, "John gave a book to Mary," some aspect of an event concerning the transfer of possession of a book from John to Mary is (hopefully) transmitted. One theoretical approach to language referred to as construction grammar emphasizes this link between sentence structure and meaning in the form of grammatical constructions. The objective of the current research is to (1) outline a functional description of grammatical construction processing based on principles of psycholinguistics, (2) develop a model of how these functions can be implemented in human neurophysiology, and then (3) demonstrate the feasibility of the resulting model in processing languages of typologically diverse natures, that is, English, French, and Japanese. In this context, particular interest will be directed toward the processing of novel compositional structure of relative phrases. The simulation results are discussed in the context of recent neurophysiological studies of language processing.

  8. Similarity metrics for surgical process models.

    Science.gov (United States)

    Neumuth, Thomas; Loebe, Frank; Jannin, Pierre

    2012-01-01

    The objective of this work is to introduce a set of similarity metrics for comparing surgical process models (SPMs). SPMs are progression models of surgical interventions that support quantitative analyses of surgical activities, supporting systems engineering or process optimization. Five different similarity metrics are presented and proven. These metrics deal with several dimensions of process compliance in surgery, including granularity, content, time, order, and frequency of surgical activities. The metrics were experimentally validated using 20 clinical data sets each for cataract interventions, craniotomy interventions, and supratentorial tumor resections. The clinical data sets were controllably modified in simulations, which were iterated ten times, resulting in a total of 600 simulated data sets. The simulated data sets were subsequently compared to the original data sets to empirically assess the predictive validity of the metrics. We show that the results of the metrics for the surgical process models correlate significantly (pmetrics meet predictive validity. The clinical use of the metrics was exemplarily, as demonstrated by assessment of the learning curves of observers during surgical process model acquisition. Measuring similarity between surgical processes is a complex task. However, metrics for computing the similarity between surgical process models are needed in many uses in the field of medical engineering. These metrics are essential whenever two SPMs need to be compared, such as during the evaluation of technical systems, the education of observers, or the determination of surgical strategies. These metrics are key figures that provide a solid base for medical decisions, such as during validation of sensor systems for use in operating rooms in the future. Copyright © 2011 Elsevier B.V. All rights reserved.

  9. A process algebra model of QED

    International Nuclear Information System (INIS)

    Sulis, William

    2016-01-01

    The process algebra approach to quantum mechanics posits a finite, discrete, determinate ontology of primitive events which are generated by processes (in the sense of Whitehead). In this ontology, primitive events serve as elements of an emergent space-time and of emergent fundamental particles and fields. Each process generates a set of primitive elements, using only local information, causally propagated as a discrete wave, forming a causal space termed a causal tapestry. Each causal tapestry forms a discrete and finite sampling of an emergent causal manifold (space-time) M and emergent wave function. Interactions between processes are described by a process algebra which possesses 8 commutative operations (sums and products) together with a non-commutative concatenation operator (transitions). The process algebra possesses a representation via nondeterministic combinatorial games. The process algebra connects to quantum mechanics through the set valued process and configuration space covering maps, which associate each causal tapestry with sets of wave functions over M. Probabilities emerge from interactions between processes. The process algebra model has been shown to reproduce many features of the theory of non-relativistic scalar particles to a high degree of accuracy, without paradox or divergences. This paper extends the approach to a semi-classical form of quantum electrodynamics. (paper)

  10. Determinantal point process models on the sphere

    DEFF Research Database (Denmark)

    Møller, Jesper; Nielsen, Morten; Porcu, Emilio

    We consider determinantal point processes on the d-dimensional unit sphere Sd . These are finite point processes exhibiting repulsiveness and with moment properties determined by a certain determinant whose entries are specified by a so-called kernel which we assume is a complex covariance function...... and eigenfunctions in a spectral representation for the kernel, and we figure out how repulsive isotropic DPPs can be. Moreover, we discuss the shortcomings of adapting existing models for isotropic covariance functions and consider strategies for developing new models, including a useful spectral approach....

  11. Performance assessment modeling of pyrometallurgical process wasteforms

    International Nuclear Information System (INIS)

    Nutt, W.M.; Hill, R.N.; Bullen, D.B.

    1995-01-01

    Performance assessment analyses have been completed to estimate the behavior of high-level nuclear wasteforms generated from the pyrometallurgical processing of liquid metal reactor (LMR) and light water reactor (LWR) spent nuclear fuel. Waste emplaced in the proposed repository at Yucca Mountain is investigated as the basis for the study. The resulting cumulative actinide and fission product releases to the accessible environment within a 100,000 year period from the various pyrometallurgical process wasteforms are compared to those of directly disposed LWR spent fuel using the same total repository system model. The impact of differing radionuclide transport models on the overall release characteristics is investigated

  12. Retort process modelling for Indian traditional foods.

    Science.gov (United States)

    Gokhale, S V; Lele, S S

    2014-11-01

    Indian traditional staple and snack food is typically a heterogeneous recipe that incorporates varieties of vegetables, lentils and other ingredients. Modelling the retorting process of multilayer pouch packed Indian food was achieved using lumped-parameter approach. A unified model is proposed to estimate cold point temperature. Initial process conditions, retort temperature and % solid content were the significantly affecting independent variables. A model was developed using combination of vegetable solids and water, which was then validated using four traditional Indian vegetarian products: Pulav (steamed rice with vegetables), Sambar (south Indian style curry containing mixed vegetables and lentils), Gajar Halawa (carrot based sweet product) and Upama (wheat based snack product). The predicted and experimental values of temperature profile matched with ±10 % error which is a good match considering the food was a multi component system. Thus the model will be useful as a tool to reduce number of trials required to optimize retorting of various Indian traditional vegetarian foods.

  13. Models of transport processes in concrete

    International Nuclear Information System (INIS)

    Pommersheim, J.M.; Clifton, J.R.

    1991-01-01

    An approach being considered by the US Nuclear Regulatory Commission for disposal of low-level radioactive waste is to place the waste forms in concrete vaults buried underground. The vaults would need a service life of 500 years. Approaches for predicting the service life of concrete of such vaults include the use of mathematical models. Mathematical models are presented in this report for the major degradation processes anticipated for the concrete vaults, which are corrosion of steel reinforcement, sulfate attack, acid attack, and leaching. The models mathematically represent rate controlling processes including diffusion, convection, and reaction and sorption of chemical species. These models can form the basis for predicting the life of concrete under in-service conditions. 33 refs., 6 figs., 7 tabs

  14. Deterministic geologic processes and stochastic modeling

    International Nuclear Information System (INIS)

    Rautman, C.A.; Flint, A.L.

    1991-01-01

    Recent outcrop sampling at Yucca Mountain, Nevada, has produced significant new information regarding the distribution of physical properties at the site of a potential high-level nuclear waste repository. Consideration of the spatial distribution of measured values and geostatistical measures of spatial variability indicates that there are a number of widespread deterministic geologic features at the site that have important implications for numerical modeling of such performance aspects as ground water flow and radionuclide transport. These deterministic features have their origin in the complex, yet logical, interplay of a number of deterministic geologic processes, including magmatic evolution; volcanic eruption, transport, and emplacement; post-emplacement cooling and alteration; and late-stage (diagenetic) alteration. Because of geologic processes responsible for formation of Yucca Mountain are relatively well understood and operate on a more-or-less regional scale, understanding of these processes can be used in modeling the physical properties and performance of the site. Information reflecting these deterministic geologic processes may be incorporated into the modeling program explicitly, using geostatistical concepts such as soft information, or implicitly, through the adoption of a particular approach to modeling. It is unlikely that any single representation of physical properties at the site will be suitable for all modeling purposes. Instead, the same underlying physical reality will need to be described many times, each in a manner conducive to assessing specific performance issues

  15. A Mathematical Model of Cigarette Smoldering Process

    Directory of Open Access Journals (Sweden)

    Chen P

    2014-12-01

    Full Text Available A mathematical model for a smoldering cigarette has been proposed. In the analysis of the cigarette combustion and pyrolysis processes, a receding burning front is defined, which has a constant temperature (~450 °C and divides the cigarette into two zones, the burning zone and the pyrolysis zone. The char combustion processes in the burning zone and the pyrolysis of virgin tobacco and evaporation of water in the pyrolysis zone are included in the model. The hot gases flow from the burning zone, are assumed to go out as sidestream smoke during smoldering. The internal heat transport is characterized by effective thermal conductivities in each zone. Thermal conduction of cigarette paper and convective and radiative heat transfer at the outer surface were also considered. The governing partial differential equations were solved using an integral method. Model predictions of smoldering speed as well as temperature and density profiles in the pyrolysis zone for different kinds of cigarettes were found to agree with the experimental data. The model also predicts the coal length and the maximum coal temperatures during smoldering conditions. The model provides a relatively fast and efficient way to simulate the cigarette burning processes. It offers a practical tool for exploring important parameters for cigarette smoldering processes, such as tobacco components, properties of cigarette paper, and heat generation in the burning zone and its dependence on the mass burn rate.

  16. Modeling of Reaction Processes Controlled by Diffusion

    International Nuclear Information System (INIS)

    Revelli, Jorge

    2003-01-01

    Stochastic modeling is quite powerful in science and technology.The technics derived from this process have been used with great success in laser theory, biological systems and chemical reactions.Besides, they provide a theoretical framework for the analysis of experimental results on the field of particle's diffusion in ordered and disordered materials.In this work we analyze transport processes in one-dimensional fluctuating media, which are media that change their state in time.This fact induces changes in the movements of the particles giving rise to different phenomena and dynamics that will be described and analyzed in this work.We present some random walk models to describe these fluctuating media.These models include state transitions governed by different dynamical processes.We also analyze the trapping problem in a lattice by means of a simple model which predicts a resonance-like phenomenon.Also we study effective diffusion processes over surfaces due to random walks in the bulk.We consider different boundary conditions and transitions movements.We derive expressions that describe diffusion behaviors constrained to bulk restrictions and the dynamic of the particles.Finally it is important to mention that the theoretical results obtained from the models proposed in this work are compared with Monte Carlo simulations.We find, in general, excellent agreements between the theory and the simulations

  17. Internet User Behaviour Model Discovery Process

    Directory of Open Access Journals (Sweden)

    2007-01-01

    Full Text Available The Academy of Economic Studies has more than 45000 students and about 5000 computers with Internet access which are connected to AES network. Students can access internet on these computers through a proxy server which stores information about the way the Internet is accessed. In this paper, we describe the process of discovering internet user behavior models by analyzing proxy server raw data and we emphasize the importance of such models for the e-learning environment.

  18. Process model development for optimization of forged disk manufacturing processes

    Energy Technology Data Exchange (ETDEWEB)

    Fischer, C.E.; Gunasekera, J.S. [Ohio Univ., Athens, OH (United States). Center for Advanced Materials Processing; Malas, J.C. [Wright Labs., Wright Patterson AFB, OH (United States). Materials Directorate

    1997-12-31

    This paper addresses the development of a system which will enable the optimization of an entire processing sequence for a forged part. Typically such a sequence may involve several stages and alternative routes of manufacturing a given part. It is important that such a system be optimized globally, (rather than locally, as is the current practice) in order to achieve improvements in affordability, producibility, and performance. This paper demonstrates the development of a simplified forging model, discussion techniques for searching and reducing a very large design space, and an objective function to evaluate the cost of a design sequence.

  19. Derivative processes for modelling metabolic fluxes

    Science.gov (United States)

    Žurauskienė, Justina; Kirk, Paul; Thorne, Thomas; Pinney, John; Stumpf, Michael

    2014-01-01

    Motivation: One of the challenging questions in modelling biological systems is to characterize the functional forms of the processes that control and orchestrate molecular and cellular phenotypes. Recently proposed methods for the analysis of metabolic pathways, for example, dynamic flux estimation, can only provide estimates of the underlying fluxes at discrete time points but fail to capture the complete temporal behaviour. To describe the dynamic variation of the fluxes, we additionally require the assumption of specific functional forms that can capture the temporal behaviour. However, it also remains unclear how to address the noise which might be present in experimentally measured metabolite concentrations. Results: Here we propose a novel approach to modelling metabolic fluxes: derivative processes that are based on multiple-output Gaussian processes (MGPs), which are a flexible non-parametric Bayesian modelling technique. The main advantages that follow from MGPs approach include the natural non-parametric representation of the fluxes and ability to impute the missing data in between the measurements. Our derivative process approach allows us to model changes in metabolite derivative concentrations and to characterize the temporal behaviour of metabolic fluxes from time course data. Because the derivative of a Gaussian process is itself a Gaussian process, we can readily link metabolite concentrations to metabolic fluxes and vice versa. Here we discuss how this can be implemented in an MGP framework and illustrate its application to simple models, including nitrogen metabolism in Escherichia coli. Availability and implementation: R code is available from the authors upon request. Contact: j.norkunaite@imperial.ac.uk; m.stumpf@imperial.ac.uk Supplementary information: Supplementary data are available at Bioinformatics online. PMID:24578401

  20. A Generic Modeling Process to Support Functional Fault Model Development

    Science.gov (United States)

    Maul, William A.; Hemminger, Joseph A.; Oostdyk, Rebecca; Bis, Rachael A.

    2016-01-01

    Functional fault models (FFMs) are qualitative representations of a system's failure space that are used to provide a diagnostic of the modeled system. An FFM simulates the failure effect propagation paths within a system between failure modes and observation points. These models contain a significant amount of information about the system including the design, operation and off nominal behavior. The development and verification of the models can be costly in both time and resources. In addition, models depicting similar components can be distinct, both in appearance and function, when created individually, because there are numerous ways of representing the failure space within each component. Generic application of FFMs has the advantages of software code reuse: reduction of time and resources in both development and verification, and a standard set of component models from which future system models can be generated with common appearance and diagnostic performance. This paper outlines the motivation to develop a generic modeling process for FFMs at the component level and the effort to implement that process through modeling conventions and a software tool. The implementation of this generic modeling process within a fault isolation demonstration for NASA's Advanced Ground System Maintenance (AGSM) Integrated Health Management (IHM) project is presented and the impact discussed.

  1. Model Identification of Integrated ARMA Processes

    Science.gov (United States)

    Stadnytska, Tetiana; Braun, Simone; Werner, Joachim

    2008-01-01

    This article evaluates the Smallest Canonical Correlation Method (SCAN) and the Extended Sample Autocorrelation Function (ESACF), automated methods for the Autoregressive Integrated Moving-Average (ARIMA) model selection commonly available in current versions of SAS for Windows, as identification tools for integrated processes. SCAN and ESACF can…

  2. Modeling as a Decision-Making Process

    Science.gov (United States)

    Bleiler-Baxter, Sarah K.; Stephens, D. Christopher; Baxter, Wesley A.; Barlow, Angela T.

    2017-01-01

    The goal in this article is to support teachers in better understanding what it means to model with mathematics by focusing on three key decision-making processes: Simplification, Relationship Mapping, and Situation Analysis. The authors use the Theme Park task to help teachers develop a vision of how students engage in these three decision-making…

  3. Kinetics and modeling of anaerobic digestion process

    DEFF Research Database (Denmark)

    Gavala, Hariklia N.; Angelidaki, Irini; Ahring, Birgitte Kiær

    2003-01-01

    Anaerobic digestion modeling started in the early 1970s when the need for design and efficient operation of anaerobic systems became evident. At that time not only was the knowledge about the complex process of anaerobic digestion inadequate but also there were computational limitations. Thus...

  4. Querying Business Process Models with VMQL

    DEFF Research Database (Denmark)

    Störrle, Harald; Acretoaie, Vlad

    2013-01-01

    . In this paper, we apply VMQL to the Business Process Modeling Notation (BPMN) to evaluate the second claim. We explore the adaptations required, and re-evaluate the usability of VMQL in this context. We find similar results to earlier work, thus both supporting our claims and establishing the usability of VMQL...

  5. Numerical modeling and simulation in various processes

    Directory of Open Access Journals (Sweden)

    Eliza Consuela ISBĂŞOIU

    2011-12-01

    The economic modeling offers the manager the rigorous side of his actions, multiple chances in order to connect existing resources with the objectives pursued for a certain period of time, offering the possibility of a better and faster thinking and deciding process, without deforming the reality.

  6. Aligning Grammatical Theories and Language Processing Models

    Science.gov (United States)

    Lewis, Shevaun; Phillips, Colin

    2015-01-01

    We address two important questions about the relationship between theoretical linguistics and psycholinguistics. First, do grammatical theories and language processing models describe separate cognitive systems, or are they accounts of different aspects of the same system? We argue that most evidence is consistent with the one-system view. Second,…

  7. Computational Process Modeling for Additive Manufacturing (OSU)

    Science.gov (United States)

    Bagg, Stacey; Zhang, Wei

    2015-01-01

    Powder-Bed Additive Manufacturing (AM) through Direct Metal Laser Sintering (DMLS) or Selective Laser Melting (SLM) is being used by NASA and the Aerospace industry to "print" parts that traditionally are very complex, high cost, or long schedule lead items. The process spreads a thin layer of metal powder over a build platform, then melts the powder in a series of welds in a desired shape. The next layer of powder is applied, and the process is repeated until layer-by-layer, a very complex part can be built. This reduces cost and schedule by eliminating very complex tooling and processes traditionally used in aerospace component manufacturing. To use the process to print end-use items, NASA seeks to understand SLM material well enough to develop a method of qualifying parts for space flight operation. Traditionally, a new material process takes many years and high investment to generate statistical databases and experiential knowledge, but computational modeling can truncate the schedule and cost -many experiments can be run quickly in a model, which would take years and a high material cost to run empirically. This project seeks to optimize material build parameters with reduced time and cost through modeling.

  8. Modeling of the mechanical alloying process

    Science.gov (United States)

    Maurice, D.; Courtney, T. H.

    1992-01-01

    Two programs have been developed to compute the dimensional and property changes that occur with repetitive impacts during the mechanical alloying process. The more sophisticated of the programs also maintains a running count of the fractions of particles present and from this calculates a population distribution. The programs predict powder particle size and shape changes in accord with the accepted stages of powder development during mechanical alloying of ductile species. They also predict hardness and lamellar thickness changes with processing, again with reasonable agreement with experimental results. These predictions offer support of the model (and thereby give insight into the possible 'actual' happenings of mechanical alloying) and hence allow refinement and calibration of the myriad aspects of the model. They also provide a vehicle for establishing control over the dimensions and properties of the output powders used for consolidation, thereby facilitating optimization of the consolidation process.

  9. Managing risks in business model innovation processes

    DEFF Research Database (Denmark)

    Taran, Yariv; Boer, Harry; Lindgren, Peter

    2010-01-01

    Companies today, in some industries more than others, invest more capital and resources just to stay competitive, develop more diverse solutions, and increasingly start thinking more radically when considering their business models. However, despite the understanding that business model (BM) inno...... forward, which link success and failure to the way companies appreciate and handle the risks involved in BM innovation.......) innovation is a risky enterprise, many companies are still choosing not to apply any risk management in the BM innovation process. The objective of this paper is to develop a better understanding of how risks are handled in the practice of BM innovation. An analysis of the BM innovation experiences of two...... industrial companies shows that both companies are experiencing high levels of uncertainty and complexity during their innovation processes and are, consequently, struggling to find new processes for handling the risks involved. Based on the two companies’ experiences, various testable propositions are put...

  10. Statistical model for high energy inclusive processes

    International Nuclear Information System (INIS)

    Pomorisac, B.

    1980-01-01

    We propose a statistical model of inclusive processes. The model is an extension of the model proposed by Salapino and Sugar for the inclusive distributions in rapidity. The model is defined in terms of a random variable on the full phase space of the produced particles and in terms of a Lorentz-invariant probability distribution. We suggest that the Lorentz invariance is broken spontaneously, this may describe the observed anisotropy of the inclusive distributions. Based on this model we calculate the distribution in transverse momentum. An explicit calculation is given of the one-particle inclusive cross sections and the two-particle correlation. The results give a fair representation of the shape of one-particle inclusive cross sections, and positive correlation for the particles emitted. The relevance of our results to experiments is discussed

  11. Fundamentals of Numerical Modelling of Casting Processes

    DEFF Research Database (Denmark)

    Hattel, Jesper Henri; Pryds, Nini; Thorborg, Jesper

    Fundamentals of Numerical Modelling of Casting Processes comprises a thorough presentation of the basic phenomena that need to be addressed in numerical simulation of casting processes. The main philosophy of the book is to present the topics in view of their physical meaning, whenever possible......, rather than relying strictly on mathematical formalism. The book, aimed both at the researcher and the practicing engineer, as well as the student, is naturally divided into four parts. Part I (Chapters 1-3) introduces the fundamentals of modelling in a 1-dimensional framework. Part II (Chapter 4......) presents the most important aspects of solidification theory related to modelling. Part III (Chapter 5) describes the fluid flow phenomena and in part IV (Chapter 6) the stress-strain analysis is addressed. For all parts, both numerical formulations as well as some important analytical solutions...

  12. Temperature Modelling of the Biomass Pretreatment Process

    DEFF Research Database (Denmark)

    Prunescu, Remus Mihail; Blanke, Mogens; Jensen, Jakob M.

    2012-01-01

    In a second generation biorefinery, the biomass pretreatment stage has an important contribution to the efficiency of the downstream processing units involved in biofuel production. Most of the pretreatment process occurs in a large pressurized thermal reactor that presents an irregular temperature...... distribution. Therefore, an accurate temperature model is critical for observing the biomass pretreatment. More than that, the biomass is also pushed with a constant horizontal speed along the reactor in order to ensure a continuous throughput. The goal of this paper is to derive a temperature model...... that captures the environmental temperature differences inside the reactor using distributed parameters. A Kalman filter is then added to account for any missing dynamics and the overall model is embedded into a temperature soft sensor. The operator of the plant will be able to observe the temperature in any...

  13. Modeling veterans healthcare administration disclosure processes :

    Energy Technology Data Exchange (ETDEWEB)

    Beyeler, Walter E; DeMenno, Mercy B.; Finley, Patrick D.

    2013-09-01

    As with other large healthcare organizations, medical adverse events at the Department of Veterans Affairs (VA) facilities can expose patients to unforeseen negative risks. VHA leadership recognizes that properly handled disclosure of adverse events can minimize potential harm to patients and negative consequences for the effective functioning of the organization. The work documented here seeks to help improve the disclosure process by situating it within the broader theoretical framework of issues management, and to identify opportunities for process improvement through modeling disclosure and reactions to disclosure. The computational model will allow a variety of disclosure actions to be tested across a range of incident scenarios. Our conceptual model will be refined in collaboration with domain experts, especially by continuing to draw on insights from VA Study of the Communication of Adverse Large-Scale Events (SCALE) project researchers.

  14. A model evaluation checklist for process-based environmental models

    Science.gov (United States)

    Jackson-Blake, Leah

    2015-04-01

    Mechanistic catchment-scale phosphorus models appear to perform poorly where diffuse sources dominate. The reasons for this were investigated for one commonly-applied model, the INtegrated model of CAtchment Phosphorus (INCA-P). Model output was compared to 18 months of daily water quality monitoring data in a small agricultural catchment in Scotland, and model structure, key model processes and internal model responses were examined. Although the model broadly reproduced dissolved phosphorus dynamics, it struggled with particulates. The reasons for poor performance were explored, together with ways in which improvements could be made. The process of critiquing and assessing model performance was then generalised to provide a broadly-applicable model evaluation checklist, incorporating: (1) Calibration challenges, relating to difficulties in thoroughly searching a high-dimensional parameter space and in selecting appropriate means of evaluating model performance. In this study, for example, model simplification was identified as a necessary improvement to reduce the number of parameters requiring calibration, whilst the traditionally-used Nash Sutcliffe model performance statistic was not able to discriminate between realistic and unrealistic model simulations, and alternative statistics were needed. (2) Data limitations, relating to a lack of (or uncertainty in) input data, data to constrain model parameters, data for model calibration and testing, and data to test internal model processes. In this study, model reliability could be improved by addressing all four kinds of data limitation. For example, there was insufficient surface water monitoring data for model testing against an independent dataset to that used in calibration, whilst additional monitoring of groundwater and effluent phosphorus inputs would help distinguish between alternative plausible model parameterisations. (3) Model structural inadequacies, whereby model structure may inadequately represent

  15. Modelling Of Manufacturing Processes With Membranes

    Science.gov (United States)

    Crăciunean, Daniel Cristian; Crăciunean, Vasile

    2015-07-01

    The current objectives to increase the standards of quality and efficiency in manufacturing processes can be achieved only through the best combination of inputs, independent of spatial distance between them. This paper proposes modelling production processes based on membrane structures introduced in [4]. Inspired from biochemistry, membrane computation [4] is based on the concept of membrane represented in its formalism by the mathematical concept of multiset. The manufacturing process is the evolution of a super cell system from its initial state according to the given actions of aggregation. In this paper we consider that the atomic production unit of the process is the action. The actions and the resources on which the actions are produced, are distributed in a virtual network of companies working together. The destination of the output resources is specified by corresponding output events.

  16. Dimensional modeling: beyond data processing constraints.

    Science.gov (United States)

    Bunardzic, A

    1995-01-01

    The focus of information processing requirements is shifting from the on-line transaction processing (OLTP) issues to the on-line analytical processing (OLAP) issues. While the former serves to ensure the feasibility of the real-time on-line transaction processing (which has already exceeded a level of up to 1,000 transactions per second under normal conditions), the latter aims at enabling more sophisticated analytical manipulation of data. The OLTP requirements, or how to efficiently get data into the system, have been solved by applying the Relational theory in the form of Entity-Relation model. There is presently no theory related to OLAP that would resolve the analytical processing requirements as efficiently as Relational theory provided for the transaction processing. The "relational dogma" also provides the mathematical foundation for the Centralized Data Processing paradigm in which mission-critical information is incorporated as 'one and only one instance' of data, thus ensuring data integrity. In such surroundings, the information that supports business analysis and decision support activities is obtained by running predefined reports and queries that are provided by the IS department. In today's intensified competitive climate, businesses are finding that this traditional approach is not good enough. The only way to stay on top of things, and to survive and prosper, is to decentralize the IS services. The newly emerging Distributed Data Processing, with its increased emphasis on empowering the end user, does not seem to find enough merit in the relational database model to justify relying upon it. Relational theory proved too rigid and complex to accommodate the analytical processing needs. In order to satisfy the OLAP requirements, or how to efficiently get the data out of the system, different models, metaphors, and theories have been devised. All of them are pointing to the need for simplifying the highly non-intuitive mathematical constraints found

  17. Mathematical modeling of the flash converting process

    Energy Technology Data Exchange (ETDEWEB)

    Sohn, H.Y.; Perez-Tello, M.; Riihilahti, K.M. [Utah Univ., Salt Lake City, UT (United States)

    1996-12-31

    An axisymmetric mathematical model for the Kennecott-Outokumpu flash converting process for converting solid copper matte to copper is presented. The model is an adaptation of the comprehensive mathematical model formerly developed at the University of Utah for the flash smelting of copper concentrates. The model incorporates the transport of momentum, heat, mass, and reaction kinetics between gas and particles in a particle-laden turbulent gas jet. The standard k-{epsilon} model is used to describe gas-phase turbulence in an Eulerian framework. The particle-phase is treated from a Lagrangian viewpoint which is coupled to the gas-phase via the source terms in the Eulerian gas-phase governing equations. Matte particles were represented as Cu{sub 2}S yFeS, and assumed to undergo homogeneous oxidation to Cu{sub 2}O, Fe{sub 3}O{sub 4}, and SO{sub 2}. A reaction kinetics mechanism involving both external mass transfer of oxygen gas to the particle surface and diffusion of oxygen through the porous oxide layer is proposed to estimate the particle oxidation rate Predictions of the mathematical model were compared with the experimental data collected in a bench-scale flash converting facility. Good agreement between the model predictions and the measurements was obtained. The model was used to study the effect of different gas-injection configurations on the overall fluid dynamics in a commercial size flash converting shaft. (author)

  18. Modelling and control of crystallization process

    Directory of Open Access Journals (Sweden)

    S.K. Jha

    2017-03-01

    Full Text Available Batch crystallizers are predominantly used in chemical industries like pharmaceuticals, food industries and specialty chemicals. The nonlinear nature of the batch process leads to difficulties when the objective is to obtain a uniform Crystal Size Distribution (CSD. In this study, a linear PI controller is designed using classical controller tuning methods for controlling the crystallizer outlet temperature by manipulating the inlet jacket temperature; however, the response is not satisfactory. A simple PID controller cannot guarantee a satisfactory response that is why an optimal controller is designed to keep the concentration and temperature in a range that suits our needs. Any typical process operation has constraints on states, inputs and outputs. So, a nonlinear process needs to be operated satisfying the constraints. Hence, a nonlinear controller like Generic Model Controller (GMC which is similar in structure to the PI controller is implemented. It minimizes the derivative of the squared error, thus improving the output response of the process. Minimization of crystal size variation is considered as an objective function in this study. Model predictive control is also designed that uses advanced optimization algorithm to minimize the error while linearizing the process. Constraints are fed into the MPC toolbox in MATLAB and Prediction, Control horizons and Performance weights are tuned using Sridhar and Cooper Method. Performances of all the three controllers (PID, GMC and MPC are compared and it is found that MPC is the most superior one in terms of settling time and percentage overshoot.

  19. The impact of working memory and the "process of process modelling" on model quality: Investigating experienced versus inexperienced modellers

    DEFF Research Database (Denmark)

    Martini, Markus; Pinggera, Jakob; Neurauter, Manuel

    2016-01-01

    A process model (PM) represents the graphical depiction of a business process, for instance, the entire process from online ordering a book until the parcel is delivered to the customer. Knowledge about relevant factors for creating PMs of high quality is lacking. The present study investigated...... of reconciliation phases was positively related to PM quality in experienced modellers. Our research reveals central cognitive mechanisms in process modelling and has potential practical implications for the development of modelling software and teaching the craft of process modelling....

  20. Theoretical modelling of carbon deposition processes

    International Nuclear Information System (INIS)

    Marsh, G.R.; Norfolk, D.J.; Skinner, R.F.

    1985-01-01

    Work based on capsule experiments in the BNL Gamma Facility, aimed at elucidating the chemistry involved in the formation of carbonaceous deposit on CAGR fuel pin surfaces is described. Using a data-base derived from capsule experiments together with literature values for the kinetics of the fundamental reactions, a chemical model of the gas-phase processes has been developed. This model successfully reproduces the capsule results, whilst preliminary application to the WAGR coolant circuit indicates the likely concentration profiles of various radical species within the fuel channels. (author)

  1. Theoretical Modelling of Intercultural Communication Process

    Directory of Open Access Journals (Sweden)

    Mariia Soter

    2016-08-01

    Full Text Available The definition of the concepts of “communication”, “intercultural communication”, “model of communication” are analyzed in the article. The basic components of the communication process are singled out. The model of intercultural communication is developed. Communicative, behavioral and complex skills for optimal organization of intercultural communication, establishment of productive contact with a foreign partner to achieve mutual understanding, searching for acceptable ways of organizing interaction and cooperation for both communicants are highlighted in the article. It is noted that intercultural communication through interaction between people affects the development of different cultures’ aspects.

  2. Empirical process modeling in fast breeder reactors

    International Nuclear Information System (INIS)

    Ikonomopoulos, A.; Endou, A.

    1998-01-01

    A non-linear multi-input/single output (MISO) empirical model is introduced for monitoring vital system parameters in a nuclear reactor environment. The proposed methodology employs a scheme of non-parametric smoothing that models the local dynamics of each fitting point individually, as opposed to global modeling techniques--such as multi-layer perceptrons (MLPs)--that attempt to capture the dynamics of the entire design space. The stimulation for employing local models in monitoring rises from one's desire to capture localized idiosyncrasies of the dynamic system utilizing independent estimators. This approach alleviates the effect of negative interference between old and new observations enhancing the model prediction capabilities. Modeling the behavior of any given system comes down to a trade off between variance and bias. The building blocks of the proposed approach are tailored to each data set through two separate, adaptive procedures in order to optimize the bias-variance reconciliation. Hetero-associative schemes of the technique presented exhibit insensitivity to sensor noise and provide the operator with accurate predictions of the actual process signals. A comparison between the local model and MLP prediction capabilities is performed and the results appear in favor of the first method. The data used to demonstrate the potential of local regression have been obtained during two startup periods of the Monju fast breeder reactor (FBR)

  3. Modeling of processes in the tourism sector

    Directory of Open Access Journals (Sweden)

    Salamatina Victoriya, S.

    2015-06-01

    Full Text Available In modern conditions for a number of Russian regions tourism is becoming budget. In this regard, it is of interest to the simulation of processes occurring in the tourism business, because they are affected by many random parameters, due to various economic, political, geographic, and other aspects. For improvement and development of systems for the management of tourism business systematically embeds economic mathematical apparatus in this area, because increased competitiveness requires continuous and constructive changes. Results of application of the economic mathematical apparatus allow a more systematic and internal unity to analyze and evaluate the applicability of further processes in tourism. For some economic processes typical tourist activities is that a certain effect and result from exposure to any of the factors on the indicators of the processes is not immediately but gradually, after some certain time, with a certain lag. With the necessity of accounting for this delay has to face when developing mathematical models of tourist business processes. In this case, the simulation of such processes it is advisable to apply economic-mathematical formalism of optimal control, called game theory.

  4. Aqueous Electrolytes: Model Parameters and Process Simulation

    DEFF Research Database (Denmark)

    Thomsen, Kaj

    This thesis deals with aqueous electrolyte mixtures. The Extended UNIQUAC model is being used to describe the excess Gibbs energy of such solutions. Extended UNIQUAC parameters for the twelve ions Na+, K+, NH4+, H+, Cl-, NO3-, SO42-, HSO4-, OH-, CO32-, HCO3-, and S2O82- are estimated. A computer ...... program including a steady state process simulator for the design, simulation, and optimization of fractional crystallization processes is presented.......This thesis deals with aqueous electrolyte mixtures. The Extended UNIQUAC model is being used to describe the excess Gibbs energy of such solutions. Extended UNIQUAC parameters for the twelve ions Na+, K+, NH4+, H+, Cl-, NO3-, SO42-, HSO4-, OH-, CO32-, HCO3-, and S2O82- are estimated. A computer...

  5. Markov State Model of Ion Assembling Process.

    Science.gov (United States)

    Shevchuk, Roman

    2016-05-12

    We study the process of ion assembling in aqueous solution by means of molecular dynamics. In this article, we present a method to study many-particle assembly using the Markov state model formalism. We observed that at NaCl concentration higher than 1.49 mol/kg, the system tends to form a big ionic cluster composed of roughly 70-90% of the total number of ions. Using Markov state models, we estimated the average time needed for the system to make a transition from discorded state to a state with big ionic cluster. Our results suggest that the characteristic time to form an ionic cluster is a negative exponential function of the salt concentration. Moreover, we defined and analyzed three different kinetic states of a single ion particle. These states correspond to a different particle location during nucleation process.

  6. Environmental Modeling Framework using Stacked Gaussian Processes

    OpenAIRE

    Abdelfatah, Kareem; Bao, Junshu; Terejanu, Gabriel

    2016-01-01

    A network of independently trained Gaussian processes (StackedGP) is introduced to obtain predictions of quantities of interest with quantified uncertainties. The main applications of the StackedGP framework are to integrate different datasets through model composition, enhance predictions of quantities of interest through a cascade of intermediate predictions, and to propagate uncertainties through emulated dynamical systems driven by uncertain forcing variables. By using analytical first an...

  7. Deep inelastic processes and the parton model

    International Nuclear Information System (INIS)

    Altarelli, G.

    The lecture was intended as an elementary introduction to the physics of deep inelastic phenomena from the point of view of theory. General formulae and facts concerning inclusive deep inelastic processes in the form: l+N→l'+hadrons (electroproduction, neutrino scattering) are first recalled. The deep inelastic annihilation e + e - →hadrons is then envisaged. The light cone approach, the parton model and their relation are mainly emphasized

  8. Process Modeling With Inhomogeneous Thin Films

    Science.gov (United States)

    Machorro, R.; Macleod, H. A.; Jacobson, M. R.

    1986-12-01

    Designers of optical multilayer coatings commonly assume that the individual layers will be ideally homogeneous and isotropic. In practice, it is very difficult to control the conditions involved in the complex evaporation process sufficiently to produce such ideal films. Clearly, changes in process parameters, such as evaporation rate, chamber pressure, and substrate temperature, affect the microstructure of the growing film, frequently producing inhomogeneity in structure or composition. In many cases, these effects are interdependent, further complicating the situation. However, this process can be simulated on powerful, interactive, and accessible microcomputers. In this work, we present such a model and apply it to estimate the influence of an inhomogeneous layer on multilayer performance. Presently, the program simulates film growth, thermal expansion and contraction, and thickness monitoring procedures, and includes the effects of uncertainty in these parameters or noise. Although the model is being developed to cover very general cases, we restrict the present discussion to isotropic and nondispersive quarterwave layers to understand the particular effects of inhomogeneity. We studied several coating designs and related results and tolerances to variations in evaporation conditions. The model is composed of several modular subprograms, is written in Fortran, and is executed on an IBM-PC with 640 K of memory. The results can be presented in graphic form on a monochrome monitor. We are currently installing and implementing color capability to improve the clarity of the multidimensional output.

  9. Near Field Environment Process Model Report

    Energy Technology Data Exchange (ETDEWEB)

    R.A. Wagner

    2000-11-14

    Waste emplacement and activities associated with construction of a repository system potentially will change environmental conditions within the repository system. These environmental changes principally result from heat generated by the decay of the radioactive waste, which elevates temperatures within the repository system. Elevated temperatures affect distribution of water, increase kinetic rates of geochemical processes, and cause stresses to change in magnitude and orientation from the stresses resulting from the overlying rock and from underground construction activities. The recognition of this evolving environment has been reflected in activities, studies and discussions generally associated with what has been termed the Near-Field Environment (NFE). The NFE interacts directly with waste packages and engineered barriers as well as potentially changing the fluid composition and flow conditions within the mountain. As such, the NFE defines the environment for assessing the performance of a potential Monitored Geologic Repository at Yucca Mountain, Nevada. The NFe evolves over time, and therefore is not amenable to direct characterization or measurement in the ambient system. Analysis or assessment of the NFE must rely upon projections based on tests and models that encompass the long-term processes of the evolution of this environment. This NFE Process Model Report (PMR) describes the analyses and modeling based on current understanding of the evolution of the near-field within the rock mass extending outward from the drift wall.

  10. MODELLING OF POSTSEISMIC PROCESSES IN SUBDUCTION ZONES

    Directory of Open Access Journals (Sweden)

    Irina S. Vladimirova

    2012-01-01

    Full Text Available Large intraplate subduction earthquakes are generally accompanied by prolonged and intense postseismic anomalies. In the present work, viscoelastic relaxation in the upper mantle and the asthenosphere is considered as a main mechanism responsible for the occurrence of such postseismic effects. The study of transient processes is performed on the basis of data on postseismic processes accompanying the first Simushir earthquake on 15 November 2006 and Maule earthquake on 27 February 2010.The methodology of modelling a viscoelastic relaxation process after a large intraplate subduction earthquake is presented. A priori parameters of the selected model describing observed postseismic effects are adjusted by minimizing deviations between modeled surface displacements and actual surface displacements recorded by geodetic methods through solving corresponding inverse problems.The presented methodology yielded estimations of Maxwell’s viscosity of the asthenosphere of the central Kuril Arc and also of the central Chile. Besides, postseismic slip distribution patterns were obtained for the focus of the Simushir earthquake of 15 November 2006 (Mw=8.3 (Figure 3, and distribution patterns of seismic and postseismic slip were determined for the focus of the Maule earthquake of 27 February 2010 (Mw=8.8 (Figure 6. These estimations and patterns can provide for prediction of the intensity of viscoelastic stress attenuation in the asthenosphere; anomalous values should be taken into account as adjustment factors when analyzing inter-seismic deformation in order to ensure correct estimation of the accumulated elastic seismogenic potential.

  11. Multimodal Similarity Gaussian Process Latent Variable Model.

    Science.gov (United States)

    Song, Guoli; Wang, Shuhui; Huang, Qingming; Tian, Qi

    2017-09-01

    Data from real applications involve multiple modalities representing content with the same semantics from complementary aspects. However, relations among heterogeneous modalities are simply treated as observation-to-fit by existing work, and the parameterized modality specific mapping functions lack flexibility in directly adapting to the content divergence and semantic complicacy in multimodal data. In this paper, we build our work based on the Gaussian process latent variable model (GPLVM) to learn the non-parametric mapping functions and transform heterogeneous modalities into a shared latent space. We propose multimodal Similarity Gaussian Process latent variable model (m-SimGP), which learns the mapping functions between the intra-modal similarities and latent representation. We further propose multimodal distance-preserved similarity GPLVM (m-DSimGP) to preserve the intra-modal global similarity structure, and multimodal regularized similarity GPLVM (m-RSimGP) by encouraging similar/dissimilar points to be similar/dissimilar in the latent space. We propose m-DRSimGP, which combines the distance preservation in m-DSimGP and semantic preservation in m-RSimGP to learn the latent representation. The overall objective functions of the four models are solved by simple and scalable gradient decent techniques. They can be applied to various tasks to discover the nonlinear correlations and to obtain the comparable low-dimensional representation for heterogeneous modalities. On five widely used real-world data sets, our approaches outperform existing models on cross-modal content retrieval and multimodal classification.

  12. Spherical Process Models for Global Spatial Statistics

    KAUST Repository

    Jeong, Jaehong

    2017-11-28

    Statistical models used in geophysical, environmental, and climate science applications must reflect the curvature of the spatial domain in global data. Over the past few decades, statisticians have developed covariance models that capture the spatial and temporal behavior of these global data sets. Though the geodesic distance is the most natural metric for measuring distance on the surface of a sphere, mathematical limitations have compelled statisticians to use the chordal distance to compute the covariance matrix in many applications instead, which may cause physically unrealistic distortions. Therefore, covariance functions directly defined on a sphere using the geodesic distance are needed. We discuss the issues that arise when dealing with spherical data sets on a global scale and provide references to recent literature. We review the current approaches to building process models on spheres, including the differential operator, the stochastic partial differential equation, the kernel convolution, and the deformation approaches. We illustrate realizations obtained from Gaussian processes with different covariance structures and the use of isotropic and nonstationary covariance models through deformations and geographical indicators for global surface temperature data. To assess the suitability of each method, we compare their log-likelihood values and prediction scores, and we end with a discussion of related research problems.

  13. Modeling Dynamic Regulatory Processes in Stroke

    Science.gov (United States)

    McDermott, Jason E.; Jarman, Kenneth; Taylor, Ronald; Lancaster, Mary; Shankaran, Harish; Vartanian, Keri B.; Stevens, Susan L.; Stenzel-Poore, Mary P.; Sanfilippo, Antonio

    2012-01-01

    The ability to examine the behavior of biological systems in silico has the potential to greatly accelerate the pace of discovery in diseases, such as stroke, where in vivo analysis is time intensive and costly. In this paper we describe an approach for in silico examination of responses of the blood transcriptome to neuroprotective agents and subsequent stroke through the development of dynamic models of the regulatory processes observed in the experimental gene expression data. First, we identified functional gene clusters from these data. Next, we derived ordinary differential equations (ODEs) from the data relating these functional clusters to each other in terms of their regulatory influence on one another. Dynamic models were developed by coupling these ODEs into a model that simulates the expression of regulated functional clusters. By changing the magnitude of gene expression in the initial input state it was possible to assess the behavior of the networks through time under varying conditions since the dynamic model only requires an initial starting state, and does not require measurement of regulatory influences at each time point in order to make accurate predictions. We discuss the implications of our models on neuroprotection in stroke, explore the limitations of the approach, and report that an optimized dynamic model can provide accurate predictions of overall system behavior under several different neuroprotective paradigms. PMID:23071432

  14. Modeling dynamic regulatory processes in stroke.

    Directory of Open Access Journals (Sweden)

    Jason E McDermott

    Full Text Available The ability to examine the behavior of biological systems in silico has the potential to greatly accelerate the pace of discovery in diseases, such as stroke, where in vivo analysis is time intensive and costly. In this paper we describe an approach for in silico examination of responses of the blood transcriptome to neuroprotective agents and subsequent stroke through the development of dynamic models of the regulatory processes observed in the experimental gene expression data. First, we identified functional gene clusters from these data. Next, we derived ordinary differential equations (ODEs from the data relating these functional clusters to each other in terms of their regulatory influence on one another. Dynamic models were developed by coupling these ODEs into a model that simulates the expression of regulated functional clusters. By changing the magnitude of gene expression in the initial input state it was possible to assess the behavior of the networks through time under varying conditions since the dynamic model only requires an initial starting state, and does not require measurement of regulatory influences at each time point in order to make accurate predictions. We discuss the implications of our models on neuroprotection in stroke, explore the limitations of the approach, and report that an optimized dynamic model can provide accurate predictions of overall system behavior under several different neuroprotective paradigms.

  15. GREENSCOPE: A Method for Modeling Chemical Process ...

    Science.gov (United States)

    Current work within the U.S. Environmental Protection Agency’s National Risk Management Research Laboratory is focused on the development of a method for modeling chemical process sustainability. The GREENSCOPE methodology, defined for the four bases of Environment, Economics, Efficiency, and Energy, can evaluate processes with over a hundred different indicators. These indicators provide a means for realizing the principles of green chemistry and green engineering in the context of sustainability. Development of the methodology has centered around three focal points. One is a taxonomy of impacts that describe the indicators and provide absolute scales for their evaluation. The setting of best and worst limits for the indicators allows the user to know the status of the process under study in relation to understood values. Thus, existing or imagined processes can be evaluated according to their relative indicator scores, and process modifications can strive towards realizable targets. A second area of focus is in advancing definitions of data needs for the many indicators of the taxonomy. Each of the indicators has specific data that is necessary for their calculation. Values needed and data sources have been identified. These needs can be mapped according to the information source (e.g., input stream, output stream, external data, etc.) for each of the bases. The user can visualize data-indicator relationships on the way to choosing selected ones for evalua

  16. Discovering Reference Process Models by Mining Process Variants

    NARCIS (Netherlands)

    Li, C.; Reichert, M.U.; Wombacher, Andreas

    Recently, a new generation of adaptive Process-Aware Information Systems (PAIS) has emerged, which allows for dynamic process and service changes (e.g., to insert, delete, and move activities and service executions in a running process). This, in turn, has led to a large number of process variants

  17. Multifunctional multiscale composites: Processing, modeling and characterization

    Science.gov (United States)

    Qiu, Jingjing

    Carbon nanotubes (CNTs) demonstrate extraordinary properties and show great promise in enhancing out-of-plane properties of traditional polymer/fiber composites and enabling functionality. However, current manufacturing challenges hinder the realization of their potential. In the dissertation research, both experimental and computational efforts have been conducted to investigate effective manufacturing techniques of CNT integrated multiscale composites. The fabricated composites demonstrated significant improvements in physical properties, such as tensile strength, tensile modulus, inter-laminar shear strength, thermal dimension stability and electrical conductivity. Such multiscale composites were truly multifunctional with the addition of CNTs. Furthermore, a novel hierarchical multiscale modeling method was developed in this research. Molecular dynamic (MD) simulation offered reasonable explanation of CNTs dispersion and their motion in polymer solution. Bi-mode finite-extensible-nonlinear-elastic (FENE) dumbbell simulation was used to analyze the influence of CNT length distribution on the stress tensor and shear-rate-dependent viscosity. Based on the simulated viscosity profile and empirical equations from experiments, a macroscale flow simulation model on the finite element method (FEM) method was developed and validated to predict resin flow behavior in the processing of CNT-enhanced multiscale composites. The proposed multiscale modeling method provided a comprehensive understanding of micro/nano flow in both atomistic details and mesoscale. The simulation model can be used to optimize process design and control of the mold-filling process in multiscale composite manufacturing. This research provided systematic investigations into the CNT-based multiscale composites. The results from this study may be used to leverage the benefits of CNTs and open up new application opportunities for high-performance multifunctional multiscale composites. Keywords. Carbon

  18. Innovative process engineering: a generic model of the innovation process

    OpenAIRE

    Pénide, Thomas; Gourc, Didier; Pingaud, Hervé; Peillon, Philippe

    2013-01-01

    International audience; Innovation can be represented as a knowledge transformation process perceived with different levels of granularity. The milestones of this process allow assessment for its each step and set up feedback loops that will be highlighted. This innovation process is a good starting point to understand innovation and then to manage it. Best practices being patterns of processes, we describe innovation best practices as compulsory steps in our innovation process. To put into p...

  19. Modeling the Object-Oriented Software Process: OPEN and the Unified Process

    NARCIS (Netherlands)

    van den Berg, Klaas; Aksit, Mehmet; van den Broek, P.M.

    A short introduction to software process modeling is presented, particularly object-oriented modeling. Two major industrial process models are discussed: the OPEN model and the Unified Process model. In more detail, the quality assurance in the Unified Process tool (formally called Objectory) is

  20. Strategies to Automatically Derive a Process Model from a Configurable Process Model Based on Event Data

    Directory of Open Access Journals (Sweden)

    Mauricio Arriagada-Benítez

    2017-10-01

    Full Text Available Configurable process models are frequently used to represent business workflows and other discrete event systems among different branches of large organizations: they unify commonalities shared by all branches and describe their differences, at the same time. The configuration of such models is usually done manually, which is challenging. On the one hand, when the number of configurable nodes in the configurable process model grows, the size of the search space increases exponentially. On the other hand, the person performing the configuration may lack the holistic perspective to make the right choice for all configurable nodes at the same time, since choices influence each other. Nowadays, information systems that support the execution of business processes create event data reflecting how processes are performed. In this article, we propose three strategies (based on exhaustive search, genetic algorithms and a greedy heuristic that use event data to automatically derive a process model from a configurable process model that better represents the characteristics of the process in a specific branch. These strategies have been implemented in our proposed framework and tested in both business-like event logs as recorded in a higher educational enterprise resource planning system and a real case scenario involving a set of Dutch municipalities.

  1. Modeling and optimization of wet sizing process

    International Nuclear Information System (INIS)

    Thai Ba Cau; Vu Thanh Quang and Nguyen Ba Tien

    2004-01-01

    Mathematical simulation on basis of Stock law has been done for wet sizing process on cylinder equipment of laboratory and semi-industrial scale. The model consists of mathematical equations describing relations between variables, such as: - Resident time distribution function of emulsion particles in the separating zone of the equipment depending on flow-rate, height, diameter and structure of the equipment. - Size-distribution function in the fine and coarse parts depending on resident time distribution function of emulsion particles, characteristics of the material being processed, such as specific density, shapes, and characteristics of the environment of classification, such as specific density, viscosity. - Experimental model was developed on data collected from an experimental cylindrical equipment with diameter x height of sedimentation chamber equal to 50 x 40 cm for an emulsion of zirconium silicate in water. - Using this experimental model allows to determine optimal flow-rate in order to obtain product with desired grain size in term of average size or size distribution function. (author)

  2. Process-Based Modeling of Constructed Wetlands

    Science.gov (United States)

    Baechler, S.; Brovelli, A.; Rossi, L.; Barry, D. A.

    2007-12-01

    Constructed wetlands (CWs) are widespread facilities for wastewater treatment. In subsurface flow wetlands, contaminated wastewater flows through a porous matrix, where oxidation and detoxification phenomena occur. Despite the large number of working CWs, system design and optimization are still mainly based upon empirical equations or simplified first-order kinetics. This results from an incomplete understanding of the system functioning, and may in turn hinder the performance and effectiveness of the treatment process. As a result, CWs are often considered not suitable to meet high water quality-standards, or to treat water contaminated with recalcitrant anthropogenic contaminants. To date, only a limited number of detailed numerical models have been developed and successfully applied to simulate constructed wetland behavior. Among these, one of the most complete and powerful is CW2D, which is based on Hydrus2D. The aim of this work is to develop a comprehensive simulator tailored to model the functioning of horizontal flow constructed wetlands and in turn provide a reliable design and optimization tool. The model is based upon PHWAT, a general reactive transport code for saturated flow. PHWAT couples MODFLOW, MT3DMS and PHREEQC-2 using an operator-splitting approach. The use of PHREEQC to simulate reactions allows great flexibility in simulating biogeochemical processes. The biogeochemical reaction network is similar to that of CW2D, and is based on the Activated Sludge Model (ASM). Kinetic oxidation of carbon sources and nutrient transformations (nitrogen and phosphorous primarily) are modeled via Monod-type kinetic equations. Oxygen dissolution is accounted for via a first-order mass-transfer equation. While the ASM model only includes a limited number of kinetic equations, the new simulator permits incorporation of an unlimited number of both kinetic and equilibrium reactions. Changes in pH, redox potential and surface reactions can be easily incorporated

  3. Modelling and control of dynamic systems using gaussian process models

    CERN Document Server

    Kocijan, Juš

    2016-01-01

    This monograph opens up new horizons for engineers and researchers in academia and in industry dealing with or interested in new developments in the field of system identification and control. It emphasizes guidelines for working solutions and practical advice for their implementation rather than the theoretical background of Gaussian process (GP) models. The book demonstrates the potential of this recent development in probabilistic machine-learning methods and gives the reader an intuitive understanding of the topic. The current state of the art is treated along with possible future directions for research. Systems control design relies on mathematical models and these may be developed from measurement data. This process of system identification, when based on GP models, can play an integral part of control design in data-based control and its description as such is an essential aspect of the text. The background of GP regression is introduced first with system identification and incorporation of prior know...

  4. Model systems for life processes on Mars

    Science.gov (United States)

    Mitz, M. A.

    1974-01-01

    In the evolution of life forms nonphotosynthetic mechanisms are developed. The question remains whether a total life system could evolve which is not dependent upon photosynthesis. In trying to visualize life on other planets, the photosynthetic process has problems. On Mars, the high intensity of light at the surface is a concern and alternative mechanisms need to be defined and analyzed. In the UV search for alternate mechanisms, several different areas may be identified. These involve activated inorganic compounds in the atmosphere, such as the products of photodissociation of carbon dioxide and the organic material which may be created by natural phenomena. In addition, a life system based on the pressure of the atmospheric constituents, such as carbon dioxide, is a possibility. These considerations may be important for the understanding of evolutionary processes of life on another planet. Model systems which depend on these alternative mechanisms are defined and related to presently planned and future planetary missions.

  5. Comments on: Spatiotemporal models for skewed processes

    KAUST Repository

    Genton, Marc G.

    2017-09-04

    We would first like to thank the authors for this paper that highlights the important problem of building models for non-Gaussian space-time processes. We will hereafter refer to the paper as SGV, and we also would like to acknowledge and thank them for providing us with the temporally detrended temperatures, plotted in their Figure 1, along with the coordinates of the twenty-one locations and the posterior means of the parameters for the MA1 model. We find much of interest to discuss in this paper, and as we progress through points of interest, we pose some questions to the authors that we hope they will be able to address.

  6. Specification of e-business process model for PayPal online payment process using Reo

    OpenAIRE

    Xie, M.

    2005-01-01

    textabstractE-business process modeling allows business analysts to better understand and analyze the business processes, and eventually to use software systems to automate (parts of) these business processes to achieve higher profit. To support e-business process modeling, many business process modeling languages have been used as tools. However, many existing business process modeling languages lack (a) formal semantics, (b) formal computational model, and (c) an integrated view of the busi...

  7. Privatization processes in banking: Motives and models

    Directory of Open Access Journals (Sweden)

    Ristić Života

    2006-01-01

    Full Text Available The paper consists of three methodologically and causally connected thematic parts: the first part deals with crucial motives and models of the privatization processes in the USA and EU with a particular analytical focus on the Herfindahl-Hirschman doctrine of the collective domination index, as well as on the essence of merger-acquisition and take-over models. The second thematic part of the paper, as a logical continuation of the first one represents a brief comparative analysis of the motives and models implemented in bank privatization in the south-eastern European countries with particular focus on identifying interests of foreign investors, an optimal volume and price of the investment, and assessment of finalized privatizations in those countries. The final part of the paper theoretically and practically stems from the first and the second part, in that way making an interdependent and a compatible thematic whole with them, presents qualitative and quantitative aspects of analyzing finalized privatization and/or sale-purchase of Serbian banks with particular focus on IPO and IPOPLUS as the prevailing models of future sale-purchase in privatizing Serbian banks.

  8. Modeling Aspects Of Activated Sludge Processes Part I: Process Modeling Of Activated Sludge Facilitation And Sedimentation

    International Nuclear Information System (INIS)

    Ibrahim, H. I.; EI-Ahwany, A.H.; Ibrahim, G.

    2004-01-01

    Process modeling of activated sludge flocculation and sedimentation reviews consider the activated sludge floc characteristics such as: morphology viable and non-viable cell ratio density and water content, bio flocculation and its kinetics were studied considering the characteristics of bio flocculation and explaining theory of Divalent Cation Bridging which describes the major role of cations in bio flocculation. Activated sludge flocculation process modeling was studied considering mass transfer limitations from Clifft and Andrew, 1981, Benefild and Molz 1983 passing Henze 1987, until Tyagi 1996 and G. Ibrahim et aI. 2002. Models of aggregation and breakage of flocs were studied by Spicer and Pratsinis 1996,and Biggs 2002 Size distribution of floes influences mass transfer and biomass separation in the activated sludge process. Therefore, it is of primary importance to establish the role of specific process operation factors, such as sludge loading dynamic sludge age and dissolved oxygen, on this distribution with special emphasis on the formation of primary particles

  9. Measuring the precision of multi-perspective process models

    NARCIS (Netherlands)

    Mannhardt, Felix; De Leoni, Massimiliano; Reijers, Hajo A.; Van Der Aalst, Wil M P

    2016-01-01

    Process models need to reflect the real behavior of an organization’s processes to be beneficial for several use cases, such as process analysis, process documentation and process improvement. One quality criterion for a process model is that they should precise and not express more behavior than

  10. Uncertainty modeling process for semantic technology

    Directory of Open Access Journals (Sweden)

    Rommel N. Carvalho

    2016-08-01

    Full Text Available The ubiquity of uncertainty across application domains generates a need for principled support for uncertainty management in semantically aware systems. A probabilistic ontology provides constructs for representing uncertainty in domain ontologies. While the literature has been growing on formalisms for representing uncertainty in ontologies, there remains little guidance in the knowledge engineering literature for how to design probabilistic ontologies. To address the gap, this paper presents the Uncertainty Modeling Process for Semantic Technology (UMP-ST, a new methodology for modeling probabilistic ontologies. To explain how the methodology works and to verify that it can be applied to different scenarios, this paper describes step-by-step the construction of a proof-of-concept probabilistic ontology. The resulting domain model can be used to support identification of fraud in public procurements in Brazil. While the case study illustrates the development of a probabilistic ontology in the PR-OWL probabilistic ontology language, the methodology is applicable to any ontology formalism that properly integrates uncertainty with domain semantics.

  11. Geochemical modelization of differentiation processes by crystallization

    International Nuclear Information System (INIS)

    Cebria, J.M.; Lopez Ruiz, J.

    1994-01-01

    During crystallization processes, major and trace elements and stable isotopes fractionate, whereas radiogenic isotopes do not change. The different equations proposed allow us to reproduce the variation in major and trace elements during these differentiation processes. In the case of simple fractional crystallization, the residual liquid is impoverished in compatible elements faster than it is enriched in incompatible elements as crystallization proceeds. During in situ crystallization the incompatible elements evolve in a similar way to the case of simple fractional crystallization but the enrichment rate of the moderately incompatible elements is slower and the compatible elements do not suffer a depletion as strong as the one observed during simple fractional crystallization, even for higher f values. In a periodically replenished magma chamber if all the liquid present is removed at the end of each cycle, the magma follows patterns similar to those generated by simple fractional crystallization. On the contrary, if the liquid fraction that crystallizes during each cycle and the one that is extruded at the end of the cycle are small, the residual liquid shows compositions similar to those that would be obtained by equilibrium crystallization. Crystallization processes modelling is in general less difficult than for partial melting. If a rock series is the result of simple fractional crystallization, a C''i L -C''i L plot in which i is a compatible element and j is highly incompatible, allows us to obtain a good approximation to the initial liquid composition. Additionally, long C''i L -log C''i L diagrams in which i is a highly incompatible element, allow us to identify steps in the process and to calculate the bulk distribution coefficients of the trace elements during each step

  12. Properties of spatial Cox process models

    DEFF Research Database (Denmark)

    Møller, Jesper

    2005-01-01

    Particularly, we study the most important classes of Cox processes, including log Gaussian Cox processes, shot noise Cox processes, and permanent Cox processes. We consider moment properties and point process operations such as thinning, displacements, and super positioning. We also discuss how...... to simulate specific Cox processes....

  13. Heat source model for welding process

    International Nuclear Information System (INIS)

    Doan, D.D.

    2006-10-01

    One of the major industrial stakes of the welding simulation relates to the control of mechanical effects of the process (residual stress, distortions, fatigue strength... ). These effects are directly dependent on the temperature evolutions imposed during the welding process. To model this thermal loading, an original method is proposed instead of the usual methods like equivalent heat source approach or multi-physical approach. This method is based on the estimation of the weld pool shape together with the heat flux crossing the liquid/solid interface, from experimental data measured in the solid part. Its originality consists in solving an inverse Stefan problem specific to the welding process, and it is shown how to estimate the parameters of the weld pool shape. To solve the heat transfer problem, the interface liquid/solid is modeled by a Bezier curve ( 2-D) or a Bezier surface (3-D). This approach is well adapted to a wide diversity of weld pool shapes met for the majority of the current welding processes (TIG, MlG-MAG, Laser, FE, Hybrid). The number of parameters to be estimated is weak enough, according to the cases considered from 2 to 5 in 20 and 7 to 16 in 3D. A sensitivity study leads to specify the location of the sensors, their number and the set of measurements required to a good estimate. The application of the method on test results of welding TIG on thin stainless steel sheets in emerging and not emerging configurations, shows that only one measurement point is enough to estimate the various weld pool shapes in 20, and two points in 3D, whatever the penetration is full or not. In the last part of the work, a methodology is developed for the transient analysis. It is based on the Duvaut's transformation which overpasses the discontinuity of the liquid metal interface and therefore gives a continuous variable for the all spatial domain. Moreover, it allows to work on a fixed mesh grid and the new inverse problem is equivalent to identify a source

  14. Modelling and application of stochastic processes

    CERN Document Server

    1986-01-01

    The subject of modelling and application of stochastic processes is too vast to be exhausted in a single volume. In this book, attention is focused on a small subset of this vast subject. The primary emphasis is on realization and approximation of stochastic systems. Recently there has been considerable interest in the stochastic realization problem, and hence, an attempt has been made here to collect in one place some of the more recent approaches and algorithms for solving the stochastic realiza­ tion problem. Various different approaches for realizing linear minimum-phase systems, linear nonminimum-phase systems, and bilinear systems are presented. These approaches range from time-domain methods to spectral-domain methods. An overview of the chapter contents briefly describes these approaches. Also, in most of these chapters special attention is given to the problem of developing numerically ef­ ficient algorithms for obtaining reduced-order (approximate) stochastic realizations. On the application side,...

  15. Mechanical-mathematical modeling for landslide process

    Science.gov (United States)

    Svalova, V.

    2009-04-01

    500 m and displacement of a landslide in the plan over 1 m. Last serious activization of a landslide has taken place in 2002 with a motion on 53 cm. Catastrophic activization of the deep blockglide landslide in the area of Khoroshevo in Moscow took place in 2006-2007. A crack of 330 m long appeared in the old sliding circus, along which a new 220 m long creeping block was separated from the plateau and began sinking with a displaced surface of the plateau reaching to 12 m. Such activization of the landslide process was not observed in Moscow since mid XIX century. The sliding area of Khoroshevo was stable during long time without manifestations of activity. Revealing of the reasons of deformation and development of ways of protection from deep landslide motions is extremely actual and difficult problem which decision is necessary for preservation of valuable historical monuments and modern city constructions. The reasons of activization and protective measures are discussed. Structure of monitoring system for urban territories is elaborated. Mechanical-mathematical model of high viscous fluid was used for modeling of matter behavior on landslide slopes. Equation of continuity and an approximated equation of the Navier-Stockes for slow motions in a thin layer were used. The results of modelling give possibility to define the place of highest velocity on landslide surface, which could be the best place for monitoring post position. Model can be used for calibration of monitoring equipment and gives possibility to investigate some fundamental aspects of matter movement on landslide slope.

  16. Modeling and simulation of economic processes

    Directory of Open Access Journals (Sweden)

    Bogdan Brumar

    2010-12-01

    Full Text Available In general, any activity requires a longer action often characterized by a degree of uncertainty, insecurity, in terms of size of the objective pursued. Because of the complexity of real economic systems, the stochastic dependencies between different variables and parameters considered, not all systems can be adequately represented by a model that can be solved by analytical methods and covering all issues for management decision analysis-economic horizon real. Often in such cases, it is considered that the simulation technique is the only alternative available. Using simulation techniques to study real-world systems often requires a laborious work. Making a simulation experiment is a process that takes place in several stages.

  17. Kanban simulation model for production process optimization

    Directory of Open Access Journals (Sweden)

    Golchev Riste

    2015-01-01

    Full Text Available A long time has passed since the KANBAN system has been established as an efficient method for coping with the excessive inventory. Still, the possibilities for its improvement through its integration with other different approaches should be investigated further. The basic research challenge of this paper is to present benefits of KANBAN implementation supported with Discrete Event Simulation (DES. In that direction, at the beginning, the basics of KANBAN system are presented with emphasis on the information and material flow, together with a methodology for implementation of KANBAN system. Certain analysis on combining the simulation with this methodology is presented. The paper is concluded with a practical example which shows that through understanding the philosophy of the implementation methodology of KANBAN system and the simulation methodology, a simulation model can be created which can serve as a basis for a variety of experiments that can be conducted within a short period of time, resulting with production process optimization.

  18. Elliptic Determinantal Processes and Elliptic Dyson Models

    Science.gov (United States)

    Katori, Makoto

    2017-10-01

    We introduce seven families of stochastic systems of interacting particles in one-dimension corresponding to the seven families of irreducible reduced affine root systems. We prove that they are determinantal in the sense that all spatio-temporal correlation functions are given by determinants controlled by a single function called the spatio-temporal correlation kernel. For the four families {A}_{N-1}, {B}_N, {C}_N and {D}_N, we identify the systems of stochastic differential equations solved by these determinantal processes, which will be regarded as the elliptic extensions of the Dyson model. Here we use the notion of martingales in probability theory and the elliptic determinant evaluations of the Macdonald denominators of irreducible reduced affine root systems given by Rosengren and Schlosser.

  19. [Standardization and modeling of surgical processes].

    Science.gov (United States)

    Strauss, G; Schmitz, P

    2016-12-01

    Due to the technological developments around the operating room, surgery in the twenty-first century is undergoing a paradigm shift. Which technologies have already been integrated into the surgical routine? How can a favorable cost-benefit balance be achieved by the implementation of new software-based assistance systems? This article presents the state of the art technology as exemplified by a semi-automated operation system for otorhinolaryngology surgery. The main focus is on systems for implementation of digital handbooks and navigational functions in situ. On the basis of continuous development in digital imaging, decisions may by facilitated by individual patient models thus allowing procedures to be optimized. The ongoing digitization and linking of all relevant information enable a high level of standardization in terms of operating procedures. This may be used by assistance systems as a basis for complete documentation and high process reliability. Automation of processes in the operating room results in an increase in quality, precision and standardization so that the effectiveness and efficiency of treatment can be improved; however, care must be taken that detrimental consequences, such as loss of skills and placing too much faith in technology must be avoided by adapted training concepts.

  20. Numerical approaches to expansion process modeling

    Directory of Open Access Journals (Sweden)

    G. V. Alekseev

    2017-01-01

    Full Text Available Forage production is currently undergoing a period of intensive renovation and introduction of the most advanced technologies and equipment. More and more often such methods as barley toasting, grain extrusion, steaming and grain flattening, boiling bed explosion, infrared ray treatment of cereals and legumes, followed by flattening, and one-time or two-time granulation of the purified whole grain without humidification in matrix presses By grinding the granules. These methods require special apparatuses, machines, auxiliary equipment, created on the basis of different methods of compiled mathematical models. When roasting, simulating the heat fields arising in the working chamber, provide such conditions, the decomposition of a portion of the starch to monosaccharides, which makes the grain sweetish, but due to protein denaturation the digestibility of the protein and the availability of amino acids decrease somewhat. Grain is roasted mainly for young animals in order to teach them to eat food at an early age, stimulate the secretory activity of digestion, better development of the masticatory muscles. In addition, the high temperature is detrimental to bacterial contamination and various types of fungi, which largely avoids possible diseases of the gastrointestinal tract. This method has found wide application directly on the farms. Apply when used in feeding animals and legumes: peas, soy, lupine and lentils. These feeds are preliminarily ground, and then cooked or steamed for 1 hour for 30–40 minutes. In the feed mill. Such processing of feeds allows inactivating the anti-nutrients in them, which reduce the effectiveness of their use. After processing, legumes are used as protein supplements in an amount of 25–30% of the total nutritional value of the diet. But it is recommended to cook and steal a grain of good quality. A poor-quality grain that has been stored for a long time and damaged by pathogenic micro flora is subject to

  1. Towards a data processing plane: An automata-based distributed dynamic data processing model

    NARCIS (Netherlands)

    Cushing, R.; Belloum, A.; Bubak, M.; de Laat, C.

    Data processing complexity, partitionability, locality and provenance play a crucial role in the effectiveness of distributed data processing. Dynamics in data processing necessitates effective modeling which allows the understanding and reasoning of the fluidity of data processing. Through

  2. Business Process Simulation: Requirements for Business and Resource Models

    OpenAIRE

    Audrius Rima; Olegas Vasilecas

    2015-01-01

    The purpose of Business Process Model and Notation (BPMN) is to provide easily understandable graphical representation of business process. Thus BPMN is widely used and applied in various areas one of them being a business process simulation. This paper addresses some BPMN model based business process simulation problems. The paper formulate requirements for business process and resource models in enabling their use for business process simulation.

  3. Business Process Simulation: Requirements for Business and Resource Models

    Directory of Open Access Journals (Sweden)

    Audrius Rima

    2015-07-01

    Full Text Available The purpose of Business Process Model and Notation (BPMN is to provide easily understandable graphical representation of business process. Thus BPMN is widely used and applied in various areas one of them being a business process simulation. This paper addresses some BPMN model based business process simulation problems. The paper formulate requirements for business process and resource models in enabling their use for business process simulation.

  4. Computer Forensics Field Triage Process Model

    Directory of Open Access Journals (Sweden)

    Marcus K. Rogers

    2006-06-01

    Full Text Available With the proliferation of digital based evidence, the need for the timely identification, analysis and interpretation of digital evidence is becoming more crucial. In many investigations critical information is required while at the scene or within a short period of time - measured in hours as opposed to days. The traditional cyber forensics approach of seizing a system(s/media, transporting it to the lab, making a forensic image(s, and then searching the entire system for potential evidence, is no longer appropriate in some circumstances. In cases such as child abductions, pedophiles, missing or exploited persons, time is of the essence. In these types of cases, investigators dealing with the suspect or crime scene need investigative leads quickly; in some cases it is the difference between life and death for the victim(s. The Cyber Forensic Field Triage Process Model (CFFTPM proposes an onsite or field approach for providing the identification, analysis and interpretation of digital evidence in a short time frame, without the requirement of having to take the system(s/media back to the lab for an in-depth examination or acquiring a complete forensic image(s. The proposed model adheres to commonly held forensic principles, and does not negate the ability that once the initial field triage is concluded, the system(s/storage media be transported back to a lab environment for a more thorough examination and analysis. The CFFTPM has been successfully used in various real world cases, and its investigative importance and pragmatic approach has been amply demonstrated. Furthermore, the derived evidence from these cases has not been challenged in the court proceedings where it has been introduced. The current article describes the CFFTPM in detail, discusses the model’s forensic soundness, investigative support capabilities and practical considerations.

  5. The Formalization of the Business Process Modeling Goals

    Directory of Open Access Journals (Sweden)

    Ligita Bušinska

    2016-10-01

    Full Text Available In business process modeling the de facto standard BPMN has emerged. However, the applications of this notation have many subsets of elements and various extensions. Also, BPMN still coincides with many other modeling languages, forming a large set of available options for business process modeling languages and dialects. While, in general, the goal of modelers is a central notion in the choice of modeling languages and notations, in most researches that propose guidelines, techniques, and methods for business process modeling language evaluation and/or selection, the business process modeling goal is not formalized and not transparently taken into account. To overcome this gap, and to explicate and help to handle business process modeling complexity, the approach to formalize the business process modeling goal, and the supporting three dimensional business process modeling framework, are proposed.

  6. Modelling of fiberglass pipe destruction process

    Directory of Open Access Journals (Sweden)

    А. К. Николаев

    2017-03-01

    Full Text Available The article deals with important current issue of oil and gas industry of using tubes made of high-strength composite corrosion resistant materials. In order to improve operational safety of industrial pipes it is feasible to use composite fiberglass tubes. More than half of the accidents at oil and gas sites happen at oil gathering systems due to high corrosiveness of pumped fluid. To reduce number of accidents and improve environmental protection we need to solve the issue of industrial pipes durability. This problem could be solved by using composite materials from fiberglass, which have required physical and mechanical properties for oil pipes. The durability and strength can be monitored by a fiberglass winding method, number of layers in composite material and high corrosion-resistance properties of fiberglass. Usage of high-strength composite materials in oil production is economically feasible; fiberglass pipes production is cheaper than steel pipes. Fiberglass has small volume weight, which simplifies pipe transportation and installation. In order to identify the efficiency of using high-strength composite materials at oil production sites we conducted a research of their physical-mechanical properties and modelled fiber pipe destruction process.

  7. Atmospheric pollution. From processes to modelling

    International Nuclear Information System (INIS)

    Sportisse, B.

    2008-01-01

    Air quality, greenhouse effect, ozone hole, chemical or nuclear accidents.. All these phenomena are tightly linked to the chemical composition of atmosphere and to the atmospheric dispersion of pollutants. This book aims at supplying the main elements of understanding of 'atmospheric pollutions': stakes, physical processes involved, role of scientific expertise in decision making. Content: 1 - classifications and scales: chemical composition of the atmosphere, vertical structure, time scales (transport, residence); 2 - matter/light interaction: notions of radiative transfer, application to the Earth's atmosphere; 3 - some elements about the atmospheric boundary layer: notion of scales in meteorology, atmospheric boundary layer (ABL), thermal stratification and stability, description of ABL turbulence, elements of atmospheric dynamics, some elements about the urban climate; 4 - notions of atmospheric chemistry: characteristics, ozone stratospheric chemistry, ozone tropospheric chemistry, brief introduction to indoor air quality; 5 - aerosols, clouds and rains: aerosols and particulates, aerosols and clouds, acid rains and leaching; 6 - towards numerical simulation: equation of reactive dispersion, numerical methods for chemistry-transport models, numerical resolution of the general equation of aerosols dynamics (GDE), modern simulation chains, perspectives. (J.S.)

  8. Mathematical Modelling of Coal Gasification Processes

    Science.gov (United States)

    Sundararajan, T.; Raghavan, V.; Ajilkumar, A.; Vijay Kumar, K.

    2017-07-01

    Coal is by far the most commonly employed fuel for electrical power generation around the world. While combustion could be the route for coal utilization for high grade coals, gasification becomes the preferred process for low grade coals having higher composition of volatiles or ash. Indian coals suffer from high ash content-nearly 50% by weight in some cases. Instead of transporting such high ash coals, it is more energy efficient to gasify the coal and transport the product syngas. Integrated Gasification Combined Cycle (IGCC) plants and Underground Gasification of coal have become attractive technologies for the best utilization of high ash coals. Gasification could be achieved in fixed beds, fluidized beds and entrained beds; faster rates of gasification are possible in fluidized beds and entrained flow systems, because of the small particle sizes and higher gas velocities. The media employed for gasification could involve air/oxygen and steam. Use of oxygen will yield relatively higher calorific value syngas because of the absence of nitrogen. Sequestration of the carbon dioxide after the combustion of the syngas is also easier, if oxygen is used for gasification. Addition of steam can increase hydrogen yield in the syngas and thereby increase the calorific value also. Gasification in the presence of suitable catalysts can increase the composition of methane in the product gas. Several competing heterogenous and homogenous reactions occur during coal major heterogenous reaction pathways, while interactions between carbon monoxide, oxygen, hydrogen, water vapour, methane and carbon dioxide result in several simultaneous gas-phase (homogenous) reactions. The overall product composition of the coal gasification process depends on the input reactant composition, particle size and type of gasifier, and pressure and temperature of the gasifier. The use of catalysts can also selectively change the product composition. At IIT Madras, over the last one decade, both

  9. A general model for membrane-based separation processes

    DEFF Research Database (Denmark)

    Soni, Vipasha; Abildskov, Jens; Jonsson, Gunnar Eigil

    2009-01-01

    behaviour will play an important role. In this paper, modelling of membrane-based processes for separation of gas and liquid mixtures are considered. Two general models, one for membrane-based liquid separation processes (with phase change) and another for membrane-based gas separation are presented....... The separation processes covered are: membrane-based gas separation processes, pervaporation and various types of membrane distillation processes. The specific model for each type of membrane-based process is generated from the two general models by applying the specific system descriptions and the corresponding......A separation process could be defined as a process that transforms a given mixture of chemicals into two or more compositionally distinct end-use products. One way to design these separation processes is to employ a model-based approach, where mathematical models that reliably predict the process...

  10. Process modeling - It's history, current status, and future

    Science.gov (United States)

    Duttweiler, Russell E.; Griffith, Walter M.; Jain, Sulekh C.

    1991-04-01

    The development of process modeling is reviewed to examine the potential of process applications to prevent and solve problems associated with the aerospace industry. The business and global environments is assessed, and the traditional approach to product/process design is argued to be obsolete. A revised engineering process is described which involves planning and prediction before production by means of process simulation. Process simulation can permit simultaneous engineering of unit processes and complex processes, and examples are given in the cross-coupling of forging-process variance. The implementation of process modeling, CAE, and computer simulations are found to reduce costs and time associated with technological development when incorporated judiciously.

  11. Compliance in Resource-based Process Models

    NARCIS (Netherlands)

    Colombo Tosatto, S.; Elrakaiby, Y.; Ziafati, P.

    2013-01-01

    Execution of business processes often requires resources, the use of which is usually subject to constraints. In this paper, we study the compliance of business processes with resource usage policies. To this end, we relate the execution of a business process to its resource requirements in terms of

  12. Two Undergraduate Process Modeling Courses Taught Using Inductive Learning Methods

    Science.gov (United States)

    Soroush, Masoud; Weinberger, Charles B.

    2010-01-01

    This manuscript presents a successful application of inductive learning in process modeling. It describes two process modeling courses that use inductive learning methods such as inquiry learning and problem-based learning, among others. The courses include a novel collection of multi-disciplinary complementary process modeling examples. They were…

  13. Model-Driven and Pattern-Based Integration of Process-Driven SOA Models

    OpenAIRE

    Zdun, Uwe; Dustdar, Schahram

    2006-01-01

    Service-oriented architectures (SOA) are increasingly used in the context of business processes. However, the modeling approaches for process-driven SOAs do not yet sufficiently integrate the various kinds of models relevant for a process-driven SOA -- ranging from process models to software architectural models to software design models. We propose to integrate process-driven SOA models via a model-driven software development approach that is based on proven practices do...

  14. A Model of Process-Based Automation: Cost and Quality Implications in the Medication Management Process

    Science.gov (United States)

    Spaulding, Trent Joseph

    2011-01-01

    The objective of this research is to understand how a set of systems, as defined by the business process, creates value. The three studies contained in this work develop the model of process-based automation. The model states that complementarities among systems are specified by handoffs in the business process. The model also provides theory to…

  15. Model reduction methods for vector autoregressive processes

    CERN Document Server

    Brüggemann, Ralf

    2004-01-01

    1. 1 Objective of the Study Vector autoregressive (VAR) models have become one of the dominant research tools in the analysis of macroeconomic time series during the last two decades. The great success of this modeling class started with Sims' (1980) critique of the traditional simultaneous equation models (SEM). Sims criticized the use of 'too many incredible restrictions' based on 'supposed a priori knowledge' in large scale macroeconometric models which were popular at that time. Therefore, he advo­ cated largely unrestricted reduced form multivariate time series models, unrestricted VAR models in particular. Ever since his influential paper these models have been employed extensively to characterize the underlying dynamics in systems of time series. In particular, tools to summarize the dynamic interaction between the system variables, such as impulse response analysis or forecast error variance decompo­ sitions, have been developed over the years. The econometrics of VAR models and related quantities i...

  16. Modeling microbial processes in porous media

    Science.gov (United States)

    Murphy, Ellyn M.; Ginn, Timothy R.

    The incorporation of microbial processes into reactive transport models has generally proceeded along two separate lines of investigation: (1) transport of bacteria as inert colloids in porous media, and (2) the biodegradation of dissolved contaminants by a stationary phase of bacteria. Research over the last decade has indicated that these processes are closely linked. This linkage may occur when a change in metabolic activity alters the attachment/detachment rates of bacteria to surfaces, either promoting or retarding bacterial transport in a groundwater-contaminant plume. Changes in metabolic activity, in turn, are controlled by the time of exposure of the microbes to electron acceptors/donor and other components affecting activity. Similarly, metabolic activity can affect the reversibility of attachment, depending on the residence time of active microbes. Thus, improvements in quantitative analysis of active subsurface biota necessitate direct linkages between substrate availability, metabolic activity, growth, and attachment/detachment rates. This linkage requires both a detailed understanding of the biological processes and robust quantitative representations of these processes that can be tested experimentally. This paper presents an overview of current approaches used to represent physicochemical and biological processes in porous media, along with new conceptual approaches that link metabolic activity with partitioning of the microorganism between the aqueous and solid phases. Résumé L'introduction des processus microbiologiques dans des modèles de transport réactif a généralement suivi deux voies différentes de recherches: (1) le transport de bactéries sous forme de colloïdes inertes en milieu poreux, et (2) la biodégradation de polluants dissous par une phase stationnaire de bactéries. Les recherches conduites au cours des dix dernières années indiquent que ces processus sont intimement liés. Cette liaison peut intervenir lorsqu

  17. Learning Markov Decision Processes for Model Checking

    DEFF Research Database (Denmark)

    Mao, Hua; Chen, Yingke; Jaeger, Manfred

    2012-01-01

    Constructing an accurate system model for formal model verification can be both resource demanding and time-consuming. To alleviate this shortcoming, algorithms have been proposed for automatically learning system models based on observed system behaviors. In this paper we extend the algorithm...... is performed by analyzing the probabilistic linear temporal logic properties of the system as well as by analyzing the schedulers, in particular the optimal schedulers, induced by the learned models....

  18. Evaluation of EOR Processes Using Network Models

    DEFF Research Database (Denmark)

    Winter, Anatol; Larsen, Jens Kjell; Krogsbøll, Anette

    1998-01-01

    The report consists of the following parts: 1) Studies of wetting properties of model fluids and fluid mixtures aimed at an optimal selection of candidates for micromodel experiments. 2) Experimental studies of multiphase transport properties using physical models of porous networks (micromodels......) including estimation of their "petrophysical" properties (e.g. absolute permeability). 3) Mathematical modelling and computer studies of multiphase transport through pore space using mathematical network models. 4) Investigation of link between pore-scale and macroscopic recovery mechanisms....

  19. Explosive Bubble Modelling by Noncausal Process

    OpenAIRE

    Christian Gouriéroux; Jean-Michel Zakoian

    2013-01-01

    The linear mixed causal and noncausal autoregressive processes provide often a better fit to economic and financial time series than the standard causal linear autoregressive processes. By considering the example of the noncausal Cauchy autoregressive process, we show that it might be explained by the special associated nonlinear causal dynamics. Indeed, this causal dynamics can include unit root, bubble phenomena, or asymmetric cycles often observed on financial markets. The noncausal Cauchy...

  20. Dynamic modeling of ultrafiltration membranes for whey separation processes

    NARCIS (Netherlands)

    Saltık, M.B.; Özkan, Leyla; Jacobs, Marc; Padt, van der Albert

    2017-01-01

    In this paper, we present a control relevant rigorous dynamic model for an ultrafiltration membrane unit in a whey separation process. The model consists of a set of differential algebraic equations and is developed for online model based applications such as model based control and process

  1. Modeling Large Time Series for Efficient Approximate Query Processing

    DEFF Research Database (Denmark)

    Perera, Kasun S; Hahmann, Martin; Lehner, Wolfgang

    2015-01-01

    -wise aggregation to derive the models. These models are initially created from the original data and are kept in the database along with it. Subsequent queries are answered using the stored models rather than scanning and processing the original datasets. In order to support model query processing, we maintain...

  2. Regime-switching models to study psychological process

    NARCIS (Netherlands)

    Hamaker, E.L.; Grasman, R.P.P.P.; Kamphuis, J.H.

    2010-01-01

    Many psychological processes are characterized by recurrent shifts between different states. To model these processes at the level of the individual, regime-switching models may prove useful. In this chapter we discuss two of these models: the threshold autoregressive model and the Markov

  3. Lipid Processing Technology: Building a Multilevel Modeling Network

    DEFF Research Database (Denmark)

    Díaz Tovar, Carlos Axel; Mustaffa, Azizul Azri; Mukkerikar, Amol

    2011-01-01

    of a master parameter table; iii) development of a model library consisting of new and adopted process models of unit operations involved in lipid processing technologies, validation of the developed models using operating data collected from existing process plants, and application of validated models......The aim of this work is to present the development of a computer aided multilevel modeling network for the systematic design and analysis of processes employing lipid technologies. This is achieved by decomposing the problem into four levels of modeling: i) pure component property modeling...... and a lipid-database of collected experimental data from industry and generated data from validated predictive property models, as well as modeling tools for fast adoption-analysis of property prediction models; ii) modeling of phase behavior of relevant lipid mixtures using the UNIFACCI model, development...

  4. Property Modelling for Applications in Chemical Product and Process Design

    DEFF Research Database (Denmark)

    Gani, Rafiqul

    Physical-chemical properties of pure chemicals and their mixtures play an important role in the design of chemicals based products and the processes that manufacture them. Although, the use of experimental data in design and analysis of chemicals based products and their processes is desirable...... such as database, property model library, model parameter regression, and, property-model based product-process design will be presented. The database contains pure component and mixture data for a wide range of organic chemicals. The property models are based on the combined group contribution and atom...... modeling tools in design and analysis of chemical product-process design, including biochemical processes will be highlighted....

  5. Modelling of injection processes in ladle metallurgy

    NARCIS (Netherlands)

    Visser, H.

    2016-01-01

    Ladle metallurgical processes constitute a portion of the total production chain of steel from iron ore. With these batch processes, the hot metal or steel transfer ladle is being used as a reactor vessel and a reagent is often injected in order to bring the composition of the hot metal or steel to

  6. The Model of the Production Process for the Quality Management

    Directory of Open Access Journals (Sweden)

    Alot Zbigniew

    2017-02-01

    Full Text Available This article is a result of the research on the models of the production processes for the quality management and their identification. It discusses the classical model and the indicators for evaluating the capabilities by taking as its starting point the assumption of the normal distribution of the process characteristics. The division of the process types proposed by ISO 21747:2006 standard introducing models for non-stationary processes is presented. A general process model that allows in any real case to precisely describe the statistical characteristics of the process is proposed. It gives the opportunity for more detailed description, in comparison to the model proposed by ISO 21747:2006 standard, of the process characteristics and determining its capability. This model contains the type of process, statistical distribution, and the method for determining the capability and performance (long-term capability of the process. One of the model elements is proposed, own classification and resulting set of process types. The classification follows the recommendations of ISO 21747:2006 introducing models for the non-stationary processes. However, the set of the process types allows, beyond a more precise description of the process characteristics, its usage to monitor the process.

  7. Modelling the Active Hearing Process in Mosquitoes

    Science.gov (United States)

    Avitabile, Daniele; Homer, Martin; Jackson, Joe; Robert, Daniel; Champneys, Alan

    2011-11-01

    A simple microscopic mechanistic model is described of the active amplification within the Johnston's organ of the mosquito species Toxorhynchites brevipalpis. The model is based on the description of the antenna as a forced-damped oscillator coupled to a set of active threads (ensembles of scolopidia) that provide an impulsive force when they twitch. This twitching is in turn controlled by channels that are opened and closed if the antennal oscillation reaches a critical amplitude. The model matches both qualitatively and quantitatively with recent experiments. New results are presented using mathematical homogenization techniques to derive a mesoscopic model as a simple oscillator with nonlinear force and damping characteristics. It is shown how the results from this new model closely resemble those from the microscopic model as the number of threads approach physiologically correct values.

  8. Study of dissolution process and its modelling

    Directory of Open Access Journals (Sweden)

    Juan Carlos Beltran-Prieto

    2017-01-01

    Full Text Available The use of mathematical concepts and language aiming to describe and represent the interactions and dynamics of a system is known as a mathematical model. Mathematical modelling finds a huge number of successful applications in a vast amount of science, social and engineering fields, including biology, chemistry, physics, computer sciences, artificial intelligence, bioengineering, finance, economy and others. In this research, we aim to propose a mathematical model that predicts the dissolution of a solid material immersed in a fluid. The developed model can be used to evaluate the rate of mass transfer and the mass transfer coefficient. Further research is expected to be carried out to use the model as a base to develop useful models for the pharmaceutical industry to gain information about the dissolution of medicaments in the body stream and this could play a key role in formulation of medicaments.

  9. Bayesian Modeling of Cerebral Information Processing

    OpenAIRE

    Labatut, Vincent; Pastor, Josette

    2001-01-01

    International audience; Modeling explicitly the links between cognitive functions and networks of cerebral areas is necessitated both by the understanding of the clinical outcomes of brain lesions and by the interpretation of activation data provided by functional neuroimaging techniques. At this global level of representation, the human brain can be best modeled by a probabilistic functional causal network. Our modeling approach is based on the anatomical connection pattern, the information ...

  10. Computer modeling of lung cancer diagnosis-to-treatment process.

    Science.gov (United States)

    Ju, Feng; Lee, Hyo Kyung; Osarogiagbon, Raymond U; Yu, Xinhua; Faris, Nick; Li, Jingshan

    2015-08-01

    We introduce an example of a rigorous, quantitative method for quality improvement in lung cancer care-delivery. Computer process modeling methods are introduced for lung cancer diagnosis, staging and treatment selection process. Two types of process modeling techniques, discrete event simulation (DES) and analytical models, are briefly reviewed. Recent developments in DES are outlined and the necessary data and procedures to develop a DES model for lung cancer diagnosis, leading up to surgical treatment process are summarized. The analytical models include both Markov chain model and closed formulas. The Markov chain models with its application in healthcare are introduced and the approach to derive a lung cancer diagnosis process model is presented. Similarly, the procedure to derive closed formulas evaluating the diagnosis process performance is outlined. Finally, the pros and cons of these methods are discussed.

  11. An Abstract Model of Historical Processes

    Directory of Open Access Journals (Sweden)

    Michael Poulshock

    2017-06-01

    Full Text Available A theoretical model is presented which provides a way to simulate, at a very abstract level, power struggles in the social world. In the model, agents can benefit or harm each other, to varying degrees and with differing levels of influence. The agents interact over time, using the power they have to try to get more of it, while being constrained in their strategic choices by social inertia. The outcomes of the model are probabilistic. More research is needed to determine whether the model has any empirical validity.

  12. Modeling and Advanced Control for Sustainable Process ...

    Science.gov (United States)

    This book chapter introduces a novel process systems engineering framework that integrates process control with sustainability assessment tools for the simultaneous evaluation and optimization of process operations. The implemented control strategy consists of a biologically-inspired, multi-agent-based method. The sustainability and performance assessment of process operating points is carried out using the U.S. E.P.A.’s GREENSCOPE assessment tool that provides scores for the selected economic, material management, environmental and energy indicators. The indicator results supply information on whether the implementation of the controller is moving the process towards a more sustainable operation. The effectiveness of the proposed framework is illustrated through a case study of a continuous bioethanol fermentation process whose dynamics are characterized by steady-state multiplicity and oscillatory behavior. This book chapter contribution demonstrates the application of novel process control strategies for sustainability by increasing material management, energy efficiency, and pollution prevention, as needed for SHC Sustainable Uses of Wastes and Materials Management.

  13. Modeling of Heating During Food Processing

    Science.gov (United States)

    Zheleva, Ivanka; Kamburova, Veselka

    Heat transfer processes are important for almost all aspects of food preparation and play a key role in determining food safety. Whether it is cooking, baking, boiling, frying, grilling, blanching, drying, sterilizing, or freezing, heat transfer is part of the processing of almost every food. Heat transfer is a dynamic process in which thermal energy is transferred from one body with higher temperature to another body with lower temperature. Temperature difference between the source of heat and the receiver of heat is the driving force in heat transfer.

  14. A computational model of human auditory signal processing and perception

    OpenAIRE

    Jepsen, Morten Løve; Ewert, Stephan D.; Dau, Torsten

    2008-01-01

    A model of computational auditory signal-processing and perception that accounts for various aspects of simultaneous and nonsimultaneous masking in human listeners is presented. The model is based on the modulation filterbank model described by Dau et al. [J. Acoust. Soc. Am. 102, 2892 (1997)] but includes major changes at the peripheral and more central stages of processing. The model contains outer- and middle-ear transformations, a nonlinear basilar-membrane processing stage, a hair-cell t...

  15. Multiscale soil-landscape process modeling

    NARCIS (Netherlands)

    Schoorl, J.M.; Veldkamp, A.

    2006-01-01

    The general objective of this chapter is to illustrate the role of soils and geomorphological processes in the multiscale soil-lanscape context. Included in this context is the fourth dimension (temporal dimension) and the human role (fifth dimension)

  16. Comparative analysis of business rules and business process modeling languages

    Directory of Open Access Journals (Sweden)

    Audrius Rima

    2013-03-01

    Full Text Available During developing an information system is important to create clear models and choose suitable modeling languages. The article analyzes the SRML, SBVR, PRR, SWRL, OCL rules specifying language and UML, DFD, CPN, EPC and IDEF3 BPMN business process modeling language. The article presents business rules and business process modeling languages theoretical comparison. The article according to selected modeling aspects of the comparison between different business process modeling languages ​​and business rules representation languages sets. Also, it is selected the best fit of language set for three layer framework for business rule based software modeling.

  17. Difference-based Model Synchronization in an Industrial MDD Process

    DEFF Research Database (Denmark)

    Könemann, Patrick; Kindler, Ekkart; Unland, Ludger

    2009-01-01

    Models play a central role in model-driven software engineering. There are different kinds of models during the development process, which are related to each other and change over time. Therefore, it is difficult to keep the different models consistent with each other. Consistency of different m...... model versions, and for synchronizing other types of models. The main concern is to apply our concepts to an industrial process, in particular keeping usability and performance in mind. Keyword: Model Differencing, Model Merging, Model Synchronization...

  18. Dynamic process model of a plutonium oxalate precipitator. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Miller, C.L.; Hammelman, J.E.; Borgonovi, G.M.

    1977-11-01

    In support of LLL material safeguards program, a dynamic process model was developed which simulates the performance of a plutonium (IV) oxalate precipitator. The plutonium oxalate precipitator is a component in the plutonium oxalate process for making plutonium oxide powder from plutonium nitrate. The model is based on state-of-the-art crystallization descriptive equations, the parameters of which are quantified through the use of batch experimental data. The dynamic model predicts performance very similar to general Hanford oxalate process experience. The utilization of such a process model in an actual plant operation could promote both process control and material safeguards control by serving as a baseline predictor which could give early warning of process upsets or material diversion. The model has been incorporated into a FORTRAN computer program and is also compatible with the DYNSYS 2 computer code which is being used at LLL for process modeling efforts.

  19. Dynamic process model of a plutonium oxalate precipitator. Final report

    International Nuclear Information System (INIS)

    Miller, C.L.; Hammelman, J.E.; Borgonovi, G.M.

    1977-11-01

    In support of LLL material safeguards program, a dynamic process model was developed which simulates the performance of a plutonium (IV) oxalate precipitator. The plutonium oxalate precipitator is a component in the plutonium oxalate process for making plutonium oxide powder from plutonium nitrate. The model is based on state-of-the-art crystallization descriptive equations, the parameters of which are quantified through the use of batch experimental data. The dynamic model predicts performance very similar to general Hanford oxalate process experience. The utilization of such a process model in an actual plant operation could promote both process control and material safeguards control by serving as a baseline predictor which could give early warning of process upsets or material diversion. The model has been incorporated into a FORTRAN computer program and is also compatible with the DYNSYS 2 computer code which is being used at LLL for process modeling efforts

  20. MODELLING OF THE PROCESS OF TEACHING READING ENGLISH LANGUAGE PERIODICALS

    Directory of Open Access Journals (Sweden)

    Тетяна Глушко

    2014-07-01

    Full Text Available The article reveals a scientifically substantiated process of teaching reading English language periodicals in all its components, which are consistently developed, and form of interconnection of the structural elements in the process of teaching reading. This process is presented as a few interconnected and interdetermined models: 1 the models of the process of acquiring standard and expressive lexical knowledge; 2 the models of the process of formation of skills to use such vocabulary; 3 the models of the development of skills to read texts of the different linguistic levels.

  1. MODELING OF MANAGEMENT PROCESSES IN AN ORGANIZATION

    Directory of Open Access Journals (Sweden)

    Stefan Iovan

    2016-05-01

    Full Text Available When driving any major change within an organization, strategy and execution are intrinsic to a project’s success. Nevertheless, closing the gap between strategy and execution remains a challenge for many organizations [1]. Companies tend to focus more on execution than strategy for quick results, instead of taking the time needed to understand the parts that make up the whole, so the right execution plan can be put in place to deliver the best outcomes. A large part of this understands that business operations don’t fit neatly within the traditional organizational hierarchy. Business processes are often messy, collaborative efforts that cross teams, departments and systems, making them difficult to manage within a hierarchical structure [2]. Business process management (BPM fills this gap by redefining an organization according to its end-to-end processes, so opportunities for improvement can be identified and processes streamlined for growth, revenue and transformation. This white paper provides guidelines on what to consider when using business process applications to solve your BPM initiatives, and the unique capabilities software systems provides that can help ensure both your project’s success and the success of your organization as a whole. majority of medium and small businesses, big companies and even some guvermental organizations [2].

  2. Modelling of additive manufacturing processes: a review and classification

    Science.gov (United States)

    Stavropoulos, Panagiotis; Foteinopoulos, Panagis

    2018-03-01

    Additive manufacturing (AM) is a very promising technology; however, there are a number of open issues related to the different AM processes. The literature on modelling the existing AM processes is reviewed and classified. A categorization of the different AM processes in process groups, according to the process mechanism, has been conducted and the most important issues are stated. Suggestions are made as to which approach is more appropriate according to the key performance indicator desired to be modelled and a discussion is included as to the way that future modelling work can better contribute to improving today's AM process understanding.

  3. Simulation Model Development for Mail Screening Process

    National Research Council Canada - National Science Library

    Vargo, Trish; Marvin, Freeman; Kooistra, Scott

    2005-01-01

    STUDY OBJECTIVE: Provide decision analysis support to the Homeland Defense Business Unit, Special Projects Team, in developing a simulation model to help determine the most effective way to eliminate backlog...

  4. The NPS Virtual Thermal Image Processing Model

    National Research Council Canada - National Science Library

    Lenter, Yucel

    2001-01-01

    ...). The MRTD is a standard performance measure for forward-looking infrared (FLIR) imaging systems. It takes into account thermal imaging system modeling concerns, such as modulation transfer functions...

  5. Task-specific visual cues for improving process model understanding

    NARCIS (Netherlands)

    Petrusel, Razvan; Mendling, Jan; Reijers, Hajo A.

    2016-01-01

    Context Business process models support various stakeholders in managing business processes and designing process-aware information systems. In order to make effective use of these models, they have to be readily understandable. Objective Prior research has emphasized the potential of visual cues to

  6. A model of the gas analysis system operation process

    Science.gov (United States)

    Yakimenko, I. V.; Kanishchev, O. A.; Lyamets, L. L.; Volkova, I. V.

    2017-12-01

    The characteristic features of modeling the gas-analysis measurement system operation process on the basis of the semi-Markov process theory are discussed. The model of the measuring gas analysis system operation process is proposed, which makes it possible to take into account the influence of the replacement interval, the level of reliability and maintainability and to evaluate the product reliability.

  7. Computer-Aided Multiscale Modelling for Chemical Process Engineering

    DEFF Research Database (Denmark)

    Morales Rodriguez, Ricardo; Gani, Rafiqul

    2007-01-01

    Chemical processes are generally modeled through monoscale approaches, which, while not adequate, satisfy a useful role in product-process design. In this case, use of a multi-dimensional and multi-scale model-based approach has importance in product-process development. A computer-aided framewor...

  8. Capability Maturity Model (CMM) for Software Process Improvements

    Science.gov (United States)

    Ling, Robert Y.

    2000-01-01

    This slide presentation reviews the Avionic Systems Division's implementation of the Capability Maturity Model (CMM) for improvements in the software development process. The presentation reviews the process involved in implementing the model and the benefits of using CMM to improve the software development process.

  9. Support of Modelling in Process-Engineering Education

    NARCIS (Netherlands)

    Schaaf, van der H.; Vermuë, M.H.; Tramper, J.; Hartog, R.J.M.

    2006-01-01

    An objective of the Process Technology curriculum at Wageningen University is to teach students a stepwise modeling approach in the context of process engineering. Many process-engineering students have difficulty with learning to design a model. Some common problems are lack of structure in the

  10. On Process Modelling Using Physical Oriented And Phenomena Based Principles

    Directory of Open Access Journals (Sweden)

    Mihai Culea

    2000-12-01

    Full Text Available This work presents a modelling framework based on phenomena description of the process. The approach is taken to easy understand and construct process model in heterogeneous possible distributed modelling and simulation environments. A simplified case study of a heat exchanger is considered and Modelica modelling language to check the proposed concept. The partial results are promising and the research effort will be extended in a computer aided modelling environment based on phenomena.

  11. Statistical image processing and multidimensional modeling

    CERN Document Server

    Fieguth, Paul

    2010-01-01

    Images are all around us! The proliferation of low-cost, high-quality imaging devices has led to an explosion in acquired images. When these images are acquired from a microscope, telescope, satellite, or medical imaging device, there is a statistical image processing task: the inference of something - an artery, a road, a DNA marker, an oil spill - from imagery, possibly noisy, blurry, or incomplete. A great many textbooks have been written on image processing. However this book does not so much focus on images, per se, but rather on spatial data sets, with one or more measurements taken over

  12. Sensitivity study of reduced models of the activated sludge process ...

    African Journals Online (AJOL)

    2009-08-07

    Aug 7, 2009 ... order to fit the reduced model behaviour to the real data for the process behaviour. Keywords: wastewater treatment, activated sludge process, reduced model, model parameters, sensitivity function, Matlab simulation. Introduction. The problem of effective and optimal control of wastewater treatment plants ...

  13. Process models as tools in forestry research and management

    Science.gov (United States)

    Kurt Johnsen; Lisa Samuelson; Robert Teskey; Steve McNulty; Tom Fox

    2001-01-01

    Forest process models are mathematical representations of biological systems that incorporate our understanding of physiological and ecological mechanisms into predictive algorithms. These models were originally designed and used for research purposes, but are being developed for use in practical forest management. Process models designed for research...

  14. Neuro-fuzzy model for evaluating the performance of processes ...

    Indian Academy of Sciences (India)

    In this work an Adaptive Neuro-Fuzzy Inference System (ANFIS) was used to model the periodic performance of some multi-input single-output (MISO) processes, namely: brewery operations (case study 1) and soap production (case study 2) processes. Two ANFIS models were developed to model the performance of the ...

  15. The evolution of process-based hydrologic models

    NARCIS (Netherlands)

    Clark, Martyn P.; Bierkens, Marc F.P.; Samaniego, Luis; Woods, Ross A.; Uijlenhoet, Remko; Bennett, Katrina E.; Pauwels, Valentijn R.N.; Cai, Xitian; Wood, Andrew W.; Peters-Lidard, Christa D.

    2017-01-01

    The diversity in hydrologic models has historically led to great controversy on the "correct" approach to process-based hydrologic modeling, with debates centered on the adequacy of process parameterizations, data limitations and uncertainty, and computational constraints on model analysis. In this

  16. PtProcess: An R Package for Modelling Marked Point Processes Indexed by Time

    Directory of Open Access Journals (Sweden)

    David Harte

    2010-10-01

    Full Text Available This paper describes the package PtProcess which uses the R statistical language. The package provides a unified approach to fitting and simulating a wide variety of temporal point process or temporal marked point process models. The models are specified by an intensity function which is conditional on the history of the process. The user needs to provide routines for calculating the conditional intensity function. Then the package enables one to carry out maximum likelihood fitting, goodness of fit testing, simulation and comparison of models. The package includes the routines for the conditional intensity functions for a variety of standard point process models. The package is intended to simplify the fitting of point process models indexed by time in much the same way as generalized linear model programs have simplified the fitting of various linear models. The primary examples used in this paper are earthquake sequences but the package is intended to have a much wider applicability.

  17. A compositional process control model and its application to biochemical processes.

    NARCIS (Netherlands)

    Jonker, C.M.; Treur, J.

    1999-01-01

    A compositional generic process control model is presented which has been applied to control enzymatic biochemical processes. The model has been designed at a conceptual and formal level using the compositional development method DESIRE, and includes processes for analysis, planning and simulation.

  18. Beyond dual-process models: A categorisation of processes underlying intuitive judgement and decision making

    NARCIS (Netherlands)

    Glöckner, A.; Witteman, C.L.M.

    2010-01-01

    Intuitive-automatic processes are crucial for making judgements and decisions. The fascinating complexity of these processes has attracted many decision researchers, prompting them to start investigating intuition empirically and to develop numerous models. Dual-process models assume a clear

  19. A Compositional Process Control Model and its Application to Biochemical Processes

    NARCIS (Netherlands)

    Jonker, C.M.; Treur, J.

    2002-01-01

    A compositional generic process control model is presented which has been applied to control enzymatic biochemical processes. The model has been designed at a conceptual and formal level using the compositional development method DESIRE, and includes processes for analysis, planning and simulation.

  20. Business Process Modeling Languages Supporting Collaborative Networks

    NARCIS (Netherlands)

    Soleimani Malekan, H.; Afsarmanesh, H.; Hammoudi, S.; Maciaszek, L.A.; Cordeiro, J.; Dietz, J.L.G.

    2013-01-01

    Formalizing the definition of Business Processes (BPs) performed within each enterprise is fundamental for effective deployment of their competencies and capabilities within Collaborative Networks (CN). In our approach, every enterprise in the CN is represented by its set of BPs, so that other

  1. Model based optimization of MSWC process control

    NARCIS (Netherlands)

    Kessel, L.B.M. van; Leskens, M.

    2002-01-01

    Optimization of municipal solid waste combustion (MSWC), processes is an im portant issue doe to the ever-lasting need for emission reduction. more optimal use of raw materials and overall cost reduction. The key of the approach of TNO (Netherlands Orgaru sation for Applied Scientific Research) to

  2. Understanding Modeling Requirements of Unstructured Business Processes

    NARCIS (Netherlands)

    Allah Bukhsh, Zaharah; van Sinderen, Marten J.; Sikkel, Nicolaas; Quartel, Dick

    2017-01-01

    Management of structured business processes is of interest to both academia and industry, where academia focuses on the development of methods and techniques while industry focuses on the development of supporting tools. With the shift from routine to knowledge work, the relevance of management of

  3. Deconstructing crop processes and models via identities

    DEFF Research Database (Denmark)

    Porter, John Roy; Christensen, Svend

    2013-01-01

    , mainly atmospheric CO2 concentration and increased and/or varying temperatures. It illustrates an important principle in models of a single cause having alternative effects and vice versa. The second part suggests some features, mostly missing in current crop models, that need to be included......This paper is part review and part opinion piece; it has three parts of increasing novelty and speculation in approach. The first presents an overview of how some of the major crop simulation models approach the issue of simulating the responses of crops to changing climatic and weather variables...... in the future, focussing on extreme events such as high temperature or extreme drought. The final opinion part is speculative but novel. It describes an approach to deconstruct resource use efficiencies into their constituent identities or elements based on the Kaya-Porter identity, each of which can...

  4. Model-Based Methods in the Biopharmaceutical Process Lifecycle.

    Science.gov (United States)

    Kroll, Paul; Hofer, Alexandra; Ulonska, Sophia; Kager, Julian; Herwig, Christoph

    2017-12-01

    Model-based methods are increasingly used in all areas of biopharmaceutical process technology. They can be applied in the field of experimental design, process characterization, process design, monitoring and control. Benefits of these methods are lower experimental effort, process transparency, clear rationality behind decisions and increased process robustness. The possibility of applying methods adopted from different scientific domains accelerates this trend further. In addition, model-based methods can help to implement regulatory requirements as suggested by recent Quality by Design and validation initiatives. The aim of this review is to give an overview of the state of the art of model-based methods, their applications, further challenges and possible solutions in the biopharmaceutical process life cycle. Today, despite these advantages, the potential of model-based methods is still not fully exhausted in bioprocess technology. This is due to a lack of (i) acceptance of the users, (ii) user-friendly tools provided by existing methods, (iii) implementation in existing process control systems and (iv) clear workflows to set up specific process models. We propose that model-based methods be applied throughout the lifecycle of a biopharmaceutical process, starting with the set-up of a process model, which is used for monitoring and control of process parameters, and ending with continuous and iterative process improvement via data mining techniques.

  5. Modeling of processing technologies in food industry

    Science.gov (United States)

    Korotkov, V. G.; Sagitov, R. F.; Popov, V. P.; Bachirov, V. D.; Akhmadieva, Z. R.; TSirkaeva, E. A.

    2018-03-01

    Currently, the society is facing an urgent need to solve the problems of nutrition (products with increased nutrition value) and to develop energy-saving technologies for food products. A mathematical modeling of heat and mass transfer of polymer materials in the extruder is rather successful these days. Mathematical description of movement and heat exchange during extrusion of gluten-protein-starch-containing material similar to pasta dough in its structure, were taken as a framework for the mathematical model presented in this paper.

  6. Analysis of Using Resources in Business Process Modeling and Simulation

    Directory of Open Access Journals (Sweden)

    Vasilecas Olegas

    2014-12-01

    Full Text Available One of the key purposes of Business Process Model and Notation (BPMN is to support graphical representation of the process model. However, such models have a lack of support for the graphical representation of resources, whose processes are used during simulation or execution of process instance. The paper analyzes different methods and their extensions for resource modeling. Further, this article presents a selected set of resource properties that are relevant for resource modeling. The paper proposes an approach that explains how to use the selected set of resource properties for extension of process modeling using BPMN and simulation tools. They are based on BPMN, where business process instances use resources in a concurrency manner.

  7. A Comprehensive and Harmonized Digital Forensic Investigation Process Model.

    Science.gov (United States)

    Valjarevic, Aleksandar; Venter, Hein S

    2015-11-01

    Performing a digital forensic investigation (DFI) requires a standardized and formalized process. There is currently neither an international standard nor does a global, harmonized DFI process (DFIP) exist. The authors studied existing state-of-the-art DFIP models and concluded that there are significant disparities pertaining to the number of processes, the scope, the hierarchical levels, and concepts applied. This paper proposes a comprehensive model that harmonizes existing models. An effort was made to incorporate all types of processes proposed by the existing models, including those aimed at achieving digital forensic readiness. The authors introduce a novel class of processes called concurrent processes. This is a novel contribution that should, together with the rest of the model, enable more efficient and effective DFI, while ensuring admissibility of digital evidence. Ultimately, the proposed model is intended to be used for different types of DFI and should lead to standardization. © 2015 American Academy of Forensic Sciences.

  8. Modelling spray drying processes for dairy products

    NARCIS (Netherlands)

    Verdurmen, Ruud E.M.; Straatsma, Han; Verschueren, Maykel; van Haren, Jan; Smit, Erik; Bargeman, Gerrald; de Jong, Peter

    2002-01-01

    NIZO food research (The Netherlands) has been working for the food industry, the dairy industry in particular, for over 50 years. During the past 15 years NIZO food research has put a lot of effort into developing predictive computer models for the food industry. Nowadays the main challenges in the

  9. Mathematical modelling of the calcination process | Olayiwola ...

    African Journals Online (AJOL)

    High quality lime is an essential raw material for Electric Arc Furnaces and Basic Oxygen Furnaces, steelmaking, alumina production etc. Decrease in fuel consumption in metallurgical furnaces is a tremendous opportunity for reduction of greenhouse gas emissions into the atmosphere. In this paper, a mathematical model ...

  10. Process modeling of a HLA research lab

    Science.gov (United States)

    Ribeiro, Bruna G. C.; Sena, Alexandre C.; Silva, Dilson; Marzulo, Leandro A. J.

    2017-11-01

    Bioinformatics has provided tremendous breakthroughs in the field of molecular biology. All this evolution has generated a large volume of biological data that increasingly require the use of computing for analysis and storage of this information. The identification of the human leukocyte antigen (HLA) genotypes is critical to the success of organ transplants in humans. HLA typing involves not only laboratory tests but also DNA sequencing, with the participation of several professionals responsible for different stages of the process. Thus, the objective of this paper is to map the main steps in HLA typing in a laboratory specialized in performing such procedures, analyzing each process and proposing solutions to speed up the these steps, avoiding mistakes.

  11. Modeling bacterial decay coefficient during SSDML process

    Energy Technology Data Exchange (ETDEWEB)

    Sreekrishnan, T.R.; Tyagi, R.D.; Blais, J.F.; Meunier, N.; Cambell, P.G.C. [Univ. de Quebec, Ste-Foy, Quebec (Canada)

    1996-11-01

    The simultaneous sludge digestion and metal leaching (SSDML) process can leach out heavy metals, achieve sludge solids reduction, and eliminate sludge pathogens. The potential for application in the wastewater treatment industry requires a sound knowledge of the system kinetics. The present work targets a better understanding of the qualitative as well as quantitative relationships between solids reduction rate and other parameters such as sludge pH, initial MLSS concentration, and availability of oxygen during the SSDML process. Experiments were carried out in laboratory batch reactors (20 L working volume) as well as in a 4,000 L capacity pilot facility. Based on the results of these experiments, it was concluded that degradation rate of sludge volatile matter is influenced by (1) sludge pH; (2) availability of oxygen; and (3) initial mixed liquor suspended solids (MLSS) concentration of the sludge. The degradation rate constant for biodegradable fraction of the mixed liquor volatile suspended solids [MLVSS(B)] was computed for various initial MLVSS concentration and sludge pH ranges. The value of k{sub d} decreased with decreasing pH in all cases. Effect of initial MLSS concentration on the value of k{sub d} was found to be minimal for the sludge studied. The relation between the sludge pH and k{sub d} for this sludge was expressed in the form of two polynomials. The relations developed were used in conjunction with previous results on the SSDML process kinetics to simulate the overall SSDML process. Results of these simulation studies were found satisfactory when compared to actual experimental results.

  12. Stochastic Models in the Identification Process

    Czech Academy of Sciences Publication Activity Database

    Slovák, Dalibor; Zvárová, Jana

    2011-01-01

    Roč. 7, č. 1 (2011), s. 44-50 ISSN 1801-5603 R&D Projects: GA MŠk(CZ) 1M06014 Institutional research plan: CEZ:AV0Z10300504 Keywords : identification process * weight-of evidence formula * coancestry coefficient * beta- binomial sampling formula * DNA mixtures Subject RIV: IN - Informatics, Computer Science http://www.ejbi.eu/images/2011-1/Slovak_en.pdf

  13. A process model of global purchasing

    OpenAIRE

    MATTHYSSENS, Paul; QUINTENS, Lieven; FAES, Wouter

    2003-01-01

    Inward internationalisation has received more and more attention in recent literature. This article contributes to this developing domain by providing a holistic description of the underlying processes of global purchasing. By means of case study research, carried out in eight companies, drivers and inhibitors of globalisation are highlighted. Conditions that could make global purchasing more efficient and effective are suggested. Attention is drawn to key factors on which companies strategie...

  14. A model for dealing with parallel processes in supervision

    OpenAIRE

    Lilja Cajvert

    2011-01-01

    A model for dealing with parallel processes in supervision Supervision in social work is essential for successful outcomes when working with clients. In social work, unconscious difficulties may arise and similar difficulties may occur in supervision as parallel processes. In this article, the development of a practice-based model of supervision to deal with parallel processes in supervision is described. The model has six phases. In the first phase, the focus is on the supervisor’s inner ...

  15. Catastrophe insurance modeled by shot-noise processes

    OpenAIRE

    Schmidt, Thorsten

    2014-01-01

    Shot-noise processes generalize compound Poisson processes in the following way: a jump (the shot) is followed by a decline (noise). This constitutes a useful model for insurance claims in many circumstances; claims due to natural disasters or self-exciting processes exhibit similar features. We give a general account of shot-noise processes with time-inhomogeneous drivers inspired by recent results in credit risk. Moreover, we derive a number of useful results for modeling and pricing with s...

  16. Measures of quality of process models created in BPMN

    Directory of Open Access Journals (Sweden)

    Radek Hronza

    2015-12-01

    Full Text Available Description, documentation, evaluation and redesign of key processes during their execution should be an essential part of the strategic management of any organization. All organizations live in a dynamically changing environment. Therefore they must adapt its internal processes to market changes. These processes must be described. Suitable way of description could be BPMN notation. Right after description of processes via BPMN, processes should be controlled to ensure expected of their quality. System (which could be automated based on mathematical expression of qualitative characteristics of process models (i.e. measures of quality of process models can support mentioned process controls. Research team trying to design and get into practical use such a tool. The aim of this publication is description of mentioned system – based on measures of the quality of process models - and answer associated scientific questions.

  17. Validation of the filament winding process model

    Science.gov (United States)

    Calius, Emilo P.; Springer, George S.; Wilson, Brian A.; Hanson, R. Scott

    1987-01-01

    Tests were performed toward validating the WIND model developed previously for simulating the filament winding of composite cylinders. In these tests two 24 in. long, 8 in. diam and 0.285 in. thick cylinders, made of IM-6G fibers and HBRF-55 resin, were wound at + or - 45 deg angle on steel mandrels. The temperatures on the inner and outer surfaces and inside the composite cylinders were recorded during oven cure. The temperatures inside the cylinders were also calculated by the WIND model. The measured and calculated temperatures were then compared. In addition, the degree of cure and resin viscosity distributions inside the cylinders were calculated for the conditions which existed in the tests.

  18. Modified Invasion Percolation Models for Multiphase Processes

    Energy Technology Data Exchange (ETDEWEB)

    Karpyn, Zuleima [Pennsylvania State Univ., State College, PA (United States)

    2015-01-31

    This project extends current understanding and modeling capabilities of pore-scale multiphase flow physics in porous media. High-resolution X-ray computed tomography imaging experiments are used to investigate structural and surface properties of the medium that influence immiscible displacement. Using experimental and computational tools, we investigate the impact of wetting characteristics, as well as radial and axial loading conditions, on the development of percolation pathways, residual phase trapping and fluid-fluid interfacial areas.

  19. Modeling non-Gaussian time-varying vector autoregressive process

    Data.gov (United States)

    National Aeronautics and Space Administration — We present a novel and general methodology for modeling time-varying vector autoregressive processes which are widely used in many areas such as modeling of chemical...

  20. Modeling Resource Hotspots: Critical Linkages and Processes

    Science.gov (United States)

    Daher, B.; Mohtar, R.; Pistikopoulos, E.; McCarl, B. A.; Yang, Y.

    2017-12-01

    Growing demands for interconnected resources emerge in the form of hotspots of varying characteristics. The business as usual allocation model cannot address the current, let alone anticipated, complex and highly interconnected resource challenges we face. A new paradigm for resource allocation must be adopted: one that identifies cross-sectoral synergies and, that moves away from silos to recognition of the nexus and integration of it. Doing so will result in new opportunities for business growth, economic development, and improved social well-being. Solutions and interventions must be multi-faceted; opportunities should be identified with holistic trade-offs in mind. No single solution fits all: different hotspots will require distinct interventions. Hotspots have varying resource constraints, stakeholders, goals and targets. The San Antonio region represents a complex resource hotspot with promising potential: its rapidly growing population, the Eagle Ford shale play, and the major agricultural activity there makes it a hotspot with many competing demands. Stakeholders need tools to allow them to knowledgeably address impending resource challenges. This study will identify contemporary WEF nexus questions and critical system interlinkages that will inform the modeling of the tightly interconnected resource systems and stresses using the San Antonio Region as a base; it will conceptualize a WEF nexus modeling framework, and develop assessment criteria to inform integrative planning and decision making.

  1. Modelling of chemical reactions in metallurgical processes

    OpenAIRE

    Kinaci, M. Efe; Lichtenegger, Thomas; Schneiderbauer, Simon

    2017-01-01

    Iron-ore reduction has attracted much interest in the last three decades since it can be considered as a core process in steel industry. The iron-ore is reduced to iron with the use of blast furnace and fluidized bed technologies. To investigate the harsh conditions inside fluidized bed reactors, computational tools can be utilized. One such tool is the CFD-DEM method, in which the gas phase reactions and governing equations are calculated in the Eulerian (CFD) side, whereas the particle reac...

  2. Holonic Business Process Modeling in Small to Medium Sized Enterprises

    OpenAIRE

    Nur Budi Mulyono; Tezar Yuliansyah Saputra; Nur Arief Rahmatsyah

    2012-01-01

    Holonic modeling analysis which is the application of system thinking in design, manage, and improvement, is used in a novel context for business process modeling. An approach and techniques of holon and holarchies is presented specifically for small and medium sized enterprise process modeling development. The fitness of the approach is compared with well known reductionist or task breakdown approach. The strength and weaknesses of the holonic modeling is discussed with illustrating case exa...

  3. Modeling interdependencies between business and communication processes in hospitals.

    Science.gov (United States)

    Brigl, Birgit; Wendt, Thomas; Winter, Alfred

    2003-01-01

    The optimization and redesign of business processes in hospitals is an important challenge for the hospital information management who has to design and implement a suitable HIS architecture. Nevertheless, there are no tools available specializing in modeling information-driven business processes and the consequences on the communication between information processing, tools. Therefore, we will present an approach which facilitates the representation and analysis of business processes and resulting communication processes between application components and their interdependencies. This approach aims not only to visualize those processes, but to also to evaluate if there are weaknesses concerning the information processing infrastructure which hinder the smooth implementation of the business processes.

  4. Detecting Difference between Process Models Based on the Refined Process Structure Tree

    Directory of Open Access Journals (Sweden)

    Jing Fan

    2017-01-01

    Full Text Available The development of mobile workflow management systems (mWfMS leads to large number of business process models. In the meantime, the location restriction embedded in mWfMS may result in different process models for a single business process. In order to help users quickly locate the difference and rebuild the process model, detecting the difference between different process models is needed. Existing detection methods either provide a dissimilarity value to represent the difference or use predefined difference template to generate the result, which cannot reflect the entire composition of the difference. Hence, in this paper, we present a new approach to solve this problem. Firstly, we parse the process models to their corresponding refined process structure trees (PSTs, that is, decomposing a process model into a hierarchy of subprocess models. Then we design a method to convert the PST to its corresponding task based process structure tree (TPST. As a consequence, the problem of detecting difference between two process models is transformed to detect difference between their corresponding TPSTs. Finally, we obtain the difference between two TPSTs based on the divide and conquer strategy, where the difference is described by an edit script and we make the cost of the edit script close to minimum. The extensive experimental evaluation shows that our method can meet the real requirements in terms of precision and efficiency.

  5. Modelling energy spot prices by Lévy semistationary processes

    DEFF Research Database (Denmark)

    Barndorff-Nielsen, Ole; Benth, Fred Espen; Veraart, Almut

    This paper introduces a new modelling framework for energy spot prices based on Lévy semistationary processes. Lévy semistationary processes are special cases of the general class of ambit processes. We provide a detailed analysis of the probabilistic properties of such models and we show how the...... they are able to capture many of the stylised facts observed in energy markets. Furthermore, we derive forward prices based on our spot price model. As it turns out, many of the classical spot models can be embedded into our novel modelling framework....

  6. A Software Development Simulation Model of a Spiral Process

    Science.gov (United States)

    Mizell, Carolyn; Malone, Linda

    2007-01-01

    There is a need for simulation models of software development processes other than the waterfall because processes such as spiral development are becoming more and more popular. The use of a spiral process can make the inherently difficult job of cost and schedule estimation even more challenging due to its evolutionary nature, but this allows for a more flexible process that can better meet customers' needs. This paper will present a discrete event simulation model of spiral development that can be used to analyze cost and schedule effects of using such a process in comparison to a waterfall process.

  7. Diff-based model synchronization in an industrial MDD process

    DEFF Research Database (Denmark)

    Kindler, Ekkart; Könemann, Patrick; Unland, Ludger

    of different models is maintained manually in many cases today. This paper presents an approach for automated model differencing, so that the differences between two model versions (called delta) can be extracted and stored. It can then be re-used independently of the models it was created from...... to interactively merge different model versions, and for synchronizing other types of models. The main concern was to apply our concepts to an industrial process, so usability and performance were important issues....

  8. A Garbage Can Model of the Psychological Research Process.

    Science.gov (United States)

    Martin, Joanne

    1981-01-01

    Reviews models commonly used in psychological research, and, particularly, in organizational decision making. An alternative model of organizational decision making is suggested. The model, referred to as the garbage can model, describes a process in which members of an organization collect the problems and solutions they generate by dumping them…

  9. Modeling Grinding Processes as Micro-Machining Operation ...

    African Journals Online (AJOL)

    A computational based model for surface grinding process as a micro-machined operation has been developed. In this model, grinding forces are made up of chip formation force and sliding force. Mathematical expressions for Modeling tangential grinding force and normal grinding force were obtained. The model was ...

  10. Pre-Processing and Modeling Tools for Bigdata

    Directory of Open Access Journals (Sweden)

    Hashem Hadi

    2016-09-01

    Full Text Available Modeling tools and operators help the user / developer to identify the processing field on the top of the sequence and to send into the computing module only the data related to the requested result. The remaining data is not relevant and it will slow down the processing. The biggest challenge nowadays is to get high quality processing results with a reduced computing time and costs. To do so, we must review the processing sequence, by adding several modeling tools. The existing processing models do not take in consideration this aspect and focus on getting high calculation performances which will increase the computing time and costs. In this paper we provide a study of the main modeling tools for BigData and a new model based on pre-processing.

  11. Effect of Linked Rules on Business Process Model Understanding

    DEFF Research Database (Denmark)

    Wang, Wei; Indulska, Marta; Sadiq, Shazia

    2017-01-01

    of business processes has not been empirically evaluated. In this paper, we report on an experiment that investigates the effect of linked rules, a specific rule integration approach, on business process model understanding. Our results indicate that linked rules are associated with better time efficiency......Business process models are widely used in organizations by information systems analysts to represent complex business requirements and by business users to understand business operations and constraints. This understanding is extracted from graphical process models as well as business rules. Prior...... research advocated integrating business rules into business process models to improve the effectiveness of important organizational activities, such as developing shared understanding, effective communication, and process improvement. However, whether such integrated modeling can improve the understanding...

  12. Concept of a cognitive-numeric plant and process modelizer

    International Nuclear Information System (INIS)

    Vetterkind, D.

    1990-01-01

    To achieve automatic modeling of plant distrubances and failure limitation procedures, first the system's hardware and the present media (water, steam, coolant fluid) are formalized into fully computable matrices, called topographies. Secondly a microscopic cellular automation model, using lattice gases and state transition rules, is combined with a semi - microscopic cellular process model and with a macroscopic model, too. In doing this, at semi-microscopic level there are acting a cellular data compressor, a feature detection device and the Intelligent Physical Element's process dynamics. At macroscopic level the Walking Process Elements, a process evolving module, a test-and-manage device and abstracting process net are involved. Additionally, a diagnosis-coordinating and a counter measurements coordinating device are used. In order to automatically get process insights, object transformations, elementary process functions and associative methods are used. Developments of optoelectronic hardware language components are under consideration

  13. Towards simplification of hydrologic modeling: identification of dominant processes

    Directory of Open Access Journals (Sweden)

    S. L. Markstrom

    2016-11-01

    Full Text Available parameter hydrologic model, has been applied to the conterminous US (CONUS. Parameter sensitivity analysis was used to identify: (1 the sensitive input parameters and (2 particular model output variables that could be associated with the dominant hydrologic process(es. Sensitivity values of 35 PRMS calibration parameters were computed using the Fourier amplitude sensitivity test procedure on 110 000 independent hydrologically based spatial modeling units covering the CONUS and then summarized to process (snowmelt, surface runoff, infiltration, soil moisture, evapotranspiration, interflow, baseflow, and runoff and model performance statistic (mean, coefficient of variation, and autoregressive lag 1. Identified parameters and processes provide insight into model performance at the location of each unit and allow the modeler to identify the most dominant process on the basis of which processes are associated with the most sensitive parameters. The results of this study indicate that: (1 the choice of performance statistic and output variables has a strong influence on parameter sensitivity, (2 the apparent model complexity to the modeler can be reduced by focusing on those processes that are associated with sensitive parameters and disregarding those that are not, (3 different processes require different numbers of parameters for simulation, and (4 some sensitive parameters influence only one hydrologic process, while others may influence many.

  14. Probabilistic modeling of discourse-aware sentence processing.

    Science.gov (United States)

    Dubey, Amit; Keller, Frank; Sturt, Patrick

    2013-07-01

    Probabilistic models of sentence comprehension are increasingly relevant to questions concerning human language processing. However, such models are often limited to syntactic factors. This restriction is unrealistic in light of experimental results suggesting interactions between syntax and other forms of linguistic information in human sentence processing. To address this limitation, this article introduces two sentence processing models that augment a syntactic component with information about discourse co-reference. The novel combination of probabilistic syntactic components with co-reference classifiers permits them to more closely mimic human behavior than existing models. The first model uses a deep model of linguistics, based in part on probabilistic logic, allowing it to make qualitative predictions on experimental data; the second model uses shallow processing to make quantitative predictions on a broad-coverage reading-time corpus. Copyright © 2013 Cognitive Science Society, Inc.

  15. A Kinetic Ladle Furnace Process Simulation Model: Effective Equilibrium Reaction Zone Model Using FactSage Macro Processing

    Science.gov (United States)

    Van Ende, Marie-Aline; Jung, In-Ho

    2017-02-01

    The ladle furnace (LF) is widely used in the secondary steelmaking process in particular for the de-sulfurization, alloying, and reheating of liquid steel prior to the casting process. The Effective Equilibrium Reaction Zone model using the FactSage macro processing code was applied to develop a kinetic LF process model. The slag/metal interactions, flux additions to slag, various metallic additions to steel, and arcing in the LF process were taken into account to describe the variations of chemistry and temperature of steel and slag. The LF operation data for several steel grades from different plants were accurately described using the present kinetic model.

  16. Modelling Template for the Development of the Process Flowsheet

    DEFF Research Database (Denmark)

    Fedorova, Marina; Gani, Rafiqul

    2015-01-01

    provides building blocks for the templates (generic models previously developed); 3) computer aided methods and tools, that include procedures to perform model translation, model analysis, model verification/validation, model solution and model documentation. In this work, the integrated use of all three......Models are playing important roles in design and analysis of chemicals/bio-chemicals based products and the processes that manufacture them. Model-based methods and tools have the potential to decrease the number of experiments, which can be expensive and time consuming, and point to candidates......, where the experimental effort could be focused.In this contribution a general modelling framework for systematic model building through modelling templates, which supports the reuse of existing models via its tools integration and model import and export capabilities, is presented. Modelling templates...

  17. Case study modelling for an ettringite treatment process ...

    African Journals Online (AJOL)

    The process modelled in this study includes the formation of ettringite and the recovery of gibbsite through the decomposition of recycled ettringite. The modelling of this process was done using PHREEQC and the results presented in this paper are based on the outcome of different case studies that investigated how the ...

  18. Parallel direct solver for finite element modeling of manufacturing processes

    DEFF Research Database (Denmark)

    Nielsen, Chris Valentin; Martins, P.A.F.

    2017-01-01

    The central processing unit (CPU) time is of paramount importance in finite element modeling of manufacturing processes. Because the most significant part of the CPU time is consumed in solving the main system of equations resulting from finite element assemblies, different approaches have been...... developed to optimize solutions and reduce the overall computational costs of large finite element models....

  19. Product Trial Processing (PTP): a model approach from ...

    African Journals Online (AJOL)

    Product Trial Processing (PTP): a model approach from theconsumer's perspective. ... Global Journal of Social Sciences ... Among the constructs used in the model of consumer's processing of product trail includes; experiential and non- experiential attributes, perceived validity of product trial, consumer perceived expertise, ...

  20. MODELING OF AUTOMATION PROCESSES CONCERNING CROP CULTIVATION BY AVIATION

    Directory of Open Access Journals (Sweden)

    V. I. Ryabkov

    2010-01-01

    Full Text Available The paper considers modeling of automation processes concerning crop cultivation by aviation. Processes that take place in three interconnected environments: human, technical and movable air objects are described by a model which is based on a set theory. Stochastic network theory of mass service systems for description of human-machine system of real time is proposed in the paper.

  1. DEVELOPMENT OF SMALL-SCALE CONSTRUCTION ENTERPRISE PROCESS MANAGEMENT MODEL

    OpenAIRE

    E. V. Folomeev

    2012-01-01

    Process approach is one of most effective ways of managing construction companies. Resulting from using models based on this approach, the company’s structure becomes flexible enough to quickly acquire the ability to get functionally and structurally tuned for specific projects. It is demonstrated in this article how to develop a process management model mechanism for a small-scale construction company.

  2. Measurement and modeling of advanced coal conversion processes

    Energy Technology Data Exchange (ETDEWEB)

    Solomon, P.R.; Serio, M.A.; Hamblen, D.G. (Advanced Fuel Research, Inc., East Hartford, CT (United States)); Smoot, L.D.; Brewster, B.S. (Brigham Young Univ., Provo, UT (United States))

    1991-09-25

    The objectives of this study are to establish the mechanisms and rates of basic steps in coal conversion processes, to integrate and incorporate this information into comprehensive computer models for coal conversion processes, to evaluate these models and to apply them to gasification, mild gasification and combustion in heat engines. (VC)

  3. Simple models of the hydrofracture process

    KAUST Repository

    Marder, M.

    2015-12-29

    Hydrofracturing to recover natural gas and oil relies on the creation of a fracture network with pressurized water. We analyze the creation of the network in two ways. First, we assemble a collection of analytical estimates for pressure-driven crack motion in simple geometries, including crack speed as a function of length, energy dissipated by fluid viscosity and used to break rock, and the conditions under which a second crack will initiate while a first is running. We develop a pseudo-three-dimensional numerical model that couples fluid motion with solid mechanics and can generate branching crack structures not specified in advance. One of our main conclusions is that the typical spacing between fractures must be on the order of a meter, and this conclusion arises in two separate ways. First, it arises from analysis of gas production rates, given the diffusion constants for gas in the rock. Second, it arises from the number of fractures that should be generated given the scale of the affected region and the amounts of water pumped into the rock.

  4. BUSINESS PROCESS MODELLING: A FOUNDATION FOR KNOWLEDGE MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Vesna Bosilj-Vukšić

    2006-12-01

    Full Text Available Knowledge management (KM is increasingly recognised as a strategic practice of knowledge-intensive companies, becoming an integral part of an organisation's strategy to improve business performance. This paper provides an overview of business process modelling applications and analyses the relationship between business process modelling and knowledge management projects. It presents the case study of Croatian leading banks and the insurance company, discussing its practical experience in conducting business process modelling projects and investigating the opportunity for integrating business process repository and organisational knowledge as the foundation for knowledge management system development.

  5. Ontological Analysis of Integrated Process Models: testing hypotheses

    Directory of Open Access Journals (Sweden)

    Michael Rosemann

    2001-11-01

    Full Text Available Integrated process modeling is achieving prominence in helping to document and manage business administration and IT processes in organizations. The ARIS framework is a popular example for a framework of integrated process modeling not least because it underlies the 800 or more reference models embedded in the world's most popular ERP package, SAP R/3. This paper demonstrates the usefulness of the Bunge-Wand-Weber (BWW representation model for evaluating modeling grammars such as those constituting ARIS. It reports some initial insights gained from pilot testing Green and Rosemann's (2000 evaluative propositions. Even when considering all five views of ARIS, modelers have problems representing business rules, the scope and boundary of systems, and decomposing models. However, even though it is completely ontologically redundant, users still find the function view useful in modeling.

  6. Object Oriented Business Process Modelling in RFID Applied Computing Environments

    Science.gov (United States)

    Zhao, Xiaohui; Liu, Chengfei; Lin, Tao

    As a tracking technology, Radio Frequency Identification (RFID) is now widely applied to enhance the context awareness of enterprise information systems. Such awareness provides great opportunities to facilitate business process automation and thereby improve operation efficiency and accuracy. With the aim to incorporate business logics into RFID-enabled applications, this book chapter addresses how RFID technologies impact current business process management and the characteristics of object-oriented business process modelling. This chapter first discusses the rationality and advantages of applying object-oriented process modelling in RFID applications, then addresses the requirements and guidelines for RFID data management and process modelling. Two typical solutions are introduced to further illustrate the modelling and incorporation of business logics/business processes into RFID edge systems. To demonstrate the applicability of these two approaches, a detailed case study is conducted within a distribution centre scenario.

  7. Methodology for Modeling and Analysis of Business Processes (MMABP

    Directory of Open Access Journals (Sweden)

    Vaclav Repa

    2015-10-01

    Full Text Available This paper introduces the methodology for modeling business processes. Creation of the methodology is described in terms of the Design Science Method. Firstly, the gap in contemporary Business Process Modeling approaches is identified and general modeling principles which can fill the gap are discussed. The way which these principles have been implemented in the main features of created methodology is described. Most critical identified points of the business process modeling are process states, process hierarchy and the granularity of process description. The methodology has been evaluated by use in the real project. Using the examples from this project the main methodology features are explained together with the significant problems which have been met during the project. Concluding from these problems together with the results of the methodology evaluation the needed future development of the methodology is outlined.

  8. Process modeling for the Integrated Nonthermal Treatment System (INTS) study

    Energy Technology Data Exchange (ETDEWEB)

    Brown, B.W.

    1997-04-01

    This report describes the process modeling done in support of the Integrated Nonthermal Treatment System (INTS) study. This study was performed to supplement the Integrated Thermal Treatment System (ITTS) study and comprises five conceptual treatment systems that treat DOE contract-handled mixed low-level wastes (MLLW) at temperatures of less than 350{degrees}F. ASPEN PLUS, a chemical process simulator, was used to model the systems. Nonthermal treatment systems were developed as part of the INTS study and include sufficient processing steps to treat the entire inventory of MLLW. The final result of the modeling is a process flowsheet with a detailed mass and energy balance. In contrast to the ITTS study, which modeled only the main treatment system, the INTS study modeled each of the various processing steps with ASPEN PLUS, release 9.1-1. Trace constituents, such as radionuclides and minor pollutant species, were not included in the calculations.

  9. QUALITY IMPROVEMENT MODEL AT THE MANUFACTURING PROCESS PREPARATION LEVEL

    Directory of Open Access Journals (Sweden)

    Dusko Pavletic

    2009-12-01

    Full Text Available The paper expresses base for an operational quality improvement model at the manufacturing process preparation level. A numerous appropriate related quality assurance and improvement methods and tools are identified. Main manufacturing process principles are investigated in order to scrutinize one general model of manufacturing process and to define a manufacturing process preparation level. Development and introduction of the operational quality improvement model is based on a research conducted and results of methods and tools application possibilities in real manufacturing processes shipbuilding and automotive industry. Basic model structure is described and presented by appropriate general algorithm. Operational quality improvement model developed lays down main guidelines for practical and systematic application of quality improvements methods and tools.

  10. Space in multi-agent systems modelling spatial processes

    Directory of Open Access Journals (Sweden)

    Petr Rapant

    2007-06-01

    Full Text Available Need for modelling of spatial processes arise in the spehere of geoinformation systems in the last time. Some processes (espetially natural ones can be modeled by means of using external tools, e. g. for modelling of contaminant transport in the environment. But in the case of socio-economic processes suitable tools interconnected with GIS are still in quest of reserch and development. One of the candidate technologies are so called multi-agent systems. Their theory is developed quite well, but they lack suitable means for dealing with space. This article deals with this problem and proposes solution for the field of a road transport modelling.

  11. Updating parameters of the chicken processing line model

    DEFF Research Database (Denmark)

    Kurowicka, Dorota; Nauta, Maarten; Jozwiak, Katarzyna

    2010-01-01

    A mathematical model of chicken processing that quantitatively describes the transmission of Campylobacter on chicken carcasses from slaughter to chicken meat product has been developed in Nauta et al. (2005). This model was quantified with expert judgment. Recent availability of data allows...... updating parameters of the model to better describe processes observed in slaughterhouses. We propose Bayesian updating as a suitable technique to update expert judgment with microbiological data. Berrang and Dickens’s data are used to demonstrate performance of this method in updating parameters...... of the chicken processing line model....

  12. Modelling the pultrusion process of off shore wind turbine blades

    NARCIS (Netherlands)

    Baran, Ismet

    This thesis is devoted to the numerical modelling of the pultrusion process for industrial products such as wind turbine blades and structural profiles. The main focus is on the thermo-chemical and mechanical analyses of the process in which the process induced tresses and shape distortions together

  13. process setting models for the minimization of costs defectives

    African Journals Online (AJOL)

    Dr Obe

    2. Optimal Setting Process Models. 2.1 Optimal setting of process mean in the case of one-sided limit. In filling operation, the process average net weight must be set. The standards prescribe the minimum weight which is printed on the packet. This set of quality control problems has one-sided limit (the minimum net weight).

  14. The two-process model : Origin and perspective

    NARCIS (Netherlands)

    Daan, S.; Hut, R. A.; Beersma, D.

    In the two-process model as developed in the early 1980's sleep is controlled by a process-S, representing the rise and fall of sleep demand resulting from prior sleep-wake history, interacting with a process-C representing circadian variation in sleep propensity. S and C together optimize sleep

  15. An Information-Processing Model of Crisis Management.

    Science.gov (United States)

    Egelhoff, William G.; Sen, Falguni

    1992-01-01

    Develops a contingency model for managing a variety of corporate crises. Views crisis management as an information-processing situation and organizations that must cope with crisis as information-processing systems. Attempts to fit appropriate information-processing mechanisms to different categories of crises. (PRA)

  16. Software engineering with process algebra: Modelling client / server architecures

    NARCIS (Netherlands)

    Diertens, B.

    2009-01-01

    In previous work we described how the process algebra based language PSF can be used in software engineering, using the ToolBus, a coordination architecture also based on process algebra, as implementation model. We also described this software development process more formally by presenting the

  17. Understanding uncertainty in process-based hydrological models

    Science.gov (United States)

    Clark, M. P.; Kavetski, D.; Slater, A. G.; Newman, A. J.; Marks, D. G.; Landry, C.; Lundquist, J. D.; Rupp, D. E.; Nijssen, B.

    2013-12-01

    Building an environmental model requires making a series of decisions regarding the appropriate representation of natural processes. While some of these decisions can already be based on well-established physical understanding, gaps in our current understanding of environmental dynamics, combined with incomplete knowledge of properties and boundary conditions of most environmental systems, make many important modeling decisions far more ambiguous. There is consequently little agreement regarding what a 'correct' model structure is, especially at relatively larger spatial scales such as catchments and beyond. In current practice, faced with such a range of decisions, different modelers will generally make different modeling decisions, often on an ad hoc basis, based on their balancing of process understanding, the data available to evaluate the model, the purpose of the modeling exercise, and their familiarity with or investment in an existing model infrastructure. This presentation describes development and application of multiple-hypothesis models to evaluate process-based hydrologic models. Our numerical model uses robust solutions of the hydrology and thermodynamic governing equations as the structural core, and incorporates multiple options to represent the impact of different modeling decisions, including multiple options for model parameterizations (e.g., below-canopy wind speed, thermal conductivity, storage and transmission of liquid water through soil, etc.), as well as multiple options for model architecture, that is, the coupling and organization of different model components (e.g., representations of sub-grid variability and hydrologic connectivity, coupling with groundwater, etc.). Application of this modeling framework across a collection of different research basins demonstrates that differences among model parameterizations are often overwhelmed by differences among equally-plausible model parameter sets, while differences in model architecture lead

  18. Catastrophe Insurance Modeled by Shot-Noise Processes

    Directory of Open Access Journals (Sweden)

    Thorsten Schmidt

    2014-02-01

    Full Text Available Shot-noise processes generalize compound Poisson processes in the following way: a jump (the shot is followed by a decline (noise. This constitutes a useful model for insurance claims in many circumstances; claims due to natural disasters or self-exciting processes exhibit similar features. We give a general account of shot-noise processes with time-inhomogeneous drivers inspired by recent results in credit risk. Moreover, we derive a number of useful results for modeling and pricing with shot-noise processes. Besides this, we obtain some highly tractable examples and constitute a useful modeling tool for dynamic claims processes. The results can in particular be used for pricing Catastrophe Bonds (CAT bonds, a traded risk-linked security. Additionally, current results regarding the estimation of shot-noise processes are reviewed.

  19. Modeling technique for the process of liquid film disintegration

    Science.gov (United States)

    Modorskii, V. Ya.; Sipatov, A. M.; Babushkina, A. V.; Kolodyazhny, D. Yu.; Nagorny, V. S.

    2016-10-01

    In the course of numerical experiments the method of calculation of two-phase flows was developed by solving a model problem. The results of the study were compared between the two models that describe the processes of two-phase flow and the collapse of the liquid jet into droplets. VoF model and model QMOM - two mathematical models were considered the implementation of the spray.

  20. Measuring the Compliance of Processes with Reference Models

    Science.gov (United States)

    Gerke, Kerstin; Cardoso, Jorge; Claus, Alexander

    Reference models provide a set of generally accepted best practices to create efficient processes to be deployed inside organizations. However, a central challenge is to determine how these best practices are implemented in practice. One limitation of existing approaches for measuring compliance is the assumption that the compliance can be determined using the notion of process equivalence. Nonetheless, the use of equivalence algorithms is not adequate since two models can have different structures but one process can still be compliant with the other. This paper presents a new approach and algorithm which allow to measure the compliance of process models with reference models. We evaluate our approach by measuring the compliance of a model currently used by a German passenger airline with the IT Infrastructure Library (ITIL) reference model and by comparing our results with existing approaches.

  1. Detection and quantification of flow consistency in business process models

    DEFF Research Database (Denmark)

    Burattin, Andrea; Bernstein, Vered; Neurauter, Manuel

    2017-01-01

    , to show how such features can be quantified into computational metrics, which are applicable to business process models. We focus on one particular feature, consistency of flow direction, and show the challenges that arise when transforming it into a precise metric. We propose three different metrics......Business process models abstract complex business processes by representing them as graphical models. Their layout, as determined by the modeler, may have an effect when these models are used. However, this effect is currently not fully understood. In order to systematically study this effect......, a basic set of measurable key visual features is proposed, depicting the layout properties that are meaningful to the human user. The aim of this research is thus twofold: first, to empirically identify key visual features of business process models which are perceived as meaningful to the user and second...

  2. Model for Simulating a Spiral Software-Development Process

    Science.gov (United States)

    Mizell, Carolyn; Curley, Charles; Nayak, Umanath

    2010-01-01

    A discrete-event simulation model, and a computer program that implements the model, have been developed as means of analyzing a spiral software-development process. This model can be tailored to specific development environments for use by software project managers in making quantitative cases for deciding among different software-development processes, courses of action, and cost estimates. A spiral process can be contrasted with a waterfall process, which is a traditional process that consists of a sequence of activities that include analysis of requirements, design, coding, testing, and support. A spiral process is an iterative process that can be regarded as a repeating modified waterfall process. Each iteration includes assessment of risk, analysis of requirements, design, coding, testing, delivery, and evaluation. A key difference between a spiral and a waterfall process is that a spiral process can accommodate changes in requirements at each iteration, whereas in a waterfall process, requirements are considered to be fixed from the beginning and, therefore, a waterfall process is not flexible enough for some projects, especially those in which requirements are not known at the beginning or may change during development. For a given project, a spiral process may cost more and take more time than does a waterfall process, but may better satisfy a customer's expectations and needs. Models for simulating various waterfall processes have been developed previously, but until now, there have been no models for simulating spiral processes. The present spiral-process-simulating model and the software that implements it were developed by extending a discrete-event simulation process model of the IEEE 12207 Software Development Process, which was built using commercially available software known as the Process Analysis Tradeoff Tool (PATT). Typical inputs to PATT models include industry-average values of product size (expressed as number of lines of code

  3. A Petri Net-Based Software Process Model for Developing Process-Oriented Information Systems

    Science.gov (United States)

    Li, Yu; Oberweis, Andreas

    Aiming at increasing flexibility, efficiency, effectiveness, and transparency of information processing and resource deployment in organizations to ensure customer satisfaction and high quality of products and services, process-oriented information systems (POIS) represent a promising realization form of computerized business information systems. Due to the complexity of POIS, explicit and specialized software process models are required to guide POIS development. In this chapter we characterize POIS with an architecture framework and present a Petri net-based software process model tailored for POIS development with consideration of organizational roles. As integrated parts of the software process model, we also introduce XML nets, a variant of high-level Petri nets as basic methodology for business processes modeling, and an XML net-based software toolset providing comprehensive functionalities for POIS development.

  4. Investigation of Mediational Processes Using Parallel Process Latent Growth Curve Modeling

    Science.gov (United States)

    Cheong, JeeWon; MacKinnon, David P.; Khoo, Siek Toon

    2010-01-01

    This study investigated a method to evaluate mediational processes using latent growth curve modeling. The mediator and the outcome measured across multiple time points were viewed as 2 separate parallel processes. The mediational process was defined as the independent variable influencing the growth of the mediator, which, in turn, affected the growth of the outcome. To illustrate modeling procedures, empirical data from a longitudinal drug prevention program, Adolescents Training and Learning to Avoid Steroids, were used. The program effects on the growth of the mediator and the growth of the outcome were examined first in a 2-group structural equation model. The mediational process was then modeled and tested in a parallel process latent growth curve model by relating the prevention program condition, the growth rate factor of the mediator, and the growth rate factor of the outcome. PMID:20157639

  5. A Measurable Model of the Creative Process in the Context of a Learning Process

    Science.gov (United States)

    Ma, Min; Van Oystaeyen, Fred

    2016-01-01

    The authors' aim was to arrive at a measurable model of the creative process by putting creativity in the context of a learning process. The authors aimed to provide a rather detailed description of how creative thinking fits in a general description of the learning process without trying to go into an analysis of a biological description of the…

  6. Toward Cognitively Constrained Models of Language Processing: A Review

    Directory of Open Access Journals (Sweden)

    Margreet Vogelzang

    2017-09-01

    Full Text Available Language processing is not an isolated capacity, but is embedded in other aspects of our cognition. However, it is still largely unexplored to what extent and how language processing interacts with general cognitive resources. This question can be investigated with cognitively constrained computational models, which simulate the cognitive processes involved in language processing. The theoretical claims implemented in cognitive models interact with general architectural constraints such as memory limitations. This way, it generates new predictions that can be tested in experiments, thus generating new data that can give rise to new theoretical insights. This theory-model-experiment cycle is a promising method for investigating aspects of language processing that are difficult to investigate with more traditional experimental techniques. This review specifically examines the language processing models of Lewis and Vasishth (2005, Reitter et al. (2011, and Van Rij et al. (2010, all implemented in the cognitive architecture Adaptive Control of Thought—Rational (Anderson et al., 2004. These models are all limited by the assumptions about cognitive capacities provided by the cognitive architecture, but use different linguistic approaches. Because of this, their comparison provides insight into the extent to which assumptions about general cognitive resources influence concretely implemented models of linguistic competence. For example, the sheer speed and accuracy of human language processing is a current challenge in the field of cognitive modeling, as it does not seem to adhere to the same memory and processing capacities that have been found in other cognitive processes. Architecture-based cognitive models of language processing may be able to make explicit which language-specific resources are needed to acquire and process natural language. The review sheds light on cognitively constrained models of language processing from two angles: we

  7. Multicriteria framework for selecting a process modelling language

    Science.gov (United States)

    Scanavachi Moreira Campos, Ana Carolina; Teixeira de Almeida, Adiel

    2016-01-01

    The choice of process modelling language can affect business process management (BPM) since each modelling language shows different features of a given process and may limit the ways in which a process can be described and analysed. However, choosing the appropriate modelling language for process modelling has become a difficult task because of the availability of a large number modelling languages and also due to the lack of guidelines on evaluating, and comparing languages so as to assist in selecting the most appropriate one. This paper proposes a framework for selecting a modelling language in accordance with the purposes of modelling. This framework is based on the semiotic quality framework (SEQUAL) for evaluating process modelling languages and a multicriteria decision aid (MCDA) approach in order to select the most appropriate language for BPM. This study does not attempt to set out new forms of assessment and evaluation criteria, but does attempt to demonstrate how two existing approaches can be combined so as to solve the problem of selection of modelling language. The framework is described in this paper and then demonstrated by means of an example. Finally, the advantages and disadvantages of using SEQUAL and MCDA in an integrated manner are discussed.

  8. Model based process-product design and analysis

    DEFF Research Database (Denmark)

    Gani, Rafiqul

    This paper gives a perspective on modelling and the important role it has within product-process design and analysis. Different modelling issues related to development and application of systematic model-based solution approaches for product-process design is discussed and the need for a hybrid......, forms and complexity, together with their associated parameters. An example of a model-based system for design of chemicals based formulated products is also given....... model-based framework is highlighted. This framework should be able to manage knowledge-data, models, and associated methods and tools integrated with design work-flows and data-flows for specific product-process design problems. In particular, the framework needs to manage models of different types...

  9. Using CASE to Exploit Process Modeling in Technology Transfer

    Science.gov (United States)

    Renz-Olar, Cheryl

    2003-01-01

    A successful business will be one that has processes in place to run that business. Creating processes, reengineering processes, and continually improving processes can be accomplished through extensive modeling. Casewise(R) Corporate Modeler(TM) CASE is a computer aided software engineering tool that will enable the Technology Transfer Department (TT) at NASA Marshall Space Flight Center (MSFC) to capture these abilities. After successful implementation of CASE, it could then go on to be applied in other departments at MSFC and other centers at NASA. The success of a business process is dependent upon the players working as a team and continuously improving the process. A good process fosters customer satisfaction as well as internal satisfaction in the organizational infrastructure. CASE provides a method for business process success through functions consisting of systems and processes business models; specialized diagrams; matrix management; simulation; report generation and publishing; and, linking, importing, and exporting documents and files. The software has an underlying repository or database to support these functions. The Casewise. manual informs us that dynamics modeling is a technique used in business design and analysis. Feedback is used as a tool for the end users and generates different ways of dealing with the process. Feedback on this project resulted from collection of issues through a systems analyst interface approach of interviews with process coordinators and Technical Points of Contact (TPOCs).

  10. Lipid Processing Technology: Building a Multilevel Modeling Network

    DEFF Research Database (Denmark)

    Díaz Tovar, Carlos Axel; Mustaffa, Azizul Azri; Mukkerikar, Amol

    2011-01-01

    The aim of this work is to present the development of a computer aided multilevel modeling network for the systematic design and analysis of processes employing lipid technologies. This is achieved by decomposing the problem into four levels of modeling: i) pure component property modeling...

  11. modelling of queuing process at airport check-in system

    African Journals Online (AJOL)

    HOD

    models in queue studies. The study adopted travel demand data for Manchester and Leeds-Bradford airports from the United Kingdom. Civil Aviation Authority database. 1.2 Analytical Models for Queuing Studies. Previous researchers have examined queuing process extensively and developed analytical models used for.

  12. A mathematical model for the leukocyte filtration process

    NARCIS (Netherlands)

    Bruil, A.; Bruil, Anton; Beugeling, T.; Beugeling, Tom; Feijen, Jan

    1995-01-01

    Leukocyte filters are applied clinically to remove leukocytes from blood. In order to optimize leukocyte filters, a mathematical model to describe the leukocyte filtration process was developed by modification of a general theoretical model for depth filtration. The model presented here can be used

  13. A consolidation based extruder model to explore GAME process configurations

    NARCIS (Netherlands)

    Willems, P.; Kuipers, N.J.M.; de Haan, A.B.

    2009-01-01

    A mathematical model from literature was adapted to predict the pressure profile and oil yield for canola in a lab-scale extruder. Changing the description of the expression process from filtration to consolidation significantly improved the performance and physical meaning of the model. The model

  14. Modelling the embedded rainfall process using tipping bucket data

    DEFF Research Database (Denmark)

    Thyregod, Peter; Arnbjerg-Nielsen, Karsten; Madsen, Henrik

    1998-01-01

    A new method for modelling the dynamics of rain measurement processes is suggested. The method takes the discrete nature and autocorrelation of measurements from the tipping bucket rain gauge into consideration. The considered model is a state space model with a Poisson marginal distribution. In ...

  15. Conflicts Analysis for Inter-Enterprise Business Process Model

    OpenAIRE

    Wei Ding; Zhong Tian; Jian Wang; Jun Zhu; Haiqi Liang; Lei Zhang

    2003-01-01

    Business process (BP) management systems facilitate the understanding and execution of business processes, which tend to change frequently due to both internal and external change in an enterprise. Therefore, the needs for analysis methods to verify the correctness of business process model is becoming more prominent. One key element of such business process is its control flow. We show how a flow specification may contain certain structural conflicts that could compromise its correct executi...

  16. Modeling and knowledge acquisition processes using case-based inference

    Directory of Open Access Journals (Sweden)

    Ameneh Khadivar

    2017-03-01

    Full Text Available The method of acquisition and presentation of the organizational Process Knowledge has considered by many KM researches. In this research a model for process knowledge acquisition and presentation has been presented by using the approach of Case Base Reasoning. The validation of the presented model was evaluated by conducting an expert panel. Then a software has been developed based on the presented model and implemented in Eghtesad Novin Bank of Iran. In this company, based on the stages of the presented model, first the knowledge intensive processes has been identified, then the Process Knowledge was stored in a knowledge base in the format of problem/solution/consequent .The retrieval of the knowledge was done based on the similarity of the nearest neighbor algorithm. For validating of the implemented system, results of the system has compared by the results of the decision making of the expert of the process.

  17. Event Modeling in UML. Unified Modeling Language and Unified Process

    DEFF Research Database (Denmark)

    Bækgaard, Lars

    2002-01-01

    We show how events can be modeled in terms of UML. We view events as change agents that have consequences and as information objects that represent information. We show how to create object-oriented structures that represent events in terms of attributes, associations, operations, state charts, a...

  18. Mechanistic Fermentation Models for Process Design, Monitoring, and Control.

    Science.gov (United States)

    Mears, Lisa; Stocks, Stuart M; Albaek, Mads O; Sin, Gürkan; Gernaey, Krist V

    2017-10-01

    Mechanistic models require a significant investment of time and resources, but their application to multiple stages of fermentation process development and operation can make this investment highly valuable. This Opinion article discusses how an established fermentation model may be adapted for application to different stages of fermentation process development: planning, process design, monitoring, and control. Although a longer development time is required for such modeling methods in comparison to purely data-based model techniques, the wide range of applications makes them a highly valuable tool for fermentation research and development. In addition, in a research environment, where collaboration is important, developing mechanistic models provides a platform for knowledge sharing and consolidation of existing process understanding. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Holonic Business Process Modeling in Small to Medium Sized Enterprises

    Directory of Open Access Journals (Sweden)

    Nur Budi Mulyono

    2012-01-01

    Full Text Available Holonic modeling analysis which is the application of system thinking in design, manage, and improvement, is used in a novel context for business process modeling. An approach and techniques of holon and holarchies is presented specifically for small and medium sized enterprise process modeling development. The fitness of the approach is compared with well known reductionist or task breakdown approach. The strength and weaknesses of the holonic modeling is discussed with illustrating case example in term of its suitability for an Indonesia’s small and medium sized industry. The novel ideas in this paper have great impact on the way analyst should perceive business process. Future research is applying the approach in supply chain context.Key words: Business process, holonic modeling, operations management, small to medium sized enterprise

  20. Mechanistic Fermentation Models for Process Design, Monitoring, and Control

    DEFF Research Database (Denmark)

    Mears, Lisa; Stocks, Stuart M.; Albæk, Mads Orla

    2017-01-01

    Mechanistic models require a significant investment of time and resources, but their application to multiple stages of fermentation process development and operation can make this investment highly valuable. This Opinion article discusses how an established fermentation model may be adapted...... for application to different stages of fermentation process development: planning, process design, monitoring, and control. Although a longer development time is required for such modeling methods in comparison to purely data-based model techniques, the wide range of applications makes them a highly valuable tool...... for fermentation research and development. In addition, in a research environment, where collaboration is important, developing mechanistic models provides a platform for knowledge sharing and consolidation of existing process understanding....

  1. An extension of clarke's model with stochastic amplitude flip processes

    KAUST Repository

    Hoel, Hakon

    2014-07-01

    Stochastic modeling is an essential tool for studying statistical properties of wireless channels. In multipath fading channel (MFC) models, the signal reception is modeled by a sum of wave path contributions, and Clarke\\'s model is an important example of such which has been widely accepted in many wireless applications. However, since Clarke\\'s model is temporally deterministic, Feng and Field noted that it does not model real wireless channels with time-varying randomness well. Here, we extend Clarke\\'s model to a novel time-varying stochastic MFC model with scatterers randomly flipping on and off. Statistical properties of the MFC model are analyzed and shown to fit well with real signal measurements, and a limit Gaussian process is derived from the model when the number of active wave paths tends to infinity. A second focus of this work is a comparison study of the error and computational cost of generating signal realizations from the MFC model and from its limit Gaussian process. By rigorous analysis and numerical studies, we show that in many settings, signal realizations are generated more efficiently by Gaussian process algorithms than by the MFC model\\'s algorithm. Numerical examples that strengthen these observations are also presented. © 2014 IEEE.

  2. Introduction of Virtualization Technology to Multi-Process Model Checking

    Science.gov (United States)

    Leungwattanakit, Watcharin; Artho, Cyrille; Hagiya, Masami; Tanabe, Yoshinori; Yamamoto, Mitsuharu

    2009-01-01

    Model checkers find failures in software by exploring every possible execution schedule. Java PathFinder (JPF), a Java model checker, has been extended recently to cover networked applications by caching data transferred in a communication channel. A target process is executed by JPF, whereas its peer process runs on a regular virtual machine outside. However, non-deterministic target programs may produce different output data in each schedule, causing the cache to restart the peer process to handle the different set of data. Virtualization tools could help us restore previous states of peers, eliminating peer restart. This paper proposes the application of virtualization technology to networked model checking, concentrating on JPF.

  3. Remote sensing models and methods for image processing

    CERN Document Server

    Schowengerdt, Robert A

    1997-01-01

    This book is a completely updated, greatly expanded version of the previously successful volume by the author. The Second Edition includes new results and data, and discusses a unified framework and rationale for designing and evaluating image processing algorithms.Written from the viewpoint that image processing supports remote sensing science, this book describes physical models for remote sensing phenomenology and sensors and how they contribute to models for remote-sensing data. The text then presents image processing techniques and interprets them in terms of these models. Spectral, s

  4. Enzymatic corn wet milling: engineering process and cost model.

    Science.gov (United States)

    Ramírez, Edna C; Johnston, David B; McAloon, Andrew J; Singh, Vijay

    2009-01-21

    Enzymatic corn wet milling (E-milling) is a process derived from conventional wet milling for the recovery and purification of starch and co-products using proteases to eliminate the need for sulfites and decrease the steeping time. In 2006, the total starch production in USA by conventional wet milling equaled 23 billion kilograms, including modified starches and starches used for sweeteners and ethanol production 1. Process engineering and cost models for an E-milling process have been developed for a processing plant with a capacity of 2.54 million kg of corn per day (100,000 bu/day). These models are based on the previously published models for a traditional wet milling plant with the same capacity. The E-milling process includes grain cleaning, pretreatment, enzymatic treatment, germ separation and recovery, fiber separation and recovery, gluten separation and recovery and starch separation. Information for the development of the conventional models was obtained from a variety of technical sources including commercial wet milling companies, industry experts and equipment suppliers. Additional information for the present models was obtained from our own experience with the development of the E-milling process and trials in the laboratory and at the pilot plant scale. The models were developed using process and cost simulation software (SuperPro Designer) and include processing information such as composition and flow rates of the various process streams, descriptions of the various unit operations and detailed breakdowns of the operating and capital cost of the facility. Based on the information from the model, we can estimate the cost of production per kilogram of starch using the input prices for corn, enzyme and other wet milling co-products. The work presented here describes the E-milling process and compares the process, the operation and costs with the conventional process. The E-milling process was found to be cost competitive with the conventional

  5. Enzymatic corn wet milling: engineering process and cost model

    Directory of Open Access Journals (Sweden)

    McAloon Andrew J

    2009-01-01

    Full Text Available Abstract Background Enzymatic corn wet milling (E-milling is a process derived from conventional wet milling for the recovery and purification of starch and co-products using proteases to eliminate the need for sulfites and decrease the steeping time. In 2006, the total starch production in USA by conventional wet milling equaled 23 billion kilograms, including modified starches and starches used for sweeteners and ethanol production 1. Process engineering and cost models for an E-milling process have been developed for a processing plant with a capacity of 2.54 million kg of corn per day (100,000 bu/day. These models are based on the previously published models for a traditional wet milling plant with the same capacity. The E-milling process includes grain cleaning, pretreatment, enzymatic treatment, germ separation and recovery, fiber separation and recovery, gluten separation and recovery and starch separation. Information for the development of the conventional models was obtained from a variety of technical sources including commercial wet milling companies, industry experts and equipment suppliers. Additional information for the present models was obtained from our own experience with the development of the E-milling process and trials in the laboratory and at the pilot plant scale. The models were developed using process and cost simulation software (SuperPro Designer® and include processing information such as composition and flow rates of the various process streams, descriptions of the various unit operations and detailed breakdowns of the operating and capital cost of the facility. Results Based on the information from the model, we can estimate the cost of production per kilogram of starch using the input prices for corn, enzyme and other wet milling co-products. The work presented here describes the E-milling process and compares the process, the operation and costs with the conventional process. Conclusion The E-milling process

  6. Adaptive Gaussian Predictive Process Models for Large Spatial Datasets

    Science.gov (United States)

    Guhaniyogi, Rajarshi; Finley, Andrew O.; Banerjee, Sudipto; Gelfand, Alan E.

    2011-01-01

    Large point referenced datasets occur frequently in the environmental and natural sciences. Use of Bayesian hierarchical spatial models for analyzing these datasets is undermined by onerous computational burdens associated with parameter estimation. Low-rank spatial process models attempt to resolve this problem by projecting spatial effects to a lower-dimensional subspace. This subspace is determined by a judicious choice of “knots” or locations that are fixed a priori. One such representation yields a class of predictive process models (e.g., Banerjee et al., 2008) for spatial and spatial-temporal data. Our contribution here expands upon predictive process models with fixed knots to models that accommodate stochastic modeling of the knots. We view the knots as emerging from a point pattern and investigate how such adaptive specifications can yield more flexible hierarchical frameworks that lead to automated knot selection and substantial computational benefits. PMID:22298952

  7. Drift-Scale Coupled Processes (DST and THC Seepage) Models

    International Nuclear Information System (INIS)

    Sonnenthale, E.

    2001-01-01

    The purpose of this Analysis/Model Report (AMR) is to document the Near-Field Environment (NFE) and Unsaturated Zone (UZ) models used to evaluate the potential effects of coupled thermal-hydrologic-chemical (THC) processes on unsaturated zone flow and transport. This is in accordance with the ''Technical Work Plan (TWP) for Unsaturated Zone Flow and Transport Process Model Report'', Addendum D, Attachment D-4 (Civilian Radioactive Waste Management System (CRWMS) Management and Operating Contractor (M and O) 2000 [1534471]) and ''Technical Work Plan for Nearfield Environment Thermal Analyses and Testing'' (CRWMS M and O 2000 [153309]). These models include the Drift Scale Test (DST) THC Model and several THC seepage models. These models provide the framework to evaluate THC coupled processes at the drift scale, predict flow and transport behavior for specified thermal loading conditions, and predict the chemistry of waters and gases entering potential waste-emplacement drifts. The intended use of this AMR is to provide input for the following: Performance Assessment (PA); Near-Field Environment (NFE) PMR; Abstraction of Drift-Scale Coupled Processes AMR (ANL-NBS-HS-000029); and UZ Flow and Transport Process Model Report (PMR). The work scope for this activity is presented in the TWPs cited above, and summarized as follows: Continue development of the repository drift-scale THC seepage model used in support of the TSPA in-drift geochemical model; incorporate heterogeneous fracture property realizations; study sensitivity of results to changes in input data and mineral assemblage; validate the DST model by comparison with field data; perform simulations to predict mineral dissolution and precipitation and their effects on fracture properties and chemistry of water (but not flow rates) that may seep into drifts; submit modeling results to the TDMS and document the models. The model development, input data, sensitivity and validation studies described in this AMR are

  8. A note on the criticisms against the internationalization process model

    OpenAIRE

    Hadjikhani, Amjad

    1997-01-01

    The internationalization process model introduced three decades ago still influences international business studies. Since that time, a growing number of researchers have tested the model to show its strengths and weaknesses. Among the critics, some focus on the weakness of the theoretical aspects, while others argue against parts of the model. This paper will review these criticisms and compare them with the original ideas in the internationalization model. One criticized aspect of the inter...

  9. Dispersive processes in models of regional radionuclide migration. Technical memorandum

    International Nuclear Information System (INIS)

    Evenson, D.E.; Dettinger, M.D.

    1980-05-01

    Three broad areas of concern in the development of aquifer scale transport models will be local scale diffusion and dispersion processes, regional scale dispersion processes, and numerical problems associated with the advection-dispersion equation. Local scale dispersion processes are fairly well understood and accessible to observation. These processes will generally be dominated in large scale systems by regional processes, or macro-dispersion. Macro-dispersion is primarily the result of large scale heterogeneities in aquifer properties. In addition, the effects of many modeling approximations are often included in the process. Because difficulties arise in parameterization of this large scale phenomenon, parameterization should be based on field measurements made at the same scale as the transport process of interest or else partially circumvented through the application of a probabilistic advection model. Other problems associated with numerical transport models include difficulties with conservation of mass, stability, numerical dissipation, overshoot, flexibility, and efficiency. We recommend the random-walk model formulation for Lawrence Livermore Laboratory's purposes as the most flexible, accurate and relatively efficient modeling approach that overcomes these difficulties

  10. Testing and modelling autoregressive conditional heteroskedasticity of streamflow processes

    Directory of Open Access Journals (Sweden)

    W. Wang

    2005-01-01

    Full Text Available Conventional streamflow models operate under the assumption of constant variance or season-dependent variances (e.g. ARMA (AutoRegressive Moving Average models for deseasonalized streamflow series and PARMA (Periodic AutoRegressive Moving Average models for seasonal streamflow series. However, with McLeod-Li test and Engle's Lagrange Multiplier test, clear evidences are found for the existence of autoregressive conditional heteroskedasticity (i.e. the ARCH (AutoRegressive Conditional Heteroskedasticity effect, a nonlinear phenomenon of the variance behaviour, in the residual series from linear models fitted to daily and monthly streamflow processes of the upper Yellow River, China. It is shown that the major cause of the ARCH effect is the seasonal variation in variance of the residual series. However, while the seasonal variation in variance can fully explain the ARCH effect for monthly streamflow, it is only a partial explanation for daily flow. It is also shown that while the periodic autoregressive moving average model is adequate in modelling monthly flows, no model is adequate in modelling daily streamflow processes because none of the conventional time series models takes the seasonal variation in variance, as well as the ARCH effect in the residuals, into account. Therefore, an ARMA-GARCH (Generalized AutoRegressive Conditional Heteroskedasticity error model is proposed to capture the ARCH effect present in daily streamflow series, as well as to preserve seasonal variation in variance in the residuals. The ARMA-GARCH error model combines an ARMA model for modelling the mean behaviour and a GARCH model for modelling the variance behaviour of the residuals from the ARMA model. Since the GARCH model is not followed widely in statistical hydrology, the work can be a useful addition in terms of statistical modelling of daily streamflow processes for the hydrological community.

  11. Multivariate Hawkes process models of the occurrence of regulatory elements

    DEFF Research Database (Denmark)

    Carstensen, L; Sandelin, A; Winther, Ole

    2010-01-01

    distribution of the occurrences of these TREs along the genome. RESULTS: We present a model of TRE occurrences known as the Hawkes process. We illustrate the use of this model by analyzing two different publically available data sets. We are able to model, in detail, how the occurrence of one TRE is affected....... For each of the two data sets we provide two results: first, a qualitative description of the dependencies among the occurrences of the TREs, and second, quantitative results on the favored or avoided distances between the different TREs. CONCLUSIONS: The Hawkes process is a novel way of modeling the joint...

  12. Ultrasonic-assisted manufacturing processes: Variational model and numerical simulations

    KAUST Repository

    Siddiq, Amir

    2012-04-01

    We present a computational study of ultrasonic assisted manufacturing processes including sheet metal forming, upsetting, and wire drawing. A fully variational porous plasticity model is modified to include ultrasonic softening effects and then utilized to account for instantaneous softening when ultrasonic energy is applied during deformation. Material model parameters are identified via inverse modeling, i.e. by using experimental data. The versatility and predictive ability of the model are demonstrated and the effect of ultrasonic intensity on the manufacturing process at hand is investigated and compared qualitatively with experimental results reported in the literature. © 2011 Elsevier B.V. All rights reserved.

  13. Modeling cancer registration processes with an enhanced activity diagram.

    Science.gov (United States)

    Lyalin, D; Williams, W

    2005-01-01

    Adequate instruments are needed to reflect the complexity of routine cancer registry operations properly in a business model. The activity diagram is a key instrument of the Unified Modeling Language (UML) for the modeling of business processes. The authors aim to improve descriptions of processes in cancer registration, as well as in other public health domains, through the enhancements of an activity diagram notation within the standard semantics of UML. The authors introduced the practical approach to enhance a conventional UML activity diagram, complementing it with the following business process concepts: timeline, duration for individual activities, responsibilities for individual activities within swimlanes, and descriptive text. The authors used an enhanced activity diagram for modeling surveillance processes in the cancer registration domain. Specific example illustrates the use of an enhanced activity diagram to visualize a process of linking cancer registry records with external mortality files. Enhanced activity diagram allows for the addition of more business concepts to a single diagram and can improve descriptions of processes in cancer registration, as well as in other domains. Additional features of an enhanced activity diagram allow to advance the visualization of cancer registration processes. That, in turn, promotes the clarification of issues related to the process timeline, responsibilities for particular operations, and collaborations among process participants. Our first experiences in a cancer registry best practices development workshop setting support the usefulness of such an approach.

  14. Residence time modeling of hot melt extrusion processes.

    Science.gov (United States)

    Reitz, Elena; Podhaisky, Helmut; Ely, David; Thommes, Markus

    2013-11-01

    The hot melt extrusion process is a widespread technique to mix viscous melts. The residence time of material in the process frequently determines the product properties. An experimental setup and a corresponding mathematical model were developed to evaluate residence time and residence time distribution in twin screw extrusion processes. The extrusion process was modeled as the convolution of a mass transport process described by a Gaussian probability function, and a mixing process represented by an exponential function. The residence time of the extrusion process was determined by introducing a tracer at the extruder inlet and measuring the tracer concentration at the die. These concentrations were fitted to the residence time model, and an adequate correlation was found. Different parameters were derived to characterize the extrusion process including the dead time, the apparent mixing volume, and a transport related axial mixing. A 2(3) design of experiments was performed to evaluate the effect of powder feed rate, screw speed, and melt viscosity of the material on the residence time. All three parameters affect the residence time of material in the extruder. In conclusion, a residence time model was developed to interpret experimental data and to get insights into the hot melt extrusion process. Copyright © 2013 Elsevier B.V. All rights reserved.

  15. A simplified computational memory model from information processing.

    Science.gov (United States)

    Zhang, Lanhua; Zhang, Dongsheng; Deng, Yuqin; Ding, Xiaoqian; Wang, Yan; Tang, Yiyuan; Sun, Baoliang

    2016-11-23

    This paper is intended to propose a computational model for memory from the view of information processing. The model, called simplified memory information retrieval network (SMIRN), is a bi-modular hierarchical functional memory network by abstracting memory function and simulating memory information processing. At first meta-memory is defined to express the neuron or brain cortices based on the biology and graph theories, and we develop an intra-modular network with the modeling algorithm by mapping the node and edge, and then the bi-modular network is delineated with intra-modular and inter-modular. At last a polynomial retrieval algorithm is introduced. In this paper we simulate the memory phenomena and functions of memorization and strengthening by information processing algorithms. The theoretical analysis and the simulation results show that the model is in accordance with the memory phenomena from information processing view.

  16. A simplified computational memory model from information processing

    Science.gov (United States)

    Zhang, Lanhua; Zhang, Dongsheng; Deng, Yuqin; Ding, Xiaoqian; Wang, Yan; Tang, Yiyuan; Sun, Baoliang

    2016-01-01

    This paper is intended to propose a computational model for memory from the view of information processing. The model, called simplified memory information retrieval network (SMIRN), is a bi-modular hierarchical functional memory network by abstracting memory function and simulating memory information processing. At first meta-memory is defined to express the neuron or brain cortices based on the biology and graph theories, and we develop an intra-modular network with the modeling algorithm by mapping the node and edge, and then the bi-modular network is delineated with intra-modular and inter-modular. At last a polynomial retrieval algorithm is introduced. In this paper we simulate the memory phenomena and functions of memorization and strengthening by information processing algorithms. The theoretical analysis and the simulation results show that the model is in accordance with the memory phenomena from information processing view. PMID:27876847

  17. Fault Management: Degradation Signature Detection, Modeling, and Processing Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Fault to Failure Progression (FFP) signature modeling and processing is a new method for applying condition-based signal data to detect degradation, to identify...

  18. Mathematical modeling of electromechanical processes in a brushless DC motor

    Directory of Open Access Journals (Sweden)

    V.I. Tkachuk

    2014-03-01

    Full Text Available On the basis of initial assumptions, a mathematical model that describes electromechanical processes in a brushless DC electric motor with a salient-pole stator and permanent-magnet excitation is created.

  19. Fault Management: Degradation Signature Detection, Modeling, and Processing, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Fault to Failure Progression (FFP) signature modeling and processing is a new method for applying condition-based signal data to detect degradation, to identify...

  20. Modeling of Cloud/Radiation Processes for Cirrus Cloud Formation

    National Research Council Canada - National Science Library

    Liou, K

    1997-01-01

    This technical report includes five reprints and pre-prints of papers associated with the modeling of cirrus cloud and radiation processes as well as remote sensing of cloud optical and microphysical...

  1. Sketch of a Noisy Channel Model for the Translation Process

    DEFF Research Database (Denmark)

    Carl, Michael

    default rendering" procedure, later conscious processes are triggered by a monitor who interferes when something goes wrong. An attempt is made to explain monitor activities with relevance theoretic concepts according to which a translator needs to ensure the similarity of explicatures and implicatures......The paper develops a Noisy Channel Model for the translation process that is based on actual user activity data. It builds on the monitor model and makes a distinction between early, automatic and late, conscious translation processes: while early priming processes are at the basis of a "literal...... of the source and the target texts. It is suggested that events and parameters in the model need be measurable and quantifiable in the user activity data so as to trace back monitoring activities in the translation process data. Michael Carl is a Professor with special responsibilities at the Department...

  2. A Dirichlet process mixture model for brain MRI tissue classification.

    Science.gov (United States)

    Ferreira da Silva, Adelino R

    2007-04-01

    Accurate classification of magnetic resonance images according to tissue type or region of interest has become a critical requirement in diagnosis, treatment planning, and cognitive neuroscience. Several authors have shown that finite mixture models give excellent results in the automated segmentation of MR images of the human normal brain. However, performance and robustness of finite mixture models deteriorate when the models have to deal with a variety of anatomical structures. In this paper, we propose a nonparametric Bayesian model for tissue classification of MR images of the brain. The model, known as Dirichlet process mixture model, uses Dirichlet process priors to overcome the limitations of current parametric finite mixture models. To validate the accuracy and robustness of our method we present the results of experiments carried out on simulated MR brain scans, as well as on real MR image data. The results are compared with similar results from other well-known MRI segmentation methods.

  3. The Use of Reference Models in Business Process Renovation

    Directory of Open Access Journals (Sweden)

    Dejan Pajk

    2010-01-01

    Full Text Available Enterprise resource planning (ERP systems are often used by companies to automate and enhance their busi- ness processes. The capabilities of ERP systems can be described by best-practice reference models. The purpose of the article is to demonstrate the business process renovation approach with the use of reference models. Although the use of reference models brings many positive effects for business, they are still rarely used in Slovenian small and medium-sized compa- nies. The reasons for this may be found in the reference models themselves as well as in project implementation methodologies. In the article a reference model based on Microsoft Dynamics NAV is suggested. The reference model is designed using upgraded BPMN notation with additional business objects, which help to describe the models in more detail.

  4. Drift-Scale Coupled Processes (DST and THC Seepage) Models

    Energy Technology Data Exchange (ETDEWEB)

    P. Dixon

    2004-04-05

    The purpose of this Model Report (REV02) is to document the unsaturated zone (UZ) models used to evaluate the potential effects of coupled thermal-hydrological-chemical (THC) processes on UZ flow and transport. This Model Report has been developed in accordance with the ''Technical Work Plan for: Performance Assessment Unsaturated Zone'' (Bechtel SAIC Company, LLC (BSC) 2002 [160819]). The technical work plan (TWP) describes planning information pertaining to the technical scope, content, and management of this Model Report in Section 1.12, Work Package AUZM08, ''Coupled Effects on Flow and Seepage''. The plan for validation of the models documented in this Model Report is given in Attachment I, Model Validation Plans, Section I-3-4, of the TWP. Except for variations in acceptance criteria (Section 4.2), there were no deviations from this TWP. This report was developed in accordance with AP-SIII.10Q, ''Models''. This Model Report documents the THC Seepage Model and the Drift Scale Test (DST) THC Model. The THC Seepage Model is a drift-scale process model for predicting the composition of gas and water that could enter waste emplacement drifts and the effects of mineral alteration on flow in rocks surrounding drifts. The DST THC model is a drift-scale process model relying on the same conceptual model and much of the same input data (i.e., physical, hydrological, thermodynamic, and kinetic) as the THC Seepage Model. The DST THC Model is the primary method for validating the THC Seepage Model. The DST THC Model compares predicted water and gas compositions, as well as mineral alteration patterns, with observed data from the DST. These models provide the framework to evaluate THC coupled processes at the drift scale, predict flow and transport behavior for specified thermal-loading conditions, and predict the evolution of mineral alteration and fluid chemistry around potential waste emplacement drifts. The

  5. How should mathematical models of geomorphic processes be judged?

    Science.gov (United States)

    Iverson, Richard M.

    Mathematical models of geomorphic processes can have value as both predictive tools and precise conceptual frameworks. Well-posed mechanistic models have great conceptual value because they link geomorphic processes to universal scientific principles, such as conservation of energy, momentum, and mass. Models without this linkage (e.g., models based exclusively on cellular rules or empirical correlations) have less conceptual value but offer logical methodology for making practical predictions in some circumstances. Clear tests of the predictive power of mechanistic models can be achieved in controlled experiments, whereas natural landscapes typically have uncontrolled initial and boundary conditions and unresolved geological heterogeneities that preclude decisive tests. The best mechanistic models have a simplicity that results from minimizing assumptions and postulates, rather than minimizing mathematics, and this simplicity promotes conclusive tests. Optimal models also employ only parameters that are defined and measured outside the model context. Common weaknesses in geomorphic models result from use of freely coined equations without clear links to conservation laws or compelling data, use of fitted rather than measured values of parameters, lack of clear distinction between assumptions and approximations, and neglect of the four-dimensional (space + time) nature of most geomorphic processes. Models for predicting landslide runout illustrate principles and pitfalls that are common to all geomorphic modeling.

  6. A multi-phase flow model for electrospinning process

    Directory of Open Access Journals (Sweden)

    Xu Lan

    2013-01-01

    Full Text Available An electrospinning process is a multi-phase and multi-physicical process with flow, electric and magnetic fields coupled together. This paper deals with establishing a multi-phase model for numerical study and explains how to prepare for nanofibers and nanoporous materials. The model provides with a powerful tool to controlling over electrospinning parameters such as voltage, flow rate, and others.

  7. A Markov Process Inspired Cellular Automata Model of Road Traffic

    OpenAIRE

    Wang, Fa; Li, Li; Hu, Jianming; Ji, Yan; Yao, Danya; Zhang, Yi; Jin, Xuexiang; Su, Yuelong; Wei, Zheng

    2008-01-01

    To provide a more accurate description of the driving behaviors in vehicle queues, a namely Markov-Gap cellular automata model is proposed in this paper. It views the variation of the gap between two consequent vehicles as a Markov process whose stationary distribution corresponds to the observed distribution of practical gaps. The multiformity of this Markov process provides the model enough flexibility to describe various driving behaviors. Two examples are given to show how to specialize i...

  8. Reverse Osmosis Processing of Organic Model Compounds and Fermentation Broths

    Science.gov (United States)

    2006-04-01

    key species found in the fermentation broth: ethanol, butanol, acetic acid, oxalic acid, lactic acid, and butyric acid. Correlations of the rejection...AFRL-ML-TY-TP-2007-4545 POSTPRINT REVERSE OSMOSIS PROCESSING OF ORGANIC MODEL COMPOUNDS AND FERMENTATION BROTHS Robert Diltz...TELEPHONE NUMBER (Include area code) Bioresource Technology 98 (2007) 686–695Reverse osmosis processing of organic model compounds and fermentation broths

  9. Stochastic process corrosion growth models for pipeline reliability

    International Nuclear Information System (INIS)

    Bazán, Felipe Alexander Vargas; Beck, André Teófilo

    2013-01-01

    Highlights: •Novel non-linear stochastic process corrosion growth model is proposed. •Corrosion rate modeled as random Poisson pulses. •Time to corrosion initiation and inherent time-variability properly represented. •Continuous corrosion growth histories obtained. •Model is shown to precisely fit actual corrosion data at two time points. -- Abstract: Linear random variable corrosion models are extensively employed in reliability analysis of pipelines. However, linear models grossly neglect well-known characteristics of the corrosion process. Herein, a non-linear model is proposed, where corrosion rate is represented as a Poisson square wave process. The resulting model represents inherent time-variability of corrosion growth, produces continuous growth and leads to mean growth at less-than-one power of time. Different corrosion models are adjusted to the same set of actual corrosion data for two inspections. The proposed non-linear random process corrosion growth model leads to the best fit to the data, while better representing problem physics

  10. Numerical Model based Reliability Estimation of Selective Laser Melting Process

    DEFF Research Database (Denmark)

    Mohanty, Sankhya; Hattel, Jesper Henri

    2014-01-01

    Selective laser melting is developing into a standard manufacturing technology with applications in various sectors. However, the process is still far from being at par with conventional processes such as welding and casting, the primary reason of which is the unreliability of the process. While...... of the selective laser melting process. A validated 3D finite-volume alternating-direction-implicit numerical technique is used to model the selective laser melting process, and is calibrated against results from single track formation experiments. Correlation coefficients are determined for process input...... parameters such as laser power, speed, beam profile, etc. Subsequently, uncertainties in the processing parameters are utilized to predict a range for the various outputs, using a Monte Carlo method based uncertainty analysis methodology, and the reliability of the process is established....

  11. Exposing earth surface process model simulations to a large audience

    Science.gov (United States)

    Overeem, I.; Kettner, A. J.; Borkowski, L.; Russell, E. L.; Peddicord, H.

    2015-12-01

    The Community Surface Dynamics Modeling System (CSDMS) represents a diverse group of >1300 scientists who develop and apply numerical models to better understand the Earth's surface. CSDMS has a mandate to make the public more aware of model capabilities and therefore started sharing state-of-the-art surface process modeling results with large audiences. One platform to reach audiences outside the science community is through museum displays on 'Science on a Sphere' (SOS). Developed by NOAA, SOS is a giant globe, linked with computers and multiple projectors and can display data and animations on a sphere. CSDMS has developed and contributed model simulation datasets for the SOS system since 2014, including hydrological processes, coastal processes, and human interactions with the environment. Model simulations of a hydrological and sediment transport model (WBM-SED) illustrate global river discharge patterns. WAVEWATCH III simulations have been specifically processed to show the impacts of hurricanes on ocean waves, with focus on hurricane Katrina and super storm Sandy. A large world dataset of dams built over the last two centuries gives an impression of the profound influence of humans on water management. Given the exposure of SOS, CSDMS aims to contribute at least 2 model datasets a year, and will soon provide displays of global river sediment fluxes and changes of the sea ice free season along the Arctic coast. Over 100 facilities worldwide show these numerical model displays to an estimated 33 million people every year. Datasets storyboards, and teacher follow-up materials associated with the simulations, are developed to address common core science K-12 standards. CSDMS dataset documentation aims to make people aware of the fact that they look at numerical model results, that underlying models have inherent assumptions and simplifications, and that limitations are known. CSDMS contributions aim to familiarize large audiences with the use of numerical

  12. Ecosystem management via interacting models of political and ecological processes

    Directory of Open Access Journals (Sweden)

    Haas, T. C.

    2004-01-01

    Full Text Available The decision to implement environmental protection options is a political one. Political realities may cause a country to not heed the most persuasive scientific analysis of an ecosystem's future health. A predictive understanding of the political processes that result in ecosystem management decisions may help guide ecosystem management policymaking. To this end, this article develops a stochastic, temporal model of how political processes influence and are influenced by ecosystem processes. This model is realized in a system of interacting influence diagrams that model the decision making of a country's political bodies. These decisions interact with a model of the ecosystem enclosed by the country. As an example, a model for Cheetah (Acinonyx jubatus management in Kenya is constructed and fitted to decision and ecological data.

  13. Virtual models of the HLA class I antigen processing pathway.

    Science.gov (United States)

    Petrovsky, Nikolai; Brusic, Vladimir

    2004-12-01

    Antigen recognition by cytotoxic CD8 T cells is dependent upon a number of critical steps in MHC class I antigen processing including proteosomal cleavage, TAP transport into the endoplasmic reticulum, and MHC class I binding. Based on extensive experimental data relating to each of these steps there is now the capacity to model individual antigen processing steps with a high degree of accuracy. This paper demonstrates the potential to bring together models of individual antigen processing steps, for example proteosome cleavage, TAP transport, and MHC binding, to build highly informative models of functional pathways. In particular, we demonstrate how an artificial neural network model of TAP transport was used to mine a HLA-binding database so as to identify HLA-binding peptides transported by TAP. This integrated model of antigen processing provided the unique insight that HLA class I alleles apparently constitute two separate classes: those that are TAP-efficient for peptide loading (HLA-B27, -A3, and -A24) and those that are TAP-inefficient (HLA-A2, -B7, and -B8). Hence, using this integrated model we were able to generate novel hypotheses regarding antigen processing, and these hypotheses are now capable of being tested experimentally. This model confirms the feasibility of constructing a virtual immune system, whereby each additional step in antigen processing is incorporated into a single modular model. Accurate models of antigen processing have implications for the study of basic immunology as well as for the design of peptide-based vaccines and other immunotherapies.

  14. Innovation Process Planning Model in the Bpmn Standard

    Directory of Open Access Journals (Sweden)

    Jurczyk-Bunkowska Magdalena

    2013-12-01

    Full Text Available The aim of the article is to show the relations in the innovation process planning model. The relations argued here guarantee the stable and reliable way to achieve the result in the form of an increased competitiveness by a professionally directed development of the company. The manager needs to specify the effect while initiating the realisation of the process, has to be achieved this by the system of indirect goals. The original model proposed here shows the standard of dependence between the plans of the fragments of the innovation process which make up for achieving its final goal. The relation in the present article was shown by using the standard Business Process Model and Notation. This enabled the specification of interrelations between the decision levels at which subsequent fragments of the innovation process are planned. This gives the possibility of a better coordination of the process, reducing the time needed for the achievement of its effect. The model has been compiled on the basis of the practises followed in Polish companies. It is not, however, the reflection of these practises, but rather an idealised standard of proceedings which aims at improving the effectiveness of the management of innovations on the operational level. The model shown could be the basis of the creation of systems supporting the decision making, supporting the knowledge management or those supporting the communication in the innovation processes.

  15. Multivariate Product-Shot-noise Cox Point Process Models

    DEFF Research Database (Denmark)

    Jalilian, Abdollah; Guan, Yongtao; Mateu, Jorge

    We introduce a new multivariate product-shot-noise Cox process which is useful for model- ing multi-species spatial point patterns with clustering intra-specific interactions and neutral, negative or positive inter-specific interactions. The auto and cross pair correlation functions of the process...

  16. Parallel direct solver for finite element modeling of manufacturing processes

    DEFF Research Database (Denmark)

    Nielsen, Chris Valentin; Martins, P.A.F.

    2017-01-01

    The central processing unit (CPU) time is of paramount importance in finite element modeling of manufacturing processes. Because the most significant part of the CPU time is consumed in solving the main system of equations resulting from finite element assemblies, different approaches have been d...

  17. FibreChain: characterization and modeling of thermoplastic composites processing

    NARCIS (Netherlands)

    Rietman, Bert; Niazi, Muhammad Sohail; Akkerman, Remko; Lomov, S.V.

    2013-01-01

    Thermoplastic composites feature the advantage of melting and shaping. The material properties during processing and the final product properties are to a large extent determined by the thermal history of the material. The approach in the FP7-project FibreChain for process chain modeling of

  18. PROGRAM COMPLEX FOR MODELING OF THE DETAILS HARDENING PROCESS

    Directory of Open Access Journals (Sweden)

    S. P. Kundas

    2004-01-01

    Full Text Available In the article there is presented the program complex ThermoSim, consisting of preprocessor, processor and postprocessor and intended for modeling (analysis of thermalphysic processes and characteristics of details of instrument-making and machine-building, diagnostics and optimization of technological processes of heat treatment and details constructions without using the destructive control methods.

  19. Innovative model of business process reengineering at machine building enterprises

    Science.gov (United States)

    Nekrasov, R. Yu; Tempel, Yu A.; Tempel, O. A.

    2017-10-01

    The paper provides consideration of business process reengineering viewed as amanagerial innovation accepted by present day machine building enterprises, as well as waysto improve its procedure. A developed innovative model of reengineering measures isdescribed and is based on the process approach and other principles of company management.

  20. Biomolecular Modeling in a Process Dynamics and Control Course

    Science.gov (United States)

    Gray, Jeffrey J.

    2006-01-01

    I present modifications to the traditional course entitled, "Process dynamics and control," which I renamed "Modeling, dynamics, and control of chemical and biological processes." Additions include the central dogma of biology, pharmacokinetic systems, population balances, control of gene transcription, and large­-scale…

  1. Modelling of the aqueous debittering process of Lupinus mutabilis Sweet

    NARCIS (Netherlands)

    Carvajal-Larenas, F.E.; Nout, M.J.R.; Boekel, van M.A.J.S.; Linnemann, A.R.

    2013-01-01

    We investigated the process of lupin debittering by soaking, cooking and washing in water using a newly designed hydroagitator. The effect on alkaloids content, solids in the product, final weight, processing time and water and energy consumption were expressed in a mathematical model for

  2. CFD modelling of condensers for freeze-drying processes

    Indian Academy of Sciences (India)

    ... the condenser, in order to evaluate condenser efficiency and gain deeper insights of the process to be used for the improvement of its design. Both a complete laboratory-scale freeze-drying apparatus and an industrial-scale condenser have been investigated in this work, modelling the process of water vapour deposition.

  3. A Unified Toolset for Business Process Model Formalization

    NARCIS (Netherlands)

    B. Changizi (Behnaz); N. Kokash (Natallia); F. Arbab (Farhad)

    2010-01-01

    htmlabstractIn this paper, we present a toolset to automate the transformation of Business Process Modeling Notation (BPMN), UML Sequence Diagrams, and Business Process Execution Language (BPEL), into their proposed formal semantics expressed in the channel-based coordination language Reo. Such

  4. Process chain modeling and selection in an additive manufacturing context

    DEFF Research Database (Denmark)

    Thompson, Mary Kathryn; Stolfi, Alessandro; Mischkot, Michael

    2016-01-01

    This paper introduces a new two-dimensional approach to modeling manufacturing process chains. This approach is used to consider the role of additive manufacturing technologies in process chains for a part with micro scale features and no internal geometry. It is shown that additive manufacturing...... evolving fields like additive manufacturing....

  5. Modelling aspects of distributed processing in telecommunication networks

    NARCIS (Netherlands)

    Tomasgard, A; Audestad, JA; Dye, S; Stougie, L; van der Vlerk, MH; Wallace, SW

    1998-01-01

    The purpose of this paper is to formally describe new optimization models for telecommunication networks with distributed processing. Modem distributed networks put more focus on the processing of information and less on the actual transportation of data than we are traditionally used to in

  6. Engineered Barrier System Degradation, Flow, and Transport Process Model Report

    Energy Technology Data Exchange (ETDEWEB)

    E.L. Hardin

    2000-07-17

    The Engineered Barrier System Degradation, Flow, and Transport Process Model Report (EBS PMR) is one of nine PMRs supporting the Total System Performance Assessment (TSPA) being developed by the Yucca Mountain Project for the Site Recommendation Report (SRR). The EBS PMR summarizes the development and abstraction of models for processes that govern the evolution of conditions within the emplacement drifts of a potential high-level nuclear waste repository at Yucca Mountain, Nye County, Nevada. Details of these individual models are documented in 23 supporting Analysis/Model Reports (AMRs). Nineteen of these AMRs are for process models, and the remaining 4 describe the abstraction of results for application in TSPA. The process models themselves cluster around four major topics: ''Water Distribution and Removal Model, Physical and Chemical Environment Model, Radionuclide Transport Model, and Multiscale Thermohydrologic Model''. One AMR (Engineered Barrier System-Features, Events, and Processes/Degradation Modes Analysis) summarizes the formal screening analysis used to select the Features, Events, and Processes (FEPs) included in TSPA and those excluded from further consideration. Performance of a potential Yucca Mountain high-level radioactive waste repository depends on both the natural barrier system (NBS) and the engineered barrier system (EBS) and on their interactions. Although the waste packages are generally considered as components of the EBS, the EBS as defined in the EBS PMR includes all engineered components outside the waste packages. The principal function of the EBS is to complement the geologic system in limiting the amount of water contacting nuclear waste. A number of alternatives were considered by the Project for different EBS designs that could provide better performance than the design analyzed for the Viability Assessment. The design concept selected was Enhanced Design Alternative II (EDA II).

  7. Engineered Barrier System Degradation, Flow, and Transport Process Model Report

    International Nuclear Information System (INIS)

    E.L. Hardin

    2000-01-01

    The Engineered Barrier System Degradation, Flow, and Transport Process Model Report (EBS PMR) is one of nine PMRs supporting the Total System Performance Assessment (TSPA) being developed by the Yucca Mountain Project for the Site Recommendation Report (SRR). The EBS PMR summarizes the development and abstraction of models for processes that govern the evolution of conditions within the emplacement drifts of a potential high-level nuclear waste repository at Yucca Mountain, Nye County, Nevada. Details of these individual models are documented in 23 supporting Analysis/Model Reports (AMRs). Nineteen of these AMRs are for process models, and the remaining 4 describe the abstraction of results for application in TSPA. The process models themselves cluster around four major topics: ''Water Distribution and Removal Model, Physical and Chemical Environment Model, Radionuclide Transport Model, and Multiscale Thermohydrologic Model''. One AMR (Engineered Barrier System-Features, Events, and Processes/Degradation Modes Analysis) summarizes the formal screening analysis used to select the Features, Events, and Processes (FEPs) included in TSPA and those excluded from further consideration. Performance of a potential Yucca Mountain high-level radioactive waste repository depends on both the natural barrier system (NBS) and the engineered barrier system (EBS) and on their interactions. Although the waste packages are generally considered as components of the EBS, the EBS as defined in the EBS PMR includes all engineered components outside the waste packages. The principal function of the EBS is to complement the geologic system in limiting the amount of water contacting nuclear waste. A number of alternatives were considered by the Project for different EBS designs that could provide better performance than the design analyzed for the Viability Assessment. The design concept selected was Enhanced Design Alternative II (EDA II)

  8. Modeling a novel glass immobilization waste treatment process using flow

    International Nuclear Information System (INIS)

    Ferrada, J.J.; Nehls, J.W. Jr.; Welch, T.D.; Giardina, J.L.

    1996-01-01

    One option for control and disposal of surplus fissile materials is the Glass Material Oxidation and Dissolution System (GMODS), a process developed at ORNL for directly converting Pu-bearing material into a durable high-quality glass waste form. This paper presents a preliminary assessment of the GMODS process flowsheet using FLOW, a chemical process simulator. The simulation showed that the glass chemistry postulated ion the models has acceptable levels of risks

  9. Numerical modelling of the tilt casting processes of titanium alumindes

    OpenAIRE

    Wang, Hong

    2008-01-01

    This research has investigated the modelling and optimisation of the tilt casting process of Titanium Aluminides (TiAl). This study is carried out in parallel with the experimental research undertaken in IRC at the University of Birmingham. They propose to use tilt casting inside a vacuum chamber and attempt to combine this tilt casting process with Induction Skull Melting (ISM). A totally novel process is developing for investment casting, which is suitable for casting gamma TiAl.\\ud \\ud As ...

  10. Modeling process-structure-property relationships for additive manufacturing

    Science.gov (United States)

    Yan, Wentao; Lin, Stephen; Kafka, Orion L.; Yu, Cheng; Liu, Zeliang; Lian, Yanping; Wolff, Sarah; Cao, Jian; Wagner, Gregory J.; Liu, Wing Kam

    2018-02-01

    This paper presents our latest work on comprehensive modeling of process-structure-property relationships for additive manufacturing (AM) materials, including using data-mining techniques to close the cycle of design-predict-optimize. To illustrate the processstructure relationship, the multi-scale multi-physics process modeling starts from the micro-scale to establish a mechanistic heat source model, to the meso-scale models of individual powder particle evolution, and finally to the macro-scale model to simulate the fabrication process of a complex product. To link structure and properties, a highefficiency mechanistic model, self-consistent clustering analyses, is developed to capture a variety of material response. The model incorporates factors such as voids, phase composition, inclusions, and grain structures, which are the differentiating features of AM metals. Furthermore, we propose data-mining as an effective solution for novel rapid design and optimization, which is motivated by the numerous influencing factors in the AM process. We believe this paper will provide a roadmap to advance AM fundamental understanding and guide the monitoring and advanced diagnostics of AM processing.

  11. Eye Tracking Meets the Process of Process Modeling: a Visual Analytic Approach

    DEFF Research Database (Denmark)

    Burattin, Andrea; Kaiser, M.; Neurauter, Manuel

    2017-01-01

    Research on the process of process modeling (PPM) studies how process models are created. It typically uses the logs of the interactions with the modeling tool to assess the modeler’s behavior. In this paper we suggest to introduce an additional stream of data (i.e., eye tracking) to improve...... diagram, heat maps, fixations distributions) both static and dynamic (i.e., movies with the evolution of the model and eye tracking data on top)....... the analysis of the PPM. We show that, by exploiting this additional source of information, we can refine the detection of comprehension phases (introducing activities such as “semantic validation” or “ problem understanding”) as well as provide more exploratory visualizations (e.g., combined modeling phase...

  12. A decision model for the risk management of hazardous processes

    International Nuclear Information System (INIS)

    Holmberg, J.E.

    1997-03-01

    A decision model for risk management of hazardous processes as an optimisation problem of a point process is formulated in the study. In the approach, the decisions made by the management are divided into three categories: (1) planned process lifetime, (2) selection of the design and, (3) operational decisions. These three controlling methods play quite different roles in the practical risk management, which is also reflected in our approach. The optimisation of the process lifetime is related to the licensing problem of the process. It provides a boundary condition for a feasible utility function that is used as the actual objective function, i.e., maximizing the process lifetime utility. By design modifications, the management can affect the inherent accident hazard rate of the process. This is usually a discrete optimisation task. The study particularly concentrates upon the optimisation of the operational strategies given a certain design and licensing time. This is done by a dynamic risk model (marked point process model) representing the stochastic process of events observable or unobservable to the decision maker. An optimal long term control variable guiding the selection of operational alternatives in short term problems is studied. The optimisation problem is solved by the stochastic quasi-gradient procedure. The approach is illustrated by a case study. (23 refs.)

  13. Continuation-like semantics for modeling structural process anomalies

    Directory of Open Access Journals (Sweden)

    Grewe Niels

    2012-09-01

    Full Text Available Abstract Background Biomedical ontologies usually encode knowledge that applies always or at least most of the time, that is in normal circumstances. But for some applications like phenotype ontologies it is becoming increasingly important to represent information about aberrations from a norm. These aberrations may be modifications of physiological structures, but also modifications of biological processes. Methods To facilitate precise definitions of process-related phenotypes, such as delayed eruption of the primary teeth or disrupted ocular pursuit movements, I introduce a modeling approach that draws inspiration from the use of continuations in the analysis of programming languages and apply a similar idea to ontological modeling. This approach characterises processes by describing their outcome up to a certain point and the way they will continue in the canonical case. Definitions of process types are then given in terms of their continuations and anomalous phenotypes are defined by their differences to the canonical definitions. Results The resulting model is capable of accurately representing structural process anomalies. It allows distinguishing between different anomaly kinds (delays, interruptions, gives identity criteria for interrupted processes, and explains why normal and anomalous process instances can be subsumed under a common type, thus establishing the connection between canonical and anomalous process-related phenotypes. Conclusion This paper shows how to to give semantically rich definitions of process-related phenotypes. These allow to expand the application areas of phenotype ontologies beyond literature annotation and establishment of genotype-phenotype associations to the detection of anomalies in suitably encoded datasets.

  14. Control of automatic processes: A parallel distributed-processing model of the stroop effect. Technical report

    Energy Technology Data Exchange (ETDEWEB)

    Cohen, J.D.; Dunbar, K.; McClelland, J.L.

    1988-06-16

    A growing body of evidence suggests that traditional views of automaticity are in need of revision. For example, automaticity has often been treated as an all-or-none phenomenon, and traditional theories have held that automatic processes are independent of attention. Yet recent empirial data suggests that automatic processes are continuous, and furthermore are subject to attentional control. In this paper we present a model of attention which addresses these issues. Using a parallel distributed processing framework we propose that the attributes of automaticity depend upon the strength of a process and that strength increases with training. Using the Stroop effect as an example, we show how automatic processes are continuous and emerge gradually with practice. Specifically, we present a computational model of the Stroop task which simulates the time course of processing as well as the effects of learning.

  15. Conceptual Frameworks in the Doctoral Research Process: A Pedagogical Model

    Science.gov (United States)

    Berman, Jeanette; Smyth, Robyn

    2015-01-01

    This paper contributes to consideration of the role of conceptual frameworks in the doctoral research process. Through reflection on the two authors' own conceptual frameworks for their doctoral studies, a pedagogical model has been developed. The model posits the development of a conceptual framework as a core element of the doctoral…

  16. Arta process model of maritime clutter and targets

    CSIR Research Space (South Africa)

    McDonald, A

    2012-10-01

    Full Text Available . The validity and practicality of the ARTA process model is demonstrated by deriving models for a maritime target and for sea clutter, both from measurements and without any prior assumption regarding the distribution of measurements. This ability to generate...

  17. In-situ biogas upgrading process: modeling and simulations aspects

    DEFF Research Database (Denmark)

    Lovato, Giovanna; Alvarado-Morales, Merlin; Kovalovszki, Adam

    2017-01-01

    Biogas upgrading processes by in-situ hydrogen (H2) injection are still challenging and could benefit from a mathematical model to predict system performance. Therefore, a previous model on anaerobic digestion was updated and expanded to include the effect of H2 injection into the liquid phase of...

  18. Model Based Monitoring and Control of Chemical and Biochemical Processes

    DEFF Research Database (Denmark)

    Huusom, Jakob Kjøbsted

    This presentation will give an overview of the work performed at the department of Chemical and Biochemical Engineering related to process control. A research vision is formulated and related to a number of active projects at the department. In more detail a project describing model estimation...... and controller tuning in Model Predictive Control application is discussed....

  19. NEURO-FUZZY MODELLING OF BLENDING PROCESS IN CEMENT PLANT

    Directory of Open Access Journals (Sweden)

    Dauda Olarotimi Araromi

    2015-11-01

    Full Text Available The profitability of a cement plant depends largely on the efficient operation of the blending stage, therefore, there is a need to control the process at the blending stage in order to maintain the chemical composition of the raw mix near or at the desired value with minimum variance despite variation in the raw material composition. In this work, neuro-fuzzy model is developed for a dynamic behaviour of the system to predict the total carbonate content in the raw mix at different clay feed rates. The data used for parameter estimation and model validation was obtained from one of the cement plants in Nigeria. The data was pre-processed to remove outliers and filtered using smoothening technique in order to reveal its dynamic nature. Autoregressive exogenous (ARX model was developed for comparison purpose. ARX model gave high root mean square error (RMSE of 5.408 and 4.0199 for training and validation respectively. Poor fit resulting from ARX model is an indication of nonlinear nature of the process. However, both visual and statistical analyses on neuro-fuzzy (ANFIS model gave a far better result. RMSE of training and validation are 0.28167 and 0.7436 respectively, and the sum of square error (SSE and R-square are 39.6692 and 0.9969 respectively. All these are an indication of good performance of ANFIS model. This model can be used for control design of the process.

  20. The quark-gluon model for particle production processes

    International Nuclear Information System (INIS)

    Volkovitskij, P.E.

    1983-01-01

    The quark-gluon model for hadronization of strings produced in soft and hard processes is suggested. The model is based on the distribution functions of valence quarks in hadrons which have correct Regge behaviour. The simplest case is discussed in which only the longitudinal degrees of freedom are taken into account

  1. Models as instruments for optimizing hospital processes: a systematic review

    NARCIS (Netherlands)

    van Sambeek, J. R. C.; Cornelissen, F. A.; Bakker, P. J. M.; Krabbendam, J. J.

    2010-01-01

    PURPOSE: The purpose of this article is to find decision-making models for the design and control of processes regarding patient flows, considering various problem types, and to find out how usable these models are for managerial decision making. DESIGN/METHODOLOGY/APPROACH: A systematic review of

  2. Automatic Detection and Resolution of Lexical Ambiguity in Process Models

    NARCIS (Netherlands)

    Pittke, F.; Leopold, H.; Mendling, J.

    2015-01-01

    System-related engineering tasks are often conducted using process models. In this context, it is essential that these models do not contain structural or terminological inconsistencies. To this end, several automatic analysis techniques have been proposed to support quality assurance. While formal

  3. Consolidation process model for film stacking glass/PPS laminates

    NARCIS (Netherlands)

    Grouve, Wouter Johannes Bernardus; Akkerman, Remko

    2010-01-01

    A model is proposed to optimise the processing parameters for the consolidation of glass/polyphenylene sulphide (PPS) laminates using a film stacking procedure. In a split approach, the heating and consolidation phase are treated separately. The heating phase is modelled using the one-dimensional

  4. Stochastic Greybox Modeling of an Alternating Activated Sludge Process

    DEFF Research Database (Denmark)

    Halvgaard, Rasmus Fogtmann; Munk-Nielsen, T.; Tychsen, P.

    Summary of key findings We found a greybox model for state estimation and control of the BioDenitro process based on a reduced ASM1. We then applied Maximum Likelihood Estimation on measurements from a real full-scale waste water treatment plant to estimate the model parameters. The estimation me...

  5. Stochastic Greybox Modeling of an Alternating Activated Sludge Process

    DEFF Research Database (Denmark)

    Halvgaard, Rasmus Fogtmann; Munk-Nielsen, T.; Tychsen, P.

    Summary of key findings We found a greybox model for state estimation and control of the BioDenitro process based on a reduced ASM1. We then applied Maximum Likelihood Estimation on measurements from a real full-scale waste water treatment plant to estimate the model parameters. The estimation me...... forecasts of the load....

  6. A computational model of human auditory signal processing and perception.

    Science.gov (United States)

    Jepsen, Morten L; Ewert, Stephan D; Dau, Torsten

    2008-07-01

    A model of computational auditory signal-processing and perception that accounts for various aspects of simultaneous and nonsimultaneous masking in human listeners is presented. The model is based on the modulation filterbank model described by Dau et al. [J. Acoust. Soc. Am. 102, 2892 (1997)] but includes major changes at the peripheral and more central stages of processing. The model contains outer- and middle-ear transformations, a nonlinear basilar-membrane processing stage, a hair-cell transduction stage, a squaring expansion, an adaptation stage, a 150-Hz lowpass modulation filter, a bandpass modulation filterbank, a constant-variance internal noise, and an optimal detector stage. The model was evaluated in experimental conditions that reflect, to a different degree, effects of compression as well as spectral and temporal resolution in auditory processing. The experiments include intensity discrimination with pure tones and broadband noise, tone-in-noise detection, spectral masking with narrow-band signals and maskers, forward masking with tone signals and tone or noise maskers, and amplitude-modulation detection with narrow- and wideband noise carriers. The model can account for most of the key properties of the data and is more powerful than the original model. The model might be useful as a front end in technical applications.

  7. Integration of drinking water treatment plant process models and emulated process automation software

    NARCIS (Netherlands)

    Worm, G.I.M.

    2012-01-01

    The objective of this research is to limit the risks of fully automated operation of drinking water treatment plants and to improve their operation by using an integrated system of process models and emulated process automation software. This thesis contains the design of such an integrated system.

  8. Analysis and synthesis of solutions for the agglomeration process modeling

    Science.gov (United States)

    Babuk, V. A.; Dolotkazin, I. N.; Nizyaev, A. A.

    2013-03-01

    The present work is devoted development of model of agglomerating process for propellants based on ammonium perchlorate (AP), ammonium dinitramide (ADN), HMX, inactive binder, and nanoaluminum. Generalization of experimental data, development of physical picture of agglomeration for listed propellants, development and analysis of mathematical models are carried out. Synthesis of models of various phenomena taking place at agglomeration implementation allows predicting of size and quantity, chemical composition, structure of forming agglomerates and its fraction in set of condensed combustion products. It became possible in many respects due to development of new model of agglomerating particle evolution on the surface of burning propellant. Obtained results correspond to available experimental data. It is supposed that analogical method based on analysis of mathematical models of particular phenomena and their synthesis will allow implementing of the agglomerating process modeling for other types of metalized solid propellants.

  9. Process optimization of friction stir welding based on thermal models

    DEFF Research Database (Denmark)

    Larsen, Anders Astrup

    2010-01-01

    This thesis investigates how to apply optimization methods to numerical models of a friction stir welding process. The work is intended as a proof-of-concept using different methods that are applicable to models of high complexity, possibly with high computational cost, and without the possibility...... information of the high-fidelity model. The optimization schemes are applied to stationary thermal models of differing complexity of the friction stir welding process. The optimization problems considered are based on optimizing the temperature field in the workpiece by finding optimal translational speed...... and the backingplate by solving an inverse modelling problem in which experimental data and a numerical model are used for determining the contact heat transfer coefficient. Different parametrizations of the spatial distribution of the heat transfer coefficient are studied and discussed, and the optimization problem...

  10. Representing vegetation processes in hydrometeorological simulations using the WRF model

    DEFF Research Database (Denmark)

    Nielsen, Joakim Refslund

    For accurate predictions of weather and climate, it is important that the land surface and its processes are well represented. In a mesoscale model the land surface processes are calculated in a land surface model (LSM). These pro-cesses include exchanges of energy, water and momentum between...... data and the default vegetation data in WRF were further used in high-resolution simulations over Denmark down to cloud-resolving scale (3 km). Results from two spatial resolutions were compared to investigate the inuence of parametrized and resolved convec-tion. The simulations using the parametrized...

  11. TECHNOLOGICAL PROCESS MODELING AIMING TO IMPROVE ITS OPERATIONS MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Ivan Mihajlović

    2011-11-01

    Full Text Available This paper presents the modeling procedure of one real technological system. In this study, thecopper extraction from the copper flotation waste generated at the Bor Copper Mine (Serbia, werethe object of modeling. Sufficient data base for statistical modeling was constructed using theorthogonal factorial design of the experiments. Mathematical model of investigated system wasdeveloped using the combination of linear and multiple linear statistical analysis approach. Thepurpose of such a model is obtaining optimal states of the system that enable efficient operationsmanagement. Besides technological and economical, ecological parameters of the process wereconsidered as crucial input variables.

  12. New process model proves accurate in tests on catalytic reformer

    Energy Technology Data Exchange (ETDEWEB)

    Aguilar-Rodriguez, E.; Ancheyta-Juarez, J. (Inst. Mexicano del Petroleo, Mexico City (Mexico))

    1994-07-25

    A mathematical model has been devised to represent the process that takes place in a fixed-bed, tubular, adiabatic catalytic reforming reactor. Since its development, the model has been applied to the simulation of a commercial semiregenerative reformer. The development of mass and energy balances for this reformer led to a model that predicts both concentration and temperature profiles along the reactor. A comparison of the model's results with experimental data illustrates its accuracy at predicting product profiles. Simple steps show how the model can be applied to simulate any fixed-bed catalytic reformer.

  13. Modelling of the Heating Process in a Thermal Screw

    Science.gov (United States)

    Zhang, Xuan; Veje, Christian T.; Lassen, Benny; Willatzen, Morten

    2012-11-01

    The procedure of separating efficiently dry-stuff (proteins), fat, and water is an important process in the handling of waste products from industrial and commercial meat manufactures. One of the sub-processes in a separation facility is a thermal screw where the raw material (after proper mincing) is heated in order to melt fat, coagulate protein, and free water. This process is very energy consuming and the efficiency of the product is highly dependent on accurate temperature control of the process. A key quality parameter is the time that the product is maintained at temperatures within a certain threshold. A detailed mathematical model for the heating process in the thermal screw is developed and analysed. The model is formulated as a set of partial differential equations including the latent heat for the melting process of the fat and the boiling of water, respectively. The product is modelled by three components; water, fat and dry-stuff (bones and proteins). The melting of the fat component is captured as a plateau in the product temperature. The model effectively captures the product outlet temperature and the energy consumed. Depending on raw material composition, "soft" or "dry", the model outlines the heat injection and screw speeds necessary to obtain optimal output quality.

  14. Rapid Prototyping of wax foundry models in an incremental process

    Directory of Open Access Journals (Sweden)

    B. Kozik

    2011-04-01

    Full Text Available The paper presents an analysis incremental methods of creating wax founding models. There are two methods of Rapid Prototypingof wax models in an incremental process which are more and more often used in industrial practice and in scientific research.Applying Rapid Prototyping methods in the process of making casts allows for acceleration of work on preparing prototypes. It isespecially important in case of element having complicated shapes. The time of making a wax model depending on the size and the appliedRP method may vary from several to a few dozen hours.

  15. Numerical modelling of the jet nozzle enrichment process

    International Nuclear Information System (INIS)

    Vercelli, P.

    1983-01-01

    A numerical model was developed for the simulation of the isotopic enrichment produced by the jet nozzle process. The flow was considered stationary and under ideal gas conditions. The model calculates, for any position of the skimmer piece: (a) values of radial mass concentration profiles for each isotopic species and (b) values of elementary separation effect (Σ sub(A)) and uranium cut (theta). The comparison of the numerical results obtained with the experimental values given in the literature proves the validity of the present work as an initial step in the modelling of the process. (Author) [pt

  16. Modelling the Pultrusion Process of Off Shore Wind Turbine Blades

    DEFF Research Database (Denmark)

    Baran, Ismet

    to the quasi-static mechanical model in which the finite element method is employed. In the mechanical model, the composite part is assumed to advance along the pulling direction meanwhile tracking the corresponding temperature and degree of cure profiles. Modelling the pultrusion process containing both uni....... The compaction, viscous and frictional forces have been predicted for a pultruded composite rod. The viscous drag is found to be the main contribution in terms of the frictional force to the overall pulling force, while the contribution due to material compaction at the inlet is found to be negligible. Process...

  17. Model based methods and tools for process systems engineering

    DEFF Research Database (Denmark)

    Gani, Rafiqul

    Process systems engineering (PSE) provides means to solve a wide range of problems in a systematic and efficient manner. This presentation will give a perspective on model based methods and tools needed to solve a wide range of problems in product-process synthesis-design. These methods and tools...... need to be integrated with work-flows and data-flows for specific product-process synthesis-design problems within a computer-aided framework. The framework therefore should be able to manage knowledge-data, models and the associated methods and tools needed by specific synthesis-design work...... of model based methods and tools within a computer aided framework for product-process synthesis-design will be highlighted....

  18. A model of algorithmic representation of a business process

    Directory of Open Access Journals (Sweden)

    E. I. Koshkarova

    2014-01-01

    Full Text Available This article presents and justifies the possibility of developing a method for estimation and optimization of an enterprise business processes; the proposed method is based on identity of two notions – an algorithm and a business process. The described method relies on extraction of a recursive model from the business process, based on the example of one process automated by the BPM system and further estimation and optimization of that process in accordance with estimation and optimization techniques applied to algorithms. The results of this investigation could be used by experts working in the field of reengineering of enterprise business processes, automation of business processes along with development of enterprise informational systems.

  19. Modeling and Control of Multivariable Process Using Intelligent Techniques

    Directory of Open Access Journals (Sweden)

    Subathra Balasubramanian

    2010-10-01

    Full Text Available For nonlinear dynamic systems, the first principles based modeling and control is difficult to implement. In this study, a fuzzy controller and recurrent fuzzy controller are developed for MIMO process. Fuzzy logic controller is a model free controller designed based on the knowledge about the process. In fuzzy controller there are two types of rule-based fuzzy models are available: one the linguistic (Mamdani model and the other is Takagi–Sugeno model. Of these two, Takagi-Sugeno model (TS has attracted most attention. The fuzzy controller application is limited to static processes due to their feedforward structure. But, most of the real-time processes are dynamic and they require the history of input/output data. In order to store the past values a memory unit is needed, which is introduced by the recurrent structure. The proposed recurrent fuzzy structure is used to develop a controller for the two tank heating process. Both controllers are designed and implemented in a real time environment and their performance is compared.

  20. Validating Computational Cognitive Process Models across Multiple Timescales

    Science.gov (United States)

    Myers, Christopher; Gluck, Kevin; Gunzelmann, Glenn; Krusmark, Michael

    2010-12-01

    Model comparison is vital to evaluating progress in the fields of artificial general intelligence (AGI) and cognitive architecture. As they mature, AGI and cognitive architectures will become increasingly capable of providing a single model that completes a multitude of tasks, some of which the model was not specifically engineered to perform. These models will be expected to operate for extended periods of time and serve functional roles in real-world contexts. Questions arise regarding how to evaluate such models appropriately, including issues pertaining to model comparison and validation. In this paper, we specifically address model validation across multiple levels of abstraction, using an existing computational process model of unmanned aerial vehicle basic maneuvering to illustrate the relationship between validity and timescales of analysis.

  1. Functional State Modelling of Cultivation Processes: Dissolved Oxygen Limitation State

    Directory of Open Access Journals (Sweden)

    Olympia Roeva

    2015-04-01

    Full Text Available A new functional state, namely dissolved oxygen limitation state for both bacteria Escherichia coli and yeast Saccharomyces cerevisiae fed-batch cultivation processes is presented in this study. Functional state modelling approach is applied to cultivation processes in order to overcome the main disadvantages of using global process model, namely complex model structure and a big number of model parameters. Alongwith the newly introduced dissolved oxygen limitation state, second acetate production state and first acetate production state are recognized during the fed-batch cultivation of E. coli, while mixed oxidative state and first ethanol production state are recognized during the fed-batch cultivation of S. cerevisiae. For all mentioned above functional states both structural and parameter identification is here performed based on experimental data of E. coli and S. cerevisiae fed-batch cultivations.

  2. Markov modulated Poisson process models incorporating covariates for rainfall intensity.

    Science.gov (United States)

    Thayakaran, R; Ramesh, N I

    2013-01-01

    Time series of rainfall bucket tip times at the Beaufort Park station, Bracknell, in the UK are modelled by a class of Markov modulated Poisson processes (MMPP) which may be thought of as a generalization of the Poisson process. Our main focus in this paper is to investigate the effects of including covariate information into the MMPP model framework on statistical properties. In particular, we look at three types of time-varying covariates namely temperature, sea level pressure, and relative humidity that are thought to be affecting the rainfall arrival process. Maximum likelihood estimation is used to obtain the parameter estimates, and likelihood ratio tests are employed in model comparison. Simulated data from the fitted model are used to make statistical inferences about the accumulated rainfall in the discrete time interval. Variability of the daily Poisson arrival rates is studied.

  3. Efficient Adoption and Assessment of Multiple Process Improvement Reference Models

    Directory of Open Access Journals (Sweden)

    Simona Jeners

    2013-06-01

    Full Text Available A variety of reference models such as CMMI, COBIT or ITIL support IT organizations to improve their processes. These process improvement reference models (IRMs cover different domains such as IT development, IT Services or IT Governance but also share some similarities. As there are organizations that address multiple domains and need to coordinate their processes in their improvement we present MoSaIC, an approach to support organizations to efficiently adopt and conform to multiple IRMs. Our solution realizes a semantic integration of IRMs based on common meta-models. The resulting IRM integration model enables organizations to efficiently implement and asses multiple IRMs and to benefit from synergy effects.

  4. Mathematical modelling of anaerobic digestion processes: applications and future needs

    DEFF Research Database (Denmark)

    Batstone, Damien J.; Puyol, Daniel; Flores Alsina, Xavier

    2015-01-01

    of the role of the central carbon catabolic metabolism in anaerobic digestion, with an increased importance of phosphorous, sulfur, and metals as electron source and sink, and consideration of hydrogen and methane as potential electron sources. The paradigm of anaerobic digestion is challenged by anoxygenic...... phototrophism, where energy is relatively cheap, but electron transfer is expensive. These new processes are commonly not compatible with the existing structure of anaerobic digestion models. These core issues extend to application of anaerobic digestion in domestic plant-wide modelling, with the need......Anaerobic process modelling is a mature and well-established field, largely guided by a mechanistic model structure that is defined by our understanding of underlying processes. This led to publication of the IWA ADM1, and strong supporting, analytical, and extension research in the 15 years since...

  5. FDA 2011 process validation guidance: lifecycle compliance model.

    Science.gov (United States)

    Campbell, Cliff

    2014-01-01

    This article has been written as a contribution to the industry's efforts in migrating from a document-driven to a data-driven compliance mindset. A combination of target product profile, control engineering, and general sum principle techniques is presented as the basis of a simple but scalable lifecycle compliance model in support of modernized process validation. Unit operations and significant variables occupy pole position within the model, documentation requirements being treated as a derivative or consequence of the modeling process. The quality system is repositioned as a subordinate of system quality, this being defined as the integral of related "system qualities". The article represents a structured interpretation of the U.S. Food and Drug Administration's 2011 Guidance for Industry on Process Validation and is based on the author's educational background and his manufacturing/consulting experience in the validation field. The U.S. Food and Drug Administration's Guidance for Industry on Process Validation (2011) provides a wide-ranging and rigorous outline of compliant drug manufacturing requirements relative to its 20(th) century predecessor (1987). Its declared focus is patient safety, and it identifies three inter-related (and obvious) stages of the compliance lifecycle. Firstly, processes must be designed, both from a technical and quality perspective. Secondly, processes must be qualified, providing evidence that the manufacturing facility is fully "roadworthy" and fit for its intended purpose. Thirdly, processes must be verified, meaning that commercial batches must be monitored to ensure that processes remain in a state of control throughout their lifetime.

  6. MODEL OF QUALITY MANAGEMENT OF TECHNOLOGICAL PROCESSES OF THE GRAIN PROCESSING AND MILL ENTERPRISES

    Directory of Open Access Journals (Sweden)

    M. M. Blagoveshchenskaia

    2014-01-01

    Full Text Available Summary. In the work the model of quality management of technological processes of the grain processing and mill enterprises is presented. It is shown that flour-grinding production is an important part of agro-industrial complex because it provides production of the main food product of people – flour. The analytical indicators of quality of technological process are presented. The matrix of expert estimates of i-th level of quality for the set combinations of parameters values according to the scheme of complete factorial experiment is made. Considered a model for the calculation of the raw material preparation for milling, which characterizes the main qualities of the processed raw materials. For the purpose of management of quality of technological processes of flour mill the mathematical model which includes calculation of two groups of indicators of an assessment is developed: qualities of preparation of raw materials for a grinding and qualities of conducting technological process. The algorithm of an analytical assessment of indicators of quality of technological process of the flour-grinding enterprises, including the selection of waste, selection of bran, a compliance rate of output of flour-grinding products, compliance rate of moisture products, is offered. The assessment of quality management of technological process of a high-quality grinding on the example of several leading flour-grinding enterprises of Central Federal District is carried out. The two-dimensional model of quality management of technological process based on an analytical indicators of an assessment of quality, an assessment of quality of preparation the raw materials for a grinding and an optimum effective condition of technological process is constructed. It is shown that quality management at the enterprise provides collecting, processing and the analysis of information on a condition of material streams and productions on all of their stages.

  7. Models as instruments for optimizing hospital processes: a systematic review.

    Science.gov (United States)

    van Sambeek, J R C; Cornelissen, F A; Bakker, P J M; Krabbendam, J J

    2010-01-01

    The purpose of this article is to find decision-making models for the design and control of processes regarding patient flows, considering various problem types, and to find out how usable these models are for managerial decision making. A systematic review of the literature was carried out. Relevant literature from three databases was selected based on inclusion and exclusion criteria and the results were analyzed. A total of 68 articles were selected. Of these, 31 contained computer simulation models, ten contained descriptive models, and 27 contained analytical models. The review showed that descriptive models are only applied to process design problems, and that analytical and computer simulation models are applied to all types of problems to approximately the same extent. Only a few models have been validated in practice, and it seems that most models are not used for their intended purpose: to support management in decision making. The comparability of the relevant databases appears to be limited and there is an insufficient number of suitable keywords and MeSH headings, which makes searching systematically within the broad field of health care management relatively hard to accomplish. The findings give managers insight into the characteristics of various types of decision-support models and into the kinds of situations in which they are used. This is the first time literature on various kinds of models for supporting managerial decision making in hospitals has been systematically collected and assessed.

  8. A simple hyperbolic model for communication in parallel processing environments

    Science.gov (United States)

    Stoica, Ion; Sultan, Florin; Keyes, David

    1994-01-01

    We introduce a model for communication costs in parallel processing environments called the 'hyperbolic model,' which generalizes two-parameter dedicated-link models in an analytically simple way. Dedicated interprocessor links parameterized by a latency and a transfer rate that are independent of load are assumed by many existing communication models; such models are unrealistic for workstation networks. The communication system is modeled as a directed communication graph in which terminal nodes represent the application processes that initiate the sending and receiving of the information and in which internal nodes, called communication blocks (CBs), reflect the layered structure of the underlying communication architecture. The direction of graph edges specifies the flow of the information carried through messages. Each CB is characterized by a two-parameter hyperbolic function of the message size that represents the service time needed for processing the message. The parameters are evaluated in the limits of very large and very small messages. Rules are given for reducing a communication graph consisting of many to an equivalent two-parameter form, while maintaining an approximation for the service time that is exact in both large and small limits. The model is validated on a dedicated Ethernet network of workstations by experiments with communication subprograms arising in scientific applications, for which a tight fit of the model predictions with actual measurements of the communication and synchronization time between end processes is demonstrated. The model is then used to evaluate the performance of two simple parallel scientific applications from partial differential equations: domain decomposition and time-parallel multigrid. In an appropriate limit, we also show the compatibility of the hyperbolic model with the recently proposed LogP model.

  9. MOUNTAIN-SCALE COUPLED PROCESSES (TH/THC/THM) MODELS

    International Nuclear Information System (INIS)

    Y.S. Wu

    2005-01-01

    This report documents the development and validation of the mountain-scale thermal-hydrologic (TH), thermal-hydrologic-chemical (THC), and thermal-hydrologic-mechanical (THM) models. These models provide technical support for screening of features, events, and processes (FEPs) related to the effects of coupled TH/THC/THM processes on mountain-scale unsaturated zone (UZ) and saturated zone (SZ) flow at Yucca Mountain, Nevada (BSC 2005 [DIRS 174842], Section 2.1.1.1). The purpose and validation criteria for these models are specified in ''Technical Work Plan for: Near-Field Environment and Transport: Coupled Processes (Mountain-Scale TH/THC/THM, Drift-Scale THC Seepage, and Drift-Scale Abstraction) Model Report Integration'' (BSC 2005 [DIRS 174842]). Model results are used to support exclusion of certain FEPs from the total system performance assessment for the license application (TSPA-LA) model on the basis of low consequence, consistent with the requirements of 10 CFR 63.342 [DIRS 173273]. Outputs from this report are not direct feeds to the TSPA-LA. All the FEPs related to the effects of coupled TH/THC/THM processes on mountain-scale UZ and SZ flow are discussed in Sections 6 and 7 of this report. The mountain-scale coupled TH/THC/THM processes models numerically simulate the impact of nuclear waste heat release on the natural hydrogeological system, including a representation of heat-driven processes occurring in the far field. The mountain-scale TH simulations provide predictions for thermally affected liquid saturation, gas- and liquid-phase fluxes, and water and rock temperature (together called the flow fields). The main focus of the TH model is to predict the changes in water flux driven by evaporation/condensation processes, and drainage between drifts. The TH model captures mountain-scale three-dimensional flow effects, including lateral diversion and mountain-scale flow patterns. The mountain-scale THC model evaluates TH effects on water and gas

  10. MOUNTAIN-SCALE COUPLED PROCESSES (TH/THC/THM)MODELS

    Energy Technology Data Exchange (ETDEWEB)

    Y.S. Wu

    2005-08-24

    This report documents the development and validation of the mountain-scale thermal-hydrologic (TH), thermal-hydrologic-chemical (THC), and thermal-hydrologic-mechanical (THM) models. These models provide technical support for screening of features, events, and processes (FEPs) related to the effects of coupled TH/THC/THM processes on mountain-scale unsaturated zone (UZ) and saturated zone (SZ) flow at Yucca Mountain, Nevada (BSC 2005 [DIRS 174842], Section 2.1.1.1). The purpose and validation criteria for these models are specified in ''Technical Work Plan for: Near-Field Environment and Transport: Coupled Processes (Mountain-Scale TH/THC/THM, Drift-Scale THC Seepage, and Drift-Scale Abstraction) Model Report Integration'' (BSC 2005 [DIRS 174842]). Model results are used to support exclusion of certain FEPs from the total system performance assessment for the license application (TSPA-LA) model on the basis of low consequence, consistent with the requirements of 10 CFR 63.342 [DIRS 173273]. Outputs from this report are not direct feeds to the TSPA-LA. All the FEPs related to the effects of coupled TH/THC/THM processes on mountain-scale UZ and SZ flow are discussed in Sections 6 and 7 of this report. The mountain-scale coupled TH/THC/THM processes models numerically simulate the impact of nuclear waste heat release on the natural hydrogeological system, including a representation of heat-driven processes occurring in the far field. The mountain-scale TH simulations provide predictions for thermally affected liquid saturation, gas- and liquid-phase fluxes, and water and rock temperature (together called the flow fields). The main focus of the TH model is to predict the changes in water flux driven by evaporation/condensation processes, and drainage between drifts. The TH model captures mountain-scale three-dimensional flow effects, including lateral diversion and mountain-scale flow patterns. The mountain-scale THC model evaluates TH effects on

  11. Process control for sheet-metal stamping process modeling, controller design and shop-floor implementation

    CERN Document Server

    Lim, Yongseob; Ulsoy, A Galip

    2014-01-01

    Process Control for Sheet-Metal Stamping presents a comprehensive and structured approach to the design and implementation of controllers for the sheet metal stamping process. The use of process control for sheet-metal stamping greatly reduces defects in deep-drawn parts and can also yield large material savings from reduced scrap. Sheet-metal forming is a complex process and most often characterized by partial differential equations that are numerically solved using finite-element techniques. In this book, twenty years of academic research are reviewed and the resulting technology transitioned to the industrial environment. The sheet-metal stamping process is modeled in a manner suitable for multiple-input multiple-output control system design, with commercially available sensors and actuators. These models are then used to design adaptive controllers and real-time controller implementation is discussed. Finally, experimental results from actual shopfloor deployment are presented along with ideas for further...

  12. A Fully Coupled Computational Model of the Silylation Process

    Energy Technology Data Exchange (ETDEWEB)

    G. H. Evans; R. S. Larson; V. C. Prantil; W. S. Winters

    1999-02-01

    This report documents the development of a new finite element model of the positive tone silylation process. Model development makes use of pre-existing Sandia technology used to describe coupled thermal-mechanical behavior in deforming metals. Material properties and constitutive models were obtained from the literature. The model is two-dimensional and transient and focuses on the part of the lithography process in which crosslinked and uncrosslinked resist is exposed to a gaseous silylation agent. The model accounts for the combined effects of mass transport (diffusion of silylation agent and reaction product), chemical reaction resulting in the uptake of silicon and material swelling, the generation of stresses, and the resulting material motion. The influence of stress on diffusion and reaction rates is also included.

  13. Embedding a State Space Model Into a Markov Decision Process

    DEFF Research Database (Denmark)

    Nielsen, Lars Relund; Jørgensen, Erik; Højsgaard, Søren

    2011-01-01

    In agriculture Markov decision processes (MDPs) with finite state and action space are often used to model sequential decision making over time. For instance, states in the process represent possible levels of traits of the animal and transition probabilities are based on biological models...... estimated from data collected from the animal or herd. State space models (SSMs) are a general tool for modeling repeated measurements over time where the model parameters can evolve dynamically. In this paper we consider methods for embedding an SSM into an MDP with finite state and action space. Different...... ways of discretizing an SSM are discussed and methods for reducing the state space of the MDP are presented. An example from dairy production is given...

  14. Filament winding cylinders. II - Validation of the process model

    Science.gov (United States)

    Calius, Emilio P.; Lee, Soo-Yong; Springer, George S.

    1990-01-01

    Analytical and experimental studies were performed to validate the model developed by Lee and Springer for simulating the manufacturing process of filament wound composite cylinders. First, results calculated by the Lee-Springer model were compared to results of the Calius-Springer thin cylinder model. Second, temperatures and strains calculated by the Lee-Springer model were compared to data. The data used in these comparisons were generated during the course of this investigation with cylinders made of Hercules IM-6G/HBRF-55 and Fiberite T-300/976 graphite-epoxy tows. Good agreement was found between the calculated and measured stresses and strains, indicating that the model is a useful representation of the winding and curing processes.

  15. A time fractional model to represent rainfall process

    Directory of Open Access Journals (Sweden)

    Jacques Golder

    2014-01-01

    Full Text Available This paper deals with a stochastic representation of the rainfall process. The analysis of a rainfall time series shows that cumulative representation of a rainfall time series can be modeled as a non-Gaussian random walk with a log-normal jump distribution and a time-waiting distribution following a tempered α-stable probability law. Based on the random walk model, a fractional Fokker-Planck equation (FFPE with tempered α-stable waiting times was obtained. Through the comparison of observed data and simulated results from the random walk model and FFPE model with tempered á-stable waiting times, it can be concluded that the behavior of the rainfall process is globally reproduced, and the FFPE model with tempered α-stable waiting times is more efficient in reproducing the observed behavior.

  16. Pavement maintenance optimization model using Markov Decision Processes

    Science.gov (United States)

    Mandiartha, P.; Duffield, C. F.; Razelan, I. S. b. M.; Ismail, A. b. H.

    2017-09-01

    This paper presents an optimization model for selection of pavement maintenance intervention using a theory of Markov Decision Processes (MDP). There are some particular characteristics of the MDP developed in this paper which distinguish it from other similar studies or optimization models intended for pavement maintenance policy development. These unique characteristics include a direct inclusion of constraints into the formulation of MDP, the use of an average cost method of MDP, and the policy development process based on the dual linear programming solution. The limited information or discussions that are available on these matters in terms of stochastic based optimization model in road network management motivates this study. This paper uses a data set acquired from road authorities of state of Victoria, Australia, to test the model and recommends steps in the computation of MDP based stochastic optimization model, leading to the development of optimum pavement maintenance policy.

  17. A model for optimization of process integration investments under uncertainty

    International Nuclear Information System (INIS)

    Svensson, Elin; Stroemberg, Ann-Brith; Patriksson, Michael

    2011-01-01

    The long-term economic outcome of energy-related industrial investment projects is difficult to evaluate because of uncertain energy market conditions. In this article, a general, multistage, stochastic programming model for the optimization of investments in process integration and industrial energy technologies is proposed. The problem is formulated as a mixed-binary linear programming model where uncertainties are modelled using a scenario-based approach. The objective is to maximize the expected net present value of the investments which enables heat savings and decreased energy imports or increased energy exports at an industrial plant. The proposed modelling approach enables a long-term planning of industrial, energy-related investments through the simultaneous optimization of immediate and later decisions. The stochastic programming approach is also suitable for modelling what is possibly complex process integration constraints. The general model formulation presented here is a suitable basis for more specialized case studies dealing with optimization of investments in energy efficiency. -- Highlights: → Stochastic programming approach to long-term planning of process integration investments. → Extensive mathematical model formulation. → Multi-stage investment decisions and scenario-based modelling of uncertain energy prices. → Results illustrate how investments made now affect later investment and operation opportunities. → Approach for evaluation of robustness with respect to variations in probability distribution.

  18. Model-based processing for underwater acoustic arrays

    CERN Document Server

    Sullivan, Edmund J

    2015-01-01

    This monograph presents a unified approach to model-based processing for underwater acoustic arrays. The use of physical models in passive array processing is not a new idea, but it has been used on a case-by-case basis, and as such, lacks any unifying structure. This work views all such processing methods as estimation procedures, which then can be unified by treating them all as a form of joint estimation based on a Kalman-type recursive processor, which can be recursive either in space or time, depending on the application. This is done for three reasons. First, the Kalman filter provides a natural framework for the inclusion of physical models in a processing scheme. Second, it allows poorly known model parameters to be jointly estimated along with the quantities of interest. This is important, since in certain areas of array processing already in use, such as those based on matched-field processing, the so-called mismatch problem either degrades performance or, indeed, prevents any solution at all. Third...

  19. Multilevel modeling of damage accumulation processes in metals

    Science.gov (United States)

    Kurmoiartseva, K. A.; Trusov, P. V.; Kotelnikova, N. V.

    2017-12-01

    To predict the behavior of components and constructions it is necessary to develop the methods and mathematical models which take into account the self-organization of microstructural processes and the strain localization. The damage accumulation processes and the evolution of material properties during deformation are important to take into account. The heterogeneity of the process of damage accumulation is due to the appropriate physical mechanisms at the scale levels, which are lower than the macro-level. The purpose of this work is to develop a mathematical model for analyzing the behavior of polycrystalline materials that allows describing the damage accumulation processes. Fracture is the multistage and multiscale process of the build-up of micro- and mesodefects over the wide range of loading rates. The formation of microcracks by mechanisms is caused by the interactions of the dislocations of different slip systems, barriers, boundaries and the inclusions of the secondary phase. This paper provides the description of some of the most well-known models of crack nucleation and also suggests the structure of a mathematical model based on crystal plasticity and dislocation models of crack nucleation.

  20. Purex process modelling - do we really need speciation data?

    International Nuclear Information System (INIS)

    Taylor, R.J.; May, I.

    2001-01-01

    The design of reprocessing flowsheets has become a complex process requiring sophisticated simulation models, containing both chemical and engineering features. Probably the most basic chemical data needed is the distribution of process species between solvent and aqueous phases at equilibrium, which is described by mathematical algorithms. These algorithms have been constructed from experimentally determined distribution coefficients over a wide range of conditions. Distribution algorithms can either be empirical fits of the data or semi-empirical equations, which describe extraction as functions of process variables such as temperature, activity coefficients, uranium loading, etc. Speciation data is not strictly needed in the accumulation of distribution coefficients, which are simple ratios of analyte concentration in the solvent phase to that in the aqueous phase. However, as we construct process models of increasing complexity, speciation data becomes much more important both to raise confidence in the model and to understand the process chemistry at a more fundamental level. UV/vis/NIR spectrophotometry has been our most commonly used speciation method since it is a well-established method for the analysis of actinide ion oxidation states in solution at typical process concentrations. However, with the increasing availability to actinide science of more sophisticated techniques (e.g. NMR; EXAFS) complementary structural information can often be obtained. This paper will, through examples, show how we have used spectrophotometry as a primary tool in distribution and kinetic experiments to obtain data for process models, which are then validated through counter-current flowsheet trials. It will also discuss how spectrophotometry and other speciation methods are allowing us to study the link between molecular structure and extraction behaviour, showing how speciation data really is important in PUREX process modelling. (authors)

  1. Use of mathematical modelling in electron beam processing: A guidebook

    International Nuclear Information System (INIS)

    2010-01-01

    The use of electron beam irradiation for industrial applications, like the sterilization of medical devices or cross-linking of polymers, has a long and successful track record and has proven itself to be a key technology. Emerging fields, including environmental applications of ionizing radiation, the sterilization of complex medical and pharmaceutical products or advanced material treatment, require the design and control of even more complex irradiators and irradiation processes. Mathematical models can aid the design process, for example by calculating absorbed dose distributions in a product, long before any prototype is built. They support process qualification through impact assessment of process variable uncertainties, and can be an indispensable teaching tool for technologists in training in the use of radiation processing. The IAEA, through various mechanisms, including its technical cooperation programme, coordinated research projects, technical meetings, guidelines and training materials, is promoting the use of radiation technologies to minimize the effects of harmful contaminants and develop value added products originating from low cost natural and human made raw materials. The need to publish a guidebook on the use of mathematical modelling for design processes in the electron beam treatment of materials was identified through the increased interest of radiation processing laboratories in Member States and as a result of recommendations from several IAEA expert meetings. In response, the IAEA has prepared this report using the services of an expert in the field. This publication should serve as both a guidebook and introductory tutorial for the use of mathematical modelling (using mostly Monte Carlo methods) in electron beam processing. The emphasis of this guide is on industrial irradiation methodologies with a strong reference to existing literature and applicable standards. Its target audience is readers who have a basic understanding of electron

  2. Material model validation for laser shock peening process simulation

    International Nuclear Information System (INIS)

    Amarchinta, H K; Grandhi, R V; Langer, K; Stargel, D S

    2009-01-01

    Advanced mechanical surface enhancement techniques have been used successfully to increase the fatigue life of metallic components. These techniques impart deep compressive residual stresses into the component to counter potentially damage-inducing tensile stresses generated under service loading. Laser shock peening (LSP) is an advanced mechanical surface enhancement technique used predominantly in the aircraft industry. To reduce costs and make the technique available on a large-scale basis for industrial applications, simulation of the LSP process is required. Accurate simulation of the LSP process is a challenging task, because the process has many parameters such as laser spot size, pressure profile and material model that must be precisely determined. This work focuses on investigating the appropriate material model that could be used in simulation and design. In the LSP process material is subjected to strain rates of 10 6  s −1 , which is very high compared with conventional strain rates. The importance of an accurate material model increases because the material behaves significantly different at such high strain rates. This work investigates the effect of multiple nonlinear material models for representing the elastic–plastic behavior of materials. Elastic perfectly plastic, Johnson–Cook and Zerilli–Armstrong models are used, and the performance of each model is compared with available experimental results

  3. The Development and Application of an Integrated VAR Process Model

    Science.gov (United States)

    Ballantyne, A. Stewart

    2016-07-01

    The VAR ingot has been the focus of several modelling efforts over the years with the result that the thermal regime in the ingot can be simulated quite realistically. Such models provide important insight into solidification of the ingot but present some significant challenges to the casual user such as a process engineer. To provide the process engineer with a tool to assist in the development of a melt practice, a comprehensive model of the complete VAR process has been developed. A radiation heat transfer simulation of the arc has been combined with electrode and ingot models to develop a platform which accepts typical operating variables (voltage, current, and gap) together with process parameters (electrode size, crucible size, orientation, water flow, etc.) as input data. The output consists of heat flow distributions and solidification parameters in the form of text, comma-separated value, and visual toolkit files. The resulting model has been used to examine the relationship between the assumed energy distribution in the arc and the actual energy flux which arrives at the ingot top surface. Utilizing heat balance information generated by the model, the effects of electrode-crucible orientation and arc gap have been explored with regard to the formation of ingot segregation defects.

  4. Liveness and Reachability Analysis of BPMN Process Models

    Directory of Open Access Journals (Sweden)

    Anass Rachdi

    2016-06-01

    Full Text Available Business processes are usually defined by business experts who require intuitive and informal graphical notations such as BPMN (Business Process Management Notation for documenting and communicating their organization activities and behavior. However, BPMN has not been provided with a formal semantics, which limits the analysis of BPMN models to using solely informal techniques such as simulation. In order to address this limitation and use formal verification, it is necessary to define a certain “mapping” between BPMN and a formal language such as Concurrent Sequential Processes (CSP and Petri Nets (PN. This paper proposes a method for the verification of BPMN models by defining formal semantics of BPMN in terms of a mapping to Time Petri Nets (TPN, which are equipped with very efficient analytical techniques. After the translation of BPMN models to TPN, verification is done to ensure that some functional properties are satisfied by the model under investigation, namely liveness and reachability properties. The main advantage of our approach over existing ones is that it takes into account the time components in modeling Business process models. An example is used throughout the paper to illustrate the proposed method.

  5. Drift-Scale Coupled Processes (DST and THC Seepage) Models

    Energy Technology Data Exchange (ETDEWEB)

    E. Gonnenthal; N. Spyoher

    2001-02-05

    The purpose of this Analysis/Model Report (AMR) is to document the Near-Field Environment (NFE) and Unsaturated Zone (UZ) models used to evaluate the potential effects of coupled thermal-hydrologic-chemical (THC) processes on unsaturated zone flow and transport. This is in accordance with the ''Technical Work Plan (TWP) for Unsaturated Zone Flow and Transport Process Model Report'', Addendum D, Attachment D-4 (Civilian Radioactive Waste Management System (CRWMS) Management and Operating Contractor (M and O) 2000 [153447]) and ''Technical Work Plan for Nearfield Environment Thermal Analyses and Testing'' (CRWMS M and O 2000 [153309]). These models include the Drift Scale Test (DST) THC Model and several THC seepage models. These models provide the framework to evaluate THC coupled processes at the drift scale, predict flow and transport behavior for specified thermal loading conditions, and predict the chemistry of waters and gases entering potential waste-emplacement drifts. The intended use of this AMR is to provide input for the following: (1) Performance Assessment (PA); (2) Abstraction of Drift-Scale Coupled Processes AMR (ANL-NBS-HS-000029); (3) UZ Flow and Transport Process Model Report (PMR); and (4) Near-Field Environment (NFE) PMR. The work scope for this activity is presented in the TWPs cited above, and summarized as follows: continue development of the repository drift-scale THC seepage model used in support of the TSPA in-drift geochemical model; incorporate heterogeneous fracture property realizations; study sensitivity of results to changes in input data and mineral assemblage; validate the DST model by comparison with field data; perform simulations to predict mineral dissolution and precipitation and their effects on fracture properties and chemistry of water (but not flow rates) that may seep into drifts; submit modeling results to the TDMS and document the models. The model development, input data

  6. Modeling, estimation and optimal filtration in signal processing

    CERN Document Server

    Najim, Mohamed

    2010-01-01

    The purpose of this book is to provide graduate students and practitioners with traditional methods and more recent results for model-based approaches in signal processing.Firstly, discrete-time linear models such as AR, MA and ARMA models, their properties and their limitations are introduced. In addition, sinusoidal models are addressed.Secondly, estimation approaches based on least squares methods and instrumental variable techniques are presented.Finally, the book deals with optimal filters, i.e. Wiener and Kalman filtering, and adaptive filters such as the RLS, the LMS and the

  7. Multilevel Flow Modelling of Process Plant for Diagnosis and Control

    DEFF Research Database (Denmark)

    Lind, Morten

    1982-01-01

    The paper describes the multilevel flow modelling methodology which can be used to construct functional models of energy and material processing systems. The models describe mass and energy flow topology on different levels of abstraction and represent the hierarchical functional structure...... operator. Plant control requirements can be derived from the models and due to independence of the actual controller implementation the method may be used as a basis for design of control strategies and for the allocation of control tasks to the computer and the plant operator....

  8. RTD modeling of a continuous dry granulation process for process control and materials diversion.

    Science.gov (United States)

    Kruisz, Julia; Rehrl, Jakob; Sacher, Stephan; Aigner, Isabella; Horn, Martin; Khinast, Johannes G

    2017-08-07

    Disturbance propagation during continuous manufacturing processes can be predicted by evaluating the residence time distribution (RTD) of the specific unit operations. In this work, a dry granulation process was modelled and four scenarios of feeding events were simulated. We performed characterization of the feeders and developed RTD models for the blender and the roller compactor based on impulse-response measurements via color tracers. Out-of-specification material was defined based on the active pharmaceutical ingredient (API) concentration. We calculated the amount of waste material at various diversion points, considering four feeder-related process-upset scenarios and formulated considerations for the development of a control concept. The developed RTD models allow material tracking of materials that may be used for following the spread contaminants within the process and for batch definition. The results show that RTD modeling is a valuable tool for process development and design, as well as for process monitoring and material tracking. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. Implementation of New Process Models for Tailored Polymer Composite Structures into Processing Software Packages

    International Nuclear Information System (INIS)

    Nguyen, Ba Nghiep; Jin, Xiaoshi; Wang, Jin; Phelps, Jay; Tucker, Charles L. III; Kunc, Vlastimil; Bapanapalli, Satish K.; Smith, Mark T.

    2010-01-01

    This report describes the work conducted under the Cooperative Research and Development Agreement (CRADA) (Nr. 260) between the Pacific Northwest National Laboratory (PNNL) and Autodesk, Inc. to develop and implement process models for injection-molded long-fiber thermoplastics (LFTs) in processing software packages. The structure of this report is organized as follows. After the Introduction Section (Section 1), Section 2 summarizes the current fiber orientation models developed for injection-molded short-fiber thermoplastics (SFTs). Section 3 provides an assessment of these models to determine their capabilities and limitations, and the developments needed for injection-molded LFTs. Section 4 then focuses on the development of a new fiber orientation model for LFTs. This model is termed the anisotropic rotary diffusion - reduced strain closure (ARD-RSC) model as it explores the concept of anisotropic rotary diffusion to capture the fiber-fiber interaction in long-fiber suspensions and uses the reduced strain closure method of Wang et al. to slow down the orientation kinetics in concentrated suspensions. In contrast to fiber orientation modeling, before this project, no standard model was developed to predict the fiber length distribution in molded fiber composites. Section 5 is therefore devoted to the development of a fiber length attrition model in the mold. Sections 6 and 7 address the implementations of the models in AMI, and the conclusions drawn from this work is presented in Section 8.

  10. Implementation of New Process Models for Tailored Polymer Composite Structures into Processing Software Packages

    Energy Technology Data Exchange (ETDEWEB)

    Nguyen, Ba Nghiep; Jin, Xiaoshi; Wang, Jin; Phelps, Jay; Tucker III, Charles L.; Kunc, Vlastimil; Bapanapalli, Satish K.; Smith, Mark T.

    2010-02-23

    This report describes the work conducted under the Cooperative Research and Development Agreement (CRADA) (Nr. 260) between the Pacific Northwest National Laboratory (PNNL) and Autodesk, Inc. to develop and implement process models for injection-molded long-fiber thermoplastics (LFTs) in processing software packages. The structure of this report is organized as follows. After the Introduction Section (Section 1), Section 2 summarizes the current fiber orientation models developed for injection-molded short-fiber thermoplastics (SFTs). Section 3 provides an assessment of these models to determine their capabilities and limitations, and the developments needed for injection-molded LFTs. Section 4 then focuses on the development of a new fiber orientation model for LFTs. This model is termed the anisotropic rotary diffusion - reduced strain closure (ARD-RSC) model as it explores the concept of anisotropic rotary diffusion to capture the fiber-fiber interaction in long-fiber suspensions and uses the reduced strain closure method of Wang et al. to slow down the orientation kinetics in concentrated suspensions. In contrast to fiber orientation modeling, before this project, no standard model was developed to predict the fiber length distribution in molded fiber composites. Section 5 is therefore devoted to the development of a fiber length attrition model in the mold. Sections 6 and 7 address the implementations of the models in AMI, and the conclusions drawn from this work is presented in Section 8.

  11. Modelling of transport and biogeochemical processes in pollution plumes: Literature review of model development

    DEFF Research Database (Denmark)

    Brun, A.; Engesgaard, Peter Knudegaard

    2002-01-01

    A literature survey shows how biogeochemical (coupled organic and inorganic reaction processes) transport models are based on considering the complete biodegradation process as either a single- or as a two-step process. It is demonstrated that some two-step process models rely on the Partial Equi....... A second paper [J. Hydrol. 256 (2002) 230-249], reports the application of the model to a field study of biogeochemical transport processes in a landfill plume in Denmark (Vejen). (C) 2002 Elsevier Science B.V. All rights reserved....

  12. Application of Computer Simulation Modeling to Medication Administration Process Redesign

    Directory of Open Access Journals (Sweden)

    Nathan Huynh

    2012-01-01

    Full Text Available The medication administration process (MAP is one of the most high-risk processes in health care. MAP workflow redesign can precipitate both unanticipated and unintended consequences that can lead to new medication safety risks and workflow inefficiencies. Thus, it is necessary to have a tool to evaluate the impact of redesign approaches in advance of their clinical implementation. This paper discusses the development of an agent-based MAP computer simulation model that can be used to assess the impact of MAP workflow redesign on MAP performance. The agent-based approach is adopted in order to capture Registered Nurse medication administration performance. The process of designing, developing, validating, and testing such a model is explained. Work is underway to collect MAP data in a hospital setting to provide more complex MAP observations to extend development of the model to better represent the complexity of MAP.

  13. A process improvement model for software verification and validation

    Science.gov (United States)

    Callahan, John; Sabolish, George

    1994-01-01

    We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and space station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.

  14. Experimentation and numerical modeling of forging induced bending (FIB) process

    Science.gov (United States)

    Naseem, S.; van den Boogaard, A. H.

    2016-10-01

    Accurate prediction of the final shape using numerical modeling has been a top priority in the field of sheet and bulk forming. Better shape prediction is the result of a better estimation of the physical stress and strain state. For experimental and numerical investigations of such estimations, simple benchmark processes are used. In this paper a benchmark process involving forging (flattening) of sheet metal between punch and die with negative clearance is proposed. The introduced material flow results in bending. Easy measurability of the angle of this bend makes this process suitable for validation purpose. Physical experiments are performed to characterize this bending angle due to flattening. Furthermore a numerical model is developed to capture this phenomenon. The main focus of this paper is the validation of the numerical model in terms of accurate prediction of the physical results.

  15. Analysis of Cryogenic Cycle with Process Modeling Tool: Aspen HYSYS

    International Nuclear Information System (INIS)

    Joshi, D.M.; Patel, H.K.

    2015-01-01

    Cryogenic engineering deals with the development and improvement of low temperature techniques, processes and equipment. A process simulator such as Aspen HYSYS, for the design, analysis, and optimization of process plants, has features that accommodate the special requirements and therefore can be used to simulate most cryogenic liquefaction and refrigeration processes. Liquefaction is the process of cooling or refrigerating a gas to a temperature below its critical temperature so that liquid can be formed at some suitable pressure which is below the critical pressure. Cryogenic processes require special attention in terms of the integration of various components like heat exchangers, Joule-Thompson Valve, Turbo expander and Compressor. Here, Aspen HYSYS, a process modeling tool, is used to understand the behavior of the complete plant. This paper presents the analysis of an air liquefaction plant based on the Linde cryogenic cycle, performed using the Aspen HYSYS process modeling tool. It covers the technique used to find the optimum values for getting the maximum liquefaction of the plant considering different constraints of other parameters. The analysis result so obtained gives clear idea in deciding various parameter values before implementation of the actual plant in the field. It also gives an idea about the productivity and profitability of the given configuration plant which leads to the design of an efficient productive plant

  16. Interactive, process-oriented climate modeling with CLIMLAB

    Science.gov (United States)

    Rose, B. E. J.

    2016-12-01

    Global climate is a complex emergent property of the rich interactions between simpler components of the climate system. We build scientific understanding of this system by breaking it down into component process models (e.g. radiation, large-scale dynamics, boundary layer turbulence), understanding each components, and putting them back together. Hands-on experience and freedom to tinker with climate models (whether simple or complex) is invaluable for building physical understanding. CLIMLAB is an open-ended software engine for interactive, process-oriented climate modeling. With CLIMLAB you can interactively mix and match model components, or combine simpler process models together into a more comprehensive model. It was created primarily to support classroom activities, using hands-on modeling to teach fundamentals of climate science at both undergraduate and graduate levels. CLIMLAB is written in Python and ties in with the rich ecosystem of open-source scientific Python tools for numerics and graphics. The Jupyter Notebook format provides an elegant medium for distributing interactive example code. I will give an overview of the current capabilities of CLIMLAB, the curriculum we have developed thus far, and plans for the future. Using CLIMLAB requires some basic Python coding skills. We consider this an educational asset, as we are targeting upper-level undergraduates and Python is an increasingly important language in STEM fields.

  17. Recursive Gaussian Process Regression Model for Adaptive Quality Monitoring in Batch Processes

    Directory of Open Access Journals (Sweden)

    Le Zhou

    2015-01-01

    Full Text Available In chemical batch processes with slow responses and a long duration, it is time-consuming and expensive to obtain sufficient normal data for statistical analysis. With the persistent accumulation of the newly evolving data, the modelling becomes adequate gradually and the subsequent batches will change slightly owing to the slow time-varying behavior. To efficiently make use of the small amount of initial data and the newly evolving data sets, an adaptive monitoring scheme based on the recursive Gaussian process (RGP model is designed in this paper. Based on the initial data, a Gaussian process model and the corresponding SPE statistic are constructed at first. When the new batches of data are included, a strategy based on the RGP model is used to choose the proper data for model updating. The performance of the proposed method is finally demonstrated by a penicillin fermentation batch process and the result indicates that the proposed monitoring scheme is effective for adaptive modelling and online monitoring.

  18. A Tri-network Model of Human Semantic Processing

    Directory of Open Access Journals (Sweden)

    Yangwen Xu

    2017-09-01

    Full Text Available Humans process the meaning of the world via both verbal and nonverbal modalities. It has been established that widely distributed cortical regions are involved in semantic processing, yet the global wiring pattern of this brain system has not been considered in the current neurocognitive semantic models. We review evidence from the brain-network perspective, which shows that the semantic system is topologically segregated into three brain modules. Revisiting previous region-based evidence in light of these new network findings, we postulate that these three modules support multimodal experiential representation, language-supported representation, and semantic control. A tri-network neurocognitive model of semantic processing is proposed, which generates new hypotheses regarding the network basis of different types of semantic processes.

  19. Designing a Process for Tracking Business Model Change

    DEFF Research Database (Denmark)

    Groskovs, Sergejs

    that may alter the business model of the firm. The decision-making process about which metrics to track affects what management’s attention is focused on during the year. The rather streamlined process outlined here is capable of facilitating swift responses to environmental changes in local markets...... by establishing new KPIs on an ongoing basis together with the business units on the ground, and is thus of key importance to strategic management of the firm. The paper concludes with a discussion of its methodological compliance to design science research guidelines and revisits the literature in process......The paper has adopted a design science research approach to design and verify with key stakeholders a fundamental management process of revising KPIs (key performance indicators), including those indicators that are related to business model change. The paper proposes a general guide...

  20. Computer-Aided Modeling of Lipid Processing Technology

    DEFF Research Database (Denmark)

    Diaz Tovar, Carlos Axel

    2011-01-01

    increase along with growing interest in biofuels, the oleochemical industry faces in the upcoming years major challenges in terms of design and development of better products and more sustainable processes to make them. Computer-aided methods and tools for process synthesis, modeling and simulation...... are widely used for design, analysis, and optimization of processes in the chemical and petrochemical industries. These computer-aided tools have helped the chemical industry to evolve beyond commodities toward specialty chemicals and ‘consumer oriented chemicals based products’. Unfortunately...... to develop systematic computer-aided methods (property models) and tools (database) related to the prediction of the necessary physical properties suitable for design and analysis of processes employing lipid technologies. The methods and tools include: the development of a lipid-database (CAPEC...

  1. Usability Briefing - a process model for healthcare facilities

    DEFF Research Database (Denmark)

    Fronczek-Munter, Aneta

    2014-01-01

    briefing for hospitals”, where methods for capturing user needs and experiences at hospital facilities are investigated in order to feed into design processes and satisfy the users’ needs and maximise the effectiveness of facilities. Purpose: This paper introduces the concept of usability briefing......Background: In complex buildings with many types of users it can be difficult to satisfy the numerous, often contradictory requirements. Research in usability mostly focuses on evaluating products or facilities with users, after they were built. This paper is part of a PhD project “Usability...... and the purpose is to develop a process model for applying it on complex building projects. Usability briefing is a process in which users are actively involved, not only in evaluations and data gathering, but also in a continuous briefing process with focus on usability. Approach: The model is inductively...

  2. Mathematical modeling of phase interaction taking place in materials processing

    International Nuclear Information System (INIS)

    Zinigrad, M.

    2002-01-01

    The quality of metallic products depends on their composition and structure. The composition and the structure are determined by various physico-chemical and technological factors. One of the most important and complicated problems in the modern industry is to obtain materials with required composition, structure and properties. For example, deep refining is a difficult task by itself, but the problem of obtaining the material with the required specific level of refining is much more complicated. It will take a lot of time and will require a lot of expanses to solve this problem empirically and the result will be far from the optimal solution. The most effective way to solve such problems is to carry out research in two parallel direction. Comprehensive analysis of thermodynamics, kinetics and mechanisms of the processes taking place at solid-liquid-gaseous phase interface and building of the clear well-based physico-chemical model of the above processes taking into account their interaction. Development of mathematical models of the specific technologies which would allow to optimize technological processes and to ensure obtaining of the required properties of the products by choosing the optimal composition of the raw materials. We apply the above unique methods. We developed unique methods of mathematical modeling of phase interaction at high temperatures. These methods allows us to build models taking into account: thermodynamic characteristics of the processes, influence of the initial composition and temperature on the equilibrium state of the reactions, kinetics of homogeneous and heterogeneous processes, influence of the temperature, composition, speed of the gas flows, hydrodynamic and thermal factors on the velocity of the chemical and diffusion processes. The models can be implemented in optimization of various metallurgical processes in manufacturing of steels and non-ferrous alloys as well as in materials refining, alloying with special additives

  3. Modeling Adsorption-Desorption Processes at the Intermolecular Interactions Level

    Science.gov (United States)

    Varfolomeeva, Vera V.; Terentev, Alexey V.

    2018-01-01

    Modeling of the surface adsorption and desorption processes, as well as the diffusion, are of considerable interest for the physical phenomenon under study in ground tests conditions. When imitating physical processes and phenomena, it is important to choose the correct parameters to describe the adsorption of gases and the formation of films on the structural materials surface. In the present research the adsorption-desorption processes on the gas-solid interface are modeled with allowance for diffusion. Approaches are proposed to describe the adsorbate distribution on the solid body surface at the intermolecular interactions level. The potentials of the intermolecular interaction of water-water, water-methane and methane-methane were used to adequately modeling the real physical and chemical processes. The energies calculated by the B3LYP/aug-cc-pVDZ method. Computational algorithms for determining the average molecule area in a dense monolayer, are considered here. Differences in modeling approaches are also given: that of the proposed in this work and the previously approved probabilistic cellular automaton (PCA) method. It has been shown that the main difference is due to certain limitations of the PCA method. The importance of accounting the intermolecular interactions via hydrogen bonding has been indicated. Further development of the adsorption-desorption processes modeling will allow to find the conditions for of surface processes regulation by means of quantity adsorbed molecules control. The proposed approach to representing the molecular system significantly shortens the calculation time in comparison with the use of atom-atom potentials. In the future, this will allow to modeling the multilayer adsorption at a reasonable computational cost.

  4. Mathematical Formulation Requirements and Specifications for the Process Models

    International Nuclear Information System (INIS)

    Steefel, C.; Moulton, D.; Pau, G.; Lipnikov, K.; Meza, J.; Lichtner, P.; Wolery, T.; Bacon, D.; Spycher, N.; Bell, J.; Moridis, G.; Yabusaki, S.; Sonnenthal, E.; Zyvoloski, G.; Andre, B.; Zheng, L.; Davis, J.

    2010-01-01

    The Advanced Simulation Capability for Environmental Management (ASCEM) is intended to be a state-of-the-art scientific tool and approach for understanding and predicting contaminant fate and transport in natural and engineered systems. The ASCEM program is aimed at addressing critical EM program needs to better understand and quantify flow and contaminant transport behavior in complex geological systems. It will also address the long-term performance of engineered components including cementitious materials in nuclear waste disposal facilities, in order to reduce uncertainties and risks associated with DOE EM's environmental cleanup and closure activities. Building upon national capabilities developed from decades of Research and Development in subsurface geosciences, computational and computer science, modeling and applied mathematics, and environmental remediation, the ASCEM initiative will develop an integrated, open-source, high-performance computer modeling system for multiphase, multicomponent, multiscale subsurface flow and contaminant transport. This integrated modeling system will incorporate capabilities for predicting releases from various waste forms, identifying exposure pathways and performing dose calculations, and conducting systematic uncertainty quantification. The ASCEM approach will be demonstrated on selected sites, and then applied to support the next generation of performance assessments of nuclear waste disposal and facility decommissioning across the EM complex. The Multi-Process High Performance Computing (HPC) Simulator is one of three thrust areas in ASCEM. The other two are the Platform and Integrated Toolsets (dubbed the Platform) and Site Applications. The primary objective of the HPC Simulator is to provide a flexible and extensible computational engine to simulate the coupled processes and flow scenarios described by the conceptual models developed using the ASCEM Platform. The graded and iterative approach to assessments naturally

  5. Nonparametric Bayesian models through probit stick-breaking processes.

    Science.gov (United States)

    Rodríguez, Abel; Dunson, David B

    2011-03-01

    We describe a novel class of Bayesian nonparametric priors based on stick-breaking constructions where the weights of the process are constructed as probit transformations of normal random variables. We show that these priors are extremely flexible, allowing us to generate a great variety of models while preserving computational simplicity. Particular emphasis is placed on the construction of rich temporal and spatial processes, which are applied to two problems in finance and ecology.

  6. Observational Data-Driven Modeling and Optimization of Manufacturing Processes

    OpenAIRE

    Sadati, Najibesadat; Chinnam, Ratna Babu; Nezhad, Milad Zafar

    2017-01-01

    The dramatic increase of observational data across industries provides unparalleled opportunities for data-driven decision making and management, including the manufacturing industry. In the context of production, data-driven approaches can exploit observational data to model, control and improve the process performance. When supplied by observational data with adequate coverage to inform the true process performance dynamics, they can overcome the cost associated with intrusive controlled de...

  7. Finite element modeling of the filament winding process using ABAQUS

    OpenAIRE

    Miltenberger, Louis C.

    1992-01-01

    A comprehensive stress model of the filament winding fabrication process, previously implemented in the finite element program, WACSAFE, was implemented using the ABAQUS finite element software package. This new implementation, referred to as the ABWACSAFE procedure, consists of the ABAQUS software and a pre/postprocessing routine that was developed to prepare necessary ABAQUS input files and process ABAQUS displacement results for stress and strain computation. The ABWACSAF...

  8. Coupled Modeling of Rhizosphere and Reactive Transport Processes

    Science.gov (United States)

    Roque-Malo, S.; Kumar, P.

    2017-12-01

    The rhizosphere, as a bio-diverse plant root-soil interface, hosts many hydrologic and biochemical processes, including nutrient cycling, hydraulic redistribution, and soil carbon dynamics among others. The biogeochemical function of root networks, including the facilitation of nutrient cycling through absorption and rhizodeposition, interaction with micro-organisms and fungi, contribution to biomass, etc., plays an important role in myriad Critical Zone processes. Despite this knowledge, the role of the rhizosphere on watershed-scale ecohydrologic functions in the Critical Zone has not been fully characterized, and specifically, the extensive capabilities of reactive transport models (RTMs) have not been applied to these hydrobiogeochemical dynamics. This study uniquely links rhizospheric processes with reactive transport modeling to couple soil biogeochemistry, biological processes, hydrologic flow, hydraulic redistribution, and vegetation dynamics. Key factors in the novel modeling approach are: (i) bi-directional effects of root-soil interaction, such as simultaneous root exudation and nutrient absorption; (ii) multi-state biomass fractions in soil (i.e. living, dormant, and dead biological and root materials); (iii) expression of three-dimensional fluxes to represent both vertical and lateral interconnected flows and processes; and (iv) the potential to include the influence of non-stationary external forcing and climatic factors. We anticipate that the resulting model will demonstrate the extensive effects of plant root dynamics on ecohydrologic functions at the watershed scale and will ultimately contribute to a better characterization of efflux from both agricultural and natural systems.

  9. Process simulation and parametric modeling for strategic project management

    CERN Document Server

    Morales, Peter J

    2013-01-01

    Process Simulation and Parametric Modeling for Strategic Project Management will offer CIOs, CTOs and Software Development Managers, IT Graduate Students an introduction to a set of technologies that will help them understand how to better plan software development projects, manage risk and have better insight into the complexities of the software development process.A novel methodology will be introduced that allows a software development manager to better plan and access risks in the early planning of a project.  By providing a better model for early software development estimation and softw

  10. Influence Processes in Climate Change Negotiations. Modelling the Rounds

    International Nuclear Information System (INIS)

    Courtois, P.

    2002-10-01

    An integrated framework for structuring and evaluating dynamic and sequential climate change decision making in the international arena is presented, taking into account influence processes occurring during negotiation rounds. The analysis integrates imitation, persuasion and dissuasion behaviours. The main innovation brought in the approach is the presentation of a stochastic model framework derived from thermodynamics. The so-called master equation is introduced in order to better understand strategic switch and influence games exerted. The model is illustrated toward a simulation of climate change conferences decision making processes. Characteristics of regions behaviours are derived from the simulations. In particular the bargain behaviours allowing for the emergence of an agreement are presented

  11. Dynamic modeling and validation of a lignocellulosic enzymatic hydrolysis process

    DEFF Research Database (Denmark)

    Prunescu, Remus Mihail; Sin, Gürkan

    2013-01-01

    The enzymatic hydrolysis process is one of the key steps in second generation biofuel production. After being thermally pretreated, the lignocellulosic material is liquefied by enzymes prior to fermentation. The scope of this paper is to evaluate a dynamic model of the hydrolysis process...... on a demonstration scale reactor. The following novel features are included: the application of the Convection–Diffusion–Reaction equation to a hydrolysis reactor to assess transport and mixing effects; the extension of a competitive kinetic model with enzymatic pH dependency and hemicellulose hydrolysis...

  12. Methodology and Results of Mathematical Modelling of Complex Technological Processes

    Science.gov (United States)

    Mokrova, Nataliya V.

    2018-03-01

    The methodology of system analysis allows us to draw a mathematical model of the complex technological process. The mathematical description of the plasma-chemical process was proposed. The importance the quenching rate and initial temperature decrease time was confirmed for producing the maximum amount of the target product. The results of numerical integration of the system of differential equations can be used to describe reagent concentrations, plasma jet rate and temperature in order to achieve optimal mode of hardening. Such models are applicable both for solving control problems and predicting future states of sophisticated technological systems.

  13. USING THE BUSINESS MODEL CANVAS TO IMPROVE INVESTMENT PROCESSES

    DEFF Research Database (Denmark)

    Sort, Jesper Chrautwald; Nielsen, Christian

    2017-01-01

    and the business angels did not fully agree on the value proposition of the investment opportunity. Practical implications — The findings show that entrepreneurs who market their business cases to investors obtain better feedback and a higher chance of funding using the business model canvas. Implications...... of this paper also relate to the preparation of the entrepreneurs and that matchmakers between entrepreneurs and investors can use the business model canvas to facilitate such processes. Originality/value — This paper contributes to both the theory of the investment process as well as the application...

  14. Modeling transport phenomena and uncertainty quantification in solidification processes

    Science.gov (United States)

    Fezi, Kyle S.

    Direct chill (DC) casting is the primary processing route for wrought aluminum alloys. This semicontinuous process consists of primary cooling as the metal is pulled through a water cooled mold followed by secondary cooling with a water jet spray and free falling water. To gain insight into this complex solidification process, a fully transient model of DC casting was developed to predict the transport phenomena of aluminum alloys for various conditions. This model is capable of solving mixture mass, momentum, energy, and species conservation equations during multicomponent solidification. Various DC casting process parameters were examined for their effect on transport phenomena predictions in an alloy of commercial interest (aluminum alloy 7050). The practice of placing a wiper to divert cooling water from the ingot surface was studied and the results showed that placement closer to the mold causes remelting at the surface and increases susceptibility to bleed outs. Numerical models of metal alloy solidification, like the one previously mentioned, are used to gain insight into physical phenomena that cannot be observed experimentally. However, uncertainty in model inputs cause uncertainty in results and those insights. The analysis of model assumptions and probable input variability on the level of uncertainty in model predictions has not been calculated in solidification modeling as yet. As a step towards understanding the effect of uncertain inputs on solidification modeling, uncertainty quantification (UQ) and sensitivity analysis were first performed on a transient solidification model of a simple binary alloy (Al-4.5wt.%Cu) in a rectangular cavity with both columnar and equiaxed solid growth models. This analysis was followed by quantifying the uncertainty in predictions from the recently developed transient DC casting model. The PRISM Uncertainty Quantification (PUQ) framework quantified the uncertainty and sensitivity in macrosegregation, solidification

  15. A process Approach to Information Services: Information Search Process (ISP Model

    Directory of Open Access Journals (Sweden)

    Hamid Keshavarz

    2010-12-01

    Full Text Available Information seeking is a behavior emerging out of the interaction between information seeker and information system and should be regarded as an episodic process so as to meet information needs of users and to take different roles in different stages of it. The present article introduces a process approach to information services in libraries using Carol Collier Kuhlthau Model. In this model, information seeking is regarded as a process consisting of six stages in each of which users have different thoughts, feelings and actions and librarians also take different roles at any stage correspondingly. These six stages are derived from instructive learning theory based on uncertainty principle. Regardless of some acceptable shortcomings, this model may be regarded as a new solution for rendering modern information services in libraries especially in relation to new information environments and media.

  16. Developing a model for assessing biomass processing technologies within a local biomass processing depot.

    Science.gov (United States)

    Bals, Bryan D; Dale, Bruce E

    2012-02-01

    One solution to the supply chain challenges of cellulosic biofuels is a network of local biomass processing depots (LBPDs) that can produce stable, dense, intermediate commodities and valuable co-products prior to shipping to a refinery. A techno-economic model of an LBPD facility that could incorporate multiple technologies and products was developed in Microsoft Excel to be used to economically and environmentally evaluate potential LBPD systems. In this study, three technologies (ammonia fiber expansion or AFEX™ pretreatment, fast pyrolysis, and leaf protein processing) were assessed for profitability. Pyrolysis was slightly profitable under the base conditions, leaf protein processing was highly unprofitable, and AFEX was profitable if biomass drying was not required. This model can be adapted to multiple feedstocks and end uses, including both economic and environmental modeling. Copyright © 2011 Elsevier Ltd. All rights reserved.

  17. Modeling of flash calcination process during clay activation

    International Nuclear Information System (INIS)

    Borrajo Perez, Ruben; Gonzalez Bayon, Juan Jose; Sanchez Rodriguez, Andy A.

    2011-01-01

    Pozzolanic activity in some materials can be increased by means of different processes, among them, thermal activation is one of the most promising. The activation process, occurring at high temperatures and velocities produces a material with better characteristics. In the last few years, high reactivity pozzolan during cure's early days has been produced. Temperature is an important parameter in the activation process and as a consequence, the activation units must consider temperature variation to allow the use of different raw materials, each one of them with different characteristics. Considering the high prices of Kaolin in the market, new materials are being tested, the clayey soil, which after a sedimentation process produces a clay that has turned out to be a suitable raw material, when the kinetics of the pozzolanic reaction is considered. Additionally, other material with higher levels of kaolin are being used with good results. This paper is about the modeling of thermal, hydrodynamics and dehydroxilation processes suffering for solids particles exposed to a hot gas stream. The models employed are discussed; the velocity and temperature of particles are obtained as a function of carrier gas parameters. The calculation include the heat losses and finally the model predict the residence time needed for finish the activation process. (author)

  18. Mathematical modeling of the voloxidation process. Final report

    International Nuclear Information System (INIS)

    Stanford, T.G.

    1979-06-01

    A mathematical model of the voloxidation process, a head-end reprocessing step for the removal of volatile fission products from spent nuclear fuel, has been developed. Three types of voloxidizer operation have been considered; co-current operation in which the gas and solid streams flow in the same direction, countercurrent operation in which the gas and solid streams flow in opposite directions, and semi-batch operation in which the gas stream passes through the reactor while the solids remain in it and are processed batch wise. Because of the complexity of the physical ahd chemical processes which occur during the voloxidation process and the lack of currently available kinetic data, a global kinetic model has been adapted for this study. Test cases for each mode of operation have been simulated using representative values of the model parameters. To process 714 kgm/day of spent nuclear fuel, using an oxidizing atmosphere containing 20 mole percent oxygen, it was found that a reactor 0.7 m in diameter and 2.49 m in length would be required for both cocurrent and countercurrent modes of operation while for semibatch operation a 0.3 m 3 reactor and an 88200 sec batch processing time would be required

  19. Self-similar Gaussian processes for modeling anomalous diffusion

    Science.gov (United States)

    Lim, S. C.; Muniandy, S. V.

    2002-08-01

    We study some Gaussian models for anomalous diffusion, which include the time-rescaled Brownian motion, two types of fractional Brownian motion, and models associated with fractional Brownian motion based on the generalized Langevin equation. Gaussian processes associated with these models satisfy the anomalous diffusion relation which requires the mean-square displacement to vary with tα, 0Brownian motion and time-rescaled Brownian motion all have the same probability distribution function, the Slepian theorem can be used to compare their first passage time distributions, which are different. Finally, in order to model anomalous diffusion with a variable exponent α(t) it is necessary to consider the multifractional extensions of these Gaussian processes.

  20. Dynamic frailty models based on compound birth-death processes.

    Science.gov (United States)

    Putter, Hein; van Houwelingen, Hans C

    2015-07-01

    Frailty models are used in survival analysis to model unobserved heterogeneity. They accommodate such heterogeneity by the inclusion of a random term, the frailty, which is assumed to multiply the hazard of a subject (individual frailty) or the hazards of all subjects in a cluster (shared frailty). Typically, the frailty term is assumed to be constant over time. This is a restrictive assumption and extensions to allow for time-varying or dynamic frailties are of interest. In this paper, we extend the auto-correlated frailty models of Henderson and Shimakura and of Fiocco, Putter and van Houwelingen, developed for longitudinal count data and discrete survival data, to continuous survival data. We present a rigorous construction of the frailty processes in continuous time based on compound birth-death processes. When the frailty processes are used as mixtures in models for survival data, we derive the marginal hazards and survival functions and the marginal bivariate survival functions and cross-ratio function. We derive distributional properties of the processes, conditional on observed data, and show how to obtain the maximum likelihood estimators of the parameters of the model using a (stochastic) expectation-maximization algorithm. The methods are applied to a publicly available data set. © The Author 2015. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  1. Roll levelling semi-analytical model for process optimization

    Science.gov (United States)

    Silvestre, E.; Garcia, D.; Galdos, L.; Saenz de Argandoña, E.; Mendiguren, J.

    2016-08-01

    Roll levelling is a primary manufacturing process used to remove residual stresses and imperfections of metal strips in order to make them suitable for subsequent forming operations. In the last years the importance of this process has been evidenced with the apparition of Ultra High Strength Steels with strength > 900 MPa. The optimal setting of the machine as well as a robust machine design has become critical for the correct processing of these materials. Finite Element Method (FEM) analysis is the widely used technique for both aspects. However, in this case, the FEM simulation times are above the admissible ones in both machine development and process optimization. In the present work, a semi-analytical model based on a discrete bending theory is presented. This model is able to calculate the critical levelling parameters i.e. force, plastification rate, residual stresses in a few seconds. First the semi-analytical model is presented. Next, some experimental industrial cases are analyzed by both the semi-analytical model and the conventional FEM model. Finally, results and computation times of both methods are compared.

  2. Modelling of spectroscopic batch process data using grey models to incorporate external information

    NARCIS (Netherlands)

    Gurden, S. P.; Westerhuis, J. A.; Bijlsma, S.; Smilde, A. K.

    2001-01-01

    In both analytical and process chemistry, one common aim is to build models describing measured data. In cases where additional information about the chemical system is available, this can be incorporated into the model with the aim of improving model fit and interpretability. A model which consists

  3. A computational model of human auditory signal processing and perception

    DEFF Research Database (Denmark)

    Jepsen, Morten Løve; Ewert, Stephan D.; Dau, Torsten

    2008-01-01

    A model of computational auditory signal-processing and perception that accounts for various aspects of simultaneous and nonsimultaneous masking in human listeners is presented. The model is based on the modulation filterbank model described by Dau et al. [J. Acoust. Soc. Am. 102, 2892 (1997...... discrimination with pure tones and broadband noise, tone-in-noise detection, spectral masking with narrow-band signals and maskers, forward masking with tone signals and tone or noise maskers, and amplitude-modulation detection with narrow- and wideband noise carriers. The model can account for most of the key...... properties of the data and is more powerful than the original model. The model might be useful as a front end in technical applications....

  4. Modelling the Hydraulic Processes on Constructed Stormwater Wetland

    Directory of Open Access Journals (Sweden)

    Isri Ronald Mangangka

    2017-03-01

    Full Text Available Constructed stormwater wetlands are manmade, shallow, and extensively vegetated water bodies which promote runoff volume and peak flow reduction, and also treat stormwater runoff quality. Researchers have noted that treatment processes of runoff in a constructed wetland are influenced by a range of hydraulic factors, which can vary during a rainfall event, and their influence on treatment can also vary as the event progresses. Variation in hydraulic factors during an event can only be generated using a detailed modelling approach, which was adopted in this research by developing a hydraulic conceptual model. The developed model was calibrated using trial and error procedures by comparing the model outflow with the measured field outflow data. The accuracy of the developed model was analyzed using a well-known statistical analysis method developed based on the regression analysis technique. The analysis results show that the developed model is satisfactory.

  5. Bayesian network modeling of operator's state recognition process

    International Nuclear Information System (INIS)

    Hatakeyama, Naoki; Furuta, Kazuo

    2000-01-01

    Nowadays we are facing a difficult problem of establishing a good relation between humans and machines. To solve this problem, we suppose that machine system need to have a model of human behavior. In this study we model the state cognition process of a PWR plant operator as an example. We use a Bayesian network as an inference engine. We incorporate the knowledge hierarchy in the Bayesian network and confirm its validity using the example of PWR plant operator. (author)

  6. A decision-making process model of young online shoppers.

    Science.gov (United States)

    Lin, Chin-Feng; Wang, Hui-Fang

    2008-12-01

    Based on the concepts of brand equity, means-end chain, and Web site trust, this study proposes a novel model called the consumption decision-making process of adolescents (CDMPA) to understand adolescents' Internet consumption habits and behavioral intention toward particular sporting goods. The findings of the CDMPA model can help marketers understand adolescents' consumption preferences and habits for developing effective Internet marketing strategies.

  7. Visual spatial localization and the two-process model

    OpenAIRE

    Uddin, Muhammad Kamal

    2006-01-01

    This review paper begins with a brief history of research on localization followed by its definition and classification. It also presents important parameters of localization and factors that affect localization. The paper gives an overview of the two-process model and highlights its limitations. A careful review exposed inadequacies in the model in particular and in localization research in general warranting a clear need for further investigations. Here the author reports findings of his se...

  8. A Harmonized Process Model for Digital Forensic Investigation Readiness

    OpenAIRE

    Valjarevic , Aleksandar; Venter , Hein

    2013-01-01

    Part 2: FORENSIC MODELS; International audience; Digital forensic readiness enables an organization to prepare itself to perform digital forensic investigations in an efficient and effective manner. The benefits include enhancing the admissibility of digital evidence, better utilization of resources and greater incident awareness. However, a harmonized process model for digital forensic readiness does not currently exist and, thus, there is a lack of effective and standardized implementations...

  9. A first packet processing subdomain cluster model based on SDN

    Science.gov (United States)

    Chen, Mingyong; Wu, Weimin

    2017-08-01

    For the current controller cluster packet processing performance bottlenecks and controller downtime problems. An SDN controller is proposed to allocate the priority of each device in the SDN (Software Defined Network) network, and the domain contains several network devices and Controller, the controller is responsible for managing the network equipment within the domain, the switch performs data delivery based on the load of the controller, processing network equipment data. The experimental results show that the model can effectively solve the risk of single point failure of the controller, and can solve the performance bottleneck of the first packet processing.

  10. A kinetic model for the burst phase of processive cellulases

    DEFF Research Database (Denmark)

    Præstgaard, Eigil; Olsen, Jens Elmerdahl; Murphy, Leigh

    2011-01-01

    Cellobiohydrolases (exocellulases) hydrolyze cellulose processively, i.e. by sequential cleaving of soluble sugars from one end of a cellulose strand. Their activity generally shows an initial burst, followed by a pronounced slowdown, even when substrate is abundant and product accumulation...... of the model, which can be solved analytically, shows that the burst and slowdown can be explained by the relative rates of the sequential reactions in the hydrolysis process and the occurrence of obstacles for the processive movement along the cellulose strand. More specifically, the maximum enzyme activity...

  11. Abdominal surgery process modeling framework for simulation using spreadsheets.

    Science.gov (United States)

    Boshkoska, Biljana Mileva; Damij, Talib; Jelenc, Franc; Damij, Nadja

    2015-08-01

    We provide a continuation of the existing Activity Table Modeling methodology with a modular spreadsheets simulation. The simulation model developed is comprised of 28 modeling elements for the abdominal surgery cycle process. The simulation of a two-week patient flow in an abdominal clinic with 75 beds demonstrates the applicability of the methodology. The simulation does not include macros, thus programming experience is not essential for replication or upgrading the model. Unlike the existing methods, the proposed solution employs a modular approach for modeling the activities that ensures better readability, the possibility of easily upgrading the model with other activities, and its easy extension and connectives with other similar models. We propose a first-in-first-served approach for simulation of servicing multiple patients. The uncertain time duration of the activities is modeled using the function "rand()". The patients movements from one activity to the next one is tracked with nested "if()" functions, thus allowing easy re-creation of the process without the need of complex programming. Copyright © 2015 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  12. Foundations for Streaming Model Transformations by Complex Event Processing.

    Science.gov (United States)

    Dávid, István; Ráth, István; Varró, Dániel

    2018-01-01

    Streaming model transformations represent a novel class of transformations to manipulate models whose elements are continuously produced or modified in high volume and with rapid rate of change. Executing streaming transformations requires efficient techniques to recognize activated transformation rules over a live model and a potentially infinite stream of events. In this paper, we propose foundations of streaming model transformations by innovatively integrating incremental model query, complex event processing (CEP) and reactive (event-driven) transformation techniques. Complex event processing allows to identify relevant patterns and sequences of events over an event stream. Our approach enables event streams to include model change events which are automatically and continuously populated by incremental model queries. Furthermore, a reactive rule engine carries out transformations on identified complex event patterns. We provide an integrated domain-specific language with precise semantics for capturing complex event patterns and streaming transformations together with an execution engine, all of which is now part of the Viatra reactive transformation framework. We demonstrate the feasibility of our approach with two case studies: one in an advanced model engineering workflow; and one in the context of on-the-fly gesture recognition.

  13. Application of Process Modeling in a Software- Engineering Course

    Directory of Open Access Journals (Sweden)

    Gabriel Alberto García Mireles

    2001-11-01

    Full Text Available Coordination in a software development project is a critical issue in delivering a successful software product, within the constraints of time, functionality and budget agreed upon with the customer. One of the strategies for approaching this problem consists in the use of process modeling to document, evaluate, and redesign the software development process. The appraisal of the projects done in the Engineering and Methodology course of a program given at the Ensenada Center of Scientific Research and Higher Education (CICESE, from a process perspective, facilitated the identification of strengths and weaknesses in the development process used. This paper presents the evaluation of the practical portion of the course, the improvements made, and the preliminary results of using the process approach in the analysis phase of a software-development project.

  14. The role of business intelligence in decision process modeling

    Directory of Open Access Journals (Sweden)

    Višnja Istrat

    2015-10-01

    Full Text Available Decision making is a very significant and complex function of management that requires methods and techniques that simplify the process of choosing the best alternative. In modern business, the challenge for managers is to find the alternatives for improving the decision-making process. Decisions directly affect profit generation and positioning of the company in the market. It is well-known that people dealt with the phenomenon of decision making in each phase of the development of society, which has triggered the need to learn more about this process. The main contribution of this paper is to show the significance of business intelligence tools and techniques as support to the decision making process of managers. Research results have shown that business intelligence plays an enormous role in modern decision process modeling.

  15. Evolution of quantum-like modeling in decision making processes

    International Nuclear Information System (INIS)

    Khrennikova, Polina

    2012-01-01

    The application of the mathematical formalism of quantum mechanics to model behavioral patterns in social science and economics is a novel and constantly emerging field. The aim of the so called 'quantum like' models is to model the decision making processes in a macroscopic setting, capturing the particular 'context' in which the decisions are taken. Several subsequent empirical findings proved that when making a decision people tend to violate the axioms of expected utility theory and Savage's Sure Thing principle, thus violating the law of total probability. A quantum probability formula was devised to describe more accurately the decision making processes. A next step in the development of QL-modeling in decision making was the application of Schrödinger equation to describe the evolution of people's mental states. A shortcoming of Schrödinger equation is its inability to capture dynamics of an open system; the brain of the decision maker can be regarded as such, actively interacting with the external environment. Recently the master equation, by which quantum physics describes the process of decoherence as the result of interaction of the mental state with the environmental 'bath', was introduced for modeling the human decision making. The external environment and memory can be referred to as a complex 'context' influencing the final decision outcomes. The master equation can be considered as a pioneering and promising apparatus for modeling the dynamics of decision making in different contexts.

  16. Evolution of quantum-like modeling in decision making processes

    Science.gov (United States)

    Khrennikova, Polina

    2012-12-01

    The application of the mathematical formalism of quantum mechanics to model behavioral patterns in social science and economics is a novel and constantly emerging field. The aim of the so called 'quantum like' models is to model the decision making processes in a macroscopic setting, capturing the particular 'context' in which the decisions are taken. Several subsequent empirical findings proved that when making a decision people tend to violate the axioms of expected utility theory and Savage's Sure Thing principle, thus violating the law of total probability. A quantum probability formula was devised to describe more accurately the decision making processes. A next step in the development of QL-modeling in decision making was the application of Schrödinger equation to describe the evolution of people's mental states. A shortcoming of Schrödinger equation is its inability to capture dynamics of an open system; the brain of the decision maker can be regarded as such, actively interacting with the external environment. Recently the master equation, by which quantum physics describes the process of decoherence as the result of interaction of the mental state with the environmental 'bath', was introduced for modeling the human decision making. The external environment and memory can be referred to as a complex 'context' influencing the final decision outcomes. The master equation can be considered as a pioneering and promising apparatus for modeling the dynamics of decision making in different contexts.

  17. Processing tree point clouds using Gaussian Mixture Models

    Directory of Open Access Journals (Sweden)

    D. Belton

    2013-10-01

    Full Text Available While traditionally used for surveying and photogrammetric fields, laser scanning is increasingly being used for a wider range of more general applications. In addition to the issues typically associated with processing point data, such applications raise a number of new complications, such as the complexity of the scenes scanned, along with the sheer volume of data. Consequently, automated procedures are required for processing, and analysing such data. This paper introduces a method for modelling multi-modal, geometrically complex objects in terrestrial laser scanning point data; specifically, the modelling of trees. The model method comprises a number of geometric features in conjunction with a multi-modal machine learning technique. The model can then be used for contextually dependent region growing through separating the tree into its component part at the point level. Subsequently object analysis can be performed, for example, performing volumetric analysis of a tree by removing points associated with leaves. The workflow for this process is as follows: isolate individual trees within the scanned scene, train a Gaussian mixture model (GMM, separate clusters within the mixture model according to exemplar points determined by the GMM, grow the structure of the tree, and then perform volumetric analysis on the structure.

  18. How processing digital elevation models can affect simulated water budgets

    Science.gov (United States)

    Kuniansky, E.L.; Lowery, M.A.; Campbell, B.G.

    2009-01-01

    For regional models, the shallow water table surface is often used as a source/sink boundary condition, as model grid scale precludes simulation of the water table aquifer. This approach is appropriate when the water table surface is relatively stationary. Since water table surface maps are not readily available, the elevation of the water table used in model cells is estimated via a two-step process. First, a regression equation is developed using existing land and water table elevations from wells in the area. This equation is then used to predict the water table surface for each model cell using land surface elevation available from digital elevation models (DEM). Two methods of processing DEM for estimating the land surface for each cell are commonly used (value nearest the cell centroid or mean value in the cell). This article demonstrates how these two methods of DEM processing can affect the simulated water budget. For the example presented, approximately 20% more total flow through the aquifer system is simulated if the centroid value rather than the mean value is used. This is due to the one-third greater average ground water gradients associated with the centroid value than the mean value. The results will vary depending on the particular model area topography and cell size. The use of the mean DEM value in each model cell will result in a more conservative water budget and is more appropriate because the model cell water table value should be representative of the entire cell area, not the centroid of the model cell.

  19. On the Control of Automatic Processes: A Parallel Distributed Processing Model of the Stroop Effect

    Science.gov (United States)

    1988-06-16

    F.N. (1973). The Stroop phenomenon and its use in the study of perceptual, cognitive , and response processes. Memory and Cognition , 1, 106-120. Gatti...189-207. Logan, G.D. (1980). Attention and automaticity in Stroop and priming tasks: Theory and data. Cognitive Psychology, 12, 523-553. Logan, D.G...Dlh’i! FILE COI’_ C0 ON THE CONTROL OF AUTOMATIC PROCESSES: (N A PARALLEL DISTRIBUTED PROCESSING MODEL OF THE STROOP EFFECT Technical Report AIP - 40

  20. Object-oriented process dose modeling for glovebox operations

    International Nuclear Information System (INIS)

    Boerigter, S.T.; Fasel, J.H.; Kornreich, D.E.

    1999-01-01

    The Plutonium Facility at Los Alamos National Laboratory supports several defense and nondefense-related missions for the country by performing fabrication, surveillance, and research and development for materials and components that contain plutonium. Most operations occur in rooms with one or more arrays of gloveboxes connected to each other via trolley gloveboxes. Minimizing the effective dose equivalent (EDE) is a growing concern as a result of steadily declining allowable dose limits being imposed and a growing general awareness of safety in the workplace. In general, the authors discriminate three components of a worker's total EDE: the primary EDE, the secondary EDE, and background EDE. A particular background source of interest is the nuclear materials vault. The distinction between sources inside and outside of a particular room is arbitrary with the underlying assumption that building walls and floors provide significant shielding to justify including sources in other rooms in the background category. Los Alamos has developed the Process Modeling System (ProMoS) primarily for performing process analyses of nuclear operations. ProMoS is an object-oriented, discrete-event simulation package that has been used to analyze operations at Los Alamos and proposed facilities such as the new fabrication facilities for the Complex-21 effort. In the past, crude estimates of the process dose (the EDE received when a particular process occurred), room dose (the EDE received when a particular process occurred in a given room), and facility dose (the EDE received when a particular process occurred in the facility) were used to obtain an integrated EDE for a given process. Modifications to the ProMoS package were made to utilize secondary dose information to use dose modeling to enhance the process modeling efforts

  1. Dynamic Modelling of the Two-stage Gasification Process

    DEFF Research Database (Denmark)

    Gøbel, Benny; Henriksen, Ulrik B.; Houbak, Niels

    1999-01-01

    A two-stage gasification pilot plant was designed and built as a co-operative project between the Technical University of Denmark and the company REKA.A dynamic, mathematical model of the two-stage pilot plant was developed to serve as a tool for optimising the process and the operating conditions...... of the gasification plant.The model consists of modules corresponding to the different elements in the plant. The modules are coupled together through mass and heat conservation.Results from the model are compared with experimental data obtained during steady and unsteady operation of the pilot plant. A good...

  2. The Cognitive Complexity in Modelling the Group Decision Process

    Directory of Open Access Journals (Sweden)

    Barna Iantovics

    2010-06-01

    Full Text Available The paper investigates for some basic contextual factors (such
    us the problem complexity, the users' creativity and the problem space complexity the cognitive complexity associated with modelling the group decision processes (GDP in e-meetings. The analysis is done by conducting a socio-simulation experiment for an envisioned collaborative software tool that acts as a stigmergic environment for modelling the GDP. The simulation results revels some interesting design guidelines for engineering some contextual functionalities that minimize the cognitive complexity associated with modelling the GDP.

  3. Analysis of Mental Processes Represented in Models of Artificial Consciousness

    Directory of Open Access Journals (Sweden)

    Luana Folchini da Costa

    2013-12-01

    Full Text Available The Artificial Consciousness concept has been used in the engineering area as being an evolution of the Artificial Intelligence. However, consciousness is a complex subject and often used without formalism. As a main contribution, in this work one proposes an analysis of four recent models of artificial consciousness published in the engineering area. The mental processes represented by these models are highlighted and correlations with the theoretical perspective of cognitive psychology are made. Finally, considerations about consciousness in such models are discussed.

  4. On some approaches to model reversible magnetization processes

    Science.gov (United States)

    Chwastek, K.; Baghel, A. P. S.; Sai Ram, B.; Borowik, B.; Daniel, L.; Kulkarni, S. V.

    2018-04-01

    This paper focuses on the problem of how reversible magnetization processes are taken into account in contemporary descriptions of hysteresis curves. For comparison, three versions of the phenomenological T(x) model based on hyperbolic tangent mapping are considered. Two of them are based on summing the output of the hysteresis operator with a linear or nonlinear mapping. The third description is inspired by the concept of the product Preisach model. Total susceptibility is modulated with a magnetization-dependent function. The models are verified using measurement data for grain-oriented electrical steel. The proposed third description represents minor loops most accurately.

  5. Modeling Autoregressive Processes with Moving-Quantiles-Implied Nonlinearity

    Directory of Open Access Journals (Sweden)

    Isao Ishida

    2015-01-01

    Full Text Available We introduce and investigate some properties of a class of nonlinear time series models based on the moving sample quantiles in the autoregressive data generating process. We derive a test fit to detect this type of nonlinearity. Using the daily realized volatility data of Standard & Poor’s 500 (S&P 500 and several other indices, we obtained good performance using these models in an out-of-sample forecasting exercise compared with the forecasts obtained based on the usual linear heterogeneous autoregressive and other models of realized volatility.

  6. Measurement and modeling of advanced coal conversion processes, Volume III

    Energy Technology Data Exchange (ETDEWEB)

    Ghani, M.U.; Hobbs, M.L.; Hamblen, D.G. [and others

    1993-08-01

    A generalized one-dimensional, heterogeneous, steady-state, fixed-bed model for coal gasification and combustion is presented. The model, FBED-1, is a design and analysis tool that can be used to simulate a variety of gasification, devolatilization, and combustion processes. The model considers separate gas and solid temperatures, axially variable solid and gas flow rates, variable bed void fraction, coal drying, devolatilization based on chemical functional group composition, depolymerization, vaporization and crosslinking, oxidation, and gasification of char, and partial equilibrium in the gas phase.

  7. Numerical Validation of Chemical Compositional Model for Wettability Alteration Processes

    Science.gov (United States)

    Bekbauov, Bakhbergen; Berdyshev, Abdumauvlen; Baishemirov, Zharasbek; Bau, Domenico

    2017-12-01

    Chemical compositional simulation of enhanced oil recovery and surfactant enhanced aquifer remediation processes is a complex task that involves solving dozens of equations for all grid blocks representing a reservoir. In the present work, we perform a numerical validation of the newly developed mathematical formulation which satisfies the conservation laws of mass and energy and allows applying a sequential solution approach to solve the governing equations separately and implicitly. Through its application to the numerical experiment using a wettability alteration model and comparisons with existing chemical compositional model's numerical results, the new model has proven to be practical, reliable and stable.

  8. Modelling and simulation of diffusive processes methods and applications

    CERN Document Server

    Basu, SK

    2014-01-01

    This book addresses the key issues in the modeling and simulation of diffusive processes from a wide spectrum of different applications across a broad range of disciplines. Features: discusses diffusion and molecular transport in living cells and suspended sediment in open channels; examines the modeling of peristaltic transport of nanofluids, and isotachophoretic separation of ionic samples in microfluidics; reviews thermal characterization of non-homogeneous media and scale-dependent porous dispersion resulting from velocity fluctuations; describes the modeling of nitrogen fate and transport

  9. Integrated chemical/physical and biological processes modeling Part 2

    African Journals Online (AJOL)

    The approach of characterising sewage sludge into carbohydrates, lipids and proteins, as is done in the International Water Association (IWA) AD model No 1 ... found to be 64 to 68% biodegradable (depending on the kinetic formulation selected for the hydrolysis process) and to have a C,sub>3.5H7O2N0.196 composition.

  10. Product Trial Processing (PTP): a model approach from ...

    African Journals Online (AJOL)

    Product trial is described as consumer's first usage experience with a company's brand or product that is most important in determining brand attributes and the intention to make a purchase. Among the constructs used in the model of consumer's processing of product trail includes; experiential and non- experiential ...

  11. Epidemic Processes on Complex Networks : Modelling, Simulation and Algorithms

    NARCIS (Netherlands)

    Van de Bovenkamp, R.

    2015-01-01

    Local interactions on a graph will lead to global dynamic behaviour. In this thesis we focus on two types of dynamic processes on graphs: the Susceptible-Infected-Susceptilbe (SIS) virus spreading model, and gossip style epidemic algorithms. The largest part of this thesis is devoted to the SIS

  12. School Processes Mediate School Compositional Effects: Model Specification and Estimation

    Science.gov (United States)

    Liu, Hongqiang; Van Damme, Jan; Gielen, Sarah; Van Den Noortgate, Wim

    2015-01-01

    School composition effects have been consistently verified, but few studies ever attempted to study how school composition affects school achievement. Based on prior research findings, we employed multilevel mediation modeling to examine whether school processes mediate the effect of school composition upon school outcomes based on the data of 28…

  13. The Extended Parallel Process Model: Illuminating the Gaps in Research

    Science.gov (United States)

    Popova, Lucy

    2012-01-01

    This article examines constructs, propositions, and assumptions of the extended parallel process model (EPPM). Review of the EPPM literature reveals that its theoretical concepts are thoroughly developed, but the theory lacks consistency in operational definitions of some of its constructs. Out of the 12 propositions of the EPPM, a few have not…

  14. How can Product Development Process Modelling be made more useful?

    DEFF Research Database (Denmark)

    Wynn, David C; Maier, Anja; Clarkson, John P

    2010-01-01

    A significant body of research exists in the area of Product Development (PD) process modelling. This is highlighted by Browning and Ramasesh (2007), who recently reviewed over 400 papers in this field. However, despite hundreds, probably thousands of publications in this area, few of the proposed...

  15. Process modeling for the Integrated Thermal Treatment System (ITTS) study

    Energy Technology Data Exchange (ETDEWEB)

    Liebelt, K.H.; Brown, B.W.; Quapp, W.J.

    1995-09-01

    This report describes the process modeling done in support of the integrated thermal treatment system (ITTS) study, Phases 1 and 2. ITTS consists of an integrated systems engineering approach for uniform comparison of widely varying thermal treatment technologies proposed for treatment of the contact-handled mixed low-level wastes (MLLW) currently stored in the U.S. Department of Energy complex. In the overall study, 19 systems were evaluated. Preconceptual designs were developed that included all of the various subsystems necessary for a complete installation, from waste receiving through to primary and secondary stabilization and disposal of the processed wastes. Each system included the necessary auxiliary treatment subsystems so that all of the waste categories in the complex were fully processed. The objective of the modeling task was to perform mass and energy balances of the major material components in each system. Modeling of trace materials, such as pollutants and radioactive isotopes, were beyond the present scope. The modeling of the main and secondary thermal treatment, air pollution control, and metal melting subsystems was done using the ASPEN PLUS process simulation code, Version 9.1-3. These results were combined with calculations for the remainder of the subsystems to achieve the final results, which included offgas volumes, and mass and volume waste reduction ratios.

  16. An Empirical Investigation into a Subsidiary Absorptive Capacity Process Model

    DEFF Research Database (Denmark)

    Schleimer, Stephanie; Pedersen, Torben

    2011-01-01

    and empirically test a process model of absorptive capacity. The setting of our empirical study is 213 subsidiaries of multinational enterprises and the focus is on the capacity of these subsidiaries to successfully absorb best practices in marketing strategy from their headquarters. This setting allows us...

  17. Integrated Modeling and Analysis of Physical Oceanographic and Acoustic Processes

    Science.gov (United States)

    2014-09-30

    dynamics of the ocean, surface and internal waves, and seabed and acoustics processes with atmospheric forcing, all in a fully synoptic and evolving...rays with the eKdVf model and a sine-wave starter is shown (synthetic SAR pictures , surface convergences, are shown). Waves computed with advection

  18. A price adjustment process in a model of monopolistic competition

    NARCIS (Netherlands)

    Tuinstra, J.

    2004-01-01

    We consider a price adjustment process in a model of monopolistic competition. Firms have incomplete information about the demand structure. When they set a price they observe the amount they can sell at that price and they observe the slope of the true demand curve at that price. With this

  19. Transfer as a two-way process: testing a model

    NARCIS (Netherlands)

    Vermeulen, R.; Admiraal, W.

    2009-01-01

    Purpose - The purpose of this exploratory research is to test the model of training transfer as a two-way process. Design/methodology/approach - Based on self-report data gathered from 58 to 44 respondents in a field experiment, it is argued that there is not just learning in the context of training

  20. Semantic Similarity, Predictability, and Models of Sentence Processing

    Science.gov (United States)

    Roland, Douglas; Yun, Hongoak; Koenig, Jean-Pierre; Mauner, Gail

    2012-01-01

    The effects of word predictability and shared semantic similarity between a target word and other words that could have taken its place in a sentence on language comprehension are investigated using data from a reading time study, a sentence completion study, and linear mixed-effects regression modeling. We find that processing is facilitated if…