WorldWideScience

Sample records for model development process

  1. Developing engineering processes through integrated modelling of product and process

    DEFF Research Database (Denmark)

    Nielsen, Jeppe Bjerrum; Hvam, Lars

    2012-01-01

    This article aims at developing an operational tool for integrated modelling of product assortments and engineering processes in companies making customer specific products. Integrating a product model in the design of engineering processes will provide a deeper understanding of the engineering...... activities as well as insight into how product features affect the engineering processes. The article suggests possible ways of integrating models of products with models of engineering processes. The models have been tested and further developed in an action research study carried out in collaboration...... with a major international engineering company....

  2. Model for Simulating a Spiral Software-Development Process

    Science.gov (United States)

    Mizell, Carolyn; Curley, Charles; Nayak, Umanath

    2010-01-01

    A discrete-event simulation model, and a computer program that implements the model, have been developed as means of analyzing a spiral software-development process. This model can be tailored to specific development environments for use by software project managers in making quantitative cases for deciding among different software-development processes, courses of action, and cost estimates. A spiral process can be contrasted with a waterfall process, which is a traditional process that consists of a sequence of activities that include analysis of requirements, design, coding, testing, and support. A spiral process is an iterative process that can be regarded as a repeating modified waterfall process. Each iteration includes assessment of risk, analysis of requirements, design, coding, testing, delivery, and evaluation. A key difference between a spiral and a waterfall process is that a spiral process can accommodate changes in requirements at each iteration, whereas in a waterfall process, requirements are considered to be fixed from the beginning and, therefore, a waterfall process is not flexible enough for some projects, especially those in which requirements are not known at the beginning or may change during development. For a given project, a spiral process may cost more and take more time than does a waterfall process, but may better satisfy a customer's expectations and needs. Models for simulating various waterfall processes have been developed previously, but until now, there have been no models for simulating spiral processes. The present spiral-process-simulating model and the software that implements it were developed by extending a discrete-event simulation process model of the IEEE 12207 Software Development Process, which was built using commercially available software known as the Process Analysis Tradeoff Tool (PATT). Typical inputs to PATT models include industry-average values of product size (expressed as number of lines of code

  3. A Generic Modeling Process to Support Functional Fault Model Development

    Science.gov (United States)

    Maul, William A.; Hemminger, Joseph A.; Oostdyk, Rebecca; Bis, Rachael A.

    2016-01-01

    Functional fault models (FFMs) are qualitative representations of a system's failure space that are used to provide a diagnostic of the modeled system. An FFM simulates the failure effect propagation paths within a system between failure modes and observation points. These models contain a significant amount of information about the system including the design, operation and off nominal behavior. The development and verification of the models can be costly in both time and resources. In addition, models depicting similar components can be distinct, both in appearance and function, when created individually, because there are numerous ways of representing the failure space within each component. Generic application of FFMs has the advantages of software code reuse: reduction of time and resources in both development and verification, and a standard set of component models from which future system models can be generated with common appearance and diagnostic performance. This paper outlines the motivation to develop a generic modeling process for FFMs at the component level and the effort to implement that process through modeling conventions and a software tool. The implementation of this generic modeling process within a fault isolation demonstration for NASA's Advanced Ground System Maintenance (AGSM) Integrated Health Management (IHM) project is presented and the impact discussed.

  4. A Petri Net-Based Software Process Model for Developing Process-Oriented Information Systems

    Science.gov (United States)

    Li, Yu; Oberweis, Andreas

    Aiming at increasing flexibility, efficiency, effectiveness, and transparency of information processing and resource deployment in organizations to ensure customer satisfaction and high quality of products and services, process-oriented information systems (POIS) represent a promising realization form of computerized business information systems. Due to the complexity of POIS, explicit and specialized software process models are required to guide POIS development. In this chapter we characterize POIS with an architecture framework and present a Petri net-based software process model tailored for POIS development with consideration of organizational roles. As integrated parts of the software process model, we also introduce XML nets, a variant of high-level Petri nets as basic methodology for business processes modeling, and an XML net-based software toolset providing comprehensive functionalities for POIS development.

  5. Modelling Template for the Development of the Process Flowsheet

    DEFF Research Database (Denmark)

    Fedorova, Marina; Gani, Rafiqul

    2015-01-01

    Models are playing important roles in design and analysis of chemicals/bio-chemicals based products and the processes that manufacture them. Model-based methods and tools have the potential to decrease the number of experiments, which can be expensive and time consuming, and point to candidates...... in connection to other modelling tools within the modelling framework are forming a user-friendly system, which will make the model development process easier and faster and provide the way for unified and consistent model documentation. The modeller can use the template for their specific problem or to extend...... models systematically, efficiently and reliably. In this way, development of products and processes can be faster, cheaper and very efficient. The developed modelling framework involves three main parts: 1) a modelling tool, that includes algorithms for model generation; 2) a template library, which...

  6. A Software Development Simulation Model of a Spiral Process

    Science.gov (United States)

    Mizell, Carolyn; Malone, Linda

    2007-01-01

    There is a need for simulation models of software development processes other than the waterfall because processes such as spiral development are becoming more and more popular. The use of a spiral process can make the inherently difficult job of cost and schedule estimation even more challenging due to its evolutionary nature, but this allows for a more flexible process that can better meet customers' needs. This paper will present a discrete event simulation model of spiral development that can be used to analyze cost and schedule effects of using such a process in comparison to a waterfall process.

  7. A Software Development Simulation Model of a Spiral Process

    OpenAIRE

    Carolyn Mizell; Linda Malone

    2009-01-01

    This paper will present a discrete event simulation model of a spiral development lifecycle that can be used to analyze cost and schedule effects of using such a process in comparison to a waterfall process. There is a need for simulation models of software development processes other than the waterfall due to new processes becoming more widely used in order to overcome the limitations of the traditional waterfall lifecycle. The use of a spiral process can make the inherently difficult job of...

  8. Comparing single- and dual-process models of memory development.

    Science.gov (United States)

    Hayes, Brett K; Dunn, John C; Joubert, Amy; Taylor, Robert

    2017-11-01

    This experiment examined single-process and dual-process accounts of the development of visual recognition memory. The participants, 6-7-year-olds, 9-10-year-olds and adults, were presented with a list of pictures which they encoded under shallow or deep conditions. They then made recognition and confidence judgments about a list containing old and new items. We replicated the main trends reported by Ghetti and Angelini () in that recognition hit rates increased from 6 to 9 years of age, with larger age changes following deep than shallow encoding. Formal versions of the dual-process high threshold signal detection model and several single-process models (equal variance signal detection, unequal variance signal detection, mixture signal detection) were fit to the developmental data. The unequal variance and mixture signal detection models gave a better account of the data than either of the other models. A state-trace analysis found evidence for only one underlying memory process across the age range tested. These results suggest that single-process memory models based on memory strength are a viable alternative to dual-process models for explaining memory development. © 2016 John Wiley & Sons Ltd.

  9. Development of an equipment management model to improve effectiveness of processes

    International Nuclear Information System (INIS)

    Chang, H. S.; Ju, T. Y.; Song, T. Y.

    2012-01-01

    The nuclear industries have developed and are trying to create a performance model to improve effectiveness of the processes implemented at nuclear plants in order to enhance performance. Most high performing nuclear stations seek to continually improve the quality of their operations by identifying and closing important performance gaps. Thus, many utilities have implemented performance models adjusted to their plant's configuration and have instituted policies for such models. KHNP is developing a standard performance model to integrate the engineering processes and to improve the inter-relation among processes. The model, called the Standard Equipment Management Model (SEMM), is under development first by focusing on engineering processes and performance improvement processes related to plant equipment used at the site. This model includes performance indicators for each process that can allow evaluating and comparing the process performance among 21 operating units. The model will later be expanded to incorporate cost and management processes. (authors)

  10. A neuroconstructivist model of past tense development and processing.

    Science.gov (United States)

    Westermann, Gert; Ruh, Nicolas

    2012-07-01

    We present a neural network model of learning and processing the English past tense that is based on the notion that experience-dependent cortical development is a core aspect of cognitive development. During learning the model adds and removes units and connections to develop a task-specific final architecture. The model provides an integrated account of characteristic errors during learning the past tense, adult generalization to pseudoverbs, and dissociations between verbs observed after brain damage in aphasic patients. We put forward a theory of verb inflection in which a functional processing architecture develops through interactions between experience-dependent brain development and the structure of the environment, in this case, the statistical properties of verbs in the language. The outcome of this process is a structured processing system giving rise to graded dissociations between verbs that are easy and verbs that are hard to learn and process. In contrast to dual-mechanism accounts of inflection, we argue that describing dissociations as a dichotomy between regular and irregular verbs is a post hoc abstraction and is not linked to underlying processing mechanisms. We extend current single-mechanism accounts of inflection by highlighting the role of structural adaptation in development and in the formation of the adult processing system. In contrast to some single-mechanism accounts, we argue that the link between irregular inflection and verb semantics is not causal and that existing data can be explained on the basis of phonological representations alone. This work highlights the benefit of taking brain development seriously in theories of cognitive development. Copyright 2012 APA, all rights reserved.

  11. 3D physical modeling for patterning process development

    Science.gov (United States)

    Sarma, Chandra; Abdo, Amr; Bailey, Todd; Conley, Will; Dunn, Derren; Marokkey, Sajan; Talbi, Mohamed

    2010-03-01

    In this paper we will demonstrate how a 3D physical patterning model can act as a forensic tool for OPC and ground-rule development. We discuss examples where the 2D modeling shows no issues in printing gate lines but 3D modeling shows severe resist loss in the middle. In absence of corrective measure, there is a high likelihood of line discontinuity post etch. Such early insight into process limitations of prospective ground rules can be invaluable for early technology development. We will also demonstrate how the root cause of broken poly-line after etch could be traced to resist necking in the region of STI step with the help of 3D models. We discuss different cases of metal and contact layouts where 3D modeling gives an early insight in to technology limitations. In addition such a 3D physical model could be used for early resist evaluation and selection for required ground-rule challenges, which can substantially reduce the cycle time for process development.

  12. An innovative service process development based on a reference model

    Directory of Open Access Journals (Sweden)

    Lorenzo Sanfelice Frazzon

    2015-06-01

    Full Text Available This article examines the new service development (NSD process, focusing specifically in a case of a financial service, guided by the following research questions: what are the processes and practices used in the development and design of new financial services? How the results of the financial NSD proposal reflects on the NSD are as a whole? Therefore, the study aims to show and describe a financial service development, conducted at Helpinveste. The paper focuses on the Conceptual Design service (activities: definition of specifications and development of alternative solutions for the service and Service Process Design (Service Representation phases. The methodological procedures are based on the process approach, using a reference model for developing new services. In order to operationalize the model, several techniques for the various stages of the project were used, e.g. QFD and Service Blueprint. Lastly, conclusions report contributions from the reference model application, both theoretical and practical contributions, as well the limitations and further research recommendations.

  13. Development of climate data storage and processing model

    Science.gov (United States)

    Okladnikov, I. G.; Gordov, E. P.; Titov, A. G.

    2016-11-01

    We present a storage and processing model for climate datasets elaborated in the framework of a virtual research environment (VRE) for climate and environmental monitoring and analysis of the impact of climate change on the socio-economic processes on local and regional scales. The model is based on a «shared nothings» distributed computing architecture and assumes using a computing network where each computing node is independent and selfsufficient. Each node holds a dedicated software for the processing and visualization of geospatial data providing programming interfaces to communicate with the other nodes. The nodes are interconnected by a local network or the Internet and exchange data and control instructions via SSH connections and web services. Geospatial data is represented by collections of netCDF files stored in a hierarchy of directories in the framework of a file system. To speed up data reading and processing, three approaches are proposed: a precalculation of intermediate products, a distribution of data across multiple storage systems (with or without redundancy), and caching and reuse of the previously obtained products. For a fast search and retrieval of the required data, according to the data storage and processing model, a metadata database is developed. It contains descriptions of the space-time features of the datasets available for processing, their locations, as well as descriptions and run options of the software components for data analysis and visualization. The model and the metadata database together will provide a reliable technological basis for development of a high- performance virtual research environment for climatic and environmental monitoring.

  14. The Development and Application of an Integrated VAR Process Model

    Science.gov (United States)

    Ballantyne, A. Stewart

    2016-07-01

    The VAR ingot has been the focus of several modelling efforts over the years with the result that the thermal regime in the ingot can be simulated quite realistically. Such models provide important insight into solidification of the ingot but present some significant challenges to the casual user such as a process engineer. To provide the process engineer with a tool to assist in the development of a melt practice, a comprehensive model of the complete VAR process has been developed. A radiation heat transfer simulation of the arc has been combined with electrode and ingot models to develop a platform which accepts typical operating variables (voltage, current, and gap) together with process parameters (electrode size, crucible size, orientation, water flow, etc.) as input data. The output consists of heat flow distributions and solidification parameters in the form of text, comma-separated value, and visual toolkit files. The resulting model has been used to examine the relationship between the assumed energy distribution in the arc and the actual energy flux which arrives at the ingot top surface. Utilizing heat balance information generated by the model, the effects of electrode-crucible orientation and arc gap have been explored with regard to the formation of ingot segregation defects.

  15. An Implicit Model Development Process for Bounding External, Seemingly Intangible/Non-Quantifiable Factors

    Science.gov (United States)

    2017-06-01

    This research expands the modeling and simulation (M and S) body of knowledge through the development of an Implicit Model Development Process (IMDP...When augmented to traditional Model Development Processes (MDP), the IMDP enables the development of models that can address a broader array of...potential impacts on operational effectiveness. Specifically, the IMDP provides a formalized methodology for developing an improved model definition

  16. Towards a Business Process Modeling Technique for Agile Development of Case Management Systems

    Directory of Open Access Journals (Sweden)

    Ilia Bider

    2017-12-01

    Full Text Available A modern organization needs to adapt its behavior to changes in the business environment by changing its Business Processes (BP and corresponding Business Process Support (BPS systems. One way of achieving such adaptability is via separation of the system code from the process description/model by applying the concept of executable process models. Furthermore, to ease introduction of changes, such process model should separate different perspectives, for example, control-flow, human resources, and data perspectives, from each other. In addition, for developing a completely new process, it should be possible to start with a reduced process model to get a BPS system quickly running, and then continue to develop it in an agile manner. This article consists of two parts, the first sets requirements on modeling techniques that could be used in the tools that supports agile development of BPs and BPS systems. The second part suggests a business process modeling technique that allows to start modeling with the data/information perspective which would be appropriate for processes supported by Case or Adaptive Case Management (CM/ACM systems. In a model produced by this technique, called data-centric business process model, a process instance/case is defined as sequence of states in a specially designed instance database, while the process model is defined as a set of rules that set restrictions on allowed states and transitions between them. The article details the background for the project of developing the data-centric process modeling technique, presents the outline of the structure of the model, and gives formal definitions for a substantial part of the model.

  17. Process development

    Energy Technology Data Exchange (ETDEWEB)

    Schuegerl, K

    1984-01-01

    The item 'process development' comprises the production of acetonic/butonal with C. acetobylicum and the yeasting of potato waste. The target is to increase productivity by taking the following measures - optimation of media, on-line process analysis, analysis of reaction, mathematic modelling and identification of parameters, process simulation, development of a state estimator with the help of the on-line process analysis and the model, optimization and adaptive control.

  18. Modelling of transport and biogeochemical processes in pollution plumes: Literature review of model development

    DEFF Research Database (Denmark)

    Brun, A.; Engesgaard, Peter Knudegaard

    2002-01-01

    A literature survey shows how biogeochemical (coupled organic and inorganic reaction processes) transport models are based on considering the complete biodegradation process as either a single- or as a two-step process. It is demonstrated that some two-step process models rely on the Partial...... Equilibrium Approach (PEA). The PEA assumes the organic degradation step, and not the electron acceptor consumption step, is rate limiting. This distinction is not possible in one-step process models, where consumption of both the electron donor and acceptor are treated kinetically. A three-dimensional, two......-step PEA model is developed. The model allows for Monod kinetics and biomass growth, features usually included only in one-step process models. The biogeochemical part of the model is tested for a batch system with degradation of organic matter under the consumption of a sequence of electron acceptors...

  19. Simulation Modeling of Software Development Processes

    Science.gov (United States)

    Calavaro, G. F.; Basili, V. R.; Iazeolla, G.

    1996-01-01

    A simulation modeling approach is proposed for the prediction of software process productivity indices, such as cost and time-to-market, and the sensitivity analysis of such indices to changes in the organization parameters and user requirements. The approach uses a timed Petri Net and Object Oriented top-down model specification. Results demonstrate the model representativeness, and its usefulness in verifying process conformance to expectations, and in performing continuous process improvement and optimization.

  20. Interprofessional practice in primary care: development of a tailored process model

    Directory of Open Access Journals (Sweden)

    Stans SEA

    2013-04-01

    Full Text Available Steffy EA Stans, JG Anita Stevens, Anna JHM Beurskens Research Center of Autonomy and Participation for Persons with a Chronic Illness, Zuyd University of Applied Sciences, Heerlen, The Netherlands Purpose: This study investigated the improvement of interprofessional practice in primary care by performing the first three steps of the implementation model described by Grol et al. This article describes the targets for improvement in a setting for children with complex care needs (step 1, the identification of barriers and facilitators influencing interprofessional practice (step 2, and the development of a tailored interprofessional process model (step 3. Methods: In step 2, thirteen qualitative semistructured interviews were held with several stakeholders, including parents of children, an occupational therapist, a speech and language therapist, a physical therapist, the manager of the team, two general practitioners, a psychologist, and a primary school teacher. The data were analyzed using directed content analysis and using the domains of the Chronic Care Model as a framework. In step 3, a project group was formed to develop helpful strategies, including the development of an interprofessional process through process mapping. Results: In step 2, it was found that the most important barriers to implementing interprofessional practice related to the lack of structure in the care process. A process model for interprofessional primary care was developed for the target group. Conclusion: The lack of a shared view of what is involved in the process of interprofessional practice was the most important barrier to its successful implementation. It is suggested that the tailored process developed, supported with the appropriate tools, may provide both professional staff and their clients, in this setting but also in other areas of primary care, with insight to the care process and a clear representation of "who should do what, when, and how." Keywords

  1. Integrated approaches to the application of advanced modeling technology in process development and optimization

    Energy Technology Data Exchange (ETDEWEB)

    Allgor, R.J.; Feehery, W.F.; Tolsma, J.E. [Massachusetts Institute of Technology, Cambridge, MA (United States)] [and others

    1995-12-31

    The batch process development problem serves as good candidate to guide the development of process modeling environments. It demonstrates that very robust numerical techniques are required within an environment that can collect, organize, and maintain the data and models required to address the batch process development problem. This paper focuses on improving the robustness and efficiency of the numerical algorithms required in such a modeling environment through the development of hybrid numerical and symbolic strategies.

  2. Development and implementation of a process model for improvement in manufacturing organisations

    International Nuclear Information System (INIS)

    Ideboeen, F.; Varildengen, R.

    1998-01-01

    The Institute for Information Technology has developed a holistic and analytic model to improve the competitive power of organisations. The goals for the work were to develop a practical and holistic tool. The Process Model is a general model for organisations and has to be adjusted to each organisation. It is based on the fact that products are created while they go through the value creating processes (what the customer is will to pay for). All products and services can be considered a service for the customer. The product itself has less value, but the customer is interested in what a product can provide, including status. The organisation is looked on as a system which, in turn, is an independent group of items, people, or processes working together toward a common purpose. A process is a set of causes and conditions that repeatedly occur in sequential series of steps to transform inputs to outputs. This model divides the company into 3 major process groups: value creating processes, management processes, and support processes. Value creating processes are activities that the customer is willing to pay for. Management processes are the long term processes to obtain optimal profitability through satisfied customers and employees, both in the present and in the future. Support processes are those processes necessary to support the value creating processes. By using the Process Model a company can re-engineer processes and the linkage between them and take out unnecessary processes. One can also work with one process individually. The main goal is to have a model of the company and an overview of the processes and the linkage between them. Changes have to be predicted and the consequences foreseen within the model

  3. Process-Based Development of Competence Models to Computer Science Education

    Science.gov (United States)

    Zendler, Andreas; Seitz, Cornelia; Klaudt, Dieter

    2016-01-01

    A process model ("cpm.4.CSE") is introduced that allows the development of competence models in computer science education related to curricular requirements. It includes eight subprocesses: (a) determine competence concept, (b) determine competence areas, (c) identify computer science concepts, (d) assign competence dimensions to…

  4. Functional Fault Model Development Process to Support Design Analysis and Operational Assessment

    Science.gov (United States)

    Melcher, Kevin J.; Maul, William A.; Hemminger, Joseph A.

    2016-01-01

    A functional fault model (FFM) is an abstract representation of the failure space of a given system. As such, it simulates the propagation of failure effects along paths between the origin of the system failure modes and points within the system capable of observing the failure effects. As a result, FFMs may be used to diagnose the presence of failures in the modeled system. FFMs necessarily contain a significant amount of information about the design, operations, and failure modes and effects. One of the important benefits of FFMs is that they may be qualitative, rather than quantitative and, as a result, may be implemented early in the design process when there is more potential to positively impact the system design. FFMs may therefore be developed and matured throughout the monitored system's design process and may subsequently be used to provide real-time diagnostic assessments that support system operations. This paper provides an overview of a generalized NASA process that is being used to develop and apply FFMs. FFM technology has been evolving for more than 25 years. The FFM development process presented in this paper was refined during NASA's Ares I, Space Launch System, and Ground Systems Development and Operations programs (i.e., from about 2007 to the present). Process refinement took place as new modeling, analysis, and verification tools were created to enhance FFM capabilities. In this paper, standard elements of a model development process (i.e., knowledge acquisition, conceptual design, implementation & verification, and application) are described within the context of FFMs. Further, newer tools and analytical capabilities that may benefit the broader systems engineering process are identified and briefly described. The discussion is intended as a high-level guide for future FFM modelers.

  5. Green Pea and Garlic Puree Model Food Development for Thermal Pasteurization Process Quality Evaluation.

    Science.gov (United States)

    Bornhorst, Ellen R; Tang, Juming; Sablani, Shyam S; Barbosa-Cánovas, Gustavo V; Liu, Fang

    2017-07-01

    Development and selection of model foods is a critical part of microwave thermal process development, simulation validation, and optimization. Previously developed model foods for pasteurization process evaluation utilized Maillard reaction products as the time-temperature integrators, which resulted in similar temperature sensitivity among the models. The aim of this research was to develop additional model foods based on different time-temperature integrators, determine their dielectric properties and color change kinetics, and validate the optimal model food in hot water and microwave-assisted pasteurization processes. Color, quantified using a * value, was selected as the time-temperature indicator for green pea and garlic puree model foods. Results showed 915 MHz microwaves had a greater penetration depth into the green pea model food than the garlic. a * value reaction rates for the green pea model were approximately 4 times slower than in the garlic model food; slower reaction rates were preferred for the application of model food in this study, that is quality evaluation for a target process of 90 °C for 10 min at the cold spot. Pasteurization validation used the green pea model food and results showed that there were quantifiable differences between the color of the unheated control, hot water pasteurization, and microwave-assisted thermal pasteurization system. Both model foods developed in this research could be utilized for quality assessment and optimization of various thermal pasteurization processes. © 2017 Institute of Food Technologists®.

  6. Development of a Systems Engineering Model of the Chemical Separations Process

    International Nuclear Information System (INIS)

    Sun, Lijian; Li, Jianhong; Chen, Yitung; Clarksean, Randy; Ladler, Jim; Vandergrift, George

    2002-01-01

    Work is being performed to develop a general-purpose systems engineering model for the AAA separation process. The work centers on the development of a new user interface for the AMUSE code and on the specification of a systems engineering model. This paper presents background information and an overview of work completed to date. (authors)

  7. Intentional Modelling: A Process for Clinical Leadership Development in Mental Health Nursing.

    Science.gov (United States)

    Ennis, Gary; Happell, Brenda; Reid-Searl, Kerry

    2016-05-01

    Clinical leadership is becoming more relevant for nurses, as the positive impact that it can have on the quality of care and outcomes for consumers is better understood and more clearly articulated in the literature. As clinical leadership continues to become more relevant, the need to gain an understanding of how clinical leaders in nursing develop will become increasingly important. While the attributes associated with effective clinical leadership are recognized in current literature there remains a paucity of research on how clinical leaders develop these attributes. This study utilized a grounded theory methodology to generate new insights into the experiences of peer identified clinical leaders in mental health nursing and the process of developing clinical leadership skills. Participants in this study were nurses working in a mental health setting who were identified as clinical leaders by their peers as opposed to identifying them by their role or organizational position. A process of intentional modeling emerged as the substantive theory identified in this study. Intentional modeling was described by participants in this study as a process that enabled them to purposefully identify models that assisted them in developing the characteristics of effective clinical leaders as well as allowing them to model these characteristics to others. Reflection on practice is an important contributor to intentional modelling. Intentional modelling could be developed as a framework for promoting knowledge and skill development in the area of clinical leadership.

  8. Development of flexible process-centric web applications: An integrated model driven approach

    NARCIS (Netherlands)

    Bernardi, M.L.; Cimitile, M.; Di Lucca, G.A.; Maggi, F.M.

    2012-01-01

    In recent years, Model Driven Engineering (MDE) approaches have been proposed and used to develop and evolve WAs. However, the definition of appropriate MDE approaches for the development of flexible process-centric WAs is still limited. In particular, (flexible) workflow models have never been

  9. Strategic Alliance Development - A Process Model A Case Study Integrating Elements of Strategic Alliances

    OpenAIRE

    Mohd Yunos, Mohd Bulkiah

    2007-01-01

    There has been enormous increase in the formation of strategic alliance and the research efforts devoted to understanding alliance development process over the last few decades. However, the critical elements that influence the each stage of alliance development are yet unexplored. This dissertation aims to fill this gap and to supplement it by introducing an integrated process model of strategic alliance development and its critical elements. The process model for strategic alliance developm...

  10. Mechanistic Models for Process Development and Optimization of Fed-batch Fermentation Systems

    DEFF Research Database (Denmark)

    Mears, Lisa; Stocks, Stuart M.; Albæk, Mads O.

    2016-01-01

    This work discusses the application of mechanistic models to pilot scale filamentous fungal fermentation systems operated at Novozymes A/S. For on-line applications, a state estimator model is developed based on a stoichiometric balance in order to predict the biomass and product concentration....... This is based on on-line gas measurements and ammonia addition flow rate measurements. Additionally, a mechanistic model is applied offline as a tool for batch planning, based on definition of the process back pressure, aeration rate and stirrer speed. This allows the batch starting fill to be planned, taking...... into account the oxygen transfer conditions, as well as the evaporation rates of the system. Mechanistic models are valuable tools which are applicable for both process development and optimization. The state estimator described will be a valuable tool for future work as part of control strategy development...

  11. Development of transformations from business process models to implementations by reuse

    NARCIS (Netherlands)

    Dirgahayu, T.; Quartel, Dick; van Sinderen, Marten J.; Ferreira Pires, Luis; Hammoudi, S.

    2007-01-01

    This paper presents an approach for developing transformations from business process models to implementations that facilitates reuse. A transformation is developed as a composition of three smaller tasks: pattern recognition, pattern realization and activity transformation. The approach allows one

  12. The Optimization of the Local Public Policies’ Development Process Through Modeling And Simulation

    Directory of Open Access Journals (Sweden)

    Minodora URSĂCESCU

    2012-06-01

    Full Text Available The local public policies development in Romania represents an empirically realized measure, the strategic management practices in this domain not being based on a scientific instrument capable to anticipate and evaluate the results of implementing a local public policy in a logic of needs-policies-effects type. Beginning from this motivation, the purpose of the paper resides in the reconceptualization of the public policies process on functioning principles of the dynamic systems with inverse connection, by means of mathematical modeling and techniques simulation. Therefore, the research is oriented in the direction of developing an optimization method for the local public policies development process, using as instruments the mathematical modeling and the techniques simulation. The research’s main results are on the one side constituted by generating a new process concept of the local public policies, and on the other side by proposing the conceptual model of a complex software product which will permit the parameterized modeling in a virtual environment of these policies development process. The informatic product’s finality resides in modeling and simulating each local public policy type, taking into account the respective policy’s characteristics, but also the value of their appliance environment parameters in a certain moment.

  13. Process correlation analysis model for process improvement identification.

    Science.gov (United States)

    Choi, Su-jin; Kim, Dae-Kyoo; Park, Sooyong

    2014-01-01

    Software process improvement aims at improving the development process of software systems. It is initiated by process assessment identifying strengths and weaknesses and based on the findings, improvement plans are developed. In general, a process reference model (e.g., CMMI) is used throughout the process of software process improvement as the base. CMMI defines a set of process areas involved in software development and what to be carried out in process areas in terms of goals and practices. Process areas and their elements (goals and practices) are often correlated due to the iterative nature of software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data.

  14. Development of Property Models with Uncertainty Estimate for Process Design under Uncertainty

    DEFF Research Database (Denmark)

    Hukkerikar, Amol; Sarup, Bent; Abildskov, Jens

    more reliable predictions with a new and improved set of model parameters for GC (group contribution) based and CI (atom connectivity index) based models and to quantify the uncertainties in the estimated property values from a process design point-of-view. This includes: (i) parameter estimation using....... The comparison of model prediction uncertainties with reported range of measurement uncertainties is presented for the properties with related available data. The application of the developed methodology to quantify the effect of these uncertainties on the design of different unit operations (distillation column......, the developed methodology can be used to quantify the sensitivity of process design to uncertainties in property estimates; obtain rationally the risk/safety factors in process design; and identify additional experimentation needs in order to reduce most critical uncertainties....

  15. Modelling coupled microbial processes in the subsurface: Model development, verification, evaluation and application

    Science.gov (United States)

    Masum, Shakil A.; Thomas, Hywel R.

    2018-06-01

    To study subsurface microbial processes, a coupled model which has been developed within a Thermal-Hydraulic-Chemical-Mechanical (THCM) framework is presented. The work presented here, focuses on microbial transport, growth and decay mechanisms under the influence of multiphase flow and bio-geochemical reactions. In this paper, theoretical formulations and numerical implementations of the microbial model are presented. The model has been verified and also evaluated against relevant experimental results. Simulated results show that the microbial processes have been accurately implemented and their impacts on porous media properties can be predicted either qualitatively or quantitatively or both. The model has been applied to investigate biofilm growth in a sandstone core that is subjected to a two-phase flow and variable pH conditions. The results indicate that biofilm growth (if not limited by substrates) in a multiphase system largely depends on the hydraulic properties of the medium. When the change in porewater pH which occurred due to dissolution of carbon dioxide gas is considered, growth processes are affected. For the given parameter regime, it has been shown that the net biofilm growth is favoured by higher pH; whilst the processes are considerably retarded at lower pH values. The capabilities of the model to predict microbial respiration in a fully coupled multiphase flow condition and microbial fermentation leading to production of a gas phase are also demonstrated.

  16. Process, cost modeling and simulations for integrated project development of biomass for fuel and protein

    International Nuclear Information System (INIS)

    Pannir Selvam, P.V.; Wolff, D.M.B.; Souza Melo, H.N.

    1998-01-01

    The construction of the models for biomass project development are described. These models, first constructed using QPRO electronic spread sheet for Windows, are now being developed with the aid of visual and object oriented program as tools using DELPHI V.1 for windows and process simulator SUPERPRO, V.2.7 Intelligent Inc. These models render the process development problems with economic objectives to be solved very rapidly. The preliminary analysis of cost and investments of biomass utilisation projects which are included for this study are: steam, ammonia, carbon dioxide and alkali pretreatment process, methane gas production using anaerobic digestion process, aerobic composting, ethanol fermentation and distillation, effluent treatments using high rate algae production as well as cogeneration of energy for drying. The main project under developments are the biomass valuation projects with the elephant (Napier) grass, sugar cane bagasse and microalgae, using models for mass balance, equipment and production cost. The sensibility analyses are carried out to account for stochastic variation of the process yield, production volume, price variations, using Monte Carlo method. These models allow the identification of economical and scale up problems of the technology. The results obtained with few preliminary project development with few case studies are reported for integrated project development for fuel and protein using process and cost simulation models. (author)

  17. Integration of Fast Predictive Model and SLM Process Development Chamber, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — This STTR project seeks to develop a fast predictive model for selective laser melting (SLM) processes and then integrate that model with an SLM chamber that allows...

  18. Modeling of biopharmaceutical processes. Part 2: Process chromatography unit operation

    DEFF Research Database (Denmark)

    Kaltenbrunner, Oliver; McCue, Justin; Engel, Philip

    2008-01-01

    Process modeling can be a useful tool to aid in process development, process optimization, and process scale-up. When modeling a chromatography process, one must first select the appropriate models that describe the mass transfer and adsorption that occurs within the porous adsorbent. The theoret......Process modeling can be a useful tool to aid in process development, process optimization, and process scale-up. When modeling a chromatography process, one must first select the appropriate models that describe the mass transfer and adsorption that occurs within the porous adsorbent...

  19. Real-time control data wrangling for development of mathematical control models of technological processes

    Science.gov (United States)

    Vasilyeva, N. V.; Koteleva, N. I.; Fedorova, E. R.

    2018-05-01

    The relevance of the research is due to the need to stabilize the composition of the melting products of copper-nickel sulfide raw materials in the Vanyukov furnace. The goal of this research is to identify the most suitable methods for the aggregation of the real time data for the development of a mathematical model for control of the technological process of melting copper-nickel sulfide raw materials in the Vanyukov furnace. Statistical methods of analyzing the historical data of the real technological object and the correlation analysis of process parameters are described. Factors that exert the greatest influence on the main output parameter (copper content in matte) and ensure the physical-chemical transformations are revealed. An approach to the processing of the real time data for the development of a mathematical model for control of the melting process is proposed. The stages of processing the real time information are considered. The adopted methodology for the aggregation of data suitable for the development of a control model for the technological process of melting copper-nickel sulfide raw materials in the Vanyukov furnace allows us to interpret the obtained results for their further practical application.

  20. Model-based high-throughout process development for chromatographic whey proteins separation

    NARCIS (Netherlands)

    Nfor, B.; Ripic, J.; Padt, van der A.; Jacobs, M.; Ottens, M.

    2012-01-01

    In this study, an integrated approach involving the combined use of high-throughput screening (HTS) and column modeling during process development was applied to an industrial case involving the evaluation of four anion-exchange chromatography (AEX) resins and four hydrophobic interaction

  1. Process development of a New Haemophilus influenzae type b conjugate vaccine and the use of mathematical modeling to identify process optimization possibilities.

    Science.gov (United States)

    Hamidi, Ahd; Kreeftenberg, Hans; V D Pol, Leo; Ghimire, Saroj; V D Wielen, Luuk A M; Ottens, Marcel

    2016-05-01

    Vaccination is one of the most successful public health interventions being a cost-effective tool in preventing deaths among young children. The earliest vaccines were developed following empirical methods, creating vaccines by trial and error. New process development tools, for example mathematical modeling, as well as new regulatory initiatives requiring better understanding of both the product and the process are being applied to well-characterized biopharmaceuticals (for example recombinant proteins). The vaccine industry is still running behind in comparison to these industries. A production process for a new Haemophilus influenzae type b (Hib) conjugate vaccine, including related quality control (QC) tests, was developed and transferred to a number of emerging vaccine manufacturers. This contributed to a sustainable global supply of affordable Hib conjugate vaccines, as illustrated by the market launch of the first Hib vaccine based on this technology in 2007 and concomitant price reduction of Hib vaccines. This paper describes the development approach followed for this Hib conjugate vaccine as well as the mathematical modeling tool applied recently in order to indicate options for further improvements of the initial Hib process. The strategy followed during the process development of this Hib conjugate vaccine was a targeted and integrated approach based on prior knowledge and experience with similar products using multi-disciplinary expertise. Mathematical modeling was used to develop a predictive model for the initial Hib process (the 'baseline' model) as well as an 'optimized' model, by proposing a number of process changes which could lead to further reduction in price. © 2016 American Institute of Chemical Engineers Biotechnol. Prog., 32:568-580, 2016. © 2016 American Institute of Chemical Engineers.

  2. Business process model repositories : efficient process retrieval

    NARCIS (Netherlands)

    Yan, Z.

    2012-01-01

    As organizations increasingly work in process-oriented manner, the number of business process models that they develop and have to maintain increases. As a consequence, it has become common for organizations to have collections of hundreds or even thousands of business process models. When a

  3. Developing Pavement Distress Deterioration Models for Pavement Management System Using Markovian Probabilistic Process

    Directory of Open Access Journals (Sweden)

    Promothes Saha

    2017-01-01

    Full Text Available In the state of Colorado, the Colorado Department of Transportation (CDOT utilizes their pavement management system (PMS to manage approximately 9,100 miles of interstate, highways, and low-volume roads. Three types of deterioration models are currently being used in the existing PMS: site-specific, family, and expert opinion curves. These curves are developed using deterministic techniques. In the deterministic technique, the uncertainties of pavement deterioration related to traffic and weather are not considered. Probabilistic models that take into account the uncertainties result in more accurate curves. In this study, probabilistic models using the discrete-time Markov process were developed for five distress indices: transverse, longitudinal, fatigue, rut, and ride indices, as a case study on low-volume roads. Regression techniques were used to develop the deterioration paths using the predicted distribution of indices estimated from the Markov process. Results indicated that longitudinal, fatigue, and rut indices had very slow deterioration over time, whereas transverse and ride indices showed faster deterioration. The developed deterioration models had the coefficient of determination (R2 above 0.84. As probabilistic models provide more accurate results, it is recommended that these models be used as the family curves in the CDOT PMS for low-volume roads.

  4. Clinical, information and business process modeling to promote development of safe and flexible software.

    Science.gov (United States)

    Liaw, Siaw-Teng; Deveny, Elizabeth; Morrison, Iain; Lewis, Bryn

    2006-09-01

    Using a factorial vignette survey and modeling methodology, we developed clinical and information models - incorporating evidence base, key concepts, relevant terms, decision-making and workflow needed to practice safely and effectively - to guide the development of an integrated rule-based knowledge module to support prescribing decisions in asthma. We identified workflows, decision-making factors, factor use, and clinician information requirements. The Unified Modeling Language (UML) and public domain software and knowledge engineering tools (e.g. Protégé) were used, with the Australian GP Data Model as the starting point for expressing information needs. A Web Services service-oriented architecture approach was adopted within which to express functional needs, and clinical processes and workflows were expressed in the Business Process Execution Language (BPEL). This formal analysis and modeling methodology to define and capture the process and logic of prescribing best practice in a reference implementation is fundamental to tackling deficiencies in prescribing decision support software.

  5. Development strategy and process models for phased automation of design and digital manufacturing electronics

    Science.gov (United States)

    Korshunov, G. I.; Petrushevskaya, A. A.; Lipatnikov, V. A.; Smirnova, M. S.

    2018-03-01

    The strategy of quality of electronics insurance is represented as most important. To provide quality, the processes sequence is considered and modeled by Markov chain. The improvement is distinguished by simple database means of design for manufacturing for future step-by-step development. Phased automation of design and digital manufacturing electronics is supposed. The MatLab modelling results showed effectiveness increase. New tools and software should be more effective. The primary digital model is proposed to represent product in the processes sequence from several processes till the whole life circle.

  6. Is there room for 'development' in developmental models of information processing biases to threat in children and adolescents?

    Science.gov (United States)

    Field, Andy P; Lester, Kathryn J

    2010-12-01

    Clinical and experimental theories assume that processing biases in attention and interpretation are a causal mechanism through which anxiety develops. Despite growing evidence that these processing biases are present in children and, therefore, develop long before adulthood, these theories ignore the potential role of child development. This review attempts to place information processing biases within a theoretical developmental framework. We consider whether child development has no impact on information processing biases to threat (integral bias model), or whether child development influences information processing biases and if so whether it does so by moderating the expression of an existing bias (moderation model) or by affecting the acquisition of a bias (acquisition model). We examine the extent to which these models fit with existing theory and research evidence and outline some methodological issues that need to be considered when drawing conclusions about the potential role of child development in the information processing of threat stimuli. Finally, we speculate about the developmental processes that might be important to consider in future research.

  7. Managing Service Development (SaaS) as a project: business process modeling

    OpenAIRE

    Iliadi, Vasiliki; Ηλιάδη, Βασιλική

    2017-01-01

    In the context of the present thesis, we will be studying core principles of Business Process Management, and how we can take advantage of them in combination with Project Management Methodologies and modeling tools in the context of Software as a Service businesses and their development. Initially we provide the reader with an introduction to Business Process Management, how it can be used, and how the life cycle is structured. We further define the first three phases of the life cycle to...

  8. The impact of working memory and the "process of process modelling" on model quality: Investigating experienced versus inexperienced modellers

    DEFF Research Database (Denmark)

    Martini, Markus; Pinggera, Jakob; Neurauter, Manuel

    2016-01-01

    of reconciliation phases was positively related to PM quality in experienced modellers. Our research reveals central cognitive mechanisms in process modelling and has potential practical implications for the development of modelling software and teaching the craft of process modelling....... the role of cognitive processes as well as modelling processes in creating a PM in experienced and inexperienced modellers. Specifically, two working memory (WM) functions (holding and processing of information and relational integration) and three process of process modelling phases (comprehension...

  9. Process modelling on a canonical basis[Process modelling; Canonical modelling

    Energy Technology Data Exchange (ETDEWEB)

    Siepmann, Volker

    2006-12-20

    possible to retrieve symbolically obtained derivatives of arbitrary process properties with respect to process parameters efficiently as a post calculation. The approach is therefore perfectly suitable to perform advanced process systems engineering tasks, such as sensitivity analysis, process optimisation, and data reconciliation. The concept of canonical modelling yields a natural definition of a general exergy state function for second law analysis. By partitioning of exergy into latent, mechanical, and chemical contributions, irreversible effects can be identified specifically, even for black-box models. The calculation core of a new process simulator called Yasim is developed and implemented. The software design follows the concepts described in the theoretical part of this thesis. Numerous exemplary process models are presented to address various subtopics of canonical modelling (author)

  10. Models development for natural circulation and its transition process in nuclear power plant

    International Nuclear Information System (INIS)

    Yu Lei; Cai Qi; Cai Zhangsheng; Xie Haiyan

    2008-01-01

    On the basis of nuclear power plant (NPP) best-estimate transient analysis code RELAP5/MOD3, the point reactor kinetics model in RELAP5/MOD3 was replaced by the two-group, 3-D space and time dependent neutron kinetic model, in order to exactly analyze the responses of key parameters in natural circulation and its transition process considering the reactivity feedback. The coupled model for three-dimensional physics and thermohydraulics was established and corresponding computing code was developed. Using developed code, natural circulation of NPP and its transiton process were calculated and analyzed. Compared with the experiment data, the calculated results show that its high precise avoids the shortage that the point reactor equation can not reflect the reactivity exactly. This code can be a computing and analysis tool for forced circulation and natural circulation and their transitions. (authors)

  11. [The development of an organizational socialization process model for new nurses using a system dynamics approach].

    Science.gov (United States)

    Choi, Soon-Ook

    2005-04-01

    The purpose of this study was to examine the problems and relevant variables for effective Organizational Socialization of new nurses, to produce a causal map, to build up a simulation model and to test its validity. The basic data was collected from Sep. 2002 to July 2003. The Organizational Socialization process of new nurses was analyzed through a model simulation. The VENSIM 5.0b DSS program was used to develop the study model. This Model shows interrelation of these result variables: organizational commitment, job satisfaction, job performance, intention of leaving the work setting, decision making ability, and general results of Organizational Socialization. The model's factors are characteristic of organization and individual values, task-related knowledge and skills, and emotion and communication that affects new nurses' socialization process. These elements go through processes of anticipatory socialization, encounter, change and acquisition. The Model was devised to induce effective Organizational Socialization results within 24 months of its implementation. The basic model is the most efficient and will also contribute to the development of knowledge in the body of nursing. This study will provide proper direction for new Nurse's Organizational Socialization. Therefore, developing an Organizational Socialization Process Model is meaningful in a sense that it could provide a framework that could create effective Organizational Socialization for new nurses.

  12. Development of Computer Aided Modelling Templates for Model Re-use in Chemical and Biochemical Process and Product Design: Importand export of models

    DEFF Research Database (Denmark)

    Fedorova, Marina; Tolksdorf, Gregor; Fillinger, Sandra

    2015-01-01

    been established, in order to provide a wider range of modelling capabilities. Through this link, developed models can be exported/imported to/from other modelling-simulation software environments to allow model reusability in chemical and biochemical product and process design. The use of this link...

  13. Study of alternative strategies to the task clarification activity of the market-pull product development process model

    OpenAIRE

    Motte, Damien

    2009-01-01

    A very large majority of the current product development process models put forward in textbooks present a homogenous structure, what Ulrich & Eppinger [1] call the market-pull model, presented as a generic one, while other possible product development process models are merely seen as variants. This paper focuses on the task clarification and derived activities (mainly the systematic search for customer needs through market study and the supplementary development costs it entails) and in...

  14. How can Product Development Process Modelling be made more useful?

    DEFF Research Database (Denmark)

    Wynn, David C; Maier, Anja; Clarkson, John P

    2010-01-01

    and on the way they are applied. The paper draws upon established principles of cybernetic systems in an attempt to explain the role played by process modelling in operating and improving PD processes. We use this framework to identify eight key factors which influence the utility of modelling in the context...... of use. Further, we indicate how these factors can be interpreted to identify opportunities to improve modelling utility. The paper is organised as follows. Section 2 provides background and motivation for the paper by discussing an example of PD process modelling practice. After highlighting from......, and the process being modelled. Section 5 draws upon established principles of cybernetic systems theory to incorporate this view in an explanation of the role of modelling in PD process operation and improvement. This framework is used to define modelling utility and to progressively identify influences upon it...

  15. An Analytic Hierarchy Process for School Quality and Inspection: Model Development and Application

    Science.gov (United States)

    Al Qubaisi, Amal; Badri, Masood; Mohaidat, Jihad; Al Dhaheri, Hamad; Yang, Guang; Al Rashedi, Asma; Greer, Kenneth

    2016-01-01

    Purpose: The purpose of this paper is to develop an analytic hierarchy planning-based framework to establish criteria weights and to develop a school performance system commonly called school inspections. Design/methodology/approach: The analytic hierarchy process (AHP) model uses pairwise comparisons and a measurement scale to generate the…

  16. Biosphere Process Model Report

    Energy Technology Data Exchange (ETDEWEB)

    J. Schmitt

    2000-05-25

    To evaluate the postclosure performance of a potential monitored geologic repository at Yucca Mountain, a Total System Performance Assessment (TSPA) will be conducted. Nine Process Model Reports (PMRs), including this document, are being developed to summarize the technical basis for each of the process models supporting the TSPA model. These reports cover the following areas: (1) Integrated Site Model; (2) Unsaturated Zone Flow and Transport; (3) Near Field Environment; (4) Engineered Barrier System Degradation, Flow, and Transport; (5) Waste Package Degradation; (6) Waste Form Degradation; (7) Saturated Zone Flow and Transport; (8) Biosphere; and (9) Disruptive Events. Analysis/Model Reports (AMRs) contain the more detailed technical information used to support TSPA and the PMRs. The AMRs consists of data, analyses, models, software, and supporting documentation that will be used to defend the applicability of each process model for evaluating the postclosure performance of the potential Yucca Mountain repository system. This documentation will ensure the traceability of information from its source through its ultimate use in the TSPA-Site Recommendation (SR) and in the National Environmental Policy Act (NEPA) analysis processes. The objective of the Biosphere PMR is to summarize (1) the development of the biosphere model, and (2) the Biosphere Dose Conversion Factors (BDCFs) developed for use in TSPA. The Biosphere PMR does not present or summarize estimates of potential radiation doses to human receptors. Dose calculations are performed as part of TSPA and will be presented in the TSPA documentation. The biosphere model is a component of the process to evaluate postclosure repository performance and regulatory compliance for a potential monitored geologic repository at Yucca Mountain, Nevada. The biosphere model describes those exposure pathways in the biosphere by which radionuclides released from a potential repository could reach a human receptor

  17. Biosphere Process Model Report

    International Nuclear Information System (INIS)

    Schmitt, J.

    2000-01-01

    To evaluate the postclosure performance of a potential monitored geologic repository at Yucca Mountain, a Total System Performance Assessment (TSPA) will be conducted. Nine Process Model Reports (PMRs), including this document, are being developed to summarize the technical basis for each of the process models supporting the TSPA model. These reports cover the following areas: (1) Integrated Site Model; (2) Unsaturated Zone Flow and Transport; (3) Near Field Environment; (4) Engineered Barrier System Degradation, Flow, and Transport; (5) Waste Package Degradation; (6) Waste Form Degradation; (7) Saturated Zone Flow and Transport; (8) Biosphere; and (9) Disruptive Events. Analysis/Model Reports (AMRs) contain the more detailed technical information used to support TSPA and the PMRs. The AMRs consists of data, analyses, models, software, and supporting documentation that will be used to defend the applicability of each process model for evaluating the postclosure performance of the potential Yucca Mountain repository system. This documentation will ensure the traceability of information from its source through its ultimate use in the TSPA-Site Recommendation (SR) and in the National Environmental Policy Act (NEPA) analysis processes. The objective of the Biosphere PMR is to summarize (1) the development of the biosphere model, and (2) the Biosphere Dose Conversion Factors (BDCFs) developed for use in TSPA. The Biosphere PMR does not present or summarize estimates of potential radiation doses to human receptors. Dose calculations are performed as part of TSPA and will be presented in the TSPA documentation. The biosphere model is a component of the process to evaluate postclosure repository performance and regulatory compliance for a potential monitored geologic repository at Yucca Mountain, Nevada. The biosphere model describes those exposure pathways in the biosphere by which radionuclides released from a potential repository could reach a human receptor

  18. Systematic Multi‐Scale Model Development Strategy for the Fragrance Spraying Process and Transport

    DEFF Research Database (Denmark)

    Heitzig, M.; Rong, Y.; Gregson, C.

    2012-01-01

    The fast and efficient development and application of reliable models with appropriate degree of detail to predict the behavior of fragrance aerosols are challenging problems of high interest to the related industries. A generic modeling template for the systematic derivation of specific fragrance......‐aided modeling framework, which is structured based on workflows for different general modeling tasks. The benefits of the fragrance spraying template are highlighted by a case study related to the derivation of a fragrance aerosol model that is able to reflect measured dynamic droplet size distribution profiles...... aerosol models is proposed. The main benefits of the fragrance spraying template are the speed‐up of the model development/derivation process, the increase in model quality, and the provision of structured domain knowledge where needed. The fragrance spraying template is integrated in a generic computer...

  19. Models of neural dynamics in brain information processing - the developments of 'the decade'

    International Nuclear Information System (INIS)

    Borisyuk, G N; Borisyuk, R M; Kazanovich, Yakov B; Ivanitskii, Genrikh R

    2002-01-01

    Neural network models are discussed that have been developed during the last decade with the purpose of reproducing spatio-temporal patterns of neural activity in different brain structures. The main goal of the modeling was to test hypotheses of synchronization, temporal and phase relations in brain information processing. The models being considered are those of temporal structure of spike sequences, of neural activity dynamics, and oscillatory models of attention and feature integration. (reviews of topical problems)

  20. Modelling and Development of a High Performance Milling Process with Monolithic Cutting Tools

    International Nuclear Information System (INIS)

    Ozturk, E.; Taylor, C. M.; Turner, S.; Devey, M.

    2011-01-01

    Critical aerospace components usually require difficult to machine workpiece materials like nickel based alloys. Moreover; there is a pressing need to maximize the productivity of machining operations. This need can be satisfied by selection of higher feed velocity, axial and radial depths. But there may be several problems during machining in this case. Due to high cutting speeds in high performance machining, the tool life may be unacceptably low. If magnitudes of cutting forces are high, out of tolerance static form errors may result; moreover in the extreme cases, the cutting tool may break apart. Forced vibrations may deteriorate the surface quality. Chatter vibrations may develop if the selected parameters result in instability. In this study, in order to deal with the tool life issue, several experimental cuts are made with different tool geometries, and the best combination in terms of tool life is selected. A force model is developed and the results of the force model are verified by experimental results. The force model is used in predicting the effect of process parameters on cutting forces. In order to account for the other concerns such as static form errors, forced and chatter vibrations, additional process models are currently under development.

  1. A conceptual model for developing KPIs for early phases of the construction process

    NARCIS (Netherlands)

    Haponava, T.; Al-Jibouri, Saad H.S.; Mawdesley, M.; Ahmed, Syed M.; Azhar, Salman; Mohamed, Sherif

    2007-01-01

    The pre-project stage in construction is where most of the decisions about project investment and development are taken. It is therefore very important to be able to control and influence the process at the very beginning of the project. This paper proposes a model for developing a set of KPIs for

  2. Development of Three-Layer Simulation Model for Freezing Process of Food Solution Systems

    Science.gov (United States)

    Kaminishi, Koji; Araki, Tetsuya; Shirakashi, Ryo; Ueno, Shigeaki; Sagara, Yasuyuki

    A numerical model has been developed for simulating freezing phenomena of food solution systems. The cell model was simplified to apply to food solution systems, incorporating with the existence of 3 parts such as unfrozen, frozen and moving boundary layers. Moreover, the moving rate of freezing front model was also introduced and calculated by using the variable space network method proposed by Murray and Landis (1957). To demonstrate the validity of the model, it was applied to the freezing processes of coffee solutions. Since the model required the phase diagram of the material to be frozen, the initial freezing temperatures of 1-55 % coffee solutions were measured by the DSC method. The effective thermal conductivity for coffee solutions was determined as a function of temperature and solute concentration by using the Maxwell - Eucken model. One-dimensional freezing process of 10 % coffee solution was simulated based on its phase diagram and thermo-physical properties. The results were good agreement with the experimental data and then showed that the model could accurately describe the change in the location of the freezing front and the distributions of temperature as well as ice fraction during a freezing process.

  3. Multi-enzyme Process Modeling

    DEFF Research Database (Denmark)

    Andrade Santacoloma, Paloma de Gracia

    are affected (in a positive or negative way) by the presence of the other enzymes and compounds in the media. In this thesis the concept of multi-enzyme in-pot term is adopted for processes that are carried out by the combination of enzymes in a single reactor and implemented at pilot or industrial scale...... features of the process and provides the information required to structure the process model by using a step-by-step procedure with the required tools and methods. In this way, this framework increases efficiency of the model development process with respect to time and resources needed (fast and effective....... In this way the model parameters that drives the main dynamic behavior can be identified and thus a better understanding of this type of processes. In order to develop, test and verify the methodology, three case studies were selected, specifically the bi-enzyme process for the production of lactobionic acid...

  4. A conceptual model for the development process of confirmatory adaptive clinical trials within an emergency research network.

    Science.gov (United States)

    Mawocha, Samkeliso C; Fetters, Michael D; Legocki, Laurie J; Guetterman, Timothy C; Frederiksen, Shirley; Barsan, William G; Lewis, Roger J; Berry, Donald A; Meurer, William J

    2017-06-01

    Adaptive clinical trials use accumulating data from enrolled subjects to alter trial conduct in pre-specified ways based on quantitative decision rules. In this research, we sought to characterize the perspectives of key stakeholders during the development process of confirmatory-phase adaptive clinical trials within an emergency clinical trials network and to build a model to guide future development of adaptive clinical trials. We used an ethnographic, qualitative approach to evaluate key stakeholders' views about the adaptive clinical trial development process. Stakeholders participated in a series of multidisciplinary meetings during the development of five adaptive clinical trials and completed a Strengths-Weaknesses-Opportunities-Threats questionnaire. In the analysis, we elucidated overarching themes across the stakeholders' responses to develop a conceptual model. Four major overarching themes emerged during the analysis of stakeholders' responses to questioning: the perceived statistical complexity of adaptive clinical trials and the roles of collaboration, communication, and time during the development process. Frequent and open communication and collaboration were viewed by stakeholders as critical during the development process, as were the careful management of time and logistical issues related to the complexity of planning adaptive clinical trials. The Adaptive Design Development Model illustrates how statistical complexity, time, communication, and collaboration are moderating factors in the adaptive design development process. The intensity and iterative nature of this process underscores the need for funding mechanisms for the development of novel trial proposals in academic settings.

  5. Framework for developing hybrid process-driven, artificial neural network and regression models for salinity prediction in river systems

    Science.gov (United States)

    Hunter, Jason M.; Maier, Holger R.; Gibbs, Matthew S.; Foale, Eloise R.; Grosvenor, Naomi A.; Harders, Nathan P.; Kikuchi-Miller, Tahali C.

    2018-05-01

    Salinity modelling in river systems is complicated by a number of processes, including in-stream salt transport and various mechanisms of saline accession that vary dynamically as a function of water level and flow, often at different temporal scales. Traditionally, salinity models in rivers have either been process- or data-driven. The primary problem with process-based models is that in many instances, not all of the underlying processes are fully understood or able to be represented mathematically. There are also often insufficient historical data to support model development. The major limitation of data-driven models, such as artificial neural networks (ANNs) in comparison, is that they provide limited system understanding and are generally not able to be used to inform management decisions targeting specific processes, as different processes are generally modelled implicitly. In order to overcome these limitations, a generic framework for developing hybrid process and data-driven models of salinity in river systems is introduced and applied in this paper. As part of the approach, the most suitable sub-models are developed for each sub-process affecting salinity at the location of interest based on consideration of model purpose, the degree of process understanding and data availability, which are then combined to form the hybrid model. The approach is applied to a 46 km reach of the Murray River in South Australia, which is affected by high levels of salinity. In this reach, the major processes affecting salinity include in-stream salt transport, accession of saline groundwater along the length of the reach and the flushing of three waterbodies in the floodplain during overbank flows of various magnitudes. Based on trade-offs between the degree of process understanding and data availability, a process-driven model is developed for in-stream salt transport, an ANN model is used to model saline groundwater accession and three linear regression models are used

  6. A system identification approach for developing model predictive controllers of antibody quality attributes in cell culture processes.

    Science.gov (United States)

    Downey, Brandon; Schmitt, John; Beller, Justin; Russell, Brian; Quach, Anthony; Hermann, Elizabeth; Lyon, David; Breit, Jeffrey

    2017-11-01

    As the biopharmaceutical industry evolves to include more diverse protein formats and processes, more robust control of Critical Quality Attributes (CQAs) is needed to maintain processing flexibility without compromising quality. Active control of CQAs has been demonstrated using model predictive control techniques, which allow development of processes which are robust against disturbances associated with raw material variability and other potentially flexible operating conditions. Wide adoption of model predictive control in biopharmaceutical cell culture processes has been hampered, however, in part due to the large amount of data and expertise required to make a predictive model of controlled CQAs, a requirement for model predictive control. Here we developed a highly automated, perfusion apparatus to systematically and efficiently generate predictive models using application of system identification approaches. We successfully created a predictive model of %galactosylation using data obtained by manipulating galactose concentration in the perfusion apparatus in serialized step change experiments. We then demonstrated the use of the model in a model predictive controller in a simulated control scenario to successfully achieve a %galactosylation set point in a simulated fed-batch culture. The automated model identification approach demonstrated here can potentially be generalized to many CQAs, and could be a more efficient, faster, and highly automated alternative to batch experiments for developing predictive models in cell culture processes, and allow the wider adoption of model predictive control in biopharmaceutical processes. © 2017 The Authors Biotechnology Progress published by Wiley Periodicals, Inc. on behalf of American Institute of Chemical Engineers Biotechnol. Prog., 33:1647-1661, 2017. © 2017 The Authors Biotechnology Progress published by Wiley Periodicals, Inc. on behalf of American Institute of Chemical Engineers.

  7. Process-Based Quality (PBQ) Tools Development

    Energy Technology Data Exchange (ETDEWEB)

    Cummins, J.L.

    2001-12-03

    The objective of this effort is to benchmark the development of process-based quality tools for application in CAD (computer-aided design) model-based applications. The processes of interest are design, manufacturing, and quality process applications. A study was commissioned addressing the impact, current technologies, and known problem areas in application of 3D MCAD (3-dimensional mechanical computer-aided design) models and model integrity on downstream manufacturing and quality processes. The downstream manufacturing and product quality processes are profoundly influenced and dependent on model quality and modeling process integrity. The goal is to illustrate and expedite the modeling and downstream model-based technologies for available or conceptual methods and tools to achieve maximum economic advantage and advance process-based quality concepts.

  8. Process of optimization of retail trade spatial development with application of locational-alocational models

    Directory of Open Access Journals (Sweden)

    Kukrika Milan

    2008-01-01

    Full Text Available This article gives a simple and brief scope of structure and usage of location-allocation models in territory planning of retail network, trying to show the main shortage of some given models and the primary direction of their future improving. We give an inspection of theirs main usage and give an explanation of basic factors that models take in consideration during the process of demand allocation. Location-allocation models are an important segment of development of spatial retail network optimization process. Their future improvement is going towards their approximation and integration with spatial-interaction models. In this way, much better methodology of planning and directing spatial development of trade general. Methodology which we have used in this research paper is based on the literature and research projects in the area. Using this methodology in analyzing parts of Serbian territory through usage of location-allocation models, showed the need for creating special software for calculating matrix with recursions. Considering the fact that the integration of location-allocation models with GIS still didn't occur, all the results acquired during the calculation of methaformula has been brought into ArcGIS 9.2 software and presented as maps.

  9. Assessing local population vulnerability to wind energy development with branching process models: an application to wind energy development

    Science.gov (United States)

    Erickson, Richard A.; Eager, Eric A.; Stanton, Jessica C.; Beston, Julie A.; Diffendorfer, James E.; Thogmartin, Wayne E.

    2015-01-01

    Quantifying the impact of anthropogenic development on local populations is important for conservation biology and wildlife management. However, these local populations are often subject to demographic stochasticity because of their small population size. Traditional modeling efforts such as population projection matrices do not consider this source of variation whereas individual-based models, which include demographic stochasticity, are computationally intense and lack analytical tractability. One compromise between approaches is branching process models because they accommodate demographic stochasticity and are easily calculated. These models are known within some sub-fields of probability and mathematical ecology but are not often applied in conservation biology and applied ecology. We applied branching process models to quantitatively compare and prioritize species locally vulnerable to the development of wind energy facilities. Specifically, we examined species vulnerability using branching process models for four representative species: A cave bat (a long-lived, low fecundity species), a tree bat (short-lived, moderate fecundity species), a grassland songbird (a short-lived, high fecundity species), and an eagle (a long-lived, slow maturation species). Wind turbine-induced mortality has been observed for all of these species types, raising conservation concerns. We simulated different mortality rates from wind farms while calculating local extinction probabilities. The longer-lived species types (e.g., cave bats and eagles) had much more pronounced transitions from low extinction risk to high extinction risk than short-lived species types (e.g., tree bats and grassland songbirds). High-offspring-producing species types had a much greater variability in baseline risk of extinction than the lower-offspring-producing species types. Long-lived species types may appear stable until a critical level of incidental mortality occurs. After this threshold, the risk of

  10. Development and Performance of a Highly Sensitive Model Formulation Based on Torasemide to Enhance Hot-Melt Extrusion Process Understanding and Process Development.

    Science.gov (United States)

    Evans, Rachel C; Kyeremateng, Samuel O; Asmus, Lutz; Degenhardt, Matthias; Rosenberg, Joerg; Wagner, Karl G

    2018-02-27

    The aim of this work was to investigate the use of torasemide as a highly sensitive indicator substance and to develop a formulation thereof for establishing quantitative relationships between hot-melt extrusion process conditions and critical quality attributes (CQAs). Using solid-state characterization techniques and a 10 mm lab-scale co-rotating twin-screw extruder, we studied torasemide in a Soluplus® (SOL)-polyethylene glycol 1500 (PEG 1500) matrix, and developed and characterized a formulation which was used as a process indicator to study thermal- and hydrolysis-induced degradation, as well as residual crystallinity. We found that torasemide first dissolved into the matrix and then degraded. Based on this mechanism, extrudates with measurable levels of degradation and residual crystallinity were produced, depending strongly on the main barrel and die temperature and residence time applied. In addition, we found that 10% w/w PEG 1500 as plasticizer resulted in the widest operating space with the widest range of measurable residual crystallinity and degradant levels. Torasemide as an indicator substance behaves like a challenging-to-process API, only with higher sensitivity and more pronounced effects, e.g., degradation and residual crystallinity. Application of a model formulation containing torasemide will enhance the understanding of the dynamic environment inside an extruder and elucidate the cumulative thermal and hydrolysis effects of the extrusion process. The use of such a formulation will also facilitate rational process development and scaling by establishing clear links between process conditions and CQAs.

  11. Letter Report: Progress in developing EQ3/6 for modeling boiling processes

    Energy Technology Data Exchange (ETDEWEB)

    Wolery, T. J., LLNL

    1995-08-28

    EQ3/6 is a software package for geochemical modeling of aqueous systems, such as water/rock or waste/water rock. It is being developed for a variety of applications in geochemical studies for the Yucca Mountain Site Characterization Project. The present focus is on development of capabilities to be used in studies of geochemical processes which will take place in the near-field environment and the altered zone of the potential repository. We have completed the first year of a planned two-year effort to develop capabilities for modeling boiling processes. These capabilities will interface with other existing and future modeling capabilities to provide a means of integrating the effects of various kinds of geochemical processes in complex systems. This year, the software has been modified to allow the formation of a generalized gas phase in a closed system for which the temperature and pressure are known (but not necessarily constant). The gas phase forms when its formation is thermodynamically favored; that is, when the system pressure is equal to the sum of the partial pressures of the gas species as computed from their equilibrium fugacities. It disappears when this sum falls below that pressure. `Boiling` is the special case in which the gas phase which forms consists mostly of water vapor. The reverse process is then `condensation.` To support calculations of boiling and condensation, we have added a capability to calculate the fugacity coefficients of gas species in the system H{sub 2}O-CO{sub 2}-CH{sub 4}-H{sub 2},-Awe{sub 2}-N{sub 2},-H{sub 2}S-NH3. This capability at present is accurate only at relatively low pressures, but is adequate for all likely repository boiling conditions. We have also modified the software to calculate changes in enthalpy (heat) and volume functions. Next year we will be extending the boiling capability to calculate the pressure or the temperature at known enthalpy. We will also add an option for open system boiling.

  12. Software-Engineering Process Simulation (SEPS) model

    Science.gov (United States)

    Lin, C. Y.; Abdel-Hamid, T.; Sherif, J. S.

    1992-01-01

    The Software Engineering Process Simulation (SEPS) model is described which was developed at JPL. SEPS is a dynamic simulation model of the software project development process. It uses the feedback principles of system dynamics to simulate the dynamic interactions among various software life cycle development activities and management decision making processes. The model is designed to be a planning tool to examine tradeoffs of cost, schedule, and functionality, and to test the implications of different managerial policies on a project's outcome. Furthermore, SEPS will enable software managers to gain a better understanding of the dynamics of software project development and perform postmodern assessments.

  13. Using process algebra to develop predator-prey models of within-host parasite dynamics.

    Science.gov (United States)

    McCaig, Chris; Fenton, Andy; Graham, Andrea; Shankland, Carron; Norman, Rachel

    2013-07-21

    As a first approximation of immune-mediated within-host parasite dynamics we can consider the immune response as a predator, with the parasite as its prey. In the ecological literature of predator-prey interactions there are a number of different functional responses used to describe how a predator reproduces in response to consuming prey. Until recently most of the models of the immune system that have taken a predator-prey approach have used simple mass action dynamics to capture the interaction between the immune response and the parasite. More recently Fenton and Perkins (2010) employed three of the most commonly used prey-dependent functional response terms from the ecological literature. In this paper we make use of a technique from computing science, process algebra, to develop mathematical models. The novelty of the process algebra approach is to allow stochastic models of the population (parasite and immune cells) to be developed from rules of individual cell behaviour. By using this approach in which individual cellular behaviour is captured we have derived a ratio-dependent response similar to that seen in the previous models of immune-mediated parasite dynamics, confirming that, whilst this type of term is controversial in ecological predator-prey models, it is appropriate for models of the immune system. Copyright © 2013 Elsevier Ltd. All rights reserved.

  14. The impact of working memory and the "process of process modelling" on model quality: Investigating experienced versus inexperienced modellers.

    Science.gov (United States)

    Martini, Markus; Pinggera, Jakob; Neurauter, Manuel; Sachse, Pierre; Furtner, Marco R; Weber, Barbara

    2016-05-09

    A process model (PM) represents the graphical depiction of a business process, for instance, the entire process from online ordering a book until the parcel is delivered to the customer. Knowledge about relevant factors for creating PMs of high quality is lacking. The present study investigated the role of cognitive processes as well as modelling processes in creating a PM in experienced and inexperienced modellers. Specifically, two working memory (WM) functions (holding and processing of information and relational integration) and three process of process modelling phases (comprehension, modelling, and reconciliation) were related to PM quality. Our results show that the WM function of relational integration was positively related to PM quality in both modelling groups. The ratio of comprehension phases was negatively related to PM quality in inexperienced modellers and the ratio of reconciliation phases was positively related to PM quality in experienced modellers. Our research reveals central cognitive mechanisms in process modelling and has potential practical implications for the development of modelling software and teaching the craft of process modelling.

  15. Parametric Cost Modeling of Space Missions Using the Develop New Projects (DMP) Implementation Process

    Science.gov (United States)

    Rosenberg, Leigh; Hihn, Jairus; Roust, Kevin; Warfield, Keith

    2000-01-01

    This paper presents an overview of a parametric cost model that has been built at JPL to estimate costs of future, deep space, robotic science missions. Due to the recent dramatic changes in JPL business practices brought about by an internal reengineering effort known as develop new products (DNP), high-level historic cost data is no longer considered analogous to future missions. Therefore, the historic data is of little value in forecasting costs for projects developed using the DNP process. This has lead to the development of an approach for obtaining expert opinion and also for combining actual data with expert opinion to provide a cost database for future missions. In addition, the DNP cost model has a maximum of objective cost drivers which reduces the likelihood of model input error. Version 2 is now under development which expands the model capabilities, links it more tightly with key design technical parameters, and is grounded in more rigorous statistical techniques. The challenges faced in building this model will be discussed, as well as it's background, development approach, status, validation, and future plans.

  16. Conceptual modelling of human resource evaluation process

    Directory of Open Access Journals (Sweden)

    Negoiţă Doina Olivia

    2017-01-01

    Full Text Available Taking into account the highly diverse tasks which employees have to fulfil due to complex requirements of nowadays consumers, the human resource within an enterprise has become a strategic element for developing and exploiting products which meet the market expectations. Therefore, organizations encounter difficulties when approaching the human resource evaluation process. Hence, the aim of the current paper is to design a conceptual model of the aforementioned process, which allows the enterprises to develop a specific methodology. In order to design the conceptual model, Business Process Modelling instruments were employed - Adonis Community Edition Business Process Management Toolkit using the ADONIS BPMS Notation. The conceptual model was developed based on an in-depth secondary research regarding the human resource evaluation process. The proposed conceptual model represents a generic workflow (sequential and/ or simultaneously activities, which can be extended considering the enterprise’s needs regarding their requirements when conducting a human resource evaluation process. Enterprises can benefit from using software instruments for business process modelling as they enable process analysis and evaluation (predefined / specific queries and also model optimization (simulations.

  17. Cost Models for MMC Manufacturing Processes

    Science.gov (United States)

    Elzey, Dana M.; Wadley, Haydn N. G.

    1996-01-01

    Processes for the manufacture of advanced metal matrix composites are rapidly approaching maturity in the research laboratory and there is growing interest in their transition to industrial production. However, research conducted to date has almost exclusively focused on overcoming the technical barriers to producing high-quality material and little attention has been given to the economical feasibility of these laboratory approaches and process cost issues. A quantitative cost modeling (QCM) approach was developed to address these issues. QCM are cost analysis tools based on predictive process models relating process conditions to the attributes of the final product. An important attribute, of the QCM approach is the ability to predict the sensitivity of material production costs to product quality and to quantitatively explore trade-offs between cost and quality. Applications of the cost models allow more efficient direction of future MMC process technology development and a more accurate assessment of MMC market potential. Cost models were developed for two state-of-the art metal matrix composite (MMC) manufacturing processes: tape casting and plasma spray deposition. Quality and Cost models are presented for both processes and the resulting predicted quality-cost curves are presented and discussed.

  18. Developing a Data Driven Process-Based Model for Remote Sensing of Ecosystem Production

    Science.gov (United States)

    Elmasri, B.; Rahman, A. F.

    2010-12-01

    Estimating ecosystem carbon fluxes at various spatial and temporal scales is essential for quantifying the global carbon cycle. Numerous models have been developed for this purpose using several environmental variables as well as vegetation indices derived from remotely sensed data. Here we present a data driven modeling approach for gross primary production (GPP) that is based on a process based model BIOME-BGC. The proposed model was run using available remote sensing data and it does not depend on look-up tables. Furthermore, this approach combines the merits of both empirical and process models, and empirical models were used to estimate certain input variables such as light use efficiency (LUE). This was achieved by using remotely sensed data to the mathematical equations that represent biophysical photosynthesis processes in the BIOME-BGC model. Moreover, a new spectral index for estimating maximum photosynthetic activity, maximum photosynthetic rate index (MPRI), is also developed and presented here. This new index is based on the ratio between the near infrared and the green bands (ρ858.5/ρ555). The model was tested and validated against MODIS GPP product and flux measurements from two eddy covariance flux towers located at Morgan Monroe State Forest (MMSF) in Indiana and Harvard Forest in Massachusetts. Satellite data acquired by the Advanced Microwave Scanning Radiometer (AMSR-E) and MODIS were used. The data driven model showed a strong correlation between the predicted and measured GPP at the two eddy covariance flux towers sites. This methodology produced better predictions of GPP than did the MODIS GPP product. Moreover, the proportion of error in the predicted GPP for MMSF and Harvard forest was dominated by unsystematic errors suggesting that the results are unbiased. The analysis indicated that maintenance respiration is one of the main factors that dominate the overall model outcome errors and improvement in maintenance respiration estimation

  19. Modeling of ETL-Processes and Processed Information in Clinical Data Warehousing.

    Science.gov (United States)

    Tute, Erik; Steiner, Jochen

    2018-01-01

    Literature describes a big potential for reuse of clinical patient data. A clinical data warehouse (CDWH) is a means for that. To support management and maintenance of processes extracting, transforming and loading (ETL) data into CDWHs as well as to ease reuse of metadata between regular IT-management, CDWH and secondary data users by providing a modeling approach. Expert survey and literature review to find requirements and existing modeling techniques. An ETL-modeling-technique was developed extending existing modeling techniques. Evaluation by exemplarily modeling existing ETL-process and a second expert survey. Nine experts participated in the first survey. Literature review yielded 15 included publications. Six existing modeling techniques were identified. A modeling technique extending 3LGM2 and combining it with openEHR information models was developed and evaluated. Seven experts participated in the evaluation. The developed approach can help in management and maintenance of ETL-processes and could serve as interface between regular IT-management, CDWH and secondary data users.

  20. Self-optimisation and model-based design of experiments for developing a C–H activation flow process

    Directory of Open Access Journals (Sweden)

    Alexander Echtermeyer

    2017-01-01

    Full Text Available A recently described C(sp3–H activation reaction to synthesise aziridines was used as a model reaction to demonstrate the methodology of developing a process model using model-based design of experiments (MBDoE and self-optimisation approaches in flow. The two approaches are compared in terms of experimental efficiency. The self-optimisation approach required the least number of experiments to reach the specified objectives of cost and product yield, whereas the MBDoE approach enabled a rapid generation of a process model.

  1. A Biopsychological Model of Anti-drug PSA Processing: Developing Effective Persuasive Messages.

    Science.gov (United States)

    Hohman, Zachary P; Keene, Justin Robert; Harris, Breanna N; Niedbala, Elizabeth M; Berke, Collin K

    2017-11-01

    For the current study, we developed and tested a biopsychological model to combine research on psychological tension, the Limited Capacity Model of Motivated Mediated Message Processing, and the endocrine system to predict and understand how people process anti-drug PSAs. We predicted that co-presentation of pleasant and unpleasant information, vs. solely pleasant or unpleasant, will trigger evaluative tension about the target behavior in persuasive messages and result in a biological response (increase in cortisol, alpha amylase, and heart rate). In experiment 1, we assessed the impact of co-presentation of pleasant and unpleasant information in persuasive messages on evaluative tension (conceptualized as attitude ambivalence), in experiment 2, we explored the impact of co-presentation on endocrine system responses (salivary cortisol and alpha amylase), and in experiment 3, we assessed the impact of co-presentation on heart rate. Across all experiments, we demonstrated that co-presentation of pleasant and unpleasant information, vs. solely pleasant or unpleasant, in persuasive communications leads to increases in attitude ambivalence, salivary cortisol, salivary alpha amylase, and heart rate. Taken together, the results support the initial paths of our biopsychological model of persuasive message processing and indicate that including both pleasant and unpleasant information in a message impacts the viewer. We predict that increases in evaluative tension and biological responses will aid in memory and cognitive processing of the message. However, future research is needed to test that hypothesis.

  2. Process-Based Quality (PBQ) Tools Development; TOPICAL

    International Nuclear Information System (INIS)

    Cummins, J.L.

    2001-01-01

    The objective of this effort is to benchmark the development of process-based quality tools for application in CAD (computer-aided design) model-based applications. The processes of interest are design, manufacturing, and quality process applications. A study was commissioned addressing the impact, current technologies, and known problem areas in application of 3D MCAD (3-dimensional mechanical computer-aided design) models and model integrity on downstream manufacturing and quality processes. The downstream manufacturing and product quality processes are profoundly influenced and dependent on model quality and modeling process integrity. The goal is to illustrate and expedite the modeling and downstream model-based technologies for available or conceptual methods and tools to achieve maximum economic advantage and advance process-based quality concepts

  3. Transition management as a model for managing processes of co-evolution towards sustainable development

    NARCIS (Netherlands)

    R. Kemp (René); D.A. Loorbach (Derk); J. Rotmans (Jan)

    2007-01-01

    textabstractSustainable development requires changes in socio-technical systems and wider societal change - in beliefs, values and governance that co-evolve with technology changes. In this article we present a practical model for managing processes of co-evolution: transition management. Transition

  4. High-throughput micro-scale cultivations and chromatography modeling: Powerful tools for integrated process development.

    Science.gov (United States)

    Baumann, Pascal; Hahn, Tobias; Hubbuch, Jürgen

    2015-10-01

    Upstream processes are rather complex to design and the productivity of cells under suitable cultivation conditions is hard to predict. The method of choice for examining the design space is to execute high-throughput cultivation screenings in micro-scale format. Various predictive in silico models have been developed for many downstream processes, leading to a reduction of time and material costs. This paper presents a combined optimization approach based on high-throughput micro-scale cultivation experiments and chromatography modeling. The overall optimized system must not necessarily be the one with highest product titers, but the one resulting in an overall superior process performance in up- and downstream. The methodology is presented in a case study for the Cherry-tagged enzyme Glutathione-S-Transferase from Escherichia coli SE1. The Cherry-Tag™ (Delphi Genetics, Belgium) which can be fused to any target protein allows for direct product analytics by simple VIS absorption measurements. High-throughput cultivations were carried out in a 48-well format in a BioLector micro-scale cultivation system (m2p-Labs, Germany). The downstream process optimization for a set of randomly picked upstream conditions producing high yields was performed in silico using a chromatography modeling software developed in-house (ChromX). The suggested in silico-optimized operational modes for product capturing were validated subsequently. The overall best system was chosen based on a combination of excellent up- and downstream performance. © 2015 Wiley Periodicals, Inc.

  5. Modeling closed nuclear fuel cycles processes

    Energy Technology Data Exchange (ETDEWEB)

    Shmidt, O.V. [A.A. Bochvar All-Russian Scientific Research Institute for Inorganic Materials, Rogova, 5a street, Moscow, 123098 (Russian Federation); Makeeva, I.R. [Zababakhin All-Russian Scientific Research Institute of Technical Physics, Vasiliev street 13, Snezhinsk, Chelyabinsk region, 456770 (Russian Federation); Liventsov, S.N. [Tomsk Polytechnic University, Tomsk, Lenin Avenue, 30, 634050 (Russian Federation)

    2016-07-01

    Computer models of processes are necessary for determination of optimal operating conditions for closed nuclear fuel cycle (NFC) processes. Computer models can be quickly changed in accordance with new and fresh data from experimental research. 3 kinds of process simulation are necessary. First, the VIZART software package is a balance model development used for calculating the material flow in technological processes. VIZART involves taking into account of equipment capacity, transport lines and storage volumes. Secondly, it is necessary to simulate the physico-chemical processes that are involved in the closure of NFC. The third kind of simulation is the development of software that allows the optimization, diagnostics and control of the processes which implies real-time simulation of product flows on the whole plant or on separate lines of the plant. (A.C.)

  6. Using the Knowledge, Process, Practice (KPP) model for driving the design and development of online postgraduate medical education.

    Science.gov (United States)

    Shaw, Tim; Barnet, Stewart; Mcgregor, Deborah; Avery, Jennifer

    2015-01-01

    Online learning is a primary delivery method for continuing health education programs. It is critical that programs have curricula objectives linked to educational models that support learning. Using a proven educational modelling process ensures that curricula objectives are met and a solid basis for learning and assessment is achieved. To develop an educational design model that produces an educationally sound program development plan for use by anyone involved in online course development. We have described the development of a generic educational model designed for continuing health education programs. The Knowledge, Process, Practice (KPP) model is founded on recognised educational theory and online education practice. This paper presents a step-by-step guide on using this model for program development that encases reliable learning and evaluation. The model supports a three-step approach, KPP, based on learning outcomes and supporting appropriate assessment activities. It provides a program structure for online or blended learning that is explicit, educationally defensible, and supports multiple assessment points for health professionals. The KPP model is based on best practice educational design using a structure that can be adapted for a variety of online or flexibly delivered postgraduate medical education programs.

  7. Measuring similarity between business process models

    NARCIS (Netherlands)

    Dongen, van B.F.; Dijkman, R.M.; Mendling, J.

    2007-01-01

    Quality aspects become increasingly important when business process modeling is used in a large-scale enterprise setting. In order to facilitate a storage without redundancy and an efficient retrieval of relevant process models in model databases it is required to develop a theoretical understanding

  8. Business Model Process Configurations

    DEFF Research Database (Denmark)

    Taran, Yariv; Nielsen, Christian; Thomsen, Peter

    2015-01-01

    , by developing (inductively) an ontological classification framework, in view of the BM process configurations typology developed. Design/methodology/approach – Given the inconsistencies found in the business model studies (e.g. definitions, configurations, classifications) we adopted the analytical induction...

  9. Adapting the unified software development process for user interface development

    NARCIS (Netherlands)

    Obrenovic, Z.; Starcevic, D.

    2006-01-01

    In this paper we describe how existing software developing processes, such as Rational Unified Process, can be adapted in order to allow disciplined and more efficient development of user interfaces. The main objective of this paper is to demonstrate that standard modeling environments, based on the

  10. Exploring C-water-temperature interactions and non-linearities in soils through developments in process-based models

    Science.gov (United States)

    Esteban Moyano, Fernando; Vasilyeva, Nadezda; Menichetti, Lorenzo

    2016-04-01

    Soil carbon models developed over the last couple of decades are limited in their capacity to accurately predict the magnitudes and temporal variations in observed carbon fluxes and stocks. New process-based models are now emerging that attempt to address the shortcomings of their more simple, empirical counterparts. While a spectrum of ideas and hypothetical mechanisms are finding their way into new models, the addition of only a few processes known to significantly affect soil carbon (e.g. enzymatic decomposition, adsorption, Michaelis-Menten kinetics) has shown the potential to resolve a number of previous model-data discrepancies (e.g. priming, Birch effects). Through model-data validation, such models are a means of testing hypothetical mechanisms. In addition, they can lead to new insights into what soil carbon pools are and how they respond to external drivers. In this study we develop a model of soil carbon dynamics based on enzymatic decomposition and other key features of process based models, i.e. simulation of carbon in particulate, soluble and adsorbed states, as well as enzyme and microbial components. Here we focus on understanding how moisture affects C decomposition at different levels, both directly (e.g. by limiting diffusion) or through interactions with other components. As the medium where most reactions and transport take place, water is central en every aspect of soil C dynamics. We compare results from a number of alternative models with experimental data in order to test different processes and parameterizations. Among other observations, we try to understand: 1. typical moisture response curves and associated temporal changes, 2. moisture-temperature interactions, and 3. diffusion effects under changing C concentrations. While the model aims at being a process based approach and at simulating fluxes at short time scales, it remains a simplified representation using the same inputs as classical soil C models, and is thus potentially

  11. Online Rule Generation Software Process Model

    OpenAIRE

    Sudeep Marwaha; Alka Aroa; Satma M C; Rajni Jain; R C Goyal

    2013-01-01

    For production systems like expert systems, a rule generation software can facilitate the faster deployment. The software process model for rule generation using decision tree classifier refers to the various steps required to be executed for the development of a web based software model for decision rule generation. The Royce’s final waterfall model has been used in this paper to explain the software development process. The paper presents the specific output of various steps of modified wat...

  12. Models of neural dynamics in brain information processing - the developments of 'the decade'

    Energy Technology Data Exchange (ETDEWEB)

    Borisyuk, G N; Borisyuk, R M; Kazanovich, Yakov B [Institute of Mathematical Problems of Biology, Russian Academy of Sciences, Pushchino, Moscow region (Russian Federation); Ivanitskii, Genrikh R [Institute for Theoretical and Experimental Biophysics, Russian Academy of Sciences, Pushchino, Moscow region (Russian Federation)

    2002-10-31

    Neural network models are discussed that have been developed during the last decade with the purpose of reproducing spatio-temporal patterns of neural activity in different brain structures. The main goal of the modeling was to test hypotheses of synchronization, temporal and phase relations in brain information processing. The models being considered are those of temporal structure of spike sequences, of neural activity dynamics, and oscillatory models of attention and feature integration. (reviews of topical problems)

  13. Process mining using BPMN: relating event logs and process models

    NARCIS (Netherlands)

    Kalenkova, A.A.; van der Aalst, W.M.P.; Lomazova, I.A.; Rubin, V.A.

    2017-01-01

    Process-aware information systems (PAIS) are systems relying on processes, which involve human and software resources to achieve concrete goals. There is a need to develop approaches for modeling, analysis, improvement and monitoring processes within PAIS. These approaches include process mining

  14. Process mining using BPMN : relating event logs and process models

    NARCIS (Netherlands)

    Kalenkova, A.A.; Aalst, van der W.M.P.; Lomazova, I.A.; Rubin, V.A.

    2015-01-01

    Process-aware information systems (PAIS) are systems relying on processes, which involve human and software resources to achieve concrete goals. There is a need to develop approaches for modeling, analysis, improvement and monitoring processes within PAIS. These approaches include process mining

  15. Developing mathematical modelling competence

    DEFF Research Database (Denmark)

    Blomhøj, Morten; Jensen, Tomas Højgaard

    2003-01-01

    In this paper we introduce the concept of mathematical modelling competence, by which we mean being able to carry through a whole mathematical modelling process in a certain context. Analysing the structure of this process, six sub-competences are identified. Mathematical modelling competence...... cannot be reduced to these six sub-competences, but they are necessary elements in the development of mathematical modelling competence. Experience from the development of a modelling course is used to illustrate how the different nature of the sub-competences can be used as a tool for finding...... the balance between different kinds of activities in a particular educational setting. Obstacles of social, cognitive and affective nature for the students' development of mathematical modelling competence are reported and discussed in relation to the sub-competences....

  16. A Decision Tool that Combines Discrete Event Software Process Models with System Dynamics Pieces for Software Development Cost Estimation and Analysis

    Science.gov (United States)

    Mizell, Carolyn Barrett; Malone, Linda

    2007-01-01

    The development process for a large software development project is very complex and dependent on many variables that are dynamic and interrelated. Factors such as size, productivity and defect injection rates will have substantial impact on the project in terms of cost and schedule. These factors can be affected by the intricacies of the process itself as well as human behavior because the process is very labor intensive. The complex nature of the development process can be investigated with software development process models that utilize discrete event simulation to analyze the effects of process changes. The organizational environment and its effects on the workforce can be analyzed with system dynamics that utilizes continuous simulation. Each has unique strengths and the benefits of both types can be exploited by combining a system dynamics model and a discrete event process model. This paper will demonstrate how the two types of models can be combined to investigate the impacts of human resource interactions on productivity and ultimately on cost and schedule.

  17. Three phase heat and mass transfer model for unsaturated soil freezing process: Part 1 - model development

    Science.gov (United States)

    Xu, Fei; Zhang, Yaning; Jin, Guangri; Li, Bingxi; Kim, Yong-Song; Xie, Gongnan; Fu, Zhongbin

    2018-04-01

    A three-phase model capable of predicting the heat transfer and moisture migration for soil freezing process was developed based on the Shen-Chen model and the mechanisms of heat and mass transfer in unsaturated soil freezing. The pre-melted film was taken into consideration, and the relationship between film thickness and soil temperature was used to calculate the liquid water fraction in both frozen zone and freezing fringe. The force that causes the moisture migration was calculated by the sum of several interactive forces and the suction in the pre-melted film was regarded as an interactive force between ice and water. Two kinds of resistance were regarded as a kind of body force related to the water films between the ice grains and soil grains, and a block force instead of gravity was introduced to keep balance with gravity before soil freezing. Lattice Boltzmann method was used in the simulation, and the input variables for the simulation included the size of computational domain, obstacle fraction, liquid water fraction, air fraction and soil porosity. The model is capable of predicting the water content distribution along soil depth and variations in water content and temperature during soil freezing process.

  18. Development and evaluation of spatial point process models for epidermal nerve fibers.

    Science.gov (United States)

    Olsbo, Viktor; Myllymäki, Mari; Waller, Lance A; Särkkä, Aila

    2013-06-01

    We propose two spatial point process models for the spatial structure of epidermal nerve fibers (ENFs) across human skin. The models derive from two point processes, Φb and Φe, describing the locations of the base and end points of the fibers. Each point of Φe (the end point process) is connected to a unique point in Φb (the base point process). In the first model, both Φe and Φb are Poisson processes, yielding a null model of uniform coverage of the skin by end points and general baseline results and reference values for moments of key physiologic indicators. The second model provides a mechanistic model to generate end points for each base, and we model the branching structure more directly by defining Φe as a cluster process conditioned on the realization of Φb as its parent points. In both cases, we derive distributional properties for observable quantities of direct interest to neurologists such as the number of fibers per base, and the direction and range of fibers on the skin. We contrast both models by fitting them to data from skin blister biopsy images of ENFs and provide inference regarding physiological properties of ENFs. Copyright © 2013 Elsevier Inc. All rights reserved.

  19. Hybrid modeling as a QbD/PAT tool in process development: an industrial E. coli case study.

    Science.gov (United States)

    von Stosch, Moritz; Hamelink, Jan-Martijn; Oliveira, Rui

    2016-05-01

    Process understanding is emphasized in the process analytical technology initiative and the quality by design paradigm to be essential for manufacturing of biopharmaceutical products with consistent high quality. A typical approach to developing a process understanding is applying a combination of design of experiments with statistical data analysis. Hybrid semi-parametric modeling is investigated as an alternative method to pure statistical data analysis. The hybrid model framework provides flexibility to select model complexity based on available data and knowledge. Here, a parametric dynamic bioreactor model is integrated with a nonparametric artificial neural network that describes biomass and product formation rates as function of varied fed-batch fermentation conditions for high cell density heterologous protein production with E. coli. Our model can accurately describe biomass growth and product formation across variations in induction temperature, pH and feed rates. The model indicates that while product expression rate is a function of early induction phase conditions, it is negatively impacted as productivity increases. This could correspond with physiological changes due to cytoplasmic product accumulation. Due to the dynamic nature of the model, rational process timing decisions can be made and the impact of temporal variations in process parameters on product formation and process performance can be assessed, which is central for process understanding.

  20. Deferred Action: Theoretical model of process architecture design for emergent business processes

    Directory of Open Access Journals (Sweden)

    Patel, N.V.

    2007-01-01

    Full Text Available E-Business modelling and ebusiness systems development assumes fixed company resources, structures, and business processes. Empirical and theoretical evidence suggests that company resources and structures are emergent rather than fixed. Planning business activity in emergent contexts requires flexible ebusiness models based on better management theories and models . This paper builds and proposes a theoretical model of ebusiness systems capable of catering for emergent factors that affect business processes. Drawing on development of theories of the ‘action and design’class the Theory of Deferred Action is invoked as the base theory for the theoretical model. A theoretical model of flexible process architecture is presented by identifying its core components and their relationships, and then illustrated with exemplar flexible process architectures capable of responding to emergent factors. Managerial implications of the model are considered and the model’s generic applicability is discussed.

  1. Mechanistic Fermentation Models for Process Design, Monitoring, and Control

    DEFF Research Database (Denmark)

    Mears, Lisa; Stocks, Stuart M.; Albæk, Mads Orla

    2017-01-01

    Mechanistic models require a significant investment of time and resources, but their application to multiple stages of fermentation process development and operation can make this investment highly valuable. This Opinion article discusses how an established fermentation model may be adapted...... for application to different stages of fermentation process development: planning, process design, monitoring, and control. Although a longer development time is required for such modeling methods in comparison to purely data-based model techniques, the wide range of applications makes them a highly valuable tool...... for fermentation research and development. In addition, in a research environment, where collaboration is important, developing mechanistic models provides a platform for knowledge sharing and consolidation of existing process understanding....

  2. Modeling Business Processes in Public Administration

    Science.gov (United States)

    Repa, Vaclav

    During more than 10 years of its existence business process modeling became a regular part of organization management practice. It is mostly regarded. as a part of information system development or even as a way to implement some supporting technology (for instance workflow system). Although I do not agree with such reduction of the real meaning of a business process, it is necessary to admit that information technologies play an essential role in business processes (see [1] for more information), Consequently, an information system is inseparable from a business process itself because it is a cornerstone of the general basic infrastructure of a business. This fact impacts on all dimensions of business process management. One of these dimensions is the methodology that postulates that the information systems development provide the business process management with exact methods and tools for modeling business processes. Also the methodology underlying the approach presented in this paper has its roots in the information systems development methodology.

  3. Mathematical model development of heat and mass exchange processes in the outdoor swimming pool

    OpenAIRE

    M. V. Shaptala; D. E. Shaptala

    2014-01-01

    Purpose. Currently exploitation of outdoor swimming pools is often not cost-effective and, despite of their relevance, such pools are closed in large quantities. At this time there is no the whole mathematical model which would allow assessing qualitatively the effect of energy-saving measures. The aim of this work is to develop a mathematical model of heat and mass exchange processes for calculating basic heat and mass losses that occur during its exploitation. Methodology. The m...

  4. Software Process Validation: Quantitatively Measuring the Correspondence of a Process to a Model

    National Research Council Canada - National Science Library

    Cook, Jonathan E; Wolf, Alexander L

    1997-01-01

    .... When process models and process executions diverge, something significant is happening. The authors have developed techniques for uncovering and measuring the discrepancies between models and executions, which they call process validation...

  5. Process development and modeling of fluidized-bed reactor with coimmobilized biocatalyst for fuel ethanol production

    Science.gov (United States)

    Sun, May Yongmei

    This research focuses on two steps of commercial fuel ethanol production processes: the hydrolysis starch process and the fermentation process. The goal of this research is to evaluate the performance of co-immobilized biocatalysts in a fluidized bed reactor with emphasis on economic and engineering aspects and to develop a predictive mathematical model for this system. The productivity of an FBR is higher than productivity of a traditional batch reactor or CSTR. Fluidized beds offer great advantages over packed beds for immobilized cells when small particles are used or when the reactant feed contains suspended solids. Plugging problems, excessive pressure drops (and thus attrition), or crushing risks may be avoided. No mechanical stirring is required as mixing occurs due to the natural turbulence in the fluidized process. Both enzyme and microorganism are immobilized in one catalyst bead which is called co-immobilization. Inside this biocatalyst matrix, starch is hydrolyzed by the enzyme glucoamylase to form glucose and then converted to ethanol and carbon dioxide by microorganisms. Two biocatalysts were evaluated: (1) co-immobilized yeast strain Saccharomyces cerevisiae and glucoamylase. (2) co-immobilized Zymomonas mobilis and glucoamylase. A co-immobilized biocatalyst accomplishes the simultaneous saccharification and fermentation (SSF process). When compared to a two-step process involving separate saccharification and fermentation stages, the SSF process has productivity values twice that given by the pre-saccharified process when the time required for pre-saccharification (15--25 h) was taken into account. The SSF process should also save capital cost. The information about productivity, fermentation yield, concentration profiles along the bed, ethanol inhibition, et al., was obtained from the experimental data. For the yeast system, experimental results showed that: no apparent decrease of productivity occurred after two and half months, the productivity

  6. Derivative Process Model of Development Power in Industry: Empirical Research and Forecast for Chinese Software Industry and US Economy

    OpenAIRE

    Feng Dai; Bao- hua Sun; Jie Sun

    2004-01-01

    Based on concept and theory of Development Power [1], this paper analyzes the transferability and the diffusibility of industrial development power, points out that the chaos is the extreme of DP releasing and order is the highest degree of DP accumulating, puts forward A-C strength, the index of adjusting and controlling strength, and sets up the derivative process model for industrial development power on the Partial Distribution [2]-[4]. By the derivative process model, a kind of time seri...

  7. Recent Developments in Multiscale and Multiphase Modelling of the Hydraulic Fracturing Process

    Directory of Open Access Journals (Sweden)

    Yong Sheng

    2015-01-01

    Full Text Available Recently hydraulic fracturing of rocks has received much attention not only for its economic importance but also for its potential environmental impact. The hydraulically fracturing technique has been widely used in the oil (EOR and gas (EGR industries, especially in the USA, to extract more oil/gas through the deep rock formations. Also there have been increasing interests in utilising the hydraulic fracturing technique in geological storage of CO2 in recent years. In all cases, the design and implementation of the hydraulic fracturing process play a central role, highlighting the significance of research and development of this technique. However, the uncertainty behind the fracking mechanism has triggered public debates regarding the possible effect of this technique on human health and the environment. This has presented new challenges in the study of the hydraulic fracturing process. This paper describes the hydraulic fracturing mechanism and provides an overview of past and recent developments of the research performed towards better understandings of the hydraulic fracturing and its potential impacts, with particular emphasis on the development of modelling techniques and their implementation on the hydraulic fracturing.

  8. Assessment and Development of Engineering Design Processes

    DEFF Research Database (Denmark)

    Ulrikkeholm, Jeppe Bjerrum

    , the engineering companies need to have efficient engineering design processes in place, so they can design customised product variants faster and more efficiently. It is however not an easy task to model and develop such processes. To conduct engineering design is often a highly iterative, illdefined and complex...... the process can be fully understood and eventually improved. Taking its starting point in this proposition, the outcome of the research is an operational 5-phased procedure for assessing and developing engineering design processes through integrated modelling of product and process, designated IPPM......, and eventually the results are discussed, overall conclusions are made and future research is proposed. The results produced throughout the research project are developed in close collaboration with the Marine Low Speed business unit within the company MAN Diesel & Turbo. The business unit is the world market...

  9. Pedagogic process modeling: Humanistic-integrative approach

    Directory of Open Access Journals (Sweden)

    Boritko Nikolaj M.

    2007-01-01

    Full Text Available The paper deals with some current problems of modeling the dynamics of the subject-features development of the individual. The term "process" is considered in the context of the humanistic-integrative approach, in which the principles of self education are regarded as criteria for efficient pedagogic activity. Four basic characteristics of the pedagogic process are pointed out: intentionality reflects logicality and regularity of the development of the process; discreteness (stageability in dicates qualitative stages through which the pedagogic phenomenon passes; nonlinearity explains the crisis character of pedagogic processes and reveals inner factors of self-development; situationality requires a selection of pedagogic conditions in accordance with the inner factors, which would enable steering the pedagogic process. Offered are two steps for singling out a particular stage and the algorithm for developing an integrative model for it. The suggested conclusions might be of use for further theoretic research, analyses of educational practices and for realistic predicting of pedagogical phenomena. .

  10. Modeling biochemical transformation processes and information processing with Narrator.

    Science.gov (United States)

    Mandel, Johannes J; Fuss, Hendrik; Palfreyman, Niall M; Dubitzky, Werner

    2007-03-27

    Software tools that model and simulate the dynamics of biological processes and systems are becoming increasingly important. Some of these tools offer sophisticated graphical user interfaces (GUIs), which greatly enhance their acceptance by users. Such GUIs are based on symbolic or graphical notations used to describe, interact and communicate the developed models. Typically, these graphical notations are geared towards conventional biochemical pathway diagrams. They permit the user to represent the transport and transformation of chemical species and to define inhibitory and stimulatory dependencies. A critical weakness of existing tools is their lack of supporting an integrative representation of transport, transformation as well as biological information processing. Narrator is a software tool facilitating the development and simulation of biological systems as Co-dependence models. The Co-dependence Methodology complements the representation of species transport and transformation together with an explicit mechanism to express biological information processing. Thus, Co-dependence models explicitly capture, for instance, signal processing structures and the influence of exogenous factors or events affecting certain parts of a biological system or process. This combined set of features provides the system biologist with a powerful tool to describe and explore the dynamics of life phenomena. Narrator's GUI is based on an expressive graphical notation which forms an integral part of the Co-dependence Methodology. Behind the user-friendly GUI, Narrator hides a flexible feature which makes it relatively easy to map models defined via the graphical notation to mathematical formalisms and languages such as ordinary differential equations, the Systems Biology Markup Language or Gillespie's direct method. This powerful feature facilitates reuse, interoperability and conceptual model development. Narrator is a flexible and intuitive systems biology tool. It is

  11. Developing a Process Model for the Forensic Extraction of Information from Desktop Search Applications

    Directory of Open Access Journals (Sweden)

    Timothy Pavlic

    2008-03-01

    Full Text Available Desktop search applications can contain cached copies of files that were deleted from the file system. Forensic investigators see this as a potential source of evidence, as documents deleted by suspects may still exist in the cache. Whilst there have been attempts at recovering data collected by desktop search applications, there is no methodology governing the process, nor discussion on the most appropriate means to do so. This article seeks to address this issue by developing a process model that can be applied when developing an information extraction application for desktop search applications, discussing preferred methods and the limitations of each. This work represents a more structured approach than other forms of current research.

  12. Development and Testing of a Model for Simulation of Process Operators' During Emergencies in Nuclear Power Plants

    DEFF Research Database (Denmark)

    Rasmussen, Jens

    1986-01-01

    The paper describes a program for the development and testing of a model of cognitive processes intended for simulation of operator responses to plant disturbances. It will be a part of a computer program complex called DYLAM for automatic identification of accident scenarios to be included...... to develop this data base is proposed. The human element is introduced in the model by a perturbation function derived from human error modes.A Program for testing the model in briefly mentioned....

  13. Diagnosing differences between business process models

    NARCIS (Netherlands)

    Dijkman, R.M.; Dumas, M.; Reichert, M.; Shan, M.-C.

    2008-01-01

    This paper presents a technique to diagnose differences between business process models in the EPC notation. The diagnosis returns the exact position of a difference in the business process models and diagnoses the type of a difference, using a typology of differences developed in previous work.

  14. Study of Research and Development Processes through Fuzzy Super FRM Model and Optimization Solutions

    Directory of Open Access Journals (Sweden)

    Flavius Aurelian Sârbu

    2015-01-01

    Full Text Available The aim of this study is to measure resources for R&D (research and development at the regional level in Romania and also obtain primary data that will be important in making the right decisions to increase competitiveness and development based on an economic knowledge. As our motivation, we would like to emphasize that by the use of Super Fuzzy FRM model we want to determine the state of R&D processes at regional level using a mean different from the statistical survey, while by the two optimization methods we mean to provide optimization solutions for the R&D actions of the enterprises. Therefore to fulfill the above mentioned aim in this application-oriented paper we decided to use a questionnaire and for the interpretation of the results the Super Fuzzy FRM model, representing the main novelty of our paper, as this theory provides a formalism based on matrix calculus, which allows processing of large volumes of information and also delivers results difficult or impossible to see, through statistical processing. Furthermore another novelty of the paper represents the optimization solutions submitted in this work, given for the situation when the sales price is variable, and the quantity sold is constant in time and for the reverse situation.

  15. Developing a Steady-state Kinetic Model for Industrial Scale Semi-Regenerative Catalytic Naphtha Reforming Process

    Directory of Open Access Journals (Sweden)

    Seif Mohaddecy, R.

    2014-05-01

    Full Text Available Due to the demand for high octane gasoline as a transportation fuel, the catalytic naphtha reformer has become one of the most important processes in petroleum refineries. In this research, the steady-state modelling of a catalytic fixed-bed naphtha reforming process to predict the momentous output variables was studied. These variables were octane number, yield, hydrogen purity, and temperature of all reforming reactors. To do such a task, an industrial scale semi-regenerative catalytic naphtha reforming unit was studied and modelled. In addition, to evaluate the developed model, the predicted variables i.e. outlet temperatures of reactors, research octane number, yield of gasoline and hydrogen purity were compared against actual data. The results showed that there is a close mapping between the actual and predicted variables, and the mean relative absolute deviation of the mentioned process variables were 0.38 %, 0.52 %, 0.54 %, 0.32 %, 4.8 % and 3.2 %, respectively.

  16. Analyzing empowerment oriented email consultation for parents : Development of the Guiding the Empowerment Process model

    NARCIS (Netherlands)

    dr. Christa C.C. Nieuwboer

    2014-01-01

    Background. Online consultation is increasingly offered by parenting practitioners, but it is not clear if it is feasible to provide empowerment oriented support in single session email consultation. Method. Based on empowerment theory we developed the Guiding the Empowerment Process model (GEP

  17. Group Contribution Based Process Flowsheet Synthesis, Design and Modelling

    DEFF Research Database (Denmark)

    d'Anterroches, Loïc; Gani, Rafiqul

    2004-01-01

    This paper presents a process-group-contribution Method to model. simulate and synthesize a flowsheet. The process-group based representation of a flowsheet together with a process "property" model are presented. The process-group based synthesis method is developed on the basis of the computer...... aided molecular design methods and gives the ability to screen numerous process alternatives without the need to use the rigorous process simulation models. The process "property" model calculates the design targets for the generated flowsheet alternatives while a reverse modelling method (also...... developed) determines the design variables matching the target. A simple illustrative example highlighting the main features of the methodology is also presented....

  18. Model Development and Process Analysis for Lean Cellular Design Planning in Aerospace Assembly and Manufacturing

    Science.gov (United States)

    Hilburn, Monty D.

    Successful lean manufacturing and cellular manufacturing execution relies upon a foundation of leadership commitment and strategic planning built upon solid data and robust analysis. The problem for this study was to create and employ a simple lean transformation planning model and review process that could be used to identify functional support staff resources required to plan and execute lean manufacturing cells within aerospace assembly and manufacturing sites. The lean planning model was developed using available literature for lean manufacturing kaizen best practices and validated through a Delphi panel of lean experts. The resulting model and a standardized review process were used to assess the state of lean transformation planning at five sites of an international aerospace manufacturing and assembly company. The results of the three day, on-site review were compared with baseline plans collected from each of the five sites to determine if there analyzed, with focus on three critical areas of lean planning: the number and type of manufacturing cells identified, the number, type, and duration of planned lean and continuous kaizen events, and the quantity and type of functional staffing resources planned to support the kaizen schedule. Summarized data of the baseline and on-site reviews was analyzed with descriptive statistics. ANOVAs and paired-t tests at 95% significance level were conducted on the means of data sets to determine if null hypotheses related to cell, kaizen event, and support resources could be rejected. The results of the research found significant differences between lean transformation plans developed by site leadership and plans developed utilizing the structured, on-site review process and lean transformation planning model. The null hypothesis that there was no difference between the means of pre-review and on-site cell counts was rejected, as was the null hypothesis that there was no significant difference in kaizen event plans. These

  19. Understanding Quality in Process Modelling: Towards a Holistic Perspective

    Directory of Open Access Journals (Sweden)

    Jan Recker

    2007-09-01

    Full Text Available Quality is one of the main topics in current conceptual modelling research, as is the field of business process modelling. Yet, widely acknowledged academic contributions towards an understanding or measurement of business process model quality are limited at best. In this paper I argue that the development of methodical theories concerning the measurement or establishment of process model quality must be preceded by methodological elaborations on business process modelling. I further argue that existing epistemological foundations of process modelling are insufficient for describing all extrinsic and intrinsic traits of model quality. This in turn has led to a lack of holistic understanding of process modelling. Taking into account the inherent social and purpose-oriented character of process modelling in contemporary organizations I present a socio-pragmatic constructionist methodology of business process modelling and sketch out implications of this perspective towards an understanding of process model quality. I anticipate that, based on this research, theories can be developed that facilitate the evaluation of the ’goodness’ of a business process model.

  20. Application of high-throughput mini-bioreactor system for systematic scale-down modeling, process characterization, and control strategy development.

    Science.gov (United States)

    Janakiraman, Vijay; Kwiatkowski, Chris; Kshirsagar, Rashmi; Ryll, Thomas; Huang, Yao-Ming

    2015-01-01

    High-throughput systems and processes have typically been targeted for process development and optimization in the bioprocessing industry. For process characterization, bench scale bioreactors have been the system of choice. Due to the need for performing different process conditions for multiple process parameters, the process characterization studies typically span several months and are considered time and resource intensive. In this study, we have shown the application of a high-throughput mini-bioreactor system viz. the Advanced Microscale Bioreactor (ambr15(TM) ), to perform process characterization in less than a month and develop an input control strategy. As a pre-requisite to process characterization, a scale-down model was first developed in the ambr system (15 mL) using statistical multivariate analysis techniques that showed comparability with both manufacturing scale (15,000 L) and bench scale (5 L). Volumetric sparge rates were matched between ambr and manufacturing scale, and the ambr process matched the pCO2 profiles as well as several other process and product quality parameters. The scale-down model was used to perform the process characterization DoE study and product quality results were generated. Upon comparison with DoE data from the bench scale bioreactors, similar effects of process parameters on process yield and product quality were identified between the two systems. We used the ambr data for setting action limits for the critical controlled parameters (CCPs), which were comparable to those from bench scale bioreactor data. In other words, the current work shows that the ambr15(TM) system is capable of replacing the bench scale bioreactor system for routine process development and process characterization. © 2015 American Institute of Chemical Engineers.

  1. Strategies for developing competency models.

    Science.gov (United States)

    Marrelli, Anne F; Tondora, Janis; Hoge, Michael A

    2005-01-01

    There is an emerging trend within healthcare to introduce competency-based approaches in the training, assessment, and development of the workforce. The trend is evident in various disciplines and specialty areas within the field of behavioral health. This article is designed to inform those efforts by presenting a step-by-step process for developing a competency model. An introductory overview of competencies, competency models, and the legal implications of competency development is followed by a description of the seven steps involved in creating a competency model for a specific function, role, or position. This modeling process is drawn from advanced work on competencies in business and industry.

  2. Process modeling of the platform choise for development of the multimedia educational complex

    Directory of Open Access Journals (Sweden)

    Ірина Олександрівна Бондар

    2016-10-01

    Full Text Available The article presents a methodical approach to the platform choice as the technological basis for building of open and functional structure and the further implementation of the substantive content of the modules of the network multimedia complex for the discipline. The proposed approach is implemented through the use of mathematical tools. The result of the process modeling is the decision of the most appropriate platform for development of the multimedia complex

  3. Models of development and educational styles in the process of emancipation of Latin America: the case of Brazil

    Directory of Open Access Journals (Sweden)

    Dermeval SAVIANI

    2011-07-01

    Full Text Available On the occasion of the commemoration of the 200 years of Independence of Latin American countries, this paper analyses the models of development and educational styles in the process of the emancipation of Ibero-America, focusing specifically on the Brazilian case. In order to do this, we use two key texts as a reference: Gregorio Weinberg’s Modelos educativos en el desarrollo histórico de América Latina (Models of Education in the Historical Development of Latin America and Germán Rama’s Estilos educacionales (Educational Styles. Both texts elaborate the educational models or styles that took part in the historical development of Latin American societies. Bearing in mind the polarization between tradition and the modernization displayed in the educational models and styles proposed by Weinberg and Rama, this work shows how the process of conservative modernization, which characterized —with different nuances— the general emancipation movement in Ibero-American countries, took place in Brazilian society.

  4. Reference model for apparel product development

    Directory of Open Access Journals (Sweden)

    Isabel Cristina Moretti

    2017-03-01

    Full Text Available The purpose of this paper was to develop a reference model for the implementation of the process of product development (PDP for apparel. The tool was developed through an interactive process of comparison between theoretical. Managers in companies and professionals working in this market can utilize the reference model as a source for the organization and improvement of the PDP for apparel and the universities as a reference source for systematized teaching of this process. This model represents the first comprehensive attempt to develop an instrument at a detailed level (macro phases, phases, activities, inputs and outputs at each stage and at the gates to systematize the PDP process for fashion products and to consider its particularities.

  5. Image Processing of Welding Procedure Specification and Pre-process program development for Finite Element Modelling

    International Nuclear Information System (INIS)

    Kim, K. S.; Lee, H. J.

    2009-11-01

    PRE-WELD program, which generates automatically the input file for the finite element analysis on the 2D butt welding at the dissimilar metal weld part, was developed. This program is pre-process program of the FEM code for analyzing the residual stress at the welding parts. Even if the users have not the detail knowledge for the FEM modelling, the users can make the ABAQUS INPUT easily by inputting the shape data of welding part, the weld current and voltage of welding parameters. By using PRE-WELD program, we can save the time and the effort greatly for preparing the ABAQUS INPUT for the residual stress analysis at the welding parts, and make the exact input without the human error

  6. Modeling biochemical transformation processes and information processing with Narrator

    Directory of Open Access Journals (Sweden)

    Palfreyman Niall M

    2007-03-01

    Full Text Available Abstract Background Software tools that model and simulate the dynamics of biological processes and systems are becoming increasingly important. Some of these tools offer sophisticated graphical user interfaces (GUIs, which greatly enhance their acceptance by users. Such GUIs are based on symbolic or graphical notations used to describe, interact and communicate the developed models. Typically, these graphical notations are geared towards conventional biochemical pathway diagrams. They permit the user to represent the transport and transformation of chemical species and to define inhibitory and stimulatory dependencies. A critical weakness of existing tools is their lack of supporting an integrative representation of transport, transformation as well as biological information processing. Results Narrator is a software tool facilitating the development and simulation of biological systems as Co-dependence models. The Co-dependence Methodology complements the representation of species transport and transformation together with an explicit mechanism to express biological information processing. Thus, Co-dependence models explicitly capture, for instance, signal processing structures and the influence of exogenous factors or events affecting certain parts of a biological system or process. This combined set of features provides the system biologist with a powerful tool to describe and explore the dynamics of life phenomena. Narrator's GUI is based on an expressive graphical notation which forms an integral part of the Co-dependence Methodology. Behind the user-friendly GUI, Narrator hides a flexible feature which makes it relatively easy to map models defined via the graphical notation to mathematical formalisms and languages such as ordinary differential equations, the Systems Biology Markup Language or Gillespie's direct method. This powerful feature facilitates reuse, interoperability and conceptual model development. Conclusion Narrator is a

  7. A process model in continuing professional development: Exploring diagnostic radiographers' views

    International Nuclear Information System (INIS)

    Henwood, Suzanne M.; Taket, Ann

    2008-01-01

    This article is based on an exploratory, interpretative grounded theory study that looked at practitioners' perceptions of continuing professional development (CPD) in diagnostic radiography in the UK. Using a combination of in-depth interviews and secondary analysis of published material, a dynamic CPD process model was generated. The study aimed to explore what radiographers understood by the term CPD and whether it was perceived to have any impact on clinical practice. The study aimed to identify and investigate the components of CPD and how they interact with one another, to help to explain what is happening within CPD and what contributes to its effectiveness. The CPD process was shown to be complex, dynamic and centred on the Individual. Supporting components of Facilitation and External Influences were identified as important in maximising the potential impact of CPD. The three main categories were shown to interact dynamically and prior to Participation were shown to have a 'superadditive' effect, where the total effect was greater than the sum of the three individual parts. This study showed that radiographers are generally unaware of the holistic concept of CPD, using instead narrow definitions of CPD with little or no expectation of any impact on practice, focusing predominantly on personal gain. The model produced in the study provided a tool that practitioners reported was helpful in reflecting on their own involvment in CPD

  8. Simulation Models of Human Decision-Making Processes

    Directory of Open Access Journals (Sweden)

    Nina RIZUN

    2014-10-01

    Full Text Available The main purpose of the paper is presentation of the new concept of human decision-making process modeling via using the analogy with Automatic Control Theory. From the author's point of view this concept allows to develop and improve the theory of decision-making in terms of the study and classification of specificity of the human intellectual processes in different conditions. It was proved that the main distinguishing feature between the Heuristic / Intuitive and Rational Decision-Making Models is the presence of so-called phenomenon of "enrichment" of the input information with human propensity, hobbies, tendencies, expectations, axioms and judgments, presumptions or bias and their justification. In order to obtain additional knowledge about the basic intellectual processes as well as the possibility of modeling the decision results in various parameters characterizing the decision-maker, the complex of the simulation models was developed. These models are based on the assumptions that:  basic intellectual processes of the Rational Decision-Making Model can be adequately simulated and identified by the transient processes of the proportional-integral-derivative controller; basic intellectual processes of the Bounded Rationality and Intuitive Models can be adequately simulated and identified by the transient processes of the nonlinear elements.The taxonomy of the most typical automatic control theory elements and their compliance with certain decision-making models with a point of view of decision-making process specificity and decision-maker behavior during a certain time of professional activity was obtained.

  9. Development and application of a processing model for the Irish dairy industry.

    Science.gov (United States)

    Geary, U; Lopez-Villalobos, N; Garrick, D J; Shalloo, L

    2010-11-01

    A processing-sector model was developed that simulates (i) milk collection, (ii) standardization, and (iii) product manufacture. The model estimates the product yield, net milk value, and component values of milk based on milk quantity, composition, product portfolio, and product values. Product specifications of cheese, butter, skim and whole milk powders, liquid milk, and casein are met through milk separation followed by reconstitution in appropriate proportions. Excess cream or skim milk are used in other product manufacture. Volume-related costs, including milk collection, standardization, and processing costs, and product-related costs, including processing costs per tonne, packaging, storage, distribution, and marketing, are quantified. Operating costs, incurred irrespective of milk received and processing activities, are included in the model on a fixed-rate basis. The net milk value is estimated as sale value less total costs. The component values of fat and protein were estimated from net milk value using the marginal rate of technical substitution. Two product portfolio scenarios were examined: scenario 1 was representative of the Irish product mix in 2000, in which 27, 39, 13, and 21% of the milk pool was processed into cheese (€ 3,291.33/t), butter (€ 2,766.33/t), whole milk powder (€ 2,453.33/t), and skim milk powder (€ 2,017.00/t), respectively, and scenario 2 was representative of the 2008 product mix, in which 43, 30, 14, and 13% was processed into cheese, butter, whole milk powder, and skim milk powder, respectively, and sold at the same market prices. Within both scenarios 3 milk compositions were considered, which were representative of (i) typical Irish Holstein-Friesian, (ii) Jersey, and (iii) the New Zealand strain of Holstein-Friesian, each of which had differing milk constituents. The effect each milk composition had on product yield, processing costs, total revenue, component values of milk, and the net value of milk was examined

  10. Mathematical model development of heat and mass exchange processes in the outdoor swimming pool

    Directory of Open Access Journals (Sweden)

    M. V. Shaptala

    2014-12-01

    Full Text Available Purpose. Currently exploitation of outdoor swimming pools is often not cost-effective and, despite of their relevance, such pools are closed in large quantities. At this time there is no the whole mathematical model which would allow assessing qualitatively the effect of energy-saving measures. The aim of this work is to develop a mathematical model of heat and mass exchange processes for calculating basic heat and mass losses that occur during its exploitation. Methodology. The method for determination of heat and mass loses based on the theory of similarity criteria equations is used. Findings. The main types of heat and mass losses of outdoor pool were analyzed. The most significant types were allocated and mathematically described. Namely: by evaporation of water from the surface of the pool, by natural and forced convection, by radiation to the environment, heat consumption for water heating. Originality. The mathematical model of heat and mass exchange process of the outdoor swimming pool was developed, which allows calculating the basic heat and mass loses that occur during its exploitation. Practical value. The method of determining heat and mass loses of outdoor swimming pool as a software system was developed and implemented. It is based on the mathematical model proposed by the authors. This method can be used for the conceptual design of energy-efficient structures of outdoor pools, to assess their use of energy-intensive and selecting the optimum energy-saving measures. A further step in research in this area is the experimental validation of the method of calculation of heat losses in outdoor swimming pools with its use as an example the pool of Dnipropetrovsk National University of Railway Transport named after Academician V. Lazaryan. The outdoor pool, with water heating- up from the boiler room of the university, is operated year-round.

  11. Computer modeling of lung cancer diagnosis-to-treatment process.

    Science.gov (United States)

    Ju, Feng; Lee, Hyo Kyung; Osarogiagbon, Raymond U; Yu, Xinhua; Faris, Nick; Li, Jingshan

    2015-08-01

    We introduce an example of a rigorous, quantitative method for quality improvement in lung cancer care-delivery. Computer process modeling methods are introduced for lung cancer diagnosis, staging and treatment selection process. Two types of process modeling techniques, discrete event simulation (DES) and analytical models, are briefly reviewed. Recent developments in DES are outlined and the necessary data and procedures to develop a DES model for lung cancer diagnosis, leading up to surgical treatment process are summarized. The analytical models include both Markov chain model and closed formulas. The Markov chain models with its application in healthcare are introduced and the approach to derive a lung cancer diagnosis process model is presented. Similarly, the procedure to derive closed formulas evaluating the diagnosis process performance is outlined. Finally, the pros and cons of these methods are discussed.

  12. Unified Approach in the DSS Development Process

    Directory of Open Access Journals (Sweden)

    2007-01-01

    Full Text Available The structure of today's decision support environment become very complex due to new generation of Business Intelligence applications and technologies like Data Warehouse, OLAP (On Line Analytical Processing and Data Mining. In this respect DSS development process are not simple and needs an adequate methodology or framework able to manage different tools and platforms to achieve manager's requirements. The DSS development process must be view like a unified and iterative set of activities and operations. The new techniques based on Unified Process (UP methodology and UML (Unified Modeling Language it seems to be appropriate for DSS development using prototyping and RAD (Rapid Application Development techniques. In this paper we present a conceptual framework for development and integrate Decision Support Systems using Unified Process Methodology and UML.

  13. User's manual for a process model code

    International Nuclear Information System (INIS)

    Kern, E.A.; Martinez, D.P.

    1981-03-01

    The MODEL code has been developed for computer modeling of materials processing facilities associated with the nuclear fuel cycle. However, it can also be used in other modeling applications. This report provides sufficient information for a potential user to apply the code to specific process modeling problems. Several examples that demonstrate most of the capabilities of the code are provided

  14. THE MATHEMATICAL MODEL DEVELOPMENT OF THE ETHYLBENZENE DEHYDROGENATION PROCESS KINETICS IN A TWO-STAGE ADIABATIC CONTINUOUS REACTOR

    Directory of Open Access Journals (Sweden)

    V. K. Bityukov

    2015-01-01

    Full Text Available The article is devoted to the mathematical modeling of the kinetics of ethyl benzene dehydrogenation in a two-stage adiabatic reactor with a catalytic bed functioning on continuous technology. The analysis of chemical reactions taking place parallel to the main reaction of styrene formation has been carried out on the basis of which a number of assumptions were made proceeding from which a kinetic scheme describing the mechanism of the chemical reactions during the dehydrogenation process was developed. A mathematical model of the dehydrogenation process, describing the dynamics of chemical reactions taking place in each of the two stages of the reactor block at a constant temperature is developed. The estimation of the rate constants of direct and reverse reactions of each component, formation and exhaustion of the reacted mixture was made. The dynamics of the starting material concentration variations (ethyl benzene batch was obtained as well as styrene formation dynamics and all byproducts of dehydrogenation (benzene, toluene, ethylene, carbon, hydrogen, ect.. The calculated the variations of the component composition of the reaction mixture during its passage through the first and second stages of the reactor showed that the proposed mathematical description adequately reproduces the kinetics of the process under investigation. This demonstrates the advantage of the developed model, as well as loyalty to the values found for the rate constants of reactions, which enable the use of models for calculating the kinetics of ethyl benzene dehydrogenation under nonisothermal mode in order to determine the optimal temperature trajectory of the reactor operation. In the future, it will reduce energy and resource consumption, increase the volume of produced styrene and improve the economic indexes of the process.

  15. Computer-Aided Modeling of Lipid Processing Technology

    DEFF Research Database (Denmark)

    Diaz Tovar, Carlos Axel

    2011-01-01

    increase along with growing interest in biofuels, the oleochemical industry faces in the upcoming years major challenges in terms of design and development of better products and more sustainable processes to make them. Computer-aided methods and tools for process synthesis, modeling and simulation...... are widely used for design, analysis, and optimization of processes in the chemical and petrochemical industries. These computer-aided tools have helped the chemical industry to evolve beyond commodities toward specialty chemicals and ‘consumer oriented chemicals based products’. Unfortunately...... to develop systematic computer-aided methods (property models) and tools (database) related to the prediction of the necessary physical properties suitable for design and analysis of processes employing lipid technologies. The methods and tools include: the development of a lipid-database (CAPEC...

  16. Ontological Model of Business Process Management Systems

    Science.gov (United States)

    Manoilov, G.; Deliiska, B.

    2008-10-01

    The activities which constitute business process management (BPM) can be grouped into five categories: design, modeling, execution, monitoring and optimization. Dedicated software packets for business process management system (BPMS) are available on the market. But the efficiency of its exploitation depends on used ontological model in the development time and run time of the system. In the article an ontological model of BPMS in area of software industry is investigated. The model building is preceded by conceptualization of the domain and taxonomy of BPMS development. On the base of the taxonomy an simple online thesaurus is created.

  17. MEASUREMENT PROCESS OF SOFTWARE DEVELOPMENT PROJECTS FOR SUPPORTING STRATEGIC BUSINESS OBJECTIVES IN SOFTWARE DEVELOPING COMPANIES

    Directory of Open Access Journals (Sweden)

    Sandra Lais Pedroso

    2013-08-01

    Full Text Available Software developing companies work in a competitive market and are often challenged to make business decisions with impact on competitiveness. Models accessing maturity for software development processes quality, such as CMMI and MPS-BR, comprise process measurements systems (PMS. However, these models are not necessarily suitable to support business decisions, neither to achieve strategic goals. The objective of this work is to analyze how the PMS of software development projects could support business strategies for software developing companies. Results taken from this work show that PMS results from maturity models for software processes can be suited to help evaluating operating capabilities and supporting strategic business decisions.

  18. Parsing multiple processes of high temperature impacts on corn/soybean yield using a newly developed CLM-APSIM modeling framework

    Science.gov (United States)

    Peng, B.; Guan, K.; Chen, M.

    2016-12-01

    Future agricultural production faces a grand challenge of higher temperature under climate change. There are multiple physiological or metabolic processes of how high temperature affects crop yield. Specifically, we consider the following major processes: (1) direct temperature effects on photosynthesis and respiration; (2) speed-up growth rate and the shortening of growing season; (3) heat stress during reproductive stage (flowering and grain-filling); (4) high-temperature induced increase of atmospheric water demands. In this work, we use a newly developed modeling framework (CLM-APSIM) to simulate the corn and soybean growth and explicitly parse the above four processes. By combining the strength of CLM in modeling surface biophysical (e.g., hydrology and energy balance) and biogeochemical (e.g., photosynthesis and carbon-nitrogen interactions), as well as that of APSIM in modeling crop phenology and reproductive stress, the newly developed CLM-APSIM modeling framework enables us to diagnose the impacts of high temperature stress through different processes at various crop phenology stages. Ground measurements from the advanced SoyFACE facility at University of Illinois is used here to calibrate, validate, and improve the CLM-APSIM modeling framework at the site level. We finally use the CLM-APSIM modeling framework to project crop yield for the whole US Corn Belt under different climate scenarios.

  19. Development of advanced spent fuel management process. System analysis of advanced spent fuel management process

    International Nuclear Information System (INIS)

    Ro, S.G.; Kang, D.S.; Seo, C.S.; Lee, H.H.; Shin, Y.J.; Park, S.W.

    1999-03-01

    The system analysis of an advanced spent fuel management process to establish a non-proliferation model for the long-term spent fuel management is performed by comparing the several dry processes, such as a salt transport process, a lithium process, the IFR process developed in America, and DDP developed in Russia. In our system analysis, the non-proliferation concept is focused on the separation factor between uranium and plutonium and decontamination factors of products in each process, and the non-proliferation model for the long-term spent fuel management has finally been introduced. (Author). 29 refs., 17 tabs., 12 figs

  20. Assessing healthcare process maturity: challenges of using a business process maturity model

    NARCIS (Netherlands)

    Tarhan, A.; Turetken, O.; van den Biggelaar, F.J.H.M.

    2015-01-01

    Doi: 10.4108/icst.pervasivehealth.2015.259105 The quality of healthcare services is influenced by the maturity of healthcare processes used to develop it. A maturity model is an instrument to assess and continually improve organizational processes. In the last decade, a number of maturity models

  1. Development and Application of an Integrated Model for Representing Hydrologic Processes and Irrigation at Residential Scale in Semiarid and Mediterranean Regions

    Science.gov (United States)

    Herrera, J. B.; Gironas, J. A.; Bonilla, C. A.; Vera, S.; Reyes, F. R.

    2015-12-01

    Urbanization alters physical and biological processes that take place in natural environments. New impervious areas change the hydrological processes, reducing infiltration and evapotranspiration and increasing direct runoff volumes and flow discharges. To reduce these effects at local scale, sustainable urban drainage systems, low impact development and best management practices have been developed and implemented. These technologies, which typically consider some type of green infrastructure (GI), simulate natural processes of capture, retention and infiltration to control flow discharges from frequent events and preserve the hydrological cycle. Applying these techniques in semiarid regions requires accounting for aspects related to the maintenance of green areas, such as the irrigation needs and the selection of the vegetation. This study develops the Integrated Hydrological Model at Residential Scale, IHMORS, which is a continuous model that simulates the most relevant hydrological processes together with irrigation processes of green areas. In the model contributing areas and drainage control practices are modeled by combining and connecting differents subareas subjected to surface processes (i.e. interception, evapotranspiration, infiltration and surface runoff) and sub-surface processes (percolation, redistribution and subsurface runoff). The model simulates these processes and accounts for the dynamics of the water content in different soil layers. The different components of the model were first tested using laboratory and numerical experiments, and then an application to a case study was carried out. In this application we assess the long-term performance in terms of runoff control and irrigation needs of green gardens with different vegetation, under different climate and irrigation practices. The model identifies significant differences in the performance of the alternatives and provides a good insight for the maintenance needs of GI for runoff control.

  2. Advanced oxidation processes: overall models

    Energy Technology Data Exchange (ETDEWEB)

    Rodriguez, M. [Univ. de los Andes, Escuela Basica de Ingenieria, La Hechicera, Merida (Venezuela); Curco, D.; Addardak, A.; Gimenez, J.; Esplugas, S. [Dept. de Ingenieria Quimica. Univ. de Barcelona, Barcelona (Spain)

    2003-07-01

    Modelling AOPs implies to consider all the steps included in the process, that means, mass transfer, kinetic (reaction) and luminic steps. In this way, recent works develop models which relate the global reaction rate to catalyst concentration and radiation absorption. However, the application of such models requires to know what is the controlling step for the overall process. In this paper, a simple method is explained which allows to determine the controlling step. Thus, it is assumed that reactor is divided in two hypothetical zones (dark and illuminated), and according to the experimental results, obtained by varying only the reaction volume, it can be decided if reaction occurs only in the illuminated zone or in the all reactor, including dark zone. The photocatalytic degradation of phenol, by using titania degussa P-25 as catalyst, is studied as reaction model. The preliminary results obtained are presented here, showing that it seems that, in this case, reaction only occurs in the illuminated zone of photoreactor. A model is developed to explain this behaviour. (orig.)

  3. The impact of working memory and the “process of process modelling” on model quality: Investigating experienced versus inexperienced modellers

    Science.gov (United States)

    Martini, Markus; Pinggera, Jakob; Neurauter, Manuel; Sachse, Pierre; Furtner, Marco R.; Weber, Barbara

    2016-01-01

    A process model (PM) represents the graphical depiction of a business process, for instance, the entire process from online ordering a book until the parcel is delivered to the customer. Knowledge about relevant factors for creating PMs of high quality is lacking. The present study investigated the role of cognitive processes as well as modelling processes in creating a PM in experienced and inexperienced modellers. Specifically, two working memory (WM) functions (holding and processing of information and relational integration) and three process of process modelling phases (comprehension, modelling, and reconciliation) were related to PM quality. Our results show that the WM function of relational integration was positively related to PM quality in both modelling groups. The ratio of comprehension phases was negatively related to PM quality in inexperienced modellers and the ratio of reconciliation phases was positively related to PM quality in experienced modellers. Our research reveals central cognitive mechanisms in process modelling and has potential practical implications for the development of modelling software and teaching the craft of process modelling. PMID:27157858

  4. Development of a Population Balance Model of a pharmaceutical drying process and testing of solution methods

    DEFF Research Database (Denmark)

    Mortier, Séverine Thérèse F.C.; Gernaey, Krist; De Beer, Thomas

    2013-01-01

    Drying is frequently used in the production of pharmaceutical tablets. Simulation-based control strategy development for such a drying process requires a detailed model. First, the drying of wet granules is modelled using a Population Balance Model. A growth term based on a reduced model was used......, which describes the decrease of the moisture content, to follow the moisture content distribution for a batch of granules. Secondly, different solution methods for solving the PBM are compared. The effect of grid size (discretization methods) is analyzed in terms of accuracy and calculation time. All...

  5. QUALITY IMPROVEMENT MODEL AT THE MANUFACTURING PROCESS PREPARATION LEVEL

    Directory of Open Access Journals (Sweden)

    Dusko Pavletic

    2009-12-01

    Full Text Available The paper expresses base for an operational quality improvement model at the manufacturing process preparation level. A numerous appropriate related quality assurance and improvement methods and tools are identified. Main manufacturing process principles are investigated in order to scrutinize one general model of manufacturing process and to define a manufacturing process preparation level. Development and introduction of the operational quality improvement model is based on a research conducted and results of methods and tools application possibilities in real manufacturing processes shipbuilding and automotive industry. Basic model structure is described and presented by appropriate general algorithm. Operational quality improvement model developed lays down main guidelines for practical and systematic application of quality improvements methods and tools.

  6. Beyond dual-process models: A categorisation of processes underlying intuitive judgement and decision making

    NARCIS (Netherlands)

    Glöckner, A.; Witteman, C.L.M.

    2010-01-01

    Intuitive-automatic processes are crucial for making judgements and decisions. The fascinating complexity of these processes has attracted many decision researchers, prompting them to start investigating intuition empirically and to develop numerous models. Dual-process models assume a clear

  7. Developing Materials Processing to Performance Modeling Capabilities and the Need for Exascale Computing Architectures (and Beyond)

    Energy Technology Data Exchange (ETDEWEB)

    Schraad, Mark William [Los Alamos National Lab. (LANL), Los Alamos, NM (United States). Physics and Engineering Models; Luscher, Darby Jon [Los Alamos National Lab. (LANL), Los Alamos, NM (United States). Advanced Simulation and Computing

    2016-09-06

    Additive Manufacturing techniques are presenting the Department of Energy and the NNSA Laboratories with new opportunities to consider novel component production and repair processes, and to manufacture materials with tailored response and optimized performance characteristics. Additive Manufacturing technologies already are being applied to primary NNSA mission areas, including Nuclear Weapons. These mission areas are adapting to these new manufacturing methods, because of potential advantages, such as smaller manufacturing footprints, reduced needs for specialized tooling, an ability to embed sensing, novel part repair options, an ability to accommodate complex geometries, and lighter weight materials. To realize the full potential of Additive Manufacturing as a game-changing technology for the NNSA’s national security missions; however, significant progress must be made in several key technical areas. In addition to advances in engineering design, process optimization and automation, and accelerated feedstock design and manufacture, significant progress must be made in modeling and simulation. First and foremost, a more mature understanding of the process-structure-property-performance relationships must be developed. Because Additive Manufacturing processes change the nature of a material’s structure below the engineering scale, new models are required to predict materials response across the spectrum of relevant length scales, from the atomistic to the continuum. New diagnostics will be required to characterize materials response across these scales. And not just models, but advanced algorithms, next-generation codes, and advanced computer architectures will be required to complement the associated modeling activities. Based on preliminary work in each of these areas, a strong argument for the need for Exascale computing architectures can be made, if a legitimate predictive capability is to be developed.

  8. Pre-Processing and Modeling Tools for Bigdata

    Directory of Open Access Journals (Sweden)

    Hashem Hadi

    2016-09-01

    Full Text Available Modeling tools and operators help the user / developer to identify the processing field on the top of the sequence and to send into the computing module only the data related to the requested result. The remaining data is not relevant and it will slow down the processing. The biggest challenge nowadays is to get high quality processing results with a reduced computing time and costs. To do so, we must review the processing sequence, by adding several modeling tools. The existing processing models do not take in consideration this aspect and focus on getting high calculation performances which will increase the computing time and costs. In this paper we provide a study of the main modeling tools for BigData and a new model based on pre-processing.

  9. A model for ageing-home-care service process improvement

    OpenAIRE

    Yu, Shu-Yan; Shie, An-Jin

    2017-01-01

    The purpose of this study was to develop an integrated model to improve service processes in ageing-home-care. According to the literature, existing service processes have potential service failures that affect service quality and efficacy. However, most previous studies have only focused on conceptual model development using New Service Development (NSD) and fail to provide a systematic model to analyse potential service failures and facilitate managers developing solutions to improve the se...

  10. Research on Process-oriented Spatio-temporal Data Model

    Directory of Open Access Journals (Sweden)

    XUE Cunjin

    2016-02-01

    Full Text Available According to the analysis of the present status and existing problems of spatio-temporal data models developed in last 20 years,this paper proposes a process-oriented spatio-temporal data model (POSTDM,aiming at representing,organizing and storing continuity and gradual geographical entities. The dynamic geographical entities are graded and abstracted into process objects series from their intrinsic characteristics,which are process objects,process stage objects,process sequence objects and process state objects. The logical relationships among process entities are further studied and the structure of UML models and storage are also designed. In addition,through the mechanisms of continuity and gradual changes impliedly recorded by process objects,and the modes of their procedure interfaces offered by the customized ObjcetStorageTable,the POSTDM can carry out process representation,storage and dynamic analysis of continuity and gradual geographic entities. Taking a process organization and storage of marine data as an example,a prototype system (consisting of an object-relational database and a functional analysis platform is developed for validating and evaluating the model's practicability.

  11. Numerical Simulation of a Grinding Process Model for the Spatial Work-pieces: Development of Modeling Techniques

    Directory of Open Access Journals (Sweden)

    S. A. Voronov

    2015-01-01

    Full Text Available The article presents a literature review in simulation of grinding processes. It takes into consideration the statistical, energy based, and imitation approaches to simulation of grinding forces. Main stages of interaction between abrasive grains and machined surface are shown. The article describes main approaches to the geometry modeling of forming new surfaces when grinding. The review of approaches to the chip and pile up effect numerical modeling is shown. Advantages and disadvantages of grain-to-surface interaction by means of finite element method and molecular dynamics method are considered. The article points out that it is necessary to take into consideration the system dynamics and its effect on the finished surface. Structure of the complex imitation model of grinding process dynamics for flexible work-pieces with spatial surface geometry is proposed from the literature review. The proposed model of spatial grinding includes the model of work-piece dynamics, model of grinding wheel dynamics, phenomenological model of grinding forces based on 3D geometry modeling algorithm. Model gives the following results for spatial grinding process: vibration of machining part and grinding wheel, machined surface geometry, static deflection of the surface and grinding forces under various cutting conditions.

  12. Non-isothermal processes during the drying of bare soil: Model Development and Validation

    Science.gov (United States)

    Sleep, B.; Talebi, A.; O'Carrol, D. M.

    2017-12-01

    Several coupled liquid water, water vapor, and heat transfer models have been developed either to study non-isothermal processes in the subsurface immediately below the ground surface, or to predict the evaporative flux from the ground surface. Equilibrium phase change between water and gas phases is typically assumed in these models. Recently, a few studies have questioned this assumption and proposed a coupled model considering kinetic phase change. However, none of these models were validated against real field data. In this study, a non-isothermal coupled model incorporating kinetic phase change was developed and examined against the measured data from a green roof test module. The model also incorporated a new surface boundary condition for water vapor transport at the ground surface. The measured field data included soil moisture content and temperature at different depths up to the depth of 15 cm below the ground surface. Lysimeter data were collected to determine the evaporation rates. Short and long wave radiation, wind velocity, air ambient temperature and relative humidity were measured and used as model input. Field data were collected for a period of three months during the warm seasons in south eastern Canada. The model was calibrated using one drying period and then several other drying periods were simulated. In general, the model underestimated the evaporation rates in the early stage of the drying period, however, the cumulative evaporation was in good agreement with the field data. The model predicted the trends in temperature and moisture content at the different depths in the green roof module. The simulated temperature was lower than the measured temperature for most of the simulation time with the maximum difference of 5 ° C. The simulated moisture content changes had the same temporal trend as the lysimeter data for the events simulated.

  13. GRA model development at Bruce Power

    International Nuclear Information System (INIS)

    Parmar, R.; Ngo, K.; Cruchley, I.

    2011-01-01

    In 2007, Bruce Power undertook a project, in partnership with AMEC NSS Limited, to develop a Generation Risk Assessment (GRA) model for its Bruce B Nuclear Generating Station. The model is intended to be used as a decision-making tool in support of plant operations. Bruce Power has recognized the strategic importance of GRA in the plant decision-making process and is currently implementing a pilot GRA application. The objective of this paper is to present the scope of the GRA model development project, methodology employed, and the results and path forward for the model implementation at Bruce Power. The required work was split into three phases. Phase 1 involved development of GRA models for the twelve systems most important to electricity production. Ten systems were added to the model during each of the next two phases. The GRA model development process consists of developing system Failure Modes and Effects Analyses (FMEA) to identify the components critical to the plant reliability and determine their impact on electricity production. The FMEAs were then used to develop the logic for system fault tree (FT) GRA models. The models were solved and post-processed to provide model outputs to the plant staff in a user-friendly format. The outputs consisted of the ranking of components based on their production impact expressed in terms of lost megawatt hours (LMWH). Another key model output was the estimation of the predicted Forced Loss Rate (FLR). (author)

  14. Cutting force model for high speed machining process

    International Nuclear Information System (INIS)

    Haber, R. E.; Jimenez, J. E.; Jimenez, A.; Lopez-Coronado, J.

    2004-01-01

    This paper presents cutting force-based models able to describe a high speed machining process. The model considers the cutting force as output variable, essential for the physical processes that are taking place in high speed machining. Moreover, this paper shows the mathematical development to derive the integral-differential equations, and the algorithms implemented in MATLAB to predict the cutting force in real time MATLAB is a software tool for doing numerical computations with matrices and vectors. It can also display information graphically and includes many toolboxes for several research and applications areas. Two end mill shapes are considered (i. e. cylindrical and ball end mill) for real-time implementation of the developed algorithms. the developed models are validated in slot milling operations. The results corroborate the importance of the cutting force variable for predicting tool wear in high speed machining operations. The developed models are the starting point for future work related with vibration analysis, process stability and dimensional surface finish in high speed machining processes. (Author) 19 refs

  15. Additive Manufacturing of IN100 Superalloy Through Scanning Laser Epitaxy for Turbine Engine Hot-Section Component Repair: Process Development, Modeling, Microstructural Characterization, and Process Control

    Science.gov (United States)

    Acharya, Ranadip; Das, Suman

    2015-09-01

    This article describes additive manufacturing (AM) of IN100, a high gamma-prime nickel-based superalloy, through scanning laser epitaxy (SLE), aimed at the creation of thick deposits onto like-chemistry substrates for enabling repair of turbine engine hot-section components. SLE is a metal powder bed-based laser AM technology developed for nickel-base superalloys with equiaxed, directionally solidified, and single-crystal microstructural morphologies. Here, we combine process modeling, statistical design-of-experiments (DoE), and microstructural characterization to demonstrate fully metallurgically bonded, crack-free and dense deposits exceeding 1000 μm of SLE-processed IN100 powder onto IN100 cast substrates produced in a single pass. A combined thermal-fluid flow-solidification model of the SLE process compliments DoE-based process development. A customized quantitative metallography technique analyzes digital cross-sectional micrographs and extracts various microstructural parameters, enabling process model validation and process parameter optimization. Microindentation measurements show an increase in the hardness by 10 pct in the deposit region compared to the cast substrate due to microstructural refinement. The results illustrate one of the very few successes reported for the crack-free deposition of IN100, a notoriously "non-weldable" hot-section alloy, thus establishing the potential of SLE as an AM method suitable for hot-section component repair and for future new-make components in high gamma-prime containing crack-prone nickel-based superalloys.

  16. Software Engineering Laboratory (SEL) cleanroom process model

    Science.gov (United States)

    Green, Scott; Basili, Victor; Godfrey, Sally; Mcgarry, Frank; Pajerski, Rose; Waligora, Sharon

    1991-01-01

    The Software Engineering Laboratory (SEL) cleanroom process model is described. The term 'cleanroom' originates in the integrated circuit (IC) production process, where IC's are assembled in dust free 'clean rooms' to prevent the destructive effects of dust. When applying the clean room methodology to the development of software systems, the primary focus is on software defect prevention rather than defect removal. The model is based on data and analysis from previous cleanroom efforts within the SEL and is tailored to serve as a guideline in applying the methodology to future production software efforts. The phases that are part of the process model life cycle from the delivery of requirements to the start of acceptance testing are described. For each defined phase, a set of specific activities is discussed, and the appropriate data flow is described. Pertinent managerial issues, key similarities and differences between the SEL's cleanroom process model and the standard development approach used on SEL projects, and significant lessons learned from prior cleanroom projects are presented. It is intended that the process model described here will be further tailored as additional SEL cleanroom projects are analyzed.

  17. Modeling business processes: theoretical and practical aspects

    Directory of Open Access Journals (Sweden)

    V.V. Dubininа

    2015-06-01

    Full Text Available The essence of process-oriented enterprise management has been examined in the article. The content and types of information technology have been analyzed in the article, due to the complexity and differentiation of existing methods, as well as the specificity of language, terminology of the enterprise business processes modeling. The theoretical aspects of business processes modeling have been reviewed and the modern traditional modeling techniques received practical application in the visualization model of retailers activity have been studied in the article. In the process of theoretical analysis of the modeling methods found that UFO-toolkit method that has been developed by Ukrainian scientists due to it systemology integrated opportunities, is the most suitable for structural and object analysis of retailers business processes. It was designed visualized simulation model of the business process "sales" as is" of retailers using a combination UFO-elements with the aim of the further practical formalization and optimization of a given business process.

  18. Computer-Aided Multiscale Modelling for Chemical Process Engineering

    DEFF Research Database (Denmark)

    Morales Rodriguez, Ricardo; Gani, Rafiqul

    2007-01-01

    Chemical processes are generally modeled through monoscale approaches, which, while not adequate, satisfy a useful role in product-process design. In this case, use of a multi-dimensional and multi-scale model-based approach has importance in product-process development. A computer-aided framework...

  19. Guided interaction exploration in artifact-centric process models

    NARCIS (Netherlands)

    van Eck, M.L.; Sidorova, N.; van der Aalst, W.M.P.

    2017-01-01

    Artifact-centric process models aim to describe complex processes as a collection of interacting artifacts. Recent development in process mining allow for the discovery of such models. However, the focus is often on the representation of the individual artifacts rather than their interactions. Based

  20. Model medication management process in Australian nursing homes using business process modeling.

    Science.gov (United States)

    Qian, Siyu; Yu, Ping

    2013-01-01

    One of the reasons for end user avoidance or rejection to use health information systems is poor alignment of the system with healthcare workflow, likely causing by system designers' lack of thorough understanding about healthcare process. Therefore, understanding the healthcare workflow is the essential first step for the design of optimal technologies that will enable care staff to complete the intended tasks faster and better. The often use of multiple or "high risk" medicines by older people in nursing homes has the potential to increase medication error rate. To facilitate the design of information systems with most potential to improve patient safety, this study aims to understand medication management process in nursing homes using business process modeling method. The paper presents study design and preliminary findings from interviewing two registered nurses, who were team leaders in two nursing homes. Although there were subtle differences in medication management between the two homes, major medication management activities were similar. Further field observation will be conducted. Based on the data collected from observations, an as-is process model for medication management will be developed.

  1. Integrating ergonomics into the product development process

    DEFF Research Database (Denmark)

    Broberg, Ole

    1997-01-01

    and production engineers regarding information sources in problem solving, communication pattern, perception of ergonomics, motivation and requests to support tools and methods. These differences and the social and organizational contexts of the development process must be taken into account when considering......A cross-sectional case study was performed in a large company producing electro-mechanical products for industrial application. The purpose was to elucidate conditions and strategies for integrating ergonomics into the product development process thereby preventing ergonomic problems at the time...... of manufacture of new products. In reality the product development process is not a rational problem solving process and does not proceed in a sequential manner as decribed in engineering models. Instead it is a complex organizational process involving uncertainties, iterative elements and negotiation between...

  2. Developing a Comprehensive Model of Intensive Care Unit Processes: Concept of Operations.

    Science.gov (United States)

    Romig, Mark; Tropello, Steven P; Dwyer, Cindy; Wyskiel, Rhonda M; Ravitz, Alan; Benson, John; Gropper, Michael A; Pronovost, Peter J; Sapirstein, Adam

    2015-04-23

    This study aimed to use a systems engineering approach to improve performance and stakeholder engagement in the intensive care unit to reduce several different patient harms. We developed a conceptual framework or concept of operations (ConOps) to analyze different types of harm that included 4 steps as follows: risk assessment, appropriate therapies, monitoring and feedback, as well as patient and family communications. This framework used a transdisciplinary approach to inventory the tasks and work flows required to eliminate 7 common types of harm experienced by patients in the intensive care unit. The inventory gathered both implicit and explicit information about how the system works or should work and converted the information into a detailed specification that clinicians could understand and use. Using the ConOps document, we created highly detailed work flow models to reduce harm and offer an example of its application to deep venous thrombosis. In the deep venous thrombosis model, we identified tasks that were synergistic across different types of harm. We will use a system of systems approach to integrate the variety of subsystems and coordinate processes across multiple types of harm to reduce the duplication of tasks. Through this process, we expect to improve efficiency and demonstrate synergistic interactions that ultimately can be applied across the spectrum of potential patient harms and patient locations. Engineering health care to be highly reliable will first require an understanding of the processes and work flows that comprise patient care. The ConOps strategy provided a framework for building complex systems to reduce patient harm.

  3. Modelling Of Monazite Ore Break-Down By Alkali Process Spectrometry

    International Nuclear Information System (INIS)

    Visetpotjanakit, Suputtra; Changkrueng, Kalaya; Pichestapong, Pipat

    2005-10-01

    A computer modelling has been developed for the calculation of mass balance of monazite ore break-down by alkali process at Rare Earth Research and Development Center. The process includes the following units : ore digestion by concentrate NaOH, dissolution of digested ore by HCl, uranium and thorium precipitation and crystallization of Na3PO4 which is by-product from this process. The model named RRDCMBP was prepared in Visual Basic language. The modelling program can be run on personal computer and it is interactive and easy to use. User is able to choose any equipment in each unit process and input data to get output of mass balance results. The model could be helpful in the process analysis for the further process adjustment and development

  4. Technology development life cycle processes.

    Energy Technology Data Exchange (ETDEWEB)

    Beck, David Franklin

    2013-05-01

    This report and set of appendices are a collection of memoranda originally drafted in 2009 for the purpose of providing motivation and the necessary background material to support the definition and integration of engineering and management processes related to technology development. At the time there was interest and support to move from Capability Maturity Model Integration (CMMI) Level One (ad hoc processes) to Level Three. As presented herein, the material begins with a survey of open literature perspectives on technology development life cycles, including published data on %E2%80%9Cwhat went wrong.%E2%80%9D The main thrust of the material presents a rational expose%CC%81 of a structured technology development life cycle that uses the scientific method as a framework, with further rigor added from adapting relevant portions of the systems engineering process. The material concludes with a discussion on the use of multiple measures to assess technology maturity, including consideration of the viewpoint of potential users.

  5. Model based process-product design and analysis

    DEFF Research Database (Denmark)

    Gani, Rafiqul

    This paper gives a perspective on modelling and the important role it has within product-process design and analysis. Different modelling issues related to development and application of systematic model-based solution approaches for product-process design is discussed and the need for a hybrid...... model-based framework is highlighted. This framework should be able to manage knowledge-data, models, and associated methods and tools integrated with design work-flows and data-flows for specific product-process design problems. In particular, the framework needs to manage models of different types......, forms and complexity, together with their associated parameters. An example of a model-based system for design of chemicals based formulated products is also given....

  6. MODELLING OF THE PROCESS OF TEACHING READING ENGLISH LANGUAGE PERIODICALS

    Directory of Open Access Journals (Sweden)

    Тетяна Глушко

    2014-07-01

    Full Text Available The article reveals a scientifically substantiated process of teaching reading English language periodicals in all its components, which are consistently developed, and form of interconnection of the structural elements in the process of teaching reading. This process is presented as a few interconnected and interdetermined models: 1 the models of the process of acquiring standard and expressive lexical knowledge; 2 the models of the process of formation of skills to use such vocabulary; 3 the models of the development of skills to read texts of the different linguistic levels.

  7. Modeling styles in business process modeling

    NARCIS (Netherlands)

    Pinggera, J.; Soffer, P.; Zugal, S.; Weber, B.; Weidlich, M.; Fahland, D.; Reijers, H.A.; Mendling, J.; Bider, I.; Halpin, T.; Krogstie, J.; Nurcan, S.; Proper, E.; Schmidt, R.; Soffer, P.; Wrycza, S.

    2012-01-01

    Research on quality issues of business process models has recently begun to explore the process of creating process models. As a consequence, the question arises whether different ways of creating process models exist. In this vein, we observed 115 students engaged in the act of modeling, recording

  8. A generalized logarithmic image processing model based on the gigavision sensor model.

    Science.gov (United States)

    Deng, Guang

    2012-03-01

    The logarithmic image processing (LIP) model is a mathematical theory providing generalized linear operations for image processing. The gigavision sensor (GVS) is a new imaging device that can be described by a statistical model. In this paper, by studying these two seemingly unrelated models, we develop a generalized LIP (GLIP) model. With the LIP model being its special case, the GLIP model not only provides new insights into the LIP model but also defines new image representations and operations for solving general image processing problems that are not necessarily related to the GVS. A new parametric LIP model is also developed. To illustrate the application of the new scalar multiplication operation, we propose an energy-preserving algorithm for tone mapping, which is a necessary step in image dehazing. By comparing with results using two state-of-the-art algorithms, we show that the new scalar multiplication operation is an effective tool for tone mapping.

  9. Model feedstock supply processing plants

    Directory of Open Access Journals (Sweden)

    V. M. Bautin

    2013-01-01

    Full Text Available The model of raw providing the processing enterprises entering into vertically integrated structure on production and processing of dairy raw materials, differing by an orientation on achievement of cumulative effect by the integrated structure acting as criterion function which maximizing is reached by optimization of capacities, volumes of deliveries of raw materials and its qualitative characteristics, costs of industrial processing of raw materials and demand for dairy production is developed.

  10. Detecting Difference between Process Models Based on the Refined Process Structure Tree

    Directory of Open Access Journals (Sweden)

    Jing Fan

    2017-01-01

    Full Text Available The development of mobile workflow management systems (mWfMS leads to large number of business process models. In the meantime, the location restriction embedded in mWfMS may result in different process models for a single business process. In order to help users quickly locate the difference and rebuild the process model, detecting the difference between different process models is needed. Existing detection methods either provide a dissimilarity value to represent the difference or use predefined difference template to generate the result, which cannot reflect the entire composition of the difference. Hence, in this paper, we present a new approach to solve this problem. Firstly, we parse the process models to their corresponding refined process structure trees (PSTs, that is, decomposing a process model into a hierarchy of subprocess models. Then we design a method to convert the PST to its corresponding task based process structure tree (TPST. As a consequence, the problem of detecting difference between two process models is transformed to detect difference between their corresponding TPSTs. Finally, we obtain the difference between two TPSTs based on the divide and conquer strategy, where the difference is described by an edit script and we make the cost of the edit script close to minimum. The extensive experimental evaluation shows that our method can meet the real requirements in terms of precision and efficiency.

  11. Mathematical modelling of the laser processing of compose materials

    International Nuclear Information System (INIS)

    Gromyko, G.F.; Matsuka, N.P.

    2009-01-01

    Expansion of the protective coating scope led to the necessity to work out lower priced methods of treatment of machine elements. Making of an adequate, agreed with process features, mathematical model and development of effective methods of its solving are promising directions in this fields. In this paper the mathematical model of high-temperature laser treatment via moving source of pre-sprayed with composite powder padding is developed. Presented model describes accurately enough the heat processes taking place by laser processing of machine elements. Varying input parameters of model (laser power, temperature and composition of environment, characteristics and quantitative composition of using materials, etc.) one can get a cheap tool of preliminary estimates for wide range of similar problems. Difference method, based on process physical features and taking into account main process-dependent parameters had been developed for solving of the built system of nonlinear equations. (authors)

  12. Welding process modelling and control

    Science.gov (United States)

    Romine, Peter L.; Adenwala, Jinen A.

    1993-01-01

    The research and analysis performed, and software developed, and hardware/software recommendations made during 1992 in development of the PC-based data acquisition system for support of Welding Process Modeling and Control is reported. A need was identified by the Metals Processing Branch of NASA Marshall Space Flight Center, for a mobile data aquisition and analysis system, customized for welding measurement and calibration. Several hardware configurations were evaluated and a PC-based system was chosen. The Welding Measurement System (WMS) is a dedicated instrument, strictly for the use of data aquisition and analysis. Although the WMS supports many of the functions associated with the process control, it is not the intention for this system to be used for welding process control.

  13. Modelling the evolution of compacted bentonite clays in engineered barrier systems: process model development of the bentonite-water-air system

    International Nuclear Information System (INIS)

    Bond, A.E.; Wilson, J.C.; Maul, P.R.; Robinson, P.C.; Savage, D.

    2010-01-01

    considered to be 'bound' or otherwise immobile (specifically water held in bentonite interlayer sites and double layers) and water which is 'free' or mobile, comprising liquid water and water vapour. The disposition of the water is then constrained using thermodynamic data derived directly from laboratory studies to give a localised energy balance (including bentonite free energy and rock stress) which allows a bound water retention curve to be dynamically evaluated. In addition, a simple mass and volume balancing approach allows the micro-scale changes in porosity and bentonite grain volume to be converted into a macro-scale bulk volume changes and water retention capacity. Indeed, the model largely abandons the concept of 'porosity' as a useful term when describing the state of fluids in bentonite, naturally considering 'capacities' to hold different types of water dependent on the physical and chemical condition of the bentonite. Migration of liquid water, air and water vapour is handled using conventional multi-phase-flow theory with some simple adjustments to selected parameterisation (mainly relative permeability and suction curves) to take into account the different water and air distribution model. The new model has been successfully applied to a series of benchmarking studies in the THERESA project, and examples of comparisons between model calculations and laboratory and field-test data are described in the paper. This model has been implemented in software based on Quintessa's general-purpose modelling code QPAC, which employs a fundamentally different approach to system discretization and process representation from most THM codes. The rapid prototyping and coupled process model development that the QPAC code facilitates has enabled the revised bentonite model to be implemented, producing a test-bed for investigating key features of EBS evolution. Although the application of the model is at an early stage, and further

  14. Graphene growth process modeling: a physical-statistical approach

    Science.gov (United States)

    Wu, Jian; Huang, Qiang

    2014-09-01

    As a zero-band semiconductor, graphene is an attractive material for a wide variety of applications such as optoelectronics. Among various techniques developed for graphene synthesis, chemical vapor deposition on copper foils shows high potential for producing few-layer and large-area graphene. Since fabrication of high-quality graphene sheets requires the understanding of growth mechanisms, and methods of characterization and control of grain size of graphene flakes, analytical modeling of graphene growth process is therefore essential for controlled fabrication. The graphene growth process starts with randomly nucleated islands that gradually develop into complex shapes, grow in size, and eventually connect together to cover the copper foil. To model this complex process, we develop a physical-statistical approach under the assumption of self-similarity during graphene growth. The growth kinetics is uncovered by separating island shapes from area growth rate. We propose to characterize the area growth velocity using a confined exponential model, which not only has clear physical explanation, but also fits the real data well. For the shape modeling, we develop a parametric shape model which can be well explained by the angular-dependent growth rate. This work can provide useful information for the control and optimization of graphene growth process on Cu foil.

  15. Process modeling style

    CERN Document Server

    Long, John

    2014-01-01

    Process Modeling Style focuses on other aspects of process modeling beyond notation that are very important to practitioners. Many people who model processes focus on the specific notation used to create their drawings. While that is important, there are many other aspects to modeling, such as naming, creating identifiers, descriptions, interfaces, patterns, and creating useful process documentation. Experience author John Long focuses on those non-notational aspects of modeling, which practitioners will find invaluable. Gives solid advice for creating roles, work produ

  16. The application of feature selection to the development of Gaussian process models for percutaneous absorption.

    Science.gov (United States)

    Lam, Lun Tak; Sun, Yi; Davey, Neil; Adams, Rod; Prapopoulou, Maria; Brown, Marc B; Moss, Gary P

    2010-06-01

    The aim was to employ Gaussian processes to assess mathematically the nature of a skin permeability dataset and to employ these methods, particularly feature selection, to determine the key physicochemical descriptors which exert the most significant influence on percutaneous absorption, and to compare such models with established existing models. Gaussian processes, including automatic relevance detection (GPRARD) methods, were employed to develop models of percutaneous absorption that identified key physicochemical descriptors of percutaneous absorption. Using MatLab software, the statistical performance of these models was compared with single linear networks (SLN) and quantitative structure-permeability relationships (QSPRs). Feature selection methods were used to examine in more detail the physicochemical parameters used in this study. A range of statistical measures to determine model quality were used. The inherently nonlinear nature of the skin data set was confirmed. The Gaussian process regression (GPR) methods yielded predictive models that offered statistically significant improvements over SLN and QSPR models with regard to predictivity (where the rank order was: GPR > SLN > QSPR). Feature selection analysis determined that the best GPR models were those that contained log P, melting point and the number of hydrogen bond donor groups as significant descriptors. Further statistical analysis also found that great synergy existed between certain parameters. It suggested that a number of the descriptors employed were effectively interchangeable, thus questioning the use of models where discrete variables are output, usually in the form of an equation. The use of a nonlinear GPR method produced models with significantly improved predictivity, compared with SLN or QSPR models. Feature selection methods were able to provide important mechanistic information. However, it was also shown that significant synergy existed between certain parameters, and as such it

  17. Mashup Model and Verification Using Mashup Processing Network

    Science.gov (United States)

    Zahoor, Ehtesham; Perrin, Olivier; Godart, Claude

    Mashups are defined to be lightweight Web applications aggregating data from different Web services, built using ad-hoc composition and being not concerned with long term stability and robustness. In this paper we present a pattern based approach, called Mashup Processing Network (MPN). The idea is based on Event Processing Network and is supposed to facilitate the creation, modeling and the verification of mashups. MPN provides a view of how different actors interact for the mashup development namely the producer, consumer, mashup processing agent and the communication channels. It also supports modeling transformations and validations of data and offers validation of both functional and non-functional requirements, such as reliable messaging and security, that are key issues within the enterprise context. We have enriched the model with a set of processing operations and categorize them into data composition, transformation and validation categories. These processing operations can be seen as a set of patterns for facilitating the mashup development process. MPN also paves a way for realizing Mashup Oriented Architecture where mashups along with services are used as building blocks for application development.

  18. Numerical modeling of atmospheric washout processes

    International Nuclear Information System (INIS)

    Bayer, D.; Beheng, K.D.; Herbert, F.

    1987-01-01

    For the washout of particles from the atmosphere by clouds and rain one has to distinguish between processes which work in the first phase of cloud development, when condensation nuclei build up in saturated air (Nucleation Aerosol Scavenging, NAS) and those processes which work at the following cloud development. In the second case particles are taken off by cloud droplets or by falling rain drops via collision (Collision Aerosol Scavenging, CAS). The physics of both processes is described. For the CAS process a numerical model is presented. The report contains a documentation of the mathematical equations and the computer programs (FORTRAN). (KW) [de

  19. Dual processing model of medical decision-making

    Science.gov (United States)

    2012-01-01

    Background Dual processing theory of human cognition postulates that reasoning and decision-making can be described as a function of both an intuitive, experiential, affective system (system I) and/or an analytical, deliberative (system II) processing system. To date no formal descriptive model of medical decision-making based on dual processing theory has been developed. Here we postulate such a model and apply it to a common clinical situation: whether treatment should be administered to the patient who may or may not have a disease. Methods We developed a mathematical model in which we linked a recently proposed descriptive psychological model of cognition with the threshold model of medical decision-making and show how this approach can be used to better understand decision-making at the bedside and explain the widespread variation in treatments observed in clinical practice. Results We show that physician’s beliefs about whether to treat at higher (lower) probability levels compared to the prescriptive therapeutic thresholds obtained via system II processing is moderated by system I and the ratio of benefit and harms as evaluated by both system I and II. Under some conditions, the system I decision maker’s threshold may dramatically drop below the expected utility threshold derived by system II. This can explain the overtreatment often seen in the contemporary practice. The opposite can also occur as in the situations where empirical evidence is considered unreliable, or when cognitive processes of decision-makers are biased through recent experience: the threshold will increase relative to the normative threshold value derived via system II using expected utility threshold. This inclination for the higher diagnostic certainty may, in turn, explain undertreatment that is also documented in the current medical practice. Conclusions We have developed the first dual processing model of medical decision-making that has potential to enrich the current medical

  20. Dual processing model of medical decision-making.

    Science.gov (United States)

    Djulbegovic, Benjamin; Hozo, Iztok; Beckstead, Jason; Tsalatsanis, Athanasios; Pauker, Stephen G

    2012-09-03

    Dual processing theory of human cognition postulates that reasoning and decision-making can be described as a function of both an intuitive, experiential, affective system (system I) and/or an analytical, deliberative (system II) processing system. To date no formal descriptive model of medical decision-making based on dual processing theory has been developed. Here we postulate such a model and apply it to a common clinical situation: whether treatment should be administered to the patient who may or may not have a disease. We developed a mathematical model in which we linked a recently proposed descriptive psychological model of cognition with the threshold model of medical decision-making and show how this approach can be used to better understand decision-making at the bedside and explain the widespread variation in treatments observed in clinical practice. We show that physician's beliefs about whether to treat at higher (lower) probability levels compared to the prescriptive therapeutic thresholds obtained via system II processing is moderated by system I and the ratio of benefit and harms as evaluated by both system I and II. Under some conditions, the system I decision maker's threshold may dramatically drop below the expected utility threshold derived by system II. This can explain the overtreatment often seen in the contemporary practice. The opposite can also occur as in the situations where empirical evidence is considered unreliable, or when cognitive processes of decision-makers are biased through recent experience: the threshold will increase relative to the normative threshold value derived via system II using expected utility threshold. This inclination for the higher diagnostic certainty may, in turn, explain undertreatment that is also documented in the current medical practice. We have developed the first dual processing model of medical decision-making that has potential to enrich the current medical decision-making field, which is still to the

  1. A process model in continuing professional development: Exploring diagnostic radiographers' views

    Energy Technology Data Exchange (ETDEWEB)

    Henwood, Suzanne M. [Henwood Associates (South East) Ltd, Coaching and Training, 38 Tudor Crescent, Otford, TN14 5QT, Sevenoaks, Kent (United Kingdom)], E-mail: henwoodassociates@btinternet.com; Taket, Ann [Centre for Health through Action on Social Exclusion (CHASE), School of Health and Social Development, Faculty of Health and Behavioural Sciences, Deakin University, 221 Burwood Highway, Burwood, Vic 3125 (Australia)], E-mail: ann.taket@deakin.edu.au

    2008-08-15

    This article is based on an exploratory, interpretative grounded theory study that looked at practitioners' perceptions of continuing professional development (CPD) in diagnostic radiography in the UK. Using a combination of in-depth interviews and secondary analysis of published material, a dynamic CPD process model was generated. The study aimed to explore what radiographers understood by the term CPD and whether it was perceived to have any impact on clinical practice. The study aimed to identify and investigate the components of CPD and how they interact with one another, to help to explain what is happening within CPD and what contributes to its effectiveness. The CPD process was shown to be complex, dynamic and centred on the Individual. Supporting components of Facilitation and External Influences were identified as important in maximising the potential impact of CPD. The three main categories were shown to interact dynamically and prior to Participation were shown to have a 'superadditive' effect, where the total effect was greater than the sum of the three individual parts. This study showed that radiographers are generally unaware of the holistic concept of CPD, using instead narrow definitions of CPD with little or no expectation of any impact on practice, focusing predominantly on personal gain. The model produced in the study provided a tool that practitioners reported was helpful in reflecting on their own involvment in CPD.

  2. Development and validation of a CFD model predicting the backfill process of a nuclear waste gallery

    International Nuclear Information System (INIS)

    Gopala, Vinay Ramohalli; Lycklama a Nijeholt, Jan-Aiso; Bakker, Paul; Haverkate, Benno

    2011-01-01

    Research highlights: → This work presents the CFD simulation of the backfill process of Supercontainers with nuclear waste emplaced in a disposal gallery. → The cement-based material used for backfill is grout and the flow of grout is modelled as a Bingham fluid. → The model is verified against an analytical solution and validated against the flowability tests for concrete. → Comparison between backfill plexiglas experiment and simulation shows a distinct difference in the filling pattern. → The numerical model needs to be further developed to include segregation effects and thixotropic behavior of grout. - Abstract: Nuclear waste material may be stored in underground tunnels for long term storage. The example treated in this article is based on the current Belgian disposal concept for High-Level Waste (HLW), in which the nuclear waste material is packed in concrete shielded packages, called Supercontainers, which are inserted into these tunnels. After placement of the packages in the underground tunnels, the remaining voids between the packages and the tunnel lining is filled-up with a cement-based material called grout in order to encase the stored containers into the underground spacing. This encasement of the stored containers inside the tunnels is known as the backfill process. A good backfill process is necessary to stabilize the waste gallery against ground settlements. A numerical model to simulate the backfill process can help to improve and optimize the process by ensuring a homogeneous filling with no air voids and also optimization of the injection positions to achieve a homogeneous filling. The objective of the present work is to develop such a numerical code that can predict the backfill process well and validate the model against the available experiments and analytical solutions. In the present work the rheology of Grout is modelled as a Bingham fluid which is implemented in OpenFOAM - a finite volume-based open source computational fluid

  3. Development of a model describing virus removal process in an activated sludge basin

    Energy Technology Data Exchange (ETDEWEB)

    Kim, T.; Shiragami, N. Unno, H. [Tokyo Institute of Technology, Tokyo (Japan)

    1995-06-20

    The virus removal process from the liquid phase in an activated sludge basin possibly consists of physicochemical processes, such as adsorption onto sludge flocs, biological processes such as microbial predating and inactivation by virucidal components excreted by microbes. To describe properly the virus behavior in an activated sludge basin, a simple model is proposed based on the experimental data obtained using a poliovirus type 1. A three-compartments model, which include the virus in the liquid phase and in the peripheral and inner regions of sludge flocs is employed. By using the model, the Virus removal process was successfully simulated to highlight the implication of its distribution in the activated sludge basin. 17 refs., 8 figs.

  4. Dual processing model of medical decision-making

    OpenAIRE

    Djulbegovic, Benjamin; Hozo, Iztok; Beckstead, Jason; Tsalatsanis, Athanasios; Pauker, Stephen G

    2012-01-01

    Abstract Background Dual processing theory of human cognition postulates that reasoning and decision-making can be described as a function of both an intuitive, experiential, affective system (system I) and/or an analytical, deliberative (system II) processing system. To date no formal descriptive model of medical decision-making based on dual processing theory has been developed. Here we postulate such a model and apply it to a common clinical situation: whether treatment should be administe...

  5. Gsflow-py: An integrated hydrologic model development tool

    Science.gov (United States)

    Gardner, M.; Niswonger, R. G.; Morton, C.; Henson, W.; Huntington, J. L.

    2017-12-01

    Integrated hydrologic modeling encompasses a vast number of processes and specifications, variable in time and space, and development of model datasets can be arduous. Model input construction techniques have not been formalized or made easily reproducible. Creating the input files for integrated hydrologic models (IHM) requires complex GIS processing of raster and vector datasets from various sources. Developing stream network topology that is consistent with the model resolution digital elevation model is important for robust simulation of surface water and groundwater exchanges. Distribution of meteorologic parameters over the model domain is difficult in complex terrain at the model resolution scale, but is necessary to drive realistic simulations. Historically, development of input data for IHM models has required extensive GIS and computer programming expertise which has restricted the use of IHMs to research groups with available financial, human, and technical resources. Here we present a series of Python scripts that provide a formalized technique for the parameterization and development of integrated hydrologic model inputs for GSFLOW. With some modifications, this process could be applied to any regular grid hydrologic model. This Python toolkit automates many of the necessary and laborious processes of parameterization, including stream network development and cascade routing, land coverages, and meteorological distribution over the model domain.

  6. A Model of Process-Based Automation: Cost and Quality Implications in the Medication Management Process

    Science.gov (United States)

    Spaulding, Trent Joseph

    2011-01-01

    The objective of this research is to understand how a set of systems, as defined by the business process, creates value. The three studies contained in this work develop the model of process-based automation. The model states that complementarities among systems are specified by handoffs in the business process. The model also provides theory to…

  7. Pavement maintenance optimization model using Markov Decision Processes

    Science.gov (United States)

    Mandiartha, P.; Duffield, C. F.; Razelan, I. S. b. M.; Ismail, A. b. H.

    2017-09-01

    This paper presents an optimization model for selection of pavement maintenance intervention using a theory of Markov Decision Processes (MDP). There are some particular characteristics of the MDP developed in this paper which distinguish it from other similar studies or optimization models intended for pavement maintenance policy development. These unique characteristics include a direct inclusion of constraints into the formulation of MDP, the use of an average cost method of MDP, and the policy development process based on the dual linear programming solution. The limited information or discussions that are available on these matters in terms of stochastic based optimization model in road network management motivates this study. This paper uses a data set acquired from road authorities of state of Victoria, Australia, to test the model and recommends steps in the computation of MDP based stochastic optimization model, leading to the development of optimum pavement maintenance policy.

  8. Transforming Collaborative Process Models into Interface Process Models by Applying an MDA Approach

    Science.gov (United States)

    Lazarte, Ivanna M.; Chiotti, Omar; Villarreal, Pablo D.

    Collaborative business models among enterprises require defining collaborative business processes. Enterprises implement B2B collaborations to execute these processes. In B2B collaborations the integration and interoperability of processes and systems of the enterprises are required to support the execution of collaborative processes. From a collaborative process model, which describes the global view of the enterprise interactions, each enterprise must define the interface process that represents the role it performs in the collaborative process in order to implement the process in a Business Process Management System. Hence, in this work we propose a method for the automatic generation of the interface process model of each enterprise from a collaborative process model. This method is based on a Model-Driven Architecture to transform collaborative process models into interface process models. By applying this method, interface processes are guaranteed to be interoperable and defined according to a collaborative process.

  9. Implementation of New Process Models for Tailored Polymer Composite Structures into Processing Software Packages

    International Nuclear Information System (INIS)

    Nguyen, Ba Nghiep; Jin, Xiaoshi; Wang, Jin; Phelps, Jay; Tucker, Charles L. III; Kunc, Vlastimil; Bapanapalli, Satish K.; Smith, Mark T.

    2010-01-01

    This report describes the work conducted under the Cooperative Research and Development Agreement (CRADA) (Nr. 260) between the Pacific Northwest National Laboratory (PNNL) and Autodesk, Inc. to develop and implement process models for injection-molded long-fiber thermoplastics (LFTs) in processing software packages. The structure of this report is organized as follows. After the Introduction Section (Section 1), Section 2 summarizes the current fiber orientation models developed for injection-molded short-fiber thermoplastics (SFTs). Section 3 provides an assessment of these models to determine their capabilities and limitations, and the developments needed for injection-molded LFTs. Section 4 then focuses on the development of a new fiber orientation model for LFTs. This model is termed the anisotropic rotary diffusion - reduced strain closure (ARD-RSC) model as it explores the concept of anisotropic rotary diffusion to capture the fiber-fiber interaction in long-fiber suspensions and uses the reduced strain closure method of Wang et al. to slow down the orientation kinetics in concentrated suspensions. In contrast to fiber orientation modeling, before this project, no standard model was developed to predict the fiber length distribution in molded fiber composites. Section 5 is therefore devoted to the development of a fiber length attrition model in the mold. Sections 6 and 7 address the implementations of the models in AMI, and the conclusions drawn from this work is presented in Section 8.

  10. Modelling a uranium ore bioleaching process

    International Nuclear Information System (INIS)

    Chien, D.C.H.; Douglas, P.L.; Herman, D.H.; Marchbank, A.

    1990-01-01

    A dynamic simulation model for the bioleaching of uranium ore in a stope leaching process has been developed. The model incorporates design and operating conditions, reaction kinetics enhanced by Thiobacillus ferroxidans present in the leaching solution and transport properties. Model predictions agree well with experimental data with an average deviation of about ± 3%. The model is sensitive to small errors in the estimates of fragment size and ore grade. Because accurate estimates are difficult to obtain a parameter estimation approach was developed to update the value of fragment size and ore grade using on-line plant information

  11. New methods for clinical pathways-Business Process Modeling Notation (BPMN) and Tangible Business Process Modeling (t.BPM).

    Science.gov (United States)

    Scheuerlein, Hubert; Rauchfuss, Falk; Dittmar, Yves; Molle, Rüdiger; Lehmann, Torsten; Pienkos, Nicole; Settmacher, Utz

    2012-06-01

    Clinical pathways (CP) are nowadays used in numerous institutions, but their real impact is still a matter of debate. The optimal design of a clinical pathway remains unclear and is mainly determined by the expectations of the individual institution. The purpose of the here described pilot project was the development of two CP (colon and rectum carcinoma) according to Business Process Modeling Notation (BPMN) and Tangible Business Process Modeling (t.BPM). BPMN is an established standard for business process modelling in industry and economy. It is, in the broadest sense, a computer programme which enables the description and a relatively easy graphical imaging of complex processes. t.BPM is a modular construction system of the BPMN symbols which enables the creation of an outline or raw model, e.g. by placing the symbols on a spread-out paper sheet. The thus created outline can then be transferred to the computer and further modified as required. CP for the treatment of colon and rectal cancer have been developed with support of an external IT coach. The pathway was developed in an interdisciplinary and interprofessional manner (55 man-days over 15 working days). During this time, necessary interviews with medical, nursing and administrative staffs were conducted as well. Both pathways were developed parallel. Subsequent analysis was focussed on feasibility, expenditure, clarity and suitability for daily clinical practice. The familiarization with BPMN was relatively quick and intuitive. The use of t.BPM enabled the pragmatic, effective and results-directed creation of outlines for the CP. The development of both CP was finished from the diagnostic evaluation to the adjuvant/neoadjuvant therapy and rehabilitation phase. The integration of checklists, guidelines and important medical or other documents is easily accomplished. A direct integration into the hospital computer system is currently not possible for technical reasons. BPMN and t.BPM are sufficiently

  12. Dynamic modeling of ultrafiltration membranes for whey separation processes

    NARCIS (Netherlands)

    Saltik, M.B.; Ozkan, L.; Jacobs, M.; van der Padt, A.

    2017-01-01

    In this paper, we present a control relevant rigorous dynamic model for an ultrafiltration membrane unit in a whey separation process. The model consists of a set of differential algebraic equations and is developed for online model based applications such as model based control and process

  13. Sustainable Chemical Process Development through an Integrated Framework

    DEFF Research Database (Denmark)

    Papadakis, Emmanouil; Kumar Tula, Anjan; Anantpinijwatna, Amata

    2016-01-01

    This paper describes the development and the application of a general integrated framework based on systematic model-based methods and computer-aided tools with the objective to achieve more sustainable process designs and to improve the process understanding. The developed framework can be appli...... studies involve multiphase reaction systems for the synthesis of active pharmaceutical ingredients....

  14. Modelling and control of dynamic systems using gaussian process models

    CERN Document Server

    Kocijan, Juš

    2016-01-01

    This monograph opens up new horizons for engineers and researchers in academia and in industry dealing with or interested in new developments in the field of system identification and control. It emphasizes guidelines for working solutions and practical advice for their implementation rather than the theoretical background of Gaussian process (GP) models. The book demonstrates the potential of this recent development in probabilistic machine-learning methods and gives the reader an intuitive understanding of the topic. The current state of the art is treated along with possible future directions for research. Systems control design relies on mathematical models and these may be developed from measurement data. This process of system identification, when based on GP models, can play an integral part of control design in data-based control and its description as such is an essential aspect of the text. The background of GP regression is introduced first with system identification and incorporation of prior know...

  15. Development and implementation of computational geometric model for simulation of plate type fuel fabrication process with microspheres dispersed in metallic matrix

    International Nuclear Information System (INIS)

    Lage, Aldo M.F.; Reis, Sergio C.; Braga, Daniel M.; Santos, Armindo; Ferraz, Wilmar B.

    2005-01-01

    In this report it is presented the development of a geometric model to simulate the plate type fuel fabrication process with fuels microspheres dispersed in metallic matrix, as well as its software implementation. The developed geometric model encloses the steps of pellets pressing and sintering, as well as the plate rolling passes. The model permits the simulation of structures, where the values of the various variables of the fabrication processes can be studied and modified. The following variables were analyzed: microspheres diameters, density of the powder/microspheres mixing, microspheres density, fuel volume fraction, sintering densification, and rolling passes number. In the model implementation, which was codified in DELPHI programming language, systems of structured analysis techniques were utilized. The structures simulated were visualized utilizing the AutoCAD applicative, what permitted to obtain planes sections in diverse directions. The objective of this model is to enable the analysis of the simulated structures and supply information that can help in the improvement of the dispersion microspheres fuel plates fabrication process, now in development at CDTN (Centro de Desenvolvimento da Tecnologia Nuclear) in cooperation with the CTMSP (Centro Tecnologico da Marinha em Sao Paulo). (author)

  16. Residence time modeling of hot melt extrusion processes.

    Science.gov (United States)

    Reitz, Elena; Podhaisky, Helmut; Ely, David; Thommes, Markus

    2013-11-01

    The hot melt extrusion process is a widespread technique to mix viscous melts. The residence time of material in the process frequently determines the product properties. An experimental setup and a corresponding mathematical model were developed to evaluate residence time and residence time distribution in twin screw extrusion processes. The extrusion process was modeled as the convolution of a mass transport process described by a Gaussian probability function, and a mixing process represented by an exponential function. The residence time of the extrusion process was determined by introducing a tracer at the extruder inlet and measuring the tracer concentration at the die. These concentrations were fitted to the residence time model, and an adequate correlation was found. Different parameters were derived to characterize the extrusion process including the dead time, the apparent mixing volume, and a transport related axial mixing. A 2(3) design of experiments was performed to evaluate the effect of powder feed rate, screw speed, and melt viscosity of the material on the residence time. All three parameters affect the residence time of material in the extruder. In conclusion, a residence time model was developed to interpret experimental data and to get insights into the hot melt extrusion process. Copyright © 2013 Elsevier B.V. All rights reserved.

  17. A Model-driven and Service-oriented framework for the business process improvement

    Directory of Open Access Journals (Sweden)

    Andrea Delgado

    2010-07-01

    Full Text Available Business Process Management (BPM importance and benefits for organizations to focus on their business processes is nowadays broadly recognized, as business and technology areas are embracing and adopting the paradigm. The Service Oriented Computing (SOC paradigm bases software development on services to realize business processes. The implementation of business processes as services helps in reducing the gap between these two areas, easing the communication and understanding of business needs. The Model Driven Development (MDD paradigm bases software development in models, metamodels and languages that allow transformation between them. The automatic generation of service models from business process models is a key issue to support the separation of its definition from its technical implementation. In this article, we present MINERVA framework which applies Model Driven Development (MDD and Service Oriented Computing (SOC paradigms to business processes for the continuous business process improvement in organizations, giving support to the stages defined in the business process lifecycle from modeling to evaluation of its execution.

  18. The Automation of Nowcast Model Assessment Processes

    Science.gov (United States)

    2016-09-01

    secondly, provide modelers with the information needed to understand the model errors and how their algorithm changes might mitigate these errors. In...by ARL modelers. 2. Development Environment The automation of Point-Stat processes (i.e., PSA) was developed using Python 3.5.* Python was selected...because it is easy to use, widely used for scripting, and satisfies all the requirements to automate the implementation of the Point-Stat tool. In

  19. BUSINESS PROCESS MODELLING: A FOUNDATION FOR KNOWLEDGE MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Vesna Bosilj-Vukšić

    2006-12-01

    Full Text Available Knowledge management (KM is increasingly recognised as a strategic practice of knowledge-intensive companies, becoming an integral part of an organisation's strategy to improve business performance. This paper provides an overview of business process modelling applications and analyses the relationship between business process modelling and knowledge management projects. It presents the case study of Croatian leading banks and the insurance company, discussing its practical experience in conducting business process modelling projects and investigating the opportunity for integrating business process repository and organisational knowledge as the foundation for knowledge management system development.

  20. Computer-aided modeling framework for efficient model development, analysis and identification

    DEFF Research Database (Denmark)

    Heitzig, Martina; Sin, Gürkan; Sales Cruz, Mauricio

    2011-01-01

    Model-based computer aided product-process engineering has attained increased importance in a number of industries, including pharmaceuticals, petrochemicals, fine chemicals, polymers, biotechnology, food, energy, and water. This trend is set to continue due to the substantial benefits computer-aided...... methods introduce. The key prerequisite of computer-aided product-process engineering is however the availability of models of different types, forms, and application modes. The development of the models required for the systems under investigation tends to be a challenging and time-consuming task....... The methodology has been implemented into a computer-aided modeling framework, which combines expert skills, tools, and database connections that are required for the different steps of the model development work-flow with the goal to increase the efficiency of the modeling process. The framework has two main...

  1. A process improvement model for software verification and validation

    Science.gov (United States)

    Callahan, John; Sabolish, George

    1994-01-01

    We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and space station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.

  2. Dual processing model of medical decision-making

    Directory of Open Access Journals (Sweden)

    Djulbegovic Benjamin

    2012-09-01

    Full Text Available Abstract Background Dual processing theory of human cognition postulates that reasoning and decision-making can be described as a function of both an intuitive, experiential, affective system (system I and/or an analytical, deliberative (system II processing system. To date no formal descriptive model of medical decision-making based on dual processing theory has been developed. Here we postulate such a model and apply it to a common clinical situation: whether treatment should be administered to the patient who may or may not have a disease. Methods We developed a mathematical model in which we linked a recently proposed descriptive psychological model of cognition with the threshold model of medical decision-making and show how this approach can be used to better understand decision-making at the bedside and explain the widespread variation in treatments observed in clinical practice. Results We show that physician’s beliefs about whether to treat at higher (lower probability levels compared to the prescriptive therapeutic thresholds obtained via system II processing is moderated by system I and the ratio of benefit and harms as evaluated by both system I and II. Under some conditions, the system I decision maker’s threshold may dramatically drop below the expected utility threshold derived by system II. This can explain the overtreatment often seen in the contemporary practice. The opposite can also occur as in the situations where empirical evidence is considered unreliable, or when cognitive processes of decision-makers are biased through recent experience: the threshold will increase relative to the normative threshold value derived via system II using expected utility threshold. This inclination for the higher diagnostic certainty may, in turn, explain undertreatment that is also documented in the current medical practice. Conclusions We have developed the first dual processing model of medical decision-making that has potential to

  3. Updating parameters of the chicken processing line model

    DEFF Research Database (Denmark)

    Kurowicka, Dorota; Nauta, Maarten; Jozwiak, Katarzyna

    2010-01-01

    A mathematical model of chicken processing that quantitatively describes the transmission of Campylobacter on chicken carcasses from slaughter to chicken meat product has been developed in Nauta et al. (2005). This model was quantified with expert judgment. Recent availability of data allows...... updating parameters of the model to better describe processes observed in slaughterhouses. We propose Bayesian updating as a suitable technique to update expert judgment with microbiological data. Berrang and Dickens’s data are used to demonstrate performance of this method in updating parameters...... of the chicken processing line model....

  4. Model for amorphous aggregation processes

    Science.gov (United States)

    Stranks, Samuel D.; Ecroyd, Heath; van Sluyter, Steven; Waters, Elizabeth J.; Carver, John A.; von Smekal, Lorenz

    2009-11-01

    The amorphous aggregation of proteins is associated with many phenomena, ranging from the formation of protein wine haze to the development of cataract in the eye lens and the precipitation of recombinant proteins during their expression and purification. While much literature exists describing models for linear protein aggregation, such as amyloid fibril formation, there are few reports of models which address amorphous aggregation. Here, we propose a model to describe the amorphous aggregation of proteins which is also more widely applicable to other situations where a similar process occurs, such as in the formation of colloids and nanoclusters. As first applications of the model, we have tested it against experimental turbidimetry data of three proteins relevant to the wine industry and biochemistry, namely, thaumatin, a thaumatinlike protein, and α -lactalbumin. The model is very robust and describes amorphous experimental data to a high degree of accuracy. Details about the aggregation process, such as shape parameters of the aggregates and rate constants, can also be extracted.

  5. Modeling and simulation of heterogeneous catalytic processes

    CERN Document Server

    Dixon, Anthony

    2014-01-01

    Heterogeneous catalysis and mathematical modeling are essential components of the continuing search for better utilization of raw materials and energy, with reduced impact on the environment. Numerical modeling of chemical systems has progressed rapidly due to increases in computer power, and is used extensively for analysis, design and development of catalytic reactors and processes. This book presents reviews of the state-of-the-art in modeling of heterogeneous catalytic reactors and processes. Reviews by leading authorities in the respective areas Up-to-date reviews of latest techniques in modeling of catalytic processes Mix of US and European authors, as well as academic/industrial/research institute perspectives Connections between computation and experimental methods in some of the chapters.

  6. Fuzzy Pruning Based LS-SVM Modeling Development for a Fermentation Process

    Directory of Open Access Journals (Sweden)

    Weili Xiong

    2014-01-01

    Full Text Available Due to the complexity and uncertainty of microbial fermentation processes, data coming from the plants often contain some outliers. However, these data may be treated as the normal support vectors, which always deteriorate the performance of soft sensor modeling. Since the outliers also contaminate the correlation structure of the least square support vector machine (LS-SVM, the fuzzy pruning method is provided to deal with the problem. Furthermore, by assigning different fuzzy membership scores to data samples, the sensitivity of the model to the outliers can be reduced greatly. The effectiveness and efficiency of the proposed approach are demonstrated through two numerical examples as well as a simulator case of penicillin fermentation process.

  7. Modeling the defrost process in complex geometries – Part 1: Development of a one-dimensional defrost model

    Directory of Open Access Journals (Sweden)

    van Buren Simon

    2017-01-01

    Full Text Available Frost formation is a common, often undesired phenomenon in heat exchanges such as air coolers. Thus, air coolers have to be defrosted periodically, causing significant energy consumption. For the design and optimization, prediction of defrosting by a CFD tool is desired. This paper presents a one-dimensional transient model approach suitable to be used as a zero-dimensional wall-function in CFD for modeling the defrost process at the fin and tube interfaces. In accordance to previous work a multi stage defrost model is introduced (e.g. [1, 2]. In the first instance the multi stage model is implemented and validated using MATLAB. The defrost process of a one-dimensional frost segment is investigated. Fixed boundary conditions are provided at the frost interfaces. The simulation results verify the plausibility of the designed model. The evaluation of the simulated defrost process shows the expected convergent behavior of the three-stage sequence.

  8. The two-process model : Origin and perspective

    NARCIS (Netherlands)

    Daan, S.; Hut, R. A.; Beersma, D.

    In the two-process model as developed in the early 1980's sleep is controlled by a process-S, representing the rise and fall of sleep demand resulting from prior sleep-wake history, interacting with a process-C representing circadian variation in sleep propensity. S and C together optimize sleep

  9. Process simulation and parametric modeling for strategic project management

    CERN Document Server

    Morales, Peter J

    2013-01-01

    Process Simulation and Parametric Modeling for Strategic Project Management will offer CIOs, CTOs and Software Development Managers, IT Graduate Students an introduction to a set of technologies that will help them understand how to better plan software development projects, manage risk and have better insight into the complexities of the software development process.A novel methodology will be introduced that allows a software development manager to better plan and access risks in the early planning of a project.  By providing a better model for early software development estimation and softw

  10. Development Instrument’s Learning of Physics Through Scientific Inquiry Model Based Batak Culture to Improve Science Process Skill and Student’s Curiosity

    Science.gov (United States)

    Nasution, Derlina; Syahreni Harahap, Putri; Harahap, Marabangun

    2018-03-01

    This research aims to: (1) developed a instrument’s learning (lesson plan, worksheet, student’s book, teacher’s guide book, and instrument test) of physics learning through scientific inquiry learning model based Batak culture to achieve skills improvement process of science students and the students’ curiosity; (2) describe the quality of the result of develop instrument’s learning in high school using scientific inquiry learning model based Batak culture (lesson plan, worksheet, student’s book, teacher’s guide book, and instrument test) to achieve the science process skill improvement of students and the student curiosity. This research is research development. This research developed a instrument’s learning of physics by using a development model that is adapted from the development model Thiagarajan, Semmel, and Semmel. The stages are traversed until retrieved a valid physics instrument’s learning, practical, and effective includes :(1) definition phase, (2) the planning phase, and (3) stages of development. Test performed include expert test/validation testing experts, small groups, and test classes is limited. Test classes are limited to do in SMAN 1 Padang Bolak alternating on a class X MIA. This research resulted in: 1) the learning of physics static fluid material specially for high school grade 10th consisted of (lesson plan, worksheet, student’s book, teacher’s guide book, and instrument test) and quality worthy of use in the learning process; 2) each component of the instrument’s learning meet the criteria have valid learning, practical, and effective way to reach the science process skill improvement and curiosity in students.

  11. Modeling and Control of Multivariable Process Using Intelligent Techniques

    Directory of Open Access Journals (Sweden)

    Subathra Balasubramanian

    2010-10-01

    Full Text Available For nonlinear dynamic systems, the first principles based modeling and control is difficult to implement. In this study, a fuzzy controller and recurrent fuzzy controller are developed for MIMO process. Fuzzy logic controller is a model free controller designed based on the knowledge about the process. In fuzzy controller there are two types of rule-based fuzzy models are available: one the linguistic (Mamdani model and the other is Takagi–Sugeno model. Of these two, Takagi-Sugeno model (TS has attracted most attention. The fuzzy controller application is limited to static processes due to their feedforward structure. But, most of the real-time processes are dynamic and they require the history of input/output data. In order to store the past values a memory unit is needed, which is introduced by the recurrent structure. The proposed recurrent fuzzy structure is used to develop a controller for the two tank heating process. Both controllers are designed and implemented in a real time environment and their performance is compared.

  12. Process models as tools in forestry research and management

    Science.gov (United States)

    Kurt Johnsen; Lisa Samuelson; Robert Teskey; Steve McNulty; Tom Fox

    2001-01-01

    Forest process models are mathematical representations of biological systems that incorporate our understanding of physiological and ecological mechanisms into predictive algorithms. These models were originally designed and used for research purposes, but are being developed for use in practical forest management. Process models designed for research...

  13. Plasma Processing of Model Residential Solid Waste

    Science.gov (United States)

    Messerle, V. E.; Mossé, A. L.; Nikonchuk, A. N.; Ustimenko, A. B.; Baimuldin, R. V.

    2017-09-01

    The authors have tested the technology of processing of model residential solid waste. They have developed and created a pilot plasma unit based on a plasma chamber incinerator. The waste processing technology has been tested and prepared for commercialization.

  14. A Systematic Process for Developing High Quality SaaS Cloud Services

    Science.gov (United States)

    La, Hyun Jung; Kim, Soo Dong

    Software-as-a-Service (SaaS) is a type of cloud service which provides software functionality through Internet. Its benefits are well received in academia and industry. To fully utilize the benefits, there should be effective methodologies to support the development of SaaS services which provide high reusability and applicability. Conventional approaches such as object-oriented methods do not effectively support SaaS-specific engineering activities such as modeling common features, variability, and designing quality services. In this paper, we present a systematic process for developing high quality SaaS and highlight the essentiality of commonality and variability (C&V) modeling to maximize the reusability. We first define criteria for designing the process model and provide a theoretical foundation for SaaS; its meta-model and C&V model. We clarify the notion of commonality and variability in SaaS, and propose a SaaS development process which is accompanied with engineering instructions. Using the proposed process, SaaS services with high quality can be effectively developed.

  15. Identification of the main processes in new towns Development ...

    African Journals Online (AJOL)

    Identification of the main processes in new towns Development Company in Iran and provision of the model of ideal processes for optimal management of ... The most important result of this project is that after identifying the status quo, mapping the processes, revising the processes and applying revised processes, the ...

  16. A production model and maintenance planning model for the process industry

    NARCIS (Netherlands)

    Ashayeri, J.; Teelen, A.; Selen, W.J.

    1995-01-01

    In this paper a model is developed to simultaneously plan preventive maintenance and production in a process industry environment, where maintenance planning is extremely important. The model schedules production jobs and preventive maintenance jobs, while minimizing costs associated with

  17. Integrated Process Modeling-A Process Validation Life Cycle Companion.

    Science.gov (United States)

    Zahel, Thomas; Hauer, Stefan; Mueller, Eric M; Murphy, Patrick; Abad, Sandra; Vasilieva, Elena; Maurer, Daniel; Brocard, Cécile; Reinisch, Daniela; Sagmeister, Patrick; Herwig, Christoph

    2017-10-17

    During the regulatory requested process validation of pharmaceutical manufacturing processes, companies aim to identify, control, and continuously monitor process variation and its impact on critical quality attributes (CQAs) of the final product. It is difficult to directly connect the impact of single process parameters (PPs) to final product CQAs, especially in biopharmaceutical process development and production, where multiple unit operations are stacked together and interact with each other. Therefore, we want to present the application of Monte Carlo (MC) simulation using an integrated process model (IPM) that enables estimation of process capability even in early stages of process validation. Once the IPM is established, its capability in risk and criticality assessment is furthermore demonstrated. IPMs can be used to enable holistic production control strategies that take interactions of process parameters of multiple unit operations into account. Moreover, IPMs can be trained with development data, refined with qualification runs, and maintained with routine manufacturing data which underlines the lifecycle concept. These applications will be shown by means of a process characterization study recently conducted at a world-leading contract manufacturing organization (CMO). The new IPM methodology therefore allows anticipation of out of specification (OOS) events, identify critical process parameters, and take risk-based decisions on counteractions that increase process robustness and decrease the likelihood of OOS events.

  18. Development of the physical model

    International Nuclear Information System (INIS)

    Liu Zunqi; Morsy, Samir

    2001-01-01

    Full text: The Physical Model was developed during Program 93+2 as a technical tool to aid enhanced information analysis and now is an integrated part of the Department's on-going State evaluation process. This paper will describe the concept of the Physical Model, including its objectives, overall structure and the development of indicators with designated strengths, followed by a brief description of using the Physical Model in implementing the enhanced information analysis. The work plan for expansion and update of the Physical Model is also presented at the end of the paper. The development of the Physical Model is an attempt to identify, describe and characterize every known process for carrying out each step necessary for the acquisition of weapons-usable material, i.e., all plausible acquisition paths for highly enriched uranium (HEU) and separated plutonium (Pu). The overall structure of the Physical Model has a multilevel arrangement. It includes at the top level all the main steps (technologies) that may be involved in the nuclear fuel cycle from the source material production up to the acquisition of weapons-usable material, and then beyond the civilian fuel cycle to the development of nuclear explosive devices (weaponization). Each step is logically interconnected with the preceding and/or succeeding steps by nuclear material flows. It contains at its lower levels every known process that is associated with the fuel cycle activities presented at the top level. For example, uranium enrichment is broken down into three branches at the second level, i.e., enrichment of UF 6 , UCl 4 and U-metal respectively; and then further broken down at the third level into nine processes: gaseous diffusion, gas centrifuge, aerodynamic, electromagnetic, molecular laser (MLIS), atomic vapor laser (AVLIS), chemical exchange, ion exchange and plasma. Narratives are presented at each level, beginning with a general process description then proceeding with detailed

  19. Standard Model processes

    CERN Document Server

    Mangano, M.L.; Aguilar-Saavedra, Juan Antonio; Alekhin, S.; Badger, S.; Bauer, C.W.; Becher, T.; Bertone, V.; Bonvini, M.; Boselli, S.; Bothmann, E.; Boughezal, R.; Cacciari, M.; Carloni Calame, C.M.; Caola, F.; Campbell, J.M.; Carrazza, S.; Chiesa, M.; Cieri, L.; Cimaglia, F.; Febres Cordero, F.; Ferrarese, P.; D'Enterria, D.; Ferrera, G.; Garcia i Tormo, X.; Garzelli, M.V.; Germann, E.; Hirschi, V.; Han, T.; Ita, H.; Jäger, B.; Kallweit, S.; Karlberg, A.; Kuttimalai, S.; Krauss, F.; Larkoski, A.J.; Lindert, J.; Luisoni, G.; Maierhöfer, P.; Mattelaer, O.; Martinez, H.; Moch, S.; Montagna, G.; Moretti, M.; Nason, P.; Nicrosini, O.; Oleari, C.; Pagani, D.; Papaefstathiou, A.; Petriello, F.; Piccinini, F.; Pierini, M.; Pierog, T.; Pozzorini, S.; Re, E.; Robens, T.; Rojo, J.; Ruiz, R.; Sakurai, K.; Salam, G.P.; Salfelder, L.; Schönherr, M.; Schulze, M.; Schumann, S.; Selvaggi, M.; Shivaji, A.; Siodmok, A.; Skands, P.; Torrielli, P.; Tramontano, F.; Tsinikos, I.; Tweedie, B.; Vicini, A.; Westhoff, S.; Zaro, M.; Zeppenfeld, D.; CERN. Geneva. ATS Department

    2017-06-22

    This report summarises the properties of Standard Model processes at the 100 TeV pp collider. We document the production rates and typical distributions for a number of benchmark Standard Model processes, and discuss new dynamical phenomena arising at the highest energies available at this collider. We discuss the intrinsic physics interest in the measurement of these Standard Model processes, as well as their role as backgrounds for New Physics searches.

  20. Process and Context in Choice Models

    DEFF Research Database (Denmark)

    Ben-Akiva, Moshe; Palma, André de; McFadden, Daniel

    2012-01-01

    . The extended choice framework includes more behavioral richness through the explicit representation of the planning process preceding an action and its dynamics and the effects of context (family, friends, and market) on the process leading to a choice, as well as the inclusion of new types of subjective data...... in choice models. We discuss the key issues involved in applying the extended framework, focusing on richer data requirements, theories, and models, and present three partial demonstrations of the proposed framework. Future research challenges include the development of more comprehensive empirical tests...

  1. Visualizing the process of process modeling with PPMCharts

    NARCIS (Netherlands)

    Claes, J.; Vanderfeesten, I.T.P.; Pinggera, J.; Reijers, H.A.; Weber, B.; Poels, G.; La Rosa, M.; Soffer, P.

    2013-01-01

    In the quest for knowledge about how to make good process models, recent research focus is shifting from studying the quality of process models to studying the process of process modeling (often abbreviated as PPM) itself. This paper reports on our efforts to visualize this specific process in such

  2. Development of rubber mixing process mathematical model and synthesis of control correction algorithm by process temperature mode using an artificial neural network

    Directory of Open Access Journals (Sweden)

    V. S. Kudryashov

    2016-01-01

    Full Text Available The article is devoted to the development of a correction control algorithm by temperature mode of a periodic rubber mixing process for JSC "Voronezh tire plant". The algorithm is designed to perform in the main controller a section of rubber mixing Siemens S7 CPU319F-3 PN/DP, which forms tasks for the local temperature controllers HESCH HE086 and Jumo dTRON304, operating by tempering stations. To compile the algorithm was performed a systematic analysis of rubber mixing process as an object of control and was developed a mathematical model of the process based on the heat balance equations describing the processes of heat transfer through the walls of technological devices, the change of coolant temperature and the temperature of the rubber compound mixing until discharge from the mixer chamber. Due to the complexity and nonlinearity of the control object – Rubber mixers and the availability of methods and a wide experience of this device control in an industrial environment, a correction algorithm is implemented on the basis of an artificial single-layer neural network and it provides the correction of tasks for local controllers on the cooling water temperature and air temperature in the workshop, which may vary considerably depending on the time of the year, and during prolonged operation of the equipment or its downtime. Tempering stations control is carried out by changing the flow of cold water from the cooler and on/off control of the heating elements. The analysis of the model experiments results and practical research at the main controller programming in the STEP 7 environment at the enterprise showed a decrease in the mixing time for different types of rubbers by reducing of heat transfer process control error.

  3. Consistency and Reconciliation Model In Regional Development Planning

    Directory of Open Access Journals (Sweden)

    Dina Suryawati

    2016-10-01

    Full Text Available The aim of this study was to identify the problems and determine the conceptual model of regional development planning. Regional development planning is a systemic, complex and unstructured process. Therefore, this study used soft systems methodology to outline unstructured issues with a structured approach. The conceptual models that were successfully constructed in this study are a model of consistency and a model of reconciliation. Regional development planning is a process that is well-integrated with central planning and inter-regional planning documents. Integration and consistency of regional planning documents are very important in order to achieve the development goals that have been set. On the other hand, the process of development planning in the region involves technocratic system, that is, both top-down and bottom-up system of participation. Both must be balanced, do not overlap and do not dominate each other. regional, development, planning, consistency, reconciliation

  4. A Queuing Model of the Airport Departure Process

    OpenAIRE

    Balakrishnan, Hamsa; Simaiakis, Ioannis

    2013-01-01

    This paper presents an analytical model of the aircraft departure process at an airport. The modeling procedure includes the estimation of unimpeded taxi-out time distributions and the development of a queuing model of the departure runway system based on the transient analysis of D/E/1 queuing systems. The parameters of the runway service process are estimated using operational data. Using the aircraft pushback schedule as input, the model predicts the expected runway schedule and takeoff ti...

  5. Development of hydrological models and surface process modelization Study case in High Mountain slopes

    International Nuclear Information System (INIS)

    Loaiza, Juan Carlos; Pauwels, Valentijn R

    2011-01-01

    Hydrological models are useful because allow to predict fluxes into the hydrological systems, which is useful to predict foods and violent phenomenon associated to water fluxes, especially in materials under a high meteorization level. The combination of these models with meteorological predictions, especially with rainfall models, allow to model water behavior into the soil. On most of cases, this type of models is really sensible to evapotranspiration. On climatic studies, the superficial processes have to be represented adequately. Calibration and validation of these models is necessary to obtain reliable results. This paper is a practical exercise of application of complete hydrological information at detailed scale in a high mountain catchment, considering the soil use and types more representatives. The information of soil moisture, infiltration, runoff and rainfall is used to calibrate and validate TOPLATS hydrological model to simulate the behavior of soil moisture. The finds show that is possible to implement an hydrological model by means of soil moisture information use and an equation of calibration by Extended Kalman Filter (EKF).

  6. Product/Process (P/P) Models For The Defense Waste Processing Facility (DWPF): Model Ranges And Validation Ranges For Future Processing

    Energy Technology Data Exchange (ETDEWEB)

    Jantzen, C. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Edwards, T. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2015-09-25

    Radioactive high level waste (HLW) at the Savannah River Site (SRS) has successfully been vitrified into borosilicate glass in the Defense Waste Processing Facility (DWPF) since 1996. Vitrification requires stringent product/process (P/P) constraints since the glass cannot be reworked once it is poured into ten foot tall by two foot diameter canisters. A unique “feed forward” statistical process control (SPC) was developed for this control rather than statistical quality control (SQC). In SPC, the feed composition to the DWPF melter is controlled prior to vitrification. In SQC, the glass product would be sampled after it is vitrified. Individual glass property-composition models form the basis for the “feed forward” SPC. The models transform constraints on the melt and glass properties into constraints on the feed composition going to the melter in order to guarantee, at the 95% confidence level, that the feed will be processable and that the durability of the resulting waste form will be acceptable to a geologic repository.

  7. Development and Application of a Low Impact Development (LID-Based District Unit Planning Model

    Directory of Open Access Journals (Sweden)

    Cheol Hee Son

    2017-01-01

    Full Text Available The purpose of this study was to develop a low impact development-based district unit planning (LID-DP model and to verify the model by applying it to a test site. To develop the model, we identified various barriers to the urban planning process and examined the advantages of various LID-related techniques to determine where in the urban development process LID would provide the greatest benefit. The resulting model provides (1 a set of district unit planning processes that consider LID standards and (2 a set of evaluation methods that measure the benefits of the LID-DP model over standard urban development practices. The developed LID-DP process is composed of status analysis, comprehensive analysis, basic plan, and sectoral plans. To determine whether the LID-DP model met the proposed LID targets, we applied the model to a test site in Cheongju City, Chungcheongbuk-do Province, Republic of Korea. The test simulation showed that the LID-DP plan reduced nonpoint source pollutants (total nitrogen, 113%; total phosphorous, 193%; and biological oxygen demand, 199%; reduced rainfall runoff (infiltration volume, 102%; surface runoff, 101%; and improved the conservation rate of the natural environment area (132%. The successful application of this model also lent support for the greater importance of non-structural techniques over structural techniques in urban planning when taking ecological factors into account.

  8. THE CONCEPTUAL FOUR-SECTOR MODEL OF DEVELOPMENT OF THE COGNITIVE PROCESS DIMENSIONS IN ABSTRACT VISUAL THINKING

    Directory of Open Access Journals (Sweden)

    Kateřina Berková

    2018-04-01

    Full Text Available The research deals with the development of cognitive process dimensions in economic education. The aim is to research factors that influence academic achievement of students according to their intellectual level and grades. The researchers used quantitative design of research based on standardized assessment of intelligence and non-standardized questionnaire. The questionnaire was used to analyse the pedagogical competences of the teachers of economic subjects from the students' point of view in close relation to the teaching management and the impact on the motivation to learn and the achievement of students in these subjects. The respondents were 277 Czech students aged 16-17 who were divided into groups according to their intellectual level and grades. The data were analysed by a correlation analysis and a multiple regression model. In conclusion, the following can be stated: (a From the point of view of the above average intelligent students, expertise can be considered as an important competency of the teacher; teaching average intelligent students, communication and presentation skills seem to be important. (b It is desirable to develop cognitive processes, critical thinking actively, to lead students to become aware of changes in their own thinking and to orient them towards mastery goals. (c Particularly for students with weaker results it is necessary to create intrinsic motivation, which develops cognition and thus is able to develop higher cognitive dimensions further. The links between these areas are of utmost importance for education and, above all, for developing of students' scholarship. Each student can be educated, and it is necessary to influence them to develop their personality and all of their potential abilities. The conceptual four-sector model represents the initial pathway to lead students who are differentiated according to the intellectual level and academic achievement to the active development of thinking, learning

  9. Role of the national energy system modelling in the process of the policy development

    Directory of Open Access Journals (Sweden)

    Merse Stane

    2012-01-01

    Full Text Available Strategic planning and decision making, nonetheless making energy policies and strategies, is very extensive process and has to follow multiple and often contradictory objectives. During the preparation of the new Slovenian Energy Programme proposal, complete update of the technology and sector oriented bottom up model of Reference Energy and Environmental System of Slovenia (REES-SLO has been done. During the redevelopment of the REES-SLO model trade-off between the simulation and optimisation approach has been done, favouring presentation of relations between controls and their effects rather than the elusive optimality of results which can be misleading for small energy systems. Scenario-based planning was integrated into the MESAP (Modular Energy System Analysis and Planning environment, allowing integration of past, present and planned (calculated data in a comprehensive overall system. Within the paper, the main technical, economic and environmental characteristics of the Slovenian energy system model REES-SLO are described. This paper presents a new approach in modelling relatively small energy systems which goes beyond investment in particular technologies or categories of technology and allows smooth transition to low carbon economy. Presented research work confirms that transition from environment unfriendly fossil fuelled economy to sustainable and climate friendly development requires a new approach, which must be based on excellent knowledge of alternative possibilities of development and especially awareness about new opportunities in exploitation of energy efficiency and renewable energy sources.

  10. Difference-based Model Synchronization in an Industrial MDD Process

    DEFF Research Database (Denmark)

    Könemann, Patrick; Kindler, Ekkart; Unland, Ludger

    2009-01-01

    Models play a central role in model-driven software engineering. There are different kinds of models during the development process, which are related to each other and change over time. Therefore, it is difficult to keep the different models consistent with each other. Consistency of different m...... model versions, and for synchronizing other types of models. The main concern is to apply our concepts to an industrial process, in particular keeping usability and performance in mind. Keyword: Model Differencing, Model Merging, Model Synchronization...

  11. On the suitability of BPMN for business process modelling

    NARCIS (Netherlands)

    Wohed, P.; Aalst, van der W.M.P.; Dumas, M.; Hofstede, ter A.H.M.; Russell, N.C.; Dustdar, S.; Fiadeiro, J.L.; Sheth, A.

    2006-01-01

    In this paper we examine the suitability of the Business Process Modelling Notation (BPMN) for business process modelling, using the Workflow Patterns as an evaluation framework. The Workflow Patterns are a collection of patterns developed for assessing control-flow, data and resource capabilities

  12. Advanced computational modelling for drying processes – A review

    International Nuclear Information System (INIS)

    Defraeye, Thijs

    2014-01-01

    Highlights: • Understanding the product dehydration process is a key aspect in drying technology. • Advanced modelling thereof plays an increasingly important role for developing next-generation drying technology. • Dehydration modelling should be more energy-oriented. • An integrated “nexus” modelling approach is needed to produce more energy-smart products. • Multi-objective process optimisation requires development of more complete multiphysics models. - Abstract: Drying is one of the most complex and energy-consuming chemical unit operations. R and D efforts in drying technology have skyrocketed in the past decades, as new drivers emerged in this industry next to procuring prime product quality and high throughput, namely reduction of energy consumption and carbon footprint as well as improving food safety and security. Solutions are sought in optimising existing technologies or developing new ones which increase energy and resource efficiency, use renewable energy, recuperate waste heat and reduce product loss, thus also the embodied energy therein. Novel tools are required to push such technological innovations and their subsequent implementation. Particularly computer-aided drying process engineering has a large potential to develop next-generation drying technology, including more energy-smart and environmentally-friendly products and dryers systems. This review paper deals with rapidly emerging advanced computational methods for modelling dehydration of porous materials, particularly for foods. Drying is approached as a combined multiphysics, multiscale and multiphase problem. These advanced methods include computational fluid dynamics, several multiphysics modelling methods (e.g. conjugate modelling), multiscale modelling and modelling of material properties and the associated propagation of material property variability. Apart from the current challenges for each of these, future perspectives should be directed towards material property

  13. DEVELOPMENT OF PERFORMANCE MODEL FOR QUALITY AND PROCESS IMPROVEMENT IN BUSINESS PROCESS SERVICE INDUSTRY

    Directory of Open Access Journals (Sweden)

    Samson Oludapo

    2017-06-01

    Full Text Available When it comes to performance improvement process, literature abounds with lean, agile and lean-agile. Over the years, the implementation of the improvement processes of lean and agile had met with resounding success in the manufacturing, production, and construction industry. For this reason, there is an interest to develop a performance process for business process service industry incorporating the key aspect of lean and agile theory extracted from the extant literature. The researcher reviewed a total of 750 scholarly articles, grouped them according to the relationship to central theme - lean or agile, and thereafter uses factor analysis under principal component method to explain the relationship of the items. The result of this study showed that firms focusing on cost will minimize the investment of resources in business operations this, in turn, will lead to difficulties in responding to changing customer's requirements in terms of volume, delivery, and new product. The implication is that on the long run cost focus strategy negatively influence flexibility.

  14. The development of a sustainable development model framework

    International Nuclear Information System (INIS)

    Hannoura, Alim P.; Cothren, Gianna M.; Khairy, Wael M.

    2006-01-01

    The emergence of the 'sustainable development' concept as a response to the mining of natural resources for the benefit of multinational corporations has advanced the cause of long-term environmental management. A sustainable development model (SDM) framework that is inclusive of the 'whole' natural environment is presented to illustrate the integration of the sustainable development of the 'whole' ecosystem. The ecosystem approach is an inclusive framework that covers the natural environment relevant futures and constraints. These are dynamically interconnected and constitute the determinates of resources development component of the SDM. The second component of the SDM framework is the resources development patterns, i.e., the use of land, water, and atmospheric resources. All of these patterns include practices that utilize environmental resources to achieve a predefined outcome producing waste and by-products that require disposal into the environment. The water quality management practices represent the third component of the framework. These practices are governed by standards, limitations and available disposal means subject to quantity and quality permits. These interconnected standards, practices and permits shape the resulting environmental quality of the ecosystem under consideration. A fourth component, environmental indicators, of the SDM framework provides a measure of the ecosystem productivity and status that may differ based on societal values and culture. The four components of the SDM are interwoven into an outcome assessment process to form the management and feedback models. The concept of Sustainable Development is expressed in the management model as an objective function subject to desired constraints imposing the required bounds for achieving ecosystem sustainability. The development of the objective function and constrains requires monetary values for ecosystem functions, resources development activities and environmental cost. The

  15. Integrated modelling of near field and engineered barrier system processes

    International Nuclear Information System (INIS)

    Lamont, A.; Gansemer, J.

    1994-01-01

    The Yucca Mountain Integrating Model (YMIM) is an integrated model of the Engineered barrier System has been developed to assist project managers at LLNL in identifying areas where research emphasis should be placed. The model was designed to be highly modular so that a model of an individual process could be easily modified or replaced without interfering with the models of other processes. The modules modelling container failure and the dissolution of nuclides include particularly detailed, temperature dependent models of their corresponding processes

  16. Model of diffusers / permeators for hydrogen processing

    International Nuclear Information System (INIS)

    Jacobs, W. D.; Hang, T.

    2008-01-01

    Palladium-silver (Pd-Ag) diffusers are mainstays of hydrogen processing. Diffusers separate hydrogen from inert species such as nitrogen, argon or helium. The tubing becomes permeable to hydrogen when heated to more than 250 C and a differential pressure is created across the membrane. The hydrogen diffuses better at higher temperatures. Experimental or experiential results have been the basis for determining or predicting a diffuser's performance. However, the process can be mathematically modeled, and comparison to experimental or other operating data can be utilized to improve the fit of the model. A reliable model-based diffuser system design is the goal which will have impacts on tritium and hydrogen processing. A computer model has been developed to solve the differential equations for diffusion given the operating boundary conditions. The model was compared to operating data for a low pressure diffuser system. The modeling approach and the results are presented in this paper. (authors)

  17. A Case for Declarative Process Modelling: Agile Development of a Grant Application System

    DEFF Research Database (Denmark)

    Debois, Søren; Hildebrandt, Thomas; Slaats, Tijs

    2014-01-01

    We present a new declarative model with composition and hierarchical definition of processes, featuring (a) incremental refinement, (b) adaptation of processes, and (c) dynamic creation of sub-processes. The approach is motivated and exemplified by a recent case management solution delivered by our...... (complex) events, which dynamically instantiate sub-processes. The extensions are realised and supported by a prototype simulation tool....

  18. Modeling process-structure-property relationships for additive manufacturing

    Science.gov (United States)

    Yan, Wentao; Lin, Stephen; Kafka, Orion L.; Yu, Cheng; Liu, Zeliang; Lian, Yanping; Wolff, Sarah; Cao, Jian; Wagner, Gregory J.; Liu, Wing Kam

    2018-02-01

    This paper presents our latest work on comprehensive modeling of process-structure-property relationships for additive manufacturing (AM) materials, including using data-mining techniques to close the cycle of design-predict-optimize. To illustrate the processstructure relationship, the multi-scale multi-physics process modeling starts from the micro-scale to establish a mechanistic heat source model, to the meso-scale models of individual powder particle evolution, and finally to the macro-scale model to simulate the fabrication process of a complex product. To link structure and properties, a highefficiency mechanistic model, self-consistent clustering analyses, is developed to capture a variety of material response. The model incorporates factors such as voids, phase composition, inclusions, and grain structures, which are the differentiating features of AM metals. Furthermore, we propose data-mining as an effective solution for novel rapid design and optimization, which is motivated by the numerous influencing factors in the AM process. We believe this paper will provide a roadmap to advance AM fundamental understanding and guide the monitoring and advanced diagnostics of AM processing.

  19. Model-based internal wave processing

    Energy Technology Data Exchange (ETDEWEB)

    Candy, J.V.; Chambers, D.H.

    1995-06-09

    A model-based approach is proposed to solve the oceanic internal wave signal processing problem that is based on state-space representations of the normal-mode vertical velocity and plane wave horizontal velocity propagation models. It is shown that these representations can be utilized to spatially propagate the modal (dept) vertical velocity functions given the basic parameters (wave numbers, Brunt-Vaisala frequency profile etc.) developed from the solution of the associated boundary value problem as well as the horizontal velocity components. Based on this framework, investigations are made of model-based solutions to the signal enhancement problem for internal waves.

  20. 3rd International Conference on Modelling and Management of Engineering Processes

    CERN Document Server

    Gericke, Kilian; Szélig, Nikoletta; Vajna, Sándor

    2015-01-01

    Innovative processes for the development of products and services are more and more considered as an organisational capability, which is recognised to be increasingly important for business success in today’s competitive environment. However, management and academia need a more profound understanding of these processes in order to develop improved management approaches to exploit business potentials. This book contains the proceedings of the 3rd International Conference on Modelling and Management of Engineering Processes (MMEP2013) held in Magdeburg, Germany, in November 2013. It includes contributions from international leading researchers in the fields of process modelling and process management. The conference topics were recent trends in modelling and management of engineering processes, potential synergies between different modelling approaches, future challenges for the management of engineering processes as well as future research in these areas.

  1. On Support Functions for the Development of MFM Models

    DEFF Research Database (Denmark)

    Heussen, Kai; Lind, Morten

    2012-01-01

    a review of MFM applications, and contextualizes the model development with respect to process design and operation knowledge. Developing a perspective for an environment for MFM-oriented model- and application-development a tool-chain is outlined and relevant software functions are discussed......A modeling environment and methodology are necessary to ensure quality and reusability of models in any domain. For MFM in particular, as a tool for modeling complex systems, awareness has been increasing for this need. Introducing the context of modeling support functions, this paper provides....... With a perspective on MFM-modeling for existing processes and automation design, modeling stages and corresponding formal model properties are identified. Finally, practically feasible support functions and model-checks to support the model-development are suggested....

  2. Process development

    International Nuclear Information System (INIS)

    Zapata G, G.

    1989-01-01

    Process development: The paper describes the organization and laboratory facilities of the group working on radioactive ore processing studies. Contains a review of the carried research and the plans for the next future. A list of the published reports is also presented

  3. Semantics and analysis of business process models in BPMN

    NARCIS (Netherlands)

    Dijkman, R.M.; Dumas, M.; Ouyang, C.

    2008-01-01

    The Business Process Modelling Notation (BPMN) is a standard for capturing business processes in the early phases of systems development. The mix of constructs found in BPMN makes it possible to create models with semantic errors. Such errors are especially serious, because errors in the early

  4. Waste immobilization process development at the Savannah River Plant

    International Nuclear Information System (INIS)

    Charlesworth, D.L.

    1986-01-01

    Processes to immobilize various wasteforms, including waste salt solution, transuranic waste, and low-level incinerator ash, are being developed. Wasteform characteristics, process and equipment details, and results from field/pilot tests and mathematical modeling studies are discussed

  5. Modeling Aspects of Activated Sludge Processes Part l l: Mathematical Process Modeling and Biokinetics of Activated Sludge Processes

    Energy Technology Data Exchange (ETDEWEB)

    AbdElHaleem, H S [Cairo Univ.-CivlI Eng. Dept., Giza (Egypt); EI-Ahwany, A H [CairoUlmrsity- Faculty ofEngincering - Chemical Engineering Department, Giza (Egypt); Ibrahim, H I [Helwan University- Faculty of Engineering - Biomedical Engineering Department, Helwan (Egypt); Ibrahim, G [Menofia University- Faculty of Engineering Sbebin EI Kom- Basic Eng. Sc. Dept., Menofia (Egypt)

    2004-07-01

    Mathematical process modeling and biokinetics of activated sludge process were reviewed considering different types of models. It has been evaluated the task group models of ASMI. and 2, and 3 versioned by Henze et al considering the conditions of each model and the different processes of which every model consists. It is revealed that ASMI contains some defects avoided in ASM3. Relied on homogeneity, Models can be classified into homogenous models characterized by taking the activated sludge process as one phase. In this type of models, the internal mass transfer inside the floes was neglected.. Hence, the kinetic parameter produces can be considered inaccurate. The other type of models is the heterogeneous model This type considers the mass transfer operations in addition to the biochemical reaction processes; hence, the resulted kinetic parameters can be considered more accurate than that of homogenous type.

  6. Modeling Aspects of Activated Sludge Processes Part l l: Mathematical Process Modeling and Biokinetics of Activated Sludge Processes

    International Nuclear Information System (INIS)

    AbdElHaleem, H.S.; EI-Ahwany, A. H.; Ibrahim, H.I.; Ibrahim, G.

    2004-01-01

    Mathematical process modeling and biokinetics of activated sludge process were reviewed considering different types of models. It has been evaluated the task group models of ASMI. and 2, and 3 versioned by Henze et al considering the conditions of each model and the different processes of which every model consists. It is revealed that ASMI contains some defects avoided in ASM3. Relied on homogeneity, Models can be classified into homogenous models characterized by taking the activated sludge process as one phase. In this type of models, the internal mass transfer inside the floes was neglected.. Hence, the kinetic parameter produces can be considered inaccurate. The other type of models is the heterogeneous model This type considers the mass transfer operations in addition to the biochemical reaction processes; hence, the resulted kinetic parameters can be considered more accurate than that of homogenous type

  7. Modelling Of Flotation Processes By Classical Mathematical Methods - A Review

    Science.gov (United States)

    Jovanović, Ivana; Miljanović, Igor

    2015-12-01

    Flotation process modelling is not a simple task, mostly because of the process complexity, i.e. the presence of a large number of variables that (to a lesser or a greater extent) affect the final outcome of the mineral particles separation based on the differences in their surface properties. The attempts toward the development of the quantitative predictive model that would fully describe the operation of an industrial flotation plant started in the middle of past century and it lasts to this day. This paper gives a review of published research activities directed toward the development of flotation models based on the classical mathematical rules. The description and systematization of classical flotation models were performed according to the available references, with emphasize exclusively given to the flotation process modelling, regardless of the model application in a certain control system. In accordance with the contemporary considerations, models were classified as the empirical, probabilistic, kinetic and population balance types. Each model type is presented through the aspects of flotation modelling at the macro and micro process levels.

  8. SPEEDUP modeling of the defense waste processing facility at the SRS

    International Nuclear Information System (INIS)

    Smith, F.G. III.

    1997-01-01

    A computer model has been developed for the dynamic simulation of batch process operations within the Defense Waste Processing Facility (DWPF) at the Savannah River Site (SRS). The DWPF chemically treats high level waste materials from the site tank farm and vitrifies the resulting slurry into a borosilicate glass for permanent disposal. The DWPF consists of three major processing areas: Salt Processing Cell (SPC), Chemical Processing Cell (CPC) and the Melt Cell. A fully integrated model of these process units has been developed using the SPEEDUP trademark software from Aspen Technology. Except for glass production in the Melt Cell, all of the chemical operations within DWPF are batch processes. Since SPEEDUP is designed for dynamic modeling of continuous processes, considerable effort was required to device batch process algorithms. This effort was successful and the model is able to simulate batch operations and the dynamic behavior of the process. The model also includes an optimization calculation that maximizes the waste content in the final glass product. In this paper, we will describe the process model in some detail and present preliminary results from a few simulation studies

  9. Product and Process Modelling

    DEFF Research Database (Denmark)

    Cameron, Ian T.; Gani, Rafiqul

    . These approaches are put into the context of life cycle modelling, where multiscale and multiform modelling is increasingly prevalent in the 21st century. The book commences with a discussion of modern product and process modelling theory and practice followed by a series of case studies drawn from a variety......This book covers the area of product and process modelling via a case study approach. It addresses a wide range of modelling applications with emphasis on modelling methodology and the subsequent in-depth analysis of mathematical models to gain insight via structural aspects of the models...... to biotechnology applications, food, polymer and human health application areas. The book highlights to important nature of modern product and process modelling in the decision making processes across the life cycle. As such it provides an important resource for students, researchers and industrial practitioners....

  10. A model-based approach to on-line process disturbance management

    International Nuclear Information System (INIS)

    Kim, I.S.

    1988-01-01

    The methodology developed can be applied to the design of a real-time expert system to aid control-room operators in coping with process abnormalities. The approach encompasses diverse functional aspects required for an effective on-line process disturbance management: (1) intelligent process monitoring and alarming, (2) on-line sensor data validation, (3) on-line sensor and hardware (except sensors) fault diagnosis, and (4) real-time corrective measure synthesis. Accomplishment of these functions is made possible through the application of various models, goal-tree success-tree, process monitor-tree, sensor failure diagnosis, and hardware failure diagnosis models. The models used in the methodology facilitate not only the knowledge-acquisition process - a bottleneck in the development of an expert system - but also the reasoning process of the knowledge-based system. These transparent models and model-based reasoning significantly enhance the maintainability of the real-time expert systems. The proposed approach was applied to the feedwater control system of a nuclear power plant, and implemented into a real-time expert system, MOAS II, using the expert system shell, PICON, on the LMI machine

  11. Dry process fuel performance technology development

    International Nuclear Information System (INIS)

    Kang, Kweon Ho; Kim, K. W.; Kim, B. K.

    2006-06-01

    The objective of the project is to establish the performance evaluation system of DUPIC fuel during the Phase III R and D. In order to fulfil this objectives, property model development of DUPIC fuel and irradiation test was carried out in Hanaro using the instrumented rig. Also, the analysis on the in-reactor behavior analysis of DUPIC fuel, out-pile test using simulated DUPIC fuel as well as performance and integrity assessment in a commercial reactor were performed during this Phase. The R and D results of the Phase III are summarized as follows: Fabrication process establishment of simulated DUPIC fuel for property measurement, Property model development for the DUPIC fuel, Performance evaluation of DUPIC fuel via irradiation test in Hanaro, Post irradiation examination of irradiated fuel and performance analysis, Development of DUPIC fuel performance code (KAOS)

  12. Dry process fuel performance technology development

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Kweon Ho; Kim, K. W.; Kim, B. K. (and others)

    2006-06-15

    The objective of the project is to establish the performance evaluation system of DUPIC fuel during the Phase III R and D. In order to fulfil this objectives, property model development of DUPIC fuel and irradiation test was carried out in Hanaro using the instrumented rig. Also, the analysis on the in-reactor behavior analysis of DUPIC fuel, out-pile test using simulated DUPIC fuel as well as performance and integrity assessment in a commercial reactor were performed during this Phase. The R and D results of the Phase III are summarized as follows: Fabrication process establishment of simulated DUPIC fuel for property measurement, Property model development for the DUPIC fuel, Performance evaluation of DUPIC fuel via irradiation test in Hanaro, Post irradiation examination of irradiated fuel and performance analysis, Development of DUPIC fuel performance code (KAOS)

  13. The software development process at the Chandra X-ray Center

    Science.gov (United States)

    Evans, Janet D.; Evans, Ian N.; Fabbiano, Giuseppina

    2008-08-01

    Software development for the Chandra X-ray Center Data System began in the mid 1990's, and the waterfall model of development was mandated by our documents. Although we initially tried this approach, we found that a process with elements of the spiral model worked better in our science-based environment. High-level science requirements are usually established by scientists, and provided to the software development group. We follow with review and refinement of those requirements prior to the design phase. Design reviews are conducted for substantial projects within the development team, and include scientists whenever appropriate. Development follows agreed upon schedules that include several internal releases of the task before completion. Feedback from science testing early in the process helps to identify and resolve misunderstandings present in the detailed requirements, and allows review of intangible requirements. The development process includes specific testing of requirements, developer and user documentation, and support after deployment to operations or to users. We discuss the process we follow at the Chandra X-ray Center (CXC) to develop software and support operations. We review the role of the science and development staff from conception to release of software, and some lessons learned from managing CXC software development for over a decade.

  14. Guideline validation in multiple trauma care through business process modeling.

    Science.gov (United States)

    Stausberg, Jürgen; Bilir, Hüseyin; Waydhas, Christian; Ruchholtz, Steffen

    2003-07-01

    Clinical guidelines can improve the quality of care in multiple trauma. In our Department of Trauma Surgery a specific guideline is available paper-based as a set of flowcharts. This format is appropriate for the use by experienced physicians but insufficient for electronic support of learning, workflow and process optimization. A formal and logically consistent version represented with a standardized meta-model is necessary for automatic processing. In our project we transferred the paper-based into an electronic format and analyzed the structure with respect to formal errors. Several errors were detected in seven error categories. The errors were corrected to reach a formally and logically consistent process model. In a second step the clinical content of the guideline was revised interactively using a process-modeling tool. Our study reveals that guideline development should be assisted by process modeling tools, which check the content in comparison to a meta-model. The meta-model itself could support the domain experts in formulating their knowledge systematically. To assure sustainability of guideline development a representation independent of specific applications or specific provider is necessary. Then, clinical guidelines could be used for eLearning, process optimization and workflow management additionally.

  15. Bridging process-based and empirical approaches to modeling tree growth

    Science.gov (United States)

    Harry T. Valentine; Annikki Makela; Annikki Makela

    2005-01-01

    The gulf between process-based and empirical approaches to modeling tree growth may be bridged, in part, by the use of a common model. To this end, we have formulated a process-based model of tree growth that can be fitted and applied in an empirical mode. The growth model is grounded in pipe model theory and an optimal control model of crown development. Together, the...

  16. Development of the Fischer-Tropsch Process: From the Reaction Concept to the Process Book

    Directory of Open Access Journals (Sweden)

    Boyer C.

    2016-05-01

    Full Text Available The process development by IFP Energies nouvelles (IFPEN/ENI/Axens of a Fischer-Tropsch process is described. This development is based on upstream process studies to choose the process scheme, reactor technology and operating conditions, and downstream to summarize all development work in a process guide. A large amount of work was devoted to the catalyst performances on one hand and the scale-up of the slurry bubble reactor with dedicated complementary tools on the other hand. Finally, an original approach was implemented to validate both the process and catalyst on an industrial scale by combining a 20 bpd unit in ENI’s Sannazzaro refinery, with cold mock-ups equivalent to 20 and 1 000 bpd at IFPEN and a special “Large Validation Tool” (LVT which reproduces the combined effect of chemical reaction condition stress and mechanical stress equivalent to a 15 000 bpd industrial unit. Dedicated analytical techniques and a dedicated model were developed to simulate the whole process (reactor and separation train, integrating a high level of complexity and phenomena coupling to scale-up the process in a robust reliable base on an industrial scale.

  17. Thermal analysis of fused deposition modeling process using infrared thermography imaging and finite element modeling

    Science.gov (United States)

    Zhou, Xunfei; Hsieh, Sheng-Jen

    2017-05-01

    After years of development, Fused Deposition Modeling (FDM) has become the most popular technique in commercial 3D printing due to its cost effectiveness and easy-to-operate fabrication process. Mechanical strength and dimensional accuracy are two of the most important factors for reliability of FDM products. However, the solid-liquid-solid state changes of material in the FDM process make it difficult to monitor and model. In this paper, an experimental model was developed to apply cost-effective infrared thermography imaging method to acquire temperature history of filaments at the interface and their corresponding cooling mechanism. A three-dimensional finite element model was constructed to simulate the same process using element "birth and death" feature and validated with the thermal response from the experimental model. In 6 of 9 experimental conditions, a maximum of 13% difference existed between the experimental and numerical models. This work suggests that numerical modeling of FDM process is reliable and can facilitate better understanding of bead spreading and road-to-road bonding mechanics during fabrication.

  18. Simulation and Development of Internal Model Control Applications in the Bayer Process

    Science.gov (United States)

    Colombé, Ph.; Dablainville, R.; Vacarisas, J.

    Traditional PID feedback control system is limited in its use in the Bayer cycle due to the important and omnipresent time delays which can lead to stability problems and sluggish response. Advanced modern control techniques are available, but suffer in an industrial environment from a lack of simplicity and robustness. In this respect the Internal Model Control (IMC) method may be considered as an exception. After a brief review of the basic theoretical principles behind IMC, an IMC scheme is developed to work with single-input, single-output, discrete-time, nonlinear systems. Two applications of IMC in the Bayer process, both in simulations and on industrial plants, are then described: control of the caustic soda concentration of the aluminate liquor and control of the A12O3/Na20 caust. ratio of the digested slurry, Finally, the results obtained make this technique quite attractive for the alumina industry.

  19. Modelling of innovative SANEX process mal-operations

    International Nuclear Information System (INIS)

    McLachlan, F.; Taylor, R.; Whittaker, D.; Woodhead, D.; Geist, A.

    2016-01-01

    The innovative (i-) SANEX process for the separation of minor actinides from PUREX highly active raffinate is expected to employ a solvent phase comprising 0.2 M TODGA with 5 v/v% 1-octanol in an inert diluent. An initial extract / scrub section would be used to extract trivalent actinides and lanthanides from the feed whilst leaving other fission products in the aqueous phase, before the loaded solvent is contacted with a low acidity aqueous phase containing a sulphonated bis-triazinyl pyridine ligand (BTP) to effect a selective strip of the actinides, so yielding separate actinide (An) and lanthanide (Ln) product streams. This process has been demonstrated in lab scale trials at Juelich (FZJ). The SACSESS (Safety of Actinide Separation processes) project is focused on the evaluation and improvement of the safety of such future systems. A key element of this is the development of an understanding of the response of a process to upsets (mal-operations). It is only practical to study a small subset of possible mal-operations experimentally and consideration of the majority of mal-operations entails the use of a validated dynamic model of the process. Distribution algorithms for HNO_3, Am, Cm and the lanthanides have been developed and incorporated into a dynamic flowsheet model that has, so far, been configured to correspond to the extract-scrub section of the i-SANEX flowsheet trial undertaken at FZJ in 2013. Comparison is made between the steady state model results and experimental results. Results from modelling of low acidity and high temperature mal-operations are presented. (authors)

  20. Applying the Business Process and Practice Alignment Meta-model: Daily Practices and Process Modelling

    Directory of Open Access Journals (Sweden)

    Ventura Martins Paula

    2017-03-01

    Full Text Available Background: Business Process Modelling (BPM is one of the most important phases of information system design. Business Process (BP meta-models allow capturing informational and behavioural aspects of business processes. Unfortunately, standard BP meta-modelling approaches focus just on process description, providing different BP models. It is not possible to compare and identify related daily practices in order to improve BP models. This lack of information implies that further research in BP meta-models is needed to reflect the evolution/change in BP. Considering this limitation, this paper introduces a new BP meta-model designed by Business Process and Practice Alignment Meta-model (BPPAMeta-model. Our intention is to present a meta-model that addresses features related to the alignment between daily work practices and BP descriptions. Objectives: This paper intends to present a metamodel which is going to integrate daily work information into coherent and sound process definitions. Methods/Approach: The methodology employed in the research follows a design-science approach. Results: The results of the case study are related to the application of the proposed meta-model to align the specification of a BP model with work practices models. Conclusions: This meta-model can be used within the BPPAM methodology to specify or improve business processes models based on work practice descriptions.

  1. Developing a framework to model the primary drying step of a continuous freeze-drying process based on infrared radiation

    DEFF Research Database (Denmark)

    Van Bockstal, Pieter-Jan; Corver, Jos; Mortier, Séverine Thérèse F.C.

    2018-01-01

    . These results assist in the selection of proper materials which could serve as IR window in the continuous freeze-drying prototype. The modelling framework presented in this paper fits the model-based design approach used for the development of this prototype and shows the potential benefits of this design...... requires the fundamental mechanistic modelling of each individual process step. Therefore, a framework is presented for the modelling and control of the continuous primary drying step based on non-contact IR radiation. The IR radiation emitted by the radiator filaments passes through various materials...

  2. Kinetics model development of cocoa bean fermentation

    Science.gov (United States)

    Kresnowati, M. T. A. P.; Gunawan, Agus Yodi; Muliyadini, Winny

    2015-12-01

    Although Indonesia is one of the biggest cocoa beans producers in the world, Indonesian cocoa beans are oftenly of low quality and thereby frequently priced low in the world market. In order to improve the quality, adequate post-harvest cocoa processing techniques are required. Fermentation is the vital stage in series of cocoa beans post harvest processing which could improve the quality of cocoa beans, in particular taste, aroma, and colours. During the fermentation process, combination of microbes grow producing metabolites that serve as the precursors for cocoa beans flavour. Microbial composition and thereby their activities will affect the fermentation performance and influence the properties of cocoa beans. The correlation could be reviewed using a kinetic model that includes unstructured microbial growth, substrate utilization and metabolic product formation. The developed kinetic model could be further used to design cocoa bean fermentation process to meet the expected quality. Further the development of kinetic model of cocoa bean fermentation also serve as a good case study of mixed culture solid state fermentation, that has rarely been studied. This paper presents the development of a kinetic model for solid-state cocoa beans fermentation using an empirical approach. Series of lab scale cocoa bean fermentations, either natural fermentations without starter addition or fermentations with mixed yeast and lactic acid bacteria starter addition, were used for model parameters estimation. The results showed that cocoa beans fermentation can be modelled mathematically and the best model included substrate utilization, microbial growth, metabolites production and its transport. Although the developed model still can not explain the dynamics in microbial population, this model can sufficiently explained the observed changes in sugar concentration as well as metabolic products in the cocoa bean pulp.

  3. Theoretical Modelling of Intercultural Communication Process

    Directory of Open Access Journals (Sweden)

    Mariia Soter

    2016-08-01

    Full Text Available The definition of the concepts of “communication”, “intercultural communication”, “model of communication” are analyzed in the article. The basic components of the communication process are singled out. The model of intercultural communication is developed. Communicative, behavioral and complex skills for optimal organization of intercultural communication, establishment of productive contact with a foreign partner to achieve mutual understanding, searching for acceptable ways of organizing interaction and cooperation for both communicants are highlighted in the article. It is noted that intercultural communication through interaction between people affects the development of different cultures’ aspects.

  4. Development of anti-inflammatory drugs - the research and development process.

    Science.gov (United States)

    Knowles, Richard Graham

    2014-01-01

    The research and development process for novel drugs to treat inflammatory diseases is described, and several current issues and debates relevant to this are raised: the decline in productivity, attrition, challenges and trends in developing anti-inflammatory drugs, the poor clinical predictivity of experimental models of inflammatory diseases, heterogeneity within inflammatory diseases, 'improving on the Beatles' in treating inflammation, and the relationships between big pharma and biotechs. The pharmaceutical research and development community is responding to these challenges in multiple ways which it is hoped will lead to the discovery and development of a new generation of anti-inflammatory medicines. © 2013 Nordic Pharmacological Society. Published by John Wiley & Sons Ltd.

  5. Modelling of a Naphtha Recovery Unit (NRU with Implications for Process Optimization

    Directory of Open Access Journals (Sweden)

    Jiawei Du

    2018-06-01

    Full Text Available The naphtha recovery unit (NRU is an integral part of the processes used in the oil sands industry for bitumen extraction. The principle role of the NRU is to recover naphtha from the tailings for reuse in this process. This process is energy-intensive, and environmental guidelines for naphtha recovery must be met. Steady-state models for the NRU system are developed in this paper using two different approaches. The first approach is a statistical, data-based modelling approach where linear regression models have been developed using Minitab® from plant data collected during a performance test. The second approach involves the development of a first-principles model in Aspen Plus® based on the NRU process flow diagram. A novel refinement to this latter model, called “withdraw and remix”, is proposed based on comparing actual plant data to model predictions around the two units used to separate water and naphtha. The models developed in this paper suggest some interesting ideas for the further optimization of the process, in that it may be possible to achieve the required naphtha recovery using less energy. More plant tests are required to validate these ideas.

  6. Development of pure component property models for chemical product-process design and analysis

    DEFF Research Database (Denmark)

    Hukkerikar, Amol Shivajirao

    information on the degree of accuracy of the property estimates. In addition, a method based on the ‘molecular structural similarity criteria’ is developed so that efficient use of knowledge of properties could be made in the development/improvement of property models. This method, in principle, can...... modeling such as: (i) quantity of property data used for the parameter regression; (ii) selection of the most appropriate form of the property model function; and (iii) the accuracy and thermodynamic consistency of predicted property values are also discussed. The developed models have been implemented...

  7. Process modeling study of the CIF incinerator

    International Nuclear Information System (INIS)

    Hang, T.

    1995-01-01

    The Savannah River Site (SRS) plans to begin operating the Consolidated Incineration Facility (CIF) in 1996. The CIF will treat liquid and solid low-level radioactive, mixed and RCRA hazardous wastes generated at SRS. In addition to experimental test programs, process modeling was applied to provide guidance in areas of safety, environmental regulation compliances, process improvement and optimization. A steady-state flowsheet model was used to calculate material/energy balances and to track key chemical constituents throughout the process units. Dynamic models were developed to predict the CIF transient characteristics in normal and abnormal operation scenarios. Predictions include the rotary kiln heat transfer, dynamic responses of the CIF to fluctuations in the solid waste feed or upsets in the system equipments, performance of the control system, air inleakage in the kiln, etc. This paper reviews the modeling study performed to assist in the deflagration risk assessment

  8. Modeling of ultrasonic processes utilizing a generic software framework

    Science.gov (United States)

    Bruns, P.; Twiefel, J.; Wallaschek, J.

    2017-06-01

    Modeling of ultrasonic processes is typically characterized by a high degree of complexity. Different domains and size scales must be regarded, so that it is rather difficult to build up a single detailed overall model. Developing partial models is a common approach to overcome this difficulty. In this paper a generic but simple software framework is presented which allows to coupe arbitrary partial models by slave modules with well-defined interfaces and a master module for coordination. Two examples are given to present the developed framework. The first one is the parameterization of a load model for ultrasonically-induced cavitation. The piezoelectric oscillator, its mounting, and the process load are described individually by partial models. These partial models then are coupled using the framework. The load model is composed of spring-damper-elements which are parameterized by experimental results. In the second example, the ideal mounting position for an oscillator utilized in ultrasonic assisted machining of stone is determined. Partial models for the ultrasonic oscillator, its mounting, the simplified contact process, and the workpiece’s material characteristics are presented. For both applications input and output variables are defined to meet the requirements of the framework’s interface.

  9. Determinantal point process models on the sphere

    DEFF Research Database (Denmark)

    Møller, Jesper; Nielsen, Morten; Porcu, Emilio

    defined on Sd × Sd . We review the appealing properties of such processes, including their specific moment properties, density expressions and simulation procedures. Particularly, we characterize and construct isotropic DPPs models on Sd , where it becomes essential to specify the eigenvalues......We consider determinantal point processes on the d-dimensional unit sphere Sd . These are finite point processes exhibiting repulsiveness and with moment properties determined by a certain determinant whose entries are specified by a so-called kernel which we assume is a complex covariance function...... and eigenfunctions in a spectral representation for the kernel, and we figure out how repulsive isotropic DPPs can be. Moreover, we discuss the shortcomings of adapting existing models for isotropic covariance functions and consider strategies for developing new models, including a useful spectral approach....

  10. Importance of Computer Model Validation in Pyroprocessing Technology Development

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Y. E.; Li, Hui; Yim, M. S. [Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of)

    2014-05-15

    In this research, we developed a plan for experimental validation of one of the computer models developed for ER process modeling, i. e., the ERAD code. Several candidate surrogate materials are selected for the experiment considering the chemical and physical properties. Molten salt-based pyroprocessing technology is being examined internationally as an alternative to treat spent nuclear fuel over aqueous technology. The central process in pyroprocessing is electrorefining(ER) which separates uranium from transuranic elements and fission products present in spent nuclear fuel. ER is a widely used process in the minerals industry to purify impure metals. Studies of ER by using actual spent nuclear fuel materials are problematic for both technical and political reasons. Therefore, the initial effort for ER process optimization is made by using computer models. A number of models have been developed for this purpose. But as validation of these models is incomplete and often times problematic, the simulation results from these models are inherently uncertain.

  11. A simplified computational memory model from information processing.

    Science.gov (United States)

    Zhang, Lanhua; Zhang, Dongsheng; Deng, Yuqin; Ding, Xiaoqian; Wang, Yan; Tang, Yiyuan; Sun, Baoliang

    2016-11-23

    This paper is intended to propose a computational model for memory from the view of information processing. The model, called simplified memory information retrieval network (SMIRN), is a bi-modular hierarchical functional memory network by abstracting memory function and simulating memory information processing. At first meta-memory is defined to express the neuron or brain cortices based on the biology and graph theories, and we develop an intra-modular network with the modeling algorithm by mapping the node and edge, and then the bi-modular network is delineated with intra-modular and inter-modular. At last a polynomial retrieval algorithm is introduced. In this paper we simulate the memory phenomena and functions of memorization and strengthening by information processing algorithms. The theoretical analysis and the simulation results show that the model is in accordance with the memory phenomena from information processing view.

  12. Preparatory selection of sterilization regime for canned Natural Atlantic Mackerel with oil based on developed mathematical models of the process

    Directory of Open Access Journals (Sweden)

    Maslov A. A.

    2016-12-01

    Full Text Available Definition of preparatory parameters for sterilization regime of canned "Natural Atlantic Mackerel with Oil" is the aim of current study. PRSC software developed at the department of automation and computer engineering is used for preparatory selection. To determine the parameters of process model, in laboratory autoclave AVK-30M the pre-trial process of sterilization and cooling in water with backpressure of canned "Natural Atlantic Mackerel with Oil" in can N 3 has been performed. Gathering information about the temperature in the autoclave sterilization chamber and the can with product has been carried out using Ellab TrackSense PRO loggers. Due to the obtained information three transfer functions for the product model have been identified: in the least heated area of autoclave, the average heated and the most heated. In PRSC programme temporary temperature dependences in the sterilization chamber have been built using this information. The model of sterilization process of canned "Natural Atlantic Mackerel with Oil" has been received after the pre-trial process. Then in the automatic mode the sterilization regime of canned "Natural Atlantic Mackerel with Oil" has been selected using the value of actual effect close to normative sterilizing effect (5.9 conditional minutes. Furthermore, in this study step-mode sterilization of canned "Natural Atlantic Mackerel with Oil" has been selected. Utilization of step-mode sterilization with the maximum temperature equal to 125 °C in the sterilization chamber allows reduce process duration by 10 %. However, the application of this regime in practice requires additional research. Using the described approach based on the developed mathematical models of the process allows receive optimal step and variable canned food sterilization regimes with high energy efficiency and product quality.

  13. MODEL ANALYTICAL NETWORK PROCESS (ANP DALAM PENGEMBANGAN PARIWISATA DI JEMBER

    Directory of Open Access Journals (Sweden)

    Sukidin Sukidin

    2015-04-01

    Full Text Available Abstrak    : Model Analytical Network Process (ANP dalam Pengembangan Pariwisata di Jember. Penelitian ini mengkaji kebijakan pengembangan pariwisata di Jember, terutama kebijakan pengembangan agrowisata perkebunan kopi dengan menggunakan Jember Fashion Carnival (JFC sebagai event marketing. Metode yang digunakan adalah soft system methodology dengan menggunakan metode analitis jaringan (Analytical Network Process. Penelitian ini menemukan bahwa pengembangan pariwisata di Jember masih dilakukan dengan menggunakan pendekatan konvensional, belum terkoordinasi dengan baik, dan lebih mengandalkan satu even (atraksi pariwisata, yakni JFC, sebagai lokomotif daya tarik pariwisata Jember. Model pengembangan konvensional ini perlu dirancang kembali untuk memperoleh pariwisata Jember yang berkesinambungan. Kata kunci: pergeseran paradigma, industry pariwisata, even pariwisata, agrowisata Abstract: Analytical Network Process (ANP Model in the Tourism Development in Jember. The purpose of this study is to conduct a review of the policy of tourism development in Jember, especially development policies for coffee plantation agro-tourism by using Jember Fashion Carnival (JFC as event marketing. The research method used is soft system methodology using Analytical Network Process. The result shows that the tourism development in Jember is done using a conventional approach, lack of coordination, and merely focus on a single event tourism, i.e. the JFC, as locomotive tourism attraction in Jember. This conventional development model needs to be redesigned to reach Jember sustainable tourism development. Keywords: paradigm shift, tourism industry, agro-tourism

  14. Modeling of Multicomponent Mixture Separation Processes Using Hollow fiber Membrane

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Sin-Ah; Kim, Jin-Kuk; Lee, Young Moo; Yeo, Yeong-Koo [Hanyang University, Seoul (Korea, Republic of)

    2015-02-15

    So far, most of research activities on modeling of membrane separation processes have been focused on binary feed mixture. But, in actual separation operations, binary feed is hard to find and most separation processes involve multicomponent feed mixture. In this work models for membrane separation processes treating multicomponent feed mixture are developed. Various model types are investigated and validity of proposed models are analysed based on experimental data obtained using hollowfiber membranes. The proposed separation models show quick convergence and exhibit good tracking performance.

  15. Numerical Modeling of Anaerobic Microzones Development in Bulk Oxic Porous Media: An Assessment of Different Microzone Formation Processes

    Science.gov (United States)

    Roy Chowdhury, S.; Zarnetske, J. P.; Briggs, M. A.; Day-Lewis, F. D.; Singha, K.

    2017-12-01

    Soil and groundwater research indicates that unique biogeochemical "microzones" commonly form within bulk soil masses. The formation of these microzones at the pore-scale has been attributed to a number of causes, including variability of in situ carbon or nutrient sources, intrinsic physical conditions that lead to dual-porosity and mass transfer conditions, or microbial bioclogging of the porous media. Each of these causes, while documented in different porous media systems, potentially can lead to the presence of anaerobic pores residing in a bulk oxic domain. The relative role of these causes operating independently or in conjunction with each other to form microzones is not known. Here, we use a single numerical modeling framework to assess the relative roles of each process in creating anaerobic microzones. Using a two-dimensional pore-network model, coupled with a microbial growth model based on Monod kinetics, simulations were performed to explore the development of these anoxic microzones and their fate under a range of hydrologic, nutrient, and microbial conditions. Initial results parameterized for a stream-groundwater exchange environment (i.e., a hyporheic zone) indicate that external forcing of fluid flux in the domain is a key soil characteristic to anaerobic microzone development as fluid flux governs the nutrient flux. The initial amount of biomass present in the system also plays a major role in the development of the microzones. In terms of dominant in situ causes, the intrinsic physical structure of the local pore space is found to play the key role in development of anaerobic sites by regulating fluxes to reaction sites. Acknowledging and understanding the drivers of these microzones will improve the ability of multiple disciplines to measure and model reactive mass transport in soils and assess if they play a significant role for particular biogeochemical processes and ecosystem functions, such as denitrification and greenhouse gas production.

  16. A subjective and objective fuzzy-based analytical hierarchy process model for prioritization of lean product development practices

    Directory of Open Access Journals (Sweden)

    Daniel O. Aikhuele

    2017-06-01

    Full Text Available In this paper, a subjective and objective fuzzy-based Analytical Hierarchy Process (AHP model is proposed. The model which is based on a newly defined evaluation matrix replaces the fuzzy comparison matrix (FCM in the traditional fuzzy AHP model, which has been found ineffective and time-consuming when criteria/alternatives are increased. The main advantage of the new model is that it is straightforward and completely eliminates the repetitive adjustment of data that is common with the FCM in traditional AHP model. The model reduces the complete dependen-cy on human judgment in prioritization assessment since the weights values are solved automati-cally using the evaluation matrix and the modified priority weight formula in the proposed mod-el. By virtue of a numerical case study, the model is successfully applied in the determination of the implementation priorities of lean practices for a product development environment and com-pared with similar computational methods in the literature.

  17. Hydrothermal processing of Hanford tank wastes: Process modeling and control

    International Nuclear Information System (INIS)

    Currier, R.P.

    1994-01-01

    In the Los Alamos National Laboratory (LANL) hydrothermal process, waste streams are first pressurized and heated as they pass through a continuous flow tubular reactor vessel. The waste is maintained at reaction temperature of 300--550 C where organic destruction and sludge reformation occur. This report documents LANL activities in process modeling and control undertaken in FY94 to support hydrothermal process development. Key issues discussed include non-ideal flow patterns (e.g. axial dispersion) and their effect on reactor performance, the use and interpretation of inert tracer experiments, and the use of computational fluid mechanics to evaluate novel hydrothermal reactor designs. In addition, the effects of axial dispersion (and simplifications to rate expressions) on the estimated kinetic parameters are explored by non-linear regression to experimental data. Safety-related calculations are reported which estimate the explosion limits of effluent gases and the fate of hydrogen as it passes through the reactor. Development and numerical solution of a generalized one-dimensional mathematical model is also summarized. The difficulties encountered in using commercially available software to correlate the behavior of high temperature, high pressure aqueous electrolyte mixtures are summarized. Finally, details of the control system and experiments conducted to empirically determine the system response are reported

  18. Development of a theory of implementation and integration: Normalization Process Theory

    Directory of Open Access Journals (Sweden)

    May Carl R

    2009-05-01

    Full Text Available Abstract Background Theories are important tools in the social and natural sciences. The methods by which they are derived are rarely described and discussed. Normalization Process Theory explains how new technologies, ways of acting, and ways of working become routinely embedded in everyday practice, and has applications in the study of implementation processes. This paper describes the process by which it was built. Methods Between 1998 and 2008, we developed a theory. We derived a set of empirical generalizations from analysis of data collected in qualitative studies of healthcare work and organization. We developed an applied theoretical model through analysis of empirical generalizations. Finally, we built a formal theory through a process of extension and implication analysis of the applied theoretical model. Results Each phase of theory development showed that the constructs of the theory did not conflict with each other, had explanatory power, and possessed sufficient robustness for formal testing. As the theory developed, its scope expanded from a set of observed regularities in data with procedural explanations, to an applied theoretical model, to a formal middle-range theory. Conclusion Normalization Process Theory has been developed through procedures that were properly sceptical and critical, and which were opened to review at each stage of development. The theory has been shown to merit formal testing.

  19. A process-based model for cattle manure compost windrows: Model performance and application

    Science.gov (United States)

    A model was developed and incorporated in the Integrated Farm System Model (IFSM, v.4.3) that simulates important processes occurring during windrow composting of manure. The model, documented in an accompanying paper, predicts changes in windrow properties and conditions and the resulting emissions...

  20. Mathematical Formulation Requirements and Specifications for the Process Models

    Energy Technology Data Exchange (ETDEWEB)

    Steefel, C.; Moulton, D.; Pau, G.; Lipnikov, K.; Meza, J.; Lichtner, P.; Wolery, T.; Bacon, D.; Spycher, N.; Bell, J.; Moridis, G.; Yabusaki, S.; Sonnenthal, E.; Zyvoloski, G.; Andre, B.; Zheng, L.; Davis, J.

    2010-11-01

    The Advanced Simulation Capability for Environmental Management (ASCEM) is intended to be a state-of-the-art scientific tool and approach for understanding and predicting contaminant fate and transport in natural and engineered systems. The ASCEM program is aimed at addressing critical EM program needs to better understand and quantify flow and contaminant transport behavior in complex geological systems. It will also address the long-term performance of engineered components including cementitious materials in nuclear waste disposal facilities, in order to reduce uncertainties and risks associated with DOE EM's environmental cleanup and closure activities. Building upon national capabilities developed from decades of Research and Development in subsurface geosciences, computational and computer science, modeling and applied mathematics, and environmental remediation, the ASCEM initiative will develop an integrated, open-source, high-performance computer modeling system for multiphase, multicomponent, multiscale subsurface flow and contaminant transport. This integrated modeling system will incorporate capabilities for predicting releases from various waste forms, identifying exposure pathways and performing dose calculations, and conducting systematic uncertainty quantification. The ASCEM approach will be demonstrated on selected sites, and then applied to support the next generation of performance assessments of nuclear waste disposal and facility decommissioning across the EM complex. The Multi-Process High Performance Computing (HPC) Simulator is one of three thrust areas in ASCEM. The other two are the Platform and Integrated Toolsets (dubbed the Platform) and Site Applications. The primary objective of the HPC Simulator is to provide a flexible and extensible computational engine to simulate the coupled processes and flow scenarios described by the conceptual models developed using the ASCEM Platform. The graded and iterative approach to assessments

  1. Mathematical Formulation Requirements and Specifications for the Process Models

    International Nuclear Information System (INIS)

    Steefel, C.; Moulton, D.; Pau, G.; Lipnikov, K.; Meza, J.; Lichtner, P.; Wolery, T.; Bacon, D.; Spycher, N.; Bell, J.; Moridis, G.; Yabusaki, S.; Sonnenthal, E.; Zyvoloski, G.; Andre, B.; Zheng, L.; Davis, J.

    2010-01-01

    The Advanced Simulation Capability for Environmental Management (ASCEM) is intended to be a state-of-the-art scientific tool and approach for understanding and predicting contaminant fate and transport in natural and engineered systems. The ASCEM program is aimed at addressing critical EM program needs to better understand and quantify flow and contaminant transport behavior in complex geological systems. It will also address the long-term performance of engineered components including cementitious materials in nuclear waste disposal facilities, in order to reduce uncertainties and risks associated with DOE EM's environmental cleanup and closure activities. Building upon national capabilities developed from decades of Research and Development in subsurface geosciences, computational and computer science, modeling and applied mathematics, and environmental remediation, the ASCEM initiative will develop an integrated, open-source, high-performance computer modeling system for multiphase, multicomponent, multiscale subsurface flow and contaminant transport. This integrated modeling system will incorporate capabilities for predicting releases from various waste forms, identifying exposure pathways and performing dose calculations, and conducting systematic uncertainty quantification. The ASCEM approach will be demonstrated on selected sites, and then applied to support the next generation of performance assessments of nuclear waste disposal and facility decommissioning across the EM complex. The Multi-Process High Performance Computing (HPC) Simulator is one of three thrust areas in ASCEM. The other two are the Platform and Integrated Toolsets (dubbed the Platform) and Site Applications. The primary objective of the HPC Simulator is to provide a flexible and extensible computational engine to simulate the coupled processes and flow scenarios described by the conceptual models developed using the ASCEM Platform. The graded and iterative approach to assessments naturally

  2. Models and Modelling Tools for Chemical Product and Process Design

    DEFF Research Database (Denmark)

    Gani, Rafiqul

    2016-01-01

    The design, development and reliability of a chemical product and the process to manufacture it, need to be consistent with the end-use characteristics of the desired product. One of the common ways to match the desired product-process characteristics is through trial and error based experiments......-based framework is that in the design, development and/or manufacturing of a chemical product-process, the knowledge of the applied phenomena together with the product-process design details can be provided with diverse degrees of abstractions and details. This would allow the experimental resources...... to be employed for validation and fine-tuning of the solutions from the model-based framework, thereby, removing the need for trial and error experimental steps. Also, questions related to economic feasibility, operability and sustainability, among others, can be considered in the early stages of design. However...

  3. Modeling multiphase materials processes

    CERN Document Server

    Iguchi, Manabu

    2010-01-01

    ""Modeling Multiphase Materials Processes: Gas-Liquid Systems"" describes the methodology and application of physical and mathematical modeling to multi-phase flow phenomena in materials processing. The book focuses on systems involving gas-liquid interaction, the most prevalent in current metallurgical processes. The performance characteristics of these processes are largely dependent on transport phenomena. This volume covers the inherent characteristics that complicate the modeling of transport phenomena in such systems, including complex multiphase structure, intense turbulence, opacity of

  4. Modelling and simulating decision processes of linked lives: An approach based on concurrent processes and stochastic race

    NARCIS (Netherlands)

    Warnke, T.; Reinhardt, O.; Klabunde, A.; Willekens, F.J.; Uhrmacher, A.

    2017-01-01

    Individuals’ decision processes play a central role in understanding modern migration phenomena and other demographic processes. Their integration into agent-based computational demography depends largely on suitable support by a modelling language. We are developing the Modelling Language for

  5. Comparative analysis of business rules and business process modeling languages

    Directory of Open Access Journals (Sweden)

    Audrius Rima

    2013-03-01

    Full Text Available During developing an information system is important to create clear models and choose suitable modeling languages. The article analyzes the SRML, SBVR, PRR, SWRL, OCL rules specifying language and UML, DFD, CPN, EPC and IDEF3 BPMN business process modeling language. The article presents business rules and business process modeling languages theoretical comparison. The article according to selected modeling aspects of the comparison between different business process modeling languages ​​and business rules representation languages sets. Also, it is selected the best fit of language set for three layer framework for business rule based software modeling.

  6. Property Modelling and Databases in Product-Process Design

    DEFF Research Database (Denmark)

    Gani, Rafiqul; Sansonetti, Sascha

    of the PC-SAFT is used. The developed database and property prediction models have been combined into a properties-software that allows different product-process design related applications. The presentation will also briefly highlight applications of the software for virtual product-process design...

  7. Modeling nonhomogeneous Markov processes via time transformation.

    Science.gov (United States)

    Hubbard, R A; Inoue, L Y T; Fann, J R

    2008-09-01

    Longitudinal studies are a powerful tool for characterizing the course of chronic disease. These studies are usually carried out with subjects observed at periodic visits giving rise to panel data. Under this observation scheme the exact times of disease state transitions and sequence of disease states visited are unknown and Markov process models are often used to describe disease progression. Most applications of Markov process models rely on the assumption of time homogeneity, that is, that the transition rates are constant over time. This assumption is not satisfied when transition rates depend on time from the process origin. However, limited statistical tools are available for dealing with nonhomogeneity. We propose models in which the time scale of a nonhomogeneous Markov process is transformed to an operational time scale on which the process is homogeneous. We develop a method for jointly estimating the time transformation and the transition intensity matrix for the time transformed homogeneous process. We assess maximum likelihood estimation using the Fisher scoring algorithm via simulation studies and compare performance of our method to homogeneous and piecewise homogeneous models. We apply our methodology to a study of delirium progression in a cohort of stem cell transplantation recipients and show that our method identifies temporal trends in delirium incidence and recovery.

  8. DEVELOPING A MATHEMATICAL MODEL FOR THE PROCESS OF DEVELOPING A MATHEMATICAL MODEL FOR THE PROCESS OF SEDIMENTARY TANKS

    Directory of Open Access Journals (Sweden)

    Valeria Victoria IOVANOV

    2013-05-01

    Full Text Available The model is reformulated by means of stochastic differential equations, and the parametersare estimated by a maximum likelihood method.VESILIND (1968; 1979 proposed a sludge settling velocity model of exponential form. During recent years,several refinements to the original model have been proposed, see e.g. GRIJSPEERDT et al. (1995; DUPONTand DAHL (1995 EKAMA et al. (1997. In the proposed models several layers in the settling tank areincorporated to permit the calculation of SS profiles over the tank depth and predict the SS concentrations in thereturn sludge and in the effluent from the clarifier.Here, the original VESILIND model combined with a simple suction depth model is used to enable predictionof the SS concentration in the effluent from the tank. In order to make the model applicable for real time controlpurposes, only two layers of variable height in the tank are considered

  9. Advanced Mirror & Modelling Technology Development

    Science.gov (United States)

    Effinger, Michael; Stahl, H. Philip; Abplanalp, Laura; Maffett, Steven; Egerman, Robert; Eng, Ron; Arnold, William; Mosier, Gary; Blaurock, Carl

    2014-01-01

    The 2020 Decadal technology survey is starting in 2018. Technology on the shelf at that time will help guide selection to future low risk and low cost missions. The Advanced Mirror Technology Development (AMTD) team has identified development priorities based on science goals and engineering requirements for Ultraviolet Optical near-Infrared (UVOIR) missions in order to contribute to the selection process. One key development identified was lightweight mirror fabrication and testing. A monolithic, stacked, deep core mirror was fused and replicated twice to achieve the desired radius of curvature. It was subsequently successfully polished and tested. A recently awarded second phase to the AMTD project will develop larger mirrors to demonstrate the lateral scaling of the deep core mirror technology. Another key development was rapid modeling for the mirror. One model focused on generating optical and structural model results in minutes instead of months. Many variables could be accounted for regarding the core, face plate and back structure details. A portion of a spacecraft model was also developed. The spacecraft model incorporated direct integration to transform optical path difference to Point Spread Function (PSF) and between PSF to modulation transfer function. The second phase to the project will take the results of the rapid mirror modeler and integrate them into the rapid spacecraft modeler.

  10. Methodology for Modeling and Analysis of Business Processes (MMABP

    Directory of Open Access Journals (Sweden)

    Vaclav Repa

    2015-10-01

    Full Text Available This paper introduces the methodology for modeling business processes. Creation of the methodology is described in terms of the Design Science Method. Firstly, the gap in contemporary Business Process Modeling approaches is identified and general modeling principles which can fill the gap are discussed. The way which these principles have been implemented in the main features of created methodology is described. Most critical identified points of the business process modeling are process states, process hierarchy and the granularity of process description. The methodology has been evaluated by use in the real project. Using the examples from this project the main methodology features are explained together with the significant problems which have been met during the project. Concluding from these problems together with the results of the methodology evaluation the needed future development of the methodology is outlined.

  11. Concept of a cognitive-numeric plant and process modelizer

    International Nuclear Information System (INIS)

    Vetterkind, D.

    1990-01-01

    To achieve automatic modeling of plant distrubances and failure limitation procedures, first the system's hardware and the present media (water, steam, coolant fluid) are formalized into fully computable matrices, called topographies. Secondly a microscopic cellular automation model, using lattice gases and state transition rules, is combined with a semi - microscopic cellular process model and with a macroscopic model, too. In doing this, at semi-microscopic level there are acting a cellular data compressor, a feature detection device and the Intelligent Physical Element's process dynamics. At macroscopic level the Walking Process Elements, a process evolving module, a test-and-manage device and abstracting process net are involved. Additionally, a diagnosis-coordinating and a counter measurements coordinating device are used. In order to automatically get process insights, object transformations, elementary process functions and associative methods are used. Developments of optoelectronic hardware language components are under consideration

  12. Towards a universal competitive intelligence process model

    Directory of Open Access Journals (Sweden)

    Rene Pellissier

    2013-08-01

    Full Text Available Background: Competitive intelligence (CI provides actionable intelligence, which provides a competitive edge in enterprises. However, without proper process, it is difficult to develop actionable intelligence. There are disagreements about how the CI process should be structured. For CI professionals to focus on producing actionable intelligence, and to do so with simplicity, they need a common CI process model.Objectives: The purpose of this research is to review the current literature on CI, to look at the aims of identifying and analysing CI process models, and finally to propose a universal CI process model.Method: The study was qualitative in nature and content analysis was conducted on all identified sources establishing and analysing CI process models. To identify relevant literature, academic databases and search engines were used. Moreover, a review of references in related studies led to more relevant sources, the references of which were further reviewed and analysed. To ensure reliability, only peer-reviewed articles were used.Results: The findings reveal that the majority of scholars view the CI process as a cycle of interrelated phases. The output of one phase is the input of the next phase.Conclusion: The CI process is a cycle of interrelated phases. The output of one phase is the input of the next phase. These phases are influenced by the following factors: decision makers, process and structure, organisational awareness and culture, and feedback.

  13. Development Model for Research Infrastructures

    Science.gov (United States)

    Wächter, Joachim; Hammitzsch, Martin; Kerschke, Dorit; Lauterjung, Jörn

    2015-04-01

    . The maturity of individual scientific domains differs considerably. • Technologically and organisationally many different RI components have to be integrated. Individual systems are often complex and have a long-term history. Existing approaches are on different maturity levels, e.g. in relation to the standardisation of interfaces. • The concrete implementation process consists of independent and often parallel development activities. In many cases no detailed architectural blue-print for the envisioned system exists. • Most of the funding currently available for RI implementation is provided on a project basis. To increase the synergies in infrastructure development the authors propose a specific RI Maturity Model (RIMM) that is specifically qualified for open system-of-system environments. RIMM is based on the concepts of Capability Maturity Models for organisational development, concretely the Levels of Conceptual Interoperability Model (LCIM) specifying the technical, syntactical, semantic, pragmatic, dynamic, and conceptual layers of interoperation [1]. The model is complemented by the identification and integration of growth factors (according to the Nolan Stages Theory [2]). These factors include supply and demand factors. Supply factors comprise available resources, e.g., data, services and IT-management capabilities including organisations and IT-personal. Demand factors are the overall application portfolio for RIs but also the skills and requirements of scientists and communities using the infrastructure. RIMM thus enables a balanced development process of RI and RI components by evaluating the status of the supply and demand factors in relation to specific levels of interoperability. [1] Tolk, A., Diallo, A., Turnitsa, C. (2007): Applying the Levels of Conceptual Interoperability Model in Support of Integratability, Interoperability, and Composability for System-of-Systems Engineering. Systemics, Cybernetics and Informatics, Volume 5 - Number 5. [2

  14. Developing a theory to guide the process of designing information retrieval systems

    DEFF Research Database (Denmark)

    Sonnenwald, Diane H.

    1992-01-01

    This research attempts to develop a descriptive design model that accounts for communication among users, designers, and developer throughout the design process. A pilot study has been completed and a preliminary model that represents a first step in understanding participants' evolving perceptio...... and expectations of the design process and its outcomes is described in this paper....

  15. Aggression and Moral Development: Integrating Social Information Processing and Moral Domain Models

    Science.gov (United States)

    Arsenio, William F.; Lemerise, Elizabeth A.

    2004-01-01

    Social information processing and moral domain theories have developed in relative isolation from each other despite their common focus on intentional harm and victimization, and mutual emphasis on social cognitive processes in explaining aggressive, morally relevant behaviors. This article presents a selective summary of these literatures with…

  16. Antecedents of Absorptive Capacity: A New Model for Developing Learning Processes

    Science.gov (United States)

    Rezaei-Zadeh, Mohammad; Darwish, Tamer K.

    2016-01-01

    Purpose: The purpose of this paper is to provide an integrated framework to indicate which antecedents of absorptive capacity (AC) influence its learning processes, and to propose testing of this model in future work. Design/methodology/approach Relevant literature into the antecedents of AC was critically reviewed and analysed with the objective…

  17. A simplified computational memory model from information processing

    Science.gov (United States)

    Zhang, Lanhua; Zhang, Dongsheng; Deng, Yuqin; Ding, Xiaoqian; Wang, Yan; Tang, Yiyuan; Sun, Baoliang

    2016-01-01

    This paper is intended to propose a computational model for memory from the view of information processing. The model, called simplified memory information retrieval network (SMIRN), is a bi-modular hierarchical functional memory network by abstracting memory function and simulating memory information processing. At first meta-memory is defined to express the neuron or brain cortices based on the biology and graph theories, and we develop an intra-modular network with the modeling algorithm by mapping the node and edge, and then the bi-modular network is delineated with intra-modular and inter-modular. At last a polynomial retrieval algorithm is introduced. In this paper we simulate the memory phenomena and functions of memorization and strengthening by information processing algorithms. The theoretical analysis and the simulation results show that the model is in accordance with the memory phenomena from information processing view. PMID:27876847

  18. Development of numerical dispersion model for radioactive nuclei including resuspension processes

    International Nuclear Information System (INIS)

    Chiba, Masaru; Kurita, Susumu; Sasaki, Hidetaka

    2003-01-01

    Global-scale and local-scale dispersion model are developed combining to global and local scale meteorological forecasting model. By applying this system to another miner constituent such as mineral dust blowing by strong wind in arid region, this system shows very good performance to watch and predict the distribution of it. (author)

  19. Semantic Business Process Modeling

    OpenAIRE

    Markovic, Ivan

    2010-01-01

    This book presents a process-oriented business modeling framework based on semantic technologies. The framework consists of modeling languages, methods, and tools that allow for semantic modeling of business motivation, business policies and rules, and business processes. Quality of the proposed modeling framework is evaluated based on the modeling content of SAP Solution Composer and several real-world business scenarios.

  20. Development of Flexible Software Process Lines with Variability Operations

    DEFF Research Database (Denmark)

    Dohrmann, Patrick; Schramm, Joachim; Kuhrmann, Marco

    2016-01-01

    the development of flexible software process lines. Method: We conducted a longitudinal study in which we studied 5 variants of the V-Modell XT process line for 2 years. Results: Our results show the variability operation instrument feasible in practice. We analyzed 616 operation exemplars addressing various...

  1. Processes Of Self-Concept Development Among Children and Adolescents

    DEFF Research Database (Denmark)

    Spaten, Ole Michael

    on childrens's development may be conquered by broader perspectives in theory and, methodology. He proposed a scientific perspective as the ecology of human development and, the Person-Process-Context-Time model (ibid). Our results includes that childrens's and adolescent's active internalization (Valsiner...... & Van der Veer, 1988). amd dialogical, cultural self-autorship are important themes for an understanding of processes of self-concept development among Danish children and adolescents from diverse cultual backgrounds. Limitations for this research as well as further directions for new studies...

  2. Dynamic process model of a plutonium oxalate precipitator. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Miller, C.L.; Hammelman, J.E.; Borgonovi, G.M.

    1977-11-01

    In support of LLL material safeguards program, a dynamic process model was developed which simulates the performance of a plutonium (IV) oxalate precipitator. The plutonium oxalate precipitator is a component in the plutonium oxalate process for making plutonium oxide powder from plutonium nitrate. The model is based on state-of-the-art crystallization descriptive equations, the parameters of which are quantified through the use of batch experimental data. The dynamic model predicts performance very similar to general Hanford oxalate process experience. The utilization of such a process model in an actual plant operation could promote both process control and material safeguards control by serving as a baseline predictor which could give early warning of process upsets or material diversion. The model has been incorporated into a FORTRAN computer program and is also compatible with the DYNSYS 2 computer code which is being used at LLL for process modeling efforts.

  3. Dynamic process model of a plutonium oxalate precipitator. Final report

    International Nuclear Information System (INIS)

    Miller, C.L.; Hammelman, J.E.; Borgonovi, G.M.

    1977-11-01

    In support of LLL material safeguards program, a dynamic process model was developed which simulates the performance of a plutonium (IV) oxalate precipitator. The plutonium oxalate precipitator is a component in the plutonium oxalate process for making plutonium oxide powder from plutonium nitrate. The model is based on state-of-the-art crystallization descriptive equations, the parameters of which are quantified through the use of batch experimental data. The dynamic model predicts performance very similar to general Hanford oxalate process experience. The utilization of such a process model in an actual plant operation could promote both process control and material safeguards control by serving as a baseline predictor which could give early warning of process upsets or material diversion. The model has been incorporated into a FORTRAN computer program and is also compatible with the DYNSYS 2 computer code which is being used at LLL for process modeling efforts

  4. Business Process Modeling: Perceived Benefits

    Science.gov (United States)

    Indulska, Marta; Green, Peter; Recker, Jan; Rosemann, Michael

    The process-centered design of organizations and information systems is globally seen as an appropriate response to the increased economic pressure on organizations. At the methodological core of process-centered management is process modeling. However, business process modeling in large initiatives can be a time-consuming and costly exercise, making it potentially difficult to convince executive management of its benefits. To date, and despite substantial interest and research in the area of process modeling, the understanding of the actual benefits of process modeling in academia and practice is limited. To address this gap, this paper explores the perception of benefits derived from process modeling initiatives, as reported through a global Delphi study. The study incorporates the views of three groups of stakeholders - academics, practitioners and vendors. Our findings lead to the first identification and ranking of 19 unique benefits associated with process modeling. The study in particular found that process modeling benefits vary significantly between practitioners and academics. We argue that the variations may point to a disconnect between research projects and practical demands.

  5. Development of expert systems for modeling of technological process of pressure casting on the basis of artificial intelligence

    Science.gov (United States)

    Gavarieva, K. N.; Simonova, L. A.; Pankratov, D. L.; Gavariev, R. V.

    2017-09-01

    In article the main component of expert system of process of casting under pressure which consists of algorithms, united in logical models is considered. The characteristics of system showing data on a condition of an object of management are described. A number of logically interconnected steps allowing to increase quality of the received castings is developed

  6. Extracting business vocabularies from business process models: SBVR and BPMN standards-based approach

    Science.gov (United States)

    Skersys, Tomas; Butleris, Rimantas; Kapocius, Kestutis

    2013-10-01

    Approaches for the analysis and specification of business vocabularies and rules are very relevant topics in both Business Process Management and Information Systems Development disciplines. However, in common practice of Information Systems Development, the Business modeling activities still are of mostly empiric nature. In this paper, basic aspects of the approach for business vocabularies' semi-automated extraction from business process models are presented. The approach is based on novel business modeling-level OMG standards "Business Process Model and Notation" (BPMN) and "Semantics for Business Vocabularies and Business Rules" (SBVR), thus contributing to OMG's vision about Model-Driven Architecture (MDA) and to model-driven development in general.

  7. How Qualitative Methods Can be Used to Inform Model Development.

    Science.gov (United States)

    Husbands, Samantha; Jowett, Susan; Barton, Pelham; Coast, Joanna

    2017-06-01

    Decision-analytic models play a key role in informing healthcare resource allocation decisions. However, there are ongoing concerns with the credibility of models. Modelling methods guidance can encourage good practice within model development, but its value is dependent on its ability to address the areas that modellers find most challenging. Further, it is important that modelling methods and related guidance are continually updated in light of any new approaches that could potentially enhance model credibility. The objective of this article was to highlight the ways in which qualitative methods have been used and recommended to inform decision-analytic model development and enhance modelling practices. With reference to the literature, the article discusses two key ways in which qualitative methods can be, and have been, applied. The first approach involves using qualitative methods to understand and inform general and future processes of model development, and the second, using qualitative techniques to directly inform the development of individual models. The literature suggests that qualitative methods can improve the validity and credibility of modelling processes by providing a means to understand existing modelling approaches that identifies where problems are occurring and further guidance is needed. It can also be applied within model development to facilitate the input of experts to structural development. We recommend that current and future model development would benefit from the greater integration of qualitative methods, specifically by studying 'real' modelling processes, and by developing recommendations around how qualitative methods can be adopted within everyday modelling practice.

  8. Modeling and managing risk early in software development

    Science.gov (United States)

    Briand, Lionel C.; Thomas, William M.; Hetmanski, Christopher J.

    1993-01-01

    In order to improve the quality of the software development process, we need to be able to build empirical multivariate models based on data collectable early in the software process. These models need to be both useful for prediction and easy to interpret, so that remedial actions may be taken in order to control and optimize the development process. We present an automated modeling technique which can be used as an alternative to regression techniques. We show how it can be used to facilitate the identification and aid the interpretation of the significant trends which characterize 'high risk' components in several Ada systems. Finally, we evaluate the effectiveness of our technique based on a comparison with logistic regression based models.

  9. Case Studies in Modelling, Control in Food Processes.

    Science.gov (United States)

    Glassey, J; Barone, A; Montague, G A; Sabou, V

    This chapter discusses the importance of modelling and control in increasing food process efficiency and ensuring product quality. Various approaches to both modelling and control in food processing are set in the context of the specific challenges in this industrial sector and latest developments in each area are discussed. Three industrial case studies are used to demonstrate the benefits of advanced measurement, modelling and control in food processes. The first case study illustrates the use of knowledge elicitation from expert operators in the process for the manufacture of potato chips (French fries) and the consequent improvements in process control to increase the consistency of the resulting product. The second case study highlights the economic benefits of tighter control of an important process parameter, moisture content, in potato crisp (chips) manufacture. The final case study describes the use of NIR spectroscopy in ensuring effective mixing of dry multicomponent mixtures and pastes. Practical implementation tips and infrastructure requirements are also discussed.

  10. Guiding gate-etch process development using 3D surface reaction modeling for 7nm and beyond

    Science.gov (United States)

    Dunn, Derren; Sporre, John R.; Deshpande, Vaibhav; Oulmane, Mohamed; Gull, Ronald; Ventzek, Peter; Ranjan, Alok

    2017-03-01

    Increasingly, advanced process nodes such as 7nm (N7) are fundamentally 3D and require stringent control of critical dimensions over high aspect ratio features. Process integration in these nodes requires a deep understanding of complex physical mechanisms to control critical dimensions from lithography through final etch. Polysilicon gate etch processes are critical steps in several device architectures for advanced nodes that rely on self-aligned patterning approaches to gate definition. These processes are required to meet several key metrics: (a) vertical etch profiles over high aspect ratios; (b) clean gate sidewalls free of etch process residue; (c) minimal erosion of liner oxide films protecting key architectural elements such as fins; and (e) residue free corners at gate interfaces with critical device elements. In this study, we explore how hybrid modeling approaches can be used to model a multi-step finFET polysilicon gate etch process. Initial parts of the patterning process through hardmask assembly are modeled using process emulation. Important aspects of gate definition are then modeled using a particle Monte Carlo (PMC) feature scale model that incorporates surface chemical reactions.1 When necessary, species and energy flux inputs to the PMC model are derived from simulations of the etch chamber. The modeled polysilicon gate etch process consists of several steps including a hard mask breakthrough step (BT), main feature etch steps (ME), and over-etch steps (OE) that control gate profiles at the gate fin interface. An additional constraint on this etch flow is that fin spacer oxides are left intact after final profile tuning steps. A natural optimization required from these processes is to maximize vertical gate profiles while minimizing erosion of fin spacer films.2

  11. Materials measurement and accounting in an operating plutonium conversion and purification process. Phase I. Process modeling and simulation

    International Nuclear Information System (INIS)

    Thomas, C.C. Jr.; Ostenak, C.A.; Gutmacher, R.G.; Dayem, H.A.; Kern, E.A.

    1981-04-01

    A model of an operating conversion and purification process for the production of reactor-grade plutonium dioxide was developed as the first component in the design and evaluation of a nuclear materials measurement and accountability system. The model accurately simulates process operation and can be used to identify process problems and to predict the effect of process modifications

  12. COMPLEX SIMULATION MODEL OF TRAIN BREAKING-UP PROCESS AT THE HUMPS

    Directory of Open Access Journals (Sweden)

    E. B. Demchenko

    2015-11-01

    Full Text Available Purpose. One of the priorities of station sorting complex functioning improvement is the breaking-up process energy consumptions reduction, namely: fuel consumption for train pushing and electric energy consumption for cut braking. In this regard, an effective solution of the problem of energy consumption reduction at breaking-up subsystem requires a comprehensive handling of train pushing and cut rolling down processes. At the same time, the analysis showed that the current task of pushing process improvement and cut rolling down effectiveness increase are solved separately. To solve this problem it is necessary to develop the complex simulation model of train breaking up process at humps. Methodology. Pushing process simulation was done based on adapted under the shunting conditions traction calculations. In addition, the features of shunting locomotives work at the humps were taken into account. In order to realize the current pushing mode the special algorithm of hump locomotive controlling, which along with the safety shunting operation requirements takes into account behavioral factors associated with engineer control actions was applied. This algorithm provides train smooth acceleration and further movement with speed, which is close to the set speed. Hump locomotive fuel consumptions were determined based on the amount of mechanical work performed by locomotive traction. Findings. The simulation model of train pushing process was developed and combined with existing cut rolling down model. Cut initial velocity is determined during simulation process. The obtained initial velocity is used for further cut rolling process modeling. In addition, the modeling resulted in sufficiently accurate determination of the fuel rates consumed for train breaking-up. Originality. The simulation model of train breaking-up process at the humps, which in contrast to the existing models allows reproducing complexly all the elements of this process in detail

  13. Dynamic Modeling of Process Technologies for Closed-Loop Water Recovery Systems

    Science.gov (United States)

    Allada, Rama Kumar; Lange, Kevin E.; Anderson, Molly S.

    2012-01-01

    Detailed chemical process simulations are a useful tool in designing and optimizing complex systems and architectures for human life support. Dynamic and steady-state models of these systems help contrast the interactions of various operating parameters and hardware designs, which become extremely useful in trade-study analyses. NASA s Exploration Life Support technology development project recently made use of such models to compliment a series of tests on different waste water distillation systems. This paper presents dynamic simulations of chemical process for primary processor technologies including: the Cascade Distillation System (CDS), the Vapor Compression Distillation (VCD) system, the Wiped-Film Rotating Disk (WFRD), and post-distillation water polishing processes such as the Volatiles Removal Assembly (VRA). These dynamic models were developed using the Aspen Custom Modeler (Registered TradeMark) and Aspen Plus(Registered TradeMark) process simulation tools. The results expand upon previous work for water recovery technology models and emphasize dynamic process modeling and results. The paper discusses system design, modeling details, and model results for each technology and presents some comparisons between the model results and available test data. Following these initial comparisons, some general conclusions and forward work are discussed.

  14. Development of a process model for intelligent control of gas metal arc welding

    International Nuclear Information System (INIS)

    Smartt, H.B.; Johnson, J.A.; Einerson, C.J.; Watkins, A.D.; Carlson, N.M.

    1991-01-01

    This paper discusses work in progress on the development of an intelligent control scheme for arc welding. A set of four sensors is used to detect weld bead cooling rate, droplet transfer mode, weld pool and joint location and configuration, and weld defects during welding. A neural network is being developed as the bridge between the multiple sensor set a conventional proportional-integral controller that provides independent control of process variables. This approach is being developed for the gas metal arc welding process. 20 refs., 8 figs

  15. Resin infusion of large composite structures modeling and manufacturing process

    Energy Technology Data Exchange (ETDEWEB)

    Loos, A.C. [Michigan State Univ., Dept. of Mechanical Engineering, East Lansing, MI (United States)

    2006-07-01

    The resin infusion processes resin transfer molding (RTM), resin film infusion (RFI) and vacuum assisted resin transfer molding (VARTM) are cost effective techniques for the fabrication of complex shaped composite structures. The dry fibrous preform is placed in the mold, consolidated, resin impregnated and cured in a single step process. The fibrous performs are often constructed near net shape using highly automated textile processes such as knitting, weaving and braiding. In this paper, the infusion processes RTM, RFI and VARTM are discussed along with the advantages of each technique compared with traditional composite fabrication methods such as prepreg tape lay up and autoclave cure. The large number of processing variables and the complex material behavior during infiltration and cure make experimental optimization of the infusion processes costly and inefficient. Numerical models have been developed which can be used to simulate the resin infusion processes. The model formulation and solution procedures for the VARTM process are presented. A VARTM process simulation of a carbon fiber preform was presented to demonstrate the type of information that can be generated by the model and to compare the model predictions with experimental measurements. Overall, the predicted flow front positions, resin pressures and preform thicknesses agree well with the measured values. The results of the simulation show the potential cost and performance benefits that can be realized by using a simulation model as part of the development process. (au)

  16. Method for modeling social care processes for national information exchange.

    Science.gov (United States)

    Miettinen, Aki; Mykkänen, Juha; Laaksonen, Maarit

    2012-01-01

    Finnish social services include 21 service commissions of social welfare including Adoption counselling, Income support, Child welfare, Services for immigrants and Substance abuse care. This paper describes the method used for process modeling in the National project for IT in Social Services in Finland (Tikesos). The process modeling in the project aimed to support common national target state processes from the perspective of national electronic archive, increased interoperability between systems and electronic client documents. The process steps and other aspects of the method are presented. The method was developed, used and refined during the three years of process modeling in the national project.

  17. Development of a Scale-up Tool for Pervaporation Processes

    Directory of Open Access Journals (Sweden)

    Holger Thiess

    2018-01-01

    Full Text Available In this study, an engineering tool for the design and optimization of pervaporation processes is developed based on physico-chemical modelling coupled with laboratory/mini-plant experiments. The model incorporates the solution-diffusion-mechanism, polarization effects (concentration and temperature, axial dispersion, pressure drop and the temperature drop in the feed channel due to vaporization of the permeating components. The permeance, being the key model parameter, was determined via dehydration experiments on a mini-plant scale for the binary mixtures ethanol/water and ethyl acetate/water. A second set of experimental data was utilized for the validation of the model for two chemical systems. The industrially relevant ternary mixture, ethanol/ethyl acetate/water, was investigated close to its azeotropic point and compared to a simulation conducted with the determined binary permeance data. Experimental and simulation data proved to agree very well for the investigated process conditions. In order to test the scalability of the developed engineering tool, large-scale data from an industrial pervaporation plant used for the dehydration of ethanol was compared to a process simulation conducted with the validated physico-chemical model. Since the membranes employed in both mini-plant and industrial scale were of the same type, the permeance data could be transferred. The comparison of the measured and simulated data proved the scalability of the derived model.

  18. A Dual-Process Account of the Development of Scientific Reasoning: The Nature and Development of Metacognitive Intercession Skills

    Science.gov (United States)

    Amsel, Eric; Klaczynski, Paul A.; Johnston, Adam; Bench, Shane; Close, Jason; Sadler, Eric; Walker, Rick

    2008-01-01

    Metacognitive knowledge of the dual-processing basis of judgment is critical to resolving conflict between analytic and experiential processing responses [Klaczynski, P. A. (2004). A dual-process model of adolescent development: Implications for decision making, reasoning, and identity. In R. V. Kail (Ed.), "Advances in child development and…

  19. MODELING OF PATTERN FORMING PROCESS OF AUTOMATIC RADIO DIRECTION FINDER OF PHASE VHF IN THE DEVELOPMENT ENVIRONMENT OF LabVIEW APPLIED PROGRAMS

    Directory of Open Access Journals (Sweden)

    G. K. Aslanov

    2015-01-01

    Full Text Available In the article is developed the model demonstrating the forming process of pattern of antenna system of aerodrome quasidopler automatic radiodirection-finder station in the development environment of LabVIEW applied programs of National Instrument company. 

  20. A systematic literature review of studies on business process modeling quality

    NARCIS (Netherlands)

    Moreno-Montes de Oca, I.; Snoeck, M.; Reijers, H.A.; Rodríguez-Morffi, A.

    2015-01-01

    Context. Business process modeling is an essential part of understanding and redesigning the activities that a typical enterprise uses to achieve its business goals. The quality of a business process model has a significant impact on the development of any enterprise and IT support for that process.

  1. A systematic literature review of studies on business process modeling quality

    NARCIS (Netherlands)

    Moreno-Montes de Oca, I.; Snoeck, M.; Reijers, H.A.; Rodríguez-Morffi, A.

    2015-01-01

    Context: Business process modeling is an essential part of understanding and redesigning the activities that a typical enterprise uses to achieve its business goals. The quality of a business process model has a significant impact on the development of any enterprise and IT support for that process.

  2. Dynamic Modeling of Process Technologies for Closed-Loop Water Recovery Systems

    Science.gov (United States)

    Allada, Rama Kumar; Lange, Kevin; Anderson, Molly

    2011-01-01

    Detailed chemical process simulations are a useful tool in designing and optimizing complex systems and architectures for human life support. Dynamic and steady-state models of these systems help contrast the interactions of various operating parameters and hardware designs, which become extremely useful in trade-study analyses. NASA s Exploration Life Support technology development project recently made use of such models to compliment a series of tests on different waste water distillation systems. This paper presents dynamic simulations of chemical process for primary processor technologies including: the Cascade Distillation System (CDS), the Vapor Compression Distillation (VCD) system, the Wiped-Film Rotating Disk (WFRD), and post-distillation water polishing processes such as the Volatiles Removal Assembly (VRA) that were developed using the Aspen Custom Modeler and Aspen Plus process simulation tools. The results expand upon previous work for water recovery technology models and emphasize dynamic process modeling and results. The paper discusses system design, modeling details, and model results for each technology and presents some comparisons between the model results and available test data. Following these initial comparisons, some general conclusions and forward work are discussed.

  3. Theoretical and Practical Aspects of Logistic Quality Management System Documentation Development Process

    Directory of Open Access Journals (Sweden)

    Linas Šaulinskas

    2013-12-01

    Full Text Available This paper addresses aspects of logistics quality management system documentation development and suggests models for quality management system documentation development, documentation hierarchical systems and authorization approval. It also identifies logistic processes and a responsibilities model and a detailed document development and approval process that can be practically applied. Our results are based upon an analysis of advanced Lithuanian and foreign corporate business practices, a review of current literature and recommendations for quality management system standards.

  4. Modeling the curing process of thermosetting resin matrix composites

    Science.gov (United States)

    Loos, A. C.

    1986-01-01

    A model is presented for simulating the curing process of a thermosetting resin matrix composite. The model relates the cure temperature, the cure pressure, and the properties of the prepreg to the thermal, chemical, and rheological processes occurring in the composite during cure. The results calculated with the computer code developed on the basis of the model were compared with the experimental data obtained from autoclave-curved composite laminates. Good agreement between the two sets of results was obtained.

  5. NEURO-FUZZY MODELLING OF BLENDING PROCESS IN CEMENT PLANT

    Directory of Open Access Journals (Sweden)

    Dauda Olarotimi Araromi

    2015-11-01

    Full Text Available The profitability of a cement plant depends largely on the efficient operation of the blending stage, therefore, there is a need to control the process at the blending stage in order to maintain the chemical composition of the raw mix near or at the desired value with minimum variance despite variation in the raw material composition. In this work, neuro-fuzzy model is developed for a dynamic behaviour of the system to predict the total carbonate content in the raw mix at different clay feed rates. The data used for parameter estimation and model validation was obtained from one of the cement plants in Nigeria. The data was pre-processed to remove outliers and filtered using smoothening technique in order to reveal its dynamic nature. Autoregressive exogenous (ARX model was developed for comparison purpose. ARX model gave high root mean square error (RMSE of 5.408 and 4.0199 for training and validation respectively. Poor fit resulting from ARX model is an indication of nonlinear nature of the process. However, both visual and statistical analyses on neuro-fuzzy (ANFIS model gave a far better result. RMSE of training and validation are 0.28167 and 0.7436 respectively, and the sum of square error (SSE and R-square are 39.6692 and 0.9969 respectively. All these are an indication of good performance of ANFIS model. This model can be used for control design of the process.

  6. How Students Learn: Information Processing, Intellectual Development and Confrontation

    Science.gov (United States)

    Entwistle, Noel

    1975-01-01

    A model derived from information processing theory is described, which helps to explain the complex verbal learning of students and suggests implications for lecturing techniques. Other factors affecting learning, which are not covered by the model, are discussed in relationship to it: student's intellectual development and effects of individual…

  7. A dynamic uranium-leaching model for process-control studies

    International Nuclear Information System (INIS)

    Vetter, D.A.; Barker, I.J.; Turner, G.A.

    1989-01-01

    The modelling of the uranium-leaching process, and the logging of data from a plant for the evaluation of the model, are reported. A phenomenological approach was adopted in the development of the model. A set of eight chemical reactions was chosen to represent the complex chemistry of the process, and kinetic expressions for these reactions were incorporated in differential equations representing mass and energy balances. These equations were coded in FORTRAN to form a program that simulated the process, and that allowed averaged and continuous data from the plant to be compared with the model. This allowed the model to be 'tuned', and to reveal a number of minor problems with the control infrastructure on the plant. 7 figs., 21 refs

  8. Structural assessment of aerospace components using image processing algorithms and Finite Element models

    DEFF Research Database (Denmark)

    Stamatelos, Dimtrios; Kappatos, Vassilios

    2017-01-01

    Purpose – This paper presents the development of an advanced structural assessment approach for aerospace components (metallic and composites). This work focuses on developing an automatic image processing methodology based on Non Destructive Testing (NDT) data and numerical models, for predicting...... the residual strength of these components. Design/methodology/approach – An image processing algorithm, based on the threshold method, has been developed to process and quantify the geometric characteristics of damages. Then, a parametric Finite Element (FE) model of the damaged component is developed based...... on the inputs acquired from the image processing algorithm. The analysis of the metallic structures is employing the Extended FE Method (XFEM), while for the composite structures the Cohesive Zone Model (CZM) technique with Progressive Damage Modelling (PDM) is used. Findings – The numerical analyses...

  9. Barriers in green lean six sigma product development process

    DEFF Research Database (Denmark)

    Kumar, Sanjay; Luthra, Sunil; Govindan, Kannan

    2016-01-01

    In today’s competitive globalised business environment, production cost cutting is a primary issue before operation managers. As a research area, green lean six sigma (GLS) is proposed to have strategic importance in product development towards cutting costs, contributing to optimisation...... experts’ opinions towards developing a hierarchical model structuring these barriers. Twenty-one barriers have been identified and sorted from the review of literature and were then validated through discussions with experts. Relationships (contextual in nature) among these barriers have been realised...... during a brainstorming session. An interpretive structural modelling (ISM) technique has been utilised for developing a hierarchical model of barriers in implementing the GLSPD process in the automobile sector of India. A nine-level structural model has been deduced after application of the ISM technique...

  10. Towards Using Reo for Compliance-Aware Business Process Modeling

    Science.gov (United States)

    Arbab, Farhad; Kokash, Natallia; Meng, Sun

    Business process modeling and implementation of process supporting infrastructures are two challenging tasks that are not fully aligned. On the one hand, languages such as Business Process Modeling Notation (BPMN) exist to capture business processes at the level of domain analysis. On the other hand, programming paradigms and technologies such as Service-Oriented Computing (SOC) and web services have emerged to simplify the development of distributed web systems that underly business processes. BPMN is the most recognized language for specifying process workflows at the early design steps. However, it is rather declarative and may lead to the executable models which are incomplete or semantically erroneous. Therefore, an approach for expressing and analyzing BPMN models in a formal setting is required. In this paper we describe how BPMN diagrams can be represented by means of a semantically precise channel-based coordination language called Reo which admits formal analysis using model checking and bisimulation techniques. Moreover, since additional requirements may come from various regulatory/legislative documents, we discuss the opportunities offered by Reo and its mathematical abstractions for expressing process-related constraints such as Quality of Service (QoS) or time-aware conditions on process states.

  11. Numerical modelling of the jet nozzle enrichment process

    International Nuclear Information System (INIS)

    Vercelli, P.

    1983-01-01

    A numerical model was developed for the simulation of the isotopic enrichment produced by the jet nozzle process. The flow was considered stationary and under ideal gas conditions. The model calculates, for any position of the skimmer piece: (a) values of radial mass concentration profiles for each isotopic species and (b) values of elementary separation effect (Σ sub(A)) and uranium cut (theta). The comparison of the numerical results obtained with the experimental values given in the literature proves the validity of the present work as an initial step in the modelling of the process. (Author) [pt

  12. Combined object-oriented approach for development of process control systems

    International Nuclear Information System (INIS)

    Antonova, I.; Batchkova, I.

    2013-01-01

    Full text: The traditional approaches for development of software control system in automation an information technology based on a directly code creation are no longer effective and successful enough. The response to these challenges is the Model Driven Engineering (MDE) or its counter part in the field of architectures Model Driven Architecture (MDA). One of the most promising approach supporting MDE and MDA is UML. It does not specify a methodology for software or system design but aims to provide an integrated modeling framework for structural, functional and behavior descriptions. The success of UML in many object-oriented approaches led to an idea of applying UML to design of multi agent systems. The approach proposed in this paper applies modified Harmony methodology and is based on the combined use of UML profile for system engineering, IEC61499 standard and FIPA standard protocols. The benefits of object-oriented paradigm and the models of IEC61499 standard are extended with UML/SysML and FIPA notations. The development phases are illustrated with the UML models of a simple process control system. The main benefits of using the proposed approach can be summarized as: it provides consistency in the syntax and underlying semantics; increases the potential and likelihood of reuse; supports the whole software development life cycle in the field of process control. Including the SysML features, based on extended activity and parametric diagrams, flow ports and items to the proposed approach opens the possibilities for modeling of continuous system and support the development in field of process control. Another advantage, connected to the UML/MARTE profile is the possibility for analysis of the designed system and detailed design of the hardware and software platform of the modeled application. Key words: object-oriented modeling, control system, UML, SysML, IEC 61499

  13. Statistical metrology - measurement and modeling of variation for advanced process development and design rule generation

    International Nuclear Information System (INIS)

    Boning, Duane S.; Chung, James E.

    1998-01-01

    Advanced process technology will require more detailed understanding and tighter control of variation in devices and interconnects. The purpose of statistical metrology is to provide methods to measure and characterize variation, to model systematic and random components of that variation, and to understand the impact of variation on both yield and performance of advanced circuits. Of particular concern are spatial or pattern-dependencies within individual chips; such systematic variation within the chip can have a much larger impact on performance than wafer-level random variation. Statistical metrology methods will play an important role in the creation of design rules for advanced technologies. For example, a key issue in multilayer interconnect is the uniformity of interlevel dielectric (ILD) thickness within the chip. For the case of ILD thickness, we describe phases of statistical metrology development and application to understanding and modeling thickness variation arising from chemical-mechanical polishing (CMP). These phases include screening experiments including design of test structures and test masks to gather electrical or optical data, techniques for statistical decomposition and analysis of the data, and approaches to calibrating empirical and physical variation models. These models can be integrated with circuit CAD tools to evaluate different process integration or design rule strategies. One focus for the generation of interconnect design rules are guidelines for the use of 'dummy fill' or 'metal fill' to improve the uniformity of underlying metal density and thus improve the uniformity of oxide thickness within the die. Trade-offs that can be evaluated via statistical metrology include the improvements to uniformity possible versus the effect of increased capacitance due to additional metal

  14. Lipid Processing Technology: Building a Multilevel Modeling Network

    DEFF Research Database (Denmark)

    Diaz Tovar, Carlos Axel; Mustaffa, Azizul Azri; Hukkerikar, Amol

    2011-01-01

    of a computer aided multilevel modeling network consisting a collection of new and adopted models, methods and tools for the systematic design and analysis of processes employing lipid technology. This is achieved by decomposing the problem into four levels of modeling: 1. pure component properties; 2. mixtures...... and phase behavior; 3. unit operations; and 4. process synthesis and design. The methods and tools in each level include: For the first level, a lipid‐database of collected experimental data from the open literature, confidential data from industry and generated data from validated predictive property...... of these unit operations with respect to performance parameters such as minimum total cost, product yield improvement, operability etc., and process intensification for the retrofit of existing biofuel plants. In the fourth level the information and models developed are used as building blocks...

  15. Application of Computer Simulation Modeling to Medication Administration Process Redesign

    Directory of Open Access Journals (Sweden)

    Nathan Huynh

    2012-01-01

    Full Text Available The medication administration process (MAP is one of the most high-risk processes in health care. MAP workflow redesign can precipitate both unanticipated and unintended consequences that can lead to new medication safety risks and workflow inefficiencies. Thus, it is necessary to have a tool to evaluate the impact of redesign approaches in advance of their clinical implementation. This paper discusses the development of an agent-based MAP computer simulation model that can be used to assess the impact of MAP workflow redesign on MAP performance. The agent-based approach is adopted in order to capture Registered Nurse medication administration performance. The process of designing, developing, validating, and testing such a model is explained. Work is underway to collect MAP data in a hospital setting to provide more complex MAP observations to extend development of the model to better represent the complexity of MAP.

  16. WWTP Process Tank Modelling

    DEFF Research Database (Denmark)

    Laursen, Jesper

    The present thesis considers numerical modeling of activated sludge tanks on municipal wastewater treatment plants. Focus is aimed at integrated modeling where the detailed microbiological model the Activated Sludge Model 3 (ASM3) is combined with a detailed hydrodynamic model based on a numerical...... solution of the Navier-Stokes equations in a multiphase scheme. After a general introduction to the activated sludge tank as a system, the activated sludge tank model is gradually setup in separate stages. The individual sub-processes that are often occurring in activated sludge tanks are initially...... hydrofoil shaped propellers. These two sub-processes deliver the main part of the supplied energy to the activated sludge tank, and for this reason they are important for the mixing conditions in the tank. For other important processes occurring in the activated sludge tank, existing models and measurements...

  17. Quality-by-Design (QbD): An integrated process analytical technology (PAT) approach for a dynamic pharmaceutical co-precipitation process characterization and process design space development.

    Science.gov (United States)

    Wu, Huiquan; White, Maury; Khan, Mansoor A

    2011-02-28

    The aim of this work was to develop an integrated process analytical technology (PAT) approach for a dynamic pharmaceutical co-precipitation process characterization and design space development. A dynamic co-precipitation process by gradually introducing water to the ternary system of naproxen-Eudragit L100-alcohol was monitored at real-time in situ via Lasentec FBRM and PVM. 3D map of count-time-chord length revealed three distinguishable process stages: incubation, transition, and steady-state. The effects of high risk process variables (slurry temperature, stirring rate, and water addition rate) on both derived co-precipitation process rates and final chord-length-distribution were evaluated systematically using a 3(3) full factorial design. Critical process variables were identified via ANOVA for both transition and steady state. General linear models (GLM) were then used for parameter estimation for each critical variable. Clear trends about effects of each critical variable during transition and steady state were found by GLM and were interpreted using fundamental process principles and Nyvlt's transfer model. Neural network models were able to link process variables with response variables at transition and steady state with R(2) of 0.88-0.98. PVM images evidenced nucleation and crystal growth. Contour plots illustrated design space via critical process variables' ranges. It demonstrated the utility of integrated PAT approach for QbD development. Published by Elsevier B.V.

  18. Mathematical modeling of biomass fuels formation process

    International Nuclear Information System (INIS)

    Gaska, Krzysztof; Wandrasz, Andrzej J.

    2008-01-01

    The increasing demand for thermal and electric energy in many branches of industry and municipal management accounts for a drastic diminishing of natural resources (fossil fuels). Meanwhile, in numerous technical processes, a huge mass of wastes is produced. A segregated and converted combustible fraction of the wastes, with relatively high calorific value, may be used as a component of formed fuels. The utilization of the formed fuel components from segregated groups of waste in associated processes of co-combustion with conventional fuels causes significant savings resulting from partial replacement of fossil fuels, and reduction of environmental pollution resulting directly from the limitation of waste migration to the environment (soil, atmospheric air, surface and underground water). The realization of technological processes with the utilization of formed fuel in associated thermal systems should be qualified by technical criteria, which means that elementary processes as well as factors of sustainable development, from a global viewpoint, must not be disturbed. The utilization of post-process waste should be preceded by detailed technical, ecological and economic analyses. In order to optimize the mixing process of fuel components, a mathematical model of the forming process was created. The model is defined as a group of data structures which uniquely identify a real process and conversion of this data in algorithms based on a problem of linear programming. The paper also presents the optimization of parameters in the process of forming fuels using a modified simplex algorithm with a polynomial worktime. This model is a datum-point in the numerical modeling of real processes, allowing a precise determination of the optimal elementary composition of formed fuels components, with assumed constraints and decision variables of the task

  19. A model evaluation checklist for process-based environmental models

    Science.gov (United States)

    Jackson-Blake, Leah

    2015-04-01

    the conceptual model on which it is based. In this study, a number of model structural shortcomings were identified, such as a lack of dissolved phosphorus transport via infiltration excess overland flow, potential discrepancies in the particulate phosphorus simulation and a lack of spatial granularity. (4) Conceptual challenges, as conceptual models on which predictive models are built are often outdated, having not kept up with new insights from monitoring and experiments. For example, soil solution dissolved phosphorus concentration in INCA-P is determined by the Freundlich adsorption isotherm, which could potentially be replaced using more recently-developed adsorption models that take additional soil properties into account. This checklist could be used to assist in identifying why model performance may be poor or unreliable. By providing a model evaluation framework, it could help prioritise which areas should be targeted to improve model performance or model credibility, whether that be through using alternative calibration techniques and statistics, improved data collection, improving or simplifying the model structure or updating the model to better represent current understanding of catchment processes.

  20. Development methodology for the software life cycle process of the safety software

    Energy Technology Data Exchange (ETDEWEB)

    Kim, D. H.; Lee, S. S. [BNF Technology, Taejon (Korea, Republic of); Cha, K. H.; Lee, C. S.; Kwon, K. C.; Han, H. B. [KAERI, Taejon (Korea, Republic of)

    2002-05-01

    A methodology for developing software life cycle processes (SLCP) is proposed to develop the digital safety-critical Engineered Safety Features - Component Control System (ESF-CCS) successfully. A software life cycle model is selected as the hybrid model mixed with waterfall, prototyping, and spiral models and is composed of two stages , development stages of prototype of ESF-CCS and ESF-CCS. To produce the software life cycle (SLC) for the Development of the Digital Reactor Safety System, the Activities referenced in IEEE Std. 1074-1997 are mapped onto the hybrid model. The SLCP is established after the available OPAs (Organizational Process Asset) are applied to the SLC Activities, and the known constraints are reconciled. The established SLCP describes well the software life cycle activities with which the Regulatory Authority provides.

  1. Development methodology for the software life cycle process of the safety software

    International Nuclear Information System (INIS)

    Kim, D. H.; Lee, S. S.; Cha, K. H.; Lee, C. S.; Kwon, K. C.; Han, H. B.

    2002-01-01

    A methodology for developing software life cycle processes (SLCP) is proposed to develop the digital safety-critical Engineered Safety Features - Component Control System (ESF-CCS) successfully. A software life cycle model is selected as the hybrid model mixed with waterfall, prototyping, and spiral models and is composed of two stages , development stages of prototype of ESF-CCS and ESF-CCS. To produce the software life cycle (SLC) for the Development of the Digital Reactor Safety System, the Activities referenced in IEEE Std. 1074-1997 are mapped onto the hybrid model. The SLCP is established after the available OPAs (Organizational Process Asset) are applied to the SLC Activities, and the known constraints are reconciled. The established SLCP describes well the software life cycle activities with which the Regulatory Authority provides

  2. Computational Process Modeling for Additive Manufacturing (OSU)

    Science.gov (United States)

    Bagg, Stacey; Zhang, Wei

    2015-01-01

    Powder-Bed Additive Manufacturing (AM) through Direct Metal Laser Sintering (DMLS) or Selective Laser Melting (SLM) is being used by NASA and the Aerospace industry to "print" parts that traditionally are very complex, high cost, or long schedule lead items. The process spreads a thin layer of metal powder over a build platform, then melts the powder in a series of welds in a desired shape. The next layer of powder is applied, and the process is repeated until layer-by-layer, a very complex part can be built. This reduces cost and schedule by eliminating very complex tooling and processes traditionally used in aerospace component manufacturing. To use the process to print end-use items, NASA seeks to understand SLM material well enough to develop a method of qualifying parts for space flight operation. Traditionally, a new material process takes many years and high investment to generate statistical databases and experiential knowledge, but computational modeling can truncate the schedule and cost -many experiments can be run quickly in a model, which would take years and a high material cost to run empirically. This project seeks to optimize material build parameters with reduced time and cost through modeling.

  3. Multi-dimensional population balance models of crystallization processes

    DEFF Research Database (Denmark)

    Meisler, Kresten Troelstrup; von Solms, Nicolas

    A generic and model-based framework for batch cooling crystallization operations has been extended to incorporate continuous and fed-batch processes. Modules for the framework have been developed, including a module for reactions, allowing the study of reactive crystallization within the framework....... A kinetic model library together with an ontology for knowledge representation has been developed, in which kinetic models and relations from the literature are stored along with the references and data. The model library connects to the generic modelling framework as well, as models can be retrieved......, analyzed, used for simulation and stored again. The model library facilitates comparison of expressions for kinetic phenomena and is tightly integrated with the model analysis tools of the framework.Through the framework, a model for a crystallization operation may be systematically generated...

  4. Implementation of the Business Process Modelling Notation (BPMN) in the modelling of anatomic pathology processes.

    Science.gov (United States)

    Rojo, Marcial García; Rolón, Elvira; Calahorra, Luis; García, Felix Oscar; Sánchez, Rosario Paloma; Ruiz, Francisco; Ballester, Nieves; Armenteros, María; Rodríguez, Teresa; Espartero, Rafael Martín

    2008-07-15

    Process orientation is one of the essential elements of quality management systems, including those in use in healthcare. Business processes in hospitals are very complex and variable. BPMN (Business Process Modelling Notation) is a user-oriented language specifically designed for the modelling of business (organizational) processes. Previous experiences of the use of this notation in the processes modelling within the Pathology in Spain or another country are not known. We present our experience in the elaboration of the conceptual models of Pathology processes, as part of a global programmed surgical patient process, using BPMN. With the objective of analyzing the use of BPMN notation in real cases, a multidisciplinary work group was created, including software engineers from the Dep. of Technologies and Information Systems from the University of Castilla-La Mancha and health professionals and administrative staff from the Hospital General de Ciudad Real. The work in collaboration was carried out in six phases: informative meetings, intensive training, process selection, definition of the work method, process describing by hospital experts, and process modelling. The modelling of the processes of Anatomic Pathology is presented using BPMN. The presented subprocesses are those corresponding to the surgical pathology examination of the samples coming from operating theatre, including the planning and realization of frozen studies. The modelling of Anatomic Pathology subprocesses has allowed the creation of an understandable graphical model, where management and improvements are more easily implemented by health professionals.

  5. I. WORKING MEMORY CAPACITY IN CONTEXT: MODELING DYNAMIC PROCESSES OF BEHAVIOR, MEMORY, AND DEVELOPMENT.

    Science.gov (United States)

    Simmering, Vanessa R

    2016-09-01

    Working memory is a vital cognitive skill that underlies a broad range of behaviors. Higher cognitive functions are reliably predicted by working memory measures from two domains: children's performance on complex span tasks, and infants' performance in looking paradigms. Despite the similar predictive power across these research areas, theories of working memory development have not connected these different task types and developmental periods. The current project takes a first step toward bridging this gap by presenting a process-oriented theory, focusing on two tasks designed to assess visual working memory capacity in infants (the change-preference task) versus children and adults (the change detection task). Previous studies have shown inconsistent results, with capacity estimates increasing from one to four items during infancy, but only two to three items during early childhood. A probable source of this discrepancy is the different task structures used with each age group, but prior theories were not sufficiently specific to explain how performance relates across tasks. The current theory focuses on cognitive dynamics, that is, how memory representations are formed, maintained, and used within specific task contexts over development. This theory was formalized in a computational model to generate three predictions: 1) capacity estimates in the change-preference task should continue to increase beyond infancy; 2) capacity estimates should be higher in the change-preference versus change detection task when tested within individuals; and 3) performance should correlate across tasks because both rely on the same underlying memory system. I also tested a fourth prediction, that development across tasks could be explained through increasing real-time stability, realized computationally as strengthening connectivity within the model. Results confirmed these predictions, supporting the cognitive dynamics account of performance and developmental changes in real

  6. Toward a General Research Process for Using Dubin's Theory Building Model

    Science.gov (United States)

    Holton, Elwood F.; Lowe, Janis S.

    2007-01-01

    Dubin developed a widely used methodology for theory building, which describes the components of the theory building process. Unfortunately, he does not define a research process for implementing his theory building model. This article proposes a seven-step general research process for implementing Dubin's theory building model. An example of a…

  7. Modeling and knowledge acquisition processes using case-based inference

    Directory of Open Access Journals (Sweden)

    Ameneh Khadivar

    2017-03-01

    Full Text Available The method of acquisition and presentation of the organizational Process Knowledge has considered by many KM researches. In this research a model for process knowledge acquisition and presentation has been presented by using the approach of Case Base Reasoning. The validation of the presented model was evaluated by conducting an expert panel. Then a software has been developed based on the presented model and implemented in Eghtesad Novin Bank of Iran. In this company, based on the stages of the presented model, first the knowledge intensive processes has been identified, then the Process Knowledge was stored in a knowledge base in the format of problem/solution/consequent .The retrieval of the knowledge was done based on the similarity of the nearest neighbor algorithm. For validating of the implemented system, results of the system has compared by the results of the decision making of the expert of the process.

  8. Systematic Development of Miniaturized (Bio)Processes using Process Systems Engineering (PSE) Methods and Tools

    DEFF Research Database (Denmark)

    Krühne, Ulrich; Larsson, Hilde; Heintz, Søren

    2014-01-01

    The focus of this work is on process systems engineering (PSE) methods and tools, and especially on how such PSE methods and tools can be used to accelerate and support systematic bioprocess development at a miniature scale. After a short presentation of the PSE methods and the bioprocess...... development drivers, three case studies are presented. In the first example it is demonstrated how experimental investigations of the bi-enzymatic production of lactobionic acid can be modeled with help of a new mechanistic mathematical model. The reaction was performed at lab scale and the prediction quality...

  9. Developments in regional scale simulation: modelling ecologically sustainable development in the Northern Territory

    International Nuclear Information System (INIS)

    Moffatt, I.

    1992-01-01

    This paper outlines one way in which researchers can make a positive methodological contribution to the debate on ecologically sustainable development (ESD) by integrating dynamic modelling and geographical information systems to form the basis for regional scale simulations. Some of the orthodox uses of Geographic Information System (GIS) are described and it is argued that most applications do not incorporate process based causal models. A description of a pilot study into developing a processed base model of ESD in the Northern Territory is given. This dynamic process based simulation model consists of two regions namely the 'Top End' and the 'Central' district. Each region consists of ten sub-sectors and the pattern of land use represents a common sector to both regions. The role of environmental defence expenditure, including environmental rehabilitation of uranium mines, in the model is noted. Similarly, it is hypothesized that the impact of exogenous changes such as the greenhouse effect and global economic fluctuations can have a differential impact on the behaviour of several sectors of the model. Some of the problems associated with calibrating and testing the model are reviewed. Finally, it is suggested that further refinement of this model can be achieved with the pooling of data sets and the development of PC based transputers for more detailed and accurate regional scale simulations. When fully developed it is anticipated that this pilot model can be of service to environmental managers and other groups involved in promoting ESD in the Northern Territory. 54 refs., 6 figs

  10. modeling grinding modeling grinding processes as micro processes

    African Journals Online (AJOL)

    eobe

    industrial precision grinding processes are cylindrical, center less and ... Several model shave been proposed and used to study grinding ..... grinding force for the two cases were 9.07237N/mm ..... International Journal of Machine Tools &.

  11. A Companion Model Approach to Modelling and Simulation of Industrial Processes

    International Nuclear Information System (INIS)

    Juslin, K.

    2005-09-01

    Modelling and simulation provides for huge possibilities if broadly taken up by engineers as a working method. However, when considering the launching of modelling and simulation tools in an engineering design project, they shall be easy to learn and use. Then, there is no time to write equations, to consult suppliers' experts, or to manually transfer data from one tool to another. The answer seems to be in the integration of easy to use and dependable simulation software with engineering tools. Accordingly, the modelling and simulation software shall accept as input such structured design information on industrial unit processes and their connections, as provided for by e.g. CAD software and product databases. The software technology, including required specification and communication standards, is already available. Internet based service repositories make it possible for equipment manufacturers to supply 'extended products', including such design data as needed by engineers engaged in process and automation integration. There is a market niche evolving for simulation service centres, operating in co-operation with project consultants, equipment manufacturers, process integrators, automation designers, plant operating personnel, and maintenance centres. The companion model approach for specification and solution of process simulation models, as presented herein, is developed from the above premises. The focus is on how to tackle real world processes, which from the modelling point of view are heterogeneous, dynamic, very stiff, very nonlinear and only piece vice continuous, without extensive manual interventions of human experts. An additional challenge, to solve the arising equations fast and reliable, is dealt with, as well. (orig.)

  12. A Multi-scale Modeling System with Unified Physics to Study Precipitation Processes

    Science.gov (United States)

    Tao, W. K.

    2017-12-01

    In recent years, exponentially increasing computer power has extended Cloud Resolving Model (CRM) integrations from hours to months, the number of computational grid points from less than a thousand to close to ten million. Three-dimensional models are now more prevalent. Much attention is devoted to precipitating cloud systems where the crucial 1-km scales are resolved in horizontal domains as large as 10,000 km in two-dimensions, and 1,000 x 1,000 km2 in three-dimensions. Cloud resolving models now provide statistical information useful for developing more realistic physically based parameterizations for climate models and numerical weather prediction models. It is also expected that NWP and mesoscale model can be run in grid size similar to cloud resolving model through nesting technique. Recently, a multi-scale modeling system with unified physics was developed at NASA Goddard. It consists of (1) a cloud-resolving model (Goddard Cumulus Ensemble model, GCE model), (2) a regional scale model (a NASA unified weather research and forecast, WRF), and (3) a coupled CRM and global model (Goddard Multi-scale Modeling Framework, MMF). The same microphysical processes, long and short wave radiative transfer and land processes and the explicit cloud-radiation, and cloud-land surface interactive processes are applied in this multi-scale modeling system. This modeling system has been coupled with a multi-satellite simulator to use NASA high-resolution satellite data to identify the strengths and weaknesses of cloud and precipitation processes simulated by the model. In this talk, a review of developments and applications of the multi-scale modeling system will be presented. In particular, the results from using multi-scale modeling system to study the precipitation, processes and their sensitivity on model resolution and microphysics schemes will be presented. Also how to use of the multi-satellite simulator to improve precipitation processes will be discussed.

  13. Hydrologic and Water Quality Model Development Using Simulink

    Directory of Open Access Journals (Sweden)

    James D. Bowen

    2014-11-01

    Full Text Available A stormwater runoff model based on the Soil Conservation Service (SCS method and a finite-volume based water quality model have been developed to investigate the use of Simulink for use in teaching and research. Simulink, a MATLAB extension, is a graphically based model development environment for system modeling and simulation. Widely used for mechanical and electrical systems, Simulink has had less use for modeling of hydrologic systems. The watershed model is being considered for use in teaching graduate-level courses in hydrology and/or stormwater modeling. Simulink’s block (data process and arrow (data transfer object model, the copy and paste user interface, the large number of existing blocks, and the absence of computer code allows students to become model developers almost immediately. The visual depiction of systems, their component subsystems, and the flow of data through the systems are ideal attributes for hands-on teaching of hydrologic and mass balance processes to today’s computer-savvy visual learners. Model development with Simulink for research purposes is also investigated. A finite volume, multi-layer pond model using the water quality kinetics present in CE-QUAL-W2 has been developed using Simulink. The model is one of the first uses of Simulink for modeling eutrophication dynamics in stratified natural systems. The model structure and a test case are presented. One use of the model for teaching a graduate-level water quality modeling class is also described.

  14. A Linked Model for Simulating Stand Development and Growth Processes of Loblolly Pine

    Science.gov (United States)

    V. Clark Baldwin; Phillip M. Dougherty; Harold E. Burkhart

    1998-01-01

    Linking models of different scales (e.g., process, tree-stand-ecosystem) is essential for furthering our understanding of stand, climatic, and edaphic effects on tree growth and forest productivity. Moreover, linking existing models that differ in scale and levels of resolution quickly identifies knowledge gaps in information required to scale from one level to another...

  15. Modeling as a Decision-Making Process

    Science.gov (United States)

    Bleiler-Baxter, Sarah K.; Stephens, D. Christopher; Baxter, Wesley A.; Barlow, Angela T.

    2017-01-01

    The goal in this article is to support teachers in better understanding what it means to model with mathematics by focusing on three key decision-making processes: Simplification, Relationship Mapping, and Situation Analysis. The authors use the Theme Park task to help teachers develop a vision of how students engage in these three decision-making…

  16. Artificial neural network modelling approach for a biomass gasification process in fixed bed gasifiers

    International Nuclear Information System (INIS)

    Mikulandrić, Robert; Lončar, Dražen; Böhning, Dorith; Böhme, Rene; Beckmann, Michael

    2014-01-01

    Highlights: • 2 Different equilibrium models are developed and their performance is analysed. • Neural network prediction models for 2 different fixed bed gasifier types are developed. • The influence of different input parameters on neural network model performance is analysed. • Methodology for neural network model development for different gasifier types is described. • Neural network models are verified for various operating conditions based on measured data. - Abstract: The number of the small and middle-scale biomass gasification combined heat and power plants as well as syngas production plants has been significantly increased in the last decade mostly due to extensive incentives. However, existing issues regarding syngas quality, process efficiency, emissions and environmental standards are preventing biomass gasification technology to become more economically viable. To encounter these issues, special attention is given to the development of mathematical models which can be used for a process analysis or plant control purposes. The presented paper analyses possibilities of neural networks to predict process parameters with high speed and accuracy. After a related literature review and measurement data analysis, different modelling approaches for the process parameter prediction that can be used for an on-line process control were developed and their performance were analysed. Neural network models showed good capability to predict biomass gasification process parameters with reasonable accuracy and speed. Measurement data for the model development, verification and performance analysis were derived from biomass gasification plant operated by Technical University Dresden

  17. Automated data acquisition technology development:Automated modeling and control development

    Science.gov (United States)

    Romine, Peter L.

    1995-01-01

    This report documents the completion of, and improvements made to, the software developed for automated data acquisition and automated modeling and control development on the Texas Micro rackmounted PC's. This research was initiated because a need was identified by the Metal Processing Branch of NASA Marshall Space Flight Center for a mobile data acquisition and data analysis system, customized for welding measurement and calibration. Several hardware configurations were evaluated and a PC based system was chosen. The Welding Measurement System (WMS), is a dedicated instrument strickly for use of data acquisition and data analysis. In addition to the data acquisition functions described in this thesis, WMS also supports many functions associated with process control. The hardware and software requirements for an automated acquisition system for welding process parameters, welding equipment checkout, and welding process modeling were determined in 1992. From these recommendations, NASA purchased the necessary hardware and software. The new welding acquisition system is designed to collect welding parameter data and perform analysis to determine the voltage versus current arc-length relationship for VPPA welding. Once the results of this analysis are obtained, they can then be used to develop a RAIL function to control welding startup and shutdown without torch crashing.

  18. Drift-Scale Coupled Processes (DST and THC Seepage) Models

    Energy Technology Data Exchange (ETDEWEB)

    E. Gonnenthal; N. Spyoher

    2001-02-05

    The purpose of this Analysis/Model Report (AMR) is to document the Near-Field Environment (NFE) and Unsaturated Zone (UZ) models used to evaluate the potential effects of coupled thermal-hydrologic-chemical (THC) processes on unsaturated zone flow and transport. This is in accordance with the ''Technical Work Plan (TWP) for Unsaturated Zone Flow and Transport Process Model Report'', Addendum D, Attachment D-4 (Civilian Radioactive Waste Management System (CRWMS) Management and Operating Contractor (M and O) 2000 [153447]) and ''Technical Work Plan for Nearfield Environment Thermal Analyses and Testing'' (CRWMS M and O 2000 [153309]). These models include the Drift Scale Test (DST) THC Model and several THC seepage models. These models provide the framework to evaluate THC coupled processes at the drift scale, predict flow and transport behavior for specified thermal loading conditions, and predict the chemistry of waters and gases entering potential waste-emplacement drifts. The intended use of this AMR is to provide input for the following: (1) Performance Assessment (PA); (2) Abstraction of Drift-Scale Coupled Processes AMR (ANL-NBS-HS-000029); (3) UZ Flow and Transport Process Model Report (PMR); and (4) Near-Field Environment (NFE) PMR. The work scope for this activity is presented in the TWPs cited above, and summarized as follows: continue development of the repository drift-scale THC seepage model used in support of the TSPA in-drift geochemical model; incorporate heterogeneous fracture property realizations; study sensitivity of results to changes in input data and mineral assemblage; validate the DST model by comparison with field data; perform simulations to predict mineral dissolution and precipitation and their effects on fracture properties and chemistry of water (but not flow rates) that may seep into drifts; submit modeling results to the TDMS and document the models. The model development, input data

  19. Drift-Scale Coupled Processes (DST and THC Seepage) Models

    International Nuclear Information System (INIS)

    Sonnenthale, E.

    2001-01-01

    The purpose of this Analysis/Model Report (AMR) is to document the Near-Field Environment (NFE) and Unsaturated Zone (UZ) models used to evaluate the potential effects of coupled thermal-hydrologic-chemical (THC) processes on unsaturated zone flow and transport. This is in accordance with the ''Technical Work Plan (TWP) for Unsaturated Zone Flow and Transport Process Model Report'', Addendum D, Attachment D-4 (Civilian Radioactive Waste Management System (CRWMS) Management and Operating Contractor (M and O) 2000 [1534471]) and ''Technical Work Plan for Nearfield Environment Thermal Analyses and Testing'' (CRWMS M and O 2000 [153309]). These models include the Drift Scale Test (DST) THC Model and several THC seepage models. These models provide the framework to evaluate THC coupled processes at the drift scale, predict flow and transport behavior for specified thermal loading conditions, and predict the chemistry of waters and gases entering potential waste-emplacement drifts. The intended use of this AMR is to provide input for the following: Performance Assessment (PA); Near-Field Environment (NFE) PMR; Abstraction of Drift-Scale Coupled Processes AMR (ANL-NBS-HS-000029); and UZ Flow and Transport Process Model Report (PMR). The work scope for this activity is presented in the TWPs cited above, and summarized as follows: Continue development of the repository drift-scale THC seepage model used in support of the TSPA in-drift geochemical model; incorporate heterogeneous fracture property realizations; study sensitivity of results to changes in input data and mineral assemblage; validate the DST model by comparison with field data; perform simulations to predict mineral dissolution and precipitation and their effects on fracture properties and chemistry of water (but not flow rates) that may seep into drifts; submit modeling results to the TDMS and document the models. The model development, input data, sensitivity and validation studies described in this AMR are

  20. An information theory-based approach to modeling the information processing of NPP operators

    International Nuclear Information System (INIS)

    Kim, Jong Hyun; Seong, Poong Hyun

    2002-01-01

    This paper proposes a quantitative approach to modeling the information processing of NPP operators. The aim of this work is to derive the amount of the information processed during a certain control task. The focus will be on i) developing a model for information processing of NPP operators and ii) quantifying the model. To resolve the problems of the previous approaches based on the information theory, i.e. the problems of single channel approaches, we primarily develop the information processing model having multiple stages, which contains information flows. Then the uncertainty of the information is quantified using the Conant's model, a kind of information theory

  1. Understanding Coupled Earth-Surface Processes through Experiments and Models (Invited)

    Science.gov (United States)

    Overeem, I.; Kim, W.

    2013-12-01

    Traditionally, both numerical models and experiments have been purposefully designed to ';isolate' singular components or certain processes of a larger mountain to deep-ocean interconnected source-to-sink (S2S) transport system. Controlling factors driven by processes outside of the domain of immediate interest were treated and simplified as input or as boundary conditions. Increasingly, earth surface processes scientists appreciate feedbacks and explore these feedbacks with more dynamically coupled approaches to their experiments and models. Here, we discuss key concepts and recent advances made in coupled modeling and experimental setups. In addition, we emphasize challenges and new frontiers to coupled experiments. Experiments have highlighted the important role of self-organization; river and delta systems do not always need to be forced by external processes to change or develop characteristic morphologies. Similarly modeling f.e. has shown that intricate networks in tidal deltas are stable because of the interplay between river avulsions and the tidal current scouring with both processes being important to develop and maintain the dentritic networks. Both models and experiment have demonstrated that seemingly stable systems can be perturbed slightly and show dramatic responses. Source-to-sink models were developed for both the Fly River System in Papua New Guinea and the Waipaoa River in New Zealand. These models pointed to the importance of upstream-downstream effects and enforced our view of the S2S system as a signal transfer and dampening conveyor belt. Coupled modeling showed that deforestation had extreme effects on sediment fluxes draining from the catchment of the Waipaoa River in New Zealand, and that this increase in sediment production rapidly shifted the locus of offshore deposition. The challenge in designing coupled models and experiments is both technological as well as intellectual. Our community advances to make numerical model coupling more

  2. Agent Behavior-Based Simulation Study on Mass Collaborative Product Development Process

    Directory of Open Access Journals (Sweden)

    Shuo Zhang

    2015-01-01

    Full Text Available Mass collaborative product development (MCPD benefits people by high innovation products with lower cost and shorter lead time due to quick development of group innovation, Internet-based customization, and prototype manufacturing. Simulation is an effective way to study the evolution process and therefore to guarantee the success of MCPD. In this paper, an agent behavior-based simulation approach of MCPD is developed, which models the MCPD process as the interactive process of design agents and the environment objects based on Complex Adaptive System (CAS theory. Next, the structure model of design agent is proposed, and the modification and collaboration behaviors are described. Third, the agent behavior-based simulation flow of MCPD is designed. At last, simulation experiments are carried out based on an engineering case of mobile phone design. The experiment results show the following: (1 the community scale has significant influence on MCPD process; (2 the simulation process can explicitly represent the modification and collaboration behaviors of design agents; (3 the community evolution process can be observed and analyzed dynamically based on simulation data.

  3. Building an Economic and Mathematical Model of Influence of Integration Processes Upon Development of Tourism in Ukraine

    Directory of Open Access Journals (Sweden)

    Yemets Mariya S.

    2013-12-01

    Full Text Available Today Ukraine actively searches for its own way in the world integration processes, demonstrates a multi-vector foreign economic policy and carries out movement in the direction of integration with the EU and CIS countries. Taking into account establishment of international tourist relations, the main task of Ukraine is getting a bigger share of the world tourist arrivals. That is why, in order to study influence of integration processes upon development of tourism in the country, the author offers the following model: building regression equations of the share of export of tourist services of Ukraine for CIS and EU countries with the aim of the further comparative analysis. The conducted analysis allows making a conclusion that integration factors influence development of international tourism, however it is proved that this influence is not unequivocal and in some cases even inconsistent. Identification of directions of such an inter-dependency allows building an efficient tourist policy by means of selection of adaptive directions of integration.

  4. Devil is in the details: Using logic models to investigate program process.

    Science.gov (United States)

    Peyton, David J; Scicchitano, Michael

    2017-12-01

    Theory-based logic models are commonly developed as part of requirements for grant funding. As a tool to communicate complex social programs, theory based logic models are an effective visual communication. However, after initial development, theory based logic models are often abandoned and remain in their initial form despite changes in the program process. This paper examines the potential benefits of committing time and resources to revising the initial theory driven logic model and developing detailed logic models that describe key activities to accurately reflect the program and assist in effective program management. The authors use a funded special education teacher preparation program to exemplify the utility of drill down logic models. The paper concludes with lessons learned from the iterative revision process and suggests how the process can lead to more flexible and calibrated program management. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Modeling of column apparatus processes

    CERN Document Server

    Boyadjiev, Christo; Boyadjiev, Boyan; Popova-Krumova, Petya

    2016-01-01

    This book presents a new approach for the modeling of chemical and interphase mass transfer processes in industrial column apparatuses, using convection-diffusion and average-concentration models. The convection-diffusion type models are used for a qualitative analysis of the processes and to assess the main, small and slight physical effects, and then reject the slight effects. As a result, the process mechanism can be identified. It also introduces average concentration models for quantitative analysis, which use the average values of the velocity and concentration over the cross-sectional area of the column. The new models are used to analyze different processes (simple and complex chemical reactions, absorption, adsorption and catalytic reactions), and make it possible to model the processes of gas purification with sulfur dioxide, which form the basis of several patents.

  6. Development of an expert system for analysis of plutonium processing operations

    International Nuclear Information System (INIS)

    Boeringter, S.T.; Fasel, J.H.; Kornreich, D.E.

    2001-01-01

    At Los Alamos National Laboratory (LANL) an expert system has been developed for the analysis and assessment of plutonium processing operations. This system is based upon an object-oriented simulation environment specifically developed for the needs of nuclear material processing. The simulation environment, called the ''Process Modeling System'' (ProMoS), contains a library of over 250 plutonium-based unit process operations ranging from analytical chemistry, oxide operations, recycle and recovery, waste management, and component fabrication. (author)

  7. Process Cost Modeling for Multi-Disciplinary Design Optimization

    Science.gov (United States)

    Bao, Han P.; Freeman, William (Technical Monitor)

    2002-01-01

    For early design concepts, the conventional approach to cost is normally some kind of parametric weight-based cost model. There is now ample evidence that this approach can be misleading and inaccurate. By the nature of its development, a parametric cost model requires historical data and is valid only if the new design is analogous to those for which the model was derived. Advanced aerospace vehicles have no historical production data and are nowhere near the vehicles of the past. Using an existing weight-based cost model would only lead to errors and distortions of the true production cost. This report outlines the development of a process-based cost model in which the physical elements of the vehicle are costed according to a first-order dynamics model. This theoretical cost model, first advocated by early work at MIT, has been expanded to cover the basic structures of an advanced aerospace vehicle. Elemental costs based on the geometry of the design can be summed up to provide an overall estimation of the total production cost for a design configuration. This capability to directly link any design configuration to realistic cost estimation is a key requirement for high payoff MDO problems. Another important consideration in this report is the handling of part or product complexity. Here the concept of cost modulus is introduced to take into account variability due to different materials, sizes, shapes, precision of fabrication, and equipment requirements. The most important implication of the development of the proposed process-based cost model is that different design configurations can now be quickly related to their cost estimates in a seamless calculation process easily implemented on any spreadsheet tool. In successive sections, the report addresses the issues of cost modeling as follows. First, an introduction is presented to provide the background for the research work. Next, a quick review of cost estimation techniques is made with the intention to

  8. Drift-Scale Coupled Processes (DST and THC Seepage) Models

    International Nuclear Information System (INIS)

    Dixon, P.

    2004-01-01

    The purpose of this Model Report (REV02) is to document the unsaturated zone (UZ) models used to evaluate the potential effects of coupled thermal-hydrological-chemical (THC) processes on UZ flow and transport. This Model Report has been developed in accordance with the ''Technical Work Plan for: Performance Assessment Unsaturated Zone'' (Bechtel SAIC Company, LLC (BSC) 2002 [160819]). The technical work plan (TWP) describes planning information pertaining to the technical scope, content, and management of this Model Report in Section 1.12, Work Package AUZM08, ''Coupled Effects on Flow and Seepage''. The plan for validation of the models documented in this Model Report is given in Attachment I, Model Validation Plans, Section I-3-4, of the TWP. Except for variations in acceptance criteria (Section 4.2), there were no deviations from this TWP. This report was developed in accordance with AP-SIII.10Q, ''Models''. This Model Report documents the THC Seepage Model and the Drift Scale Test (DST) THC Model. The THC Seepage Model is a drift-scale process model for predicting the composition of gas and water that could enter waste emplacement drifts and the effects of mineral alteration on flow in rocks surrounding drifts. The DST THC model is a drift-scale process model relying on the same conceptual model and much of the same input data (i.e., physical, hydrological, thermodynamic, and kinetic) as the THC Seepage Model. The DST THC Model is the primary method for validating the THC Seepage Model. The DST THC Model compares predicted water and gas compositions, as well as mineral alteration patterns, with observed data from the DST. These models provide the framework to evaluate THC coupled processes at the drift scale, predict flow and transport behavior for specified thermal-loading conditions, and predict the evolution of mineral alteration and fluid chemistry around potential waste emplacement drifts. The DST THC Model is used solely for the validation of the THC

  9. Effect of process operating conditions in the biomass torrefaction: A simulation study using one-dimensional reactor and process model

    International Nuclear Information System (INIS)

    Park, Chansaem; Zahid, Umer; Lee, Sangho; Han, Chonghun

    2015-01-01

    Torrefaction reactor model is required for the development of reactor and process design for biomass torrefaction. In this study, a one-dimensional reactor model is developed based on the kinetic model describing volatiles components and solid evolution and the existing thermochemical model considering the heat and mass balance. The developed reactor model used the temperature and flow rate of the recycled gas as the practical manipulated variables instead of the torrefaction temperature. The temperature profiles of the gas and solid phase were generated, depending on the practical thermal conditions, using developed model. Moreover, the effect of each selected operating variables on the parameters of the torrefaction process and the effect of whole operating variables with particular energy yield were analyzed. Through the results of sensitivity analysis, it is shown that the residence time insignificantly influenced the energy yield when the flow rate of recycled gas is low. Moreover, higher temperature of recycled gas with low flow rate and residence time produces the attractive properties, including HHV and grindability, of torrefied biomass when the energy yield is specified. - Highlights: • A one-dimensional reactor model for biomass torrefaction is developed considering the heat and mass balance. • The developed reactor model uses the temperature and flow rate of the recycled gas as the practical manipulated variables. • The effect of operating variables on the parameters of the torrefaction process is analyzed. • The results of sensitivity analysis represent notable discussions which were not done by the previous researches

  10. Multiphase porous media modelling: A novel approach to predicting food processing performance.

    Science.gov (United States)

    Khan, Md Imran H; Joardder, M U H; Kumar, Chandan; Karim, M A

    2018-03-04

    The development of a physics-based model of food processing is essential to improve the quality of processed food and optimize energy consumption. Food materials, particularly plant-based food materials, are complex in nature as they are porous and have hygroscopic properties. A multiphase porous media model for simultaneous heat and mass transfer can provide a realistic understanding of transport processes and thus can help to optimize energy consumption and improve food quality. Although the development of a multiphase porous media model for food processing is a challenging task because of its complexity, many researchers have attempted it. The primary aim of this paper is to present a comprehensive review of the multiphase models available in the literature for different methods of food processing, such as drying, frying, cooking, baking, heating, and roasting. A critical review of the parameters that should be considered for multiphase modelling is presented which includes input parameters, material properties, simulation techniques and the hypotheses. A discussion on the general trends in outcomes, such as moisture saturation, temperature profile, pressure variation, and evaporation patterns, is also presented. The paper concludes by considering key issues in the existing multiphase models and future directions for development of multiphase models.

  11. Investigating the Process of Process Modeling with Eye Movement Analysis

    OpenAIRE

    Pinggera, Jakob; Furtner, Marco; Martini, Markus; Sachse, Pierre; Reiter, Katharina; Zugal, Stefan; Weber, Barbara

    2015-01-01

    Research on quality issues of business process models has recently begun to explore the process of creating process models by analyzing the modeler's interactions with the modeling environment. In this paper we aim to complement previous insights on the modeler's modeling behavior with data gathered by tracking the modeler's eye movements when engaged in the act of modeling. We present preliminary results and outline directions for future research to triangulate toward a more comprehensive un...

  12. Modeling surface topography of state-of-the-art x-ray mirrors as a result of stochastic polishing process: recent developments

    Science.gov (United States)

    Yashchuk, Valeriy V.; Centers, Gary; Tyurin, Yuri N.; Tyurina, Anastasia

    2016-09-01

    Recently, an original method for the statistical modeling of surface topography of state-of-the-art mirrors for usage in xray optical systems at light source facilities and for astronomical telescopes [Opt. Eng. 51(4), 046501, 2012; ibid. 53(8), 084102 (2014); and ibid. 55(7), 074106 (2016)] has been developed. In modeling, the mirror surface topography is considered to be a result of a stationary uniform stochastic polishing process and the best fit time-invariant linear filter (TILF) that optimally parameterizes, with limited number of parameters, the polishing process is determined. The TILF model allows the surface slope profile of an optic with a newly desired specification to be reliably forecast before fabrication. With the forecast data, representative numerical evaluations of expected performance of the prospective mirrors in optical systems under development become possible [Opt. Eng., 54(2), 025108 (2015)]. Here, we suggest and demonstrate an analytical approach for accounting the imperfections of the used metrology instruments, which are described by the instrumental point spread function, in the TILF modeling. The efficacy of the approach is demonstrated with numerical simulations for correction of measurements performed with an autocollimator based surface slope profiler. Besides solving this major metrological problem, the results of the present work open an avenue for developing analytical and computational tools for stitching data in the statistical domain, obtained using multiple metrology instruments measuring significantly different bandwidths of spatial wavelengths.

  13. Application of the recurrent multilayer perceptron in modeling complex process dynamics.

    Science.gov (United States)

    Parlos, A G; Chong, K T; Atiya, A F

    1994-01-01

    A nonlinear dynamic model is developed for a process system, namely a heat exchanger, using the recurrent multilayer perceptron network as the underlying model structure. The perceptron is a dynamic neural network, which appears effective in the input-output modeling of complex process systems. Dynamic gradient descent learning is used to train the recurrent multilayer perceptron, resulting in an order of magnitude improvement in convergence speed over a static learning algorithm used to train the same network. In developing the empirical process model the effects of actuator, process, and sensor noise on the training and testing sets are investigated. Learning and prediction both appear very effective, despite the presence of training and testing set noise, respectively. The recurrent multilayer perceptron appears to learn the deterministic part of a stochastic training set, and it predicts approximately a moving average response of various testing sets. Extensive model validation studies with signals that are encountered in the operation of the process system modeled, that is steps and ramps, indicate that the empirical model can substantially generalize operational transients, including accurate prediction of instabilities not in the training set. However, the accuracy of the model beyond these operational transients has not been investigated. Furthermore, online learning is necessary during some transients and for tracking slowly varying process dynamics. Neural networks based empirical models in some cases appear to provide a serious alternative to first principles models.

  14. What makes process models understandable?

    NARCIS (Netherlands)

    Mendling, J.; Reijers, H.A.; Cardoso, J.; Alonso, G.; Dadam, P.; Rosemann, M.

    2007-01-01

    Despite that formal and informal quality aspects are of significant importance to business process modeling, there is only little empirical work reported on process model quality and its impact factors. In this paper we investigate understandability as a proxy for quality of process models and focus

  15. Dispersive processes in models of regional radionuclide migration. Technical memorandum

    International Nuclear Information System (INIS)

    Evenson, D.E.; Dettinger, M.D.

    1980-05-01

    Three broad areas of concern in the development of aquifer scale transport models will be local scale diffusion and dispersion processes, regional scale dispersion processes, and numerical problems associated with the advection-dispersion equation. Local scale dispersion processes are fairly well understood and accessible to observation. These processes will generally be dominated in large scale systems by regional processes, or macro-dispersion. Macro-dispersion is primarily the result of large scale heterogeneities in aquifer properties. In addition, the effects of many modeling approximations are often included in the process. Because difficulties arise in parameterization of this large scale phenomenon, parameterization should be based on field measurements made at the same scale as the transport process of interest or else partially circumvented through the application of a probabilistic advection model. Other problems associated with numerical transport models include difficulties with conservation of mass, stability, numerical dissipation, overshoot, flexibility, and efficiency. We recommend the random-walk model formulation for Lawrence Livermore Laboratory's purposes as the most flexible, accurate and relatively efficient modeling approach that overcomes these difficulties

  16. Modeling the Object-Oriented Software Process: OPEN and the Unified Process

    NARCIS (Netherlands)

    van den Berg, Klaas; Aksit, Mehmet; van den Broek, P.M.

    A short introduction to software process modeling is presented, particularly object-oriented modeling. Two major industrial process models are discussed: the OPEN model and the Unified Process model. In more detail, the quality assurance in the Unified Process tool (formally called Objectory) is

  17. Expected Influence of Ethics on Product Development Process

    Directory of Open Access Journals (Sweden)

    Stig Larsson

    2008-07-01

    Full Text Available Product development efficiency and effectiveness is depending on a process being well executed. The actions of individuals included in the processes are influenced by the ethical and moral orientations that have been selected by each individual, whether this selection is conscious or not. This paper describes different ethical choices and the expected effects they may have on the development process exemplified by the product integration process for software products. The different frameworks analyzed are utilitarianism, rights ethics, duty ethics, virtue ethics and ethical egoism. The expected effects on the goals for product integration may be debated. This is a result in it self as it triggers discussions about ethical considerations and increase the awareness of the influence of moral decisions. Our conclusion is that the adherence to specific moral frameworks simplifies the alignment of actions to the practices described in product development models and standards and through this supports a more successful execution of product development projects. This conclusion is also confirmed through a comparison between the different directions and several codes of ethics for engineers issued by organizations such as IEEE as these combine features from several of the discussed ethical directions.

  18. Recent developments in numerical simulation techniques of thermal recovery processes

    Energy Technology Data Exchange (ETDEWEB)

    Tamim, M. [Bangladesh University of Engineering and Technology, Bangladesh (Bangladesh); Abou-Kassem, J.H. [Chemical and Petroleum Engineering Department, UAE University, Al-Ain 17555 (United Arab Emirates); Farouq Ali, S.M. [University of Alberta, Alberta (Canada)

    2000-05-01

    Numerical simulation of thermal processes (steam flooding, steam stimulation, SAGD, in-situ combustion, electrical heating, etc.) is an integral part of a thermal project design. The general tendency in the last 10 years has been to use commercial simulators. During the last decade, only a few new models have been reported in the literature. More work has been done to modify and refine solutions to existing problems to improve the efficiency of simulators. The paper discusses some of the recent developments in simulation techniques of thermal processes such as grid refinement, grid orientation, effect of temperature on relative permeability, mathematical models, and solution methods. The various aspects of simulation discussed here promote better understanding of the problems encountered in the simulation of thermal processes and will be of value to both simulator users and developers.

  19. Process modelling and optimization of osmotic dehydration assisted ...

    African Journals Online (AJOL)

    ... ash content, water loss and solid gain were estimated as quality parameters. Model equations were developed with Essential Regression (ESSREG) software package which related output parameters to process variables and validated.

  20. Determination of cognitive development: postnonclassical theoretical model

    Directory of Open Access Journals (Sweden)

    Irina N. Pogozhina

    2015-09-01

    Full Text Available The aim of this research is to develop a postnonclassical cognitive processes content determination model in which mental processes are considered as open selfdeveloping, self-organizing systems. Three types of systems (dynamic, statistical, developing were analysed and compared on the basis of the description of the external and internal characteristics of causation, types of causal chains (dependent, independent and their interactions, as well as the nature of the relationship between the elements of the system (hard, probabilistic, mixed. Mechanisms of open non-equilibrium nonlinear systems (dissipative and four dissipative structures emergence conditions are described. Determination models of mental and behaviour formation and development that were developed under various theoretical approaches (associationism, behaviorism, gestaltism, psychology of intelligence by Piaget, Vygotsky culture historical approach, activity approach and others are mapped on each other as the models that describe behaviour of the three system types mentioned above. The development models of the mental sphere are shown to be different by the following criteria: 1 allocated determinants amount; 2 presence or absence of the system own activity that results in selecting the model not only external, but also internal determinants; 3 types of causal chains (dependent-independent-blended; 4 types of relationships between the causal chain that ultimately determines the subsequent system determination type as decisive (a tough dynamic pattern or stochastic (statistical regularity. The continuity of postnonclassical, classical and non-classical models of mental development determination are described. The process of gradual refinement, complexity, «absorption» of the mental determination by the latter models is characterized. The human mental can be deemed as the functioning of the open developing non-equilibrium nonlinear system (dissipative. The mental sphere is

  1. Development of a global 1-D chemically radiatively coupled model and an introduction to the development of a chemically coupled General Circulation Model

    International Nuclear Information System (INIS)

    Akiyoshi, H.

    1997-01-01

    A global one-dimensional, chemically and radiatively coupled model has been developed. The basic concept of the coupled model, definition of globally averaged zenith angles, the formulation of the model chemistry, radiation, the coupled processes, and profiles and diurnal variations of temperature and chemical species at a normal steady state are presented. Furthermore, a suddenly doubled CO 2 experiment and a Pinatubo aerosol increase experiment were performed with the model. The time scales of variations in ozone and temperature in the lower stratosphere of the coupled system in the doubled CO 2 experiment was long, due to a feedback process among ultra violet radiation, O(1D), NO y , NO x , and O 3 . From the Pinatubo aerosol experiment, a delay of maximum ozone decrease from the maximum aerosol loading is shown and discussed. Developments of 3-D chemical models with coupled processes are briefly described, and the ozone distribution from the first version of the 3-D model are presented. Chemical model development in National Institute for Environmental Studies (NIES) are briefly described. (author)

  2. Model-based Rational and Systematic Protein Purification Process Development : A Knowledge-based Approach

    NARCIS (Netherlands)

    Kungah Nfor, B.

    2011-01-01

    The increasing market and regulatory (quality and safety) demands on therapeutic proteins calls for radical improvement in their manufacturing processes. Addressing these challenges requires the adoption of strategies and tools that enable faster and more efficient process development. This thesis

  3. Modeling the Object-Oriented Software Process: OPEN and the Unified Process

    OpenAIRE

    van den Berg, Klaas; Aksit, Mehmet; van den Broek, P.M.

    1999-01-01

    A short introduction to software process modeling is presented, particularly object-oriented modeling. Two major industrial process models are discussed: the OPEN model and the Unified Process model. In more detail, the quality assurance in the Unified Process tool (formally called Objectory) is reviewed.

  4. Integrated Site Model Process Model Report

    International Nuclear Information System (INIS)

    Booth, T.

    2000-01-01

    The Integrated Site Model (ISM) provides a framework for discussing the geologic features and properties of Yucca Mountain, which is being evaluated as a potential site for a geologic repository for the disposal of nuclear waste. The ISM is important to the evaluation of the site because it provides 3-D portrayals of site geologic, rock property, and mineralogic characteristics and their spatial variabilities. The ISM is not a single discrete model; rather, it is a set of static representations that provide three-dimensional (3-D), computer representations of site geology, selected hydrologic and rock properties, and mineralogic-characteristics data. These representations are manifested in three separate model components of the ISM: the Geologic Framework Model (GFM), the Rock Properties Model (RPM), and the Mineralogic Model (MM). The GFM provides a representation of the 3-D stratigraphy and geologic structure. Based on the framework provided by the GFM, the RPM and MM provide spatial simulations of the rock and hydrologic properties, and mineralogy, respectively. Functional summaries of the component models and their respective output are provided in Section 1.4. Each of the component models of the ISM considers different specific aspects of the site geologic setting. Each model was developed using unique methodologies and inputs, and the determination of the modeled units for each of the components is dependent on the requirements of that component. Therefore, while the ISM represents the integration of the rock properties and mineralogy into a geologic framework, the discussion of ISM construction and results is most appropriately presented in terms of the three separate components. This Process Model Report (PMR) summarizes the individual component models of the ISM (the GFM, RPM, and MM) and describes how the three components are constructed and combined to form the ISM

  5. Multiple High-Fidelity Modeling Tools for Metal Additive Manufacturing Process Development, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Despite the rapid commercialization of additive manufacturing technology such as selective laser melting, SLM, there are gaps in process modeling and material...

  6. Multiple High-Fidelity Modeling Tools for Metal Additive Manufacturing Process Development, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — Despite the rapid commercialization of additive manufacturing technology such as selective laser melting, SLM, there are gaps in process modeling and material...

  7. Learning while (re-)configuring: business model innovation processes in established firms

    NARCIS (Netherlands)

    Berends, J.J.; Smits, A.; Reymen, I.M.M.J.; Podoynitsyna, K.S.

    2016-01-01

    This study addresses the question of how established organizations develop new business models over time, using a process research approach to trace how four business model innovation trajectories unfold. With organizational learning as analytical lens, we discern two process patterns: “drifting”

  8. Modelling the Pultrusion Process of Off Shore Wind Turbine Blades

    DEFF Research Database (Denmark)

    Baran, Ismet

    together with the thermal and cure developments are addressed. A detailed survey on pultrusion is presented including numerical and experimental studies available in the literature since the 1980s. Keeping the multi-physics and large amount of variables involved in the pultrusion process in mind...... and shape distortions in the pultrusion process. Together these models present a thermo-chemical-mechanical model framework for the process which is unprecedented in literature. In this framework, the temperature and degree of cure fields already calculated in the thermo-chemical model are mapped...

  9. Stochastic model of milk homogenization process using Markov's chain

    Directory of Open Access Journals (Sweden)

    A. A. Khvostov

    2016-01-01

    Full Text Available The process of development of a mathematical model of the process of homogenization of dairy products is considered in the work. The theory of Markov's chains was used in the development of the mathematical model, Markov's chain with discrete states and continuous parameter for which the homogenisation pressure is taken, being the basis for the model structure. Machine realization of the model is implemented in the medium of structural modeling MathWorks Simulink™. Identification of the model parameters was carried out by minimizing the standard deviation calculated from the experimental data for each fraction of dairy products fat phase. As the set of experimental data processing results of the micrographic images of fat globules of whole milk samples distribution which were subjected to homogenization at different pressures were used. Pattern Search method was used as optimization method with the Latin Hypercube search algorithm from Global Optimization Тoolbox library. The accuracy of calculations averaged over all fractions of 0.88% (the relative share of units, the maximum relative error was 3.7% with the homogenization pressure of 30 MPa, which may be due to the very abrupt change in properties from the original milk in the particle size distribution at the beginning of the homogenization process and the lack of experimental data at homogenization pressures of below the specified value. The mathematical model proposed allows to calculate the profile of volume and mass distribution of the fat phase (fat globules in the product, depending on the homogenization pressure and can be used in the laboratory and research of dairy products composition, as well as in the calculation, design and modeling of the process equipment of the dairy industry enterprises.

  10. Mathematical modelling of the process of quality control of construction products

    Directory of Open Access Journals (Sweden)

    Pogorelov Vadim

    2017-01-01

    Full Text Available The study presents the results of years of research in the field of quality management of industrial production construction production, based on mathematical modelling techniques, process and results of implementing the developed programme of monitoring and quality control in the production process of the enterprise. The aim of this work is the presentation of scientific community of the practical results of mathematical modelling in application programs. In the course of the research addressed the description of the applied mathematical models, views, practical results of its application in the applied field to assess quality control. The authors used this mathematical model in practice. The article presents the results of applying this model. The authors developed the experimental software management and quality assessment by using mathematical modeling methods. The authors continue research in this direction to improve the diagnostic systems and quality management systems based on mathematical modeling methods prognostic and diagnostic processes.

  11. Enhanced Geothermal Systems Research and Development: Models of Subsurface Chemical Processes Affecting Fluid Flow

    Energy Technology Data Exchange (ETDEWEB)

    Moller, Nancy; Weare J. H.

    2008-05-29

    Successful exploitation of the vast amount of heat stored beneath the earth’s surface in hydrothermal and fluid-limited, low permeability geothermal resources would greatly expand the Nation’s domestic energy inventory and thereby promote a more secure energy supply, a stronger economy and a cleaner environment. However, a major factor limiting the expanded development of current hydrothermal resources as well as the production of enhanced geothermal systems (EGS) is insufficient knowledge about the chemical processes controlling subsurface fluid flow. With funding from past grants from the DOE geothermal program and other agencies, we successfully developed advanced equation of state (EOS) and simulation technologies that accurately describe the chemistry of geothermal reservoirs and energy production processes via their free energies for wide XTP ranges. Using the specific interaction equations of Pitzer, we showed that our TEQUIL chemical models can correctly simulate behavior (e.g., mineral scaling and saturation ratios, gas break out, brine mixing effects, down hole temperatures and fluid chemical composition, spent brine incompatibilities) within the compositional range (Na-K-Ca-Cl-SO4-CO3-H2O-SiO2-CO2(g)) and temperature range (T < 350°C) associated with many current geothermal energy production sites that produce brines with temperatures below the critical point of water. The goal of research carried out under DOE grant DE-FG36-04GO14300 (10/1/2004-12/31/2007) was to expand the compositional range of our Pitzer-based TEQUIL fluid/rock interaction models to include the important aluminum and silica interactions (T < 350°C). Aluminum is the third most abundant element in the earth’s crust; and, as a constituent of aluminosilicate minerals, it is found in two thirds of the minerals in the earth’s crust. The ability to accurately characterize effects of temperature, fluid mixing and interactions between major rock-forming minerals and hydrothermal and

  12. Development of a Mathematical Model for Multivariate Process by Balanced Six Sigma

    Directory of Open Access Journals (Sweden)

    Díaz-Castellanos Elizabeth Eugenia

    2015-07-01

    Full Text Available The Six Sigma methodology is widely used in business to improve quality, increase productivity and lower costs, impacting on business improvement. However, today the challenge is to use those tools for improvements that will have a direct impact on the differentiation of value, which requires the alignment of Six Sigma with the competitive strategies of the organization.Hence the importance of a strategic management system to measure, analyze, improve and control corporate performance, while setting out responsibilities of leadership and commitment. The specific purpose of this research is to provide a mathematical model through the alignment of strategic objectives (Balanced Scorecard and tools for productivity improvement (Six Sigma for processes with multiple answers, which is sufficiently robust so that it can serve as basis for application in manufacturing and thus effectively link strategy performance and customer satisfaction. Specifically we worked with a case study: Córdoba, Ver. The model proposes that is the strategy, performance and customer satisfaction are aligned, the organization will benefit from the intense relationship between process performance and strategic initiatives. These changes can be measured by productivity and process metrics such as cycle time, production rates, production efficiency and percentage of reprocessing, among others.

  13. An integrated numerical and physical modeling system for an enhanced in situ bioremediation process

    International Nuclear Information System (INIS)

    Huang, Y.F.; Huang, G.H.; Wang, G.Q.; Lin, Q.G.; Chakma, A.

    2006-01-01

    Groundwater contamination due to releases of petroleum products is a major environmental concern in many urban districts and industrial zones. Over the past years, a few studies were undertaken to address in situ bioremediation processes coupled with contaminant transport in two- or three-dimensional domains. However, they were concentrated on natural attenuation processes for petroleum contaminants or enhanced in situ bioremediation processes in laboratory columns. In this study, an integrated numerical and physical modeling system is developed for simulating an enhanced in situ biodegradation (EISB) process coupled with three-dimensional multiphase multicomponent flow and transport simulation in a multi-dimensional pilot-scale physical model. The designed pilot-scale physical model is effective in tackling natural attenuation and EISB processes for site remediation. The simulation results demonstrate that the developed system is effective in modeling the EISB process, and can thus be used for investigating the effects of various uncertainties. - An integrated modeling system was developed to enhance in situ bioremediation processes

  14. The Context, Process, and Outcome Evaluation Model for Organisational Health Interventions

    Science.gov (United States)

    Fridrich, Annemarie; Jenny, Gregor J.; Bauer, Georg F.

    2015-01-01

    To facilitate evaluation of complex, organisational health interventions (OHIs), this paper aims at developing a context, process, and outcome (CPO) evaluation model. It builds on previous model developments in the field and advances them by clearly defining and relating generic evaluation categories for OHIs. Context is defined as the underlying frame that influences and is influenced by an OHI. It is further differentiated into the omnibus and discrete contexts. Process is differentiated into the implementation process, as the time-limited enactment of the original intervention plan, and the change process of individual and collective dynamics triggered by the implementation process. These processes lead to proximate, intermediate, and distal outcomes, as all results of the change process that are meaningful for various stakeholders. Research questions that might guide the evaluation of an OHI according to the CPO categories and a list of concrete themes/indicators and methods/sources applied within the evaluation of an OHI project at a hospital in Switzerland illustrate the model's applicability in structuring evaluations of complex OHIs. In conclusion, the model supplies a common language and a shared mental model for improving communication between researchers and company members and will improve the comparability and aggregation of evaluation study results. PMID:26557665

  15. The Context, Process, and Outcome Evaluation Model for Organisational Health Interventions.

    Science.gov (United States)

    Fridrich, Annemarie; Jenny, Gregor J; Bauer, Georg F

    2015-01-01

    To facilitate evaluation of complex, organisational health interventions (OHIs), this paper aims at developing a context, process, and outcome (CPO) evaluation model. It builds on previous model developments in the field and advances them by clearly defining and relating generic evaluation categories for OHIs. Context is defined as the underlying frame that influences and is influenced by an OHI. It is further differentiated into the omnibus and discrete contexts. Process is differentiated into the implementation process, as the time-limited enactment of the original intervention plan, and the change process of individual and collective dynamics triggered by the implementation process. These processes lead to proximate, intermediate, and distal outcomes, as all results of the change process that are meaningful for various stakeholders. Research questions that might guide the evaluation of an OHI according to the CPO categories and a list of concrete themes/indicators and methods/sources applied within the evaluation of an OHI project at a hospital in Switzerland illustrate the model's applicability in structuring evaluations of complex OHIs. In conclusion, the model supplies a common language and a shared mental model for improving communication between researchers and company members and will improve the comparability and aggregation of evaluation study results.

  16. Process Modelling of Curing Process-Induced Internal Stress and Deformation of Composite Laminate Structure with Elastic and Viscoelastic Models

    Science.gov (United States)

    Li, Dongna; Li, Xudong; Dai, Jianfeng

    2018-06-01

    In this paper, two kinds of transient models, the viscoelastic model and the linear elastic model, are established to analyze the curing deformation of the thermosetting resin composites, and are calculated by COMSOL Multiphysics software. The two models consider the complicated coupling between physical and chemical changes during curing process of the composites and the time-variant characteristic of material performance parameters. Subsequently, the two proposed models are implemented respectively in a three-dimensional composite laminate structure, and a simple and convenient method of local coordinate system is used to calculate the development of residual stresses, curing shrinkage and curing deformation for the composite laminate. Researches show that the temperature, degree of curing (DOC) and residual stresses during curing process are consistent with the study in literature, so the curing shrinkage and curing deformation obtained on these basis have a certain referential value. Compared the differences between the two numerical results, it indicates that the residual stress and deformation calculated by the viscoelastic model are more close to the reference value than the linear elastic model.

  17. Innovation Process Planning Model in the Bpmn Standard

    Directory of Open Access Journals (Sweden)

    Jurczyk-Bunkowska Magdalena

    2013-12-01

    Full Text Available The aim of the article is to show the relations in the innovation process planning model. The relations argued here guarantee the stable and reliable way to achieve the result in the form of an increased competitiveness by a professionally directed development of the company. The manager needs to specify the effect while initiating the realisation of the process, has to be achieved this by the system of indirect goals. The original model proposed here shows the standard of dependence between the plans of the fragments of the innovation process which make up for achieving its final goal. The relation in the present article was shown by using the standard Business Process Model and Notation. This enabled the specification of interrelations between the decision levels at which subsequent fragments of the innovation process are planned. This gives the possibility of a better coordination of the process, reducing the time needed for the achievement of its effect. The model has been compiled on the basis of the practises followed in Polish companies. It is not, however, the reflection of these practises, but rather an idealised standard of proceedings which aims at improving the effectiveness of the management of innovations on the operational level. The model shown could be the basis of the creation of systems supporting the decision making, supporting the knowledge management or those supporting the communication in the innovation processes.

  18. Engineered Barrier System Degradation, Flow, and Transport Process Model Report

    Energy Technology Data Exchange (ETDEWEB)

    E.L. Hardin

    2000-07-17

    The Engineered Barrier System Degradation, Flow, and Transport Process Model Report (EBS PMR) is one of nine PMRs supporting the Total System Performance Assessment (TSPA) being developed by the Yucca Mountain Project for the Site Recommendation Report (SRR). The EBS PMR summarizes the development and abstraction of models for processes that govern the evolution of conditions within the emplacement drifts of a potential high-level nuclear waste repository at Yucca Mountain, Nye County, Nevada. Details of these individual models are documented in 23 supporting Analysis/Model Reports (AMRs). Nineteen of these AMRs are for process models, and the remaining 4 describe the abstraction of results for application in TSPA. The process models themselves cluster around four major topics: ''Water Distribution and Removal Model, Physical and Chemical Environment Model, Radionuclide Transport Model, and Multiscale Thermohydrologic Model''. One AMR (Engineered Barrier System-Features, Events, and Processes/Degradation Modes Analysis) summarizes the formal screening analysis used to select the Features, Events, and Processes (FEPs) included in TSPA and those excluded from further consideration. Performance of a potential Yucca Mountain high-level radioactive waste repository depends on both the natural barrier system (NBS) and the engineered barrier system (EBS) and on their interactions. Although the waste packages are generally considered as components of the EBS, the EBS as defined in the EBS PMR includes all engineered components outside the waste packages. The principal function of the EBS is to complement the geologic system in limiting the amount of water contacting nuclear waste. A number of alternatives were considered by the Project for different EBS designs that could provide better performance than the design analyzed for the Viability Assessment. The design concept selected was Enhanced Design Alternative II (EDA II).

  19. Engineered Barrier System Degradation, Flow, and Transport Process Model Report

    International Nuclear Information System (INIS)

    E.L. Hardin

    2000-01-01

    The Engineered Barrier System Degradation, Flow, and Transport Process Model Report (EBS PMR) is one of nine PMRs supporting the Total System Performance Assessment (TSPA) being developed by the Yucca Mountain Project for the Site Recommendation Report (SRR). The EBS PMR summarizes the development and abstraction of models for processes that govern the evolution of conditions within the emplacement drifts of a potential high-level nuclear waste repository at Yucca Mountain, Nye County, Nevada. Details of these individual models are documented in 23 supporting Analysis/Model Reports (AMRs). Nineteen of these AMRs are for process models, and the remaining 4 describe the abstraction of results for application in TSPA. The process models themselves cluster around four major topics: ''Water Distribution and Removal Model, Physical and Chemical Environment Model, Radionuclide Transport Model, and Multiscale Thermohydrologic Model''. One AMR (Engineered Barrier System-Features, Events, and Processes/Degradation Modes Analysis) summarizes the formal screening analysis used to select the Features, Events, and Processes (FEPs) included in TSPA and those excluded from further consideration. Performance of a potential Yucca Mountain high-level radioactive waste repository depends on both the natural barrier system (NBS) and the engineered barrier system (EBS) and on their interactions. Although the waste packages are generally considered as components of the EBS, the EBS as defined in the EBS PMR includes all engineered components outside the waste packages. The principal function of the EBS is to complement the geologic system in limiting the amount of water contacting nuclear waste. A number of alternatives were considered by the Project for different EBS designs that could provide better performance than the design analyzed for the Viability Assessment. The design concept selected was Enhanced Design Alternative II (EDA II)

  20. IT vendor selection model by using structural equation model & analytical hierarchy process

    Science.gov (United States)

    Maitra, Sarit; Dominic, P. D. D.

    2012-11-01

    Selecting and evaluating the right vendors is imperative for an organization's global marketplace competitiveness. Improper selection and evaluation of potential vendors can dwarf an organization's supply chain performance. Numerous studies have demonstrated that firms consider multiple criteria when selecting key vendors. This research intends to develop a new hybrid model for vendor selection process with better decision making. The new proposed model provides a suitable tool for assisting decision makers and managers to make the right decisions and select the most suitable vendor. This paper proposes a Hybrid model based on Structural Equation Model (SEM) and Analytical Hierarchy Process (AHP) for long-term strategic vendor selection problems. The five steps framework of the model has been designed after the thorough literature study. The proposed hybrid model will be applied using a real life case study to assess its effectiveness. In addition, What-if analysis technique will be used for model validation purpose.

  1. Process model repositories and PNML

    NARCIS (Netherlands)

    Hee, van K.M.; Post, R.D.J.; Somers, L.J.A.M.; Werf, van der J.M.E.M.; Kindler, E.

    2004-01-01

    Bringing system and process models together in repositories facilitates the interchange of model information between modelling tools, and allows the combination and interlinking of complementary models. Petriweb is a web application for managing such repositories. It supports hierarchical process

  2. Model-based software process improvement

    Science.gov (United States)

    Zettervall, Brenda T.

    1994-01-01

    The activities of a field test site for the Software Engineering Institute's software process definition project are discussed. Products tested included the improvement model itself, descriptive modeling techniques, the CMM level 2 framework document, and the use of process definition guidelines and templates. The software process improvement model represents a five stage cyclic approach for organizational process improvement. The cycles consist of the initiating, diagnosing, establishing, acting, and leveraging phases.

  3. MVP: A Volunteer Development & Recognition Model.

    Science.gov (United States)

    Gerhard, Gary W.

    This model was developed to provide a systematic, staged approach to volunteer personnel management. It provides a general process for dealing with volunteers from the point of organization entry through volunteer career stages to the time of exiting the organization. The model provides the structural components necessary to (1) plan, coordinate,…

  4. Efficient Simulation Modeling of an Integrated High-Level-Waste Processing Complex

    International Nuclear Information System (INIS)

    Gregory, Michael V.; Paul, Pran K.

    2000-01-01

    An integrated computational tool named the Production Planning Model (ProdMod) has been developed to simulate the operation of the entire high-level-waste complex (HLW) at the Savannah River Site (SRS) over its full life cycle. ProdMod is used to guide SRS management in operating the waste complex in an economically efficient and environmentally sound manner. SRS HLW operations are modeled using coupled algebraic equations. The dynamic nature of plant processes is modeled in the form of a linear construct in which the time dependence is implicit. Batch processes are modeled in discrete event-space, while continuous processes are modeled in time-space. The ProdMod methodology maps between event-space and time-space such that the inherent mathematical discontinuities in batch process simulation are avoided without sacrificing any of the necessary detail in the batch recipe steps. Modeling the processes separately in event- and time-space using linear constructs, and then coupling the two spaces, has accelerated the speed of simulation compared to a typical dynamic simulation. The ProdMod simulator models have been validated against operating data and other computer codes. Case studies have demonstrated the usefulness of the ProdMod simulator in developing strategies that demonstrate significant cost savings in operating the SRS HLW complex and in verifying the feasibility of newly proposed processes

  5. Theoretical modelling of carbon deposition processes

    International Nuclear Information System (INIS)

    Marsh, G.R.; Norfolk, D.J.; Skinner, R.F.

    1985-01-01

    Work based on capsule experiments in the BNL Gamma Facility, aimed at elucidating the chemistry involved in the formation of carbonaceous deposit on CAGR fuel pin surfaces is described. Using a data-base derived from capsule experiments together with literature values for the kinetics of the fundamental reactions, a chemical model of the gas-phase processes has been developed. This model successfully reproduces the capsule results, whilst preliminary application to the WAGR coolant circuit indicates the likely concentration profiles of various radical species within the fuel channels. (author)

  6. Process modeling for the Integrated Nonthermal Treatment System (INTS) study

    Energy Technology Data Exchange (ETDEWEB)

    Brown, B.W.

    1997-04-01

    This report describes the process modeling done in support of the Integrated Nonthermal Treatment System (INTS) study. This study was performed to supplement the Integrated Thermal Treatment System (ITTS) study and comprises five conceptual treatment systems that treat DOE contract-handled mixed low-level wastes (MLLW) at temperatures of less than 350{degrees}F. ASPEN PLUS, a chemical process simulator, was used to model the systems. Nonthermal treatment systems were developed as part of the INTS study and include sufficient processing steps to treat the entire inventory of MLLW. The final result of the modeling is a process flowsheet with a detailed mass and energy balance. In contrast to the ITTS study, which modeled only the main treatment system, the INTS study modeled each of the various processing steps with ASPEN PLUS, release 9.1-1. Trace constituents, such as radionuclides and minor pollutant species, were not included in the calculations.

  7. Fuzzy model for Laser Assisted Bending Process

    Directory of Open Access Journals (Sweden)

    Giannini Oliviero

    2016-01-01

    Full Text Available In the present study, a fuzzy model was developed to predict the residual bending in a conventional metal bending process assisted by a high power diode laser. The study was focused on AA6082T6 aluminium thin sheets. In most dynamic sheet metal forming operations, the highly nonlinear deformation processes cause large amounts of elastic strain energy stored in the formed material. The novel hybrid forming process was thus aimed at inducing the local heating of the mechanically bent workpiece in order to decrease or eliminate the related springback phenomena. In particular, the influence on the extent of springback phenomena of laser process parameters such as source power, scan speed and starting elastic deformation of mechanically bent sheets, was experimentally assessed. Consistent trends in experimental response according to operational parameters were found. Accordingly, 3D process maps of the extent of the springback phenomena according to operational parameters were constructed. The effect of the inherent uncertainties on the predicted residual bending caused by the approximation in the model parameters was evaluated. In particular, a fuzzy-logic based approach was used to describe the model uncertainties and the transformation method was applied to propagate their effect on the residual bending.

  8. Extension of internationalisation models drivers and processes for the globalisation of product development

    DEFF Research Database (Denmark)

    Søndergaard, Erik Stefan; Oehmen, Josef; Ahmed-Kristensen, Saeema

    2016-01-01

    of product development and collaborative distributed development beyond sourcing, sales and production elements. The paper then provides propositions for how to further develop the suggested model, and how western companies can learn from the Chinese approaches, and globalise their product development...

  9. Development of a standard equipment management model for nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Chang, Hee Seung; Ju, Tae Young; Kim, Jung Wun [KHNP Central Research Institute, Daejeon (Korea, Republic of)

    2012-10-15

    Most utilities that have achieved high performance have introduced a management model to improve performance and operate plants safely. The Nuclear Energy Institute has developed and updated its Standard Nuclear Performance Model (SNPM) in order to provide a summary of nuclear processes, cost definitions, and key business performance measures for business performance comparison and benchmarking. Over the past decade, Korea Hydro and Nuclear Power Co. (KHNP) has introduced and implemented many engineering processes such as Equipment Reliability (ER), Maintenance Rule (MR), Single Point Vulnerability (SPV), Corrective Action Program (CAP), and Self Assessment (SA) to improve plant performance and to sustain high performance. Some processes, however, are not well interfaced with other processes, because they were developed separately and were focused on the process itself. KHNP is developing a Standard Equipment Management Model (SEMM) to integrate these engineering processes and to improve the interrelation among the processes. In this paper, a draft model and attributes of the SEMM are discussed.

  10. Development of a standard equipment management model for nuclear power plants

    International Nuclear Information System (INIS)

    Chang, Hee Seung; Ju, Tae Young; Kim, Jung Wun

    2012-01-01

    Most utilities that have achieved high performance have introduced a management model to improve performance and operate plants safely. The Nuclear Energy Institute has developed and updated its Standard Nuclear Performance Model (SNPM) in order to provide a summary of nuclear processes, cost definitions, and key business performance measures for business performance comparison and benchmarking. Over the past decade, Korea Hydro and Nuclear Power Co. (KHNP) has introduced and implemented many engineering processes such as Equipment Reliability (ER), Maintenance Rule (MR), Single Point Vulnerability (SPV), Corrective Action Program (CAP), and Self Assessment (SA) to improve plant performance and to sustain high performance. Some processes, however, are not well interfaced with other processes, because they were developed separately and were focused on the process itself. KHNP is developing a Standard Equipment Management Model (SEMM) to integrate these engineering processes and to improve the interrelation among the processes. In this paper, a draft model and attributes of the SEMM are discussed

  11. Using Unified Modelling Language (UML) as a process-modelling technique for clinical-research process improvement.

    Science.gov (United States)

    Kumarapeli, P; De Lusignan, S; Ellis, T; Jones, B

    2007-03-01

    The Primary Care Data Quality programme (PCDQ) is a quality-improvement programme which processes routinely collected general practice computer data. Patient data collected from a wide range of different brands of clinical computer systems are aggregated, processed, and fed back to practices in an educational context to improve the quality of care. Process modelling is a well-established approach used to gain understanding and systematic appraisal, and identify areas of improvement of a business process. Unified modelling language (UML) is a general purpose modelling technique used for this purpose. We used UML to appraise the PCDQ process to see if the efficiency and predictability of the process could be improved. Activity analysis and thinking-aloud sessions were used to collect data to generate UML diagrams. The UML model highlighted the sequential nature of the current process as a barrier for efficiency gains. It also identified the uneven distribution of process controls, lack of symmetric communication channels, critical dependencies among processing stages, and failure to implement all the lessons learned in the piloting phase. It also suggested that improved structured reporting at each stage - especially from the pilot phase, parallel processing of data and correctly positioned process controls - should improve the efficiency and predictability of research projects. Process modelling provided a rational basis for the critical appraisal of a clinical data processing system; its potential maybe underutilized within health care.

  12. Conceptual Frameworks in the Doctoral Research Process: A Pedagogical Model

    Science.gov (United States)

    Berman, Jeanette; Smyth, Robyn

    2015-01-01

    This paper contributes to consideration of the role of conceptual frameworks in the doctoral research process. Through reflection on the two authors' own conceptual frameworks for their doctoral studies, a pedagogical model has been developed. The model posits the development of a conceptual framework as a core element of the doctoral…

  13. UML in business process modeling

    Directory of Open Access Journals (Sweden)

    Bartosz Marcinkowski

    2013-03-01

    Full Text Available Selection and proper application of business process modeling methods and techniques have a significant impact on organizational improvement capabilities as well as proper understanding of functionality of information systems that shall support activity of the organization. A number of business process modeling notations were popularized in practice in recent decades. Most significant of the notations include Business Process Modeling Notation (OMG BPMN and several Unified Modeling Language (OMG UML extensions. In this paper, the assessment whether one of the most flexible and strictly standardized contemporary business process modeling notations, i.e. Rational UML Profile for Business Modeling, enable business analysts to prepare business models that are all-embracing and understandable by all the stakeholders. After the introduction, methodology of research is discussed. Section 2 presents selected case study results. The paper is concluded with a summary.

  14. Parallel Development of Products and New Business Models

    DEFF Research Database (Denmark)

    Lund, Morten; Hansen, Poul H. Kyvsgård

    2014-01-01

    The perception of product development and the practical execution of product development in professional organizations have undergone dramatic changes in recent years. Many of these chances relate to introduction of broader and more cross-disciplinary views that involves new organizational functi...... and innovation management the 4th generation models are increasingly including the concept business models and business model innovation....... functions and new concepts. These chances can be captured in various generations of practice. This paper will discuss the recent development of 3rd generation product development process models and the emergence of a 4th generation. While the 3rd generation models included the concept of innovation...

  15. A model-based systems approach to pharmaceutical product-process design and analysis

    DEFF Research Database (Denmark)

    Gernaey, Krist; Gani, Rafiqul

    2010-01-01

    This is a perspective paper highlighting the need for systematic model-based design and analysis in pharmaceutical product-process development. A model-based framework is presented and the role, development and use of models of various types are discussed together with the structure of the models...

  16. Automatic extraction of process categories from process model collections

    NARCIS (Netherlands)

    Malinova, M.; Dijkman, R.M.; Mendling, J.; Lohmann, N.; Song, M.; Wohed, P.

    2014-01-01

    Many organizations build up their business process management activities in an incremental way. As a result, there is no overarching structure defined at the beginning. However, as business process modeling initiatives often yield hundreds to thousands of process models, there is a growing need for

  17. Mathematical modeling of heat treatment processes conserving biological activity of plant bioresources

    Science.gov (United States)

    Rodionova, N. S.; Popov, E. S.; Pozhidaeva, E. A.; Pynzar, S. S.; Ryaskina, L. O.

    2018-05-01

    The aim of this study is to develop a mathematical model of the heat exchange process of LT-processing to estimate the dynamics of temperature field changes and optimize the regime parameters, due to the non-stationarity process, the physicochemical and thermophysical properties of food systems. The application of LT-processing, based on the use of low-temperature modes in thermal culinary processing of raw materials with preliminary vacuum packaging in a polymer heat- resistant film is a promising trend in the development of technics and technology in the catering field. LT-processing application of food raw materials guarantees the preservation of biologically active substances in food environments, which are characterized by a certain thermolability, as well as extend the shelf life and high consumer characteristics of food systems that are capillary-porous bodies. When performing the mathematical modeling of the LT-processing process, the packet of symbolic mathematics “Maple” was used, as well as the mathematical packet flexPDE that uses the finite element method for modeling objects with distributed parameters. The processing of experimental results was evaluated with the help of the developed software in the programming language Python 3.4. To calculate and optimize the parameters of the LT processing process of polycomponent food systems, the differential equation of non-stationary thermal conductivity was used, the solution of which makes it possible to identify the temperature change at any point of the solid at different moments. The present study specifies data on the thermophysical characteristics of the polycomponent food system based on plant raw materials, with the help of which the physico-mathematical model of the LT- processing process has been developed. The obtained mathematical model allows defining of the dynamics of the temperature field in different sections of the LT-processed polycomponent food systems on the basis of calculating the

  18. Mathematical Optimal Sequence Model Development to Process Planes and Other Interconnected Surfaces of Complex Body Parts

    Directory of Open Access Journals (Sweden)

    I. I. Kravchenko

    2016-01-01

    Full Text Available Experience in application of multi-operational machines CNC (MOM CNC shows that they are efficient only in case of significantly increasing productivity and dramatically reducing time-to-market cycle of new products. Most full technological MOM capabilities are revealed when processing the complex body parts. The more complex is a part design and the more is its number of machined surfaces, the more tools are necessary for its processing and positioning, the more is an efficiency of their application. At the same time, the case history of using these machines in industry shows that MOM CNC are, virtually, used mostly for technological processes of universal equipment, which is absolutely unacceptable. One way to improve the processing performance on MOM CNC is to reduce nonproductive machine time through reducing the mutual idle movements of the working machine. This problem is solved using dynamic programming methods, one of which is the solution of the traveling salesman problem (Bellman's method. With a known plan for treatment of all elementary surfaces of the body part, i.e. the known number of performed transitions, each transition is represented as a vertex of some graph, while technological links between the vertices are its edges. A mathematical model is developed on the Bellman principle, which is adapted to technological tasks to minimize the idle time of mutual idle movements of the working machine to perform all transitions in the optimal sequence. The initial data to fill matrix of time expenditures are time consumed by the hardware after executing the i-th transition, and necessary to complete the j-transition. The programmer fills in matrix cells according to known routing body part taking into account the time for part and table positioning, tool exchange, spindle and table approach to the working zone, and the time of table rotation, etc. The mathematical model was tested when machining the body part with 36 transitions on the

  19. Conceptual models of information processing

    Science.gov (United States)

    Stewart, L. J.

    1983-01-01

    The conceptual information processing issues are examined. Human information processing is defined as an active cognitive process that is analogous to a system. It is the flow and transformation of information within a human. The human is viewed as an active information seeker who is constantly receiving, processing, and acting upon the surrounding environmental stimuli. Human information processing models are conceptual representations of cognitive behaviors. Models of information processing are useful in representing the different theoretical positions and in attempting to define the limits and capabilities of human memory. It is concluded that an understanding of conceptual human information processing models and their applications to systems design leads to a better human factors approach.

  20. Cognitive components underpinning the development of model-based learning.

    Science.gov (United States)

    Potter, Tracey C S; Bryce, Nessa V; Hartley, Catherine A

    2017-06-01

    Reinforcement learning theory distinguishes "model-free" learning, which fosters reflexive repetition of previously rewarded actions, from "model-based" learning, which recruits a mental model of the environment to flexibly select goal-directed actions. Whereas model-free learning is evident across development, recruitment of model-based learning appears to increase with age. However, the cognitive processes underlying the development of model-based learning remain poorly characterized. Here, we examined whether age-related differences in cognitive processes underlying the construction and flexible recruitment of mental models predict developmental increases in model-based choice. In a cohort of participants aged 9-25, we examined whether the abilities to infer sequential regularities in the environment ("statistical learning"), maintain information in an active state ("working memory") and integrate distant concepts to solve problems ("fluid reasoning") predicted age-related improvements in model-based choice. We found that age-related improvements in statistical learning performance did not mediate the relationship between age and model-based choice. Ceiling performance on our working memory assay prevented examination of its contribution to model-based learning. However, age-related improvements in fluid reasoning statistically mediated the developmental increase in the recruitment of a model-based strategy. These findings suggest that gradual development of fluid reasoning may be a critical component process underlying the emergence of model-based learning. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  1. Software Development Process Changes in the Telecommunications Industry

    Directory of Open Access Journals (Sweden)

    John Kevin Doyle

    2006-06-01

    Full Text Available The tremendous changes in the telecommunications business in the last several years drove changes in the software development processes of telecommunications equipment providers. We compare changes in these very large projects, in two companies, with those proposed in the Theory of Constraints / Critical Chains, Extreme Programming, and Agile development models. The 2000s have been a time of significant challenge in the telecommunications equipment business. Telecommunications service providers have excess equipment capacity. Many are waiting for next generation telephone switches that will simultaneously lower operating costs and enable additional revenue generation. The large service providers have drastically reduced their capital and expense purchases. Many small service providers, particularly the dot-coms, went bankrupt; much of their equipment is on the secondary market, at a fraction of the original cost. Thus the equipment market has significantly shrunk, and the equipment providers have been reducing expenses, while continuing to deliver software and hardware equipment at the high quality level required by the service providers. This drove many changes in the software development process. While the process changes are reported in two telecommunication equipment development organizations, the changes are applicable in any product development organization.

  2. The processes of strategy development

    OpenAIRE

    Bailey, Andy; Johnson, Gerry

    1995-01-01

    This paper is concerned with the processes by which strategy is developed within organisations. It builds on research into the nature of strategy development being undertaken within the Centre for Strategic Management and Organisational Change at Cranfield School of Management. Initially the process of strategy development is discussed, a number of explanations of the process are presented and an integrated framework is developed. This framework is subsequently used to illustra...

  3. A continuum based fem model for friction stir welding-model development

    Energy Technology Data Exchange (ETDEWEB)

    Buffa, G. [Ohio State University, Department of Industrial, Welding and Systems Engineering, 1971 Neil Avenue, 210 Baker Systems, Columbus, OH 43210 (United States) and Dipartimento di Tecnologia Meccanica, Produzione e Ingegneria Gestionale, Universita di Palermo, Viale delle Scienze, 90128 Palermo (Italy)]. E-mail: g.buffa@dtpm.unipa.it; Hua, J. [Ohio State University, Department of Industrial, Welding and Systems Engineering, 1971 Neil Avenue, 210 Baker Systems, Columbus, OH 43210 (United States)]. E-mail: hua.14@osu.edu; Shivpuri, R. [Ohio State University, Department of Industrial, Welding and Systems Engineering, 1971 Neil Avenue, 210 Baker Systems, Columbus, OH 43210 (United States)]. E-mail: shivpuri.1@osu.edu; Fratini, L. [Dipartimento di Tecnologia Meccanica, Produzione e Ingegneria Gestionale, Universita di Palermo, Viale delle Scienze, 90128 Palermo (Italy)]. E-mail: abaqus@dtpm.unipa.it

    2006-03-15

    Although friction stir welding (FSW) has been successfully used to join materials that are difficult-to-weld or unweldeable by fusion welding methods, it is still in its early development stage and, therefore, a scientific knowledge based predictive model is of significant help for thorough understanding of FSW process. In this paper, a continuum based FEM model for friction stir welding process is proposed, that is 3D Lagrangian implicit, coupled, rigid-viscoplastic. This model is calibrated by comparing with experimental results of force and temperature distribution, then is used to investigate the distribution of temperature and strain in heat affect zone and the weld nugget. The model correctly predicts the non-symmetric nature of FSW process, and the relationships between the tool forces and the variation in the process parameters. It is found that the effective strain distribution is non-symmetric about the weld line while the temperature profile is almost symmetric in the weld zone.

  4. A continuum based fem model for friction stir welding-model development

    International Nuclear Information System (INIS)

    Buffa, G.; Hua, J.; Shivpuri, R.; Fratini, L.

    2006-01-01

    Although friction stir welding (FSW) has been successfully used to join materials that are difficult-to-weld or unweldeable by fusion welding methods, it is still in its early development stage and, therefore, a scientific knowledge based predictive model is of significant help for thorough understanding of FSW process. In this paper, a continuum based FEM model for friction stir welding process is proposed, that is 3D Lagrangian implicit, coupled, rigid-viscoplastic. This model is calibrated by comparing with experimental results of force and temperature distribution, then is used to investigate the distribution of temperature and strain in heat affect zone and the weld nugget. The model correctly predicts the non-symmetric nature of FSW process, and the relationships between the tool forces and the variation in the process parameters. It is found that the effective strain distribution is non-symmetric about the weld line while the temperature profile is almost symmetric in the weld zone

  5. A finite difference model of the iron ore sinter process

    OpenAIRE

    Muller, J.; de Vries, T.L.; Dippenaar, B.A.; Vreugdenburg, J.C.

    2015-01-01

    Iron ore fines are agglomerated to produce sinter, which is an important feed material for blast furnaces worldwide. A model of the iron ore sintering process has been developed with the objective of being representative of the sinter pot test, the standard laboratory process in which the behaviour of specific sinter feed mixtures is evaluated. The model aims to predict sinter quality, including chemical quality and physical strength, as well as key sinter process performance parameters such ...

  6. Modeling and analysis of chill and fill processes for the cryogenic storage and transfer engineering development unit tank

    Science.gov (United States)

    Hedayat, A.; Cartagena, W.; Majumdar, A. K.; LeClair, A. C.

    2016-03-01

    NASA's future missions may require long-term storage and transfer of cryogenic propellants. The Engineering Development Unit (EDU), a NASA in-house effort supported by both Marshall Space Flight Center (MSFC) and Glenn Research Center, is a cryogenic fluid management (CFM) test article that primarily serves as a manufacturing pathfinder and a risk reduction task for a future CFM payload. The EDU test article comprises a flight-like tank, internal components, insulation, and attachment struts. The EDU is designed to perform integrated passive thermal control performance testing with liquid hydrogen (LH2) in a test-like vacuum environment. A series of tests, with LH2 as a testing fluid, was conducted at Test Stand 300 at MSFC during the summer of 2014. The objective of this effort was to develop a thermal/fluid model for evaluating the thermodynamic behavior of the EDU tank during the chill and fill processes. The Generalized Fluid System Simulation Program, an MSFC in-house general-purpose computer program for flow network analysis, was utilized to model and simulate the chill and fill portion of the testing. The model contained the LH2 supply source, feed system, EDU tank, and vent system. The test setup, modeling description, and comparison of model predictions with the test data are presented.

  7. Ongoing Model Development Analyzing Glass Fracture

    DEFF Research Database (Denmark)

    Molnar, G.; Bojtar, I.; Nielsen, Jens Henrik

    2013-01-01

    Present subject deals with an ongoing experimental and numerical analysis of inplane loaded glass plates. The main goal of the investigation is to develop a hybrid – discrete and finite element – model which could follow the fracture process in annealed and in tempered glass. Measurements of the ...... an overview of the structure of the research and a summary of current status archived so far.......Present subject deals with an ongoing experimental and numerical analysis of inplane loaded glass plates. The main goal of the investigation is to develop a hybrid – discrete and finite element – model which could follow the fracture process in annealed and in tempered glass. Measurements...... of the residual stress state before failure and high-speed camera recordings of the failure are being performed in order to verify the numerical model. The primary goal of this research is to follow the overall fracture of a structural element – e.g. beam – loaded inplane. Present paper would like to give...

  8. Modeling critical zone processes in intensively managed environments

    Science.gov (United States)

    Kumar, Praveen; Le, Phong; Woo, Dong; Yan, Qina

    2017-04-01

    Processes in the Critical Zone (CZ), which sustain terrestrial life, are tightly coupled across hydrological, physical, biochemical, and many other domains over both short and long timescales. In addition, vegetation acclimation resulting from elevated atmospheric CO2 concentration, along with response to increased temperature and altered rainfall pattern, is expected to result in emergent behaviors in ecologic and hydrologic functions, subsequently controlling CZ processes. We hypothesize that the interplay between micro-topographic variability and these emergent behaviors will shape complex responses of a range of ecosystem dynamics within the CZ. Here, we develop a modeling framework ('Dhara') that explicitly incorporates micro-topographic variability based on lidar topographic data with coupling of multi-layer modeling of the soil-vegetation continuum and 3-D surface-subsurface transport processes to study ecological and biogeochemical dynamics. We further couple a C-N model with a physically based hydro-geomorphologic model to quantify (i) how topographic variability controls the spatial distribution of soil moisture, temperature, and biogeochemical processes, and (ii) how farming activities modify the interaction between soil erosion and soil organic carbon (SOC) dynamics. To address the intensive computational demand from high-resolution modeling at lidar data scale, we use a hybrid CPU-GPU parallel computing architecture run over large supercomputing systems for simulations. Our findings indicate that rising CO2 concentration and air temperature have opposing effects on soil moisture, surface water and ponding in topographic depressions. Further, the relatively higher soil moisture and lower soil temperature contribute to decreased soil microbial activities in the low-lying areas due to anaerobic conditions and reduced temperatures. The decreased microbial relevant processes cause the reduction of nitrification rates, resulting in relatively lower nitrate

  9. A review on pilot plant development models

    International Nuclear Information System (INIS)

    Rosli Darmawan

    2005-01-01

    After more than 30 years, MINT has been able to produce many new findings, products and processes. Some of these have been able to penetrate local and international markets. This was achieved through a systematic commercialisation program practiced in MINT with its technological chain and MINT Technology Park program. This paper will review the development process of MINT pilot plants and compare them with a few other models from other institutions in Malaysia and abroad. The advantages and disadvantages of each model are reviewed and a discussion against MINT's model is presented. (Author)

  10. Object-oriented process dose modeling for glovebox operations

    International Nuclear Information System (INIS)

    Boerigter, S.T.; Fasel, J.H.; Kornreich, D.E.

    1999-01-01

    The Plutonium Facility at Los Alamos National Laboratory supports several defense and nondefense-related missions for the country by performing fabrication, surveillance, and research and development for materials and components that contain plutonium. Most operations occur in rooms with one or more arrays of gloveboxes connected to each other via trolley gloveboxes. Minimizing the effective dose equivalent (EDE) is a growing concern as a result of steadily declining allowable dose limits being imposed and a growing general awareness of safety in the workplace. In general, the authors discriminate three components of a worker's total EDE: the primary EDE, the secondary EDE, and background EDE. A particular background source of interest is the nuclear materials vault. The distinction between sources inside and outside of a particular room is arbitrary with the underlying assumption that building walls and floors provide significant shielding to justify including sources in other rooms in the background category. Los Alamos has developed the Process Modeling System (ProMoS) primarily for performing process analyses of nuclear operations. ProMoS is an object-oriented, discrete-event simulation package that has been used to analyze operations at Los Alamos and proposed facilities such as the new fabrication facilities for the Complex-21 effort. In the past, crude estimates of the process dose (the EDE received when a particular process occurred), room dose (the EDE received when a particular process occurred in a given room), and facility dose (the EDE received when a particular process occurred in the facility) were used to obtain an integrated EDE for a given process. Modifications to the ProMoS package were made to utilize secondary dose information to use dose modeling to enhance the process modeling efforts

  11. Estimation of environment-related properties of chemicals for design of sustainable processes: development of group-contribution+ (GC+) property models and uncertainty analysis.

    Science.gov (United States)

    Hukkerikar, Amol Shivajirao; Kalakul, Sawitree; Sarup, Bent; Young, Douglas M; Sin, Gürkan; Gani, Rafiqul

    2012-11-26

    of the developed property models for the estimation of environment-related properties and uncertainties of the estimated property values is highlighted through an illustrative example. The developed property models provide reliable estimates of environment-related properties needed to perform process synthesis, design, and analysis of sustainable chemical processes and allow one to evaluate the effect of uncertainties of estimated property values on the calculated performance of processes giving useful insights into quality and reliability of the design of sustainable processes.

  12. Retort process modelling for Indian traditional foods.

    Science.gov (United States)

    Gokhale, S V; Lele, S S

    2014-11-01

    Indian traditional staple and snack food is typically a heterogeneous recipe that incorporates varieties of vegetables, lentils and other ingredients. Modelling the retorting process of multilayer pouch packed Indian food was achieved using lumped-parameter approach. A unified model is proposed to estimate cold point temperature. Initial process conditions, retort temperature and % solid content were the significantly affecting independent variables. A model was developed using combination of vegetable solids and water, which was then validated using four traditional Indian vegetarian products: Pulav (steamed rice with vegetables), Sambar (south Indian style curry containing mixed vegetables and lentils), Gajar Halawa (carrot based sweet product) and Upama (wheat based snack product). The predicted and experimental values of temperature profile matched with ±10 % error which is a good match considering the food was a multi component system. Thus the model will be useful as a tool to reduce number of trials required to optimize retorting of various Indian traditional vegetarian foods.

  13. Development of KIAPS Observation Processing Package for Data Assimilation System

    Science.gov (United States)

    Kang, Jeon-Ho; Chun, Hyoung-Wook; Lee, Sihye; Han, Hyun-Jun; Ha, Su-Jin

    2015-04-01

    The Korea Institute of Atmospheric Prediction Systems (KIAPS) was founded in 2011 by the Korea Meteorological Administration (KMA) to develop Korea's own global Numerical Weather Prediction (NWP) system as nine year (2011-2019) project. Data assimilation team at KIAPS has been developing the observation processing system (KIAPS Package for Observation Processing: KPOP) to provide optimal observations to the data assimilation system for the KIAPS Global Model (KIAPS Integrated Model - Spectral Element method based on HOMME: KIM-SH). Currently, the KPOP is capable of processing the satellite radiance data (AMSU-A, IASI), GPS Radio Occultation (GPS-RO), AIRCRAFT (AMDAR, AIREP, and etc…), and synoptic observation (SONDE and SURFACE). KPOP adopted Radiative Transfer for TOVS version 10 (RTTOV_v10) to get brightness temperature (TB) for each channel at top of the atmosphere (TOA), and Radio Occultation Processing Package (ROPP) 1-dimensional forward module to get bending angle (BA) at each tangent point. The observation data are obtained from the KMA which has been composited with BUFR format to be converted with ODB that are used for operational data assimilation and monitoring at the KMA. The Unified Model (UM), Community Atmosphere - Spectral Element (CAM-SE) and KIM-SH model outputs are used for the bias correction (BC) and quality control (QC) of the observations, respectively. KPOP provides radiance and RO data for Local Ensemble Transform Kalman Filter (LETKF) and also provides SONDE, SURFACE and AIRCRAFT data for Three-Dimensional Variational Assimilation (3DVAR). We are expecting all of the observation type which processed in KPOP could be combined with both of the data assimilation method as soon as possible. The preliminary results from each observation type will be introduced with the current development status of the KPOP.

  14. Modeling post-wildfire hydrological processes with ParFlow

    Science.gov (United States)

    Escobar, I. S.; Lopez, S. R.; Kinoshita, A. M.

    2017-12-01

    Wildfires alter the natural processes within a watershed, such as surface runoff, evapotranspiration rates, and subsurface water storage. Post-fire hydrologic models are typically one-dimensional, empirically-based models or two-dimensional, conceptually-based models with lumped parameter distributions. These models are useful for modeling and predictions at the watershed outlet; however, do not provide detailed, distributed hydrologic processes at the point scale within the watershed. This research uses ParFlow, a three-dimensional, distributed hydrologic model to simulate post-fire hydrologic processes by representing the spatial and temporal variability of soil burn severity (via hydrophobicity) and vegetation recovery. Using this approach, we are able to evaluate the change in post-fire water components (surface flow, lateral flow, baseflow, and evapotranspiration). This work builds upon previous field and remote sensing analysis conducted for the 2003 Old Fire Burn in Devil Canyon, located in southern California (USA). This model is initially developed for a hillslope defined by a 500 m by 1000 m lateral extent. The subsurface reaches 12.4 m and is assigned a variable cell thickness to explicitly consider soil burn severity throughout the stages of recovery and vegetation regrowth. We consider four slope and eight hydrophobic layer configurations. Evapotranspiration is used as a proxy for vegetation regrowth and is represented by the satellite-based Simplified Surface Energy Balance (SSEBOP) product. The pre- and post-fire surface runoff, subsurface storage, and surface storage interactions are evaluated at the point scale. Results will be used as a basis for developing and fine-tuning a watershed-scale model. Long-term simulations will advance our understanding of post-fire hydrological partitioning between water balance components and the spatial variability of watershed processes, providing improved guidance for post-fire watershed management. In reference

  15. Dynamic process model of a plutonium oxalate precipitator

    International Nuclear Information System (INIS)

    Borgonovi, G.M.; Hammelman, J.E.; Miller, C.L.

    1980-01-01

    A dynamic model of a plutonium oxalate precipitator is developed to provide a means of predicting plutonium inventory on a continuous basis. The model is based on state-of-the-art crystallization equations, which describe nucleation and growth phenomena. The model parameters were obtained through the use of batch experimental data. The model has been used to study the approach to steady state, to investigate the response to input transients, and to simulate the control of the precipitation process. 12 refs

  16. Prediction of strong earthquake motions on rock surface using evolutionary process models

    International Nuclear Information System (INIS)

    Kameda, H.; Sugito, M.

    1984-01-01

    Stochastic process models are developed for prediction of strong earthquake motions for engineering design purposes. Earthquake motions with nonstationary frequency content are modeled by using the concept of evolutionary processes. Discussion is focused on the earthquake motions on bed rocks which are important for construction of nuclear power plants in seismic regions. On this basis, two earthquake motion prediction models are developed, one (EMP-IB Model) for prediction with given magnitude and epicentral distance, and the other (EMP-IIB Model) to account for the successive fault ruptures and the site location relative to the fault of great earthquakes. (Author) [pt

  17. Integrated durability process in product development

    International Nuclear Information System (INIS)

    Pompetzki, M.; Saadetian, H.

    2002-01-01

    This presentation describes the integrated durability process in product development. Each of the major components of the integrated process are described along with a number of examples of how integrated durability assessment has been used in the ground vehicle industry. The durability process starts with the acquisition of loading information, either physically through loads measurement or virtually through multibody dynamics. The loading information is then processed and characterized for further analysis. Durability assessment was historically test based and completed through field or laboratory evaluation. Today, it is common that both the test and CAE environments are used together in durability assessment. Test based durability assessment is used for final design sign-off but is also critically important for correlating CAE models, in order to investigate design alternatives. There is also a major initiative today to integrate the individual components into a process, by linking applications and providing a framework to communicate information as well as manage all the data involved in the entire process. Although a single process is presented, the details of the process can vary significantly for different products and applications. Recent applications that highlight different parts of the durability process are given. As well as an example of how integration of software tools between different disciplines (MBD, FE and fatigue) not only simplifies the process, but also significantly improves it. (author)

  18. Flotation process diagnostics and modelling by coal grain analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ofori, P; O' Brien, G.; Firth, B.; Jenkins, B. [CSIRO Energy Technology, Brisbane, Qld. (Australia)

    2006-05-15

    In coal flotation, particles of different components of the coal such as maceral groups and mineral matter and their associations have different hydrophobicities and therefore different flotation responses. By using a new coal grain analysis method for characterising individual grains, more detailed flotation performance analysis and modelling approaches have been developed. The method involves the use of microscopic imaging techniques to obtain estimates of size, compositional and density information on individual grains of fine coal. The density and composition partitioning of coal processed through different flotation systems provides an avenue to pinpoint the actual cause of poor process performance so that corrective action may be initiated. The information on grain size, density and composition is being used as input data to develop more detailed flotation process models to provide better predictions of process performance for both mechanical and column flotation devices. A number of approaches may be taken to flotation modelling such as the probability approach and the kinetic model approach or a combination of the two. In the work reported here, a simple probability approach has been taken, which will be further refined in due course. The use of grain data to map the responses of different types of coal grains through various fine coal cleaning processes provided a more advanced diagnostic capability for fine coal cleaning circuits. This enabled flotation performance curves analogous to partition curves for density separators to be produced for flotation devices.

  19. Improved model management with aggregated business process models

    NARCIS (Netherlands)

    Reijers, H.A.; Mans, R.S.; Toorn, van der R.A.

    2009-01-01

    Contemporary organizations invest much efforts in creating models of their business processes. This raises the issue of how to deal with large sets of process models that become available over time. This paper proposes an extension of Event-driven Process Chains, called the aggregate EPC (aEPC),

  20. In-situ biogas upgrading process: Modeling and simulations aspects.

    Science.gov (United States)

    Lovato, Giovanna; Alvarado-Morales, Merlin; Kovalovszki, Adam; Peprah, Maria; Kougias, Panagiotis G; Rodrigues, José Alberto Domingues; Angelidaki, Irini

    2017-12-01

    Biogas upgrading processes by in-situ hydrogen (H 2 ) injection are still challenging and could benefit from a mathematical model to predict system performance. Therefore, a previous model on anaerobic digestion was updated and expanded to include the effect of H 2 injection into the liquid phase of a fermenter with the aim of modeling and simulating these processes. This was done by including hydrogenotrophic methanogen kinetics for H 2 consumption and inhibition effect on the acetogenic steps. Special attention was paid to gas to liquid transfer of H 2 . The final model was successfully validated considering a set of Case Studies. Biogas composition and H 2 utilization were correctly predicted, with overall deviation below 10% compared to experimental measurements. Parameter sensitivity analysis revealed that the model is highly sensitive to the H 2 injection rate and mass transfer coefficient. The model developed is an effective tool for predicting process performance in scenarios with biogas upgrading. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. Business process modeling in healthcare.

    Science.gov (United States)

    Ruiz, Francisco; Garcia, Felix; Calahorra, Luis; Llorente, César; Gonçalves, Luis; Daniel, Christel; Blobel, Bernd

    2012-01-01

    The importance of the process point of view is not restricted to a specific enterprise sector. In the field of health, as a result of the nature of the service offered, health institutions' processes are also the basis for decision making which is focused on achieving their objective of providing quality medical assistance. In this chapter the application of business process modelling - using the Business Process Modelling Notation (BPMN) standard is described. Main challenges of business process modelling in healthcare are the definition of healthcare processes, the multi-disciplinary nature of healthcare, the flexibility and variability of the activities involved in health care processes, the need of interoperability between multiple information systems, and the continuous updating of scientific knowledge in healthcare.

  2. Scripting MODFLOW model development using Python and FloPy

    Science.gov (United States)

    Bakker, Mark; Post, Vincent E. A.; Langevin, Christian D.; Hughes, Joseph D.; White, Jeremy; Starn, Jeffrey; Fienen, Michael N.

    2016-01-01

    Graphical user interfaces (GUIs) are commonly used to construct and postprocess numerical groundwater flow and transport models. Scripting model development with the programming language Python is presented here as an alternative approach. One advantage of Python is that there are many packages available to facilitate the model development process, including packages for plotting, array manipulation, optimization, and data analysis. For MODFLOW-based models, the FloPy package was developed by the authors to construct model input files, run the model, and read and plot simulation results. Use of Python with the available scientific packages and FloPy facilitates data exploration, alternative model evaluations, and model analyses that can be difficult to perform with GUIs. Furthermore, Python scripts are a complete, transparent, and repeatable record of the modeling process. The approach is introduced with a simple FloPy example to create and postprocess a MODFLOW model. A more complicated capture-fraction analysis with a real-world model is presented to demonstrate the types of analyses that can be performed using Python and FloPy.

  3. Modelling of slag emulsification and slag reduction in CAS-OB process

    OpenAIRE

    Sulasalmi, P. (Petri)

    2016-01-01

    Abstract Composition Adjustment by Sealed argon bubbling – Oxygen Blowing (CAS-OB) process is a ladle treatment process that was developed for chemical heating and alloying of steel. The main stages of the process are heating, (possible) alloying and reduction of slag. The CAS-OB process aims for homogenization and control of the composition and temperature of steel. In this dissertation, a mathematical reaction model was developed for the slag reduction stage of the CAS-OB process. Sl...

  4. Image processing and analysis software development

    International Nuclear Information System (INIS)

    Shahnaz, R.

    1999-01-01

    The work presented in this project is aimed at developing a software 'IMAGE GALLERY' to investigate various image processing and analysis techniques. The work was divided into two parts namely the image processing techniques and pattern recognition, which further comprised of character and face recognition. Various image enhancement techniques including negative imaging, contrast stretching, compression of dynamic, neon, diffuse, emboss etc. have been studied. Segmentation techniques including point detection, line detection, edge detection have been studied. Also some of the smoothing and sharpening filters have been investigated. All these imaging techniques have been implemented in a window based computer program written in Visual Basic Neural network techniques based on Perception model have been applied for face and character recognition. (author)

  5. MATHEMATICAL MODELING OF ELECTROCHEMICAL PROCESSES IN LITHIUM-ION BATTERIES POTENTIALLY STREAMING METHOD

    Directory of Open Access Journals (Sweden)

    S. P. Halutin

    2014-01-01

    Full Text Available Mathematical models in the electrical parameters of physico-chemical processes in lithium-ion batteries are developed. The developed model parameters (discharge mode are identified out of family of discharging curve. By using of the parameters of this model we get the numerically model of lithium-ion battery.

  6. Efficient Adoption and Assessment of Multiple Process Improvement Reference Models

    Directory of Open Access Journals (Sweden)

    Simona Jeners

    2013-06-01

    Full Text Available A variety of reference models such as CMMI, COBIT or ITIL support IT organizations to improve their processes. These process improvement reference models (IRMs cover different domains such as IT development, IT Services or IT Governance but also share some similarities. As there are organizations that address multiple domains and need to coordinate their processes in their improvement we present MoSaIC, an approach to support organizations to efficiently adopt and conform to multiple IRMs. Our solution realizes a semantic integration of IRMs based on common meta-models. The resulting IRM integration model enables organizations to efficiently implement and asses multiple IRMs and to benefit from synergy effects.

  7. Synchronized mammalian cell culture: part II--population ensemble modeling and analysis for development of reproducible processes.

    Science.gov (United States)

    Jandt, Uwe; Barradas, Oscar Platas; Pörtner, Ralf; Zeng, An-Ping

    2015-01-01

    The consideration of inherent population inhomogeneities of mammalian cell cultures becomes increasingly important for systems biology study and for developing more stable and efficient processes. However, variations of cellular properties belonging to different sub-populations and their potential effects on cellular physiology and kinetics of culture productivity under bioproduction conditions have not yet been much in the focus of research. Culture heterogeneity is strongly determined by the advance of the cell cycle. The assignment of cell-cycle specific cellular variations to large-scale process conditions can be optimally determined based on the combination of (partially) synchronized cultivation under otherwise physiological conditions and subsequent population-resolved model adaptation. The first step has been achieved using the physical selection method of countercurrent flow centrifugal elutriation, recently established in our group for different mammalian cell lines which is presented in Part I of this paper series. In this second part, we demonstrate the successful adaptation and application of a cell-cycle dependent population balance ensemble model to describe and understand synchronized bioreactor cultivations performed with two model mammalian cell lines, AGE1.HNAAT and CHO-K1. Numerical adaptation of the model to experimental data allows for detection of phase-specific parameters and for determination of significant variations between different phases and different cell lines. It shows that special care must be taken with regard to the sampling frequency in such oscillation cultures to minimize phase shift (jitter) artifacts. Based on predictions of long-term oscillation behavior of a culture depending on its start conditions, optimal elutriation setup trade-offs between high cell yields and high synchronization efficiency are proposed. © 2014 American Institute of Chemical Engineers.

  8. Demand-based maintenance and operators support based on process models; Behovsstyrt underhaall och operatoersstoed baserat paa process modeller

    Energy Technology Data Exchange (ETDEWEB)

    Dahlquist, Erik; Widarsson, Bjoern; Tomas-Aparicio, Elena

    2012-02-15

    There is a strong demand for systems that can give early warnings on upcoming problems in process performance or sensor measurements. In this project we have developed and implemented such a system on-line. The goal with the system is to give warnings about both faults needing urgent actions, as well giving advice on roughly when service may be needed for specific functions. The use of process simulation models on-line can offer a significant tool for operators and process engineers to analyse the performance of the process and make the most correct and fastest decision when problems arise. In this project physical simulation models are used in combination with decision support tools. By using a physical model it is possible to compare the measured data to the data obtained from the simulation and give these deviations as input to a decision support tool with Bayesian Networks (BN) that will result in information about the probability for wrong measurement in the instruments, process problems and maintenance needs. The application has been implemented in a CFB boiler at Maelarenergi AB. After tuning the model the system has been used online during September - October 2010 and May - October 2011, showing that the system is working on-line with respect to running the simulation model but with batch runs with respect to the BN. Examples have been made for several variables where trends of the deviation between simulation results and measured data have been used as input to a BN, where the probability for different faults has been calculated. Combustion up in the separator/cyclones has been detected several times, problems with fuel feed on both sides of the boiler as well. A moisture sensor not functioning as it should and suspected malfunctioning temperature meters as well. Deeper investigations of the true cause of problems have been used as input to tune the BN

  9. Biomass Torrefaction Process Review and Moving Bed Torrefaction System Model Development

    Energy Technology Data Exchange (ETDEWEB)

    Jaya Shakar Tumuluru; Shahab Sokhansanj; Christopher T. Wright; Richard D. Boardman

    2010-08-01

    Torrefaction is currently developing as an important preprocessing step to improve the quality of biomass in terms of physical properties, and proximate and ultimate composition. Torrefaction is a slow heating of biomass in an inert or reduced environment to a maximum temperature of 300 C. Torrefaction can also be defined as a group of products resulting from the partially controlled and isothermal pyrolysis of biomass occurring in a temperature range of 200-230 C and 270-280 C. Thus, the process can also be called a mild pyrolysis as it occurs at the lower temperature range of the pyrolysis process. At the end of the torrefaction process, a solid uniform product with lower moisture content and higher energy content than raw biomass is produced. Most of the smoke-producing compounds and other volatiles are removed during torrefaction, producing a final product that will have a lower mass but a higher heating value. An important aspect of research is to establish a degree of torrefaction where gains in heating value offset the loss of mass. There is a lack of literature on torrefaction reactor designs and a design sheet for estimating the dimensions of the torrefier based on capacity. This study includes (a) conducting a detailed review on the torrefaction of biomass in terms of understanding the process, product properties, off-gas compositions, and methods used, and (b) to design a moving bed torrefier, taking into account the basic fundamental heat and mass transfer calculations. Specific objectives include calculating the dimensions like diameter and height of the moving packed bed for different capacities, designing the heat loads and gas flow rates, and developing an interactive excel sheet where the user can define design specifications. In this report, 25-1000 kg/hr are used in equations for the design of the torrefier, examples of calculations, and specifications for the torrefier.

  10. A Process Towards Societal Value within a Community-Based Regional Development Project

    Directory of Open Access Journals (Sweden)

    Anna Åslund

    2012-12-01

    Full Text Available Processes, activities and tasks of a community-based area development project are described. The main process has been used three times and a model is presented. An earlier developed process map has been verified. The description of the project can help other communities to plan development projects. The illustration can be valuable for entrepreneurs who are planning a societal value initiative and for decision-makers and stakeholders who can contribute to, are concerned with, or may be affected by societal entrepreneurship. Observation, participating studies, dokumentations and an interview with the project leader has been carried out. Data have been analyzed and compared with the previously developed process map to achieve a deeper understanding of the processes within societal entrepreneurship. The purpose was to study and describe the processes of a community-based area development project and to compare it with a previously developed process map and to verify the process map.

  11. Development of enhanced sulfur rejection processes

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, R.H.; Luttrell, G.H.; Adel, G.T.; Richardson, P.E.

    1996-03-01

    Research at Virginia Tech led to the development of two complementary concepts for improving the removal of inorganic sulfur from many eastern U.S. coals. These concepts are referred to as Electrochemically Enhanced Sulfur Rejection (EESR) and Polymer Enhanced Sulfur Rejection (PESR) processes. The EESR process uses electrochemical techniques to suppress the formation of hydrophobic oxidation products believed to be responsible for the floatability of coal pyrite. The PESR process uses polymeric reagents that react with pyrite and convert floatable middlings, i.e., composite particles composed of pyrite with coal inclusions, into hydrophilic particles. These new pyritic-sulfur rejection processes do not require significant modifications to existing coal preparation facilities, thereby enhancing their adoptability by the coal industry. It is believed that these processes can be used simultaneously to maximize the rejection of both well-liberated pyrite and composite coal-pyrite particles. The project was initiated on October 1, 1992 and all technical work has been completed. This report is based on the research carried out under Tasks 2-7 described in the project proposal. These tasks include Characterization, Electrochemical Studies, In Situ Monitoring of Reagent Adsorption on Pyrite, Bench Scale Testing of the EESR Process, Bench Scale Testing of the PESR Process, and Modeling and Simulation.

  12. Modeling the Hydrologic Processes of a Permeable Pavement ...

    Science.gov (United States)

    A permeable pavement system can capture stormwater to reduce runoff volume and flow rate, improve onsite groundwater recharge, and enhance pollutant controls within the site. A new unit process model for evaluating the hydrologic performance of a permeable pavement system has been developed in this study. The developed model can continuously simulate infiltration through the permeable pavement surface, exfiltration from the storage to the surrounding in situ soils, and clogging impacts on infiltration/exfiltration capacity at the pavement surface and the bottom of the subsurface storage unit. The exfiltration modeling component simulates vertical and horizontal exfiltration independently based on Darcy’s formula with the Green-Ampt approximation. The developed model can be arranged with physically-based modeling parameters, such as hydraulic conductivity, Manning’s friction flow parameters, saturated and field capacity volumetric water contents, porosity, density, etc. The developed model was calibrated using high-frequency observed data. The modeled water depths are well matched with the observed values (R2 = 0.90). The modeling results show that horizontal exfiltration through the side walls of the subsurface storage unit is a prevailing factor in determining the hydrologic performance of the system, especially where the storage unit is developed in a long, narrow shape; or with a high risk of bottom compaction and clogging. This paper presents unit

  13. Development of a working Hovercraft model

    Science.gov (United States)

    Noor, S. H. Mohamed; Syam, K.; Jaafar, A. A.; Mohamad Sharif, M. F.; Ghazali, M. R.; Ibrahim, W. I.; Atan, M. F.

    2016-02-01

    This paper presents the development process to fabricate a working hovercraft model. The purpose of this study is to design and investigate of a fully functional hovercraft, based on the studies that had been done. The different designs of hovercraft model had been made and tested but only one of the models is presented in this paper. In this thesis, the weight, the thrust, the lift and the drag force of the model had been measured and the electrical and mechanical parts are also presented. The processing unit of this model is Arduino Uno by using the PSP2 (Playstation 2) as the controller. Since our prototype should be functioning on all kind of earth surface, our model also had been tested in different floor condition. They include water, grass, cement and tile. The Speed of the model is measured in every case as the respond variable, Current (I) as the manipulated variable and Voltage (V) as the constant variable.

  14. Three-dimensional model for fusion processes

    International Nuclear Information System (INIS)

    Olson, A.P.

    1984-01-01

    Active galactic nuclei (AGN) emit unusual spectra of radiation which is interpreted to signify extreme distance, extreme power, or both. The status of AGNs was recently reviewed by Balick and Heckman. It seems that the greatest conceptual difficulty with understanding AGNs is how to form a coherent phenomenological model of their properties. What drives the galactic engine. What and where are the mass-flows of fuel to this engine. Are there more than one engine. Do the engines have any symmetry properties. Is observed radiation isotropically emitted from the source. If it is polarized, what causes the polarization. Why is there a roughly spherical cloud of ionized gas about the center of our own galaxy, the Milky Way. The purpose of this paper is to discuss a new model, based on fusion processes which are not axisymmetric, uniform, isotropic, or even time-invariant. Then, the relationship to these questions will be developed. A unified model of fusion processes applicable to many astronomical phenomena will be proposed and discussed

  15. Integrating textual and model-based process descriptions for comprehensive process search

    NARCIS (Netherlands)

    Leopold, Henrik; van der Aa, Han; Pittke, Fabian; Raffel, Manuel; Mendling, Jan; Reijers, Hajo A.

    2016-01-01

    Documenting business processes using process models is common practice in many organizations. However, not all process information is best captured in process models. Hence, many organizations complement these models with textual descriptions that specify additional details. The problem with this

  16. Modeling laser velocimeter signals as triply stochastic Poisson processes

    Science.gov (United States)

    Mayo, W. T., Jr.

    1976-01-01

    Previous models of laser Doppler velocimeter (LDV) systems have not adequately described dual-scatter signals in a manner useful for analysis and simulation of low-level photon-limited signals. At low photon rates, an LDV signal at the output of a photomultiplier tube is a compound nonhomogeneous filtered Poisson process, whose intensity function is another (slower) Poisson process with the nonstationary rate and frequency parameters controlled by a random flow (slowest) process. In the present paper, generalized Poisson shot noise models are developed for low-level LDV signals. Theoretical results useful in detection error analysis and simulation are presented, along with measurements of burst amplitude statistics. Computer generated simulations illustrate the difference between Gaussian and Poisson models of low-level signals.

  17. Process Model for Friction Stir Welding

    Science.gov (United States)

    Adams, Glynn

    1996-01-01

    Friction stir welding (FSW) is a relatively new process being applied for joining of metal alloys. The process was initially developed by The Welding Institute (TWI) in Cambridge, UK. The FSW process is being investigated at NASA/MSEC as a repair/initial weld procedure for fabrication of the super-light-weight aluminum-lithium shuttle external tank. The FSW investigations at MSFC were conducted on a horizontal mill to produce butt welds of flat plate material. The weldment plates are butted together and fixed to a backing plate on the mill bed. A pin tool is placed into the tool holder of the mill spindle and rotated at approximately 400 rpm. The pin tool is then plunged into the plates such that the center of the probe lies at, one end of the line of contact, between the plates and the shoulder of the pin tool penetrates the top surface of the weldment. The weld is produced by traversing the tool along the line of contact between the plates. A lead angle allows the leading edge of the shoulder to remain above the top surface of the plate. The work presented here is the first attempt at modeling a complex phenomenon. The mechanical aspects of conducting the weld process are easily defined and the process itself is controlled by relatively few input parameters. However, in the region of the weld, plasticizing and forging of the parent material occurs. These are difficult processes to model. The model presented here addresses only variations in the radial dimension outward from the pin tool axis. Examinations of the grain structure of the weld reveal that a considerable amount of material deformation also occurs in the direction parallel to the pin tool axis of rotation, through the material thickness. In addition, measurements of the axial load on the pin tool demonstrate that the forging affect of the pin tool shoulder is an important process phenomenon. Therefore, the model needs to be expanded to account for the deformations through the material thickness and the

  18. The Complexity of Developmental Predictions from Dual Process Models

    Science.gov (United States)

    Stanovich, Keith E.; West, Richard F.; Toplak, Maggie E.

    2011-01-01

    Drawing developmental predictions from dual-process theories is more complex than is commonly realized. Overly simplified predictions drawn from such models may lead to premature rejection of the dual process approach as one of many tools for understanding cognitive development. Misleading predictions can be avoided by paying attention to several…

  19. Modelling and control of a flotation process

    International Nuclear Information System (INIS)

    Ding, L.; Gustafsson, T.

    1999-01-01

    A general description of a flotation process is given. The dynamic model of a MIMO nonlinear subprocess in flotation, i. e. the pulp levels in five compartments in series is developed and the model is verified with real data from a production plant. In order to reject constant disturbances five extra states are introduced and the model is modified. An exact linearization has been made for the non-linear model and a linear quadratic gaussian controller is proposed based on the linearized model. The simulation result shows an improved performance of the pulp level control when the set points are changed or a disturbance occur. In future the controller will be tested in production. (author)

  20. PSE in Pharmaceutical Process Development

    DEFF Research Database (Denmark)

    Gernaey, Krist; Cervera Padrell, Albert Emili; Woodley, John

    2011-01-01

    The pharmaceutical industry is under growing pressure to increase efficiency, both in production and in process development. This paper will discuss the use of Process Systems Engineering (PSE) methods in pharmaceutical process development, and searches for answers to questions such as: Which PSE...

  1. Development of a Mantle Convection Physical Model to Assist with Teaching about Earth's Interior Processes

    Science.gov (United States)

    Glesener, G. B.; Aurnou, J. M.

    2010-12-01

    The Modeling and Educational Demonstrations Laboratory (MEDL) at UCLA is developing a mantle convection physical model to assist educators with the pedagogy of Earth’s interior processes. Our design goal consists of two components to help the learner gain conceptual understanding by means of visual interactions without the burden of distracters, which may promote alternative conceptions. Distracters may be any feature of the conceptual model that causes the learner to use inadequate mental artifact to help him or her understand what the conceptual model is intended to convey. The first component, and most important, is a psychological component that links properties of “everyday things” (Norman, 1988) to the natural phenomenon, mantle convection. Some examples of everyday things may be heat rising out from a freshly popped bag of popcorn, or cold humid air falling from an open freezer. The second component is the scientific accuracy of the conceptual model. We would like to simplify the concepts for the learner without sacrificing key information that is linked to other natural phenomena the learner will come across in future science lessons. By taking into account the learner’s mental artifacts in combination with a simplified, but accurate, representation of what scientists know of the Earth’s interior, we expect the learner to have the ability to create an adequate qualitative mental simulation of mantle convection. We will be presenting some of our prototypes of this mantle convection physical model at this year’s poster session and invite constructive input from our colleagues.

  2. Development of Probabilistic Structural Analysis Integrated with Manufacturing Processes

    Science.gov (United States)

    Pai, Shantaram S.; Nagpal, Vinod K.

    2007-01-01

    An effort has been initiated to integrate manufacturing process simulations with probabilistic structural analyses in order to capture the important impacts of manufacturing uncertainties on component stress levels and life. Two physics-based manufacturing process models (one for powdered metal forging and the other for annular deformation resistance welding) have been linked to the NESSUS structural analysis code. This paper describes the methodology developed to perform this integration including several examples. Although this effort is still underway, particularly for full integration of a probabilistic analysis, the progress to date has been encouraging and a software interface that implements the methodology has been developed. The purpose of this paper is to report this preliminary development.

  3. Integration of Tuyere, Raceway and Shaft Models for Predicting Blast Furnace Process

    Science.gov (United States)

    Fu, Dong; Tang, Guangwu; Zhao, Yongfu; D'Alessio, John; Zhou, Chenn Q.

    2018-06-01

    A novel modeling strategy is presented for simulating the blast furnace iron making process. Such physical and chemical phenomena are taking place across a wide range of length and time scales, and three models are developed to simulate different regions of the blast furnace, i.e., the tuyere model, the raceway model and the shaft model. This paper focuses on the integration of the three models to predict the entire blast furnace process. Mapping output and input between models and an iterative scheme are developed to establish communications between models. The effects of tuyere operation and burden distribution on blast furnace fuel efficiency are investigated numerically. The integration of different models provides a way to realistically simulate the blast furnace by improving the modeling resolution on local phenomena and minimizing the model assumptions.

  4. Developing Elementary Math and Science Process Skills Through Engineering Design Instruction

    Science.gov (United States)

    Strong, Matthew G.

    This paper examines how elementary students can develop math and science process skills through an engineering design approach to instruction. The performance and development of individual process skills overall and by gender were also examined. The study, preceded by a pilot, took place in a grade four extracurricular engineering design program in a public, suburban school district. Students worked in pairs and small groups to design and construct airplane models from styrofoam, paper clips, and toothpicks. The development and performance of process skills were assessed through a student survey of learning gains, an engineering design packet rubric (student work), observation field notes, and focus group notes. The results indicate that students can significantly develop process skills, that female students may develop process skills through engineering design better than male students, and that engineering design is most helpful for developing the measuring, suggesting improvements, and observing process skills. The study suggests that a more regular engineering design program or curriculum could be beneficial for students' math and science abilities both in this school and for the elementary field as a whole.

  5. Modeling and computational simulation of the osmotic evaporation process

    Directory of Open Access Journals (Sweden)

    Freddy Forero Longas

    2016-09-01

    Conclusions: It was found that for the conditions studied the Knudsen diffusion model is most suitable to describe the transfer of water vapor through the hydrophobic membrane. Simulations developed adequately describe the process of osmotic evaporation, becoming a tool for faster economic development of this technology.

  6. Modeling cancer registration processes with an enhanced activity diagram.

    Science.gov (United States)

    Lyalin, D; Williams, W

    2005-01-01

    Adequate instruments are needed to reflect the complexity of routine cancer registry operations properly in a business model. The activity diagram is a key instrument of the Unified Modeling Language (UML) for the modeling of business processes. The authors aim to improve descriptions of processes in cancer registration, as well as in other public health domains, through the enhancements of an activity diagram notation within the standard semantics of UML. The authors introduced the practical approach to enhance a conventional UML activity diagram, complementing it with the following business process concepts: timeline, duration for individual activities, responsibilities for individual activities within swimlanes, and descriptive text. The authors used an enhanced activity diagram for modeling surveillance processes in the cancer registration domain. Specific example illustrates the use of an enhanced activity diagram to visualize a process of linking cancer registry records with external mortality files. Enhanced activity diagram allows for the addition of more business concepts to a single diagram and can improve descriptions of processes in cancer registration, as well as in other domains. Additional features of an enhanced activity diagram allow to advance the visualization of cancer registration processes. That, in turn, promotes the clarification of issues related to the process timeline, responsibilities for particular operations, and collaborations among process participants. Our first experiences in a cancer registry best practices development workshop setting support the usefulness of such an approach.

  7. Multifunctional multiscale composites: Processing, modeling and characterization

    Science.gov (United States)

    Qiu, Jingjing

    Carbon nanotubes (CNTs) demonstrate extraordinary properties and show great promise in enhancing out-of-plane properties of traditional polymer/fiber composites and enabling functionality. However, current manufacturing challenges hinder the realization of their potential. In the dissertation research, both experimental and computational efforts have been conducted to investigate effective manufacturing techniques of CNT integrated multiscale composites. The fabricated composites demonstrated significant improvements in physical properties, such as tensile strength, tensile modulus, inter-laminar shear strength, thermal dimension stability and electrical conductivity. Such multiscale composites were truly multifunctional with the addition of CNTs. Furthermore, a novel hierarchical multiscale modeling method was developed in this research. Molecular dynamic (MD) simulation offered reasonable explanation of CNTs dispersion and their motion in polymer solution. Bi-mode finite-extensible-nonlinear-elastic (FENE) dumbbell simulation was used to analyze the influence of CNT length distribution on the stress tensor and shear-rate-dependent viscosity. Based on the simulated viscosity profile and empirical equations from experiments, a macroscale flow simulation model on the finite element method (FEM) method was developed and validated to predict resin flow behavior in the processing of CNT-enhanced multiscale composites. The proposed multiscale modeling method provided a comprehensive understanding of micro/nano flow in both atomistic details and mesoscale. The simulation model can be used to optimize process design and control of the mold-filling process in multiscale composite manufacturing. This research provided systematic investigations into the CNT-based multiscale composites. The results from this study may be used to leverage the benefits of CNTs and open up new application opportunities for high-performance multifunctional multiscale composites. Keywords. Carbon

  8. Modeling of excavation induced coupled hydraulic-mechanical processes in claystone

    Energy Technology Data Exchange (ETDEWEB)

    Massmann, Jobst

    2009-07-01

    Concepts for the numerical modeling of excavation induced processes in claystone are investigated. The study has been motivated by the international discussion on the adequacy of claystone as a potential host rock for a final repository of radioactive waste. The processes, which could impact the safety of such a repository, are manifold and strongly interacting. Thus, a multiphysics approach is needed, regarding solid mechanics and fluid mechanics within a geological context. A coupled modeling concept is therefore indispensable. Based on observations and measurements at an argillaceous test site (the underground laboratory Tournemire, operated by the Institute of Radioprotection and Nuclear Safety, France) the modeling concept is developed. Two main processes constitute the basis of the applied model: deformation (linear elasticity considering damage) and fluid flow (unsaturated one-phase flow). Several coupling phenomena are considered: Terzaghi 's effective stress concept, mass conservation of the liquid in a deformable porous media, drying induced shrinkage, and a permeability which depends on deformation and damage. In addition, transversely isotropic material behavior is considered. The numerical simulations are done with the finite element code RockFlow, which is extended to include: an orthotropic non-linear shrinkage model, a continuum damage model, and an orthotropic permeability model. For these new methods the theory and a literature review are presented, followed by applications, which illustrate the capability to model excavation induced processes in principle. In a comprehensive case study, the modeling concept is used to simulate the response of the Tournemire argillite to excavation. The results are compared with observations and measurements of three different excavations (century old tunnel, two galleries excavated in 1996 and 2003). In summary, it can be concluded that the developed model concept provides a prediction of the excavation induced

  9. Modeling of excavation induced coupled hydraulic-mechanical processes in claystone

    International Nuclear Information System (INIS)

    Massmann, Jobst

    2009-01-01

    Concepts for the numerical modeling of excavation induced processes in claystone are investigated. The study has been motivated by the international discussion on the adequacy of claystone as a potential host rock for a final repository of radioactive waste. The processes, which could impact the safety of such a repository, are manifold and strongly interacting. Thus, a multiphysics approach is needed, regarding solid mechanics and fluid mechanics within a geological context. A coupled modeling concept is therefore indispensable. Based on observations and measurements at an argillaceous test site (the underground laboratory Tournemire, operated by the Institute of Radioprotection and Nuclear Safety, France) the modeling concept is developed. Two main processes constitute the basis of the applied model: deformation (linear elasticity considering damage) and fluid flow (unsaturated one-phase flow). Several coupling phenomena are considered: Terzaghi 's effective stress concept, mass conservation of the liquid in a deformable porous media, drying induced shrinkage, and a permeability which depends on deformation and damage. In addition, transversely isotropic material behavior is considered. The numerical simulations are done with the finite element code RockFlow, which is extended to include: an orthotropic non-linear shrinkage model, a continuum damage model, and an orthotropic permeability model. For these new methods the theory and a literature review are presented, followed by applications, which illustrate the capability to model excavation induced processes in principle. In a comprehensive case study, the modeling concept is used to simulate the response of the Tournemire argillite to excavation. The results are compared with observations and measurements of three different excavations (century old tunnel, two galleries excavated in 1996 and 2003). In summary, it can be concluded that the developed model concept provides a prediction of the excavation induced

  10. Modeling of excavation induced coupled hydraulic-mechanical processes in claystone

    Energy Technology Data Exchange (ETDEWEB)

    Massmann, Jobst

    2009-07-01

    Concepts for the numerical modeling of excavation induced processes in claystone are investigated. The study has been motivated by the international discussion on the adequacy of claystone as a potential host rock for a final repository of radioactive waste. The processes, which could impact the safety of such a repository, are manifold and strongly interacting. Thus, a multiphysics approach is needed, regarding solid mechanics and fluid mechanics within a geological context. A coupled modeling concept is therefore indispensable. Based on observations and measurements at an argillaceous test site (the underground laboratory Tournemire, operated by the Institute of Radioprotection and Nuclear Safety, France) the modeling concept is developed. Two main processes constitute the basis of the applied model: deformation (linear elasticity considering damage) and fluid flow (unsaturated one-phase flow). Several coupling phenomena are considered: Terzaghi 's effective stress concept, mass conservation of the liquid in a deformable porous media, drying induced shrinkage, and a permeability which depends on deformation and damage. In addition, transversely isotropic material behavior is considered. The numerical simulations are done with the finite element code RockFlow, which is extended to include: an orthotropic non-linear shrinkage model, a continuum damage model, and an orthotropic permeability model. For these new methods the theory and a literature review are presented, followed by applications, which illustrate the capability to model excavation induced processes in principle. In a comprehensive case study, the modeling concept is used to simulate the response of the Tournemire argillite to excavation. The results are compared with observations and measurements of three different excavations (century old tunnel, two galleries excavated in 1996 and 2003). In summary, it can be concluded that the developed model concept provides a prediction of the excavation

  11. Process modeling for Humanities: tracing and analyzing scientific processes

    OpenAIRE

    Hug , Charlotte; Salinesi , Camille; Deneckere , Rebecca; Lamasse , Stéphane

    2011-01-01

    International audience; This paper concerns epistemology and the understanding of research processes in Humanities, such as Archaeology. We believe that to properly understand research processes, it is essential to trace them. The collected traces depend on the process model established, which has to be as accurate as possible to exhaustively record the traces. In this paper, we briefly explain why the existing process models for Humanities are not sufficient to represent traces. We then pres...

  12. Modeling of processes in the tourism sector

    Directory of Open Access Journals (Sweden)

    Salamatina Victoriya, S.

    2015-06-01

    Full Text Available In modern conditions for a number of Russian regions tourism is becoming budget. In this regard, it is of interest to the simulation of processes occurring in the tourism business, because they are affected by many random parameters, due to various economic, political, geographic, and other aspects. For improvement and development of systems for the management of tourism business systematically embeds economic mathematical apparatus in this area, because increased competitiveness requires continuous and constructive changes. Results of application of the economic mathematical apparatus allow a more systematic and internal unity to analyze and evaluate the applicability of further processes in tourism. For some economic processes typical tourist activities is that a certain effect and result from exposure to any of the factors on the indicators of the processes is not immediately but gradually, after some certain time, with a certain lag. With the necessity of accounting for this delay has to face when developing mathematical models of tourist business processes. In this case, the simulation of such processes it is advisable to apply economic-mathematical formalism of optimal control, called game theory.

  13. Modeling of a Large-Scale High Temperature Regenerative Sulfur Removal Process

    DEFF Research Database (Denmark)

    Konttinen, Jukka T.; Johnsson, Jan Erik

    1999-01-01

    model that does not account for bed hydrodynamics. The pilot-scale test run results, obtained in the test runs of the sulfur removal process with real coal gasifier gas, have been used for parameter estimation. The validity of the reactor model for commercial-scale design applications is discussed.......Regenerable mixed metal oxide sorbents are prime candidates for the removal of hydrogen sulfide from hot gasifier gas in the simplified integrated gasification combined cycle (IGCC) process. As part of the regenerative sulfur removal process development, reactor models are needed for scale......-up. Steady-state kinetic reactor models are needed for reactor sizing, and dynamic models can be used for process control design and operator training. The regenerative sulfur removal process to be studied in this paper consists of two side-by-side fluidized bed reactors operating at temperatures of 400...

  14. Ecosystem management via interacting models of political and ecological processes

    Directory of Open Access Journals (Sweden)

    Haas, T. C.

    2004-01-01

    Full Text Available The decision to implement environmental protection options is a political one. Political realities may cause a country to not heed the most persuasive scientific analysis of an ecosystem's future health. A predictive understanding of the political processes that result in ecosystem management decisions may help guide ecosystem management policymaking. To this end, this article develops a stochastic, temporal model of how political processes influence and are influenced by ecosystem processes. This model is realized in a system of interacting influence diagrams that model the decision making of a country's political bodies. These decisions interact with a model of the ecosystem enclosed by the country. As an example, a model for Cheetah (Acinonyx jubatus management in Kenya is constructed and fitted to decision and ecological data.

  15. A functional-dynamic reflection on participatory processes in modeling projects.

    Science.gov (United States)

    Seidl, Roman

    2015-12-01

    The participation of nonscientists in modeling projects/studies is increasingly employed to fulfill different functions. However, it is not well investigated if and how explicitly these functions and the dynamics of a participatory process are reflected by modeling projects in particular. In this review study, I explore participatory modeling projects from a functional-dynamic process perspective. The main differences among projects relate to the functions of participation-most often, more than one per project can be identified, along with the degree of explicit reflection (i.e., awareness and anticipation) on the dynamic process perspective. Moreover, two main approaches are revealed: participatory modeling covering diverse approaches and companion modeling. It becomes apparent that the degree of reflection on the participatory process itself is not always explicit and perfectly visible in the descriptions of the modeling projects. Thus, the use of common protocols or templates is discussed to facilitate project planning, as well as the publication of project results. A generic template may help, not in providing details of a project or model development, but in explicitly reflecting on the participatory process. It can serve to systematize the particular project's approach to stakeholder collaboration, and thus quality management.

  16. Locating the Seventh Cervical Spinous Process: Development and Validation of a Multivariate Model Using Palpation and Personal Information.

    Science.gov (United States)

    Ferreira, Ana Paula A; Póvoa, Luciana C; Zanier, José F C; Ferreira, Arthur S

    2017-02-01

    The aim of this study was to develop and validate a multivariate prediction model, guided by palpation and personal information, for locating the seventh cervical spinous process (C7SP). A single-blinded, cross-sectional study at a primary to tertiary health care center was conducted for model development and temporal validation. One-hundred sixty participants were prospectively included for model development (n = 80) and time-split validation stages (n = 80). The C7SP was located using the thorax-rib static method (TRSM). Participants underwent chest radiography for assessment of the inner body structure located with TRSM and using radio-opaque markers placed over the skin. Age, sex, height, body mass, body mass index, and vertex-marker distance (D V-M ) were used to predict the distance from the C7SP to the vertex (D V-C7 ). Multivariate linear regression modeling, limits of agreement plot, histogram of residues, receiver operating characteristic curves, and confusion tables were analyzed. The multivariate linear prediction model for D V-C7 (in centimeters) was D V-C7 = 0.986D V-M + 0.018(mass) + 0.014(age) - 1.008. Receiver operating characteristic curves had better discrimination of D V-C7 (area under the curve = 0.661; 95% confidence interval = 0.541-0.782; P = .015) than D V-M (area under the curve = 0.480; 95% confidence interval = 0.345-0.614; P = .761), with respective cutoff points at 23.40 cm (sensitivity = 41%, specificity = 63%) and 24.75 cm (sensitivity = 69%, specificity = 52%). The C7SP was correctly located more often when using predicted D V-C7 in the validation sample than when using the TRSM in the development sample: n = 53 (66%) vs n = 32 (40%), P information. Copyright © 2016. Published by Elsevier Inc.

  17. Modeling nuclear processes by Simulink

    Energy Technology Data Exchange (ETDEWEB)

    Rashid, Nahrul Khair Alang Md, E-mail: nahrul@iium.edu.my [Faculty of Engineering, International Islamic University Malaysia, Jalan Gombak, Selangor (Malaysia)

    2015-04-29

    Modelling and simulation are essential parts in the study of dynamic systems behaviours. In nuclear engineering, modelling and simulation are important to assess the expected results of an experiment before the actual experiment is conducted or in the design of nuclear facilities. In education, modelling can give insight into the dynamic of systems and processes. Most nuclear processes can be described by ordinary or partial differential equations. Efforts expended to solve the equations using analytical or numerical solutions consume time and distract attention from the objectives of modelling itself. This paper presents the use of Simulink, a MATLAB toolbox software that is widely used in control engineering, as a modelling platform for the study of nuclear processes including nuclear reactor behaviours. Starting from the describing equations, Simulink models for heat transfer, radionuclide decay process, delayed neutrons effect, reactor point kinetic equations with delayed neutron groups, and the effect of temperature feedback are used as examples.

  18. Modeling nuclear processes by Simulink

    International Nuclear Information System (INIS)

    Rashid, Nahrul Khair Alang Md

    2015-01-01

    Modelling and simulation are essential parts in the study of dynamic systems behaviours. In nuclear engineering, modelling and simulation are important to assess the expected results of an experiment before the actual experiment is conducted or in the design of nuclear facilities. In education, modelling can give insight into the dynamic of systems and processes. Most nuclear processes can be described by ordinary or partial differential equations. Efforts expended to solve the equations using analytical or numerical solutions consume time and distract attention from the objectives of modelling itself. This paper presents the use of Simulink, a MATLAB toolbox software that is widely used in control engineering, as a modelling platform for the study of nuclear processes including nuclear reactor behaviours. Starting from the describing equations, Simulink models for heat transfer, radionuclide decay process, delayed neutrons effect, reactor point kinetic equations with delayed neutron groups, and the effect of temperature feedback are used as examples

  19. Development of kinetic models for photoassisted electrochemical process using Ti/RuO2 anode and carbon nanotube-based O2-diffusion cathode

    International Nuclear Information System (INIS)

    Akbarpour, Amaneh; Khataee, Alireza; Fathinia, Mehrangiz; Vahid, Behrouz

    2016-01-01

    Highlights: • Preparation and characterization of carbon nanotube-based O 2 -diffusion cathode. • Photoassisted electrochemical process using Ti/RuO 2 anode and O 2 -diffusion cathode. • Degradation of C.I. Basic Yellow 28 under recirculation mode. • Development of kinetic models for photoassisted electrochemical process. - Abstract: A coupled photoassisted electrochemical system was utilized for degradation of C.I. Basic Yellow 28 (BY28) as a cationic azomethine dye under recirculation mode. Experiments were carried out by utilizing active titanium/ruthenium oxide (Ti/RuO 2 ) anode and O 2 -diffusion cathode with carbon nanotubes (CNTs). Transmission electron microscopy (TEM) image of the CNTs demonstrated that CNTs had approximately an inner and outer diameter of 5 nm and 19 nm, respectively. Then, the dye degradation kinetics was experimentally examined under various operational parameters including BY28 initial concentration (mg/L), current density (mA/cm 2 ), flow rate (L/h) and pH. Based on the generally accepted intrinsic elementary reactions for photoassisted electrochemical process (PEP), a novel kinetic model was proposed and validated for predicting the k app . The developed kinetic model explicitly describes the dependency of the k app on BY28 initial concentration and current density. A good agreement was obtained between the predicted values of k app and experimental results (correlation coefficient (R 2 ) = 0.996, mean squared error (MSE) = 2.10 × 10 −4 and mean absolute error (MAE) = 1.10 × 10 −2 ). Finally, in order to profoundly evaluate and compare the accuracy of the suggested intrinsic kinetic model, an empirical kinetic model was also developed as a function of main operational parameters, and an artificial neural network model (ANN) by 3-layer feed-forward back propagation network with topology of 5:9:1. The performance of the mentioned models was compared based on the error functions and analysis of variance (ANOVA). A

  20. Mathematical modeling of the voloxidation process. Final report

    International Nuclear Information System (INIS)

    Stanford, T.G.

    1979-06-01

    A mathematical model of the voloxidation process, a head-end reprocessing step for the removal of volatile fission products from spent nuclear fuel, has been developed. Three types of voloxidizer operation have been considered; co-current operation in which the gas and solid streams flow in the same direction, countercurrent operation in which the gas and solid streams flow in opposite directions, and semi-batch operation in which the gas stream passes through the reactor while the solids remain in it and are processed batch wise. Because of the complexity of the physical ahd chemical processes which occur during the voloxidation process and the lack of currently available kinetic data, a global kinetic model has been adapted for this study. Test cases for each mode of operation have been simulated using representative values of the model parameters. To process 714 kgm/day of spent nuclear fuel, using an oxidizing atmosphere containing 20 mole percent oxygen, it was found that a reactor 0.7 m in diameter and 2.49 m in length would be required for both cocurrent and countercurrent modes of operation while for semibatch operation a 0.3 m 3 reactor and an 88200 sec batch processing time would be required

  1. A literature review on business process modelling: new frontiers of reusability

    Science.gov (United States)

    Aldin, Laden; de Cesare, Sergio

    2011-08-01

    Business process modelling (BPM) has become fundamental for modern enterprises due to the increasing rate of organisational change. As a consequence, business processes need to be continuously (re-)designed as well as subsequently aligned with the corresponding enterprise information systems. One major problem associated with the design of business processes is reusability. Reuse of business process models has the potential of increasing the efficiency and effectiveness of BPM. This article critically surveys the existing literature on the problem of BPM reusability and more specifically on that State-of-the-Art research that can provide or suggest the 'elements' required for the development of a methodology aimed at discovering reusable conceptual artefacts in the form of patterns. The article initially clarifies the definitions of business process and business process model; then, it sets out to explore the previous research conducted in areas that have an impact on reusability in BPM. The article concludes by distilling directions for future research towards the development of apatterns-based approach to BPM; an approach that brings together the contributions made by the research community in the areas of process mining and discovery, declarative approaches and ontologies.

  2. High-Throughput Process Development for Biopharmaceuticals.

    Science.gov (United States)

    Shukla, Abhinav A; Rameez, Shahid; Wolfe, Leslie S; Oien, Nathan

    2017-11-14

    The ability to conduct multiple experiments in parallel significantly reduces the time that it takes to develop a manufacturing process for a biopharmaceutical. This is particularly significant before clinical entry, because process development and manufacturing are on the "critical path" for a drug candidate to enter clinical development. High-throughput process development (HTPD) methodologies can be similarly impactful during late-stage development, both for developing the final commercial process as well as for process characterization and scale-down validation activities that form a key component of the licensure filing package. This review examines the current state of the art for HTPD methodologies as they apply to cell culture, downstream purification, and analytical techniques. In addition, we provide a vision of how HTPD activities across all of these spaces can integrate to create a rapid process development engine that can accelerate biopharmaceutical drug development. Graphical Abstract.

  3. Sato Processes in Default Modeling

    DEFF Research Database (Denmark)

    Kokholm, Thomas; Nicolato, Elisa

    -change of a homogeneous Levy process. While the processes in these two classes share the same average behavior over time, the associated intensities exhibit very different properties. Concrete specifications are calibrated to data on the single names included in the iTraxx Europe index. The performances are compared......In reduced form default models, the instantaneous default intensity is classically the modeling object. Survival probabilities are then given by the Laplace transform of the cumulative hazard defined as the integrated intensity process. Instead, recent literature has shown a tendency towards...... specifying the cumulative hazard process directly. Within this framework we present a new model class where cumulative hazards are described by self-similar additive processes, also known as Sato processes. Furthermore we also analyze specifications obtained via a simple deterministic time...

  4. Activated sludge model (ASM) based modelling of membrane bioreactor (MBR) processes: a critical review with special regard to MBR specificities.

    Science.gov (United States)

    Fenu, A; Guglielmi, G; Jimenez, J; Spèrandio, M; Saroj, D; Lesjean, B; Brepols, C; Thoeye, C; Nopens, I

    2010-08-01

    Membrane bioreactors (MBRs) have been increasingly employed for municipal and industrial wastewater treatment in the last decade. The efforts for modelling of such wastewater treatment systems have always targeted either the biological processes (treatment quality target) as well as the various aspects of engineering (cost effective design and operation). The development of Activated Sludge Models (ASM) was an important evolution in the modelling of Conventional Activated Sludge (CAS) processes and their use is now very well established. However, although they were initially developed to describe CAS processes, they have simply been transferred and applied to MBR processes. Recent studies on MBR biological processes have reported several crucial specificities: medium to very high sludge retention times, high mixed liquor concentration, accumulation of soluble microbial products (SMP) rejected by the membrane filtration step, and high aeration rates for scouring purposes. These aspects raise the question as to what extent the ASM framework is applicable to MBR processes. Several studies highlighting some of the aforementioned issues are scattered through the literature. Hence, through a concise and structured overview of the past developments and current state-of-the-art in biological modelling of MBR, this review explores ASM-based modelling applied to MBR processes. The work aims to synthesize previous studies and differentiates between unmodified and modified applications of ASM to MBR. Particular emphasis is placed on influent fractionation, biokinetics, and soluble microbial products (SMPs)/exo-polymeric substances (EPS) modelling, and suggestions are put forward as to good modelling practice with regard to MBR modelling both for end-users and academia. A last section highlights shortcomings and future needs for improved biological modelling of MBR processes. (c) 2010 Elsevier Ltd. All rights reserved.

  5. On the Use of Variability Operations in the V-Modell XT Software Process Line

    DEFF Research Database (Denmark)

    Kuhrmann, Marco; Méndez Fernández, Daniel; Ternité, Thomas

    2016-01-01

    . In this article, we present a study on the feasibility of variability operations to support the development of software process lines in the context of the V-Modell XT. We analyze which variability operations are defined and practically used. We provide an initial catalog of variability operations...... as an improvement proposal for other process models. Our findings show that 69 variability operation types are defined across several metamodel versions of which, however, 25 remain unused. The found variability operations allow for systematically modifying the content of process model elements and the process......Software process lines provide a systematic approach to develop and manage software processes. It defines a reference process containing general process assets, whereas a well-defined customization approach allows process engineers to create new process variants, e.g., by extending or modifying...

  6. Evaluation of End-Products in Architecture Design Process: A Fuzzy Decision-Making Model

    Directory of Open Access Journals (Sweden)

    Serkan PALABIYIK

    2012-06-01

    Full Text Available This paper presents a study on the development of a fuzzy multi-criteria decision-making model for the evaluation of end products of the architectural design process. Potentials of the developed model were investigated within the scope of architectural design education, specifically an international design studio titled “Design for Disassembly and Reuse: Design & Building Multipurpose Transformable Pavilions.” The studio work followed a design process that integrated systematic and heuristic thinking. The design objectives and assessment criteria were clearly set out at the beginning of the process by the studio coordinator with the aim of narrowing the design space and increasing awareness of the consequences of design decisions. At the end of the design process, designs produced in the studio were evaluated using the developed model to support decision making. The model facilitated the identification of positive and negative aspects of the designs and selection of the design alternative that best met the studio objectives set at the beginning.

  7. Mathematical modeling of biological processes

    CERN Document Server

    Friedman, Avner

    2014-01-01

    This book on mathematical modeling of biological processes includes a wide selection of biological topics that demonstrate the power of mathematics and computational codes in setting up biological processes with a rigorous and predictive framework. Topics include: enzyme dynamics, spread of disease, harvesting bacteria, competition among live species, neuronal oscillations, transport of neurofilaments in axon, cancer and cancer therapy, and granulomas. Complete with a description of the biological background and biological question that requires the use of mathematics, this book is developed for graduate students and advanced undergraduate students with only basic knowledge of ordinary differential equations and partial differential equations; background in biology is not required. Students will gain knowledge on how to program with MATLAB without previous programming experience and how to use codes in order to test biological hypothesis.

  8. Physically based modelling and optimal operation for product drying during post-harvest processing.

    NARCIS (Netherlands)

    Boxtel, van A.J.B.; Lukasse, L.; Farkas, I.; Rendik, Z.

    1996-01-01

    The development of new procedures for crop production and post-harvest processing requires models. Models based on physical backgrounds are most useful for this purpose because of their extrapolation potential. An optimal procedure is developed for alfalfa drying using a physical model. The model

  9. Development of process simulation code for reprocessing plant and process analysis for solvent degradation and solvent washing waste

    International Nuclear Information System (INIS)

    Tsukada, Tsuyoshi; Takahashi, Keiki

    1999-01-01

    We developed a process simulation code for an entire nuclear fuel reprocessing plant. The code can be used on a PC. Almost all of the equipment in the reprocessing plant is included in the code and the mass balance model of each item of equipment is based on the distribution factors of flow-out streams. All models are connected between the outlet flow and the inlet flow according to the process flow sheet. We estimated the amount of DBP from TBP degradation in the entire process by using the developed code. Most of the DBP is generated in the Pu refining process by the effect of α radiation from Pu, which is extracted in a solvent. On the other hand, very little of DBP is generated in the U refining process. We therefore propose simplification of the solvent washing process and volume reduction of the alkali washing waste in the U refining process. The first Japanese commercial reprocessing plant is currently under construction at Rokkasho Mura, Recently, for the sake of process simplification, the original process design has been changed. Using our code, we analyzed the original process and the simplified process. According our results, the volume of alkali waste solution in the low-level liquid treatment process will be reduced by half in the simplified process. (author)

  10. Development of interactive graphic user interfaces for modeling reaction-based biogeochemical processes in batch systems with BIOGEOCHEM

    Science.gov (United States)

    Chang, C.; Li, M.; Yeh, G.

    2010-12-01

    The BIOGEOCHEM numerical model (Yeh and Fang, 2002; Fang et al., 2003) was developed with FORTRAN for simulating reaction-based geochemical and biochemical processes with mixed equilibrium and kinetic reactions in batch systems. A complete suite of reactions including aqueous complexation, adsorption/desorption, ion-exchange, redox, precipitation/dissolution, acid-base reactions, and microbial mediated reactions were embodied in this unique modeling tool. Any reaction can be treated as fast/equilibrium or slow/kinetic reaction. An equilibrium reaction is modeled with an implicit finite rate governed by a mass action equilibrium equation or by a user-specified algebraic equation. A kinetic reaction is modeled with an explicit finite rate with an elementary rate, microbial mediated enzymatic kinetics, or a user-specified rate equation. None of the existing models has encompassed this wide array of scopes. To ease the input/output learning curve using the unique feature of BIOGEOCHEM, an interactive graphic user interface was developed with the Microsoft Visual Studio and .Net tools. Several user-friendly features, such as pop-up help windows, typo warning messages, and on-screen input hints, were implemented, which are robust. All input data can be real-time viewed and automated to conform with the input file format of BIOGEOCHEM. A post-processor for graphic visualizations of simulated results was also embedded for immediate demonstrations. By following data input windows step by step, errorless BIOGEOCHEM input files can be created even if users have little prior experiences in FORTRAN. With this user-friendly interface, the time effort to conduct simulations with BIOGEOCHEM can be greatly reduced.

  11. The CREST Simulation Development Process: Training the Next Generation.

    Science.gov (United States)

    Sweet, Robert M

    2017-04-01

    The challenges of training and assessing endourologic skill have driven the development of new training systems. The Center for Research in Education and Simulation Technologies (CREST) has developed a team and a methodology to facilitate this development process. Backwards design principles were applied. A panel of experts first defined desired clinical and educational outcomes. Outcomes were subsequently linked to learning objectives. Gross task deconstruction was performed, and the primary domain was classified as primarily involving decision-making, psychomotor skill, or communication. A more detailed cognitive task analysis was performed to elicit and prioritize relevant anatomy/tissues, metrics, and errors. Reference anatomy was created using a digital anatomist and clinician working off of a clinical data set. Three dimensional printing can facilitate this process. When possible, synthetic or virtual tissue behavior and textures were recreated using data derived from human tissue. Embedded sensors/markers and/or computer-based systems were used to facilitate the collection of objective metrics. A learning Verification and validation occurred throughout the engineering development process. Nine endourology-relevant training systems were created by CREST with this approach. Systems include basic laparoscopic skills (BLUS), vesicourethral anastomosis, pyeloplasty, cystoscopic procedures, stent placement, rigid and flexible ureteroscopy, GreenLight PVP (GL Sim), Percutaneous access with C-arm (CAT), Nephrolithotomy (NLM), and a vascular injury model. Mixed modalities have been used, including "smart" physical models, virtual reality, augmented reality, and video. Substantial validity evidence for training and assessment has been collected on systems. An open source manikin-based modular platform is under development by CREST with the Department of Defense that will unify these and other commercial task trainers through the common physiology engine, learning

  12. Modelling and simulating decision processes of linked lives: An approach based on concurrent processes and stochastic race.

    Science.gov (United States)

    Warnke, Tom; Reinhardt, Oliver; Klabunde, Anna; Willekens, Frans; Uhrmacher, Adelinde M

    2017-10-01

    Individuals' decision processes play a central role in understanding modern migration phenomena and other demographic processes. Their integration into agent-based computational demography depends largely on suitable support by a modelling language. We are developing the Modelling Language for Linked Lives (ML3) to describe the diverse decision processes of linked lives succinctly in continuous time. The context of individuals is modelled by networks the individual is part of, such as family ties and other social networks. Central concepts, such as behaviour conditional on agent attributes, age-dependent behaviour, and stochastic waiting times, are tightly integrated in the language. Thereby, alternative decisions are modelled by concurrent processes that compete by stochastic race. Using a migration model, we demonstrate how this allows for compact description of complex decisions, here based on the Theory of Planned Behaviour. We describe the challenges for the simulation algorithm posed by stochastic race between multiple concurrent complex decisions.

  13. Sketch of a Noisy Channel Model for the Translation Process

    DEFF Research Database (Denmark)

    Carl, Michael

    default rendering" procedure, later conscious processes are triggered by a monitor who interferes when something goes wrong. An attempt is made to explain monitor activities with relevance theoretic concepts according to which a translator needs to ensure the similarity of explicatures and implicatures......The paper develops a Noisy Channel Model for the translation process that is based on actual user activity data. It builds on the monitor model and makes a distinction between early, automatic and late, conscious translation processes: while early priming processes are at the basis of a "literal...... of the source and the target texts. It is suggested that events and parameters in the model need be measurable and quantifiable in the user activity data so as to trace back monitoring activities in the translation process data. Michael Carl is a Professor with special responsibilities at the Department...

  14. Multiphysics modelling of manufacturing processes: A review

    DEFF Research Database (Denmark)

    Jabbari, Masoud; Baran, Ismet; Mohanty, Sankhya

    2018-01-01

    Numerical modelling is increasingly supporting the analysis and optimization of manufacturing processes in the production industry. Even if being mostly applied to multistep processes, single process steps may be so complex by nature that the needed models to describe them must include multiphysics...... the diversity in the field of modelling of manufacturing processes as regards process, materials, generic disciplines as well as length scales: (1) modelling of tape casting for thin ceramic layers, (2) modelling the flow of polymers in extrusion, (3) modelling the deformation process of flexible stamps...... for nanoimprint lithography, (4) modelling manufacturing of composite parts and (5) modelling the selective laser melting process. For all five examples, the emphasis is on modelling results as well as describing the models in brief mathematical details. Alongside with relevant references to the original work...

  15. DEVELOPMENT OF A KINETIC MODEL OF BOEHMITE DISSOLUTION IN CAUSTIC SOLUTIONS APPLIED TO OPTIMIZE HANFORD WASTE PROCESSING

    International Nuclear Information System (INIS)

    Disselkamp, R.S.

    2011-01-01

    Boehmite (e.g., aluminum oxyhydroxide) is a major non-radioactive component in Hanford and Savannah River nuclear tank waste sludge. Boehmite dissolution from sludge using caustic at elevated temperatures is being planned at Hanford to minimize the mass of material disposed of as high-level waste (HLW) during operation of the Waste Treatment Plant (WTP). To more thoroughly understand the chemistry of this dissolution process, we have developed an empirical kinetic model for aluminate production due to boehmite dissolution. Application of this model to Hanford tank wastes would allow predictability and optimization of the caustic leaching of aluminum solids, potentially yielding significant improvements to overall processing time, disposal cost, and schedule. This report presents an empirical kinetic model that can be used to estimate the aluminate production from the leaching of boehmite in Hanford waste as a function of the following parameters: (1) hydroxide concentration; (2) temperature; (3) specific surface area of boehmite; (4) initial soluble aluminate plus gibbsite present in waste; (5) concentration of boehmite in the waste; and (6) (pre-fit) Arrhenius kinetic parameters. The model was fit to laboratory, non-radioactive (e.g. 'simulant boehmite') leaching results, providing best-fit values of the Arrhenius A-factor, A, and apparent activation energy, E A , of A = 5.0 x 10 12 hour -1 and E A = 90 kJ/mole. These parameters were then used to predict boehmite leaching behavior observed in previously reported actual waste leaching studies. Acceptable aluminate versus leaching time profiles were predicted for waste leaching data from both Hanford and Savannah River site studies.

  16. New process model proves accurate in tests on catalytic reformer

    Energy Technology Data Exchange (ETDEWEB)

    Aguilar-Rodriguez, E.; Ancheyta-Juarez, J. (Inst. Mexicano del Petroleo, Mexico City (Mexico))

    1994-07-25

    A mathematical model has been devised to represent the process that takes place in a fixed-bed, tubular, adiabatic catalytic reforming reactor. Since its development, the model has been applied to the simulation of a commercial semiregenerative reformer. The development of mass and energy balances for this reformer led to a model that predicts both concentration and temperature profiles along the reactor. A comparison of the model's results with experimental data illustrates its accuracy at predicting product profiles. Simple steps show how the model can be applied to simulate any fixed-bed catalytic reformer.

  17. Managing the TDM process : developing MPO institutional capacity - technical report.

    Science.gov (United States)

    2015-04-01

    Within Texas, the development of urban travel demand models (TDMs) is a cooperative process between the : Texas Department of Transportation and Metropolitan Planning Organizations (MPOs). Though TxDOT-Transportation Planning and Programming Division...

  18. Model and Analytic Processes for Export License Assessments

    Energy Technology Data Exchange (ETDEWEB)

    Thompson, Sandra E.; Whitney, Paul D.; Weimar, Mark R.; Wood, Thomas W.; Daly, Don S.; Brothers, Alan J.; Sanfilippo, Antonio P.; Cook, Diane; Holder, Larry

    2011-09-29

    This paper represents the Department of Energy Office of Nonproliferation Research and Development (NA-22) Simulations, Algorithms and Modeling (SAM) Program's first effort to identify and frame analytical methods and tools to aid export control professionals in effectively predicting proliferation intent; a complex, multi-step and multi-agency process. The report focuses on analytical modeling methodologies that alone, or combined, may improve the proliferation export control license approval process. It is a follow-up to an earlier paper describing information sources and environments related to international nuclear technology transfer. This report describes the decision criteria used to evaluate modeling techniques and tools to determine which approaches will be investigated during the final 2 years of the project. The report also details the motivation for why new modeling techniques and tools are needed. The analytical modeling methodologies will enable analysts to evaluate the information environment for relevance to detecting proliferation intent, with specific focus on assessing risks associated with transferring dual-use technologies. Dual-use technologies can be used in both weapons and commercial enterprises. A decision-framework was developed to evaluate which of the different analytical modeling methodologies would be most appropriate conditional on the uniqueness of the approach, data availability, laboratory capabilities, relevance to NA-22 and Office of Arms Control and Nonproliferation (NA-24) research needs and the impact if successful. Modeling methodologies were divided into whether they could help micro-level assessments (e.g., help improve individual license assessments) or macro-level assessment. Macro-level assessment focuses on suppliers, technology, consumers, economies, and proliferation context. Macro-level assessment technologies scored higher in the area of uniqueness because less work has been done at the macro level. An

  19. A Business Process Model and Reengineering Plan for the Student Services Department of the Marine Corps Institute

    National Research Council Canada - National Science Library

    Baden, Kurt

    1997-01-01

    ...). The objective of this thesis is to develop the As-Is process model, redesign the processes to increase efficiency and reduce costs, and develop a To-Be process model to improve the current business processes...

  20. Finite Element Modeling of Adsorption Processes for Gas Separation and Purification

    International Nuclear Information System (INIS)

    Humble, Paul H.; Williams, Richard M.; Hayes, James C.

    2009-01-01

    Pacific Northwest National Laboratory (PNNL) has expertise in the design and fabrication of automated radioxenon collection systems for nuclear explosion monitoring. In developing new systems there is an ever present need to reduce size, power consumption and complexity. Most of these systems have used adsorption based techniques for gas collection and/or concentration and purification. These processes include pressure swing adsorption, vacuum swing adsorption, temperature swing adsorption, gas chromatography and hybrid processes that combine elements of these techniques. To better understand these processes, and help with the development of improved hardware, a finite element software package (COMSOL Multiphysics) has been used to develop complex models of these adsorption based operations. The partial differential equations used include a mass balance for each gas species and adsorbed species along with a convection conduction energy balance equation. These equations in conjunction with multicomponent temperature dependent isotherm models are capable of simulating separation processes ranging from complex multibed PSA processes, and multicomponent temperature programmed gas chromatography, to simple two component temperature swing adsorption. These numerical simulations have been a valuable tool for assessing the capability of proposed processes and optimizing hardware and process parameters.