WorldWideScience

Sample records for model development process

  1. Developing engineering processes through integrated modelling of product and process

    DEFF Research Database (Denmark)

    Nielsen, Jeppe Bjerrum; Hvam, Lars

    2012-01-01

    This article aims at developing an operational tool for integrated modelling of product assortments and engineering processes in companies making customer specific products. Integrating a product model in the design of engineering processes will provide a deeper understanding of the engineering...... activities as well as insight into how product features affect the engineering processes. The article suggests possible ways of integrating models of products with models of engineering processes. The models have been tested and further developed in an action research study carried out in collaboration...... with a major international engineering company....

  2. Simulation Modeling of Software Development Processes

    Science.gov (United States)

    Calavaro, G. F.; Basili, V. R.; Iazeolla, G.

    1996-01-01

    A simulation modeling approach is proposed for the prediction of software process productivity indices, such as cost and time-to-market, and the sensitivity analysis of such indices to changes in the organization parameters and user requirements. The approach uses a timed Petri Net and Object Oriented top-down model specification. Results demonstrate the model representativeness, and its usefulness in verifying process conformance to expectations, and in performing continuous process improvement and optimization.

  3. A Generic Modeling Process to Support Functional Fault Model Development

    Science.gov (United States)

    Maul, William A.; Hemminger, Joseph A.; Oostdyk, Rebecca; Bis, Rachael A.

    2016-01-01

    Functional fault models (FFMs) are qualitative representations of a system's failure space that are used to provide a diagnostic of the modeled system. An FFM simulates the failure effect propagation paths within a system between failure modes and observation points. These models contain a significant amount of information about the system including the design, operation and off nominal behavior. The development and verification of the models can be costly in both time and resources. In addition, models depicting similar components can be distinct, both in appearance and function, when created individually, because there are numerous ways of representing the failure space within each component. Generic application of FFMs has the advantages of software code reuse: reduction of time and resources in both development and verification, and a standard set of component models from which future system models can be generated with common appearance and diagnostic performance. This paper outlines the motivation to develop a generic modeling process for FFMs at the component level and the effort to implement that process through modeling conventions and a software tool. The implementation of this generic modeling process within a fault isolation demonstration for NASA's Advanced Ground System Maintenance (AGSM) Integrated Health Management (IHM) project is presented and the impact discussed.

  4. A Software Development Simulation Model of a Spiral Process

    OpenAIRE

    Carolyn Mizell; Linda Malone

    2009-01-01

    This paper will present a discrete event simulation model of a spiral development lifecycle that can be used to analyze cost and schedule effects of using such a process in comparison to a waterfall process. There is a need for simulation models of software development processes other than the waterfall due to new processes becoming more widely used in order to overcome the limitations of the traditional waterfall lifecycle. The use of a spiral process can make the inherently difficult job of...

  5. A Software Development Simulation Model of a Spiral Process

    Science.gov (United States)

    Mizell, Carolyn; Malone, Linda

    2007-01-01

    There is a need for simulation models of software development processes other than the waterfall because processes such as spiral development are becoming more and more popular. The use of a spiral process can make the inherently difficult job of cost and schedule estimation even more challenging due to its evolutionary nature, but this allows for a more flexible process that can better meet customers' needs. This paper will present a discrete event simulation model of spiral development that can be used to analyze cost and schedule effects of using such a process in comparison to a waterfall process.

  6. Modelling Template for the Development of the Process Flowsheet

    DEFF Research Database (Denmark)

    Fedorova, Marina; Gani, Rafiqul

    2015-01-01

    Models are playing important roles in design and analysis of chemicals/bio-chemicals based products and the processes that manufacture them. Model-based methods and tools have the potential to decrease the number of experiments, which can be expensive and time consuming, and point to candidates...... in connection to other modelling tools within the modelling framework are forming a user-friendly system, which will make the model development process easier and faster and provide the way for unified and consistent model documentation. The modeller can use the template for their specific problem or to extend...... models systematically, efficiently and reliably. In this way, development of products and processes can be faster, cheaper and very efficient. The developed modelling framework involves three main parts: 1) a modelling tool, that includes algorithms for model generation; 2) a template library, which...

  7. Model for Simulating a Spiral Software-Development Process

    Science.gov (United States)

    Mizell, Carolyn; Curley, Charles; Nayak, Umanath

    2010-01-01

    A discrete-event simulation model, and a computer program that implements the model, have been developed as means of analyzing a spiral software-development process. This model can be tailored to specific development environments for use by software project managers in making quantitative cases for deciding among different software-development processes, courses of action, and cost estimates. A spiral process can be contrasted with a waterfall process, which is a traditional process that consists of a sequence of activities that include analysis of requirements, design, coding, testing, and support. A spiral process is an iterative process that can be regarded as a repeating modified waterfall process. Each iteration includes assessment of risk, analysis of requirements, design, coding, testing, delivery, and evaluation. A key difference between a spiral and a waterfall process is that a spiral process can accommodate changes in requirements at each iteration, whereas in a waterfall process, requirements are considered to be fixed from the beginning and, therefore, a waterfall process is not flexible enough for some projects, especially those in which requirements are not known at the beginning or may change during development. For a given project, a spiral process may cost more and take more time than does a waterfall process, but may better satisfy a customer's expectations and needs. Models for simulating various waterfall processes have been developed previously, but until now, there have been no models for simulating spiral processes. The present spiral-process-simulating model and the software that implements it were developed by extending a discrete-event simulation process model of the IEEE 12207 Software Development Process, which was built using commercially available software known as the Process Analysis Tradeoff Tool (PATT). Typical inputs to PATT models include industry-average values of product size (expressed as number of lines of code

  8. Process development

    Energy Technology Data Exchange (ETDEWEB)

    Schuegerl, K

    1984-01-01

    The item 'process development' comprises the production of acetonic/butonal with C. acetobylicum and the yeasting of potato waste. The target is to increase productivity by taking the following measures - optimation of media, on-line process analysis, analysis of reaction, mathematic modelling and identification of parameters, process simulation, development of a state estimator with the help of the on-line process analysis and the model, optimization and adaptive control.

  9. Comparing single- and dual-process models of memory development.

    Science.gov (United States)

    Hayes, Brett K; Dunn, John C; Joubert, Amy; Taylor, Robert

    2017-11-01

    This experiment examined single-process and dual-process accounts of the development of visual recognition memory. The participants, 6-7-year-olds, 9-10-year-olds and adults, were presented with a list of pictures which they encoded under shallow or deep conditions. They then made recognition and confidence judgments about a list containing old and new items. We replicated the main trends reported by Ghetti and Angelini () in that recognition hit rates increased from 6 to 9 years of age, with larger age changes following deep than shallow encoding. Formal versions of the dual-process high threshold signal detection model and several single-process models (equal variance signal detection, unequal variance signal detection, mixture signal detection) were fit to the developmental data. The unequal variance and mixture signal detection models gave a better account of the data than either of the other models. A state-trace analysis found evidence for only one underlying memory process across the age range tested. These results suggest that single-process memory models based on memory strength are a viable alternative to dual-process models for explaining memory development. © 2016 John Wiley & Sons Ltd.

  10. A neuroconstructivist model of past tense development and processing.

    Science.gov (United States)

    Westermann, Gert; Ruh, Nicolas

    2012-07-01

    We present a neural network model of learning and processing the English past tense that is based on the notion that experience-dependent cortical development is a core aspect of cognitive development. During learning the model adds and removes units and connections to develop a task-specific final architecture. The model provides an integrated account of characteristic errors during learning the past tense, adult generalization to pseudoverbs, and dissociations between verbs observed after brain damage in aphasic patients. We put forward a theory of verb inflection in which a functional processing architecture develops through interactions between experience-dependent brain development and the structure of the environment, in this case, the statistical properties of verbs in the language. The outcome of this process is a structured processing system giving rise to graded dissociations between verbs that are easy and verbs that are hard to learn and process. In contrast to dual-mechanism accounts of inflection, we argue that describing dissociations as a dichotomy between regular and irregular verbs is a post hoc abstraction and is not linked to underlying processing mechanisms. We extend current single-mechanism accounts of inflection by highlighting the role of structural adaptation in development and in the formation of the adult processing system. In contrast to some single-mechanism accounts, we argue that the link between irregular inflection and verb semantics is not causal and that existing data can be explained on the basis of phonological representations alone. This work highlights the benefit of taking brain development seriously in theories of cognitive development. Copyright 2012 APA, all rights reserved.

  11. The Development and Application of an Integrated VAR Process Model

    Science.gov (United States)

    Ballantyne, A. Stewart

    2016-07-01

    The VAR ingot has been the focus of several modelling efforts over the years with the result that the thermal regime in the ingot can be simulated quite realistically. Such models provide important insight into solidification of the ingot but present some significant challenges to the casual user such as a process engineer. To provide the process engineer with a tool to assist in the development of a melt practice, a comprehensive model of the complete VAR process has been developed. A radiation heat transfer simulation of the arc has been combined with electrode and ingot models to develop a platform which accepts typical operating variables (voltage, current, and gap) together with process parameters (electrode size, crucible size, orientation, water flow, etc.) as input data. The output consists of heat flow distributions and solidification parameters in the form of text, comma-separated value, and visual toolkit files. The resulting model has been used to examine the relationship between the assumed energy distribution in the arc and the actual energy flux which arrives at the ingot top surface. Utilizing heat balance information generated by the model, the effects of electrode-crucible orientation and arc gap have been explored with regard to the formation of ingot segregation defects.

  12. An innovative service process development based on a reference model

    Directory of Open Access Journals (Sweden)

    Lorenzo Sanfelice Frazzon

    2015-06-01

    Full Text Available This article examines the new service development (NSD process, focusing specifically in a case of a financial service, guided by the following research questions: what are the processes and practices used in the development and design of new financial services? How the results of the financial NSD proposal reflects on the NSD are as a whole? Therefore, the study aims to show and describe a financial service development, conducted at Helpinveste. The paper focuses on the Conceptual Design service (activities: definition of specifications and development of alternative solutions for the service and Service Process Design (Service Representation phases. The methodological procedures are based on the process approach, using a reference model for developing new services. In order to operationalize the model, several techniques for the various stages of the project were used, e.g. QFD and Service Blueprint. Lastly, conclusions report contributions from the reference model application, both theoretical and practical contributions, as well the limitations and further research recommendations.

  13. 3D physical modeling for patterning process development

    Science.gov (United States)

    Sarma, Chandra; Abdo, Amr; Bailey, Todd; Conley, Will; Dunn, Derren; Marokkey, Sajan; Talbi, Mohamed

    2010-03-01

    In this paper we will demonstrate how a 3D physical patterning model can act as a forensic tool for OPC and ground-rule development. We discuss examples where the 2D modeling shows no issues in printing gate lines but 3D modeling shows severe resist loss in the middle. In absence of corrective measure, there is a high likelihood of line discontinuity post etch. Such early insight into process limitations of prospective ground rules can be invaluable for early technology development. We will also demonstrate how the root cause of broken poly-line after etch could be traced to resist necking in the region of STI step with the help of 3D models. We discuss different cases of metal and contact layouts where 3D modeling gives an early insight in to technology limitations. In addition such a 3D physical model could be used for early resist evaluation and selection for required ground-rule challenges, which can substantially reduce the cycle time for process development.

  14. Development of climate data storage and processing model

    Science.gov (United States)

    Okladnikov, I. G.; Gordov, E. P.; Titov, A. G.

    2016-11-01

    We present a storage and processing model for climate datasets elaborated in the framework of a virtual research environment (VRE) for climate and environmental monitoring and analysis of the impact of climate change on the socio-economic processes on local and regional scales. The model is based on a «shared nothings» distributed computing architecture and assumes using a computing network where each computing node is independent and selfsufficient. Each node holds a dedicated software for the processing and visualization of geospatial data providing programming interfaces to communicate with the other nodes. The nodes are interconnected by a local network or the Internet and exchange data and control instructions via SSH connections and web services. Geospatial data is represented by collections of netCDF files stored in a hierarchy of directories in the framework of a file system. To speed up data reading and processing, three approaches are proposed: a precalculation of intermediate products, a distribution of data across multiple storage systems (with or without redundancy), and caching and reuse of the previously obtained products. For a fast search and retrieval of the required data, according to the data storage and processing model, a metadata database is developed. It contains descriptions of the space-time features of the datasets available for processing, their locations, as well as descriptions and run options of the software components for data analysis and visualization. The model and the metadata database together will provide a reliable technological basis for development of a high- performance virtual research environment for climatic and environmental monitoring.

  15. A Petri Net-Based Software Process Model for Developing Process-Oriented Information Systems

    Science.gov (United States)

    Li, Yu; Oberweis, Andreas

    Aiming at increasing flexibility, efficiency, effectiveness, and transparency of information processing and resource deployment in organizations to ensure customer satisfaction and high quality of products and services, process-oriented information systems (POIS) represent a promising realization form of computerized business information systems. Due to the complexity of POIS, explicit and specialized software process models are required to guide POIS development. In this chapter we characterize POIS with an architecture framework and present a Petri net-based software process model tailored for POIS development with consideration of organizational roles. As integrated parts of the software process model, we also introduce XML nets, a variant of high-level Petri nets as basic methodology for business processes modeling, and an XML net-based software toolset providing comprehensive functionalities for POIS development.

  16. How can Product Development Process Modelling be made more useful?

    DEFF Research Database (Denmark)

    Wynn, David C; Maier, Anja; Clarkson, John P

    2010-01-01

    and on the way they are applied. The paper draws upon established principles of cybernetic systems in an attempt to explain the role played by process modelling in operating and improving PD processes. We use this framework to identify eight key factors which influence the utility of modelling in the context...... of use. Further, we indicate how these factors can be interpreted to identify opportunities to improve modelling utility. The paper is organised as follows. Section 2 provides background and motivation for the paper by discussing an example of PD process modelling practice. After highlighting from......, and the process being modelled. Section 5 draws upon established principles of cybernetic systems theory to incorporate this view in an explanation of the role of modelling in PD process operation and improvement. This framework is used to define modelling utility and to progressively identify influences upon it...

  17. Modelling of transport and biogeochemical processes in pollution plumes: Literature review of model development

    DEFF Research Database (Denmark)

    Brun, A.; Engesgaard, Peter Knudegaard

    2002-01-01

    A literature survey shows how biogeochemical (coupled organic and inorganic reaction processes) transport models are based on considering the complete biodegradation process as either a single- or as a two-step process. It is demonstrated that some two-step process models rely on the Partial...... Equilibrium Approach (PEA). The PEA assumes the organic degradation step, and not the electron acceptor consumption step, is rate limiting. This distinction is not possible in one-step process models, where consumption of both the electron donor and acceptor are treated kinetically. A three-dimensional, two......-step PEA model is developed. The model allows for Monod kinetics and biomass growth, features usually included only in one-step process models. The biogeochemical part of the model is tested for a batch system with degradation of organic matter under the consumption of a sequence of electron acceptors...

  18. An Implicit Model Development Process for Bounding External, Seemingly Intangible/Non-Quantifiable Factors

    Science.gov (United States)

    2017-06-01

    This research expands the modeling and simulation (M and S) body of knowledge through the development of an Implicit Model Development Process (IMDP...When augmented to traditional Model Development Processes (MDP), the IMDP enables the development of models that can address a broader array of...potential impacts on operational effectiveness. Specifically, the IMDP provides a formalized methodology for developing an improved model definition

  19. Process development

    International Nuclear Information System (INIS)

    Zapata G, G.

    1989-01-01

    Process development: The paper describes the organization and laboratory facilities of the group working on radioactive ore processing studies. Contains a review of the carried research and the plans for the next future. A list of the published reports is also presented

  20. Integrated approaches to the application of advanced modeling technology in process development and optimization

    Energy Technology Data Exchange (ETDEWEB)

    Allgor, R.J.; Feehery, W.F.; Tolsma, J.E. [Massachusetts Institute of Technology, Cambridge, MA (United States)] [and others

    1995-12-31

    The batch process development problem serves as good candidate to guide the development of process modeling environments. It demonstrates that very robust numerical techniques are required within an environment that can collect, organize, and maintain the data and models required to address the batch process development problem. This paper focuses on improving the robustness and efficiency of the numerical algorithms required in such a modeling environment through the development of hybrid numerical and symbolic strategies.

  1. Development of an equipment management model to improve effectiveness of processes

    International Nuclear Information System (INIS)

    Chang, H. S.; Ju, T. Y.; Song, T. Y.

    2012-01-01

    The nuclear industries have developed and are trying to create a performance model to improve effectiveness of the processes implemented at nuclear plants in order to enhance performance. Most high performing nuclear stations seek to continually improve the quality of their operations by identifying and closing important performance gaps. Thus, many utilities have implemented performance models adjusted to their plant's configuration and have instituted policies for such models. KHNP is developing a standard performance model to integrate the engineering processes and to improve the inter-relation among processes. The model, called the Standard Equipment Management Model (SEMM), is under development first by focusing on engineering processes and performance improvement processes related to plant equipment used at the site. This model includes performance indicators for each process that can allow evaluating and comparing the process performance among 21 operating units. The model will later be expanded to incorporate cost and management processes. (authors)

  2. Modelling coupled microbial processes in the subsurface: Model development, verification, evaluation and application

    Science.gov (United States)

    Masum, Shakil A.; Thomas, Hywel R.

    2018-06-01

    To study subsurface microbial processes, a coupled model which has been developed within a Thermal-Hydraulic-Chemical-Mechanical (THCM) framework is presented. The work presented here, focuses on microbial transport, growth and decay mechanisms under the influence of multiphase flow and bio-geochemical reactions. In this paper, theoretical formulations and numerical implementations of the microbial model are presented. The model has been verified and also evaluated against relevant experimental results. Simulated results show that the microbial processes have been accurately implemented and their impacts on porous media properties can be predicted either qualitatively or quantitatively or both. The model has been applied to investigate biofilm growth in a sandstone core that is subjected to a two-phase flow and variable pH conditions. The results indicate that biofilm growth (if not limited by substrates) in a multiphase system largely depends on the hydraulic properties of the medium. When the change in porewater pH which occurred due to dissolution of carbon dioxide gas is considered, growth processes are affected. For the given parameter regime, it has been shown that the net biofilm growth is favoured by higher pH; whilst the processes are considerably retarded at lower pH values. The capabilities of the model to predict microbial respiration in a fully coupled multiphase flow condition and microbial fermentation leading to production of a gas phase are also demonstrated.

  3. Image Processing of Welding Procedure Specification and Pre-process program development for Finite Element Modelling

    International Nuclear Information System (INIS)

    Kim, K. S.; Lee, H. J.

    2009-11-01

    PRE-WELD program, which generates automatically the input file for the finite element analysis on the 2D butt welding at the dissimilar metal weld part, was developed. This program is pre-process program of the FEM code for analyzing the residual stress at the welding parts. Even if the users have not the detail knowledge for the FEM modelling, the users can make the ABAQUS INPUT easily by inputting the shape data of welding part, the weld current and voltage of welding parameters. By using PRE-WELD program, we can save the time and the effort greatly for preparing the ABAQUS INPUT for the residual stress analysis at the welding parts, and make the exact input without the human error

  4. Competency Model 101. The Process of Developing Core Competencies.

    Science.gov (United States)

    Eichelberger, Lisa Wright; Hewlett, Peggy O'Neill

    1999-01-01

    The Mississippi Competency Model defines nurses' roles as provider (caregiver, teacher, counselor, advocate), professional (scholar, collaborator, ethicist, researcher), and manager (leader, facilitator, intrapreneur, decision maker, technology user) for four levels of nursing: licensed practical nurse, associate degree, bachelor's degree, and…

  5. DEVELOPING A MATHEMATICAL MODEL FOR THE PROCESS OF DEVELOPING A MATHEMATICAL MODEL FOR THE PROCESS OF SEDIMENTARY TANKS

    Directory of Open Access Journals (Sweden)

    Valeria Victoria IOVANOV

    2013-05-01

    Full Text Available The model is reformulated by means of stochastic differential equations, and the parametersare estimated by a maximum likelihood method.VESILIND (1968; 1979 proposed a sludge settling velocity model of exponential form. During recent years,several refinements to the original model have been proposed, see e.g. GRIJSPEERDT et al. (1995; DUPONTand DAHL (1995 EKAMA et al. (1997. In the proposed models several layers in the settling tank areincorporated to permit the calculation of SS profiles over the tank depth and predict the SS concentrations in thereturn sludge and in the effluent from the clarifier.Here, the original VESILIND model combined with a simple suction depth model is used to enable predictionof the SS concentration in the effluent from the tank. In order to make the model applicable for real time controlpurposes, only two layers of variable height in the tank are considered

  6. Understanding Personality Development: An Integrative State Process Model

    Science.gov (United States)

    Geukes, Katharina; van Zalk, Maarten; Back, Mitja D.

    2018-01-01

    While personality is relatively stable over time, it is also subject to change across the entire lifespan. On a macro-analytical level, empirical research has identified patterns of normative and differential development that are affected by biological and environmental factors, specific life events, and social role investments. On a…

  7. Integration of Fast Predictive Model and SLM Process Development Chamber, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — This STTR project seeks to develop a fast predictive model for selective laser melting (SLM) processes and then integrate that model with an SLM chamber that allows...

  8. Strategic Alliance Development - A Process Model A Case Study Integrating Elements of Strategic Alliances

    OpenAIRE

    Mohd Yunos, Mohd Bulkiah

    2007-01-01

    There has been enormous increase in the formation of strategic alliance and the research efforts devoted to understanding alliance development process over the last few decades. However, the critical elements that influence the each stage of alliance development are yet unexplored. This dissertation aims to fill this gap and to supplement it by introducing an integrated process model of strategic alliance development and its critical elements. The process model for strategic alliance developm...

  9. Development of transformations from business process models to implementations by reuse

    NARCIS (Netherlands)

    Dirgahayu, T.; Quartel, Dick; van Sinderen, Marten J.; Ferreira Pires, Luis; Hammoudi, S.

    2007-01-01

    This paper presents an approach for developing transformations from business process models to implementations that facilitates reuse. A transformation is developed as a composition of three smaller tasks: pattern recognition, pattern realization and activity transformation. The approach allows one

  10. Towards a Business Process Modeling Technique for Agile Development of Case Management Systems

    Directory of Open Access Journals (Sweden)

    Ilia Bider

    2017-12-01

    Full Text Available A modern organization needs to adapt its behavior to changes in the business environment by changing its Business Processes (BP and corresponding Business Process Support (BPS systems. One way of achieving such adaptability is via separation of the system code from the process description/model by applying the concept of executable process models. Furthermore, to ease introduction of changes, such process model should separate different perspectives, for example, control-flow, human resources, and data perspectives, from each other. In addition, for developing a completely new process, it should be possible to start with a reduced process model to get a BPS system quickly running, and then continue to develop it in an agile manner. This article consists of two parts, the first sets requirements on modeling techniques that could be used in the tools that supports agile development of BPs and BPS systems. The second part suggests a business process modeling technique that allows to start modeling with the data/information perspective which would be appropriate for processes supported by Case or Adaptive Case Management (CM/ACM systems. In a model produced by this technique, called data-centric business process model, a process instance/case is defined as sequence of states in a specially designed instance database, while the process model is defined as a set of rules that set restrictions on allowed states and transitions between them. The article details the background for the project of developing the data-centric process modeling technique, presents the outline of the structure of the model, and gives formal definitions for a substantial part of the model.

  11. Three phase heat and mass transfer model for unsaturated soil freezing process: Part 1 - model development

    Science.gov (United States)

    Xu, Fei; Zhang, Yaning; Jin, Guangri; Li, Bingxi; Kim, Yong-Song; Xie, Gongnan; Fu, Zhongbin

    2018-04-01

    A three-phase model capable of predicting the heat transfer and moisture migration for soil freezing process was developed based on the Shen-Chen model and the mechanisms of heat and mass transfer in unsaturated soil freezing. The pre-melted film was taken into consideration, and the relationship between film thickness and soil temperature was used to calculate the liquid water fraction in both frozen zone and freezing fringe. The force that causes the moisture migration was calculated by the sum of several interactive forces and the suction in the pre-melted film was regarded as an interactive force between ice and water. Two kinds of resistance were regarded as a kind of body force related to the water films between the ice grains and soil grains, and a block force instead of gravity was introduced to keep balance with gravity before soil freezing. Lattice Boltzmann method was used in the simulation, and the input variables for the simulation included the size of computational domain, obstacle fraction, liquid water fraction, air fraction and soil porosity. The model is capable of predicting the water content distribution along soil depth and variations in water content and temperature during soil freezing process.

  12. Green Pea and Garlic Puree Model Food Development for Thermal Pasteurization Process Quality Evaluation.

    Science.gov (United States)

    Bornhorst, Ellen R; Tang, Juming; Sablani, Shyam S; Barbosa-Cánovas, Gustavo V; Liu, Fang

    2017-07-01

    Development and selection of model foods is a critical part of microwave thermal process development, simulation validation, and optimization. Previously developed model foods for pasteurization process evaluation utilized Maillard reaction products as the time-temperature integrators, which resulted in similar temperature sensitivity among the models. The aim of this research was to develop additional model foods based on different time-temperature integrators, determine their dielectric properties and color change kinetics, and validate the optimal model food in hot water and microwave-assisted pasteurization processes. Color, quantified using a * value, was selected as the time-temperature indicator for green pea and garlic puree model foods. Results showed 915 MHz microwaves had a greater penetration depth into the green pea model food than the garlic. a * value reaction rates for the green pea model were approximately 4 times slower than in the garlic model food; slower reaction rates were preferred for the application of model food in this study, that is quality evaluation for a target process of 90 °C for 10 min at the cold spot. Pasteurization validation used the green pea model food and results showed that there were quantifiable differences between the color of the unheated control, hot water pasteurization, and microwave-assisted thermal pasteurization system. Both model foods developed in this research could be utilized for quality assessment and optimization of various thermal pasteurization processes. © 2017 Institute of Food Technologists®.

  13. Development and implementation of a process model for improvement in manufacturing organisations

    International Nuclear Information System (INIS)

    Ideboeen, F.; Varildengen, R.

    1998-01-01

    The Institute for Information Technology has developed a holistic and analytic model to improve the competitive power of organisations. The goals for the work were to develop a practical and holistic tool. The Process Model is a general model for organisations and has to be adjusted to each organisation. It is based on the fact that products are created while they go through the value creating processes (what the customer is will to pay for). All products and services can be considered a service for the customer. The product itself has less value, but the customer is interested in what a product can provide, including status. The organisation is looked on as a system which, in turn, is an independent group of items, people, or processes working together toward a common purpose. A process is a set of causes and conditions that repeatedly occur in sequential series of steps to transform inputs to outputs. This model divides the company into 3 major process groups: value creating processes, management processes, and support processes. Value creating processes are activities that the customer is willing to pay for. Management processes are the long term processes to obtain optimal profitability through satisfied customers and employees, both in the present and in the future. Support processes are those processes necessary to support the value creating processes. By using the Process Model a company can re-engineer processes and the linkage between them and take out unnecessary processes. One can also work with one process individually. The main goal is to have a model of the company and an overview of the processes and the linkage between them. Changes have to be predicted and the consequences foreseen within the model

  14. Development of a Systems Engineering Model of the Chemical Separations Process

    International Nuclear Information System (INIS)

    Sun, Lijian; Li, Jianhong; Chen, Yitung; Clarksean, Randy; Ladler, Jim; Vandergrift, George

    2002-01-01

    Work is being performed to develop a general-purpose systems engineering model for the AAA separation process. The work centers on the development of a new user interface for the AMUSE code and on the specification of a systems engineering model. This paper presents background information and an overview of work completed to date. (authors)

  15. Development of flexible process-centric web applications: An integrated model driven approach

    NARCIS (Netherlands)

    Bernardi, M.L.; Cimitile, M.; Di Lucca, G.A.; Maggi, F.M.

    2012-01-01

    In recent years, Model Driven Engineering (MDE) approaches have been proposed and used to develop and evolve WAs. However, the definition of appropriate MDE approaches for the development of flexible process-centric WAs is still limited. In particular, (flexible) workflow models have never been

  16. DEVELOPMENT OF PERFORMANCE MODEL FOR QUALITY AND PROCESS IMPROVEMENT IN BUSINESS PROCESS SERVICE INDUSTRY

    Directory of Open Access Journals (Sweden)

    Samson Oludapo

    2017-06-01

    Full Text Available When it comes to performance improvement process, literature abounds with lean, agile and lean-agile. Over the years, the implementation of the improvement processes of lean and agile had met with resounding success in the manufacturing, production, and construction industry. For this reason, there is an interest to develop a performance process for business process service industry incorporating the key aspect of lean and agile theory extracted from the extant literature. The researcher reviewed a total of 750 scholarly articles, grouped them according to the relationship to central theme - lean or agile, and thereafter uses factor analysis under principal component method to explain the relationship of the items. The result of this study showed that firms focusing on cost will minimize the investment of resources in business operations this, in turn, will lead to difficulties in responding to changing customer's requirements in terms of volume, delivery, and new product. The implication is that on the long run cost focus strategy negatively influence flexibility.

  17. Process, cost modeling and simulations for integrated project development of biomass for fuel and protein

    International Nuclear Information System (INIS)

    Pannir Selvam, P.V.; Wolff, D.M.B.; Souza Melo, H.N.

    1998-01-01

    The construction of the models for biomass project development are described. These models, first constructed using QPRO electronic spread sheet for Windows, are now being developed with the aid of visual and object oriented program as tools using DELPHI V.1 for windows and process simulator SUPERPRO, V.2.7 Intelligent Inc. These models render the process development problems with economic objectives to be solved very rapidly. The preliminary analysis of cost and investments of biomass utilisation projects which are included for this study are: steam, ammonia, carbon dioxide and alkali pretreatment process, methane gas production using anaerobic digestion process, aerobic composting, ethanol fermentation and distillation, effluent treatments using high rate algae production as well as cogeneration of energy for drying. The main project under developments are the biomass valuation projects with the elephant (Napier) grass, sugar cane bagasse and microalgae, using models for mass balance, equipment and production cost. The sensibility analyses are carried out to account for stochastic variation of the process yield, production volume, price variations, using Monte Carlo method. These models allow the identification of economical and scale up problems of the technology. The results obtained with few preliminary project development with few case studies are reported for integrated project development for fuel and protein using process and cost simulation models. (author)

  18. Models of neural dynamics in brain information processing - the developments of 'the decade'

    International Nuclear Information System (INIS)

    Borisyuk, G N; Borisyuk, R M; Kazanovich, Yakov B; Ivanitskii, Genrikh R

    2002-01-01

    Neural network models are discussed that have been developed during the last decade with the purpose of reproducing spatio-temporal patterns of neural activity in different brain structures. The main goal of the modeling was to test hypotheses of synchronization, temporal and phase relations in brain information processing. The models being considered are those of temporal structure of spike sequences, of neural activity dynamics, and oscillatory models of attention and feature integration. (reviews of topical problems)

  19. Development strategy and process models for phased automation of design and digital manufacturing electronics

    Science.gov (United States)

    Korshunov, G. I.; Petrushevskaya, A. A.; Lipatnikov, V. A.; Smirnova, M. S.

    2018-03-01

    The strategy of quality of electronics insurance is represented as most important. To provide quality, the processes sequence is considered and modeled by Markov chain. The improvement is distinguished by simple database means of design for manufacturing for future step-by-step development. Phased automation of design and digital manufacturing electronics is supposed. The MatLab modelling results showed effectiveness increase. New tools and software should be more effective. The primary digital model is proposed to represent product in the processes sequence from several processes till the whole life circle.

  20. Process-Based Development of Competence Models to Computer Science Education

    Science.gov (United States)

    Zendler, Andreas; Seitz, Cornelia; Klaudt, Dieter

    2016-01-01

    A process model ("cpm.4.CSE") is introduced that allows the development of competence models in computer science education related to curricular requirements. It includes eight subprocesses: (a) determine competence concept, (b) determine competence areas, (c) identify computer science concepts, (d) assign competence dimensions to…

  1. Clinical, information and business process modeling to promote development of safe and flexible software.

    Science.gov (United States)

    Liaw, Siaw-Teng; Deveny, Elizabeth; Morrison, Iain; Lewis, Bryn

    2006-09-01

    Using a factorial vignette survey and modeling methodology, we developed clinical and information models - incorporating evidence base, key concepts, relevant terms, decision-making and workflow needed to practice safely and effectively - to guide the development of an integrated rule-based knowledge module to support prescribing decisions in asthma. We identified workflows, decision-making factors, factor use, and clinician information requirements. The Unified Modeling Language (UML) and public domain software and knowledge engineering tools (e.g. Protégé) were used, with the Australian GP Data Model as the starting point for expressing information needs. A Web Services service-oriented architecture approach was adopted within which to express functional needs, and clinical processes and workflows were expressed in the Business Process Execution Language (BPEL). This formal analysis and modeling methodology to define and capture the process and logic of prescribing best practice in a reference implementation is fundamental to tackling deficiencies in prescribing decision support software.

  2. A conceptual model for developing KPIs for early phases of the construction process

    NARCIS (Netherlands)

    Haponava, T.; Al-Jibouri, Saad H.S.; Mawdesley, M.; Ahmed, Syed M.; Azhar, Salman; Mohamed, Sherif

    2007-01-01

    The pre-project stage in construction is where most of the decisions about project investment and development are taken. It is therefore very important to be able to control and influence the process at the very beginning of the project. This paper proposes a model for developing a set of KPIs for

  3. An Analytic Hierarchy Process for School Quality and Inspection: Model Development and Application

    Science.gov (United States)

    Al Qubaisi, Amal; Badri, Masood; Mohaidat, Jihad; Al Dhaheri, Hamad; Yang, Guang; Al Rashedi, Asma; Greer, Kenneth

    2016-01-01

    Purpose: The purpose of this paper is to develop an analytic hierarchy planning-based framework to establish criteria weights and to develop a school performance system commonly called school inspections. Design/methodology/approach: The analytic hierarchy process (AHP) model uses pairwise comparisons and a measurement scale to generate the…

  4. [The development of an organizational socialization process model for new nurses using a system dynamics approach].

    Science.gov (United States)

    Choi, Soon-Ook

    2005-04-01

    The purpose of this study was to examine the problems and relevant variables for effective Organizational Socialization of new nurses, to produce a causal map, to build up a simulation model and to test its validity. The basic data was collected from Sep. 2002 to July 2003. The Organizational Socialization process of new nurses was analyzed through a model simulation. The VENSIM 5.0b DSS program was used to develop the study model. This Model shows interrelation of these result variables: organizational commitment, job satisfaction, job performance, intention of leaving the work setting, decision making ability, and general results of Organizational Socialization. The model's factors are characteristic of organization and individual values, task-related knowledge and skills, and emotion and communication that affects new nurses' socialization process. These elements go through processes of anticipatory socialization, encounter, change and acquisition. The Model was devised to induce effective Organizational Socialization results within 24 months of its implementation. The basic model is the most efficient and will also contribute to the development of knowledge in the body of nursing. This study will provide proper direction for new Nurse's Organizational Socialization. Therefore, developing an Organizational Socialization Process Model is meaningful in a sense that it could provide a framework that could create effective Organizational Socialization for new nurses.

  5. Intentional Modelling: A Process for Clinical Leadership Development in Mental Health Nursing.

    Science.gov (United States)

    Ennis, Gary; Happell, Brenda; Reid-Searl, Kerry

    2016-05-01

    Clinical leadership is becoming more relevant for nurses, as the positive impact that it can have on the quality of care and outcomes for consumers is better understood and more clearly articulated in the literature. As clinical leadership continues to become more relevant, the need to gain an understanding of how clinical leaders in nursing develop will become increasingly important. While the attributes associated with effective clinical leadership are recognized in current literature there remains a paucity of research on how clinical leaders develop these attributes. This study utilized a grounded theory methodology to generate new insights into the experiences of peer identified clinical leaders in mental health nursing and the process of developing clinical leadership skills. Participants in this study were nurses working in a mental health setting who were identified as clinical leaders by their peers as opposed to identifying them by their role or organizational position. A process of intentional modeling emerged as the substantive theory identified in this study. Intentional modeling was described by participants in this study as a process that enabled them to purposefully identify models that assisted them in developing the characteristics of effective clinical leaders as well as allowing them to model these characteristics to others. Reflection on practice is an important contributor to intentional modelling. Intentional modelling could be developed as a framework for promoting knowledge and skill development in the area of clinical leadership.

  6. Systematic Multi‐Scale Model Development Strategy for the Fragrance Spraying Process and Transport

    DEFF Research Database (Denmark)

    Heitzig, M.; Rong, Y.; Gregson, C.

    2012-01-01

    The fast and efficient development and application of reliable models with appropriate degree of detail to predict the behavior of fragrance aerosols are challenging problems of high interest to the related industries. A generic modeling template for the systematic derivation of specific fragrance......‐aided modeling framework, which is structured based on workflows for different general modeling tasks. The benefits of the fragrance spraying template are highlighted by a case study related to the derivation of a fragrance aerosol model that is able to reflect measured dynamic droplet size distribution profiles...... aerosol models is proposed. The main benefits of the fragrance spraying template are the speed‐up of the model development/derivation process, the increase in model quality, and the provision of structured domain knowledge where needed. The fragrance spraying template is integrated in a generic computer...

  7. Development of hydrological models and surface process modelization Study case in High Mountain slopes

    International Nuclear Information System (INIS)

    Loaiza, Juan Carlos; Pauwels, Valentijn R

    2011-01-01

    Hydrological models are useful because allow to predict fluxes into the hydrological systems, which is useful to predict foods and violent phenomenon associated to water fluxes, especially in materials under a high meteorization level. The combination of these models with meteorological predictions, especially with rainfall models, allow to model water behavior into the soil. On most of cases, this type of models is really sensible to evapotranspiration. On climatic studies, the superficial processes have to be represented adequately. Calibration and validation of these models is necessary to obtain reliable results. This paper is a practical exercise of application of complete hydrological information at detailed scale in a high mountain catchment, considering the soil use and types more representatives. The information of soil moisture, infiltration, runoff and rainfall is used to calibrate and validate TOPLATS hydrological model to simulate the behavior of soil moisture. The finds show that is possible to implement an hydrological model by means of soil moisture information use and an equation of calibration by Extended Kalman Filter (EKF).

  8. Interprofessional practice in primary care: development of a tailored process model

    Directory of Open Access Journals (Sweden)

    Stans SEA

    2013-04-01

    Full Text Available Steffy EA Stans, JG Anita Stevens, Anna JHM Beurskens Research Center of Autonomy and Participation for Persons with a Chronic Illness, Zuyd University of Applied Sciences, Heerlen, The Netherlands Purpose: This study investigated the improvement of interprofessional practice in primary care by performing the first three steps of the implementation model described by Grol et al. This article describes the targets for improvement in a setting for children with complex care needs (step 1, the identification of barriers and facilitators influencing interprofessional practice (step 2, and the development of a tailored interprofessional process model (step 3. Methods: In step 2, thirteen qualitative semistructured interviews were held with several stakeholders, including parents of children, an occupational therapist, a speech and language therapist, a physical therapist, the manager of the team, two general practitioners, a psychologist, and a primary school teacher. The data were analyzed using directed content analysis and using the domains of the Chronic Care Model as a framework. In step 3, a project group was formed to develop helpful strategies, including the development of an interprofessional process through process mapping. Results: In step 2, it was found that the most important barriers to implementing interprofessional practice related to the lack of structure in the care process. A process model for interprofessional primary care was developed for the target group. Conclusion: The lack of a shared view of what is involved in the process of interprofessional practice was the most important barrier to its successful implementation. It is suggested that the tailored process developed, supported with the appropriate tools, may provide both professional staff and their clients, in this setting but also in other areas of primary care, with insight to the care process and a clear representation of "who should do what, when, and how." Keywords

  9. Mechanistic Models for Process Development and Optimization of Fed-batch Fermentation Systems

    DEFF Research Database (Denmark)

    Mears, Lisa; Stocks, Stuart M.; Albæk, Mads O.

    2016-01-01

    This work discusses the application of mechanistic models to pilot scale filamentous fungal fermentation systems operated at Novozymes A/S. For on-line applications, a state estimator model is developed based on a stoichiometric balance in order to predict the biomass and product concentration....... This is based on on-line gas measurements and ammonia addition flow rate measurements. Additionally, a mechanistic model is applied offline as a tool for batch planning, based on definition of the process back pressure, aeration rate and stirrer speed. This allows the batch starting fill to be planned, taking...... into account the oxygen transfer conditions, as well as the evaporation rates of the system. Mechanistic models are valuable tools which are applicable for both process development and optimization. The state estimator described will be a valuable tool for future work as part of control strategy development...

  10. Functional Fault Model Development Process to Support Design Analysis and Operational Assessment

    Science.gov (United States)

    Melcher, Kevin J.; Maul, William A.; Hemminger, Joseph A.

    2016-01-01

    A functional fault model (FFM) is an abstract representation of the failure space of a given system. As such, it simulates the propagation of failure effects along paths between the origin of the system failure modes and points within the system capable of observing the failure effects. As a result, FFMs may be used to diagnose the presence of failures in the modeled system. FFMs necessarily contain a significant amount of information about the design, operations, and failure modes and effects. One of the important benefits of FFMs is that they may be qualitative, rather than quantitative and, as a result, may be implemented early in the design process when there is more potential to positively impact the system design. FFMs may therefore be developed and matured throughout the monitored system's design process and may subsequently be used to provide real-time diagnostic assessments that support system operations. This paper provides an overview of a generalized NASA process that is being used to develop and apply FFMs. FFM technology has been evolving for more than 25 years. The FFM development process presented in this paper was refined during NASA's Ares I, Space Launch System, and Ground Systems Development and Operations programs (i.e., from about 2007 to the present). Process refinement took place as new modeling, analysis, and verification tools were created to enhance FFM capabilities. In this paper, standard elements of a model development process (i.e., knowledge acquisition, conceptual design, implementation & verification, and application) are described within the context of FFMs. Further, newer tools and analytical capabilities that may benefit the broader systems engineering process are identified and briefly described. The discussion is intended as a high-level guide for future FFM modelers.

  11. The Optimization of the Local Public Policies’ Development Process Through Modeling And Simulation

    Directory of Open Access Journals (Sweden)

    Minodora URSĂCESCU

    2012-06-01

    Full Text Available The local public policies development in Romania represents an empirically realized measure, the strategic management practices in this domain not being based on a scientific instrument capable to anticipate and evaluate the results of implementing a local public policy in a logic of needs-policies-effects type. Beginning from this motivation, the purpose of the paper resides in the reconceptualization of the public policies process on functioning principles of the dynamic systems with inverse connection, by means of mathematical modeling and techniques simulation. Therefore, the research is oriented in the direction of developing an optimization method for the local public policies development process, using as instruments the mathematical modeling and the techniques simulation. The research’s main results are on the one side constituted by generating a new process concept of the local public policies, and on the other side by proposing the conceptual model of a complex software product which will permit the parameterized modeling in a virtual environment of these policies development process. The informatic product’s finality resides in modeling and simulating each local public policy type, taking into account the respective policy’s characteristics, but also the value of their appliance environment parameters in a certain moment.

  12. Development of Property Models with Uncertainty Estimate for Process Design under Uncertainty

    DEFF Research Database (Denmark)

    Hukkerikar, Amol; Sarup, Bent; Abildskov, Jens

    more reliable predictions with a new and improved set of model parameters for GC (group contribution) based and CI (atom connectivity index) based models and to quantify the uncertainties in the estimated property values from a process design point-of-view. This includes: (i) parameter estimation using....... The comparison of model prediction uncertainties with reported range of measurement uncertainties is presented for the properties with related available data. The application of the developed methodology to quantify the effect of these uncertainties on the design of different unit operations (distillation column......, the developed methodology can be used to quantify the sensitivity of process design to uncertainties in property estimates; obtain rationally the risk/safety factors in process design; and identify additional experimentation needs in order to reduce most critical uncertainties....

  13. Managing Service Development (SaaS) as a project: business process modeling

    OpenAIRE

    Iliadi, Vasiliki; Ηλιάδη, Βασιλική

    2017-01-01

    In the context of the present thesis, we will be studying core principles of Business Process Management, and how we can take advantage of them in combination with Project Management Methodologies and modeling tools in the context of Software as a Service businesses and their development. Initially we provide the reader with an introduction to Business Process Management, how it can be used, and how the life cycle is structured. We further define the first three phases of the life cycle to...

  14. Development of a Population Balance Model of a pharmaceutical drying process and testing of solution methods

    DEFF Research Database (Denmark)

    Mortier, Séverine Thérèse F.C.; Gernaey, Krist; De Beer, Thomas

    2013-01-01

    Drying is frequently used in the production of pharmaceutical tablets. Simulation-based control strategy development for such a drying process requires a detailed model. First, the drying of wet granules is modelled using a Population Balance Model. A growth term based on a reduced model was used......, which describes the decrease of the moisture content, to follow the moisture content distribution for a batch of granules. Secondly, different solution methods for solving the PBM are compared. The effect of grid size (discretization methods) is analyzed in terms of accuracy and calculation time. All...

  15. Real-time control data wrangling for development of mathematical control models of technological processes

    Science.gov (United States)

    Vasilyeva, N. V.; Koteleva, N. I.; Fedorova, E. R.

    2018-05-01

    The relevance of the research is due to the need to stabilize the composition of the melting products of copper-nickel sulfide raw materials in the Vanyukov furnace. The goal of this research is to identify the most suitable methods for the aggregation of the real time data for the development of a mathematical model for control of the technological process of melting copper-nickel sulfide raw materials in the Vanyukov furnace. Statistical methods of analyzing the historical data of the real technological object and the correlation analysis of process parameters are described. Factors that exert the greatest influence on the main output parameter (copper content in matte) and ensure the physical-chemical transformations are revealed. An approach to the processing of the real time data for the development of a mathematical model for control of the melting process is proposed. The stages of processing the real time information are considered. The adopted methodology for the aggregation of data suitable for the development of a control model for the technological process of melting copper-nickel sulfide raw materials in the Vanyukov furnace allows us to interpret the obtained results for their further practical application.

  16. Analyzing empowerment oriented email consultation for parents : Development of the Guiding the Empowerment Process model

    NARCIS (Netherlands)

    dr. Christa C.C. Nieuwboer

    2014-01-01

    Background. Online consultation is increasingly offered by parenting practitioners, but it is not clear if it is feasible to provide empowerment oriented support in single session email consultation. Method. Based on empowerment theory we developed the Guiding the Empowerment Process model (GEP

  17. Model-based high-throughout process development for chromatographic whey proteins separation

    NARCIS (Netherlands)

    Nfor, B.; Ripic, J.; Padt, van der A.; Jacobs, M.; Ottens, M.

    2012-01-01

    In this study, an integrated approach involving the combined use of high-throughput screening (HTS) and column modeling during process development was applied to an industrial case involving the evaluation of four anion-exchange chromatography (AEX) resins and four hydrophobic interaction

  18. Transition management as a model for managing processes of co-evolution towards sustainable development

    NARCIS (Netherlands)

    R. Kemp (René); D.A. Loorbach (Derk); J. Rotmans (Jan)

    2007-01-01

    textabstractSustainable development requires changes in socio-technical systems and wider societal change - in beliefs, values and governance that co-evolve with technology changes. In this article we present a practical model for managing processes of co-evolution: transition management. Transition

  19. Mathematical model development of heat and mass exchange processes in the outdoor swimming pool

    OpenAIRE

    M. V. Shaptala; D. E. Shaptala

    2014-01-01

    Purpose. Currently exploitation of outdoor swimming pools is often not cost-effective and, despite of their relevance, such pools are closed in large quantities. At this time there is no the whole mathematical model which would allow assessing qualitatively the effect of energy-saving measures. The aim of this work is to develop a mathematical model of heat and mass exchange processes for calculating basic heat and mass losses that occur during its exploitation. Methodology. The m...

  20. Developing Pavement Distress Deterioration Models for Pavement Management System Using Markovian Probabilistic Process

    Directory of Open Access Journals (Sweden)

    Promothes Saha

    2017-01-01

    Full Text Available In the state of Colorado, the Colorado Department of Transportation (CDOT utilizes their pavement management system (PMS to manage approximately 9,100 miles of interstate, highways, and low-volume roads. Three types of deterioration models are currently being used in the existing PMS: site-specific, family, and expert opinion curves. These curves are developed using deterministic techniques. In the deterministic technique, the uncertainties of pavement deterioration related to traffic and weather are not considered. Probabilistic models that take into account the uncertainties result in more accurate curves. In this study, probabilistic models using the discrete-time Markov process were developed for five distress indices: transverse, longitudinal, fatigue, rut, and ride indices, as a case study on low-volume roads. Regression techniques were used to develop the deterioration paths using the predicted distribution of indices estimated from the Markov process. Results indicated that longitudinal, fatigue, and rut indices had very slow deterioration over time, whereas transverse and ride indices showed faster deterioration. The developed deterioration models had the coefficient of determination (R2 above 0.84. As probabilistic models provide more accurate results, it is recommended that these models be used as the family curves in the CDOT PMS for low-volume roads.

  1. Models development for natural circulation and its transition process in nuclear power plant

    International Nuclear Information System (INIS)

    Yu Lei; Cai Qi; Cai Zhangsheng; Xie Haiyan

    2008-01-01

    On the basis of nuclear power plant (NPP) best-estimate transient analysis code RELAP5/MOD3, the point reactor kinetics model in RELAP5/MOD3 was replaced by the two-group, 3-D space and time dependent neutron kinetic model, in order to exactly analyze the responses of key parameters in natural circulation and its transition process considering the reactivity feedback. The coupled model for three-dimensional physics and thermohydraulics was established and corresponding computing code was developed. Using developed code, natural circulation of NPP and its transiton process were calculated and analyzed. Compared with the experiment data, the calculated results show that its high precise avoids the shortage that the point reactor equation can not reflect the reactivity exactly. This code can be a computing and analysis tool for forced circulation and natural circulation and their transitions. (authors)

  2. The application of feature selection to the development of Gaussian process models for percutaneous absorption.

    Science.gov (United States)

    Lam, Lun Tak; Sun, Yi; Davey, Neil; Adams, Rod; Prapopoulou, Maria; Brown, Marc B; Moss, Gary P

    2010-06-01

    The aim was to employ Gaussian processes to assess mathematically the nature of a skin permeability dataset and to employ these methods, particularly feature selection, to determine the key physicochemical descriptors which exert the most significant influence on percutaneous absorption, and to compare such models with established existing models. Gaussian processes, including automatic relevance detection (GPRARD) methods, were employed to develop models of percutaneous absorption that identified key physicochemical descriptors of percutaneous absorption. Using MatLab software, the statistical performance of these models was compared with single linear networks (SLN) and quantitative structure-permeability relationships (QSPRs). Feature selection methods were used to examine in more detail the physicochemical parameters used in this study. A range of statistical measures to determine model quality were used. The inherently nonlinear nature of the skin data set was confirmed. The Gaussian process regression (GPR) methods yielded predictive models that offered statistically significant improvements over SLN and QSPR models with regard to predictivity (where the rank order was: GPR > SLN > QSPR). Feature selection analysis determined that the best GPR models were those that contained log P, melting point and the number of hydrogen bond donor groups as significant descriptors. Further statistical analysis also found that great synergy existed between certain parameters. It suggested that a number of the descriptors employed were effectively interchangeable, thus questioning the use of models where discrete variables are output, usually in the form of an equation. The use of a nonlinear GPR method produced models with significantly improved predictivity, compared with SLN or QSPR models. Feature selection methods were able to provide important mechanistic information. However, it was also shown that significant synergy existed between certain parameters, and as such it

  3. Development of Three-Layer Simulation Model for Freezing Process of Food Solution Systems

    Science.gov (United States)

    Kaminishi, Koji; Araki, Tetsuya; Shirakashi, Ryo; Ueno, Shigeaki; Sagara, Yasuyuki

    A numerical model has been developed for simulating freezing phenomena of food solution systems. The cell model was simplified to apply to food solution systems, incorporating with the existence of 3 parts such as unfrozen, frozen and moving boundary layers. Moreover, the moving rate of freezing front model was also introduced and calculated by using the variable space network method proposed by Murray and Landis (1957). To demonstrate the validity of the model, it was applied to the freezing processes of coffee solutions. Since the model required the phase diagram of the material to be frozen, the initial freezing temperatures of 1-55 % coffee solutions were measured by the DSC method. The effective thermal conductivity for coffee solutions was determined as a function of temperature and solute concentration by using the Maxwell - Eucken model. One-dimensional freezing process of 10 % coffee solution was simulated based on its phase diagram and thermo-physical properties. The results were good agreement with the experimental data and then showed that the model could accurately describe the change in the location of the freezing front and the distributions of temperature as well as ice fraction during a freezing process.

  4. Process of optimization of retail trade spatial development with application of locational-alocational models

    Directory of Open Access Journals (Sweden)

    Kukrika Milan

    2008-01-01

    Full Text Available This article gives a simple and brief scope of structure and usage of location-allocation models in territory planning of retail network, trying to show the main shortage of some given models and the primary direction of their future improving. We give an inspection of theirs main usage and give an explanation of basic factors that models take in consideration during the process of demand allocation. Location-allocation models are an important segment of development of spatial retail network optimization process. Their future improvement is going towards their approximation and integration with spatial-interaction models. In this way, much better methodology of planning and directing spatial development of trade general. Methodology which we have used in this research paper is based on the literature and research projects in the area. Using this methodology in analyzing parts of Serbian territory through usage of location-allocation models, showed the need for creating special software for calculating matrix with recursions. Considering the fact that the integration of location-allocation models with GIS still didn't occur, all the results acquired during the calculation of methaformula has been brought into ArcGIS 9.2 software and presented as maps.

  5. Parametric Cost Modeling of Space Missions Using the Develop New Projects (DMP) Implementation Process

    Science.gov (United States)

    Rosenberg, Leigh; Hihn, Jairus; Roust, Kevin; Warfield, Keith

    2000-01-01

    This paper presents an overview of a parametric cost model that has been built at JPL to estimate costs of future, deep space, robotic science missions. Due to the recent dramatic changes in JPL business practices brought about by an internal reengineering effort known as develop new products (DNP), high-level historic cost data is no longer considered analogous to future missions. Therefore, the historic data is of little value in forecasting costs for projects developed using the DNP process. This has lead to the development of an approach for obtaining expert opinion and also for combining actual data with expert opinion to provide a cost database for future missions. In addition, the DNP cost model has a maximum of objective cost drivers which reduces the likelihood of model input error. Version 2 is now under development which expands the model capabilities, links it more tightly with key design technical parameters, and is grounded in more rigorous statistical techniques. The challenges faced in building this model will be discussed, as well as it's background, development approach, status, validation, and future plans.

  6. Using process algebra to develop predator-prey models of within-host parasite dynamics.

    Science.gov (United States)

    McCaig, Chris; Fenton, Andy; Graham, Andrea; Shankland, Carron; Norman, Rachel

    2013-07-21

    As a first approximation of immune-mediated within-host parasite dynamics we can consider the immune response as a predator, with the parasite as its prey. In the ecological literature of predator-prey interactions there are a number of different functional responses used to describe how a predator reproduces in response to consuming prey. Until recently most of the models of the immune system that have taken a predator-prey approach have used simple mass action dynamics to capture the interaction between the immune response and the parasite. More recently Fenton and Perkins (2010) employed three of the most commonly used prey-dependent functional response terms from the ecological literature. In this paper we make use of a technique from computing science, process algebra, to develop mathematical models. The novelty of the process algebra approach is to allow stochastic models of the population (parasite and immune cells) to be developed from rules of individual cell behaviour. By using this approach in which individual cellular behaviour is captured we have derived a ratio-dependent response similar to that seen in the previous models of immune-mediated parasite dynamics, confirming that, whilst this type of term is controversial in ecological predator-prey models, it is appropriate for models of the immune system. Copyright © 2013 Elsevier Ltd. All rights reserved.

  7. Process modeling of the platform choise for development of the multimedia educational complex

    Directory of Open Access Journals (Sweden)

    Ірина Олександрівна Бондар

    2016-10-01

    Full Text Available The article presents a methodical approach to the platform choice as the technological basis for building of open and functional structure and the further implementation of the substantive content of the modules of the network multimedia complex for the discipline. The proposed approach is implemented through the use of mathematical tools. The result of the process modeling is the decision of the most appropriate platform for development of the multimedia complex

  8. High-throughput micro-scale cultivations and chromatography modeling: Powerful tools for integrated process development.

    Science.gov (United States)

    Baumann, Pascal; Hahn, Tobias; Hubbuch, Jürgen

    2015-10-01

    Upstream processes are rather complex to design and the productivity of cells under suitable cultivation conditions is hard to predict. The method of choice for examining the design space is to execute high-throughput cultivation screenings in micro-scale format. Various predictive in silico models have been developed for many downstream processes, leading to a reduction of time and material costs. This paper presents a combined optimization approach based on high-throughput micro-scale cultivation experiments and chromatography modeling. The overall optimized system must not necessarily be the one with highest product titers, but the one resulting in an overall superior process performance in up- and downstream. The methodology is presented in a case study for the Cherry-tagged enzyme Glutathione-S-Transferase from Escherichia coli SE1. The Cherry-Tag™ (Delphi Genetics, Belgium) which can be fused to any target protein allows for direct product analytics by simple VIS absorption measurements. High-throughput cultivations were carried out in a 48-well format in a BioLector micro-scale cultivation system (m2p-Labs, Germany). The downstream process optimization for a set of randomly picked upstream conditions producing high yields was performed in silico using a chromatography modeling software developed in-house (ChromX). The suggested in silico-optimized operational modes for product capturing were validated subsequently. The overall best system was chosen based on a combination of excellent up- and downstream performance. © 2015 Wiley Periodicals, Inc.

  9. Modelling and Development of a High Performance Milling Process with Monolithic Cutting Tools

    International Nuclear Information System (INIS)

    Ozturk, E.; Taylor, C. M.; Turner, S.; Devey, M.

    2011-01-01

    Critical aerospace components usually require difficult to machine workpiece materials like nickel based alloys. Moreover; there is a pressing need to maximize the productivity of machining operations. This need can be satisfied by selection of higher feed velocity, axial and radial depths. But there may be several problems during machining in this case. Due to high cutting speeds in high performance machining, the tool life may be unacceptably low. If magnitudes of cutting forces are high, out of tolerance static form errors may result; moreover in the extreme cases, the cutting tool may break apart. Forced vibrations may deteriorate the surface quality. Chatter vibrations may develop if the selected parameters result in instability. In this study, in order to deal with the tool life issue, several experimental cuts are made with different tool geometries, and the best combination in terms of tool life is selected. A force model is developed and the results of the force model are verified by experimental results. The force model is used in predicting the effect of process parameters on cutting forces. In order to account for the other concerns such as static form errors, forced and chatter vibrations, additional process models are currently under development.

  10. Developing a Data Driven Process-Based Model for Remote Sensing of Ecosystem Production

    Science.gov (United States)

    Elmasri, B.; Rahman, A. F.

    2010-12-01

    Estimating ecosystem carbon fluxes at various spatial and temporal scales is essential for quantifying the global carbon cycle. Numerous models have been developed for this purpose using several environmental variables as well as vegetation indices derived from remotely sensed data. Here we present a data driven modeling approach for gross primary production (GPP) that is based on a process based model BIOME-BGC. The proposed model was run using available remote sensing data and it does not depend on look-up tables. Furthermore, this approach combines the merits of both empirical and process models, and empirical models were used to estimate certain input variables such as light use efficiency (LUE). This was achieved by using remotely sensed data to the mathematical equations that represent biophysical photosynthesis processes in the BIOME-BGC model. Moreover, a new spectral index for estimating maximum photosynthetic activity, maximum photosynthetic rate index (MPRI), is also developed and presented here. This new index is based on the ratio between the near infrared and the green bands (ρ858.5/ρ555). The model was tested and validated against MODIS GPP product and flux measurements from two eddy covariance flux towers located at Morgan Monroe State Forest (MMSF) in Indiana and Harvard Forest in Massachusetts. Satellite data acquired by the Advanced Microwave Scanning Radiometer (AMSR-E) and MODIS were used. The data driven model showed a strong correlation between the predicted and measured GPP at the two eddy covariance flux towers sites. This methodology produced better predictions of GPP than did the MODIS GPP product. Moreover, the proportion of error in the predicted GPP for MMSF and Harvard forest was dominated by unsystematic errors suggesting that the results are unbiased. The analysis indicated that maintenance respiration is one of the main factors that dominate the overall model outcome errors and improvement in maintenance respiration estimation

  11. A Biopsychological Model of Anti-drug PSA Processing: Developing Effective Persuasive Messages.

    Science.gov (United States)

    Hohman, Zachary P; Keene, Justin Robert; Harris, Breanna N; Niedbala, Elizabeth M; Berke, Collin K

    2017-11-01

    For the current study, we developed and tested a biopsychological model to combine research on psychological tension, the Limited Capacity Model of Motivated Mediated Message Processing, and the endocrine system to predict and understand how people process anti-drug PSAs. We predicted that co-presentation of pleasant and unpleasant information, vs. solely pleasant or unpleasant, will trigger evaluative tension about the target behavior in persuasive messages and result in a biological response (increase in cortisol, alpha amylase, and heart rate). In experiment 1, we assessed the impact of co-presentation of pleasant and unpleasant information in persuasive messages on evaluative tension (conceptualized as attitude ambivalence), in experiment 2, we explored the impact of co-presentation on endocrine system responses (salivary cortisol and alpha amylase), and in experiment 3, we assessed the impact of co-presentation on heart rate. Across all experiments, we demonstrated that co-presentation of pleasant and unpleasant information, vs. solely pleasant or unpleasant, in persuasive communications leads to increases in attitude ambivalence, salivary cortisol, salivary alpha amylase, and heart rate. Taken together, the results support the initial paths of our biopsychological model of persuasive message processing and indicate that including both pleasant and unpleasant information in a message impacts the viewer. We predict that increases in evaluative tension and biological responses will aid in memory and cognitive processing of the message. However, future research is needed to test that hypothesis.

  12. Development of Computer Aided Modelling Templates for Model Re-use in Chemical and Biochemical Process and Product Design: Importand export of models

    DEFF Research Database (Denmark)

    Fedorova, Marina; Tolksdorf, Gregor; Fillinger, Sandra

    2015-01-01

    been established, in order to provide a wider range of modelling capabilities. Through this link, developed models can be exported/imported to/from other modelling-simulation software environments to allow model reusability in chemical and biochemical product and process design. The use of this link...

  13. Non-isothermal processes during the drying of bare soil: Model Development and Validation

    Science.gov (United States)

    Sleep, B.; Talebi, A.; O'Carrol, D. M.

    2017-12-01

    Several coupled liquid water, water vapor, and heat transfer models have been developed either to study non-isothermal processes in the subsurface immediately below the ground surface, or to predict the evaporative flux from the ground surface. Equilibrium phase change between water and gas phases is typically assumed in these models. Recently, a few studies have questioned this assumption and proposed a coupled model considering kinetic phase change. However, none of these models were validated against real field data. In this study, a non-isothermal coupled model incorporating kinetic phase change was developed and examined against the measured data from a green roof test module. The model also incorporated a new surface boundary condition for water vapor transport at the ground surface. The measured field data included soil moisture content and temperature at different depths up to the depth of 15 cm below the ground surface. Lysimeter data were collected to determine the evaporation rates. Short and long wave radiation, wind velocity, air ambient temperature and relative humidity were measured and used as model input. Field data were collected for a period of three months during the warm seasons in south eastern Canada. The model was calibrated using one drying period and then several other drying periods were simulated. In general, the model underestimated the evaporation rates in the early stage of the drying period, however, the cumulative evaporation was in good agreement with the field data. The model predicted the trends in temperature and moisture content at the different depths in the green roof module. The simulated temperature was lower than the measured temperature for most of the simulation time with the maximum difference of 5 ° C. The simulated moisture content changes had the same temporal trend as the lysimeter data for the events simulated.

  14. Development and validation of a CFD model predicting the backfill process of a nuclear waste gallery

    International Nuclear Information System (INIS)

    Gopala, Vinay Ramohalli; Lycklama a Nijeholt, Jan-Aiso; Bakker, Paul; Haverkate, Benno

    2011-01-01

    Research highlights: → This work presents the CFD simulation of the backfill process of Supercontainers with nuclear waste emplaced in a disposal gallery. → The cement-based material used for backfill is grout and the flow of grout is modelled as a Bingham fluid. → The model is verified against an analytical solution and validated against the flowability tests for concrete. → Comparison between backfill plexiglas experiment and simulation shows a distinct difference in the filling pattern. → The numerical model needs to be further developed to include segregation effects and thixotropic behavior of grout. - Abstract: Nuclear waste material may be stored in underground tunnels for long term storage. The example treated in this article is based on the current Belgian disposal concept for High-Level Waste (HLW), in which the nuclear waste material is packed in concrete shielded packages, called Supercontainers, which are inserted into these tunnels. After placement of the packages in the underground tunnels, the remaining voids between the packages and the tunnel lining is filled-up with a cement-based material called grout in order to encase the stored containers into the underground spacing. This encasement of the stored containers inside the tunnels is known as the backfill process. A good backfill process is necessary to stabilize the waste gallery against ground settlements. A numerical model to simulate the backfill process can help to improve and optimize the process by ensuring a homogeneous filling with no air voids and also optimization of the injection positions to achieve a homogeneous filling. The objective of the present work is to develop such a numerical code that can predict the backfill process well and validate the model against the available experiments and analytical solutions. In the present work the rheology of Grout is modelled as a Bingham fluid which is implemented in OpenFOAM - a finite volume-based open source computational fluid

  15. Developing Materials Processing to Performance Modeling Capabilities and the Need for Exascale Computing Architectures (and Beyond)

    Energy Technology Data Exchange (ETDEWEB)

    Schraad, Mark William [Los Alamos National Lab. (LANL), Los Alamos, NM (United States). Physics and Engineering Models; Luscher, Darby Jon [Los Alamos National Lab. (LANL), Los Alamos, NM (United States). Advanced Simulation and Computing

    2016-09-06

    Additive Manufacturing techniques are presenting the Department of Energy and the NNSA Laboratories with new opportunities to consider novel component production and repair processes, and to manufacture materials with tailored response and optimized performance characteristics. Additive Manufacturing technologies already are being applied to primary NNSA mission areas, including Nuclear Weapons. These mission areas are adapting to these new manufacturing methods, because of potential advantages, such as smaller manufacturing footprints, reduced needs for specialized tooling, an ability to embed sensing, novel part repair options, an ability to accommodate complex geometries, and lighter weight materials. To realize the full potential of Additive Manufacturing as a game-changing technology for the NNSA’s national security missions; however, significant progress must be made in several key technical areas. In addition to advances in engineering design, process optimization and automation, and accelerated feedstock design and manufacture, significant progress must be made in modeling and simulation. First and foremost, a more mature understanding of the process-structure-property-performance relationships must be developed. Because Additive Manufacturing processes change the nature of a material’s structure below the engineering scale, new models are required to predict materials response across the spectrum of relevant length scales, from the atomistic to the continuum. New diagnostics will be required to characterize materials response across these scales. And not just models, but advanced algorithms, next-generation codes, and advanced computer architectures will be required to complement the associated modeling activities. Based on preliminary work in each of these areas, a strong argument for the need for Exascale computing architectures can be made, if a legitimate predictive capability is to be developed.

  16. Developing a Process Model for the Forensic Extraction of Information from Desktop Search Applications

    Directory of Open Access Journals (Sweden)

    Timothy Pavlic

    2008-03-01

    Full Text Available Desktop search applications can contain cached copies of files that were deleted from the file system. Forensic investigators see this as a potential source of evidence, as documents deleted by suspects may still exist in the cache. Whilst there have been attempts at recovering data collected by desktop search applications, there is no methodology governing the process, nor discussion on the most appropriate means to do so. This article seeks to address this issue by developing a process model that can be applied when developing an information extraction application for desktop search applications, discussing preferred methods and the limitations of each. This work represents a more structured approach than other forms of current research.

  17. Process development and modeling of fluidized-bed reactor with coimmobilized biocatalyst for fuel ethanol production

    Science.gov (United States)

    Sun, May Yongmei

    This research focuses on two steps of commercial fuel ethanol production processes: the hydrolysis starch process and the fermentation process. The goal of this research is to evaluate the performance of co-immobilized biocatalysts in a fluidized bed reactor with emphasis on economic and engineering aspects and to develop a predictive mathematical model for this system. The productivity of an FBR is higher than productivity of a traditional batch reactor or CSTR. Fluidized beds offer great advantages over packed beds for immobilized cells when small particles are used or when the reactant feed contains suspended solids. Plugging problems, excessive pressure drops (and thus attrition), or crushing risks may be avoided. No mechanical stirring is required as mixing occurs due to the natural turbulence in the fluidized process. Both enzyme and microorganism are immobilized in one catalyst bead which is called co-immobilization. Inside this biocatalyst matrix, starch is hydrolyzed by the enzyme glucoamylase to form glucose and then converted to ethanol and carbon dioxide by microorganisms. Two biocatalysts were evaluated: (1) co-immobilized yeast strain Saccharomyces cerevisiae and glucoamylase. (2) co-immobilized Zymomonas mobilis and glucoamylase. A co-immobilized biocatalyst accomplishes the simultaneous saccharification and fermentation (SSF process). When compared to a two-step process involving separate saccharification and fermentation stages, the SSF process has productivity values twice that given by the pre-saccharified process when the time required for pre-saccharification (15--25 h) was taken into account. The SSF process should also save capital cost. The information about productivity, fermentation yield, concentration profiles along the bed, ethanol inhibition, et al., was obtained from the experimental data. For the yeast system, experimental results showed that: no apparent decrease of productivity occurred after two and half months, the productivity

  18. Study of Research and Development Processes through Fuzzy Super FRM Model and Optimization Solutions

    Directory of Open Access Journals (Sweden)

    Flavius Aurelian Sârbu

    2015-01-01

    Full Text Available The aim of this study is to measure resources for R&D (research and development at the regional level in Romania and also obtain primary data that will be important in making the right decisions to increase competitiveness and development based on an economic knowledge. As our motivation, we would like to emphasize that by the use of Super Fuzzy FRM model we want to determine the state of R&D processes at regional level using a mean different from the statistical survey, while by the two optimization methods we mean to provide optimization solutions for the R&D actions of the enterprises. Therefore to fulfill the above mentioned aim in this application-oriented paper we decided to use a questionnaire and for the interpretation of the results the Super Fuzzy FRM model, representing the main novelty of our paper, as this theory provides a formalism based on matrix calculus, which allows processing of large volumes of information and also delivers results difficult or impossible to see, through statistical processing. Furthermore another novelty of the paper represents the optimization solutions submitted in this work, given for the situation when the sales price is variable, and the quantity sold is constant in time and for the reverse situation.

  19. Assessing local population vulnerability to wind energy development with branching process models: an application to wind energy development

    Science.gov (United States)

    Erickson, Richard A.; Eager, Eric A.; Stanton, Jessica C.; Beston, Julie A.; Diffendorfer, James E.; Thogmartin, Wayne E.

    2015-01-01

    Quantifying the impact of anthropogenic development on local populations is important for conservation biology and wildlife management. However, these local populations are often subject to demographic stochasticity because of their small population size. Traditional modeling efforts such as population projection matrices do not consider this source of variation whereas individual-based models, which include demographic stochasticity, are computationally intense and lack analytical tractability. One compromise between approaches is branching process models because they accommodate demographic stochasticity and are easily calculated. These models are known within some sub-fields of probability and mathematical ecology but are not often applied in conservation biology and applied ecology. We applied branching process models to quantitatively compare and prioritize species locally vulnerable to the development of wind energy facilities. Specifically, we examined species vulnerability using branching process models for four representative species: A cave bat (a long-lived, low fecundity species), a tree bat (short-lived, moderate fecundity species), a grassland songbird (a short-lived, high fecundity species), and an eagle (a long-lived, slow maturation species). Wind turbine-induced mortality has been observed for all of these species types, raising conservation concerns. We simulated different mortality rates from wind farms while calculating local extinction probabilities. The longer-lived species types (e.g., cave bats and eagles) had much more pronounced transitions from low extinction risk to high extinction risk than short-lived species types (e.g., tree bats and grassland songbirds). High-offspring-producing species types had a much greater variability in baseline risk of extinction than the lower-offspring-producing species types. Long-lived species types may appear stable until a critical level of incidental mortality occurs. After this threshold, the risk of

  20. Mathematical model development of heat and mass exchange processes in the outdoor swimming pool

    Directory of Open Access Journals (Sweden)

    M. V. Shaptala

    2014-12-01

    Full Text Available Purpose. Currently exploitation of outdoor swimming pools is often not cost-effective and, despite of their relevance, such pools are closed in large quantities. At this time there is no the whole mathematical model which would allow assessing qualitatively the effect of energy-saving measures. The aim of this work is to develop a mathematical model of heat and mass exchange processes for calculating basic heat and mass losses that occur during its exploitation. Methodology. The method for determination of heat and mass loses based on the theory of similarity criteria equations is used. Findings. The main types of heat and mass losses of outdoor pool were analyzed. The most significant types were allocated and mathematically described. Namely: by evaporation of water from the surface of the pool, by natural and forced convection, by radiation to the environment, heat consumption for water heating. Originality. The mathematical model of heat and mass exchange process of the outdoor swimming pool was developed, which allows calculating the basic heat and mass loses that occur during its exploitation. Practical value. The method of determining heat and mass loses of outdoor swimming pool as a software system was developed and implemented. It is based on the mathematical model proposed by the authors. This method can be used for the conceptual design of energy-efficient structures of outdoor pools, to assess their use of energy-intensive and selecting the optimum energy-saving measures. A further step in research in this area is the experimental validation of the method of calculation of heat losses in outdoor swimming pools with its use as an example the pool of Dnipropetrovsk National University of Railway Transport named after Academician V. Lazaryan. The outdoor pool, with water heating- up from the boiler room of the university, is operated year-round.

  1. Development of a Mantle Convection Physical Model to Assist with Teaching about Earth's Interior Processes

    Science.gov (United States)

    Glesener, G. B.; Aurnou, J. M.

    2010-12-01

    The Modeling and Educational Demonstrations Laboratory (MEDL) at UCLA is developing a mantle convection physical model to assist educators with the pedagogy of Earth’s interior processes. Our design goal consists of two components to help the learner gain conceptual understanding by means of visual interactions without the burden of distracters, which may promote alternative conceptions. Distracters may be any feature of the conceptual model that causes the learner to use inadequate mental artifact to help him or her understand what the conceptual model is intended to convey. The first component, and most important, is a psychological component that links properties of “everyday things” (Norman, 1988) to the natural phenomenon, mantle convection. Some examples of everyday things may be heat rising out from a freshly popped bag of popcorn, or cold humid air falling from an open freezer. The second component is the scientific accuracy of the conceptual model. We would like to simplify the concepts for the learner without sacrificing key information that is linked to other natural phenomena the learner will come across in future science lessons. By taking into account the learner’s mental artifacts in combination with a simplified, but accurate, representation of what scientists know of the Earth’s interior, we expect the learner to have the ability to create an adequate qualitative mental simulation of mantle convection. We will be presenting some of our prototypes of this mantle convection physical model at this year’s poster session and invite constructive input from our colleagues.

  2. Development and Performance of a Highly Sensitive Model Formulation Based on Torasemide to Enhance Hot-Melt Extrusion Process Understanding and Process Development.

    Science.gov (United States)

    Evans, Rachel C; Kyeremateng, Samuel O; Asmus, Lutz; Degenhardt, Matthias; Rosenberg, Joerg; Wagner, Karl G

    2018-02-27

    The aim of this work was to investigate the use of torasemide as a highly sensitive indicator substance and to develop a formulation thereof for establishing quantitative relationships between hot-melt extrusion process conditions and critical quality attributes (CQAs). Using solid-state characterization techniques and a 10 mm lab-scale co-rotating twin-screw extruder, we studied torasemide in a Soluplus® (SOL)-polyethylene glycol 1500 (PEG 1500) matrix, and developed and characterized a formulation which was used as a process indicator to study thermal- and hydrolysis-induced degradation, as well as residual crystallinity. We found that torasemide first dissolved into the matrix and then degraded. Based on this mechanism, extrudates with measurable levels of degradation and residual crystallinity were produced, depending strongly on the main barrel and die temperature and residence time applied. In addition, we found that 10% w/w PEG 1500 as plasticizer resulted in the widest operating space with the widest range of measurable residual crystallinity and degradant levels. Torasemide as an indicator substance behaves like a challenging-to-process API, only with higher sensitivity and more pronounced effects, e.g., degradation and residual crystallinity. Application of a model formulation containing torasemide will enhance the understanding of the dynamic environment inside an extruder and elucidate the cumulative thermal and hydrolysis effects of the extrusion process. The use of such a formulation will also facilitate rational process development and scaling by establishing clear links between process conditions and CQAs.

  3. A Case for Declarative Process Modelling: Agile Development of a Grant Application System

    DEFF Research Database (Denmark)

    Debois, Søren; Hildebrandt, Thomas; Slaats, Tijs

    2014-01-01

    We present a new declarative model with composition and hierarchical definition of processes, featuring (a) incremental refinement, (b) adaptation of processes, and (c) dynamic creation of sub-processes. The approach is motivated and exemplified by a recent case management solution delivered by our...... (complex) events, which dynamically instantiate sub-processes. The extensions are realised and supported by a prototype simulation tool....

  4. Study of alternative strategies to the task clarification activity of the market-pull product development process model

    OpenAIRE

    Motte, Damien

    2009-01-01

    A very large majority of the current product development process models put forward in textbooks present a homogenous structure, what Ulrich & Eppinger [1] call the market-pull model, presented as a generic one, while other possible product development process models are merely seen as variants. This paper focuses on the task clarification and derived activities (mainly the systematic search for customer needs through market study and the supplementary development costs it entails) and in...

  5. Role of the national energy system modelling in the process of the policy development

    Directory of Open Access Journals (Sweden)

    Merse Stane

    2012-01-01

    Full Text Available Strategic planning and decision making, nonetheless making energy policies and strategies, is very extensive process and has to follow multiple and often contradictory objectives. During the preparation of the new Slovenian Energy Programme proposal, complete update of the technology and sector oriented bottom up model of Reference Energy and Environmental System of Slovenia (REES-SLO has been done. During the redevelopment of the REES-SLO model trade-off between the simulation and optimisation approach has been done, favouring presentation of relations between controls and their effects rather than the elusive optimality of results which can be misleading for small energy systems. Scenario-based planning was integrated into the MESAP (Modular Energy System Analysis and Planning environment, allowing integration of past, present and planned (calculated data in a comprehensive overall system. Within the paper, the main technical, economic and environmental characteristics of the Slovenian energy system model REES-SLO are described. This paper presents a new approach in modelling relatively small energy systems which goes beyond investment in particular technologies or categories of technology and allows smooth transition to low carbon economy. Presented research work confirms that transition from environment unfriendly fossil fuelled economy to sustainable and climate friendly development requires a new approach, which must be based on excellent knowledge of alternative possibilities of development and especially awareness about new opportunities in exploitation of energy efficiency and renewable energy sources.

  6. Multiple High-Fidelity Modeling Tools for Metal Additive Manufacturing Process Development, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Despite the rapid commercialization of additive manufacturing technology such as selective laser melting, SLM, there are gaps in process modeling and material...

  7. Multiple High-Fidelity Modeling Tools for Metal Additive Manufacturing Process Development, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — Despite the rapid commercialization of additive manufacturing technology such as selective laser melting, SLM, there are gaps in process modeling and material...

  8. Development and application of a processing model for the Irish dairy industry.

    Science.gov (United States)

    Geary, U; Lopez-Villalobos, N; Garrick, D J; Shalloo, L

    2010-11-01

    A processing-sector model was developed that simulates (i) milk collection, (ii) standardization, and (iii) product manufacture. The model estimates the product yield, net milk value, and component values of milk based on milk quantity, composition, product portfolio, and product values. Product specifications of cheese, butter, skim and whole milk powders, liquid milk, and casein are met through milk separation followed by reconstitution in appropriate proportions. Excess cream or skim milk are used in other product manufacture. Volume-related costs, including milk collection, standardization, and processing costs, and product-related costs, including processing costs per tonne, packaging, storage, distribution, and marketing, are quantified. Operating costs, incurred irrespective of milk received and processing activities, are included in the model on a fixed-rate basis. The net milk value is estimated as sale value less total costs. The component values of fat and protein were estimated from net milk value using the marginal rate of technical substitution. Two product portfolio scenarios were examined: scenario 1 was representative of the Irish product mix in 2000, in which 27, 39, 13, and 21% of the milk pool was processed into cheese (€ 3,291.33/t), butter (€ 2,766.33/t), whole milk powder (€ 2,453.33/t), and skim milk powder (€ 2,017.00/t), respectively, and scenario 2 was representative of the 2008 product mix, in which 43, 30, 14, and 13% was processed into cheese, butter, whole milk powder, and skim milk powder, respectively, and sold at the same market prices. Within both scenarios 3 milk compositions were considered, which were representative of (i) typical Irish Holstein-Friesian, (ii) Jersey, and (iii) the New Zealand strain of Holstein-Friesian, each of which had differing milk constituents. The effect each milk composition had on product yield, processing costs, total revenue, component values of milk, and the net value of milk was examined

  9. Model Development and Process Analysis for Lean Cellular Design Planning in Aerospace Assembly and Manufacturing

    Science.gov (United States)

    Hilburn, Monty D.

    Successful lean manufacturing and cellular manufacturing execution relies upon a foundation of leadership commitment and strategic planning built upon solid data and robust analysis. The problem for this study was to create and employ a simple lean transformation planning model and review process that could be used to identify functional support staff resources required to plan and execute lean manufacturing cells within aerospace assembly and manufacturing sites. The lean planning model was developed using available literature for lean manufacturing kaizen best practices and validated through a Delphi panel of lean experts. The resulting model and a standardized review process were used to assess the state of lean transformation planning at five sites of an international aerospace manufacturing and assembly company. The results of the three day, on-site review were compared with baseline plans collected from each of the five sites to determine if there analyzed, with focus on three critical areas of lean planning: the number and type of manufacturing cells identified, the number, type, and duration of planned lean and continuous kaizen events, and the quantity and type of functional staffing resources planned to support the kaizen schedule. Summarized data of the baseline and on-site reviews was analyzed with descriptive statistics. ANOVAs and paired-t tests at 95% significance level were conducted on the means of data sets to determine if null hypotheses related to cell, kaizen event, and support resources could be rejected. The results of the research found significant differences between lean transformation plans developed by site leadership and plans developed utilizing the structured, on-site review process and lean transformation planning model. The null hypothesis that there was no difference between the means of pre-review and on-site cell counts was rejected, as was the null hypothesis that there was no significant difference in kaizen event plans. These

  10. Letter Report: Progress in developing EQ3/6 for modeling boiling processes

    Energy Technology Data Exchange (ETDEWEB)

    Wolery, T. J., LLNL

    1995-08-28

    EQ3/6 is a software package for geochemical modeling of aqueous systems, such as water/rock or waste/water rock. It is being developed for a variety of applications in geochemical studies for the Yucca Mountain Site Characterization Project. The present focus is on development of capabilities to be used in studies of geochemical processes which will take place in the near-field environment and the altered zone of the potential repository. We have completed the first year of a planned two-year effort to develop capabilities for modeling boiling processes. These capabilities will interface with other existing and future modeling capabilities to provide a means of integrating the effects of various kinds of geochemical processes in complex systems. This year, the software has been modified to allow the formation of a generalized gas phase in a closed system for which the temperature and pressure are known (but not necessarily constant). The gas phase forms when its formation is thermodynamically favored; that is, when the system pressure is equal to the sum of the partial pressures of the gas species as computed from their equilibrium fugacities. It disappears when this sum falls below that pressure. `Boiling` is the special case in which the gas phase which forms consists mostly of water vapor. The reverse process is then `condensation.` To support calculations of boiling and condensation, we have added a capability to calculate the fugacity coefficients of gas species in the system H{sub 2}O-CO{sub 2}-CH{sub 4}-H{sub 2},-Awe{sub 2}-N{sub 2},-H{sub 2}S-NH3. This capability at present is accurate only at relatively low pressures, but is adequate for all likely repository boiling conditions. We have also modified the software to calculate changes in enthalpy (heat) and volume functions. Next year we will be extending the boiling capability to calculate the pressure or the temperature at known enthalpy. We will also add an option for open system boiling.

  11. Model-based Rational and Systematic Protein Purification Process Development : A Knowledge-based Approach

    NARCIS (Netherlands)

    Kungah Nfor, B.

    2011-01-01

    The increasing market and regulatory (quality and safety) demands on therapeutic proteins calls for radical improvement in their manufacturing processes. Addressing these challenges requires the adoption of strategies and tools that enable faster and more efficient process development. This thesis

  12. Recent Developments in Multiscale and Multiphase Modelling of the Hydraulic Fracturing Process

    Directory of Open Access Journals (Sweden)

    Yong Sheng

    2015-01-01

    Full Text Available Recently hydraulic fracturing of rocks has received much attention not only for its economic importance but also for its potential environmental impact. The hydraulically fracturing technique has been widely used in the oil (EOR and gas (EGR industries, especially in the USA, to extract more oil/gas through the deep rock formations. Also there have been increasing interests in utilising the hydraulic fracturing technique in geological storage of CO2 in recent years. In all cases, the design and implementation of the hydraulic fracturing process play a central role, highlighting the significance of research and development of this technique. However, the uncertainty behind the fracking mechanism has triggered public debates regarding the possible effect of this technique on human health and the environment. This has presented new challenges in the study of the hydraulic fracturing process. This paper describes the hydraulic fracturing mechanism and provides an overview of past and recent developments of the research performed towards better understandings of the hydraulic fracturing and its potential impacts, with particular emphasis on the development of modelling techniques and their implementation on the hydraulic fracturing.

  13. I. WORKING MEMORY CAPACITY IN CONTEXT: MODELING DYNAMIC PROCESSES OF BEHAVIOR, MEMORY, AND DEVELOPMENT.

    Science.gov (United States)

    Simmering, Vanessa R

    2016-09-01

    Working memory is a vital cognitive skill that underlies a broad range of behaviors. Higher cognitive functions are reliably predicted by working memory measures from two domains: children's performance on complex span tasks, and infants' performance in looking paradigms. Despite the similar predictive power across these research areas, theories of working memory development have not connected these different task types and developmental periods. The current project takes a first step toward bridging this gap by presenting a process-oriented theory, focusing on two tasks designed to assess visual working memory capacity in infants (the change-preference task) versus children and adults (the change detection task). Previous studies have shown inconsistent results, with capacity estimates increasing from one to four items during infancy, but only two to three items during early childhood. A probable source of this discrepancy is the different task structures used with each age group, but prior theories were not sufficiently specific to explain how performance relates across tasks. The current theory focuses on cognitive dynamics, that is, how memory representations are formed, maintained, and used within specific task contexts over development. This theory was formalized in a computational model to generate three predictions: 1) capacity estimates in the change-preference task should continue to increase beyond infancy; 2) capacity estimates should be higher in the change-preference versus change detection task when tested within individuals; and 3) performance should correlate across tasks because both rely on the same underlying memory system. I also tested a fourth prediction, that development across tasks could be explained through increasing real-time stability, realized computationally as strengthening connectivity within the model. Results confirmed these predictions, supporting the cognitive dynamics account of performance and developmental changes in real

  14. Statistical metrology - measurement and modeling of variation for advanced process development and design rule generation

    International Nuclear Information System (INIS)

    Boning, Duane S.; Chung, James E.

    1998-01-01

    Advanced process technology will require more detailed understanding and tighter control of variation in devices and interconnects. The purpose of statistical metrology is to provide methods to measure and characterize variation, to model systematic and random components of that variation, and to understand the impact of variation on both yield and performance of advanced circuits. Of particular concern are spatial or pattern-dependencies within individual chips; such systematic variation within the chip can have a much larger impact on performance than wafer-level random variation. Statistical metrology methods will play an important role in the creation of design rules for advanced technologies. For example, a key issue in multilayer interconnect is the uniformity of interlevel dielectric (ILD) thickness within the chip. For the case of ILD thickness, we describe phases of statistical metrology development and application to understanding and modeling thickness variation arising from chemical-mechanical polishing (CMP). These phases include screening experiments including design of test structures and test masks to gather electrical or optical data, techniques for statistical decomposition and analysis of the data, and approaches to calibrating empirical and physical variation models. These models can be integrated with circuit CAD tools to evaluate different process integration or design rule strategies. One focus for the generation of interconnect design rules are guidelines for the use of 'dummy fill' or 'metal fill' to improve the uniformity of underlying metal density and thus improve the uniformity of oxide thickness within the die. Trade-offs that can be evaluated via statistical metrology include the improvements to uniformity possible versus the effect of increased capacitance due to additional metal

  15. A process model in continuing professional development: Exploring diagnostic radiographers' views

    Energy Technology Data Exchange (ETDEWEB)

    Henwood, Suzanne M. [Henwood Associates (South East) Ltd, Coaching and Training, 38 Tudor Crescent, Otford, TN14 5QT, Sevenoaks, Kent (United Kingdom)], E-mail: henwoodassociates@btinternet.com; Taket, Ann [Centre for Health through Action on Social Exclusion (CHASE), School of Health and Social Development, Faculty of Health and Behavioural Sciences, Deakin University, 221 Burwood Highway, Burwood, Vic 3125 (Australia)], E-mail: ann.taket@deakin.edu.au

    2008-08-15

    This article is based on an exploratory, interpretative grounded theory study that looked at practitioners' perceptions of continuing professional development (CPD) in diagnostic radiography in the UK. Using a combination of in-depth interviews and secondary analysis of published material, a dynamic CPD process model was generated. The study aimed to explore what radiographers understood by the term CPD and whether it was perceived to have any impact on clinical practice. The study aimed to identify and investigate the components of CPD and how they interact with one another, to help to explain what is happening within CPD and what contributes to its effectiveness. The CPD process was shown to be complex, dynamic and centred on the Individual. Supporting components of Facilitation and External Influences were identified as important in maximising the potential impact of CPD. The three main categories were shown to interact dynamically and prior to Participation were shown to have a 'superadditive' effect, where the total effect was greater than the sum of the three individual parts. This study showed that radiographers are generally unaware of the holistic concept of CPD, using instead narrow definitions of CPD with little or no expectation of any impact on practice, focusing predominantly on personal gain. The model produced in the study provided a tool that practitioners reported was helpful in reflecting on their own involvment in CPD.

  16. A process model in continuing professional development: Exploring diagnostic radiographers' views

    International Nuclear Information System (INIS)

    Henwood, Suzanne M.; Taket, Ann

    2008-01-01

    This article is based on an exploratory, interpretative grounded theory study that looked at practitioners' perceptions of continuing professional development (CPD) in diagnostic radiography in the UK. Using a combination of in-depth interviews and secondary analysis of published material, a dynamic CPD process model was generated. The study aimed to explore what radiographers understood by the term CPD and whether it was perceived to have any impact on clinical practice. The study aimed to identify and investigate the components of CPD and how they interact with one another, to help to explain what is happening within CPD and what contributes to its effectiveness. The CPD process was shown to be complex, dynamic and centred on the Individual. Supporting components of Facilitation and External Influences were identified as important in maximising the potential impact of CPD. The three main categories were shown to interact dynamically and prior to Participation were shown to have a 'superadditive' effect, where the total effect was greater than the sum of the three individual parts. This study showed that radiographers are generally unaware of the holistic concept of CPD, using instead narrow definitions of CPD with little or no expectation of any impact on practice, focusing predominantly on personal gain. The model produced in the study provided a tool that practitioners reported was helpful in reflecting on their own involvment in CPD

  17. Derivative Process Model of Development Power in Industry: Empirical Research and Forecast for Chinese Software Industry and US Economy

    OpenAIRE

    Feng Dai; Bao- hua Sun; Jie Sun

    2004-01-01

    Based on concept and theory of Development Power [1], this paper analyzes the transferability and the diffusibility of industrial development power, points out that the chaos is the extreme of DP releasing and order is the highest degree of DP accumulating, puts forward A-C strength, the index of adjusting and controlling strength, and sets up the derivative process model for industrial development power on the Partial Distribution [2]-[4]. By the derivative process model, a kind of time seri...

  18. Development and evaluation of spatial point process models for epidermal nerve fibers.

    Science.gov (United States)

    Olsbo, Viktor; Myllymäki, Mari; Waller, Lance A; Särkkä, Aila

    2013-06-01

    We propose two spatial point process models for the spatial structure of epidermal nerve fibers (ENFs) across human skin. The models derive from two point processes, Φb and Φe, describing the locations of the base and end points of the fibers. Each point of Φe (the end point process) is connected to a unique point in Φb (the base point process). In the first model, both Φe and Φb are Poisson processes, yielding a null model of uniform coverage of the skin by end points and general baseline results and reference values for moments of key physiologic indicators. The second model provides a mechanistic model to generate end points for each base, and we model the branching structure more directly by defining Φe as a cluster process conditioned on the realization of Φb as its parent points. In both cases, we derive distributional properties for observable quantities of direct interest to neurologists such as the number of fibers per base, and the direction and range of fibers on the skin. We contrast both models by fitting them to data from skin blister biopsy images of ENFs and provide inference regarding physiological properties of ENFs. Copyright © 2013 Elsevier Inc. All rights reserved.

  19. Simulation and Development of Internal Model Control Applications in the Bayer Process

    Science.gov (United States)

    Colombé, Ph.; Dablainville, R.; Vacarisas, J.

    Traditional PID feedback control system is limited in its use in the Bayer cycle due to the important and omnipresent time delays which can lead to stability problems and sluggish response. Advanced modern control techniques are available, but suffer in an industrial environment from a lack of simplicity and robustness. In this respect the Internal Model Control (IMC) method may be considered as an exception. After a brief review of the basic theoretical principles behind IMC, an IMC scheme is developed to work with single-input, single-output, discrete-time, nonlinear systems. Two applications of IMC in the Bayer process, both in simulations and on industrial plants, are then described: control of the caustic soda concentration of the aluminate liquor and control of the A12O3/Na20 caust. ratio of the digested slurry, Finally, the results obtained make this technique quite attractive for the alumina industry.

  20. Developing a Comprehensive Model of Intensive Care Unit Processes: Concept of Operations.

    Science.gov (United States)

    Romig, Mark; Tropello, Steven P; Dwyer, Cindy; Wyskiel, Rhonda M; Ravitz, Alan; Benson, John; Gropper, Michael A; Pronovost, Peter J; Sapirstein, Adam

    2015-04-23

    This study aimed to use a systems engineering approach to improve performance and stakeholder engagement in the intensive care unit to reduce several different patient harms. We developed a conceptual framework or concept of operations (ConOps) to analyze different types of harm that included 4 steps as follows: risk assessment, appropriate therapies, monitoring and feedback, as well as patient and family communications. This framework used a transdisciplinary approach to inventory the tasks and work flows required to eliminate 7 common types of harm experienced by patients in the intensive care unit. The inventory gathered both implicit and explicit information about how the system works or should work and converted the information into a detailed specification that clinicians could understand and use. Using the ConOps document, we created highly detailed work flow models to reduce harm and offer an example of its application to deep venous thrombosis. In the deep venous thrombosis model, we identified tasks that were synergistic across different types of harm. We will use a system of systems approach to integrate the variety of subsystems and coordinate processes across multiple types of harm to reduce the duplication of tasks. Through this process, we expect to improve efficiency and demonstrate synergistic interactions that ultimately can be applied across the spectrum of potential patient harms and patient locations. Engineering health care to be highly reliable will first require an understanding of the processes and work flows that comprise patient care. The ConOps strategy provided a framework for building complex systems to reduce patient harm.

  1. Business Model Process Configurations

    DEFF Research Database (Denmark)

    Taran, Yariv; Nielsen, Christian; Thomsen, Peter

    2015-01-01

    , by developing (inductively) an ontological classification framework, in view of the BM process configurations typology developed. Design/methodology/approach – Given the inconsistencies found in the business model studies (e.g. definitions, configurations, classifications) we adopted the analytical induction...

  2. Mathematical Optimal Sequence Model Development to Process Planes and Other Interconnected Surfaces of Complex Body Parts

    Directory of Open Access Journals (Sweden)

    I. I. Kravchenko

    2016-01-01

    Full Text Available Experience in application of multi-operational machines CNC (MOM CNC shows that they are efficient only in case of significantly increasing productivity and dramatically reducing time-to-market cycle of new products. Most full technological MOM capabilities are revealed when processing the complex body parts. The more complex is a part design and the more is its number of machined surfaces, the more tools are necessary for its processing and positioning, the more is an efficiency of their application. At the same time, the case history of using these machines in industry shows that MOM CNC are, virtually, used mostly for technological processes of universal equipment, which is absolutely unacceptable. One way to improve the processing performance on MOM CNC is to reduce nonproductive machine time through reducing the mutual idle movements of the working machine. This problem is solved using dynamic programming methods, one of which is the solution of the traveling salesman problem (Bellman's method. With a known plan for treatment of all elementary surfaces of the body part, i.e. the known number of performed transitions, each transition is represented as a vertex of some graph, while technological links between the vertices are its edges. A mathematical model is developed on the Bellman principle, which is adapted to technological tasks to minimize the idle time of mutual idle movements of the working machine to perform all transitions in the optimal sequence. The initial data to fill matrix of time expenditures are time consumed by the hardware after executing the i-th transition, and necessary to complete the j-transition. The programmer fills in matrix cells according to known routing body part taking into account the time for part and table positioning, tool exchange, spindle and table approach to the working zone, and the time of table rotation, etc. The mathematical model was tested when machining the body part with 36 transitions on the

  3. Biosphere Process Model Report

    Energy Technology Data Exchange (ETDEWEB)

    J. Schmitt

    2000-05-25

    To evaluate the postclosure performance of a potential monitored geologic repository at Yucca Mountain, a Total System Performance Assessment (TSPA) will be conducted. Nine Process Model Reports (PMRs), including this document, are being developed to summarize the technical basis for each of the process models supporting the TSPA model. These reports cover the following areas: (1) Integrated Site Model; (2) Unsaturated Zone Flow and Transport; (3) Near Field Environment; (4) Engineered Barrier System Degradation, Flow, and Transport; (5) Waste Package Degradation; (6) Waste Form Degradation; (7) Saturated Zone Flow and Transport; (8) Biosphere; and (9) Disruptive Events. Analysis/Model Reports (AMRs) contain the more detailed technical information used to support TSPA and the PMRs. The AMRs consists of data, analyses, models, software, and supporting documentation that will be used to defend the applicability of each process model for evaluating the postclosure performance of the potential Yucca Mountain repository system. This documentation will ensure the traceability of information from its source through its ultimate use in the TSPA-Site Recommendation (SR) and in the National Environmental Policy Act (NEPA) analysis processes. The objective of the Biosphere PMR is to summarize (1) the development of the biosphere model, and (2) the Biosphere Dose Conversion Factors (BDCFs) developed for use in TSPA. The Biosphere PMR does not present or summarize estimates of potential radiation doses to human receptors. Dose calculations are performed as part of TSPA and will be presented in the TSPA documentation. The biosphere model is a component of the process to evaluate postclosure repository performance and regulatory compliance for a potential monitored geologic repository at Yucca Mountain, Nevada. The biosphere model describes those exposure pathways in the biosphere by which radionuclides released from a potential repository could reach a human receptor

  4. Biosphere Process Model Report

    International Nuclear Information System (INIS)

    Schmitt, J.

    2000-01-01

    To evaluate the postclosure performance of a potential monitored geologic repository at Yucca Mountain, a Total System Performance Assessment (TSPA) will be conducted. Nine Process Model Reports (PMRs), including this document, are being developed to summarize the technical basis for each of the process models supporting the TSPA model. These reports cover the following areas: (1) Integrated Site Model; (2) Unsaturated Zone Flow and Transport; (3) Near Field Environment; (4) Engineered Barrier System Degradation, Flow, and Transport; (5) Waste Package Degradation; (6) Waste Form Degradation; (7) Saturated Zone Flow and Transport; (8) Biosphere; and (9) Disruptive Events. Analysis/Model Reports (AMRs) contain the more detailed technical information used to support TSPA and the PMRs. The AMRs consists of data, analyses, models, software, and supporting documentation that will be used to defend the applicability of each process model for evaluating the postclosure performance of the potential Yucca Mountain repository system. This documentation will ensure the traceability of information from its source through its ultimate use in the TSPA-Site Recommendation (SR) and in the National Environmental Policy Act (NEPA) analysis processes. The objective of the Biosphere PMR is to summarize (1) the development of the biosphere model, and (2) the Biosphere Dose Conversion Factors (BDCFs) developed for use in TSPA. The Biosphere PMR does not present or summarize estimates of potential radiation doses to human receptors. Dose calculations are performed as part of TSPA and will be presented in the TSPA documentation. The biosphere model is a component of the process to evaluate postclosure repository performance and regulatory compliance for a potential monitored geologic repository at Yucca Mountain, Nevada. The biosphere model describes those exposure pathways in the biosphere by which radionuclides released from a potential repository could reach a human receptor

  5. Extension of internationalisation models drivers and processes for the globalisation of product development

    DEFF Research Database (Denmark)

    Søndergaard, Erik Stefan; Oehmen, Josef; Ahmed-Kristensen, Saeema

    2016-01-01

    of product development and collaborative distributed development beyond sourcing, sales and production elements. The paper then provides propositions for how to further develop the suggested model, and how western companies can learn from the Chinese approaches, and globalise their product development...

  6. Business process model repositories : efficient process retrieval

    NARCIS (Netherlands)

    Yan, Z.

    2012-01-01

    As organizations increasingly work in process-oriented manner, the number of business process models that they develop and have to maintain increases. As a consequence, it has become common for organizations to have collections of hundreds or even thousands of business process models. When a

  7. Development of pure component property models for chemical product-process design and analysis

    DEFF Research Database (Denmark)

    Hukkerikar, Amol Shivajirao

    information on the degree of accuracy of the property estimates. In addition, a method based on the ‘molecular structural similarity criteria’ is developed so that efficient use of knowledge of properties could be made in the development/improvement of property models. This method, in principle, can...... modeling such as: (i) quantity of property data used for the parameter regression; (ii) selection of the most appropriate form of the property model function; and (iii) the accuracy and thermodynamic consistency of predicted property values are also discussed. The developed models have been implemented...

  8. Advanced autonomous model-based operation of industrial process systems (Autoprofit) : technological developments and future perspectives

    NARCIS (Netherlands)

    Ozkan, L.; Bombois, X.J.A.; Ludlage, J.H.A.; Rojas, C.R.; Hjalmarsson, H.; Moden, P.E.; Lundh, M.; Backx, A.C.P.M.; Van den Hof, P.M.J.

    2016-01-01

    Model-based operation support technology such as Model Predictive Control (MPC) is a proven and accepted technology for multivariable and constrained large scale control problems in process industry. Despite the growing number of successful implementations, the low level of operational efficiency of

  9. A Linked Model for Simulating Stand Development and Growth Processes of Loblolly Pine

    Science.gov (United States)

    V. Clark Baldwin; Phillip M. Dougherty; Harold E. Burkhart

    1998-01-01

    Linking models of different scales (e.g., process, tree-stand-ecosystem) is essential for furthering our understanding of stand, climatic, and edaphic effects on tree growth and forest productivity. Moreover, linking existing models that differ in scale and levels of resolution quickly identifies knowledge gaps in information required to scale from one level to another...

  10. Development of a model describing virus removal process in an activated sludge basin

    Energy Technology Data Exchange (ETDEWEB)

    Kim, T.; Shiragami, N. Unno, H. [Tokyo Institute of Technology, Tokyo (Japan)

    1995-06-20

    The virus removal process from the liquid phase in an activated sludge basin possibly consists of physicochemical processes, such as adsorption onto sludge flocs, biological processes such as microbial predating and inactivation by virucidal components excreted by microbes. To describe properly the virus behavior in an activated sludge basin, a simple model is proposed based on the experimental data obtained using a poliovirus type 1. A three-compartments model, which include the virus in the liquid phase and in the peripheral and inner regions of sludge flocs is employed. By using the model, the Virus removal process was successfully simulated to highlight the implication of its distribution in the activated sludge basin. 17 refs., 8 figs.

  11. A Model for the Development of Hospital Beds Using Fuzzy Analytical Hierarchy Process (Fuzzy AHP).

    Science.gov (United States)

    Ravangard, Ramin; Bahadori, Mohammadkarim; Raadabadi, Mehdi; Teymourzadeh, Ehsan; Alimomohammadzadeh, Khalil; Mehrabian, Fardin

    2017-11-01

    This study aimed to identify and prioritize factors affecting the development of military hospital beds and provide a model using fuzzy analytical hierarchy process (Fuzzy AHP). This applied study was conducted in 2016 in Iran using a mixed method. The sample included experts in the field of military health care system. The MAXQDA 10.0 and Expert Choice 10.0 software were used for analyzing the collected data. Geographic situation, demographic status, economic status, health status, health care centers and organizations, financial and human resources, laws and regulations and by-laws, and the military nature of service recipients had effects on the development of military hospital beds. The military nature of service recipients (S=0.249) and economic status (S=0.040) received the highest and lowest priorities, respectively. Providing direct health care services to the military forces in order to maintain their dignity, and according to its effects in the crisis, as well as the necessity for maintaining the security of the armed forces, and the hospital beds per capita based on the existing laws, regulations and bylaws are of utmost importance.

  12. Development of mathematical models for automation of strength calculation during plastic deformation processing

    Science.gov (United States)

    Steposhina, S. V.; Fedonin, O. N.

    2018-03-01

    Dependencies that make it possible to automate the force calculation during surface plastic deformation (SPD) processing and, thus, to shorten the time for technological preparation of production have been developed.

  13. Aggression and Moral Development: Integrating Social Information Processing and Moral Domain Models

    Science.gov (United States)

    Arsenio, William F.; Lemerise, Elizabeth A.

    2004-01-01

    Social information processing and moral domain theories have developed in relative isolation from each other despite their common focus on intentional harm and victimization, and mutual emphasis on social cognitive processes in explaining aggressive, morally relevant behaviors. This article presents a selective summary of these literatures with…

  14. Enhanced Geothermal Systems Research and Development: Models of Subsurface Chemical Processes Affecting Fluid Flow

    Energy Technology Data Exchange (ETDEWEB)

    Moller, Nancy; Weare J. H.

    2008-05-29

    Successful exploitation of the vast amount of heat stored beneath the earth’s surface in hydrothermal and fluid-limited, low permeability geothermal resources would greatly expand the Nation’s domestic energy inventory and thereby promote a more secure energy supply, a stronger economy and a cleaner environment. However, a major factor limiting the expanded development of current hydrothermal resources as well as the production of enhanced geothermal systems (EGS) is insufficient knowledge about the chemical processes controlling subsurface fluid flow. With funding from past grants from the DOE geothermal program and other agencies, we successfully developed advanced equation of state (EOS) and simulation technologies that accurately describe the chemistry of geothermal reservoirs and energy production processes via their free energies for wide XTP ranges. Using the specific interaction equations of Pitzer, we showed that our TEQUIL chemical models can correctly simulate behavior (e.g., mineral scaling and saturation ratios, gas break out, brine mixing effects, down hole temperatures and fluid chemical composition, spent brine incompatibilities) within the compositional range (Na-K-Ca-Cl-SO4-CO3-H2O-SiO2-CO2(g)) and temperature range (T < 350°C) associated with many current geothermal energy production sites that produce brines with temperatures below the critical point of water. The goal of research carried out under DOE grant DE-FG36-04GO14300 (10/1/2004-12/31/2007) was to expand the compositional range of our Pitzer-based TEQUIL fluid/rock interaction models to include the important aluminum and silica interactions (T < 350°C). Aluminum is the third most abundant element in the earth’s crust; and, as a constituent of aluminosilicate minerals, it is found in two thirds of the minerals in the earth’s crust. The ability to accurately characterize effects of temperature, fluid mixing and interactions between major rock-forming minerals and hydrothermal and

  15. A conceptual model for the development process of confirmatory adaptive clinical trials within an emergency research network.

    Science.gov (United States)

    Mawocha, Samkeliso C; Fetters, Michael D; Legocki, Laurie J; Guetterman, Timothy C; Frederiksen, Shirley; Barsan, William G; Lewis, Roger J; Berry, Donald A; Meurer, William J

    2017-06-01

    Adaptive clinical trials use accumulating data from enrolled subjects to alter trial conduct in pre-specified ways based on quantitative decision rules. In this research, we sought to characterize the perspectives of key stakeholders during the development process of confirmatory-phase adaptive clinical trials within an emergency clinical trials network and to build a model to guide future development of adaptive clinical trials. We used an ethnographic, qualitative approach to evaluate key stakeholders' views about the adaptive clinical trial development process. Stakeholders participated in a series of multidisciplinary meetings during the development of five adaptive clinical trials and completed a Strengths-Weaknesses-Opportunities-Threats questionnaire. In the analysis, we elucidated overarching themes across the stakeholders' responses to develop a conceptual model. Four major overarching themes emerged during the analysis of stakeholders' responses to questioning: the perceived statistical complexity of adaptive clinical trials and the roles of collaboration, communication, and time during the development process. Frequent and open communication and collaboration were viewed by stakeholders as critical during the development process, as were the careful management of time and logistical issues related to the complexity of planning adaptive clinical trials. The Adaptive Design Development Model illustrates how statistical complexity, time, communication, and collaboration are moderating factors in the adaptive design development process. The intensity and iterative nature of this process underscores the need for funding mechanisms for the development of novel trial proposals in academic settings.

  16. Development of thermodynamically-based models for simulation of hydrogeochemical processes coupled to channel flow processes in abandoned underground mines

    Energy Technology Data Exchange (ETDEWEB)

    Kruse, N.A., E-mail: natalie.kruse@ncl.ac.uk [Sir Joseph Swan Institute for Energy Research, Newcastle University, Newcastle upon Tyne NE1 7RU (United Kingdom); Younger, P.L. [Sir Joseph Swan Institute for Energy Research, Newcastle University, Newcastle upon Tyne NE1 7RU (United Kingdom)

    2009-07-15

    Accurate modeling of changing geochemistry in mine water can be an important tool in post-mining site management. The Pollutant Sources and Sinks in Underground Mines (POSSUM) model and Pollutant Loadings Above Average Pyrite Influenced Geochemistry POSSUM (PLAYING POSSUM) model were developed using object-oriented programming techniques to simulate changing geochemistry in abandoned underground mines over time. The conceptual model was created to avoid significant simplifying assumptions that decrease the accuracy and defensibility of model solutions. POSSUM and PLAYING POSSUM solve for changes in flow rate and depth of flow using a finite difference hydrodynamics model then, subsequently, solve for geochemical changes at distinct points along the flow path. Geochemical changes are modeled based on a suite of 28 kinetically controlled mineral weathering reactions. Additional geochemical transformations due to reversible sorption, dissolution and precipitation of acid generating salts and mineral precipitation are also simulated using simplified expressions. Contaminant transport is simulated using a novel application of the Random-Walk method. By simulating hydrogeochemical changes with a physically and thermodynamically controlled model, the 'state of the art' in post-mining management can be advanced.

  17. Conceptual process models and quantitative analysis of classification problems in Scrum software development practices

    NARCIS (Netherlands)

    Helwerda, L.S.; Niessink, F.; Verbeek, F.J.

    2017-01-01

    We propose a novel classification method that integrates into existing agile software development practices by collecting data records generated by software and tools used in the development process. We extract features from the collected data and create visualizations that provide insights,

  18. DEVELOPMENT OF SCIENCE PROCESS SKILLS STUDENTS WITH PROJECT BASED LEARNING MODEL- BASED TRAINING IN LEARNING PHYSICS

    Directory of Open Access Journals (Sweden)

    Ratna Malawati

    2016-06-01

    Full Text Available This study aims to improve the physics Science Process Skills Students on cognitive and psychomotor aspects by using model based Project Based Learning training.The object of this study is the Project Based Learning model used in the learning process of Computationa Physics.The method used is classroom action research through two learning cycles, each cycle consisting of the stages of planning, implementation, observation and reflection. In the first cycle of treatment with their emphasis given training in the first phase up to third in the model Project Based Learning, while the second cycle is given additional treatment with emphasis discussion is collaboration in achieving the best results for each group of products. The results of data analysis showed increased ability to think Students on cognitive and Science Process Skills in the psychomotor.

  19. Development of a process model for intelligent control of gas metal arc welding

    International Nuclear Information System (INIS)

    Smartt, H.B.; Johnson, J.A.; Einerson, C.J.; Watkins, A.D.; Carlson, N.M.

    1991-01-01

    This paper discusses work in progress on the development of an intelligent control scheme for arc welding. A set of four sensors is used to detect weld bead cooling rate, droplet transfer mode, weld pool and joint location and configuration, and weld defects during welding. A neural network is being developed as the bridge between the multiple sensor set a conventional proportional-integral controller that provides independent control of process variables. This approach is being developed for the gas metal arc welding process. 20 refs., 8 figs

  20. Development and Testing of a Model for Simulation of Process Operators' During Emergencies in Nuclear Power Plants

    DEFF Research Database (Denmark)

    Rasmussen, Jens

    1986-01-01

    The paper describes a program for the development and testing of a model of cognitive processes intended for simulation of operator responses to plant disturbances. It will be a part of a computer program complex called DYLAM for automatic identification of accident scenarios to be included...... to develop this data base is proposed. The human element is introduced in the model by a perturbation function derived from human error modes.A Program for testing the model in briefly mentioned....

  1. Development of numerical dispersion model for radioactive nuclei including resuspension processes

    International Nuclear Information System (INIS)

    Chiba, Masaru; Kurita, Susumu; Sasaki, Hidetaka

    2003-01-01

    Global-scale and local-scale dispersion model are developed combining to global and local scale meteorological forecasting model. By applying this system to another miner constituent such as mineral dust blowing by strong wind in arid region, this system shows very good performance to watch and predict the distribution of it. (author)

  2. Is there room for 'development' in developmental models of information processing biases to threat in children and adolescents?

    Science.gov (United States)

    Field, Andy P; Lester, Kathryn J

    2010-12-01

    Clinical and experimental theories assume that processing biases in attention and interpretation are a causal mechanism through which anxiety develops. Despite growing evidence that these processing biases are present in children and, therefore, develop long before adulthood, these theories ignore the potential role of child development. This review attempts to place information processing biases within a theoretical developmental framework. We consider whether child development has no impact on information processing biases to threat (integral bias model), or whether child development influences information processing biases and if so whether it does so by moderating the expression of an existing bias (moderation model) or by affecting the acquisition of a bias (acquisition model). We examine the extent to which these models fit with existing theory and research evidence and outline some methodological issues that need to be considered when drawing conclusions about the potential role of child development in the information processing of threat stimuli. Finally, we speculate about the developmental processes that might be important to consider in future research.

  3. Role of the national energy system modelling in the process of the policy development

    OpenAIRE

    Merse Stane; Urbancic Andreja; Sucic Boris; Pusnik Matevz

    2012-01-01

    Strategic planning and decision making, nonetheless making energy policies and strategies, is very extensive process and has to follow multiple and often contradictory objectives. During the preparation of the new Slovenian Energy Programme proposal, complete update of the technology and sector oriented bottom up model of Reference Energy and Environmental System of Slovenia (REES-SLO) has been done. During the redevelopment of the REES-SLO model trade-off between the simulation and opt...

  4. Models of neural dynamics in brain information processing - the developments of 'the decade'

    Energy Technology Data Exchange (ETDEWEB)

    Borisyuk, G N; Borisyuk, R M; Kazanovich, Yakov B [Institute of Mathematical Problems of Biology, Russian Academy of Sciences, Pushchino, Moscow region (Russian Federation); Ivanitskii, Genrikh R [Institute for Theoretical and Experimental Biophysics, Russian Academy of Sciences, Pushchino, Moscow region (Russian Federation)

    2002-10-31

    Neural network models are discussed that have been developed during the last decade with the purpose of reproducing spatio-temporal patterns of neural activity in different brain structures. The main goal of the modeling was to test hypotheses of synchronization, temporal and phase relations in brain information processing. The models being considered are those of temporal structure of spike sequences, of neural activity dynamics, and oscillatory models of attention and feature integration. (reviews of topical problems)

  5. A Structural Equation Model of the Writing Process in Typically-Developing Sixth Grade Children

    Science.gov (United States)

    Koutsoftas, Anthony D.; Gray, Shelley

    2013-01-01

    The purpose of this study was to evaluate how sixth grade children planned, translated, and revised written narrative stories using a task reflecting current instructional and assessment practices. A modified version of the Hayes and Flower (1980) writing process model was used as the theoretical framework for the study. Two hundred one…

  6. IT process architectures for enterprises development: A survey from a maturity model perspective

    NARCIS (Netherlands)

    Santana Tapia, R.G.

    During the last years much has been published about IT governance. Close to the success of many governance efforts are the business frameworks, quality models, and technology standards that help enterprises improve processes, customer service, quality of products, and control. In this paper we i)

  7. Antecedents of Absorptive Capacity: A New Model for Developing Learning Processes

    Science.gov (United States)

    Rezaei-Zadeh, Mohammad; Darwish, Tamer K.

    2016-01-01

    Purpose: The purpose of this paper is to provide an integrated framework to indicate which antecedents of absorptive capacity (AC) influence its learning processes, and to propose testing of this model in future work. Design/methodology/approach Relevant literature into the antecedents of AC was critically reviewed and analysed with the objective…

  8. Development of Energy Models for Production Systems and Processes to Inform Environmentally Benign Decision-Making

    Science.gov (United States)

    Diaz-Elsayed, Nancy

    Between 2008 and 2035 global energy demand is expected to grow by 53%. While most industry-level analyses of manufacturing in the United States (U.S.) have traditionally focused on high energy consumers such as the petroleum, chemical, paper, primary metal, and food sectors, the remaining sectors account for the majority of establishments in the U.S. Specifically, of the establishments participating in the Energy Information Administration's Manufacturing Energy Consumption Survey in 2006, the non-energy intensive" sectors still consumed 4*109 GJ of energy, i.e., one-quarter of the energy consumed by the manufacturing sectors, which is enough to power 98 million homes for a year. The increasing use of renewable energy sources and the introduction of energy-efficient technologies in manufacturing operations support the advancement towards a cleaner future, but having a good understanding of how the systems and processes function can reduce the environmental burden even further. To facilitate this, methods are developed to model the energy of manufacturing across three hierarchical levels: production equipment, factory operations, and industry; these methods are used to accurately assess the current state and provide effective recommendations to further reduce energy consumption. First, the energy consumption of production equipment is characterized to provide machine operators and product designers with viable methods to estimate the environmental impact of the manufacturing phase of a product. The energy model of production equipment is tested and found to have an average accuracy of 97% for a product requiring machining with a variable material removal rate profile. However, changing the use of production equipment alone will not result in an optimal solution since machines are part of a larger system. Which machines to use, how to schedule production runs while accounting for idle time, the design of the factory layout to facilitate production, and even the

  9. Development and Implementation of a Telecommuting Evaluation Framework, and Modeling the Executive Telecommuting Adoption Process

    Science.gov (United States)

    Vora, V. P.; Mahmassani, H. S.

    2002-02-01

    This work proposes and implements a comprehensive evaluation framework to document the telecommuter, organizational, and societal impacts of telecommuting through telecommuting programs. Evaluation processes and materials within the outlined framework are also proposed and implemented. As the first component of the evaluation process, the executive survey is administered within a public sector agency. The survey data is examined through exploratory analysis and is compared to a previous survey of private sector executives. The ordinal probit, dynamic probit, and dynamic generalized ordinal probit (DGOP) models of telecommuting adoption are calibrated to identify factors which significantly influence executive adoption preferences and to test the robustness of such factors. The public sector DGOP model of executive willingness to support telecommuting under different program scenarios is compared with an equivalent private sector DGOP model. Through the telecommuting program, a case study of telecommuting travel impacts is performed to further substantiate research.

  10. Modeling the defrost process in complex geometries – Part 1: Development of a one-dimensional defrost model

    Directory of Open Access Journals (Sweden)

    van Buren Simon

    2017-01-01

    Full Text Available Frost formation is a common, often undesired phenomenon in heat exchanges such as air coolers. Thus, air coolers have to be defrosted periodically, causing significant energy consumption. For the design and optimization, prediction of defrosting by a CFD tool is desired. This paper presents a one-dimensional transient model approach suitable to be used as a zero-dimensional wall-function in CFD for modeling the defrost process at the fin and tube interfaces. In accordance to previous work a multi stage defrost model is introduced (e.g. [1, 2]. In the first instance the multi stage model is implemented and validated using MATLAB. The defrost process of a one-dimensional frost segment is investigated. Fixed boundary conditions are provided at the frost interfaces. The simulation results verify the plausibility of the designed model. The evaluation of the simulated defrost process shows the expected convergent behavior of the three-stage sequence.

  11. Self-optimisation and model-based design of experiments for developing a C–H activation flow process

    Directory of Open Access Journals (Sweden)

    Alexander Echtermeyer

    2017-01-01

    Full Text Available A recently described C(sp3–H activation reaction to synthesise aziridines was used as a model reaction to demonstrate the methodology of developing a process model using model-based design of experiments (MBDoE and self-optimisation approaches in flow. The two approaches are compared in terms of experimental efficiency. The self-optimisation approach required the least number of experiments to reach the specified objectives of cost and product yield, whereas the MBDoE approach enabled a rapid generation of a process model.

  12. Framework for developing hybrid process-driven, artificial neural network and regression models for salinity prediction in river systems

    Science.gov (United States)

    Hunter, Jason M.; Maier, Holger R.; Gibbs, Matthew S.; Foale, Eloise R.; Grosvenor, Naomi A.; Harders, Nathan P.; Kikuchi-Miller, Tahali C.

    2018-05-01

    Salinity modelling in river systems is complicated by a number of processes, including in-stream salt transport and various mechanisms of saline accession that vary dynamically as a function of water level and flow, often at different temporal scales. Traditionally, salinity models in rivers have either been process- or data-driven. The primary problem with process-based models is that in many instances, not all of the underlying processes are fully understood or able to be represented mathematically. There are also often insufficient historical data to support model development. The major limitation of data-driven models, such as artificial neural networks (ANNs) in comparison, is that they provide limited system understanding and are generally not able to be used to inform management decisions targeting specific processes, as different processes are generally modelled implicitly. In order to overcome these limitations, a generic framework for developing hybrid process and data-driven models of salinity in river systems is introduced and applied in this paper. As part of the approach, the most suitable sub-models are developed for each sub-process affecting salinity at the location of interest based on consideration of model purpose, the degree of process understanding and data availability, which are then combined to form the hybrid model. The approach is applied to a 46 km reach of the Murray River in South Australia, which is affected by high levels of salinity. In this reach, the major processes affecting salinity include in-stream salt transport, accession of saline groundwater along the length of the reach and the flushing of three waterbodies in the floodplain during overbank flows of various magnitudes. Based on trade-offs between the degree of process understanding and data availability, a process-driven model is developed for in-stream salt transport, an ANN model is used to model saline groundwater accession and three linear regression models are used

  13. Development and Validation of a Constitutive Model for Dental Composites during the Curing Process

    Science.gov (United States)

    Wickham Kolstad, Lauren

    Debonding is a critical failure of a dental composites used for dental restorations. Debonding of dental composites can be determined by comparing the shrinkage stress of to the debonding strength of the adhesive that bonds it to the tooth surface. It is difficult to measure shrinkage stress experimentally. In this study, finite element analysis is used to predict the stress in the composite during cure. A new constitutive law is presented that will allow composite developers to evaluate composite shrinkage stress at early stages in the material development. Shrinkage stress and shrinkage strain experimental data were gathered for three dental resins, Z250, Z350, and P90. Experimental data were used to develop a constitutive model for the Young's modulus as a function of time of the dental composite during cure. A Maxwell model, spring and dashpot in series, was used to simulate the composite. The compliance of the shrinkage stress device was also taken into account by including a spring in series with the Maxwell model. A coefficient of thermal expansion was also determined for internal loading of the composite by dividing shrinkage strain by time. Three FEA models are presented. A spring-disk model validates that the constitutive law is self-consistent. A quarter cuspal deflection model uses separate experimental data to verify that the constitutive law is valid. Finally, an axisymmetric tooth model is used to predict interfacial stresses in the composite. These stresses are compared to the debonding strength to check if the composite debonds. The new constitutive model accurately predicted cuspal deflection data. Predictions for interfacial bond stress in the tooth model compare favorably with debonding characteristics observed in practice for dental resins.

  14. Developing the Model of Fuel Injection Process Efficiency Analysis for Injector for Diesel Engines

    Science.gov (United States)

    Anisimov, M. Yu; Kayukov, S. S.; Gorshkalev, A. A.; Belousov, A. V.; Gallyamov, R. E.; Lysenko, Yu D.

    2018-01-01

    The article proposes an assessment option for analysing the quality of fuel injection by the injector constituting the development of calculation blocks in a common injector model within LMS Imagine.Lab AMESim. The parameters of the injector model in the article correspond to the serial injector Common Rail-type with solenoid. The possibilities of this approach are demonstrated with providing the results using the example of modelling the modified injector. Following the research results, the advantages of the proposed approach to analysing assessing the fuel injection quality were detected.

  15. Factors Models of Scrum Adoption in the Software Development Process: A Systematic Literature Review

    Directory of Open Access Journals (Sweden)

    Marilyn Sihuay

    2018-05-01

    Full Text Available (Background The adoption of Agile Software Development (ASD, in particular Scrum, has grown significantly since its introduction in 2001. However, in Lima, many ASDs implementations have been not suitable (uncompleted or inconsistent, thus losing benefits obtainable by this approach and the critical success factors in this context are unknown. (Objective To analyze factors models used in the evaluation of the adoption of ASDs, as these factors models can contribute to explaining the success or failure of these adoptions. (Method In this study we used a systematic literature review. (Result Ten models have been identified; their similarities and differences are presented. (Conclusion Each model identified consider different factors, however some of them are shared by five of these models, such as team member attributes, engaging customer, customer collaboration, experience and work environment.

  16. A system identification approach for developing model predictive controllers of antibody quality attributes in cell culture processes.

    Science.gov (United States)

    Downey, Brandon; Schmitt, John; Beller, Justin; Russell, Brian; Quach, Anthony; Hermann, Elizabeth; Lyon, David; Breit, Jeffrey

    2017-11-01

    As the biopharmaceutical industry evolves to include more diverse protein formats and processes, more robust control of Critical Quality Attributes (CQAs) is needed to maintain processing flexibility without compromising quality. Active control of CQAs has been demonstrated using model predictive control techniques, which allow development of processes which are robust against disturbances associated with raw material variability and other potentially flexible operating conditions. Wide adoption of model predictive control in biopharmaceutical cell culture processes has been hampered, however, in part due to the large amount of data and expertise required to make a predictive model of controlled CQAs, a requirement for model predictive control. Here we developed a highly automated, perfusion apparatus to systematically and efficiently generate predictive models using application of system identification approaches. We successfully created a predictive model of %galactosylation using data obtained by manipulating galactose concentration in the perfusion apparatus in serialized step change experiments. We then demonstrated the use of the model in a model predictive controller in a simulated control scenario to successfully achieve a %galactosylation set point in a simulated fed-batch culture. The automated model identification approach demonstrated here can potentially be generalized to many CQAs, and could be a more efficient, faster, and highly automated alternative to batch experiments for developing predictive models in cell culture processes, and allow the wider adoption of model predictive control in biopharmaceutical processes. © 2017 The Authors Biotechnology Progress published by Wiley Periodicals, Inc. on behalf of American Institute of Chemical Engineers Biotechnol. Prog., 33:1647-1661, 2017. © 2017 The Authors Biotechnology Progress published by Wiley Periodicals, Inc. on behalf of American Institute of Chemical Engineers.

  17. Numerical Simulation of a Grinding Process Model for the Spatial Work-pieces: Development of Modeling Techniques

    Directory of Open Access Journals (Sweden)

    S. A. Voronov

    2015-01-01

    Full Text Available The article presents a literature review in simulation of grinding processes. It takes into consideration the statistical, energy based, and imitation approaches to simulation of grinding forces. Main stages of interaction between abrasive grains and machined surface are shown. The article describes main approaches to the geometry modeling of forming new surfaces when grinding. The review of approaches to the chip and pile up effect numerical modeling is shown. Advantages and disadvantages of grain-to-surface interaction by means of finite element method and molecular dynamics method are considered. The article points out that it is necessary to take into consideration the system dynamics and its effect on the finished surface. Structure of the complex imitation model of grinding process dynamics for flexible work-pieces with spatial surface geometry is proposed from the literature review. The proposed model of spatial grinding includes the model of work-piece dynamics, model of grinding wheel dynamics, phenomenological model of grinding forces based on 3D geometry modeling algorithm. Model gives the following results for spatial grinding process: vibration of machining part and grinding wheel, machined surface geometry, static deflection of the surface and grinding forces under various cutting conditions.

  18. Fuzzy Pruning Based LS-SVM Modeling Development for a Fermentation Process

    Directory of Open Access Journals (Sweden)

    Weili Xiong

    2014-01-01

    Full Text Available Due to the complexity and uncertainty of microbial fermentation processes, data coming from the plants often contain some outliers. However, these data may be treated as the normal support vectors, which always deteriorate the performance of soft sensor modeling. Since the outliers also contaminate the correlation structure of the least square support vector machine (LS-SVM, the fuzzy pruning method is provided to deal with the problem. Furthermore, by assigning different fuzzy membership scores to data samples, the sensitivity of the model to the outliers can be reduced greatly. The effectiveness and efficiency of the proposed approach are demonstrated through two numerical examples as well as a simulator case of penicillin fermentation process.

  19. Process development of a New Haemophilus influenzae type b conjugate vaccine and the use of mathematical modeling to identify process optimization possibilities.

    Science.gov (United States)

    Hamidi, Ahd; Kreeftenberg, Hans; V D Pol, Leo; Ghimire, Saroj; V D Wielen, Luuk A M; Ottens, Marcel

    2016-05-01

    Vaccination is one of the most successful public health interventions being a cost-effective tool in preventing deaths among young children. The earliest vaccines were developed following empirical methods, creating vaccines by trial and error. New process development tools, for example mathematical modeling, as well as new regulatory initiatives requiring better understanding of both the product and the process are being applied to well-characterized biopharmaceuticals (for example recombinant proteins). The vaccine industry is still running behind in comparison to these industries. A production process for a new Haemophilus influenzae type b (Hib) conjugate vaccine, including related quality control (QC) tests, was developed and transferred to a number of emerging vaccine manufacturers. This contributed to a sustainable global supply of affordable Hib conjugate vaccines, as illustrated by the market launch of the first Hib vaccine based on this technology in 2007 and concomitant price reduction of Hib vaccines. This paper describes the development approach followed for this Hib conjugate vaccine as well as the mathematical modeling tool applied recently in order to indicate options for further improvements of the initial Hib process. The strategy followed during the process development of this Hib conjugate vaccine was a targeted and integrated approach based on prior knowledge and experience with similar products using multi-disciplinary expertise. Mathematical modeling was used to develop a predictive model for the initial Hib process (the 'baseline' model) as well as an 'optimized' model, by proposing a number of process changes which could lead to further reduction in price. © 2016 American Institute of Chemical Engineers Biotechnol. Prog., 32:568-580, 2016. © 2016 American Institute of Chemical Engineers.

  20. MCD Process Model: A Systematic Approach to Curriculum Development in Black Studies.

    Science.gov (United States)

    Miller, Howard J.

    1986-01-01

    Holds that Black Studies programs have had problems surviving because of (1) resistance to curriculum change in colleges and universities, (2) their lack of supporters in positions of administrative power, and (3) lack of an organized, conceptual approach to developing and implementing a Black Studies curriculum. Presents a model designed to…

  1. Hybrid modeling as a QbD/PAT tool in process development: an industrial E. coli case study.

    Science.gov (United States)

    von Stosch, Moritz; Hamelink, Jan-Martijn; Oliveira, Rui

    2016-05-01

    Process understanding is emphasized in the process analytical technology initiative and the quality by design paradigm to be essential for manufacturing of biopharmaceutical products with consistent high quality. A typical approach to developing a process understanding is applying a combination of design of experiments with statistical data analysis. Hybrid semi-parametric modeling is investigated as an alternative method to pure statistical data analysis. The hybrid model framework provides flexibility to select model complexity based on available data and knowledge. Here, a parametric dynamic bioreactor model is integrated with a nonparametric artificial neural network that describes biomass and product formation rates as function of varied fed-batch fermentation conditions for high cell density heterologous protein production with E. coli. Our model can accurately describe biomass growth and product formation across variations in induction temperature, pH and feed rates. The model indicates that while product expression rate is a function of early induction phase conditions, it is negatively impacted as productivity increases. This could correspond with physiological changes due to cytoplasmic product accumulation. Due to the dynamic nature of the model, rational process timing decisions can be made and the impact of temporal variations in process parameters on product formation and process performance can be assessed, which is central for process understanding.

  2. Using the Knowledge, Process, Practice (KPP) model for driving the design and development of online postgraduate medical education.

    Science.gov (United States)

    Shaw, Tim; Barnet, Stewart; Mcgregor, Deborah; Avery, Jennifer

    2015-01-01

    Online learning is a primary delivery method for continuing health education programs. It is critical that programs have curricula objectives linked to educational models that support learning. Using a proven educational modelling process ensures that curricula objectives are met and a solid basis for learning and assessment is achieved. To develop an educational design model that produces an educationally sound program development plan for use by anyone involved in online course development. We have described the development of a generic educational model designed for continuing health education programs. The Knowledge, Process, Practice (KPP) model is founded on recognised educational theory and online education practice. This paper presents a step-by-step guide on using this model for program development that encases reliable learning and evaluation. The model supports a three-step approach, KPP, based on learning outcomes and supporting appropriate assessment activities. It provides a program structure for online or blended learning that is explicit, educationally defensible, and supports multiple assessment points for health professionals. The KPP model is based on best practice educational design using a structure that can be adapted for a variety of online or flexibly delivered postgraduate medical education programs.

  3. Development of a Mathematical Model for Multivariate Process by Balanced Six Sigma

    Directory of Open Access Journals (Sweden)

    Díaz-Castellanos Elizabeth Eugenia

    2015-07-01

    Full Text Available The Six Sigma methodology is widely used in business to improve quality, increase productivity and lower costs, impacting on business improvement. However, today the challenge is to use those tools for improvements that will have a direct impact on the differentiation of value, which requires the alignment of Six Sigma with the competitive strategies of the organization.Hence the importance of a strategic management system to measure, analyze, improve and control corporate performance, while setting out responsibilities of leadership and commitment. The specific purpose of this research is to provide a mathematical model through the alignment of strategic objectives (Balanced Scorecard and tools for productivity improvement (Six Sigma for processes with multiple answers, which is sufficiently robust so that it can serve as basis for application in manufacturing and thus effectively link strategy performance and customer satisfaction. Specifically we worked with a case study: Córdoba, Ver. The model proposes that is the strategy, performance and customer satisfaction are aligned, the organization will benefit from the intense relationship between process performance and strategic initiatives. These changes can be measured by productivity and process metrics such as cycle time, production rates, production efficiency and percentage of reprocessing, among others.

  4. NASA Standard for Models and Simulations (M and S): Development Process and Rationale

    Science.gov (United States)

    Zang, Thomas A.; Blattnig, Steve R.; Green, Lawrence L.; Hemsch, Michael J.; Luckring, James M.; Morison, Joseph H.; Tripathi, Ram K.

    2009-01-01

    After the Columbia Accident Investigation Board (CAIB) report. the NASA Administrator at that time chartered an executive team (known as the Diaz Team) to identify the CAIB report elements with Agency-wide applicability, and to develop corrective measures to address each element. This report documents the chronological development and release of an Agency-wide Standard for Models and Simulations (M&S) (NASA Standard 7009) in response to Action #4 from the report, "A Renewed Commitment to Excellence: An Assessment of the NASA Agency-wide Applicability of the Columbia Accident Investigation Board Report, January 30, 2004".

  5. Semantic Business Process Modeling

    OpenAIRE

    Markovic, Ivan

    2010-01-01

    This book presents a process-oriented business modeling framework based on semantic technologies. The framework consists of modeling languages, methods, and tools that allow for semantic modeling of business motivation, business policies and rules, and business processes. Quality of the proposed modeling framework is evaluated based on the modeling content of SAP Solution Composer and several real-world business scenarios.

  6. Exploring C-water-temperature interactions and non-linearities in soils through developments in process-based models

    Science.gov (United States)

    Esteban Moyano, Fernando; Vasilyeva, Nadezda; Menichetti, Lorenzo

    2016-04-01

    Soil carbon models developed over the last couple of decades are limited in their capacity to accurately predict the magnitudes and temporal variations in observed carbon fluxes and stocks. New process-based models are now emerging that attempt to address the shortcomings of their more simple, empirical counterparts. While a spectrum of ideas and hypothetical mechanisms are finding their way into new models, the addition of only a few processes known to significantly affect soil carbon (e.g. enzymatic decomposition, adsorption, Michaelis-Menten kinetics) has shown the potential to resolve a number of previous model-data discrepancies (e.g. priming, Birch effects). Through model-data validation, such models are a means of testing hypothetical mechanisms. In addition, they can lead to new insights into what soil carbon pools are and how they respond to external drivers. In this study we develop a model of soil carbon dynamics based on enzymatic decomposition and other key features of process based models, i.e. simulation of carbon in particulate, soluble and adsorbed states, as well as enzyme and microbial components. Here we focus on understanding how moisture affects C decomposition at different levels, both directly (e.g. by limiting diffusion) or through interactions with other components. As the medium where most reactions and transport take place, water is central en every aspect of soil C dynamics. We compare results from a number of alternative models with experimental data in order to test different processes and parameterizations. Among other observations, we try to understand: 1. typical moisture response curves and associated temporal changes, 2. moisture-temperature interactions, and 3. diffusion effects under changing C concentrations. While the model aims at being a process based approach and at simulating fluxes at short time scales, it remains a simplified representation using the same inputs as classical soil C models, and is thus potentially

  7. Modelling and developing a decision-making process of hazard zone identification in ship power plants

    International Nuclear Information System (INIS)

    Podsiadlo, Antoni; Tarelko, Wieslaw

    2006-01-01

    The most dangerous places in ships are their power plants. Particularly, they are very unsafe for operators carried out various necessary operation and maintenance activities. For this reason, ship machinery should be designed to ensure the maximum safety for its operators. It is a very difficult task. Therefore, it could not be solved by means of conventional design methods, which are used for design of uncomplicated technical equipment. One of the possible ways of solving this problem is to provide appropriate tools, which allow us to take the operator's safety into account during a design process, especially at its early stages. A computer-aided system supporting design of safe ship power plants could be such a tool. This paper deals with developing process of a prototype of the computer-aided system for hazard zone identification in ship power plants

  8. Modelling and developing a decision-making process of hazard zone identification in ship power plants

    Energy Technology Data Exchange (ETDEWEB)

    Podsiadlo, Antoni [Department of Engineering Sciences, Gdynia Maritime University, ul. Morska 83, 81-225 Gdynia (Poland)]. E-mail: topo@am.gdynia.pl; Tarelko, Wieslaw [Department of Engineering Sciences, Gdynia Maritime University, ul. Morska 83, 81-225 Gdynia (Poland)]. E-mail: tar@am.gdynia.pl

    2006-04-15

    The most dangerous places in ships are their power plants. Particularly, they are very unsafe for operators carried out various necessary operation and maintenance activities. For this reason, ship machinery should be designed to ensure the maximum safety for its operators. It is a very difficult task. Therefore, it could not be solved by means of conventional design methods, which are used for design of uncomplicated technical equipment. One of the possible ways of solving this problem is to provide appropriate tools, which allow us to take the operator's safety into account during a design process, especially at its early stages. A computer-aided system supporting design of safe ship power plants could be such a tool. This paper deals with developing process of a prototype of the computer-aided system for hazard zone identification in ship power plants.

  9. Biomass Torrefaction Process Review and Moving Bed Torrefaction System Model Development

    Energy Technology Data Exchange (ETDEWEB)

    Jaya Shakar Tumuluru; Shahab Sokhansanj; Christopher T. Wright

    2010-08-01

    Torrefaction is currently developing as an important preprocessing step to improve the quality of biomass in terms of physical properties, and proximate and ultimate composition. Torrefaction is a slow heating of biomass in an inert or reduced environment to a maximum temperature of 300°C. Torrefaction can also be defined as a group of products resulting from the partially controlled and isothermal pyrolysis of biomass occurring in a temperature range of 200–230ºC and 270–280ºC. Thus, the process can also be called a mild pyrolysis as it occurs at the lower temperature range of the pyrolysis process. At the end of the torrefaction process, a solid uniform product with lower moisture content and higher energy content than raw biomass is produced. Most of the smoke-producing compounds and other volatiles are removed during torrefaction, producing a final product that will have a lower mass but a higher heating value. An important aspect of research is to establish a degree of torrefaction where gains in heating value offset the loss of mass. There is a lack of literature on torrefaction reactor designs and a design sheet for estimating the dimensions of the torrefier based on capacity. This study includes a) conducting a detailed review on the torrefaction of biomass in terms of understanding the process, product properties, off-gas compositions, and methods used, and b) to design a moving bed torrefier, taking into account the basic fundamental heat and mass transfer calculations. Specific objectives include calculating the dimensions like diameter and height of the moving packed bed for different capacities, designing the heat loads and gas flow rates, and developing an interactive excel sheet where the user can define design specifications. In this report, 25–1000 kg/hr are used in equations for the design of the torrefier, examples of calculations, and specifications for the torrefier.

  10. Biomass Torrefaction Process Review and Moving Bed Torrefaction System Model Development

    Energy Technology Data Exchange (ETDEWEB)

    Jaya Shakar Tumuluru; Shahab Sokhansanj; Christopher T. Wright; Richard D. Boardman

    2010-08-01

    Torrefaction is currently developing as an important preprocessing step to improve the quality of biomass in terms of physical properties, and proximate and ultimate composition. Torrefaction is a slow heating of biomass in an inert or reduced environment to a maximum temperature of 300 C. Torrefaction can also be defined as a group of products resulting from the partially controlled and isothermal pyrolysis of biomass occurring in a temperature range of 200-230 C and 270-280 C. Thus, the process can also be called a mild pyrolysis as it occurs at the lower temperature range of the pyrolysis process. At the end of the torrefaction process, a solid uniform product with lower moisture content and higher energy content than raw biomass is produced. Most of the smoke-producing compounds and other volatiles are removed during torrefaction, producing a final product that will have a lower mass but a higher heating value. An important aspect of research is to establish a degree of torrefaction where gains in heating value offset the loss of mass. There is a lack of literature on torrefaction reactor designs and a design sheet for estimating the dimensions of the torrefier based on capacity. This study includes (a) conducting a detailed review on the torrefaction of biomass in terms of understanding the process, product properties, off-gas compositions, and methods used, and (b) to design a moving bed torrefier, taking into account the basic fundamental heat and mass transfer calculations. Specific objectives include calculating the dimensions like diameter and height of the moving packed bed for different capacities, designing the heat loads and gas flow rates, and developing an interactive excel sheet where the user can define design specifications. In this report, 25-1000 kg/hr are used in equations for the design of the torrefier, examples of calculations, and specifications for the torrefier.

  11. The development of stochastic process modeling through risk analysis derived from scheduling of NPP project

    International Nuclear Information System (INIS)

    Lee, Kwang Ho; Roh, Myung Sub

    2013-01-01

    There are so many different factors to consider when constructing a nuclear power plant successfully from planning to decommissioning. According to PMBOK, all projects have nine domains from a holistic project management perspective. They are equally important to all projects, however, this study focuses mostly on the processes required to manage timely completion of the project and conduct risk management. The overall objective of this study is to let you know what the risk analysis derived from scheduling of NPP project is, and understand how to implement the stochastic process modeling through risk management. Building the Nuclear Power Plant is required a great deal of time and fundamental knowledge related to all engineering. That means that integrated project scheduling management with so many activities is necessary and very important. Simulation techniques for scheduling of NPP project using Open Plan program, Crystal Ball program, and Minitab program can be useful tools for designing optimal schedule planning. Thus far, Open Plan and Monte Carlo programs have been used to calculate the critical path for scheduling network analysis. And also, Minitab program has been applied to monitor the scheduling risk. This approach to stochastic modeling through risk analysis of project activities is very useful for optimizing the schedules of activities using Critical Path Method and managing the scheduling control of NPP project. This study has shown new approach to optimal scheduling of NPP project, however, this does not consider the characteristic of activities according to the NPP site conditions. Hence, this study needs more research considering those factors

  12. The development of stochastic process modeling through risk analysis derived from scheduling of NPP project

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Kwang Ho; Roh, Myung Sub [KEPCO International Nuclear Graduate School, Ulsan (Korea, Republic of)

    2013-10-15

    There are so many different factors to consider when constructing a nuclear power plant successfully from planning to decommissioning. According to PMBOK, all projects have nine domains from a holistic project management perspective. They are equally important to all projects, however, this study focuses mostly on the processes required to manage timely completion of the project and conduct risk management. The overall objective of this study is to let you know what the risk analysis derived from scheduling of NPP project is, and understand how to implement the stochastic process modeling through risk management. Building the Nuclear Power Plant is required a great deal of time and fundamental knowledge related to all engineering. That means that integrated project scheduling management with so many activities is necessary and very important. Simulation techniques for scheduling of NPP project using Open Plan program, Crystal Ball program, and Minitab program can be useful tools for designing optimal schedule planning. Thus far, Open Plan and Monte Carlo programs have been used to calculate the critical path for scheduling network analysis. And also, Minitab program has been applied to monitor the scheduling risk. This approach to stochastic modeling through risk analysis of project activities is very useful for optimizing the schedules of activities using Critical Path Method and managing the scheduling control of NPP project. This study has shown new approach to optimal scheduling of NPP project, however, this does not consider the characteristic of activities according to the NPP site conditions. Hence, this study needs more research considering those factors.

  13. Possibilities of application of process modelling when developing a proposal of the business process management system for a university department

    Directory of Open Access Journals (Sweden)

    Pavel Máchal

    2009-01-01

    Full Text Available Today’s global environment requires every sustained effort to outmatch both competition and innovation. Top organizations of all types – governments, non-profit organizations, companies, institutions and universities try to solve the following difficult questions: How to improve standard of customer service and raise the productivity without concurrent growth of expense accounts. How to control risks and observe the rules without losses of entrepreneurial ( competition benefits? How to stimulate all the employees to participate in innovation, development of new products and services, finding new markets and more efficient satisfaction of customers?The paper deals with possibilities of procedural simulation both for the improvement and innovation of present processes, and for the formation of completely new trends of the Institute of Lifelong Education of Mendel University of Agriculture and Forestry in Brno. An important condition for the design of procedural simulation is expert activity provided by highly professional experts or consultants in the field of education as well as in business matters.

  14. Process control program development

    International Nuclear Information System (INIS)

    Dameron, H.J.

    1985-01-01

    This paper details the development and implementation of a ''Process Control Program'' at Duke Power's three nuclear stations - Oconee, McGuire, and Catawba. Each station is required by Technical Specification to have a ''Process Control Program'' (PCP) to control all dewatering and/or solidification activities for radioactive wastes

  15. Developing a framework to model the primary drying step of a continuous freeze-drying process based on infrared radiation

    DEFF Research Database (Denmark)

    Van Bockstal, Pieter-Jan; Corver, Jos; Mortier, Séverine Thérèse F.C.

    2018-01-01

    . These results assist in the selection of proper materials which could serve as IR window in the continuous freeze-drying prototype. The modelling framework presented in this paper fits the model-based design approach used for the development of this prototype and shows the potential benefits of this design...... requires the fundamental mechanistic modelling of each individual process step. Therefore, a framework is presented for the modelling and control of the continuous primary drying step based on non-contact IR radiation. The IR radiation emitted by the radiator filaments passes through various materials...

  16. Development of expert systems for modeling of technological process of pressure casting on the basis of artificial intelligence

    Science.gov (United States)

    Gavarieva, K. N.; Simonova, L. A.; Pankratov, D. L.; Gavariev, R. V.

    2017-09-01

    In article the main component of expert system of process of casting under pressure which consists of algorithms, united in logical models is considered. The characteristics of system showing data on a condition of an object of management are described. A number of logically interconnected steps allowing to increase quality of the received castings is developed

  17. iPSC-Based Models to Unravel Key Pathogenetic Processes Underlying Motor Neuron Disease Development

    Directory of Open Access Journals (Sweden)

    Irene Faravelli

    2014-10-01

    Full Text Available Motor neuron diseases (MNDs are neuromuscular disorders affecting rather exclusively upper motor neurons (UMNs and/or lower motor neurons (LMNs. The clinical phenotype is characterized by muscular weakness and atrophy leading to paralysis and almost invariably death due to respiratory failure. Adult MNDs include sporadic and familial amyotrophic lateral sclerosis (sALS-fALS, while the most common infantile MND is represented by spinal muscular atrophy (SMA. No effective treatment is ccurrently available for MNDs, as for the vast majority of neurodegenerative disorders, and cures are limited to supportive care and symptom relief. The lack of a deep understanding of MND pathogenesis accounts for the difficulties in finding a cure, together with the scarcity of reliable in vitro models. Recent progresses in stem cell field, in particular in the generation of induced Pluripotent Stem Cells (iPSCs has made possible for the first time obtaining substantial amounts of human cells to recapitulate in vitro some of the key pathogenetic processes underlying MNDs. In the present review, recently published studies involving the use of iPSCs to unravel aspects of ALS and SMA pathogenesis are discussed with an overview of their implications in the process of finding a cure for these still orphan disorders.

  18. Modeling multiphase materials processes

    CERN Document Server

    Iguchi, Manabu

    2010-01-01

    ""Modeling Multiphase Materials Processes: Gas-Liquid Systems"" describes the methodology and application of physical and mathematical modeling to multi-phase flow phenomena in materials processing. The book focuses on systems involving gas-liquid interaction, the most prevalent in current metallurgical processes. The performance characteristics of these processes are largely dependent on transport phenomena. This volume covers the inherent characteristics that complicate the modeling of transport phenomena in such systems, including complex multiphase structure, intense turbulence, opacity of

  19. Development of Mathematical Model for Lifecycle Management Process of New Type of Multirip Saw Machine

    Directory of Open Access Journals (Sweden)

    B. V. Phung

    2017-01-01

    variables. Based on the obtained unified information model, a multi-criterion problem has been formulated for the process of automated synthesis and rational choice to design and manufacture the multirip saw machine of new generation.

  20. Additive Manufacturing of IN100 Superalloy Through Scanning Laser Epitaxy for Turbine Engine Hot-Section Component Repair: Process Development, Modeling, Microstructural Characterization, and Process Control

    Science.gov (United States)

    Acharya, Ranadip; Das, Suman

    2015-09-01

    This article describes additive manufacturing (AM) of IN100, a high gamma-prime nickel-based superalloy, through scanning laser epitaxy (SLE), aimed at the creation of thick deposits onto like-chemistry substrates for enabling repair of turbine engine hot-section components. SLE is a metal powder bed-based laser AM technology developed for nickel-base superalloys with equiaxed, directionally solidified, and single-crystal microstructural morphologies. Here, we combine process modeling, statistical design-of-experiments (DoE), and microstructural characterization to demonstrate fully metallurgically bonded, crack-free and dense deposits exceeding 1000 μm of SLE-processed IN100 powder onto IN100 cast substrates produced in a single pass. A combined thermal-fluid flow-solidification model of the SLE process compliments DoE-based process development. A customized quantitative metallography technique analyzes digital cross-sectional micrographs and extracts various microstructural parameters, enabling process model validation and process parameter optimization. Microindentation measurements show an increase in the hardness by 10 pct in the deposit region compared to the cast substrate due to microstructural refinement. The results illustrate one of the very few successes reported for the crack-free deposition of IN100, a notoriously "non-weldable" hot-section alloy, thus establishing the potential of SLE as an AM method suitable for hot-section component repair and for future new-make components in high gamma-prime containing crack-prone nickel-based superalloys.

  1. Development of polymer nano composite patterns using fused deposition modeling for rapid investment casting process

    Science.gov (United States)

    Vivek, Tiwary; Arunkumar, P.; Deshpande, A. S.; Vinayak, Malik; Kulkarni, R. M.; Asif, Angadi

    2018-04-01

    Conventional investment casting is one of the oldest and most economical manufacturing techniques to produce intricate and complex part geometries. However, investment casting is considered economical only if the volume of production is large. Design iterations and design optimisations in this technique proves to be very costly due to time and tooling cost for making dies for producing wax patterns. However, with the advent of Additive manufacturing technology, plastic patterns promise a very good potential to replace the wax patterns. This approach can be very useful for low volume production & lab requirements, since the cost and time required to incorporate the changes in the design is very low. This research paper discusses the steps involved for developing polymer nanocomposite filaments and checking its suitability for investment castings. The process parameters of the 3D printer machine are also optimized using the DOE technique to obtain mechanically stronger plastic patterns. The study is done to develop a framework for rapid investment casting for lab as well as industrial requirements.

  2. Process modeling style

    CERN Document Server

    Long, John

    2014-01-01

    Process Modeling Style focuses on other aspects of process modeling beyond notation that are very important to practitioners. Many people who model processes focus on the specific notation used to create their drawings. While that is important, there are many other aspects to modeling, such as naming, creating identifiers, descriptions, interfaces, patterns, and creating useful process documentation. Experience author John Long focuses on those non-notational aspects of modeling, which practitioners will find invaluable. Gives solid advice for creating roles, work produ

  3. Neighbor-dependent Ramachandran probability distributions of amino acids developed from a hierarchical Dirichlet process model.

    Directory of Open Access Journals (Sweden)

    Daniel Ting

    2010-04-01

    Full Text Available Distributions of the backbone dihedral angles of proteins have been studied for over 40 years. While many statistical analyses have been presented, only a handful of probability densities are publicly available for use in structure validation and structure prediction methods. The available distributions differ in a number of important ways, which determine their usefulness for various purposes. These include: 1 input data size and criteria for structure inclusion (resolution, R-factor, etc.; 2 filtering of suspect conformations and outliers using B-factors or other features; 3 secondary structure of input data (e.g., whether helix and sheet are included; whether beta turns are included; 4 the method used for determining probability densities ranging from simple histograms to modern nonparametric density estimation; and 5 whether they include nearest neighbor effects on the distribution of conformations in different regions of the Ramachandran map. In this work, Ramachandran probability distributions are presented for residues in protein loops from a high-resolution data set with filtering based on calculated electron densities. Distributions for all 20 amino acids (with cis and trans proline treated separately have been determined, as well as 420 left-neighbor and 420 right-neighbor dependent distributions. The neighbor-independent and neighbor-dependent probability densities have been accurately estimated using Bayesian nonparametric statistical analysis based on the Dirichlet process. In particular, we used hierarchical Dirichlet process priors, which allow sharing of information between densities for a particular residue type and different neighbor residue types. The resulting distributions are tested in a loop modeling benchmark with the program Rosetta, and are shown to improve protein loop conformation prediction significantly. The distributions are available at http://dunbrack.fccc.edu/hdp.

  4. Product and Process Modelling

    DEFF Research Database (Denmark)

    Cameron, Ian T.; Gani, Rafiqul

    . These approaches are put into the context of life cycle modelling, where multiscale and multiform modelling is increasingly prevalent in the 21st century. The book commences with a discussion of modern product and process modelling theory and practice followed by a series of case studies drawn from a variety......This book covers the area of product and process modelling via a case study approach. It addresses a wide range of modelling applications with emphasis on modelling methodology and the subsequent in-depth analysis of mathematical models to gain insight via structural aspects of the models...... to biotechnology applications, food, polymer and human health application areas. The book highlights to important nature of modern product and process modelling in the decision making processes across the life cycle. As such it provides an important resource for students, researchers and industrial practitioners....

  5. Standard Model processes

    CERN Document Server

    Mangano, M.L.; Aguilar-Saavedra, Juan Antonio; Alekhin, S.; Badger, S.; Bauer, C.W.; Becher, T.; Bertone, V.; Bonvini, M.; Boselli, S.; Bothmann, E.; Boughezal, R.; Cacciari, M.; Carloni Calame, C.M.; Caola, F.; Campbell, J.M.; Carrazza, S.; Chiesa, M.; Cieri, L.; Cimaglia, F.; Febres Cordero, F.; Ferrarese, P.; D'Enterria, D.; Ferrera, G.; Garcia i Tormo, X.; Garzelli, M.V.; Germann, E.; Hirschi, V.; Han, T.; Ita, H.; Jäger, B.; Kallweit, S.; Karlberg, A.; Kuttimalai, S.; Krauss, F.; Larkoski, A.J.; Lindert, J.; Luisoni, G.; Maierhöfer, P.; Mattelaer, O.; Martinez, H.; Moch, S.; Montagna, G.; Moretti, M.; Nason, P.; Nicrosini, O.; Oleari, C.; Pagani, D.; Papaefstathiou, A.; Petriello, F.; Piccinini, F.; Pierini, M.; Pierog, T.; Pozzorini, S.; Re, E.; Robens, T.; Rojo, J.; Ruiz, R.; Sakurai, K.; Salam, G.P.; Salfelder, L.; Schönherr, M.; Schulze, M.; Schumann, S.; Selvaggi, M.; Shivaji, A.; Siodmok, A.; Skands, P.; Torrielli, P.; Tramontano, F.; Tsinikos, I.; Tweedie, B.; Vicini, A.; Westhoff, S.; Zaro, M.; Zeppenfeld, D.; CERN. Geneva. ATS Department

    2017-06-22

    This report summarises the properties of Standard Model processes at the 100 TeV pp collider. We document the production rates and typical distributions for a number of benchmark Standard Model processes, and discuss new dynamical phenomena arising at the highest energies available at this collider. We discuss the intrinsic physics interest in the measurement of these Standard Model processes, as well as their role as backgrounds for New Physics searches.

  6. Toward the Development of a Canadian Less Lethal Weapon Approval Process: A Study of Contemporary Process Models

    Science.gov (United States)

    2011-10-01

    they must be ISO / IEC 17025 compliant. A list of laboratories accredited to also certify terminal apparatus is available on the Industry Canada...accredited by Standards Council of Canada or Certified to ISO / IEC 17025 . The emphasis in the approval process is on the independence of testing or the...of Canada. Industry Canada takes a similar approach depending on ISO / IEC 17025 certified labs for most testing. In summary, technical/testing

  7. Modeling of biopharmaceutical processes. Part 2: Process chromatography unit operation

    DEFF Research Database (Denmark)

    Kaltenbrunner, Oliver; McCue, Justin; Engel, Philip

    2008-01-01

    Process modeling can be a useful tool to aid in process development, process optimization, and process scale-up. When modeling a chromatography process, one must first select the appropriate models that describe the mass transfer and adsorption that occurs within the porous adsorbent. The theoret......Process modeling can be a useful tool to aid in process development, process optimization, and process scale-up. When modeling a chromatography process, one must first select the appropriate models that describe the mass transfer and adsorption that occurs within the porous adsorbent...

  8. Modelling management process of key drivers for economic sustainability in the modern conditions of economic development

    Directory of Open Access Journals (Sweden)

    Pishchulina E.S.

    2017-01-01

    Full Text Available The text is about issues concerning the management of driver for manufacturing enterprise economic sustainability and manufacturing enterprise sustainability assessment as the key aspect of the management of enterprise economic sustainability. The given issues become topical as new requirements for the methods of manufacturing enterprise management in the modern conditions of market economy occur. An economic sustainability model that is considered in the article is an integration of enterprise economic growth, economic balance of external and internal environment and economic sustainability. The method of assessment of economic sustainability of a manufacturing enterprise proposed in the study allows to reveal some weaknesses in the enterprise performance, and untapped reserves, which can be further used to improve the economic sustainability and efficiency of the enterprise. The management of manufacturing enterprise economic sustainability is one of the most important factors of business functioning and development in modern market economy. The relevance of this trend is increasing in accordance with the objective requirements of the growing volumes of production and sale, the increasing complexity of economic relations, changing external environment of an enterprise.

  9. Preparatory selection of sterilization regime for canned Natural Atlantic Mackerel with oil based on developed mathematical models of the process

    Directory of Open Access Journals (Sweden)

    Maslov A. A.

    2016-12-01

    Full Text Available Definition of preparatory parameters for sterilization regime of canned "Natural Atlantic Mackerel with Oil" is the aim of current study. PRSC software developed at the department of automation and computer engineering is used for preparatory selection. To determine the parameters of process model, in laboratory autoclave AVK-30M the pre-trial process of sterilization and cooling in water with backpressure of canned "Natural Atlantic Mackerel with Oil" in can N 3 has been performed. Gathering information about the temperature in the autoclave sterilization chamber and the can with product has been carried out using Ellab TrackSense PRO loggers. Due to the obtained information three transfer functions for the product model have been identified: in the least heated area of autoclave, the average heated and the most heated. In PRSC programme temporary temperature dependences in the sterilization chamber have been built using this information. The model of sterilization process of canned "Natural Atlantic Mackerel with Oil" has been received after the pre-trial process. Then in the automatic mode the sterilization regime of canned "Natural Atlantic Mackerel with Oil" has been selected using the value of actual effect close to normative sterilizing effect (5.9 conditional minutes. Furthermore, in this study step-mode sterilization of canned "Natural Atlantic Mackerel with Oil" has been selected. Utilization of step-mode sterilization with the maximum temperature equal to 125 °C in the sterilization chamber allows reduce process duration by 10 %. However, the application of this regime in practice requires additional research. Using the described approach based on the developed mathematical models of the process allows receive optimal step and variable canned food sterilization regimes with high energy efficiency and product quality.

  10. Further developments of the Neyman-Scott clustered point process for modeling rainfall

    Science.gov (United States)

    Cowpertwait, Paul S. P.

    1991-07-01

    This paper provides some useful results for modeling rainfall. It extends work on the Neyman-Scott cluster model for simulating rainfall time series. Several important properties have previously been found for the model, for example, the expectation and variance of the amount of rain captured in an arbitrary time interval (Rodriguez-Iturbe et al., 1987a), In this paper additional properties are derived, such as the probability of an arbitrary interval of any chosen length being dry. In applications this is a desirable property to have, and is often used for fitting stochastic rainfall models to field data. The model is currently being used in rainfall time series research directed toward improving sewage systems in the United Kingdom. To illustrate the model's performance an example is given, where the model is fitted to 10 years of hourly data taken from Blackpool, England.

  11. Acidic deposition: State of science and technology. Report 2. Atmospheric processes research and process model development. Final report

    International Nuclear Information System (INIS)

    Hicks, B.B.; Draxler, R.R.; Albritton, D.L.; Fehsenfeld, F.C.; Davidson, C.I.

    1990-10-01

    The document represents an attempt to put together, in one place, a summary of the present state of knowledge concerning those processes that affect air concentrations of acidic and acidifying pollutants, during their transport, from emission to deposition. It is not intended to be an all-encompassing review of the entire breadth of each of the contributing disciplines, but instead focuses on those areas where the state of science has improved over the last decade--the period of the National Acid Precipitation Assessment Program. The discussion is not limited to NAPAP activities, although it is clear that the products of NAPAP research are perhaps given greater attention than are the results obtained elsewhere. This bias is partially intentional, since it is the INTEGRATED ASSESSMENT that is currently being prepared by NAPAP that constitutes the 'client' for the material presented here. The integrated assessment pay attention to the North American situation alone, and hence the present work gives greatest attention to the North American case, but with awareness of the need to place this particular situation in the context of the rest of the world

  12. Models of development and educational styles in the process of emancipation of Latin America: the case of Brazil

    Directory of Open Access Journals (Sweden)

    Dermeval SAVIANI

    2011-07-01

    Full Text Available On the occasion of the commemoration of the 200 years of Independence of Latin American countries, this paper analyses the models of development and educational styles in the process of the emancipation of Ibero-America, focusing specifically on the Brazilian case. In order to do this, we use two key texts as a reference: Gregorio Weinberg’s Modelos educativos en el desarrollo histórico de América Latina (Models of Education in the Historical Development of Latin America and Germán Rama’s Estilos educacionales (Educational Styles. Both texts elaborate the educational models or styles that took part in the historical development of Latin American societies. Bearing in mind the polarization between tradition and the modernization displayed in the educational models and styles proposed by Weinberg and Rama, this work shows how the process of conservative modernization, which characterized —with different nuances— the general emancipation movement in Ibero-American countries, took place in Brazilian society.

  13. THE MATHEMATICAL MODEL DEVELOPMENT OF THE ETHYLBENZENE DEHYDROGENATION PROCESS KINETICS IN A TWO-STAGE ADIABATIC CONTINUOUS REACTOR

    Directory of Open Access Journals (Sweden)

    V. K. Bityukov

    2015-01-01

    Full Text Available The article is devoted to the mathematical modeling of the kinetics of ethyl benzene dehydrogenation in a two-stage adiabatic reactor with a catalytic bed functioning on continuous technology. The analysis of chemical reactions taking place parallel to the main reaction of styrene formation has been carried out on the basis of which a number of assumptions were made proceeding from which a kinetic scheme describing the mechanism of the chemical reactions during the dehydrogenation process was developed. A mathematical model of the dehydrogenation process, describing the dynamics of chemical reactions taking place in each of the two stages of the reactor block at a constant temperature is developed. The estimation of the rate constants of direct and reverse reactions of each component, formation and exhaustion of the reacted mixture was made. The dynamics of the starting material concentration variations (ethyl benzene batch was obtained as well as styrene formation dynamics and all byproducts of dehydrogenation (benzene, toluene, ethylene, carbon, hydrogen, ect.. The calculated the variations of the component composition of the reaction mixture during its passage through the first and second stages of the reactor showed that the proposed mathematical description adequately reproduces the kinetics of the process under investigation. This demonstrates the advantage of the developed model, as well as loyalty to the values found for the rate constants of reactions, which enable the use of models for calculating the kinetics of ethyl benzene dehydrogenation under nonisothermal mode in order to determine the optimal temperature trajectory of the reactor operation. In the future, it will reduce energy and resource consumption, increase the volume of produced styrene and improve the economic indexes of the process.

  14. MODELLING THE DEVELOPMENT OF THE INTEGRATION PROCESSES DIRECTION IN THE BAKING INDUSTRY

    OpenAIRE

    Tetyana Kublikova; Svetlana Stupak

    2013-01-01

    The paper presents the characteristics of the economic interaction between organizations and enterprises within the system of cluster type and the direction of their investment and innovation transformation through the implementation of the integration processes in the bakery industry.

  15. Development of Physics-Based Numerical Models for Uncertainty Quantification of Selective Laser Melting Processes

    Data.gov (United States)

    National Aeronautics and Space Administration — The goal of the proposed research is to characterize the influence of process parameter variability inherent to Selective Laser Melting (SLM) and performance effect...

  16. Development of a computer model (REASON) for the simulation of behavioural decisions on the basis of inference and valuation processes

    International Nuclear Information System (INIS)

    Engemann, A.; Radtke, M.; Sachs, S.

    1981-07-01

    A computer model for the simulation of behavioural decisions and the preceding inference and valuation processes is under development under the sponsorship of the 'Stiftung Volkswagenwerk'. The present paper describes the basic ideas of the model from both the psychological and the mathematical point of view. The interdisciplinary character of the project is demonstrated quite clearly. In a semantic network, which contains knowlege, values and standarts related to the field under consideration, feasible actions and their consequences are evaluated. According to the behavioural model of Ajzen and Fishbein, valuations of the consequences are multiplied with the expections and added up. The command language for the program allows and algebraic definition of the Fishbein formula. The concept, consisting of object structres, predicates and implications, can be described in nearly natural language. In addition to this 'expected utility model' the program provides the possibility to exclude option by thresholds for the simulation of simplistic heuristics. (orig.) [de

  17. WWTP Process Tank Modelling

    DEFF Research Database (Denmark)

    Laursen, Jesper

    The present thesis considers numerical modeling of activated sludge tanks on municipal wastewater treatment plants. Focus is aimed at integrated modeling where the detailed microbiological model the Activated Sludge Model 3 (ASM3) is combined with a detailed hydrodynamic model based on a numerical...... solution of the Navier-Stokes equations in a multiphase scheme. After a general introduction to the activated sludge tank as a system, the activated sludge tank model is gradually setup in separate stages. The individual sub-processes that are often occurring in activated sludge tanks are initially...... hydrofoil shaped propellers. These two sub-processes deliver the main part of the supplied energy to the activated sludge tank, and for this reason they are important for the mixing conditions in the tank. For other important processes occurring in the activated sludge tank, existing models and measurements...

  18. Organizational and Functional Modeling of Business Processes for Management of the Development Project Efficiency

    OpenAIRE

    Soboleva Elena

    2017-01-01

    The article is devoted to development project management issues in the current economic situation in Russia, to the construction industry, to the problems of development projects implementation in Russia and to the impact of the quality of projects in the construction industry, to assessment of the impact of external project environment on the effectiveness of project activities in crisis, as well as to project management quality. A methodological approach to qualitative management of develop...

  19. INCORPORATION OF MECHANISTIC INFORMATION IN THE ARSENIC PBPK MODEL DEVELOPMENT PROCESS

    Science.gov (United States)

    INCORPORATING MECHANISTIC INSIGHTS IN A PBPK MODEL FOR ARSENICElaina M. Kenyon, Michael F. Hughes, Marina V. Evans, David J. Thomas, U.S. EPA; Miroslav Styblo, University of North Carolina; Michael Easterling, Analytical Sciences, Inc.A physiologically based phar...

  20. A subjective and objective fuzzy-based analytical hierarchy process model for prioritization of lean product development practices

    Directory of Open Access Journals (Sweden)

    Daniel O. Aikhuele

    2017-06-01

    Full Text Available In this paper, a subjective and objective fuzzy-based Analytical Hierarchy Process (AHP model is proposed. The model which is based on a newly defined evaluation matrix replaces the fuzzy comparison matrix (FCM in the traditional fuzzy AHP model, which has been found ineffective and time-consuming when criteria/alternatives are increased. The main advantage of the new model is that it is straightforward and completely eliminates the repetitive adjustment of data that is common with the FCM in traditional AHP model. The model reduces the complete dependen-cy on human judgment in prioritization assessment since the weights values are solved automati-cally using the evaluation matrix and the modified priority weight formula in the proposed mod-el. By virtue of a numerical case study, the model is successfully applied in the determination of the implementation priorities of lean practices for a product development environment and com-pared with similar computational methods in the literature.

  1. MODELLING THE DEVELOPMENT OF THE INTEGRATION PROCESSES DIRECTION IN THE BAKING INDUSTRY

    Directory of Open Access Journals (Sweden)

    Tetyana Kublikova

    2013-09-01

    Full Text Available The paper presents the characteristics of the economic interaction between organizations and enterprises within the system of cluster type and the direction of their investment and innovation transformation through the implementation of the integration processes in the bakery industry.

  2. Development of Landscape Metrics to Support Process-Driven Ecological Modeling

    Science.gov (United States)

    2014-04-01

    channel experiences shoaling due to strong tidal currents transporting sediments and has a symmetrical north-south, tide-dominant ebb delta. A 350...quantitative relationships can be established between landscape pattern formation and environmental or geomorphic processes, then those relationships could...should be aware that notwithstanding any other provision of law , no person shall be subject to any penalty for failing to comply with a collection of

  3. Integrated water system simulation by considering hydrological and biogeochemical processes: model development, with parameter sensitivity and autocalibration

    Science.gov (United States)

    Zhang, Y. Y.; Shao, Q. X.; Ye, A. Z.; Xing, H. T.; Xia, J.

    2016-02-01

    Integrated water system modeling is a feasible approach to understanding severe water crises in the world and promoting the implementation of integrated river basin management. In this study, a classic hydrological model (the time variant gain model: TVGM) was extended to an integrated water system model by coupling multiple water-related processes in hydrology, biogeochemistry, water quality, and ecology, and considering the interference of human activities. A parameter analysis tool, which included sensitivity analysis, autocalibration and model performance evaluation, was developed to improve modeling efficiency. To demonstrate the model performances, the Shaying River catchment, which is the largest highly regulated and heavily polluted tributary of the Huai River basin in China, was selected as the case study area. The model performances were evaluated on the key water-related components including runoff, water quality, diffuse pollution load (or nonpoint sources) and crop yield. Results showed that our proposed model simulated most components reasonably well. The simulated daily runoff at most regulated and less-regulated stations matched well with the observations. The average correlation coefficient and Nash-Sutcliffe efficiency were 0.85 and 0.70, respectively. Both the simulated low and high flows at most stations were improved when the dam regulation was considered. The daily ammonium-nitrogen (NH4-N) concentration was also well captured with the average correlation coefficient of 0.67. Furthermore, the diffuse source load of NH4-N and the corn yield were reasonably simulated at the administrative region scale. This integrated water system model is expected to improve the simulation performances with extension to more model functionalities, and to provide a scientific basis for the implementation in integrated river basin managements.

  4. Numerical Modeling of Anaerobic Microzones Development in Bulk Oxic Porous Media: An Assessment of Different Microzone Formation Processes

    Science.gov (United States)

    Roy Chowdhury, S.; Zarnetske, J. P.; Briggs, M. A.; Day-Lewis, F. D.; Singha, K.

    2017-12-01

    Soil and groundwater research indicates that unique biogeochemical "microzones" commonly form within bulk soil masses. The formation of these microzones at the pore-scale has been attributed to a number of causes, including variability of in situ carbon or nutrient sources, intrinsic physical conditions that lead to dual-porosity and mass transfer conditions, or microbial bioclogging of the porous media. Each of these causes, while documented in different porous media systems, potentially can lead to the presence of anaerobic pores residing in a bulk oxic domain. The relative role of these causes operating independently or in conjunction with each other to form microzones is not known. Here, we use a single numerical modeling framework to assess the relative roles of each process in creating anaerobic microzones. Using a two-dimensional pore-network model, coupled with a microbial growth model based on Monod kinetics, simulations were performed to explore the development of these anoxic microzones and their fate under a range of hydrologic, nutrient, and microbial conditions. Initial results parameterized for a stream-groundwater exchange environment (i.e., a hyporheic zone) indicate that external forcing of fluid flux in the domain is a key soil characteristic to anaerobic microzone development as fluid flux governs the nutrient flux. The initial amount of biomass present in the system also plays a major role in the development of the microzones. In terms of dominant in situ causes, the intrinsic physical structure of the local pore space is found to play the key role in development of anaerobic sites by regulating fluxes to reaction sites. Acknowledging and understanding the drivers of these microzones will improve the ability of multiple disciplines to measure and model reactive mass transport in soils and assess if they play a significant role for particular biogeochemical processes and ecosystem functions, such as denitrification and greenhouse gas production.

  5. Model Process Control Language

    Data.gov (United States)

    National Aeronautics and Space Administration — The MPC (Model Process Control) language enables the capture, communication and preservation of a simulation instance, with sufficient detail that it can be...

  6. Processes and models for serious game design and development

    NARCIS (Netherlands)

    Braad, Eelco; Žavcer, Gregor; Sandovar, Alyea; Dörner, Ralf; Göbel, Stefan; Kickmeier-Rust, Michael; Masuch, Maic; Zweig, Katharina

    2016-01-01

    A serious game needs to combine a number of different aspects to help the end user in reaching the desired effects. This requires incorporating a broad range of different aspects in the design, stemming from a broad range of different fields of expertise. For designers, developers, researchers, and

  7. Heterogeneity, uncertainty and process identification in early diagenesis : New model developments with applications to biological mixing

    NARCIS (Netherlands)

    Meile, C.D.

    2003-01-01

    Within the last decades, there have been spectacular developments in experimental and analytical techniques that allow geochemists and biologists to acquire ever more detailed data sets on aquatic sediments. These data sets often combine high-resolution chemical distributions with rate

  8. Heterogeneity, uncertainty and process identification in early diagenesis : new model developments with applications to biological mixing

    NARCIS (Netherlands)

    Meile, C.D.

    2003-01-01

    Within the last decades, there have been spectacular developments in experimental and analytical techniques that allow geochemists and biologists to acquire ever more detailed data sets on aquatic sediments. These data sets often combine high-resolution chemical distributions with rate

  9. Modelling the dynamics of agricultural development : a process approach : the case of Koutiala (Mali)

    NARCIS (Netherlands)

    Struif Bontkes, T.

    1999-01-01

    Introduction

    Sustainability of agricultural production and food supply is threatened in many developing countries by human population growth. The increasing food requirement forces the population to extend the cultivated areas to less fertile areas, often without taking

  10. THE CONCEPTUAL FOUR-SECTOR MODEL OF DEVELOPMENT OF THE COGNITIVE PROCESS DIMENSIONS IN ABSTRACT VISUAL THINKING

    Directory of Open Access Journals (Sweden)

    Kateřina Berková

    2018-04-01

    Full Text Available The research deals with the development of cognitive process dimensions in economic education. The aim is to research factors that influence academic achievement of students according to their intellectual level and grades. The researchers used quantitative design of research based on standardized assessment of intelligence and non-standardized questionnaire. The questionnaire was used to analyse the pedagogical competences of the teachers of economic subjects from the students' point of view in close relation to the teaching management and the impact on the motivation to learn and the achievement of students in these subjects. The respondents were 277 Czech students aged 16-17 who were divided into groups according to their intellectual level and grades. The data were analysed by a correlation analysis and a multiple regression model. In conclusion, the following can be stated: (a From the point of view of the above average intelligent students, expertise can be considered as an important competency of the teacher; teaching average intelligent students, communication and presentation skills seem to be important. (b It is desirable to develop cognitive processes, critical thinking actively, to lead students to become aware of changes in their own thinking and to orient them towards mastery goals. (c Particularly for students with weaker results it is necessary to create intrinsic motivation, which develops cognition and thus is able to develop higher cognitive dimensions further. The links between these areas are of utmost importance for education and, above all, for developing of students' scholarship. Each student can be educated, and it is necessary to influence them to develop their personality and all of their potential abilities. The conceptual four-sector model represents the initial pathway to lead students who are differentiated according to the intellectual level and academic achievement to the active development of thinking, learning

  11. Modeling and analysis of chill and fill processes for the cryogenic storage and transfer engineering development unit tank

    Science.gov (United States)

    Hedayat, A.; Cartagena, W.; Majumdar, A. K.; LeClair, A. C.

    2016-03-01

    NASA's future missions may require long-term storage and transfer of cryogenic propellants. The Engineering Development Unit (EDU), a NASA in-house effort supported by both Marshall Space Flight Center (MSFC) and Glenn Research Center, is a cryogenic fluid management (CFM) test article that primarily serves as a manufacturing pathfinder and a risk reduction task for a future CFM payload. The EDU test article comprises a flight-like tank, internal components, insulation, and attachment struts. The EDU is designed to perform integrated passive thermal control performance testing with liquid hydrogen (LH2) in a test-like vacuum environment. A series of tests, with LH2 as a testing fluid, was conducted at Test Stand 300 at MSFC during the summer of 2014. The objective of this effort was to develop a thermal/fluid model for evaluating the thermodynamic behavior of the EDU tank during the chill and fill processes. The Generalized Fluid System Simulation Program, an MSFC in-house general-purpose computer program for flow network analysis, was utilized to model and simulate the chill and fill portion of the testing. The model contained the LH2 supply source, feed system, EDU tank, and vent system. The test setup, modeling description, and comparison of model predictions with the test data are presented.

  12. Developing a Steady-state Kinetic Model for Industrial Scale Semi-Regenerative Catalytic Naphtha Reforming Process

    Directory of Open Access Journals (Sweden)

    Seif Mohaddecy, R.

    2014-05-01

    Full Text Available Due to the demand for high octane gasoline as a transportation fuel, the catalytic naphtha reformer has become one of the most important processes in petroleum refineries. In this research, the steady-state modelling of a catalytic fixed-bed naphtha reforming process to predict the momentous output variables was studied. These variables were octane number, yield, hydrogen purity, and temperature of all reforming reactors. To do such a task, an industrial scale semi-regenerative catalytic naphtha reforming unit was studied and modelled. In addition, to evaluate the developed model, the predicted variables i.e. outlet temperatures of reactors, research octane number, yield of gasoline and hydrogen purity were compared against actual data. The results showed that there is a close mapping between the actual and predicted variables, and the mean relative absolute deviation of the mentioned process variables were 0.38 %, 0.52 %, 0.54 %, 0.32 %, 4.8 % and 3.2 %, respectively.

  13. Multi-enzyme Process Modeling

    DEFF Research Database (Denmark)

    Andrade Santacoloma, Paloma de Gracia

    are affected (in a positive or negative way) by the presence of the other enzymes and compounds in the media. In this thesis the concept of multi-enzyme in-pot term is adopted for processes that are carried out by the combination of enzymes in a single reactor and implemented at pilot or industrial scale...... features of the process and provides the information required to structure the process model by using a step-by-step procedure with the required tools and methods. In this way, this framework increases efficiency of the model development process with respect to time and resources needed (fast and effective....... In this way the model parameters that drives the main dynamic behavior can be identified and thus a better understanding of this type of processes. In order to develop, test and verify the methodology, three case studies were selected, specifically the bi-enzyme process for the production of lactobionic acid...

  14. Eliciting the Functional Processes of Apologizing for Errors in Health Care: Developing an Explanatory Model of Apology.

    Science.gov (United States)

    Prothero, Marie M; Morse, Janice M

    2017-01-01

    The purpose of this article was to analyze the concept development of apology in the context of errors in health care, the administrative response, policy and format/process of the subsequent apology. Using pragmatic utility and a systematic review of the literature, 29 articles and one book provided attributes involved in apologizing. Analytic questions were developed to guide the data synthesis and types of apologies used in different circumstances identified. The antecedents of apologizing, and the attributes and outcomes were identified. A model was constructed illustrating the components of a complete apology, other types of apologies, and ramifications/outcomes of each. Clinical implications of developing formal policies for correcting medical errors through apologies are recommended. Defining the essential elements of apology is the first step in establishing a just culture in health care. Respect for patient-centered care reduces the retaliate consequences following an error, and may even restore the physician patient relationship.

  15. Development of rubber mixing process mathematical model and synthesis of control correction algorithm by process temperature mode using an artificial neural network

    Directory of Open Access Journals (Sweden)

    V. S. Kudryashov

    2016-01-01

    Full Text Available The article is devoted to the development of a correction control algorithm by temperature mode of a periodic rubber mixing process for JSC "Voronezh tire plant". The algorithm is designed to perform in the main controller a section of rubber mixing Siemens S7 CPU319F-3 PN/DP, which forms tasks for the local temperature controllers HESCH HE086 and Jumo dTRON304, operating by tempering stations. To compile the algorithm was performed a systematic analysis of rubber mixing process as an object of control and was developed a mathematical model of the process based on the heat balance equations describing the processes of heat transfer through the walls of technological devices, the change of coolant temperature and the temperature of the rubber compound mixing until discharge from the mixer chamber. Due to the complexity and nonlinearity of the control object – Rubber mixers and the availability of methods and a wide experience of this device control in an industrial environment, a correction algorithm is implemented on the basis of an artificial single-layer neural network and it provides the correction of tasks for local controllers on the cooling water temperature and air temperature in the workshop, which may vary considerably depending on the time of the year, and during prolonged operation of the equipment or its downtime. Tempering stations control is carried out by changing the flow of cold water from the cooler and on/off control of the heating elements. The analysis of the model experiments results and practical research at the main controller programming in the STEP 7 environment at the enterprise showed a decrease in the mixing time for different types of rubbers by reducing of heat transfer process control error.

  16. Model integrating the processes of consumer perceived value creation and consumer relationship development

    OpenAIRE

    Vaitkienė, Rimgailė; Pilibaitytė, Vestina

    2008-01-01

    Straipsnyje analizuojamos vertės vartotojui kūrimo bei santykių vystymo procesų sąsajos. Jame apžvelgiama santykių nauda, kaip vertės vartotojui dedamoji, santykių su vartotojais vystymo procesas bei jo etapai, vertės vartotojui kūrimo procesas santykių marketingo kontekste bei pateikiamas vertės vartotojui kūrimo ir santykių su vartotojais vystymo procesus integruojantis modelis. The article analyses the links between the consumer perceived value creation and relationship development proc...

  17. Locating the Seventh Cervical Spinous Process: Development and Validation of a Multivariate Model Using Palpation and Personal Information.

    Science.gov (United States)

    Ferreira, Ana Paula A; Póvoa, Luciana C; Zanier, José F C; Ferreira, Arthur S

    2017-02-01

    The aim of this study was to develop and validate a multivariate prediction model, guided by palpation and personal information, for locating the seventh cervical spinous process (C7SP). A single-blinded, cross-sectional study at a primary to tertiary health care center was conducted for model development and temporal validation. One-hundred sixty participants were prospectively included for model development (n = 80) and time-split validation stages (n = 80). The C7SP was located using the thorax-rib static method (TRSM). Participants underwent chest radiography for assessment of the inner body structure located with TRSM and using radio-opaque markers placed over the skin. Age, sex, height, body mass, body mass index, and vertex-marker distance (D V-M ) were used to predict the distance from the C7SP to the vertex (D V-C7 ). Multivariate linear regression modeling, limits of agreement plot, histogram of residues, receiver operating characteristic curves, and confusion tables were analyzed. The multivariate linear prediction model for D V-C7 (in centimeters) was D V-C7 = 0.986D V-M + 0.018(mass) + 0.014(age) - 1.008. Receiver operating characteristic curves had better discrimination of D V-C7 (area under the curve = 0.661; 95% confidence interval = 0.541-0.782; P = .015) than D V-M (area under the curve = 0.480; 95% confidence interval = 0.345-0.614; P = .761), with respective cutoff points at 23.40 cm (sensitivity = 41%, specificity = 63%) and 24.75 cm (sensitivity = 69%, specificity = 52%). The C7SP was correctly located more often when using predicted D V-C7 in the validation sample than when using the TRSM in the development sample: n = 53 (66%) vs n = 32 (40%), P information. Copyright © 2016. Published by Elsevier Inc.

  18. Developing mathematical modelling competence

    DEFF Research Database (Denmark)

    Blomhøj, Morten; Jensen, Tomas Højgaard

    2003-01-01

    In this paper we introduce the concept of mathematical modelling competence, by which we mean being able to carry through a whole mathematical modelling process in a certain context. Analysing the structure of this process, six sub-competences are identified. Mathematical modelling competence...... cannot be reduced to these six sub-competences, but they are necessary elements in the development of mathematical modelling competence. Experience from the development of a modelling course is used to illustrate how the different nature of the sub-competences can be used as a tool for finding...... the balance between different kinds of activities in a particular educational setting. Obstacles of social, cognitive and affective nature for the students' development of mathematical modelling competence are reported and discussed in relation to the sub-competences....

  19. Development and Modeling of a Novel Self-Assembly Process for Polymer and Polymeric Composite Nanoparticles

    Energy Technology Data Exchange (ETDEWEB)

    Sumpter, Bobby G. [ORNL; Carrillo, Jan-Michael Y. [ORNL; Ahn, Suk-Kyun [ORNL; Barnes, Mike D. [University of Massachusetts, Amherst; Shelton, William A. [Pacific Northwest National Laboratory (PNNL); Harrison, Robert J. [Stony Brook University (SUNY); W. Noid, Donald [Retired

    2017-10-01

    Extensive computational simulations and experiments have been used to investigate the structure, dynamics and resulting photophysical properties of a number para-phenylenevinylene (PPV) based polymers and oligomers. These studies have shown how the morphology and structure are controlled to a large extent by the nature of the solute-solvent interactions in the initial solution phase preparation. A good solvent such as dichloromethane generates non-compact structures with more of a defect-extended chain like morphology while a bad solvent such as toluene leads to compact organized and folded structures with rod-like morphologies. Secondary structural organization is induced by using the solution phase structures to generate solvent-free single molecule nanoparticles. These nanoparticles are very compact and rod shaped, consisting of near-cofacial ordering of the conjugated PPV chain backbones between folds located at tetrahedral defects (sp3 C-C bonds). The resulting photophysical properties exhibit a significant enhancement in the photoluminescence quantum yield, lifetime, and stability. In addition, the single molecule nanoparticles have Gaussian-like emission spectra with discrete center frequencies that are correlated to a conjugation length, allowing the design of nanoparticles which luminesces at a particular frequency. We followed a similar approach and applied a comparable methodology in our recent work on polythiophenes in order to study the effect of polymer architecture on nanoscale assembly. Unlike linear chains of comparable size, we observed aggregation of the bottlebrush architecture of poly(norbornene)-g-poly(3-hexylthiophene) (PNB-g-P3HT) after the freeze-drying and dissolution processes. The behavior can be attributed to a significant enhancement in the number of π-π interactions between grafted P3HT side chains.

  20. A database of wavefront measurements for laser system modeling, optical component development and fabrication process qualification

    International Nuclear Information System (INIS)

    Wolfe, C.R.; Lawson, J.K.; Aikens, D.M.; English, R.E.

    1995-01-01

    In the second half of the 1990's, LLNL and others anticipate designing and beginning construction of the National Ignition Facility (NIF). The NIF will be capable of producing the worlds first laboratory scale fusion ignition and bum reaction by imploding a small target. The NIF will utilize approximately 192 simultaneous laser beams for this purpose. The laser will be capable of producing a shaped energy pulse of at least 1.8 million joules (MJ) with peak power of at least 500 trillion watts (TV). In total, the facility will require more than 7,000 large optical components. The performance of a high power laser of this kind can be seriously degraded by the presence of low amplitude, periodic modulations in the surface and transmitted wavefronts of the optics used. At high peak power, these phase modulations can convert into large intensity modulations by non-linear optical processes. This in turn can lead to loss in energy on target via many well known mechanisms. In some cases laser damage to the optics downstream of the source of the phase modulation can occur. The database described here contains wavefront phase maps of early prototype optical components for the NIF. It has only recently become possible to map the wavefront of these large aperture components with high spatial resolution. Modem large aperture static fringe and phase shifting interferometers equipped with large area solid state detectors have made this possible. In a series of measurements with these instruments, wide spatial bandwidth can be detected in the wavefront

  1. Integration of membrane distillation into traditional salt farming method: Process development and modelling

    Science.gov (United States)

    Hizam, S.; Bilad, M. R.; Putra, Z. A.

    2017-10-01

    Farmers still practice the traditional salt farming in many regions, particularly in Indonesia. This archaic method not only produces low yield and poor salt quality, it is also laborious. Furthermore, the farming locations typically have poor access to fresh water and are far away from electricity grid, which restrict upgrade to a more advanced technology for salt production. This paper proposes a new concept of salt harvesting method that improves the salt yield and at the same time facilitates recovery of fresh water from seawater. The new concept integrates solar powered membrane distillation (MD) and photovoltaic cells to drive the pumping. We performed basic solar still experiments to quantify the heat flux received by a pond. The data were used as insight for designing the proposed concept, particularly on operational strategy and the most effective way to integrate MD. After the conceptual design had been developed, we formulated mass and energy balance to estimate the performance of the proposed concept. Based on our data and design, it is expected that the system would improve the yield and quality of the salt production, maximizing fresh water harvesting, and eventually provides economical gain for salt farmers hence improving their quality of life. The key performance can only be measured via experiment using gain output ratio as performance indicator, which will be done in a future study.

  2. Development of an empirical model for fluoride removal from photovoltaic wastewater by electrocoagulation process

    KAUST Repository

    Drouiche, Nadjib

    2011-05-01

    Electrocoagulation experiments were conducted with bipolar aluminium electrodes to determine the optimum conditions for the fluoride removal from synthetic photovoltaic wastewater. A high fluoride concentration in community water supplies can cause fluorosis which has a detrimental effect on human health in particular on teeth and bones. A full 23 factorial design of experiments was used to obtain the best conditions of fluoride removal from water solutions. The three factors considered were initial fluoride concentration, applied potential, and supporting electrolyte dosage. Two levels for each factor were used; supporting electrolyte (0 and 100), applied potential (10 and 30 V), and initial fluoride concentration (20 and 25 mg/L). Results showed that the optimum conditions for fluoride removal from photovoltaic wastewater containing an initial fluoride concentration of 20 mg/L were a supporting electrolyte dose of 100 mg/L and an applied potential of 30 V. These gave a residual fluoride concentration of 8.6 mg/L which was below the standard discharge limit. A mathematical equation showing the relation between residual fluoride concentration and the effective variables was also developed. © 2011 Desalination Publications. All rights reserved.

  3. Development of an empirical model for fluoride removal from photovoltaic wastewater by electrocoagulation process

    KAUST Repository

    Drouiche, Nadjib; Aoudj, Salaheddine; Lounici, Hakim; Mahmoudi, Hacè ne; Ghaffour, NorEddine; Goosen, Mattheus F A

    2011-01-01

    Electrocoagulation experiments were conducted with bipolar aluminium electrodes to determine the optimum conditions for the fluoride removal from synthetic photovoltaic wastewater. A high fluoride concentration in community water supplies can cause fluorosis which has a detrimental effect on human health in particular on teeth and bones. A full 23 factorial design of experiments was used to obtain the best conditions of fluoride removal from water solutions. The three factors considered were initial fluoride concentration, applied potential, and supporting electrolyte dosage. Two levels for each factor were used; supporting electrolyte (0 and 100), applied potential (10 and 30 V), and initial fluoride concentration (20 and 25 mg/L). Results showed that the optimum conditions for fluoride removal from photovoltaic wastewater containing an initial fluoride concentration of 20 mg/L were a supporting electrolyte dose of 100 mg/L and an applied potential of 30 V. These gave a residual fluoride concentration of 8.6 mg/L which was below the standard discharge limit. A mathematical equation showing the relation between residual fluoride concentration and the effective variables was also developed. © 2011 Desalination Publications. All rights reserved.

  4. Empirical Study on Sustainable Opportunities Recognition. A Polyvinyl Chloride (PVC Joinery Industry Analysis Using Augmented Sustainable Development Process Model

    Directory of Open Access Journals (Sweden)

    Eduard-Gabriel Ceptureanu

    2017-09-01

    Full Text Available This paper analyzes factors influencing recognition of sustainable opportunities by using an augmented sustainability process model. The conceptual model used two main factors, Knowledge and Motivation, and one moderating variable, Social embeddedness. We investigated entrepreneurs from PVC joinery industry and concluded that while market orientation and sustainable entrepreneurial orientation definitely and positively influence sustainable opportunity recognition, others variables like knowledge of the natural/communal environment, awareness of sustainable development or focus on success have less support. Among all variables analyzed, perception of the threat of the natural/communal environment and altruism toward others have the poorest impact on opportunity recognition. Finally, we concluded that social embeddedness has a moderating effect on sustainable opportunity recognition, even though the results were mixed.

  5. Aerospace Materials Process Modelling

    Science.gov (United States)

    1988-08-01

    Cooling Transformation diagram ( CCT diagram ) When a IT diagram is used in the heat process modelling, we suppose that a sudden cooling (instantaneous...processes. CE, chooses instead to study thermo-mechanical properties referring to a CCT diagram . This is thinked to be more reliable to give a true...k , mm-_____sml l ml A I 1 III 12.4 This determination is however based on the following approximations: i) A CCT diagram is valid only for the

  6. Is There Room for "Development" in Developmental Models of Information Processing Biases to Threat in Children and Adolescents?

    Science.gov (United States)

    Field, Andy P.; Lester, Kathryn J.

    2010-01-01

    Clinical and experimental theories assume that processing biases in attention and interpretation are a causal mechanism through which anxiety develops. Despite growing evidence that these processing biases are present in children and, therefore, develop long before adulthood, these theories ignore the potential role of child development. This…

  7. Computerized prediction of intensive care unit discharge after cardiac surgery: development and validation of a Gaussian processes model

    Directory of Open Access Journals (Sweden)

    Meyfroidt Geert

    2011-10-01

    Full Text Available Abstract Background The intensive care unit (ICU length of stay (LOS of patients undergoing cardiac surgery may vary considerably, and is often difficult to predict within the first hours after admission. The early clinical evolution of a cardiac surgery patient might be predictive for his LOS. The purpose of the present study was to develop a predictive model for ICU discharge after non-emergency cardiac surgery, by analyzing the first 4 hours of data in the computerized medical record of these patients with Gaussian processes (GP, a machine learning technique. Methods Non-interventional study. Predictive modeling, separate development (n = 461 and validation (n = 499 cohort. GP models were developed to predict the probability of ICU discharge the day after surgery (classification task, and to predict the day of ICU discharge as a discrete variable (regression task. GP predictions were compared with predictions by EuroSCORE, nurses and physicians. The classification task was evaluated using aROC for discrimination, and Brier Score, Brier Score Scaled, and Hosmer-Lemeshow test for calibration. The regression task was evaluated by comparing median actual and predicted discharge, loss penalty function (LPF ((actual-predicted/actual and calculating root mean squared relative errors (RMSRE. Results Median (P25-P75 ICU length of stay was 3 (2-5 days. For classification, the GP model showed an aROC of 0.758 which was significantly higher than the predictions by nurses, but not better than EuroSCORE and physicians. The GP had the best calibration, with a Brier Score of 0.179 and Hosmer-Lemeshow p-value of 0.382. For regression, GP had the highest proportion of patients with a correctly predicted day of discharge (40%, which was significantly better than the EuroSCORE (p Conclusions A GP model that uses PDMS data of the first 4 hours after admission in the ICU of scheduled adult cardiac surgery patients was able to predict discharge from the ICU as a

  8. Process research and development

    Science.gov (United States)

    Bickler, D. B.

    1986-01-01

    The following major processes involved in the production of crystalline-silicon solar cells were discussed: surface preparation, junction formation, metallization, and assembly. The status of each of these processes, and the sequence in which these processes are applied, were described as they were in 1975, as they were in 1985, and what they might be in the future.

  9. Model feedstock supply processing plants

    Directory of Open Access Journals (Sweden)

    V. M. Bautin

    2013-01-01

    Full Text Available The model of raw providing the processing enterprises entering into vertically integrated structure on production and processing of dairy raw materials, differing by an orientation on achievement of cumulative effect by the integrated structure acting as criterion function which maximizing is reached by optimization of capacities, volumes of deliveries of raw materials and its qualitative characteristics, costs of industrial processing of raw materials and demand for dairy production is developed.

  10. Development of interactive graphic user interfaces for modeling reaction-based biogeochemical processes in batch systems with BIOGEOCHEM

    Science.gov (United States)

    Chang, C.; Li, M.; Yeh, G.

    2010-12-01

    The BIOGEOCHEM numerical model (Yeh and Fang, 2002; Fang et al., 2003) was developed with FORTRAN for simulating reaction-based geochemical and biochemical processes with mixed equilibrium and kinetic reactions in batch systems. A complete suite of reactions including aqueous complexation, adsorption/desorption, ion-exchange, redox, precipitation/dissolution, acid-base reactions, and microbial mediated reactions were embodied in this unique modeling tool. Any reaction can be treated as fast/equilibrium or slow/kinetic reaction. An equilibrium reaction is modeled with an implicit finite rate governed by a mass action equilibrium equation or by a user-specified algebraic equation. A kinetic reaction is modeled with an explicit finite rate with an elementary rate, microbial mediated enzymatic kinetics, or a user-specified rate equation. None of the existing models has encompassed this wide array of scopes. To ease the input/output learning curve using the unique feature of BIOGEOCHEM, an interactive graphic user interface was developed with the Microsoft Visual Studio and .Net tools. Several user-friendly features, such as pop-up help windows, typo warning messages, and on-screen input hints, were implemented, which are robust. All input data can be real-time viewed and automated to conform with the input file format of BIOGEOCHEM. A post-processor for graphic visualizations of simulated results was also embedded for immediate demonstrations. By following data input windows step by step, errorless BIOGEOCHEM input files can be created even if users have little prior experiences in FORTRAN. With this user-friendly interface, the time effort to conduct simulations with BIOGEOCHEM can be greatly reduced.

  11. Synchronized mammalian cell culture: part II--population ensemble modeling and analysis for development of reproducible processes.

    Science.gov (United States)

    Jandt, Uwe; Barradas, Oscar Platas; Pörtner, Ralf; Zeng, An-Ping

    2015-01-01

    The consideration of inherent population inhomogeneities of mammalian cell cultures becomes increasingly important for systems biology study and for developing more stable and efficient processes. However, variations of cellular properties belonging to different sub-populations and their potential effects on cellular physiology and kinetics of culture productivity under bioproduction conditions have not yet been much in the focus of research. Culture heterogeneity is strongly determined by the advance of the cell cycle. The assignment of cell-cycle specific cellular variations to large-scale process conditions can be optimally determined based on the combination of (partially) synchronized cultivation under otherwise physiological conditions and subsequent population-resolved model adaptation. The first step has been achieved using the physical selection method of countercurrent flow centrifugal elutriation, recently established in our group for different mammalian cell lines which is presented in Part I of this paper series. In this second part, we demonstrate the successful adaptation and application of a cell-cycle dependent population balance ensemble model to describe and understand synchronized bioreactor cultivations performed with two model mammalian cell lines, AGE1.HNAAT and CHO-K1. Numerical adaptation of the model to experimental data allows for detection of phase-specific parameters and for determination of significant variations between different phases and different cell lines. It shows that special care must be taken with regard to the sampling frequency in such oscillation cultures to minimize phase shift (jitter) artifacts. Based on predictions of long-term oscillation behavior of a culture depending on its start conditions, optimal elutriation setup trade-offs between high cell yields and high synchronization efficiency are proposed. © 2014 American Institute of Chemical Engineers.

  12. DEVELOPMENT OF A KINETIC MODEL OF BOEHMITE DISSOLUTION IN CAUSTIC SOLUTIONS APPLIED TO OPTIMIZE HANFORD WASTE PROCESSING

    International Nuclear Information System (INIS)

    Disselkamp, R.S.

    2011-01-01

    Boehmite (e.g., aluminum oxyhydroxide) is a major non-radioactive component in Hanford and Savannah River nuclear tank waste sludge. Boehmite dissolution from sludge using caustic at elevated temperatures is being planned at Hanford to minimize the mass of material disposed of as high-level waste (HLW) during operation of the Waste Treatment Plant (WTP). To more thoroughly understand the chemistry of this dissolution process, we have developed an empirical kinetic model for aluminate production due to boehmite dissolution. Application of this model to Hanford tank wastes would allow predictability and optimization of the caustic leaching of aluminum solids, potentially yielding significant improvements to overall processing time, disposal cost, and schedule. This report presents an empirical kinetic model that can be used to estimate the aluminate production from the leaching of boehmite in Hanford waste as a function of the following parameters: (1) hydroxide concentration; (2) temperature; (3) specific surface area of boehmite; (4) initial soluble aluminate plus gibbsite present in waste; (5) concentration of boehmite in the waste; and (6) (pre-fit) Arrhenius kinetic parameters. The model was fit to laboratory, non-radioactive (e.g. 'simulant boehmite') leaching results, providing best-fit values of the Arrhenius A-factor, A, and apparent activation energy, E A , of A = 5.0 x 10 12 hour -1 and E A = 90 kJ/mole. These parameters were then used to predict boehmite leaching behavior observed in previously reported actual waste leaching studies. Acceptable aluminate versus leaching time profiles were predicted for waste leaching data from both Hanford and Savannah River site studies.

  13. Building an Economic and Mathematical Model of Influence of Integration Processes Upon Development of Tourism in Ukraine

    Directory of Open Access Journals (Sweden)

    Yemets Mariya S.

    2013-12-01

    Full Text Available Today Ukraine actively searches for its own way in the world integration processes, demonstrates a multi-vector foreign economic policy and carries out movement in the direction of integration with the EU and CIS countries. Taking into account establishment of international tourist relations, the main task of Ukraine is getting a bigger share of the world tourist arrivals. That is why, in order to study influence of integration processes upon development of tourism in the country, the author offers the following model: building regression equations of the share of export of tourist services of Ukraine for CIS and EU countries with the aim of the further comparative analysis. The conducted analysis allows making a conclusion that integration factors influence development of international tourism, however it is proved that this influence is not unequivocal and in some cases even inconsistent. Identification of directions of such an inter-dependency allows building an efficient tourist policy by means of selection of adaptive directions of integration.

  14. Optical modeling and electrical properties of cadmium oxide nanofilms: Developing a meta–heuristic calculation process model

    Energy Technology Data Exchange (ETDEWEB)

    Abdolahzadeh Ziabari, Ali, E-mail: ali.abd.ziabari@gmail.com [Nano Research Lab, Lahijan Branch, Islamic Azad University, P.O. Box 1616, Lahijan (Iran, Islamic Republic of); Refahi Sheikhani, A. H. [Department of Applied Mathematics, Lahijan Branch, Islamic Azad University, Lahijan (Iran, Islamic Republic of); Nezafat, Reza Vatani [Department of Civil Engineering, Faculty of Technology, University of Guilan, Rasht (Iran, Islamic Republic of); Haghighidoust, Kasra Monsef [Department of Mechanical Engineering, Faculty of Technology, University of Guilan, Rasht (Iran, Islamic Republic of)

    2015-04-07

    Cadmium oxide thin films were deposited onto glass substrates by sol–gel dip-coating method and annealed in air. The normal incidence transmittance of the films was measured by a spectrophotometer. D.C electrical parameters such as carrier concentration and mobility were analyzed by Hall Effect measurements. A combination of Forouhi–Bloomer and standard Drude model was used to simulate the optical constants and thicknesses of the films from transmittance data. The transmittance spectra of the films in the visible domain of wavelengths were successfully fitted by using the result of a hybrid particle swarm optimization method and genetic algorithm. The simulated transmittance is in good accordance with the measured spectrum in the whole measurement wavelength range. The electrical parameters obtained from the optical simulation are well consistent with those measured electrically by Hall Effect measurements.

  15. Guiding gate-etch process development using 3D surface reaction modeling for 7nm and beyond

    Science.gov (United States)

    Dunn, Derren; Sporre, John R.; Deshpande, Vaibhav; Oulmane, Mohamed; Gull, Ronald; Ventzek, Peter; Ranjan, Alok

    2017-03-01

    Increasingly, advanced process nodes such as 7nm (N7) are fundamentally 3D and require stringent control of critical dimensions over high aspect ratio features. Process integration in these nodes requires a deep understanding of complex physical mechanisms to control critical dimensions from lithography through final etch. Polysilicon gate etch processes are critical steps in several device architectures for advanced nodes that rely on self-aligned patterning approaches to gate definition. These processes are required to meet several key metrics: (a) vertical etch profiles over high aspect ratios; (b) clean gate sidewalls free of etch process residue; (c) minimal erosion of liner oxide films protecting key architectural elements such as fins; and (e) residue free corners at gate interfaces with critical device elements. In this study, we explore how hybrid modeling approaches can be used to model a multi-step finFET polysilicon gate etch process. Initial parts of the patterning process through hardmask assembly are modeled using process emulation. Important aspects of gate definition are then modeled using a particle Monte Carlo (PMC) feature scale model that incorporates surface chemical reactions.1 When necessary, species and energy flux inputs to the PMC model are derived from simulations of the etch chamber. The modeled polysilicon gate etch process consists of several steps including a hard mask breakthrough step (BT), main feature etch steps (ME), and over-etch steps (OE) that control gate profiles at the gate fin interface. An additional constraint on this etch flow is that fin spacer oxides are left intact after final profile tuning steps. A natural optimization required from these processes is to maximize vertical gate profiles while minimizing erosion of fin spacer films.2

  16. Development of a MODIS-Derived Surface Albedo Data Set: An Improved Model Input for Processing the NSRDB

    Energy Technology Data Exchange (ETDEWEB)

    Maclaurin, Galen [National Renewable Energy Lab. (NREL), Golden, CO (United States); Sengupta, Manajit [National Renewable Energy Lab. (NREL), Golden, CO (United States); Xie, Yu [National Renewable Energy Lab. (NREL), Golden, CO (United States); Gilroy, Nicholas [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2016-12-01

    A significant source of bias in the transposition of global horizontal irradiance to plane-of-array (POA) irradiance arises from inaccurate estimations of surface albedo. The current physics-based model used to produce the National Solar Radiation Database (NSRDB) relies on model estimations of surface albedo from a reanalysis climatalogy produced at relatively coarse spatial resolution compared to that of the NSRDB. As an input to spectral decomposition and transposition models, more accurate surface albedo data from remotely sensed imagery at finer spatial resolutions would improve accuracy in the final product. The National Renewable Energy Laboratory (NREL) developed an improved white-sky (bi-hemispherical reflectance) broadband (0.3-5.0 ..mu..m) surface albedo data set for processing the NSRDB from two existing data sets: a gap-filled albedo product and a daily snow cover product. The Moderate Resolution Imaging Spectroradiometer (MODIS) sensors onboard the Terra and Aqua satellites have provided high-quality measurements of surface albedo at 30 arc-second spatial resolution and 8-day temporal resolution since 2001. The high spatial and temporal resolutions and the temporal coverage of the MODIS sensor will allow for improved modeling of POA irradiance in the NSRDB. However, cloud and snow cover interfere with MODIS observations of ground surface albedo, and thus they require post-processing. The MODIS production team applied a gap-filling methodology to interpolate observations obscured by clouds or ephemeral snow. This approach filled pixels with ephemeral snow cover because the 8-day temporal resolution is too coarse to accurately capture the variability of snow cover and its impact on albedo estimates. However, for this project, accurate representation of daily snow cover change is important in producing the NSRDB. Therefore, NREL also used the Integrated Multisensor Snow and Ice Mapping System data set, which provides daily snow cover observations of the

  17. Transient simulation of an endothermic chemical process facility coupled to a high temperature reactor: Model development and validation

    International Nuclear Information System (INIS)

    Brown, Nicholas R.; Seker, Volkan; Revankar, Shripad T.; Downar, Thomas J.

    2012-01-01

    Highlights: ► Models for PBMR and thermochemical sulfur cycle based hydrogen plant are developed. ► Models are validated against available data in literature. ► Transient in coupled reactor and hydrogen plant system is studied. ► For loss-of-heat sink accident, temperature feedback within the reactor core enables shut down of the reactor. - Abstract: A high temperature reactor (HTR) is a candidate to drive high temperature water-splitting using process heat. While both high temperature nuclear reactors and hydrogen generation plants have high individual degrees of development, study of the coupled plant is lacking. Particularly absent are considerations of the transient behavior of the coupled plant, as well as studies of the safety of the overall plant. The aim of this document is to contribute knowledge to the effort of nuclear hydrogen generation. In particular, this study regards identification of safety issues in the coupled plant and the transient modeling of some leading candidates for implementation in the Nuclear Hydrogen Initiative (NHI). The Sulfur Iodine (SI) and Hybrid Sulfur (HyS) cycles are considered as candidate hydrogen generation schemes. Three thermodynamically derived chemical reaction chamber models are coupled to a well-known reference design of a high temperature nuclear reactor. These chemical reaction chamber models have several dimensions of validation, including detailed steady state flowsheets, integrated loop test data, and bench scale chemical kinetics. The models and coupling scheme are presented here, as well as a transient test case initiated within the chemical plant. The 50% feed flow failure within the chemical plant results in a slow loss-of-heat sink (LOHS) accident in the nuclear reactor. Due to the temperature feedback within the reactor core the nuclear reactor partially shuts down over 1500 s. Two distinct regions are identified within the coupled plant response: (1) immediate LOHS due to the loss of the sulfuric

  18. Silicon web process development

    Science.gov (United States)

    Duncan, C. S.; Seidensticker, R. G.; Mchugh, J. P.; Skutch, M. E.; Driggers, J. M.; Hopkins, R. H.

    1981-01-01

    The silicon web process takes advantage of natural crystallographic stabilizing forces to grow long, thin single crystal ribbons directly from liquid silicon. The ribbon, or web, is formed by the solidification of a liquid film supported by surface tension between two silicon filaments, called dendrites, which border the edges of the growing strip. The ribbon can be propagated indefinitely by replenishing the liquid silicon as it is transformed to crystal. The dendritic web process has several advantages for achieving low cost, high efficiency solar cells. These advantages are discussed.

  19. Decision making for business model development : A process study of effectuation and causation in new technology-based ventures

    NARCIS (Netherlands)

    Reymen, Isabelle; Berends, Hans; Oudehand, Rob; Stultiëns, Rutger

    2017-01-01

    This study investigates the decision-making logics used by new ventures to develop their business models. In particular, they focussed on the logics of effectuation and causation and how their dynamics shape the development of business models over time. They found that the effectual decision-making

  20. Uranium processing developments

    International Nuclear Information System (INIS)

    Jones, J.Q.

    1977-01-01

    The basic methods for processing ore to recover the contained uranium have not changed significantly since the 1954-62 period. Improvements in mill operations have been the result of better or less expensive reagents, changes in equipment, and in the successful resolvement of many environmental matters. There is also an apparent trend toward large mills that can profitably process lower grade ores. The major thrust in the near future will not be on process technology but on the remaining environmental constraints associated with milling. At this time the main ''spot light'' is on tailings dam and impoundment area construction and reclamation. Plans must provide for an adequate safety factor for stability, no surface or groundwater contamination, and minimal discharge of radionuclides to unrestricted areas, as may be required by law. Solution mining methods must also provide for plans to restore the groundwater back to its original condition as defined by local groundwater regulations. Basic flowsheets (each to finished product) plus modified versions of the basic types are shown

  1. Process model repositories and PNML

    NARCIS (Netherlands)

    Hee, van K.M.; Post, R.D.J.; Somers, L.J.A.M.; Werf, van der J.M.E.M.; Kindler, E.

    2004-01-01

    Bringing system and process models together in repositories facilitates the interchange of model information between modelling tools, and allows the combination and interlinking of complementary models. Petriweb is a web application for managing such repositories. It supports hierarchical process

  2. Implementation of a Web-Based Organ Donation Educational Intervention: Development and Use of a Refined Process Evaluation Model

    Science.gov (United States)

    Harker, Laura; Bamps, Yvan; Flemming, Shauna St. Clair; Perryman, Jennie P; Thompson, Nancy J; Patzer, Rachel E; Williams, Nancy S DeSousa; Arriola, Kimberly R Jacob

    2017-01-01

    Background The lack of available organs is often considered to be the single greatest problem in transplantation today. Internet use is at an all-time high, creating an opportunity to increase public commitment to organ donation through the broad reach of Web-based behavioral interventions. Implementing Internet interventions, however, presents challenges including preventing fraudulent respondents and ensuring intervention uptake. Although Web-based organ donation interventions have increased in recent years, process evaluation models appropriate for Web-based interventions are lacking. Objective The aim of this study was to describe a refined process evaluation model adapted for Web-based settings and used to assess the implementation of a Web-based intervention aimed to increase organ donation among African Americans. Methods We used a randomized pretest-posttest control design to assess the effectiveness of the intervention website that addressed barriers to organ donation through corresponding videos. Eligible participants were African American adult residents of Georgia who were not registered on the state donor registry. Drawing from previously developed process evaluation constructs, we adapted reach (the extent to which individuals were found eligible, and participated in the study), recruitment (online recruitment mechanism), dose received (intervention uptake), and context (how the Web-based setting influenced study implementation) for Internet settings and used the adapted model to assess the implementation of our Web-based intervention. Results With regard to reach, 1415 individuals completed the eligibility screener; 948 (67.00%) were determined eligible, of whom 918 (96.8%) completed the study. After eliminating duplicate entries (n=17), those who did not initiate the posttest (n=21) and those with an invalid ZIP code (n=108), 772 valid entries remained. Per the Internet protocol (IP) address analysis, only 23 of the 772 valid entries (3.0%) were

  3. Implementation of a Web-Based Organ Donation Educational Intervention: Development and Use of a Refined Process Evaluation Model.

    Science.gov (United States)

    Redmond, Nakeva; Harker, Laura; Bamps, Yvan; Flemming, Shauna St Clair; Perryman, Jennie P; Thompson, Nancy J; Patzer, Rachel E; Williams, Nancy S DeSousa; Arriola, Kimberly R Jacob

    2017-11-30

    The lack of available organs is often considered to be the single greatest problem in transplantation today. Internet use is at an all-time high, creating an opportunity to increase public commitment to organ donation through the broad reach of Web-based behavioral interventions. Implementing Internet interventions, however, presents challenges including preventing fraudulent respondents and ensuring intervention uptake. Although Web-based organ donation interventions have increased in recent years, process evaluation models appropriate for Web-based interventions are lacking. The aim of this study was to describe a refined process evaluation model adapted for Web-based settings and used to assess the implementation of a Web-based intervention aimed to increase organ donation among African Americans. We used a randomized pretest-posttest control design to assess the effectiveness of the intervention website that addressed barriers to organ donation through corresponding videos. Eligible participants were African American adult residents of Georgia who were not registered on the state donor registry. Drawing from previously developed process evaluation constructs, we adapted reach (the extent to which individuals were found eligible, and participated in the study), recruitment (online recruitment mechanism), dose received (intervention uptake), and context (how the Web-based setting influenced study implementation) for Internet settings and used the adapted model to assess the implementation of our Web-based intervention. With regard to reach, 1415 individuals completed the eligibility screener; 948 (67.00%) were determined eligible, of whom 918 (96.8%) completed the study. After eliminating duplicate entries (n=17), those who did not initiate the posttest (n=21) and those with an invalid ZIP code (n=108), 772 valid entries remained. Per the Internet protocol (IP) address analysis, only 23 of the 772 valid entries (3.0%) were within Georgia, and only 17 of those

  4. Development and validation of predictive simulation model of multi-layer repair welding process by temper bead technique

    International Nuclear Information System (INIS)

    Okano, Shigetaka; Miyasaka, Fumikazu; Mochizuki, Masahito; Tanaka, Manabu

    2015-01-01

    Stress corrosion cracking (SCC) has recently been observed in the nickel base alloy weld metal of dissimilar pipe joint used in pressurized water reactor (PWR) . Temper bead technique has been developed as one of repair procedures against SCC applicable in case that post weld heat treatment (PWHT) is difficult to carry out. In this regard, however it is essential to pass the property and performance qualification test to confirm the effect of tempering on the mechanical properties at repair welds before temper bead technique is actually used in practice. Thus the appropriate welding procedure conditions in temper bead technique are determined on the basis of the property and performance qualification testing. It is necessary for certifying the structural soundness and reliability at repair welds but takes a lot of work and time in the present circumstances. Therefore it is desirable to establish the reasonable alternatives for qualifying the property and performance at repair welds. In this study, mathematical modeling and numerical simulation procedures were developed for predicting weld bead configuration and temperature distribution during multi-layer repair welding process by temper bead technique. In the developed simulation technique, characteristics of heat source in temper bead welding are calculated from weld heat input conditions through the arc plasma simulation and then weld bead configuration and temperature distribution during temper bead welding are calculated from characteristics of heat source obtained through the coupling analysis between bead surface shape and thermal conduction. The simulation results were compared with the experimental results under the same welding heat input conditions. As the results, the bead surface shape and temperature distribution, such as A cl lines, were in good agreement between simulation and experimental results. It was concluded that the developed simulation technique has the potential to become useful for

  5. INFORMATION MODELLING OF PROCESS OF ADOPTION OF ADMINISTRATIVE DECISIONS AT THE ORGANIZATION OF PROFESSIONAL DEVELOPMENT OF THE PERSONNEL

    Directory of Open Access Journals (Sweden)

    Yaroslav E. Prokushev

    2015-01-01

    Full Text Available The article is devoted to a problem of theorganization of professional developmentof personnel. The article is consideringtwo interconnected tasks. The fi rst task is: estimation of degree of need of professional development of the specifi c worker. The second task is: choice of the programof professional development. Functionalinformation models of procedures ofadoption of administrative decisions withinthese tasks are developed.

  6. Technology development life cycle processes.

    Energy Technology Data Exchange (ETDEWEB)

    Beck, David Franklin

    2013-05-01

    This report and set of appendices are a collection of memoranda originally drafted in 2009 for the purpose of providing motivation and the necessary background material to support the definition and integration of engineering and management processes related to technology development. At the time there was interest and support to move from Capability Maturity Model Integration (CMMI) Level One (ad hoc processes) to Level Three. As presented herein, the material begins with a survey of open literature perspectives on technology development life cycles, including published data on %E2%80%9Cwhat went wrong.%E2%80%9D The main thrust of the material presents a rational expose%CC%81 of a structured technology development life cycle that uses the scientific method as a framework, with further rigor added from adapting relevant portions of the systems engineering process. The material concludes with a discussion on the use of multiple measures to assess technology maturity, including consideration of the viewpoint of potential users.

  7. Development of a Model for Measuring Scientific Processing Skills Based on Brain-Imaging Technology: Focused on the Experimental Design Process

    Science.gov (United States)

    Lee, Il-Sun; Byeon, Jung-Ho; Kim, Young-shin; Kwon, Yong-Ju

    2014-01-01

    The purpose of this study was to develop a model for measuring experimental design ability based on functional magnetic resonance imaging (fMRI) during biological inquiry. More specifically, the researchers developed an experimental design task that measures experimental design ability. Using the developed experimental design task, they measured…

  8. Modeling styles in business process modeling

    NARCIS (Netherlands)

    Pinggera, J.; Soffer, P.; Zugal, S.; Weber, B.; Weidlich, M.; Fahland, D.; Reijers, H.A.; Mendling, J.; Bider, I.; Halpin, T.; Krogstie, J.; Nurcan, S.; Proper, E.; Schmidt, R.; Soffer, P.; Wrycza, S.

    2012-01-01

    Research on quality issues of business process models has recently begun to explore the process of creating process models. As a consequence, the question arises whether different ways of creating process models exist. In this vein, we observed 115 students engaged in the act of modeling, recording

  9. Development and modelling of a steel slag filter effluent neutralization process with CO2-enriched air from an upstream bioprocess.

    Science.gov (United States)

    Bove, Patricia; Claveau-Mallet, Dominique; Boutet, Étienne; Lida, Félix; Comeau, Yves

    2018-02-01

    The main objective of this project was to develop a steel slag filter effluent neutralization process by acidification with CO 2 -enriched air coming from a bioprocess. Sub-objectives were to evaluate the neutralization capacity of different configurations of neutralization units in lab-scale conditions and to propose a design model of steel slag effluent neutralization. Two lab-scale column neutralization units fed with two different types of influent were operated at hydraulic retention time of 10 h. Tested variables were mode of flow (saturated or percolating), type of media (none, gravel, Bionest and AnoxKaldnes K3), type of air (ambient or CO 2 -enriched) and airflow rate. One neutralization field test (saturated and no media, 2000-5000 ppm CO 2 , sequential feeding, hydraulic retention time of 7.8 h) was conducted for 7 days. Lab-scale and field-scale tests resulted in effluent pH of 7.5-9.5 when the aeration rate was sufficiently high. A model was implemented in the PHREEQC software and was based on the carbonate system, CO 2 transfer and calcite precipitation; and was calibrated on ambient air lab tests. The model was validated with CO 2 -enriched air lab and field tests, providing satisfactory validation results over a wide range of CO 2 concentrations. The flow mode had a major impact on CO 2 transfer and hydraulic efficiency, while the type of media had little influence. The flow mode also had a major impact on the calcite surface concentration in the reactor: it was constant in saturated mode and was increasing in percolating mode. Predictions could be made for different steel slag effluent pH and different operation conditions (hydraulic retention time, CO 2 concentration, media and mode of flow). The pH of the steel slag filter effluent and the CO 2 concentration of the enriched air were factors that influenced most the effluent pH of the neutralization process. An increased concentration in CO 2 in the enriched air reduced calcite precipitation

  10. The processes of strategy development

    OpenAIRE

    Bailey, Andy; Johnson, Gerry

    1995-01-01

    This paper is concerned with the processes by which strategy is developed within organisations. It builds on research into the nature of strategy development being undertaken within the Centre for Strategic Management and Organisational Change at Cranfield School of Management. Initially the process of strategy development is discussed, a number of explanations of the process are presented and an integrated framework is developed. This framework is subsequently used to illustra...

  11. Process-Based Quality (PBQ) Tools Development

    Energy Technology Data Exchange (ETDEWEB)

    Cummins, J.L.

    2001-12-03

    The objective of this effort is to benchmark the development of process-based quality tools for application in CAD (computer-aided design) model-based applications. The processes of interest are design, manufacturing, and quality process applications. A study was commissioned addressing the impact, current technologies, and known problem areas in application of 3D MCAD (3-dimensional mechanical computer-aided design) models and model integrity on downstream manufacturing and quality processes. The downstream manufacturing and product quality processes are profoundly influenced and dependent on model quality and modeling process integrity. The goal is to illustrate and expedite the modeling and downstream model-based technologies for available or conceptual methods and tools to achieve maximum economic advantage and advance process-based quality concepts.

  12. PSE in Pharmaceutical Process Development

    DEFF Research Database (Denmark)

    Gernaey, Krist; Cervera Padrell, Albert Emili; Woodley, John

    2011-01-01

    The pharmaceutical industry is under growing pressure to increase efficiency, both in production and in process development. This paper will discuss the use of Process Systems Engineering (PSE) methods in pharmaceutical process development, and searches for answers to questions such as: Which PSE...

  13. Co-development of Problem Gambling and Depression Symptoms in Emerging Adults: A Parallel-Process Latent Class Growth Model.

    Science.gov (United States)

    Edgerton, Jason D; Keough, Matthew T; Roberts, Lance W

    2018-02-21

    This study examines whether there are multiple joint trajectories of depression and problem gambling co-development in a sample of emerging adults. Data were from the Manitoba Longitudinal Study of Young Adults (n = 679), which was collected in 4 waves across 5 years (age 18-20 at baseline). Parallel process latent class growth modeling was used to identified 5 joint trajectory classes: low decreasing gambling, low increasing depression (81%); low stable gambling, moderate decreasing depression (9%); low stable gambling, high decreasing depression (5%); low stable gambling, moderate stable depression (3%); moderate stable problem gambling, no depression (2%). There was no evidence of reciprocal growth in problem gambling and depression in any of the joint classes. Multinomial logistic regression analyses of baseline risk and protective factors found that only neuroticism, escape-avoidance coping, and perceived level of family social support were significant predictors of joint trajectory class membership. Consistent with the pathways model framework, we observed that individuals in the problem gambling only class were more likely using gambling as a stable way to cope with negative emotions. Similarly, high levels of neuroticism and low levels of family support were associated with increased odds of being in a class with moderate to high levels of depressive symptoms (but low gambling problems). The results suggest that interventions for problem gambling and/or depression need to focus on promoting more adaptive coping skills among more "at-risk" young adults, and such interventions should be tailored in relation to specific subtypes of comorbid mental illness.

  14. Welding process modelling and control

    Science.gov (United States)

    Romine, Peter L.; Adenwala, Jinen A.

    1993-01-01

    The research and analysis performed, and software developed, and hardware/software recommendations made during 1992 in development of the PC-based data acquisition system for support of Welding Process Modeling and Control is reported. A need was identified by the Metals Processing Branch of NASA Marshall Space Flight Center, for a mobile data aquisition and analysis system, customized for welding measurement and calibration. Several hardware configurations were evaluated and a PC-based system was chosen. The Welding Measurement System (WMS) is a dedicated instrument, strictly for the use of data aquisition and analysis. Although the WMS supports many of the functions associated with the process control, it is not the intention for this system to be used for welding process control.

  15. The impact of working memory and the "process of process modelling" on model quality: Investigating experienced versus inexperienced modellers

    DEFF Research Database (Denmark)

    Martini, Markus; Pinggera, Jakob; Neurauter, Manuel

    2016-01-01

    of reconciliation phases was positively related to PM quality in experienced modellers. Our research reveals central cognitive mechanisms in process modelling and has potential practical implications for the development of modelling software and teaching the craft of process modelling....... the role of cognitive processes as well as modelling processes in creating a PM in experienced and inexperienced modellers. Specifically, two working memory (WM) functions (holding and processing of information and relational integration) and three process of process modelling phases (comprehension...

  16. Modelling the evolution of compacted bentonite clays in engineered barrier systems: process model development of the bentonite-water-air system

    International Nuclear Information System (INIS)

    Bond, A.E.; Wilson, J.C.; Maul, P.R.; Robinson, P.C.; Savage, D.

    2010-01-01

    considered to be 'bound' or otherwise immobile (specifically water held in bentonite interlayer sites and double layers) and water which is 'free' or mobile, comprising liquid water and water vapour. The disposition of the water is then constrained using thermodynamic data derived directly from laboratory studies to give a localised energy balance (including bentonite free energy and rock stress) which allows a bound water retention curve to be dynamically evaluated. In addition, a simple mass and volume balancing approach allows the micro-scale changes in porosity and bentonite grain volume to be converted into a macro-scale bulk volume changes and water retention capacity. Indeed, the model largely abandons the concept of 'porosity' as a useful term when describing the state of fluids in bentonite, naturally considering 'capacities' to hold different types of water dependent on the physical and chemical condition of the bentonite. Migration of liquid water, air and water vapour is handled using conventional multi-phase-flow theory with some simple adjustments to selected parameterisation (mainly relative permeability and suction curves) to take into account the different water and air distribution model. The new model has been successfully applied to a series of benchmarking studies in the THERESA project, and examples of comparisons between model calculations and laboratory and field-test data are described in the paper. This model has been implemented in software based on Quintessa's general-purpose modelling code QPAC, which employs a fundamentally different approach to system discretization and process representation from most THM codes. The rapid prototyping and coupled process model development that the QPAC code facilitates has enabled the revised bentonite model to be implemented, producing a test-bed for investigating key features of EBS evolution. Although the application of the model is at an early stage, and further

  17. An Attack Model Development Process for the Cyber Security of Safety Related Nuclear Digital I and C Systems

    Energy Technology Data Exchange (ETDEWEB)

    Khand, Parvaiz Ahmed; Seong, Poong Hyun [Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of)

    2007-10-15

    as a root node and different ways to achieve that attack as leaf nodes. The structure, syntax and semantics of attack trees can be seen in. In attack trees, the leaf nodes can take many kinds of values to evaluate different aspects of system security. For example, the possible/impossible value can be assigned to enumerate all sets of possible attacks that achieve the attack goal, probability values to evaluate the probability that the attack goal can be achieved, cost value to evaluate the minimum cost needed to reach attack goal, and the special equipment value to obtain the most probable attack sets with no special equipment required. Although it is possible to implement security controls almost any type of attack, it is not practical to protect everything. Attack trees also provide a systematic way to model security controls and plant specific procedures as a safeguard against attacks, and check their effectiveness. In this paper, we will present a process for developing an attack model for the cyber security of safety related nuclear digital I and C systems using attack trees.

  18. An Attack Model Development Process for the Cyber Security of Safety Related Nuclear Digital I and C Systems

    International Nuclear Information System (INIS)

    Khand, Parvaiz Ahmed; Seong, Poong Hyun

    2007-01-01

    root node and different ways to achieve that attack as leaf nodes. The structure, syntax and semantics of attack trees can be seen in. In attack trees, the leaf nodes can take many kinds of values to evaluate different aspects of system security. For example, the possible/impossible value can be assigned to enumerate all sets of possible attacks that achieve the attack goal, probability values to evaluate the probability that the attack goal can be achieved, cost value to evaluate the minimum cost needed to reach attack goal, and the special equipment value to obtain the most probable attack sets with no special equipment required. Although it is possible to implement security controls almost any type of attack, it is not practical to protect everything. Attack trees also provide a systematic way to model security controls and plant specific procedures as a safeguard against attacks, and check their effectiveness. In this paper, we will present a process for developing an attack model for the cyber security of safety related nuclear digital I and C systems using attack trees

  19. DEVELOPMENT OF MATHEMATICAL MODEL OF PROCESS OF BLACK CURRANT BERRIES DRYING IN VACUUMDEVICE WITH THE MICROWAVE POWER SUPPLY

    Directory of Open Access Journals (Sweden)

    S. T. Antipov

    2014-01-01

    Full Text Available Summary. The mathematical model allowed to reproduce and study at qualitative level the change of berries form and the structure of the berries layer in the course of drying. The separate berry in the course of drying loses gradually its elasticity, decreases in volume, the peel gathers in folds, there appear internal emptiness. In the course of drying the berries layer decreases in thickness, contacting berries stick strongly with each other due to the coordinated folds of peel appearing, the layer is condensed due to penetration of the berries which have lost elasticity into emptiness between them. The model with high specification describes black currant drying process and therefore has a large number of the parameters available to change. Among them three most important technological parameters, influencing productivity and the drying quality are chosen: the power of microwave radiation P, thickness of the berries layer h, environmental pressure p. From output indicators of the model the most important are three functions from time: dependence of average humidity of the layer on time Wcp (t, dependence of the speed of change of average humidity on time dWcp (t/dt, dependence of the layer average temperature on time Tср (t. On the standard models classification the offered model is algorithmic, but not analytical. It means that output characteristics of model are calculated with the entrance ones, not by analytical transformations (it is impossible principally for the modeled process, but by means of spatial and temporary sampling and the corresponding calculation algorithm. Detailed research of the microwave drying process by means of the model allows to allocate the following stages: fast heating, the fast dehydration, the slowed-down dehydration, consolidation of a layer of a product, final drying, heating after dehydration.

  20. Process mining using BPMN: relating event logs and process models

    NARCIS (Netherlands)

    Kalenkova, A.A.; van der Aalst, W.M.P.; Lomazova, I.A.; Rubin, V.A.

    2017-01-01

    Process-aware information systems (PAIS) are systems relying on processes, which involve human and software resources to achieve concrete goals. There is a need to develop approaches for modeling, analysis, improvement and monitoring processes within PAIS. These approaches include process mining

  1. Process mining using BPMN : relating event logs and process models

    NARCIS (Netherlands)

    Kalenkova, A.A.; Aalst, van der W.M.P.; Lomazova, I.A.; Rubin, V.A.

    2015-01-01

    Process-aware information systems (PAIS) are systems relying on processes, which involve human and software resources to achieve concrete goals. There is a need to develop approaches for modeling, analysis, improvement and monitoring processes within PAIS. These approaches include process mining

  2. Process correlation analysis model for process improvement identification.

    Science.gov (United States)

    Choi, Su-jin; Kim, Dae-Kyoo; Park, Sooyong

    2014-01-01

    Software process improvement aims at improving the development process of software systems. It is initiated by process assessment identifying strengths and weaknesses and based on the findings, improvement plans are developed. In general, a process reference model (e.g., CMMI) is used throughout the process of software process improvement as the base. CMMI defines a set of process areas involved in software development and what to be carried out in process areas in terms of goals and practices. Process areas and their elements (goals and practices) are often correlated due to the iterative nature of software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data.

  3. Pre-Service Teachers' Material Development Process Based on the ADDIE Model: E-Book Design

    Science.gov (United States)

    Usta, Necla Dönmez; Güntepe, Ebru Turan

    2017-01-01

    With the developments in information and communication technologies, books which are fundamental information sources for students throughout their education and training process are being transformed into electronic book (e-book) formats. E-books provide interactive environments, and they are also updateable materials, which shows that, in time,…

  4. A Decision Tool that Combines Discrete Event Software Process Models with System Dynamics Pieces for Software Development Cost Estimation and Analysis

    Science.gov (United States)

    Mizell, Carolyn Barrett; Malone, Linda

    2007-01-01

    The development process for a large software development project is very complex and dependent on many variables that are dynamic and interrelated. Factors such as size, productivity and defect injection rates will have substantial impact on the project in terms of cost and schedule. These factors can be affected by the intricacies of the process itself as well as human behavior because the process is very labor intensive. The complex nature of the development process can be investigated with software development process models that utilize discrete event simulation to analyze the effects of process changes. The organizational environment and its effects on the workforce can be analyzed with system dynamics that utilizes continuous simulation. Each has unique strengths and the benefits of both types can be exploited by combining a system dynamics model and a discrete event process model. This paper will demonstrate how the two types of models can be combined to investigate the impacts of human resource interactions on productivity and ultimately on cost and schedule.

  5. User's manual for a process model code

    International Nuclear Information System (INIS)

    Kern, E.A.; Martinez, D.P.

    1981-03-01

    The MODEL code has been developed for computer modeling of materials processing facilities associated with the nuclear fuel cycle. However, it can also be used in other modeling applications. This report provides sufficient information for a potential user to apply the code to specific process modeling problems. Several examples that demonstrate most of the capabilities of the code are provided

  6. MODELING OF PATTERN FORMING PROCESS OF AUTOMATIC RADIO DIRECTION FINDER OF PHASE VHF IN THE DEVELOPMENT ENVIRONMENT OF LabVIEW APPLIED PROGRAMS

    Directory of Open Access Journals (Sweden)

    G. K. Aslanov

    2015-01-01

    Full Text Available In the article is developed the model demonstrating the forming process of pattern of antenna system of aerodrome quasidopler automatic radiodirection-finder station in the development environment of LabVIEW applied programs of National Instrument company. 

  7. Application of high-throughput mini-bioreactor system for systematic scale-down modeling, process characterization, and control strategy development.

    Science.gov (United States)

    Janakiraman, Vijay; Kwiatkowski, Chris; Kshirsagar, Rashmi; Ryll, Thomas; Huang, Yao-Ming

    2015-01-01

    High-throughput systems and processes have typically been targeted for process development and optimization in the bioprocessing industry. For process characterization, bench scale bioreactors have been the system of choice. Due to the need for performing different process conditions for multiple process parameters, the process characterization studies typically span several months and are considered time and resource intensive. In this study, we have shown the application of a high-throughput mini-bioreactor system viz. the Advanced Microscale Bioreactor (ambr15(TM) ), to perform process characterization in less than a month and develop an input control strategy. As a pre-requisite to process characterization, a scale-down model was first developed in the ambr system (15 mL) using statistical multivariate analysis techniques that showed comparability with both manufacturing scale (15,000 L) and bench scale (5 L). Volumetric sparge rates were matched between ambr and manufacturing scale, and the ambr process matched the pCO2 profiles as well as several other process and product quality parameters. The scale-down model was used to perform the process characterization DoE study and product quality results were generated. Upon comparison with DoE data from the bench scale bioreactors, similar effects of process parameters on process yield and product quality were identified between the two systems. We used the ambr data for setting action limits for the critical controlled parameters (CCPs), which were comparable to those from bench scale bioreactor data. In other words, the current work shows that the ambr15(TM) system is capable of replacing the bench scale bioreactor system for routine process development and process characterization. © 2015 American Institute of Chemical Engineers.

  8. Development of a neural network model to predict distortion during the metal forming process by line heating

    OpenAIRE

    Pinzón, César; Plazaola, Carlos; Banfield, Ilka; Fong, Amaly; Vega, Adán

    2013-01-01

    In order to achieve automation of the plate forming process by line heating, it is necessary to know in advance the deformation to be obtained under specific heating conditions. Currently, different methods exist to predict deformation, but these are limited to specific applications and most of them depend on the computational capacity so that only simple structures can be analyzed. In this paper, a neural network model that can accurately predict distortions produced during the plate forming...

  9. Scientific-technical level of developments of technological process control and management systems. Modeling of processes of defective fuel pellets formation

    International Nuclear Information System (INIS)

    Troshchenko, V.G.

    2005-01-01

    Automation section of the Institute SverdNIIkhimmash is founded for development of management systems for equipment being created in the Institute and for providing of the systems by optional facilities for control and automation. To solve these problems the section takes part in investigations of technological units as objects of automation with mathematical models working out [ru

  10. The Processes of Location Study for Developing Economic Zones under Public Private Partnership Model: Country Study on Bangladesh

    Directory of Open Access Journals (Sweden)

    Mahmudul Alam

    2011-02-01

    Full Text Available In spite of the complexity in defining the boundary, the concept of Economic Zones (EZ has been evolved as a way forward for the government of the developing countries for enhancing the national trade. Similarly the recent phenomenon of widespread Public Private Partnership (PPP practices especially in infrastructure sector is also providing a window to develop many of such economic zones through PPP model as EZ typically is capital intensive. Bangladesh has discrete success both under PPP and EZ regime. However, developing EZ under PPP model has few commercial complexities as both the public and private sector need to bear some roles and obligations one of which is selection of appropriate location for EZ development. The location study for PPP EZ development therefore receives paramount attention both from developer and lenders perspective. Such location study generally is not typical project site study by nature; rather it is more economic concentrated. This paper will try to identify the factors that are essential to consider for conducting these location studies based on the examples of Bangladesh. The paper will also identify the appropriate methods and approaches required for successful EZ development through PPP.

  11. Advanced oxidation processes: overall models

    Energy Technology Data Exchange (ETDEWEB)

    Rodriguez, M. [Univ. de los Andes, Escuela Basica de Ingenieria, La Hechicera, Merida (Venezuela); Curco, D.; Addardak, A.; Gimenez, J.; Esplugas, S. [Dept. de Ingenieria Quimica. Univ. de Barcelona, Barcelona (Spain)

    2003-07-01

    Modelling AOPs implies to consider all the steps included in the process, that means, mass transfer, kinetic (reaction) and luminic steps. In this way, recent works develop models which relate the global reaction rate to catalyst concentration and radiation absorption. However, the application of such models requires to know what is the controlling step for the overall process. In this paper, a simple method is explained which allows to determine the controlling step. Thus, it is assumed that reactor is divided in two hypothetical zones (dark and illuminated), and according to the experimental results, obtained by varying only the reaction volume, it can be decided if reaction occurs only in the illuminated zone or in the all reactor, including dark zone. The photocatalytic degradation of phenol, by using titania degussa P-25 as catalyst, is studied as reaction model. The preliminary results obtained are presented here, showing that it seems that, in this case, reaction only occurs in the illuminated zone of photoreactor. A model is developed to explain this behaviour. (orig.)

  12. Generating process model collections

    NARCIS (Netherlands)

    Yan, Z.; Dijkman, R.M.; Grefen, P.W.P.J.

    2017-01-01

    Business process management plays an important role in the management of organizations. More and more organizations describe their operations as business processes. It is common for organizations to have collections of thousands of business processes, but for reasons of confidentiality these

  13. Development and Application of an Integrated Model for Representing Hydrologic Processes and Irrigation at Residential Scale in Semiarid and Mediterranean Regions

    Science.gov (United States)

    Herrera, J. B.; Gironas, J. A.; Bonilla, C. A.; Vera, S.; Reyes, F. R.

    2015-12-01

    Urbanization alters physical and biological processes that take place in natural environments. New impervious areas change the hydrological processes, reducing infiltration and evapotranspiration and increasing direct runoff volumes and flow discharges. To reduce these effects at local scale, sustainable urban drainage systems, low impact development and best management practices have been developed and implemented. These technologies, which typically consider some type of green infrastructure (GI), simulate natural processes of capture, retention and infiltration to control flow discharges from frequent events and preserve the hydrological cycle. Applying these techniques in semiarid regions requires accounting for aspects related to the maintenance of green areas, such as the irrigation needs and the selection of the vegetation. This study develops the Integrated Hydrological Model at Residential Scale, IHMORS, which is a continuous model that simulates the most relevant hydrological processes together with irrigation processes of green areas. In the model contributing areas and drainage control practices are modeled by combining and connecting differents subareas subjected to surface processes (i.e. interception, evapotranspiration, infiltration and surface runoff) and sub-surface processes (percolation, redistribution and subsurface runoff). The model simulates these processes and accounts for the dynamics of the water content in different soil layers. The different components of the model were first tested using laboratory and numerical experiments, and then an application to a case study was carried out. In this application we assess the long-term performance in terms of runoff control and irrigation needs of green gardens with different vegetation, under different climate and irrigation practices. The model identifies significant differences in the performance of the alternatives and provides a good insight for the maintenance needs of GI for runoff control.

  14. Plasma Processing of Model Residential Solid Waste

    Science.gov (United States)

    Messerle, V. E.; Mossé, A. L.; Nikonchuk, A. N.; Ustimenko, A. B.; Baimuldin, R. V.

    2017-09-01

    The authors have tested the technology of processing of model residential solid waste. They have developed and created a pilot plasma unit based on a plasma chamber incinerator. The waste processing technology has been tested and prepared for commercialization.

  15. What makes process models understandable?

    NARCIS (Netherlands)

    Mendling, J.; Reijers, H.A.; Cardoso, J.; Alonso, G.; Dadam, P.; Rosemann, M.

    2007-01-01

    Despite that formal and informal quality aspects are of significant importance to business process modeling, there is only little empirical work reported on process model quality and its impact factors. In this paper we investigate understandability as a proxy for quality of process models and focus

  16. Review on Biomass Torrefaction Process and Product Properties and Design of Moving Bed Torrefaction System Model Development

    Energy Technology Data Exchange (ETDEWEB)

    Jaya Shankar Tumuluru; Christopher T. Wright; Shahab Sokhansanj

    2011-08-01

    A Review on Torrefaction Process and Design of Moving Bed Torrefaction System for Biomass Processing Jaya Shankar Tumuluru1, Shahab Sokhansanj2 and Christopher T. Wright1 Idaho National Laboratory Biofuels and Renewable Energy Technologies Department Idaho Falls, Idaho 83415 Oak Ridge National Laboratory Bioenergy Resource and Engineering Systems Group Oak Ridge, TN 37831 Abstract Torrefaction is currently developing as an important preprocessing step to improve the quality of biomass in terms of physical properties, and proximate and ultimate composition. Torrefaction is a slow heating of biomass in an inert or reduced environment to a maximum temperature of 300 C. Torrefaction can also be defined as a group of products resulting from the partially controlled and isothermal pyrolysis of biomass occurring in a temperature range of 200-230 C and 270-280 C. Thus, the process can also be called a mild pyrolysis as it occurs at the lower temperature range of the pyrolysis process. At the end of the torrefaction process, a solid uniform product with lower moisture content and higher energy content than raw biomass is produced. Most of the smoke-producing compounds and other volatiles are removed during torrefaction, which produces a final product that will have a lower mass but a higher heating value. There is a lack of literature on the design aspects of torrefaction reactor and a design sheet for estimating the dimensions of the torrefier based on capacity. This study includes (a) conducting a detailed review on the torrefaction of biomass in terms of understanding the process, product properties, off-gas compositions, and methods used, and (b) to design a moving bed torrefier, taking into account the basic fundamental heat and mass transfer calculations. Specific objectives include calculating the dimensions like diameter and height of the moving packed bed torrefier for different capacities ranging from 25-1000 kg/hr, designing the heat loads and gas flow rates, and

  17. Integrated Site Model Process Model Report

    International Nuclear Information System (INIS)

    Booth, T.

    2000-01-01

    The Integrated Site Model (ISM) provides a framework for discussing the geologic features and properties of Yucca Mountain, which is being evaluated as a potential site for a geologic repository for the disposal of nuclear waste. The ISM is important to the evaluation of the site because it provides 3-D portrayals of site geologic, rock property, and mineralogic characteristics and their spatial variabilities. The ISM is not a single discrete model; rather, it is a set of static representations that provide three-dimensional (3-D), computer representations of site geology, selected hydrologic and rock properties, and mineralogic-characteristics data. These representations are manifested in three separate model components of the ISM: the Geologic Framework Model (GFM), the Rock Properties Model (RPM), and the Mineralogic Model (MM). The GFM provides a representation of the 3-D stratigraphy and geologic structure. Based on the framework provided by the GFM, the RPM and MM provide spatial simulations of the rock and hydrologic properties, and mineralogy, respectively. Functional summaries of the component models and their respective output are provided in Section 1.4. Each of the component models of the ISM considers different specific aspects of the site geologic setting. Each model was developed using unique methodologies and inputs, and the determination of the modeled units for each of the components is dependent on the requirements of that component. Therefore, while the ISM represents the integration of the rock properties and mineralogy into a geologic framework, the discussion of ISM construction and results is most appropriately presented in terms of the three separate components. This Process Model Report (PMR) summarizes the individual component models of the ISM (the GFM, RPM, and MM) and describes how the three components are constructed and combined to form the ISM

  18. Adapting the unified software development process for user interface development

    NARCIS (Netherlands)

    Obrenovic, Z.; Starcevic, D.

    2006-01-01

    In this paper we describe how existing software developing processes, such as Rational Unified Process, can be adapted in order to allow disciplined and more efficient development of user interfaces. The main objective of this paper is to demonstrate that standard modeling environments, based on the

  19. Capability Maturity Model Integration (CMMISM), Version 1.1 CMMISM for Systems Engineering, Software Engineering, Integrated Product and Process Development, and Supplier Sourcing (CMMI-SE/SW/IPPD/SS, V1.1). Staged Representation

    National Research Council Canada - National Science Library

    2002-01-01

    .... Concepts covered by this model include systems engineering, software engineering, integrated product and process development, and supplier sourcing as well as traditional CMM concepts such as process...

  20. Process developments in gasoil hydrotreating

    Energy Technology Data Exchange (ETDEWEB)

    Lawrence, R.C.; Kinley, D.H.; Wood, M.A. [Davy Process Technology Limited, London (United Kingdom)

    1997-07-01

    Changing demand patterns and legislation increase the pressure upon hydrotreating capacities at many refineries. To meet these pressures, improvements have been and will be necessary not only in catalysts, but also in the hydrotreating process. On the basis of its hydrogenation experience, Davy Process Technology has developed and tested a number of concepts aimed at improving the effectiveness of the basic process - enabling economic deep desulfurisation and opening up the potential for an integrated HDS/HDA flowsheet using sulphur tolerant HDA Catalysts.

  1. Model for amorphous aggregation processes

    Science.gov (United States)

    Stranks, Samuel D.; Ecroyd, Heath; van Sluyter, Steven; Waters, Elizabeth J.; Carver, John A.; von Smekal, Lorenz

    2009-11-01

    The amorphous aggregation of proteins is associated with many phenomena, ranging from the formation of protein wine haze to the development of cataract in the eye lens and the precipitation of recombinant proteins during their expression and purification. While much literature exists describing models for linear protein aggregation, such as amyloid fibril formation, there are few reports of models which address amorphous aggregation. Here, we propose a model to describe the amorphous aggregation of proteins which is also more widely applicable to other situations where a similar process occurs, such as in the formation of colloids and nanoclusters. As first applications of the model, we have tested it against experimental turbidimetry data of three proteins relevant to the wine industry and biochemistry, namely, thaumatin, a thaumatinlike protein, and α -lactalbumin. The model is very robust and describes amorphous experimental data to a high degree of accuracy. Details about the aggregation process, such as shape parameters of the aggregates and rate constants, can also be extracted.

  2. Measuring similarity between business process models

    NARCIS (Netherlands)

    Dongen, van B.F.; Dijkman, R.M.; Mendling, J.

    2007-01-01

    Quality aspects become increasingly important when business process modeling is used in a large-scale enterprise setting. In order to facilitate a storage without redundancy and an efficient retrieval of relevant process models in model databases it is required to develop a theoretical understanding

  3. RSMASS system model development

    International Nuclear Information System (INIS)

    Marshall, A.C.; Gallup, D.R.

    1998-01-01

    RSMASS system mass models have been used for more than a decade to make rapid estimates of space reactor power system masses. This paper reviews the evolution of the RSMASS models and summarizes present capabilities. RSMASS has evolved from a simple model used to make rough estimates of space reactor and shield masses to a versatile space reactor power system model. RSMASS uses unique reactor and shield models that permit rapid mass optimization calculations for a variety of space reactor power and propulsion systems. The RSMASS-D upgrade of the original model includes algorithms for the balance of the power system, a number of reactor and shield modeling improvements, and an automatic mass optimization scheme. The RSMASS-D suite of codes cover a very broad range of reactor and power conversion system options as well as propulsion and bimodal reactor systems. Reactor choices include in-core and ex-core thermionic reactors, liquid metal cooled reactors, particle bed reactors, and prismatic configuration reactors. Power conversion options include thermoelectric, thermionic, Stirling, Brayton, and Rankine approaches. Program output includes all major component masses and dimensions, efficiencies, and a description of the design parameters for a mass optimized system. In the past, RSMASS has been used as an aid to identify and select promising concepts for space power applications. The RSMASS modeling approach has been demonstrated to be a valuable tool for guiding optimization of the power system design; consequently, the model is useful during system design and development as well as during the selection process. An improved in-core thermionic reactor system model RSMASS-T is now under development. The current development of the RSMASS-T code represents the next evolutionary stage of the RSMASS models. RSMASS-T includes many modeling improvements and is planned to be more user-friendly. RSMASS-T will be released as a fully documented, certified code at the end of

  4. The MINERVA Software Development Process

    Science.gov (United States)

    Narkawicz, Anthony; Munoz, Cesar A.; Dutle, Aaron M.

    2017-01-01

    This paper presents a software development process for safety-critical software components of cyber-physical systems. The process is called MINERVA, which stands for Mirrored Implementation Numerically Evaluated against Rigorously Verified Algorithms. The process relies on formal methods for rigorously validating code against its requirements. The software development process uses: (1) a formal specification language for describing the algorithms and their functional requirements, (2) an interactive theorem prover for formally verifying the correctness of the algorithms, (3) test cases that stress the code, and (4) numerical evaluation on these test cases of both the algorithm specifications and their implementations in code. The MINERVA process is illustrated in this paper with an application to geo-containment algorithms for unmanned aircraft systems. These algorithms ensure that the position of an aircraft never leaves a predetermined polygon region and provide recovery maneuvers when the region is inadvertently exited.

  5. Precipitation processes developed during TOGA COARE (1992), GATE (1974), SCSMEX (1998), and KWAJEX (1999): 3D Cloud Resolving Model Simulation

    Science.gov (United States)

    Tao, W.-K.

    2006-01-01

    Real clouds and cloud systems are inherently three-dimensional (3D). Because of the limitations in computer resources, however, most cloud-resolving models (CRMs) today are still two-dimensional (2D). A few 3D CRMs have been used to study the response of clouds to large-scale forcing. In these 3D simulations, the model domain was small, and the integration time was 6 hours. Only recently have 3D experiments been performed for multi-day periods for tropical cloud systems with large horizontal domains at the National Center for Atmospheric Research (NCAR), NOAA GFDL, the U.K. Met. Office, Colorado State University and NASA Goddard Space Flight Center. An improved 3D Goddard Cumulus Ensemble (GCE) model was recently used to simulate periods during TOGA COARE (December 19-27, 1992), GATE (september 1-7, 1974), SCSMEX (May 18-26, June 2-11, 1998) and KWAJEX (August 7-13, August 18-21, and August 29-September 12, 1999) using a 512 by 512 km domain and 41 vertical layers. The major objectives of this paper are: (1) to identify the differences and similarities in the simulated precipitation processes and their associated surface and water energy budgets in TOGA COARE, GATE, KWAJEX, and SCSMEX, and (2) to asses the impact of microphysics, radiation budget and surface fluxes on the organization of convection in tropics.

  6. Strategies for developing competency models.

    Science.gov (United States)

    Marrelli, Anne F; Tondora, Janis; Hoge, Michael A

    2005-01-01

    There is an emerging trend within healthcare to introduce competency-based approaches in the training, assessment, and development of the workforce. The trend is evident in various disciplines and specialty areas within the field of behavioral health. This article is designed to inform those efforts by presenting a step-by-step process for developing a competency model. An introductory overview of competencies, competency models, and the legal implications of competency development is followed by a description of the seven steps involved in creating a competency model for a specific function, role, or position. This modeling process is drawn from advanced work on competencies in business and industry.

  7. Parsing multiple processes of high temperature impacts on corn/soybean yield using a newly developed CLM-APSIM modeling framework

    Science.gov (United States)

    Peng, B.; Guan, K.; Chen, M.

    2016-12-01

    Future agricultural production faces a grand challenge of higher temperature under climate change. There are multiple physiological or metabolic processes of how high temperature affects crop yield. Specifically, we consider the following major processes: (1) direct temperature effects on photosynthesis and respiration; (2) speed-up growth rate and the shortening of growing season; (3) heat stress during reproductive stage (flowering and grain-filling); (4) high-temperature induced increase of atmospheric water demands. In this work, we use a newly developed modeling framework (CLM-APSIM) to simulate the corn and soybean growth and explicitly parse the above four processes. By combining the strength of CLM in modeling surface biophysical (e.g., hydrology and energy balance) and biogeochemical (e.g., photosynthesis and carbon-nitrogen interactions), as well as that of APSIM in modeling crop phenology and reproductive stress, the newly developed CLM-APSIM modeling framework enables us to diagnose the impacts of high temperature stress through different processes at various crop phenology stages. Ground measurements from the advanced SoyFACE facility at University of Illinois is used here to calibrate, validate, and improve the CLM-APSIM modeling framework at the site level. We finally use the CLM-APSIM modeling framework to project crop yield for the whole US Corn Belt under different climate scenarios.

  8. Neuroscientific Model of Motivational Process

    OpenAIRE

    Kim, Sung-il

    2013-01-01

    Considering the neuroscientific findings on reward, learning, value, decision-making, and cognitive control, motivation can be parsed into three sub processes, a process of generating motivation, a process of maintaining motivation, and a process of regulating motivation. I propose a tentative neuroscientific model of motivational processes which consists of three distinct but continuous sub processes, namely reward-driven approach, value-based decision-making, and goal-directed control. Rewa...

  9. Software-Engineering Process Simulation (SEPS) model

    Science.gov (United States)

    Lin, C. Y.; Abdel-Hamid, T.; Sherif, J. S.

    1992-01-01

    The Software Engineering Process Simulation (SEPS) model is described which was developed at JPL. SEPS is a dynamic simulation model of the software project development process. It uses the feedback principles of system dynamics to simulate the dynamic interactions among various software life cycle development activities and management decision making processes. The model is designed to be a planning tool to examine tradeoffs of cost, schedule, and functionality, and to test the implications of different managerial policies on a project's outcome. Furthermore, SEPS will enable software managers to gain a better understanding of the dynamics of software project development and perform postmodern assessments.

  10. A model for landscape development in terms of shoreline displacement, sediment dynamics, lake formation, and lake choke-up processes

    International Nuclear Information System (INIS)

    Brydsten, Lars

    2006-12-01

    repeated until the calculated number of pixels is marked. However, vegetation is only permitted to colonize on bottoms shallower than 2 metres. The lake module steps forward until the former lake basin is totally covered with vegetation. Outputs from the module are in text-file with following values; time, mean water depth, water area, added sediment volume since lake isolation, and area and volume of organic material. The model is applied on a large number of objects in both the Forsmark and Oskarshamn sites. Most of the objects exists or are future lakes, but, also some terrestrial objects are processed. For future lakes in Forsmark, the results from the simulations show that the length of the lacustrine phase are 3,000-4,000 years for the small lakes and > 9,000 years for the large and deep lakes situated in the so-called Graesoeraennan. Two of the future lakes in the Simpevarp area will also be long-lived (> 1 ,000 years); both will be formed in the existing Granholmsfjaerden

  11. A model for landscape development in terms of shoreline displacement, sediment dynamics, lake formation, and lake choke-up processes

    Energy Technology Data Exchange (ETDEWEB)

    Brydsten, Lars [Umeaa University, Dept. of Ecology and Environmental Science (Sweden)

    2006-12-15

    process is repeated until the calculated number of pixels is marked. However, vegetation is only permitted to colonize on bottoms shallower than 2 metres. The lake module steps forward until the former lake basin is totally covered with vegetation. Outputs from the module are in text-file with following values; time, mean water depth, water area, added sediment volume since lake isolation, and area and volume of organic material. The model is applied on a large number of objects in both the Forsmark and Oskarshamn sites. Most of the objects exists or are future lakes, but, also some terrestrial objects are processed. For future lakes in Forsmark, the results from the simulations show that the length of the lacustrine phase are 3,000-4,000 years for the small lakes and > 9,000 years for the large and deep lakes situated in the so-called Graesoeraennan. Two of the future lakes in the Simpevarp area will also be long-lived (> 1 ,000 years); both will be formed in the existing Granholmsfjaerden.

  12. Unified Approach in the DSS Development Process

    Directory of Open Access Journals (Sweden)

    2007-01-01

    Full Text Available The structure of today's decision support environment become very complex due to new generation of Business Intelligence applications and technologies like Data Warehouse, OLAP (On Line Analytical Processing and Data Mining. In this respect DSS development process are not simple and needs an adequate methodology or framework able to manage different tools and platforms to achieve manager's requirements. The DSS development process must be view like a unified and iterative set of activities and operations. The new techniques based on Unified Process (UP methodology and UML (Unified Modeling Language it seems to be appropriate for DSS development using prototyping and RAD (Rapid Application Development techniques. In this paper we present a conceptual framework for development and integrate Decision Support Systems using Unified Process Methodology and UML.

  13. Evaluation of Structural Changes in the Coal Specimen Heating Process and UCG Model Experiments for Developing Efficient UCG Systems

    Directory of Open Access Journals (Sweden)

    Gota Deguchi

    2013-05-01

    Full Text Available In the underground coal gasification (UCG process, cavity growth with crack extension inside the coal seam is an important phenomenon that directly influences gasification efficiency. An efficient and environmentally friendly UCG system also relies upon the precise control and evaluation of the gasification zone. This paper presents details of laboratory studies undertaken to evaluate structural changes that occur inside the coal under thermal stress and to evaluate underground coal-oxygen gasification simulated in an ex-situ reactor. The effects of feed temperature, the direction of the stratified plane, and the inherent microcracks on the coal fracture and crack extension were investigated using some heating experiments performed using plate-shaped and cylindrical coal specimens. To monitor the failure process and to measure the microcrack distribution inside the coal specimen before and after heating, acoustic emission (AE analysis and X-ray CT were applied. We also introduce a laboratory-scale UCG model experiment conducted with set design and operating parameters. The temperature profiles, AE activities, product gas concentration as well as the gasifier weight lossess were measured successively during gasification. The product gas mainly comprised combustible components such as CO, CH4, and H2 (27.5, 5.5, and 17.2 vol% respectively, which produced a high average calorific value (9.1 MJ/m3.

  14. Assessment and Development of Engineering Design Processes

    DEFF Research Database (Denmark)

    Ulrikkeholm, Jeppe Bjerrum

    , the engineering companies need to have efficient engineering design processes in place, so they can design customised product variants faster and more efficiently. It is however not an easy task to model and develop such processes. To conduct engineering design is often a highly iterative, illdefined and complex...... the process can be fully understood and eventually improved. Taking its starting point in this proposition, the outcome of the research is an operational 5-phased procedure for assessing and developing engineering design processes through integrated modelling of product and process, designated IPPM......, and eventually the results are discussed, overall conclusions are made and future research is proposed. The results produced throughout the research project are developed in close collaboration with the Marine Low Speed business unit within the company MAN Diesel & Turbo. The business unit is the world market...

  15. Developing a Model Component

    Science.gov (United States)

    Fields, Christina M.

    2013-01-01

    The Spaceport Command and Control System (SCCS) Simulation Computer Software Configuration Item (CSCI) is responsible for providing simulations to support test and verification of SCCS hardware and software. The Universal Coolant Transporter System (UCTS) was a Space Shuttle Orbiter support piece of the Ground Servicing Equipment (GSE). The initial purpose of the UCTS was to provide two support services to the Space Shuttle Orbiter immediately after landing at the Shuttle Landing Facility. The UCTS is designed with the capability of servicing future space vehicles; including all Space Station Requirements necessary for the MPLM Modules. The Simulation uses GSE Models to stand in for the actual systems to support testing of SCCS systems during their development. As an intern at Kennedy Space Center (KSC), my assignment was to develop a model component for the UCTS. I was given a fluid component (dryer) to model in Simulink. I completed training for UNIX and Simulink. The dryer is a Catch All replaceable core type filter-dryer. The filter-dryer provides maximum protection for the thermostatic expansion valve and solenoid valve from dirt that may be in the system. The filter-dryer also protects the valves from freezing up. I researched fluid dynamics to understand the function of my component. The filter-dryer was modeled by determining affects it has on the pressure and velocity of the system. I used Bernoulli's Equation to calculate the pressure and velocity differential through the dryer. I created my filter-dryer model in Simulink and wrote the test script to test the component. I completed component testing and captured test data. The finalized model was sent for peer review for any improvements. I participated in Simulation meetings and was involved in the subsystem design process and team collaborations. I gained valuable work experience and insight into a career path as an engineer.

  16. Modelling of processes occurring in deep geological repository - development of new modules in the GoldSim environment

    International Nuclear Information System (INIS)

    Vopalka, D.; Lukin, D.; Vokal, A.

    2006-01-01

    Three new modules modelling the processes that occur in a deep geological repository have been prepared in the GoldSim computer code environment (using its Transport Module). These modules help to understand the role of selected parameters in the near-field region of the final repository and to prepare an own complex model of the repository behaviour. The source term module includes radioactive decay and ingrowth in the canister, first order degradation of fuel matrix, solubility limitation of the concentration of the studied nuclides, and diffusive migration through the surrounding bentonite layer controlled by the output boundary condition formulated with respect to the rate of water flow in the rock. The corrosion module describes corrosion of canisters made of carbon steel and transport of corrosion products in the near-field region. This module computes balance equations between dissolving species and species transported by diffusion and/or advection from the surface of a solid material. The diffusion module that includes also non-linear form of the interaction isotherm can be used for an evaluation of small-scale diffusion experiments. (author)

  17. Modelling of processes occurring in deep geological repository - Development of new modules in the GoldSim environment

    Science.gov (United States)

    Vopálka, D.; Lukin, D.; Vokál, A.

    2006-01-01

    Three new modules modelling the processes that occur in a deep geological repository have been prepared in the GoldSim computer code environment (using its Transport Module). These modules help to understand the role of selected parameters in the near-field region of the final repository and to prepare an own complex model of the repository behaviour. The source term module includes radioactive decay and ingrowth in the canister, first order degradation of fuel matrix, solubility limitation of the concentration of the studied nuclides, and diffusive migration through the surrounding bentonite layer controlled by the output boundary condition formulated with respect to the rate of water flow in the rock. The corrosion module describes corrosion of canisters made of carbon steel and transport of corrosion products in the near-field region. This module computes balance equations between dissolving species and species transported by diffusion and/or advection from the surface of a solid material. The diffusion module that includes also non-linear form of the interaction isotherm can be used for an evaluation of small-scale diffusion experiments.

  18. Mathematical modeling of biological processes

    CERN Document Server

    Friedman, Avner

    2014-01-01

    This book on mathematical modeling of biological processes includes a wide selection of biological topics that demonstrate the power of mathematics and computational codes in setting up biological processes with a rigorous and predictive framework. Topics include: enzyme dynamics, spread of disease, harvesting bacteria, competition among live species, neuronal oscillations, transport of neurofilaments in axon, cancer and cancer therapy, and granulomas. Complete with a description of the biological background and biological question that requires the use of mathematics, this book is developed for graduate students and advanced undergraduate students with only basic knowledge of ordinary differential equations and partial differential equations; background in biology is not required. Students will gain knowledge on how to program with MATLAB without previous programming experience and how to use codes in order to test biological hypothesis.

  19. Testing a model of science process skills acquisition: An interaction with parents' education, preferred language, gender, science attitude, cognitive development, academic ability, and biology knowledge

    Science.gov (United States)

    Germann, Paul J.

    Path analysis techniques were used to test a hypothesized structural model of direct and indirect causal effects of student variables on science process skills. The model was tested twice using data collected at the beginning and end of the school year from 67 9th- and 10th-grade biology students who lived in a rural Franco-American community in New England. Each student variable was found to have significant effects, accounting for approximately 80% of the variance in science process skills achievement. Academic ability, biology knowledge, and language preference had significant direct effects. There were significant mediated effects by cognitive development, parents' education, and attitude toward science in school. The variables of cognitive development and academic ability had the greatest total effects on science process skills. Implications for practitioners and researchers are discussed.

  20. Online Rule Generation Software Process Model

    OpenAIRE

    Sudeep Marwaha; Alka Aroa; Satma M C; Rajni Jain; R C Goyal

    2013-01-01

    For production systems like expert systems, a rule generation software can facilitate the faster deployment. The software process model for rule generation using decision tree classifier refers to the various steps required to be executed for the development of a web based software model for decision rule generation. The Royce’s final waterfall model has been used in this paper to explain the software development process. The paper presents the specific output of various steps of modified wat...

  1. Development Instrument’s Learning of Physics Through Scientific Inquiry Model Based Batak Culture to Improve Science Process Skill and Student’s Curiosity

    Science.gov (United States)

    Nasution, Derlina; Syahreni Harahap, Putri; Harahap, Marabangun

    2018-03-01

    This research aims to: (1) developed a instrument’s learning (lesson plan, worksheet, student’s book, teacher’s guide book, and instrument test) of physics learning through scientific inquiry learning model based Batak culture to achieve skills improvement process of science students and the students’ curiosity; (2) describe the quality of the result of develop instrument’s learning in high school using scientific inquiry learning model based Batak culture (lesson plan, worksheet, student’s book, teacher’s guide book, and instrument test) to achieve the science process skill improvement of students and the student curiosity. This research is research development. This research developed a instrument’s learning of physics by using a development model that is adapted from the development model Thiagarajan, Semmel, and Semmel. The stages are traversed until retrieved a valid physics instrument’s learning, practical, and effective includes :(1) definition phase, (2) the planning phase, and (3) stages of development. Test performed include expert test/validation testing experts, small groups, and test classes is limited. Test classes are limited to do in SMAN 1 Padang Bolak alternating on a class X MIA. This research resulted in: 1) the learning of physics static fluid material specially for high school grade 10th consisted of (lesson plan, worksheet, student’s book, teacher’s guide book, and instrument test) and quality worthy of use in the learning process; 2) each component of the instrument’s learning meet the criteria have valid learning, practical, and effective way to reach the science process skill improvement and curiosity in students.

  2. Development and implementation of computational geometric model for simulation of plate type fuel fabrication process with microspheres dispersed in metallic matrix

    International Nuclear Information System (INIS)

    Lage, Aldo M.F.; Reis, Sergio C.; Braga, Daniel M.; Santos, Armindo; Ferraz, Wilmar B.

    2005-01-01

    In this report it is presented the development of a geometric model to simulate the plate type fuel fabrication process with fuels microspheres dispersed in metallic matrix, as well as its software implementation. The developed geometric model encloses the steps of pellets pressing and sintering, as well as the plate rolling passes. The model permits the simulation of structures, where the values of the various variables of the fabrication processes can be studied and modified. The following variables were analyzed: microspheres diameters, density of the powder/microspheres mixing, microspheres density, fuel volume fraction, sintering densification, and rolling passes number. In the model implementation, which was codified in DELPHI programming language, systems of structured analysis techniques were utilized. The structures simulated were visualized utilizing the AutoCAD applicative, what permitted to obtain planes sections in diverse directions. The objective of this model is to enable the analysis of the simulated structures and supply information that can help in the improvement of the dispersion microspheres fuel plates fabrication process, now in development at CDTN (Centro de Desenvolvimento da Tecnologia Nuclear) in cooperation with the CTMSP (Centro Tecnologico da Marinha em Sao Paulo). (author)

  3. Modeling business processes: theoretical and practical aspects

    Directory of Open Access Journals (Sweden)

    V.V. Dubininа

    2015-06-01

    Full Text Available The essence of process-oriented enterprise management has been examined in the article. The content and types of information technology have been analyzed in the article, due to the complexity and differentiation of existing methods, as well as the specificity of language, terminology of the enterprise business processes modeling. The theoretical aspects of business processes modeling have been reviewed and the modern traditional modeling techniques received practical application in the visualization model of retailers activity have been studied in the article. In the process of theoretical analysis of the modeling methods found that UFO-toolkit method that has been developed by Ukrainian scientists due to it systemology integrated opportunities, is the most suitable for structural and object analysis of retailers business processes. It was designed visualized simulation model of the business process "sales" as is" of retailers using a combination UFO-elements with the aim of the further practical formalization and optimization of a given business process.

  4. Modeling of column apparatus processes

    CERN Document Server

    Boyadjiev, Christo; Boyadjiev, Boyan; Popova-Krumova, Petya

    2016-01-01

    This book presents a new approach for the modeling of chemical and interphase mass transfer processes in industrial column apparatuses, using convection-diffusion and average-concentration models. The convection-diffusion type models are used for a qualitative analysis of the processes and to assess the main, small and slight physical effects, and then reject the slight effects. As a result, the process mechanism can be identified. It also introduces average concentration models for quantitative analysis, which use the average values of the velocity and concentration over the cross-sectional area of the column. The new models are used to analyze different processes (simple and complex chemical reactions, absorption, adsorption and catalytic reactions), and make it possible to model the processes of gas purification with sulfur dioxide, which form the basis of several patents.

  5. A Goal Programming R&D (Research and Development) Project Funding Model of the U.S. Army Strategic Defense Command Using the Analytic Hierarchy Process.

    Science.gov (United States)

    1987-09-01

    A187 899 A GOAL PROGRANNIN R&D (RESEARCH AND DEVELOPMENT) 1/2 PROJECT FUNDING MODEL 0 (U) NAVAL POSTGRADUATE SCHOOL MONTEREY CA S M ANDERSON SEP 87...PROGRAMMING R&D PROJECT FUNDING MODEL OF THE U.S. ARMY STRATEGIC DEFENSE COMMAND USING THE ANALYTIC HIERARCHY PROCESS by Steven M. Anderson September 1987...jACCESSION NO TITI E (Influde Securt ClauAIcatsrn) A Goal Programming R&D Project Funding Model of the U.S. Army Strategic Defense Command Using the

  6. UML in business process modeling

    Directory of Open Access Journals (Sweden)

    Bartosz Marcinkowski

    2013-03-01

    Full Text Available Selection and proper application of business process modeling methods and techniques have a significant impact on organizational improvement capabilities as well as proper understanding of functionality of information systems that shall support activity of the organization. A number of business process modeling notations were popularized in practice in recent decades. Most significant of the notations include Business Process Modeling Notation (OMG BPMN and several Unified Modeling Language (OMG UML extensions. In this paper, the assessment whether one of the most flexible and strictly standardized contemporary business process modeling notations, i.e. Rational UML Profile for Business Modeling, enable business analysts to prepare business models that are all-embracing and understandable by all the stakeholders. After the introduction, methodology of research is discussed. Section 2 presents selected case study results. The paper is concluded with a summary.

  7. Patterns of Software Development Process

    Directory of Open Access Journals (Sweden)

    Sandro Javier Bolaños Castro

    2011-12-01

    Full Text Available "Times New Roman","serif";mso-fareast-font-family:"Times New Roman";mso-ansi-language:EN-US;mso-fareast-language:EN-US;mso-bidi-language:AR-SA">This article presents a set of patterns that can be found to perform best practices in software processes that are directly related to the problem of implementing the activities of the process, the roles involved, the knowledge generated and the inputs and outputs belonging to the process. In this work, a definition of the architecture is encouraged by using different recurrent configurations that strengthen the process and yield efficient results for the development of a software project. The patterns presented constitute a catalog, which serves as a vocabulary for communication among project participants [1], [2], and also can be implemented through software tools, thus facilitating patterns implementation [3]. Additionally, a tool that can be obtained under GPL (General Public license is provided for this purpose

  8. Diagnosing differences between business process models

    NARCIS (Netherlands)

    Dijkman, R.M.; Dumas, M.; Reichert, M.; Shan, M.-C.

    2008-01-01

    This paper presents a technique to diagnose differences between business process models in the EPC notation. The diagnosis returns the exact position of a difference in the business process models and diagnoses the type of a difference, using a typology of differences developed in previous work.

  9. Business Process Modeling: Perceived Benefits

    Science.gov (United States)

    Indulska, Marta; Green, Peter; Recker, Jan; Rosemann, Michael

    The process-centered design of organizations and information systems is globally seen as an appropriate response to the increased economic pressure on organizations. At the methodological core of process-centered management is process modeling. However, business process modeling in large initiatives can be a time-consuming and costly exercise, making it potentially difficult to convince executive management of its benefits. To date, and despite substantial interest and research in the area of process modeling, the understanding of the actual benefits of process modeling in academia and practice is limited. To address this gap, this paper explores the perception of benefits derived from process modeling initiatives, as reported through a global Delphi study. The study incorporates the views of three groups of stakeholders - academics, practitioners and vendors. Our findings lead to the first identification and ranking of 19 unique benefits associated with process modeling. The study in particular found that process modeling benefits vary significantly between practitioners and academics. We argue that the variations may point to a disconnect between research projects and practical demands.

  10. Hillslope runoff processes and models

    Science.gov (United States)

    Kirkby, Mike

    1988-07-01

    Hillslope hydrology is concerned with the partition of precipitation as it passes through the vegetation and soil between overland flow and subsurface flow. Flow follows routes which attenuate and delay the flow to different extents, so that a knowledge of the relevant mechanisms is important. In the 1960s and 1970s, hillslope hydrology developed as a distinct topic through the application of new field observations to develop a generation of physically based forecasting models. In its short history, theory has continually been overturned by field observation. Thus the current tendency, particularly among temperate zone hydrologists, to dismiss all Hortonian overland flow as a myth, is now being corrected by a number of significant field studies which reveal the great range in both climatic and hillslope conditions. Some recent models have generally attempted to simplify the processes acting, for example including only vertical unsaturated flow and lateral saturated flows. Others explicitly forecast partial or contributing areas. With hindsight, the most complete and distributed models have generally shown little forecasting advantage over simpler approaches, perhaps trending towards reliable models which can run on desk top microcomputers. The variety now being recognised in hillslope hydrological responses should also lead to models which take account of more complex interactions, even if initially with a less secure physical and mathematical basis than the Richards equation. In particular, there is a need to respond to the variety of climatic responses, and to spatial variability on and beneath the surface, including the role of seepage macropores and pipes which call into question whether the hillside can be treated as a Darcian flow system.

  11. Developmental changes in reading do not alter the development of visual processing skills: An application of explanatory item response models in grades K-2

    Directory of Open Access Journals (Sweden)

    Kristi L Santi

    2015-02-01

    Full Text Available Visual processing has been widely studied in regard to its impact on a students’ ability to read. A less researched area is the role of reading in the development of visual processing skills. A cohort-sequential, accelerated-longitudinal design was utilized with 932 kindergarten, first, and second grade students to examine the impact of reading acquisition on the processing of various types of visual discrimination and visual motor test items. Students were assessed four times per year on a variety of reading measures and reading precursors and two popular measures of visual processing over a three-year period. Explanatory item response models were used to examine the roles of person and item characteristics on changes in visual processing abilities and changes in item difficulties over time. Results showed different developmental patterns for five types of visual processing test items, but most importantly failed to show consistent effects of learning to read on changes in item difficulty. Thus, the present study failed to find support for the hypothesis that learning to read alters performance on measures of visual processing. Rather, visual processing and reading ability improved together over time with no evidence to suggest cross-domain influences from reading to visual processing. Results are discussed in the context of developmental theories of visual processing and brain-based research on the role of visual skills in learning to read.

  12. Conceptual models of information processing

    Science.gov (United States)

    Stewart, L. J.

    1983-01-01

    The conceptual information processing issues are examined. Human information processing is defined as an active cognitive process that is analogous to a system. It is the flow and transformation of information within a human. The human is viewed as an active information seeker who is constantly receiving, processing, and acting upon the surrounding environmental stimuli. Human information processing models are conceptual representations of cognitive behaviors. Models of information processing are useful in representing the different theoretical positions and in attempting to define the limits and capabilities of human memory. It is concluded that an understanding of conceptual human information processing models and their applications to systems design leads to a better human factors approach.

  13. Business process modeling in healthcare.

    Science.gov (United States)

    Ruiz, Francisco; Garcia, Felix; Calahorra, Luis; Llorente, César; Gonçalves, Luis; Daniel, Christel; Blobel, Bernd

    2012-01-01

    The importance of the process point of view is not restricted to a specific enterprise sector. In the field of health, as a result of the nature of the service offered, health institutions' processes are also the basis for decision making which is focused on achieving their objective of providing quality medical assistance. In this chapter the application of business process modelling - using the Business Process Modelling Notation (BPMN) standard is described. Main challenges of business process modelling in healthcare are the definition of healthcare processes, the multi-disciplinary nature of healthcare, the flexibility and variability of the activities involved in health care processes, the need of interoperability between multiple information systems, and the continuous updating of scientific knowledge in healthcare.

  14. Modeling nuclear processes by Simulink

    Energy Technology Data Exchange (ETDEWEB)

    Rashid, Nahrul Khair Alang Md, E-mail: nahrul@iium.edu.my [Faculty of Engineering, International Islamic University Malaysia, Jalan Gombak, Selangor (Malaysia)

    2015-04-29

    Modelling and simulation are essential parts in the study of dynamic systems behaviours. In nuclear engineering, modelling and simulation are important to assess the expected results of an experiment before the actual experiment is conducted or in the design of nuclear facilities. In education, modelling can give insight into the dynamic of systems and processes. Most nuclear processes can be described by ordinary or partial differential equations. Efforts expended to solve the equations using analytical or numerical solutions consume time and distract attention from the objectives of modelling itself. This paper presents the use of Simulink, a MATLAB toolbox software that is widely used in control engineering, as a modelling platform for the study of nuclear processes including nuclear reactor behaviours. Starting from the describing equations, Simulink models for heat transfer, radionuclide decay process, delayed neutrons effect, reactor point kinetic equations with delayed neutron groups, and the effect of temperature feedback are used as examples.

  15. Modeling nuclear processes by Simulink

    International Nuclear Information System (INIS)

    Rashid, Nahrul Khair Alang Md

    2015-01-01

    Modelling and simulation are essential parts in the study of dynamic systems behaviours. In nuclear engineering, modelling and simulation are important to assess the expected results of an experiment before the actual experiment is conducted or in the design of nuclear facilities. In education, modelling can give insight into the dynamic of systems and processes. Most nuclear processes can be described by ordinary or partial differential equations. Efforts expended to solve the equations using analytical or numerical solutions consume time and distract attention from the objectives of modelling itself. This paper presents the use of Simulink, a MATLAB toolbox software that is widely used in control engineering, as a modelling platform for the study of nuclear processes including nuclear reactor behaviours. Starting from the describing equations, Simulink models for heat transfer, radionuclide decay process, delayed neutrons effect, reactor point kinetic equations with delayed neutron groups, and the effect of temperature feedback are used as examples

  16. Development process of subjects society

    Directory of Open Access Journals (Sweden)

    A. V. Reshetnichenko

    2014-08-01

    Full Text Available Background due to defining the role of people in the development of society and the almost complete absence of scientific management processes capable of progressive development of both individuals and social communities, and nations, and civilization in general. In order to overcome inherent subjectivist methodology of knowledge, psyholohizatorskyh, hiperpolityzovanyh and utilitarian approach, the authors proposed a three-tier system of business processes of society. The conceptual core of the approach consists in the detection task as logical - mathematical laws of subjects of primary, secondary and higher levels of development, and on the mechanisms of their formation and practice. The solution of the tasks allowed the authors to reveal the structure of both the ascending and descending processes of economic society. Thus, the analysis of individual carriers upward changes as «individual», «individuality», «person» and «personality» showed conditionality determination of their activities with «anthropometric», «ethnic», «demographic» and «ideological» mechanisms. Nature as common carriers downstream changes revealed using correlative related «groups», «group «, «groups» and «communities» whose activity is due to «vitalistic», «education», «professional» and «stratification» mechanisms. To disclose the nature and organization of secondary and higher levels of economic society by the authors introduced the category of «citizen», «heneralista», «human space», «human galactic» ‘formation and development is causing «status», «Persona logical», «humanocentric», «institutional», «cluster», «kontaminatsiyni» and other mechanisms. One of the main achievements of the work, the authors consider the possibility of further development and practical implementation of new quality management processes of economic society based multimodal dialectical logic.

  17. Integrating ergonomics into the product development process

    DEFF Research Database (Denmark)

    Broberg, Ole

    1997-01-01

    and production engineers regarding information sources in problem solving, communication pattern, perception of ergonomics, motivation and requests to support tools and methods. These differences and the social and organizational contexts of the development process must be taken into account when considering......A cross-sectional case study was performed in a large company producing electro-mechanical products for industrial application. The purpose was to elucidate conditions and strategies for integrating ergonomics into the product development process thereby preventing ergonomic problems at the time...... of manufacture of new products. In reality the product development process is not a rational problem solving process and does not proceed in a sequential manner as decribed in engineering models. Instead it is a complex organizational process involving uncertainties, iterative elements and negotiation between...

  18. modeling grinding modeling grinding processes as micro processes

    African Journals Online (AJOL)

    eobe

    industrial precision grinding processes are cylindrical, center less and ... Several model shave been proposed and used to study grinding ..... grinding force for the two cases were 9.07237N/mm ..... International Journal of Machine Tools &.

  19. Collapse models and perceptual processes

    International Nuclear Information System (INIS)

    Ghirardi, Gian Carlo; Romano, Raffaele

    2014-01-01

    Theories including a collapse mechanism have been presented various years ago. They are based on a modification of standard quantum mechanics in which nonlinear and stochastic terms are added to the evolution equation. Their principal merits derive from the fact that they are mathematically precise schemes accounting, on the basis of a unique universal dynamical principle, both for the quantum behavior of microscopic systems as well as for the reduction associated to measurement processes and for the classical behavior of macroscopic objects. Since such theories qualify themselves not as new interpretations but as modifications of the standard theory they can be, in principle, tested against quantum mechanics. Recently, various investigations identifying possible crucial test have been discussed. In spite of the extreme difficulty to perform such tests it seems that recent technological developments allow at least to put precise limits on the parameters characterizing the modifications of the evolution equation. Here we will simply mention some of the recent investigations in this direction, while we will mainly concentrate our attention to the way in which collapse theories account for definite perceptual process. The differences between the case of reductions induced by perceptions and those related to measurement procedures by means of standard macroscopic devices will be discussed. On this basis, we suggest a precise experimental test of collapse theories involving conscious observers. We make plausible, by discussing in detail a toy model, that the modified dynamics can give rise to quite small but systematic errors in the visual perceptual process.

  20. Biochemical Process Development and Integration | Bioenergy | NREL

    Science.gov (United States)

    Biochemical Process Development and Integration Biochemical Process Development and Integration Our conversion and separation processes to pilot-scale integrated process development and scale up. We also Publications Accounting for all sugar produced during integrated production of ethanol from lignocellulosic

  1. Software Development Standard Processes (SDSP)

    Science.gov (United States)

    Lavin, Milton L.; Wang, James J.; Morillo, Ronald; Mayer, John T.; Jamshidian, Barzia; Shimizu, Kenneth J.; Wilkinson, Belinda M.; Hihn, Jairus M.; Borgen, Rosana B.; Meyer, Kenneth N.; hide

    2011-01-01

    A JPL-created set of standard processes is to be used throughout the lifecycle of software development. These SDSPs cover a range of activities, from management and engineering activities, to assurance and support activities. These processes must be applied to software tasks per a prescribed set of procedures. JPL s Software Quality Improvement Project is currently working at the behest of the JPL Software Process Owner to ensure that all applicable software tasks follow these procedures. The SDSPs are captured as a set of 22 standards in JPL s software process domain. They were developed in-house at JPL by a number of Subject Matter Experts (SMEs) residing primarily within the Engineering and Science Directorate, but also from the Business Operations Directorate and Safety and Mission Success Directorate. These practices include not only currently performed best practices, but also JPL-desired future practices in key thrust areas like software architecting and software reuse analysis. Additionally, these SDSPs conform to many standards and requirements to which JPL projects are beholden.

  2. A parallel process model of the development of positive smoking expectancies and smoking behavior during early adolescence in Caucasian and African American girls

    OpenAIRE

    Chung, Tammy; White, Helene R.; Hipwell, Alison E.; Stepp, Stephanie D.; Loeber, Rolf

    2010-01-01

    This study examined the development of positive smoking expectancies and smoking behavior in an urban cohort of girls followed annually over ages 11-14. Longitudinal data from the oldest cohort of the Pittsburgh Girls Study (N=566, 56% African American, 44% Caucasian) were used to estimate a parallel process growth model of positive smoking expectancies and smoking behavior. Average level of positive smoking expectancies was relatively stable over ages 11-14, although there was significant va...

  3. Sato Processes in Default Modeling

    DEFF Research Database (Denmark)

    Kokholm, Thomas; Nicolato, Elisa

    -change of a homogeneous Levy process. While the processes in these two classes share the same average behavior over time, the associated intensities exhibit very different properties. Concrete specifications are calibrated to data on the single names included in the iTraxx Europe index. The performances are compared......In reduced form default models, the instantaneous default intensity is classically the modeling object. Survival probabilities are then given by the Laplace transform of the cumulative hazard defined as the integrated intensity process. Instead, recent literature has shown a tendency towards...... specifying the cumulative hazard process directly. Within this framework we present a new model class where cumulative hazards are described by self-similar additive processes, also known as Sato processes. Furthermore we also analyze specifications obtained via a simple deterministic time...

  4. Processing of (in)tractable polymers using reactive solvents, 4: Structure development in the model system poly(ethylene)/styrene

    NARCIS (Netherlands)

    Goossens, J.G.P.; Rastogi, S.; Meijer, H.E.H.; Lemstra, P.J.

    1998-01-01

    The use of reactive solvents provides a unique opportunity to extend the processing characteristics of both intractable and standard (tractable) polymers beyond existing limits. The polymer to be processed is dissolved in the reactive solvent (monomer) and the solution is transferred into a mould.

  5. Modeling Business Processes in Public Administration

    Science.gov (United States)

    Repa, Vaclav

    During more than 10 years of its existence business process modeling became a regular part of organization management practice. It is mostly regarded. as a part of information system development or even as a way to implement some supporting technology (for instance workflow system). Although I do not agree with such reduction of the real meaning of a business process, it is necessary to admit that information technologies play an essential role in business processes (see [1] for more information), Consequently, an information system is inseparable from a business process itself because it is a cornerstone of the general basic infrastructure of a business. This fact impacts on all dimensions of business process management. One of these dimensions is the methodology that postulates that the information systems development provide the business process management with exact methods and tools for modeling business processes. Also the methodology underlying the approach presented in this paper has its roots in the information systems development methodology.

  6. The Process of Trust Development

    DEFF Research Database (Denmark)

    Jagd, Søren; Højland, Jeppe

    in management among employees. Trust is found to be higher among employees interacting regularly with managers, as in the project coordination group. It is found that personal relations are very important for the development of trust. The success of the project may be explained by the involvement of an ‘elite...... and discuss with colleagues from other departments and develop personal knowledge of each other....... by high trust and co-operation? In this paper we explore the process of trust development during an organisational change project in a Danish SME by looking at two kinds of trust relations: employee trust in management and trust relations among employees. We find substantial differences in trust...

  7. Conceptual modelling of human resource evaluation process

    Directory of Open Access Journals (Sweden)

    Negoiţă Doina Olivia

    2017-01-01

    Full Text Available Taking into account the highly diverse tasks which employees have to fulfil due to complex requirements of nowadays consumers, the human resource within an enterprise has become a strategic element for developing and exploiting products which meet the market expectations. Therefore, organizations encounter difficulties when approaching the human resource evaluation process. Hence, the aim of the current paper is to design a conceptual model of the aforementioned process, which allows the enterprises to develop a specific methodology. In order to design the conceptual model, Business Process Modelling instruments were employed - Adonis Community Edition Business Process Management Toolkit using the ADONIS BPMS Notation. The conceptual model was developed based on an in-depth secondary research regarding the human resource evaluation process. The proposed conceptual model represents a generic workflow (sequential and/ or simultaneously activities, which can be extended considering the enterprise’s needs regarding their requirements when conducting a human resource evaluation process. Enterprises can benefit from using software instruments for business process modelling as they enable process analysis and evaluation (predefined / specific queries and also model optimization (simulations.

  8. Modeling surface topography of state-of-the-art x-ray mirrors as a result of stochastic polishing process: recent developments

    Science.gov (United States)

    Yashchuk, Valeriy V.; Centers, Gary; Tyurin, Yuri N.; Tyurina, Anastasia

    2016-09-01

    Recently, an original method for the statistical modeling of surface topography of state-of-the-art mirrors for usage in xray optical systems at light source facilities and for astronomical telescopes [Opt. Eng. 51(4), 046501, 2012; ibid. 53(8), 084102 (2014); and ibid. 55(7), 074106 (2016)] has been developed. In modeling, the mirror surface topography is considered to be a result of a stationary uniform stochastic polishing process and the best fit time-invariant linear filter (TILF) that optimally parameterizes, with limited number of parameters, the polishing process is determined. The TILF model allows the surface slope profile of an optic with a newly desired specification to be reliably forecast before fabrication. With the forecast data, representative numerical evaluations of expected performance of the prospective mirrors in optical systems under development become possible [Opt. Eng., 54(2), 025108 (2015)]. Here, we suggest and demonstrate an analytical approach for accounting the imperfections of the used metrology instruments, which are described by the instrumental point spread function, in the TILF modeling. The efficacy of the approach is demonstrated with numerical simulations for correction of measurements performed with an autocollimator based surface slope profiler. Besides solving this major metrological problem, the results of the present work open an avenue for developing analytical and computational tools for stitching data in the statistical domain, obtained using multiple metrology instruments measuring significantly different bandwidths of spatial wavelengths.

  9. Markov Decision Process Measurement Model.

    Science.gov (United States)

    LaMar, Michelle M

    2018-03-01

    Within-task actions can provide additional information on student competencies but are challenging to model. This paper explores the potential of using a cognitive model for decision making, the Markov decision process, to provide a mapping between within-task actions and latent traits of interest. Psychometric properties of the model are explored, and simulation studies report on parameter recovery within the context of a simple strategy game. The model is then applied to empirical data from an educational game. Estimates from the model are found to correlate more strongly with posttest results than a partial-credit IRT model based on outcome data alone.

  10. Social Models: Blueprints or Processes?

    Science.gov (United States)

    Little, Graham R.

    1981-01-01

    Discusses the nature and implications of two different models for societal planning: (1) the problem-solving process approach based on Karl Popper; and (2) the goal-setting "blueprint" approach based on Karl Marx. (DC)

  11. Simple Models for Process Control

    Czech Academy of Sciences Publication Activity Database

    Gorez, R.; Klán, Petr

    2011-01-01

    Roč. 22, č. 2 (2011), s. 58-62 ISSN 0929-2268 Institutional research plan: CEZ:AV0Z10300504 Keywords : process model s * PID control * second order dynamics Subject RIV: JB - Sensors, Measurment, Regulation

  12. Ontological Model of Business Process Management Systems

    Science.gov (United States)

    Manoilov, G.; Deliiska, B.

    2008-10-01

    The activities which constitute business process management (BPM) can be grouped into five categories: design, modeling, execution, monitoring and optimization. Dedicated software packets for business process management system (BPMS) are available on the market. But the efficiency of its exploitation depends on used ontological model in the development time and run time of the system. In the article an ontological model of BPMS in area of software industry is investigated. The model building is preceded by conceptualization of the domain and taxonomy of BPMS development. On the base of the taxonomy an simple online thesaurus is created.

  13. Development of a new version of the Liverpool Malaria Model. I. Refining the parameter settings and mathematical formulation of basic processes based on a literature review

    Directory of Open Access Journals (Sweden)

    Jones Anne E

    2011-02-01

    Full Text Available Abstract Background A warm and humid climate triggers several water-associated diseases such as malaria. Climate- or weather-driven malaria models, therefore, allow for a better understanding of malaria transmission dynamics. The Liverpool Malaria Model (LMM is a mathematical-biological model of malaria parasite dynamics using daily temperature and precipitation data. In this study, the parameter settings of the LMM are refined and a new mathematical formulation of key processes related to the growth and size of the vector population are developed. Methods One of the most comprehensive studies to date in terms of gathering entomological and parasitological information from the literature was undertaken for the development of a new version of an existing malaria model. The knowledge was needed to allow the justification of new settings of various model parameters and motivated changes of the mathematical formulation of the LMM. Results The first part of the present study developed an improved set of parameter settings and mathematical formulation of the LMM. Important modules of the original LMM version were enhanced in order to achieve a higher biological and physical accuracy. The oviposition as well as the survival of immature mosquitoes were adjusted to field conditions via the application of a fuzzy distribution model. Key model parameters, including the mature age of mosquitoes, the survival probability of adult mosquitoes, the human blood index, the mosquito-to-human (human-to-mosquito transmission efficiency, the human infectious age, the recovery rate, as well as the gametocyte prevalence, were reassessed by means of entomological and parasitological observations. This paper also revealed that various malaria variables lack information from field studies to be set properly in a malaria modelling approach. Conclusions Due to the multitude of model parameters and the uncertainty involved in the setting of parameters, an extensive

  14. Modeling closed nuclear fuel cycles processes

    Energy Technology Data Exchange (ETDEWEB)

    Shmidt, O.V. [A.A. Bochvar All-Russian Scientific Research Institute for Inorganic Materials, Rogova, 5a street, Moscow, 123098 (Russian Federation); Makeeva, I.R. [Zababakhin All-Russian Scientific Research Institute of Technical Physics, Vasiliev street 13, Snezhinsk, Chelyabinsk region, 456770 (Russian Federation); Liventsov, S.N. [Tomsk Polytechnic University, Tomsk, Lenin Avenue, 30, 634050 (Russian Federation)

    2016-07-01

    Computer models of processes are necessary for determination of optimal operating conditions for closed nuclear fuel cycle (NFC) processes. Computer models can be quickly changed in accordance with new and fresh data from experimental research. 3 kinds of process simulation are necessary. First, the VIZART software package is a balance model development used for calculating the material flow in technological processes. VIZART involves taking into account of equipment capacity, transport lines and storage volumes. Secondly, it is necessary to simulate the physico-chemical processes that are involved in the closure of NFC. The third kind of simulation is the development of software that allows the optimization, diagnostics and control of the processes which implies real-time simulation of product flows on the whole plant or on separate lines of the plant. (A.C.)

  15. The Automation of Nowcast Model Assessment Processes

    Science.gov (United States)

    2016-09-01

    secondly, provide modelers with the information needed to understand the model errors and how their algorithm changes might mitigate these errors. In...by ARL modelers. 2. Development Environment The automation of Point-Stat processes (i.e., PSA) was developed using Python 3.5.* Python was selected...because it is easy to use, widely used for scripting, and satisfies all the requirements to automate the implementation of the Point-Stat tool. In

  16. Sato Processes in Default Modeling

    DEFF Research Database (Denmark)

    Kokholm, Thomas; Nicolato, Elisa

    2010-01-01

    In reduced form default models, the instantaneous default intensity is the classical modeling object. Survival probabilities are then given by the Laplace transform of the cumulative hazard defined as the integrated intensity process. Instead, recent literature tends to specify the cumulative haz...

  17. Sato Processes in Default Modeling

    DEFF Research Database (Denmark)

    Kokholm, Thomas; Nicolato, Elisa

    In reduced form default models, the instantaneous default intensity is classically the modeling object. Survival probabilities are then given by the Laplace transform of the cumulative hazard defined as the integrated intensity process. Instead, recent literature has shown a tendency towards...

  18. Metrics for Business Process Models

    Science.gov (United States)

    Mendling, Jan

    Up until now, there has been little research on why people introduce errors in real-world business process models. In a more general context, Simon [404] points to the limitations of cognitive capabilities and concludes that humans act rationally only to a certain extent. Concerning modeling errors, this argument would imply that human modelers lose track of the interrelations of large and complex models due to their limited cognitive capabilities and introduce errors that they would not insert in a small model. A recent study by Mendling et al. [275] explores in how far certain complexity metrics of business process models have the potential to serve as error determinants. The authors conclude that complexity indeed appears to have an impact on error probability. Before we can test such a hypothesis in a more general setting, we have to establish an understanding of how we can define determinants that drive error probability and how we can measure them.

  19. Development of kinetic models for photoassisted electrochemical process using Ti/RuO2 anode and carbon nanotube-based O2-diffusion cathode

    International Nuclear Information System (INIS)

    Akbarpour, Amaneh; Khataee, Alireza; Fathinia, Mehrangiz; Vahid, Behrouz

    2016-01-01

    Highlights: • Preparation and characterization of carbon nanotube-based O 2 -diffusion cathode. • Photoassisted electrochemical process using Ti/RuO 2 anode and O 2 -diffusion cathode. • Degradation of C.I. Basic Yellow 28 under recirculation mode. • Development of kinetic models for photoassisted electrochemical process. - Abstract: A coupled photoassisted electrochemical system was utilized for degradation of C.I. Basic Yellow 28 (BY28) as a cationic azomethine dye under recirculation mode. Experiments were carried out by utilizing active titanium/ruthenium oxide (Ti/RuO 2 ) anode and O 2 -diffusion cathode with carbon nanotubes (CNTs). Transmission electron microscopy (TEM) image of the CNTs demonstrated that CNTs had approximately an inner and outer diameter of 5 nm and 19 nm, respectively. Then, the dye degradation kinetics was experimentally examined under various operational parameters including BY28 initial concentration (mg/L), current density (mA/cm 2 ), flow rate (L/h) and pH. Based on the generally accepted intrinsic elementary reactions for photoassisted electrochemical process (PEP), a novel kinetic model was proposed and validated for predicting the k app . The developed kinetic model explicitly describes the dependency of the k app on BY28 initial concentration and current density. A good agreement was obtained between the predicted values of k app and experimental results (correlation coefficient (R 2 ) = 0.996, mean squared error (MSE) = 2.10 × 10 −4 and mean absolute error (MAE) = 1.10 × 10 −2 ). Finally, in order to profoundly evaluate and compare the accuracy of the suggested intrinsic kinetic model, an empirical kinetic model was also developed as a function of main operational parameters, and an artificial neural network model (ANN) by 3-layer feed-forward back propagation network with topology of 5:9:1. The performance of the mentioned models was compared based on the error functions and analysis of variance (ANOVA). A

  20. A Multilevel Design Model : The mutual relationship between product-service system development and societal change processes

    NARCIS (Netherlands)

    Joore, J.P.; Brezet, J.C.

    2014-01-01

    Change actors like designers play a strategic role in innovation and transition processes towards a sustainable society. They act at all levels of society and need help to find their way through increasingly interrelated innovation systems. To support their efforts, there is a need for a design

  1. Pedagogic process modeling: Humanistic-integrative approach

    Directory of Open Access Journals (Sweden)

    Boritko Nikolaj M.

    2007-01-01

    Full Text Available The paper deals with some current problems of modeling the dynamics of the subject-features development of the individual. The term "process" is considered in the context of the humanistic-integrative approach, in which the principles of self education are regarded as criteria for efficient pedagogic activity. Four basic characteristics of the pedagogic process are pointed out: intentionality reflects logicality and regularity of the development of the process; discreteness (stageability in dicates qualitative stages through which the pedagogic phenomenon passes; nonlinearity explains the crisis character of pedagogic processes and reveals inner factors of self-development; situationality requires a selection of pedagogic conditions in accordance with the inner factors, which would enable steering the pedagogic process. Offered are two steps for singling out a particular stage and the algorithm for developing an integrative model for it. The suggested conclusions might be of use for further theoretic research, analyses of educational practices and for realistic predicting of pedagogical phenomena. .

  2. Computer Modelling of Dynamic Processes

    Directory of Open Access Journals (Sweden)

    B. Rybakin

    2000-10-01

    Full Text Available Results of numerical modeling of dynamic problems are summed in the article up. These problems are characteristic for various areas of human activity, in particular for problem solving in ecology. The following problems are considered in the present work: computer modeling of dynamic effects on elastic-plastic bodies, calculation and determination of performances of gas streams in gas cleaning equipment, modeling of biogas formation processes.

  3. Estimation of environment-related properties of chemicals for design of sustainable processes: development of group-contribution+ (GC+) property models and uncertainty analysis.

    Science.gov (United States)

    Hukkerikar, Amol Shivajirao; Kalakul, Sawitree; Sarup, Bent; Young, Douglas M; Sin, Gürkan; Gani, Rafiqul

    2012-11-26

    of the developed property models for the estimation of environment-related properties and uncertainties of the estimated property values is highlighted through an illustrative example. The developed property models provide reliable estimates of environment-related properties needed to perform process synthesis, design, and analysis of sustainable chemical processes and allow one to evaluate the effect of uncertainties of estimated property values on the calculated performance of processes giving useful insights into quality and reliability of the design of sustainable processes.

  4. Modelling Hospital Materials Management Processes

    Directory of Open Access Journals (Sweden)

    Raffaele Iannone

    2013-06-01

    integrated and detailed analysis and description model for hospital materials management data and tasks, which is able to tackle information from patient requirements to usage, from replenishment requests to supplying and handling activities. The model takes account of medical risk reduction, traceability and streamlined processes perspectives. Second, the paper translates this information into a business process model and mathematical formalization.The study provides a useful guide to the various relevant technology‐related, management and business issues, laying the foundations of an efficient reengineering of the supply chain to reduce healthcare costs and improve the quality of care.

  5. Process Development for Nanostructured Photovoltaics

    Energy Technology Data Exchange (ETDEWEB)

    Elam, Jeffrey W.

    2015-01-01

    Photovoltaic manufacturing is an emerging industry that promises a carbon-free, nearly limitless source of energy for our nation. However, the high-temperature manufacturing processes used for conventional silicon-based photovoltaics are extremely energy-intensive and expensive. This high cost imposes a critical barrier to the widespread implementation of photovoltaic technology. Argonne National Laboratory and its partners recently invented new methods for manufacturing nanostructured photovoltaic devices that allow dramatic savings in materials, process energy, and cost. These methods are based on atomic layer deposition, a thin film synthesis technique that has been commercialized for the mass production of semiconductor microelectronics. The goal of this project was to develop these low-cost fabrication methods for the high efficiency production of nanostructured photovoltaics, and to demonstrate these methods in solar cell manufacturing. We achieved this goal in two ways: 1) we demonstrated the benefits of these coatings in the laboratory by scaling-up the fabrication of low-cost dye sensitized solar cells; 2) we used our coating technology to reduce the manufacturing cost of solar cells under development by our industrial partners.

  6. Declarative modeling for process supervision

    International Nuclear Information System (INIS)

    Leyval, L.

    1989-01-01

    Our work is a contribution to computer aided supervision of continuous processes. It is inspired by an area of Artificial Intelligence: qualitative physics. Here, supervision is based on a model which continuously provides operators with a synthetic view of the process; but this model is founded on general principles of control theory rather than on physics. It involves concepts such as high gain or small time response. It helps in linking temporally the evolution of various variables. Moreover, the model provides predictions of the future behaviour of the process, which allows action advice and alarm filtering. This should greatly reduce the famous cognitive overload associated to any complex and dangerous evolution of the process

  7. The development of global energy supply as a succession of energy-related innovation processes. A qualitative model approach to assess the use of nuclear power

    International Nuclear Information System (INIS)

    Herrmann, Dieter

    2017-01-01

    Often, the development of the world energy supply is adopted as a painful sequence of the exhaustible and polluting use of primary energy sources. Therefore the expectations in practically inexhaustible and environmentally neutral renewable energy sources are high. However, in fact, it depends on the available production, conversion, and utilization technology, which sources of energy are suitable to meet given demands and requirements. In particular, the development of the energy demand requires energy technology innovations to use new energy sources, to use known energy sources more efficient and to replace exhaustible energy sources at an early stage by others. The historical development of the global energy supply is a sequence of interrelated energy technology innovation processes. This makes it also possible, to analyse the historical development of nuclear power and to derive a model on the future role of nuclear power worldwide.

  8. ENTERPRISES DEVELOPMENT: MANAGEMENT MODEL

    Directory of Open Access Journals (Sweden)

    Lina Shenderivska

    2018-01-01

    Full Text Available The paper’s purpose is to provide recommendations for the effective managing the companies’ development taking into account the sectoral key elements’ transformation. Methodology. The enterprise profits’ econometric simulation is conducted to determine the most significant factors influencing their development. According to the model testing result, their multicollinearity was revealed. To get rid of the multicollinearity phenomenon from the profit models, isolated regressors are excluded, namely, return on assets, material returns, return on equity. To obtain qualitative models with a small error of model parameters estimation and, accordingly, high reliability of the conclusion about the interrelation between the factors of the model and the resulting feature, factors in the income model that are not closely interconnected, that is, not multicollinear, are included. Determination coefficients R2 and F-criterion were calculated for model quality checking. The modern printing enterprises of Ukraine key elements, connected with integration into the global information space, are analysed. Results. The interrelation between a company’s development and earning capacity is identified in the study. The profit importance as the main source for enterprise financing is substantiated. Factors that have the greatest impact on the enterprises’ development are labour productivity, financial autonomy, working capital turnover, and the character of their influence is most adequately reflected by the power model. Peculiarities of the enterprises’ activity include increased competition at the inter-branch level, poorly developed industrial relations, and the own sources of financing activities shortage. Practical implications. Based on information on the most significant developmental impact factors, directions for perspective enterprises development for their competitiveness increase are proposed: diversification based on the activity expansion

  9. Integrated Process Modeling-A Process Validation Life Cycle Companion.

    Science.gov (United States)

    Zahel, Thomas; Hauer, Stefan; Mueller, Eric M; Murphy, Patrick; Abad, Sandra; Vasilieva, Elena; Maurer, Daniel; Brocard, Cécile; Reinisch, Daniela; Sagmeister, Patrick; Herwig, Christoph

    2017-10-17

    During the regulatory requested process validation of pharmaceutical manufacturing processes, companies aim to identify, control, and continuously monitor process variation and its impact on critical quality attributes (CQAs) of the final product. It is difficult to directly connect the impact of single process parameters (PPs) to final product CQAs, especially in biopharmaceutical process development and production, where multiple unit operations are stacked together and interact with each other. Therefore, we want to present the application of Monte Carlo (MC) simulation using an integrated process model (IPM) that enables estimation of process capability even in early stages of process validation. Once the IPM is established, its capability in risk and criticality assessment is furthermore demonstrated. IPMs can be used to enable holistic production control strategies that take interactions of process parameters of multiple unit operations into account. Moreover, IPMs can be trained with development data, refined with qualification runs, and maintained with routine manufacturing data which underlines the lifecycle concept. These applications will be shown by means of a process characterization study recently conducted at a world-leading contract manufacturing organization (CMO). The new IPM methodology therefore allows anticipation of out of specification (OOS) events, identify critical process parameters, and take risk-based decisions on counteractions that increase process robustness and decrease the likelihood of OOS events.

  10. Benefits of user-oriented software development based on an iterative cyclic process model for simultaneous engineering

    NARCIS (Netherlands)

    Rauterberg, G.W.M.; Strohm, O.; Kirsch, C.

    1995-01-01

    The current state of traditional software development is surveyed and essential problems are investigated on the basis of empirical data and theoretical considerations. The concept of optimisation cycle is proposed as a solution for simultaneous engineering. The relationships of several different

  11. Neuroscientific Model of Motivational Process

    Science.gov (United States)

    Kim, Sung-il

    2013-01-01

    Considering the neuroscientific findings on reward, learning, value, decision-making, and cognitive control, motivation can be parsed into three sub processes, a process of generating motivation, a process of maintaining motivation, and a process of regulating motivation. I propose a tentative neuroscientific model of motivational processes which consists of three distinct but continuous sub processes, namely reward-driven approach, value-based decision-making, and goal-directed control. Reward-driven approach is the process in which motivation is generated by reward anticipation and selective approach behaviors toward reward. This process recruits the ventral striatum (reward area) in which basic stimulus-action association is formed, and is classified as an automatic motivation to which relatively less attention is assigned. By contrast, value-based decision-making is the process of evaluating various outcomes of actions, learning through positive prediction error, and calculating the value continuously. The striatum and the orbitofrontal cortex (valuation area) play crucial roles in sustaining motivation. Lastly, the goal-directed control is the process of regulating motivation through cognitive control to achieve goals. This consciously controlled motivation is associated with higher-level cognitive functions such as planning, retaining the goal, monitoring the performance, and regulating action. The anterior cingulate cortex (attention area) and the dorsolateral prefrontal cortex (cognitive control area) are the main neural circuits related to regulation of motivation. These three sub processes interact with each other by sending reward prediction error signals through dopaminergic pathway from the striatum and to the prefrontal cortex. The neuroscientific model of motivational process suggests several educational implications with regard to the generation, maintenance, and regulation of motivation to learn in the learning environment. PMID:23459598

  12. Neuroscientific model of motivational process.

    Science.gov (United States)

    Kim, Sung-Il

    2013-01-01

    Considering the neuroscientific findings on reward, learning, value, decision-making, and cognitive control, motivation can be parsed into three sub processes, a process of generating motivation, a process of maintaining motivation, and a process of regulating motivation. I propose a tentative neuroscientific model of motivational processes which consists of three distinct but continuous sub processes, namely reward-driven approach, value-based decision-making, and goal-directed control. Reward-driven approach is the process in which motivation is generated by reward anticipation and selective approach behaviors toward reward. This process recruits the ventral striatum (reward area) in which basic stimulus-action association is formed, and is classified as an automatic motivation to which relatively less attention is assigned. By contrast, value-based decision-making is the process of evaluating various outcomes of actions, learning through positive prediction error, and calculating the value continuously. The striatum and the orbitofrontal cortex (valuation area) play crucial roles in sustaining motivation. Lastly, the goal-directed control is the process of regulating motivation through cognitive control to achieve goals. This consciously controlled motivation is associated with higher-level cognitive functions such as planning, retaining the goal, monitoring the performance, and regulating action. The anterior cingulate cortex (attention area) and the dorsolateral prefrontal cortex (cognitive control area) are the main neural circuits related to regulation of motivation. These three sub processes interact with each other by sending reward prediction error signals through dopaminergic pathway from the striatum and to the prefrontal cortex. The neuroscientific model of motivational process suggests several educational implications with regard to the generation, maintenance, and regulation of motivation to learn in the learning environment.

  13. A development process meta-model for Web based expert systems: The Web engineering point of view

    DEFF Research Database (Denmark)

    Dokas, I.M.; Alapetite, Alexandre

    2006-01-01

    raised their complexity. Unfortunately, there is so far no clear answer to the question: How may the methods and experience of Web engineering and expert systems be combined and applied in order todevelop effective and successful Web based expert systems? In an attempt to answer this question...... on Web based expert systems – will be presented. The idea behind the presentation of theaccessibility evaluation and its conclusions is to show to Web based expert system developers, who typically have little Web engineering background, that Web engineering issues must be considered when developing Web......Similar to many legacy computer systems, expert systems can be accessed via the Web, forming a set of Web applications known as Web based expert systems. The tough Web competition, the way people and organizations rely on Web applications and theincreasing user requirements for better services have...

  14. Software Process Validation: Quantitatively Measuring the Correspondence of a Process to a Model

    National Research Council Canada - National Science Library

    Cook, Jonathan E; Wolf, Alexander L

    1997-01-01

    .... When process models and process executions diverge, something significant is happening. The authors have developed techniques for uncovering and measuring the discrepancies between models and executions, which they call process validation...

  15. ECONOMIC MODELING PROCESSES USING MATLAB

    Directory of Open Access Journals (Sweden)

    Anamaria G. MACOVEI

    2008-06-01

    Full Text Available To study economic phenomena and processes using mathem atical modeling, and to determine the approximatesolution to a problem we need to choose a method of calculation and a numerical computer program, namely thepackage of programs MatLab. Any economic process or phenomenon is a mathematical description of h is behavior,and thus draw up an economic and mathematical model that has the following stages: formulation of the problem, theanalysis process modeling, the production model and design verification, validation and implementation of the model.This article is presented an economic model and its modeling is using mathematical equations and software packageMatLab, which helps us approximation effective solution. As data entry is considered the net cost, the cost of direct andtotal cost and the link between them. I presented the basic formula for determining the total cost. Economic modelcalculations were made in MatLab software package and with graphic representation of its interpretation of the resultsachieved in terms of our specific problem.

  16. The Modeling of the Processes of Assessment and Analysis of the Level of Socio-Ecological-Economic Development of a Region

    Directory of Open Access Journals (Sweden)

    Pilko Andriy D.

    2017-06-01

    Full Text Available There presented results of the research of existing scientific and methodological, and theoretical approaches to management of the socio-ecological-economic development of a region with subsequent definition of priorities of the regional policy taking into account the concepts of security and development. Based on the study of literature sources and the analysis of the available statistical base, the problem of assessing the level of social, economic, ecological development, the level of sustainable development and the degree of harmonization of sustainable development of territorial systems of a region is formulated, and a possible method of its solution is suggested. The direction and nature of the cause-effect relationships between social tension and the levels of economic, environmental, social development and the level of sustainable development of a region are determined. There proposed a scheme to build models for assessing the effectiveness of levers for managing social, economic and environmental processes at the level of territorial systems in a region, taking into account the level of social tension and indicators of the investment component of development.

  17. Path modeling and process control

    DEFF Research Database (Denmark)

    Høskuldsson, Agnar; Rodionova, O.; Pomerantsev, A.

    2007-01-01

    and having three or more stages. The methods are applied to a process control of a multi-stage production process having 25 variables and one output variable. When moving along the process, variables change their roles. It is shown how the methods of path modeling can be applied to estimate variables...... be performed regarding the foreseeable output property y, and with respect to an admissible range of correcting actions for the parameters of the next stage. In this paper the basic principles of path modeling is presented. The mathematics is presented for processes having only one stage, having two stages...... of the next stage with the purpose of obtaining optimal or almost optimal quality of the output variable. An important aspect of the methods presented is the possibility of extensive graphic analysis of data that can provide the engineer with a detailed view of the multi-variate variation in data....

  18. GEODAT. Development of thermodynamic data for the thermodynamic equilibrium modeling of processes in deep geothermal formations. Combined report

    International Nuclear Information System (INIS)

    Moog, Helge C.; Regenspurg, Simona; Voigt, Wolfgang

    2015-02-01

    The concept for geothermal energy application for electricity generation can be differentiated into three compartments: In the geologic compartment cooled fluid is pressed into a porous or fractured rock formation, in the borehole compartment a hot fluid is pumped to the surface and back into the geothermal reservoir, in the aboveground facility the energy is extracted from the geothermal fluid by heat exchangers. Pressure and temperature changes influence the thermodynamic equilibrium of a system. The modeling of a geothermal system has therefore to consider besides the mass transport the heat transport and consequently changing solution compositions and the pressure/temperature effected chemical equilibrium. The GEODAT project is aimed to simulate the reactive mass transport in a geothermal reservoir in the North German basin (Gross Schoenebeck). The project was performed by the cooperation of three partners: Geoforschungsinstitut Potsdam, Bergakademie Freiberg and GRS.

  19. Study of hybrid laser / MAG welding process: characterization of the geometry and the hydrodynamics of the melt pool and development of a 3D thermal model

    International Nuclear Information System (INIS)

    Le Guen, E.

    2010-11-01

    Hybrid laser/MIG-MAG welding shows high advantages compared to laser welding or GMAW arc welding used separately. Thanks to this process, higher productivity can be gained through higher welding speed, higher squeeze tolerance moreover possible improvement of the metallurgical properties of the weld seam can be obtained. However, many operating parameters have to be set up in order to achieve optimal process. The complex physical phenomena, which govern welding process, have to be understood in order to use efficiently this technique in mass production. Understanding of these phenomena is also necessary to program numerical simulations which fit to this process. In the first step, experimental studies have been carried out with GMAW, laser and hybrid welding on samples of S355 steel. Influence of operating parameters has been analyzed through films performed with speed camera and macro-graphies of weld seam cross section. Surface deformations of the melt pool, induced by the arc pressure, weld pool length, droplet detachment and welding speed, have been analyzed precisely from images of the surface melt pool. In a second step, a numerical model was developed using the COMSOL Multiphysics software for MAG, laser and hybrid laser/MAG welding processes. A 3D quasi-stationary model has been calculated from the temperature field within the metal. The originality of the MAG and hybrid model lies in the prediction of the melt pool surface profile used to determine the 3D geometry, by taking into account the material input. The influence of different parameters such as arc power and speed welding on the efficiency as well as the distribution radius of the arc power and the arc pressure are analyzed through validations with different experimental results and different calculation configurations. (author)

  20. Modelling of processes occurring in deep geological repository - development of new modules in the GoldSim environment

    International Nuclear Information System (INIS)

    Vopalka, D.; Lukin, D.; Vokal, A.

    2006-01-01

    Three new modules were prepared in the environment of the GoldSim environment (using its Transport Module). The source term module includes radioactive decay and ingrowth in the canister, first order degradation of fuel matrix, solubility limitation of the concentration of the studied nuclides, and diffusive migration through the surrounding bentonite layer controlled by the output boundary condition formulated with respect to the rate of water flow in the rock. The module was successfully compared with results of similar codes (MIVCYL and Pagoda) and possibilities of the module were extended by a more realistic model of matrix degradation. A better quantification of the role of radionuclide sorption on the bentonite surface was enabled by a module that included non-linear form of the interaction isotherm. Using this module both the influence of the shape of sorption isotherm on the values of diffusion coefficients and the limits of K d -approach that dominates in most codes used in performance assessment studies was discussed. The third of the GoldSim modules presented has been worked out for the description of corrosion of canisters made of carbon steel and for the transport of corrosion products in the near-field region. This module evaluates balance equations between the dissolving species and species transported by diffusion and/or advection from the surface of a solid material. The model also includes transport of iron directly to a fracture in the surrounding rock or into a layer of granite host rock without fractures, and takes into account the reduction of the actual corrosion rate of the canister by growth of the corrosion layer thickness

  1. Birth/death process model

    Science.gov (United States)

    Solloway, C. B.; Wakeland, W.

    1976-01-01

    First-order Markov model developed on digital computer for population with specific characteristics. System is user interactive, self-documenting, and does not require user to have complete understanding of underlying model details. Contains thorough error-checking algorithms on input and default capabilities.

  2. Cost Models for MMC Manufacturing Processes

    Science.gov (United States)

    Elzey, Dana M.; Wadley, Haydn N. G.

    1996-01-01

    Processes for the manufacture of advanced metal matrix composites are rapidly approaching maturity in the research laboratory and there is growing interest in their transition to industrial production. However, research conducted to date has almost exclusively focused on overcoming the technical barriers to producing high-quality material and little attention has been given to the economical feasibility of these laboratory approaches and process cost issues. A quantitative cost modeling (QCM) approach was developed to address these issues. QCM are cost analysis tools based on predictive process models relating process conditions to the attributes of the final product. An important attribute, of the QCM approach is the ability to predict the sensitivity of material production costs to product quality and to quantitatively explore trade-offs between cost and quality. Applications of the cost models allow more efficient direction of future MMC process technology development and a more accurate assessment of MMC market potential. Cost models were developed for two state-of-the art metal matrix composite (MMC) manufacturing processes: tape casting and plasma spray deposition. Quality and Cost models are presented for both processes and the resulting predicted quality-cost curves are presented and discussed.

  3. Development of a Mass Transfer Model and Its Application to the Behavior of the Cs, Sr, Ba, and Oxygen ions in an Electrolytic Reduction Process for SF

    International Nuclear Information System (INIS)

    Park, Byung Heung; Kang, Dae Seung; Seo, Chung Seok; Park, Seong Won

    2005-01-01

    Isotopes of alkali and alkaline earth metals (AM and AEM) are the main contributors to the heat load and the radiotoxicity of spent fuel (SF). These components are separated from the SF and dissolved in a molten LiCl in an electrolytic reduction process. A mass transfer model is developed to describe the diffusion behavior of Cs, Sr, and Ba in the SF into the molten salt. The model is an analytical solution of Fick's second law of diffusion for a cylinder which is the shape of a cathode in the electrolytic reduction process. And the model is also applied to depict the concentration profile of the oxygen ion which is produced by the electrolysis of Li 2 O. The regressed diffusion coefficients of the model correlating the experimentally measured data are evaluated to be greater in the order of Ba, Cs, and Sr for the metal ions and the diffusion of the oxygen ion is slower than the metal ions which implies that different mechanisms govern the diffusion of the metal ions and the oxygen ions in a molten LiCl.

  4. Numerical modeling of atmospheric washout processes

    International Nuclear Information System (INIS)

    Bayer, D.; Beheng, K.D.; Herbert, F.

    1987-01-01

    For the washout of particles from the atmosphere by clouds and rain one has to distinguish between processes which work in the first phase of cloud development, when condensation nuclei build up in saturated air (Nucleation Aerosol Scavenging, NAS) and those processes which work at the following cloud development. In the second case particles are taken off by cloud droplets or by falling rain drops via collision (Collision Aerosol Scavenging, CAS). The physics of both processes is described. For the CAS process a numerical model is presented. The report contains a documentation of the mathematical equations and the computer programs (FORTRAN). (KW) [de

  5. Integrated durability process in product development

    International Nuclear Information System (INIS)

    Pompetzki, M.; Saadetian, H.

    2002-01-01

    This presentation describes the integrated durability process in product development. Each of the major components of the integrated process are described along with a number of examples of how integrated durability assessment has been used in the ground vehicle industry. The durability process starts with the acquisition of loading information, either physically through loads measurement or virtually through multibody dynamics. The loading information is then processed and characterized for further analysis. Durability assessment was historically test based and completed through field or laboratory evaluation. Today, it is common that both the test and CAE environments are used together in durability assessment. Test based durability assessment is used for final design sign-off but is also critically important for correlating CAE models, in order to investigate design alternatives. There is also a major initiative today to integrate the individual components into a process, by linking applications and providing a framework to communicate information as well as manage all the data involved in the entire process. Although a single process is presented, the details of the process can vary significantly for different products and applications. Recent applications that highlight different parts of the durability process are given. As well as an example of how integration of software tools between different disciplines (MBD, FE and fatigue) not only simplifies the process, but also significantly improves it. (author)

  6. Process Models for Security Architectures

    Directory of Open Access Journals (Sweden)

    Floarea NASTASE

    2006-01-01

    Full Text Available This paper presents a model for an integrated security system, which can be implemented in any organization. It is based on security-specific standards and taxonomies as ISO 7498-2 and Common Criteria. The functionalities are derived from the classes proposed in the Common Criteria document. In the paper we present the process model for each functionality and also we focus on the specific components.

  7. Model of diffusers / permeators for hydrogen processing

    International Nuclear Information System (INIS)

    Jacobs, W. D.; Hang, T.

    2008-01-01

    Palladium-silver (Pd-Ag) diffusers are mainstays of hydrogen processing. Diffusers separate hydrogen from inert species such as nitrogen, argon or helium. The tubing becomes permeable to hydrogen when heated to more than 250 C and a differential pressure is created across the membrane. The hydrogen diffuses better at higher temperatures. Experimental or experiential results have been the basis for determining or predicting a diffuser's performance. However, the process can be mathematically modeled, and comparison to experimental or other operating data can be utilized to improve the fit of the model. A reliable model-based diffuser system design is the goal which will have impacts on tritium and hydrogen processing. A computer model has been developed to solve the differential equations for diffusion given the operating boundary conditions. The model was compared to operating data for a low pressure diffuser system. The modeling approach and the results are presented in this paper. (authors)

  8. Advancements in Hydrology and Erosion Process Understanding and Post-Fire Hydrologic and Erosion Model Development for Semi-Arid Landscapes

    Science.gov (United States)

    Williams, C. Jason; Pierson, Frederick B.; Al-Hamdan, Osama Z.; Robichaud, Peter R.; Nearing, Mark A.; Hernandez, Mariano; Weltz, Mark A.; Spaeth, Kenneth E.; Goodrich, David C.

    2017-04-01

    Fire activity continues to increase in semi-arid regions around the globe. Private and governmental land management entities are challenged with predicting and mitigating post-fire hydrologic and erosion responses on these landscapes. For more than a decade, a team of scientists with the US Department of Agriculture has collaborated on extensive post-fire hydrologic field research and the application of field research to development of post-fire hydrology and erosion predictive technologies. Experiments funded through this research investigated the impacts of fire on vegetation and soils and the effects of these fire-induced changes on infiltration, runoff generation, erodibility, and soil erosion processes. The distribution of study sites spans diverse topography across grassland, shrubland, and woodland landscapes throughout the western United States. Knowledge gleaned from the extensive field experiments was applied to develop and enhance physically-based models for hillslope- to watershed-scale runoff and erosion prediction. Our field research and subsequent data syntheses have identified key knowledge gaps and challenges regarding post-fire hydrology and erosion modeling. Our presentation details some consistent trends across a diverse domain and varying landscape conditions based on our extensive field campaigns. We demonstrate how field data have advanced our understanding of post-fire hydrology and erosion for semi-arid landscapes and highlight remaining key knowledge gaps. Lastly, we briefly show how our well-replicated experimental methodologies have contributed to advancements in hydrologic and erosion model development for the post-fire environment.

  9. Light measurement model to assess software development process improvement Modelo liviano de medidas para evaluar la mejora de procesos de desarrollo de software MLM-PDS

    Directory of Open Access Journals (Sweden)

    Diana Vásquez

    2010-12-01

    Full Text Available Companies in software development in Colombia face a number of problems such as the construction of software in a artesian, empirical and disorganized way. Therefore, it is necessary for these companies to implement projects to improve their development processes, because ensure the quality of products, by improving their software processes, is a step that should give to be able to compete in the market. To implement process improvement models, it is not enough to say whether a company is actually getting benefits, definitely one of the first actions in a to improvement project is to be able to determine the current status of the process. Only by measuring it is possible to know the state of a process in an objective ay, and only through this it is possible to plan strategies and solutions, about improvements to make, depending on the objectives of the organization. This paper proposes a light model to assess software development process, which seeks to help the Colombian software development companies to determine whether the process of implementing improvements, being effective in achieving the objectives and goals set to implement this, through the use of measures to evaluate the process of improving their development processes, allowing characterize the current practices of the company, identifying weaknesses, strengths and abilities of the processes that are carried out within this and thus control or prevent the causes of low quality, or deviations in costs or planning.Las empresas de desarrollo de software en Colombia enfrentan una serie de problemas tales como la construcción de software de forma artesanal, empírica y desorganizada. Por esto, es necesario que implementen proyectos para mejorar sus procesos de desarrollo, ya que asegurar la calidad de los productos,a través de la mejora de sus procesos de software, es un paso que deben dar para estar en condiciones de competir en el mercado nacional e internacional. Implementar modelos

  10. Mathematical modelling in economic processes.

    Directory of Open Access Journals (Sweden)

    L.V. Kravtsova

    2008-06-01

    Full Text Available In article are considered a number of methods of mathematical modelling of economic processes and opportunities of use of spreadsheets Excel for reception of the optimum decision of tasks or calculation of financial operations with the help of the built-in functions.

  11. 0-6759 : developing a business process and logical model to support a tour-based travel demand model design for TxDOT.

    Science.gov (United States)

    2013-08-01

    The Texas Department of Transportation : (TxDOT) created a standardized trip-based : modeling approach for travel demand modeling : called the Texas Package Suite of Travel Demand : Models (referred to as the Texas Package) to : oversee the travel de...

  12. Modelling a uranium ore bioleaching process

    International Nuclear Information System (INIS)

    Chien, D.C.H.; Douglas, P.L.; Herman, D.H.; Marchbank, A.

    1990-01-01

    A dynamic simulation model for the bioleaching of uranium ore in a stope leaching process has been developed. The model incorporates design and operating conditions, reaction kinetics enhanced by Thiobacillus ferroxidans present in the leaching solution and transport properties. Model predictions agree well with experimental data with an average deviation of about ± 3%. The model is sensitive to small errors in the estimates of fragment size and ore grade. Because accurate estimates are difficult to obtain a parameter estimation approach was developed to update the value of fragment size and ore grade using on-line plant information

  13. Visualizing the process of process modeling with PPMCharts

    NARCIS (Netherlands)

    Claes, J.; Vanderfeesten, I.T.P.; Pinggera, J.; Reijers, H.A.; Weber, B.; Poels, G.; La Rosa, M.; Soffer, P.

    2013-01-01

    In the quest for knowledge about how to make good process models, recent research focus is shifting from studying the quality of process models to studying the process of process modeling (often abbreviated as PPM) itself. This paper reports on our efforts to visualize this specific process in such

  14. Neuroscientific Model of Motivational Process

    Directory of Open Access Journals (Sweden)

    Sung-Il eKim

    2013-03-01

    Full Text Available Considering the neuroscientific findings on reward, learning, value, decision-making, and cognitive control, motivation can be parsed into three subprocesses, a process of generating motivation, a process of maintaining motivation, and a process of regulating motivation. I propose a tentative neuroscientific model of motivational processes which consists of three distinct but continuous subprocesses, namely reward-driven approach, value-based decision making, and goal-directed control. Reward-driven approach is the process in which motivation is generated by reward anticipation and selective approach behaviors toward reward. This process recruits the ventral striatum (reward area in which basic stimulus-action association is formed, and is classified as an automatic motivation to which relatively less attention is assigned. By contrast, value-based decision making is the process of evaluating various outcomes of actions, learning through positive prediction error, and calculating the value continuously. The striatum and the orbitofrontal cortex (valuation area play crucial roles in sustaining motivation. Lastly, the goal-directed control is the process of regulating motivation through cognitive control to achieve goals. This consciously controlled motivation is associated with higher-level cognitive functions such as planning, retaining the goal, monitoring the performance, and regulating action. The anterior cingulate cortex (attention area and the dorsolateral prefrontal cortex (cognitive control area are the main neural circuits related to regulation of motivation. These three subprocesses interact with each other by sending reward prediction error signals through dopaminergic pathway from the striatum and to the prefrontal cortex. The neuroscientific model of motivational process suggests several educational implications with regard to the generation, maintenance, and regulation of motivation to learn in the learning environment.

  15. Process modelling on a canonical basis[Process modelling; Canonical modelling

    Energy Technology Data Exchange (ETDEWEB)

    Siepmann, Volker

    2006-12-20

    possible to retrieve symbolically obtained derivatives of arbitrary process properties with respect to process parameters efficiently as a post calculation. The approach is therefore perfectly suitable to perform advanced process systems engineering tasks, such as sensitivity analysis, process optimisation, and data reconciliation. The concept of canonical modelling yields a natural definition of a general exergy state function for second law analysis. By partitioning of exergy into latent, mechanical, and chemical contributions, irreversible effects can be identified specifically, even for black-box models. The calculation core of a new process simulator called Yasim is developed and implemented. The software design follows the concepts described in the theoretical part of this thesis. Numerous exemplary process models are presented to address various subtopics of canonical modelling (author)

  16. New droplet model developments

    International Nuclear Information System (INIS)

    Dorso, C.O.; Myers, W.D.; Swiatecki, W.J.; Moeller, P.; Treiner, J.; Weiss, M.S.

    1985-09-01

    A brief summary is given of three recent contributions to the development of the Droplet Model. The first concerns the electric dipole moment induced in octupole deformed nuclei by the Coulomb redistribution. The second concerns a study of squeezing in nuclei and the third is a study of the improved predictive power of the model when an empirical ''exponential'' term is included. 25 refs., 3 figs

  17. Advanced Mirror & Modelling Technology Development

    Science.gov (United States)

    Effinger, Michael; Stahl, H. Philip; Abplanalp, Laura; Maffett, Steven; Egerman, Robert; Eng, Ron; Arnold, William; Mosier, Gary; Blaurock, Carl

    2014-01-01

    The 2020 Decadal technology survey is starting in 2018. Technology on the shelf at that time will help guide selection to future low risk and low cost missions. The Advanced Mirror Technology Development (AMTD) team has identified development priorities based on science goals and engineering requirements for Ultraviolet Optical near-Infrared (UVOIR) missions in order to contribute to the selection process. One key development identified was lightweight mirror fabrication and testing. A monolithic, stacked, deep core mirror was fused and replicated twice to achieve the desired radius of curvature. It was subsequently successfully polished and tested. A recently awarded second phase to the AMTD project will develop larger mirrors to demonstrate the lateral scaling of the deep core mirror technology. Another key development was rapid modeling for the mirror. One model focused on generating optical and structural model results in minutes instead of months. Many variables could be accounted for regarding the core, face plate and back structure details. A portion of a spacecraft model was also developed. The spacecraft model incorporated direct integration to transform optical path difference to Point Spread Function (PSF) and between PSF to modulation transfer function. The second phase to the project will take the results of the rapid mirror modeler and integrate them into the rapid spacecraft modeler.

  18. Investigating the Process of Process Modeling with Eye Movement Analysis

    OpenAIRE

    Pinggera, Jakob; Furtner, Marco; Martini, Markus; Sachse, Pierre; Reiter, Katharina; Zugal, Stefan; Weber, Barbara

    2015-01-01

    Research on quality issues of business process models has recently begun to explore the process of creating process models by analyzing the modeler's interactions with the modeling environment. In this paper we aim to complement previous insights on the modeler's modeling behavior with data gathered by tracking the modeler's eye movements when engaged in the act of modeling. We present preliminary results and outline directions for future research to triangulate toward a more comprehensive un...

  19. Probabilistic Model Development

    Science.gov (United States)

    Adam, James H., Jr.

    2010-01-01

    Objective: Develop a Probabilistic Model for the Solar Energetic Particle Environment. Develop a tool to provide a reference solar particle radiation environment that: 1) Will not be exceeded at a user-specified confidence level; 2) Will provide reference environments for: a) Peak flux; b) Event-integrated fluence; and c) Mission-integrated fluence. The reference environments will consist of: a) Elemental energy spectra; b) For protons, helium and heavier ions.

  20. Understanding flexible and distributed software development processes

    OpenAIRE

    Agerfalk, Par J.; Fitzgerald, Brian

    2006-01-01

    peer-reviewed The minitrack on Flexible and Distributed Software Development Processes addresses two important and partially intertwined current themes in software development: process flexibility and globally distributed software development

  1. An ecological vegetation-activated sludge process (V-ASP) for decentralized wastewater treatment: system development, treatment performance, and mathematical modeling.

    Science.gov (United States)

    Yuan, Jiajia; Dong, Wenyi; Sun, Feiyun; Li, Pu; Zhao, Ke

    2016-05-01

    An environment-friendly decentralized wastewater treatment process that is comprised of activated sludge process (ASP) and wetland vegetation, named as vegetation-activated sludge process (V-ASP), was developed for decentralized wastewater treatment. The long-term experimental results evidenced that the vegetation sequencing batch reactor (V-SBR) process had consistently stable higher removal efficiencies of organic substances and nutrients from domestic wastewater compared with traditional sequencing batch reactor (SBR). The vegetation allocated into V-SBR system could not only remove nutrients through its vegetation transpiration ratio but also provide great surface area for microorganism activity enhancement. This high vegetation transpiration ratio enhanced nutrients removal effectiveness from wastewater mainly by flux enhancement, oxygen and substrate transportation acceleration, and vegetation respiration stimulation. A mathematical model based on ASM2d was successfully established by involving the specific function of vegetation to simulate system performance. The simulation results on the influence of operational parameters on V-ASP treatment effectiveness demonstrated that V-SBR had a high resistance to seasonal temperature fluctuations and influent loading shocking.

  2. Dry process fuel performance technology development

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Kweon Ho; Kim, K. W.; Kim, B. K. (and others)

    2006-06-15

    The objective of the project is to establish the performance evaluation system of DUPIC fuel during the Phase III R and D. In order to fulfil this objectives, property model development of DUPIC fuel and irradiation test was carried out in Hanaro using the instrumented rig. Also, the analysis on the in-reactor behavior analysis of DUPIC fuel, out-pile test using simulated DUPIC fuel as well as performance and integrity assessment in a commercial reactor were performed during this Phase. The R and D results of the Phase III are summarized as follows: Fabrication process establishment of simulated DUPIC fuel for property measurement, Property model development for the DUPIC fuel, Performance evaluation of DUPIC fuel via irradiation test in Hanaro, Post irradiation examination of irradiated fuel and performance analysis, Development of DUPIC fuel performance code (KAOS)

  3. Dry process fuel performance technology development

    International Nuclear Information System (INIS)

    Kang, Kweon Ho; Kim, K. W.; Kim, B. K.

    2006-06-01

    The objective of the project is to establish the performance evaluation system of DUPIC fuel during the Phase III R and D. In order to fulfil this objectives, property model development of DUPIC fuel and irradiation test was carried out in Hanaro using the instrumented rig. Also, the analysis on the in-reactor behavior analysis of DUPIC fuel, out-pile test using simulated DUPIC fuel as well as performance and integrity assessment in a commercial reactor were performed during this Phase. The R and D results of the Phase III are summarized as follows: Fabrication process establishment of simulated DUPIC fuel for property measurement, Property model development for the DUPIC fuel, Performance evaluation of DUPIC fuel via irradiation test in Hanaro, Post irradiation examination of irradiated fuel and performance analysis, Development of DUPIC fuel performance code (KAOS)

  4. Towards a universal competitive intelligence process model

    Directory of Open Access Journals (Sweden)

    Rene Pellissier

    2013-08-01

    Full Text Available Background: Competitive intelligence (CI provides actionable intelligence, which provides a competitive edge in enterprises. However, without proper process, it is difficult to develop actionable intelligence. There are disagreements about how the CI process should be structured. For CI professionals to focus on producing actionable intelligence, and to do so with simplicity, they need a common CI process model.Objectives: The purpose of this research is to review the current literature on CI, to look at the aims of identifying and analysing CI process models, and finally to propose a universal CI process model.Method: The study was qualitative in nature and content analysis was conducted on all identified sources establishing and analysing CI process models. To identify relevant literature, academic databases and search engines were used. Moreover, a review of references in related studies led to more relevant sources, the references of which were further reviewed and analysed. To ensure reliability, only peer-reviewed articles were used.Results: The findings reveal that the majority of scholars view the CI process as a cycle of interrelated phases. The output of one phase is the input of the next phase.Conclusion: The CI process is a cycle of interrelated phases. The output of one phase is the input of the next phase. These phases are influenced by the following factors: decision makers, process and structure, organisational awareness and culture, and feedback.

  5. Multiphysics modelling of the spray forming process

    International Nuclear Information System (INIS)

    Mi, J.; Grant, P.S.; Fritsching, U.; Belkessam, O.; Garmendia, I.; Landaberea, A.

    2008-01-01

    An integrated, multiphysics numerical model has been developed through the joint efforts of the University of Oxford (UK), University of Bremen (Germany) and Inasmet (Spain) to simulate the spray forming process. The integrated model consisted of four sub-models: (1) an atomization model simulating the fragmentation of a continuous liquid metal stream into droplet spray during gas atomization; (2) a droplet spray model simulating the droplet spray mass and enthalpy evolution in the gas flow field prior to deposition; (3) a droplet deposition model simulating droplet deposition, splashing and re-deposition behavior and the resulting preform shape and heat flow; and (4) a porosity model simulating the porosity distribution inside a spray formed ring preform. The model has been validated against experiments of the spray forming of large diameter IN718 Ni superalloy rings. The modelled preform shape, surface temperature and final porosity distribution showed good agreement with experimental measurements

  6. Stem Cell Differentiation Stage Factors and Their Role in Triggering Symmetry Breaking Processes during Cancer Development: A Quantum Field Theory Model for Reprogramming Cancer Cells to Healthy Phenotypes.

    Science.gov (United States)

    Biava, Pier Mario; Burigana, Fabio; Germano, Roberto; Kurian, Philip; Verzegnassi, Claudio; Vitiello, Giuseppe

    2017-09-20

    A long history of research has pursued the use of embryonic factors isolated during cell differentiation processes for the express purpose of transforming cancer cells back to healthy phenotypes. Recent results have clarified that the substances present at different stages of cell differentiation-which we call stem cell differentiation stage factors (SCDSFs)-are proteins with low molecular weight and nucleic acids that regulate genomic expression. The present review summarizes how these substances, taken at different stages of cellular maturation, are able to retard proliferation of many human tumor cell lines and thereby reprogram cancer cells to healthy phenotypes. The model presented here is a quantum field theory (QFT) model in which SCDSFs are able to trigger symmetry breaking processes during cancer development. These symmetry breaking processes, which lie at the root of many phenomena in elementary particle physics and condensed matter physics, govern the phase transitions of totipotent cells to higher degrees of diversity and order, resulting in cell differentiation. In cancers, which share many genomic and metabolic similarities with embryonic stem cells, stimulated re-differentiation often signifies the phenotypic reversion back to health and non-proliferation. In addition to acting on key components of the cellular cycle, SCDSFs are able to reprogram cancer cells by delicately influencing the cancer microenvironment, modulating the electrochemistry and thus the collective electrodynamic behaviors between dipole networks in biomacromolecules and the interstitial water field. Coherent effects in biological water, which are derived from a dissipative QFT framework, may offer new diagnostic and therapeutic targets at a systemic level, before tumor instantiation occurs in specific tissues or organs. Thus, by including the environment as an essential component of our model, we may push the prevailing paradigm of mutation-driven oncogenesis toward a closer

  7. Development of a data driven process-based model for remote sensing of terrestrial ecosystem productivity, evapotranspiration, and above-ground biomass

    Science.gov (United States)

    El Masri, Bassil

    2011-12-01

    Modeling terrestrial ecosystem functions and structure has been a subject of increasing interest because of the importance of the terrestrial carbon cycle in global carbon budget and climate change. In this study, satellite data were used to estimate gross primary production (GPP), evapotranspiration (ET) for two deciduous forests: Morgan Monroe State forest (MMSF) in Indiana and Harvard forest in Massachusetts. Also, above-ground biomass (AGB) was estimated for the MMSF and the Howland forest (mixed forest) in Maine. Surface reflectance and temperature, vegetation indices, soil moisture, tree height and canopy area derived from the Moderate Resolution Imagining Spectroradiometer (MODIS), the Advanced Microwave Scanning Radiometer (AMRS-E), LIDAR, and aerial imagery respectively, were used for this purpose. These variables along with others derived from remotely sensed data were used as inputs variables to process-based models which estimated GPP and ET and to a regression model which estimated AGB. The process-based models were BIOME-BGC and the Penman-Monteith equation. Measured values for the carbon and water fluxes obtained from the Eddy covariance flux tower were compared to the modeled GPP and ET. The data driven methods produced good estimation of GPP and ET with an average root mean square error (RMSE) of 0.17 molC/m2 and 0.40 mm/day, respectively for the MMSF and the Harvard forest. In addition, allometric data for the MMSF were used to develop the regression model relating AGB with stem volume. The performance of the AGB regression model was compared to site measurements using remotely sensed data for the MMSF and the Howland forest where the model AGB RMSE ranged between 2.92--3.30 Kg C/m2. Sensitivity analysis revealed that improvement in maintenance respiration estimation and remotely sensed maximum photosynthetic activity as well as accurate estimate of canopy resistance will result in improved GPP and ET predictions. Moreover, AGB estimates were

  8. Identification of the main processes in new towns Development ...

    African Journals Online (AJOL)

    Identification of the main processes in new towns Development Company in Iran and provision of the model of ideal processes for optimal management of ... The most important result of this project is that after identifying the status quo, mapping the processes, revising the processes and applying revised processes, the ...

  9. Development of enhanced sulfur rejection processes

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, R.H.; Luttrell, G.H.; Adel, G.T.; Richardson, P.E.

    1996-03-01

    Research at Virginia Tech led to the development of two complementary concepts for improving the removal of inorganic sulfur from many eastern U.S. coals. These concepts are referred to as Electrochemically Enhanced Sulfur Rejection (EESR) and Polymer Enhanced Sulfur Rejection (PESR) processes. The EESR process uses electrochemical techniques to suppress the formation of hydrophobic oxidation products believed to be responsible for the floatability of coal pyrite. The PESR process uses polymeric reagents that react with pyrite and convert floatable middlings, i.e., composite particles composed of pyrite with coal inclusions, into hydrophilic particles. These new pyritic-sulfur rejection processes do not require significant modifications to existing coal preparation facilities, thereby enhancing their adoptability by the coal industry. It is believed that these processes can be used simultaneously to maximize the rejection of both well-liberated pyrite and composite coal-pyrite particles. The project was initiated on October 1, 1992 and all technical work has been completed. This report is based on the research carried out under Tasks 2-7 described in the project proposal. These tasks include Characterization, Electrochemical Studies, In Situ Monitoring of Reagent Adsorption on Pyrite, Bench Scale Testing of the EESR Process, Bench Scale Testing of the PESR Process, and Modeling and Simulation.

  10. DEVELOPMENT AND TESTING OF GEO-PROCESSING MODELS FOR THE AUTOMATIC GENERATION OF REMEDIATION PLAN AND NAVIGATION DATA TO USE IN INDUSTRIAL DISASTER REMEDIATION

    Directory of Open Access Journals (Sweden)

    G. Lucas

    2015-08-01

    Full Text Available This paper introduces research done on the automatic preparation of remediation plans and navigation data for the precise guidance of heavy machinery in clean-up work after an industrial disaster. The input test data consists of a pollution extent shapefile derived from the processing of hyperspectral aerial survey data from the Kolontár red mud disaster. Three algorithms were developed and the respective scripts were written in Python. The first model aims at drawing a parcel clean-up plan. The model tests four different parcel orientations (0, 90, 45 and 135 degree and keeps the plan where clean-up parcels are less numerous considering it is an optimal spatial configuration. The second model drifts the clean-up parcel of a work plan both vertically and horizontally following a grid pattern with sampling distance of a fifth of a parcel width and keep the most optimal drifted version; here also with the belief to reduce the final number of parcel features. The last model aims at drawing a navigation line in the middle of each clean-up parcel. The models work efficiently and achieve automatic optimized plan generation (parcels and navigation lines. Applying the first model we demonstrated that depending on the size and geometry of the features of the contaminated area layer, the number of clean-up parcels generated by the model varies in a range of 4% to 38% from plan to plan. Such a significant variation with the resulting feature numbers shows that the optimal orientation identification can result in saving work, time and money in remediation. The various tests demonstrated that the model gains efficiency when 1/ the individual features of contaminated area present a significant orientation with their geometry (features are long, 2/ the size of pollution extent features becomes closer to the size of the parcels (scale effect. The second model shows only 1% difference with the variation of feature number; so this last is less interesting for

  11. Probabilistic evaluation of process model matching techniques

    NARCIS (Netherlands)

    Kuss, Elena; Leopold, Henrik; van der Aa, Han; Stuckenschmidt, Heiner; Reijers, Hajo A.

    2016-01-01

    Process model matching refers to the automatic identification of corresponding activities between two process models. It represents the basis for many advanced process model analysis techniques such as the identification of similar process parts or process model search. A central problem is how to

  12. Professionals' views on the development process of a structural collaboration between child and adolescent psychiatry and child welfare: an exploration through the lens of the life cycle model.

    Science.gov (United States)

    Van den Steene, Helena; van West, Dirk; Peeraer, Griet; Glazemakers, Inge

    2018-03-23

    This study, as a part of a participatory action research project, reports the development process of an innovative collaboration between child and adolescent psychiatry and child welfare, for adolescent girls with multiple and complex needs. The findings emerge from a qualitative descriptive analysis of four focus groups with 30 professionals closely involved in this project, and describe the evolution of the collaborative efforts and outcomes through time. Participants describe large investments and negative consequences of rapid organizational change in the beginning of the collaboration project, while benefits of the intensive collaboration only appeared later. A shared person-centred vision and enhanced professionals' confidence were pointed out as important contributors in the evolution of the collaboration. Findings were compared to the literature and showed significant analogy with the life cycle model for shared service centres that describe the maturation of collaborations from a management perspective. These findings enrich the knowledge about the development process of collaboration in health and social care. In increasingly collaborative services, child and adolescent psychiatrists and policy makers should be aware that gains from a collaboration will possibly only be achieved in the longer term, and benefit from knowing which factors have an influence on the evolution of a collaboration project.

  13. Process modeling study of the CIF incinerator

    International Nuclear Information System (INIS)

    Hang, T.

    1995-01-01

    The Savannah River Site (SRS) plans to begin operating the Consolidated Incineration Facility (CIF) in 1996. The CIF will treat liquid and solid low-level radioactive, mixed and RCRA hazardous wastes generated at SRS. In addition to experimental test programs, process modeling was applied to provide guidance in areas of safety, environmental regulation compliances, process improvement and optimization. A steady-state flowsheet model was used to calculate material/energy balances and to track key chemical constituents throughout the process units. Dynamic models were developed to predict the CIF transient characteristics in normal and abnormal operation scenarios. Predictions include the rotary kiln heat transfer, dynamic responses of the CIF to fluctuations in the solid waste feed or upsets in the system equipments, performance of the control system, air inleakage in the kiln, etc. This paper reviews the modeling study performed to assist in the deflagration risk assessment

  14. Modeling and simulation of heterogeneous catalytic processes

    CERN Document Server

    Dixon, Anthony

    2014-01-01

    Heterogeneous catalysis and mathematical modeling are essential components of the continuing search for better utilization of raw materials and energy, with reduced impact on the environment. Numerical modeling of chemical systems has progressed rapidly due to increases in computer power, and is used extensively for analysis, design and development of catalytic reactors and processes. This book presents reviews of the state-of-the-art in modeling of heterogeneous catalytic reactors and processes. Reviews by leading authorities in the respective areas Up-to-date reviews of latest techniques in modeling of catalytic processes Mix of US and European authors, as well as academic/industrial/research institute perspectives Connections between computation and experimental methods in some of the chapters.

  15. Waste immobilization process development at the Savannah River Plant

    International Nuclear Information System (INIS)

    Charlesworth, D.L.

    1986-01-01

    Processes to immobilize various wasteforms, including waste salt solution, transuranic waste, and low-level incinerator ash, are being developed. Wasteform characteristics, process and equipment details, and results from field/pilot tests and mathematical modeling studies are discussed

  16. Software Engineering Laboratory (SEL) cleanroom process model

    Science.gov (United States)

    Green, Scott; Basili, Victor; Godfrey, Sally; Mcgarry, Frank; Pajerski, Rose; Waligora, Sharon

    1991-01-01

    The Software Engineering Laboratory (SEL) cleanroom process model is described. The term 'cleanroom' originates in the integrated circuit (IC) production process, where IC's are assembled in dust free 'clean rooms' to prevent the destructive effects of dust. When applying the clean room methodology to the development of software systems, the primary focus is on software defect prevention rather than defect removal. The model is based on data and analysis from previous cleanroom efforts within the SEL and is tailored to serve as a guideline in applying the methodology to future production software efforts. The phases that are part of the process model life cycle from the delivery of requirements to the start of acceptance testing are described. For each defined phase, a set of specific activities is discussed, and the appropriate data flow is described. Pertinent managerial issues, key similarities and differences between the SEL's cleanroom process model and the standard development approach used on SEL projects, and significant lessons learned from prior cleanroom projects are presented. It is intended that the process model described here will be further tailored as additional SEL cleanroom projects are analyzed.

  17. Group Contribution Based Process Flowsheet Synthesis, Design and Modelling

    DEFF Research Database (Denmark)

    d'Anterroches, Loïc; Gani, Rafiqul

    2004-01-01

    This paper presents a process-group-contribution Method to model. simulate and synthesize a flowsheet. The process-group based representation of a flowsheet together with a process "property" model are presented. The process-group based synthesis method is developed on the basis of the computer...... aided molecular design methods and gives the ability to screen numerous process alternatives without the need to use the rigorous process simulation models. The process "property" model calculates the design targets for the generated flowsheet alternatives while a reverse modelling method (also...... developed) determines the design variables matching the target. A simple illustrative example highlighting the main features of the methodology is also presented....

  18. Modeling as a Decision-Making Process

    Science.gov (United States)

    Bleiler-Baxter, Sarah K.; Stephens, D. Christopher; Baxter, Wesley A.; Barlow, Angela T.

    2017-01-01

    The goal in this article is to support teachers in better understanding what it means to model with mathematics by focusing on three key decision-making processes: Simplification, Relationship Mapping, and Situation Analysis. The authors use the Theme Park task to help teachers develop a vision of how students engage in these three decision-making…

  19. Rectenna thermal model development

    Science.gov (United States)

    Kadiramangalam, Murall; Alden, Adrian; Speyer, Daniel

    1992-01-01

    Deploying rectennas in space requires adapting existing designs developed for terrestrial applications to the space environment. One of the major issues in doing so is to understand the thermal performance of existing designs in the space environment. Toward that end, a 3D rectenna thermal model has been developed, which involves analyzing shorted rectenna elements and finite size rectenna element arrays. A shorted rectenna element is a single element whose ends are connected together by a material of negligible thermal resistance. A shorted element is a good approximation to a central element of a large array. This model has been applied to Brown's 2.45 GHz rectenna design. Results indicate that Brown's rectenna requires redesign or some means of enhancing the heat dissipation in order for the diode temperature to be maintained below 200 C above an output power density of 620 W/sq.m. The model developed in this paper is very general and can be used for the analysis and design of any type of rectenna design of any frequency.

  20. The Development of Analogical Reasoning Processes.

    Science.gov (United States)

    Sternberg, Robert J.; Rifkin, Bathsheva

    1979-01-01

    Two experiments were conducted to test the generalizability to children of a theory of analogical reasoning processes, originally proposed for adults, and to examine the development of analogical reasoning processes in terms of five proposed sources of cognitive development. (MP)

  1. Integrated modelling of near field and engineered barrier system processes

    International Nuclear Information System (INIS)

    Lamont, A.; Gansemer, J.

    1994-01-01

    The Yucca Mountain Integrating Model (YMIM) is an integrated model of the Engineered barrier System has been developed to assist project managers at LLNL in identifying areas where research emphasis should be placed. The model was designed to be highly modular so that a model of an individual process could be easily modified or replaced without interfering with the models of other processes. The modules modelling container failure and the dissolution of nuclides include particularly detailed, temperature dependent models of their corresponding processes

  2. Instrumental development and data processing

    International Nuclear Information System (INIS)

    Franzen, J.

    1978-01-01

    A review of recent developments in mass spectrometry instrumentation is presented under the following headings: introduction (scope of mass spectrometry compared with neighbouring fields); ion sources and ionization techniques; spectrometers (instrumental developments); measuring procedures; coupling techniques; data systems; conclusions (that mass spectrometry should have a broader basis and that there would be mutual profit from a better penetration of mass spectrometry into fields of routine application). (U.K.)

  3. Neurocognitive and electrophysiological evidence of altered face processing in parents of children with autism: implications for a model of abnormal development of social brain circuitry in autism.

    Science.gov (United States)

    Dawson, Geraldine; Webb, Sara Jane; Wijsman, Ellen; Schellenberg, Gerard; Estes, Annette; Munson, Jeffrey; Faja, Susan

    2005-01-01

    Neuroimaging and behavioral studies have shown that children and adults with autism have impaired face recognition. Individuals with autism also exhibit atypical event-related brain potentials to faces, characterized by a failure to show a negative component (N170) latency advantage to face compared to nonface stimuli and a bilateral, rather than right lateralized, pattern of N170 distribution. In this report, performance by 143 parents of children with autism on standardized verbal, visual-spatial, and face recognition tasks was examined. It was found that parents of children with autism exhibited a significant decrement in face recognition ability relative to their verbal and visual spatial abilities. Event-related brain potentials to face and nonface stimuli were examined in 21 parents of children with autism and 21 control adults. Parents of children with autism showed an atypical event-related potential response to faces, which mirrored the pattern shown by children and adults with autism. These results raise the possibility that face processing might be a functional trait marker of genetic susceptibility to autism. Discussion focuses on hypotheses regarding the neurodevelopmental and genetic basis of altered face processing in autism. A general model of the normal emergence of social brain circuitry in the first year of life is proposed, followed by a discussion of how the trajectory of normal development of social brain circuitry, including cortical specialization for face processing, is altered in individuals with autism. The hypothesis that genetic-mediated dysfunction of the dopamine reward system, especially its functioning in social contexts, might account for altered face processing in individuals with autism and their relatives is discussed.

  4. Modeling pellet impact drilling process

    Science.gov (United States)

    Kovalyov, A. V.; Ryabchikov, S. Ya; Isaev, Ye D.; Ulyanova, O. S.

    2016-03-01

    The paper describes pellet impact drilling which could be used to increase the drilling speed and the rate of penetration when drilling hard rocks. Pellet impact drilling implies rock destruction by metal pellets with high kinetic energy in the immediate vicinity of the earth formation encountered. The pellets are circulated in the bottom hole by a high velocity fluid jet, which is the principle component of the ejector pellet impact drill bit. The experiments conducted has allowed modeling the process of pellet impact drilling, which creates the scientific and methodological basis for engineering design of drilling operations under different geo-technical conditions.

  5. Research and development of models and instruments to define, measure, and improve shared information processing with government oversight agencies. An analysis of the literature, August 1990--January 1992

    Energy Technology Data Exchange (ETDEWEB)

    1992-12-31

    This document identifies elements of sharing, plus key variables of each and their interrelationships. The document`s model of sharing is intended to help management systems` users understand what sharing is and how to integrate it with information processing.

  6. Process and Context in Choice Models

    DEFF Research Database (Denmark)

    Ben-Akiva, Moshe; Palma, André de; McFadden, Daniel

    2012-01-01

    . The extended choice framework includes more behavioral richness through the explicit representation of the planning process preceding an action and its dynamics and the effects of context (family, friends, and market) on the process leading to a choice, as well as the inclusion of new types of subjective data...... in choice models. We discuss the key issues involved in applying the extended framework, focusing on richer data requirements, theories, and models, and present three partial demonstrations of the proposed framework. Future research challenges include the development of more comprehensive empirical tests...

  7. Modeling biochemical transformation processes and information processing with Narrator

    Directory of Open Access Journals (Sweden)

    Palfreyman Niall M

    2007-03-01

    Full Text Available Abstract Background Software tools that model and simulate the dynamics of biological processes and systems are becoming increasingly important. Some of these tools offer sophisticated graphical user interfaces (GUIs, which greatly enhance their acceptance by users. Such GUIs are based on symbolic or graphical notations used to describe, interact and communicate the developed models. Typically, these graphical notations are geared towards conventional biochemical pathway diagrams. They permit the user to represent the transport and transformation of chemical species and to define inhibitory and stimulatory dependencies. A critical weakness of existing tools is their lack of supporting an integrative representation of transport, transformation as well as biological information processing. Results Narrator is a software tool facilitating the development and simulation of biological systems as Co-dependence models. The Co-dependence Methodology complements the representation of species transport and transformation together with an explicit mechanism to express biological information processing. Thus, Co-dependence models explicitly capture, for instance, signal processing structures and the influence of exogenous factors or events affecting certain parts of a biological system or process. This combined set of features provides the system biologist with a powerful tool to describe and explore the dynamics of life phenomena. Narrator's GUI is based on an expressive graphical notation which forms an integral part of the Co-dependence Methodology. Behind the user-friendly GUI, Narrator hides a flexible feature which makes it relatively easy to map models defined via the graphical notation to mathematical formalisms and languages such as ordinary differential equations, the Systems Biology Markup Language or Gillespie's direct method. This powerful feature facilitates reuse, interoperability and conceptual model development. Conclusion Narrator is a

  8. Modeling biochemical transformation processes and information processing with Narrator.

    Science.gov (United States)

    Mandel, Johannes J; Fuss, Hendrik; Palfreyman, Niall M; Dubitzky, Werner

    2007-03-27

    Software tools that model and simulate the dynamics of biological processes and systems are becoming increasingly important. Some of these tools offer sophisticated graphical user interfaces (GUIs), which greatly enhance their acceptance by users. Such GUIs are based on symbolic or graphical notations used to describe, interact and communicate the developed models. Typically, these graphical notations are geared towards conventional biochemical pathway diagrams. They permit the user to represent the transport and transformation of chemical species and to define inhibitory and stimulatory dependencies. A critical weakness of existing tools is their lack of supporting an integrative representation of transport, transformation as well as biological information processing. Narrator is a software tool facilitating the development and simulation of biological systems as Co-dependence models. The Co-dependence Methodology complements the representation of species transport and transformation together with an explicit mechanism to express biological information processing. Thus, Co-dependence models explicitly capture, for instance, signal processing structures and the influence of exogenous factors or events affecting certain parts of a biological system or process. This combined set of features provides the system biologist with a powerful tool to describe and explore the dynamics of life phenomena. Narrator's GUI is based on an expressive graphical notation which forms an integral part of the Co-dependence Methodology. Behind the user-friendly GUI, Narrator hides a flexible feature which makes it relatively easy to map models defined via the graphical notation to mathematical formalisms and languages such as ordinary differential equations, the Systems Biology Markup Language or Gillespie's direct method. This powerful feature facilitates reuse, interoperability and conceptual model development. Narrator is a flexible and intuitive systems biology tool. It is

  9. Integrated modelling in materials and process technology

    DEFF Research Database (Denmark)

    Hattel, Jesper Henri

    2008-01-01

    Integrated modelling of entire process sequences and the subsequent in-service conditions, and multiphysics modelling of the single process steps are areas that increasingly support optimisation of manufactured parts. In the present paper, three different examples of modelling manufacturing...... processes from the viewpoint of combined materials and process modelling are presented: solidification of thin walled ductile cast iron, integrated modelling of spray forming and multiphysics modelling of friction stir welding. The fourth example describes integrated modelling applied to a failure analysis...

  10. Branching process models of cancer

    CERN Document Server

    Durrett, Richard

    2015-01-01

    This volume develops results on continuous time branching processes and applies them to study rate of tumor growth, extending classic work on the Luria-Delbruck distribution. As a consequence, the authors calculate the probability that mutations that confer resistance to treatment are present at detection and quantify the extent of tumor heterogeneity. As applications, the authors evaluate ovarian cancer screening strategies and give rigorous proofs for results of Heano and Michor concerning tumor metastasis. These notes should be accessible to students who are familiar with Poisson processes and continuous time. Richard Durrett is mathematics professor at Duke University, USA. He is the author of 8 books, over 200 journal articles, and has supervised more than 40 Ph.D. students. Most of his current research concerns the applications of probability to biology: ecology, genetics, and most recently cancer.

  11. Determinantal point process models on the sphere

    DEFF Research Database (Denmark)

    Møller, Jesper; Nielsen, Morten; Porcu, Emilio

    defined on Sd × Sd . We review the appealing properties of such processes, including their specific moment properties, density expressions and simulation procedures. Particularly, we characterize and construct isotropic DPPs models on Sd , where it becomes essential to specify the eigenvalues......We consider determinantal point processes on the d-dimensional unit sphere Sd . These are finite point processes exhibiting repulsiveness and with moment properties determined by a certain determinant whose entries are specified by a so-called kernel which we assume is a complex covariance function...... and eigenfunctions in a spectral representation for the kernel, and we figure out how repulsive isotropic DPPs can be. Moreover, we discuss the shortcomings of adapting existing models for isotropic covariance functions and consider strategies for developing new models, including a useful spectral approach....

  12. Extending the agile development process to develop acceptably secure software

    NARCIS (Netherlands)

    Ben Othmane, L.; Angin, P.; Weffers, H.T.G.; Bhargava, B.

    2013-01-01

    The agile software development approach makes developing secure software challenging. Existing approaches for extending the agile development process, which enables incremental and iterative software development, fall short of providing a method for efficiently ensuring the security of the software

  13. Business Process Modelling for Measuring Quality

    NARCIS (Netherlands)

    Heidari, F.; Loucopoulos, P.; Brazier, F.M.

    2013-01-01

    Business process modelling languages facilitate presentation, communication and analysis of business processes with different stakeholders. This paper proposes an approach that drives specification and measurement of quality requirements and in doing so relies on business process models as

  14. Mathematical modeling of biomass fuels formation process

    International Nuclear Information System (INIS)

    Gaska, Krzysztof; Wandrasz, Andrzej J.

    2008-01-01

    The increasing demand for thermal and electric energy in many branches of industry and municipal management accounts for a drastic diminishing of natural resources (fossil fuels). Meanwhile, in numerous technical processes, a huge mass of wastes is produced. A segregated and converted combustible fraction of the wastes, with relatively high calorific value, may be used as a component of formed fuels. The utilization of the formed fuel components from segregated groups of waste in associated processes of co-combustion with conventional fuels causes significant savings resulting from partial replacement of fossil fuels, and reduction of environmental pollution resulting directly from the limitation of waste migration to the environment (soil, atmospheric air, surface and underground water). The realization of technological processes with the utilization of formed fuel in associated thermal systems should be qualified by technical criteria, which means that elementary processes as well as factors of sustainable development, from a global viewpoint, must not be disturbed. The utilization of post-process waste should be preceded by detailed technical, ecological and economic analyses. In order to optimize the mixing process of fuel components, a mathematical model of the forming process was created. The model is defined as a group of data structures which uniquely identify a real process and conversion of this data in algorithms based on a problem of linear programming. The paper also presents the optimization of parameters in the process of forming fuels using a modified simplex algorithm with a polynomial worktime. This model is a datum-point in the numerical modeling of real processes, allowing a precise determination of the optimal elementary composition of formed fuels components, with assumed constraints and decision variables of the task

  15. Modeling nonhomogeneous Markov processes via time transformation.

    Science.gov (United States)

    Hubbard, R A; Inoue, L Y T; Fann, J R

    2008-09-01

    Longitudinal studies are a powerful tool for characterizing the course of chronic disease. These studies are usually carried out with subjects observed at periodic visits giving rise to panel data. Under this observation scheme the exact times of disease state transitions and sequence of disease states visited are unknown and Markov process models are often used to describe disease progression. Most applications of Markov process models rely on the assumption of time homogeneity, that is, that the transition rates are constant over time. This assumption is not satisfied when transition rates depend on time from the process origin. However, limited statistical tools are available for dealing with nonhomogeneity. We propose models in which the time scale of a nonhomogeneous Markov process is transformed to an operational time scale on which the process is homogeneous. We develop a method for jointly estimating the time transformation and the transition intensity matrix for the time transformed homogeneous process. We assess maximum likelihood estimation using the Fisher scoring algorithm via simulation studies and compare performance of our method to homogeneous and piecewise homogeneous models. We apply our methodology to a study of delirium progression in a cohort of stem cell transplantation recipients and show that our method identifies temporal trends in delirium incidence and recovery.

  16. Theoretical Modelling of Intercultural Communication Process

    Directory of Open Access Journals (Sweden)

    Mariia Soter

    2016-08-01

    Full Text Available The definition of the concepts of “communication”, “intercultural communication”, “model of communication” are analyzed in the article. The basic components of the communication process are singled out. The model of intercultural communication is developed. Communicative, behavioral and complex skills for optimal organization of intercultural communication, establishment of productive contact with a foreign partner to achieve mutual understanding, searching for acceptable ways of organizing interaction and cooperation for both communicants are highlighted in the article. It is noted that intercultural communication through interaction between people affects the development of different cultures’ aspects.

  17. Development of mathematical model and optimal control system of internal temperatures of hot-blast stove process in staggered parallel operation; Netsufuro sushiki model to parallel sofu ni okeru ronai ondo saiteki seigyo system no kaihatsu

    Energy Technology Data Exchange (ETDEWEB)

    Matoba, Y. [Sumitomo Metal Industries, Ltd., Osaka (Japan); Otsuka, K.

    1998-07-01

    A mathematical model and an optimal control system of hot-blast stove process are described. A precise mathematical simulation model of the hot-blast stove was developed and the accuracy of the model has been confirmed. An optimal control system of the thermal conditions of the hot-blast stoves in staggered parallel operation was also developed. By the use of the multivariable optimal regulator and the feedforward compensations for the change of the aimed blast temperature and blast volume, the system is able to control the hot blast temperature and the brick temperature efficiently. The system has been applied to Kashima works. The variations of the blast temperature and the silica brick temperature have been decreased. The ultimate low heat level operations have been realized and the thermal efficiency furthermore has been raised by about 1%. 8 refs., 14 figs., 1 tab.

  18. Development of functionally-oriented technological processes of electroerosive processing

    Science.gov (United States)

    Syanov, S. Yu

    2018-03-01

    The stages of the development of functionally oriented technological processes of electroerosive processing from the separation of the surfaces of parts and their service functions to the determination of the parameters of the process of electric erosion, which will provide not only the quality parameters of the surface layer, but also the required operational properties, are described.

  19. Image processing and analysis software development

    International Nuclear Information System (INIS)

    Shahnaz, R.

    1999-01-01

    The work presented in this project is aimed at developing a software 'IMAGE GALLERY' to investigate various image processing and analysis techniques. The work was divided into two parts namely the image processing techniques and pattern recognition, which further comprised of character and face recognition. Various image enhancement techniques including negative imaging, contrast stretching, compression of dynamic, neon, diffuse, emboss etc. have been studied. Segmentation techniques including point detection, line detection, edge detection have been studied. Also some of the smoothing and sharpening filters have been investigated. All these imaging techniques have been implemented in a window based computer program written in Visual Basic Neural network techniques based on Perception model have been applied for face and character recognition. (author)

  20. Model-based internal wave processing

    Energy Technology Data Exchange (ETDEWEB)

    Candy, J.V.; Chambers, D.H.

    1995-06-09

    A model-based approach is proposed to solve the oceanic internal wave signal processing problem that is based on state-space representations of the normal-mode vertical velocity and plane wave horizontal velocity propagation models. It is shown that these representations can be utilized to spatially propagate the modal (dept) vertical velocity functions given the basic parameters (wave numbers, Brunt-Vaisala frequency profile etc.) developed from the solution of the associated boundary value problem as well as the horizontal velocity components. Based on this framework, investigations are made of model-based solutions to the signal enhancement problem for internal waves.

  1. The impact of working memory and the "process of process modelling" on model quality: Investigating experienced versus inexperienced modellers.

    Science.gov (United States)

    Martini, Markus; Pinggera, Jakob; Neurauter, Manuel; Sachse, Pierre; Furtner, Marco R; Weber, Barbara

    2016-05-09

    A process model (PM) represents the graphical depiction of a business process, for instance, the entire process from online ordering a book until the parcel is delivered to the customer. Knowledge about relevant factors for creating PMs of high quality is lacking. The present study investigated the role of cognitive processes as well as modelling processes in creating a PM in experienced and inexperienced modellers. Specifically, two working memory (WM) functions (holding and processing of information and relational integration) and three process of process modelling phases (comprehension, modelling, and reconciliation) were related to PM quality. Our results show that the WM function of relational integration was positively related to PM quality in both modelling groups. The ratio of comprehension phases was negatively related to PM quality in inexperienced modellers and the ratio of reconciliation phases was positively related to PM quality in experienced modellers. Our research reveals central cognitive mechanisms in process modelling and has potential practical implications for the development of modelling software and teaching the craft of process modelling.

  2. Precipitation Processes developed during ARM (1997), TOGA COARE(1992), GATE(1 974), SCSMEX(1998) and KWAJEX(1999): Consistent 2D and 3D Cloud Resolving Model Simulations

    Science.gov (United States)

    Tao, W.-K.; Shie, C.-H.; Simpson, J.; Starr, D.; Johnson, D.; Sud, Y.

    2003-01-01

    Real clouds and clouds systems are inherently three dimensional (3D). Because of the limitations in computer resources, however, most cloud-resolving models (CRMs) today are still two-dimensional (2D). A few 3D CRMs have been used to study the response of clouds to large-scale forcing. In these 3D simulations, the model domain was small, and the integration time was 6 hours. Only recently have 3D experiments been performed for multi-day periods for tropical cloud system with large horizontal domains at the National Center for Atmospheric Research. The results indicate that surface precipitation and latent heating profiles are very similar between the 2D and 3D simulations of these same cases. The reason for the strong similarity between the 2D and 3D CRM simulations is that the observed large-scale advective tendencies of potential temperature, water vapor mixing ratio, and horizontal momentum were used as the main forcing in both the 2D and 3D models. Interestingly, the 2D and 3D versions of the CRM used in CSU and U.K. Met Office showed significant differences in the rainfall and cloud statistics for three ARM cases. The major objectives of this project are to calculate and axamine: (1)the surface energy and water budgets, (2) the precipitation processes in the convective and stratiform regions, (3) the cloud upward and downward mass fluxes in the convective and stratiform regions; (4) cloud characteristics such as size, updraft intensity and lifetime, and (5) the entrainment and detrainment rates associated with clouds and cloud systems that developed in TOGA COARE, GATE, SCSMEX, ARM and KWAJEX. Of special note is that the analyzed (model generated) data sets are all produced by the same current version of the GCE model, i.e. consistent model physics and configurations. Trajectory analyse and inert tracer calculation will be conducted to identify the differences and similarities in the organization of convection between simulated 2D and 3D cloud systems.

  3. Retort process modelling for Indian traditional foods.

    Science.gov (United States)

    Gokhale, S V; Lele, S S

    2014-11-01

    Indian traditional staple and snack food is typically a heterogeneous recipe that incorporates varieties of vegetables, lentils and other ingredients. Modelling the retorting process of multilayer pouch packed Indian food was achieved using lumped-parameter approach. A unified model is proposed to estimate cold point temperature. Initial process conditions, retort temperature and % solid content were the significantly affecting independent variables. A model was developed using combination of vegetable solids and water, which was then validated using four traditional Indian vegetarian products: Pulav (steamed rice with vegetables), Sambar (south Indian style curry containing mixed vegetables and lentils), Gajar Halawa (carrot based sweet product) and Upama (wheat based snack product). The predicted and experimental values of temperature profile matched with ±10 % error which is a good match considering the food was a multi component system. Thus the model will be useful as a tool to reduce number of trials required to optimize retorting of various Indian traditional vegetarian foods.

  4. High-Throughput Process Development for Biopharmaceuticals.

    Science.gov (United States)

    Shukla, Abhinav A; Rameez, Shahid; Wolfe, Leslie S; Oien, Nathan

    2017-11-14

    The ability to conduct multiple experiments in parallel significantly reduces the time that it takes to develop a manufacturing process for a biopharmaceutical. This is particularly significant before clinical entry, because process development and manufacturing are on the "critical path" for a drug candidate to enter clinical development. High-throughput process development (HTPD) methodologies can be similarly impactful during late-stage development, both for developing the final commercial process as well as for process characterization and scale-down validation activities that form a key component of the licensure filing package. This review examines the current state of the art for HTPD methodologies as they apply to cell culture, downstream purification, and analytical techniques. In addition, we provide a vision of how HTPD activities across all of these spaces can integrate to create a rapid process development engine that can accelerate biopharmaceutical drug development. Graphical Abstract.

  5. Process-Based Quality (PBQ) Tools Development; TOPICAL

    International Nuclear Information System (INIS)

    Cummins, J.L.

    2001-01-01

    The objective of this effort is to benchmark the development of process-based quality tools for application in CAD (computer-aided design) model-based applications. The processes of interest are design, manufacturing, and quality process applications. A study was commissioned addressing the impact, current technologies, and known problem areas in application of 3D MCAD (3-dimensional mechanical computer-aided design) models and model integrity on downstream manufacturing and quality processes. The downstream manufacturing and product quality processes are profoundly influenced and dependent on model quality and modeling process integrity. The goal is to illustrate and expedite the modeling and downstream model-based technologies for available or conceptual methods and tools to achieve maximum economic advantage and advance process-based quality concepts

  6. Modelling and control of dynamic systems using gaussian process models

    CERN Document Server

    Kocijan, Juš

    2016-01-01

    This monograph opens up new horizons for engineers and researchers in academia and in industry dealing with or interested in new developments in the field of system identification and control. It emphasizes guidelines for working solutions and practical advice for their implementation rather than the theoretical background of Gaussian process (GP) models. The book demonstrates the potential of this recent development in probabilistic machine-learning methods and gives the reader an intuitive understanding of the topic. The current state of the art is treated along with possible future directions for research. Systems control design relies on mathematical models and these may be developed from measurement data. This process of system identification, when based on GP models, can play an integral part of control design in data-based control and its description as such is an essential aspect of the text. The background of GP regression is introduced first with system identification and incorporation of prior know...

  7. Development of the physical model

    International Nuclear Information System (INIS)

    Liu Zunqi; Morsy, Samir

    2001-01-01

    Full text: The Physical Model was developed during Program 93+2 as a technical tool to aid enhanced information analysis and now is an integrated part of the Department's on-going State evaluation process. This paper will describe the concept of the Physical Model, including its objectives, overall structure and the development of indicators with designated strengths, followed by a brief description of using the Physical Model in implementing the enhanced information analysis. The work plan for expansion and update of the Physical Model is also presented at the end of the paper. The development of the Physical Model is an attempt to identify, describe and characterize every known process for carrying out each step necessary for the acquisition of weapons-usable material, i.e., all plausible acquisition paths for highly enriched uranium (HEU) and separated plutonium (Pu). The overall structure of the Physical Model has a multilevel arrangement. It includes at the top level all the main steps (technologies) that may be involved in the nuclear fuel cycle from the source material production up to the acquisition of weapons-usable material, and then beyond the civilian fuel cycle to the development of nuclear explosive devices (weaponization). Each step is logically interconnected with the preceding and/or succeeding steps by nuclear material flows. It contains at its lower levels every known process that is associated with the fuel cycle activities presented at the top level. For example, uranium enrichment is broken down into three branches at the second level, i.e., enrichment of UF 6 , UCl 4 and U-metal respectively; and then further broken down at the third level into nine processes: gaseous diffusion, gas centrifuge, aerodynamic, electromagnetic, molecular laser (MLIS), atomic vapor laser (AVLIS), chemical exchange, ion exchange and plasma. Narratives are presented at each level, beginning with a general process description then proceeding with detailed

  8. Discovery of a Novel Immune Gene Signature with Profound Prognostic Value in Colorectal Cancer: A Model of Cooperativity Disorientation Created in the Process from Development to Cancer.

    Directory of Open Access Journals (Sweden)

    Ning An

    Full Text Available Immune response-related genes play a major role in colorectal carcinogenesis by mediating inflammation or immune-surveillance evasion. Although remarkable progress has been made to investigate the underlying mechanism, the understanding of the complicated carcinogenesis process was enormously hindered by large-scale tumor heterogeneity. Development and carcinogenesis share striking similarities in their cellular behavior and underlying molecular mechanisms. The association between embryonic development and carcinogenesis makes embryonic development a viable reference model for studying cancer thereby circumventing the potentially misleading complexity of tumor heterogeneity. Here we proposed that the immune genes, responsible for intra-immune cooperativity disorientation (defined in this study as disruption of developmental expression correlation patterns during carcinogenesis, probably contain untapped prognostic resource of colorectal cancer. In this study, we determined the mRNA expression profile of 137 human biopsy samples, including samples from different stages of human colonic development, colorectal precancerous progression and colorectal cancer samples, among which 60 were also used to generate miRNA expression profile. We originally established Spearman correlation transition model to quantify the cooperativity disorientation associated with the transition from normal to precancerous to cancer tissue, in conjunction with miRNA-mRNA regulatory network and machine learning algorithm to identify genes with prognostic value. Finally, a 12-gene signature was extracted, whose prognostic value was evaluated using Kaplan-Meier survival analysis in five independent datasets. Using the log-rank test, the 12-gene signature was closely related to overall survival in four datasets (GSE17536, n = 177, p = 0.0054; GSE17537, n = 55, p = 0.0039; GSE39582, n = 562, p = 0.13; GSE39084, n = 70, p = 0.11, and significantly associated with disease

  9. Validation process of simulation model

    International Nuclear Information System (INIS)

    San Isidro, M. J.

    1998-01-01

    It is presented a methodology on empirical validation about any detailed simulation model. This king of validation it is always related with an experimental case. The empirical validation has a residual sense, because the conclusions are based on comparisons between simulated outputs and experimental measurements. This methodology will guide us to detect the fails of the simulation model. Furthermore, it can be used a guide in the design of posterior experiments. Three steps can be well differentiated: Sensitivity analysis. It can be made with a DSA, differential sensitivity analysis, and with a MCSA, Monte-Carlo sensitivity analysis. Looking the optimal domains of the input parameters. It has been developed a procedure based on the Monte-Carlo methods and Cluster techniques, to find the optimal domains of these parameters. Residual analysis. This analysis has been made on the time domain and on the frequency domain, it has been used the correlation analysis and spectral analysis. As application of this methodology, it is presented the validation carried out on a thermal simulation model on buildings, Esp., studying the behavior of building components on a Test Cell of LECE of CIEMAT. (Author) 17 refs

  10. Development of estimation method for crop yield using MODIS satellite imagery data and process-based model for corn and soybean in US Corn-Belt region

    Science.gov (United States)

    Lee, J.; Kang, S.; Jang, K.; Ko, J.; Hong, S.

    2012-12-01

    Crop productivity is associated with the food security and hence, several models have been developed to estimate crop yield by combining remote sensing data with carbon cycle processes. In present study, we attempted to estimate crop GPP and NPP using algorithm based on the LUE model and a simplified respiration model. The state of Iowa and Illinois was chosen as the study site for estimating the crop yield for a period covering the 5 years (2006-2010), as it is the main Corn-Belt area in US. Present study focuses on developing crop-specific parameters for corn and soybean to estimate crop productivity and yield mapping using satellite remote sensing data. We utilized a 10 km spatial resolution daily meteorological data from WRF to provide cloudy-day meteorological variables but in clear-say days, MODIS-based meteorological data were utilized to estimate daily GPP, NPP, and biomass. County-level statistics on yield, area harvested, and productions were used to test model predicted crop yield. The estimated input meteorological variables from MODIS and WRF showed with good agreements with the ground observations from 6 Ameriflux tower sites in 2006. For examples, correlation coefficients ranged from 0.93 to 0.98 for Tmin and Tavg ; from 0.68 to 0.85 for daytime mean VPD; from 0.85 to 0.96 for daily shortwave radiation, respectively. We developed county-specific crop conversion coefficient, i.e. ratio of yield to biomass on 260 DOY and then, validated the estimated county-level crop yield with the statistical yield data. The estimated corn and soybean yields at the county level ranged from 671 gm-2 y-1 to 1393 gm-2 y-1 and from 213 gm-2 y-1 to 421 gm-2 y-1, respectively. The county-specific yield estimation mostly showed errors less than 10%. Furthermore, we estimated crop yields at the state level which were validated against the statistics data and showed errors less than 1%. Further analysis for crop conversion coefficient was conducted for 200 DOY and 280 DOY

  11. Development of novel microencapsulation processes

    Science.gov (United States)

    Yin, Weisi

    of polymer solution suspended in water or from a spray. Hollow PS particles were obtained by swelling PS latex with solvent, freezing in liquid nitrogen, and drying in vacuum. It is shown that the particle morphology is due to phase separation in the polymer emulsion droplets upon freezing in liquid nitrogen, and that morphological changes are driven largely by lowering interfacial free energy. The dried hollow particles were resuspended in a dispersing media and exposed to a plasticizer, which imparts mobility to polymer chains, to close the surface opening and form microcapsules surrounding an aqueous core. The interfacial free energy difference between the hydrophobic inside and hydrophilic outside surfaces is the major driving force for closing the hole on the surface. A controlled release biodegradable vehicle for drug was made by encapsulating procaine hydrochloride, a water-soluble drug, into the core of poly(DL-lactide) (PLA) microcapsules, which were made by the freeze-drying and subsequent closing process. The encapsulation efficiency is affected by the hollow particle morphology, amount of closing agent, exposure time, surfactant, and method of dispersing the hollow particles in water. Controlled release of procaine hydrochloride from the microcapsules into phosphate buffer was observed. The use of benign solvents dimethyl carbonate in spray/freeze-drying and CO2 for closing would eliminate concerns of residual harmful solvent in the product. The ease of separation of CO2 from the drug solution may also enable recycling of the drug solution to increase the overall encapsulation efficiency using these novel hollow particles.

  12. Hydrothermal processing of Hanford tank wastes: Process modeling and control

    International Nuclear Information System (INIS)

    Currier, R.P.

    1994-01-01

    In the Los Alamos National Laboratory (LANL) hydrothermal process, waste streams are first pressurized and heated as they pass through a continuous flow tubular reactor vessel. The waste is maintained at reaction temperature of 300--550 C where organic destruction and sludge reformation occur. This report documents LANL activities in process modeling and control undertaken in FY94 to support hydrothermal process development. Key issues discussed include non-ideal flow patterns (e.g. axial dispersion) and their effect on reactor performance, the use and interpretation of inert tracer experiments, and the use of computational fluid mechanics to evaluate novel hydrothermal reactor designs. In addition, the effects of axial dispersion (and simplifications to rate expressions) on the estimated kinetic parameters are explored by non-linear regression to experimental data. Safety-related calculations are reported which estimate the explosion limits of effluent gases and the fate of hydrogen as it passes through the reactor. Development and numerical solution of a generalized one-dimensional mathematical model is also summarized. The difficulties encountered in using commercially available software to correlate the behavior of high temperature, high pressure aqueous electrolyte mixtures are summarized. Finally, details of the control system and experiments conducted to empirically determine the system response are reported

  13. Development Model for Research Infrastructures

    Science.gov (United States)

    Wächter, Joachim; Hammitzsch, Martin; Kerschke, Dorit; Lauterjung, Jörn

    2015-04-01

    . The maturity of individual scientific domains differs considerably. • Technologically and organisationally many different RI components have to be integrated. Individual systems are often complex and have a long-term history. Existing approaches are on different maturity levels, e.g. in relation to the standardisation of interfaces. • The concrete implementation process consists of independent and often parallel development activities. In many cases no detailed architectural blue-print for the envisioned system exists. • Most of the funding currently available for RI implementation is provided on a project basis. To increase the synergies in infrastructure development the authors propose a specific RI Maturity Model (RIMM) that is specifically qualified for open system-of-system environments. RIMM is based on the concepts of Capability Maturity Models for organisational development, concretely the Levels of Conceptual Interoperability Model (LCIM) specifying the technical, syntactical, semantic, pragmatic, dynamic, and conceptual layers of interoperation [1]. The model is complemented by the identification and integration of growth factors (according to the Nolan Stages Theory [2]). These factors include supply and demand factors. Supply factors comprise available resources, e.g., data, services and IT-management capabilities including organisations and IT-personal. Demand factors are the overall application portfolio for RIs but also the skills and requirements of scientists and communities using the infrastructure. RIMM thus enables a balanced development process of RI and RI components by evaluating the status of the supply and demand factors in relation to specific levels of interoperability. [1] Tolk, A., Diallo, A., Turnitsa, C. (2007): Applying the Levels of Conceptual Interoperability Model in Support of Integratability, Interoperability, and Composability for System-of-Systems Engineering. Systemics, Cybernetics and Informatics, Volume 5 - Number 5. [2

  14. Properties of spatial Cox process models

    DEFF Research Database (Denmark)

    Møller, Jesper

    Probabilistic properties of Cox processes of relevance for statistical modelling and inference are studied. Particularly, we study the most important classes of Cox processes, including log Gaussian Cox processes, shot noise Cox processes, and permanent Cox processes. We consider moment properties...... and point process operations such as thinning, displacements, and superpositioning. We also discuss how to simulate specific Cox processes....

  15. Automatic extraction of process categories from process model collections

    NARCIS (Netherlands)

    Malinova, M.; Dijkman, R.M.; Mendling, J.; Lohmann, N.; Song, M.; Wohed, P.

    2014-01-01

    Many organizations build up their business process management activities in an incremental way. As a result, there is no overarching structure defined at the beginning. However, as business process modeling initiatives often yield hundreds to thousands of process models, there is a growing need for

  16. Development and Application of a Category System to Describe Pre-Service Science Teachers' Activities in the Process of Scientific Modelling

    Science.gov (United States)

    Krell, Moritz; Walzer, Christine; Hergert, Susann; Krüger, Dirk

    2017-09-01

    As part of their professional competencies, science teachers need an elaborate meta-modelling knowledge as well as modelling skills in order to guide and monitor modelling practices of their students. However, qualitative studies about (pre-service) science teachers' modelling practices are rare. This study provides a category system which is suitable to analyse and to describe pre-service science teachers' modelling activities and to infer modelling strategies. The category system was developed based on theoretical considerations and was inductively refined within the methodological frame of qualitative content analysis. For the inductive refinement, modelling practices of pre-service teachers (n = 4) have been video-taped and analysed. In this study, one case was selected to demonstrate the application of the category system to infer modelling strategies. The contribution of this study for science education research and science teacher education is discussed.

  17. A Model of Process-Based Automation: Cost and Quality Implications in the Medication Management Process

    Science.gov (United States)

    Spaulding, Trent Joseph

    2011-01-01

    The objective of this research is to understand how a set of systems, as defined by the business process, creates value. The three studies contained in this work develop the model of process-based automation. The model states that complementarities among systems are specified by handoffs in the business process. The model also provides theory to…

  18. A methodology for development of biocatalytic processes

    DEFF Research Database (Denmark)

    Lima Ramos, Joana

    are available. The first case study presents a rational approach for defining a development strategy for multi-enzymatic processes. The proposed methodology requires a profound and structured knowledge of the multi-enzyme systems, integrating chemistry, biological and process engineering. In order to suggest......). These process metrics can often be attained by improvements in the reaction chemistry, the biocatalyst, and/or by process engineering, which often requires a complex process development strategy. Interestingly this complexity, which arises from the need for integration of biological and process technologies...... and their relationship with the overall process is not clear.The work described in this thesis presents a methodological approach for early stage development of biocatalytic processes, understanding and dealing with the reaction, biocatalyst and process constraints. When applied, this methodology has a decisive role...

  19. Modelling and control of a flotation process

    International Nuclear Information System (INIS)

    Ding, L.; Gustafsson, T.

    1999-01-01

    A general description of a flotation process is given. The dynamic model of a MIMO nonlinear subprocess in flotation, i. e. the pulp levels in five compartments in series is developed and the model is verified with real data from a production plant. In order to reject constant disturbances five extra states are introduced and the model is modified. An exact linearization has been made for the non-linear model and a linear quadratic gaussian controller is proposed based on the linearized model. The simulation result shows an improved performance of the pulp level control when the set points are changed or a disturbance occur. In future the controller will be tested in production. (author)

  20. Theory Creation, Modification, and Testing: An Information-Processing Model and Theory of the Anticipated and Unanticipated Consequences of Research and Development

    Science.gov (United States)

    Perla, Rocco J.; Carifio, James

    2011-01-01

    Background: Extending Merton's (1936) work on the consequences of purposive social action, the model, theory and taxonomy outlined here incorporates and formalizes both anticipated and unanticipated research findings in a unified theoretical framework. The model of anticipated research findings was developed initially by Carifio (1975, 1977) and…

  1. Integrated Intelligent Modeling, Design and Control of Crystal Growth Processes

    National Research Council Canada - National Science Library

    Prasad, V

    2000-01-01

    .... This MURI program took an integrated approach towards modeling, design and control of crystal growth processes and in conjunction with growth and characterization experiments developed much better...

  2. Radioactive Dry Process Material Treatment Technology Development

    Energy Technology Data Exchange (ETDEWEB)

    Park, J. J.; Hung, I. H.; Kim, K. K. (and others)

    2007-06-15

    The project 'Radioactive Dry Process Material Treatment Technology Development' aims to be normal operation for the experiments at DUPIC fuel development facility (DFDF) and safe operation of the facility through the technology developments such as remote operation, maintenance and pair of the facility, treatment of various high level process wastes and trapping of volatile process gases. DUPIC Fuel Development Facility (DFDF) can accommodate highly active nuclear materials, and now it is for fabrication of the oxide fuel by dry process characterizing the proliferation resistance. During the second stage from march 2005 to February 2007, we carried out technology development of the remote maintenance and the DFDF's safe operation, development of treatment technology for process off-gas, and development of treatment technology for PWR cladding hull and the results was described in this report.

  3. Dynamic modeling of ultrafiltration membranes for whey separation processes

    NARCIS (Netherlands)

    Saltik, M.B.; Ozkan, L.; Jacobs, M.; van der Padt, A.

    2017-01-01

    In this paper, we present a control relevant rigorous dynamic model for an ultrafiltration membrane unit in a whey separation process. The model consists of a set of differential algebraic equations and is developed for online model based applications such as model based control and process

  4. Recent developments in multiperipheral models

    International Nuclear Information System (INIS)

    De Tar, C.

    1977-01-01

    Experiments do not provide all detailed information required in order to select among possible formulations of the multiperipheral model the correct one (''uniqueness problem''). There are at least three directions which lead away from the uniqueness problem. The first is simplified models with only enough complexity so as to satisfy the data approximately. The second involves invoking theoretical constraints which limit the theoretical flexibility of the model. The third and ultimate solution may be provided by the quark-gluon models or string models. The recent interest in the role of clusters in multiple production is a good illustration of the phenomenological problems facing multiperipheral models. The existence of clusters is certainly agreed upon, but for determination of their size directly from rapidity distributions the result so far depends on what one assumes about how they are produced. Theoretical work toward a unified picture of strong interactions has also led to some novel developments in multiperipheral models and the Regge pole theory. It is a problem now to choose between the more traditional picture of two vacuum singularities or the more novel approach which makes an effort to deal not merely with four-body amplitudes but in a more profound way, with multiple production processes which are related to them through unitarity

  5. Dual elaboration models in attitude change processes

    Directory of Open Access Journals (Sweden)

    Žeželj Iris

    2005-01-01

    Full Text Available This article examines empirical and theoretical developments in research on attitude change in the past 50 years. It focuses the period from 1980 till present as well as cognitive response theories as the dominant theoretical approach in the field. The postulates of Elaboration Likelihood Model, as most-researched representative of dual process theories are studied, based on review of accumulated research evidence. Main research findings are grouped in four basic factors: message source, message content, message recipient and its context. Most influential criticisms of the theory are then presented regarding its empirical base and dual process assumption. Some possible applications and further research perspectives are discussed at the end.

  6. Development of advanced spent fuel management process. System analysis of advanced spent fuel management process

    International Nuclear Information System (INIS)

    Ro, S.G.; Kang, D.S.; Seo, C.S.; Lee, H.H.; Shin, Y.J.; Park, S.W.

    1999-03-01

    The system analysis of an advanced spent fuel management process to establish a non-proliferation model for the long-term spent fuel management is performed by comparing the several dry processes, such as a salt transport process, a lithium process, the IFR process developed in America, and DDP developed in Russia. In our system analysis, the non-proliferation concept is focused on the separation factor between uranium and plutonium and decontamination factors of products in each process, and the non-proliferation model for the long-term spent fuel management has finally been introduced. (Author). 29 refs., 17 tabs., 12 figs

  7. Multiphysics modelling of manufacturing processes: A review

    DEFF Research Database (Denmark)

    Jabbari, Masoud; Baran, Ismet; Mohanty, Sankhya

    2018-01-01

    Numerical modelling is increasingly supporting the analysis and optimization of manufacturing processes in the production industry. Even if being mostly applied to multistep processes, single process steps may be so complex by nature that the needed models to describe them must include multiphysics...... the diversity in the field of modelling of manufacturing processes as regards process, materials, generic disciplines as well as length scales: (1) modelling of tape casting for thin ceramic layers, (2) modelling the flow of polymers in extrusion, (3) modelling the deformation process of flexible stamps...... for nanoimprint lithography, (4) modelling manufacturing of composite parts and (5) modelling the selective laser melting process. For all five examples, the emphasis is on modelling results as well as describing the models in brief mathematical details. Alongside with relevant references to the original work...

  8. Modeling Suspension and Continuation of a Process

    Directory of Open Access Journals (Sweden)

    Oleg Svatos

    2012-04-01

    Full Text Available This work focuses on difficulties an analyst encounters when modeling suspension and continuation of a process in contemporary process modeling languages. As a basis there is introduced general lifecycle of an activity which is then compared to activity lifecycles supported by individual process modeling languages. The comparison shows that the contemporary process modeling languages cover the defined general lifecycle of an activity only partially. There are picked two popular process modeling languages and there is modeled real example, which reviews how the modeling languages can get along with their lack of native support of suspension and continuation of an activity. Upon the unsatisfying results of the contemporary process modeling languages in the modeled example, there is presented a new process modeling language which, as demonstrated, is capable of capturing suspension and continuation of an activity in much simpler and precise way.

  9. Computer Aided Continuous Time Stochastic Process Modelling

    DEFF Research Database (Denmark)

    Kristensen, N.R.; Madsen, Henrik; Jørgensen, Sten Bay

    2001-01-01

    A grey-box approach to process modelling that combines deterministic and stochastic modelling is advocated for identification of models for model-based control of batch and semi-batch processes. A computer-aided tool designed for supporting decision-making within the corresponding modelling cycle...

  10. A model for ageing-home-care service process improvement

    OpenAIRE

    Yu, Shu-Yan; Shie, An-Jin

    2017-01-01

    The purpose of this study was to develop an integrated model to improve service processes in ageing-home-care. According to the literature, existing service processes have potential service failures that affect service quality and efficacy. However, most previous studies have only focused on conceptual model development using New Service Development (NSD) and fail to provide a systematic model to analyse potential service failures and facilitate managers developing solutions to improve the se...

  11. Computational Process Modeling for Additive Manufacturing (OSU)

    Science.gov (United States)

    Bagg, Stacey; Zhang, Wei

    2015-01-01

    Powder-Bed Additive Manufacturing (AM) through Direct Metal Laser Sintering (DMLS) or Selective Laser Melting (SLM) is being used by NASA and the Aerospace industry to "print" parts that traditionally are very complex, high cost, or long schedule lead items. The process spreads a thin layer of metal powder over a build platform, then melts the powder in a series of welds in a desired shape. The next layer of powder is applied, and the process is repeated until layer-by-layer, a very complex part can be built. This reduces cost and schedule by eliminating very complex tooling and processes traditionally used in aerospace component manufacturing. To use the process to print end-use items, NASA seeks to understand SLM material well enough to develop a method of qualifying parts for space flight operation. Traditionally, a new material process takes many years and high investment to generate statistical databases and experiential knowledge, but computational modeling can truncate the schedule and cost -many experiments can be run quickly in a model, which would take years and a high material cost to run empirically. This project seeks to optimize material build parameters with reduced time and cost through modeling.

  12. Fuzzy model for Laser Assisted Bending Process

    Directory of Open Access Journals (Sweden)

    Giannini Oliviero

    2016-01-01

    Full Text Available In the present study, a fuzzy model was developed to predict the residual bending in a conventional metal bending process assisted by a high power diode laser. The study was focused on AA6082T6 aluminium thin sheets. In most dynamic sheet metal forming operations, the highly nonlinear deformation processes cause large amounts of elastic strain energy stored in the formed material. The novel hybrid forming process was thus aimed at inducing the local heating of the mechanically bent workpiece in order to decrease or eliminate the related springback phenomena. In particular, the influence on the extent of springback phenomena of laser process parameters such as source power, scan speed and starting elastic deformation of mechanically bent sheets, was experimentally assessed. Consistent trends in experimental response according to operational parameters were found. Accordingly, 3D process maps of the extent of the springback phenomena according to operational parameters were constructed. The effect of the inherent uncertainties on the predicted residual bending caused by the approximation in the model parameters was evaluated. In particular, a fuzzy-logic based approach was used to describe the model uncertainties and the transformation method was applied to propagate their effect on the residual bending.

  13. Process modeling for Humanities: tracing and analyzing scientific processes

    OpenAIRE

    Hug , Charlotte; Salinesi , Camille; Deneckere , Rebecca; Lamasse , Stéphane

    2011-01-01

    International audience; This paper concerns epistemology and the understanding of research processes in Humanities, such as Archaeology. We believe that to properly understand research processes, it is essential to trace them. The collected traces depend on the process model established, which has to be as accurate as possible to exhaustively record the traces. In this paper, we briefly explain why the existing process models for Humanities are not sufficient to represent traces. We then pres...

  14. Theoretical modelling of carbon deposition processes

    International Nuclear Information System (INIS)

    Marsh, G.R.; Norfolk, D.J.; Skinner, R.F.

    1985-01-01

    Work based on capsule experiments in the BNL Gamma Facility, aimed at elucidating the chemistry involved in the formation of carbonaceous deposit on CAGR fuel pin surfaces is described. Using a data-base derived from capsule experiments together with literature values for the kinetics of the fundamental reactions, a chemical model of the gas-phase processes has been developed. This model successfully reproduces the capsule results, whilst preliminary application to the WAGR coolant circuit indicates the likely concentration profiles of various radical species within the fuel channels. (author)

  15. Models and Modelling Tools for Chemical Product and Process Design

    DEFF Research Database (Denmark)

    Gani, Rafiqul

    2016-01-01

    The design, development and reliability of a chemical product and the process to manufacture it, need to be consistent with the end-use characteristics of the desired product. One of the common ways to match the desired product-process characteristics is through trial and error based experiments......-based framework is that in the design, development and/or manufacturing of a chemical product-process, the knowledge of the applied phenomena together with the product-process design details can be provided with diverse degrees of abstractions and details. This would allow the experimental resources...... to be employed for validation and fine-tuning of the solutions from the model-based framework, thereby, removing the need for trial and error experimental steps. Also, questions related to economic feasibility, operability and sustainability, among others, can be considered in the early stages of design. However...

  16. A model evaluation checklist for process-based environmental models

    Science.gov (United States)

    Jackson-Blake, Leah

    2015-04-01

    the conceptual model on which it is based. In this study, a number of model structural shortcomings were identified, such as a lack of dissolved phosphorus transport via infiltration excess overland flow, potential discrepancies in the particulate phosphorus simulation and a lack of spatial granularity. (4) Conceptual challenges, as conceptual models on which predictive models are built are often outdated, having not kept up with new insights from monitoring and experiments. For example, soil solution dissolved phosphorus concentration in INCA-P is determined by the Freundlich adsorption isotherm, which could potentially be replaced using more recently-developed adsorption models that take additional soil properties into account. This checklist could be used to assist in identifying why model performance may be poor or unreliable. By providing a model evaluation framework, it could help prioritise which areas should be targeted to improve model performance or model credibility, whether that be through using alternative calibration techniques and statistics, improved data collection, improving or simplifying the model structure or updating the model to better represent current understanding of catchment processes.

  17. Features of the Manufacturing Vision Development Process

    DEFF Research Database (Denmark)

    Dukovska-Popovska, Iskra; Riis, Jens Ove; Boer, Harry

    2005-01-01

    of action research. The methodology recommends wide participation of people from different hierarchical and functional positions, who engage in a relatively short, playful and creative process and come up with a vision (concept) for the future manufacturing system in the company. Based on three case studies......This paper discusses the key features of the process of Manufacturing Vision Development, a process that enables companies to develop their future manufacturing concept. The basis for the process is a generic five-phase methodology (Riis and Johansen, 2003) developed as a result of ten years...... of companies going through the initial phases of the methodology, this research identified the key features of the Manufacturing Vision Development process. The paper elaborates the key features by defining them, discussing how and when they can appear, and how they influence the process....

  18. Business process modeling for processing classified documents using RFID technology

    Directory of Open Access Journals (Sweden)

    Koszela Jarosław

    2016-01-01

    Full Text Available The article outlines the application of the processing approach to the functional description of the designed IT system supporting the operations of the secret office, which processes classified documents. The article describes the application of the method of incremental modeling of business processes according to the BPMN model to the description of the processes currently implemented (“as is” in a manual manner and target processes (“to be”, using the RFID technology for the purpose of their automation. Additionally, the examples of applying the method of structural and dynamic analysis of the processes (process simulation to verify their correctness and efficiency were presented. The extension of the process analysis method is a possibility of applying the warehouse of processes and process mining methods.

  19. Modeling of processes in the tourism sector

    Directory of Open Access Journals (Sweden)

    Salamatina Victoriya, S.

    2015-06-01

    Full Text Available In modern conditions for a number of Russian regions tourism is becoming budget. In this regard, it is of interest to the simulation of processes occurring in the tourism business, because they are affected by many random parameters, due to various economic, political, geographic, and other aspects. For improvement and development of systems for the management of tourism business systematically embeds economic mathematical apparatus in this area, because increased competitiveness requires continuous and constructive changes. Results of application of the economic mathematical apparatus allow a more systematic and internal unity to analyze and evaluate the applicability of further processes in tourism. For some economic processes typical tourist activities is that a certain effect and result from exposure to any of the factors on the indicators of the processes is not immediately but gradually, after some certain time, with a certain lag. With the necessity of accounting for this delay has to face when developing mathematical models of tourist business processes. In this case, the simulation of such processes it is advisable to apply economic-mathematical formalism of optimal control, called game theory.

  20. Sustainable Chemical Process Development through an Integrated Framework

    DEFF Research Database (Denmark)

    Papadakis, Emmanouil; Kumar Tula, Anjan; Anantpinijwatna, Amata

    2016-01-01

    This paper describes the development and the application of a general integrated framework based on systematic model-based methods and computer-aided tools with the objective to achieve more sustainable process designs and to improve the process understanding. The developed framework can be appli...... studies involve multiphase reaction systems for the synthesis of active pharmaceutical ingredients....

  1. Syntax highlighting in business process models

    NARCIS (Netherlands)

    Reijers, H.A.; Freytag, T.; Mendling, J.; Eckleder, A.

    2011-01-01

    Sense-making of process models is an important task in various phases of business process management initiatives. Despite this, there is currently hardly any support in business process modeling tools to adequately support model comprehension. In this paper we adapt the concept of syntax

  2. Configurable multi-perspective business process models

    NARCIS (Netherlands)

    La Rosa, M.; Dumas, M.; Hofstede, ter A.H.M.; Mendling, J.

    2011-01-01

    A configurable process model provides a consolidated view of a family of business processes. It promotes the reuse of proven practices by providing analysts with a generic modeling artifact from which to derive individual process models. Unfortunately, the scope of existing notations for

  3. On Assumptions in Development of a Mathematical Model of Thermo-gravitational Convection in the Large Volume Process Tanks Taking into Account Fermentation

    Directory of Open Access Journals (Sweden)

    P. M. Shkapov

    2015-01-01

    Full Text Available The paper provides a mathematical model of thermo-gravity convection in a large volume vertical cylinder. The heat is removed from the product via the cooling jacket at the top of the cylinder. We suppose that a laminar fluid motion takes place. The model is based on the NavierStokes equation, the equation of heat transfer through the wall, and the heat transfer equation. The peculiarity of the process in large volume tanks was the distribution of the physical parameters of the coordinates that was taken into account when constructing the model. The model corresponds to a process of wort beer fermentation in the cylindrical-conical tanks (CCT. The CCT volume is divided into three zones and for each zone model equations was obtained. The first zone has an annular cross-section and it is limited to the height by the cooling jacket. In this zone the heat flow from the cooling jacket to the product is uppermost. Model equation of the first zone describes the process of heat transfer through the wall and is presented by linear inhomogeneous differential equation in partial derivatives that is solved analytically. For the second and third zones description there was a number of engineering assumptions. The fluid was considered Newtonian, viscous and incompressible. Convective motion considered in the Boussinesq approximation. The effect of viscous dissipation is not considered. The topology of fluid motion is similar to the cylindrical Poiseuille. The second zone model consists of the Navier-Stokes equations in cylindrical coordinates with the introduction of a simplified and the heat equation in the liquid layer. The volume that is occupied by an upward convective flow pertains to the third area. Convective flows do not mix and do not exchange heat. At the start of the process a medium has the same temperature and a zero initial velocity in the whole volume that allows us to specify the initial conditions for the process. The paper shows the

  4. MODELLING OF THE PROCESS OF TEACHING READING ENGLISH LANGUAGE PERIODICALS

    Directory of Open Access Journals (Sweden)

    Тетяна Глушко

    2014-07-01

    Full Text Available The article reveals a scientifically substantiated process of teaching reading English language periodicals in all its components, which are consistently developed, and form of interconnection of the structural elements in the process of teaching reading. This process is presented as a few interconnected and interdetermined models: 1 the models of the process of acquiring standard and expressive lexical knowledge; 2 the models of the process of formation of skills to use such vocabulary; 3 the models of the development of skills to read texts of the different linguistic levels.

  5. EUV mask process specifics and development challenges

    Science.gov (United States)

    Nesladek, Pavel

    2014-07-01

    EUV lithography is currently the favorite and most promising candidate among the next generation lithography (NGL) technologies. Decade ago the NGL was supposed to be used for 45 nm technology node. Due to introduction of immersion 193nm lithography, double/triple patterning and further techniques, the 193 nm lithography capabilities was greatly improved, so it is expected to be used successfully depending on business decision of the end user down to 10 nm logic. Subsequent technology node will require EUV or DSA alternative technology. Manufacturing and especially process development for EUV technology requires significant number of unique processes, in several cases performed at dedicated tools. Currently several of these tools as e.g. EUV AIMS or actinic reflectometer are not available on site yet. The process development is done using external services /tools with impact on the single unit process development timeline and the uncertainty of the process performance estimation, therefore compromises in process development, caused by assumption about similarities between optical and EUV mask made in experiment planning and omitting of tests are further reasons for challenges to unit process development. Increased defect risk and uncertainty in process qualification are just two examples, which can impact mask quality / process development. The aim of this paper is to identify critical aspects of the EUV mask manufacturing with respect to defects on the mask with focus on mask cleaning and defect repair and discuss the impact of the EUV specific requirements on the experiments needed.

  6. Process Model for Friction Stir Welding

    Science.gov (United States)

    Adams, Glynn

    1996-01-01

    Friction stir welding (FSW) is a relatively new process being applied for joining of metal alloys. The process was initially developed by The Welding Institute (TWI) in Cambridge, UK. The FSW process is being investigated at NASA/MSEC as a repair/initial weld procedure for fabrication of the super-light-weight aluminum-lithium shuttle external tank. The FSW investigations at MSFC were conducted on a horizontal mill to produce butt welds of flat plate material. The weldment plates are butted together and fixed to a backing plate on the mill bed. A pin tool is placed into the tool holder of the mill spindle and rotated at approximately 400 rpm. The pin tool is then plunged into the plates such that the center of the probe lies at, one end of the line of contact, between the plates and the shoulder of the pin tool penetrates the top surface of the weldment. The weld is produced by traversing the tool along the line of contact between the plates. A lead angle allows the leading edge of the shoulder to remain above the top surface of the plate. The work presented here is the first attempt at modeling a complex phenomenon. The mechanical aspects of conducting the weld process are easily defined and the process itself is controlled by relatively few input parameters. However, in the region of the weld, plasticizing and forging of the parent material occurs. These are difficult processes to model. The model presented here addresses only variations in the radial dimension outward from the pin tool axis. Examinations of the grain structure of the weld reveal that a considerable amount of material deformation also occurs in the direction parallel to the pin tool axis of rotation, through the material thickness. In addition, measurements of the axial load on the pin tool demonstrate that the forging affect of the pin tool shoulder is an important process phenomenon. Therefore, the model needs to be expanded to account for the deformations through the material thickness and the

  7. Management of Talent Development Process in Sport

    OpenAIRE

    SEVİMLİ, Dilek

    2015-01-01

    In the development of elite athletes, talent identification and education, is a complex and multidimensional process. It is difficult to predict the future performance depending on the increasing amount of technical, tactical, conditioning and psychological needs in a sport. Factors such as children’s developmental stages and levels, gender, athlete development programs, social support, the quality of coaches, access to equipment and facilities can affect talent development process.Phases of ...

  8. Computer modeling of lung cancer diagnosis-to-treatment process.

    Science.gov (United States)

    Ju, Feng; Lee, Hyo Kyung; Osarogiagbon, Raymond U; Yu, Xinhua; Faris, Nick; Li, Jingshan

    2015-08-01

    We introduce an example of a rigorous, quantitative method for quality improvement in lung cancer care-delivery. Computer process modeling methods are introduced for lung cancer diagnosis, staging and treatment selection process. Two types of process modeling techniques, discrete event simulation (DES) and analytical models, are briefly reviewed. Recent developments in DES are outlined and the necessary data and procedures to develop a DES model for lung cancer diagnosis, leading up to surgical treatment process are summarized. The analytical models include both Markov chain model and closed formulas. The Markov chain models with its application in healthcare are introduced and the approach to derive a lung cancer diagnosis process model is presented. Similarly, the procedure to derive closed formulas evaluating the diagnosis process performance is outlined. Finally, the pros and cons of these methods are discussed.

  9. Managing the TDM process : developing MPO institutional capacity - technical report.

    Science.gov (United States)

    2015-04-01

    Within Texas, the development of urban travel demand models (TDMs) is a cooperative process between the : Texas Department of Transportation and Metropolitan Planning Organizations (MPOs). Though TxDOT-Transportation Planning and Programming Division...

  10. Mathematical modeling of the flash converting process

    Energy Technology Data Exchange (ETDEWEB)

    Sohn, H.Y.; Perez-Tello, M.; Riihilahti, K.M. [Utah Univ., Salt Lake City, UT (United States)

    1996-12-31

    An axisymmetric mathematical model for the Kennecott-Outokumpu flash converting process for converting solid copper matte to copper is presented. The model is an adaptation of the comprehensive mathematical model formerly developed at the University of Utah for the flash smelting of copper concentrates. The model incorporates the transport of momentum, heat, mass, and reaction kinetics between gas and particles in a particle-laden turbulent gas jet. The standard k-{epsilon} model is used to describe gas-phase turbulence in an Eulerian framework. The particle-phase is treated from a Lagrangian viewpoint which is coupled to the gas-phase via the source terms in the Eulerian gas-phase governing equations. Matte particles were represented as Cu{sub 2}S yFeS, and assumed to undergo homogeneous oxidation to Cu{sub 2}O, Fe{sub 3}O{sub 4}, and SO{sub 2}. A reaction kinetics mechanism involving both external mass transfer of oxygen gas to the particle surface and diffusion of oxygen through the porous oxide layer is proposed to estimate the particle oxidation rate Predictions of the mathematical model were compared with the experimental data collected in a bench-scale flash converting facility. Good agreement between the model predictions and the measurements was obtained. The model was used to study the effect of different gas-injection configurations on the overall fluid dynamics in a commercial size flash converting shaft. (author)

  11. Mathematical modeling of the flash converting process

    Energy Technology Data Exchange (ETDEWEB)

    Sohn, H Y; Perez-Tello, M; Riihilahti, K M [Utah Univ., Salt Lake City, UT (United States)

    1997-12-31

    An axisymmetric mathematical model for the Kennecott-Outokumpu flash converting process for converting solid copper matte to copper is presented. The model is an adaptation of the comprehensive mathematical model formerly developed at the University of Utah for the flash smelting of copper concentrates. The model incorporates the transport of momentum, heat, mass, and reaction kinetics between gas and particles in a particle-laden turbulent gas jet. The standard k-{epsilon} model is used to describe gas-phase turbulence in an Eulerian framework. The particle-phase is treated from a Lagrangian viewpoint which is coupled to the gas-phase via the source terms in the Eulerian gas-phase governing equations. Matte particles were represented as Cu{sub 2}S yFeS, and assumed to undergo homogeneous oxidation to Cu{sub 2}O, Fe{sub 3}O{sub 4}, and SO{sub 2}. A reaction kinetics mechanism involving both external mass transfer of oxygen gas to the particle surface and diffusion of oxygen through the porous oxide layer is proposed to estimate the particle oxidation rate Predictions of the mathematical model were compared with the experimental data collected in a bench-scale flash converting facility. Good agreement between the model predictions and the measurements was obtained. The model was used to study the effect of different gas-injection configurations on the overall fluid dynamics in a commercial size flash converting shaft. (author)

  12. Repairing process models to reflect reality

    NARCIS (Netherlands)

    Fahland, D.; Aalst, van der W.M.P.; Barros, A.; Gal, A.; Kindler, E.

    2012-01-01

    Process mining techniques relate observed behavior (i.e., event logs) to modeled behavior (e.g., a BPMN model or a Petri net). Processes models can be discovered from event logs and conformance checking techniques can be used to detect and diagnose differences between observed and modeled behavior.

  13. Development of a dynamic growth-death model for Escherichia coli O157:H7 in minimally processed leafy green vegetables.

    Science.gov (United States)

    McKellar, Robin C; Delaquis, Pascal

    2011-11-15

    Escherichia coli O157:H7, an occasional contaminant of fresh produce, can present a serious health risk in minimally processed leafy green vegetables. A good predictive model is needed for Quantitative Risk Assessment (QRA) purposes, which adequately describes the growth or die-off of this pathogen under variable temperature conditions experienced during processing, storage and shipping. Literature data on behaviour of this pathogen on fresh-cut lettuce and spinach was taken from published graphs by digitization, published tables or from personal communications. A three-phase growth function was fitted to the data from 13 studies, and a square root model for growth rate (μ) as a function of temperature was derived: μ=(0.023*(Temperature-1.20))(2). Variability in the published data was incorporated into the growth model by the use of weighted regression and the 95% prediction limits. A log-linear die-off function was fitted to the data from 13 studies, and the resulting rate constants were fitted to a shifted lognormal distribution (Mean: 0.013; Standard Deviation, 0.010; Shift, 0.001). The combined growth-death model successfully predicted pathogen behaviour under both isothermal and non-isothermal conditions when compared to new published data. By incorporating variability, the resulting model is an improvement over existing ones, and is suitable for QRA applications. Crown Copyright © 2011. Published by Elsevier B.V. All rights reserved.

  14. Modeling Dynamic Regulatory Processes in Stroke

    Science.gov (United States)

    McDermott, Jason E.; Jarman, Kenneth; Taylor, Ronald; Lancaster, Mary; Shankaran, Harish; Vartanian, Keri B.; Stevens, Susan L.; Stenzel-Poore, Mary P.; Sanfilippo, Antonio

    2012-01-01

    The ability to examine the behavior of biological systems in silico has the potential to greatly accelerate the pace of discovery in diseases, such as stroke, where in vivo analysis is time intensive and costly. In this paper we describe an approach for in silico examination of responses of the blood transcriptome to neuroprotective agents and subsequent stroke through the development of dynamic models of the regulatory processes observed in the experimental gene expression data. First, we identified functional gene clusters from these data. Next, we derived ordinary differential equations (ODEs) from the data relating these functional clusters to each other in terms of their regulatory influence on one another. Dynamic models were developed by coupling these ODEs into a model that simulates the expression of regulated functional clusters. By changing the magnitude of gene expression in the initial input state it was possible to assess the behavior of the networks through time under varying conditions since the dynamic model only requires an initial starting state, and does not require measurement of regulatory influences at each time point in order to make accurate predictions. We discuss the implications of our models on neuroprotection in stroke, explore the limitations of the approach, and report that an optimized dynamic model can provide accurate predictions of overall system behavior under several different neuroprotective paradigms. PMID:23071432

  15. Selected sports talent development models

    Directory of Open Access Journals (Sweden)

    Michal Vičar

    2017-06-01

    Full Text Available Background: Sports talent in the Czech Republic is generally viewed as a static, stable phenomena. It stands in contrast with widespread praxis carried out in Anglo-Saxon countries that emphasise its fluctuant nature. This is reflected in the current models describing its development. Objectives: The aim is to introduce current models of talent development in sport. Methods: Comparison and analysing of the following models: Balyi - Long term athlete development model, Côté - Developmental model of sport participation, Csikszentmihalyi - The flow model of optimal expertise, Bailey and Morley - Model of talent development. Conclusion: Current models of sport talent development approach talent as dynamic phenomenon, varying in time. They are based in particular on the work of Simonton and his Emergenic and epigenic model and of Gagné and his Differentiated model of giftedness and talent. Balyi's model is characterised by its applicability and impications for practice. Côté's model highlights the role of family and deliberate play. Both models describe periodization of talent development. Csikszentmihalyi's flow model explains how the athlete acquires experience and develops during puberty based on the structure of attention and flow experience. Bailey and Morley's model accents the situational approach to talent and development of skills facilitating its growth.

  16. Computer-Aided Multiscale Modelling for Chemical Process Engineering

    DEFF Research Database (Denmark)

    Morales Rodriguez, Ricardo; Gani, Rafiqul

    2007-01-01

    Chemical processes are generally modeled through monoscale approaches, which, while not adequate, satisfy a useful role in product-process design. In this case, use of a multi-dimensional and multi-scale model-based approach has importance in product-process development. A computer-aided framework...

  17. Guided interaction exploration in artifact-centric process models

    NARCIS (Netherlands)

    van Eck, M.L.; Sidorova, N.; van der Aalst, W.M.P.

    2017-01-01

    Artifact-centric process models aim to describe complex processes as a collection of interacting artifacts. Recent development in process mining allow for the discovery of such models. However, the focus is often on the representation of the individual artifacts rather than their interactions. Based

  18. Genetic Process Mining: Alignment-based Process Model Mutation

    NARCIS (Netherlands)

    Eck, van M.L.; Buijs, J.C.A.M.; Dongen, van B.F.; Fournier, F.; Mendling, J.

    2015-01-01

    The Evolutionary Tree Miner (ETM) is a genetic process discovery algorithm that enables the user to guide the discovery process based on preferences with respect to four process model quality dimensions: replay fitness, precision, generalization and simplicity. Traditionally, the ETM algorithm uses

  19. The Skill Development Processes of Apprenticeship.

    Science.gov (United States)

    Wolek, Francis W.

    1999-01-01

    Case studies of apprenticeship in the Japanese tea ceremony, traditional crafts, and strategic thinking illustrate novices' growth in internal knowledge through reflective practice of skilled processes. As skilled experts, adult educators are engaged in continually improving the skilled processes they model. (SK)

  20. Development of the negative gravity anomaly of the 85 degrees E Ridge, northeastern Indian Ocean – A process oriented modelling approach

    Digital Repository Service at National Institute of Oceanography (India)

    Sreejith, K.M.; Radhakrishna, M.; Krishna, K.S.; Majumdar, T.J.

    Te value. Entire process is repeated for different Te values ranging from 0 to 25 km, until a good fit is obtained between the observed and calculated gravity anomalies considering RMS error as well as amplitude and wavelength of the anomalies... as the goodness of fit. The model parameters used in the computations are given in table 1. 5. Crustal structure and elastic plate thickness (Te) beneath the ridge Following the approach described above, we have computed individual gravity anomalies contributed...

  1. Organizational Development: Values, Process, and Technology.

    Science.gov (United States)

    Margulies, Newton; Raia, Anthony P.

    The current state-of-the-art of organizational development is the focus of this book. The five parts into which the book is divided are as follows: Part One--Introduction (Organizational Development in Perspective--the nature, values, process, and technology of organizational development); Part Two--The Components of Organizational Developments…

  2. Robot development for nuclear material processing

    International Nuclear Information System (INIS)

    Pedrotti, L.R.; Armantrout, G.A.; Allen, D.C.; Sievers, R.H. Sr.

    1991-07-01

    The Department of Energy is seeking to modernize its special nuclear material (SNM) production facilities and concurrently reduce radiation exposures and process and incidental radioactive waste generated. As part of this program, Lawrence Livermore National Laboratory (LLNL) lead team is developing and adapting generic and specific applications of commercial robotic technologies to SNM pyrochemical processing and other operations. A working gantry robot within a sealed processing glove box and a telerobot control test bed are manifestations of this effort. This paper describes the development challenges and progress in adapting processing, robotic, and nuclear safety technologies to the application. 3 figs

  3. Composing Models of Geographic Physical Processes

    Science.gov (United States)

    Hofer, Barbara; Frank, Andrew U.

    Processes are central for geographic information science; yet geographic information systems (GIS) lack capabilities to represent process related information. A prerequisite to including processes in GIS software is a general method to describe geographic processes independently of application disciplines. This paper presents such a method, namely a process description language. The vocabulary of the process description language is derived formally from mathematical models. Physical processes in geography can be described in two equivalent languages: partial differential equations or partial difference equations, where the latter can be shown graphically and used as a method for application specialists to enter their process models. The vocabulary of the process description language comprises components for describing the general behavior of prototypical geographic physical processes. These process components can be composed by basic models of geographic physical processes, which is shown by means of an example.

  4. Estimation of Environment-Related Properties of Chemicals for Design of Sustainable Processes: Development of Group-Contribution+ (GC+) Property Models and Uncertainty Analysis

    Science.gov (United States)

    The aim of this work is to develop group-contribution+ (GC+) method (combined group-contribution (GC) method and atom connectivity index (CI) method) based property models to provide reliable estimations of environment-related properties of organic chemicals together with uncert...

  5. GRA model development at Bruce Power

    International Nuclear Information System (INIS)

    Parmar, R.; Ngo, K.; Cruchley, I.

    2011-01-01

    In 2007, Bruce Power undertook a project, in partnership with AMEC NSS Limited, to develop a Generation Risk Assessment (GRA) model for its Bruce B Nuclear Generating Station. The model is intended to be used as a decision-making tool in support of plant operations. Bruce Power has recognized the strategic importance of GRA in the plant decision-making process and is currently implementing a pilot GRA application. The objective of this paper is to present the scope of the GRA model development project, methodology employed, and the results and path forward for the model implementation at Bruce Power. The required work was split into three phases. Phase 1 involved development of GRA models for the twelve systems most important to electricity production. Ten systems were added to the model during each of the next two phases. The GRA model development process consists of developing system Failure Modes and Effects Analyses (FMEA) to identify the components critical to the plant reliability and determine their impact on electricity production. The FMEAs were then used to develop the logic for system fault tree (FT) GRA models. The models were solved and post-processed to provide model outputs to the plant staff in a user-friendly format. The outputs consisted of the ranking of components based on their production impact expressed in terms of lost megawatt hours (LMWH). Another key model output was the estimation of the predicted Forced Loss Rate (FLR). (author)

  6. MORTALITY MODELING WITH LEVY PROCESSES

    Directory of Open Access Journals (Sweden)

    M. Serhat Yucel, FRM

    2012-07-01

    Full Text Available Mortality and longevity risk is usually one of the main risk components ineconomic capital models of insurance companies. Above all, future mortalityexpectations are an important input in the modeling and pricing of long termproducts. Deviations from the expectation can lead insurance company even todefault if sufficient reserves and capital is not held. Thus, Modeling of mortalitytime series accurately is a vital concern for the insurance industry. The aim of thisstudy is to perform distributional and spectral testing to the mortality data andpracticed discrete and continuous time modeling. We believe, the results and thetechniques used in this study will provide a basis for Value at Risk formula incase of mortality.

  7. Safety guides development process in Spain

    International Nuclear Information System (INIS)

    Butragueno, J.L.; Perello, M.

    1979-01-01

    Safety guides have become a major factor in the licensing process of nuclear power plants and related nuclear facilities of the fuel cycle. As far as the experience corroborates better and better engineering methodologies and procedures, the results of these are settled down in form of standards, guides, and similar issues. This paper presents the actual Spanish experience in nuclear standards and safety guides development. The process to develop a standard or safety guide is shown. Up to date list of issued and on development nuclear safety guides is included and comments on the future role of nuclear standards in the licensing process are made. (author)

  8. Process models as tools in forestry research and management

    Science.gov (United States)

    Kurt Johnsen; Lisa Samuelson; Robert Teskey; Steve McNulty; Tom Fox

    2001-01-01

    Forest process models are mathematical representations of biological systems that incorporate our understanding of physiological and ecological mechanisms into predictive algorithms. These models were originally designed and used for research purposes, but are being developed for use in practical forest management. Process models designed for research...

  9. Beyond dual-process models: A categorisation of processes underlying intuitive judgement and decision making

    NARCIS (Netherlands)

    Glöckner, A.; Witteman, C.L.M.

    2010-01-01

    Intuitive-automatic processes are crucial for making judgements and decisions. The fascinating complexity of these processes has attracted many decision researchers, prompting them to start investigating intuition empirically and to develop numerous models. Dual-process models assume a clear

  10. Model-based software process improvement

    Science.gov (United States)

    Zettervall, Brenda T.

    1994-01-01

    The activities of a field test site for the Software Engineering Institute's software process definition project are discussed. Products tested included the improvement model itself, descriptive modeling techniques, the CMM level 2 framework document, and the use of process definition guidelines and templates. The software process improvement model represents a five stage cyclic approach for organizational process improvement. The cycles consist of the initiating, diagnosing, establishing, acting, and leveraging phases.

  11. Kopernik : modeling business processes for digital customers

    OpenAIRE

    Estañol Lamarca, Montserrat; Castro, Manuel; Díaz-Montenegro, Sylvia; Teniente López, Ernest

    2016-01-01

    This paper presents the Kopernik methodology for modeling business processes for digital customers. These processes require a high degree of flexibility in the execution of their tasks or actions. We achieve this by using the artifact-centric approach to process modeling and the use of condition-action rules. The processes modeled following Kopernik can then be implemented in an existing commercial tool, Balandra.

  12. The triconnected abstraction of process models

    OpenAIRE

    Polyvyanyy, Artem; Smirnov, Sergey; Weske, Mathias

    2009-01-01

    Contents: Artem Polyvanny, Sergey Smirnow, and Mathias Weske The Triconnected Abstraction of Process Models 1 Introduction 2 Business Process Model Abstraction 3 Preliminaries 4 Triconnected Decomposition 4.1 Basic Approach for Process Component Discovery 4.2 SPQR-Tree Decomposition 4.3 SPQR-Tree Fragments in the Context of Process Models 5 Triconnected Abstraction 5.1 Abstraction Rules 5.2 Abstraction Algorithm 6 Related Work and Conclusions

  13. Modelling of Batch Process Operations

    DEFF Research Database (Denmark)

    Abdul Samad, Noor Asma Fazli; Cameron, Ian; Gani, Rafiqul

    2011-01-01

    Here a batch cooling crystalliser is modelled and simulated as is a batch distillation system. In the batch crystalliser four operational modes of the crystalliser are considered, namely: initial cooling, nucleation, crystal growth and product removal. A model generation procedure is shown that s...

  14. Mathematical Modeling: A Structured Process

    Science.gov (United States)

    Anhalt, Cynthia Oropesa; Cortez, Ricardo

    2015-01-01

    Mathematical modeling, in which students use mathematics to explain or interpret physical, social, or scientific phenomena, is an essential component of the high school curriculum. The Common Core State Standards for Mathematics (CCSSM) classify modeling as a K-12 standard for mathematical practice and as a conceptual category for high school…

  15. Selected sports talent development models

    OpenAIRE

    Michal Vičar

    2017-01-01

    Background: Sports talent in the Czech Republic is generally viewed as a static, stable phenomena. It stands in contrast with widespread praxis carried out in Anglo-Saxon countries that emphasise its fluctuant nature. This is reflected in the current models describing its development. Objectives: The aim is to introduce current models of talent development in sport. Methods: Comparison and analysing of the following models: Balyi - Long term athlete development model, Côté - Developmen...

  16. Estimation of environment-related properties of chemicals for design of sustainable processes: Development of group-contribution+ (GC+) models and uncertainty analysis

    DEFF Research Database (Denmark)

    Hukkerikar, Amol; Kalakul, Sawitree; Sarup, Bent

    2012-01-01

    The aim of this work is to develop group-3 contribution+ (GC+)method (combined group-contribution (GC) method and atom connectivity index (CI)) based 15 property models to provide reliable estimations of environment-related properties of organic chemicals together with uncertainties of estimated...... property values. For this purpose, a systematic methodology for property modeling and uncertainty analysis is used. The methodology includes a parameter estimation step to determine parameters of property models and an uncertainty analysis step to establish statistical information about the quality......, poly functional chemicals, etc.) taken from the database of the US Environmental Protection Agency (EPA) and from the database of USEtox is used. For property modeling and uncertainty analysis, the Marrero and Gani GC method and atom connectivity index method have been considered. In total, 22...

  17. Transforming Collaborative Process Models into Interface Process Models by Applying an MDA Approach

    Science.gov (United States)

    Lazarte, Ivanna M.; Chiotti, Omar; Villarreal, Pablo D.

    Collaborative business models among enterprises require defining collaborative business processes. Enterprises implement B2B collaborations to execute these processes. In B2B collaborations the integration and interoperability of processes and systems of the enterprises are required to support the execution of collaborative processes. From a collaborative process model, which describes the global view of the enterprise interactions, each enterprise must define the interface process that represents the role it performs in the collaborative process in order to implement the process in a Business Process Management System. Hence, in this work we propose a method for the automatic generation of the interface process model of each enterprise from a collaborative process model. This method is based on a Model-Driven Architecture to transform collaborative process models into interface process models. By applying this method, interface processes are guaranteed to be interoperable and defined according to a collaborative process.

  18. Understanding Quality in Process Modelling: Towards a Holistic Perspective

    Directory of Open Access Journals (Sweden)

    Jan Recker

    2007-09-01

    Full Text Available Quality is one of the main topics in current conceptual modelling research, as is the field of business process modelling. Yet, widely acknowledged academic contributions towards an understanding or measurement of business process model quality are limited at best. In this paper I argue that the development of methodical theories concerning the measurement or establishment of process model quality must be preceded by methodological elaborations on business process modelling. I further argue that existing epistemological foundations of process modelling are insufficient for describing all extrinsic and intrinsic traits of model quality. This in turn has led to a lack of holistic understanding of process modelling. Taking into account the inherent social and purpose-oriented character of process modelling in contemporary organizations I present a socio-pragmatic constructionist methodology of business process modelling and sketch out implications of this perspective towards an understanding of process model quality. I anticipate that, based on this research, theories can be developed that facilitate the evaluation of the ’goodness’ of a business process model.

  19. Fabrication Process Development for Light Deformable Mirrors

    Data.gov (United States)

    National Aeronautics and Space Administration — The project objective is to develop robust, reproductibble fabrication processes to realize functional deformable membrane mirrors (DM) for a space mission in which...

  20. AECL's use of FMEA and OPEX for field service tooling and process development, implementation and improvement: a model for the future

    International Nuclear Information System (INIS)

    Cox, E.; Dam, R.F.; Wilson, E.

    2008-01-01

    Failure Modes and Effects Analysis (FMEA) is a systematic and rigorous process applied to new or complex systems to predict system failures and assist with the development of mitigating strategies. The process is especially beneficial when applied to higher-risk applications such as nuclear systems. FMEA may be used for design verification and maintenance program development. For field service tooling, FMEA is complimented well by operating experience (OPEX) and continuous improvement initiatives. FMEA is generally conducted while developing systems and processes to ensure safe and successful implementation, while OPEX is fed back into the system design and operation to improve those systems and processes for subsequent field applications. This paper will explore these techniques as they have been applied to AECL's CANDUclean system. The portable CANDUclean system is employed to mechanically clean the inside of steam generator (SG) tubes in CANDU nuclear power plants. During normal plant operation, the steam generator tubes in the heat transport system develop a build-up of magnetite on their internal diameter, which decreases heat transfer efficiency, impedes SG maintenance activities and increases the radiation fields in and around the boilers. As part of a regular plant aging management routine, the CANDUclean system is used to remove the magnetite layers. The nature of this work includes risks to personnel safety, however by continually applying FMEA and other improvement initiatives, safety and system effectiveness are maximized. This paper will provide an overview of the integrated continuous improvement approach applied to the CANDUclean system and consider the value of strategies when applied to field service tooling and CANDU systems. (author)

  1. Process Modeling With Inhomogeneous Thin Films

    Science.gov (United States)

    Machorro, R.; Macleod, H. A.; Jacobson, M. R.

    1986-12-01

    Designers of optical multilayer coatings commonly assume that the individual layers will be ideally homogeneous and isotropic. In practice, it is very difficult to control the conditions involved in the complex evaporation process sufficiently to produce such ideal films. Clearly, changes in process parameters, such as evaporation rate, chamber pressure, and substrate temperature, affect the microstructure of the growing film, frequently producing inhomogeneity in structure or composition. In many cases, these effects are interdependent, further complicating the situation. However, this process can be simulated on powerful, interactive, and accessible microcomputers. In this work, we present such a model and apply it to estimate the influence of an inhomogeneous layer on multilayer performance. Presently, the program simulates film growth, thermal expansion and contraction, and thickness monitoring procedures, and includes the effects of uncertainty in these parameters or noise. Although the model is being developed to cover very general cases, we restrict the present discussion to isotropic and nondispersive quarterwave layers to understand the particular effects of inhomogeneity. We studied several coating designs and related results and tolerances to variations in evaporation conditions. The model is composed of several modular subprograms, is written in Fortran, and is executed on an IBM-PC with 640 K of memory. The results can be presented in graphic form on a monochrome monitor. We are currently installing and implementing color capability to improve the clarity of the multidimensional output.

  2. Recent Developments in Abrasive Hybrid Manufacturing Processes

    Directory of Open Access Journals (Sweden)

    Ruszaj Adam

    2017-06-01

    Full Text Available Recent dynamic development of abrasive hybrid manufacturing processes results from application of a new difficult for machining materials and improvement of technological indicators of manufacturing processes already applied in practice. This tendency also occurs in abrasive machining processes which are often supported by ultrasonic vibrations, electrochemical dissolution or by electrical discharges. In the paper we present the review of new results of investigations and new practical applications of Abrasive Electrodischarge (AEDM and Electrochemical (AECM Machining.

  3. Itataia project - Development of the process

    International Nuclear Information System (INIS)

    Coelho, S.V.

    1987-01-01

    A process for treating the phosphorus uraniferous ore, from Itataia-CE mine in Brazil, was developed, establishing the basic flow chart for recovery two products: uranium concentrate and phosphoric acid. The developed process consists in physical concentration, chemical separation, solvent extraction, and it presented, in laboratory and pilot scales, recovery levels which assure the project viability technicaly and economicaly. The consolidation of project and the description of installations are presented by a documentary film. (M.C.K.) [pt

  4. Itataia project - Development of the process

    International Nuclear Information System (INIS)

    Coelho, S.V.

    1987-01-01

    A process for treating the phosphorous uraniferous ore, from Itataia-CE mine in Brazil, was developed, establishing the basic flow chart for recovery two products: uranium concentrate and phosphoric acid. The developed process consists in physical concentration, chemical separation, solvent extraction, and it presented, in laboratory and pilot scales, recovery leves which assure the project viability technically and economically. The consolidation of project and the description of installations are presented by a documentary film. (M.C.K.) [pt

  5. Teaching Information Systems Development via Process Variants

    Science.gov (United States)

    Tan, Wee-Kek; Tan, Chuan-Hoo

    2010-01-01

    Acquiring the knowledge to assemble an integrated Information System (IS) development process that is tailored to the specific needs of a project has become increasingly important. It is therefore necessary for educators to impart to students this crucial skill. However, Situational Method Engineering (SME) is an inherently complex process that…

  6. Biocatalytic process development using microfluidic miniaturized systems

    DEFF Research Database (Denmark)

    Krühne, Ulrich; Heintz, Søren; Ringborg, Rolf Hoffmeyer

    2014-01-01

    The increasing interest in biocatalytic processes means there is a clear need for a new systematic development paradigm which encompasses both protein engineering and process engineering. This paper argues that through the use of a new microfluidic platform, data can be collected more rapidly...

  7. Process Consultation: Its Role in Organization Development.

    Science.gov (United States)

    Schein, Edgar H.

    This volume focuses on the process by which the consultant builds readiness for organizational development (OD) programs, actually conducts training, and works with the key individuals of an organization as part of an OD program. Part I describes in some detail the human processes in organizations--communication, functional roles of group members,…

  8. Application of quality by design concept to develop a dual gradient elution stability-indicating method for cloxacillin forced degradation studies using combined mixture-process variable models.

    Science.gov (United States)

    Zhang, Xia; Hu, Changqin

    2017-09-08

    Penicillins are typical of complex ionic samples which likely contain large number of degradation-related impurities (DRIs) with different polarities and charge properties. It is often a challenge to develop selective and robust high performance liquid chromatography (HPLC) methods for the efficient separation of all DRIs. In this study, an analytical quality by design (AQbD) approach was proposed for stability-indicating method development of cloxacillin. The structures, retention and UV characteristics rules of penicillins and their impurities were summarized and served as useful prior knowledge. Through quality risk assessment and screen design, 3 critical process parameters (CPPs) were defined, including 2 mixture variables (MVs) and 1 process variable (PV). A combined mixture-process variable (MPV) design was conducted to evaluate the 3 CPPs simultaneously and a response surface methodology (RSM) was used to achieve the optimal experiment parameters. A dual gradient elution was performed to change buffer pH, mobile-phase type and strength simultaneously. The design spaces (DSs) was evaluated using Monte Carlo simulation to give their possibility of meeting the specifications of CQAs. A Plackett-Burman design was performed to test the robustness around the working points and to decide the normal operating ranges (NORs). Finally, validation was performed following International Conference on Harmonisation (ICH) guidelines. To our knowledge, this is the first study of using MPV design and dual gradient elution to develop HPLC methods and improve separations for complex ionic samples. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. Multifunctional multiscale composites: Processing, modeling and characterization

    Science.gov (United States)

    Qiu, Jingjing

    Carbon nanotubes (CNTs) demonstrate extraordinary properties and show great promise in enhancing out-of-plane properties of traditional polymer/fiber composites and enabling functionality. However, current manufacturing challenges hinder the realization of their potential. In the dissertation research, both experimental and computational efforts have been conducted to investigate effective manufacturing techniques of CNT integrated multiscale composites. The fabricated composites demonstrated significant improvements in physical properties, such as tensile strength, tensile modulus, inter-laminar shear strength, thermal dimension stability and electrical conductivity. Such multiscale composites were truly multifunctional with the addition of CNTs. Furthermore, a novel hierarchical multiscale modeling method was developed in this research. Molecular dynamic (MD) simulation offered reasonable explanation of CNTs dispersion and their motion in polymer solution. Bi-mode finite-extensible-nonlinear-elastic (FENE) dumbbell simulation was used to analyze the influence of CNT length distribution on the stress tensor and shear-rate-dependent viscosity. Based on the simulated viscosity profile and empirical equations from experiments, a macroscale flow simulation model on the finite element method (FEM) method was developed and validated to predict resin flow behavior in the processing of CNT-enhanced multiscale composites. The proposed multiscale modeling method provided a comprehensive understanding of micro/nano flow in both atomistic details and mesoscale. The simulation model can be used to optimize process design and control of the mold-filling process in multiscale composite manufacturing. This research provided systematic investigations into the CNT-based multiscale composites. The results from this study may be used to leverage the benefits of CNTs and open up new application opportunities for high-performance multifunctional multiscale composites. Keywords. Carbon

  10. A Comparative of business process modelling techniques

    Science.gov (United States)

    Tangkawarow, I. R. H. T.; Waworuntu, J.

    2016-04-01

    In this era, there is a lot of business process modeling techniques. This article is the research about differences of business process modeling techniques. For each technique will explain about the definition and the structure. This paper presents a comparative analysis of some popular business process modelling techniques. The comparative framework is based on 2 criteria: notation and how it works when implemented in Somerleyton Animal Park. Each technique will end with the advantages and disadvantages. The final conclusion will give recommend of business process modeling techniques that easy to use and serve the basis for evaluating further modelling techniques.

  11. Comparative analysis of business rules and business process modeling languages

    Directory of Open Access Journals (Sweden)

    Audrius Rima

    2013-03-01

    Full Text Available During developing an information system is important to create clear models and choose suitable modeling languages. The article analyzes the SRML, SBVR, PRR, SWRL, OCL rules specifying language and UML, DFD, CPN, EPC and IDEF3 BPMN business process modeling language. The article presents business rules and business process modeling languages theoretical comparison. The article according to selected modeling aspects of the comparison between different business process modeling languages ​​and business rules representation languages sets. Also, it is selected the best fit of language set for three layer framework for business rule based software modeling.

  12. Information-Processing Models and Curriculum Design

    Science.gov (United States)

    Calfee, Robert C.

    1970-01-01

    "This paper consists of three sections--(a) the relation of theoretical analyses of learning to curriculum design, (b) the role of information-processing models in analyses of learning processes, and (c) selected examples of the application of information-processing models to curriculum design problems." (Author)

  13. How visual cognition influences process model comprehension

    NARCIS (Netherlands)

    Petrusel, Razvan; Mendling, Jan; Reijers, Hajo A.

    2017-01-01

    Process analysts and other professionals extensively use process models to analyze business processes and identify performance improvement opportunities. Therefore, it is important that such models can be easily and properly understood. Previous research has mainly focused on two types of factors

  14. Social software for business process modeling

    NARCIS (Netherlands)

    Koschmider, A.; Song, M.S.; Reijers, H.A.

    2010-01-01

    Formal models of business processes are used for a variety of purposes. But where the elicitation of the characteristics of a business process usually takes place in a collaborative fashion, the building of the final, formal process model is done mostly by a single person. This article presents the

  15. Difference-based Model Synchronization in an Industrial MDD Process

    DEFF Research Database (Denmark)

    Könemann, Patrick; Kindler, Ekkart; Unland, Ludger

    2009-01-01

    Models play a central role in model-driven software engineering. There are different kinds of models during the development process, which are related to each other and change over time. Therefore, it is difficult to keep the different models consistent with each other. Consistency of different m...... model versions, and for synchronizing other types of models. The main concern is to apply our concepts to an industrial process, in particular keeping usability and performance in mind. Keyword: Model Differencing, Model Merging, Model Synchronization...

  16. Advances in the Process Development of Biocatalytic Processes

    DEFF Research Database (Denmark)

    Tufvesson, Pär; Lima Ramos, Joana; Al-Haque, Naweed

    2013-01-01

    Biocatalysis is already established in chemical synthesis on an industrial scale, in particular in the pharmaceutical sector. However, the wider implementation of biocatalysis is currently hindered by the extensive effort required to develop a competitive process. In order that resources spent...

  17. Spherical Process Models for Global Spatial Statistics

    KAUST Repository

    Jeong, Jaehong

    2017-11-28

    Statistical models used in geophysical, environmental, and climate science applications must reflect the curvature of the spatial domain in global data. Over the past few decades, statisticians have developed covariance models that capture the spatial and temporal behavior of these global data sets. Though the geodesic distance is the most natural metric for measuring distance on the surface of a sphere, mathematical limitations have compelled statisticians to use the chordal distance to compute the covariance matrix in many applications instead, which may cause physically unrealistic distortions. Therefore, covariance functions directly defined on a sphere using the geodesic distance are needed. We discuss the issues that arise when dealing with spherical data sets on a global scale and provide references to recent literature. We review the current approaches to building process models on spheres, including the differential operator, the stochastic partial differential equation, the kernel convolution, and the deformation approaches. We illustrate realizations obtained from Gaussian processes with different covariance structures and the use of isotropic and nonstationary covariance models through deformations and geographical indicators for global surface temperature data. To assess the suitability of each method, we compare their log-likelihood values and prediction scores, and we end with a discussion of related research problems.

  18. Reference model for apparel product development

    Directory of Open Access Journals (Sweden)

    Isabel Cristina Moretti

    2017-03-01

    Full Text Available The purpose of this paper was to develop a reference model for the implementation of the process of product development (PDP for apparel. The tool was developed through an interactive process of comparison between theoretical. Managers in companies and professionals working in this market can utilize the reference model as a source for the organization and improvement of the PDP for apparel and the universities as a reference source for systematized teaching of this process. This model represents the first comprehensive attempt to develop an instrument at a detailed level (macro phases, phases, activities, inputs and outputs at each stage and at the gates to systematize the PDP process for fashion products and to consider its particularities.

  19. Creation of a competency-based professional development program for infection preventionists guided by the APIC Competency Model: steps in the process.

    Science.gov (United States)

    Bernard, Heather; Hackbarth, Diana; Olmsted, Russell N; Murphy, Denise

    2018-06-07

    Infection Preventionists have varying levels of educational preparation. Many have no prior experience in IP. The diversity makes design of professional development programs challenging. Recent surveys suggest that only about half of practicing IPs are board certified. There is an urgent need to employ competent IP's to drive improvement in patient outcomes. This is a project that utilized the APIC Competency Model to create a professional development program characterizing three career stages. Methods included a review of literature on professional development; a survey of IP competence; an assessment of job descriptions and performance evaluations; and a crosswalk of IP competencies. The professional development program includes competency - based IP job descriptions and performance evaluations for each career stage; a professional portfolio; and a toolkit for supervisors. Participants agreed that application of the model resulted in tools which are more closely aligned with current roles for IPs; and increased satisfaction and motivation with the new program. Competent and knowledgeable IP's are crucial to optimizing efficacy of IPC programs. A professional development program has the potential to guide staff orientation, improve satisfaction and retention, improve patient outcomes and promote a positive trajectory in advancing practice. Copyright © 2018 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Elsevier Inc. All rights reserved.

  20. Steady-State Process Modelling

    DEFF Research Database (Denmark)

    Cameron, Ian; Gani, Rafiqul

    2011-01-01

    illustrate the “equation oriented” approach as well as the “sequential modular” approach to solving complex flowsheets for steady state applications. The applications include the Williams-Otto plant, the hydrodealkylation (HDA) of toluene, conversion of ethylene to ethanol and a bio-ethanol process....