WorldWideScience

Sample records for model simplification process

  1. Towards simplification of hydrologic modeling: Identification of dominant processes

    Science.gov (United States)

    Markstrom, Steven; Hay, Lauren E.; Clark, Martyn P.

    2016-01-01

    The Precipitation–Runoff Modeling System (PRMS), a distributed-parameter hydrologic model, has been applied to the conterminous US (CONUS). Parameter sensitivity analysis was used to identify: (1) the sensitive input parameters and (2) particular model output variables that could be associated with the dominant hydrologic process(es). Sensitivity values of 35 PRMS calibration parameters were computed using the Fourier amplitude sensitivity test procedure on 110 000 independent hydrologically based spatial modeling units covering the CONUS and then summarized to process (snowmelt, surface runoff, infiltration, soil moisture, evapotranspiration, interflow, baseflow, and runoff) and model performance statistic (mean, coefficient of variation, and autoregressive lag 1). Identified parameters and processes provide insight into model performance at the location of each unit and allow the modeler to identify the most dominant process on the basis of which processes are associated with the most sensitive parameters. The results of this study indicate that: (1) the choice of performance statistic and output variables has a strong influence on parameter sensitivity, (2) the apparent model complexity to the modeler can be reduced by focusing on those processes that are associated with sensitive parameters and disregarding those that are not, (3) different processes require different numbers of parameters for simulation, and (4) some sensitive parameters influence only one hydrologic process, while others may influence many

  2. Towards simplification of hydrologic modeling: identification of dominant processes

    Directory of Open Access Journals (Sweden)

    S. L. Markstrom

    2016-11-01

    Full Text Available parameter hydrologic model, has been applied to the conterminous US (CONUS. Parameter sensitivity analysis was used to identify: (1 the sensitive input parameters and (2 particular model output variables that could be associated with the dominant hydrologic process(es. Sensitivity values of 35 PRMS calibration parameters were computed using the Fourier amplitude sensitivity test procedure on 110 000 independent hydrologically based spatial modeling units covering the CONUS and then summarized to process (snowmelt, surface runoff, infiltration, soil moisture, evapotranspiration, interflow, baseflow, and runoff and model performance statistic (mean, coefficient of variation, and autoregressive lag 1. Identified parameters and processes provide insight into model performance at the location of each unit and allow the modeler to identify the most dominant process on the basis of which processes are associated with the most sensitive parameters. The results of this study indicate that: (1 the choice of performance statistic and output variables has a strong influence on parameter sensitivity, (2 the apparent model complexity to the modeler can be reduced by focusing on those processes that are associated with sensitive parameters and disregarding those that are not, (3 different processes require different numbers of parameters for simulation, and (4 some sensitive parameters influence only one hydrologic process, while others may influence many.

  3. SAHM - Simplification of one-dimensional hydraulic networks by automated processes evaluated on 1D/2D deterministic flood models

    DEFF Research Database (Denmark)

    Löwe, Roland; Davidsen, Steffen; Thrysøe, Cecilie

    We present an algorithm for automated simplification of 1D pipe network models. The impact of the simplifications on the flooding simulated by coupled 1D-2D models is evaluated in an Australian case study. Significant reductions of the simulation time of the coupled model are achieved by reducing...... the 1D network model. The simplifications lead to an underestimation of flooded area because interaction points between network and surface are removed and because water is transported downstream faster. These effects can be mitigated by maintaining nodes in flood-prone areas in the simplification...... and by adjusting pipe roughness to increase transport times....

  4. Simplification of one-dimensional hydraulic networks by automated processes evaluated on 1D/2D deterministic flood models

    DEFF Research Database (Denmark)

    Davidsen, Steffen; Löwe, Roland; Thrysøe, Cecilie

    2017-01-01

    Evaluation of pluvial flood risk is often based on computations using 1D/2D urban flood models. However, guidelines on choice of model complexity are missing, especially for one-dimensional (1D) network models. This study presents a new automatic approach for simplification of 1D hydraulic networ...

  5. Recomputing Causality Assignments on Lumped Process Models When Adding New Simplification Assumptions

    Directory of Open Access Journals (Sweden)

    Antonio Belmonte

    2018-04-01

    Full Text Available This paper presents a new algorithm for the resolution of over-constrained lumped process systems, where partial differential equations of a continuous time and space model of the system are reduced into ordinary differential equations with a finite number of parameters and where the model equations outnumber the unknown model variables. Our proposal is aimed at the study and improvement of the algorithm proposed by Hangos-Szerkenyi-Tuza. This new algorithm improves the computational cost and solves some of the internal problems of the aforementioned algorithm in its original formulation. The proposed algorithm is based on parameter relaxation that can be modified easily. It retains the necessary information of the lumped process system to reduce the time cost after introducing changes during the system formulation. It also allows adjustment of the system formulations that change its differential index between simulations.

  6. Terrain Simplification Research in Augmented Scene Modeling

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    environment. As one of the most important tasks in augmented scene modeling, terrain simplification research has gained more and more attention. In this paper, we mainly focus on point selection problem in terrain simplification using triangulated irregular network. Based on the analysis and comparison of traditional importance measures for each input point, we put forward a new importance measure based on local entropy. The results demonstrate that the local entropy criterion has a better performance than any traditional methods. In addition, it can effectively conquer the "short-sight" problem associated with the traditional methods.

  7. An Agent Based Collaborative Simplification of 3D Mesh Model

    Science.gov (United States)

    Wang, Li-Rong; Yu, Bo; Hagiwara, Ichiro

    Large-volume mesh model faces the challenge in fast rendering and transmission by Internet. The current mesh models obtained by using three-dimensional (3D) scanning technology are usually very large in data volume. This paper develops a mobile agent based collaborative environment on the development platform of mobile-C. Communication among distributed agents includes grasping image of visualized mesh model, annotation to grasped image and instant message. Remote and collaborative simplification can be efficiently conducted by Internet.

  8. Electric Power Distribution System Model Simplification Using Segment Substitution

    Energy Technology Data Exchange (ETDEWEB)

    Reiman, Andrew P.; McDermott, Thomas E.; Akcakaya, Murat; Reed, Gregory F.

    2018-05-01

    Quasi-static time-series (QSTS) simulation is used to simulate the behavior of distribution systems over long periods of time (typically hours to years). The technique involves repeatedly solving the load-flow problem for a distribution system model and is useful for distributed energy resource (DER) planning. When a QSTS simulation has a small time step and a long duration, the computational burden of the simulation can be a barrier to integration into utility workflows. One way to relieve the computational burden is to simplify the system model. The segment substitution method of simplifying distribution system models introduced in this paper offers model bus reduction of up to 98% with a simplification error as low as 0.2% (0.002 pu voltage). In contrast to existing methods of distribution system model simplification, which rely on topological inspection and linearization, the segment substitution method uses black-box segment data and an assumed simplified topology.

  9. Electric Power Distribution System Model Simplification Using Segment Substitution

    International Nuclear Information System (INIS)

    Reiman, Andrew P.; McDermott, Thomas E.; Akcakaya, Murat; Reed, Gregory F.

    2017-01-01

    Quasi-static time-series (QSTS) simulation is used to simulate the behavior of distribution systems over long periods of time (typically hours to years). The technique involves repeatedly solving the load-flow problem for a distribution system model and is useful for distributed energy resource (DER) planning. When a QSTS simulation has a small time step and a long duration, the computational burden of the simulation can be a barrier to integration into utility workflows. One way to relieve the computational burden is to simplify the system model. The segment substitution method of simplifying distribution system models introduced in this paper offers model bus reduction of up to 98% with a simplification error as low as 0.2% (0.002 pu voltage). Finally, in contrast to existing methods of distribution system model simplification, which rely on topological inspection and linearization, the segment substitution method uses black-box segment data and an assumed simplified topology.

  10. Modeling fructose-load-induced hepatic de-novo lipogenesis by model simplification

    Directory of Open Access Journals (Sweden)

    Richard J Allen

    2017-03-01

    Full Text Available Hepatic de-novo lipogenesis is a metabolic process implemented in the pathogenesis of type 2 diabetes. Clinically, the rate of this process can be ascertained by use of labeled acetate and stimulation by fructose administration. A systems pharmacology model of this process is desirable because it facilitates the description, analysis, and prediction of this experiment. Due to the multiple enzymes involved in de-novo lipogenesis, and the limited data, it is desirable to use single functional expressions to encapsulate the flux between multiple enzymes. To accomplish this we developed a novel simplification technique which uses the available information about the properties of the individual enzymes to bound the parameters of a single governing ‘transfer function’. This method should be applicable to any model with linear chains of enzymes that are well stimulated. We validated this approach with computational simulations and analytical justification in a limiting case. Using this technique we generated a simple model of hepatic de-novo lipogenesis in these experimental conditions that matched prior data. This model can be used to assess pharmacological intervention at specific points on this pathway. We have demonstrated this with prospective simulation of acetyl-CoA carboxylase inhibition. This simplification technique suggests how the constituent properties of an enzymatic chain of reactions gives rise to the sensitivity (to substrate of the pathway as a whole.

  11. Modeling fructose-load-induced hepatic de-novo lipogenesis by model simplification.

    Science.gov (United States)

    Allen, Richard J; Musante, Cynthia J

    2017-01-01

    Hepatic de-novo lipogenesis is a metabolic process implemented in the pathogenesis of type 2 diabetes. Clinically, the rate of this process can be ascertained by use of labeled acetate and stimulation by fructose administration. A systems pharmacology model of this process is desirable because it facilitates the description, analysis, and prediction of this experiment. Due to the multiple enzymes involved in de-novo lipogenesis, and the limited data, it is desirable to use single functional expressions to encapsulate the flux between multiple enzymes. To accomplish this we developed a novel simplification technique which uses the available information about the properties of the individual enzymes to bound the parameters of a single governing 'transfer function'. This method should be applicable to any model with linear chains of enzymes that are well stimulated. We validated this approach with computational simulations and analytical justification in a limiting case. Using this technique we generated a simple model of hepatic de-novo lipogenesis in these experimental conditions that matched prior data. This model can be used to assess pharmacological intervention at specific points on this pathway. We have demonstrated this with prospective simulation of acetyl-CoA carboxylase inhibition. This simplification technique suggests how the constituent properties of an enzymatic chain of reactions gives rise to the sensitivity (to substrate) of the pathway as a whole.

  12. Simplification of Process Integration Studies in Intermediate Size Industries

    DEFF Research Database (Denmark)

    Dalsgård, Henrik; Petersen, P. M.; Qvale, Einar Bjørn

    2002-01-01

    associated with a given process integration study in an intermediate size industry. This is based on the observation that the systems that eventually result from a process integration project and that are economically and operationally most interesting are also quite simple. Four steps that may be used......It can be argued that the largest potential for energy savings based on process integration is in the intermediate size industry. But this is also the industrial scale in which it is most difficult to make the introduction of energy saving measures economically interesting. The reasons......' and therefore lead to non-optimal economic solutions, which may be right. But the objective of the optimisation is not to reach the best economic solution, but to relatively quickly develop the design of a simple and operationally friendly network without losing too much energy saving potential. (C) 2002...

  13. Surface Simplification of 3D Animation Models Using Robust Homogeneous Coordinate Transformation

    Directory of Open Access Journals (Sweden)

    Juin-Ling Tseng

    2014-01-01

    Full Text Available The goal of 3D surface simplification is to reduce the storage cost of 3D models. A 3D animation model typically consists of several 3D models. Therefore, to ensure that animation models are realistic, numerous triangles are often required. However, animation models that have a high storage cost have a substantial computational cost. Hence, surface simplification methods are adopted to reduce the number of triangles and computational cost of 3D models. Quadric error metrics (QEM has recently been identified as one of the most effective methods for simplifying static models. To simplify animation models by using QEM, Mohr and Gleicher summed the QEM of all frames. However, homogeneous coordinate problems cannot be considered completely by using QEM. To resolve this problem, this paper proposes a robust homogeneous coordinate transformation that improves the animation simplification method proposed by Mohr and Gleicher. In this study, the root mean square errors of the proposed method were compared with those of the method proposed by Mohr and Gleicher, and the experimental results indicated that the proposed approach can preserve more contour features than Mohr’s method can at the same simplification ratio.

  14. A new model for the simplification of particle counting data

    Directory of Open Access Journals (Sweden)

    M. F. Fadal

    2012-06-01

    Full Text Available This paper proposes a three-parameter mathematical model to describe the particle size distribution in a water sample. The proposed model offers some conceptual advantages over two other models reported on previously, and also provides a better fit to the particle counting data obtained from 321 water samples taken over three years at a large South African drinking water supplier. Using the data from raw water samples taken from a moderately turbid, large surface impoundment, as well as samples from the same water after treatment, typical ranges of the model parameters are presented for both raw and treated water. Once calibrated, the model allows the calculation and comparison of total particle number and volumes over any randomly selected size interval of interest.

  15. Simplification of an MCNP model designed for dose rate estimation

    Science.gov (United States)

    Laptev, Alexander; Perry, Robert

    2017-09-01

    A study was made to investigate the methods of building a simplified MCNP model for radiological dose estimation. The research was done using an example of a complicated glovebox with extra shielding. The paper presents several different calculations for neutron and photon dose evaluations where glovebox elements were consecutively excluded from the MCNP model. The analysis indicated that to obtain a fast and reasonable estimation of dose, the model should be realistic in details that are close to the tally. Other details may be omitted.

  16. Simplification of an MCNP model designed for dose rate estimation

    Directory of Open Access Journals (Sweden)

    Laptev Alexander

    2017-01-01

    Full Text Available A study was made to investigate the methods of building a simplified MCNP model for radiological dose estimation. The research was done using an example of a complicated glovebox with extra shielding. The paper presents several different calculations for neutron and photon dose evaluations where glovebox elements were consecutively excluded from the MCNP model. The analysis indicated that to obtain a fast and reasonable estimation of dose, the model should be realistic in details that are close to the tally. Other details may be omitted.

  17. Ecosystem models are by definition simplifications of the real ...

    African Journals Online (AJOL)

    spamer

    to calculate changes in total phytoplankton vegetative biomass with time ... into account when modelling phytoplankton population dynamics. ... Then, the means whereby the magnitude of ..... There was increased heat input and slight stratification from mid to ... conditions must be optimal and the water should be extremely ...

  18. Effects of model layer simplification using composite hydraulic properties

    Science.gov (United States)

    Kuniansky, Eve L.; Sepulveda, Nicasio; Elango, Lakshmanan

    2011-01-01

    Groundwater provides much of the fresh drinking water to more than 1.5 billion people in the world (Clarke et al., 1996) and in the United States more that 50 percent of citizens rely on groundwater for drinking water (Solley et al., 1998). As aquifer systems are developed for water supply, the hydrologic system is changed. Water pumped from the aquifer system initially can come from some combination of inducing more recharge, water permanently removed from storage, and decreased groundwater discharge. Once a new equilibrium is achieved, all of the pumpage must come from induced recharge and decreased discharge (Alley et al., 1999). Further development of groundwater resources may result in reductions of surface water runoff and base flows. Competing demands for groundwater resources require good management. Adequate data to characterize the aquifers and confining units of the system, like hydrologic boundaries, groundwater levels, streamflow, and groundwater pumping and climatic data for recharge estimation are to be collected in order to quantify the effects of groundwater withdrawals on wetlands, streams, and lakes. Once collected, three-dimensional (3D) groundwater flow models can be developed and calibrated and used as a tool for groundwater management. The main hydraulic parameters that comprise a regional or subregional model of an aquifer system are the hydraulic conductivity and storage properties of the aquifers and confining units (hydrogeologic units) that confine the system. Many 3D groundwater flow models used to help assess groundwater/surface-water interactions require calculating ?effective? or composite hydraulic properties of multilayered lithologic units within a hydrogeologic unit. The calculation of composite hydraulic properties stems from the need to characterize groundwater flow using coarse model layering in order to reduce simulation times while still representing the flow through the system accurately. The accuracy of flow models with

  19. Large regional groundwater modeling - a sensitivity study of some selected conceptual descriptions and simplifications

    International Nuclear Information System (INIS)

    Ericsson, Lars O.; Holmen, Johan

    2010-12-01

    The primary aim of this report is: - To present a supplementary, in-depth evaluation of certain conceptual simplifications, descriptions and model uncertainties in conjunction with regional groundwater simulation, which in the first instance refer to model depth, topography, groundwater table level and boundary conditions. Implementation was based on geo-scientifically available data compilations from the Smaaland region but different conceptual assumptions have been analysed

  20. Phonological simplifications, apraxia of speech and the interaction between phonological and phonetic processing.

    Science.gov (United States)

    Galluzzi, Claudia; Bureca, Ivana; Guariglia, Cecilia; Romani, Cristina

    2015-05-01

    Research on aphasia has struggled to identify apraxia of speech (AoS) as an independent deficit affecting a processing level separate from phonological assembly and motor implementation. This is because AoS is characterized by both phonological and phonetic errors and, therefore, can be interpreted as a combination of deficits at the phonological and the motoric level rather than as an independent impairment. We apply novel psycholinguistic analyses to the perceptually phonological errors made by 24 Italian aphasic patients. We show that only patients with relative high rate (>10%) of phonetic errors make sound errors which simplify the phonology of the target. Moreover, simplifications are strongly associated with other variables indicative of articulatory difficulties - such as a predominance of errors on consonants rather than vowels - but not with other measures - such as rate of words reproduced correctly or rates of lexical errors. These results indicate that sound errors cannot arise at a single phonological level because they are different in different patients. Instead, different patterns: (1) provide evidence for separate impairments and the existence of a level of articulatory planning/programming intermediate between phonological selection and motor implementation; (2) validate AoS as an independent impairment at this level, characterized by phonetic errors and phonological simplifications; (3) support the claim that linguistic principles of complexity have an articulatory basis since they only apply in patients with associated articulatory difficulties. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  1. Use of process indices for simplification of the description of vapor deposition systems

    International Nuclear Information System (INIS)

    Kajikawa, Yuya; Noda, Suguru; Komiyama, Hiroshi

    2004-01-01

    Vapor deposition is a complex process, including gas-phase, surface, and solid-phase phenomena. Because of the complexity of chemical and physical processes occurring in vapor deposition processes, it is difficult to form a comprehensive, fundamental understanding of vapor deposition and to control such systems for obtaining desirable structures and performance. To overcome this difficulty, we present a method for simplifying the complex description of such systems. One simplification method is to separate complex systems into multiple elements, and determine which of these are important elements. We call this method abridgement. The abridgement method retains only the dominant processes in a description of the system, and discards the others. Abridgement can be achieved by using process indices to evaluate the relative importance of the elementary processes. We describe the formulation and use of these process indices through examples of the growth of continuous films, initial deposition processes, and the formation of the preferred orientation of polycrystalline films. In this paper, we propose a method for representing complex vapor deposition processes as a set of simpler processes

  2. Work Simplification

    Science.gov (United States)

    Ross, Lynne

    1970-01-01

    Excerpts from a talk by Mrs. Ross at the 23rd annual convention of the American School Food Service Association in Detroit, August 5, 1969. A book on work simplification by Mrs. Ross will be available in June from the Iowa State University Press, Ames, Iowa. (Editor)

  3. A New Approach to Line Simplification Based on Image Processing: A Case Study of Water Area Boundaries

    Directory of Open Access Journals (Sweden)

    Yilang Shen

    2018-01-01

    Full Text Available Line simplification is an important component of map generalization. In recent years, algorithms for line simplification have been widely researched, and most of them are based on vector data. However, with the increasing development of computer vision, analysing and processing information from unstructured image data is both meaningful and challenging. Therefore, in this paper, we present a new line simplification approach based on image processing (BIP, which is specifically designed for raster data. First, the key corner points on a multi-scale image feature are detected and treated as candidate points. Then, to capture the essence of the shape within a given boundary using the fewest possible segments, the minimum-perimeter polygon (MPP is calculated and the points of the MPP are defined as the approximate feature points. Finally, the points after simplification are selected from the candidate points by comparing the distances between the candidate points and the approximate feature points. An empirical example was used to test the applicability of the proposed method. The results showed that (1 when the key corner points are detected based on a multi-scale image feature, the local features of the line can be extracted and retained and the positional accuracy of the proposed method can be maintained well; and (2 by defining the visibility constraint of geographical features, this method is especially suitable for simplifying water areas as it is aligned with people’s visual habits.

  4. A study of modelling simplifications in ground vibration predictions for railway traffic at grade

    Science.gov (United States)

    Germonpré, M.; Degrande, G.; Lombaert, G.

    2017-10-01

    Accurate computational models are required to predict ground-borne vibration due to railway traffic. Such models generally require a substantial computational effort. Therefore, much research has focused on developing computationally efficient methods, by either exploiting the regularity of the problem geometry in the direction along the track or assuming a simplified track structure. This paper investigates the modelling errors caused by commonly made simplifications of the track geometry. A case study is presented investigating a ballasted track in an excavation. The soil underneath the ballast is stiffened by a lime treatment. First, periodic track models with different cross sections are analyzed, revealing that a prediction of the rail receptance only requires an accurate representation of the soil layering directly underneath the ballast. A much more detailed representation of the cross sectional geometry is required, however, to calculate vibration transfer from track to free field. Second, simplifications in the longitudinal track direction are investigated by comparing 2.5D and periodic track models. This comparison shows that the 2.5D model slightly overestimates the track stiffness, while the transfer functions between track and free field are well predicted. Using a 2.5D model to predict the response during a train passage leads to an overestimation of both train-track interaction forces and free field vibrations. A combined periodic/2.5D approach is therefore proposed in this paper. First, the dynamic axle loads are computed by solving the train-track interaction problem with a periodic model. Next, the vibration transfer to the free field is computed with a 2.5D model. This combined periodic/2.5D approach only introduces small modelling errors compared to an approach in which a periodic model is used in both steps, while significantly reducing the computational cost.

  5. Reconstruction and simplification of urban scene models based on oblique images

    Science.gov (United States)

    Liu, J.; Guo, B.

    2014-08-01

    We describe a multi-view stereo reconstruction and simplification algorithms for urban scene models based on oblique images. The complexity, diversity, and density within the urban scene, it increases the difficulty to build the city models using the oblique images. But there are a lot of flat surfaces existing in the urban scene. One of our key contributions is that a dense matching algorithm based on Self-Adaptive Patch in view of the urban scene is proposed. The basic idea of matching propagating based on Self-Adaptive Patch is to build patches centred by seed points which are already matched. The extent and shape of the patches can adapt to the objects of urban scene automatically: when the surface is flat, the extent of the patch would become bigger; while the surface is very rough, the extent of the patch would become smaller. The other contribution is that the mesh generated by Graph Cuts is 2-manifold surface satisfied the half edge data structure. It is solved by clustering and re-marking tetrahedrons in s-t graph. The purpose of getting 2- manifold surface is to simply the mesh by edge collapse algorithm which can preserve and stand out the features of buildings.

  6. Influence of the degree of simplification of the two-phase hydrodynamic model on the simulated behaviour dynamics of a steam generator

    International Nuclear Information System (INIS)

    Dupont, J.F.

    1979-03-01

    The principal simplifications of a mathematical model for the simulation of behaviour dynamics of a two-phase flow with heat exchange are examined, as it appears in a steam generator. The theoretical considerations and numerical solutions permit the evaluation of the validity limits and the influence of these simplifications on the results. (G.T.H.)

  7. Hybrid stochastic simplifications for multiscale gene networks

    Directory of Open Access Journals (Sweden)

    Debussche Arnaud

    2009-09-01

    Full Text Available Abstract Background Stochastic simulation of gene networks by Markov processes has important applications in molecular biology. The complexity of exact simulation algorithms scales with the number of discrete jumps to be performed. Approximate schemes reduce the computational time by reducing the number of simulated discrete events. Also, answering important questions about the relation between network topology and intrinsic noise generation and propagation should be based on general mathematical results. These general results are difficult to obtain for exact models. Results We propose a unified framework for hybrid simplifications of Markov models of multiscale stochastic gene networks dynamics. We discuss several possible hybrid simplifications, and provide algorithms to obtain them from pure jump processes. In hybrid simplifications, some components are discrete and evolve by jumps, while other components are continuous. Hybrid simplifications are obtained by partial Kramers-Moyal expansion 123 which is equivalent to the application of the central limit theorem to a sub-model. By averaging and variable aggregation we drastically reduce simulation time and eliminate non-critical reactions. Hybrid and averaged simplifications can be used for more effective simulation algorithms and for obtaining general design principles relating noise to topology and time scales. The simplified models reproduce with good accuracy the stochastic properties of the gene networks, including waiting times in intermittence phenomena, fluctuation amplitudes and stationary distributions. The methods are illustrated on several gene network examples. Conclusion Hybrid simplifications can be used for onion-like (multi-layered approaches to multi-scale biochemical systems, in which various descriptions are used at various scales. Sets of discrete and continuous variables are treated with different methods and are coupled together in a physically justified approach.

  8. Infrastructure Area Simplification Plan

    CERN Document Server

    Field, L.

    2011-01-01

    The infrastructure area simplification plan was presented at the 3rd EMI All Hands Meeting in Padova. This plan only affects the information and accounting systems as the other areas are new in EMI and hence do not require simplification.

  9. Simplification of the processing of milled aluminium powder and mechanical evaluation properties

    International Nuclear Information System (INIS)

    Cintas, J.; Rodriguez, J. A.; Gallardo, J. M.; Herrera, E. J.

    2001-01-01

    An alternative powder.metallurgy consolidation method of milled aluminium (M Al) powder, consisting in a double cycle of cold pressing and vacuum sintering, has been developed. The aim of the present investigation is to simplify this consolidation method, from the original five steps to only three steps. This would be possible since milled powders soften during desassing, at high temperature. The mechanical properties of compacts (hardness at room and high temperature, ultimate tensile strength and elongation) obtained by the three-step and the five-step processing are comparable. This process could be ol special interest for the manufacturing of large series of small parts, such as are used in the automotive industry. (Author) 10 refs

  10. Web services-based text-mining demonstrates broad impacts for interoperability and process simplification.

    Science.gov (United States)

    Wiegers, Thomas C; Davis, Allan Peter; Mattingly, Carolyn J

    2014-01-01

    The Critical Assessment of Information Extraction systems in Biology (BioCreAtIvE) challenge evaluation tasks collectively represent a community-wide effort to evaluate a variety of text-mining and information extraction systems applied to the biological domain. The BioCreative IV Workshop included five independent subject areas, including Track 3, which focused on named-entity recognition (NER) for the Comparative Toxicogenomics Database (CTD; http://ctdbase.org). Previously, CTD had organized document ranking and NER-related tasks for the BioCreative Workshop 2012; a key finding of that effort was that interoperability and integration complexity were major impediments to the direct application of the systems to CTD's text-mining pipeline. This underscored a prevailing problem with software integration efforts. Major interoperability-related issues included lack of process modularity, operating system incompatibility, tool configuration complexity and lack of standardization of high-level inter-process communications. One approach to potentially mitigate interoperability and general integration issues is the use of Web services to abstract implementation details; rather than integrating NER tools directly, HTTP-based calls from CTD's asynchronous, batch-oriented text-mining pipeline could be made to remote NER Web services for recognition of specific biological terms using BioC (an emerging family of XML formats) for inter-process communications. To test this concept, participating groups developed Representational State Transfer /BioC-compliant Web services tailored to CTD's NER requirements. Participants were provided with a comprehensive set of training materials. CTD evaluated results obtained from the remote Web service-based URLs against a test data set of 510 manually curated scientific articles. Twelve groups participated in the challenge. Recall, precision, balanced F-scores and response times were calculated. Top balanced F-scores for gene, chemical and

  11. Analysis of Simplifications Applied in Vibration Damping Modelling for a Passive Car Shock Absorber

    Directory of Open Access Journals (Sweden)

    Łukasz Konieczny

    2016-01-01

    Full Text Available The paper presents results of research on hydraulic automotive shock absorbers. The considerations provided in the paper indicate certain flaws and simplifications resulting from the fact that damping characteristics are assumed as the function of input velocity only, which is the case of simulation studies. An important aspect taken into account when determining parameters of damping performed by car shock absorbers at a testing station is the permissible range of characteristics of a shock absorber of the same type. The aim of this study was to determine the damping characteristics entailing the stroke value. The stroke and rotary velocities were selected in a manner enabling that, for different combinations, the same maximum linear velocity can be obtained. Thus the influence of excitation parameters, such as the stroke value, on force versus displacement and force versus velocity diagrams was determined. The 3D characteristics presented as the damping surface in the stoke and the linear velocity function were determined. An analysis of the results addressed in the paper highlights the impact of such factors on the profile of closed loop graphs of damping forces and point-type damping characteristics.

  12. Using subdivision surfaces and adaptive surface simplification algorithms for modeling chemical heterogeneities in geophysical flows

    Science.gov (United States)

    Schmalzl, JöRg; Loddoch, Alexander

    2003-09-01

    We present a new method for investigating the transport of an active chemical component in a convective flow. We apply a three-dimensional front tracking method using a triangular mesh. For the refinement of the mesh we use subdivision surfaces which have been developed over the last decade primarily in the field of computer graphics. We present two different subdivision schemes and discuss their applicability to problems related to fluid dynamics. For adaptive refinement we propose a weight function based on the length of triangle edge and the sum of the angles of the triangle formed with neighboring triangles. In order to remove excess triangles we apply an adaptive surface simplification method based on quadric error metrics. We test these schemes by advecting a blob of passive material in a steady state flow in which the total volume is well preserved over a long time. Since for time-dependent flows the number of triangles may increase exponentially in time we propose the use of a subdivision scheme with diffusive properties in order to remove the small scale features of the chemical field. By doing so we are able to follow the evolution of a heavy chemical component in a vigorously convecting field. This calculation is aimed at the fate of a heavy layer at the Earth's core-mantle boundary. Since the viscosity variation with temperature is of key importance we also present a calculation with a strongly temperature-dependent viscosity.

  13. Homotopic Polygonal Line Simplification

    DEFF Research Database (Denmark)

    Deleuran, Lasse Kosetski

    This thesis presents three contributions to the area of polygonal line simplification, or simply line simplification. A polygonal path, or simply a path is a list of points with line segments between the points. A path can be simplified by morphing it in order to minimize some objective function...

  14. A model to predict element redistribution in unsaturated soil: Its simplification and validation

    International Nuclear Information System (INIS)

    Sheppard, M.I.; Stephens, M.E.; Davis, P.A.; Wojciechowski, L.

    1991-01-01

    A research model has been developed to predict the long-term fate of contaminants entering unsaturated soil at the surface through irrigation or atmospheric deposition, and/or at the water table through groundwater. The model, called SCEMR1 (Soil Chemical Exchange and Migration of Radionuclides, Version 1), uses Darcy's law to model water movement, and the soil solid/liquid partition coefficient, K d , to model chemical exchange. SCEMR1 has been validated extensively on controlled field experiments with several soils, aeration statuses and the effects of plants. These validation results show that the model is robust and performs well. Sensitivity analyses identified soil K d , annual effective precipitation, soil type and soil depth to be the four most important model parameters. SCEMR1 consumes too much computer time for incorporation into a probabilistic assessment code. Therefore, we have used SCEMR1 output to derive a simple assessment model. The assessment model reflects the complexity of its parent code, and provides a more realistic description of containment transport in soils than would a compartment model. Comparison of the performance of the SCEMR1 research model, the simple SCEMR1 assessment model and the TERRA compartment model on a four-year soil-core experiment shows that the SCEMR1 assessment model generally provides conservative soil concentrations. (15 refs., 3 figs.)

  15. On the simplifications for the thermal modeling of tilting-pad journal bearings under thermoelastohydrodynamic regime

    DEFF Research Database (Denmark)

    Cerda Varela, Alejandro Javier; Fillon, Michel; Santos, Ilmar

    2012-01-01

    formulation for inclusion of the heat transfer effects between oil film and pad surface. Such simplified approach becomes necessary when modeling the behavior of tilting-pad journal bearings operating on controllable lubrication regime. Three different simplified heat transfer models are tested, by comparing...... are strongly dependent on the Reynolds number for the oil flow in the bearing. For bearings operating in laminar regime, the decoupling of the oil film energy equation solving procedure, with no heat transfer terms included, with the pad heat conduction problem, where the oil film temperature is applied......The relevance of calculating accurately the oil film temperature build up when modeling tilting-pad journal bearings is well established within the literature on the subject. This work studies the feasibility of using a thermal model for the tilting-pad journal bearing which includes a simplified...

  16. Simplification and Validation of a Spectral-Tensor Model for Turbulence Including Atmospheric Stability

    Science.gov (United States)

    Chougule, Abhijit; Mann, Jakob; Kelly, Mark; Larsen, Gunner C.

    2018-02-01

    A spectral-tensor model of non-neutral, atmospheric-boundary-layer turbulence is evaluated using Eulerian statistics from single-point measurements of the wind speed and temperature at heights up to 100 m, assuming constant vertical gradients of mean wind speed and temperature. The model has been previously described in terms of the dissipation rate ɛ , the length scale of energy-containing eddies L , a turbulence anisotropy parameter Γ, the Richardson number Ri, and the normalized rate of destruction of temperature variance η _θ ≡ ɛ _θ /ɛ . Here, the latter two parameters are collapsed into a single atmospheric stability parameter z / L using Monin-Obukhov similarity theory, where z is the height above the Earth's surface, and L is the Obukhov length corresponding to Ri,η _θ. Model outputs of the one-dimensional velocity spectra, as well as cospectra of the streamwise and/or vertical velocity components, and/or temperature, and cross-spectra for the spatial separation of all three velocity components and temperature, are compared with measurements. As a function of the four model parameters, spectra and cospectra are reproduced quite well, but horizontal temperature fluxes are slightly underestimated in stable conditions. In moderately unstable stratification, our model reproduces spectra only up to a scale ˜ 1 km. The model also overestimates coherences for vertical separations, but is less severe in unstable than in stable cases.

  17. Modeling canopy-level productivity: is the "big-leaf" simplification acceptable?

    Science.gov (United States)

    Sprintsin, M.; Chen, J. M.

    2009-05-01

    The "big-leaf" approach to calculating the carbon balance of plant canopies assumes that canopy carbon fluxes have the same relative responses to the environment as any single unshaded leaf in the upper canopy. Widely used light use efficiency models are essentially simplified versions of the big-leaf model. Despite its wide acceptance, subsequent developments in the modeling of leaf photosynthesis and measurements of canopy physiology have brought into question the assumptions behind this approach showing that big leaf approximation is inadequate for simulating canopy photosynthesis because of the additional leaf internal control on carbon assimilation and because of the non-linear response of photosynthesis on leaf nitrogen and absorbed light, and changes in leaf microenvironment with canopy depth. To avoid this problem a sunlit/shaded leaf separation approach, within which the vegetation is treated as two big leaves under different illumination conditions, is gradually replacing the "big-leaf" strategy, for applications at local and regional scales. Such separation is now widely accepted as a more accurate and physiologically based approach for modeling canopy photosynthesis. Here we compare both strategies for Gross Primary Production (GPP) modeling using the Boreal Ecosystem Productivity Simulator (BEPS) at local (tower footprint) scale for different land cover types spread over North America: two broadleaf forests (Harvard, Massachusetts and Missouri Ozark, Missouri); two coniferous forests (Howland, Maine and Old Black Spruce, Saskatchewan); Lost Creek shrubland site (Wisconsin) and Mer Bleue petland (Ontario). BEPS calculates carbon fixation by scaling Farquhar's leaf biochemical model up to canopy level with stomatal conductance estimated by a modified version of the Ball-Woodrow-Berry model. The "big-leaf" approach was parameterized using derived leaf level parameters scaled up to canopy level by means of Leaf Area Index. The influence of sunlit

  18. Streaming simplification of tetrahedral meshes.

    Science.gov (United States)

    Vo, Huy T; Callahan, Steven P; Lindstrom, Peter; Pascucci, Valerio; Silva, Cláudio T

    2007-01-01

    Unstructured tetrahedral meshes are commonly used in scientific computing to represent scalar, vector, and tensor fields in three dimensions. Visualization of these meshes can be difficult to perform interactively due to their size and complexity. By reducing the size of the data, we can accomplish real-time visualization necessary for scientific analysis. We propose a two-step approach for streaming simplification of large tetrahedral meshes. Our algorithm arranges the data on disk in a streaming, I/O-efficient format that allows coherent access to the tetrahedral cells. A quadric-based simplification is sequentially performed on small portions of the mesh in-core. Our output is a coherent streaming mesh which facilitates future processing. Our technique is fast, produces high quality approximations, and operates out-of-core to process meshes too large for main memory.

  19. Large regional groundwater modeling - a sensitivity study of some selected conceptual descriptions and simplifications; Storregional grundvattenmodellering - en kaenslighetsstudie av naagra utvalda konceptuella beskrivningar och foerenklingar

    Energy Technology Data Exchange (ETDEWEB)

    Ericsson, Lars O. (Lars O. Ericsson Consulting AB (Sweden)); Holmen, Johan (Golder Associates (Sweden))

    2010-12-15

    The primary aim of this report is: - To present a supplementary, in-depth evaluation of certain conceptual simplifications, descriptions and model uncertainties in conjunction with regional groundwater simulation, which in the first instance refer to model depth, topography, groundwater table level and boundary conditions. Implementation was based on geo-scientifically available data compilations from the Smaaland region but different conceptual assumptions have been analysed

  20. Modeling as a Decision-Making Process

    Science.gov (United States)

    Bleiler-Baxter, Sarah K.; Stephens, D. Christopher; Baxter, Wesley A.; Barlow, Angela T.

    2017-01-01

    The goal in this article is to support teachers in better understanding what it means to model with mathematics by focusing on three key decision-making processes: Simplification, Relationship Mapping, and Situation Analysis. The authors use the Theme Park task to help teachers develop a vision of how students engage in these three decision-making…

  1. The effects of modeling simplifications on craniofacial finite element models: the alveoli (tooth sockets) and periodontal ligaments.

    Science.gov (United States)

    Wood, Sarah A; Strait, David S; Dumont, Elizabeth R; Ross, Callum F; Grosse, Ian R

    2011-07-07

    Several finite element models of a primate cranium were used to investigate the biomechanical effects of the tooth sockets and the material behavior of the periodontal ligament (PDL) on stress and strain patterns associated with feeding. For examining the effect of tooth sockets, the unloaded sockets were modeled as devoid of teeth and PDL, filled with teeth and PDLs, or simply filled with cortical bone. The third premolar on the left side of the cranium was loaded and the PDL was treated as an isotropic, linear elastic material using published values for Young's modulus and Poisson's ratio. The remaining models, along with one of the socket models, were used to determine the effect of the PDL's material behavior on stress and strain distributions under static premolar biting and dynamic tooth loading conditions. Two models (one static and the other dynamic) treated the PDL as cortical bone. The other two models treated it as a ligament with isotropic, linear elastic material properties. Two models treated the PDL as a ligament with hyperelastic properties, and the other two as a ligament with viscoelastic properties. Both behaviors were defined using published stress-strain data obtained from in vitro experiments on porcine ligament specimens. Von Mises stress and strain contour plots indicate that the effects of the sockets and PDL material behavior are local. Results from this study suggest that modeling the sockets and the PDL in finite element analyses of skulls is project dependent and can be ignored if values of stress and strain within the alveolar region are not required. Copyright © 2011 Elsevier Ltd. All rights reserved.

  2. Simplification of neural network model for predicting local power distributions of BWR fuel bundle using learning algorithm with forgetting

    International Nuclear Information System (INIS)

    Tanabe, Akira; Yamamoto, Toru; Shinfuku, Kimihiro; Nakamae, Takuji; Nishide, Fusayo.

    1995-01-01

    Previously a two-layered neural network model was developed to predict the relation between fissile enrichment of each fuel rod and local power distribution in a BWR fuel bundle. This model was obtained intuitively based on 33 patterns of training signals after an intensive survey of the models. Recently, a learning algorithm with forgetting was reported to simplify neural network models. It is an interesting subject what kind of model will be obtained if this algorithm is applied to the complex three-layered model which learns the same training signals. A three-layered model which is expanded to have direct connections between the 1st and the 3rd layer elements has been constructed and the learning method of normal back propagation was applied first to this model. The forgetting algorithm was then added to this learning process. The connections concerned with the 2nd layer elements disappeared and the 2nd layer has become unnecessary. It took a longer computing time by an order to learn the same training signals than the simple back propagation, but the two-layered model was obtained autonomously from the expanded three-layered model. (author)

  3. BioSimplify: an open source sentence simplification engine to improve recall in automatic biomedical information extraction

    OpenAIRE

    Jonnalagadda, Siddhartha; Gonzalez, Graciela

    2011-01-01

    BioSimplify is an open source tool written in Java that introduces and facilitates the use of a novel model for sentence simplification tuned for automatic discourse analysis and information extraction (as opposed to sentence simplification for improving human readability). The model is based on a "shot-gun" approach that produces many different (simpler) versions of the original sentence by combining variants of its constituent elements. This tool is optimized for processing biomedical scien...

  4. Complexity and simplification in understanding recruitment in benthic populations

    KAUST Repository

    Pineda, Jesú s; Reyns, Nathalie B.; Starczak, Victoria R.

    2008-01-01

    reduces the number of processes and makes the problem manageable. We discuss how simplifications and "broad-brush first-order approaches" may muddle our understanding of recruitment. Lack of empirical determination of the fundamental processes often

  5. Simplifications of rational matrices by using UML

    OpenAIRE

    Tasić, Milan B.; Stanimirović, Ivan P.

    2013-01-01

    The simplification process on rational matrices consists of simplifying each entry represented by a rational function. We follow the classic approach of dividing the numerator and denominator polynomials by their common GCD polynomial, and provide the activity diagram in UML for this process. A rational matrix representation as the quotient of a polynomial matrix and a polynomial is also discussed here and illustrated via activity diagrams. Also, a class diagram giving the links between the c...

  6. Simplification and Shift in Cognition of Political Difference: Applying the Geometric Modeling to the Analysis of Semantic Similarity Judgment

    Science.gov (United States)

    Kato, Junko; Okada, Kensuke

    2011-01-01

    Perceiving differences by means of spatial analogies is intrinsic to human cognition. Multi-dimensional scaling (MDS) analysis based on Minkowski geometry has been used primarily on data on sensory similarity judgments, leaving judgments on abstractive differences unanalyzed. Indeed, analysts have failed to find appropriate experimental or real-life data in this regard. Our MDS analysis used survey data on political scientists' judgments of the similarities and differences between political positions expressed in terms of distance. Both distance smoothing and majorization techniques were applied to a three-way dataset of similarity judgments provided by at least seven experts on at least five parties' positions on at least seven policies (i.e., originally yielding 245 dimensions) to substantially reduce the risk of local minima. The analysis found two dimensions, which were sufficient for mapping differences, and fit the city-block dimensions better than the Euclidean metric in all datasets obtained from 13 countries. Most city-block dimensions were highly correlated with the simplified criterion (i.e., the left–right ideology) for differences that are actually used in real politics. The isometry of the city-block and dominance metrics in two-dimensional space carries further implications. More specifically, individuals may pay attention to two dimensions (if represented in the city-block metric) or focus on a single dimension (if represented in the dominance metric) when judging differences between the same objects. Switching between metrics may be expected to occur during cognitive processing as frequently as the apparent discontinuities and shifts in human attention that may underlie changing judgments in real situations occur. Consequently, the result has extended strong support for the validity of the geometric models to represent an important social cognition, i.e., the one of political differences, which is deeply rooted in human nature. PMID:21673959

  7. Computer control system synthesis for nuclear power plants through simplification and partitioning of the complex system model into a set of simple subsystems

    International Nuclear Information System (INIS)

    Zobor, E.

    1978-12-01

    The approach chosen is based on the hierarchical control systems theory, however, the fundamentals of other approaches such as the systems simplification and systems partitioning are briefly summarized for introducing the problems associated with the control of large scale systems. The concept of a hierarchical control system acting in broad variety of operating conditions is developed and some practical extensions to the hierarchical control system approach e.g. subsystems measured and controlled with different rates, control of the partial state vector, coordination for autoregressive models etc. are given. Throughout the work the WWR-SM research reactor of the Institute has been taken as a guiding example and simple methods for the identification of the model parameters from a reactor start-up are discussed. Using the PROHYS digital simulation program elaborated in the course of the present research, detailed simulation studies were carried out for investigating the performance of a control system based on the concept and algorithms developed. In order to give a real application evidence, a short description is finally given about the closed-loop computer control system installed - in the framework of a project supported by the Hungarian State Office for Technical Development - at the WWR-SM research reactor where the results obtained in the present IAEA Research Contract were successfully applied and furnished the expected high performance

  8. Streaming Algorithms for Line Simplification

    DEFF Research Database (Denmark)

    Abam, Mohammad; de Berg, Mark; Hachenberger, Peter

    2010-01-01

    this problem in a streaming setting, where we only have a limited amount of storage, so that we cannot store all the points. We analyze the competitive ratio of our algorithms, allowing resource augmentation: we let our algorithm maintain a simplification with 2k (internal) points and compare the error of our...... simplification to the error of the optimal simplification with k points. We obtain the algorithms with O(1) competitive ratio for three cases: convex paths, where the error is measured using the Hausdorff distance (or Fréchet distance), xy-monotone paths, where the error is measured using the Hausdorff distance...... (or Fréchet distance), and general paths, where the error is measured using the Fréchet distance. In the first case the algorithm needs O(k) additional storage, and in the latter two cases the algorithm needs O(k 2) additional storage....

  9. Control-Oriented Models for SO Fuel Cells from the Angle of V&V: Analysis, Simplification Possibilities, Performance

    Directory of Open Access Journals (Sweden)

    Ekaterina Auer

    2017-12-01

    Full Text Available In this paper, we take a look at the analysis and parameter identification for control-oriented, dynamic models for the thermal subsystem of solid oxide fuel cells (SOFC from the systematized point of view of verification and validation (V&V. First, we give a possible classification of models according to their verification degree which depends, for example, on the kind of arithmetic used for both formulation and simulation. Typical SOFC models, consisting of several coupled differential equations for gas preheaters and the temperature distribution in the stack module, do not have analytical solutions because of spatial nonlinearity. Therefore, in the next part of the paper, we describe in detail two possible ways to simplify such models so that the underlying differential equations can be solved analytically while still being sufficiently accurate to serve as the basis for control synthesis. The simplifying assumption is to approximate the heat capacities of the gases by zero-order polynomials (or first-oder polynomials, respectively in the temperature. In the last, application-oriented part of the paper, we identify the parameters of these models as well as compare their performance and their ability to reflect the reality with the corresponding characteristics of models in which the heat capacities are represented by quadratic polynomials (the usual case. For this purpose, the framework UniVerMeC (Unified Framework for Verified GeoMetric Computations is used, which allows us to employ different kinds of arithmetics including the interval one. This latter possibility ensures a high level of reliability of simulations and of the subsequent validation. Besides, it helps to take into account bounded uncertainty in measurements.

  10. LMJ target implosions: sensitivity of the acceptable gain to physical parameters and simplification of the radiative transport model

    International Nuclear Information System (INIS)

    Charpin, C.; Bonnefille, M.; Charrier, A.; Giorla, J.; Holstein, P.A.; Malinie, G.

    2000-01-01

    Our study is in line with the robustness of the LMJ target and the definition of safety margins. It is based on the determination of the 'acceptable gain', defined as 75% of the nominal gain. We have tested the sensitivity of the gain to physical and numerical parameters in the case of deteriorated implosions, i.e. when implosion conditions are not optimized. Moreover, we have simplified the radiative transport model, which enabled us to save a lot of computing time. All our calculations were done with the Lagrangian code FCI2 in a very simplified configuration. (authors)

  11. Simplification of complex kinetic models used for the quantitative analysis of nuclear magnetic resonance or radioactive tracer studies

    International Nuclear Information System (INIS)

    Schuster, R.; Schuster, S.; Holzhuetter, H.-G.

    1992-01-01

    A method for simplifying the mathematical models describing the dynamics of tracers (e.g. 13 C, 31 P, 14 C, as used in NMR studies or radioactive tracer experiments) in (bio-)chemical reaction systems is presented. This method is appropriate in the cases where the system includes reactions, the rates of which differ by several orders of magnitude. The basic idea is to adapt the rapid-equilibrium approximation to tracer systems. It is shown with the aid of the Perron-Frobenius theorem that for tracer systems, the conditions for applicability of this approximation are satisfied whenever some reactions are near equilibrium. It turns out that the specific enrichments of all of the labelled atoms that are connected by fast reversible reactions can be grouped together as 'pool variables'. The reduced system contains fewer parameters and can, thus, be fitted more easily to experimental data. Moreover, the method can be employed for identifying non-equilibrium and near-equilibrium reactions from experimentally measured specific enrichments of tracer. The reduction algorithm is illustrated by studying a model of the distribution of 13 C-tracers in the pentose phosphate pathway. (author)

  12. Simplifications of Einstein supergravity

    International Nuclear Information System (INIS)

    Ferrara, S.; van Nieuwenhuizen, P.

    1979-01-01

    Using a new symmetry of the Einstein supergravity action and defining a new spin connection, the axial-vector auxiliary field cancels in the gauge action and in the gauge algebra. This explains why in some models a first-order formalism with minimal coupling of the spin connection and tensor calculus agree, while in other models only the tensor calculus gives the correct result but torsion does not

  13. Assessing the Impact of Canopy Structure Simplification in Common Multilayer Models on Irradiance Absorption Estimates of Measured and Virtually Created Fagus sylvatica (L. Stands

    Directory of Open Access Journals (Sweden)

    Pol Coppin

    2009-11-01

    of leaves differed significantly between a multilayer representation and a 3D architecture canopy of the same LAI. The deviations in irradiance absorbance were caused by canopy structure, clumping and positioning of leaves. Although it was found that the use of canopy simplifications for modelling purposes in closed canopies is demonstrated as a valid option, special care should be taken when considering forest stands irradiance simulation for sparse canopies and particularly on higher sun zenith angles where the surrounding trees strongly affect the absorbed irradiance and results can highly deviate from the multilayer assumptions.

  14. Equivalent Simplification Method of Micro-Grid

    OpenAIRE

    Cai Changchun; Cao Xiangqin

    2013-01-01

    The paper concentrates on the equivalent simplification method for the micro-grid system connection into distributed network. The equivalent simplification method proposed for interaction study between micro-grid and distributed network. Micro-grid network, composite load, gas turbine synchronous generation, wind generation are equivalent simplification and parallel connect into the point of common coupling. A micro-grid system is built and three phase and single phase grounded faults are per...

  15. SIMPLIFICATION IN CHILD LANGUAGE IN BAHASA INDONESIA: A CASE STUDY ON FILIP

    Directory of Open Access Journals (Sweden)

    Julia Eka Rini

    2000-01-01

    Full Text Available This article aims at giving examples of characteristics of simplification in Bahasa Indonesia and proving that child language has a pattern and that there is a process in learning. Since this is a case study, it might not be enough to say that simplification is universal for all children of any mother tongues, but at least there is a proof that such patterns of simplification also occur in Bahasa Indonesia.

  16. 2D Vector Field Simplification Based on Robustness

    KAUST Repository

    Skraba, Primoz

    2014-03-01

    Vector field simplification aims to reduce the complexity of the flow by removing features in order of their relevance and importance, to reveal prominent behavior and obtain a compact representation for interpretation. Most existing simplification techniques based on the topological skeleton successively remove pairs of critical points connected by separatrices, using distance or area-based relevance measures. These methods rely on the stable extraction of the topological skeleton, which can be difficult due to instability in numerical integration, especially when processing highly rotational flows. These geometric metrics do not consider the flow magnitude, an important physical property of the flow. In this paper, we propose a novel simplification scheme derived from the recently introduced topological notion of robustness, which provides a complementary view on flow structure compared to the traditional topological-skeleton-based approaches. Robustness enables the pruning of sets of critical points according to a quantitative measure of their stability, that is, the minimum amount of vector field perturbation required to remove them. This leads to a hierarchical simplification scheme that encodes flow magnitude in its perturbation metric. Our novel simplification algorithm is based on degree theory, has fewer boundary restrictions, and so can handle more general cases. Finally, we provide an implementation under the piecewise-linear setting and apply it to both synthetic and real-world datasets. © 2014 IEEE.

  17. Simplification: A Viewpoint in Outline. Appendix.

    Science.gov (United States)

    Tickoo, Makhan L.

    This essay examines language simplification for second language learners as a linguistic and a pedagogic phenomenon, posing questions for further study by considering past research. It discusses linguistic simplification (LS) in relation to the development of artificial languages, such as Esperanto, "pidgin" languages, Basic English,…

  18. Extreme simplification and rendering of point sets using algebraic multigrid

    NARCIS (Netherlands)

    Reniers, D.; Telea, A.C.

    2009-01-01

    We present a novel approach for extreme simplification of point set models, in the context of real-time rendering. Point sets are often rendered using simple point primitives, such as oriented discs. However, this requires using many primitives to render even moderately simple shapes. Often, one

  19. Extreme Simplification and Rendering of Point Sets using Algebraic Multigrid

    NARCIS (Netherlands)

    Reniers, Dennie; Telea, Alexandru

    2005-01-01

    We present a novel approach for extreme simplification of point set models in the context of real-time rendering. Point sets are often rendered using simple point primitives, such as oriented discs. However efficient, simple primitives are less effective in approximating large surface areas. A large

  20. Simplifications and Idealizations in High School Physics in Mechanics: A Study of Slovenian Curriculum and Textbooks

    Science.gov (United States)

    Forjan, Matej; Sliško, Josip

    2014-01-01

    This article presents the results of an analysis of three Slovenian textbooks for high school physics, from the point of view of simplifications and idealizations in the field of mechanics. In modeling of physical systems, making simplifications and idealizations is important, since one ignores minor effects and focuses on the most important…

  1. On Simplification of Database Integrity Constraints

    DEFF Research Database (Denmark)

    Christiansen, Henning; Martinenghi, Davide

    2006-01-01

    Without proper simplification techniques, database integrity checking can be prohibitively time consuming. Several methods have been developed for producing simplified incremental checks for each update but none until now of sufficient quality and generality for providing a true practical impact,...

  2. The complexities of HIPAA and administration simplification.

    Science.gov (United States)

    Mozlin, R

    2000-11-01

    The Health Insurance Portability and Accessibility Act (HIPAA) was signed into law in 1996. Although focused on information technology issues, HIPAA will ultimately impact day-to-day operations at multiple levels within any clinical setting. Optometrists must begin to familiarize themselves with HIPAA in order to prepare themselves to practice in a technology-enriched environment. Title II of HIPAA, entitled "Administration Simplification," is intended to reduce the costs and administrative burden of healthcare by standardizing the electronic transmission of administrative and financial transactions. The Department of Health and Human Services is expected to publish the final rules and regulations that will govern HIPAA's implementation this year. The rules and regulations will cover three key aspects of healthcare delivery: electronic data interchange (EDI), security and privacy. EDI will standardize the format for healthcare transactions. Health plans must accept and respond to all transactions in the EDI format. Security refers to policies and procedures that protect the accuracy and integrity of information and limit access. Privacy focuses on how the information is used and disclosure of identifiable health information. Security and privacy regulations apply to all information that is maintained and transmitted in a digital format and require administrative, physical, and technical safeguards. HIPAA will force the healthcare industry to adopt an e-commerce paradigm and provide opportunities to improve patient care processes. Optometrists should take advantage of the opportunity to develop more efficient and profitable practices.

  3. BioSimplify: an open source sentence simplification engine to improve recall in automatic biomedical information extraction.

    Science.gov (United States)

    Jonnalagadda, Siddhartha; Gonzalez, Graciela

    2010-11-13

    BioSimplify is an open source tool written in Java that introduces and facilitates the use of a novel model for sentence simplification tuned for automatic discourse analysis and information extraction (as opposed to sentence simplification for improving human readability). The model is based on a "shot-gun" approach that produces many different (simpler) versions of the original sentence by combining variants of its constituent elements. This tool is optimized for processing biomedical scientific literature such as the abstracts indexed in PubMed. We tested our tool on its impact to the task of PPI extraction and it improved the f-score of the PPI tool by around 7%, with an improvement in recall of around 20%. The BioSimplify tool and test corpus can be downloaded from https://biosimplify.sourceforge.net.

  4. Pathways of DNA unlinking: A story of stepwise simplification.

    Science.gov (United States)

    Stolz, Robert; Yoshida, Masaaki; Brasher, Reuben; Flanner, Michelle; Ishihara, Kai; Sherratt, David J; Shimokawa, Koya; Vazquez, Mariel

    2017-09-29

    In Escherichia coli DNA replication yields interlinked chromosomes. Controlling topological changes associated with replication and returning the newly replicated chromosomes to an unlinked monomeric state is essential to cell survival. In the absence of the topoisomerase topoIV, the site-specific recombination complex XerCD- dif-FtsK can remove replication links by local reconnection. We previously showed mathematically that there is a unique minimal pathway of unlinking replication links by reconnection while stepwise reducing the topological complexity. However, the possibility that reconnection preserves or increases topological complexity is biologically plausible. In this case, are there other unlinking pathways? Which is the most probable? We consider these questions in an analytical and numerical study of minimal unlinking pathways. We use a Markov Chain Monte Carlo algorithm with Multiple Markov Chain sampling to model local reconnection on 491 different substrate topologies, 166 knots and 325 links, and distinguish between pathways connecting a total of 881 different topologies. We conclude that the minimal pathway of unlinking replication links that was found under more stringent assumptions is the most probable. We also present exact results on unlinking a 6-crossing replication link. These results point to a general process of topology simplification by local reconnection, with applications going beyond DNA.

  5. WORK SIMPLIFICATION FOR PRODUCTIVITY IMPROVEMENT A ...

    African Journals Online (AJOL)

    Mechanical Engineering Department. Addis Ababa University ... press concerning the work simplification techniques state ... encompassing as it does improved labor-management cooperation ... achievement of business aims or a contribution to attaining ..... recommended work methods is done after a 1hrough study and ...

  6. Simplification of integrity constraints for data integration

    DEFF Research Database (Denmark)

    Christiansen, Henning; Martinenghi, Davide

    2004-01-01

    , because either the global database is known to be consistent or suitable actions have been taken to provide consistent views. The present work generalizes simplification techniques for integrity checking in traditional databases to the combined case. Knowledge of local consistency is employed, perhaps...

  7. Complexity and simplification in understanding recruitment in benthic populations

    KAUST Repository

    Pineda, Jesús

    2008-11-13

    Research of complex systems and problems, entities with many dependencies, is often reductionist. The reductionist approach splits systems or problems into different components, and then addresses these components one by one. This approach has been used in the study of recruitment and population dynamics of marine benthic (bottom-dwelling) species. Another approach examines benthic population dynamics by looking at a small set of processes. This approach is statistical or model-oriented. Simplified approaches identify "macroecological" patterns or attempt to identify and model the essential, "first-order" elements of the system. The complexity of the recruitment and population dynamics problems stems from the number of processes that can potentially influence benthic populations, including (1) larval pool dynamics, (2) larval transport, (3) settlement, and (4) post-settlement biotic and abiotic processes, and larval production. Moreover, these processes are non-linear, some interact, and they may operate on disparate scales. This contribution discusses reductionist and simplified approaches to study benthic recruitment and population dynamics of bottom-dwelling marine invertebrates. We first address complexity in two processes known to influence recruitment, larval transport, and post-settlement survival to reproduction, and discuss the difficulty in understanding recruitment by looking at relevant processes individually and in isolation. We then address the simplified approach, which reduces the number of processes and makes the problem manageable. We discuss how simplifications and "broad-brush first-order approaches" may muddle our understanding of recruitment. Lack of empirical determination of the fundamental processes often results in mistaken inferences, and processes and parameters used in some models can bias our view of processes influencing recruitment. We conclude with a discussion on how to reconcile complex and simplified approaches. Although it

  8. Modeling Attitude Variance in Small UAS’s for Acoustic Signature Simplification Using Experimental Design in a Hardware-in-the-Loop Simulation

    Science.gov (United States)

    2015-03-26

    response. Additionally, choosing correlated levels for multiple factors results in multicollinearity which can cause problems such as model...misspecification or large variances and covariances for the regression coefficients. A good way to avoid multicollinearity is to use orthogonal, factorial

  9. Hydrothermal processing of Hanford tank wastes: Process modeling and control

    International Nuclear Information System (INIS)

    Currier, R.P.

    1994-01-01

    In the Los Alamos National Laboratory (LANL) hydrothermal process, waste streams are first pressurized and heated as they pass through a continuous flow tubular reactor vessel. The waste is maintained at reaction temperature of 300--550 C where organic destruction and sludge reformation occur. This report documents LANL activities in process modeling and control undertaken in FY94 to support hydrothermal process development. Key issues discussed include non-ideal flow patterns (e.g. axial dispersion) and their effect on reactor performance, the use and interpretation of inert tracer experiments, and the use of computational fluid mechanics to evaluate novel hydrothermal reactor designs. In addition, the effects of axial dispersion (and simplifications to rate expressions) on the estimated kinetic parameters are explored by non-linear regression to experimental data. Safety-related calculations are reported which estimate the explosion limits of effluent gases and the fate of hydrogen as it passes through the reactor. Development and numerical solution of a generalized one-dimensional mathematical model is also summarized. The difficulties encountered in using commercially available software to correlate the behavior of high temperature, high pressure aqueous electrolyte mixtures are summarized. Finally, details of the control system and experiments conducted to empirically determine the system response are reported

  10. 2D Vector Field Simplification Based on Robustness

    KAUST Repository

    Skraba, Primoz; Wang, Bei; Chen, Guoning; Rosen, Paul

    2014-01-01

    Vector field simplification aims to reduce the complexity of the flow by removing features in order of their relevance and importance, to reveal prominent behavior and obtain a compact representation for interpretation. Most existing simplification

  11. Simplification of the helical TEN2 laser

    Science.gov (United States)

    Krahn, K.-H.

    1980-04-01

    The observation that the helical TEN2 laser can effectively be simplified by giving up the use of decoupling elements as well as by abolishing the segmentation of the electrode structure is examined. Although, as a consequence of this simplification, the operating pressure range was slightly decreased, the output power could be improved by roughly 30%, a result which is attributed to the new electrode geometry exhibiting lower inductance and lower damping losses.

  12. Semantic Business Process Modeling

    OpenAIRE

    Markovic, Ivan

    2010-01-01

    This book presents a process-oriented business modeling framework based on semantic technologies. The framework consists of modeling languages, methods, and tools that allow for semantic modeling of business motivation, business policies and rules, and business processes. Quality of the proposed modeling framework is evaluated based on the modeling content of SAP Solution Composer and several real-world business scenarios.

  13. Process optimization of friction stir welding based on thermal models

    DEFF Research Database (Denmark)

    Larsen, Anders Astrup

    2010-01-01

    This thesis investigates how to apply optimization methods to numerical models of a friction stir welding process. The work is intended as a proof-of-concept using different methods that are applicable to models of high complexity, possibly with high computational cost, and without the possibility...... information of the high-fidelity model. The optimization schemes are applied to stationary thermal models of differing complexity of the friction stir welding process. The optimization problems considered are based on optimizing the temperature field in the workpiece by finding optimal translational speed....... Also an optimization problem based on a microstructure model is solved, allowing the hardness distribution in the plate to be optimized. The use of purely thermal models represents a simplification of the real process; nonetheless, it shows the applicability of the optimization methods considered...

  14. Robustness-Based Simplification of 2D Steady and Unsteady Vector Fields.

    Science.gov (United States)

    Skraba, Primoz; Bei Wang; Guoning Chen; Rosen, Paul

    2015-08-01

    Vector field simplification aims to reduce the complexity of the flow by removing features in order of their relevance and importance, to reveal prominent behavior and obtain a compact representation for interpretation. Most existing simplification techniques based on the topological skeleton successively remove pairs of critical points connected by separatrices, using distance or area-based relevance measures. These methods rely on the stable extraction of the topological skeleton, which can be difficult due to instability in numerical integration, especially when processing highly rotational flows. In this paper, we propose a novel simplification scheme derived from the recently introduced topological notion of robustness which enables the pruning of sets of critical points according to a quantitative measure of their stability, that is, the minimum amount of vector field perturbation required to remove them. This leads to a hierarchical simplification scheme that encodes flow magnitude in its perturbation metric. Our novel simplification algorithm is based on degree theory and has minimal boundary restrictions. Finally, we provide an implementation under the piecewise-linear setting and apply it to both synthetic and real-world datasets. We show local and complete hierarchical simplifications for steady as well as unsteady vector fields.

  15. Robustness-Based Simplification of 2D Steady and Unsteady Vector Fields

    KAUST Repository

    Skraba, Primoz

    2015-08-01

    © 2015 IEEE. Vector field simplification aims to reduce the complexity of the flow by removing features in order of their relevance and importance, to reveal prominent behavior and obtain a compact representation for interpretation. Most existing simplification techniques based on the topological skeleton successively remove pairs of critical points connected by separatrices, using distance or area-based relevance measures. These methods rely on the stable extraction of the topological skeleton, which can be difficult due to instability in numerical integration, especially when processing highly rotational flows. In this paper, we propose a novel simplification scheme derived from the recently introduced topological notion of robustness which enables the pruning of sets of critical points according to a quantitative measure of their stability, that is, the minimum amount of vector field perturbation required to remove them. This leads to a hierarchical simplification scheme that encodes flow magnitude in its perturbation metric. Our novel simplification algorithm is based on degree theory and has minimal boundary restrictions. Finally, we provide an implementation under the piecewise-linear setting and apply it to both synthetic and real-world datasets. We show local and complete hierarchical simplifications for steady as well as unsteady vector fields.

  16. Robustness-Based Simplification of 2D Steady and Unsteady Vector Fields

    KAUST Repository

    Skraba, Primoz; Wang, Bei; Chen, Guoning; Rosen, Paul

    2015-01-01

    © 2015 IEEE. Vector field simplification aims to reduce the complexity of the flow by removing features in order of their relevance and importance, to reveal prominent behavior and obtain a compact representation for interpretation. Most existing simplification techniques based on the topological skeleton successively remove pairs of critical points connected by separatrices, using distance or area-based relevance measures. These methods rely on the stable extraction of the topological skeleton, which can be difficult due to instability in numerical integration, especially when processing highly rotational flows. In this paper, we propose a novel simplification scheme derived from the recently introduced topological notion of robustness which enables the pruning of sets of critical points according to a quantitative measure of their stability, that is, the minimum amount of vector field perturbation required to remove them. This leads to a hierarchical simplification scheme that encodes flow magnitude in its perturbation metric. Our novel simplification algorithm is based on degree theory and has minimal boundary restrictions. Finally, we provide an implementation under the piecewise-linear setting and apply it to both synthetic and real-world datasets. We show local and complete hierarchical simplifications for steady as well as unsteady vector fields.

  17. Modeling multiphase materials processes

    CERN Document Server

    Iguchi, Manabu

    2010-01-01

    ""Modeling Multiphase Materials Processes: Gas-Liquid Systems"" describes the methodology and application of physical and mathematical modeling to multi-phase flow phenomena in materials processing. The book focuses on systems involving gas-liquid interaction, the most prevalent in current metallurgical processes. The performance characteristics of these processes are largely dependent on transport phenomena. This volume covers the inherent characteristics that complicate the modeling of transport phenomena in such systems, including complex multiphase structure, intense turbulence, opacity of

  18. Process modeling style

    CERN Document Server

    Long, John

    2014-01-01

    Process Modeling Style focuses on other aspects of process modeling beyond notation that are very important to practitioners. Many people who model processes focus on the specific notation used to create their drawings. While that is important, there are many other aspects to modeling, such as naming, creating identifiers, descriptions, interfaces, patterns, and creating useful process documentation. Experience author John Long focuses on those non-notational aspects of modeling, which practitioners will find invaluable. Gives solid advice for creating roles, work produ

  19. Simplification of Home Cooking and Its Periphery

    OpenAIRE

    小住, フミ子; 北崎, 康子; Fumiko, OZUMI; Yasuko, KITAZAKI

    1997-01-01

    Sence of home cooking has been changing with the times. Various topics, which make us conscious of health and dietary habits, such as delicatessen, half-ready-made foods, eating out, and utilization of home delivery service and food imports are involved in those of simplification of cooking. We requested 64 students to fill in a questionnaire in three parts. The recovery was 96.4%. The results are as follows : The main reason for purchasing delicatessen or half-ready-made foods was that "they...

  20. Product and Process Modelling

    DEFF Research Database (Denmark)

    Cameron, Ian T.; Gani, Rafiqul

    . These approaches are put into the context of life cycle modelling, where multiscale and multiform modelling is increasingly prevalent in the 21st century. The book commences with a discussion of modern product and process modelling theory and practice followed by a series of case studies drawn from a variety......This book covers the area of product and process modelling via a case study approach. It addresses a wide range of modelling applications with emphasis on modelling methodology and the subsequent in-depth analysis of mathematical models to gain insight via structural aspects of the models...... to biotechnology applications, food, polymer and human health application areas. The book highlights to important nature of modern product and process modelling in the decision making processes across the life cycle. As such it provides an important resource for students, researchers and industrial practitioners....

  1. Standard Model processes

    CERN Document Server

    Mangano, M.L.; Aguilar-Saavedra, Juan Antonio; Alekhin, S.; Badger, S.; Bauer, C.W.; Becher, T.; Bertone, V.; Bonvini, M.; Boselli, S.; Bothmann, E.; Boughezal, R.; Cacciari, M.; Carloni Calame, C.M.; Caola, F.; Campbell, J.M.; Carrazza, S.; Chiesa, M.; Cieri, L.; Cimaglia, F.; Febres Cordero, F.; Ferrarese, P.; D'Enterria, D.; Ferrera, G.; Garcia i Tormo, X.; Garzelli, M.V.; Germann, E.; Hirschi, V.; Han, T.; Ita, H.; Jäger, B.; Kallweit, S.; Karlberg, A.; Kuttimalai, S.; Krauss, F.; Larkoski, A.J.; Lindert, J.; Luisoni, G.; Maierhöfer, P.; Mattelaer, O.; Martinez, H.; Moch, S.; Montagna, G.; Moretti, M.; Nason, P.; Nicrosini, O.; Oleari, C.; Pagani, D.; Papaefstathiou, A.; Petriello, F.; Piccinini, F.; Pierini, M.; Pierog, T.; Pozzorini, S.; Re, E.; Robens, T.; Rojo, J.; Ruiz, R.; Sakurai, K.; Salam, G.P.; Salfelder, L.; Schönherr, M.; Schulze, M.; Schumann, S.; Selvaggi, M.; Shivaji, A.; Siodmok, A.; Skands, P.; Torrielli, P.; Tramontano, F.; Tsinikos, I.; Tweedie, B.; Vicini, A.; Westhoff, S.; Zaro, M.; Zeppenfeld, D.; CERN. Geneva. ATS Department

    2017-06-22

    This report summarises the properties of Standard Model processes at the 100 TeV pp collider. We document the production rates and typical distributions for a number of benchmark Standard Model processes, and discuss new dynamical phenomena arising at the highest energies available at this collider. We discuss the intrinsic physics interest in the measurement of these Standard Model processes, as well as their role as backgrounds for New Physics searches.

  2. WWTP Process Tank Modelling

    DEFF Research Database (Denmark)

    Laursen, Jesper

    The present thesis considers numerical modeling of activated sludge tanks on municipal wastewater treatment plants. Focus is aimed at integrated modeling where the detailed microbiological model the Activated Sludge Model 3 (ASM3) is combined with a detailed hydrodynamic model based on a numerical...... solution of the Navier-Stokes equations in a multiphase scheme. After a general introduction to the activated sludge tank as a system, the activated sludge tank model is gradually setup in separate stages. The individual sub-processes that are often occurring in activated sludge tanks are initially...... hydrofoil shaped propellers. These two sub-processes deliver the main part of the supplied energy to the activated sludge tank, and for this reason they are important for the mixing conditions in the tank. For other important processes occurring in the activated sludge tank, existing models and measurements...

  3. Sutural simplification in Physodoceratinae (Aspidoceratidae, Ammonitina

    Directory of Open Access Journals (Sweden)

    Checa, A.

    1987-08-01

    Full Text Available The estructural analysis of the shell septum interrelationship in sorne Jurassic ammonites allows us to conclude that sutural simplifications occurred throughout the phylogeny, were originated by alterations in the external morphology of the shell. In the case of Physodoceratinae the simplification observed in the morphology of the septal suture may have a double origin. First, an increase in the size of periumbilical tubercles may determine a shallowing of sutural elements and a shortening of saddle and lobe frilling. In other cases, shallowing is determined by a decrease in the whorl expansion rate, an apparent shortening of secondary branching not being observed.El análisis estructural de la interrelación concha-septo en algunos ammonites del Jurásico superior lleva a concluir que las simplificaciones suturales aparecidas a lo largo de la filogenia fueron originadas por alteraciones ocurridas en la morfología externa de la concha. En el caso concreto de la subfamilia Physodoceratinae, la simplificación observada en la morfología de la sutura puede tener un doble origen. En primer lugar, un incremento en el tamaño de los tubérculos periumbilicales puede determinar una pérdida de profundidad de los elementos de la sutura. siempre acompañada de una disminución en las indentaciones (frilling de sillas y lóbulos. En otros casos el acortamiento en profundidad está determinado por una disminución de la tasa de expansión de la espira, sin que se observe un acortamiento aparente de las ramificaciones secundarias.

  4. Model Process Control Language

    Data.gov (United States)

    National Aeronautics and Space Administration — The MPC (Model Process Control) language enables the capture, communication and preservation of a simulation instance, with sufficient detail that it can be...

  5. Simplification of a dust emission scheme and comparison with data

    Science.gov (United States)

    Shao, Yaping

    2004-05-01

    A simplification of a dust emission scheme is proposed, which takes into account of saltation bombardment and aggregates disintegration. The statement of the scheme is that dust emission is proportional to streamwise saltation flux, but the proportionality depends on soil texture and soil plastic pressure p. For small p values (loose soils), dust emission rate is proportional to u*4 (u* is friction velocity) but not necessarily so in general. The dust emission predictions using the scheme are compared with several data sets published in the literature. The comparison enables the estimate of a model parameter and soil plastic pressure for various soils. While more data are needed for further verification, a general guideline for choosing model parameters is recommended.

  6. Biosphere Process Model Report

    Energy Technology Data Exchange (ETDEWEB)

    J. Schmitt

    2000-05-25

    To evaluate the postclosure performance of a potential monitored geologic repository at Yucca Mountain, a Total System Performance Assessment (TSPA) will be conducted. Nine Process Model Reports (PMRs), including this document, are being developed to summarize the technical basis for each of the process models supporting the TSPA model. These reports cover the following areas: (1) Integrated Site Model; (2) Unsaturated Zone Flow and Transport; (3) Near Field Environment; (4) Engineered Barrier System Degradation, Flow, and Transport; (5) Waste Package Degradation; (6) Waste Form Degradation; (7) Saturated Zone Flow and Transport; (8) Biosphere; and (9) Disruptive Events. Analysis/Model Reports (AMRs) contain the more detailed technical information used to support TSPA and the PMRs. The AMRs consists of data, analyses, models, software, and supporting documentation that will be used to defend the applicability of each process model for evaluating the postclosure performance of the potential Yucca Mountain repository system. This documentation will ensure the traceability of information from its source through its ultimate use in the TSPA-Site Recommendation (SR) and in the National Environmental Policy Act (NEPA) analysis processes. The objective of the Biosphere PMR is to summarize (1) the development of the biosphere model, and (2) the Biosphere Dose Conversion Factors (BDCFs) developed for use in TSPA. The Biosphere PMR does not present or summarize estimates of potential radiation doses to human receptors. Dose calculations are performed as part of TSPA and will be presented in the TSPA documentation. The biosphere model is a component of the process to evaluate postclosure repository performance and regulatory compliance for a potential monitored geologic repository at Yucca Mountain, Nevada. The biosphere model describes those exposure pathways in the biosphere by which radionuclides released from a potential repository could reach a human receptor

  7. Biosphere Process Model Report

    International Nuclear Information System (INIS)

    Schmitt, J.

    2000-01-01

    To evaluate the postclosure performance of a potential monitored geologic repository at Yucca Mountain, a Total System Performance Assessment (TSPA) will be conducted. Nine Process Model Reports (PMRs), including this document, are being developed to summarize the technical basis for each of the process models supporting the TSPA model. These reports cover the following areas: (1) Integrated Site Model; (2) Unsaturated Zone Flow and Transport; (3) Near Field Environment; (4) Engineered Barrier System Degradation, Flow, and Transport; (5) Waste Package Degradation; (6) Waste Form Degradation; (7) Saturated Zone Flow and Transport; (8) Biosphere; and (9) Disruptive Events. Analysis/Model Reports (AMRs) contain the more detailed technical information used to support TSPA and the PMRs. The AMRs consists of data, analyses, models, software, and supporting documentation that will be used to defend the applicability of each process model for evaluating the postclosure performance of the potential Yucca Mountain repository system. This documentation will ensure the traceability of information from its source through its ultimate use in the TSPA-Site Recommendation (SR) and in the National Environmental Policy Act (NEPA) analysis processes. The objective of the Biosphere PMR is to summarize (1) the development of the biosphere model, and (2) the Biosphere Dose Conversion Factors (BDCFs) developed for use in TSPA. The Biosphere PMR does not present or summarize estimates of potential radiation doses to human receptors. Dose calculations are performed as part of TSPA and will be presented in the TSPA documentation. The biosphere model is a component of the process to evaluate postclosure repository performance and regulatory compliance for a potential monitored geologic repository at Yucca Mountain, Nevada. The biosphere model describes those exposure pathways in the biosphere by which radionuclides released from a potential repository could reach a human receptor

  8. An Integrated Simplification Approach for 3D Buildings with Sloped and Flat Roofs

    Directory of Open Access Journals (Sweden)

    Jinghan Xie

    2016-07-01

    Full Text Available Simplification of three-dimensional (3D buildings is critical to improve the efficiency of visualizing urban environments while ensuring realistic urban scenes. Moreover, it underpins the construction of multi-scale 3D city models (3DCMs which could be applied to study various urban issues. In this paper, we design a generic yet effective approach for simplifying 3D buildings. Instead of relying on both semantic information and geometric information, our approach is based solely on geometric information as many 3D buildings still do not include semantic information. In addition, it provides an integrated means to treat 3D buildings with either sloped or flat roofs. The two case studies, one exploring simplification of individual 3D buildings at varying levels of complexity while the other, investigating the multi-scale simplification of a cityscape, show the effectiveness of our approach.

  9. Impact of pipes networks simplification on water hammer phenomenon

    Indian Academy of Sciences (India)

    Simplification of water supply networks is an indispensible design step to make the original network easier to be analysed. The impact of networks' simplification on water hammer phenomenon is investigated. This study uses two loops network with different diameters, thicknesses, and roughness coefficients. The network is ...

  10. 77 FR 66361 - Reserve Requirements of Depository Institutions: Reserves Simplification

    Science.gov (United States)

    2012-11-05

    ... Requirements of Depository Institutions: Reserves Simplification AGENCY: Board of Governors of the Federal... (Reserve Requirements of Depository Institutions) published in the Federal Register on April 12, 2012. The... simplifications related to the administration of reserve requirements: 1. Create a common two-week maintenance...

  11. The cost of policy simplification in conservation incentive programs

    DEFF Research Database (Denmark)

    Armsworth, Paul R.; Acs, Szvetlana; Dallimer, Martin

    2012-01-01

    of biodiversity. Common policy simplifications result in a 49100% loss in biodiversity benefits depending on the conservation target chosen. Failure to differentiate prices for conservation improvements in space is particularly problematic. Additional implementation costs that accompany more complicated policies......Incentive payments to private landowners provide a common strategy to conserve biodiversity and enhance the supply of goods and services from ecosystems. To deliver cost-effective improvements in biodiversity, payment schemes must trade-off inefficiencies that result from over-simplified policies...... with the administrative burden of implementing more complex incentive designs. We examine the effectiveness of different payment schemes using field parameterized, ecological economic models of extensive grazing farms. We focus on profit maximising farm management plans and use bird species as a policy-relevant indicator...

  12. Aerospace Materials Process Modelling

    Science.gov (United States)

    1988-08-01

    Cooling Transformation diagram ( CCT diagram ) When a IT diagram is used in the heat process modelling, we suppose that a sudden cooling (instantaneous...processes. CE, chooses instead to study thermo-mechanical properties referring to a CCT diagram . This is thinked to be more reliable to give a true...k , mm-_____sml l ml A I 1 III 12.4 This determination is however based on the following approximations: i) A CCT diagram is valid only for the

  13. Quantum copying and simplification of the quantum Fourier transform

    Science.gov (United States)

    Niu, Chi-Sheng

    Theoretical studies of quantum computation and quantum information theory are presented in this thesis. Three topics are considered: simplification of the quantum Fourier transform in Shor's algorithm, optimal eavesdropping in the BB84 quantum cryptographic protocol, and quantum copying of one qubit. The quantum Fourier transform preceding the final measurement in Shor's algorithm is simplified by replacing a network of quantum gates with one that has fewer and simpler gates controlled by classical signals. This simplification results from an analysis of the network using the consistent history approach to quantum mechanics. The optimal amount of information which an eavesdropper can gain, for a given level of noise in the communication channel, is worked out for the BB84 quantum cryptographic protocol. The optimal eavesdropping strategy is expressed in terms of various quantum networks. A consistent history analysis of these networks using two conjugate quantum bases shows how the information gain in one basis influences the noise level in the conjugate basis. The no-cloning property of quantum systems, which is the physics behind quantum cryptography, is studied by considering copying machines that generate two imperfect copies of one qubit. The best qualities these copies can have are worked out with the help of the Bloch sphere representation for one qubit, and a quantum network is worked out for an optimal copying machine. If the copying machine does not have additional ancillary qubits, the copying process can be viewed using a 2-dimensional subspace in a product space of two qubits. A special representation of such a two-dimensional subspace makes possible a complete characterization of this type of copying. This characterization in turn leads to simplified eavesdropping strategies in the BB84 and the B92 quantum cryptographic protocols.

  14. Business Model Process Configurations

    DEFF Research Database (Denmark)

    Taran, Yariv; Nielsen, Christian; Thomsen, Peter

    2015-01-01

    , by developing (inductively) an ontological classification framework, in view of the BM process configurations typology developed. Design/methodology/approach – Given the inconsistencies found in the business model studies (e.g. definitions, configurations, classifications) we adopted the analytical induction...

  15. Organisational simplification and secondary complexity in health services for adults with learning disabilities.

    Science.gov (United States)

    Heyman, Bob; Swain, John; Gillman, Maureen

    2004-01-01

    This paper explores the role of complexity and simplification in the delivery of health care for adults with learning disabilities, drawing upon qualitative data obtained in a study carried out in NE England. It is argued that the requirement to manage complex health needs with limited resources causes service providers to simplify, standardise and routinise care. Simplified service models may work well enough for the majority of clients, but can impede recognition of the needs of those whose characteristics are not congruent with an adopted model. The data were analysed in relation to the core category, identified through thematic analysis, of secondary complexity arising from organisational simplification. Organisational simplification generates secondary complexity when operational routines designed to make health complexity manageable cannot accommodate the needs of non-standard service users. Associated themes, namely the social context of services, power and control, communication skills, expertise and service inclusiveness and evaluation are explored in relation to the core category. The concept of secondary complexity resulting from organisational simplification may partly explain seemingly irrational health service provider behaviour.

  16. Business process model repositories : efficient process retrieval

    NARCIS (Netherlands)

    Yan, Z.

    2012-01-01

    As organizations increasingly work in process-oriented manner, the number of business process models that they develop and have to maintain increases. As a consequence, it has become common for organizations to have collections of hundreds or even thousands of business process models. When a

  17. A model evaluation checklist for process-based environmental models

    Science.gov (United States)

    Jackson-Blake, Leah

    2015-04-01

    Mechanistic catchment-scale phosphorus models appear to perform poorly where diffuse sources dominate. The reasons for this were investigated for one commonly-applied model, the INtegrated model of CAtchment Phosphorus (INCA-P). Model output was compared to 18 months of daily water quality monitoring data in a small agricultural catchment in Scotland, and model structure, key model processes and internal model responses were examined. Although the model broadly reproduced dissolved phosphorus dynamics, it struggled with particulates. The reasons for poor performance were explored, together with ways in which improvements could be made. The process of critiquing and assessing model performance was then generalised to provide a broadly-applicable model evaluation checklist, incorporating: (1) Calibration challenges, relating to difficulties in thoroughly searching a high-dimensional parameter space and in selecting appropriate means of evaluating model performance. In this study, for example, model simplification was identified as a necessary improvement to reduce the number of parameters requiring calibration, whilst the traditionally-used Nash Sutcliffe model performance statistic was not able to discriminate between realistic and unrealistic model simulations, and alternative statistics were needed. (2) Data limitations, relating to a lack of (or uncertainty in) input data, data to constrain model parameters, data for model calibration and testing, and data to test internal model processes. In this study, model reliability could be improved by addressing all four kinds of data limitation. For example, there was insufficient surface water monitoring data for model testing against an independent dataset to that used in calibration, whilst additional monitoring of groundwater and effluent phosphorus inputs would help distinguish between alternative plausible model parameterisations. (3) Model structural inadequacies, whereby model structure may inadequately represent

  18. Process model repositories and PNML

    NARCIS (Netherlands)

    Hee, van K.M.; Post, R.D.J.; Somers, L.J.A.M.; Werf, van der J.M.E.M.; Kindler, E.

    2004-01-01

    Bringing system and process models together in repositories facilitates the interchange of model information between modelling tools, and allows the combination and interlinking of complementary models. Petriweb is a web application for managing such repositories. It supports hierarchical process

  19. Modeling styles in business process modeling

    NARCIS (Netherlands)

    Pinggera, J.; Soffer, P.; Zugal, S.; Weber, B.; Weidlich, M.; Fahland, D.; Reijers, H.A.; Mendling, J.; Bider, I.; Halpin, T.; Krogstie, J.; Nurcan, S.; Proper, E.; Schmidt, R.; Soffer, P.; Wrycza, S.

    2012-01-01

    Research on quality issues of business process models has recently begun to explore the process of creating process models. As a consequence, the question arises whether different ways of creating process models exist. In this vein, we observed 115 students engaged in the act of modeling, recording

  20. Simplification of arboreal marsupial assemblages in response to increasing urbanization.

    Science.gov (United States)

    Isaac, Bronwyn; White, John; Ierodiaconou, Daniel; Cooke, Raylene

    2014-01-01

    Arboreal marsupials play an essential role in ecosystem function including regulating insect and plant populations, facilitating pollen and seed dispersal and acting as a prey source for higher-order carnivores in Australian environments. Primarily, research has focused on their biology, ecology and response to disturbance in forested and urban environments. We used presence-only species distribution modelling to understand the relationship between occurrences of arboreal marsupials and eco-geographical variables, and to infer habitat suitability across an urban gradient. We used post-proportional analysis to determine whether increasing urbanization affected potential habitat for arboreal marsupials. The key eco-geographical variables that influenced disturbance intolerant species and those with moderate tolerance to disturbance were natural features such as tree cover and proximity to rivers and to riparian vegetation, whereas variables for disturbance tolerant species were anthropogenic-based (e.g., road density) but also included some natural characteristics such as proximity to riparian vegetation, elevation and tree cover. Arboreal marsupial diversity was subject to substantial change along the gradient, with potential habitat for disturbance-tolerant marsupials distributed across the complete gradient and potential habitat for less tolerant species being restricted to the natural portion of the gradient. This resulted in highly-urbanized environments being inhabited by a few generalist arboreal marsupial species. Increasing urbanization therefore leads to functional simplification of arboreal marsupial assemblages, thus impacting on the ecosystem services they provide.

  1. Simplification of arboreal marsupial assemblages in response to increasing urbanization.

    Directory of Open Access Journals (Sweden)

    Bronwyn Isaac

    Full Text Available Arboreal marsupials play an essential role in ecosystem function including regulating insect and plant populations, facilitating pollen and seed dispersal and acting as a prey source for higher-order carnivores in Australian environments. Primarily, research has focused on their biology, ecology and response to disturbance in forested and urban environments. We used presence-only species distribution modelling to understand the relationship between occurrences of arboreal marsupials and eco-geographical variables, and to infer habitat suitability across an urban gradient. We used post-proportional analysis to determine whether increasing urbanization affected potential habitat for arboreal marsupials. The key eco-geographical variables that influenced disturbance intolerant species and those with moderate tolerance to disturbance were natural features such as tree cover and proximity to rivers and to riparian vegetation, whereas variables for disturbance tolerant species were anthropogenic-based (e.g., road density but also included some natural characteristics such as proximity to riparian vegetation, elevation and tree cover. Arboreal marsupial diversity was subject to substantial change along the gradient, with potential habitat for disturbance-tolerant marsupials distributed across the complete gradient and potential habitat for less tolerant species being restricted to the natural portion of the gradient. This resulted in highly-urbanized environments being inhabited by a few generalist arboreal marsupial species. Increasing urbanization therefore leads to functional simplification of arboreal marsupial assemblages, thus impacting on the ecosystem services they provide.

  2. Decomposition and Simplification of Multivariate Data using Pareto Sets.

    Science.gov (United States)

    Huettenberger, Lars; Heine, Christian; Garth, Christoph

    2014-12-01

    Topological and structural analysis of multivariate data is aimed at improving the understanding and usage of such data through identification of intrinsic features and structural relationships among multiple variables. We present two novel methods for simplifying so-called Pareto sets that describe such structural relationships. Such simplification is a precondition for meaningful visualization of structurally rich or noisy data. As a framework for simplification operations, we introduce a decomposition of the data domain into regions of equivalent structural behavior and the reachability graph that describes global connectivity of Pareto extrema. Simplification is then performed as a sequence of edge collapses in this graph; to determine a suitable sequence of such operations, we describe and utilize a comparison measure that reflects the changes to the data that each operation represents. We demonstrate and evaluate our methods on synthetic and real-world examples.

  3. Generating process model collections

    NARCIS (Netherlands)

    Yan, Z.; Dijkman, R.M.; Grefen, P.W.P.J.

    2017-01-01

    Business process management plays an important role in the management of organizations. More and more organizations describe their operations as business processes. It is common for organizations to have collections of thousands of business processes, but for reasons of confidentiality these

  4. What makes process models understandable?

    NARCIS (Netherlands)

    Mendling, J.; Reijers, H.A.; Cardoso, J.; Alonso, G.; Dadam, P.; Rosemann, M.

    2007-01-01

    Despite that formal and informal quality aspects are of significant importance to business process modeling, there is only little empirical work reported on process model quality and its impact factors. In this paper we investigate understandability as a proxy for quality of process models and focus

  5. Regulatory simplification of fission product chemistry

    International Nuclear Information System (INIS)

    Read, J.B.J.; Soffer, L.

    1986-01-01

    The requirements for design provisions intended to limit fission product escape during reactor accidents have been based since 1962 upon a small number of simply-stated assumptions. These assumptions permeate current reactor regulation, but are too simple to deal with the complex processes that can reasonably be expected to occur during real accidents. Potential chemical processes of fission products in severe accidents are compared with existing plant safety features designed to minimize off-site consequences, and the possibility of a new set of simply-stated assumptions to replace the 1982 set is discussed

  6. Application of a power plant simplification methodology: The example of the condensate feedwater system

    International Nuclear Information System (INIS)

    Seong, P.H.; Manno, V.P.; Golay, M.W.

    1988-01-01

    A novel framework for the systematic simplification of power plant design is described with a focus on the application for the optimization of condensate feedwater system (CFWS) design. The evolution of design complexity of CFWS is reviewed with emphasis upon the underlying optimization process. A new evaluation methodology which includes explicit accounting of human as well as mechanical effects upon system availability is described. The unifying figure of merit for an operating system is taken to be net electricity production cost. The evaluation methodology is applied to the comparative analysis of three designs. In the illustrative examples, the results illustrate how inclusion in the evaluation of explicit availability related costs leads to optimal configurations. These are different from those of current system design practices in that thermodynamic efficiency and capital cost optimization are not overemphasized. Rather a more complete set of design-dependent variables is taken into account, and other important variables which remain neglected in current practices are identified. A critique of the new optimization approach and a discussion of future work areas including improved human performance modeling and different optimization constraints are provided. (orig.)

  7. Cutting red tape: national strategies for administrative simplification

    National Research Council Canada - National Science Library

    Cerri, Fabienne; Hepburn, Glen; Barazzoni, Fiorenza

    2006-01-01

    ... when the topic was new, and had a strong focus on the tools used to simplify administrative regulations. Expectations are greater today, and ad hoc simplification initiatives have in many cases been replaced by comprehensive government programmes to reduce red tape. Some instruments, such as one-stop shops, which were new then, have become widely adop...

  8. Viewpoint-Driven Simplification of Plant and Tree Foliage

    Directory of Open Access Journals (Sweden)

    Cristina Gasch

    2018-03-01

    Full Text Available Plants and trees are an essential part of outdoor scenes. They are represented by such a vast number of polygons that performing real-time visualization is still a problem in spite of the advantages of the hardware. Some methods have appeared to solve this drawback based on point- or image-based rendering. However, geometry representation is required in some interactive applications. This work presents a simplification method that deals with the geometry of the foliage, reducing the number of primitives that represent these objects and making their interactive visualization possible. It is based on an image-based simplification that establishes an order of leaf pruning and reduces the complexity of the canopies of trees and plants. The proposed simplification method is viewpoint-driven and uses the mutual information in order to choose the leaf to prune. Moreover, this simplification method avoids the pruned appearance of the tree that is usually produced when a foliage representation is formed by a reduced number of leaves. The error introduced every time a leaf is pruned is compensated for if the size of the nearest leaf is altered to preserve the leafy appearance of the foliage. Results demonstrate the good quality and time performance of the presented work.

  9. 77 FR 21846 - Reserve Requirements of Depository Institutions: Reserves Simplification

    Science.gov (United States)

    2012-04-12

    ... Requirements of Depository Institutions: Reserves Simplification AGENCY: Board of Governors of the Federal Reserve System. ACTION: Final rule. SUMMARY: The Board is amending Regulation D, Reserve Requirements of Depository Institutions, to simplify the administration of reserve requirements. The final rule creates a...

  10. Neuroscientific Model of Motivational Process

    OpenAIRE

    Kim, Sung-il

    2013-01-01

    Considering the neuroscientific findings on reward, learning, value, decision-making, and cognitive control, motivation can be parsed into three sub processes, a process of generating motivation, a process of maintaining motivation, and a process of regulating motivation. I propose a tentative neuroscientific model of motivational processes which consists of three distinct but continuous sub processes, namely reward-driven approach, value-based decision-making, and goal-directed control. Rewa...

  11. Modelling and dynamics analysis of heat exchanger as a distributed parameter process

    International Nuclear Information System (INIS)

    Savic, B.; Debeljkovic, D.Lj.

    2004-01-01

    A non-linear and afterwards linearized mathematical model of fuel oil cooling chamber has been developed. This chamber is a part of a recuperative heat exchanger of a tube-in-tube type and of opposite-direction acting, set in a heavy oil fraction discharge tubing. The model is defined as a range of assumptions and simplifications from which energy balance equations under non-stationary operating conditions are derived. The model is in the form of a set of partial differential equations with constant coefficients. Using appropriate numerical simulation of the transfer function, the dynamic of this process has been shown in the form of appropriate transient process responses which quite well correspond to the real process behavior

  12. Modelling and dynamics analysis of heat exchanger as a distributed parameter process

    Energy Technology Data Exchange (ETDEWEB)

    Savic, B.; Debeljkovic, D.Lj. [University of Belgrade, Department of Control Engineering, Belgrade (Yugoslavia)

    2004-07-01

    A non-linear and afterwards linearized mathematical model of fuel oil cooling chamber has been developed. This chamber is a part of a recuperative heat exchanger of a tube-in-tube type and of opposite-direction acting, set in a heavy oil fraction discharge tubing. The model is defined as a range of assumptions and simplifications from which energy balance equations under non-stationary operating conditions are derived. The model is in the form of a set of partial differential equations with constant coefficients. Using appropriate numerical simulation of the transfer function, the dynamic of this process has been shown in the form of appropriate transient process responses which quite well correspond to the real process behavior.

  13. Modeling of column apparatus processes

    CERN Document Server

    Boyadjiev, Christo; Boyadjiev, Boyan; Popova-Krumova, Petya

    2016-01-01

    This book presents a new approach for the modeling of chemical and interphase mass transfer processes in industrial column apparatuses, using convection-diffusion and average-concentration models. The convection-diffusion type models are used for a qualitative analysis of the processes and to assess the main, small and slight physical effects, and then reject the slight effects. As a result, the process mechanism can be identified. It also introduces average concentration models for quantitative analysis, which use the average values of the velocity and concentration over the cross-sectional area of the column. The new models are used to analyze different processes (simple and complex chemical reactions, absorption, adsorption and catalytic reactions), and make it possible to model the processes of gas purification with sulfur dioxide, which form the basis of several patents.

  14. UML in business process modeling

    Directory of Open Access Journals (Sweden)

    Bartosz Marcinkowski

    2013-03-01

    Full Text Available Selection and proper application of business process modeling methods and techniques have a significant impact on organizational improvement capabilities as well as proper understanding of functionality of information systems that shall support activity of the organization. A number of business process modeling notations were popularized in practice in recent decades. Most significant of the notations include Business Process Modeling Notation (OMG BPMN and several Unified Modeling Language (OMG UML extensions. In this paper, the assessment whether one of the most flexible and strictly standardized contemporary business process modeling notations, i.e. Rational UML Profile for Business Modeling, enable business analysts to prepare business models that are all-embracing and understandable by all the stakeholders. After the introduction, methodology of research is discussed. Section 2 presents selected case study results. The paper is concluded with a summary.

  15. Business Process Modeling: Perceived Benefits

    Science.gov (United States)

    Indulska, Marta; Green, Peter; Recker, Jan; Rosemann, Michael

    The process-centered design of organizations and information systems is globally seen as an appropriate response to the increased economic pressure on organizations. At the methodological core of process-centered management is process modeling. However, business process modeling in large initiatives can be a time-consuming and costly exercise, making it potentially difficult to convince executive management of its benefits. To date, and despite substantial interest and research in the area of process modeling, the understanding of the actual benefits of process modeling in academia and practice is limited. To address this gap, this paper explores the perception of benefits derived from process modeling initiatives, as reported through a global Delphi study. The study incorporates the views of three groups of stakeholders - academics, practitioners and vendors. Our findings lead to the first identification and ranking of 19 unique benefits associated with process modeling. The study in particular found that process modeling benefits vary significantly between practitioners and academics. We argue that the variations may point to a disconnect between research projects and practical demands.

  16. THE ELITISM OF LEGAL LANGUAGE AND THE NEED OF SIMPLIFICATION

    Directory of Open Access Journals (Sweden)

    Antonio Escandiel de Souza

    2016-12-01

    Full Text Available This article presents the results of the research project entitled “Simplification of legal language: a study on the view of the academic community of the University of Cruz Alta”. It is a qualitative nature study on simplifying the legal language as a means of democratizing/pluralize access to justice, in the view of scholars and Law Course teachers. There is great difficulty by society in the understanding of legal terms, which hinders access to justice. Similarly, the legal field is not far, of their traditional formalities, which indicates the existence of a parallel where, on one hand, is society, with its problems of understanding, and the other the law, its inherent and intrinsic procedures. However, the company may not have access to the judiciary hampered on account of formalities arising from the law and its flowery language. Preliminary results indicate simplification of legal language as essential to real democratization of access to Law/Justice.

  17. Computing Strongly Homotopic Line Simplification in the Plane

    DEFF Research Database (Denmark)

    Daneshpajou, Shervin; Abam, Mohammad; Deleuran, Lasse Kosetski

    We study a variant of the line-simplification problem where we are given a polygonal path P = p1 , p2 , . . . , pn and a set S of m point obstacles in a plane, and the goal is to find the optimal homotopic simplification, that is, a minimum subsequence Q = q1 , q2 , . . . , qk (q1 = p1 and qk = pn...... ) of P defining a polygonal path which approximates P within the given error ε and is homotopic to P . We assume all shortcuts pi,pj whose errors under a distance function F are at most ε can be computed in TF(n) time where TF(n) is polynomial for all widely-used distance functions. We define the new...

  18. Conceptual models of information processing

    Science.gov (United States)

    Stewart, L. J.

    1983-01-01

    The conceptual information processing issues are examined. Human information processing is defined as an active cognitive process that is analogous to a system. It is the flow and transformation of information within a human. The human is viewed as an active information seeker who is constantly receiving, processing, and acting upon the surrounding environmental stimuli. Human information processing models are conceptual representations of cognitive behaviors. Models of information processing are useful in representing the different theoretical positions and in attempting to define the limits and capabilities of human memory. It is concluded that an understanding of conceptual human information processing models and their applications to systems design leads to a better human factors approach.

  19. Business process modeling in healthcare.

    Science.gov (United States)

    Ruiz, Francisco; Garcia, Felix; Calahorra, Luis; Llorente, César; Gonçalves, Luis; Daniel, Christel; Blobel, Bernd

    2012-01-01

    The importance of the process point of view is not restricted to a specific enterprise sector. In the field of health, as a result of the nature of the service offered, health institutions' processes are also the basis for decision making which is focused on achieving their objective of providing quality medical assistance. In this chapter the application of business process modelling - using the Business Process Modelling Notation (BPMN) standard is described. Main challenges of business process modelling in healthcare are the definition of healthcare processes, the multi-disciplinary nature of healthcare, the flexibility and variability of the activities involved in health care processes, the need of interoperability between multiple information systems, and the continuous updating of scientific knowledge in healthcare.

  20. Modeling nuclear processes by Simulink

    Energy Technology Data Exchange (ETDEWEB)

    Rashid, Nahrul Khair Alang Md, E-mail: nahrul@iium.edu.my [Faculty of Engineering, International Islamic University Malaysia, Jalan Gombak, Selangor (Malaysia)

    2015-04-29

    Modelling and simulation are essential parts in the study of dynamic systems behaviours. In nuclear engineering, modelling and simulation are important to assess the expected results of an experiment before the actual experiment is conducted or in the design of nuclear facilities. In education, modelling can give insight into the dynamic of systems and processes. Most nuclear processes can be described by ordinary or partial differential equations. Efforts expended to solve the equations using analytical or numerical solutions consume time and distract attention from the objectives of modelling itself. This paper presents the use of Simulink, a MATLAB toolbox software that is widely used in control engineering, as a modelling platform for the study of nuclear processes including nuclear reactor behaviours. Starting from the describing equations, Simulink models for heat transfer, radionuclide decay process, delayed neutrons effect, reactor point kinetic equations with delayed neutron groups, and the effect of temperature feedback are used as examples.

  1. Modeling nuclear processes by Simulink

    International Nuclear Information System (INIS)

    Rashid, Nahrul Khair Alang Md

    2015-01-01

    Modelling and simulation are essential parts in the study of dynamic systems behaviours. In nuclear engineering, modelling and simulation are important to assess the expected results of an experiment before the actual experiment is conducted or in the design of nuclear facilities. In education, modelling can give insight into the dynamic of systems and processes. Most nuclear processes can be described by ordinary or partial differential equations. Efforts expended to solve the equations using analytical or numerical solutions consume time and distract attention from the objectives of modelling itself. This paper presents the use of Simulink, a MATLAB toolbox software that is widely used in control engineering, as a modelling platform for the study of nuclear processes including nuclear reactor behaviours. Starting from the describing equations, Simulink models for heat transfer, radionuclide decay process, delayed neutrons effect, reactor point kinetic equations with delayed neutron groups, and the effect of temperature feedback are used as examples

  2. modeling grinding modeling grinding processes as micro processes

    African Journals Online (AJOL)

    eobe

    industrial precision grinding processes are cylindrical, center less and ... Several model shave been proposed and used to study grinding ..... grinding force for the two cases were 9.07237N/mm ..... International Journal of Machine Tools &.

  3. Multi-enzyme Process Modeling

    DEFF Research Database (Denmark)

    Andrade Santacoloma, Paloma de Gracia

    are affected (in a positive or negative way) by the presence of the other enzymes and compounds in the media. In this thesis the concept of multi-enzyme in-pot term is adopted for processes that are carried out by the combination of enzymes in a single reactor and implemented at pilot or industrial scale...... features of the process and provides the information required to structure the process model by using a step-by-step procedure with the required tools and methods. In this way, this framework increases efficiency of the model development process with respect to time and resources needed (fast and effective....... In this way the model parameters that drives the main dynamic behavior can be identified and thus a better understanding of this type of processes. In order to develop, test and verify the methodology, three case studies were selected, specifically the bi-enzyme process for the production of lactobionic acid...

  4. Sato Processes in Default Modeling

    DEFF Research Database (Denmark)

    Kokholm, Thomas; Nicolato, Elisa

    -change of a homogeneous Levy process. While the processes in these two classes share the same average behavior over time, the associated intensities exhibit very different properties. Concrete specifications are calibrated to data on the single names included in the iTraxx Europe index. The performances are compared......In reduced form default models, the instantaneous default intensity is classically the modeling object. Survival probabilities are then given by the Laplace transform of the cumulative hazard defined as the integrated intensity process. Instead, recent literature has shown a tendency towards...... specifying the cumulative hazard process directly. Within this framework we present a new model class where cumulative hazards are described by self-similar additive processes, also known as Sato processes. Furthermore we also analyze specifications obtained via a simple deterministic time...

  5. Methodologies for Systematic Assessment of Design Simplification. Annex II

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2013-12-15

    Nuclear power plants are sophisticated engineered systems. To achieve a commercial nuclear power plant, its functions, systems and components need to be elaborated from design ideas to technical solutions and to the appropriate hardware over a long period of time. On the way, several design alternatives usually compete for implementation in the final plant. Engineering teams perform assessments, comparing different proposed engineering options in order to select an appropriate solution for the specific plant aimed at specific customers. This is a common process in design evolution. During such assessments, the trade-offs associated with different options are not always as simple as seen at very early design stages. Any requirement (e.g. relevant to safety, availability or competitiveness) usually has several dimensions; therefore, a change in the design aimed at producing the targeted effect (e.g. simplification of passive safety systems) as a rule produces other effects not directly related to the original idea. It means that the assessment needs to be carried out in iterations, not to bypass any meaningful feedback. The assessment then becomes a challenge for those designers who are interested in exploring innovative approaches and simplified systems. Unlike in several developed countries, so far, nuclear energy has been only marginally used in small and medium sized developing countries. One of the important reasons for this has been the lack of competitive commercial nuclear options with small and medium sized reactors (SMRs). Then, the challenge for SMR designers has been to design simpler plants in order to counterbalance the well known penalties of economy of scale. The lack of experience with SMRs in small and medium sized developing countries could be viewed as practical proof of the lack of commercial success of such reactors. Fossil fuelled gas turbine technologies offer very competitive energy options available from tens to hundreds of MW(e), with

  6. Markov Decision Process Measurement Model.

    Science.gov (United States)

    LaMar, Michelle M

    2018-03-01

    Within-task actions can provide additional information on student competencies but are challenging to model. This paper explores the potential of using a cognitive model for decision making, the Markov decision process, to provide a mapping between within-task actions and latent traits of interest. Psychometric properties of the model are explored, and simulation studies report on parameter recovery within the context of a simple strategy game. The model is then applied to empirical data from an educational game. Estimates from the model are found to correlate more strongly with posttest results than a partial-credit IRT model based on outcome data alone.

  7. Social Models: Blueprints or Processes?

    Science.gov (United States)

    Little, Graham R.

    1981-01-01

    Discusses the nature and implications of two different models for societal planning: (1) the problem-solving process approach based on Karl Popper; and (2) the goal-setting "blueprint" approach based on Karl Marx. (DC)

  8. Simple Models for Process Control

    Czech Academy of Sciences Publication Activity Database

    Gorez, R.; Klán, Petr

    2011-01-01

    Roč. 22, č. 2 (2011), s. 58-62 ISSN 0929-2268 Institutional research plan: CEZ:AV0Z10300504 Keywords : process model s * PID control * second order dynamics Subject RIV: JB - Sensors, Measurment, Regulation

  9. Model feedstock supply processing plants

    Directory of Open Access Journals (Sweden)

    V. M. Bautin

    2013-01-01

    Full Text Available The model of raw providing the processing enterprises entering into vertically integrated structure on production and processing of dairy raw materials, differing by an orientation on achievement of cumulative effect by the integrated structure acting as criterion function which maximizing is reached by optimization of capacities, volumes of deliveries of raw materials and its qualitative characteristics, costs of industrial processing of raw materials and demand for dairy production is developed.

  10. Sato Processes in Default Modeling

    DEFF Research Database (Denmark)

    Kokholm, Thomas; Nicolato, Elisa

    2010-01-01

    In reduced form default models, the instantaneous default intensity is the classical modeling object. Survival probabilities are then given by the Laplace transform of the cumulative hazard defined as the integrated intensity process. Instead, recent literature tends to specify the cumulative haz...

  11. Sato Processes in Default Modeling

    DEFF Research Database (Denmark)

    Kokholm, Thomas; Nicolato, Elisa

    In reduced form default models, the instantaneous default intensity is classically the modeling object. Survival probabilities are then given by the Laplace transform of the cumulative hazard defined as the integrated intensity process. Instead, recent literature has shown a tendency towards...

  12. Metrics for Business Process Models

    Science.gov (United States)

    Mendling, Jan

    Up until now, there has been little research on why people introduce errors in real-world business process models. In a more general context, Simon [404] points to the limitations of cognitive capabilities and concludes that humans act rationally only to a certain extent. Concerning modeling errors, this argument would imply that human modelers lose track of the interrelations of large and complex models due to their limited cognitive capabilities and introduce errors that they would not insert in a small model. A recent study by Mendling et al. [275] explores in how far certain complexity metrics of business process models have the potential to serve as error determinants. The authors conclude that complexity indeed appears to have an impact on error probability. Before we can test such a hypothesis in a more general setting, we have to establish an understanding of how we can define determinants that drive error probability and how we can measure them.

  13. Computer Modelling of Dynamic Processes

    Directory of Open Access Journals (Sweden)

    B. Rybakin

    2000-10-01

    Full Text Available Results of numerical modeling of dynamic problems are summed in the article up. These problems are characteristic for various areas of human activity, in particular for problem solving in ecology. The following problems are considered in the present work: computer modeling of dynamic effects on elastic-plastic bodies, calculation and determination of performances of gas streams in gas cleaning equipment, modeling of biogas formation processes.

  14. Modelling Hospital Materials Management Processes

    Directory of Open Access Journals (Sweden)

    Raffaele Iannone

    2013-06-01

    integrated and detailed analysis and description model for hospital materials management data and tasks, which is able to tackle information from patient requirements to usage, from replenishment requests to supplying and handling activities. The model takes account of medical risk reduction, traceability and streamlined processes perspectives. Second, the paper translates this information into a business process model and mathematical formalization.The study provides a useful guide to the various relevant technology‐related, management and business issues, laying the foundations of an efficient reengineering of the supply chain to reduce healthcare costs and improve the quality of care.

  15. Declarative modeling for process supervision

    International Nuclear Information System (INIS)

    Leyval, L.

    1989-01-01

    Our work is a contribution to computer aided supervision of continuous processes. It is inspired by an area of Artificial Intelligence: qualitative physics. Here, supervision is based on a model which continuously provides operators with a synthetic view of the process; but this model is founded on general principles of control theory rather than on physics. It involves concepts such as high gain or small time response. It helps in linking temporally the evolution of various variables. Moreover, the model provides predictions of the future behaviour of the process, which allows action advice and alarm filtering. This should greatly reduce the famous cognitive overload associated to any complex and dangerous evolution of the process

  16. The minimum attention plant inherent safety through LWR simplification

    International Nuclear Information System (INIS)

    Turk, R.S.; Matzie, R.A.

    1987-01-01

    The Minimum Attention Plant (MAP) is a unique small LWR that achieves greater inherent safety, improved operability, and reduced costs through design simplification. The MAP is a self-pressurized, indirect-cycle light water reactor with full natural circulation primary coolant flow and multiple once-through steam generators located within the reactor vessel. A fundamental tenent of the MAP design is its complete reliance on existing LWR technology. This reliance on conventional technology provides an extensive experience base which gives confidence in judging the safety and performance aspects of the design

  17. Ecosystem simplification, biodiversity loss and plant virus emergence.

    Science.gov (United States)

    Roossinck, Marilyn J; García-Arenal, Fernando

    2015-02-01

    Plant viruses can emerge into crops from wild plant hosts, or conversely from domestic (crop) plants into wild hosts. Changes in ecosystems, including loss of biodiversity and increases in managed croplands, can impact the emergence of plant virus disease. Although data are limited, in general the loss of biodiversity is thought to contribute to disease emergence. More in-depth studies have been done for human viruses, but studies with plant viruses suggest similar patterns, and indicate that simplification of ecosystems through increased human management may increase the emergence of viral diseases in crops. Copyright © 2015 Elsevier B.V. All rights reserved.

  18. Modeling of biopharmaceutical processes. Part 2: Process chromatography unit operation

    DEFF Research Database (Denmark)

    Kaltenbrunner, Oliver; McCue, Justin; Engel, Philip

    2008-01-01

    Process modeling can be a useful tool to aid in process development, process optimization, and process scale-up. When modeling a chromatography process, one must first select the appropriate models that describe the mass transfer and adsorption that occurs within the porous adsorbent. The theoret......Process modeling can be a useful tool to aid in process development, process optimization, and process scale-up. When modeling a chromatography process, one must first select the appropriate models that describe the mass transfer and adsorption that occurs within the porous adsorbent...

  19. Neuroscientific Model of Motivational Process

    Science.gov (United States)

    Kim, Sung-il

    2013-01-01

    Considering the neuroscientific findings on reward, learning, value, decision-making, and cognitive control, motivation can be parsed into three sub processes, a process of generating motivation, a process of maintaining motivation, and a process of regulating motivation. I propose a tentative neuroscientific model of motivational processes which consists of three distinct but continuous sub processes, namely reward-driven approach, value-based decision-making, and goal-directed control. Reward-driven approach is the process in which motivation is generated by reward anticipation and selective approach behaviors toward reward. This process recruits the ventral striatum (reward area) in which basic stimulus-action association is formed, and is classified as an automatic motivation to which relatively less attention is assigned. By contrast, value-based decision-making is the process of evaluating various outcomes of actions, learning through positive prediction error, and calculating the value continuously. The striatum and the orbitofrontal cortex (valuation area) play crucial roles in sustaining motivation. Lastly, the goal-directed control is the process of regulating motivation through cognitive control to achieve goals. This consciously controlled motivation is associated with higher-level cognitive functions such as planning, retaining the goal, monitoring the performance, and regulating action. The anterior cingulate cortex (attention area) and the dorsolateral prefrontal cortex (cognitive control area) are the main neural circuits related to regulation of motivation. These three sub processes interact with each other by sending reward prediction error signals through dopaminergic pathway from the striatum and to the prefrontal cortex. The neuroscientific model of motivational process suggests several educational implications with regard to the generation, maintenance, and regulation of motivation to learn in the learning environment. PMID:23459598

  20. Neuroscientific model of motivational process.

    Science.gov (United States)

    Kim, Sung-Il

    2013-01-01

    Considering the neuroscientific findings on reward, learning, value, decision-making, and cognitive control, motivation can be parsed into three sub processes, a process of generating motivation, a process of maintaining motivation, and a process of regulating motivation. I propose a tentative neuroscientific model of motivational processes which consists of three distinct but continuous sub processes, namely reward-driven approach, value-based decision-making, and goal-directed control. Reward-driven approach is the process in which motivation is generated by reward anticipation and selective approach behaviors toward reward. This process recruits the ventral striatum (reward area) in which basic stimulus-action association is formed, and is classified as an automatic motivation to which relatively less attention is assigned. By contrast, value-based decision-making is the process of evaluating various outcomes of actions, learning through positive prediction error, and calculating the value continuously. The striatum and the orbitofrontal cortex (valuation area) play crucial roles in sustaining motivation. Lastly, the goal-directed control is the process of regulating motivation through cognitive control to achieve goals. This consciously controlled motivation is associated with higher-level cognitive functions such as planning, retaining the goal, monitoring the performance, and regulating action. The anterior cingulate cortex (attention area) and the dorsolateral prefrontal cortex (cognitive control area) are the main neural circuits related to regulation of motivation. These three sub processes interact with each other by sending reward prediction error signals through dopaminergic pathway from the striatum and to the prefrontal cortex. The neuroscientific model of motivational process suggests several educational implications with regard to the generation, maintenance, and regulation of motivation to learn in the learning environment.

  1. ECONOMIC MODELING PROCESSES USING MATLAB

    Directory of Open Access Journals (Sweden)

    Anamaria G. MACOVEI

    2008-06-01

    Full Text Available To study economic phenomena and processes using mathem atical modeling, and to determine the approximatesolution to a problem we need to choose a method of calculation and a numerical computer program, namely thepackage of programs MatLab. Any economic process or phenomenon is a mathematical description of h is behavior,and thus draw up an economic and mathematical model that has the following stages: formulation of the problem, theanalysis process modeling, the production model and design verification, validation and implementation of the model.This article is presented an economic model and its modeling is using mathematical equations and software packageMatLab, which helps us approximation effective solution. As data entry is considered the net cost, the cost of direct andtotal cost and the link between them. I presented the basic formula for determining the total cost. Economic modelcalculations were made in MatLab software package and with graphic representation of its interpretation of the resultsachieved in terms of our specific problem.

  2. Path modeling and process control

    DEFF Research Database (Denmark)

    Høskuldsson, Agnar; Rodionova, O.; Pomerantsev, A.

    2007-01-01

    and having three or more stages. The methods are applied to a process control of a multi-stage production process having 25 variables and one output variable. When moving along the process, variables change their roles. It is shown how the methods of path modeling can be applied to estimate variables...... be performed regarding the foreseeable output property y, and with respect to an admissible range of correcting actions for the parameters of the next stage. In this paper the basic principles of path modeling is presented. The mathematics is presented for processes having only one stage, having two stages...... of the next stage with the purpose of obtaining optimal or almost optimal quality of the output variable. An important aspect of the methods presented is the possibility of extensive graphic analysis of data that can provide the engineer with a detailed view of the multi-variate variation in data....

  3. Process Models for Security Architectures

    Directory of Open Access Journals (Sweden)

    Floarea NASTASE

    2006-01-01

    Full Text Available This paper presents a model for an integrated security system, which can be implemented in any organization. It is based on security-specific standards and taxonomies as ISO 7498-2 and Common Criteria. The functionalities are derived from the classes proposed in the Common Criteria document. In the paper we present the process model for each functionality and also we focus on the specific components.

  4. Mathematical modelling in economic processes.

    Directory of Open Access Journals (Sweden)

    L.V. Kravtsova

    2008-06-01

    Full Text Available In article are considered a number of methods of mathematical modelling of economic processes and opportunities of use of spreadsheets Excel for reception of the optimum decision of tasks or calculation of financial operations with the help of the built-in functions.

  5. Visualizing the process of process modeling with PPMCharts

    NARCIS (Netherlands)

    Claes, J.; Vanderfeesten, I.T.P.; Pinggera, J.; Reijers, H.A.; Weber, B.; Poels, G.; La Rosa, M.; Soffer, P.

    2013-01-01

    In the quest for knowledge about how to make good process models, recent research focus is shifting from studying the quality of process models to studying the process of process modeling (often abbreviated as PPM) itself. This paper reports on our efforts to visualize this specific process in such

  6. Neuroscientific Model of Motivational Process

    Directory of Open Access Journals (Sweden)

    Sung-Il eKim

    2013-03-01

    Full Text Available Considering the neuroscientific findings on reward, learning, value, decision-making, and cognitive control, motivation can be parsed into three subprocesses, a process of generating motivation, a process of maintaining motivation, and a process of regulating motivation. I propose a tentative neuroscientific model of motivational processes which consists of three distinct but continuous subprocesses, namely reward-driven approach, value-based decision making, and goal-directed control. Reward-driven approach is the process in which motivation is generated by reward anticipation and selective approach behaviors toward reward. This process recruits the ventral striatum (reward area in which basic stimulus-action association is formed, and is classified as an automatic motivation to which relatively less attention is assigned. By contrast, value-based decision making is the process of evaluating various outcomes of actions, learning through positive prediction error, and calculating the value continuously. The striatum and the orbitofrontal cortex (valuation area play crucial roles in sustaining motivation. Lastly, the goal-directed control is the process of regulating motivation through cognitive control to achieve goals. This consciously controlled motivation is associated with higher-level cognitive functions such as planning, retaining the goal, monitoring the performance, and regulating action. The anterior cingulate cortex (attention area and the dorsolateral prefrontal cortex (cognitive control area are the main neural circuits related to regulation of motivation. These three subprocesses interact with each other by sending reward prediction error signals through dopaminergic pathway from the striatum and to the prefrontal cortex. The neuroscientific model of motivational process suggests several educational implications with regard to the generation, maintenance, and regulation of motivation to learn in the learning environment.

  7. Integrated Site Model Process Model Report

    International Nuclear Information System (INIS)

    Booth, T.

    2000-01-01

    The Integrated Site Model (ISM) provides a framework for discussing the geologic features and properties of Yucca Mountain, which is being evaluated as a potential site for a geologic repository for the disposal of nuclear waste. The ISM is important to the evaluation of the site because it provides 3-D portrayals of site geologic, rock property, and mineralogic characteristics and their spatial variabilities. The ISM is not a single discrete model; rather, it is a set of static representations that provide three-dimensional (3-D), computer representations of site geology, selected hydrologic and rock properties, and mineralogic-characteristics data. These representations are manifested in three separate model components of the ISM: the Geologic Framework Model (GFM), the Rock Properties Model (RPM), and the Mineralogic Model (MM). The GFM provides a representation of the 3-D stratigraphy and geologic structure. Based on the framework provided by the GFM, the RPM and MM provide spatial simulations of the rock and hydrologic properties, and mineralogy, respectively. Functional summaries of the component models and their respective output are provided in Section 1.4. Each of the component models of the ISM considers different specific aspects of the site geologic setting. Each model was developed using unique methodologies and inputs, and the determination of the modeled units for each of the components is dependent on the requirements of that component. Therefore, while the ISM represents the integration of the rock properties and mineralogy into a geologic framework, the discussion of ISM construction and results is most appropriately presented in terms of the three separate components. This Process Model Report (PMR) summarizes the individual component models of the ISM (the GFM, RPM, and MM) and describes how the three components are constructed and combined to form the ISM

  8. Investigating the Process of Process Modeling with Eye Movement Analysis

    OpenAIRE

    Pinggera, Jakob; Furtner, Marco; Martini, Markus; Sachse, Pierre; Reiter, Katharina; Zugal, Stefan; Weber, Barbara

    2015-01-01

    Research on quality issues of business process models has recently begun to explore the process of creating process models by analyzing the modeler's interactions with the modeling environment. In this paper we aim to complement previous insights on the modeler's modeling behavior with data gathered by tracking the modeler's eye movements when engaged in the act of modeling. We present preliminary results and outline directions for future research to triangulate toward a more comprehensive un...

  9. Welding process modelling and control

    Science.gov (United States)

    Romine, Peter L.; Adenwala, Jinen A.

    1993-01-01

    The research and analysis performed, and software developed, and hardware/software recommendations made during 1992 in development of the PC-based data acquisition system for support of Welding Process Modeling and Control is reported. A need was identified by the Metals Processing Branch of NASA Marshall Space Flight Center, for a mobile data aquisition and analysis system, customized for welding measurement and calibration. Several hardware configurations were evaluated and a PC-based system was chosen. The Welding Measurement System (WMS) is a dedicated instrument, strictly for the use of data aquisition and analysis. Although the WMS supports many of the functions associated with the process control, it is not the intention for this system to be used for welding process control.

  10. Efficient Simplification Methods for Generating High Quality LODs of 3D Meshes

    Institute of Scientific and Technical Information of China (English)

    Muhammad Hussain

    2009-01-01

    Two simplification algorithms are proposed for automatic decimation of polygonal models, and for generating their LODs. Each algorithm orders vertices according to their priority values and then removes them iteratively. For setting the priority value of each vertex, exploiting normal field of its one-ring neighborhood, we introduce a new measure of geometric fidelity that reflects well the local geometric features of the vertex. After a vertex is selected, using other measures of geometric distortion that are based on normal field deviation and distance measure, it is decided which of the edges incident on the vertex is to be collapsed for removing it. The collapsed edge is substituted with a new vertex whose position is found by minimizing the local quadric error measure. A comparison with the state-of-the-art algorithms reveals that the proposed algorithms are simple to implement, are computationally more efficient, generate LODs with better quality, and preserve salient features even after drastic simplification. The methods are useful for applications such as 3D computer games, virtual reality, where focus is on fast running time, reduced memory overhead, and high quality LODs.

  11. A New Algorithm for Cartographic Simplification of Streams and Lakes Using Deviation Angles and Error Bands

    Directory of Open Access Journals (Sweden)

    Türkay Gökgöz

    2015-10-01

    Full Text Available Multi-representation databases (MRDBs are used in several geographical information system applications for different purposes. MRDBs are mainly obtained through model and cartographic generalizations. Simplification is the essential operator of cartographic generalization, and streams and lakes are essential features in hydrography. In this study, a new algorithm was developed for the simplification of streams and lakes. In this algorithm, deviation angles and error bands are used to determine the characteristic vertices and the planimetric accuracy of the features, respectively. The algorithm was tested using a high-resolution national hydrography dataset of Pomme de Terre, a sub-basin in the USA. To assess the performance of the new algorithm, the Bend Simplify and Douglas-Peucker algorithms, the medium-resolution hydrography dataset of the sub-basin, and Töpfer’s radical law were used. For quantitative analysis, the vertex numbers, the lengths, and the sinuosity values were computed. Consequently, it was shown that the new algorithm was able to meet the main requirements (i.e., accuracy, legibility and aesthetics, and storage.

  12. The Study of Simplification and Explicitation Techniques in Khaled Hosseini's “A Thousand Splendid Suns”

    OpenAIRE

    Reza Kafipour

    2016-01-01

    Teaching and learning strategies help facilitate teaching and learning. Among them, simplification and explicitation strategies are those which help transferring the meaning to the learners and readers of a translated text. The aim of this study was to investigate explicitation and simplification in Persian translation of novel of Khaled Hosseini's “A Thousand Splendid Suns”. The study also attempted to find out frequencies of simplification and explicitation techniques used by the translator...

  13. Advanced oxidation processes: overall models

    Energy Technology Data Exchange (ETDEWEB)

    Rodriguez, M. [Univ. de los Andes, Escuela Basica de Ingenieria, La Hechicera, Merida (Venezuela); Curco, D.; Addardak, A.; Gimenez, J.; Esplugas, S. [Dept. de Ingenieria Quimica. Univ. de Barcelona, Barcelona (Spain)

    2003-07-01

    Modelling AOPs implies to consider all the steps included in the process, that means, mass transfer, kinetic (reaction) and luminic steps. In this way, recent works develop models which relate the global reaction rate to catalyst concentration and radiation absorption. However, the application of such models requires to know what is the controlling step for the overall process. In this paper, a simple method is explained which allows to determine the controlling step. Thus, it is assumed that reactor is divided in two hypothetical zones (dark and illuminated), and according to the experimental results, obtained by varying only the reaction volume, it can be decided if reaction occurs only in the illuminated zone or in the all reactor, including dark zone. The photocatalytic degradation of phenol, by using titania degussa P-25 as catalyst, is studied as reaction model. The preliminary results obtained are presented here, showing that it seems that, in this case, reaction only occurs in the illuminated zone of photoreactor. A model is developed to explain this behaviour. (orig.)

  14. Model for amorphous aggregation processes

    Science.gov (United States)

    Stranks, Samuel D.; Ecroyd, Heath; van Sluyter, Steven; Waters, Elizabeth J.; Carver, John A.; von Smekal, Lorenz

    2009-11-01

    The amorphous aggregation of proteins is associated with many phenomena, ranging from the formation of protein wine haze to the development of cataract in the eye lens and the precipitation of recombinant proteins during their expression and purification. While much literature exists describing models for linear protein aggregation, such as amyloid fibril formation, there are few reports of models which address amorphous aggregation. Here, we propose a model to describe the amorphous aggregation of proteins which is also more widely applicable to other situations where a similar process occurs, such as in the formation of colloids and nanoclusters. As first applications of the model, we have tested it against experimental turbidimetry data of three proteins relevant to the wine industry and biochemistry, namely, thaumatin, a thaumatinlike protein, and α -lactalbumin. The model is very robust and describes amorphous experimental data to a high degree of accuracy. Details about the aggregation process, such as shape parameters of the aggregates and rate constants, can also be extracted.

  15. Subthalamic stimulation: toward a simplification of the electrophysiological procedure.

    Science.gov (United States)

    Fetter, Damien; Derrey, Stephane; Lefaucheur, Romain; Borden, Alaina; Wallon, David; Chastan, Nathalie; Maltete, David

    2016-06-01

    The aim of the present study was to assess the consequences of a simplification of the electrophysiological procedure on the post-operative clinical outcome after subthalamic nucleus implantation in Parkinson disease. Microelectrode recordings were performed on 5 parallel trajectories in group 1 and less than 5 trajectories in group 2. Clinical evaluations were performed 1 month before and 6 months after surgery. After surgery, the UPDRS III score in the off-drug/on-stimulation and on-drug/on-stimulation conditions significantly improved by 66,9% and 82%, respectively in group 1, and by 65.8% and 82.3% in group 2 (P<0.05). Meanwhile, the total number of words (P<0.05) significantly decreased for fluency tasks in both groups. Motor disability improvement and medication reduction were similar in both groups. Our results suggest that the electrophysiological procedure should be simplified as the team's experience increases.

  16. System Model of Heat and Mass Transfer Process for Mobile Solvent Vapor Phase Drying Equipment

    Directory of Open Access Journals (Sweden)

    Shiwei Zhang

    2014-01-01

    Full Text Available The solvent vapor phase drying process is one of the most important processes during the production and maintenance for large oil-immersed power transformer. In this paper, the working principle, system composition, and technological process of mobile solvent vapor phase drying (MVPD equipment for transformer are introduced in detail. On the basis of necessary simplification and assumption for MVPD equipment and process, a heat and mass transfer mathematical model including 40 mathematical equations is established, which represents completely thermodynamics laws of phase change and transport process of solvent, water, and air in MVPD technological processes and describes in detail the quantitative relationship among important physical quantities such as temperature, pressure, and flux in key equipment units and process. Taking a practical field drying process of 500 KV/750 MVA power transformer as an example, the simulation calculation of a complete technological process is carried out by programming with MATLAB software and some relation curves of key process parameters changing with time are obtained such as body temperature, tank pressure, and water yield. The change trend of theoretical simulation results is very consistent with the actual production record data which verifies the correctness of mathematical model established.

  17. Probabilistic evaluation of process model matching techniques

    NARCIS (Netherlands)

    Kuss, Elena; Leopold, Henrik; van der Aa, Han; Stuckenschmidt, Heiner; Reijers, Hajo A.

    2016-01-01

    Process model matching refers to the automatic identification of corresponding activities between two process models. It represents the basis for many advanced process model analysis techniques such as the identification of similar process parts or process model search. A central problem is how to

  18. Mathematical modeling of biological processes

    CERN Document Server

    Friedman, Avner

    2014-01-01

    This book on mathematical modeling of biological processes includes a wide selection of biological topics that demonstrate the power of mathematics and computational codes in setting up biological processes with a rigorous and predictive framework. Topics include: enzyme dynamics, spread of disease, harvesting bacteria, competition among live species, neuronal oscillations, transport of neurofilaments in axon, cancer and cancer therapy, and granulomas. Complete with a description of the biological background and biological question that requires the use of mathematics, this book is developed for graduate students and advanced undergraduate students with only basic knowledge of ordinary differential equations and partial differential equations; background in biology is not required. Students will gain knowledge on how to program with MATLAB without previous programming experience and how to use codes in order to test biological hypothesis.

  19. Geological heterogeneity: Goal-oriented simplification of structure and characterization needs

    Science.gov (United States)

    Savoy, Heather; Kalbacher, Thomas; Dietrich, Peter; Rubin, Yoram

    2017-11-01

    Geological heterogeneity, i.e. the spatial variability of discrete hydrogeological units, is investigated in an aquifer analog of glacio-fluvial sediments to determine how such a geological structure can be simplified for characterization needs. The aquifer analog consists of ten hydrofacies whereas the scarcity of measurements in typical field studies precludes such detailed spatial models of hydraulic properties. Of particular interest is the role of connectivity of the hydrofacies structure, along with its effect on the connectivity of mass transport, in site characterization for predicting early arrival times. Transport through three realizations of the aquifer analog is modeled with numerical particle tracking to ascertain the fast flow channel through which early arriving particles travel. Three simplification schemes of two-facies models are considered to represent the aquifer analogs, and the velocity within the fast flow channel is used to estimate the apparent hydraulic conductivity of the new facies. The facies models in which the discontinuous patches of high hydraulic conductivity are separated from the rest of the domain yield the closest match in early arrival times compared to the aquifer analog, but assuming a continuous high hydraulic conductivity channel connecting these patches yields underestimated early arrivals times within the range of variability between the realizations, which implies that the three simplification schemes could be advised but pose different implications for field measurement campaigns. Overall, the results suggest that the result of transport connectivity, i.e. early arrival times, within realistic geological heterogeneity can be conserved even when the underlying structural connectivity is modified.

  20. Modeling pellet impact drilling process

    Science.gov (United States)

    Kovalyov, A. V.; Ryabchikov, S. Ya; Isaev, Ye D.; Ulyanova, O. S.

    2016-03-01

    The paper describes pellet impact drilling which could be used to increase the drilling speed and the rate of penetration when drilling hard rocks. Pellet impact drilling implies rock destruction by metal pellets with high kinetic energy in the immediate vicinity of the earth formation encountered. The pellets are circulated in the bottom hole by a high velocity fluid jet, which is the principle component of the ejector pellet impact drill bit. The experiments conducted has allowed modeling the process of pellet impact drilling, which creates the scientific and methodological basis for engineering design of drilling operations under different geo-technical conditions.

  1. Integrated modelling in materials and process technology

    DEFF Research Database (Denmark)

    Hattel, Jesper Henri

    2008-01-01

    Integrated modelling of entire process sequences and the subsequent in-service conditions, and multiphysics modelling of the single process steps are areas that increasingly support optimisation of manufactured parts. In the present paper, three different examples of modelling manufacturing...... processes from the viewpoint of combined materials and process modelling are presented: solidification of thin walled ductile cast iron, integrated modelling of spray forming and multiphysics modelling of friction stir welding. The fourth example describes integrated modelling applied to a failure analysis...

  2. The limits of simplification in translated isiZulu health texts | Ndlovu ...

    African Journals Online (AJOL)

    Simplification, defined as the practice of simplifying the language used in translation, is regarded as one of the universal features of translation. This article investigates the limitations of simplification encountered in efforts to make translated isiZulu health texts more accessible to the target readership. The focus is on public ...

  3. 7 CFR 3015.311 - Simplification, consolidation, or substitution of State plans.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 15 2010-01-01 2010-01-01 false Simplification, consolidation, or substitution of... (Continued) OFFICE OF THE CHIEF FINANCIAL OFFICER, DEPARTMENT OF AGRICULTURE UNIFORM FEDERAL ASSISTANCE... Simplification, consolidation, or substitution of State plans. (a) As used in this section: (1) Simplify means...

  4. Collapse models and perceptual processes

    International Nuclear Information System (INIS)

    Ghirardi, Gian Carlo; Romano, Raffaele

    2014-01-01

    Theories including a collapse mechanism have been presented various years ago. They are based on a modification of standard quantum mechanics in which nonlinear and stochastic terms are added to the evolution equation. Their principal merits derive from the fact that they are mathematically precise schemes accounting, on the basis of a unique universal dynamical principle, both for the quantum behavior of microscopic systems as well as for the reduction associated to measurement processes and for the classical behavior of macroscopic objects. Since such theories qualify themselves not as new interpretations but as modifications of the standard theory they can be, in principle, tested against quantum mechanics. Recently, various investigations identifying possible crucial test have been discussed. In spite of the extreme difficulty to perform such tests it seems that recent technological developments allow at least to put precise limits on the parameters characterizing the modifications of the evolution equation. Here we will simply mention some of the recent investigations in this direction, while we will mainly concentrate our attention to the way in which collapse theories account for definite perceptual process. The differences between the case of reductions induced by perceptions and those related to measurement procedures by means of standard macroscopic devices will be discussed. On this basis, we suggest a precise experimental test of collapse theories involving conscious observers. We make plausible, by discussing in detail a toy model, that the modified dynamics can give rise to quite small but systematic errors in the visual perceptual process.

  5. Hillslope runoff processes and models

    Science.gov (United States)

    Kirkby, Mike

    1988-07-01

    Hillslope hydrology is concerned with the partition of precipitation as it passes through the vegetation and soil between overland flow and subsurface flow. Flow follows routes which attenuate and delay the flow to different extents, so that a knowledge of the relevant mechanisms is important. In the 1960s and 1970s, hillslope hydrology developed as a distinct topic through the application of new field observations to develop a generation of physically based forecasting models. In its short history, theory has continually been overturned by field observation. Thus the current tendency, particularly among temperate zone hydrologists, to dismiss all Hortonian overland flow as a myth, is now being corrected by a number of significant field studies which reveal the great range in both climatic and hillslope conditions. Some recent models have generally attempted to simplify the processes acting, for example including only vertical unsaturated flow and lateral saturated flows. Others explicitly forecast partial or contributing areas. With hindsight, the most complete and distributed models have generally shown little forecasting advantage over simpler approaches, perhaps trending towards reliable models which can run on desk top microcomputers. The variety now being recognised in hillslope hydrological responses should also lead to models which take account of more complex interactions, even if initially with a less secure physical and mathematical basis than the Richards equation. In particular, there is a need to respond to the variety of climatic responses, and to spatial variability on and beneath the surface, including the role of seepage macropores and pipes which call into question whether the hillside can be treated as a Darcian flow system.

  6. Business Process Modelling for Measuring Quality

    NARCIS (Netherlands)

    Heidari, F.; Loucopoulos, P.; Brazier, F.M.

    2013-01-01

    Business process modelling languages facilitate presentation, communication and analysis of business processes with different stakeholders. This paper proposes an approach that drives specification and measurement of quality requirements and in doing so relies on business process models as

  7. Properties of spatial Cox process models

    DEFF Research Database (Denmark)

    Møller, Jesper

    Probabilistic properties of Cox processes of relevance for statistical modelling and inference are studied. Particularly, we study the most important classes of Cox processes, including log Gaussian Cox processes, shot noise Cox processes, and permanent Cox processes. We consider moment properties...... and point process operations such as thinning, displacements, and superpositioning. We also discuss how to simulate specific Cox processes....

  8. Developing engineering processes through integrated modelling of product and process

    DEFF Research Database (Denmark)

    Nielsen, Jeppe Bjerrum; Hvam, Lars

    2012-01-01

    This article aims at developing an operational tool for integrated modelling of product assortments and engineering processes in companies making customer specific products. Integrating a product model in the design of engineering processes will provide a deeper understanding of the engineering...... activities as well as insight into how product features affect the engineering processes. The article suggests possible ways of integrating models of products with models of engineering processes. The models have been tested and further developed in an action research study carried out in collaboration...... with a major international engineering company....

  9. Automatic extraction of process categories from process model collections

    NARCIS (Netherlands)

    Malinova, M.; Dijkman, R.M.; Mendling, J.; Lohmann, N.; Song, M.; Wohed, P.

    2014-01-01

    Many organizations build up their business process management activities in an incremental way. As a result, there is no overarching structure defined at the beginning. However, as business process modeling initiatives often yield hundreds to thousands of process models, there is a growing need for

  10. Four Common Simplifications of Multi-Criteria Decision Analysis do not hold for River Rehabilitation.

    Science.gov (United States)

    Langhans, Simone D; Lienert, Judit

    2016-01-01

    River rehabilitation aims at alleviating negative effects of human impacts such as loss of biodiversity and reduction of ecosystem services. Such interventions entail difficult trade-offs between different ecological and often socio-economic objectives. Multi-Criteria Decision Analysis (MCDA) is a very suitable approach that helps assessing the current ecological state and prioritizing river rehabilitation measures in a standardized way, based on stakeholder or expert preferences. Applications of MCDA in river rehabilitation projects are often simplified, i.e. using a limited number of objectives and indicators, assuming linear value functions, aggregating individual indicator assessments additively, and/or assuming risk neutrality of experts. Here, we demonstrate an implementation of MCDA expert preference assessments to river rehabilitation and provide ample material for other applications. To test whether the above simplifications reflect common expert opinion, we carried out very detailed interviews with five river ecologists and a hydraulic engineer. We defined essential objectives and measurable quality indicators (attributes), elicited the experts´ preferences for objectives on a standardized scale (value functions) and their risk attitude, and identified suitable aggregation methods. The experts recommended an extensive objectives hierarchy including between 54 and 93 essential objectives and between 37 to 61 essential attributes. For 81% of these, they defined non-linear value functions and in 76% recommended multiplicative aggregation. The experts were risk averse or risk prone (but never risk neutral), depending on the current ecological state of the river, and the experts´ personal importance of objectives. We conclude that the four commonly applied simplifications clearly do not reflect the opinion of river rehabilitation experts. The optimal level of model complexity, however, remains highly case-study specific depending on data and resource

  11. Process mining using BPMN: relating event logs and process models

    NARCIS (Netherlands)

    Kalenkova, A.A.; van der Aalst, W.M.P.; Lomazova, I.A.; Rubin, V.A.

    2017-01-01

    Process-aware information systems (PAIS) are systems relying on processes, which involve human and software resources to achieve concrete goals. There is a need to develop approaches for modeling, analysis, improvement and monitoring processes within PAIS. These approaches include process mining

  12. Process mining using BPMN : relating event logs and process models

    NARCIS (Netherlands)

    Kalenkova, A.A.; Aalst, van der W.M.P.; Lomazova, I.A.; Rubin, V.A.

    2015-01-01

    Process-aware information systems (PAIS) are systems relying on processes, which involve human and software resources to achieve concrete goals. There is a need to develop approaches for modeling, analysis, improvement and monitoring processes within PAIS. These approaches include process mining

  13. Multiphysics modelling of manufacturing processes: A review

    DEFF Research Database (Denmark)

    Jabbari, Masoud; Baran, Ismet; Mohanty, Sankhya

    2018-01-01

    Numerical modelling is increasingly supporting the analysis and optimization of manufacturing processes in the production industry. Even if being mostly applied to multistep processes, single process steps may be so complex by nature that the needed models to describe them must include multiphysics...... the diversity in the field of modelling of manufacturing processes as regards process, materials, generic disciplines as well as length scales: (1) modelling of tape casting for thin ceramic layers, (2) modelling the flow of polymers in extrusion, (3) modelling the deformation process of flexible stamps...... for nanoimprint lithography, (4) modelling manufacturing of composite parts and (5) modelling the selective laser melting process. For all five examples, the emphasis is on modelling results as well as describing the models in brief mathematical details. Alongside with relevant references to the original work...

  14. Modeling Suspension and Continuation of a Process

    Directory of Open Access Journals (Sweden)

    Oleg Svatos

    2012-04-01

    Full Text Available This work focuses on difficulties an analyst encounters when modeling suspension and continuation of a process in contemporary process modeling languages. As a basis there is introduced general lifecycle of an activity which is then compared to activity lifecycles supported by individual process modeling languages. The comparison shows that the contemporary process modeling languages cover the defined general lifecycle of an activity only partially. There are picked two popular process modeling languages and there is modeled real example, which reviews how the modeling languages can get along with their lack of native support of suspension and continuation of an activity. Upon the unsatisfying results of the contemporary process modeling languages in the modeled example, there is presented a new process modeling language which, as demonstrated, is capable of capturing suspension and continuation of an activity in much simpler and precise way.

  15. Computer Aided Continuous Time Stochastic Process Modelling

    DEFF Research Database (Denmark)

    Kristensen, N.R.; Madsen, Henrik; Jørgensen, Sten Bay

    2001-01-01

    A grey-box approach to process modelling that combines deterministic and stochastic modelling is advocated for identification of models for model-based control of batch and semi-batch processes. A computer-aided tool designed for supporting decision-making within the corresponding modelling cycle...

  16. Emergency planning simplification: Why ALWR designs shall support this goal

    International Nuclear Information System (INIS)

    Tripputi, I.

    2004-01-01

    Emergency Plan simplification, could be achieved only if it can proved, in a context of balanced national health protection policies, that there is a reduced or no technical need for some elements of it and that public protection is assured in all considered situations regardless of protective actions outside the plant. These objectives may be technically supported if one or more of the following conditions are complied with: 1. Accidents potentially releasing large amounts of fission products can be ruled out by characteristics of the designs 2. Plant engineered features (and the containment system in particular) are able to drastically mitigate the radioactive releases under all conceivable scenarios. 3. A realistic approach to the consequence evaluation can reduce the expected consequences to effects below any concern. Unfortunately no one single approach is either technically feasible or justified in a perspective of defense in depth and only a mix of them may provide the necessary conditions. It appears that most or all proposed ALWR designs address the technical issues, whose solutions are the bases to eliminate the need for a number of protective actions (evacuation, relocation, sheltering, iodine tablets administration, etc.) even in the case of a severe accident. Some designs are mainly oriented to prevent the need for short term protective actions; they credit simplified Emergency Plans or the capabilities of existing civil protection organizations for public relocation in the long term, if needed. Others take also into account the overall releases to exclude or minimize public relocation and land contamination. Design targets for population individual doses and for land contamination proposed in Italy are discussed in the paper. It is also shown that these limits, while challenging, appear to be within the reach of the next generation proposed designs currently studied in Italy. (author)

  17. Process modeling for Humanities: tracing and analyzing scientific processes

    OpenAIRE

    Hug , Charlotte; Salinesi , Camille; Deneckere , Rebecca; Lamasse , Stéphane

    2011-01-01

    International audience; This paper concerns epistemology and the understanding of research processes in Humanities, such as Archaeology. We believe that to properly understand research processes, it is essential to trace them. The collected traces depend on the process model established, which has to be as accurate as possible to exhaustively record the traces. In this paper, we briefly explain why the existing process models for Humanities are not sufficient to represent traces. We then pres...

  18. Business process modeling for processing classified documents using RFID technology

    Directory of Open Access Journals (Sweden)

    Koszela Jarosław

    2016-01-01

    Full Text Available The article outlines the application of the processing approach to the functional description of the designed IT system supporting the operations of the secret office, which processes classified documents. The article describes the application of the method of incremental modeling of business processes according to the BPMN model to the description of the processes currently implemented (“as is” in a manual manner and target processes (“to be”, using the RFID technology for the purpose of their automation. Additionally, the examples of applying the method of structural and dynamic analysis of the processes (process simulation to verify their correctness and efficiency were presented. The extension of the process analysis method is a possibility of applying the warehouse of processes and process mining methods.

  19. Process correlation analysis model for process improvement identification.

    Science.gov (United States)

    Choi, Su-jin; Kim, Dae-Kyoo; Park, Sooyong

    2014-01-01

    Software process improvement aims at improving the development process of software systems. It is initiated by process assessment identifying strengths and weaknesses and based on the findings, improvement plans are developed. In general, a process reference model (e.g., CMMI) is used throughout the process of software process improvement as the base. CMMI defines a set of process areas involved in software development and what to be carried out in process areas in terms of goals and practices. Process areas and their elements (goals and practices) are often correlated due to the iterative nature of software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data.

  20. Syntax highlighting in business process models

    NARCIS (Netherlands)

    Reijers, H.A.; Freytag, T.; Mendling, J.; Eckleder, A.

    2011-01-01

    Sense-making of process models is an important task in various phases of business process management initiatives. Despite this, there is currently hardly any support in business process modeling tools to adequately support model comprehension. In this paper we adapt the concept of syntax

  1. Configurable multi-perspective business process models

    NARCIS (Netherlands)

    La Rosa, M.; Dumas, M.; Hofstede, ter A.H.M.; Mendling, J.

    2011-01-01

    A configurable process model provides a consolidated view of a family of business processes. It promotes the reuse of proven practices by providing analysts with a generic modeling artifact from which to derive individual process models. Unfortunately, the scope of existing notations for

  2. Model reduction and physical understanding of slowly oscillating processes : the circadian cycle.

    Energy Technology Data Exchange (ETDEWEB)

    Goussis, Dimitris A. (Ploutonos 7, Palaio Faliro, Greece); Najm, Habib N.

    2006-01-01

    A differential system that models the circadian rhythm in Drosophila is analyzed with the computational singular perturbation (CSP) algorithm. Reduced nonstiff models of prespecified accuracy are constructed, the form and size of which are time-dependent. When compared with conventional asymptotic analysis, CSP exhibits superior performance in constructing reduced models, since it can algorithmically identify and apply all the required order of magnitude estimates and algebraic manipulations. A similar performance is demonstrated by CSP in generating data that allow for the acquisition of physical understanding. It is shown that the processes driving the circadian cycle are (i) mRNA translation into monomer protein, and monomer protein destruction by phosphorylation and degradation (along the largest portion of the cycle); and (ii) mRNA synthesis (along a short portion of the cycle). These are slow processes. Their action in driving the cycle is allowed by the equilibration of the fastest processes; (1) the monomer dimerization with the dimer dissociation (along the largest portion of the cycle); and (2) the net production of monomer+dimmer proteins with that of mRNA (along the short portion of the cycle). Additional results (regarding the time scales of the established equilibria, their origin, the rate limiting steps, the couplings among the variables, etc.) highlight the utility of CSP for automated identification of the important underlying dynamical features, otherwise accessible only for simple systems whose various suitable simplifications can easily be recognized.

  3. User's manual for a process model code

    International Nuclear Information System (INIS)

    Kern, E.A.; Martinez, D.P.

    1981-03-01

    The MODEL code has been developed for computer modeling of materials processing facilities associated with the nuclear fuel cycle. However, it can also be used in other modeling applications. This report provides sufficient information for a potential user to apply the code to specific process modeling problems. Several examples that demonstrate most of the capabilities of the code are provided

  4. Repairing process models to reflect reality

    NARCIS (Netherlands)

    Fahland, D.; Aalst, van der W.M.P.; Barros, A.; Gal, A.; Kindler, E.

    2012-01-01

    Process mining techniques relate observed behavior (i.e., event logs) to modeled behavior (e.g., a BPMN model or a Petri net). Processes models can be discovered from event logs and conformance checking techniques can be used to detect and diagnose differences between observed and modeled behavior.

  5. Genetic Process Mining: Alignment-based Process Model Mutation

    NARCIS (Netherlands)

    Eck, van M.L.; Buijs, J.C.A.M.; Dongen, van B.F.; Fournier, F.; Mendling, J.

    2015-01-01

    The Evolutionary Tree Miner (ETM) is a genetic process discovery algorithm that enables the user to guide the discovery process based on preferences with respect to four process model quality dimensions: replay fitness, precision, generalization and simplicity. Traditionally, the ETM algorithm uses

  6. Composing Models of Geographic Physical Processes

    Science.gov (United States)

    Hofer, Barbara; Frank, Andrew U.

    Processes are central for geographic information science; yet geographic information systems (GIS) lack capabilities to represent process related information. A prerequisite to including processes in GIS software is a general method to describe geographic processes independently of application disciplines. This paper presents such a method, namely a process description language. The vocabulary of the process description language is derived formally from mathematical models. Physical processes in geography can be described in two equivalent languages: partial differential equations or partial difference equations, where the latter can be shown graphically and used as a method for application specialists to enter their process models. The vocabulary of the process description language comprises components for describing the general behavior of prototypical geographic physical processes. These process components can be composed by basic models of geographic physical processes, which is shown by means of an example.

  7. An Improved Surface Simplification Method for Facial Expression Animation Based on Homogeneous Coordinate Transformation Matrix and Maximum Shape Operator

    Directory of Open Access Journals (Sweden)

    Juin-Ling Tseng

    2016-01-01

    Full Text Available Facial animation is one of the most popular 3D animation topics researched in recent years. However, when using facial animation, a 3D facial animation model has to be stored. This 3D facial animation model requires many triangles to accurately describe and demonstrate facial expression animation because the face often presents a number of different expressions. Consequently, the costs associated with facial animation have increased rapidly. In an effort to reduce storage costs, researchers have sought to simplify 3D animation models using techniques such as Deformation Sensitive Decimation and Feature Edge Quadric. The studies conducted have examined the problems in the homogeneity of the local coordinate system between different expression models and in the retainment of simplified model characteristics. This paper proposes a method that applies Homogeneous Coordinate Transformation Matrix to solve the problem of homogeneity of the local coordinate system and Maximum Shape Operator to detect shape changes in facial animation so as to properly preserve the features of facial expressions. Further, root mean square error and perceived quality error are used to compare the errors generated by different simplification methods in experiments. Experimental results show that, compared with Deformation Sensitive Decimation and Feature Edge Quadric, our method can not only reduce the errors caused by simplification of facial animation, but also retain more facial features.

  8. MORTALITY MODELING WITH LEVY PROCESSES

    Directory of Open Access Journals (Sweden)

    M. Serhat Yucel, FRM

    2012-07-01

    Full Text Available Mortality and longevity risk is usually one of the main risk components ineconomic capital models of insurance companies. Above all, future mortalityexpectations are an important input in the modeling and pricing of long termproducts. Deviations from the expectation can lead insurance company even todefault if sufficient reserves and capital is not held. Thus, Modeling of mortalitytime series accurately is a vital concern for the insurance industry. The aim of thisstudy is to perform distributional and spectral testing to the mortality data andpracticed discrete and continuous time modeling. We believe, the results and thetechniques used in this study will provide a basis for Value at Risk formula incase of mortality.

  9. Model-based software process improvement

    Science.gov (United States)

    Zettervall, Brenda T.

    1994-01-01

    The activities of a field test site for the Software Engineering Institute's software process definition project are discussed. Products tested included the improvement model itself, descriptive modeling techniques, the CMM level 2 framework document, and the use of process definition guidelines and templates. The software process improvement model represents a five stage cyclic approach for organizational process improvement. The cycles consist of the initiating, diagnosing, establishing, acting, and leveraging phases.

  10. Kopernik : modeling business processes for digital customers

    OpenAIRE

    Estañol Lamarca, Montserrat; Castro, Manuel; Díaz-Montenegro, Sylvia; Teniente López, Ernest

    2016-01-01

    This paper presents the Kopernik methodology for modeling business processes for digital customers. These processes require a high degree of flexibility in the execution of their tasks or actions. We achieve this by using the artifact-centric approach to process modeling and the use of condition-action rules. The processes modeled following Kopernik can then be implemented in an existing commercial tool, Balandra.

  11. The triconnected abstraction of process models

    OpenAIRE

    Polyvyanyy, Artem; Smirnov, Sergey; Weske, Mathias

    2009-01-01

    Contents: Artem Polyvanny, Sergey Smirnow, and Mathias Weske The Triconnected Abstraction of Process Models 1 Introduction 2 Business Process Model Abstraction 3 Preliminaries 4 Triconnected Decomposition 4.1 Basic Approach for Process Component Discovery 4.2 SPQR-Tree Decomposition 4.3 SPQR-Tree Fragments in the Context of Process Models 5 Triconnected Abstraction 5.1 Abstraction Rules 5.2 Abstraction Algorithm 6 Related Work and Conclusions

  12. Modelling of Batch Process Operations

    DEFF Research Database (Denmark)

    Abdul Samad, Noor Asma Fazli; Cameron, Ian; Gani, Rafiqul

    2011-01-01

    Here a batch cooling crystalliser is modelled and simulated as is a batch distillation system. In the batch crystalliser four operational modes of the crystalliser are considered, namely: initial cooling, nucleation, crystal growth and product removal. A model generation procedure is shown that s...

  13. Mathematical Modeling: A Structured Process

    Science.gov (United States)

    Anhalt, Cynthia Oropesa; Cortez, Ricardo

    2015-01-01

    Mathematical modeling, in which students use mathematics to explain or interpret physical, social, or scientific phenomena, is an essential component of the high school curriculum. The Common Core State Standards for Mathematics (CCSSM) classify modeling as a K-12 standard for mathematical practice and as a conceptual category for high school…

  14. Birth/death process model

    Science.gov (United States)

    Solloway, C. B.; Wakeland, W.

    1976-01-01

    First-order Markov model developed on digital computer for population with specific characteristics. System is user interactive, self-documenting, and does not require user to have complete understanding of underlying model details. Contains thorough error-checking algorithms on input and default capabilities.

  15. Transforming Collaborative Process Models into Interface Process Models by Applying an MDA Approach

    Science.gov (United States)

    Lazarte, Ivanna M.; Chiotti, Omar; Villarreal, Pablo D.

    Collaborative business models among enterprises require defining collaborative business processes. Enterprises implement B2B collaborations to execute these processes. In B2B collaborations the integration and interoperability of processes and systems of the enterprises are required to support the execution of collaborative processes. From a collaborative process model, which describes the global view of the enterprise interactions, each enterprise must define the interface process that represents the role it performs in the collaborative process in order to implement the process in a Business Process Management System. Hence, in this work we propose a method for the automatic generation of the interface process model of each enterprise from a collaborative process model. This method is based on a Model-Driven Architecture to transform collaborative process models into interface process models. By applying this method, interface processes are guaranteed to be interoperable and defined according to a collaborative process.

  16. Modeling business processes: theoretical and practical aspects

    Directory of Open Access Journals (Sweden)

    V.V. Dubininа

    2015-06-01

    Full Text Available The essence of process-oriented enterprise management has been examined in the article. The content and types of information technology have been analyzed in the article, due to the complexity and differentiation of existing methods, as well as the specificity of language, terminology of the enterprise business processes modeling. The theoretical aspects of business processes modeling have been reviewed and the modern traditional modeling techniques received practical application in the visualization model of retailers activity have been studied in the article. In the process of theoretical analysis of the modeling methods found that UFO-toolkit method that has been developed by Ukrainian scientists due to it systemology integrated opportunities, is the most suitable for structural and object analysis of retailers business processes. It was designed visualized simulation model of the business process "sales" as is" of retailers using a combination UFO-elements with the aim of the further practical formalization and optimization of a given business process.

  17. A Comparative of business process modelling techniques

    Science.gov (United States)

    Tangkawarow, I. R. H. T.; Waworuntu, J.

    2016-04-01

    In this era, there is a lot of business process modeling techniques. This article is the research about differences of business process modeling techniques. For each technique will explain about the definition and the structure. This paper presents a comparative analysis of some popular business process modelling techniques. The comparative framework is based on 2 criteria: notation and how it works when implemented in Somerleyton Animal Park. Each technique will end with the advantages and disadvantages. The final conclusion will give recommend of business process modeling techniques that easy to use and serve the basis for evaluating further modelling techniques.

  18. Information-Processing Models and Curriculum Design

    Science.gov (United States)

    Calfee, Robert C.

    1970-01-01

    "This paper consists of three sections--(a) the relation of theoretical analyses of learning to curriculum design, (b) the role of information-processing models in analyses of learning processes, and (c) selected examples of the application of information-processing models to curriculum design problems." (Author)

  19. How visual cognition influences process model comprehension

    NARCIS (Netherlands)

    Petrusel, Razvan; Mendling, Jan; Reijers, Hajo A.

    2017-01-01

    Process analysts and other professionals extensively use process models to analyze business processes and identify performance improvement opportunities. Therefore, it is important that such models can be easily and properly understood. Previous research has mainly focused on two types of factors

  20. Social software for business process modeling

    NARCIS (Netherlands)

    Koschmider, A.; Song, M.S.; Reijers, H.A.

    2010-01-01

    Formal models of business processes are used for a variety of purposes. But where the elicitation of the characteristics of a business process usually takes place in a collaborative fashion, the building of the final, formal process model is done mostly by a single person. This article presents the

  1. Measuring similarity between business process models

    NARCIS (Netherlands)

    Dongen, van B.F.; Dijkman, R.M.; Mendling, J.

    2007-01-01

    Quality aspects become increasingly important when business process modeling is used in a large-scale enterprise setting. In order to facilitate a storage without redundancy and an efficient retrieval of relevant process models in model databases it is required to develop a theoretical understanding

  2. Steady-State Process Modelling

    DEFF Research Database (Denmark)

    Cameron, Ian; Gani, Rafiqul

    2011-01-01

    illustrate the “equation oriented” approach as well as the “sequential modular” approach to solving complex flowsheets for steady state applications. The applications include the Williams-Otto plant, the hydrodealkylation (HDA) of toluene, conversion of ethylene to ethanol and a bio-ethanol process....

  3. The impact of working memory and the "process of process modelling" on model quality: Investigating experienced versus inexperienced modellers

    DEFF Research Database (Denmark)

    Martini, Markus; Pinggera, Jakob; Neurauter, Manuel

    2016-01-01

    of reconciliation phases was positively related to PM quality in experienced modellers. Our research reveals central cognitive mechanisms in process modelling and has potential practical implications for the development of modelling software and teaching the craft of process modelling....... the role of cognitive processes as well as modelling processes in creating a PM in experienced and inexperienced modellers. Specifically, two working memory (WM) functions (holding and processing of information and relational integration) and three process of process modelling phases (comprehension...

  4. Styles in business process modeling: an exploration and a model

    NARCIS (Netherlands)

    Pinggera, J.; Soffer, P.; Fahland, D.; Weidlich, M.; Zugal, S.; Weber, B.; Reijers, H.A.; Mendling, J.

    2015-01-01

    Business process models are an important means to design, analyze, implement, and control business processes. As with every type of conceptual model, a business process model has to meet certain syntactic, semantic, and pragmatic quality requirements to be of value. For many years, such quality

  5. Process generalization in conceptual models

    NARCIS (Netherlands)

    Wieringa, Roelf J.

    In conceptual modeling, the universe of discourse (UoD) is divided into classes which have a taxonomic structure. The classes are usually defined in terms of attributes (all objects in a class share attribute names) and possibly of events. For enmple, the class of employees is the set of objects to

  6. Numerical modelling of reflood processes

    International Nuclear Information System (INIS)

    Glynn, D.R.; Rhodes, N.; Tatchell, D.G.

    1983-01-01

    The use of a detailed computer model to investigate the effects of grid size and the choice of wall-to-fluid heat-transfer correlations on the predictions obtained for reflooding of a vertical heated channel is described. The model employs equations for the momentum and enthalpy of vapour and liquid and hence accounts for both thermal non-equilibrium and slip between the phases. Empirical correlations are used to calculate interphase and wall-to-fluid friction and heat-transfer as functions of flow regime and local conditions. The empirical formulae have remained fixed with the exception of the wall-to-fluid heat-transfer correlations. These have been varied according to the practices adopted in other computer codes used to model reflood, namely REFLUX, RELAP and TRAC. Calculations have been performed to predict the CSNI standard problem number 7, and the results are compared with experiment. It is shown that the results are substantially grid-independent, and that the choice of correlation has a significant influence on the general flow behaviour, the rate of quenching and on the maximum cladding temperature predicted by the model. It is concluded that good predictions of reflooding rates can be obtained with particular correlation sets. (author)

  7. Branching process models of cancer

    CERN Document Server

    Durrett, Richard

    2015-01-01

    This volume develops results on continuous time branching processes and applies them to study rate of tumor growth, extending classic work on the Luria-Delbruck distribution. As a consequence, the authors calculate the probability that mutations that confer resistance to treatment are present at detection and quantify the extent of tumor heterogeneity. As applications, the authors evaluate ovarian cancer screening strategies and give rigorous proofs for results of Heano and Michor concerning tumor metastasis. These notes should be accessible to students who are familiar with Poisson processes and continuous time. Richard Durrett is mathematics professor at Duke University, USA. He is the author of 8 books, over 200 journal articles, and has supervised more than 40 Ph.D. students. Most of his current research concerns the applications of probability to biology: ecology, genetics, and most recently cancer.

  8. MODELING OF PROCESSES OF OVERCOMING CONTRADICTIONS OF THE STATE AND ECONOMIC OPERATORS FOR THE SECURITY AND FACILITATION OF CUSTOMS PROCEDURES

    Directory of Open Access Journals (Sweden)

    Berezhnyuk Ivan

    2018-03-01

    Full Text Available Introduction. The issue of simultaneous provision of economic security of the state and simplification of customs procedures is actualized nowadays. The author of the study stressed the importance to create a «safe» business environment from the point of view of the customs sphere, which is based on «security», «justice» and «stability». Purpose. Development of methodical recommendations for modeling the processes of overcoming contradictions of the state and subjects of foreign economic activity in the field of security and simplification of customs procedures. Results. The research indicates that the appointment of revenue and fee bodies is the creation of favorable conditions for the development of foreign economic activity, ensuring the safety of society, protecting the customs interests of Ukraine. When performing customs duties by the SFS, the tasks assigned to them, aimed at ensuring the correct application, strict observance and prevention of non-compliance with the requirements of the Ukrainian legislation on state customs issues, may present risks that are inherently contradictory, conflicting in terms of the vector of action with respect to each other, namely: the probability of non-compliance by the subjects of foreign trade with the norms of customs legislation, or the creation of significant bureaucratic barriers in the process of economic operators. There is a peculiar conflict of interests between the state and the subjects of foreign economic activity. The main direction of creating a favorable business environment in accordance with the recommendations of WCO is the process of further simplification of customs procedures for subjects with a high degree of trust, fighting corruption and facilitating the movement of goods, vehicles and people in general. Conclusions. Thus, the scheme of «relations» between the state and the subjects of foreign economic activity can be modeled by the means of game theory, which is

  9. Plasma Processing of Model Residential Solid Waste

    Science.gov (United States)

    Messerle, V. E.; Mossé, A. L.; Nikonchuk, A. N.; Ustimenko, A. B.; Baimuldin, R. V.

    2017-09-01

    The authors have tested the technology of processing of model residential solid waste. They have developed and created a pilot plasma unit based on a plasma chamber incinerator. The waste processing technology has been tested and prepared for commercialization.

  10. Simulation Modeling of Software Development Processes

    Science.gov (United States)

    Calavaro, G. F.; Basili, V. R.; Iazeolla, G.

    1996-01-01

    A simulation modeling approach is proposed for the prediction of software process productivity indices, such as cost and time-to-market, and the sensitivity analysis of such indices to changes in the organization parameters and user requirements. The approach uses a timed Petri Net and Object Oriented top-down model specification. Results demonstrate the model representativeness, and its usefulness in verifying process conformance to expectations, and in performing continuous process improvement and optimization.

  11. Discovering Process Reference Models from Process Variants Using Clustering Techniques

    NARCIS (Netherlands)

    Li, C.; Reichert, M.U.; Wombacher, Andreas

    2008-01-01

    In today's dynamic business world, success of an enterprise increasingly depends on its ability to react to changes in a quick and flexible way. In response to this need, process-aware information systems (PAIS) emerged, which support the modeling, orchestration and monitoring of business processes

  12. Validation process of simulation model

    International Nuclear Information System (INIS)

    San Isidro, M. J.

    1998-01-01

    It is presented a methodology on empirical validation about any detailed simulation model. This king of validation it is always related with an experimental case. The empirical validation has a residual sense, because the conclusions are based on comparisons between simulated outputs and experimental measurements. This methodology will guide us to detect the fails of the simulation model. Furthermore, it can be used a guide in the design of posterior experiments. Three steps can be well differentiated: Sensitivity analysis. It can be made with a DSA, differential sensitivity analysis, and with a MCSA, Monte-Carlo sensitivity analysis. Looking the optimal domains of the input parameters. It has been developed a procedure based on the Monte-Carlo methods and Cluster techniques, to find the optimal domains of these parameters. Residual analysis. This analysis has been made on the time domain and on the frequency domain, it has been used the correlation analysis and spectral analysis. As application of this methodology, it is presented the validation carried out on a thermal simulation model on buildings, Esp., studying the behavior of building components on a Test Cell of LECE of CIEMAT. (Author) 17 refs

  13. Towards Model Checking Stochastic Process Algebra

    NARCIS (Netherlands)

    Hermanns, H.; Grieskamp, W.; Santen, T.; Katoen, Joost P.; Stoddart, B.; Meyer-Kayser, J.; Siegle, M.

    2000-01-01

    Stochastic process algebras have been proven useful because they allow behaviour-oriented performance and reliability modelling. As opposed to traditional performance modelling techniques, the behaviour- oriented style supports composition and abstraction in a natural way. However, analysis of

  14. Parameter identification in multinomial processing tree models

    NARCIS (Netherlands)

    Schmittmann, V.D.; Dolan, C.V.; Raijmakers, M.E.J.; Batchelder, W.H.

    2010-01-01

    Multinomial processing tree models form a popular class of statistical models for categorical data that have applications in various areas of psychological research. As in all statistical models, establishing which parameters are identified is necessary for model inference and selection on the basis

  15. Simplification of Markov chains with infinite state space and the mathematical theory of random gene expression bursts

    Science.gov (United States)

    Jia, Chen

    2017-09-01

    Here we develop an effective approach to simplify two-time-scale Markov chains with infinite state spaces by removal of states with fast leaving rates, which improves the simplification method of finite Markov chains. We introduce the concept of fast transition paths and show that the effective transitions of the reduced chain can be represented as the superposition of the direct transitions and the indirect transitions via all the fast transition paths. Furthermore, we apply our simplification approach to the standard Markov model of single-cell stochastic gene expression and provide a mathematical theory of random gene expression bursts. We give the precise mathematical conditions for the bursting kinetics of both mRNAs and proteins. It turns out that random bursts exactly correspond to the fast transition paths of the Markov model. This helps us gain a better understanding of the physics behind the bursting kinetics as an emergent behavior from the fundamental multiscale biochemical reaction kinetics of stochastic gene expression.

  16. Process modelling on a canonical basis[Process modelling; Canonical modelling

    Energy Technology Data Exchange (ETDEWEB)

    Siepmann, Volker

    2006-12-20

    Based on an equation oriented solving strategy, this thesis investigates a new approach to process modelling. Homogeneous thermodynamic state functions represent consistent mathematical models of thermodynamic properties. Such state functions of solely extensive canonical state variables are the basis of this work, as they are natural objective functions in optimisation nodes to calculate thermodynamic equilibrium regarding phase-interaction and chemical reactions. Analytical state function derivatives are utilised within the solution process as well as interpreted as physical properties. By this approach, only a limited range of imaginable process constraints are considered, namely linear balance equations of state variables. A second-order update of source contributions to these balance equations is obtained by an additional constitutive equation system. These equations are general dependent on state variables and first-order sensitivities, and cover therefore practically all potential process constraints. Symbolic computation technology efficiently provides sparsity and derivative information of active equations to avoid performance problems regarding robustness and computational effort. A benefit of detaching the constitutive equation system is that the structure of the main equation system remains unaffected by these constraints, and a priori information allows to implement an efficient solving strategy and a concise error diagnosis. A tailor-made linear algebra library handles the sparse recursive block structures efficiently. The optimisation principle for single modules of thermodynamic equilibrium is extended to host entire process models. State variables of different modules interact through balance equations, representing material flows from one module to the other. To account for reusability and encapsulation of process module details, modular process modelling is supported by a recursive module structure. The second-order solving algorithm makes it

  17. Integrated Process Modeling-A Process Validation Life Cycle Companion.

    Science.gov (United States)

    Zahel, Thomas; Hauer, Stefan; Mueller, Eric M; Murphy, Patrick; Abad, Sandra; Vasilieva, Elena; Maurer, Daniel; Brocard, Cécile; Reinisch, Daniela; Sagmeister, Patrick; Herwig, Christoph

    2017-10-17

    During the regulatory requested process validation of pharmaceutical manufacturing processes, companies aim to identify, control, and continuously monitor process variation and its impact on critical quality attributes (CQAs) of the final product. It is difficult to directly connect the impact of single process parameters (PPs) to final product CQAs, especially in biopharmaceutical process development and production, where multiple unit operations are stacked together and interact with each other. Therefore, we want to present the application of Monte Carlo (MC) simulation using an integrated process model (IPM) that enables estimation of process capability even in early stages of process validation. Once the IPM is established, its capability in risk and criticality assessment is furthermore demonstrated. IPMs can be used to enable holistic production control strategies that take interactions of process parameters of multiple unit operations into account. Moreover, IPMs can be trained with development data, refined with qualification runs, and maintained with routine manufacturing data which underlines the lifecycle concept. These applications will be shown by means of a process characterization study recently conducted at a world-leading contract manufacturing organization (CMO). The new IPM methodology therefore allows anticipation of out of specification (OOS) events, identify critical process parameters, and take risk-based decisions on counteractions that increase process robustness and decrease the likelihood of OOS events.

  18. Revising process models through inductive learning

    NARCIS (Netherlands)

    Maggi, F.M.; Corapi, D.; Russo, A.; Lupu, E.; Visaggio, G.; Muehlen, zur M.; Su, J.

    2011-01-01

    Discovering the Business Process (BP) model underpinning existing practices through analysis of event logs, allows users to understand, analyse and modify the process. But, to be useful, the BP model must be kept in line with practice throughout its lifetime, as changes occur to the business

  19. Diagnosing differences between business process models

    NARCIS (Netherlands)

    Dijkman, R.M.; Dumas, M.; Reichert, M.; Shan, M.-C.

    2008-01-01

    This paper presents a technique to diagnose differences between business process models in the EPC notation. The diagnosis returns the exact position of a difference in the business process models and diagnoses the type of a difference, using a typology of differences developed in previous work.

  20. APROMORE : an advanced process model repository

    NARCIS (Netherlands)

    La Rosa, M.; Reijers, H.A.; Aalst, van der W.M.P.; Dijkman, R.M.; Mendling, J.; Dumas, M.; García-Bañuelos, L.

    2011-01-01

    Business process models are becoming available in large numbers due to their widespread use in many industrial applications such as enterprise and quality engineering projects. On the one hand, this raises a challenge as to their proper management: how can it be ensured that the proper process model

  1. Efficient querying of large process model repositories

    NARCIS (Netherlands)

    Jin, Tao; Wang, Jianmin; La Rosa, M.; Hofstede, ter A.H.M.; Wen, Lijie

    2013-01-01

    Recent years have seen an increased uptake of business process management technology in industries. This has resulted in organizations trying to manage large collections of business process models. One of the challenges facing these organizations concerns the retrieval of models from large business

  2. THE STUDY OF SIMPLIFICATION AND EXPLICITATION TECHNIQUES IN KHALED HOSSEINI'S “A THOUSAND SPLENDID SUNS”

    Directory of Open Access Journals (Sweden)

    Reza Kafipour

    2016-12-01

    Full Text Available Teaching and learning strategies help facilitate teaching and learning. Among them, simplification and explicitation strategies are those which help transferring the meaning to the learners and readers of a translated text. The aim of this study was to investigate explicitation and simplification in Persian translation of novel of Khaled Hosseini's “A Thousand Splendid Suns”. The study also attempted to find out frequencies of simplification and explicitation techniques used by the translators in translating the novel. To do so, 359 sentences out of 6000 sentences in original text were selected by systematic random sampling procedure. Then the percentage and total sums of each one of the strategies were calculated. The result showed that both translators used simplification and explicitation techniques significantly in their translation whereas Saadvandian, the first translator, significantly applied more simplification techniques in comparison with Ghabrai, the second translator. However, no significant difference was found between translators in the application of explicitation techniques. The study implies that these two translation strategies were fully familiar for the translators as both translators used them significantly to make the translation more understandable to the readers.

  3. The Study of Simplification and Explicitation Techniques in Khaled Hosseini's “A Thousand Splendid Suns”

    Directory of Open Access Journals (Sweden)

    Reza Kafipour

    2016-12-01

    Full Text Available Teaching and learning strategies help facilitate teaching and learning. Among them, simplification and explicitation strategies are those which help transferring the meaning to the learners and readers of a translated text. The aim of this study was to investigate explicitation and simplification in Persian translation of novel of Khaled Hosseini's “A Thousand Splendid Suns”. The study also attempted to find out frequencies of simplification and explicitation techniques used by the translators in translating the novel. To do so, 359 sentences out of 6000 sentences in original text were selected by systematic random sampling procedure. Then the percentage and total sums of each one of the strategies were calculated. The result showed that both translators used simplification and explicitation techniques significantly in their translation whereas Saadvandian, the first translator, significantly applied more simplification techniques in comparison with Ghabrai, the second translator. However, no significant difference was found between translators in the application of explicitation techniques. The study implies that these two translation strategies were fully familiar for the translators as both translators used them significantly to make the translation more understandable to the readers.

  4. Distillation modeling for a uranium refining process

    Energy Technology Data Exchange (ETDEWEB)

    Westphal, B.R.

    1996-03-01

    As part of the spent fuel treatment program at Argonne National Laboratory, a vacuum distillation process is being employed for the recovery of uranium following an electrorefining process. Distillation of a salt electrolyte, containing a eutectic mixture of lithium and potassium chlorides, from uranium is achieved by a simple batch operation and is termed {open_quotes}cathode processing{close_quotes}. The incremental distillation of electrolyte salt will be modeled by an equilibrium expression and on a molecular basis since the operation is conducted under moderate vacuum conditions. As processing continues, the two models will be compared and analyzed for correlation with actual operating results. Possible factors that may contribute to aberrations from the models include impurities at the vapor-liquid boundary, distillate reflux, anomalous pressure gradients, and mass transport phenomena at the evaporating surface. Ultimately, the purpose of either process model is to enable the parametric optimization of the process.

  5. CSP-based chemical kinetics mechanisms simplification strategy for non-premixed combustion: An application to hybrid rocket propulsion

    KAUST Repository

    Ciottoli, Pietro P.

    2017-08-14

    A set of simplified chemical kinetics mechanisms for hybrid rocket applications using gaseous oxygen (GOX) and hydroxyl-terminated polybutadiene (HTPB) is proposed. The starting point is a 561-species, 2538-reactions, detailed chemical kinetics mechanism for hydrocarbon combustion. This mechanism is used for predictions of the oxidation of butadiene, the primary HTPB pyrolysis product. A Computational Singular Perturbation (CSP) based simplification strategy for non-premixed combustion is proposed. The simplification algorithm is fed with the steady-solutions of classical flamelet equations, these being representative of the non-premixed nature of the combustion processes characterizing a hybrid rocket combustion chamber. The adopted flamelet steady-state solutions are obtained employing pure butadiene and gaseous oxygen as fuel and oxidizer boundary conditions, respectively, for a range of imposed values of strain rate and background pressure. Three simplified chemical mechanisms, each comprising less than 20 species, are obtained for three different pressure values, 3, 17, and 36 bar, selected in accordance with an experimental test campaign of lab-scale hybrid rocket static firings. Finally, a comprehensive strategy is shown to provide simplified mechanisms capable of reproducing the main flame features in the whole pressure range considered.

  6. Improved model management with aggregated business process models

    NARCIS (Netherlands)

    Reijers, H.A.; Mans, R.S.; Toorn, van der R.A.

    2009-01-01

    Contemporary organizations invest much efforts in creating models of their business processes. This raises the issue of how to deal with large sets of process models that become available over time. This paper proposes an extension of Event-driven Process Chains, called the aggregate EPC (aEPC),

  7. Conceptual modelling of human resource evaluation process

    Directory of Open Access Journals (Sweden)

    Negoiţă Doina Olivia

    2017-01-01

    Full Text Available Taking into account the highly diverse tasks which employees have to fulfil due to complex requirements of nowadays consumers, the human resource within an enterprise has become a strategic element for developing and exploiting products which meet the market expectations. Therefore, organizations encounter difficulties when approaching the human resource evaluation process. Hence, the aim of the current paper is to design a conceptual model of the aforementioned process, which allows the enterprises to develop a specific methodology. In order to design the conceptual model, Business Process Modelling instruments were employed - Adonis Community Edition Business Process Management Toolkit using the ADONIS BPMS Notation. The conceptual model was developed based on an in-depth secondary research regarding the human resource evaluation process. The proposed conceptual model represents a generic workflow (sequential and/ or simultaneously activities, which can be extended considering the enterprise’s needs regarding their requirements when conducting a human resource evaluation process. Enterprises can benefit from using software instruments for business process modelling as they enable process analysis and evaluation (predefined / specific queries and also model optimization (simulations.

  8. Measures of Quality in Business Process Modelling

    Directory of Open Access Journals (Sweden)

    Radek Hronza

    2015-06-01

    Full Text Available Business process modelling and analysing is undoubtedly one of the most important parts of Applied (Business Informatics. Quality of business process models (diagrams is crucial for any purpose in this area. The goal of a process analyst’s work is to create generally understandable, explicit and error free models. If a process is properly described, created models can be used as an input into deep analysis and optimization. It can be assumed that properly designed business process models (similarly as in the case of correctly written algorithms contain characteristics that can be mathematically described. Besides it will be possible to create a tool that will help process analysts to design proper models. As part of this review will be conducted systematic literature review in order to find and analyse business process model’s design and business process model’s quality measures. It was found that mentioned area had already been the subject of research investigation in the past. Thirty-three suitable scietific publications and twenty-two quality measures were found. Analysed scientific publications and existing quality measures do not reflect all important attributes of business process model’s clarity, simplicity and completeness. Therefore it would be appropriate to add new measures of quality.

  9. Online Rule Generation Software Process Model

    OpenAIRE

    Sudeep Marwaha; Alka Aroa; Satma M C; Rajni Jain; R C Goyal

    2013-01-01

    For production systems like expert systems, a rule generation software can facilitate the faster deployment. The software process model for rule generation using decision tree classifier refers to the various steps required to be executed for the development of a web based software model for decision rule generation. The Royce’s final waterfall model has been used in this paper to explain the software development process. The paper presents the specific output of various steps of modified wat...

  10. Evaluation of Models of the Reading Process.

    Science.gov (United States)

    Balajthy, Ernest

    A variety of reading process models have been proposed and evaluated in reading research. Traditional approaches to model evaluation specify the workings of a system in a simplified fashion to enable organized, systematic study of the system's components. Following are several statistical methods of model evaluation: (1) empirical research on…

  11. Edgar Schein's Process versus Content Consultation Models.

    Science.gov (United States)

    Rockwood, Gary F.

    1993-01-01

    Describes Schein's three models of consultation based on assumptions inherent in different helping styles: purchase of expertise and doctor-patient models, which focus on content of organization problems; and process consultation model, which focuses on how organizational problems are solved. Notes that Schein has suggested that consultants begin…

  12. Fatigue crack growth spectrum simplification: Facilitation of on-board damage prognosis systems

    Science.gov (United States)

    Adler, Matthew Adam

    2009-12-01

    monitoring and management of aircraft. A spectrum reduction method was proposed and experimentally validated that reduces a variable-amplitude spectrum to a constant-amplitude equivalent. The reduction from a variable-amplitude (VA) spectrum to a constant-amplitude equivalent (CAE) was proposed as a two-part process. Preliminary spectrum reduction is first performed by elimination of those loading events shown to be too negligible to significantly contribute to fatigue crack growth. This is accomplished by rainflow counting. The next step is to calculate the appropriate, equivalent maximum and minimum loads by means of a root-mean-square average. This reduced spectrum defines the CAE and replaces the original spectrum. The simplified model was experimentally shown to provide the approximately same fatigue crack growth as the original spectrum. Fatigue crack growth experiments for two dissimilar aircraft spectra across a wide-range of stress-intensity levels validated the proposed spectrum reduction procedure. Irrespective of the initial K-level, the constant-amplitude equivalent spectra were always conservative in crack growth rate, and were so by an average of 50% over the full range tested. This corresponds to a maximum 15% overestimation in driving force Delta K. Given other typical sources of scatter that occur during fatigue crack growth, a consistent 50% conservative prediction on crack growth rate is very satisfying. This is especially attractive given the reduction in cost gained by the simplification. We now have a seamless system that gives an acceptably good approximation of damage occurring in the aircraft. This contribution is significant because in a very simple way we now have given a path to bypass the current infrastructure and ground-support requirements. The decision-making is now a lot simpler. In managing an entire fleet we now have a workable system where the strength is in no need for a massive, isolated computational center. The fidelity of the model

  13. THE EFFECT OF TAX SIMPLIFICATION ON TAXPAYERS’ COMPLIANCE BEHAVIOR: RELIGIOSITY AS MODERATING VARIABLE

    Directory of Open Access Journals (Sweden)

    Muslichah Muslichah

    2017-03-01

    Full Text Available Tax compliance was an important issue for nations around the world as governments searched for revenue tomeet public needs. The importance of tax simplification had long been known as a determinant of compliancebehavior and it became an important issue in taxation research. The primary objective of this study was toinvestigate the effect of tax simplification and religiosity on compliance behavior. This study was conducted inMalang, East Java. Survey questionnaires were sent to 200 taxpayers and only 122 responded. Consistentwith the prior research, this study suggested that the effect of religiosity on compliance behavior was positiveand significant. Religiosity acted as moderating role on the relationship between tax simplification andcompliance behavior. This study was contributed to the compliance literature. The present study also providedpractical significance because the empirical result provided information about compliance behavior to helpgovernment to develop strategies toward increasing voluntary compliance.

  14. Distillation modeling for a uranium refining process

    International Nuclear Information System (INIS)

    Westphal, B.R.

    1996-01-01

    As part of the spent fuel treatment program at Argonne National Laboratory, a vacuum distillation process is being employed for the recovery of uranium following an electrorefining process. Distillation of a salt electrolyte, containing a eutectic mixture of lithium and potassium chlorides, from uranium is achieved by a simple batch operation and is termed open-quotes cathode processingclose quotes. The incremental distillation of electrolyte salt will be modeled by an equilibrium expression and on a molecular basis since the operation is conducted under moderate vacuum conditions. As processing continues, the two models will be compared and analyzed for correlation with actual operating results. Possible factors that may contribute to aberrations from the models include impurities at the vapor-liquid boundary, distillate reflux, anomalous pressure gradients, and mass transport phenomena at the evaporating surface. Ultimately, the purpose of either process model is to enable the parametric optimization of the process

  15. Piecewise deterministic processes in biological models

    CERN Document Server

    Rudnicki, Ryszard

    2017-01-01

    This book presents a concise introduction to piecewise deterministic Markov processes (PDMPs), with particular emphasis on their applications to biological models. Further, it presents examples of biological phenomena, such as gene activity and population growth, where different types of PDMPs appear: continuous time Markov chains, deterministic processes with jumps, processes with switching dynamics, and point processes. Subsequent chapters present the necessary tools from the theory of stochastic processes and semigroups of linear operators, as well as theoretical results concerning the long-time behaviour of stochastic semigroups induced by PDMPs and their applications to biological models. As such, the book offers a valuable resource for mathematicians and biologists alike. The first group will find new biological models that lead to interesting and often new mathematical questions, while the second can observe how to include seemingly disparate biological processes into a unified mathematical theory, and...

  16. Modeling closed nuclear fuel cycles processes

    Energy Technology Data Exchange (ETDEWEB)

    Shmidt, O.V. [A.A. Bochvar All-Russian Scientific Research Institute for Inorganic Materials, Rogova, 5a street, Moscow, 123098 (Russian Federation); Makeeva, I.R. [Zababakhin All-Russian Scientific Research Institute of Technical Physics, Vasiliev street 13, Snezhinsk, Chelyabinsk region, 456770 (Russian Federation); Liventsov, S.N. [Tomsk Polytechnic University, Tomsk, Lenin Avenue, 30, 634050 (Russian Federation)

    2016-07-01

    Computer models of processes are necessary for determination of optimal operating conditions for closed nuclear fuel cycle (NFC) processes. Computer models can be quickly changed in accordance with new and fresh data from experimental research. 3 kinds of process simulation are necessary. First, the VIZART software package is a balance model development used for calculating the material flow in technological processes. VIZART involves taking into account of equipment capacity, transport lines and storage volumes. Secondly, it is necessary to simulate the physico-chemical processes that are involved in the closure of NFC. The third kind of simulation is the development of software that allows the optimization, diagnostics and control of the processes which implies real-time simulation of product flows on the whole plant or on separate lines of the plant. (A.C.)

  17. Fermentation process diagnosis using a mathematical model

    Energy Technology Data Exchange (ETDEWEB)

    Yerushalmi, L; Volesky, B; Votruba, J

    1988-09-01

    Intriguing physiology of a solvent-producing strain of Clostridium acetobutylicum led to the synthesis of a mathematical model of the acetone-butanol fermentation process. The model presented is capable of describing the process dynamics and the culture behavior during a standard and a substandard acetone-butanol fermentation. In addition to the process kinetic parameters, the model includes the culture physiological parameters, such as the cellular membrane permeability and the number of membrane sites for active transport of sugar. Computer process simulation studies for different culture conditions used the model, and quantitatively pointed out the importance of selected culture parameters that characterize the cell membrane behaviour and play an important role in the control of solvent synthesis by the cell. The theoretical predictions by the new model were confirmed by experimental determination of the cellular membrane permeability.

  18. Integrating Tax Preparation with FAFSA Completion: Three Case Models

    Science.gov (United States)

    Daun-Barnett, Nathan; Mabry, Beth

    2012-01-01

    This research compares three different models implemented in four cities. The models integrated free tax-preparation services to assist low-income families with their completion of the Free Application for Federal Student Aid (FAFSA). There has been an increased focus on simplifying the FAFSA process. However, simplification is not the only…

  19. Generalizing on best practices in image processing: a model for promoting research integrity: Commentary on: Avoiding twisted pixels: ethical guidelines for the appropriate use and manipulation of scientific digital images.

    Science.gov (United States)

    Benos, Dale J; Vollmer, Sara H

    2010-12-01

    Modifying images for scientific publication is now quick and easy due to changes in technology. This has created a need for new image processing guidelines and attitudes, such as those offered to the research community by Doug Cromey (Cromey 2010). We suggest that related changes in technology have simplified the task of detecting misconduct for journal editors as well as researchers, and that this simplification has caused a shift in the responsibility for reporting misconduct. We also argue that the concept of best practices in image processing can serve as a general model for education in best practices in research.

  20. Software-Engineering Process Simulation (SEPS) model

    Science.gov (United States)

    Lin, C. Y.; Abdel-Hamid, T.; Sherif, J. S.

    1992-01-01

    The Software Engineering Process Simulation (SEPS) model is described which was developed at JPL. SEPS is a dynamic simulation model of the software project development process. It uses the feedback principles of system dynamics to simulate the dynamic interactions among various software life cycle development activities and management decision making processes. The model is designed to be a planning tool to examine tradeoffs of cost, schedule, and functionality, and to test the implications of different managerial policies on a project's outcome. Furthermore, SEPS will enable software managers to gain a better understanding of the dynamics of software project development and perform postmodern assessments.

  1. Model of diffusers / permeators for hydrogen processing

    International Nuclear Information System (INIS)

    Jacobs, W. D.; Hang, T.

    2008-01-01

    Palladium-silver (Pd-Ag) diffusers are mainstays of hydrogen processing. Diffusers separate hydrogen from inert species such as nitrogen, argon or helium. The tubing becomes permeable to hydrogen when heated to more than 250 C and a differential pressure is created across the membrane. The hydrogen diffuses better at higher temperatures. Experimental or experiential results have been the basis for determining or predicting a diffuser's performance. However, the process can be mathematically modeled, and comparison to experimental or other operating data can be utilized to improve the fit of the model. A reliable model-based diffuser system design is the goal which will have impacts on tritium and hydrogen processing. A computer model has been developed to solve the differential equations for diffusion given the operating boundary conditions. The model was compared to operating data for a low pressure diffuser system. The modeling approach and the results are presented in this paper. (authors)

  2. Modeling Business Processes in Public Administration

    Science.gov (United States)

    Repa, Vaclav

    During more than 10 years of its existence business process modeling became a regular part of organization management practice. It is mostly regarded. as a part of information system development or even as a way to implement some supporting technology (for instance workflow system). Although I do not agree with such reduction of the real meaning of a business process, it is necessary to admit that information technologies play an essential role in business processes (see [1] for more information), Consequently, an information system is inseparable from a business process itself because it is a cornerstone of the general basic infrastructure of a business. This fact impacts on all dimensions of business process management. One of these dimensions is the methodology that postulates that the information systems development provide the business process management with exact methods and tools for modeling business processes. Also the methodology underlying the approach presented in this paper has its roots in the information systems development methodology.

  3. Mathematical model of seed germination process

    International Nuclear Information System (INIS)

    Gładyszewska, B.; Koper, R.; Kornarzyński, K.

    1999-01-01

    An analytical model of seed germination process was described. The model based on proposed working hypothesis leads - by analogy - to a law corresponding with Verhulst-Pearl's law, known from the theory of population kinetics. The model was applied to describe the germination kinetics of tomato seeds, Promyk field cultivar, biostimulated by laser treatment. Close agreement of experimental and model data was obtained [pl

  4. Computer modelling for better diagnosis and therapy of patients by cardiac resynchronisation therapy

    NARCIS (Netherlands)

    Pluijmert, Marieke; Lumens, Joost; Potse, Mark; Delhaas, Tammo; Auricchio, Angelo; Prinzen, Frits W

    2015-01-01

    Mathematical or computer models have become increasingly popular in biomedical science. Although they are a simplification of reality, computer models are able to link a multitude of processes to each other. In the fields of cardiac physiology and cardiology, models can be used to describe the

  5. Ontological Model of Business Process Management Systems

    Science.gov (United States)

    Manoilov, G.; Deliiska, B.

    2008-10-01

    The activities which constitute business process management (BPM) can be grouped into five categories: design, modeling, execution, monitoring and optimization. Dedicated software packets for business process management system (BPMS) are available on the market. But the efficiency of its exploitation depends on used ontological model in the development time and run time of the system. In the article an ontological model of BPMS in area of software industry is investigated. The model building is preceded by conceptualization of the domain and taxonomy of BPMS development. On the base of the taxonomy an simple online thesaurus is created.

  6. Towards a universal competitive intelligence process model

    Directory of Open Access Journals (Sweden)

    Rene Pellissier

    2013-08-01

    Full Text Available Background: Competitive intelligence (CI provides actionable intelligence, which provides a competitive edge in enterprises. However, without proper process, it is difficult to develop actionable intelligence. There are disagreements about how the CI process should be structured. For CI professionals to focus on producing actionable intelligence, and to do so with simplicity, they need a common CI process model.Objectives: The purpose of this research is to review the current literature on CI, to look at the aims of identifying and analysing CI process models, and finally to propose a universal CI process model.Method: The study was qualitative in nature and content analysis was conducted on all identified sources establishing and analysing CI process models. To identify relevant literature, academic databases and search engines were used. Moreover, a review of references in related studies led to more relevant sources, the references of which were further reviewed and analysed. To ensure reliability, only peer-reviewed articles were used.Results: The findings reveal that the majority of scholars view the CI process as a cycle of interrelated phases. The output of one phase is the input of the next phase.Conclusion: The CI process is a cycle of interrelated phases. The output of one phase is the input of the next phase. These phases are influenced by the following factors: decision makers, process and structure, organisational awareness and culture, and feedback.

  7. Three-dimensional model analysis and processing

    CERN Document Server

    Yu, Faxin; Luo, Hao; Wang, Pinghui

    2011-01-01

    This book focuses on five hot research directions in 3D model analysis and processing in computer science:  compression, feature extraction, content-based retrieval, irreversible watermarking and reversible watermarking.

  8. Modeling of Dielectric Heating within Lyophilization Process

    Directory of Open Access Journals (Sweden)

    Jan Kyncl

    2014-01-01

    Full Text Available A process of lyophilization of paper books is modeled. The process of drying is controlled by a dielectric heating system. From the physical viewpoint, the task represents a 2D coupled problem described by two partial differential equations for the electric and temperature fields. The material parameters are supposed to be temperature-dependent functions. The continuous mathematical model is solved numerically. The methodology is illustrated with some examples whose results are discussed.

  9. Cost Models for MMC Manufacturing Processes

    Science.gov (United States)

    Elzey, Dana M.; Wadley, Haydn N. G.

    1996-01-01

    Processes for the manufacture of advanced metal matrix composites are rapidly approaching maturity in the research laboratory and there is growing interest in their transition to industrial production. However, research conducted to date has almost exclusively focused on overcoming the technical barriers to producing high-quality material and little attention has been given to the economical feasibility of these laboratory approaches and process cost issues. A quantitative cost modeling (QCM) approach was developed to address these issues. QCM are cost analysis tools based on predictive process models relating process conditions to the attributes of the final product. An important attribute, of the QCM approach is the ability to predict the sensitivity of material production costs to product quality and to quantitatively explore trade-offs between cost and quality. Applications of the cost models allow more efficient direction of future MMC process technology development and a more accurate assessment of MMC market potential. Cost models were developed for two state-of-the art metal matrix composite (MMC) manufacturing processes: tape casting and plasma spray deposition. Quality and Cost models are presented for both processes and the resulting predicted quality-cost curves are presented and discussed.

  10. Model visualization for evaluation of biocatalytic processes

    DEFF Research Database (Denmark)

    Law, HEM; Lewis, DJ; McRobbie, I

    2008-01-01

    Biocatalysis offers great potential as an additional, and in some cases as an alternative, synthetic tool for organic chemists, especially as a route to introduce chirality. However, the implementation of scalable biocatalytic processes nearly always requires the introduction of process and/or bi......,S-EDDS), a biodegradable chelant, and is characterised by the use of model visualization using `windows of operation"....

  11. Business process modeling using Petri nets

    NARCIS (Netherlands)

    Hee, van K.M.; Sidorova, N.; Werf, van der J.M.E.M.; Jensen, K.; Aalst, van der W.M.P.; Balbo, G.; Koutny, M.; Wolf, K.

    2013-01-01

    Business process modeling has become a standard activity in many organizations. We start with going back into the history and explain why this activity appeared and became of such importance for organizations to achieve their business targets. We discuss the context in which business process

  12. Business Process Modeling Notation - An Overview

    Directory of Open Access Journals (Sweden)

    Alexandra Fortiş

    2006-01-01

    Full Text Available BPMN represents an industrial standard created to offer a common and user friendly notation to all the participants to a business process. The present paper aims to briefly present the main features of this notation as well as an interpretation of some of the main patterns characterizing a business process modeled by the working fluxes.

  13. Workflow Patterns for Business Process Modeling

    NARCIS (Netherlands)

    Thom, Lucineia Heloisa; Lochpe, Cirano; Reichert, M.U.

    For its reuse advantages, workflow patterns (e.g., control flow patterns, data patterns, resource patterns) are increasingly attracting the interest of both researchers and vendors. Frequently, business process or workflow models can be assembeled out of a set of recurrent process fragments (or

  14. Physical and mathematical modelling of extrusion processes

    DEFF Research Database (Denmark)

    Arentoft, Mogens; Gronostajski, Z.; Niechajowics, A.

    2000-01-01

    The main objective of the work is to study the extrusion process using physical modelling and to compare the findings of the study with finite element predictions. The possibilities and advantages of the simultaneous application of both of these methods for the analysis of metal forming processes...

  15. Further Simplification of the Simple Erosion Narrowing Score With Item Response Theory Methodology.

    Science.gov (United States)

    Oude Voshaar, Martijn A H; Schenk, Olga; Ten Klooster, Peter M; Vonkeman, Harald E; Bernelot Moens, Hein J; Boers, Maarten; van de Laar, Mart A F J

    2016-08-01

    To further simplify the simple erosion narrowing score (SENS) by removing scored areas that contribute the least to its measurement precision according to analysis based on item response theory (IRT) and to compare the measurement performance of the simplified version to the original. Baseline and 18-month data of the Combinatietherapie Bij Reumatoide Artritis (COBRA) trial were modeled using longitudinal IRT methodology. Measurement precision was evaluated across different levels of structural damage. SENS was further simplified by omitting the least reliably scored areas. Discriminant validity of SENS and its simplification were studied by comparing their ability to differentiate between the COBRA and sulfasalazine arms. Responsiveness was studied by comparing standardized change scores between versions. SENS data showed good fit to the IRT model. Carpal and feet joints contributed the least statistical information to both erosion and joint space narrowing scores. Omitting the joints of the foot reduced measurement precision for the erosion score in cases with below-average levels of structural damage (relative efficiency compared with the original version ranged 35-59%). Omitting the carpal joints had minimal effect on precision (relative efficiency range 77-88%). Responsiveness of a simplified SENS without carpal joints closely approximated the original version (i.e., all Δ standardized change scores were ≤0.06). Discriminant validity was also similar between versions for both the erosion score (relative efficiency = 97%) and the SENS total score (relative efficiency = 84%). Our results show that the carpal joints may be omitted from the SENS without notable repercussion for its measurement performance. © 2016, American College of Rheumatology.

  16. Qualitative simulation in formal process modelling

    International Nuclear Information System (INIS)

    Sivertsen, Elin R.

    1999-01-01

    In relation to several different research activities at the OECD Halden Reactor Project, the usefulness of formal process models has been identified. Being represented in some appropriate representation language, the purpose of these models is to model process plants and plant automatics in a unified way to allow verification and computer aided design of control strategies. The present report discusses qualitative simulation and the tool QSIM as one approach to formal process models. In particular, the report aims at investigating how recent improvements of the tool facilitate the use of the approach in areas like process system analysis, procedure verification, and control software safety analysis. An important long term goal is to provide a basis for using qualitative reasoning in combination with other techniques to facilitate the treatment of embedded programmable systems in Probabilistic Safety Analysis (PSA). This is motivated from the potential of such a combination in safety analysis based on models comprising both software, hardware, and operator. It is anticipated that the research results from this activity will benefit V and V in a wide variety of applications where formal process models can be utilized. Examples are operator procedures, intelligent decision support systems, and common model repositories (author) (ml)

  17. Multiphysics modelling of the spray forming process

    International Nuclear Information System (INIS)

    Mi, J.; Grant, P.S.; Fritsching, U.; Belkessam, O.; Garmendia, I.; Landaberea, A.

    2008-01-01

    An integrated, multiphysics numerical model has been developed through the joint efforts of the University of Oxford (UK), University of Bremen (Germany) and Inasmet (Spain) to simulate the spray forming process. The integrated model consisted of four sub-models: (1) an atomization model simulating the fragmentation of a continuous liquid metal stream into droplet spray during gas atomization; (2) a droplet spray model simulating the droplet spray mass and enthalpy evolution in the gas flow field prior to deposition; (3) a droplet deposition model simulating droplet deposition, splashing and re-deposition behavior and the resulting preform shape and heat flow; and (4) a porosity model simulating the porosity distribution inside a spray formed ring preform. The model has been validated against experiments of the spray forming of large diameter IN718 Ni superalloy rings. The modelled preform shape, surface temperature and final porosity distribution showed good agreement with experimental measurements

  18. Modelling long-term redox processes and oxygen scavenging in fractured crystalline rocks

    International Nuclear Information System (INIS)

    Sidborn, Magnus

    2007-10-01

    Advanced plans for the construction of a deep geological repository for highly radioactive wastes from nuclear power plants have evolved during the past decades in many countries including Sweden. As part of the Swedish concept, the waste is to be encapsulated in canisters surrounded by low permeability backfill material. The copper canisters will be deposited at around 500 metres depth in granitic rock, which acts as a natural barrier for the transport of radionuclides to the ground surface. These natural and engineered barriers are chosen and designed to ensure the safety of the repository over hundred of thousands of years. One issue of interest for the safety assessment of such a repository is the redox evolution over long times. An oxidising environment would enhance the corrosion of the copper canisters, and increases the mobility of any released radionuclides. In the first part of the present thesis, the ability of the host rock to ensure a reducing environment at repository depth over long times was studied. A model framework was developed with the aim to capture all processes that are deemed to be important for the scavenging of intruding oxygen from the ground surface over long times. Simplifications allowing for analytical solutions were introduced for transparency reasons so that evaluation of results is straight-forward, and so that uncertain parameter values easily can be adjusted. More complex systems were solved numerically for cases when the analytical simplifications are not applicable, and to validate the simplifications underlying the analytical solutions. Results were presented for prevailing present day conditions as well as for conditions deemed to be likely during the melting phase of a period of glaciation. It was shown that the hydraulic properties have a great influence on the oxygen intrusion length downstream along flow-paths in the rock. An important parameter that determines the extent of interaction between the dissolved oxygen and

  19. The (Mathematical) Modeling Process in Biosciences.

    Science.gov (United States)

    Torres, Nestor V; Santos, Guido

    2015-01-01

    In this communication, we introduce a general framework and discussion on the role of models and the modeling process in the field of biosciences. The objective is to sum up the common procedures during the formalization and analysis of a biological problem from the perspective of Systems Biology, which approaches the study of biological systems as a whole. We begin by presenting the definitions of (biological) system and model. Particular attention is given to the meaning of mathematical model within the context of biology. Then, we present the process of modeling and analysis of biological systems. Three stages are described in detail: conceptualization of the biological system into a model, mathematical formalization of the previous conceptual model and optimization and system management derived from the analysis of the mathematical model. All along this work the main features and shortcomings of the process are analyzed and a set of rules that could help in the task of modeling any biological system are presented. Special regard is given to the formative requirements and the interdisciplinary nature of this approach. We conclude with some general considerations on the challenges that modeling is posing to current biology.

  20. The semantics of hybrid process models

    NARCIS (Netherlands)

    Slaats, T.; Schunselaar, D.M.M.; Maggi, F.M.; Reijers, H.A.; Debruyne, C.; Panetto, H.; Meersman, R.; Dillon, T.; Kuhn, E.; O'Sullivan, D.; Agostino Ardagna, C.

    2016-01-01

    In the area of business process modelling, declarative notations have been proposed as alternatives to notations that follow the dominant, imperative paradigm. Yet, the choice between an imperative or declarative style of modelling is not always easy to make. Instead, a mixture of these styles is

  1. Modeling biochemical transformation processes and information processing with Narrator

    Directory of Open Access Journals (Sweden)

    Palfreyman Niall M

    2007-03-01

    Full Text Available Abstract Background Software tools that model and simulate the dynamics of biological processes and systems are becoming increasingly important. Some of these tools offer sophisticated graphical user interfaces (GUIs, which greatly enhance their acceptance by users. Such GUIs are based on symbolic or graphical notations used to describe, interact and communicate the developed models. Typically, these graphical notations are geared towards conventional biochemical pathway diagrams. They permit the user to represent the transport and transformation of chemical species and to define inhibitory and stimulatory dependencies. A critical weakness of existing tools is their lack of supporting an integrative representation of transport, transformation as well as biological information processing. Results Narrator is a software tool facilitating the development and simulation of biological systems as Co-dependence models. The Co-dependence Methodology complements the representation of species transport and transformation together with an explicit mechanism to express biological information processing. Thus, Co-dependence models explicitly capture, for instance, signal processing structures and the influence of exogenous factors or events affecting certain parts of a biological system or process. This combined set of features provides the system biologist with a powerful tool to describe and explore the dynamics of life phenomena. Narrator's GUI is based on an expressive graphical notation which forms an integral part of the Co-dependence Methodology. Behind the user-friendly GUI, Narrator hides a flexible feature which makes it relatively easy to map models defined via the graphical notation to mathematical formalisms and languages such as ordinary differential equations, the Systems Biology Markup Language or Gillespie's direct method. This powerful feature facilitates reuse, interoperability and conceptual model development. Conclusion Narrator is a

  2. Modeling biochemical transformation processes and information processing with Narrator.

    Science.gov (United States)

    Mandel, Johannes J; Fuss, Hendrik; Palfreyman, Niall M; Dubitzky, Werner

    2007-03-27

    Software tools that model and simulate the dynamics of biological processes and systems are becoming increasingly important. Some of these tools offer sophisticated graphical user interfaces (GUIs), which greatly enhance their acceptance by users. Such GUIs are based on symbolic or graphical notations used to describe, interact and communicate the developed models. Typically, these graphical notations are geared towards conventional biochemical pathway diagrams. They permit the user to represent the transport and transformation of chemical species and to define inhibitory and stimulatory dependencies. A critical weakness of existing tools is their lack of supporting an integrative representation of transport, transformation as well as biological information processing. Narrator is a software tool facilitating the development and simulation of biological systems as Co-dependence models. The Co-dependence Methodology complements the representation of species transport and transformation together with an explicit mechanism to express biological information processing. Thus, Co-dependence models explicitly capture, for instance, signal processing structures and the influence of exogenous factors or events affecting certain parts of a biological system or process. This combined set of features provides the system biologist with a powerful tool to describe and explore the dynamics of life phenomena. Narrator's GUI is based on an expressive graphical notation which forms an integral part of the Co-dependence Methodology. Behind the user-friendly GUI, Narrator hides a flexible feature which makes it relatively easy to map models defined via the graphical notation to mathematical formalisms and languages such as ordinary differential equations, the Systems Biology Markup Language or Gillespie's direct method. This powerful feature facilitates reuse, interoperability and conceptual model development. Narrator is a flexible and intuitive systems biology tool. It is

  3. Planar simplification and texturing of dense point cloud maps

    NARCIS (Netherlands)

    Ma, L.; Whelan, T.; Bondarau, Y.; With, de P.H.N.; McDonald, J.

    2013-01-01

    Dense RGB-D based SLAM techniques and highfidelity LIDAR scanners are examples from an abundant set of systems capable of providing multi-million point datasets. These large datasets quickly become difficult to process and work with due to the sheer volume of data, which typically contains

  4. Incremental and batch planar simplification of dense point cloud maps

    NARCIS (Netherlands)

    Whelan, T.; Ma, L.; Bondarev, E.; With, de P.H.N.; McDonald, J.

    2015-01-01

    Dense RGB-D SLAM techniques and high-fidelity LIDAR scanners are examples from an abundant set of systems capable of providing multi-million point datasets. These datasets quickly become difficult to process due to the sheer volume of data, typically containing significant redundant information,

  5. The Automation of Nowcast Model Assessment Processes

    Science.gov (United States)

    2016-09-01

    secondly, provide modelers with the information needed to understand the model errors and how their algorithm changes might mitigate these errors. In...by ARL modelers. 2. Development Environment The automation of Point-Stat processes (i.e., PSA) was developed using Python 3.5.* Python was selected...because it is easy to use, widely used for scripting, and satisfies all the requirements to automate the implementation of the Point-Stat tool. In

  6. New helical-shape magnetic pole design for Magnetic Lead Screw enabling structure simplification

    DEFF Research Database (Denmark)

    Lu, Kaiyuan; Xia, Yongming; Wu, Weimin

    2015-01-01

    Magnetic lead screw (MLS) is a new type of high performance linear actuator that is attractive for many potential applications. The main difficulty of the MLS technology lies in the manufacturing of its complicated helical-shape magnetic poles. Structure simplification is, therefore, quite...

  7. Utilizing 'hot words' in ParaConc to verify lexical simplification ...

    African Journals Online (AJOL)

    Lexical simplification strategies investigated are: using a superordinate or more general word, using a general word with extended meaning and using more familiar or common synonyms. The analysis gives the reader an idea about how some general words are used to translate technical language. It also displays that 'hot ...

  8. 76 FR 64250 - Reserve Requirements of Depository Institutions: Reserves Simplification and Private Sector...

    Science.gov (United States)

    2011-10-18

    ... Simplification and Private Sector Adjustment Factor AGENCY: Board of Governors of the Federal Reserve System... comment on several issues related to the methodology used for the Private Sector Adjustment Factor that is... Analyst (202) 452- 3674, Division of Monetary Affairs, or, for questions regarding the Private Sector...

  9. Perceptual Recovery from Consonant-Cluster Simplification in Korean Using Language-Specific Phonological Knowledge

    NARCIS (Netherlands)

    Cho, T.; McQueen, J.M.

    2011-01-01

    Two experiments examined whether perceptual recovery from Korean consonant-cluster simplification is based on language-specific phonological knowledge. In tri-consonantal C1C2C3 sequences such as /lkt/ and /lpt/ in Seoul Korean, either C1 or C2 can be completely deleted. Seoul Koreans monitored for

  10. Between-Word Simplification Patterns in the Continuous Speech of Children with Speech Sound Disorders

    Science.gov (United States)

    Klein, Harriet B.; Liu-Shea, May

    2009-01-01

    Purpose: This study was designed to identify and describe between-word simplification patterns in the continuous speech of children with speech sound disorders. It was hypothesized that word combinations would reveal phonological changes that were unobserved with single words, possibly accounting for discrepancies between the intelligibility of…

  11. Applying the Business Process and Practice Alignment Meta-model: Daily Practices and Process Modelling

    Directory of Open Access Journals (Sweden)

    Ventura Martins Paula

    2017-03-01

    Full Text Available Background: Business Process Modelling (BPM is one of the most important phases of information system design. Business Process (BP meta-models allow capturing informational and behavioural aspects of business processes. Unfortunately, standard BP meta-modelling approaches focus just on process description, providing different BP models. It is not possible to compare and identify related daily practices in order to improve BP models. This lack of information implies that further research in BP meta-models is needed to reflect the evolution/change in BP. Considering this limitation, this paper introduces a new BP meta-model designed by Business Process and Practice Alignment Meta-model (BPPAMeta-model. Our intention is to present a meta-model that addresses features related to the alignment between daily work practices and BP descriptions. Objectives: This paper intends to present a metamodel which is going to integrate daily work information into coherent and sound process definitions. Methods/Approach: The methodology employed in the research follows a design-science approach. Results: The results of the case study are related to the application of the proposed meta-model to align the specification of a BP model with work practices models. Conclusions: This meta-model can be used within the BPPAM methodology to specify or improve business processes models based on work practice descriptions.

  12. Thermochemical equilibrium modelling of a gasifying process

    International Nuclear Information System (INIS)

    Melgar, Andres; Perez, Juan F.; Laget, Hannes; Horillo, Alfonso

    2007-01-01

    This article discusses a mathematical model for the thermochemical processes in a downdraft biomass gasifier. The model combines the chemical equilibrium and the thermodynamic equilibrium of the global reaction, predicting the final composition of the producer gas as well as its reaction temperature. Once the composition of the producer gas is obtained, a range of parameters can be derived, such as the cold gas efficiency of the gasifier, the amount of dissociated water in the process and the heating value and engine fuel quality of the gas. The model has been validated experimentally. This work includes a parametric study of the influence of the gasifying relative fuel/air ratio and the moisture content of the biomass on the characteristics of the process and the producer gas composition. The model helps to predict the behaviour of different biomass types and is a useful tool for optimizing the design and operation of downdraft biomass gasifiers

  13. Process modeling study of the CIF incinerator

    International Nuclear Information System (INIS)

    Hang, T.

    1995-01-01

    The Savannah River Site (SRS) plans to begin operating the Consolidated Incineration Facility (CIF) in 1996. The CIF will treat liquid and solid low-level radioactive, mixed and RCRA hazardous wastes generated at SRS. In addition to experimental test programs, process modeling was applied to provide guidance in areas of safety, environmental regulation compliances, process improvement and optimization. A steady-state flowsheet model was used to calculate material/energy balances and to track key chemical constituents throughout the process units. Dynamic models were developed to predict the CIF transient characteristics in normal and abnormal operation scenarios. Predictions include the rotary kiln heat transfer, dynamic responses of the CIF to fluctuations in the solid waste feed or upsets in the system equipments, performance of the control system, air inleakage in the kiln, etc. This paper reviews the modeling study performed to assist in the deflagration risk assessment

  14. Deterministic geologic processes and stochastic modeling

    International Nuclear Information System (INIS)

    Rautman, C.A.; Flint, A.L.

    1992-01-01

    This paper reports that recent outcrop sampling at Yucca Mountain, Nevada, has produced significant new information regarding the distribution of physical properties at the site of a potential high-level nuclear waste repository. consideration of the spatial variability indicates that her are a number of widespread deterministic geologic features at the site that have important implications for numerical modeling of such performance aspects as ground water flow and radionuclide transport. Because the geologic processes responsible for formation of Yucca Mountain are relatively well understood and operate on a more-or-less regional scale, understanding of these processes can be used in modeling the physical properties and performance of the site. Information reflecting these deterministic geologic processes may be incorporated into the modeling program explicitly using geostatistical concepts such as soft information, or implicitly, through the adoption of a particular approach to modeling

  15. Modeling and simulation of heterogeneous catalytic processes

    CERN Document Server

    Dixon, Anthony

    2014-01-01

    Heterogeneous catalysis and mathematical modeling are essential components of the continuing search for better utilization of raw materials and energy, with reduced impact on the environment. Numerical modeling of chemical systems has progressed rapidly due to increases in computer power, and is used extensively for analysis, design and development of catalytic reactors and processes. This book presents reviews of the state-of-the-art in modeling of heterogeneous catalytic reactors and processes. Reviews by leading authorities in the respective areas Up-to-date reviews of latest techniques in modeling of catalytic processes Mix of US and European authors, as well as academic/industrial/research institute perspectives Connections between computation and experimental methods in some of the chapters.

  16. Modeling Aspects of Activated Sludge Processes Part l l: Mathematical Process Modeling and Biokinetics of Activated Sludge Processes

    Energy Technology Data Exchange (ETDEWEB)

    AbdElHaleem, H S [Cairo Univ.-CivlI Eng. Dept., Giza (Egypt); EI-Ahwany, A H [CairoUlmrsity- Faculty ofEngincering - Chemical Engineering Department, Giza (Egypt); Ibrahim, H I [Helwan University- Faculty of Engineering - Biomedical Engineering Department, Helwan (Egypt); Ibrahim, G [Menofia University- Faculty of Engineering Sbebin EI Kom- Basic Eng. Sc. Dept., Menofia (Egypt)

    2004-07-01

    Mathematical process modeling and biokinetics of activated sludge process were reviewed considering different types of models. It has been evaluated the task group models of ASMI. and 2, and 3 versioned by Henze et al considering the conditions of each model and the different processes of which every model consists. It is revealed that ASMI contains some defects avoided in ASM3. Relied on homogeneity, Models can be classified into homogenous models characterized by taking the activated sludge process as one phase. In this type of models, the internal mass transfer inside the floes was neglected.. Hence, the kinetic parameter produces can be considered inaccurate. The other type of models is the heterogeneous model This type considers the mass transfer operations in addition to the biochemical reaction processes; hence, the resulted kinetic parameters can be considered more accurate than that of homogenous type.

  17. Modeling Aspects of Activated Sludge Processes Part l l: Mathematical Process Modeling and Biokinetics of Activated Sludge Processes

    International Nuclear Information System (INIS)

    AbdElHaleem, H.S.; EI-Ahwany, A. H.; Ibrahim, H.I.; Ibrahim, G.

    2004-01-01

    Mathematical process modeling and biokinetics of activated sludge process were reviewed considering different types of models. It has been evaluated the task group models of ASMI. and 2, and 3 versioned by Henze et al considering the conditions of each model and the different processes of which every model consists. It is revealed that ASMI contains some defects avoided in ASM3. Relied on homogeneity, Models can be classified into homogenous models characterized by taking the activated sludge process as one phase. In this type of models, the internal mass transfer inside the floes was neglected.. Hence, the kinetic parameter produces can be considered inaccurate. The other type of models is the heterogeneous model This type considers the mass transfer operations in addition to the biochemical reaction processes; hence, the resulted kinetic parameters can be considered more accurate than that of homogenous type

  18. Pedagogic process modeling: Humanistic-integrative approach

    Directory of Open Access Journals (Sweden)

    Boritko Nikolaj M.

    2007-01-01

    Full Text Available The paper deals with some current problems of modeling the dynamics of the subject-features development of the individual. The term "process" is considered in the context of the humanistic-integrative approach, in which the principles of self education are regarded as criteria for efficient pedagogic activity. Four basic characteristics of the pedagogic process are pointed out: intentionality reflects logicality and regularity of the development of the process; discreteness (stageability in dicates qualitative stages through which the pedagogic phenomenon passes; nonlinearity explains the crisis character of pedagogic processes and reveals inner factors of self-development; situationality requires a selection of pedagogic conditions in accordance with the inner factors, which would enable steering the pedagogic process. Offered are two steps for singling out a particular stage and the algorithm for developing an integrative model for it. The suggested conclusions might be of use for further theoretic research, analyses of educational practices and for realistic predicting of pedagogical phenomena. .

  19. Operation Windshield and the simplification of emergency management.

    Science.gov (United States)

    Andrews, Michael

    2016-01-01

    Large, complex, multi-stakeholder exercises are the culmination of years of gradual progression through a comprehensive training and exercise programme. Exercises intended to validate training, refine procedures and test processes initially tested in isolation are combined to ensure seamless response and coordination during actual crises. The challenges of integrating timely and accurate situational awareness from an array of sources, including response agencies, municipal departments, partner agencies and the public, on an ever-growing range of media platforms, increase information management complexity in emergencies. Considering that many municipal emergency operations centre roles are filled by staff whose day jobs have little to do with crisis management, there is a need to simplify emergency management and make it more intuitive. North Shore Emergency Management has accepted the challenge of making emergency management less onerous to occasional practitioners through a series of initiatives aimed to build competence and confidence by making processes easier to use as well as by introducing technical tools that can simplify processes and enhance efficiencies. These efforts culminated in the full-scale earthquake exercise, Operation Windshield, which preceded the 2015 Emergency Preparedness and Business Continuity Conference in Vancouver, British Columbia.

  20. Numerical modeling of atmospheric washout processes

    International Nuclear Information System (INIS)

    Bayer, D.; Beheng, K.D.; Herbert, F.

    1987-01-01

    For the washout of particles from the atmosphere by clouds and rain one has to distinguish between processes which work in the first phase of cloud development, when condensation nuclei build up in saturated air (Nucleation Aerosol Scavenging, NAS) and those processes which work at the following cloud development. In the second case particles are taken off by cloud droplets or by falling rain drops via collision (Collision Aerosol Scavenging, CAS). The physics of both processes is described. For the CAS process a numerical model is presented. The report contains a documentation of the mathematical equations and the computer programs (FORTRAN). (KW) [de

  1. Various Models for Reading Comprehension Process

    Directory of Open Access Journals (Sweden)

    Parastoo Babashamsi

    2013-11-01

    Full Text Available In recent years reading can be viewed as a process, as a form of thinking, as a true experience, and as a tool subject. As a process, reading includes visual discrimination, independent recognition of word, rhythmic progression along a line of print, precision in the return sweep of the eyes, and adjustment of rate. In the same line, the present paper aims at considering the various models of reading process. Moreover, the paper will take a look at various factors such as schema and vocabulary knowledge which affect reading comprehension process.

  2. Mathematical modeling of biomass fuels formation process

    International Nuclear Information System (INIS)

    Gaska, Krzysztof; Wandrasz, Andrzej J.

    2008-01-01

    The increasing demand for thermal and electric energy in many branches of industry and municipal management accounts for a drastic diminishing of natural resources (fossil fuels). Meanwhile, in numerous technical processes, a huge mass of wastes is produced. A segregated and converted combustible fraction of the wastes, with relatively high calorific value, may be used as a component of formed fuels. The utilization of the formed fuel components from segregated groups of waste in associated processes of co-combustion with conventional fuels causes significant savings resulting from partial replacement of fossil fuels, and reduction of environmental pollution resulting directly from the limitation of waste migration to the environment (soil, atmospheric air, surface and underground water). The realization of technological processes with the utilization of formed fuel in associated thermal systems should be qualified by technical criteria, which means that elementary processes as well as factors of sustainable development, from a global viewpoint, must not be disturbed. The utilization of post-process waste should be preceded by detailed technical, ecological and economic analyses. In order to optimize the mixing process of fuel components, a mathematical model of the forming process was created. The model is defined as a group of data structures which uniquely identify a real process and conversion of this data in algorithms based on a problem of linear programming. The paper also presents the optimization of parameters in the process of forming fuels using a modified simplex algorithm with a polynomial worktime. This model is a datum-point in the numerical modeling of real processes, allowing a precise determination of the optimal elementary composition of formed fuels components, with assumed constraints and decision variables of the task

  3. Karst Aquifer Recharge: A Case History of over Simplification from the Uley South Basin, South Australia

    Directory of Open Access Journals (Sweden)

    Nara Somaratne

    2015-02-01

    Full Text Available The article “Karst aquifer recharge: Comments on ‘Characteristics of Point Recharge in Karst Aquifers’, by Adrian D. Werner, 2014, Water 6, doi:10.3390/w6123727” provides misrepresentation in some parts of Somaratne [1]. The description of Uley South Quaternary Limestone (QL as unconsolidated or poorly consolidated aeolianite sediments with the presence of well-mixed groundwater in Uley South [2] appears unsubstantiated. Examination of 98 lithological descriptions with corresponding drillers’ logs show only two wells containing bands of unconsolidated sediments. In Uley South basin, about 70% of salinity profiles obtained by electrical conductivity (EC logging from monitoring wells show stratification. The central and north central areas of the basin receive leakage from the Tertiary Sand (TS aquifer thereby influencing QL groundwater characteristics, such as chemistry, age and isotope composition. The presence of conduit pathways is evident in salinity profiles taken away from TS water affected areas. Pumping tests derived aquifer parameters show strong heterogeneity, a typical characteristic of karst aquifers. Uley South QL aquifer recharge is derived from three sources; diffuse recharge, point recharge from sinkholes and continuous leakage of TS water. This limits application of recharge estimation methods, such as the conventional chloride mass balance (CMB as the basic premise of the CMB is violated. The conventional CMB is not suitable for accounting chloride mass balance in groundwater systems displaying extreme range of chloride concentrations and complex mixing [3]. Over simplification of karst aquifer systems to suit application of the conventional CMB or 1-D unsaturated modelling as described in Werner [2], is not suitable use of these recharge estimation methods.

  4. Quantum mechanical Hamiltonian models of discrete processes

    International Nuclear Information System (INIS)

    Benioff, P.

    1981-01-01

    Here the results of other work on quantum mechanical Hamiltonian models of Turing machines are extended to include any discrete process T on a countably infinite set A. The models are constructed here by use of scattering phase shifts from successive scatterers to turn on successive step interactions. Also a locality requirement is imposed. The construction is done by first associating with each process T a model quantum system M with associated Hilbert space H/sub M/ and step operator U/sub T/. Since U/sub T/ is not unitary in general, M, H/sub M/, and U/sub T/ are extended into a (continuous time) Hamiltonian model on a larger space which satisfies the locality requirement. The construction is compared with the minimal unitary dilation of U/sub T/. It is seen that the model constructed here is larger than the minimal one. However, the minimal one does not satisfy the locality requirement

  5. Modeling nonhomogeneous Markov processes via time transformation.

    Science.gov (United States)

    Hubbard, R A; Inoue, L Y T; Fann, J R

    2008-09-01

    Longitudinal studies are a powerful tool for characterizing the course of chronic disease. These studies are usually carried out with subjects observed at periodic visits giving rise to panel data. Under this observation scheme the exact times of disease state transitions and sequence of disease states visited are unknown and Markov process models are often used to describe disease progression. Most applications of Markov process models rely on the assumption of time homogeneity, that is, that the transition rates are constant over time. This assumption is not satisfied when transition rates depend on time from the process origin. However, limited statistical tools are available for dealing with nonhomogeneity. We propose models in which the time scale of a nonhomogeneous Markov process is transformed to an operational time scale on which the process is homogeneous. We develop a method for jointly estimating the time transformation and the transition intensity matrix for the time transformed homogeneous process. We assess maximum likelihood estimation using the Fisher scoring algorithm via simulation studies and compare performance of our method to homogeneous and piecewise homogeneous models. We apply our methodology to a study of delirium progression in a cohort of stem cell transplantation recipients and show that our method identifies temporal trends in delirium incidence and recovery.

  6. Software Engineering Laboratory (SEL) cleanroom process model

    Science.gov (United States)

    Green, Scott; Basili, Victor; Godfrey, Sally; Mcgarry, Frank; Pajerski, Rose; Waligora, Sharon

    1991-01-01

    The Software Engineering Laboratory (SEL) cleanroom process model is described. The term 'cleanroom' originates in the integrated circuit (IC) production process, where IC's are assembled in dust free 'clean rooms' to prevent the destructive effects of dust. When applying the clean room methodology to the development of software systems, the primary focus is on software defect prevention rather than defect removal. The model is based on data and analysis from previous cleanroom efforts within the SEL and is tailored to serve as a guideline in applying the methodology to future production software efforts. The phases that are part of the process model life cycle from the delivery of requirements to the start of acceptance testing are described. For each defined phase, a set of specific activities is discussed, and the appropriate data flow is described. Pertinent managerial issues, key similarities and differences between the SEL's cleanroom process model and the standard development approach used on SEL projects, and significant lessons learned from prior cleanroom projects are presented. It is intended that the process model described here will be further tailored as additional SEL cleanroom projects are analyzed.

  7. Causally nonseparable processes admitting a causal model

    International Nuclear Information System (INIS)

    Feix, Adrien; Araújo, Mateus; Brukner, Caslav

    2016-01-01

    A recent framework of quantum theory with no global causal order predicts the existence of ‘causally nonseparable’ processes. Some of these processes produce correlations incompatible with any causal order (they violate so-called ‘causal inequalities’ analogous to Bell inequalities ) while others do not (they admit a ‘causal model’ analogous to a local model ). Here we show for the first time that bipartite causally nonseparable processes with a causal model exist, and give evidence that they have no clear physical interpretation. We also provide an algorithm to generate processes of this kind and show that they have nonzero measure in the set of all processes. We demonstrate the existence of processes which stop violating causal inequalities but are still causally nonseparable when mixed with a certain amount of ‘white noise’. This is reminiscent of the behavior of Werner states in the context of entanglement and nonlocality. Finally, we provide numerical evidence for the existence of causally nonseparable processes which have a causal model even when extended with an entangled state shared among the parties. (paper)

  8. Stochastic differential equation model to Prendiville processes

    Energy Technology Data Exchange (ETDEWEB)

    Granita, E-mail: granitafc@gmail.com [Dept. of Mathematical Science, Universiti Teknologi Malaysia, 81310, Johor Malaysia (Malaysia); Bahar, Arifah [Dept. of Mathematical Science, Universiti Teknologi Malaysia, 81310, Johor Malaysia (Malaysia); UTM Center for Industrial & Applied Mathematics (UTM-CIAM) (Malaysia)

    2015-10-22

    The Prendiville process is another variation of the logistic model which assumes linearly decreasing population growth rate. It is a continuous time Markov chain (CTMC) taking integer values in the finite interval. The continuous time Markov chain can be approximated by stochastic differential equation (SDE). This paper discusses the stochastic differential equation of Prendiville process. The work started with the forward Kolmogorov equation in continuous time Markov chain of Prendiville process. Then it was formulated in the form of a central-difference approximation. The approximation was then used in Fokker-Planck equation in relation to the stochastic differential equation of the Prendiville process. The explicit solution of the Prendiville process was obtained from the stochastic differential equation. Therefore, the mean and variance function of the Prendiville process could be easily found from the explicit solution.

  9. Stochastic differential equation model to Prendiville processes

    International Nuclear Information System (INIS)

    Granita; Bahar, Arifah

    2015-01-01

    The Prendiville process is another variation of the logistic model which assumes linearly decreasing population growth rate. It is a continuous time Markov chain (CTMC) taking integer values in the finite interval. The continuous time Markov chain can be approximated by stochastic differential equation (SDE). This paper discusses the stochastic differential equation of Prendiville process. The work started with the forward Kolmogorov equation in continuous time Markov chain of Prendiville process. Then it was formulated in the form of a central-difference approximation. The approximation was then used in Fokker-Planck equation in relation to the stochastic differential equation of the Prendiville process. The explicit solution of the Prendiville process was obtained from the stochastic differential equation. Therefore, the mean and variance function of the Prendiville process could be easily found from the explicit solution

  10. Modelling a uranium ore bioleaching process

    International Nuclear Information System (INIS)

    Chien, D.C.H.; Douglas, P.L.; Herman, D.H.; Marchbank, A.

    1990-01-01

    A dynamic simulation model for the bioleaching of uranium ore in a stope leaching process has been developed. The model incorporates design and operating conditions, reaction kinetics enhanced by Thiobacillus ferroxidans present in the leaching solution and transport properties. Model predictions agree well with experimental data with an average deviation of about ± 3%. The model is sensitive to small errors in the estimates of fragment size and ore grade. Because accurate estimates are difficult to obtain a parameter estimation approach was developed to update the value of fragment size and ore grade using on-line plant information

  11. Chain binomial models and binomial autoregressive processes.

    Science.gov (United States)

    Weiss, Christian H; Pollett, Philip K

    2012-09-01

    We establish a connection between a class of chain-binomial models of use in ecology and epidemiology and binomial autoregressive (AR) processes. New results are obtained for the latter, including expressions for the lag-conditional distribution and related quantities. We focus on two types of chain-binomial model, extinction-colonization and colonization-extinction models, and present two approaches to parameter estimation. The asymptotic distributions of the resulting estimators are studied, as well as their finite-sample performance, and we give an application to real data. A connection is made with standard AR models, which also has implications for parameter estimation. © 2011, The International Biometric Society.

  12. The Probability Model of Expectation Disconfirmation Process

    Directory of Open Access Journals (Sweden)

    Hui-Hsin HUANG

    2015-06-01

    Full Text Available This paper proposes a probability model to explore the dynamic process of customer’s satisfaction. Bases on expectation disconfirmation theory, the satisfaction is constructed with customer’s expectation before buying behavior and the perceived performance after purchase. The experiment method is designed to measure expectation disconfirmation effects and we also use the collection data to estimate the overall satisfaction and model calibration. The results show good fitness between the model and the real data. This model has application for business marketing areas in order to manage relationship satisfaction.

  13. Implementation of the Business Process Modelling Notation (BPMN) in the modelling of anatomic pathology processes.

    Science.gov (United States)

    Rojo, Marcial García; Rolón, Elvira; Calahorra, Luis; García, Felix Oscar; Sánchez, Rosario Paloma; Ruiz, Francisco; Ballester, Nieves; Armenteros, María; Rodríguez, Teresa; Espartero, Rafael Martín

    2008-07-15

    Process orientation is one of the essential elements of quality management systems, including those in use in healthcare. Business processes in hospitals are very complex and variable. BPMN (Business Process Modelling Notation) is a user-oriented language specifically designed for the modelling of business (organizational) processes. Previous experiences of the use of this notation in the processes modelling within the Pathology in Spain or another country are not known. We present our experience in the elaboration of the conceptual models of Pathology processes, as part of a global programmed surgical patient process, using BPMN. With the objective of analyzing the use of BPMN notation in real cases, a multidisciplinary work group was created, including software engineers from the Dep. of Technologies and Information Systems from the University of Castilla-La Mancha and health professionals and administrative staff from the Hospital General de Ciudad Real. The work in collaboration was carried out in six phases: informative meetings, intensive training, process selection, definition of the work method, process describing by hospital experts, and process modelling. The modelling of the processes of Anatomic Pathology is presented using BPMN. The presented subprocesses are those corresponding to the surgical pathology examination of the samples coming from operating theatre, including the planning and realization of frozen studies. The modelling of Anatomic Pathology subprocesses has allowed the creation of an understandable graphical model, where management and improvements are more easily implemented by health professionals.

  14. Simplification of Visual Rendering in Simulated Prosthetic Vision Facilitates Navigation.

    Science.gov (United States)

    Vergnieux, Victor; Macé, Marc J-M; Jouffrais, Christophe

    2017-09-01

    Visual neuroprostheses are still limited and simulated prosthetic vision (SPV) is used to evaluate potential and forthcoming functionality of these implants. SPV has been used to evaluate the minimum requirement on visual neuroprosthetic characteristics to restore various functions such as reading, objects and face recognition, object grasping, etc. Some of these studies focused on obstacle avoidance but only a few investigated orientation or navigation abilities with prosthetic vision. The resolution of current arrays of electrodes is not sufficient to allow navigation tasks without additional processing of the visual input. In this study, we simulated a low resolution array (15 × 18 electrodes, similar to a forthcoming generation of arrays) and evaluated the navigation abilities restored when visual information was processed with various computer vision algorithms to enhance the visual rendering. Three main visual rendering strategies were compared to a control rendering in a wayfinding task within an unknown environment. The control rendering corresponded to a resizing of the original image onto the electrode array size, according to the average brightness of the pixels. In the first rendering strategy, vision distance was limited to 3, 6, or 9 m, respectively. In the second strategy, the rendering was not based on the brightness of the image pixels, but on the distance between the user and the elements in the field of view. In the last rendering strategy, only the edges of the environments were displayed, similar to a wireframe rendering. All the tested renderings, except the 3 m limitation of the viewing distance, improved navigation performance and decreased cognitive load. Interestingly, the distance-based and wireframe renderings also improved the cognitive mapping of the unknown environment. These results show that low resolution implants are usable for wayfinding if specific computer vision algorithms are used to select and display appropriate

  15. Determinantal point process models on the sphere

    DEFF Research Database (Denmark)

    Møller, Jesper; Nielsen, Morten; Porcu, Emilio

    defined on Sd × Sd . We review the appealing properties of such processes, including their specific moment properties, density expressions and simulation procedures. Particularly, we characterize and construct isotropic DPPs models on Sd , where it becomes essential to specify the eigenvalues......We consider determinantal point processes on the d-dimensional unit sphere Sd . These are finite point processes exhibiting repulsiveness and with moment properties determined by a certain determinant whose entries are specified by a so-called kernel which we assume is a complex covariance function...... and eigenfunctions in a spectral representation for the kernel, and we figure out how repulsive isotropic DPPs can be. Moreover, we discuss the shortcomings of adapting existing models for isotropic covariance functions and consider strategies for developing new models, including a useful spectral approach....

  16. Quantitative Modeling of Earth Surface Processes

    Science.gov (United States)

    Pelletier, Jon D.

    This textbook describes some of the most effective and straightforward quantitative techniques for modeling Earth surface processes. By emphasizing a core set of equations and solution techniques, the book presents state-of-the-art models currently employed in Earth surface process research, as well as a set of simple but practical research tools. Detailed case studies demonstrate application of the methods to a wide variety of processes including hillslope, fluvial, aeolian, glacial, tectonic, and climatic systems. Exercises at the end of each chapter begin with simple calculations and then progress to more sophisticated problems that require computer programming. All the necessary computer codes are available online at www.cambridge.org/9780521855976. Assuming some knowledge of calculus and basic programming experience, this quantitative textbook is designed for advanced geomorphology courses and as a reference book for professional researchers in Earth and planetary science looking for a quantitative approach to Earth surface processes. More details...

  17. A process algebra model of QED

    International Nuclear Information System (INIS)

    Sulis, William

    2016-01-01

    The process algebra approach to quantum mechanics posits a finite, discrete, determinate ontology of primitive events which are generated by processes (in the sense of Whitehead). In this ontology, primitive events serve as elements of an emergent space-time and of emergent fundamental particles and fields. Each process generates a set of primitive elements, using only local information, causally propagated as a discrete wave, forming a causal space termed a causal tapestry. Each causal tapestry forms a discrete and finite sampling of an emergent causal manifold (space-time) M and emergent wave function. Interactions between processes are described by a process algebra which possesses 8 commutative operations (sums and products) together with a non-commutative concatenation operator (transitions). The process algebra possesses a representation via nondeterministic combinatorial games. The process algebra connects to quantum mechanics through the set valued process and configuration space covering maps, which associate each causal tapestry with sets of wave functions over M. Probabilities emerge from interactions between processes. The process algebra model has been shown to reproduce many features of the theory of non-relativistic scalar particles to a high degree of accuracy, without paradox or divergences. This paper extends the approach to a semi-classical form of quantum electrodynamics. (paper)

  18. Fundamentals of Numerical Modelling of Casting Processes

    DEFF Research Database (Denmark)

    Hattel, Jesper Henri; Pryds, Nini; Thorborg, Jesper

    Fundamentals of Numerical Modelling of Casting Processes comprises a thorough presentation of the basic phenomena that need to be addressed in numerical simulation of casting processes. The main philosophy of the book is to present the topics in view of their physical meaning, whenever possible......, rather than relying strictly on mathematical formalism. The book, aimed both at the researcher and the practicing engineer, as well as the student, is naturally divided into four parts. Part I (Chapters 1-3) introduces the fundamentals of modelling in a 1-dimensional framework. Part II (Chapter 4...

  19. Process and Context in Choice Models

    DEFF Research Database (Denmark)

    Ben-Akiva, Moshe; Palma, André de; McFadden, Daniel

    2012-01-01

    . The extended choice framework includes more behavioral richness through the explicit representation of the planning process preceding an action and its dynamics and the effects of context (family, friends, and market) on the process leading to a choice, as well as the inclusion of new types of subjective data...... in choice models. We discuss the key issues involved in applying the extended framework, focusing on richer data requirements, theories, and models, and present three partial demonstrations of the proposed framework. Future research challenges include the development of more comprehensive empirical tests...

  20. Performance assessment modeling of pyrometallurgical process wasteforms

    International Nuclear Information System (INIS)

    Nutt, W.M.; Hill, R.N.; Bullen, D.B.

    1995-01-01

    Performance assessment analyses have been completed to estimate the behavior of high-level nuclear wasteforms generated from the pyrometallurgical processing of liquid metal reactor (LMR) and light water reactor (LWR) spent nuclear fuel. Waste emplaced in the proposed repository at Yucca Mountain is investigated as the basis for the study. The resulting cumulative actinide and fission product releases to the accessible environment within a 100,000 year period from the various pyrometallurgical process wasteforms are compared to those of directly disposed LWR spent fuel using the same total repository system model. The impact of differing radionuclide transport models on the overall release characteristics is investigated

  1. Models of transport processes in concrete

    International Nuclear Information System (INIS)

    Pommersheim, J.M.; Clifton, J.R.

    1991-01-01

    An approach being considered by the US Nuclear Regulatory Commission for disposal of low-level radioactive waste is to place the waste forms in concrete vaults buried underground. The vaults would need a service life of 500 years. Approaches for predicting the service life of concrete of such vaults include the use of mathematical models. Mathematical models are presented in this report for the major degradation processes anticipated for the concrete vaults, which are corrosion of steel reinforcement, sulfate attack, acid attack, and leaching. The models mathematically represent rate controlling processes including diffusion, convection, and reaction and sorption of chemical species. These models can form the basis for predicting the life of concrete under in-service conditions. 33 refs., 6 figs., 7 tabs

  2. Retort process modelling for Indian traditional foods.

    Science.gov (United States)

    Gokhale, S V; Lele, S S

    2014-11-01

    Indian traditional staple and snack food is typically a heterogeneous recipe that incorporates varieties of vegetables, lentils and other ingredients. Modelling the retorting process of multilayer pouch packed Indian food was achieved using lumped-parameter approach. A unified model is proposed to estimate cold point temperature. Initial process conditions, retort temperature and % solid content were the significantly affecting independent variables. A model was developed using combination of vegetable solids and water, which was then validated using four traditional Indian vegetarian products: Pulav (steamed rice with vegetables), Sambar (south Indian style curry containing mixed vegetables and lentils), Gajar Halawa (carrot based sweet product) and Upama (wheat based snack product). The predicted and experimental values of temperature profile matched with ±10 % error which is a good match considering the food was a multi component system. Thus the model will be useful as a tool to reduce number of trials required to optimize retorting of various Indian traditional vegetarian foods.

  3. Towards a universal competitive intelligence process model

    Directory of Open Access Journals (Sweden)

    Rene Pellissier

    2013-08-01

    Objectives: The purpose of this research is to review the current literature on CI, to look at the aims of identifying and analysing CI process models, and finally to propose a universal CI process model. Method: The study was qualitative in nature and content analysis was conducted on all identified sources establishing and analysing CI process models. To identify relevant literature, academic databases and search engines were used. Moreover, a review of references in related studies led to more relevant sources, the references of which were further reviewed and analysed. To ensure reliability, only peer-reviewed articles were used. Results: The findings reveal that the majority of scholars view the CI process as a cycle of interrelated phases. The output of one phase is the input of the next phase. Conclusion: The CI process is a cycle of interrelated phases. The output of one phase is the input of the next phase. These phases are influenced by the following factors: decision makers, process and structure, organisational awareness and culture, and feedback.

  4. Modeling of Reaction Processes Controlled by Diffusion

    International Nuclear Information System (INIS)

    Revelli, Jorge

    2003-01-01

    Stochastic modeling is quite powerful in science and technology.The technics derived from this process have been used with great success in laser theory, biological systems and chemical reactions.Besides, they provide a theoretical framework for the analysis of experimental results on the field of particle's diffusion in ordered and disordered materials.In this work we analyze transport processes in one-dimensional fluctuating media, which are media that change their state in time.This fact induces changes in the movements of the particles giving rise to different phenomena and dynamics that will be described and analyzed in this work.We present some random walk models to describe these fluctuating media.These models include state transitions governed by different dynamical processes.We also analyze the trapping problem in a lattice by means of a simple model which predicts a resonance-like phenomenon.Also we study effective diffusion processes over surfaces due to random walks in the bulk.We consider different boundary conditions and transitions movements.We derive expressions that describe diffusion behaviors constrained to bulk restrictions and the dynamic of the particles.Finally it is important to mention that the theoretical results obtained from the models proposed in this work are compared with Monte Carlo simulations.We find, in general, excellent agreements between the theory and the simulations

  5. Integrating textual and model-based process descriptions for comprehensive process search

    NARCIS (Netherlands)

    Leopold, Henrik; van der Aa, Han; Pittke, Fabian; Raffel, Manuel; Mendling, Jan; Reijers, Hajo A.

    2016-01-01

    Documenting business processes using process models is common practice in many organizations. However, not all process information is best captured in process models. Hence, many organizations complement these models with textual descriptions that specify additional details. The problem with this

  6. A Generic Modeling Process to Support Functional Fault Model Development

    Science.gov (United States)

    Maul, William A.; Hemminger, Joseph A.; Oostdyk, Rebecca; Bis, Rachael A.

    2016-01-01

    Functional fault models (FFMs) are qualitative representations of a system's failure space that are used to provide a diagnostic of the modeled system. An FFM simulates the failure effect propagation paths within a system between failure modes and observation points. These models contain a significant amount of information about the system including the design, operation and off nominal behavior. The development and verification of the models can be costly in both time and resources. In addition, models depicting similar components can be distinct, both in appearance and function, when created individually, because there are numerous ways of representing the failure space within each component. Generic application of FFMs has the advantages of software code reuse: reduction of time and resources in both development and verification, and a standard set of component models from which future system models can be generated with common appearance and diagnostic performance. This paper outlines the motivation to develop a generic modeling process for FFMs at the component level and the effort to implement that process through modeling conventions and a software tool. The implementation of this generic modeling process within a fault isolation demonstration for NASA's Advanced Ground System Maintenance (AGSM) Integrated Health Management (IHM) project is presented and the impact discussed.

  7. Hencky's model for elastomer forming process

    Science.gov (United States)

    Oleinikov, A. A.; Oleinikov, A. I.

    2016-08-01

    In the numerical simulation of elastomer forming process, Henckys isotropic hyperelastic material model can guarantee relatively accurate prediction of strain range in terms of large deformations. It is shown, that this material model prolongate Hooke's law from the area of infinitesimal strains to the area of moderate ones. New representation of the fourth-order elasticity tensor for Hencky's hyperelastic isotropic material is obtained, it possesses both minor symmetries, and the major symmetry. Constitutive relations of considered model is implemented into MSC.Marc code. By calculating and fitting curves, the polyurethane elastomer material constants are selected. Simulation of equipment for elastomer sheet forming are considered.

  8. Model-based internal wave processing

    Energy Technology Data Exchange (ETDEWEB)

    Candy, J.V.; Chambers, D.H.

    1995-06-09

    A model-based approach is proposed to solve the oceanic internal wave signal processing problem that is based on state-space representations of the normal-mode vertical velocity and plane wave horizontal velocity propagation models. It is shown that these representations can be utilized to spatially propagate the modal (dept) vertical velocity functions given the basic parameters (wave numbers, Brunt-Vaisala frequency profile etc.) developed from the solution of the associated boundary value problem as well as the horizontal velocity components. Based on this framework, investigations are made of model-based solutions to the signal enhancement problem for internal waves.

  9. Modeling and simulation of pressurizer dynamic process in PWR nuclear power plant

    International Nuclear Information System (INIS)

    Ma Jin; Liu Changliang; Li Shu'na

    2010-01-01

    By analysis of the actual operating characteristics of pressurizer in pressurized water reactor (PWR) nuclear power plant and based on some reasonable simplification and basic assumptions, the quality and energy conservation equations about pressurizer' s steam zone and the liquid zone are set up. The purpose of this paper is to build a pressurizer model of two imbalance districts. Water level and pressure control system of pressurizer is formed though model encapsulation. Dynamic simulation curves of main parameters are also shown. At last, comparisons between the theoretical analysis and simulation results show that the pressurizer model of two imbalance districts is reasonable. (authors)

  10. Model Identification of Integrated ARMA Processes

    Science.gov (United States)

    Stadnytska, Tetiana; Braun, Simone; Werner, Joachim

    2008-01-01

    This article evaluates the Smallest Canonical Correlation Method (SCAN) and the Extended Sample Autocorrelation Function (ESACF), automated methods for the Autoregressive Integrated Moving-Average (ARIMA) model selection commonly available in current versions of SAS for Windows, as identification tools for integrated processes. SCAN and ESACF can…

  11. Kinetics and modeling of anaerobic digestion process

    DEFF Research Database (Denmark)

    Gavala, Hariklia N.; Angelidaki, Irini; Ahring, Birgitte Kiær

    2003-01-01

    Anaerobic digestion modeling started in the early 1970s when the need for design and efficient operation of anaerobic systems became evident. At that time not only was the knowledge about the complex process of anaerobic digestion inadequate but also there were computational limitations. Thus...

  12. Designing equivalent semantic models for process creation

    NARCIS (Netherlands)

    P.H.M. America (Pierre); J.W. de Bakker (Jaco)

    1986-01-01

    textabstractOperational and denotational semantic models are designed for languages with process creation, and the relationships between the two semantics are investigated. The presentation is organized in four sections dealing with a uniform and static, a uniform and dynamic, a nonuniform and

  13. Mathematical Modelling of Continuous Biotechnological Processes

    Science.gov (United States)

    Pencheva, T.; Hristozov, I.; Shannon, A. G.

    2003-01-01

    Biotechnological processes (BTP) are characterized by a complicated structure of organization and interdependent characteristics. Partial differential equations or systems of partial differential equations are used for their behavioural description as objects with distributed parameters. Modelling of substrate without regard to dispersion…

  14. Aligning Grammatical Theories and Language Processing Models

    Science.gov (United States)

    Lewis, Shevaun; Phillips, Colin

    2015-01-01

    We address two important questions about the relationship between theoretical linguistics and psycholinguistics. First, do grammatical theories and language processing models describe separate cognitive systems, or are they accounts of different aspects of the same system? We argue that most evidence is consistent with the one-system view. Second,…

  15. Computational Process Modeling for Additive Manufacturing (OSU)

    Science.gov (United States)

    Bagg, Stacey; Zhang, Wei

    2015-01-01

    Powder-Bed Additive Manufacturing (AM) through Direct Metal Laser Sintering (DMLS) or Selective Laser Melting (SLM) is being used by NASA and the Aerospace industry to "print" parts that traditionally are very complex, high cost, or long schedule lead items. The process spreads a thin layer of metal powder over a build platform, then melts the powder in a series of welds in a desired shape. The next layer of powder is applied, and the process is repeated until layer-by-layer, a very complex part can be built. This reduces cost and schedule by eliminating very complex tooling and processes traditionally used in aerospace component manufacturing. To use the process to print end-use items, NASA seeks to understand SLM material well enough to develop a method of qualifying parts for space flight operation. Traditionally, a new material process takes many years and high investment to generate statistical databases and experiential knowledge, but computational modeling can truncate the schedule and cost -many experiments can be run quickly in a model, which would take years and a high material cost to run empirically. This project seeks to optimize material build parameters with reduced time and cost through modeling.

  16. Fuzzy model for Laser Assisted Bending Process

    Directory of Open Access Journals (Sweden)

    Giannini Oliviero

    2016-01-01

    Full Text Available In the present study, a fuzzy model was developed to predict the residual bending in a conventional metal bending process assisted by a high power diode laser. The study was focused on AA6082T6 aluminium thin sheets. In most dynamic sheet metal forming operations, the highly nonlinear deformation processes cause large amounts of elastic strain energy stored in the formed material. The novel hybrid forming process was thus aimed at inducing the local heating of the mechanically bent workpiece in order to decrease or eliminate the related springback phenomena. In particular, the influence on the extent of springback phenomena of laser process parameters such as source power, scan speed and starting elastic deformation of mechanically bent sheets, was experimentally assessed. Consistent trends in experimental response according to operational parameters were found. Accordingly, 3D process maps of the extent of the springback phenomena according to operational parameters were constructed. The effect of the inherent uncertainties on the predicted residual bending caused by the approximation in the model parameters was evaluated. In particular, a fuzzy-logic based approach was used to describe the model uncertainties and the transformation method was applied to propagate their effect on the residual bending.

  17. Process Model for Friction Stir Welding

    Science.gov (United States)

    Adams, Glynn

    1996-01-01

    Friction stir welding (FSW) is a relatively new process being applied for joining of metal alloys. The process was initially developed by The Welding Institute (TWI) in Cambridge, UK. The FSW process is being investigated at NASA/MSEC as a repair/initial weld procedure for fabrication of the super-light-weight aluminum-lithium shuttle external tank. The FSW investigations at MSFC were conducted on a horizontal mill to produce butt welds of flat plate material. The weldment plates are butted together and fixed to a backing plate on the mill bed. A pin tool is placed into the tool holder of the mill spindle and rotated at approximately 400 rpm. The pin tool is then plunged into the plates such that the center of the probe lies at, one end of the line of contact, between the plates and the shoulder of the pin tool penetrates the top surface of the weldment. The weld is produced by traversing the tool along the line of contact between the plates. A lead angle allows the leading edge of the shoulder to remain above the top surface of the plate. The work presented here is the first attempt at modeling a complex phenomenon. The mechanical aspects of conducting the weld process are easily defined and the process itself is controlled by relatively few input parameters. However, in the region of the weld, plasticizing and forging of the parent material occurs. These are difficult processes to model. The model presented here addresses only variations in the radial dimension outward from the pin tool axis. Examinations of the grain structure of the weld reveal that a considerable amount of material deformation also occurs in the direction parallel to the pin tool axis of rotation, through the material thickness. In addition, measurements of the axial load on the pin tool demonstrate that the forging affect of the pin tool shoulder is an important process phenomenon. Therefore, the model needs to be expanded to account for the deformations through the material thickness and the

  18. Modelling and control of a flotation process

    International Nuclear Information System (INIS)

    Ding, L.; Gustafsson, T.

    1999-01-01

    A general description of a flotation process is given. The dynamic model of a MIMO nonlinear subprocess in flotation, i. e. the pulp levels in five compartments in series is developed and the model is verified with real data from a production plant. In order to reject constant disturbances five extra states are introduced and the model is modified. An exact linearization has been made for the non-linear model and a linear quadratic gaussian controller is proposed based on the linearized model. The simulation result shows an improved performance of the pulp level control when the set points are changed or a disturbance occur. In future the controller will be tested in production. (author)

  19. Learning Markov Decision Processes for Model Checking

    DEFF Research Database (Denmark)

    Mao, Hua; Chen, Yingke; Jaeger, Manfred

    2012-01-01

    . The proposed learning algorithm is adapted from algorithms for learning deterministic probabilistic finite automata, and extended to include both probabilistic and nondeterministic transitions. The algorithm is empirically analyzed and evaluated by learning system models of slot machines. The evaluation......Constructing an accurate system model for formal model verification can be both resource demanding and time-consuming. To alleviate this shortcoming, algorithms have been proposed for automatically learning system models based on observed system behaviors. In this paper we extend the algorithm...... on learning probabilistic automata to reactive systems, where the observed system behavior is in the form of alternating sequences of inputs and outputs. We propose an algorithm for automatically learning a deterministic labeled Markov decision process model from the observed behavior of a reactive system...

  20. Advances in modeling plastic waste pyrolysis processes

    Energy Technology Data Exchange (ETDEWEB)

    Safadi, Y. [Department of Mechanical Engineering, American University of Beirut, PO Box 11-0236, Beirut (Lebanon); Zeaiter, J. [Chemical Engineering Program, American University of Beirut, PO Box 11-0236, Beirut (Lebanon)

    2013-07-01

    The tertiary recycling of plastics via pyrolysis is recently gaining momentum due to promising economic returns from the generated products that can be used as a chemical feedstock or fuel. The need for prediction models to simulate such processes is essential in understanding in depth the mechanisms that take place during the thermal or catalytic degradation of the waste polymer. This paper presents key different models used successfully in literature so far. Three modeling schemes are identified: Power-Law, Lumped-Empirical, and Population-Balance based equations. The categorization is based mainly on the level of detail and prediction capability from each modeling scheme. The data shows that the reliability of these modeling approaches vary with the degree of details the experimental work and product analysis are trying to achieve.

  1. An ecological process model of systems change.

    Science.gov (United States)

    Peirson, Leslea J; Boydell, Katherine M; Ferguson, H Bruce; Ferris, Lorraine E

    2011-06-01

    In June 2007 the American Journal of Community Psychology published a special issue focused on theories, methods and interventions for systems change which included calls from the editors and authors for theoretical advancement in this field. We propose a conceptual model of systems change that integrates familiar and fundamental community psychology principles (succession, interdependence, cycling of resources, adaptation) and accentuates a process orientation. To situate our framework we offer a definition of systems change and a brief review of the ecological perspective and principles. The Ecological Process Model of Systems Change is depicted, described and applied to a case example of policy driven systems level change in publicly funded social programs. We conclude by identifying salient implications for thinking and action which flow from the Model.

  2. Modeling veterans healthcare administration disclosure processes :

    Energy Technology Data Exchange (ETDEWEB)

    Beyeler, Walter E; DeMenno, Mercy B.; Finley, Patrick D.

    2013-09-01

    As with other large healthcare organizations, medical adverse events at the Department of Veterans Affairs (VA) facilities can expose patients to unforeseen negative risks. VHA leadership recognizes that properly handled disclosure of adverse events can minimize potential harm to patients and negative consequences for the effective functioning of the organization. The work documented here seeks to help improve the disclosure process by situating it within the broader theoretical framework of issues management, and to identify opportunities for process improvement through modeling disclosure and reactions to disclosure. The computational model will allow a variety of disclosure actions to be tested across a range of incident scenarios. Our conceptual model will be refined in collaboration with domain experts, especially by continuing to draw on insights from VA Study of the Communication of Adverse Large-Scale Events (SCALE) project researchers.

  3. Dimensional modeling: beyond data processing constraints.

    Science.gov (United States)

    Bunardzic, A

    1995-01-01

    The focus of information processing requirements is shifting from the on-line transaction processing (OLTP) issues to the on-line analytical processing (OLAP) issues. While the former serves to ensure the feasibility of the real-time on-line transaction processing (which has already exceeded a level of up to 1,000 transactions per second under normal conditions), the latter aims at enabling more sophisticated analytical manipulation of data. The OLTP requirements, or how to efficiently get data into the system, have been solved by applying the Relational theory in the form of Entity-Relation model. There is presently no theory related to OLAP that would resolve the analytical processing requirements as efficiently as Relational theory provided for the transaction processing. The "relational dogma" also provides the mathematical foundation for the Centralized Data Processing paradigm in which mission-critical information is incorporated as 'one and only one instance' of data, thus ensuring data integrity. In such surroundings, the information that supports business analysis and decision support activities is obtained by running predefined reports and queries that are provided by the IS department. In today's intensified competitive climate, businesses are finding that this traditional approach is not good enough. The only way to stay on top of things, and to survive and prosper, is to decentralize the IS services. The newly emerging Distributed Data Processing, with its increased emphasis on empowering the end user, does not seem to find enough merit in the relational database model to justify relying upon it. Relational theory proved too rigid and complex to accommodate the analytical processing needs. In order to satisfy the OLAP requirements, or how to efficiently get the data out of the system, different models, metaphors, and theories have been devised. All of them are pointing to the need for simplifying the highly non-intuitive mathematical constraints found

  4. Mathematical modeling of the flash converting process

    Energy Technology Data Exchange (ETDEWEB)

    Sohn, H.Y.; Perez-Tello, M.; Riihilahti, K.M. [Utah Univ., Salt Lake City, UT (United States)

    1996-12-31

    An axisymmetric mathematical model for the Kennecott-Outokumpu flash converting process for converting solid copper matte to copper is presented. The model is an adaptation of the comprehensive mathematical model formerly developed at the University of Utah for the flash smelting of copper concentrates. The model incorporates the transport of momentum, heat, mass, and reaction kinetics between gas and particles in a particle-laden turbulent gas jet. The standard k-{epsilon} model is used to describe gas-phase turbulence in an Eulerian framework. The particle-phase is treated from a Lagrangian viewpoint which is coupled to the gas-phase via the source terms in the Eulerian gas-phase governing equations. Matte particles were represented as Cu{sub 2}S yFeS, and assumed to undergo homogeneous oxidation to Cu{sub 2}O, Fe{sub 3}O{sub 4}, and SO{sub 2}. A reaction kinetics mechanism involving both external mass transfer of oxygen gas to the particle surface and diffusion of oxygen through the porous oxide layer is proposed to estimate the particle oxidation rate Predictions of the mathematical model were compared with the experimental data collected in a bench-scale flash converting facility. Good agreement between the model predictions and the measurements was obtained. The model was used to study the effect of different gas-injection configurations on the overall fluid dynamics in a commercial size flash converting shaft. (author)

  5. Mathematical modeling of the flash converting process

    Energy Technology Data Exchange (ETDEWEB)

    Sohn, H Y; Perez-Tello, M; Riihilahti, K M [Utah Univ., Salt Lake City, UT (United States)

    1997-12-31

    An axisymmetric mathematical model for the Kennecott-Outokumpu flash converting process for converting solid copper matte to copper is presented. The model is an adaptation of the comprehensive mathematical model formerly developed at the University of Utah for the flash smelting of copper concentrates. The model incorporates the transport of momentum, heat, mass, and reaction kinetics between gas and particles in a particle-laden turbulent gas jet. The standard k-{epsilon} model is used to describe gas-phase turbulence in an Eulerian framework. The particle-phase is treated from a Lagrangian viewpoint which is coupled to the gas-phase via the source terms in the Eulerian gas-phase governing equations. Matte particles were represented as Cu{sub 2}S yFeS, and assumed to undergo homogeneous oxidation to Cu{sub 2}O, Fe{sub 3}O{sub 4}, and SO{sub 2}. A reaction kinetics mechanism involving both external mass transfer of oxygen gas to the particle surface and diffusion of oxygen through the porous oxide layer is proposed to estimate the particle oxidation rate Predictions of the mathematical model were compared with the experimental data collected in a bench-scale flash converting facility. Good agreement between the model predictions and the measurements was obtained. The model was used to study the effect of different gas-injection configurations on the overall fluid dynamics in a commercial size flash converting shaft. (author)

  6. Process models and model-data fusion in dendroecology

    Directory of Open Access Journals (Sweden)

    Joel eGuiot

    2014-08-01

    Full Text Available Dendrochronology (i.e. the study of annually dated tree-ring time series has proved to be a powerful technique to understand tree-growth. This paper intends to show the interest of using ecophysiological modeling not only to understand and predict tree-growth (dendroecology but also to reconstruct past climates (dendroclimatology. Process models have been used for several decades in dendroclimatology, but it is only recently that methods of model-data fusion have led to significant progress in modeling tree-growth as a function of climate and in reconstructing past climates. These model-data fusion (MDF methods, mainly based on the Bayesian paradigm, have been shown to be powerful for both model calibration and model inversion. After a rapid survey of tree-growth modeling, we illustrate MDF with examples based on series of Southern France Aleppo pines and Central France oaks. These examples show that if plants experienced CO2 fertilization, this would have a significant effect on tree-growth which in turn would bias the climate reconstructions. This bias could be extended to other environmental non-climatic factors directly or indirectly affecting annual ring formation and not taken into account in classical empirical models, which supports the use of more complex process-based models. Finally, we conclude by showing the interest of the data assimilation methods applied in climatology to produce climate re-analyses.

  7. Reversibility in Quantum Models of Stochastic Processes

    Science.gov (United States)

    Gier, David; Crutchfield, James; Mahoney, John; James, Ryan

    Natural phenomena such as time series of neural firing, orientation of layers in crystal stacking and successive measurements in spin-systems are inherently probabilistic. The provably minimal classical models of such stochastic processes are ɛ-machines, which consist of internal states, transition probabilities between states and output values. The topological properties of the ɛ-machine for a given process characterize the structure, memory and patterns of that process. However ɛ-machines are often not ideal because their statistical complexity (Cμ) is demonstrably greater than the excess entropy (E) of the processes they represent. Quantum models (q-machines) of the same processes can do better in that their statistical complexity (Cq) obeys the relation Cμ >= Cq >= E. q-machines can be constructed to consider longer lengths of strings, resulting in greater compression. With code-words of sufficiently long length, the statistical complexity becomes time-symmetric - a feature apparently novel to this quantum representation. This result has ramifications for compression of classical information in quantum computing and quantum communication technology.

  8. Exploring the spatial distribution of light interception and photosynthesis of canopies by means of a functional-structural plant model

    NARCIS (Netherlands)

    Sarlikioti, V.; Visser, de P.H.B.; Marcelis, L.F.M.

    2011-01-01

    Background and Aims - At present most process-based models and the majority of three-dimensional models include simplifications of plant architecture that can compromise the accuracy of light interception simulations and, accordingly, canopy photosynthesis. The aim of this paper is to analyse canopy

  9. Dual elaboration models in attitude change processes

    Directory of Open Access Journals (Sweden)

    Žeželj Iris

    2005-01-01

    Full Text Available This article examines empirical and theoretical developments in research on attitude change in the past 50 years. It focuses the period from 1980 till present as well as cognitive response theories as the dominant theoretical approach in the field. The postulates of Elaboration Likelihood Model, as most-researched representative of dual process theories are studied, based on review of accumulated research evidence. Main research findings are grouped in four basic factors: message source, message content, message recipient and its context. Most influential criticisms of the theory are then presented regarding its empirical base and dual process assumption. Some possible applications and further research perspectives are discussed at the end.

  10. Temperature Modelling of the Biomass Pretreatment Process

    DEFF Research Database (Denmark)

    Prunescu, Remus Mihail; Blanke, Mogens; Jensen, Jakob M.

    2012-01-01

    In a second generation biorefinery, the biomass pretreatment stage has an important contribution to the efficiency of the downstream processing units involved in biofuel production. Most of the pretreatment process occurs in a large pressurized thermal reactor that presents an irregular temperature...... that captures the environmental temperature differences inside the reactor using distributed parameters. A Kalman filter is then added to account for any missing dynamics and the overall model is embedded into a temperature soft sensor. The operator of the plant will be able to observe the temperature in any...

  11. Adaptive simplification and the evolution of gecko locomotion: Morphological and biomechanical consequences of losing adhesion

    Science.gov (United States)

    Higham, Timothy E.; Birn-Jeffery, Aleksandra V.; Collins, Clint E.; Hulsey, C. Darrin; Russell, Anthony P.

    2015-01-01

    Innovations permit the diversification of lineages, but they may also impose functional constraints on behaviors such as locomotion. Thus, it is not surprising that secondary simplification of novel locomotory traits has occurred several times among vertebrates and could potentially lead to exceptional divergence when constraints are relaxed. For example, the gecko adhesive system is a remarkable innovation that permits locomotion on surfaces unavailable to other animals, but has been lost or simplified in species that have reverted to a terrestrial lifestyle. We examined the functional and morphological consequences of this adaptive simplification in the Pachydactylus radiation of geckos, which exhibits multiple unambiguous losses or bouts of simplification of the adhesive system. We found that the rates of morphological and 3D locomotor kinematic evolution are elevated in those species that have simplified or lost adhesive capabilities. This finding suggests that the constraints associated with adhesion have been circumvented, permitting these species to either run faster or burrow. The association between a terrestrial lifestyle and the loss/reduction of adhesion suggests a direct link between morphology, biomechanics, and ecology. PMID:25548182

  12. An integrated computer aided system for integrated design of chemical processes

    DEFF Research Database (Denmark)

    Gani, Rafiqul; Hytoft, Glen; Jaksland, Cecilia

    1997-01-01

    In this paper, an Integrated Computer Aided System (ICAS), which is particularly suitable for solving problems related to integrated design of chemical processes; is presented. ICAS features include a model generator (generation of problem specific models including model simplification and model ...... form the basis for the toolboxes. The available features of ICAS are highlighted through a case study involving the separation of binary azeotropic mixtures. (C) 1997 Elsevier Science Ltd....

  13. Theoretical modelling of carbon deposition processes

    International Nuclear Information System (INIS)

    Marsh, G.R.; Norfolk, D.J.; Skinner, R.F.

    1985-01-01

    Work based on capsule experiments in the BNL Gamma Facility, aimed at elucidating the chemistry involved in the formation of carbonaceous deposit on CAGR fuel pin surfaces is described. Using a data-base derived from capsule experiments together with literature values for the kinetics of the fundamental reactions, a chemical model of the gas-phase processes has been developed. This model successfully reproduces the capsule results, whilst preliminary application to the WAGR coolant circuit indicates the likely concentration profiles of various radical species within the fuel channels. (author)

  14. Theoretical Modelling of Intercultural Communication Process

    Directory of Open Access Journals (Sweden)

    Mariia Soter

    2016-08-01

    Full Text Available The definition of the concepts of “communication”, “intercultural communication”, “model of communication” are analyzed in the article. The basic components of the communication process are singled out. The model of intercultural communication is developed. Communicative, behavioral and complex skills for optimal organization of intercultural communication, establishment of productive contact with a foreign partner to achieve mutual understanding, searching for acceptable ways of organizing interaction and cooperation for both communicants are highlighted in the article. It is noted that intercultural communication through interaction between people affects the development of different cultures’ aspects.

  15. Symmetries and modelling functions for diffusion processes

    International Nuclear Information System (INIS)

    Nikitin, A G; Spichak, S V; Vedula, Yu S; Naumovets, A G

    2009-01-01

    A constructive approach to the theory of diffusion processes is proposed, which is based on application of both symmetry analysis and the method of modelling functions. An algorithm for construction of the modelling functions is suggested. This algorithm is based on the error function expansion (ERFEX) of experimental concentration profiles. The high-accuracy analytical description of the profiles provided by ERFEX approximation allows a convenient extraction of the concentration dependence of diffusivity from experimental data and prediction of the diffusion process. Our analysis is exemplified by its employment in experimental results obtained for surface diffusion of lithium on the molybdenum (1 1 2) surface precovered with dysprosium. The ERFEX approximation can be directly extended to many other diffusion systems.

  16. Modeling of processes in the tourism sector

    Directory of Open Access Journals (Sweden)

    Salamatina Victoriya, S.

    2015-06-01

    Full Text Available In modern conditions for a number of Russian regions tourism is becoming budget. In this regard, it is of interest to the simulation of processes occurring in the tourism business, because they are affected by many random parameters, due to various economic, political, geographic, and other aspects. For improvement and development of systems for the management of tourism business systematically embeds economic mathematical apparatus in this area, because increased competitiveness requires continuous and constructive changes. Results of application of the economic mathematical apparatus allow a more systematic and internal unity to analyze and evaluate the applicability of further processes in tourism. For some economic processes typical tourist activities is that a certain effect and result from exposure to any of the factors on the indicators of the processes is not immediately but gradually, after some certain time, with a certain lag. With the necessity of accounting for this delay has to face when developing mathematical models of tourist business processes. In this case, the simulation of such processes it is advisable to apply economic-mathematical formalism of optimal control, called game theory.

  17. Deep inelastic processes and the parton model

    International Nuclear Information System (INIS)

    Altarelli, G.

    The lecture was intended as an elementary introduction to the physics of deep inelastic phenomena from the point of view of theory. General formulae and facts concerning inclusive deep inelastic processes in the form: l+N→l'+hadrons (electroproduction, neutrino scattering) are first recalled. The deep inelastic annihilation e + e - →hadrons is then envisaged. The light cone approach, the parton model and their relation are mainly emphasized

  18. Survivability Assessment: Modeling A Recovery Process

    OpenAIRE

    Paputungan, Irving Vitra; Abdullah, Azween

    2009-01-01

    Survivability is the ability of a system to continue operating, in a timely manner, in the presence ofattacks, failures, or accidents. Recovery in survivability is a process of a system to heal or recover from damageas early as possible to fulfill its mission as condition permit. In this paper, we show a preliminary recoverymodel to enhance the system survivability. The model focuses on how we preserve the system and resumes itscritical service under attacks as soon as possible.Keywords: surv...

  19. Process Modeling With Inhomogeneous Thin Films

    Science.gov (United States)

    Machorro, R.; Macleod, H. A.; Jacobson, M. R.

    1986-12-01

    Designers of optical multilayer coatings commonly assume that the individual layers will be ideally homogeneous and isotropic. In practice, it is very difficult to control the conditions involved in the complex evaporation process sufficiently to produce such ideal films. Clearly, changes in process parameters, such as evaporation rate, chamber pressure, and substrate temperature, affect the microstructure of the growing film, frequently producing inhomogeneity in structure or composition. In many cases, these effects are interdependent, further complicating the situation. However, this process can be simulated on powerful, interactive, and accessible microcomputers. In this work, we present such a model and apply it to estimate the influence of an inhomogeneous layer on multilayer performance. Presently, the program simulates film growth, thermal expansion and contraction, and thickness monitoring procedures, and includes the effects of uncertainty in these parameters or noise. Although the model is being developed to cover very general cases, we restrict the present discussion to isotropic and nondispersive quarterwave layers to understand the particular effects of inhomogeneity. We studied several coating designs and related results and tolerances to variations in evaporation conditions. The model is composed of several modular subprograms, is written in Fortran, and is executed on an IBM-PC with 640 K of memory. The results can be presented in graphic form on a monochrome monitor. We are currently installing and implementing color capability to improve the clarity of the multidimensional output.

  20. Near Field Environment Process Model Report

    Energy Technology Data Exchange (ETDEWEB)

    R.A. Wagner

    2000-11-14

    Waste emplacement and activities associated with construction of a repository system potentially will change environmental conditions within the repository system. These environmental changes principally result from heat generated by the decay of the radioactive waste, which elevates temperatures within the repository system. Elevated temperatures affect distribution of water, increase kinetic rates of geochemical processes, and cause stresses to change in magnitude and orientation from the stresses resulting from the overlying rock and from underground construction activities. The recognition of this evolving environment has been reflected in activities, studies and discussions generally associated with what has been termed the Near-Field Environment (NFE). The NFE interacts directly with waste packages and engineered barriers as well as potentially changing the fluid composition and flow conditions within the mountain. As such, the NFE defines the environment for assessing the performance of a potential Monitored Geologic Repository at Yucca Mountain, Nevada. The NFe evolves over time, and therefore is not amenable to direct characterization or measurement in the ambient system. Analysis or assessment of the NFE must rely upon projections based on tests and models that encompass the long-term processes of the evolution of this environment. This NFE Process Model Report (PMR) describes the analyses and modeling based on current understanding of the evolution of the near-field within the rock mass extending outward from the drift wall.

  1. [The dual process model of addiction. Towards an integrated model?].

    Science.gov (United States)

    Vandermeeren, R; Hebbrecht, M

    2012-01-01

    Neurobiology and cognitive psychology have provided us with a dual process model of addiction. According to this model, behavior is considered to be the dynamic result of a combination of automatic and controlling processes. In cases of addiction the balance between these two processes is severely disturbed. Automated processes will continue to produce impulses that ensure the continuance of addictive behavior. Weak, reflective or controlling processes are both the reason for and the result of the inability to forgo addiction. To identify features that are common to current neurocognitive insights into addiction and psychodynamic views on addiction. The picture that emerges from research is not clear. There is some evidence that attentional bias has a causal effect on addiction. There is no evidence that automatic associations have a causal effect, but there is some evidence that automatic action-tendencies do have a causal effect. Current neurocognitive views on the dual process model of addiction can be integrated with an evidence-based approach to addiction and with psychodynamic views on addiction.

  2. Improving the process of process modelling by the use of domain process patterns

    NARCIS (Netherlands)

    Koschmider, A.; Reijers, H.A.

    2015-01-01

    The use of business process models has become prevalent in a wide area of enterprise applications. But while their popularity is expanding, concerns are growing with respect to their proper creation and maintenance. An obvious way to boost the efficiency of creating high-quality business process

  3. Heat and water transport in soils and across the soil-atmosphere interface: 1. Theory and different model concepts

    DEFF Research Database (Denmark)

    Vanderborght, Jan; Fetzer, Thomas; Mosthaf, Klaus

    2017-01-01

    on a theoretical level by identifying the underlying simplifications that are made for the different compartments of the system: porous medium, free flow and their interface, and by discussing how processes not explicitly considered are parameterized. Simplifications can be grouped into three sets depending......Evaporation is an important component of the soil water balance. It is composed of water flow and transport processes in a porous medium that are coupled with heat fluxes and free air flow. This work provides a comprehensive review of model concepts used in different research fields to describe...

  4. The impact of working memory and the "process of process modelling" on model quality: Investigating experienced versus inexperienced modellers.

    Science.gov (United States)

    Martini, Markus; Pinggera, Jakob; Neurauter, Manuel; Sachse, Pierre; Furtner, Marco R; Weber, Barbara

    2016-05-09

    A process model (PM) represents the graphical depiction of a business process, for instance, the entire process from online ordering a book until the parcel is delivered to the customer. Knowledge about relevant factors for creating PMs of high quality is lacking. The present study investigated the role of cognitive processes as well as modelling processes in creating a PM in experienced and inexperienced modellers. Specifically, two working memory (WM) functions (holding and processing of information and relational integration) and three process of process modelling phases (comprehension, modelling, and reconciliation) were related to PM quality. Our results show that the WM function of relational integration was positively related to PM quality in both modelling groups. The ratio of comprehension phases was negatively related to PM quality in inexperienced modellers and the ratio of reconciliation phases was positively related to PM quality in experienced modellers. Our research reveals central cognitive mechanisms in process modelling and has potential practical implications for the development of modelling software and teaching the craft of process modelling.

  5. Equifinality and process-based modelling

    Science.gov (United States)

    Khatami, S.; Peel, M. C.; Peterson, T. J.; Western, A. W.

    2017-12-01

    Equifinality is understood as one of the fundamental difficulties in the study of open complex systems, including catchment hydrology. A review of the hydrologic literature reveals that the term equifinality has been widely used, but in many cases inconsistently and without coherent recognition of the various facets of equifinality, which can lead to ambiguity but also methodological fallacies. Therefore, in this study we first characterise the term equifinality within the context of hydrological modelling by reviewing the genesis of the concept of equifinality and then presenting a theoretical framework. During past decades, equifinality has mainly been studied as a subset of aleatory (arising due to randomness) uncertainty and for the assessment of model parameter uncertainty. Although the connection between parameter uncertainty and equifinality is undeniable, we argue there is more to equifinality than just aleatory parameter uncertainty. That is, the importance of equifinality and epistemic uncertainty (arising due to lack of knowledge) and their implications is overlooked in our current practice of model evaluation. Equifinality and epistemic uncertainty in studying, modelling, and evaluating hydrologic processes are treated as if they can be simply discussed in (or often reduced to) probabilistic terms (as for aleatory uncertainty). The deficiencies of this approach to conceptual rainfall-runoff modelling are demonstrated for selected Australian catchments by examination of parameter and internal flux distributions and interactions within SIMHYD. On this basis, we present a new approach that expands equifinality concept beyond model parameters to inform epistemic uncertainty. The new approach potentially facilitates the identification and development of more physically plausible models and model evaluation schemes particularly within the multiple working hypotheses framework, and is generalisable to other fields of environmental modelling as well.

  6. Spherical Process Models for Global Spatial Statistics

    KAUST Repository

    Jeong, Jaehong

    2017-11-28

    Statistical models used in geophysical, environmental, and climate science applications must reflect the curvature of the spatial domain in global data. Over the past few decades, statisticians have developed covariance models that capture the spatial and temporal behavior of these global data sets. Though the geodesic distance is the most natural metric for measuring distance on the surface of a sphere, mathematical limitations have compelled statisticians to use the chordal distance to compute the covariance matrix in many applications instead, which may cause physically unrealistic distortions. Therefore, covariance functions directly defined on a sphere using the geodesic distance are needed. We discuss the issues that arise when dealing with spherical data sets on a global scale and provide references to recent literature. We review the current approaches to building process models on spheres, including the differential operator, the stochastic partial differential equation, the kernel convolution, and the deformation approaches. We illustrate realizations obtained from Gaussian processes with different covariance structures and the use of isotropic and nonstationary covariance models through deformations and geographical indicators for global surface temperature data. To assess the suitability of each method, we compare their log-likelihood values and prediction scores, and we end with a discussion of related research problems.

  7. Modeling Dynamic Regulatory Processes in Stroke

    Science.gov (United States)

    McDermott, Jason E.; Jarman, Kenneth; Taylor, Ronald; Lancaster, Mary; Shankaran, Harish; Vartanian, Keri B.; Stevens, Susan L.; Stenzel-Poore, Mary P.; Sanfilippo, Antonio

    2012-01-01

    The ability to examine the behavior of biological systems in silico has the potential to greatly accelerate the pace of discovery in diseases, such as stroke, where in vivo analysis is time intensive and costly. In this paper we describe an approach for in silico examination of responses of the blood transcriptome to neuroprotective agents and subsequent stroke through the development of dynamic models of the regulatory processes observed in the experimental gene expression data. First, we identified functional gene clusters from these data. Next, we derived ordinary differential equations (ODEs) from the data relating these functional clusters to each other in terms of their regulatory influence on one another. Dynamic models were developed by coupling these ODEs into a model that simulates the expression of regulated functional clusters. By changing the magnitude of gene expression in the initial input state it was possible to assess the behavior of the networks through time under varying conditions since the dynamic model only requires an initial starting state, and does not require measurement of regulatory influences at each time point in order to make accurate predictions. We discuss the implications of our models on neuroprotection in stroke, explore the limitations of the approach, and report that an optimized dynamic model can provide accurate predictions of overall system behavior under several different neuroprotective paradigms. PMID:23071432

  8. A probabilistic evaluation procedure for process model matching techniques

    NARCIS (Netherlands)

    Kuss, Elena; Leopold, Henrik; van der Aa, Han; Stuckenschmidt, Heiner; Reijers, Hajo A.

    2018-01-01

    Process model matching refers to the automatic identification of corresponding activities between two process models. It represents the basis for many advanced process model analysis techniques such as the identification of similar process parts or process model search. A central problem is how to

  9. Process-based modelling of NH3 exchange with grazed grasslands

    Science.gov (United States)

    Móring, Andrea; Vieno, Massimo; Doherty, Ruth M.; Milford, Celia; Nemitz, Eiko; Twigg, Marsailidh M.; Horváth, László; Sutton, Mark A.

    2017-09-01

    In this study the GAG model, a process-based ammonia (NH3) emission model for urine patches, was extended and applied for the field scale. The new model (GAG_field) was tested over two modelling periods, for which micrometeorological NH3 flux data were available. Acknowledging uncertainties in the measurements, the model was able to simulate the main features of the observed fluxes. The temporal evolution of the simulated NH3 exchange flux was found to be dominated by NH3 emission from the urine patches, offset by simultaneous NH3 deposition to areas of the field not affected by urine. The simulations show how NH3 fluxes over a grazed field in a given day can be affected by urine patches deposited several days earlier, linked to the interaction of volatilization processes with soil pH dynamics. Sensitivity analysis showed that GAG_field was more sensitive to soil buffering capacity (β), field capacity (θfc) and permanent wilting point (θpwp) than the patch-scale model. The reason for these different sensitivities is dual. Firstly, the difference originates from the different scales. Secondly, the difference can be explained by the different initial soil pH and physical properties, which determine the maximum volume of urine that can be stored in the NH3 source layer. It was found that in the case of urine patches with a higher initial soil pH and higher initial soil water content, the sensitivity of NH3 exchange to β was stronger. Also, in the case of a higher initial soil water content, NH3 exchange was more sensitive to the changes in θfc and θpwp. The sensitivity analysis showed that the nitrogen content of urine (cN) is associated with high uncertainty in the simulated fluxes. However, model experiments based on cN values randomized from an estimated statistical distribution indicated that this uncertainty is considerably smaller in practice. Finally, GAG_field was tested with a constant soil pH of 7.5. The variation of NH3 fluxes simulated in this way

  10. Discovering Reference Process Models by Mining Process Variants

    NARCIS (Netherlands)

    Li, C.; Reichert, M.U.; Wombacher, Andreas

    Recently, a new generation of adaptive Process-Aware Information Systems (PAIS) has emerged, which allows for dynamic process and service changes (e.g., to insert, delete, and move activities and service executions in a running process). This, in turn, has led to a large number of process variants

  11. Multifunctional multiscale composites: Processing, modeling and characterization

    Science.gov (United States)

    Qiu, Jingjing

    Carbon nanotubes (CNTs) demonstrate extraordinary properties and show great promise in enhancing out-of-plane properties of traditional polymer/fiber composites and enabling functionality. However, current manufacturing challenges hinder the realization of their potential. In the dissertation research, both experimental and computational efforts have been conducted to investigate effective manufacturing techniques of CNT integrated multiscale composites. The fabricated composites demonstrated significant improvements in physical properties, such as tensile strength, tensile modulus, inter-laminar shear strength, thermal dimension stability and electrical conductivity. Such multiscale composites were truly multifunctional with the addition of CNTs. Furthermore, a novel hierarchical multiscale modeling method was developed in this research. Molecular dynamic (MD) simulation offered reasonable explanation of CNTs dispersion and their motion in polymer solution. Bi-mode finite-extensible-nonlinear-elastic (FENE) dumbbell simulation was used to analyze the influence of CNT length distribution on the stress tensor and shear-rate-dependent viscosity. Based on the simulated viscosity profile and empirical equations from experiments, a macroscale flow simulation model on the finite element method (FEM) method was developed and validated to predict resin flow behavior in the processing of CNT-enhanced multiscale composites. The proposed multiscale modeling method provided a comprehensive understanding of micro/nano flow in both atomistic details and mesoscale. The simulation model can be used to optimize process design and control of the mold-filling process in multiscale composite manufacturing. This research provided systematic investigations into the CNT-based multiscale composites. The results from this study may be used to leverage the benefits of CNTs and open up new application opportunities for high-performance multifunctional multiscale composites. Keywords. Carbon

  12. Event Modeling in UML. Unified Modeling Language and Unified Process

    DEFF Research Database (Denmark)

    Bækgaard, Lars

    2002-01-01

    We show how events can be modeled in terms of UML. We view events as change agents that have consequences and as information objects that represent information. We show how to create object-oriented structures that represent events in terms of attributes, associations, operations, state charts......, and messages. We outline a run-time environment for the processing of events with multiple participants....

  13. Modelling and control of dynamic systems using gaussian process models

    CERN Document Server

    Kocijan, Juš

    2016-01-01

    This monograph opens up new horizons for engineers and researchers in academia and in industry dealing with or interested in new developments in the field of system identification and control. It emphasizes guidelines for working solutions and practical advice for their implementation rather than the theoretical background of Gaussian process (GP) models. The book demonstrates the potential of this recent development in probabilistic machine-learning methods and gives the reader an intuitive understanding of the topic. The current state of the art is treated along with possible future directions for research. Systems control design relies on mathematical models and these may be developed from measurement data. This process of system identification, when based on GP models, can play an integral part of control design in data-based control and its description as such is an essential aspect of the text. The background of GP regression is introduced first with system identification and incorporation of prior know...

  14. Modeling the Object-Oriented Software Process: OPEN and the Unified Process

    NARCIS (Netherlands)

    van den Berg, Klaas; Aksit, Mehmet; van den Broek, P.M.

    A short introduction to software process modeling is presented, particularly object-oriented modeling. Two major industrial process models are discussed: the OPEN model and the Unified Process model. In more detail, the quality assurance in the Unified Process tool (formally called Objectory) is

  15. Modeling the Object-Oriented Software Process: OPEN and the Unified Process

    OpenAIRE

    van den Berg, Klaas; Aksit, Mehmet; van den Broek, P.M.

    1999-01-01

    A short introduction to software process modeling is presented, particularly object-oriented modeling. Two major industrial process models are discussed: the OPEN model and the Unified Process model. In more detail, the quality assurance in the Unified Process tool (formally called Objectory) is reviewed.

  16. Strategies to Automatically Derive a Process Model from a Configurable Process Model Based on Event Data

    Directory of Open Access Journals (Sweden)

    Mauricio Arriagada-Benítez

    2017-10-01

    Full Text Available Configurable process models are frequently used to represent business workflows and other discrete event systems among different branches of large organizations: they unify commonalities shared by all branches and describe their differences, at the same time. The configuration of such models is usually done manually, which is challenging. On the one hand, when the number of configurable nodes in the configurable process model grows, the size of the search space increases exponentially. On the other hand, the person performing the configuration may lack the holistic perspective to make the right choice for all configurable nodes at the same time, since choices influence each other. Nowadays, information systems that support the execution of business processes create event data reflecting how processes are performed. In this article, we propose three strategies (based on exhaustive search, genetic algorithms and a greedy heuristic that use event data to automatically derive a process model from a configurable process model that better represents the characteristics of the process in a specific branch. These strategies have been implemented in our proposed framework and tested in both business-like event logs as recorded in a higher educational enterprise resource planning system and a real case scenario involving a set of Dutch municipalities.

  17. Modeling and optimization of wet sizing process

    International Nuclear Information System (INIS)

    Thai Ba Cau; Vu Thanh Quang and Nguyen Ba Tien

    2004-01-01

    Mathematical simulation on basis of Stock law has been done for wet sizing process on cylinder equipment of laboratory and semi-industrial scale. The model consists of mathematical equations describing relations between variables, such as: - Resident time distribution function of emulsion particles in the separating zone of the equipment depending on flow-rate, height, diameter and structure of the equipment. - Size-distribution function in the fine and coarse parts depending on resident time distribution function of emulsion particles, characteristics of the material being processed, such as specific density, shapes, and characteristics of the environment of classification, such as specific density, viscosity. - Experimental model was developed on data collected from an experimental cylindrical equipment with diameter x height of sedimentation chamber equal to 50 x 40 cm for an emulsion of zirconium silicate in water. - Using this experimental model allows to determine optimal flow-rate in order to obtain product with desired grain size in term of average size or size distribution function. (author)

  18. Three-dimensional model for fusion processes

    International Nuclear Information System (INIS)

    Olson, A.P.

    1984-01-01

    Active galactic nuclei (AGN) emit unusual spectra of radiation which is interpreted to signify extreme distance, extreme power, or both. The status of AGNs was recently reviewed by Balick and Heckman. It seems that the greatest conceptual difficulty with understanding AGNs is how to form a coherent phenomenological model of their properties. What drives the galactic engine. What and where are the mass-flows of fuel to this engine. Are there more than one engine. Do the engines have any symmetry properties. Is observed radiation isotropically emitted from the source. If it is polarized, what causes the polarization. Why is there a roughly spherical cloud of ionized gas about the center of our own galaxy, the Milky Way. The purpose of this paper is to discuss a new model, based on fusion processes which are not axisymmetric, uniform, isotropic, or even time-invariant. Then, the relationship to these questions will be developed. A unified model of fusion processes applicable to many astronomical phenomena will be proposed and discussed

  19. Process-Based Modeling of Constructed Wetlands

    Science.gov (United States)

    Baechler, S.; Brovelli, A.; Rossi, L.; Barry, D. A.

    2007-12-01

    Constructed wetlands (CWs) are widespread facilities for wastewater treatment. In subsurface flow wetlands, contaminated wastewater flows through a porous matrix, where oxidation and detoxification phenomena occur. Despite the large number of working CWs, system design and optimization are still mainly based upon empirical equations or simplified first-order kinetics. This results from an incomplete understanding of the system functioning, and may in turn hinder the performance and effectiveness of the treatment process. As a result, CWs are often considered not suitable to meet high water quality-standards, or to treat water contaminated with recalcitrant anthropogenic contaminants. To date, only a limited number of detailed numerical models have been developed and successfully applied to simulate constructed wetland behavior. Among these, one of the most complete and powerful is CW2D, which is based on Hydrus2D. The aim of this work is to develop a comprehensive simulator tailored to model the functioning of horizontal flow constructed wetlands and in turn provide a reliable design and optimization tool. The model is based upon PHWAT, a general reactive transport code for saturated flow. PHWAT couples MODFLOW, MT3DMS and PHREEQC-2 using an operator-splitting approach. The use of PHREEQC to simulate reactions allows great flexibility in simulating biogeochemical processes. The biogeochemical reaction network is similar to that of CW2D, and is based on the Activated Sludge Model (ASM). Kinetic oxidation of carbon sources and nutrient transformations (nitrogen and phosphorous primarily) are modeled via Monod-type kinetic equations. Oxygen dissolution is accounted for via a first-order mass-transfer equation. While the ASM model only includes a limited number of kinetic equations, the new simulator permits incorporation of an unlimited number of both kinetic and equilibrium reactions. Changes in pH, redox potential and surface reactions can be easily incorporated

  20. Models and Modelling Tools for Chemical Product and Process Design

    DEFF Research Database (Denmark)

    Gani, Rafiqul

    2016-01-01

    The design, development and reliability of a chemical product and the process to manufacture it, need to be consistent with the end-use characteristics of the desired product. One of the common ways to match the desired product-process characteristics is through trial and error based experiments......-based framework is that in the design, development and/or manufacturing of a chemical product-process, the knowledge of the applied phenomena together with the product-process design details can be provided with diverse degrees of abstractions and details. This would allow the experimental resources...... to be employed for validation and fine-tuning of the solutions from the model-based framework, thereby, removing the need for trial and error experimental steps. Also, questions related to economic feasibility, operability and sustainability, among others, can be considered in the early stages of design. However...

  1. Model systems for life processes on Mars

    Science.gov (United States)

    Mitz, M. A.

    1974-01-01

    In the evolution of life forms nonphotosynthetic mechanisms are developed. The question remains whether a total life system could evolve which is not dependent upon photosynthesis. In trying to visualize life on other planets, the photosynthetic process has problems. On Mars, the high intensity of light at the surface is a concern and alternative mechanisms need to be defined and analyzed. In the UV search for alternate mechanisms, several different areas may be identified. These involve activated inorganic compounds in the atmosphere, such as the products of photodissociation of carbon dioxide and the organic material which may be created by natural phenomena. In addition, a life system based on the pressure of the atmospheric constituents, such as carbon dioxide, is a possibility. These considerations may be important for the understanding of evolutionary processes of life on another planet. Model systems which depend on these alternative mechanisms are defined and related to presently planned and future planetary missions.

  2. Comments on: Spatiotemporal models for skewed processes

    KAUST Repository

    Genton, Marc G.; Hering, Amanda S.

    2017-01-01

    We would first like to thank the authors for this paper that highlights the important problem of building models for non-Gaussian space-time processes. We will hereafter refer to the paper as SGV, and we also would like to acknowledge and thank them for providing us with the temporally detrended temperatures, plotted in their Figure 1, along with the coordinates of the twenty-one locations and the posterior means of the parameters for the MA1 model. We find much of interest to discuss in this paper, and as we progress through points of interest, we pose some questions to the authors that we hope they will be able to address.

  3. Comments on: Spatiotemporal models for skewed processes

    KAUST Repository

    Genton, Marc G.

    2017-09-04

    We would first like to thank the authors for this paper that highlights the important problem of building models for non-Gaussian space-time processes. We will hereafter refer to the paper as SGV, and we also would like to acknowledge and thank them for providing us with the temporally detrended temperatures, plotted in their Figure 1, along with the coordinates of the twenty-one locations and the posterior means of the parameters for the MA1 model. We find much of interest to discuss in this paper, and as we progress through points of interest, we pose some questions to the authors that we hope they will be able to address.

  4. Specification of e-business process model for PayPal online payment process using Reo

    OpenAIRE

    Xie, M.

    2005-01-01

    textabstractE-business process modeling allows business analysts to better understand and analyze the business processes, and eventually to use software systems to automate (parts of) these business processes to achieve higher profit. To support e-business process modeling, many business process modeling languages have been used as tools. However, many existing business process modeling languages lack (a) formal semantics, (b) formal computational model, and (c) an integrated view of the busi...

  5. Modeling Aspects Of Activated Sludge Processes Part I: Process Modeling Of Activated Sludge Facilitation And Sedimentation

    International Nuclear Information System (INIS)

    Ibrahim, H. I.; EI-Ahwany, A.H.; Ibrahim, G.

    2004-01-01

    Process modeling of activated sludge flocculation and sedimentation reviews consider the activated sludge floc characteristics such as: morphology viable and non-viable cell ratio density and water content, bio flocculation and its kinetics were studied considering the characteristics of bio flocculation and explaining theory of Divalent Cation Bridging which describes the major role of cations in bio flocculation. Activated sludge flocculation process modeling was studied considering mass transfer limitations from Clifft and Andrew, 1981, Benefild and Molz 1983 passing Henze 1987, until Tyagi 1996 and G. Ibrahim et aI. 2002. Models of aggregation and breakage of flocs were studied by Spicer and Pratsinis 1996,and Biggs 2002 Size distribution of floes influences mass transfer and biomass separation in the activated sludge process. Therefore, it is of primary importance to establish the role of specific process operation factors, such as sludge loading dynamic sludge age and dissolved oxygen, on this distribution with special emphasis on the formation of primary particles

  6. Text Simplification Using Consumer Health Vocabulary to Generate Patient-Centered Radiology Reporting: Translation and Evaluation.

    Science.gov (United States)

    Qenam, Basel; Kim, Tae Youn; Carroll, Mark J; Hogarth, Michael

    2017-12-18

    Radiology reporting is a clinically oriented form of documentation that reflects critical information for patients about their health care processes. Realizing its importance, many medical institutions have started providing radiology reports in patient portals. The gain, however, can be limited because of medical language barriers, which require a way for customizing these reports for patients. The open-access, collaborative consumer health vocabulary (CHV) is a terminology system created for such purposes and can be the basis of lexical simplification processes for clinical notes. The aim of this study was to examine the comprehensibility and suitability of CHV in simplifying radiology reports for consumers. This was done by characterizing the content coverage and the lexical similarity between the terms in the reports and the CHV-preferred terms. The overall procedure was divided into the following two main stages: (1) translation and (2) evaluation. The translation process involved using MetaMap to link terms in the reports to CHV concepts. This is followed by replacing the terms with CHV-preferred terms using the concept names and sources table (MRCONSO) in the Unified Medical Language System (UMLS) Metathesaurus. In the second stage, medical terms in the reports and general terms that are used to describe medical phenomena were selected and evaluated by comparing the words in the original reports with the translated ones. The evaluation includes measuring the content coverage, investigating lexical similarity, and finding trends in missing concepts. Of the 792 terms selected from the radiology reports, 695 of them could be mapped directly to CHV concepts, indicating a content coverage of 88.5%. A total of 51 of the concepts (53%, 51/97) that could not be mapped are names of human anatomical structures and regions, followed by 28 anatomical descriptions and pathological variations (29%, 28/97). In addition, 12 radiology techniques and projections represented

  7. Extraction and Simplification of Building Façade Pieces from Mobile Laser Scanner Point Clouds for 3D Street View Services

    Directory of Open Access Journals (Sweden)

    Yan Li

    2016-12-01

    Full Text Available Extraction and analysis of building façades are key processes in the three-dimensional (3D building reconstruction and realistic geometrical modeling of the urban environment, which includes many applications, such as smart city management, autonomous navigation through the urban environment, fly-through rendering, 3D street view, virtual tourism, urban mission planning, etc. This paper proposes a building facade pieces extraction and simplification algorithm based on morphological filtering with point clouds obtained by a mobile laser scanner (MLS. First, this study presents a point cloud projection algorithm with high-accuracy orientation parameters from the position and orientation system (POS of MLS that can convert large volumes of point cloud data to a raster image. Second, this study proposes a feature extraction approach based on morphological filtering with point cloud projection that can obtain building facade features in an image space. Third, this study designs an inverse transformation of point cloud projection to convert building facade features from an image space to a 3D space. A building facade feature with restricted facade plane detection algorithm is implemented to reconstruct façade pieces for street view service. The results of building facade extraction experiments with large volumes of point cloud from MLS show that the proposed approach is suitable for various types of building facade extraction. The geometric accuracy of building façades is 0.66 m in x direction, 0.64 in y direction and 0.55 m in the vertical direction, which is the same level as the space resolution (0.5 m of the point cloud.

  8. Simplification and optimisation of treatment/conditioning processes for radioactive waste at Indian Peshawar

    International Nuclear Information System (INIS)

    Sharma, P.D.; Chopra, S.K.

    2001-01-01

    Past experience of more than 160 years reactor operation and management of waste generation has established that the methods followed and the controls adopted are adequate though improvement in design with respect to tech no-economic considerations are under development

  9. Modelling chemical behavior of water reactor fuel

    Energy Technology Data Exchange (ETDEWEB)

    Ball, R G.J.; Hanshaw, J; Mason, P K; Mignanelli, M A [AEA Technology, Harwell (United Kingdom)

    1997-08-01

    For many applications, large computer codes have been developed which use correlation`s, simplifications and approximations in order to describe the complex situations which may occur during the operation of nuclear power plant or during fault scenarios. However, it is important to have a firm physical basis for simplifications and approximations in such codes and, therefore, there has been an emphasis on modelling the behaviour of materials and processes on a more detailed or fundamental basis. The application of fundamental modelling techniques to simulated various chemical phenomena in thermal reactor fuel systems are described in this paper. These methods include thermochemical modelling, kinetic and mass transfer modelling and atomistic simulation and examples of each approach are presented. In each of these applications a summary of the methods are discussed together with the assessment process adopted to provide the fundamental parameters which form the basis of the calculation. (author). 25 refs, 9 figs, 2 tabs.

  10. An Analysis of Simplification Strategies in a Reading Textbook of Japanese as a Foreign Language

    Directory of Open Access Journals (Sweden)

    Kristina HMELJAK SANGAWA

    2016-06-01

    Full Text Available Reading is one of the bases of second language learning, and it can be most effective when the linguistic difficulty of the text matches the reader's level of language proficiency. The present paper reviews previous research on the readability and simplification of Japanese texts, and presents an analysis of a collection of simplified texts for learners of Japanese as a foreign language. The simplified texts are compared to their original versions to uncover different strategies used to make the texts more accessible to learners. The list of strategies thus obtained can serve as useful guidelines for assessing, selecting, and devising texts for learners of Japanese as a foreign language.

  11. Uncertainty modeling process for semantic technology

    Directory of Open Access Journals (Sweden)

    Rommel N. Carvalho

    2016-08-01

    Full Text Available The ubiquity of uncertainty across application domains generates a need for principled support for uncertainty management in semantically aware systems. A probabilistic ontology provides constructs for representing uncertainty in domain ontologies. While the literature has been growing on formalisms for representing uncertainty in ontologies, there remains little guidance in the knowledge engineering literature for how to design probabilistic ontologies. To address the gap, this paper presents the Uncertainty Modeling Process for Semantic Technology (UMP-ST, a new methodology for modeling probabilistic ontologies. To explain how the methodology works and to verify that it can be applied to different scenarios, this paper describes step-by-step the construction of a proof-of-concept probabilistic ontology. The resulting domain model can be used to support identification of fraud in public procurements in Brazil. While the case study illustrates the development of a probabilistic ontology in the PR-OWL probabilistic ontology language, the methodology is applicable to any ontology formalism that properly integrates uncertainty with domain semantics.

  12. GRAPHICAL MODELS OF THE AIRCRAFT MAINTENANCE PROCESS

    Directory of Open Access Journals (Sweden)

    Stanislav Vladimirovich Daletskiy

    2017-01-01

    Full Text Available The aircraft maintenance is realized by a rapid sequence of maintenance organizational and technical states, its re- search and analysis are carried out by statistical methods. The maintenance process concludes aircraft technical states con- nected with the objective patterns of technical qualities changes of the aircraft as a maintenance object and organizational states which determine the subjective organization and planning process of aircraft using. The objective maintenance pro- cess is realized in Maintenance and Repair System which does not include maintenance organization and planning and is a set of related elements: aircraft, Maintenance and Repair measures, executors and documentation that sets rules of their interaction for maintaining of the aircraft reliability and readiness for flight. The aircraft organizational and technical states are considered, their characteristics and heuristic estimates of connection in knots and arcs of graphs and of aircraft organi- zational states during regular maintenance and at technical state failure are given. It is shown that in real conditions of air- craft maintenance, planned aircraft technical state control and maintenance control through it, is only defined by Mainte- nance and Repair conditions at a given Maintenance and Repair type and form structures, and correspondingly by setting principles of Maintenance and Repair work types to the execution, due to maintenance, by aircraft and all its units mainte- nance and reconstruction strategies. The realization of planned Maintenance and Repair process determines the one of the constant maintenance component. The proposed graphical models allow to reveal quantitative correlations between graph knots to improve maintenance processes by statistical research methods, what reduces manning, timetable and expenses for providing safe civil aviation aircraft maintenance.

  13. Software Process Validation: Quantitatively Measuring the Correspondence of a Process to a Model

    National Research Council Canada - National Science Library

    Cook, Jonathan E; Wolf, Alexander L

    1997-01-01

    .... When process models and process executions diverge, something significant is happening. The authors have developed techniques for uncovering and measuring the discrepancies between models and executions, which they call process validation...

  14. Properties of spatial Cox process models

    DEFF Research Database (Denmark)

    Møller, Jesper

    2005-01-01

    Particularly, we study the most important classes of Cox processes, including log Gaussian Cox processes, shot noise Cox processes, and permanent Cox processes. We consider moment properties and point process operations such as thinning, displacements, and super positioning. We also discuss how...... to simulate specific Cox processes....

  15. Heat source model for welding process

    International Nuclear Information System (INIS)

    Doan, D.D.

    2006-10-01

    One of the major industrial stakes of the welding simulation relates to the control of mechanical effects of the process (residual stress, distortions, fatigue strength... ). These effects are directly dependent on the temperature evolutions imposed during the welding process. To model this thermal loading, an original method is proposed instead of the usual methods like equivalent heat source approach or multi-physical approach. This method is based on the estimation of the weld pool shape together with the heat flux crossing the liquid/solid interface, from experimental data measured in the solid part. Its originality consists in solving an inverse Stefan problem specific to the welding process, and it is shown how to estimate the parameters of the weld pool shape. To solve the heat transfer problem, the interface liquid/solid is modeled by a Bezier curve ( 2-D) or a Bezier surface (3-D). This approach is well adapted to a wide diversity of weld pool shapes met for the majority of the current welding processes (TIG, MlG-MAG, Laser, FE, Hybrid). The number of parameters to be estimated is weak enough, according to the cases considered from 2 to 5 in 20 and 7 to 16 in 3D. A sensitivity study leads to specify the location of the sensors, their number and the set of measurements required to a good estimate. The application of the method on test results of welding TIG on thin stainless steel sheets in emerging and not emerging configurations, shows that only one measurement point is enough to estimate the various weld pool shapes in 20, and two points in 3D, whatever the penetration is full or not. In the last part of the work, a methodology is developed for the transient analysis. It is based on the Duvaut's transformation which overpasses the discontinuity of the liquid metal interface and therefore gives a continuous variable for the all spatial domain. Moreover, it allows to work on a fixed mesh grid and the new inverse problem is equivalent to identify a source

  16. Mechanical-mathematical modeling for landslide process

    Science.gov (United States)

    Svalova, V.

    2009-04-01

    500 m and displacement of a landslide in the plan over 1 m. Last serious activization of a landslide has taken place in 2002 with a motion on 53 cm. Catastrophic activization of the deep blockglide landslide in the area of Khoroshevo in Moscow took place in 2006-2007. A crack of 330 m long appeared in the old sliding circus, along which a new 220 m long creeping block was separated from the plateau and began sinking with a displaced surface of the plateau reaching to 12 m. Such activization of the landslide process was not observed in Moscow since mid XIX century. The sliding area of Khoroshevo was stable during long time without manifestations of activity. Revealing of the reasons of deformation and development of ways of protection from deep landslide motions is extremely actual and difficult problem which decision is necessary for preservation of valuable historical monuments and modern city constructions. The reasons of activization and protective measures are discussed. Structure of monitoring system for urban territories is elaborated. Mechanical-mathematical model of high viscous fluid was used for modeling of matter behavior on landslide slopes. Equation of continuity and an approximated equation of the Navier-Stockes for slow motions in a thin layer were used. The results of modelling give possibility to define the place of highest velocity on landslide surface, which could be the best place for monitoring post position. Model can be used for calibration of monitoring equipment and gives possibility to investigate some fundamental aspects of matter movement on landslide slope.

  17. R-process nucleosynthesis: a dynamical model

    Energy Technology Data Exchange (ETDEWEB)

    Hillebrandt, W; Takahashi, K [Technische Hochschule Darmstadt (Germany, F.R.). Inst. fuer Kernphysik; Kodama, T [Centro Brasileiro de Pesquisas Fisicas, Rio de Janeiro

    1976-10-01

    The synthesis of heavy and neutron-rich elements (with the mass number A > approximately 70) is reconsidered in the framework of a dynamical supernova model. The synthesis equation for the rapid neutron-capture (or, the r-) process and the hydrodynamical equations for the supernova explosion are solved simultaneously. Improved systematics of nuclear parameters are used, and the energy release due to ..beta..-decays as well as the energy loss due to neutrinos is taken into account. It is shown that the observed solar-system abundance curve can be reproduced fairly well by assuming only one supernova event on a time-scale of the order of 1 s. However there are still some discrepancies which may be explained by uncertainties in the nuclear data used.

  18. Kanban simulation model for production process optimization

    Directory of Open Access Journals (Sweden)

    Golchev Riste

    2015-01-01

    Full Text Available A long time has passed since the KANBAN system has been established as an efficient method for coping with the excessive inventory. Still, the possibilities for its improvement through its integration with other different approaches should be investigated further. The basic research challenge of this paper is to present benefits of KANBAN implementation supported with Discrete Event Simulation (DES. In that direction, at the beginning, the basics of KANBAN system are presented with emphasis on the information and material flow, together with a methodology for implementation of KANBAN system. Certain analysis on combining the simulation with this methodology is presented. The paper is concluded with a practical example which shows that through understanding the philosophy of the implementation methodology of KANBAN system and the simulation methodology, a simulation model can be created which can serve as a basis for a variety of experiments that can be conducted within a short period of time, resulting with production process optimization.

  19. Modeling and simulation of economic processes

    Directory of Open Access Journals (Sweden)

    Bogdan Brumar

    2010-12-01

    Full Text Available In general, any activity requires a longer action often characterized by a degree of uncertainty, insecurity, in terms of size of the objective pursued. Because of the complexity of real economic systems, the stochastic dependencies between different variables and parameters considered, not all systems can be adequately represented by a model that can be solved by analytical methods and covering all issues for management decision analysis-economic horizon real. Often in such cases, it is considered that the simulation technique is the only alternative available. Using simulation techniques to study real-world systems often requires a laborious work. Making a simulation experiment is a process that takes place in several stages.

  20. Modelling and application of stochastic processes

    CERN Document Server

    1986-01-01

    The subject of modelling and application of stochastic processes is too vast to be exhausted in a single volume. In this book, attention is focused on a small subset of this vast subject. The primary emphasis is on realization and approximation of stochastic systems. Recently there has been considerable interest in the stochastic realization problem, and hence, an attempt has been made here to collect in one place some of the more recent approaches and algorithms for solving the stochastic realiza­ tion problem. Various different approaches for realizing linear minimum-phase systems, linear nonminimum-phase systems, and bilinear systems are presented. These approaches range from time-domain methods to spectral-domain methods. An overview of the chapter contents briefly describes these approaches. Also, in most of these chapters special attention is given to the problem of developing numerically ef­ ficient algorithms for obtaining reduced-order (approximate) stochastic realizations. On the application side,...

  1. Elliptic Determinantal Processes and Elliptic Dyson Models

    Science.gov (United States)

    Katori, Makoto

    2017-10-01

    We introduce seven families of stochastic systems of interacting particles in one-dimension corresponding to the seven families of irreducible reduced affine root systems. We prove that they are determinantal in the sense that all spatio-temporal correlation functions are given by determinants controlled by a single function called the spatio-temporal correlation kernel. For the four families {A}_{N-1}, {B}_N, {C}_N and {D}_N, we identify the systems of stochastic differential equations solved by these determinantal processes, which will be regarded as the elliptic extensions of the Dyson model. Here we use the notion of martingales in probability theory and the elliptic determinant evaluations of the Macdonald denominators of irreducible reduced affine root systems given by Rosengren and Schlosser.

  2. Numerical approaches to expansion process modeling

    Directory of Open Access Journals (Sweden)

    G. V. Alekseev

    2017-01-01

    Full Text Available Forage production is currently undergoing a period of intensive renovation and introduction of the most advanced technologies and equipment. More and more often such methods as barley toasting, grain extrusion, steaming and grain flattening, boiling bed explosion, infrared ray treatment of cereals and legumes, followed by flattening, and one-time or two-time granulation of the purified whole grain without humidification in matrix presses By grinding the granules. These methods require special apparatuses, machines, auxiliary equipment, created on the basis of different methods of compiled mathematical models. When roasting, simulating the heat fields arising in the working chamber, provide such conditions, the decomposition of a portion of the starch to monosaccharides, which makes the grain sweetish, but due to protein denaturation the digestibility of the protein and the availability of amino acids decrease somewhat. Grain is roasted mainly for young animals in order to teach them to eat food at an early age, stimulate the secretory activity of digestion, better development of the masticatory muscles. In addition, the high temperature is detrimental to bacterial contamination and various types of fungi, which largely avoids possible diseases of the gastrointestinal tract. This method has found wide application directly on the farms. Apply when used in feeding animals and legumes: peas, soy, lupine and lentils. These feeds are preliminarily ground, and then cooked or steamed for 1 hour for 30–40 minutes. In the feed mill. Such processing of feeds allows inactivating the anti-nutrients in them, which reduce the effectiveness of their use. After processing, legumes are used as protein supplements in an amount of 25–30% of the total nutritional value of the diet. But it is recommended to cook and steal a grain of good quality. A poor-quality grain that has been stored for a long time and damaged by pathogenic micro flora is subject to

  3. Integrated modelling of near field and engineered barrier system processes

    International Nuclear Information System (INIS)

    Lamont, A.; Gansemer, J.

    1994-01-01

    The Yucca Mountain Integrating Model (YMIM) is an integrated model of the Engineered barrier System has been developed to assist project managers at LLNL in identifying areas where research emphasis should be placed. The model was designed to be highly modular so that a model of an individual process could be easily modified or replaced without interfering with the models of other processes. The modules modelling container failure and the dissolution of nuclides include particularly detailed, temperature dependent models of their corresponding processes

  4. Behavioral conformance of artifact-centric process models

    NARCIS (Netherlands)

    Fahland, D.; Leoni, de M.; Dongen, van B.F.; Aalst, van der W.M.P.; Abramowicz, W.

    2011-01-01

    The use of process models in business information systems for analysis, execution, and improvement of processes assumes that the models describe reality. Conformance checking is a technique to validate how good a given process model describes recorded executions of the actual process. Recently,

  5. On the correlation between process model metrics and errors

    NARCIS (Netherlands)

    Mendling, J.; Neumann, G.; Aalst, van der W.M.P.; Grundy, J.; Hartmann, S.; Laender, S.; Maciaszek, L.; Roddick, J.F.

    2007-01-01

    Business process models play an important role for the management, design, and improvement of process organizations and process-aware information systems. Despite the extensive application of process modeling in practice there are hardly empirical results available on quality aspects of process

  6. Business Process Simulation: Requirements for Business and Resource Models

    Directory of Open Access Journals (Sweden)

    Audrius Rima

    2015-07-01

    Full Text Available The purpose of Business Process Model and Notation (BPMN is to provide easily understandable graphical representation of business process. Thus BPMN is widely used and applied in various areas one of them being a business process simulation. This paper addresses some BPMN model based business process simulation problems. The paper formulate requirements for business process and resource models in enabling their use for business process simulation.

  7. Group Contribution Based Process Flowsheet Synthesis, Design and Modelling

    DEFF Research Database (Denmark)

    d'Anterroches, Loïc; Gani, Rafiqul

    2004-01-01

    This paper presents a process-group-contribution Method to model. simulate and synthesize a flowsheet. The process-group based representation of a flowsheet together with a process "property" model are presented. The process-group based synthesis method is developed on the basis of the computer...... aided molecular design methods and gives the ability to screen numerous process alternatives without the need to use the rigorous process simulation models. The process "property" model calculates the design targets for the generated flowsheet alternatives while a reverse modelling method (also...... developed) determines the design variables matching the target. A simple illustrative example highlighting the main features of the methodology is also presented....

  8. a Geometric Processing Workflow for Transforming Reality-Based 3d Models in Volumetric Meshes Suitable for Fea

    Science.gov (United States)

    Gonizzi Barsanti, S.; Guidi, G.

    2017-02-01

    Conservation of Cultural Heritage is a key issue and structural changes and damages can influence the mechanical behaviour of artefacts and buildings. The use of Finite Elements Methods (FEM) for mechanical analysis is largely used in modelling stress behaviour. The typical workflow involves the use of CAD 3D models made by Non-Uniform Rational B-splines (NURBS) surfaces, representing the ideal shape of the object to be simulated. Nowadays, 3D documentation of CH has been widely developed through reality-based approaches, but the models are not suitable for a direct use in FEA: the mesh has in fact to be converted to volumetric, and the density has to be reduced since the computational complexity of a FEA grows exponentially with the number of nodes. The focus of this paper is to present a new method aiming at generate the most accurate 3D representation of a real artefact from highly accurate 3D digital models derived from reality-based techniques, maintaining the accuracy of the high-resolution polygonal models in the solid ones. The approach proposed is based on a wise use of retopology procedures and a transformation of this model to a mathematical one made by NURBS surfaces suitable for being processed by volumetric meshers typically embedded in standard FEM packages. The strong simplification with little loss of consistency possible with the retopology step is used for maintaining as much coherence as possible between the original acquired mesh and the simplified model, creating in the meantime a topology that is more favourable for the automatic NURBS conversion.

  9. Modeling nutrient in-stream processes at the watershed scale using Nutrient Spiralling metrics

    Science.gov (United States)

    Marcé, R.; Armengol, J.

    2009-07-01

    One of the fundamental problems of using large-scale biogeochemical models is the uncertainty involved in aggregating the components of fine-scale deterministic models in watershed applications, and in extrapolating the results of field-scale measurements to larger spatial scales. Although spatial or temporal lumping may reduce the problem, information obtained during fine-scale research may not apply to lumped categories. Thus, the use of knowledge gained through fine-scale studies to predict coarse-scale phenomena is not straightforward. In this study, we used the nutrient uptake metrics defined in the Nutrient Spiralling concept to formulate the equations governing total phosphorus in-stream fate in a deterministic, watershed-scale biogeochemical model. Once the model was calibrated, fitted phosphorus retention metrics where put in context of global patterns of phosphorus retention variability. For this purpose, we calculated power regressions between phosphorus retention metrics, streamflow, and phosphorus concentration in water using published data from 66 streams worldwide, including both pristine and nutrient enriched streams. Performance of the calibrated model confirmed that the Nutrient Spiralling formulation is a convenient simplification of the biogeochemical transformations involved in total phosphorus in-stream fate. Thus, this approach may be helpful even for customary deterministic applications working at short time steps. The calibrated phosphorus retention metrics were comparable to field estimates from the study watershed, and showed high coherence with global patterns of retention metrics from streams of the world. In this sense, the fitted phosphorus retention metrics were similar to field values measured in other nutrient enriched streams. Analysis of the bibliographical data supports the view that nutrient enriched streams have lower phosphorus retention efficiency than pristine streams, and that this efficiency loss is maintained in a wide

  10. Radiative processes in gauge theories

    International Nuclear Information System (INIS)

    Berends, F.A.; Kleiss, R.; Danckaert, D.; Causmaecker, P. De; Gastmans, R.; Troost, W.; Tai Tsun Wu

    1982-01-01

    It is shown how the introduction of explicit polarization vectors of the radiated gauge particles leads to great simplifications in the calculation of bremsstrahlung processes at high energies. (author)

  11. Specification of e-business process model for PayPal online payment process using Reo

    NARCIS (Netherlands)

    M. Xie

    2005-01-01

    textabstractE-business process modeling allows business analysts to better understand and analyze the business processes, and eventually to use software systems to automate (parts of) these business processes to achieve higher profit. To support e-business process modeling, many business process

  12. Computer Forensics Field Triage Process Model

    Directory of Open Access Journals (Sweden)

    Marcus K. Rogers

    2006-06-01

    Full Text Available With the proliferation of digital based evidence, the need for the timely identification, analysis and interpretation of digital evidence is becoming more crucial. In many investigations critical information is required while at the scene or within a short period of time - measured in hours as opposed to days. The traditional cyber forensics approach of seizing a system(s/media, transporting it to the lab, making a forensic image(s, and then searching the entire system for potential evidence, is no longer appropriate in some circumstances. In cases such as child abductions, pedophiles, missing or exploited persons, time is of the essence. In these types of cases, investigators dealing with the suspect or crime scene need investigative leads quickly; in some cases it is the difference between life and death for the victim(s. The Cyber Forensic Field Triage Process Model (CFFTPM proposes an onsite or field approach for providing the identification, analysis and interpretation of digital evidence in a short time frame, without the requirement of having to take the system(s/media back to the lab for an in-depth examination or acquiring a complete forensic image(s. The proposed model adheres to commonly held forensic principles, and does not negate the ability that once the initial field triage is concluded, the system(s/storage media be transported back to a lab environment for a more thorough examination and analysis. The CFFTPM has been successfully used in various real world cases, and its investigative importance and pragmatic approach has been amply demonstrated. Furthermore, the derived evidence from these cases has not been challenged in the court proceedings where it has been introduced. The current article describes the CFFTPM in detail, discusses the model’s forensic soundness, investigative support capabilities and practical considerations.

  13. The Formalization of the Business Process Modeling Goals

    Directory of Open Access Journals (Sweden)

    Ligita Bušinska

    2016-10-01

    Full Text Available In business process modeling the de facto standard BPMN has emerged. However, the applications of this notation have many subsets of elements and various extensions. Also, BPMN still coincides with many other modeling languages, forming a large set of available options for business process modeling languages and dialects. While, in general, the goal of modelers is a central notion in the choice of modeling languages and notations, in most researches that propose guidelines, techniques, and methods for business process modeling language evaluation and/or selection, the business process modeling goal is not formalized and not transparently taken into account. To overcome this gap, and to explicate and help to handle business process modeling complexity, the approach to formalize the business process modeling goal, and the supporting three dimensional business process modeling framework, are proposed.

  14. Atmospheric pollution. From processes to modelling

    International Nuclear Information System (INIS)

    Sportisse, B.

    2008-01-01

    Air quality, greenhouse effect, ozone hole, chemical or nuclear accidents.. All these phenomena are tightly linked to the chemical composition of atmosphere and to the atmospheric dispersion of pollutants. This book aims at supplying the main elements of understanding of 'atmospheric pollutions': stakes, physical processes involved, role of scientific expertise in decision making. Content: 1 - classifications and scales: chemical composition of the atmosphere, vertical structure, time scales (transport, residence); 2 - matter/light interaction: notions of radiative transfer, application to the Earth's atmosphere; 3 - some elements about the atmospheric boundary layer: notion of scales in meteorology, atmospheric boundary layer (ABL), thermal stratification and stability, description of ABL turbulence, elements of atmospheric dynamics, some elements about the urban climate; 4 - notions of atmospheric chemistry: characteristics, ozone stratospheric chemistry, ozone tropospheric chemistry, brief introduction to indoor air quality; 5 - aerosols, clouds and rains: aerosols and particulates, aerosols and clouds, acid rains and leaching; 6 - towards numerical simulation: equation of reactive dispersion, numerical methods for chemistry-transport models, numerical resolution of the general equation of aerosols dynamics (GDE), modern simulation chains, perspectives. (J.S.)

  15. Signal Processing Model for Radiation Transport

    Energy Technology Data Exchange (ETDEWEB)

    Chambers, D H

    2008-07-28

    This note describes the design of a simplified gamma ray transport model for use in designing a sequential Bayesian signal processor for low-count detection and classification. It uses a simple one-dimensional geometry to describe the emitting source, shield effects, and detector (see Fig. 1). At present, only Compton scattering and photoelectric absorption are implemented for the shield and the detector. Other effects may be incorporated in the future by revising the expressions for the probabilities of escape and absorption. Pair production would require a redesign of the simulator to incorporate photon correlation effects. The initial design incorporates the physical effects that were present in the previous event mode sequence simulator created by Alan Meyer. The main difference is that this simulator transports the rate distributions instead of single photons. Event mode sequences and other time-dependent photon flux sequences are assumed to be marked Poisson processes that are entirely described by their rate distributions. Individual realizations can be constructed from the rate distribution using a random Poisson point sequence generator.

  16. Evaluating Translational Research: A Process Marker Model

    Science.gov (United States)

    Trochim, William; Kane, Cathleen; Graham, Mark J.; Pincus, Harold A.

    2011-01-01

    Abstract Objective: We examine the concept of translational research from the perspective of evaluators charged with assessing translational efforts. One of the major tasks for evaluators involved in translational research is to help assess efforts that aim to reduce the time it takes to move research to practice and health impacts. Another is to assess efforts that are intended to increase the rate and volume of translation. Methods: We offer an alternative to the dominant contemporary tendency to define translational research in terms of a series of discrete “phases.”Results: We contend that this phased approach has been confusing and that it is insufficient as a basis for evaluation. Instead, we argue for the identification of key operational and measurable markers along a generalized process pathway from research to practice. Conclusions: This model provides a foundation for the evaluation of interventions designed to improve translational research and the integration of these findings into a field of translational studies. Clin Trans Sci 2011; Volume 4: 153–162 PMID:21707944

  17. Modelling of fiberglass pipe destruction process

    Directory of Open Access Journals (Sweden)

    А. К. Николаев

    2017-03-01

    Full Text Available The article deals with important current issue of oil and gas industry of using tubes made of high-strength composite corrosion resistant materials. In order to improve operational safety of industrial pipes it is feasible to use composite fiberglass tubes. More than half of the accidents at oil and gas sites happen at oil gathering systems due to high corrosiveness of pumped fluid. To reduce number of accidents and improve environmental protection we need to solve the issue of industrial pipes durability. This problem could be solved by using composite materials from fiberglass, which have required physical and mechanical properties for oil pipes. The durability and strength can be monitored by a fiberglass winding method, number of layers in composite material and high corrosion-resistance properties of fiberglass. Usage of high-strength composite materials in oil production is economically feasible; fiberglass pipes production is cheaper than steel pipes. Fiberglass has small volume weight, which simplifies pipe transportation and installation. In order to identify the efficiency of using high-strength composite materials at oil production sites we conducted a research of their physical-mechanical properties and modelled fiber pipe destruction process.

  18. Modeling spatial processes with unknown extremal dependence class

    KAUST Repository

    Huser, Raphaë l G.; Wadsworth, Jennifer L.

    2017-01-01

    Many environmental processes exhibit weakening spatial dependence as events become more extreme. Well-known limiting models, such as max-stable or generalized Pareto processes, cannot capture this, which can lead to a preference for models

  19. Querying Business Process Models with VMQL

    DEFF Research Database (Denmark)

    Störrle, Harald; Acretoaie, Vlad

    2013-01-01

    The Visual Model Query Language (VMQL) has been invented with the objectives (1) to make it easier for modelers to query models effectively, and (2) to be universally applicable to all modeling languages. In previous work, we have applied VMQL to UML, and validated the first of these two claims. ...

  20. Transforming Process Models to Problem Frames

    NARCIS (Netherlands)

    Fassbender, Stephan; Aysolmaz, Banu; Weske, M.; Rinderle-Ma, S.

    2015-01-01

    An increase of process awareness within organizations and advances in IT systems led to a development of process-aware information systems (PAIS) in many organizations. UPROM is developed as a unified BPM methodology to conduct business process and user requirements analysis for PAIS in an

  1. Two Undergraduate Process Modeling Courses Taught Using Inductive Learning Methods

    Science.gov (United States)

    Soroush, Masoud; Weinberger, Charles B.

    2010-01-01

    This manuscript presents a successful application of inductive learning in process modeling. It describes two process modeling courses that use inductive learning methods such as inquiry learning and problem-based learning, among others. The courses include a novel collection of multi-disciplinary complementary process modeling examples. They were…

  2. A linear time layout algorithm for business process models

    NARCIS (Netherlands)

    Gschwind, T.; Pinggera, J.; Zugal, S.; Reijers, H.A.; Weber, B.

    2014-01-01

    The layout of a business process model influences how easily it can beunderstood. Existing layout features in process modeling tools often rely on graph representations, but do not take the specific properties of business process models into account. In this paper, we propose an algorithm that is

  3. User-guided discovery of declarative process models

    NARCIS (Netherlands)

    Maggi, F.M.; Mooij, A.J.; Aalst, van der W.M.P.; Chawla, N.; King, I.; Sperduti, A.

    2011-01-01

    Process mining techniques can be used to effectively discover process models from logs with example behaviour. Cross-correlating a discovered model with information in the log can be used to improve the underlying process. However, existing process discovery techniques have two important drawbacks.

  4. Reduction of sources of error and simplification of the Carbon-14 urea breath test

    International Nuclear Information System (INIS)

    Bellon, M.S.

    1997-01-01

    Full text: Carbon-14 urea breath testing is established in the diagnosis of H. pylori infection. The aim of this study was to investigate possible further simplification and identification of error sources in the 14 C urea kit extensively used at the Royal Adelaide Hospital. Thirty six patients with validated H. pylon status were tested with breath samples taken at 10,15, and 20 min. Using the single sample value at 15 min, there was no change in the diagnostic category. Reduction or errors in analysis depends on attention to the following details: Stability of absorption solution, (now > 2 months), compatibility of scintillation cocktail/absorption solution. (with particular regard to photoluminescence and chemiluminescence), reduction in chemical quenching (moisture reduction), understanding counting hardware and relevance, and appropriate response to deviation in quality assurance. With this experience, we are confident of the performance and reliability of the RAPID-14 urea breath test kit now available commercially

  5. Model medication management process in Australian nursing homes using business process modeling.

    Science.gov (United States)

    Qian, Siyu; Yu, Ping

    2013-01-01

    One of the reasons for end user avoidance or rejection to use health information systems is poor alignment of the system with healthcare workflow, likely causing by system designers' lack of thorough understanding about healthcare process. Therefore, understanding the healthcare workflow is the essential first step for the design of optimal technologies that will enable care staff to complete the intended tasks faster and better. The often use of multiple or "high risk" medicines by older people in nursing homes has the potential to increase medication error rate. To facilitate the design of information systems with most potential to improve patient safety, this study aims to understand medication management process in nursing homes using business process modeling method. The paper presents study design and preliminary findings from interviewing two registered nurses, who were team leaders in two nursing homes. Although there were subtle differences in medication management between the two homes, major medication management activities were similar. Further field observation will be conducted. Based on the data collected from observations, an as-is process model for medication management will be developed.

  6. Model reduction methods for vector autoregressive processes

    CERN Document Server

    Brüggemann, Ralf

    2004-01-01

    1. 1 Objective of the Study Vector autoregressive (VAR) models have become one of the dominant research tools in the analysis of macroeconomic time series during the last two decades. The great success of this modeling class started with Sims' (1980) critique of the traditional simultaneous equation models (SEM). Sims criticized the use of 'too many incredible restrictions' based on 'supposed a priori knowledge' in large scale macroeconometric models which were popular at that time. Therefore, he advo­ cated largely unrestricted reduced form multivariate time series models, unrestricted VAR models in particular. Ever since his influential paper these models have been employed extensively to characterize the underlying dynamics in systems of time series. In particular, tools to summarize the dynamic interaction between the system variables, such as impulse response analysis or forecast error variance decompo­ sitions, have been developed over the years. The econometrics of VAR models and related quantities i...

  7. Spherical Process Models for Global Spatial Statistics

    KAUST Repository

    Jeong, Jaehong; Jun, Mikyoung; Genton, Marc G.

    2017-01-01

    Statistical models used in geophysical, environmental, and climate science applications must reflect the curvature of the spatial domain in global data. Over the past few decades, statisticians have developed covariance models that capture

  8. Managing risks in business model innovation processes

    DEFF Research Database (Denmark)

    Taran, Yariv; Boer, Harry; Lindgren, Peter

    2010-01-01

    Companies today, in some industries more than others, invest more capital and resources just to stay competitive, develop more diverse solutions, and increasingly start thinking more radically when considering their business models. However, despite the understanding that business model (BM...

  9. Plasma Process Modeling for Integrated Circuits Manufacturing

    OpenAIRE

    M. Meyyappan; T. R. Govindan

    1998-01-01

    A reactor model for plasma-based deposition and etching is presented. Two-dimensional results are discussed in terms of plasma density, ion flux, and ion energy. Approaches to develop rapid CAD-type models are discussed.

  10. Modeling microbial processes in porous media

    Science.gov (United States)

    Murphy, Ellyn M.; Ginn, Timothy R.

    The incorporation of microbial processes into reactive transport models has generally proceeded along two separate lines of investigation: (1) transport of bacteria as inert colloids in porous media, and (2) the biodegradation of dissolved contaminants by a stationary phase of bacteria. Research over the last decade has indicated that these processes are closely linked. This linkage may occur when a change in metabolic activity alters the attachment/detachment rates of bacteria to surfaces, either promoting or retarding bacterial transport in a groundwater-contaminant plume. Changes in metabolic activity, in turn, are controlled by the time of exposure of the microbes to electron acceptors/donor and other components affecting activity. Similarly, metabolic activity can affect the reversibility of attachment, depending on the residence time of active microbes. Thus, improvements in quantitative analysis of active subsurface biota necessitate direct linkages between substrate availability, metabolic activity, growth, and attachment/detachment rates. This linkage requires both a detailed understanding of the biological processes and robust quantitative representations of these processes that can be tested experimentally. This paper presents an overview of current approaches used to represent physicochemical and biological processes in porous media, along with new conceptual approaches that link metabolic activity with partitioning of the microorganism between the aqueous and solid phases. Résumé L'introduction des processus microbiologiques dans des modèles de transport réactif a généralement suivi deux voies différentes de recherches: (1) le transport de bactéries sous forme de colloïdes inertes en milieu poreux, et (2) la biodégradation de polluants dissous par une phase stationnaire de bactéries. Les recherches conduites au cours des dix dernières années indiquent que ces processus sont intimement liés. Cette liaison peut intervenir lorsqu

  11. Evaluation of EOR Processes Using Network Models

    DEFF Research Database (Denmark)

    Winter, Anatol; Larsen, Jens Kjell; Krogsbøll, Anette

    1998-01-01

    The report consists of the following parts: 1) Studies of wetting properties of model fluids and fluid mixtures aimed at an optimal selection of candidates for micromodel experiments. 2) Experimental studies of multiphase transport properties using physical models of porous networks (micromodels......) including estimation of their "petrophysical" properties (e.g. absolute permeability). 3) Mathematical modelling and computer studies of multiphase transport through pore space using mathematical network models. 4) Investigation of link between pore-scale and macroscopic recovery mechanisms....

  12. A Model of Process-Based Automation: Cost and Quality Implications in the Medication Management Process

    Science.gov (United States)

    Spaulding, Trent Joseph

    2011-01-01

    The objective of this research is to understand how a set of systems, as defined by the business process, creates value. The three studies contained in this work develop the model of process-based automation. The model states that complementarities among systems are specified by handoffs in the business process. The model also provides theory to…

  13. Eye Tracking Meets the Process of Process Modeling: a Visual Analytic Approach

    DEFF Research Database (Denmark)

    Burattin, Andrea; Kaiser, M.; Neurauter, Manuel

    2017-01-01

    Research on the process of process modeling (PPM) studies how process models are created. It typically uses the logs of the interactions with the modeling tool to assess the modeler’s behavior. In this paper we suggest to introduce an additional stream of data (i.e., eye tracking) to improve the ...

  14. Estimating Diurnal Courses of Gross Primary Production for Maize: A Comparison of Sun-Induced Chlorophyll Fluorescence, Light-Use Efficiency and Process-Based Models

    Directory of Open Access Journals (Sweden)

    Tianxiang Cui

    2017-12-01

    Full Text Available Accurately quantifying gross primary production (GPP is of vital importance to understanding the global carbon cycle. Light-use efficiency (LUE models and process-based models have been widely used to estimate GPP at different spatial and temporal scales. However, large uncertainties remain in quantifying GPP, especially for croplands. Recently, remote measurements of solar-induced chlorophyll fluorescence (SIF have provided a new perspective to assess actual levels of plant photosynthesis. In the presented study, we evaluated the performance of three approaches, including the LUE-based multi-source data synergized quantitative (MuSyQ GPP algorithm, the process-based boreal ecosystem productivity simulator (BEPS model, and the SIF-based statistical model, in estimating the diurnal courses of GPP at a maize site in Zhangye, China. A field campaign was conducted to acquire synchronous far-red SIF (SIF760 observations and flux tower-based GPP measurements. Our results showed that both SIF760 and GPP were linearly correlated with APAR, and the SIF760-GPP relationship was adequately characterized using a linear function. The evaluation of the modeled GPP against the GPP measured from the tower demonstrated that all three approaches provided reasonable estimates, with R2 values of 0.702, 0.867, and 0.667 and RMSE values of 0.247, 0.153, and 0.236 mg m−2 s−1 for the MuSyQ-GPP, BEPS and SIF models, respectively. This study indicated that the BEPS model simulated the GPP best due to its efficiency in describing the underlying physiological processes of sunlit and shaded leaves. The MuSyQ-GPP model was limited by its simplification of some critical ecological processes and its weakness in characterizing the contribution of shaded leaves. The SIF760-based model demonstrated a relatively limited accuracy but showed its potential in modeling GPP without dependency on climate inputs in short-term studies.

  15. Regime-switching models to study psychological process

    NARCIS (Netherlands)

    Hamaker, E.L.; Grasman, R.P.P.P.; Kamphuis, J.H.

    2010-01-01

    Many psychological processes are characterized by recurrent shifts between different states. To model these processes at the level of the individual, regime-switching models may prove useful. In this chapter we discuss two of these models: the threshold autoregressive model and the Markov

  16. Dynamic modeling of ultrafiltration membranes for whey separation processes

    NARCIS (Netherlands)

    Saltik, M.B.; Ozkan, L.; Jacobs, M.; van der Padt, A.

    2017-01-01

    In this paper, we present a control relevant rigorous dynamic model for an ultrafiltration membrane unit in a whey separation process. The model consists of a set of differential algebraic equations and is developed for online model based applications such as model based control and process

  17. Macro Level Simulation Model Of Space Shuttle Processing

    Science.gov (United States)

    2000-01-01

    The contents include: 1) Space Shuttle Processing Simulation Model; 2) Knowledge Acquisition; 3) Simulation Input Analysis; 4) Model Applications in Current Shuttle Environment; and 5) Model Applications for Future Reusable Launch Vehicles (RLV's). This paper is presented in viewgraph form.

  18. A general model for membrane-based separation processes

    DEFF Research Database (Denmark)

    Soni, Vipasha; Abildskov, Jens; Jonsson, Gunnar Eigil

    2009-01-01

    behaviour will play an important role. In this paper, modelling of membrane-based processes for separation of gas and liquid mixtures are considered. Two general models, one for membrane-based liquid separation processes (with phase change) and another for membrane-based gas separation are presented....... The separation processes covered are: membrane-based gas separation processes, pervaporation and various types of membrane distillation processes. The specific model for each type of membrane-based process is generated from the two general models by applying the specific system descriptions and the corresponding...

  19. Modelling of injection processes in ladle metallurgy

    NARCIS (Netherlands)

    Visser, H.

    2016-01-01

    Ladle metallurgical processes constitute a portion of the total production chain of steel from iron ore. With these batch processes, the hot metal or steel transfer ladle is being used as a reactor vessel and a reagent is often injected in order to bring the composition of the hot metal or steel to

  20. The Process of Horizontal Differentiation: Two Models.

    Science.gov (United States)

    Daft, Richard L.; Bradshaw, Patricia J.

    1980-01-01

    Explores the process of horizontal differentiation by examining events leading to the establishment of 30 new departments in five universities. Two types of horizontal differentiation processes--administrative and academic--were observed and each was associated with different organizational conditions. (Author/IRT)

  1. Evolutionary Regeneration Model of Thought Process

    OpenAIRE

    Noboru, HOKKYO; Hitachi Energy Research Laboratory

    1982-01-01

    A preliminary attempt is made to understand the thought process and the evolution of the nervous system on the same footing as regeneration processes obeying certain recursive algebraic rules which possibly economize the information content of the increasingly complex structural-functional correlate of the evolving and thinking nervous system.

  2. Animal models for information processing during sleep

    NARCIS (Netherlands)

    Coenen, A.M.L.; Drinkenburg, W.H.I.M.

    2002-01-01

    Information provided by external stimuli does reach the brain during sleep, although the amount of information is reduced during sleep compared to wakefulness. The process controlling this reduction is called `sensory' gating and evidence exists that the underlying neurophysiological processes take

  3. Business process model repositories : framework and survey

    NARCIS (Netherlands)

    Yan, Z.; Dijkman, R.M.; Grefen, P.W.P.J.

    2009-01-01

    Large organizations often run hundreds or even thousands of business processes. Managing such large collections of business processes is a challenging task. Intelligent software can assist in that task by providing common repository functions such as storage, search and version management. They can

  4. A framework for business process model repositories

    NARCIS (Netherlands)

    Yan, Z.; Grefen, P.W.P.J.; Muehlen, zur M.; Su, J.

    2010-01-01

    Large organizations often run hundreds or even thousands of business processes. Managing such large collections of business processes is a challenging task. Intelligent software can assist in that task by providing common repository functions such as storage, search and version management. They can

  5. The Model of the Production Process for the Quality Management

    Directory of Open Access Journals (Sweden)

    Alot Zbigniew

    2017-02-01

    Full Text Available This article is a result of the research on the models of the production processes for the quality management and their identification. It discusses the classical model and the indicators for evaluating the capabilities by taking as its starting point the assumption of the normal distribution of the process characteristics. The division of the process types proposed by ISO 21747:2006 standard introducing models for non-stationary processes is presented. A general process model that allows in any real case to precisely describe the statistical characteristics of the process is proposed. It gives the opportunity for more detailed description, in comparison to the model proposed by ISO 21747:2006 standard, of the process characteristics and determining its capability. This model contains the type of process, statistical distribution, and the method for determining the capability and performance (long-term capability of the process. One of the model elements is proposed, own classification and resulting set of process types. The classification follows the recommendations of ISO 21747:2006 introducing models for the non-stationary processes. However, the set of the process types allows, beyond a more precise description of the process characteristics, its usage to monitor the process.

  6. Study of dissolution process and its modelling

    Directory of Open Access Journals (Sweden)

    Juan Carlos Beltran-Prieto

    2017-01-01

    Full Text Available The use of mathematical concepts and language aiming to describe and represent the interactions and dynamics of a system is known as a mathematical model. Mathematical modelling finds a huge number of successful applications in a vast amount of science, social and engineering fields, including biology, chemistry, physics, computer sciences, artificial intelligence, bioengineering, finance, economy and others. In this research, we aim to propose a mathematical model that predicts the dissolution of a solid material immersed in a fluid. The developed model can be used to evaluate the rate of mass transfer and the mass transfer coefficient. Further research is expected to be carried out to use the model as a base to develop useful models for the pharmaceutical industry to gain information about the dissolution of medicaments in the body stream and this could play a key role in formulation of medicaments.

  7. Understanding Quality in Process Modelling: Towards a Holistic Perspective

    Directory of Open Access Journals (Sweden)

    Jan Recker

    2007-09-01

    Full Text Available Quality is one of the main topics in current conceptual modelling research, as is the field of business process modelling. Yet, widely acknowledged academic contributions towards an understanding or measurement of business process model quality are limited at best. In this paper I argue that the development of methodical theories concerning the measurement or establishment of process model quality must be preceded by methodological elaborations on business process modelling. I further argue that existing epistemological foundations of process modelling are insufficient for describing all extrinsic and intrinsic traits of model quality. This in turn has led to a lack of holistic understanding of process modelling. Taking into account the inherent social and purpose-oriented character of process modelling in contemporary organizations I present a socio-pragmatic constructionist methodology of business process modelling and sketch out implications of this perspective towards an understanding of process model quality. I anticipate that, based on this research, theories can be developed that facilitate the evaluation of the ’goodness’ of a business process model.

  8. Centrifuge modelling of contaminant transport processes

    OpenAIRE

    Culligan, P. J.; Savvidou, C.; Barry, D. A.

    1996-01-01

    Over the past decade, research workers have started to investigate problems of subsurface contaminant transport through physical modelling on a geotechnical centrifuge. A major advantage of this apparatus is its ability to model complex natural systems in a controlled laboratory environment In this paper, we discusses the principles and scaling laws related to the centrifugal modelling of contaminant transport, and presents four examples of recent work that has bee...

  9. An Abstract Model of Historical Processes

    Directory of Open Access Journals (Sweden)

    Michael Poulshock

    2017-06-01

    Full Text Available A theoretical model is presented which provides a way to simulate, at a very abstract level, power struggles in the social world. In the model, agents can benefit or harm each other, to varying degrees and with differing levels of influence. The agents interact over time, using the power they have to try to get more of it, while being constrained in their strategic choices by social inertia. The outcomes of the model are probabilistic. More research is needed to determine whether the model has any empirical validity.

  10. Modeling and Advanced Control for Sustainable Process ...

    Science.gov (United States)

    This book chapter introduces a novel process systems engineering framework that integrates process control with sustainability assessment tools for the simultaneous evaluation and optimization of process operations. The implemented control strategy consists of a biologically-inspired, multi-agent-based method. The sustainability and performance assessment of process operating points is carried out using the U.S. E.P.A.’s GREENSCOPE assessment tool that provides scores for the selected economic, material management, environmental and energy indicators. The indicator results supply information on whether the implementation of the controller is moving the process towards a more sustainable operation. The effectiveness of the proposed framework is illustrated through a case study of a continuous bioethanol fermentation process whose dynamics are characterized by steady-state multiplicity and oscillatory behavior. This book chapter contribution demonstrates the application of novel process control strategies for sustainability by increasing material management, energy efficiency, and pollution prevention, as needed for SHC Sustainable Uses of Wastes and Materials Management.

  11. Computer modeling of lung cancer diagnosis-to-treatment process.

    Science.gov (United States)

    Ju, Feng; Lee, Hyo Kyung; Osarogiagbon, Raymond U; Yu, Xinhua; Faris, Nick; Li, Jingshan

    2015-08-01

    We introduce an example of a rigorous, quantitative method for quality improvement in lung cancer care-delivery. Computer process modeling methods are introduced for lung cancer diagnosis, staging and treatment selection process. Two types of process modeling techniques, discrete event simulation (DES) and analytical models, are briefly reviewed. Recent developments in DES are outlined and the necessary data and procedures to develop a DES model for lung cancer diagnosis, leading up to surgical treatment process are summarized. The analytical models include both Markov chain model and closed formulas. The Markov chain models with its application in healthcare are introduced and the approach to derive a lung cancer diagnosis process model is presented. Similarly, the procedure to derive closed formulas evaluating the diagnosis process performance is outlined. Finally, the pros and cons of these methods are discussed.

  12. Modeling of Heating During Food Processing

    Science.gov (United States)

    Zheleva, Ivanka; Kamburova, Veselka

    Heat transfer processes are important for almost all aspects of food preparation and play a key role in determining food safety. Whether it is cooking, baking, boiling, frying, grilling, blanching, drying, sterilizing, or freezing, heat transfer is part of the processing of almost every food. Heat transfer is a dynamic process in which thermal energy is transferred from one body with higher temperature to another body with lower temperature. Temperature difference between the source of heat and the receiver of heat is the driving force in heat transfer.

  13. A QCD motivated model for soft processes

    International Nuclear Information System (INIS)

    Kormilitzin, A.; Levin, E.

    2009-01-01

    In this talk we give a brief description of a QCD motivated model for both hard and soft interactions at high energies. In this model the long distance behaviour of the scattering amplitude is determined by the dipole scattering amplitude in the saturation domain.

  14. Health care management modelling: a process perspective

    NARCIS (Netherlands)

    Vissers, J.M.H.

    1998-01-01

    Modelling-based health care management ought to become just as popular as evidence based medicine. Making managerial decisions based on evidence by modelling efforts is certainly a step forward. Examples can be given of many successful applications in different areas of decision making: disease

  15. GRAPHICAL MODELS OF THE AIRCRAFT MAINTENANCE PROCESS

    OpenAIRE

    Stanislav Vladimirovich Daletskiy; Stanislav Stanislavovich Daletskiy

    2017-01-01

    The aircraft maintenance is realized by a rapid sequence of maintenance organizational and technical states, its re- search and analysis are carried out by statistical methods. The maintenance process concludes aircraft technical states con- nected with the objective patterns of technical qualities changes of the aircraft as a maintenance object and organizational states which determine the subjective organization and planning process of aircraft using. The objective maintenance pro- cess is ...

  16. Comparative analysis of business rules and business process modeling languages

    Directory of Open Access Journals (Sweden)

    Audrius Rima

    2013-03-01

    Full Text Available During developing an information system is important to create clear models and choose suitable modeling languages. The article analyzes the SRML, SBVR, PRR, SWRL, OCL rules specifying language and UML, DFD, CPN, EPC and IDEF3 BPMN business process modeling language. The article presents business rules and business process modeling languages theoretical comparison. The article according to selected modeling aspects of the comparison between different business process modeling languages ​​and business rules representation languages sets. Also, it is selected the best fit of language set for three layer framework for business rule based software modeling.

  17. Difference-based Model Synchronization in an Industrial MDD Process

    DEFF Research Database (Denmark)

    Könemann, Patrick; Kindler, Ekkart; Unland, Ludger

    2009-01-01

    Models play a central role in model-driven software engineering. There are different kinds of models during the development process, which are related to each other and change over time. Therefore, it is difficult to keep the different models consistent with each other. Consistency of different m...... model versions, and for synchronizing other types of models. The main concern is to apply our concepts to an industrial process, in particular keeping usability and performance in mind. Keyword: Model Differencing, Model Merging, Model Synchronization...

  18. Modeling fixation locations using spatial point processes.

    Science.gov (United States)

    Barthelmé, Simon; Trukenbrod, Hans; Engbert, Ralf; Wichmann, Felix

    2013-10-01

    Whenever eye movements are measured, a central part of the analysis has to do with where subjects fixate and why they fixated where they fixated. To a first approximation, a set of fixations can be viewed as a set of points in space; this implies that fixations are spatial data and that the analysis of fixation locations can be beneficially thought of as a spatial statistics problem. We argue that thinking of fixation locations as arising from point processes is a very fruitful framework for eye-movement data, helping turn qualitative questions into quantitative ones. We provide a tutorial introduction to some of the main ideas of the field of spatial statistics, focusing especially on spatial Poisson processes. We show how point processes help relate image properties to fixation locations. In particular we show how point processes naturally express the idea that image features' predictability for fixations may vary from one image to another. We review other methods of analysis used in the literature, show how they relate to point process theory, and argue that thinking in terms of point processes substantially extends the range of analyses that can be performed and clarify their interpretation.

  19. A Software Development Simulation Model of a Spiral Process

    OpenAIRE

    Carolyn Mizell; Linda Malone

    2009-01-01

    This paper will present a discrete event simulation model of a spiral development lifecycle that can be used to analyze cost and schedule effects of using such a process in comparison to a waterfall process. There is a need for simulation models of software development processes other than the waterfall due to new processes becoming more widely used in order to overcome the limitations of the traditional waterfall lifecycle. The use of a spiral process can make the inherently difficult job of...

  20. Deconstructing crop processes and models via identities

    DEFF Research Database (Denmark)

    Porter, John Roy; Christensen, Svend

    2013-01-01

    This paper is part review and part opinion piece; it has three parts of increasing novelty and speculation in approach. The first presents an overview of how some of the major crop simulation models approach the issue of simulating the responses of crops to changing climatic and weather variables......, mainly atmospheric CO2 concentration and increased and/or varying temperatures. It illustrates an important principle in models of a single cause having alternative effects and vice versa. The second part suggests some features, mostly missing in current crop models, that need to be included...

  1. NONLINEAR MODEL PREDICTIVE CONTROL OF CHEMICAL PROCESSES

    Directory of Open Access Journals (Sweden)

    SILVA R. G.

    1999-01-01

    Full Text Available A new algorithm for model predictive control is presented. The algorithm utilizes a simultaneous solution and optimization strategy to solve the model's differential equations. The equations are discretized by equidistant collocation, and along with the algebraic model equations are included as constraints in a nonlinear programming (NLP problem. This algorithm is compared with the algorithm that uses orthogonal collocation on finite elements. The equidistant collocation algorithm results in simpler equations, providing a decrease in computation time for the control moves. Simulation results are presented and show a satisfactory performance of this algorithm.

  2. The Structured Process Modeling Theory (SPMT) : a cognitive view on why and how modelers benefit from structuring the process of process modeling

    NARCIS (Netherlands)

    Claes, J.; Vanderfeesten, I.T.P.; Gailly, F.; Grefen, P.W.P.J.; Poels, G.

    2015-01-01

    After observing various inexperienced modelers constructing a business process model based on the same textual case description, it was noted that great differences existed in the quality of the produced models. The impression arose that certain quality issues originated from cognitive failures

  3. Dynamic process model of a plutonium oxalate precipitator. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Miller, C.L.; Hammelman, J.E.; Borgonovi, G.M.

    1977-11-01

    In support of LLL material safeguards program, a dynamic process model was developed which simulates the performance of a plutonium (IV) oxalate precipitator. The plutonium oxalate precipitator is a component in the plutonium oxalate process for making plutonium oxide powder from plutonium nitrate. The model is based on state-of-the-art crystallization descriptive equations, the parameters of which are quantified through the use of batch experimental data. The dynamic model predicts performance very similar to general Hanford oxalate process experience. The utilization of such a process model in an actual plant operation could promote both process control and material safeguards control by serving as a baseline predictor which could give early warning of process upsets or material diversion. The model has been incorporated into a FORTRAN computer program and is also compatible with the DYNSYS 2 computer code which is being used at LLL for process modeling efforts.

  4. Dynamic process model of a plutonium oxalate precipitator. Final report

    International Nuclear Information System (INIS)

    Miller, C.L.; Hammelman, J.E.; Borgonovi, G.M.

    1977-11-01

    In support of LLL material safeguards program, a dynamic process model was developed which simulates the performance of a plutonium (IV) oxalate precipitator. The plutonium oxalate precipitator is a component in the plutonium oxalate process for making plutonium oxide powder from plutonium nitrate. The model is based on state-of-the-art crystallization descriptive equations, the parameters of which are quantified through the use of batch experimental data. The dynamic model predicts performance very similar to general Hanford oxalate process experience. The utilization of such a process model in an actual plant operation could promote both process control and material safeguards control by serving as a baseline predictor which could give early warning of process upsets or material diversion. The model has been incorporated into a FORTRAN computer program and is also compatible with the DYNSYS 2 computer code which is being used at LLL for process modeling efforts

  5. MODELLING OF THE PROCESS OF TEACHING READING ENGLISH LANGUAGE PERIODICALS

    Directory of Open Access Journals (Sweden)

    Тетяна Глушко

    2014-07-01

    Full Text Available The article reveals a scientifically substantiated process of teaching reading English language periodicals in all its components, which are consistently developed, and form of interconnection of the structural elements in the process of teaching reading. This process is presented as a few interconnected and interdetermined models: 1 the models of the process of acquiring standard and expressive lexical knowledge; 2 the models of the process of formation of skills to use such vocabulary; 3 the models of the development of skills to read texts of the different linguistic levels.

  6. Aspect-Oriented Business Process Modeling with AO4BPMN

    Science.gov (United States)

    Charfi, Anis; Müller, Heiko; Mezini, Mira

    Many crosscutting concerns in business processes need to be addressed already at the business process modeling level such as compliance, auditing, billing, and separation of duties. However, existing business process modeling languages including OMG's Business Process Modeling Notation (BPMN) lack appropriate means for expressing such concerns in a modular way. In this paper, we motivate the need for aspect-oriented concepts in business process modeling languages and propose an aspect-oriented extension to BPMN called AO4BPMN. We also present a graphical editor supporting that extension.

  7. The Role(s) of Process Models in Design Practice

    DEFF Research Database (Denmark)

    Iversen, Søren; Jensen, Mads Kunø Nyegaard; Vistisen, Peter

    2018-01-01

    This paper investigates how design process models are implemented and used in design-driven organisations. The archetypical theoretical framing of process models, describe their primary role as guiding the design process, and assign roles and deliverables throughout the process. We hypothesise...... that the process models also take more communicative roles in practice, both in terms of creating an internal design rationale, as well as demystifying the black box of design thinking to external stakeholders. We investigate this hypothesis through an interview study of four major danish design......-driven organisations, and analyse the different roles their archetypical process models take in their organisations. The main contribution is the identification of three, often overlapping roles, which design process models showed to assume in design-driven organisations: process guidance, adding transparency...

  8. MODELING OF MANAGEMENT PROCESSES IN AN ORGANIZATION

    Directory of Open Access Journals (Sweden)

    Stefan Iovan

    2016-05-01

    Full Text Available When driving any major change within an organization, strategy and execution are intrinsic to a project’s success. Nevertheless, closing the gap between strategy and execution remains a challenge for many organizations [1]. Companies tend to focus more on execution than strategy for quick results, instead of taking the time needed to understand the parts that make up the whole, so the right execution plan can be put in place to deliver the best outcomes. A large part of this understands that business operations don’t fit neatly within the traditional organizational hierarchy. Business processes are often messy, collaborative efforts that cross teams, departments and systems, making them difficult to manage within a hierarchical structure [2]. Business process management (BPM fills this gap by redefining an organization according to its end-to-end processes, so opportunities for improvement can be identified and processes streamlined for growth, revenue and transformation. This white paper provides guidelines on what to consider when using business process applications to solve your BPM initiatives, and the unique capabilities software systems provides that can help ensure both your project’s success and the success of your organization as a whole. majority of medium and small businesses, big companies and even some guvermental organizations [2].

  9. Biomedical Simulation Models of Human Auditory Processes

    Science.gov (United States)

    Bicak, Mehmet M. A.

    2012-01-01

    Detailed acoustic engineering models that explore noise propagation mechanisms associated with noise attenuation and transmission paths created when using hearing protectors such as earplugs and headsets in high noise environments. Biomedical finite element (FE) models are developed based on volume Computed Tomography scan data which provides explicit external ear, ear canal, middle ear ossicular bones and cochlea geometry. Results from these studies have enabled a greater understanding of hearing protector to flesh dynamics as well as prioritizing noise propagation mechanisms. Prioritization of noise mechanisms can form an essential framework for exploration of new design principles and methods in both earplug and earcup applications. These models are currently being used in development of a novel hearing protection evaluation system that can provide experimentally correlated psychoacoustic noise attenuation. Moreover, these FE models can be used to simulate the effects of blast related impulse noise on human auditory mechanisms and brain tissue.

  10. Modelling of additive manufacturing processes: a review and classification

    Science.gov (United States)

    Stavropoulos, Panagiotis; Foteinopoulos, Panagis

    2018-03-01

    Additive manufacturing (AM) is a very promising technology; however, there are a number of open issues related to the different AM processes. The literature on modelling the existing AM processes is reviewed and classified. A categorization of the different AM processes in process groups, according to the process mechanism, has been conducted and the most important issues are stated. Suggestions are made as to which approach is more appropriate according to the key performance indicator desired to be modelled and a discussion is included as to the way that future modelling work can better contribute to improving today's AM process understanding.

  11. Sensitivity study of reduced models of the activated sludge process ...

    African Journals Online (AJOL)

    2009-08-07

    Aug 7, 2009 ... Sensitivity study of reduced models of the activated sludge process, for the purposes of parameter estimation and process optimisation: Benchmark process with ASM1 and UCT reduced biological models. S du Plessis and R Tzoneva*. Department of Electrical Engineering, Cape Peninsula University of ...

  12. Task-specific visual cues for improving process model understanding

    NARCIS (Netherlands)

    Petrusel, Razvan; Mendling, Jan; Reijers, Hajo A.

    2016-01-01

    Context Business process models support various stakeholders in managing business processes and designing process-aware information systems. In order to make effective use of these models, they have to be readily understandable. Objective Prior research has emphasized the potential of visual cues to

  13. Computer-Aided Multiscale Modelling for Chemical Process Engineering

    DEFF Research Database (Denmark)

    Morales Rodriguez, Ricardo; Gani, Rafiqul

    2007-01-01

    Chemical processes are generally modeled through monoscale approaches, which, while not adequate, satisfy a useful role in product-process design. In this case, use of a multi-dimensional and multi-scale model-based approach has importance in product-process development. A computer-aided framework...

  14. Guided interaction exploration in artifact-centric process models

    NARCIS (Netherlands)

    van Eck, M.L.; Sidorova, N.; van der Aalst, W.M.P.

    2017-01-01

    Artifact-centric process models aim to describe complex processes as a collection of interacting artifacts. Recent development in process mining allow for the discovery of such models. However, the focus is often on the representation of the individual artifacts rather than their interactions. Based

  15. The Formalization of the Business Process Modeling Goals

    OpenAIRE

    Bušinska, Ligita; Kirikova, Mārīte

    2016-01-01

    In business process modeling the de facto standard BPMN has emerged. However, the applications of this notation have many subsets of elements and various extensions. Also, BPMN still coincides with many other modeling languages, forming a large set of available options for business process modeling languages and dialects. While, in general, the goal of modelers is a central notion in the choice of modeling languages and notations, in most researches that propose guidelines, techniques, and me...

  16. On Process Modelling Using Physical Oriented And Phenomena Based Principles

    Directory of Open Access Journals (Sweden)

    Mihai Culea

    2000-12-01

    Full Text Available This work presents a modelling framework based on phenomena description of the process. The approach is taken to easy understand and construct process model in heterogeneous possible distributed modelling and simulation environments. A simplified case study of a heat exchanger is considered and Modelica modelling language to check the proposed concept. The partial results are promising and the research effort will be extended in a computer aided modelling environment based on phenomena.

  17. An integrated model for supplier selection process

    Institute of Scientific and Technical Information of China (English)

    2003-01-01

    In today's highly competitive manufacturing environment, the supplier selection process becomes one of crucial activities in supply chain management. In order to select the best supplier(s) it is not only necessary to continuously tracking and benchmarking performance of suppliers but also to make a tradeoff between tangible and intangible factors some of which may conflict. In this paper an integration of case-based reasoning (CBR), analytical network process (ANP) and linear programming (LP) is proposed to solve the supplier selection problem.

  18. Statistical image processing and multidimensional modeling

    CERN Document Server

    Fieguth, Paul

    2010-01-01

    Images are all around us! The proliferation of low-cost, high-quality imaging devices has led to an explosion in acquired images. When these images are acquired from a microscope, telescope, satellite, or medical imaging device, there is a statistical image processing task: the inference of something - an artery, a road, a DNA marker, an oil spill - from imagery, possibly noisy, blurry, or incomplete. A great many textbooks have been written on image processing. However this book does not so much focus on images, per se, but rather on spatial data sets, with one or more measurements taken over

  19. From BPMN process models to BPEL web services

    NARCIS (Netherlands)

    Ouyang, C.; Aalst, van der W.M.P.; Dumas, M.; Hofstede, ter A.H.M.; Feig, E.; Kumar, A.

    2006-01-01

    The Business Process Modelling Notation (BPMN) is a graph-oriented language in which control and action nodes can be connected almost arbitrarily. It is supported by various modelling tools but so far no systems can directly execute BPMN models. The Business Process Execution Language for Web

  20. The evolution of process-based hydrologic models

    NARCIS (Netherlands)

    Clark, Martyn P.; Bierkens, Marc F.P.; Samaniego, Luis; Woods, Ross A.; Uijlenhoet, Remko; Bennett, Katrina E.; Pauwels, Valentijn R.N.; Cai, Xitian; Wood, Andrew W.; Peters-Lidard, Christa D.

    2017-01-01

    The diversity in hydrologic models has historically led to great controversy on the "correct" approach to process-based hydrologic modeling, with debates centered on the adequacy of process parameterizations, data limitations and uncertainty, and computational constraints on model analysis. In this

  1. Correctness-preserving configuration of business process models

    NARCIS (Netherlands)

    Aalst, van der W.M.P.; Dumas, M.; Gottschalk, F.; Hofstede, ter A.H.M.; La Rosa, M.; Mendling, J.; Fiadeiro, J.; Inverardi, P.

    2008-01-01

    Reference process models capture recurrent business operations in a given domain such as procurement or logistics. These models are intended to be configured to fit the requirements of specific organizations or projects, leading to individualized process models that are subsequently used for domain

  2. Process models as tools in forestry research and management

    Science.gov (United States)

    Kurt Johnsen; Lisa Samuelson; Robert Teskey; Steve McNulty; Tom Fox

    2001-01-01

    Forest process models are mathematical representations of biological systems that incorporate our understanding of physiological and ecological mechanisms into predictive algorithms. These models were originally designed and used for research purposes, but are being developed for use in practical forest management. Process models designed for research...

  3. Development of process simulation code for reprocessing plant and process analysis for solvent degradation and solvent washing waste

    International Nuclear Information System (INIS)

    Tsukada, Tsuyoshi; Takahashi, Keiki

    1999-01-01

    We developed a process simulation code for an entire nuclear fuel reprocessing plant. The code can be used on a PC. Almost all of the equipment in the reprocessing plant is included in the code and the mass balance model of each item of equipment is based on the distribution factors of flow-out streams. All models are connected between the outlet flow and the inlet flow according to the process flow sheet. We estimated the amount of DBP from TBP degradation in the entire process by using the developed code. Most of the DBP is generated in the Pu refining process by the effect of α radiation from Pu, which is extracted in a solvent. On the other hand, very little of DBP is generated in the U refining process. We therefore propose simplification of the solvent washing process and volume reduction of the alkali washing waste in the U refining process. The first Japanese commercial reprocessing plant is currently under construction at Rokkasho Mura, Recently, for the sake of process simplification, the original process design has been changed. Using our code, we analyzed the original process and the simplified process. According our results, the volume of alkali waste solution in the low-level liquid treatment process will be reduced by half in the simplified process. (author)

  4. Business Process Modeling Languages Supporting Collaborative Networks

    NARCIS (Netherlands)

    Soleimani Malekan, H.; Afsarmanesh, H.; Hammoudi, S.; Maciaszek, L.A.; Cordeiro, J.; Dietz, J.L.G.

    2013-01-01

    Formalizing the definition of Business Processes (BPs) performed within each enterprise is fundamental for effective deployment of their competencies and capabilities within Collaborative Networks (CN). In our approach, every enterprise in the CN is represented by its set of BPs, so that other

  5. Anode baking process optimization through computer modelling

    Energy Technology Data Exchange (ETDEWEB)

    Wilburn, D.; Lancaster, D.; Crowell, B. [Noranda Aluminum, New Madrid, MO (United States); Ouellet, R.; Jiao, Q. [Noranda Technology Centre, Pointe Claire, PQ (Canada)

    1998-12-31

    Carbon anodes used in aluminum electrolysis are produced in vertical or horizontal type anode baking furnaces. The carbon blocks are formed from petroleum coke aggregate mixed with a coal tar pitch binder. Before the carbon block can be used in a reduction cell it must be heated to pyrolysis. The baking process represents a large portion of the aluminum production cost, and also has a significant effect on anode quality. To ensure that the baking of the anode is complete, it must be heated to about 1100 degrees C. To improve the understanding of the anode baking process and to improve its efficiency, a menu-driven heat, mass and fluid flow simulation tool, called NABSIM (Noranda Anode Baking SIMulation), was developed and calibrated in 1993 and 1994. It has been used since then to evaluate and screen firing practices, and to determine which firing procedure will produce the optimum heat-up rate, final temperature, and soak time, without allowing unburned tar to escape. NABSIM is used as a furnace simulation tool on a daily basis by Noranda plant process engineers and much effort is expended in improving its utility by creating new versions, and the addition of new modules. In the immediate future, efforts will be directed towards optimizing the anode baking process to improve temperature uniformity from pit to pit. 3 refs., 4 figs.

  6. PRODUCT TRIAL PROCESSING (PTP): A MODEL APPROACH ...

    African Journals Online (AJOL)

    Admin

    This study is a theoretical approach to consumer's processing of product trail, and equally explored ... consumer's first usage experience with a company's brand or product that is most important in determining ... product, what it is really marketing is the expected ..... confidence, thus there is a positive relationship between ...

  7. Understanding Modeling Requirements of Unstructured Business Processes

    NARCIS (Netherlands)

    Allah Bukhsh, Zaharah; van Sinderen, Marten J.; Sikkel, Nicolaas; Quartel, Dick

    2017-01-01

    Management of structured business processes is of interest to both academia and industry, where academia focuses on the development of methods and techniques while industry focuses on the development of supporting tools. With the shift from routine to knowledge work, the relevance of management of

  8. Modeling Kanban Processes in Systems Engineering

    Science.gov (United States)

    2012-06-01

    results in lower change traffic and defect incidence. • KSS: Incremental SE, with some design up-front and design continuing throughout development...Boston: Addison-Wesley. [11] Morgan, James M, and Jeffrey K Liker. (2006). The Toyota Product Development System: Integrating People, Process, and

  9. Beyond dual-process models: A categorisation of processes underlying intuitive judgement and decision making

    NARCIS (Netherlands)

    Glöckner, A.; Witteman, C.L.M.

    2010-01-01

    Intuitive-automatic processes are crucial for making judgements and decisions. The fascinating complexity of these processes has attracted many decision researchers, prompting them to start investigating intuition empirically and to develop numerous models. Dual-process models assume a clear

  10. Process model simulations of the divergence effect

    Science.gov (United States)

    Anchukaitis, K. J.; Evans, M. N.; D'Arrigo, R. D.; Smerdon, J. E.; Hughes, M. K.; Kaplan, A.; Vaganov, E. A.

    2007-12-01

    We explore the extent to which the Vaganov-Shashkin (VS) model of conifer tree-ring formation can explain evidence for changing relationships between climate and tree growth over recent decades. The VS model is driven by daily environmental forcing (temperature, soil moisture, and solar radiation), and simulates tree-ring growth cell-by-cell as a function of the most limiting environmental control. This simplified representation of tree physiology allows us to examine using a selection of case studies whether instances of divergence may be explained in terms of changes in limiting environmental dependencies or transient climate change. Identification of model-data differences permits further exploration of the effects of tree-ring standardization, atmospheric composition, and additional non-climatic factors.

  11. Modelling Template for the Development of the Process Flowsheet

    DEFF Research Database (Denmark)

    Fedorova, Marina; Gani, Rafiqul

    2015-01-01

    Models are playing important roles in design and analysis of chemicals/bio-chemicals based products and the processes that manufacture them. Model-based methods and tools have the potential to decrease the number of experiments, which can be expensive and time consuming, and point to candidates...... in connection to other modelling tools within the modelling framework are forming a user-friendly system, which will make the model development process easier and faster and provide the way for unified and consistent model documentation. The modeller can use the template for their specific problem or to extend...... models systematically, efficiently and reliably. In this way, development of products and processes can be faster, cheaper and very efficient. The developed modelling framework involves three main parts: 1) a modelling tool, that includes algorithms for model generation; 2) a template library, which...

  12. Influence of Model Simplifications Excitation Force in Surge for a Floating Foundation for Offshore Wind Turbines

    DEFF Research Database (Denmark)

    Andersen, Morten Thøtt; Hindhede, Dennis; Lauridsen, Jimmy

    2015-01-01

    As offshore wind turbines move towards deeper and more distant sites, the concept of floating foundations is a potential technically and economically attractive alternative to the traditional fixed foundations. Unlike the well-studied monopile, the geometry of a floating foundation is complex and...

  13. Impacts of Modelling Simplifications on Predicted Dispersion of Human Expiratory Droplets

    DEFF Research Database (Denmark)

    Liu, Li; Nielsen, Peter Vilhelm; Xu, Chunwen

    2016-01-01

    simplifying the room air condition into isothermal condition, or neglecting the body plume of the manikin. It will also change the microenvironment completely by simplifying the shape of human grid in to a robot shape. The trajectories of both the exhalation airflows and droplet nuclei are significantly...

  14. Modeling of processing technologies in food industry

    Science.gov (United States)

    Korotkov, V. G.; Sagitov, R. F.; Popov, V. P.; Bachirov, V. D.; Akhmadieva, Z. R.; TSirkaeva, E. A.

    2018-03-01

    Currently, the society is facing an urgent need to solve the problems of nutrition (products with increased nutrition value) and to develop energy-saving technologies for food products. A mathematical modeling of heat and mass transfer of polymer materials in the extruder is rather successful these days. Mathematical description of movement and heat exchange during extrusion of gluten-protein-starch-containing material similar to pasta dough in its structure, were taken as a framework for the mathematical model presented in this paper.

  15. Analysis of Using Resources in Business Process Modeling and Simulation

    Directory of Open Access Journals (Sweden)

    Vasilecas Olegas

    2014-12-01

    Full Text Available One of the key purposes of Business Process Model and Notation (BPMN is to support graphical representation of the process model. However, such models have a lack of support for the graphical representation of resources, whose processes are used during simulation or execution of process instance. The paper analyzes different methods and their extensions for resource modeling. Further, this article presents a selected set of resource properties that are relevant for resource modeling. The paper proposes an approach that explains how to use the selected set of resource properties for extension of process modeling using BPMN and simulation tools. They are based on BPMN, where business process instances use resources in a concurrency manner.

  16. A Comprehensive and Harmonized Digital Forensic Investigation Process Model.

    Science.gov (United States)

    Valjarevic, Aleksandar; Venter, Hein S

    2015-11-01

    Performing a digital forensic investigation (DFI) requires a standardized and formalized process. There is currently neither an international standard nor does a global, harmonized DFI process (DFIP) exist. The authors studied existing state-of-the-art DFIP models and concluded that there are significant disparities pertaining to the number of processes, the scope, the hierarchical levels, and concepts applied. This paper proposes a comprehensive model that harmonizes existing models. An effort was made to incorporate all types of processes proposed by the existing models, including those aimed at achieving digital forensic readiness. The authors introduce a novel class of processes called concurrent processes. This is a novel contribution that should, together with the rest of the model, enable more efficient and effective DFI, while ensuring admissibility of digital evidence. Ultimately, the proposed model is intended to be used for different types of DFI and should lead to standardization. © 2015 American Academy of Forensic Sciences.

  17. Statistical properties of several models of fractional random point processes

    Science.gov (United States)

    Bendjaballah, C.

    2011-08-01

    Statistical properties of several models of fractional random point processes have been analyzed from the counting and time interval statistics points of view. Based on the criterion of the reduced variance, it is seen that such processes exhibit nonclassical properties. The conditions for these processes to be treated as conditional Poisson processes are examined. Numerical simulations illustrate part of the theoretical calculations.

  18. Process modeling of a HLA research lab

    Science.gov (United States)

    Ribeiro, Bruna G. C.; Sena, Alexandre C.; Silva, Dilson; Marzulo, Leandro A. J.

    2017-11-01

    Bioinformatics has provided tremendous breakthroughs in the field of molecular biology. All this evolution has generated a large volume of biological data that increasingly require the use of computing for analysis and storage of this information. The identification of the human leukocyte antigen (HLA) genotypes is critical to the success of organ transplants in humans. HLA typing involves not only laboratory tests but also DNA sequencing, with the participation of several professionals responsible for different stages of the process. Thus, the objective of this paper is to map the main steps in HLA typing in a laboratory specialized in performing such procedures, analyzing each process and proposing solutions to speed up the these steps, avoiding mistakes.

  19. Business process modeling in the cloud

    OpenAIRE

    Yarahmadi, Aziz

    2014-01-01

    In this study, I have defined the first steps of creating a methodological framework to implement a cloud business application. The term 'cloud' here refers to applying the processing power of a network of computing tools to business solutions in order to move on from legacy systems. I have introduced the hardware and software requirements of cloud computing in business and the procedure by which the business needs will be found, analyzed and recorded as a decision making system. But first we...

  20. Multiscale Modeling and Simulation of Material Processing

    Science.gov (United States)

    2006-07-01

    challenge is how to develop methods that permit simulation of a process with a fewer number of atoms (for e.g. 106 instead of 1014 atoms in a cube) or...rreula bakgrundmes to ea wih poblms n-here. In dynamic simulations, the mass and momentum volving rapidly varying stress, such as stress field near a...significant, as indicated by numerical examples that will follow. We next summarize the coupling scheme with the aid of flowchart Fig. 8. The material

  1. Stochastic Models in the Identification Process

    Czech Academy of Sciences Publication Activity Database

    Slovák, Dalibor; Zvárová, Jana

    2011-01-01

    Roč. 7, č. 1 (2011), s. 44-50 ISSN 1801-5603 R&D Projects: GA MŠk(CZ) 1M06014 Institutional research plan: CEZ:AV0Z10300504 Keywords : identification process * weight-of evidence formula * coancestry coefficient * beta-binomial sampling formula * DNA mixtures Subject RIV: IN - Informatics, Computer Science http://www.ejbi.eu/images/2011-1/Slovak_en.pdf

  2. How can Product Development Process Modelling be made more useful?

    DEFF Research Database (Denmark)

    Wynn, David C; Maier, Anja; Clarkson, John P

    2010-01-01

    and on the way they are applied. The paper draws upon established principles of cybernetic systems in an attempt to explain the role played by process modelling in operating and improving PD processes. We use this framework to identify eight key factors which influence the utility of modelling in the context...... of use. Further, we indicate how these factors can be interpreted to identify opportunities to improve modelling utility. The paper is organised as follows. Section 2 provides background and motivation for the paper by discussing an example of PD process modelling practice. After highlighting from......, and the process being modelled. Section 5 draws upon established principles of cybernetic systems theory to incorporate this view in an explanation of the role of modelling in PD process operation and improvement. This framework is used to define modelling utility and to progressively identify influences upon it...

  3. Multivariate Product-Shot-noise Cox Point Process Models

    DEFF Research Database (Denmark)

    Jalilian, Abdollah; Guan, Yongtao; Mateu, Jorge

    We introduce a new multivariate product-shot-noise Cox process which is useful for model- ing multi-species spatial point patterns with clustering intra-specific interactions and neutral, negative or positive inter-specific interactions. The auto and cross pair correlation functions of the process...... can be obtained in closed analytical forms and approximate simulation of the process is straightforward. We use the proposed process to model interactions within and among five tree species in the Barro Colorado Island plot....

  4. Study on a Process-oriented Knowledge Management Model

    OpenAIRE

    Zhang, Lingling; Li, Jun; Zheng, Xiuyu; Li, Xingsen; Shi, Yong

    2007-01-01

    Now knowledge has become the most important resource of enterprises. Process-oriented knowledge management (POKM) is a new and valuable research field. It may be the most practical method to deal with difficulties in knowledge management. The paper analyzes background, hypothesis and proposes of POKM, define the process knowledge, and give a process-oriented knowledge management model. The model integrates knowledge, process, human, and technology. It can improve the decision support capabili...

  5. Simplified Model and Response Analysis for Crankshaft of Air Compressor

    Science.gov (United States)

    Chao-bo, Li; Jing-jun, Lou; Zhen-hai, Zhang

    2017-11-01

    The original model of crankshaft is simplified to the appropriateness to balance the calculation precision and calculation speed, and then the finite element method is used to analyse the vibration response of the structure. In order to study the simplification and stress concentration for crankshaft of air compressor, this paper compares calculative mode frequency and experimental mode frequency of the air compressor crankshaft before and after the simplification, the vibration response of reference point constraint conditions is calculated by using the simplified model, and the stress distribution of the original model is calculated. The results show that the error between calculative mode frequency and experimental mode frequency is controlled in less than 7%, the constraint will change the model density of the system, the position between the crank arm and the shaft appeared stress concentration, so the part of the crankshaft should be treated in the process of manufacture.

  6. Measures of quality of process models created in BPMN

    Directory of Open Access Journals (Sweden)

    Radek Hronza

    2015-12-01

    Full Text Available Description, documentation, evaluation and redesign of key processes during their execution should be an essential part of the strategic management of any organization. All organizations live in a dynamically changing environment. Therefore they must adapt its internal processes to market changes. These processes must be described. Suitable way of description could be BPMN notation. Right after description of processes via BPMN, processes should be controlled to ensure expected of their quality. System (which could be automated based on mathematical expression of qualitative characteristics of process models (i.e. measures of quality of process models can support mentioned process controls. Research team trying to design and get into practical use such a tool. The aim of this publication is description of mentioned system – based on measures of the quality of process models - and answer associated scientific questions.

  7. Integrated Intelligent Modeling, Design and Control of Crystal Growth Processes

    National Research Council Canada - National Science Library

    Prasad, V

    2000-01-01

    .... This MURI program took an integrated approach towards modeling, design and control of crystal growth processes and in conjunction with growth and characterization experiments developed much better...

  8. Self-exciting point process in modeling earthquake occurrences

    International Nuclear Information System (INIS)

    Pratiwi, H.; Slamet, I.; Respatiwulan; Saputro, D. R. S.

    2017-01-01

    In this paper, we present a procedure for modeling earthquake based on spatial-temporal point process. The magnitude distribution is expressed as truncated exponential and the event frequency is modeled with a spatial-temporal point process that is characterized uniquely by its associated conditional intensity process. The earthquakes can be regarded as point patterns that have a temporal clustering feature so we use self-exciting point process for modeling the conditional intensity function. The choice of main shocks is conducted via window algorithm by Gardner and Knopoff and the model can be fitted by maximum likelihood method for three random variables. (paper)

  9. Modified Invasion Percolation Models for Multiphase Processes

    Energy Technology Data Exchange (ETDEWEB)

    Karpyn, Zuleima [Pennsylvania State Univ., State College, PA (United States)

    2015-01-31

    This project extends current understanding and modeling capabilities of pore-scale multiphase flow physics in porous media. High-resolution X-ray computed tomography imaging experiments are used to investigate structural and surface properties of the medium that influence immiscible displacement. Using experimental and computational tools, we investigate the impact of wetting characteristics, as well as radial and axial loading conditions, on the development of percolation pathways, residual phase trapping and fluid-fluid interfacial areas.

  10. Animated-simulation modeling facilitates clinical-process costing.

    Science.gov (United States)

    Zelman, W N; Glick, N D; Blackmore, C C

    2001-09-01

    Traditionally, the finance department has assumed responsibility for assessing process costs in healthcare organizations. To enhance process-improvement efforts, however, many healthcare providers need to include clinical staff in process cost analysis. Although clinical staff often use electronic spreadsheets to model the cost of specific processes, PC-based animated-simulation tools offer two major advantages over spreadsheets: they allow clinicians to interact more easily with the costing model so that it more closely represents the process being modeled, and they represent cost output as a cost range rather than as a single cost estimate, thereby providing more useful information for decision making.

  11. Method for modeling social care processes for national information exchange.

    Science.gov (United States)

    Miettinen, Aki; Mykkänen, Juha; Laaksonen, Maarit

    2012-01-01

    Finnish social services include 21 service commissions of social welfare including Adoption counselling, Income support, Child welfare, Services for immigrants and Substance abuse care. This paper describes the method used for process modeling in the National project for IT in Social Services in Finland (Tikesos). The process modeling in the project aimed to support common national target state processes from the perspective of national electronic archive, increased interoperability between systems and electronic client documents. The process steps and other aspects of the method are presented. The method was developed, used and refined during the three years of process modeling in the national project.

  12. Modeling Resource Hotspots: Critical Linkages and Processes

    Science.gov (United States)

    Daher, B.; Mohtar, R.; Pistikopoulos, E.; McCarl, B. A.; Yang, Y.

    2017-12-01

    Growing demands for interconnected resources emerge in the form of hotspots of varying characteristics. The business as usual allocation model cannot address the current, let alone anticipated, complex and highly interconnected resource challenges we face. A new paradigm for resource allocation must be adopted: one that identifies cross-sectoral synergies and, that moves away from silos to recognition of the nexus and integration of it. Doing so will result in new opportunities for business growth, economic development, and improved social well-being. Solutions and interventions must be multi-faceted; opportunities should be identified with holistic trade-offs in mind. No single solution fits all: different hotspots will require distinct interventions. Hotspots have varying resource constraints, stakeholders, goals and targets. The San Antonio region represents a complex resource hotspot with promising potential: its rapidly growing population, the Eagle Ford shale play, and the major agricultural activity there makes it a hotspot with many competing demands. Stakeholders need tools to allow them to knowledgeably address impending resource challenges. This study will identify contemporary WEF nexus questions and critical system interlinkages that will inform the modeling of the tightly interconnected resource systems and stresses using the San Antonio Region as a base; it will conceptualize a WEF nexus modeling framework, and develop assessment criteria to inform integrative planning and decision making.

  13. Model for analyzing decontamination process systems

    International Nuclear Information System (INIS)

    Boykin, R.F.; Rolland, C.W.

    1979-06-01

    Selection of equipment and the design of a new facility in light of minimizing cost and maximizing capacity, is a problem managers face many times in the operations of a manufacturing organization. This paper deals with the actual analysis of equipment facility design for a decontamination operation. Discussions on the selection method of the equipment and the development of the facility design criteria are presented along with insight into the problems encountered in the equipment analysis for a new decontamination facility. The presentation also includes a review of the transition from the old facility into the new facility and the process used to minimize the cost and conveyance problems of the transition

  14. Modelling of chemical reactions in metallurgical processes

    OpenAIRE

    Kinaci, M. Efe; Lichtenegger, Thomas; Schneiderbauer, Simon

    2017-01-01

    Iron-ore reduction has attracted much interest in the last three decades since it can be considered as a core process in steel industry. The iron-ore is reduced to iron with the use of blast furnace and fluidized bed technologies. To investigate the harsh conditions inside fluidized bed reactors, computational tools can be utilized. One such tool is the CFD-DEM method, in which the gas phase reactions and governing equations are calculated in the Eulerian (CFD) side, whereas the particle reac...

  15. A Queuing Model of the Airport Departure Process

    OpenAIRE

    Balakrishnan, Hamsa; Simaiakis, Ioannis

    2013-01-01

    This paper presents an analytical model of the aircraft departure process at an airport. The modeling procedure includes the estimation of unimpeded taxi-out time distributions and the development of a queuing model of the departure runway system based on the transient analysis of D/E/1 queuing systems. The parameters of the runway service process are estimated using operational data. Using the aircraft pushback schedule as input, the model predicts the expected runway schedule and takeoff ti...

  16. Modelling energy spot prices by Lévy semistationary processes

    DEFF Research Database (Denmark)

    Barndorff-Nielsen, Ole; Benth, Fred Espen; Veraart, Almut

    This paper introduces a new modelling framework for energy spot prices based on Lévy semistationary processes. Lévy semistationary processes are special cases of the general class of ambit processes. We provide a detailed analysis of the probabilistic properties of such models and we show how...... they are able to capture many of the stylised facts observed in energy markets. Furthermore, we derive forward prices based on our spot price model. As it turns out, many of the classical spot models can be embedded into our novel modelling framework....

  17. Modeling of Multicomponent Mixture Separation Processes Using Hollow fiber Membrane

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Sin-Ah; Kim, Jin-Kuk; Lee, Young Moo; Yeo, Yeong-Koo [Hanyang University, Seoul (Korea, Republic of)

    2015-02-15

    So far, most of research activities on modeling of membrane separation processes have been focused on binary feed mixture. But, in actual separation operations, binary feed is hard to find and most separation processes involve multicomponent feed mixture. In this work models for membrane separation processes treating multicomponent feed mixture are developed. Various model types are investigated and validity of proposed models are analysed based on experimental data obtained using hollowfiber membranes. The proposed separation models show quick convergence and exhibit good tracking performance.

  18. Modeling interdependencies between business and communication processes in hospitals.

    Science.gov (United States)

    Brigl, Birgit; Wendt, Thomas; Winter, Alfred

    2003-01-01

    The optimization and redesign of business processes in hospitals is an important challenge for the hospital information management who has to design and implement a suitable HIS architecture. Nevertheless, there are no tools available specializing in modeling information-driven business processes and the consequences on the communication between information processing, tools. Therefore, we will present an approach which facilitates the representation and analysis of business processes and resulting communication processes between application components and their interdependencies. This approach aims not only to visualize those processes, but to also to evaluate if there are weaknesses concerning the information processing infrastructure which hinder the smooth implementation of the business processes.

  19. Evolutionary image simplification for lung nodule classification with convolutional neural networks.

    Science.gov (United States)

    Lückehe, Daniel; von Voigt, Gabriele

    2018-05-29

    Understanding decisions of deep learning techniques is important. Especially in the medical field, the reasons for a decision in a classification task are as crucial as the pure classification results. In this article, we propose a new approach to compute relevant parts of a medical image. Knowing the relevant parts makes it easier to understand decisions. In our approach, a convolutional neural network is employed to learn structures of images of lung nodules. Then, an evolutionary algorithm is applied to compute a simplified version of an unknown image based on the learned structures by the convolutional neural network. In the simplified version, irrelevant parts are removed from the original image. In the results, we show simplified images which allow the observer to focus on the relevant parts. In these images, more than 50% of the pixels are simplified. The simplified pixels do not change the meaning of the images based on the learned structures by the convolutional neural network. An experimental analysis shows the potential of the approach. Besides the examples of simplified images, we analyze the run time development. Simplified images make it easier to focus on relevant parts and to find reasons for a decision. The combination of an evolutionary algorithm employing a learned convolutional neural network is well suited for the simplification task. From a research perspective, it is interesting which areas of the images are simplified and which parts are taken as relevant.

  20. Modeling and simulation for process and safeguards system design

    International Nuclear Information System (INIS)

    Gutmacher, R.G.; Kern, E.A.; Duncan, D.R.; Benecke, M.W.

    1983-01-01

    A computer modeling and simulation approach that meets the needs of both the process and safeguards system designers is described. The results have been useful to Westinghouse Hanford Company process designers in optimizing the process scenario and operating scheme of the Secure Automated Fabrication line. The combined process/measurements model will serve as the basis for design of the safeguards system. Integration of the process design and the safeguards system design should result in a smoothly operating process that is easier to safeguard