WorldWideScience

Sample records for model simplification process

  1. Towards simplification of hydrologic modeling: Identification of dominant processes

    Science.gov (United States)

    Markstrom, Steven; Hay, Lauren E.; Clark, Martyn P.

    2016-01-01

    The Precipitation–Runoff Modeling System (PRMS), a distributed-parameter hydrologic model, has been applied to the conterminous US (CONUS). Parameter sensitivity analysis was used to identify: (1) the sensitive input parameters and (2) particular model output variables that could be associated with the dominant hydrologic process(es). Sensitivity values of 35 PRMS calibration parameters were computed using the Fourier amplitude sensitivity test procedure on 110 000 independent hydrologically based spatial modeling units covering the CONUS and then summarized to process (snowmelt, surface runoff, infiltration, soil moisture, evapotranspiration, interflow, baseflow, and runoff) and model performance statistic (mean, coefficient of variation, and autoregressive lag 1). Identified parameters and processes provide insight into model performance at the location of each unit and allow the modeler to identify the most dominant process on the basis of which processes are associated with the most sensitive parameters. The results of this study indicate that: (1) the choice of performance statistic and output variables has a strong influence on parameter sensitivity, (2) the apparent model complexity to the modeler can be reduced by focusing on those processes that are associated with sensitive parameters and disregarding those that are not, (3) different processes require different numbers of parameters for simulation, and (4) some sensitive parameters influence only one hydrologic process, while others may influence many

  2. Solid model design simplification

    Energy Technology Data Exchange (ETDEWEB)

    Ames, A.L.; Rivera, J.J.; Webb, A.J.; Hensinger, D.M.

    1997-12-01

    This paper documents an investigation of approaches to improving the quality of Pro/Engineer-created solid model data for use by downstream applications. The investigation identified a number of sources of problems caused by deficiencies in Pro/Engineer`s geometric engine, and developed prototype software capable of detecting many of these problems and guiding users towards simplified, useable models. The prototype software was tested using Sandia production solid models, and provided significant leverage in attacking the simplification problem.

  3. Advanced Spacecraft EM Modelling Based on Geometric Simplification Process and Multi-Methods Simulation

    Science.gov (United States)

    Leman, Samuel; Hoeppe, Frederic

    2016-05-01

    This paper is about the first results of a new generation of ElectroMagnetic (EM) methodology applied to spacecraft systems modelling in the low frequency range (system's dimensions are of the same order of magnitude as the wavelength).This innovative approach aims at implementing appropriate simplifications of the real system based on the identification of the dominant electrical and geometrical parameters driving the global EM behaviour. One rigorous but expensive simulation is performed to quantify the error generated by the use of simpler multi-models. If both the speed up of the simulation time and the quality of the EM response are satisfied, uncertainty simulation could be performed based on the simple models library implementing in a flexible and robust Kron's network formalism.This methodology is expected to open up new perspectives concerning fast parametric analysis, and deep understanding of systems behaviour. It will ensure the identification of main radiated and conducted coupling paths and the sensitive EM parameters in order to optimize the protections and to control the disturbance sources in spacecraft design phases.

  4. Practical simplifications for radioimmunotherapy dosimetric models

    Energy Technology Data Exchange (ETDEWEB)

    Shen, S.; DeNardo, G.L.; O`Donnell, R.T.; Yuan, A.; DeNardo, D.A.; Macey, D.J.; DeNardo, S.J. [Univ. of California, Sacramento, CA (United States). Davis Medical Center

    1999-01-01

    Radiation dosimetry is potentially useful for assessment and prediction of efficacy and toxicity for radionuclide therapy. The usefulness of these dose estimates relies on the establishment of a dose-response model using accurate pharmacokinetic data and a radiation dosimetric model. Due to the complexity in radiation dose estimation, many practical simplifications have been introduced in the dosimetric modeling for clinical trials of radioimmunotherapy. Although research efforts are generally needed to improve the simplifications used at each stage of model development, practical simplifications are often possible for specific applications without significant consequences to the dose-response model. In the development of dosimetric methods for radioimmunotherapy, practical simplifications in the dosimetric models were introduced. This study evaluated the magnitude of uncertainty associated with practical simplifications for: (1) organ mass of the MIRD phantom; (2) radiation contribution from target alone; (3) interpolation of S value; (4) macroscopic tumor uniformity; and (5) fit of tumor pharmacokinetic data.

  5. Terrain Simplification Research in Augmented Scene Modeling

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    environment. As one of the most important tasks in augmented scene modeling, terrain simplification research has gained more and more attention. In this paper, we mainly focus on point selection problem in terrain simplification using triangulated irregular network. Based on the analysis and comparison of traditional importance measures for each input point, we put forward a new importance measure based on local entropy. The results demonstrate that the local entropy criterion has a better performance than any traditional methods. In addition, it can effectively conquer the "short-sight" problem associated with the traditional methods.

  6. A Novel Fast Method for Point-sampled Model Simplification

    Directory of Open Access Journals (Sweden)

    Cao Zhi

    2016-01-01

    Full Text Available A novel fast simplification method for point-sampled statue model is proposed. Simplifying method for 3d model reconstruction is a hot topic in the field of 3D surface construction. But it is difficult as point cloud of many 3d models is very large, so its running time becomes very long. In this paper, a two-stage simplifying method is proposed. Firstly, a feature-preserved non-uniform simplification method for cloud points is presented, which simplifies the data set to remove the redundancy while keeping down the features of the model. Secondly, an affinity clustering simplifying method is used to classify the point cloud into a sharp point or a simple point. The advantage of Affinity Propagation clustering is passing messages among data points and fast speed of processing. Together with the re-sampling, it can dramatically reduce the duration of the process while keep a lower memory cost. Both theoretical analysis and experimental results show that after the simplification, the performance of the proposed method is efficient as well as the details of the surface are preserved well.

  7. In-Process modeling method of applying blend feature simplification%应用过渡特征简化的工序几何建模方法

    Institute of Scientific and Technical Information of China (English)

    唐健钧; 田锡天; 耿俊浩

    2013-01-01

    To build In-Process model rapidly,an In-Process modeling method was proposed by combining blend feature simplification with boundary extraction of machining feature.Boundary of the machining feature was obtained by simplifying blend features in machining feature,and it was selected to build processing volume characteristic by rotation,sweep,or stretching.Boolean subtraction between the former In-Process model and the processing volume characteristics was operated to obtain In-Process model.Edge blends and vertex blends were distinguish,and the situation of support surface lose was analyzed when blend features simplified.A process of typical shaft parts turning In-Process modeling was provided to analyze the In-Process model's dimension change rule which was built by different boundary of machining feature.The effectiveness of proposed method was verified by examples.%为了快速建立工序几何模型,提出一种将过渡特征简化和加工特征边界提取相结合的工序几何模型建立方法.首先对加工特征中的过渡特征进行简化,获得加工特征边界;然后选择加工特征边界,利用旋转、扫掠或拉伸方法建立加工体积特征;用前一道工序的几何模型与该加工体积特征做布尔差运算,求得本工序的工序几何模型.在简化过渡特征时区分边过渡和点过渡,并综合考虑支持面丢失等情况.以典型轴类零件的车削加工工序几何建模为例,分析了选择不同加工特征边界建立的工序几何模型的尺寸变化规律,验证了该方法的有效性.

  8. Towards Effective Sentence Simplification for Automatic Processing of Biomedical Text

    CERN Document Server

    Jonnalagadda, Siddhartha; Hakenberg, Jorg; Baral, Chitta; Gonzalez, Graciela

    2010-01-01

    The complexity of sentences characteristic to biomedical articles poses a challenge to natural language parsers, which are typically trained on large-scale corpora of non-technical text. We propose a text simplification process, bioSimplify, that seeks to reduce the complexity of sentences in biomedical abstracts in order to improve the performance of syntactic parsers on the processed sentences. Syntactic parsing is typically one of the first steps in a text mining pipeline. Thus, any improvement in performance would have a ripple effect over all processing steps. We evaluated our method using a corpus of biomedical sentences annotated with syntactic links. Our empirical results show an improvement of 2.90% for the Charniak-McClosky parser and of 4.23% for the Link Grammar parser when processing simplified sentences rather than the original sentences in the corpus.

  9. Model simplification and optimization of a passive wind turbine generator

    OpenAIRE

    Sareni, Bruno; Abdelli, Abdenour; Roboam, Xavier; Tran, Duc-Hoan

    2009-01-01

    International audience; In this paper, the design of a "low cost full passive structure" of wind turbine system without active electronic part (power and control) is investigated. The efficiency of such device can be obtained only if the design parameters are mutually adapted through an optimization design approach. For this purpose, sizing and simulating models are developed to characterize the behavior and the efficiency of the wind turbine system. A model simplification approach is present...

  10. Study of Simplification of Markov Model for Analyzing System Dependability

    Energy Technology Data Exchange (ETDEWEB)

    Son, Gwang Seop; Kim, Dong Hoon; Choi, Jong Gyun [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-05-15

    In this paper, we introduce the simplification methodology of the Markov model for analyzing system dependability using system failure rate concept. This system failure rate is the probability that the system is failed or unavailable given that the system was as good as at this time. Using this parameter, the Markov model of sub system can be replaced to the system failure rate and then this parameter just is considered in the Markov model of whole system. In this paper, we proposed the method to simplify the Markov model in complex system architecture. We define the system failure rate and using this parameter, the Markov model of system could be simplified.

  11. A Memory Insensitive Technique for Large Model Simplification

    Energy Technology Data Exchange (ETDEWEB)

    Lindstrom, P; Silva, C

    2001-08-07

    In this paper we propose three simple, but significant improvements to the OoCS (Out-of-Core Simplification) algorithm of Lindstrom [20] which increase the quality of approximations and extend the applicability of the algorithm to an even larger class of compute systems. The original OoCS algorithm has memory complexity that depends on the size of the output mesh, but no dependency on the size of the input mesh. That is, it can be used to simplify meshes of arbitrarily large size, but the complexity of the output mesh is limited by the amount of memory available. Our first contribution is a version of OoCS that removes the dependency of having enough memory to hold (even) the simplified mesh. With our new algorithm, the whole process is made essentially independent of the available memory on the host computer. Our new technique uses disk instead of main memory, but it is carefully designed to avoid costly random accesses. Our two other contributions improve the quality of the approximations generated by OoCS. We propose a scheme for preserving surface boundaries which does not use connectivity information, and a scheme for constraining the position of the ''representative vertex'' of a grid cell to an optimal position inside the cell.

  12. Simplification of Process Integration Studies in Intermediate Size Industries

    DEFF Research Database (Denmark)

    Dalsgård, Henrik; Petersen, P. M.; Qvale, Einar Bjørn

    2002-01-01

    It can be argued that the largest potential for energy savings based on process integration is in the intermediate size industry. But this is also the industrial scale in which it is most difficult to make the introduction of energy saving measures economically interesting. The reasons...... associated with a given process integration study in an intermediate size industry. This is based on the observation that the systems that eventually result from a process integration project and that are economically and operationally most interesting are also quite simple. Four steps that may be used...... separately or in series ahead of or simultaneously with the conventional process integration procedures (for example, the pinch point method) are described and are applied to an industrial case study. It might be feared that the use of preselections and groupings would limit the "freedom of movement...

  13. A New Skeleton Feature Extraction Method for Terrain Model Using Profile Recognition and Morphological Simplification

    Directory of Open Access Journals (Sweden)

    Huijie Zhang

    2013-01-01

    Full Text Available It is always difficul to reserve rings and main truck lines in the real engineering of feature extraction for terrain model. In this paper, a new skeleton feature extraction method is proposed to solve these problems, which put forward a simplification algorithm based on morphological theory to eliminate the noise points of the target points produced by classical profile recognition. As well all know, noise point is the key factor to influence the accuracy and efficiency of feature extraction. Our method connected the optimized feature points subset after morphological simplification; therefore, the efficiency of ring process and pruning has been improved markedly, and the accuracy has been enhanced without the negative effect of noisy points. An outbranching concept is defined, and the related algorithms are proposed to extract sufficient long trucks, which is capable of being consistent with real terrain skeleton. All of algorithms are conducted on many real experimental data, including GTOPO30 and benchmark data provided by PPA to verify the performance and accuracy of our method. The results showed that our method precedes PPA as a whole.

  14. A motion retargeting algorithm based on model simplification

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    A new motion retargeting algorithm is presented, which adapts the motion capture data to a new character. To make the resulting motion realistic, the physically-based optimization method is adopted. However, the optimization process is difficult to converge to the optimal value because of high complexity of the physical human model. In order to address this problem, an appropriate simplified model automatically determined by a motion analysis technique is utilized, and then motion retargeting with this simplified model as an intermediate agent is implemented. The entire motion retargeting algorithm involves three steps of nonlinearly constrained optimization: forward retargeting, motion scaling and inverse retargeting. Experimental results show the validity of this algorithm.

  15. Investigation of Model Simplification and Its Influence on the Accuracy in FEM Magnetic Calculations of Gearless Drives

    DEFF Research Database (Denmark)

    Andersen, Søren Bøgh; Santos, Ilmar F.; Fuerst, Axel

    2012-01-01

    Finite-element models of electrical motors often become very complex and time consuming to evaluate when taking into account every little detail. There is therefore a need for simplifications to make the models computational within a reasonable time frame. This is especially important in an optim......Finite-element models of electrical motors often become very complex and time consuming to evaluate when taking into account every little detail. There is therefore a need for simplifications to make the models computational within a reasonable time frame. This is especially important...... in an optimization process, as many iterations usually have to be performed. The focus of this work is an investigation of the electromagnetic part of a gearless mill drive based on real system data which is part of a larger project building a multiphysics model including electromagnet, thermal, and structural...

  16. Surface Simplification of 3D Animation Models Using Robust Homogeneous Coordinate Transformation

    Directory of Open Access Journals (Sweden)

    Juin-Ling Tseng

    2014-01-01

    Full Text Available The goal of 3D surface simplification is to reduce the storage cost of 3D models. A 3D animation model typically consists of several 3D models. Therefore, to ensure that animation models are realistic, numerous triangles are often required. However, animation models that have a high storage cost have a substantial computational cost. Hence, surface simplification methods are adopted to reduce the number of triangles and computational cost of 3D models. Quadric error metrics (QEM has recently been identified as one of the most effective methods for simplifying static models. To simplify animation models by using QEM, Mohr and Gleicher summed the QEM of all frames. However, homogeneous coordinate problems cannot be considered completely by using QEM. To resolve this problem, this paper proposes a robust homogeneous coordinate transformation that improves the animation simplification method proposed by Mohr and Gleicher. In this study, the root mean square errors of the proposed method were compared with those of the method proposed by Mohr and Gleicher, and the experimental results indicated that the proposed approach can preserve more contour features than Mohr’s method can at the same simplification ratio.

  17. Modeling gene regulatory networks: A network simplification algorithm

    Science.gov (United States)

    Ferreira, Luiz Henrique O.; de Castro, Maria Clicia S.; da Silva, Fabricio A. B.

    2016-12-01

    Boolean networks have been used for some time to model Gene Regulatory Networks (GRNs), which describe cell functions. Those models can help biologists to make predictions, prognosis and even specialized treatment when some disturb on the GRN lead to a sick condition. However, the amount of information related to a GRN can be huge, making the task of inferring its boolean network representation quite a challenge. The method shown here takes into account information about the interactome to build a network, where each node represents a protein, and uses the entropy of each node as a key to reduce the size of the network, allowing the further inferring process to focus only on the main protein hubs, the ones with most potential to interfere in overall network behavior.

  18. Application-oriented simplification of actuation mechanism and physical model for ionic polymer-metal composites

    Science.gov (United States)

    Zhu, Zicai; Wang, Yanjie; Liu, Yanfa; Asaka, Kinji; Sun, Xiaofei; Chang, Longfei; Lu, Pin

    2016-07-01

    Water containing ionic polymer-metal composites (IPMCs) show complex deformation properties with water content. In order to develop a simple application-oriented model for engineering application, actuation mechanisms and model equations should be simplified as necessary. Beginning from our previous comprehensive multi-physical model of IPMC actuator, numerical analysis was performed to obtain the main factors influencing the bending deformation and the corresponding simplified model. In this paper, three aspects are mainly concerned. (1) Regarding mass transport process, the diffusion caused by concentration gradient mainly influences the concentrations of cation and water at the two electrode boundaries. (2) By specifying the transport components as hydrated cation and free water in the model, at the cathode, the hydrated cation concentration profile is more flat, whereas the concentrations of both free water and the total water show drastic changes. In general, the two influence the redistribution of cation and water but have little impact on deformation prediction. Thus, they can be ignored in the simplification. (3) An extended osmotic pressure is proposed to cover all eigen stresses simply with an effective osmotic coefficient. Combining with a few other linearized methods, a simplified model has been obtained by sacrificing the prediction precision on the transport process. Furthermore, the improved model has been verified by fitting with IPMC deformation evolved with water content. It shows that the simplified model has the ability to predict the complex deformations of IPMCs.

  19. Simplification of a pharmacokinetic model for red blood cell methotrexate disposition.

    Science.gov (United States)

    Pan, Shan; Korell, Julia; Stamp, Lisa K; Duffull, Stephen B

    2015-12-01

    A pharmacokinetic (PK) model is available for describing the time course of the concentrations of methotrexate (MTX or MTXGlu1) and its active polyglutamated metabolites (MTXGlu2-5) in red blood cells (RBCs). In this study, we aimed to simplify the MTX PK model and to optimise the blood sampling schedules for use in future studies. A proper lumping technique was used to simplify the original MTX RBC PK model. The sum of predicted RBC MTXGlu3-5 concentrations in both the simplified and original models was compared. The sampling schedules for MTXGlu3-5 or all MTX polyglutamates in RBCs were optimised using the Population OPTimal design (POPT) software. The MTX RBC PK model was simplified into a three-state model. The maximum of the absolute value of relative difference in the sum of predicted RBC MTXGlu3-5 concentrations over time was 6.3 %. A five blood sample design was identified for estimating parameters of the simplified model. This study illustrates the application of model simplification processes to an existing model for MTX RBC PK. The same techniques illustrated in our study may be adopted by other studies with similar interest.

  20. A new model for the simplification of particle counting data

    Directory of Open Access Journals (Sweden)

    M. F. Fadal

    2012-06-01

    Full Text Available This paper proposes a three-parameter mathematical model to describe the particle size distribution in a water sample. The proposed model offers some conceptual advantages over two other models reported on previously, and also provides a better fit to the particle counting data obtained from 321 water samples taken over three years at a large South African drinking water supplier. Using the data from raw water samples taken from a moderately turbid, large surface impoundment, as well as samples from the same water after treatment, typical ranges of the model parameters are presented for both raw and treated water. Once calibrated, the model allows the calculation and comparison of total particle number and volumes over any randomly selected size interval of interest.

  1. Simplification of physics-based electrochemical model for lithium ion battery on electric vehicle. Part I: Diffusion simplification and single particle model

    Science.gov (United States)

    Han, Xuebing; Ouyang, Minggao; Lu, Languang; Li, Jianqiu

    2015-03-01

    Now the lithium ion batteries are widely used in electrical vehicles (EV). The battery modeling and state estimation is of great significance. The rigorous physic based electrochemical model is too complicated for on-line simulation in vehicle. In this work, the simplification of physics-based model lithium ion battery for application in battery management system (BMS) on real electrical vehicle is proposed. Approximate method for solving the solid phase diffusion and electrolyte concentration distribution problems is introduced. The approximate result is very close to the rigorous model but fewer computations are needed. An extended single particle model is founded based on these approximated results and the on-line state of charge (SOC) estimation algorithm using the extended Kalman filter with this single particle model is discussed. This SOC estimation algorithm could be used in the BMS in real vehicle.

  2. Effects of model layer simplification using composite hydraulic properties

    Science.gov (United States)

    Kuniansky, Eve L.; Sepulveda, Nicasio; Elango, Lakshmanan

    2011-01-01

    Groundwater provides much of the fresh drinking water to more than 1.5 billion people in the world (Clarke et al., 1996) and in the United States more that 50 percent of citizens rely on groundwater for drinking water (Solley et al., 1998). As aquifer systems are developed for water supply, the hydrologic system is changed. Water pumped from the aquifer system initially can come from some combination of inducing more recharge, water permanently removed from storage, and decreased groundwater discharge. Once a new equilibrium is achieved, all of the pumpage must come from induced recharge and decreased discharge (Alley et al., 1999). Further development of groundwater resources may result in reductions of surface water runoff and base flows. Competing demands for groundwater resources require good management. Adequate data to characterize the aquifers and confining units of the system, like hydrologic boundaries, groundwater levels, streamflow, and groundwater pumping and climatic data for recharge estimation are to be collected in order to quantify the effects of groundwater withdrawals on wetlands, streams, and lakes. Once collected, three-dimensional (3D) groundwater flow models can be developed and calibrated and used as a tool for groundwater management. The main hydraulic parameters that comprise a regional or subregional model of an aquifer system are the hydraulic conductivity and storage properties of the aquifers and confining units (hydrogeologic units) that confine the system. Many 3D groundwater flow models used to help assess groundwater/surface-water interactions require calculating ?effective? or composite hydraulic properties of multilayered lithologic units within a hydrogeologic unit. The calculation of composite hydraulic properties stems from the need to characterize groundwater flow using coarse model layering in order to reduce simulation times while still representing the flow through the system accurately. The accuracy of flow models with

  3. Inductive Voltage Adder Network Analysis and Model Simplification

    Science.gov (United States)

    2007-06-01

    ORGANIZATION NAME(S) AND ADDRESS(ES) Brookhaven National Laboratory Upton, NY 11973 USA 8. PERFORMING ORGANIZATION REPORT NUMBER 9. SPONSORING/ MONITORING ... Kicker Pulser for DARHT-II”, Proceedings of the 20th International LINAC Conference, pp. 509-511, 2000. [4] Wang, G. J. Caporaso, E. G. Cook...Modeling of an Inductive Adder Kicker Pulser for a Proton Radiography System”, Digest of Technical Papers, Pulsed Power Plasma Science, 2001. PPPS-2001

  4. Phonological simplifications, apraxia of speech and the interaction between phonological and phonetic processing.

    Science.gov (United States)

    Galluzzi, Claudia; Bureca, Ivana; Guariglia, Cecilia; Romani, Cristina

    2015-05-01

    Research on aphasia has struggled to identify apraxia of speech (AoS) as an independent deficit affecting a processing level separate from phonological assembly and motor implementation. This is because AoS is characterized by both phonological and phonetic errors and, therefore, can be interpreted as a combination of deficits at the phonological and the motoric level rather than as an independent impairment. We apply novel psycholinguistic analyses to the perceptually phonological errors made by 24 Italian aphasic patients. We show that only patients with relative high rate (>10%) of phonetic errors make sound errors which simplify the phonology of the target. Moreover, simplifications are strongly associated with other variables indicative of articulatory difficulties - such as a predominance of errors on consonants rather than vowels - but not with other measures - such as rate of words reproduced correctly or rates of lexical errors. These results indicate that sound errors cannot arise at a single phonological level because they are different in different patients. Instead, different patterns: (1) provide evidence for separate impairments and the existence of a level of articulatory planning/programming intermediate between phonological selection and motor implementation; (2) validate AoS as an independent impairment at this level, characterized by phonetic errors and phonological simplifications; (3) support the claim that linguistic principles of complexity have an articulatory basis since they only apply in patients with associated articulatory difficulties.

  5. Effects of model layer simplification using composite hydraulic properties

    Science.gov (United States)

    Sepulveda, Nicasio; Kuniansky, Eve L.

    2010-01-01

    The effects of simplifying hydraulic property layering within an unconfined aquifer and the underlying confining unit were assessed. The hydraulic properties of lithologic units within the unconfined aquifer and confining unit were computed by analyzing the aquifer-test data using radial, axisymmetric two-dimensional (2D) flow. Time-varying recharge to the unconfined aquifer and pumping from the confined Upper Floridan aquifer (USA) were simulated using 3D flow. Conceptual flow models were developed by gradually reducing the number of lithologic units in the unconfined aquifer and confining unit by calculating composite hydraulic properties for the simplified lithologic units. Composite hydraulic properties were calculated using either thickness-weighted averages or inverse modeling using regression-based parameter estimation. No significant residuals were simulated when all lithologic units comprising the unconfined aquifer were simulated as one layer. The largest residuals occurred when the unconfined aquifer and confining unit were aggregated into a single layer (quasi-3D), with residuals over 100% for the leakage rates to the confined aquifer and the heads in the confining unit. Residuals increased with contrasts in vertical hydraulic conductivity between the unconfined aquifer and confining unit. Residuals increased when the constant-head boundary at the bottom of the Upper Floridan aquifer was replaced with a no-flow boundary.

  6. Minor actinide separation: simplification of the DIAMEX-SANEX strategy by means of novel SANEX processes

    Energy Technology Data Exchange (ETDEWEB)

    Geist, A. [Karlsruher Institut fuer Technologie - KIT, INE, P. O. Box 3640, 76021 Karlsruhe (Germany); Modolo, G.; Wilden, A.; Kaufholz, P. [Forschungszentrum Juelich GmbH, IEK-6, Juelich (Germany)

    2013-07-01

    The separation of An(III) from PUREX raffinate has previously been demonstrated by applying a DIAMEX process (i.e., co-extraction of An(III) and Ln(III) from HAR) followed by a SANEX process (i.e., selective extraction of An(III) from the DIAMEX product containing An(III) + Ln(III)). In line with process intensification issues, more compact processes have been developed: Recently, a 1c-SANEX process test was successfully performed, directly extracting An(III) from PUREX HAR. More recently, a new i-SANEX process was successfully tested. This process is based on the co-extraction of An(III) + Ln(III) into a TODGA solvent, followed by a selective back-extraction of An(III) by a water soluble complexing agent, in this case SO{sub 3}-Ph-BTP. In both cases, good recoveries were achieved, and very pure product solutions were obtained. However, both 1c-SANEX and i-SANEX used non-CHON chemicals. Nevertheless, these processes are a simplification to the DIAMEX + SANEX process as only one solvent is used. Finally, the new i-SANEX process is the most compact process. (authors)

  7. Simplifications in modelling of dynamical response of coupled electro-mechanical system

    Science.gov (United States)

    Darula, Radoslav; Sorokin, Sergey

    2016-12-01

    The choice of a most suitable model of an electro-mechanical system depends on many variables, such as a scale of the system, type and frequency range of its operation, or power requirements. The article focuses on the model of the electromagnetic element used in passive regime (no feedback loops are assumed) and a general lumped parameter model (a conventional mass-spring-damper system coupled to an electric circuit consisting of a resistance, an inductance and a capacitance) is compared with its simplified version, where the full RLC circuit is replaced with its RL simplification, i.e. the capacitance of the electric system is neglected and just its inductance and the resistance are considered. From the comparison of dynamical responses of these systems, the range of applicability of a simplified model is assessed for free as well as forced vibration.

  8. Simplification of the Flux Function for a Higher-order Gas-kinetic Evolution Model

    CERN Document Server

    Zhou, Guangzhao; Liu, Feng

    2016-01-01

    The higher-order gas-kinetic scheme for solving the Navier-Stokes equations has been studied in recent years. In addition to the use of higher-order reconstruction techniques, many terms are used in the Taylor expansion of the gas distribution functions. Therefore, a large number of coefficients need to be determined in the calculation of the time evolution of the gas distribution function at cell interfaces. As a consequence, the higher-order flux function takes much more computational time than that of a second-order gas-kinetic scheme. This paper aims to simplify the evolution model by two steps. Firstly, the coefficients related to the higher-order spatial and temporal derivatives of a distribution function are redefined to reduce the computational cost. Secondly, based on the physical analysis, some terms can be removed without loss of accuracy. Through the simplifications, the computational efficiency of the higher-order scheme is increased significantly. In addition, a self-adaptive numerical viscosity...

  9. A "Kane's Dynamics" Model for the Active Rack Isolation System Part Two: Nonlinear Model Development, Verification, and Simplification

    Science.gov (United States)

    Beech, G. S.; Hampton, R. D.; Rupert, J. K.

    2004-01-01

    Many microgravity space-science experiments require vibratory acceleration levels that are unachievable without active isolation. The Boeing Corporation's active rack isolation system (ARIS) employs a novel combination of magnetic actuation and mechanical linkages to address these isolation requirements on the International Space Station. Effective model-based vibration isolation requires: (1) An isolation device, (2) an adequate dynamic; i.e., mathematical, model of that isolator, and (3) a suitable, corresponding controller. This Technical Memorandum documents the validation of that high-fidelity dynamic model of ARIS. The verification of this dynamics model was achieved by utilizing two commercial off-the-shelf (COTS) software tools: Deneb's ENVISION(registered trademark), and Online Dynamics Autolev(trademark). ENVISION is a robotics software package developed for the automotive industry that employs three-dimensional computer-aided design models to facilitate both forward and inverse kinematics analyses. Autolev is a DOS-based interpreter designed, in general, to solve vector-based mathematical problems and specifically to solve dynamics problems using Kane's method. The simplification of this model was achieved using the small-angle theorem for the joint angle of the ARIS actuators. This simplification has a profound effect on the overall complexity of the closed-form solution while yielding a closed-form solution easily employed using COTS control hardware.

  10. [Study on simplification of extraction kinetics model and adaptability of total flavonoids model of Scutellariae radix].

    Science.gov (United States)

    Chen, Yang; Zhang, Jin; Ni, Jian; Dong, Xiao-Xu; Xu, Meng-Jie; Dou, Hao-Ran; Shen, Ming-Rui; Yang, Bo-Di; Fu, Jing

    2014-01-01

    Because of irregular shapes of Chinese herbal pieces, we simplified the previously deduced general extraction kinetic model for TCMs, and integrated particle diameters of Chinese herbs that had been hard to be determined in the final parameter "a". The reduction of the direct determination of particle diameters of Chinese herbs was conducive to increase the accuracy of the model, expand the application scope of the model, and get closer to the actual production conditions. Finally, a simplified model was established, with its corresponding experimental methods and data processing methods determined. With total flavonoids in Scutellariae Radix as the determination index, we conducted a study on the adaptability of total flavonoids extracted from Scutellariae Radix with the water decoction method in the model. The results showed a good linear correlation among the natural logarithm value of the mass concentration of total flavonoids in Scutellariae Radix, the time and the changes in the natural logarithm of solvent multiple. Through calculating and fitting, efforts were made to establish the kinetic model of extracting total flavonoids from Scutellariae Radix with the water decoction method, and verify the model, with a good degree of fitting and deviation within the range of the industrial production requirements. This indicated that the model established by the method has a good adaptability.

  11. Infrastructure Area Simplification Plan

    CERN Document Server

    Field, L.

    2011-01-01

    The infrastructure area simplification plan was presented at the 3rd EMI All Hands Meeting in Padova. This plan only affects the information and accounting systems as the other areas are new in EMI and hence do not require simplification.

  12. Regulatory forum opinion piece: Clarification and simplification of the pathology peer review documentation process.

    Science.gov (United States)

    Tomlinson, Michael J; Leininger, Joel R

    2014-01-01

    The transparency and documentation of the peer review process have been discussed recently. Our position is that transparency is best achieved when peer review is a collaborative process, in which both parties are open-minded but both also realize that the study pathologist retains complete control over the findings (raw data) and over the content of the pathology report. For these reasons, we believe that histopathology raw data should be defined as the observations made by the study pathologist (printed and/or electronic formats) rather than as the tissue slides recommended by the Organisation for Economic Co-operation and Development (OECD). Also, because the study pathologist retains control over the histopathology raw data, any notes or tabulations of findings by the study pathologist and peer review pathologist during the peer review are interim notes and should not be included as an appendix to the pathology report though they may be retained if desired, as currently recommended. Because the histopathology raw data have not been created until completion of the peer review, the performance of a peer review should be documented in the study report, as currently recommended, but that it not be a GLP-compliant process.

  13. Web services-based text-mining demonstrates broad impacts for interoperability and process simplification.

    Science.gov (United States)

    Wiegers, Thomas C; Davis, Allan Peter; Mattingly, Carolyn J

    2014-01-01

    The Critical Assessment of Information Extraction systems in Biology (BioCreAtIvE) challenge evaluation tasks collectively represent a community-wide effort to evaluate a variety of text-mining and information extraction systems applied to the biological domain. The BioCreative IV Workshop included five independent subject areas, including Track 3, which focused on named-entity recognition (NER) for the Comparative Toxicogenomics Database (CTD; http://ctdbase.org). Previously, CTD had organized document ranking and NER-related tasks for the BioCreative Workshop 2012; a key finding of that effort was that interoperability and integration complexity were major impediments to the direct application of the systems to CTD's text-mining pipeline. This underscored a prevailing problem with software integration efforts. Major interoperability-related issues included lack of process modularity, operating system incompatibility, tool configuration complexity and lack of standardization of high-level inter-process communications. One approach to potentially mitigate interoperability and general integration issues is the use of Web services to abstract implementation details; rather than integrating NER tools directly, HTTP-based calls from CTD's asynchronous, batch-oriented text-mining pipeline could be made to remote NER Web services for recognition of specific biological terms using BioC (an emerging family of XML formats) for inter-process communications. To test this concept, participating groups developed Representational State Transfer /BioC-compliant Web services tailored to CTD's NER requirements. Participants were provided with a comprehensive set of training materials. CTD evaluated results obtained from the remote Web service-based URLs against a test data set of 510 manually curated scientific articles. Twelve groups participated in the challenge. Recall, precision, balanced F-scores and response times were calculated. Top balanced F-scores for gene, chemical and

  14. About Bifurcational Parametric Simplification

    CERN Document Server

    Gol'dshtein, V; Yablonsky, G

    2015-01-01

    A concept of "critical" simplification was proposed by Yablonsky and Lazman in 1996 for the oxidation of carbon monoxide over a platinum catalyst using a Langmuir-Hinshelwood mechanism. The main observation was a simplification of the mechanism at ignition and extinction points. The critical simplification is an example of a much more general phenomenon that we call \\emph{a bifurcational parametric simplification}. Ignition and extinction points are points of equilibrium multiplicity bifurcations, i.e., they are points of a corresponding bifurcation set for parameters. Any bifurcation produces a dependence between system parameters. This is a mathematical explanation and/or justification of the "parametric simplification". It leads us to a conjecture that "maximal bifurcational parametric simplification" corresponds to the "maximal bifurcation complexity." This conjecture can have practical applications for experimental study, because at points of "maximal bifurcation complexity" the number of independent sys...

  15. Impact of model structure simplifications on the performance of a distributed physically-based soil erosion model at the hillslope scale

    Science.gov (United States)

    Cea, Luis; Legoût, Cédric; Grangeon, Thomas; Nord, Guillaume

    2016-04-01

    In order to make affordable the use of physcially based soil erosion models in field applications it is often necessary to reduce the number of parameters or adapt the calibration method to the available data sets. In this study we analyse how the performance and calibration of a distributed event-based soil erosion model at the hillslope scale are affected by different simplifications on the parameterisations used to compute the production of suspended sediment by rainfall and runoff. Six modelling scenarios of different complexity are used to evaluate the temporal variability of the sedimentograph at the outlet of a 60 m long cultivated hillslope. The six scenarios are calibrated within the GLUE framework in order to account for parameter uncertainty, and their performance is evaluated against experimental data registered during five storm events. The NSE, PBIAS and coverage performance ratios show that the sedimentary response of the hillslope in terms of mass flux of eroded soil can be efficiently captured by a model structure including only two soil erodibility parameters which control the rainfall and runoff production of suspended sediment. Increasing the number of parameters makes the calibration process more complex without increasing in a noticeable manner the predictive capability of the model.

  16. Toward the simplification of the design process chain aimed at optimizing the productive processes to improve innovation and competitiveness

    Directory of Open Access Journals (Sweden)

    Emilio Pizzi

    2013-10-01

    Full Text Available The demand within the process of building construction requires necessarily a deeper definition in terms of time and cost management. It means that new instruments of control are to be are to be inserted in the first phases of any design. The prefiguration of scenarios and the presetting of rules and constraints along with the adoption of new digital fabrication softwares assume particular importance in process more and more oriented into a file to factory/fabrication method.

  17. On the simplifications for the thermal modeling of tilting-pad journal bearings under thermoelastohydrodynamic regime

    DEFF Research Database (Denmark)

    Cerda Varela, Alejandro Javier; Fillon, Michel; Santos, Ilmar

    2012-01-01

    The relevance of calculating accurately the oil film temperature build up when modeling tilting-pad journal bearings is well established within the literature on the subject. This work studies the feasibility of using a thermal model for the tilting-pad journal bearing which includes a simplified...

  18. Modeling Attitude Variance in Small UAS’s for Acoustic Signature Simplification Using Experimental Design in a Hardware-in-the-Loop Simulation

    Science.gov (United States)

    2015-03-26

    MODELING ATTITUDE VARIANCE IN SMALL UAS’S FOR ACOUSTIC SIGNATURE SIMPLIFICATION USING EXPERIMENTAL...and is not subject to copyright protection in the United States. AFIT-ENS-MS-15-M-110 MODELING ATTITUDE VARIANCE IN SMALL UAS’S FOR ACOUSTIC...USAF March 2015 DISTRIBUTION STATEMENT A. APPROVED FOR PUBLIC RELEASE; DISTRIBUTION UNLIMITED. AFIT-ENS-MS-15-M-110 MODELING ATTITUDE VARIANCE

  19. Visual salience guided feature-aware shape simplification

    Institute of Scientific and Technical Information of China (English)

    Yong-wei MIAO; Fei-xia HU; Min-yan CHEN; Zhen LIU; Hua-hao SHOU

    2014-01-01

    In the area of 3D digital engineering and 3D digital geometry processing, shape simplification is an important task to reduce their requirement of large memory and high time complexity. By incorporating the content-aware visual salience measure of a polygonal mesh into simplification operation, a novel feature-aware shape simplification approach is presented in this paper. Owing to the robust extraction of relief heights on 3D highly detailed meshes, our visual salience measure is defined by a center-surround operator on Gaussian-weighted relief heights in a scale-dependent manner. Guided by our visual salience map, the feature-aware shape simplification algorithm can be performed by weighting the high-dimensional feature space quadric error metric of vertex pair contractions with the weight map derived from our visual salience map. The weighted quadric error metric is calculated in a six-dimensional feature space by combining the position and normal information of mesh vertices. Experimental results demonstrate that our visual salience guided shape simplification scheme can adaptively and effectively re-sample the underlying models in a feature-aware manner, which can account for the visually salient features of the complex shapes and thus yield better visual fidelity.

  20. Simplification and improvement of prediction model for elastic modulus of particulate reinforced metal matrix composite

    Institute of Scientific and Technical Information of China (English)

    WANG Wen-ming; PAN Fu-sheng; LU Yun; ZENG Su-min

    2006-01-01

    In this paper, we proposed a five-zone model to predict the elastic modulus of particulate reinforced metal matrix composite. We simplified the calculation by ignoring structural parameters including particulate shape, arrangement pattern and dimensional variance mode which have no obvious influence on the elastic modulus of a composite, and improved the precision of the method by stressing the interaction of interfaces with pariculates and maxtrix of the composite. The five- zone model can reflect effects of interface modulus on elastic modulus of composite. It overcomes limitations of expressions of rigidity mixed law and flexibility mixed law. The original idea of five zone model is to put forward the particulate/interface interactive zone and matrix/interface interactive zone. By organically integrating the rigidity mixed law and flexibility mixed law,the model can predict the engineering elastic constant of a composite effectively.

  1. Simplification of the tug-of-war model for cellular transport in cells

    CERN Document Server

    Zhang, Yunxin

    2010-01-01

    The transport of organelles and vesicles in living cells can be well described by a kinetic tug-of-war model advanced by M\\"uller, Klumpp and Lipowsky. In which, the cargo is attached by two motor species, kinesin and dynein, and the direction of motion is determined by the number of motors which bind to the track. In recent work [Phys. Rev. E 79, 061918 (2009)], this model was studied by mean field theory, and it was found that, usually the tug-of-war model has one, two, or three distinct stable stationary points. However, the results there are mostly obtained by numerical calculations, since it is hard to do detailed theoretical studies to a two-dimensional nonlinear system. In this paper, we will carry out further detailed analysis about this model, and try to find more properties theoretically. Firstly, the tug-of-war model is simplified to a one-dimensional equation. Then we claim that the stationary points of the tug-of-war model correspond to the roots of the simplified equation, and the stable station...

  2. Simplification of high order polynomial calibration model for fringe projection profilometry

    Science.gov (United States)

    Yu, Liandong; Zhang, Wei; Li, Weishi; Pan, Chengliang; Xia, Haojie

    2016-10-01

    In fringe projection profilometry systems, high order polynomial calibration models can be employed to improve the accuracy. However, it is not stable to fit a high order polynomial model with least-squares algorithms. In this paper, a novel method is presented to analyze the significance of each polynomial term and simplify the high order polynomial calibration model. Term significance is evaluated by comparing the loading vector elements of the first few principal components which are obtained with the principal component analysis, and trivial terms are identified and neglected from the high order polynomial calibration model. As a result, the high order model is simplified with significant improvement of computation stability and little loss of reconstruction accuracy. An interesting finding is that some terms of 0 and 1st order, as well as some high order terms related to the image direction that is vertical to the phase change direction, are trivial terms for this specific problem. Experimental results are shown to validate of the proposed method.

  3. Modeling canopy-level productivity: is the "big-leaf" simplification acceptable?

    Science.gov (United States)

    Sprintsin, M.; Chen, J. M.

    2009-05-01

    The "big-leaf" approach to calculating the carbon balance of plant canopies assumes that canopy carbon fluxes have the same relative responses to the environment as any single unshaded leaf in the upper canopy. Widely used light use efficiency models are essentially simplified versions of the big-leaf model. Despite its wide acceptance, subsequent developments in the modeling of leaf photosynthesis and measurements of canopy physiology have brought into question the assumptions behind this approach showing that big leaf approximation is inadequate for simulating canopy photosynthesis because of the additional leaf internal control on carbon assimilation and because of the non-linear response of photosynthesis on leaf nitrogen and absorbed light, and changes in leaf microenvironment with canopy depth. To avoid this problem a sunlit/shaded leaf separation approach, within which the vegetation is treated as two big leaves under different illumination conditions, is gradually replacing the "big-leaf" strategy, for applications at local and regional scales. Such separation is now widely accepted as a more accurate and physiologically based approach for modeling canopy photosynthesis. Here we compare both strategies for Gross Primary Production (GPP) modeling using the Boreal Ecosystem Productivity Simulator (BEPS) at local (tower footprint) scale for different land cover types spread over North America: two broadleaf forests (Harvard, Massachusetts and Missouri Ozark, Missouri); two coniferous forests (Howland, Maine and Old Black Spruce, Saskatchewan); Lost Creek shrubland site (Wisconsin) and Mer Bleue petland (Ontario). BEPS calculates carbon fixation by scaling Farquhar's leaf biochemical model up to canopy level with stomatal conductance estimated by a modified version of the Ball-Woodrow-Berry model. The "big-leaf" approach was parameterized using derived leaf level parameters scaled up to canopy level by means of Leaf Area Index. The influence of sunlit

  4. Simplification and analysis of a model of social interaction in voting

    CERN Document Server

    Lafuerza, Luis F; Edmonds, Bruce; McKane, Alan J

    2015-01-01

    A recently proposed model of social interaction in voting is investigated by simplifying it down into a version that is more analytically tractable and which allows a mathematical analysis to be performed. This analysis clarifies the interplay of the different elements present in the system --- social influence, heterogeneity and noise --- and leads to a better understanding of its properties. The origin of a regime of bistability is identified. The insight gained in this way gives further intuition into the behaviour of the original model.

  5. Simplification and analysis of a model of social interaction in voting

    Science.gov (United States)

    Lafuerza, Luis F.; Dyson, Louise; Edmonds, Bruce; McKane, Alan J.

    2016-06-01

    A recently proposed model of social interaction in voting is investigated by simplifying it down into a version that is more analytically tractable and which allows a mathematical analysis to be performed. This analysis clarifies the interplay of the different elements present in the system - social influence, heterogeneity and noise - and leads to a better understanding of its properties. The origin of a regime of bistability is identified. The insight gained in this way gives further intuition into the behaviour of the original model.

  6. Impediments to predicting site response: Seismic property estimation and modeling simplifications

    Science.gov (United States)

    Thompson, E.M.; Baise, L.G.; Kayen, R.E.; Guzina, B.B.

    2009-01-01

    We compare estimates of the empirical transfer function (ETF) to the plane SH-wave theoretical transfer function (TTF) within a laterally constant medium for invasive and noninvasive estimates of the seismic shear-wave slownesses at 13 Kiban-Kyoshin network stations throughout Japan. The difference between the ETF and either of the TTFs is substantially larger than the difference between the two TTFs computed from different estimates of the seismic properties. We show that the plane SH-wave TTF through a laterally homogeneous medium at vertical incidence inadequately models observed amplifications at most sites for both slowness estimates, obtained via downhole measurements and the spectral analysis of surface waves. Strategies to improve the predictions can be separated into two broad categories: improving the measurement of soil properties and improving the theory that maps the 1D soil profile onto spectral amplification. Using an example site where the 1D plane SH-wave formulation poorly predicts the ETF, we find a more satisfactory fit to the ETF by modeling the full wavefield and incorporating spatially correlated variability of the seismic properties. We conclude that our ability to model the observed site response transfer function is limited largely by the assumptions of the theoretical formulation rather than the uncertainty of the soil property estimates.

  7. Large regional groundwater modeling - a sensitivity study of some selected conceptual descriptions and simplifications; Storregional grundvattenmodellering - en kaenslighetsstudie av naagra utvalda konceptuella beskrivningar och foerenklingar

    Energy Technology Data Exchange (ETDEWEB)

    Ericsson, Lars O. (Lars O. Ericsson Consulting AB (Sweden)); Holmen, Johan (Golder Associates (Sweden))

    2010-12-15

    The primary aim of this report is: - To present a supplementary, in-depth evaluation of certain conceptual simplifications, descriptions and model uncertainties in conjunction with regional groundwater simulation, which in the first instance refer to model depth, topography, groundwater table level and boundary conditions. Implementation was based on geo-scientifically available data compilations from the Smaaland region but different conceptual assumptions have been analysed

  8. Influence of Model Simplifications Excitation Force in Surge for a Floating Foundation for Offshore Wind Turbines

    DEFF Research Database (Denmark)

    Andersen, Morten Thøtt; Hindhede, Dennis; Lauridsen, Jimmy

    2015-01-01

    , thereby, increases the difficulty in wave force determination due to limitations of the commonly used simplified methods. This paper deals with a physical model test of the hydrodynamic excitation force in surge on a fixed three-columned structure intended as a floating foundation for offshore wind......As offshore wind turbines move towards deeper and more distant sites, the concept of floating foundations is a potential technically and economically attractive alternative to the traditional fixed foundations. Unlike the well-studied monopile, the geometry of a floating foundation is complex and...

  9. 顾及约束的网络道路三维模型简化方法%Web-based road 3D model simplification method considering constraints

    Institute of Scientific and Technical Information of China (English)

    蒲浩; 李伟; 赵海峰; 宋占峰

    2013-01-01

    针对道路三维模型数据呈海量,且包含大量约束边界的特点,提出顾及约束的网络道路三维模型简化方法.在服务器端已建立道路三维整体模型的基础上,首先提出顾及约束的半边折叠误差度量方法;然后,在服务器端采用半边折叠操作对初始道路模型进行整体简化,同时建立操作层次树;最后,建立远程视相关模型重构准则,在操作层次树上确定需传输至客户端的视相关结点数据,并结合约束边优先策略,在客户端实现道路三维模型的快速视相关重构.研究结果表明:该方法简化率高,远程动态浏览时保留必要的约束边界,需要传输的数据量小,满足道路远程实时交互式可视化要求.%Based on the fact that the road 3D model has massive data and a large number of constraints, a web-based road 3D model simplification method considering constraints was put forward. The road 3D integrated model was built on the server beforehand. First, a half-edge collapse error metric that considered a large number of road constrained edges was proposed. Then, original road model was integrated simplified on the server by half-edge collapse, and meanwhile operating hierarchical tree was built. Finally, remote view-dependent reconstruction criteria were established. According to these criteria, minimum nodes data that needed to be transferred to client were quickly selected in the operating hierarchical tree. Combined with constrained edges priority strategy, road 3D model quickly view-dependent reconstruction was realized on client. The results show that high simplification rate can be obtained, the necessary constrained edges can be retained, small scale of data transmitted through network is needed in the process of remote dynamic browsing, and the requirements of road remote real-time interactive visualization are met.

  10. Influence of Model Simplifications Excitation Force in Surge for a Floating Foundation for Offshore Wind Turbines

    Directory of Open Access Journals (Sweden)

    Morten Thøtt Andersen

    2015-04-01

    Full Text Available As offshore wind turbines move towards deeper and more distant sites, the concept of floating foundations is a potential technically and economically attractive alternative to the traditional fixed foundations. Unlike the well-studied monopile, the geometry of a floating foundation is complex and, thereby, increases the difficulty in wave force determination due to limitations of the commonly used simplified methods. This paper deals with a physical model test of the hydrodynamic excitation force in surge on a fixed three-columned structure intended as a floating foundation for offshore wind turbines. The experiments were conducted in a wave basin at Aalborg University. The test results are compared with a Boundary Element Method code based on linear diffraction theory for different wave force regimes defined by the column diameter, wave heights and lengths. Furthermore, the study investigates the influence of incident wave direction and stabilizing heave-plates. The structure can be divided into primary, secondary and tertiary parts, defined by the columns, heave-plates and braces to determine the excitation force in surge. The test results are in good agreement with the numerical computation for the primary parts only, which leads to simplified determination of peak frequencies and corresponding dominant force regime.

  11. Equivalent Simplification Method of Micro-Grid

    Directory of Open Access Journals (Sweden)

    Cai Changchun

    2013-09-01

    Full Text Available The paper concentrates on the equivalent simplification method for the micro-grid system connection into distributed network. The equivalent simplification method proposed for interaction study between micro-grid and distributed network. Micro-grid network, composite load, gas turbine synchronous generation, wind generation are equivalent simplification and parallel connect into the point of common coupling. A micro-grid system is built and three phase and single phase grounded faults are performed for the test of the equivalent model of micro-grid. The simulation results show that the equivalent model of micro-grid is effective, and the dynamic of equivalent model is similar with the detailed model of micro-grid. The equivalent simplification method for the micro-grid network and distributed components is suitable for the study of micro-grid.  

  12. Homotopic Polygonal Line Simplification

    DEFF Research Database (Denmark)

    Deleuran, Lasse Kosetski

    of the paths. For an input consisting of n paths with the total size of m, our algorithm improves the running time from O(n log^(1+e) n + m log n) to O(n log^(1+e) n + m), where e > 0. - Heuristic algorithms are simplification algorithm where the reduction based on the complexity measure is not necessarily...

  13. 在动态仿真中风电场模型的简化%Simplification of Wind Farm Model for Dynamic Simulation

    Institute of Scientific and Technical Information of China (English)

    黄梅; 万航羽

    2009-01-01

    Simplification of wind farm model is studied for dynamic simulation about induction generator(IG) farm and double fed induction generator(DFIG) farm in this paper. The rule for simplifying wind farm dynamic model is the same or similar operating point to wind turbines and generators. According to wake effect, the wind farms are divided into regions, and some wind turbine-generators are merged as one for establishing the simplified model of wind farm. Focusing on the type of wind turbine-generators and the dynamic process, such as, wind fluctuation, short cirut fault, the availability of simplified models is verified by comparing detailed model with simplified models in varying degrees on simulating, and the application of simplified models is proposed for dynamic simulation.%分别针对异步风力发电机组风电场和双馈风力发电机组风电场,研究在动态仿真中风电场模型的简化.以风力机和发电机具有相同或相近运行点为风电场动态模型简化原则,依据尾流效应影响对风电场进行区域划分,将风力发电机组合并简化,建立风电场整体简化模型.针对风力发电机组类型和风速变化、电力系统短路的动态过程,通过仿真对比详细模型和不同程度的简化模型,验证各种简化模型的适用性,提出在动态仿真中风电场简化模型的应用建议.

  14. Improving the Financial Aid Process for Community College Students: A Literature Review of FAFSA Simplification, Information, and Verification

    Science.gov (United States)

    Davidson, J. Cody

    2015-01-01

    Research has shown that community college and other low income students have the most need regarding the financial aid process. Community college students are less likely to complete the free application for federal student aid (FAFSA) than students at public and private four-year and for-profit institutions. Surveys have shown that the complexity…

  15. USER STORY SOFTWARE ESTIMATION:A SIMPLIFICATION OF SOFTWARE ESTIMATION MODEL WITH DISTRIBUTED EXTREME PROGRAMMING ESTIMATION TECHNIQUE

    OpenAIRE

    Ridi Ferdiana; Paulus Insap Santoso; Lukito Edi Nugroho; Ahmad Ashari

    2011-01-01

    Software estimation is an area of software engineering concerned with the identification, classification and measurement of features of software that affect the cost of developing and sustaining computer programs [19]. Measuring the software through software estimation has purpose to know the complexity of the software, estimate the human resources, and get better visibility of execution and process model. There is a lot of software estimation that work sufficiently in certain conditions or s...

  16. USER STORY SOFTWARE ESTIMATION:A SIMPLIFICATION OF SOFTWARE ESTIMATION MODEL WITH DISTRIBUTED EXTREME PROGRAMMING ESTIMATION TECHNIQUE

    Directory of Open Access Journals (Sweden)

    Ridi Ferdiana

    2011-01-01

    Full Text Available Software estimation is an area of software engineering concerned with the identification, classification and measurement of features of software that affect the cost of developing and sustaining computer programs [19]. Measuring the software through software estimation has purpose to know the complexity of the software, estimate the human resources, and get better visibility of execution and process model. There is a lot of software estimation that work sufficiently in certain conditions or step in software engineering for example measuring line of codes, function point, COCOMO, or use case points. This paper proposes another estimation technique called Distributed eXtreme Programming Estimation (DXP Estimation. DXP estimation provides a basic technique for the team that using eXtreme Programming method in onsite or distributed development. According to writer knowledge this is a first estimation technique that applied into agile method in eXtreme Programming.

  17. Effect of simplifications of bone and components inclination on the elastohydrodynamic lubrication modeling of metal-on-metal hip resurfacing prosthesis.

    Science.gov (United States)

    Meng, Qingen; Liu, Feng; Fisher, John; Jin, Zhongmin

    2013-05-01

    It is important to study the lubrication mechanism of metal-on-metal hip resurfacing prosthesis in order to understand its overall tribological performance, thereby minimize the wear particles. Previous elastohydrodynamic lubrication studies of metal-on-metal hip resurfacing prosthesis neglected the effects of the orientations of the cup and head. Simplified pelvic and femoral bone models were also adopted for the previous studies. These simplifications may lead to unrealistic predictions. For the first time, an elastohydrodynamic lubrication model was developed and solved for a full metal-on-metal hip resurfacing arthroplasty. The effects of the orientations of components and the realistic bones on the lubrication performance of metal-on-metal hip resurfacing prosthesis were investigated by comparing the full model with simplified models. It was found that the orientation of the head played a very important role in the prediction of pressure distributions and film profiles of the metal-on-metal hip resurfacing prosthesis. The inclination of the hemispherical cup up to 45° had no appreciable effect on the lubrication performance of the metal-on-metal hip resurfacing prosthesis. Moreover, the combined effect of material properties and structures of bones was negligible. Future studies should focus on higher inclination angles, smaller coverage angle and microseparation related to the occurrences of edge loading.

  18. Simplification and shift in cognition of political difference: applying the geometric modeling to the analysis of semantic similarity judgment.

    Science.gov (United States)

    Kato, Junko; Okada, Kensuke

    2011-01-01

    Perceiving differences by means of spatial analogies is intrinsic to human cognition. Multi-dimensional scaling (MDS) analysis based on Minkowski geometry has been used primarily on data on sensory similarity judgments, leaving judgments on abstractive differences unanalyzed. Indeed, analysts have failed to find appropriate experimental or real-life data in this regard. Our MDS analysis used survey data on political scientists' judgments of the similarities and differences between political positions expressed in terms of distance. Both distance smoothing and majorization techniques were applied to a three-way dataset of similarity judgments provided by at least seven experts on at least five parties' positions on at least seven policies (i.e., originally yielding 245 dimensions) to substantially reduce the risk of local minima. The analysis found two dimensions, which were sufficient for mapping differences, and fit the city-block dimensions better than the Euclidean metric in all datasets obtained from 13 countries. Most city-block dimensions were highly correlated with the simplified criterion (i.e., the left-right ideology) for differences that are actually used in real politics. The isometry of the city-block and dominance metrics in two-dimensional space carries further implications. More specifically, individuals may pay attention to two dimensions (if represented in the city-block metric) or focus on a single dimension (if represented in the dominance metric) when judging differences between the same objects. Switching between metrics may be expected to occur during cognitive processing as frequently as the apparent discontinuities and shifts in human attention that may underlie changing judgments in real situations occur. Consequently, the result has extended strong support for the validity of the geometric models to represent an important social cognition, i.e., the one of political differences, which is deeply rooted in human nature.

  19. A simplification of Cobelli's glucose-insulin model for type 1 diabetes mellitus and its FPGA implementation.

    Science.gov (United States)

    Li, Peng; Yu, Lei; Fang, Qiang; Lee, Shuenn-Yuh

    2016-10-01

    Cobelli's glucose-insulin model is the only computer simulator of glucose-insulin interactions accepted by Food Drug Administration as a substitute to animal trials. However, it consists of multiple differential equations that make it hard to be implemented on a hardware platform. In this investigation, the Cobelli's model is simplified by Padé approximant method and implemented on a field-programmable gate array-based platform as a hardware model for predicting glucose changes in subjects with type 1 diabetes mellitus. Compared with the original Cobelli's model, the implemented hardware model provides a nearly perfect approximation in predicting glucose changes with rather small root-mean-square errors and maximum errors. The RMSE results for 30 subjects show that the method for simplifying and implementing Cobelli's model has good robustness and applicability. The successful hardware implementation of Cobelli's model will promote a wider adoption of this model that can substitute animal trials, provide fast and reliable glucose and insulin estimation, and ultimately assist the further development of an artificial pancreas system.

  20. Streaming Algorithms for Line Simplification

    DEFF Research Database (Denmark)

    Abam, Mohammad; de Berg, Mark; Hachenberger, Peter

    2010-01-01

    this problem in a streaming setting, where we only have a limited amount of storage, so that we cannot store all the points. We analyze the competitive ratio of our algorithms, allowing resource augmentation: we let our algorithm maintain a simplification with 2k (internal) points and compare the error of our...

  1. Cuckoo Filter: Simplification and Analysis

    OpenAIRE

    Eppstein, David

    2016-01-01

    The cuckoo filter data structure of Fan, Andersen, Kaminsky, and Mitzenmacher (CoNEXT 2014) performs the same approximate set operations as a Bloom filter in less memory, with better locality of reference, and adds the ability to delete elements as well as to insert them. However, until now it has lacked theoretical guarantees on its performance. We describe a simplified version of the cuckoo filter using fewer hash function calls per query. With this simplification, we provide the first theo...

  2. Assessing the Impact of Canopy Structure Simplification in Common Multilayer Models on Irradiance Absorption Estimates of Measured and Virtually Created Fagus sylvatica (L. Stands

    Directory of Open Access Journals (Sweden)

    Pol Coppin

    2009-11-01

    of leaves differed significantly between a multilayer representation and a 3D architecture canopy of the same LAI. The deviations in irradiance absorbance were caused by canopy structure, clumping and positioning of leaves. Although it was found that the use of canopy simplifications for modelling purposes in closed canopies is demonstrated as a valid option, special care should be taken when considering forest stands irradiance simulation for sparse canopies and particularly on higher sun zenith angles where the surrounding trees strongly affect the absorbed irradiance and results can highly deviate from the multilayer assumptions.

  3. On Simplification of Database Integrity Constraints

    DEFF Research Database (Denmark)

    Christiansen, Henning; Martinenghi, Davide

    2006-01-01

    , and the present paper is an attempt to fill this gap. On the theoretical side, a general characterization is introduced of the problem of simplification of integrity constraints and a natural definition is given of what it means for a simplification procedure to be ideal. We prove that ideality of simplification...... is strictly related to query containment; in fact, an ideal simplification procedure can only exist in database languages for which query containment is decidable. However, simplifications that do not qualify as ideal may also be relevant for practical purposes. We present a concrete approach based...

  4. SIMPLIFICATION IN CHILD LANGUAGE IN BAHASA INDONESIA: A CASE STUDY ON FILIP

    Directory of Open Access Journals (Sweden)

    Julia Eka Rini

    2000-01-01

    Full Text Available This article aims at giving examples of characteristics of simplification in Bahasa Indonesia and proving that child language has a pattern and that there is a process in learning. Since this is a case study, it might not be enough to say that simplification is universal for all children of any mother tongues, but at least there is a proof that such patterns of simplification also occur in Bahasa Indonesia.

  5. 2D Vector Field Simplification Based on Robustness

    KAUST Repository

    Skraba, Primoz

    2014-03-01

    Vector field simplification aims to reduce the complexity of the flow by removing features in order of their relevance and importance, to reveal prominent behavior and obtain a compact representation for interpretation. Most existing simplification techniques based on the topological skeleton successively remove pairs of critical points connected by separatrices, using distance or area-based relevance measures. These methods rely on the stable extraction of the topological skeleton, which can be difficult due to instability in numerical integration, especially when processing highly rotational flows. These geometric metrics do not consider the flow magnitude, an important physical property of the flow. In this paper, we propose a novel simplification scheme derived from the recently introduced topological notion of robustness, which provides a complementary view on flow structure compared to the traditional topological-skeleton-based approaches. Robustness enables the pruning of sets of critical points according to a quantitative measure of their stability, that is, the minimum amount of vector field perturbation required to remove them. This leads to a hierarchical simplification scheme that encodes flow magnitude in its perturbation metric. Our novel simplification algorithm is based on degree theory, has fewer boundary restrictions, and so can handle more general cases. Finally, we provide an implementation under the piecewise-linear setting and apply it to both synthetic and real-world datasets. © 2014 IEEE.

  6. Equivalent Simplification Method of Micro-Grid

    OpenAIRE

    Cai Changchun; Cao Xiangqin

    2013-01-01

    The paper concentrates on the equivalent simplification method for the micro-grid system connection into distributed network. The equivalent simplification method proposed for interaction study between micro-grid and distributed network. Micro-grid network, composite load, gas turbine synchronous generation, wind generation are equivalent simplification and parallel connect into the point of common coupling. A micro-grid system is built and three phase and single phase grounded faults are per...

  7. Moving Beyond Readability Metrics for Health-Related Text Simplification.

    Science.gov (United States)

    Kauchak, David; Leroy, Gondy

    2016-01-01

    Limited health literacy is a barrier to understanding health information. Simplifying text can reduce this barrier and possibly other known disparities in health. Unfortunately, few tools exist to simplify text with demonstrated impact on comprehension. By leveraging modern data sources integrated with natural language processing algorithms, we are developing the first semi-automated text simplification tool. We present two main contributions. First, we introduce our evidence-based development strategy for designing effective text simplification software and summarize initial, promising results. Second, we present a new study examining existing readability formulas, which are the most commonly used tools for text simplification in healthcare. We compare syllable count, the proxy for word difficulty used by most readability formulas, with our new metric 'term familiarity' and find that syllable count measures how difficult words 'appear' to be, but not their actual difficulty. In contrast, term familiarity can be used to measure actual difficulty.

  8. For the sake of simplicity: Unsupervised extraction of lexical simplifications from Wikipedia

    CERN Document Server

    Yatskar, Mark; Danescu-Niculescu-Mizil, Cristian; Lee, Lillian

    2010-01-01

    We report on work in progress on extracting lexical simplifications (e.g., "collaborate" -> "work together"), focusing on utilizing edit histories in Simple English Wikipedia for this task. We consider two main approaches: (1) deriving simplification probabilities via an edit model that accounts for a mixture of different operations, and (2) using metadata to focus on edits that are more likely to be simplification operations. We find our methods to outperform a reasonable baseline and yield many high-quality lexical simplifications not included in an independently-created manually prepared list.

  9. 离心泵转子动力学模型中流体力的简化%Simplification of Fluid Force in Rotordynamic Model of Centrifugal Pumps

    Institute of Scientific and Technical Information of China (English)

    蒋爱华; 华宏星; 陈长盛; 李国平; 周璞; 章艺

    2014-01-01

    Simplification of the fluid force applied on impeller can significantly raise the accuracy of computation of centrifugal pump vibration incited by the fluid. In this paper, a rotor dynamic model including four discs, three shaft sections and a pump base is built for the workbench based on D'Alembert principle. Then fluid force on the impeller is simplified as 20 % fluid weight in impeller, 40 % fluid weight in impeller, and a concentrated force as well as a torque by CFD respectively. Finally, the transient response analysis is carried out by Newmark-implicit algorithm. The result shows that the base vibration incited by the fluid force during centrifugal pump operation can be effectively gained by simplifying the fluid force on the impeller to a concentrated force and a torque, and the amplitudes of acceleration and displacement of the base vibration by simplifying the fluid force to concentrated force and torque are much larger than those by simplifying the fluid force as 20 % and 40 % fluid weight in the impeller respectively. Also, the acceleration and displacement amplitudes by 40%fluid weight in the impeller are larger than those by 20%fluid weight in the impeller.%采用叶轮流体力的简化方式可以提高离心泵流体激励诱发振动的计算的准确程度。根据达朗伯原理对试验台架建立了包含离心泵基座的四圆盘三轴段转子动力学模型;将流体力分别简化为叶轮内20%流体质量、40%流体质量、CFD集中力与力矩,采用Newmark-隐式算法对转子动力学模型进行瞬态响应分析。结果表明,将叶轮上流体力简化为CFD;所得集中力与力矩时;可有效得出离心泵运转过程中流体激励所诱发的基座振动。而所获得的基座振动位移与加速度幅值均远大于将流体力简化为叶轮内20%或40%流体质量所获得的基座振动数值。另一方面,将流体力简化为叶轮内40%流体质量所获得的基座振动大于简化为叶轮内20

  10. Sentence Simplification Aids Protein-Protein Interaction Extraction

    CERN Document Server

    Jonnalagadda, Siddhartha

    2010-01-01

    Accurate systems for extracting Protein-Protein Interactions (PPIs) automatically from biomedical articles can help accelerate biomedical research. Biomedical Informatics researchers are collaborating to provide metaservices and advance the state-of-art in PPI extraction. One problem often neglected by current Natural Language Processing systems is the characteristic complexity of the sentences in biomedical literature. In this paper, we report on the impact that automatic simplification of sentences has on the performance of a state-of-art PPI extraction system, showing a substantial improvement in recall (8%) when the sentence simplification method is applied, without significant impact to precision.

  11. Simplification Rules for Birdtrack Operators

    CERN Document Server

    Alckock-Zeilinger, Judith

    2016-01-01

    This paper derives a set of easy-to-use tools designed to simplify calculations with birdtrack op- erators comprised of symmetrizers and antisymmetrizers. In particular, we present cancellation rules allowing one to shorten the birdtrack expressions of operators, and propagation rules identifying the circumstances under which it is possible to propagate symmetrizers past antisymmetrizers and vice versa. We exhibit the power of these simplification rules by means of a short example in which we apply the tools derived in this paper on a typical operator that can be encountered in the representation theory of SU(N) over the product space $V^{\\otimes m}$ . These rules form the basis for the construction of compact Hermitian Young projection operators and their transition operators addressed in companion papers.

  12. A Review for Model Plant Mismatch Measures in Process Monitoring

    Institute of Scientific and Technical Information of China (English)

    王洪; 谢磊; 宋执环

    2012-01-01

    Model is usually necessary for the design of a control loop. Due to simplification and unknown dynamics, model plant mismatch is inevitable in the control loop. In process monitoring, detection of mismatch and evaluation of its influences are demanded. In this paper several mismatch measures are presented based on different model descriptions. They are categorized into different groups from different perspectives and their potential in detection and diagnosis is evaluated. Two case studies on mixing process and distillation process demonstrate the efficacy of the framework of mismatch monitoring.

  13. Simplifications and Idealizations in High School Physics in Mechanics: A Study of Slovenian Curriculum and Textbooks

    Science.gov (United States)

    Forjan, Matej; Sliško, Josip

    2014-01-01

    This article presents the results of an analysis of three Slovenian textbooks for high school physics, from the point of view of simplifications and idealizations in the field of mechanics. In modeling of physical systems, making simplifications and idealizations is important, since one ignores minor effects and focuses on the most important…

  14. Radiolysis Process Model

    Energy Technology Data Exchange (ETDEWEB)

    Buck, Edgar C.; Wittman, Richard S.; Skomurski, Frances N.; Cantrell, Kirk J.; McNamara, Bruce K.; Soderquist, Chuck Z.

    2012-07-17

    Assessing the performance of spent (used) nuclear fuel in geological repository requires quantification of time-dependent phenomena that may influence its behavior on a time-scale up to millions of years. A high-level waste repository environment will be a dynamic redox system because of the time-dependent generation of radiolytic oxidants and reductants and the corrosion of Fe-bearing canister materials. One major difference between used fuel and natural analogues, including unirradiated UO2, is the intense radiolytic field. The radiation emitted by used fuel can produce radiolysis products in the presence of water vapor or a thin-film of water (including OH• and H• radicals, O2-, eaq, H2O2, H2, and O2) that may increase the waste form degradation rate and change radionuclide behavior. H2O2 is the dominant oxidant for spent nuclear fuel in an O2 depleted water environment, the most sensitive parameters have been identified with respect to predictions of a radiolysis model under typical conditions. As compared with the full model with about 100 reactions it was found that only 30-40 of the reactions are required to determine [H2O2] to one part in 10–5 and to preserve most of the predictions for major species. This allows a systematic approach for model simplification and offers guidance in designing experiments for validation.

  15. Monte Carlo modelling of diode detectors for small field MV photon dosimetry: detector model simplification and the sensitivity of correction factors to source parameterization.

    Science.gov (United States)

    Cranmer-Sargison, G; Weston, S; Evans, J A; Sidhu, N P; Thwaites, D I

    2012-08-21

    The goal of this work was to examine the use of simplified diode detector models within a recently proposed Monte Carlo (MC) based small field dosimetry formalism and to investigate the influence of electron source parameterization has on MC calculated correction factors. BEAMnrc was used to model Varian 6 MV jaw-collimated square field sizes down to 0.5 cm. The IBA stereotactic field diode (SFD), PTW T60016 (shielded) and PTW T60017 (un-shielded) diodes were modelled in DOSRZnrc and isocentric output ratios (OR(fclin)(detMC)) calculated at depths of d = 1.5, 5.0 and 10.0 cm. Simplified detector models were then tested by evaluating the percent difference in (OR(fclin)(detMC)) between the simplified and complete detector models. The influence of active volume dimension on simulated output ratio and response factor was also investigated. The sensitivity of each MC calculated replacement correction factor (k(fclin,fmsr)(Qclin,Qmsr)), as a function of electron FWHM between 0.100 and 0.150 cm and energy between 5.5 and 6.5 MeV, was investigated for the same set of small field sizes using the simplified detector models. The SFD diode can be approximated simply as a silicon chip in water, the T60016 shielded diode can be modelled as a chip in water plus the entire shielding geometry and the T60017 unshielded diode as a chip in water plus the filter plate located upstream. The detector-specific (k(fclin,fmsr)(Qclin,Qmsr)), required to correct measured output ratios using the SFD, T60016 and T60017 diode detectors are insensitive to incident electron energy between 5.5 and 6.5 MeV and spot size variation between FWHM = 0.100 and 0.150 cm. Three general conclusions come out of this work: (1) detector models can be simplified to produce OR(fclin)(detMC) to within 1.0% of those calculated using the complete geometry, where typically not only the silicon chip, but also any high density components close to the chip, such as scattering plates or shielding material is necessary

  16. The complexities of HIPAA and administration simplification.

    Science.gov (United States)

    Mozlin, R

    2000-11-01

    The Health Insurance Portability and Accessibility Act (HIPAA) was signed into law in 1996. Although focused on information technology issues, HIPAA will ultimately impact day-to-day operations at multiple levels within any clinical setting. Optometrists must begin to familiarize themselves with HIPAA in order to prepare themselves to practice in a technology-enriched environment. Title II of HIPAA, entitled "Administration Simplification," is intended to reduce the costs and administrative burden of healthcare by standardizing the electronic transmission of administrative and financial transactions. The Department of Health and Human Services is expected to publish the final rules and regulations that will govern HIPAA's implementation this year. The rules and regulations will cover three key aspects of healthcare delivery: electronic data interchange (EDI), security and privacy. EDI will standardize the format for healthcare transactions. Health plans must accept and respond to all transactions in the EDI format. Security refers to policies and procedures that protect the accuracy and integrity of information and limit access. Privacy focuses on how the information is used and disclosure of identifiable health information. Security and privacy regulations apply to all information that is maintained and transmitted in a digital format and require administrative, physical, and technical safeguards. HIPAA will force the healthcare industry to adopt an e-commerce paradigm and provide opportunities to improve patient care processes. Optometrists should take advantage of the opportunity to develop more efficient and profitable practices.

  17. BioSimplify: an open source sentence simplification engine to improve recall in automatic biomedical information extraction

    CERN Document Server

    Jonnalagadda, Siddhartha

    2011-01-01

    BioSimplify is an open source tool written in Java that introduces and facilitates the use of a novel model for sentence simplification tuned for automatic discourse analysis and information extraction (as opposed to sentence simplification for improving human readability). The model is based on a "shot-gun" approach that produces many different (simpler) versions of the original sentence by combining variants of its constituent elements. This tool is optimized for processing biomedical scientific literature such as the abstracts indexed in PubMed. We tested our tool on its impact to the task of PPI extraction and it improved the f-score of the PPI tool by around 7%, with an improvement in recall of around 20%. The BioSimplify tool and test corpus can be downloaded from https://biosimplify.sourceforge.net.

  18. On Simplification of Database Integrity Constraints

    DEFF Research Database (Denmark)

    Christiansen, Henning; Martinenghi, Davide

    2006-01-01

    is strictly related to query containment; in fact, an ideal simplification procedure can only exist in database languages for which query containment is decidable. However, simplifications that do not qualify as ideal may also be relevant for practical purposes. We present a concrete approach based......Without proper simplification techniques, database integrity checking can be prohibitively time consuming. Several methods have been developed for producing simplified incremental checks for each update but none until now of sufficient quality and generality for providing a true practical impact...... take place before the execution of the update, so that only consistency-preserving updates are eventually given to the database. The extension to more expressive languages and the application of the framework to other contexts, such as data integration and concurrent database systems, are also...

  19. Quadratic Error Metric Mesh Simplification Algorithm Based on Discrete Curvature

    Directory of Open Access Journals (Sweden)

    Li Yao

    2015-01-01

    Full Text Available Complex and highly detailed polygon meshes have been adopted for model representation in many areas of computer graphics. Existing works mainly focused on the quadric error metric based complex models approximation, which has not taken the retention of important model details into account. This may lead to visual degeneration. In this paper, we improve Garland and Heckberts’ quadric error metric based algorithm by using the discrete curvature to reserve more features for mesh simplification. Our experiments on various models show that the geometry and topology structure as well as the features of the original models are precisely retained by employing discrete curvature.

  20. Complexity and simplification in understanding recruitment in benthic populations

    KAUST Repository

    Pineda, Jesús

    2008-11-13

    Research of complex systems and problems, entities with many dependencies, is often reductionist. The reductionist approach splits systems or problems into different components, and then addresses these components one by one. This approach has been used in the study of recruitment and population dynamics of marine benthic (bottom-dwelling) species. Another approach examines benthic population dynamics by looking at a small set of processes. This approach is statistical or model-oriented. Simplified approaches identify "macroecological" patterns or attempt to identify and model the essential, "first-order" elements of the system. The complexity of the recruitment and population dynamics problems stems from the number of processes that can potentially influence benthic populations, including (1) larval pool dynamics, (2) larval transport, (3) settlement, and (4) post-settlement biotic and abiotic processes, and larval production. Moreover, these processes are non-linear, some interact, and they may operate on disparate scales. This contribution discusses reductionist and simplified approaches to study benthic recruitment and population dynamics of bottom-dwelling marine invertebrates. We first address complexity in two processes known to influence recruitment, larval transport, and post-settlement survival to reproduction, and discuss the difficulty in understanding recruitment by looking at relevant processes individually and in isolation. We then address the simplified approach, which reduces the number of processes and makes the problem manageable. We discuss how simplifications and "broad-brush first-order approaches" may muddle our understanding of recruitment. Lack of empirical determination of the fundamental processes often results in mistaken inferences, and processes and parameters used in some models can bias our view of processes influencing recruitment. We conclude with a discussion on how to reconcile complex and simplified approaches. Although it

  1. Process optimization of friction stir welding based on thermal models

    DEFF Research Database (Denmark)

    Larsen, Anders Astrup

    2010-01-01

    This thesis investigates how to apply optimization methods to numerical models of a friction stir welding process. The work is intended as a proof-of-concept using different methods that are applicable to models of high complexity, possibly with high computational cost, and without the possibility...... information of the high-fidelity model. The optimization schemes are applied to stationary thermal models of differing complexity of the friction stir welding process. The optimization problems considered are based on optimizing the temperature field in the workpiece by finding optimal translational speed....... Also an optimization problem based on a microstructure model is solved, allowing the hardness distribution in the plate to be optimized. The use of purely thermal models represents a simplification of the real process; nonetheless, it shows the applicability of the optimization methods considered...

  2. WORK SIMPLIFICATION FOR PRODUCTIVITY IMPROVEMENT A ...

    African Journals Online (AJOL)

    press concerning the work simplification techniques state ... and social development and its importance as a source of ..... data sheets as per the training given by the authors at site [5]. ... As recorded in the 1993 E.C. budget year Armual.

  3. Simplification Study of FE Model for 1000kV AC Transmission Line Insulator String Voltage and Grading Ring Surface Electric Field Distribution Calculation

    Directory of Open Access Journals (Sweden)

    Guoli Wang

    2013-09-01

    Full Text Available The finite element model of the 1000kV Ultra High Voltage (UHV AC transmission line porcelain insulator string voltage distribution and grading ring surface electric field distribution calculation has the characteristics of large size, complicated structure and various mediums. To insure the accuracy, related influencing factors should be considered to simplify the model reasonably for improving computational efficiency. A whole model and a simplified 3D finite element model of UHV AC transmission line porcelain insulator string were built. The influencing factors including tower, phase conductors, hardware fittings, yoke plate and phase interaction were considered in the analysis. And finally, the rationality of the simplified model was validated. The results comparison show that building a simplified model of three-phase bundled conductors within a certain length, simplifying the tower reasonably, omitting the hardware fittings and yoke plate and containing only single-phase insulator string model is feasible. The simplified model could replace the whole model to analyze the voltage distribution along the porcelain insulator string and the electric field distribution on the grading ring surface, and it can reduce the calculation scale, improve optimization efficiency of insulators string and grading ring parameters.

  4. Modeling multiphase materials processes

    CERN Document Server

    Iguchi, Manabu

    2010-01-01

    ""Modeling Multiphase Materials Processes: Gas-Liquid Systems"" describes the methodology and application of physical and mathematical modeling to multi-phase flow phenomena in materials processing. The book focuses on systems involving gas-liquid interaction, the most prevalent in current metallurgical processes. The performance characteristics of these processes are largely dependent on transport phenomena. This volume covers the inherent characteristics that complicate the modeling of transport phenomena in such systems, including complex multiphase structure, intense turbulence, opacity of

  5. Product Development Process Modeling

    Institute of Scientific and Technical Information of China (English)

    1999-01-01

    The use of Concurrent Engineering and other modern methods of product development and maintenance require that a large number of time-overlapped "processes" be performed by many people. However, successfully describing and optimizing these processes are becoming even more difficult to achieve. The perspective of industrial process theory (the definition of process) and the perspective of process implementation (process transition, accumulation, and inter-operations between processes) are used to survey the method used to build one base model (multi-view) process model.

  6. Generation and simplification of software Markov chain usage model%软件Markov链使用模型生成与化简技术

    Institute of Scientific and Technical Information of China (English)

    冯俊池; 于磊; 刘洋

    2015-01-01

    为解决软件可靠性测试中 Markov链使用模型的状态空间爆炸问题,研究基于 UML 模型的使用模型生成与化简技术。基于 UML模型中的顺序图获取软件与外部环境之间的消息交互,通过分析激励与响应消息来获取状态生成软件Markov链使用模型,准确描述软件的使用情况。针对状态空间爆炸问题,提出冗余状态和等价状态的定义,设计使用模型化简算法,针对化简算法给出相关理论证明。实验结果表明了该方法的有效性。%To solve the state space explosion problem of Markov chain usage model in the software reliability testing,the tech-nology to generate and simplify usage model based on UML model was studied.Based on the sequence diagram of UML model, the messages between software and environment were derived.Based on the stimulus and response messages,the states of usage model were derived.The usage model described the usage of software accurately.After analyzing the state space explosion prob-lem,the concepts of equivalent states and redundant states were defined.An algorithm to simplify the state space was proposed. Related theoretical proof was given.Finally,the effectiveness of the proposed method was verified by experiments.

  7. Process modeling style

    CERN Document Server

    Long, John

    2014-01-01

    Process Modeling Style focuses on other aspects of process modeling beyond notation that are very important to practitioners. Many people who model processes focus on the specific notation used to create their drawings. While that is important, there are many other aspects to modeling, such as naming, creating identifiers, descriptions, interfaces, patterns, and creating useful process documentation. Experience author John Long focuses on those non-notational aspects of modeling, which practitioners will find invaluable. Gives solid advice for creating roles, work produ

  8. Robustness-Based Simplification of 2D Steady and Unsteady Vector Fields

    KAUST Repository

    Skraba, Primoz

    2015-08-01

    © 2015 IEEE. Vector field simplification aims to reduce the complexity of the flow by removing features in order of their relevance and importance, to reveal prominent behavior and obtain a compact representation for interpretation. Most existing simplification techniques based on the topological skeleton successively remove pairs of critical points connected by separatrices, using distance or area-based relevance measures. These methods rely on the stable extraction of the topological skeleton, which can be difficult due to instability in numerical integration, especially when processing highly rotational flows. In this paper, we propose a novel simplification scheme derived from the recently introduced topological notion of robustness which enables the pruning of sets of critical points according to a quantitative measure of their stability, that is, the minimum amount of vector field perturbation required to remove them. This leads to a hierarchical simplification scheme that encodes flow magnitude in its perturbation metric. Our novel simplification algorithm is based on degree theory and has minimal boundary restrictions. Finally, we provide an implementation under the piecewise-linear setting and apply it to both synthetic and real-world datasets. We show local and complete hierarchical simplifications for steady as well as unsteady vector fields.

  9. Product and Process Modelling

    DEFF Research Database (Denmark)

    Cameron, Ian T.; Gani, Rafiqul

    This book covers the area of product and process modelling via a case study approach. It addresses a wide range of modelling applications with emphasis on modelling methodology and the subsequent in-depth analysis of mathematical models to gain insight via structural aspects of the models. These ...

  10. Product and Process Modelling

    DEFF Research Database (Denmark)

    Cameron, Ian T.; Gani, Rafiqul

    This book covers the area of product and process modelling via a case study approach. It addresses a wide range of modelling applications with emphasis on modelling methodology and the subsequent in-depth analysis of mathematical models to gain insight via structural aspects of the models....... These approaches are put into the context of life cycle modelling, where multiscale and multiform modelling is increasingly prevalent in the 21st century. The book commences with a discussion of modern product and process modelling theory and practice followed by a series of case studies drawn from a variety...... to biotechnology applications, food, polymer and human health application areas. The book highlights to important nature of modern product and process modelling in the decision making processes across the life cycle. As such it provides an important resource for students, researchers and industrial practitioners....

  11. Simplification of the unified gas kinetic scheme

    Science.gov (United States)

    Chen, Songze; Guo, Zhaoli; Xu, Kun

    2016-08-01

    The unified gas kinetic scheme (UGKS) is an asymptotic preserving (AP) scheme for kinetic equations. It is superior for transition flow simulation and has been validated in the past years. However, compared to the well-known discrete ordinate method (DOM), which is a classical numerical method solving the kinetic equations, the UGKS needs more computational resources. In this study, we propose a simplification of the unified gas kinetic scheme. It allows almost identical numerical cost as the DOM, but predicts numerical results as accurate as the UGKS. In the simplified scheme, the numerical flux for the velocity distribution function and the numerical flux for the macroscopic conservative quantities are evaluated separately. The equilibrium part of the UGKS flux is calculated by analytical solution instead of the numerical quadrature in velocity space. The simplification is equivalent to a flux hybridization of the gas kinetic scheme for the Navier-Stokes (NS) equations and the conventional discrete ordinate method. Several simplification strategies are tested, through which we can identify the key ingredient of the Navier-Stokes asymptotic preserving property. Numerical tests show that, as long as the collision effect is built into the macroscopic numerical flux, the numerical scheme is Navier-Stokes asymptotic preserving, regardless the accuracy of the microscopic numerical flux for the velocity distribution function.

  12. Standard Model processes

    CERN Document Server

    Mangano, M.L.; Aguilar Saavedra, J.A.; Alekhin, S.; Badger, S.; Bauer, C.W.; Becher, T.; Bertone, V.; Bonvini, M.; Boselli, S.; Bothmann, E.; Boughezal, R.; Cacciari, M.; Carloni Calame, C.M.; Caola, F.; Campbell, J.M.; Carrazza, S.; Chiesa, M.; Cieri, L.; Cimaglia, F.; Febres Cordero, F.; Ferrarese, P.; D'Enterria, D.; Ferrera, G.; Garcia i Tormo, X.; Garzelli, M.V.; Germann, E.; Hirschi, V.; Han, T.; Ita, H.; Jäger, B.; Kallweit, S.; Karlberg, A.; Kuttimalai, S.; Krauss, F.; Larkoski, A.J.; Lindert, J.; Luisoni, G.; Maierhöfer, P.; Mattelaer, O.; Martinez, H.; Moch, S.; Montagna, G.; Moretti, M.; Nason, P.; Nicrosini, O.; Oleari, C.; Pagani, D.; Papaefstathiou, A.; Petriello, F.; Piccinini, F.; Pierini, M.; Pierog, T.; Pozzorini, S.; Re, E.; Robens, T.; Rojo, J.; Ruiz, R.; Sakurai, K.; Salam, G.P.; Salfelder, L.; Schönherr, M.; Schulze, M.; Schumann, S.; Selvaggi, M.; Shivaji, A.; Siodmok, A.; Skands, P.; Torrielli, P.; Tramontano, F.; Tsinikos, I.; Tweedie, B.; Vicini, A.; Westhoff, S.; Zaro, M.; Zeppenfeld, D.; CERN. Geneva. ATS Department

    2017-06-22

    This report summarises the properties of Standard Model processes at the 100 TeV pp collider. We document the production rates and typical distributions for a number of benchmark Standard Model processes, and discuss new dynamical phenomena arising at the highest energies available at this collider. We discuss the intrinsic physics interest in the measurement of these Standard Model processes, as well as their role as backgrounds for New Physics searches.

  13. Memory Insensitive Simplification for View-Dependent Refinement

    Energy Technology Data Exchange (ETDEWEB)

    Lindstrom, P

    2002-04-03

    We present an algorithm for end-to-end out-of-core simplification and view-dependent visualization of large surfaces. The method consists of three phases: (1) memory insensitive simplification; (2) memory insensitive construction of a level-of-detail hierarchy; and (3) run-time, output sensitive, view-dependent rendering and navigation of the mesh. The first two off-line phases are performed entirely on disk, and use only a small, constant amount of memory, whereas the run-time component relies on memory mapping to page in only the rendered parts of the mesh in a cache coherent manner. As a result, we are able to process and visualize arbitrarily large meshes given a sufficient amount of disk space--a constant multiple of the size of the input mesh. Similar to recent work on out-of-core simplification, our memory insensitive method uses vertex clustering on a uniform octree grid to coarsen a mesh and create a hierarchy, and a quadric error mettic to choose vertex positions at all levels of resolution. We show how the quadric information can be used to concisely represent vertex position, surface normal, error, and curvature information for anisotropic view-dependent coarsening and silhouette preservation. The focus of this paper is on the out-of-core construction of a level-of-detail hierarchy---our framework is general enough to incorporate many different aspects of view-dependent rendering. We therefore emphasize the off-line phases of our method, and report on their theoretical and experimental memory and disk usage and execution time. Our results indicate on average one to two orders of magnitude improvement in processing speed over previous out-of-core methods. Meanwhile, all phases of the method are both disk and memory efficient, and are fairly straightforward to implement.

  14. Business Process Modeling: Blueprinting

    OpenAIRE

    Al-Fedaghi, Sabah

    2017-01-01

    This paper presents a flow-based methodology for capturing processes specified in business process modeling. The proposed methodology is demonstrated through re-modeling of an IBM Blueworks case study. While the Blueworks approach offers a well-proven tool in the field, this should not discourage workers from exploring other ways of thinking about effectively capturing processes. The diagrammatic representation presented here demonstrates a viable methodology in this context. It is hoped this...

  15. 网络环境下道路三维整体建模与简化方法%Integrated model construction and simplification methods for Web 3D road

    Institute of Scientific and Technical Information of China (English)

    蒲浩; 李伟; 赵海峰

    2013-01-01

    In order to realize the Web 3D visualization of road engineering, the key technologies such as the road integrated 3D model construction and simplification methods concerning constraints were studied. Based on the theory of constrained Delaunay triangulation, the road 3D model with integrated appearance and inner topological relationship was created. A half-edge collapse error metric concerning road constrained edges was proposed. Based on it, original road model was integrated and simplified by half-edge collapse and operating hierarchical tree was built to store operation records on the server The view-dependent strategy was put forward, in which the constrained edges were refined preferentially and simplified afterwards. Combined with the view-dependent reconstruction criterions, the transmission data for 3D visualization was substantially reduced and fast view-dependent reconstruction of the road 3D model was realized on the client. Given the benefits from above methods, a relevant system was developed out and applied to many highways Web-based construction management successfully.%为实现网络环境下道路工程的三维可视化,对其中的关键技术:顾及约束的整体模型构建及模型简化方法进行了研究.基于约束Delaunay三角网构建理论,建立了外形与内部拓扑关系均为整体的道路三维模型.提出了顾及道路约束边界的半边折叠误差度量方法,采用半边折叠操作,在服务器端对道路模型进行整体简化,并建立操作层次树存储操作记录;提出了约束边优先细化,延迟简化的视相关策略,结合视相关重构准则,减少网络可视化所需传输的数据量,在客户端实现了道路三维模型的快速重构.基于上述原理方法开发了相关系统,已在高速公路的网络建设管理中成功应用.

  16. Business Model Process Configurations

    DEFF Research Database (Denmark)

    Taran, Yariv; Nielsen, Christian; Thomsen, Peter

    2015-01-01

    Purpose – The paper aims: 1) To develop systematically a structural list of various business model process configuration and to group (deductively) these selected configurations in a structured typological categorization list. 2) To facilitate companies in the process of BM innovation......, by developing (inductively) an ontological classification framework, in view of the BM process configurations typology developed. Design/methodology/approach – Given the inconsistencies found in the business model studies (e.g. definitions, configurations, classifications) we adopted the analytical induction...... method of data analysis. Findings - A comprehensive literature review and analysis resulted in a list of business model process configurations systematically organized under five classification groups, namely, revenue model; value proposition; value configuration; target customers, and strategic...

  17. WWTP Process Tank Modelling

    DEFF Research Database (Denmark)

    Laursen, Jesper

    hydrofoil shaped propellers. These two sub-processes deliver the main part of the supplied energy to the activated sludge tank, and for this reason they are important for the mixing conditions in the tank. For other important processes occurring in the activated sludge tank, existing models and measurements...

  18. Simplification-driven automated partial evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Boyle, J.M.

    1992-11-21

    I describe an automated approach to partial evaluation based on simplification and implemented by program transformations. The approach emphasizes program algebra and relies on canonical forms and distributive laws to expose instances to which simplifications can be applied. I discuss some of the considerations that led to the design of this approach. This design discussion should be useful both in understanding the structure of the partial evaluation transformations, and as an example of how to approach the design of automated program transformations in general. This approach to partial evaluation has been applied to a number of practical examples of moderate complexity, including: the running example used in this paper, proving an identity for lists, and eliminating a virtual data structure from a specification of practical interest. The chief practical barrier to its wider application is the growth of the intermediate program text during partial evaluation. Despite this limitation, this approach has the virtues of being implemented, automated, and able to partially evaluate specifications containing implicit data, including some specifications of practical interest.

  19. Biosphere Process Model Report

    Energy Technology Data Exchange (ETDEWEB)

    J. Schmitt

    2000-05-25

    To evaluate the postclosure performance of a potential monitored geologic repository at Yucca Mountain, a Total System Performance Assessment (TSPA) will be conducted. Nine Process Model Reports (PMRs), including this document, are being developed to summarize the technical basis for each of the process models supporting the TSPA model. These reports cover the following areas: (1) Integrated Site Model; (2) Unsaturated Zone Flow and Transport; (3) Near Field Environment; (4) Engineered Barrier System Degradation, Flow, and Transport; (5) Waste Package Degradation; (6) Waste Form Degradation; (7) Saturated Zone Flow and Transport; (8) Biosphere; and (9) Disruptive Events. Analysis/Model Reports (AMRs) contain the more detailed technical information used to support TSPA and the PMRs. The AMRs consists of data, analyses, models, software, and supporting documentation that will be used to defend the applicability of each process model for evaluating the postclosure performance of the potential Yucca Mountain repository system. This documentation will ensure the traceability of information from its source through its ultimate use in the TSPA-Site Recommendation (SR) and in the National Environmental Policy Act (NEPA) analysis processes. The objective of the Biosphere PMR is to summarize (1) the development of the biosphere model, and (2) the Biosphere Dose Conversion Factors (BDCFs) developed for use in TSPA. The Biosphere PMR does not present or summarize estimates of potential radiation doses to human receptors. Dose calculations are performed as part of TSPA and will be presented in the TSPA documentation. The biosphere model is a component of the process to evaluate postclosure repository performance and regulatory compliance for a potential monitored geologic repository at Yucca Mountain, Nevada. The biosphere model describes those exposure pathways in the biosphere by which radionuclides released from a potential repository could reach a human receptor

  20. Structural simplification of chemical reaction networks in partial steady states.

    Science.gov (United States)

    Madelaine, Guillaume; Lhoussaine, Cédric; Niehren, Joachim; Tonello, Elisa

    2016-11-01

    We study the structural simplification of chemical reaction networks with partial steady state semantics assuming that the concentrations of some but not all species are constant. We present a simplification rule that can eliminate intermediate species that are in partial steady state, while preserving the dynamics of all other species. Our simplification rule can be applied to general reaction networks with some but few restrictions on the possible kinetic laws. We can also simplify reaction networks subject to conservation laws. We prove that our simplification rule is correct when applied to a module of a reaction network, as long as the partial steady state is assumed with respect to the complete network. Michaelis-Menten's simplification rule for enzymatic reactions falls out as a special case. We have implemented an algorithm that applies our simplification rules repeatedly and applied it to reaction networks from systems biology.

  1. Simplification of irreversible Markov chains by removal of states with fast leaving rates.

    Science.gov (United States)

    Jia, Chen

    2016-07-07

    In the recent work of Ullah et al. (2012a), the authors developed an effective method to simplify reversible Markov chains by removal of states with low equilibrium occupancies. In this paper, we extend this result to irreversible Markov chains. We show that an irreversible chain can be simplified by removal of states with fast leaving rates. Moreover, we reveal that the irreversibility of the chain will always decrease after model simplification. This suggests that although model simplification can retain almost all the dynamic information of the chain, it will lose some thermodynamic information as a trade-off. Examples from biology are also given to illustrate the main results of this paper.

  2. Foam process models.

    Energy Technology Data Exchange (ETDEWEB)

    Moffat, Harry K.; Noble, David R.; Baer, Thomas A. (Procter & Gamble Co., West Chester, OH); Adolf, Douglas Brian; Rao, Rekha Ranjana; Mondy, Lisa Ann

    2008-09-01

    In this report, we summarize our work on developing a production level foam processing computational model suitable for predicting the self-expansion of foam in complex geometries. The model is based on a finite element representation of the equations of motion, with the movement of the free surface represented using the level set method, and has been implemented in SIERRA/ARIA. An empirically based time- and temperature-dependent density model is used to encapsulate the complex physics of foam nucleation and growth in a numerically tractable model. The change in density with time is at the heart of the foam self-expansion as it creates the motion of the foam. This continuum-level model uses an homogenized description of foam, which does not include the gas explicitly. Results from the model are compared to temperature-instrumented flow visualization experiments giving the location of the foam front as a function of time for our EFAR model system.

  3. Refactoring Process Models in Large Process Repositories.

    NARCIS (Netherlands)

    Weber, B.; Reichert, M.U.

    2008-01-01

    With the increasing adoption of process-aware information systems (PAIS), large process model repositories have emerged. Over time respective models have to be re-aligned to the real-world business processes through customization or adaptation. This bears the risk that model redundancies are introdu

  4. Refactoring Process Models in Large Process Repositories.

    NARCIS (Netherlands)

    Weber, B.; Reichert, M.U.

    With the increasing adoption of process-aware information systems (PAIS), large process model repositories have emerged. Over time respective models have to be re-aligned to the real-world business processes through customization or adaptation. This bears the risk that model redundancies are

  5. Simplification of integrity constraints for data integration

    DEFF Research Database (Denmark)

    Christiansen, Henning; Martinenghi, Davide

    2004-01-01

    When two or more databases are combined into a global one, integrity may be violated even when each database is consistent with its own local integrity constraints. Efficient methods for checking global integrity in data integration systems are called for: answers to queries can then be trusted......, because either the global database is known to be consistent or suitable actions have been taken to provide consistent views. The present work generalizes simplification techniques for integrity checking in traditional databases to the combined case. Knowledge of local consistency is employed, perhaps...... together with given a priori constraints on the combination, so that only a minimal number of tuples needs to be considered. Combination from scratch, integration of a new source, and absorption of local updates are dealt with for both the local-as-view and global-as-view approaches to data integration....

  6. A numerical experiment on tidal river simplification in simulation of tide dominated estuaries

    Science.gov (United States)

    Yin, X.; Jia, L.; Zhu, L.

    2017-04-01

    In numerical simulation of tide dominated estuaries, introduction of simplified tidal channels into the model for real rivers is one of the strategies to deal with the lack of topographic data. To understand the effects of simplification and their sensitivity to the simplifying parameters, a numerical experiment was conducted to test the parameters such as channel length L, surface width B, bed slope S, bottom elevation ▽0, bed roughness n and run-off Qr. The results indicated the values of those parameters which were liable to less tidal prism and greater flood resistance would result in larger simulation errors. For a better simplification the values of parameters for the channel geometry, resistance and upstream inflow needed to be consistent with the average of the natural river as much as possible. The simplification method made the computation stable, fast and saved the storage space and it was adoptable for different time periods and seasons.

  7. Simplification of 3D Graphics for Mobile Devices: Exploring the Trade-off Between Energy Savings and User Perceptions of Visual Quality

    Science.gov (United States)

    Vatjus-Anttila, Jarkko; Koskela, Timo; Lappalainen, Tuomas; Häkkilä, Jonna

    2017-03-01

    3D graphics have quickly become a popular form of media that can also be accessed with today's mobile devices. However, the use of 3D applications with mobile devices is typically a very energy-consuming task due to the processing complexity and the large file size of 3D graphics. As a result, their use may lead to rapid depletion of the limited battery life. In this paper, we investigate how much energy savings can be gained in the transmission and rendering of 3D graphics by simplifying geometry data. In this connection, we also examine users' perceptions on the visual quality of the simplified 3D models. The results of this paper provide new knowledge on the energy savings that can be gained through geometry simplification, as well as on how much the geometry can be simplified before the visual quality of 3D models becomes unacceptable for the mobile users. Based on the results, it can be concluded that geometry simplification can provide significant energy savings for mobile devices without disturbing the users. When geometry simplification is combined with distance based adjustment of detail, up to 52% energy savings were gained in our experiments compared to using only a single high quality 3D model.

  8. 77 FR 66361 - Reserve Requirements of Depository Institutions: Reserves Simplification

    Science.gov (United States)

    2012-11-05

    ... AD 83 Reserve Requirements of Depository Institutions: Reserves Simplification AGENCY: Board of... Regulation D (Reserve Requirements of Depository Institutions) published in the Federal Register on April 12... simplifications related to the administration of reserve requirements: 1. Create a common two-week...

  9. 日光温室土墙传热特性及轻简化路径的理论分析%Heat transfer process of soil wall in Chinese solar greenhouse and its theoretical simplification methods

    Institute of Scientific and Technical Information of China (English)

    李明; 周长吉; 周涛; 尹义蕾; 富建鲁; 王志强; 齐长红

    2016-01-01

    为减小日光温室土墙厚度,该研究在分析土墙温度变化的基础上提出了土墙轻简化路径并进行了理论分析。根据测试分析,土墙可划分为用于储蓄热量的蓄热层和防止热量从蓄热层向室外方向流失的保温层。土墙86.9%的部分为保温层。模拟结果表明使用由47 cm厚夯土和7 cm厚聚苯板(热阻等于3.13 m厚夯土保温层)构成的复合墙在夜间的放热量与3.6 m厚土墙相近。使用保温材料替代夯土保温层来减薄土墙在理论上可行。另外,根据模拟,当土壤20 cm深处温度提高至23℃后,土壤供热量可超过测试条件下土壤和土墙放热量总和。为此,土墙在理论上可通过以下2条途径实现轻简化:1)使用保温材料建造墙体保温层;2)使用土壤蓄热替代墙体蓄热。%Soil wall of the Chinese solar greenhouse (hereafter referred to as “solar greenhouse”) has problems of occupying large area and damaging the cultivation land. The simplification of soil wall, which means decreasing the thickness and soil use of the soil wall, becomes very important. The purpose of this study is to develop simplification methods of soil wall. A simplification wall with less soil use was proposed based on the measured temperature of soil wall and analysis of feasibility of those methods. The tested solar greenhouse was located in Yongqing county, Lanfang city, Hebei province (116°44′ E, 36°27′ N). It is 50 m long and 10 m wide. The top and bottom thicknesses of the soil wall were 2.0 and 5.3 m, respectively. Its average thickness was 3.6 m. The test period was from Dec. 01, 2013 to Mar. 30, 2014. During that time, the tested solar greenhouse was used to growing cucumber with surface irrigation. The heat insulation sheet of the solar greenhouse was rolled up and down at 8:30 am and 5:00 pm daily, respectively. The wind vent was open if the indoor air temperature was high during daytime. The indoor and

  10. Quantum copying and simplification of the quantum Fourier transform

    Science.gov (United States)

    Niu, Chi-Sheng

    Theoretical studies of quantum computation and quantum information theory are presented in this thesis. Three topics are considered: simplification of the quantum Fourier transform in Shor's algorithm, optimal eavesdropping in the BB84 quantum cryptographic protocol, and quantum copying of one qubit. The quantum Fourier transform preceding the final measurement in Shor's algorithm is simplified by replacing a network of quantum gates with one that has fewer and simpler gates controlled by classical signals. This simplification results from an analysis of the network using the consistent history approach to quantum mechanics. The optimal amount of information which an eavesdropper can gain, for a given level of noise in the communication channel, is worked out for the BB84 quantum cryptographic protocol. The optimal eavesdropping strategy is expressed in terms of various quantum networks. A consistent history analysis of these networks using two conjugate quantum bases shows how the information gain in one basis influences the noise level in the conjugate basis. The no-cloning property of quantum systems, which is the physics behind quantum cryptography, is studied by considering copying machines that generate two imperfect copies of one qubit. The best qualities these copies can have are worked out with the help of the Bloch sphere representation for one qubit, and a quantum network is worked out for an optimal copying machine. If the copying machine does not have additional ancillary qubits, the copying process can be viewed using a 2-dimensional subspace in a product space of two qubits. A special representation of such a two-dimensional subspace makes possible a complete characterization of this type of copying. This characterization in turn leads to simplified eavesdropping strategies in the BB84 and the B92 quantum cryptographic protocols.

  11. Generalized Topological Simplification of Scalar Fields on Surfaces.

    Science.gov (United States)

    Tierny, J; Pascucci, V

    2012-12-01

    We present a combinatorial algorithm for the general topological simplification of scalar fields on surfaces. Given a scalar field f, our algorithm generates a simplified field g that provably admits only critical points from a constrained subset of the singularities of f, while guaranteeing a small distance ||f - g||∞ for data-fitting purpose. In contrast to previous algorithms, our approach is oblivious to the strategy used for selecting features of interest and allows critical points to be removed arbitrarily. When topological persistence is used to select the features of interest, our algorithm produces a standard ϵ-simplification. Our approach is based on a new iterative algorithm for the constrained reconstruction of sub- and sur-level sets. Extensive experiments show that the number of iterations required for our algorithm to converge is rarely greater than 2 and never greater than 5, yielding O(n log(n)) practical time performances. The algorithm handles triangulated surfaces with or without boundary and is robust to the presence of multi-saddles in the input. It is simple to implement, fast in practice and more general than previous techniques. Practically, our approach allows a user to arbitrarily simplify the topology of an input function and robustly generate the corresponding simplified function. An appealing application area of our algorithm is in scalar field design since it enables, without any threshold parameter, the robust pruning of topological noise as selected by the user. This is needed for example to get rid of inaccuracies introduced by numerical solvers, thereby providing topological guarantees needed for certified geometry processing. Experiments show this ability to eliminate numerical noise as well as validate the time efficiency and accuracy of our algorithm. We provide a lightweight C++ implementation as supplemental material that can be used for topological cleaning on surface meshes.

  12. OPC mask simplification using over-designed timing slack of standard cells

    Science.gov (United States)

    Qu, Yifan; Heng, Chun Huat; Tay, Arthur; Lee, Tong Heng

    2013-05-01

    It is well known that VLSI circuits must be designed to sustain the variations in process, voltage, temperature, etc. As a result, standard cell libraries (collections of the basic circuit components) are usually designed with large margin (also known as "timing slack"). However, in circuit manufacturing, only part of the margin will be utilized. The knowledge of the rest of the margin (over-designed timing slack), armed with models that link between timing domain and shape domain, can help to reduce the complexity of mask patterns and manufacturing cost. This paper proposed a novel methodology to simplify mask patterns in optical proximity correction (OPC) by using extra margin in timing (over-designed timing slack). This methodology can be applied after a conventional OPC, and is compatible with the current application-specific integrated circuit (ASIC) design flow. This iterative method is applied to each occurrence of over-designed timing slack. The actual value of timing slack can be estimated from post-OPC simulation. A timing cost function is developed in this work to map timing slack in timing domain to mask patterns in shape domain. This enables us to adjust mask patterns selectively based on the outcome of the cost function. All related mask patterns with over-designed timing slack will be annotated and simplified using our proposed mask simplification algorithm, which is in fact to merge the nearby edge fragments on the mask patterns. Simulations are conducted on a standard cell library and a full chip design to validate this proposed approach. When compared to existing OPC methods without mask simplification in the literature, our approach achieved a 51% reduction in mask fragment count, and this directly leads to a large saving in lithography manufacturing cost. The result also shows that timing closure is ensured, though part of the timing slack has been sacrificed.

  13. Organisational simplification and secondary complexity in health services for adults with learning disabilities.

    Science.gov (United States)

    Heyman, Bob; Swain, John; Gillman, Maureen

    2004-01-01

    This paper explores the role of complexity and simplification in the delivery of health care for adults with learning disabilities, drawing upon qualitative data obtained in a study carried out in NE England. It is argued that the requirement to manage complex health needs with limited resources causes service providers to simplify, standardise and routinise care. Simplified service models may work well enough for the majority of clients, but can impede recognition of the needs of those whose characteristics are not congruent with an adopted model. The data were analysed in relation to the core category, identified through thematic analysis, of secondary complexity arising from organisational simplification. Organisational simplification generates secondary complexity when operational routines designed to make health complexity manageable cannot accommodate the needs of non-standard service users. Associated themes, namely the social context of services, power and control, communication skills, expertise and service inclusiveness and evaluation are explored in relation to the core category. The concept of secondary complexity resulting from organisational simplification may partly explain seemingly irrational health service provider behaviour.

  14. Modular process modeling for OPC

    Science.gov (United States)

    Keck, M. C.; Bodendorf, C.; Schmidtling, T.; Schlief, R.; Wildfeuer, R.; Zumpe, S.; Niehoff, M.

    2007-03-01

    Modular OPC modeling, describing mask, optics, resist and etch processes separately is an approach to keep efforts for OPC manageable. By exchanging single modules of a modular OPC model, a fast response to process changes during process development is possible. At the same time efforts can be reduced, since only single modular process steps have to be re-characterized as input for OPC modeling as the process is adjusted and optimized. Commercially available OPC tools for full chip processing typically make use of semi-empirical models. The goal of our work is to investigate to what extent these OPC tools can be applied for modeling of single process steps as separate modules. For an advanced gate level process we analyze the modeling accuracy over different process conditions (focus and dose) when combining models for each process step - optics, resist and etch - for differing single processes to a model describing the total process.

  15. Exposing earth surface process model simulations to a large audience

    Science.gov (United States)

    Overeem, I.; Kettner, A. J.; Borkowski, L.; Russell, E. L.; Peddicord, H.

    2015-12-01

    The Community Surface Dynamics Modeling System (CSDMS) represents a diverse group of >1300 scientists who develop and apply numerical models to better understand the Earth's surface. CSDMS has a mandate to make the public more aware of model capabilities and therefore started sharing state-of-the-art surface process modeling results with large audiences. One platform to reach audiences outside the science community is through museum displays on 'Science on a Sphere' (SOS). Developed by NOAA, SOS is a giant globe, linked with computers and multiple projectors and can display data and animations on a sphere. CSDMS has developed and contributed model simulation datasets for the SOS system since 2014, including hydrological processes, coastal processes, and human interactions with the environment. Model simulations of a hydrological and sediment transport model (WBM-SED) illustrate global river discharge patterns. WAVEWATCH III simulations have been specifically processed to show the impacts of hurricanes on ocean waves, with focus on hurricane Katrina and super storm Sandy. A large world dataset of dams built over the last two centuries gives an impression of the profound influence of humans on water management. Given the exposure of SOS, CSDMS aims to contribute at least 2 model datasets a year, and will soon provide displays of global river sediment fluxes and changes of the sea ice free season along the Arctic coast. Over 100 facilities worldwide show these numerical model displays to an estimated 33 million people every year. Datasets storyboards, and teacher follow-up materials associated with the simulations, are developed to address common core science K-12 standards. CSDMS dataset documentation aims to make people aware of the fact that they look at numerical model results, that underlying models have inherent assumptions and simplifications, and that limitations are known. CSDMS contributions aim to familiarize large audiences with the use of numerical

  16. Linguistic Simplification: A Promising Test Accommodation for LEP Students?

    Directory of Open Access Journals (Sweden)

    Charles W. Stansfield

    2002-07-01

    Full Text Available This article is a synopsis of an experimental study of the effects of linguistic simplification, a test accommodation..designed for LEP students. Conducted as part of Delaware's statewide assessment program, this study examined the..effects of linguistic simplification of fourth- and sixth-grade science test items and specifically looked at score..comparability between LEP and non-LEP examinees.

  17. New technique for system simplification using Cuckoo search and ESA

    Indian Academy of Sciences (India)

    AFZAL SIKANDER; RAJENDRA PRASAD

    2017-09-01

    In this study, a new technique is suggested for simplification of linear time-invariant systems.Motivated by optimization and various system simplification techniques available in the literature, the proposed technique is formulated using Cuckoo search in combination with Le´vy flight and Eigen spectrum analysis. Theefficacy and powerfulness of the new technique is illustrated by three benchmark systems considered from previously published work and the results are compared in terms of performance indices.

  18. Impact of pipes networks simplification on water hammer phenomenon

    Indian Academy of Sciences (India)

    Ali A M Gad; Hassan I Mohammed

    2014-10-01

    Simplification of water supply networks is an indispensible design step to make the original network easier to be analysed. The impact of networks’ simplification on water hammer phenomenon is investigated. This study uses two loops network with different diameters, thicknesses, and roughness coefficients. The network is fed from a boundary head reservoir and loaded by either distributed or concentrated boundary water demands. According to both hydraulic and hydraulic plus water quality equivalence, three simplification levels are performed. The effect of demands’ concentration on the transient flow is checked. The transient flow is initialized by either concentrated or distributed boundary demands which are suddenly shut-off or released. WHAMO software is used for simulation. All scenarios showed that both hydraulic equivalence and demands’ concentration simplifications increase the transient pressure and flow rate. However, hydraulic plus water quality equivalence simplification produces an adverse effect. Therefore, simplifications of the networks should be done carefully. Also, it was found that pump shut-off gives the same trend of valve shut-off or release.

  19. Fierz relations for Volkov spinors and the simplification of Furry picture traces

    CERN Document Server

    Hartin, A

    2016-01-01

    Transition probability calculations of strong field particle processes in the Furry picture, typically use fermion Volkov solutions. These solutions have a relatively complicated spinor due to the interaction of the electron spin with a strong external field, which in turn leads to unwieldy trace calculations. The simplification of these calculations would aid theoretical studies of strong field phenomena such as the predicted resonance behaviour of higher order Furry picture processes. Here, Fierz transformations of Volkov spinors are developed and applied to a 1st order and a 2nd order Furry picture process. Combined with symmetry properties, the techniques presented here are generally applicable and lead to considerable simplification of Furry picture analytic calculations.

  20. On Activity modelling in process modeling

    Directory of Open Access Journals (Sweden)

    Dorel Aiordachioaie

    2001-12-01

    Full Text Available The paper is looking to the dynamic feature of the meta-models of the process modelling process, the time. Some principles are considered and discussed as main dimensions of any modelling activity: the compatibility of the substances, the equipresence of phenomena and the solvability of the model. The activity models are considered and represented at meta-level.

  1. Auditory processing models

    DEFF Research Database (Denmark)

    Dau, Torsten

    2008-01-01

    The Handbook of Signal Processing in Acoustics will compile the techniques and applications of signal processing as they are used in the many varied areas of Acoustics. The Handbook will emphasize the interdisciplinary nature of signal processing in acoustics. Each Section of the Handbook will pr...

  2. Modeling and computational simulation of the osmotic evaporation process

    Directory of Open Access Journals (Sweden)

    Freddy Forero Longas

    2016-09-01

    Full Text Available Context: Within the processing technologies with membranes, osmotic evaporation is a promising alternative for the transformation of exotic fruits, generating concentrated products that can be used in the daily diet, being easier to consume, reducing transportation costs and increasing shelf life.Method: In this research, it was studied and developed a comprehensive strategy for multiphysics modeling and simulation of mass and momentum transfer phenomena in the process of osmotic evaporation through Comsol® and Matlab® software. It was used an axial geometry approach in two dimensions as simplifications of real module and the finite element method for the numerical solution. The simulations were validated experimentally in an osmotic evaporation system of laboratory scale.Results: The models used and the generated simulations were statistically significant (p <0,05 in predicting the flux behavior, taking into account the effect of flow and temperature feed together with the brine flow, being obtained correlations above 96% between experimental and calculated data.Conclusions: It was found that for the conditions studied the Knudsen diffusion model is most suitable to describe the transfer of water vapor through the hydrophobic membrane. Simulations developed adequately describe the process of osmotic evaporation, becoming a tool for faster economic development of this technology.

  3. Simplification of arboreal marsupial assemblages in response to increasing urbanization.

    Directory of Open Access Journals (Sweden)

    Bronwyn Isaac

    Full Text Available Arboreal marsupials play an essential role in ecosystem function including regulating insect and plant populations, facilitating pollen and seed dispersal and acting as a prey source for higher-order carnivores in Australian environments. Primarily, research has focused on their biology, ecology and response to disturbance in forested and urban environments. We used presence-only species distribution modelling to understand the relationship between occurrences of arboreal marsupials and eco-geographical variables, and to infer habitat suitability across an urban gradient. We used post-proportional analysis to determine whether increasing urbanization affected potential habitat for arboreal marsupials. The key eco-geographical variables that influenced disturbance intolerant species and those with moderate tolerance to disturbance were natural features such as tree cover and proximity to rivers and to riparian vegetation, whereas variables for disturbance tolerant species were anthropogenic-based (e.g., road density but also included some natural characteristics such as proximity to riparian vegetation, elevation and tree cover. Arboreal marsupial diversity was subject to substantial change along the gradient, with potential habitat for disturbance-tolerant marsupials distributed across the complete gradient and potential habitat for less tolerant species being restricted to the natural portion of the gradient. This resulted in highly-urbanized environments being inhabited by a few generalist arboreal marsupial species. Increasing urbanization therefore leads to functional simplification of arboreal marsupial assemblages, thus impacting on the ecosystem services they provide.

  4. GREAT Process Modeller user manual

    OpenAIRE

    Rueda, Urko; España, Sergio; Ruiz, Marcela

    2015-01-01

    This report contains instructions to install, uninstall and use GREAT Process Modeller, a tool that supports Communication Analysis, a communication-oriented business process modelling method. GREAT allows creating communicative event diagrams (i.e. business process models), specifying message structures (which describe the messages associated to each communicative event), and automatically generating a class diagram (representing the data model of an information system that would support suc...

  5. INNOVATION PROCESS MODELLING

    Directory of Open Access Journals (Sweden)

    JANUSZ K. GRABARA

    2011-01-01

    Full Text Available Modelling phenomena in accordance with the structural approach enables one to simplify the observed relations and to present the classification grounds. An example may be a model of organisational structure identifying the logical relations between particular units and presenting the division of authority, work.

  6. Decomposition and Simplification of Multivariate Data using Pareto Sets.

    Science.gov (United States)

    Huettenberger, Lars; Heine, Christian; Garth, Christoph

    2014-12-01

    Topological and structural analysis of multivariate data is aimed at improving the understanding and usage of such data through identification of intrinsic features and structural relationships among multiple variables. We present two novel methods for simplifying so-called Pareto sets that describe such structural relationships. Such simplification is a precondition for meaningful visualization of structurally rich or noisy data. As a framework for simplification operations, we introduce a decomposition of the data domain into regions of equivalent structural behavior and the reachability graph that describes global connectivity of Pareto extrema. Simplification is then performed as a sequence of edge collapses in this graph; to determine a suitable sequence of such operations, we describe and utilize a comparison measure that reflects the changes to the data that each operation represents. We demonstrate and evaluate our methods on synthetic and real-world examples.

  7. BPMN Impact on Process Modeling

    OpenAIRE

    Polak, Przemyslaw

    2013-01-01

    Recent years have seen huge rise in popularity of BPMN in the area of business process modeling, especially among business analysts. This notation has characteristics that distinguish it significantly from the previously popular process modeling notations, such as EPC. The article contains the analysis of some important characteristics of BPMN and provides author’s conclusions on the impact that the popularity and specificity of BPMN can have on the practice of process modeling. Author's obse...

  8. A consistent positive association between landscape simplification and insecticide use across the Midwestern US from 1997 through 2012

    Science.gov (United States)

    Meehan, Timothy D.; Gratton, Claudio

    2015-11-01

    During 2007, counties across the Midwestern US with relatively high levels of landscape simplification (i.e., widespread replacement of seminatural habitats with cultivated crops) had relatively high crop-pest abundances which, in turn, were associated with relatively high insecticide application. These results suggested a positive relationship between landscape simplification and insecticide use, mediated by landscape effects on crop pests or their natural enemies. A follow-up study, in the same region but using different statistical methods, explored the relationship between landscape simplification and insecticide use between 1987 and 2007, and concluded that the relationship varied substantially in sign and strength across years. Here, we explore this relationship from 1997 through 2012, using a single dataset and two different analytical approaches. We demonstrate that, when using ordinary least squares (OLS) regression, the relationship between landscape simplification and insecticide use is, indeed, quite variable over time. However, the residuals from OLS models show strong spatial autocorrelation, indicating spatial structure in the data not accounted for by explanatory variables, and violating a standard assumption of OLS. When modeled using spatial regression techniques, relationships between landscape simplification and insecticide use were consistently positive between 1997 and 2012, and model fits were dramatically improved. We argue that spatial regression methods are more appropriate for these data, and conclude that there remains compelling correlative support for a link between landscape simplification and insecticide use in the Midwestern US. We discuss the limitations of inference from this and related studies, and suggest improved data collection campaigns for better understanding links between landscape structure, crop-pest pressure, and pest-management practices.

  9. Fermilab experience of post-annealing losses in SRF niobium cavities due to furnace contamination and the ways to its mitigation: a pathway to processing simplification and quality factor improvement

    CERN Document Server

    Grassellino, A; Crawford, A; Melnychuk, O; Rowe, A; Wong, M; Cooper, C; Sergatskov, D; Bice, D; Trenikhina, Y; Cooley, L D; Ginsburg, C; Kephart, R D

    2013-01-01

    We investigate the effect of high temperature treatments followed by only high-pressure water rinse (HPR) of superconducting radio frequency (SRF) niobium cavities. The objective is to provide a cost effective alternative to the typical cavity processing sequence, by eliminating the material removal step post furnace treatment while preserving or improving the RF performance. The studies have been conducted in the temperature range 800-1000C for different conditions of the starting substrate: large grain and fine grain, electro-polished (EP) and centrifugal barrel polished (CBP) to mirror finish. An interesting effect of the grain size on the performances is found. Cavity results and samples characterization show that furnace contaminants cause poor cavity performance, and a practical solution is found to prevent surface contamination. Extraordinary values of residual resistances ~ 1 nOhm and below are then consistently achieved for the contamination-free cavities. These results lead to a more cost-effective ...

  10. Optimization and simplification of the Allergic and Hypersensitivity conditions classification for the ICD-11.

    Science.gov (United States)

    Tanno, L K; Calderon, M A; Demoly, P

    2016-05-01

    Since 2013, an international collaboration of Allergy Academies, including first the World Allergy Organization (WAO), the American Academy of Allergy Asthma and Immunology (AAAAI), and the European Academy of Allergy and Clinical Immunology (EAACI), and then the American College of Allergy, Asthma and Immunology (ACAAI), the Latin American Society of Allergy, Asthma and Immunology (SLAAI), and the Asia Pacific Association of Allergy, Asthma and Clinical Immunology (APAAACI), has spent tremendous efforts to have a better and updated classification of allergic and hypersensitivity conditions in the forthcoming International Classification of Diseases (ICD)-11 version by providing evidences and promoting actions for the need for changes. The latest action was the implementation of a classification proposal of hypersensitivity/allergic diseases built by crowdsourcing the Allergy Academy leaderships. Following bilateral discussions with the representatives of the ICD-11 revision, a face-to-face meeting was held at the United Nations Office in Geneva and a simplification process of the hypersensitivity/allergic disorders classification was carried out to better fit the ICD structure. We are here presenting the end result of what we consider to be a model of good collaboration between the World Health Organization and a specialty. We strongly believe that the outcomes of all past and future actions will impact positively the recognition of the allergy specialty as well as the quality improvement of healthcare system for allergic and hypersensitivity conditions worldwide. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  11. Modeling of column apparatus processes

    CERN Document Server

    Boyadjiev, Christo; Boyadjiev, Boyan; Popova-Krumova, Petya

    2016-01-01

    This book presents a new approach for the modeling of chemical and interphase mass transfer processes in industrial column apparatuses, using convection-diffusion and average-concentration models. The convection-diffusion type models are used for a qualitative analysis of the processes and to assess the main, small and slight physical effects, and then reject the slight effects. As a result, the process mechanism can be identified. It also introduces average concentration models for quantitative analysis, which use the average values of the velocity and concentration over the cross-sectional area of the column. The new models are used to analyze different processes (simple and complex chemical reactions, absorption, adsorption and catalytic reactions), and make it possible to model the processes of gas purification with sulfur dioxide, which form the basis of several patents.

  12. UML in business process modeling

    Directory of Open Access Journals (Sweden)

    Bartosz Marcinkowski

    2013-03-01

    Full Text Available Selection and proper application of business process modeling methods and techniques have a significant impact on organizational improvement capabilities as well as proper understanding of functionality of information systems that shall support activity of the organization. A number of business process modeling notations were popularized in practice in recent decades. Most significant of the notations include Business Process Modeling Notation (OMG BPMN and several Unified Modeling Language (OMG UML extensions. In this paper, the assessment whether one of the most flexible and strictly standardized contemporary business process modeling notations, i.e. Rational UML Profile for Business Modeling, enable business analysts to prepare business models that are all-embracing and understandable by all the stakeholders. After the introduction, methodology of research is discussed. Section 2 presents selected case study results. The paper is concluded with a summary.

  13. Modeling Software Processes and Artifacts

    NARCIS (Netherlands)

    van den Berg, Klaas; Bosch, Jan; Mitchell, Stuart

    1997-01-01

    The workshop on Modeling Software Processes and Artifacts explored the application of object technology in process modeling. After the introduction and the invited lecture, a number of participants presented their position papers. First, an overview is given on some background work, and the aims, as

  14. Simplification in Graded Readers: Measuring the Authenticity of Graded Texts

    Science.gov (United States)

    Claridge, Gillian

    2005-01-01

    This study examines the characteristics and quality of simplification in graded readers as compared to those of "normal" authentic English. Two passages from graded readers are compared with the original passages. The comparison uses a computer programme, RANGE (Nation and Heatley, 2003) to analyse the distribution of high and low frequency words…

  15. Multi-enzyme Process Modeling

    DEFF Research Database (Denmark)

    Andrade Santacoloma, Paloma de Gracia

    . In this way the model parameters that drives the main dynamic behavior can be identified and thus a better understanding of this type of processes. In order to develop, test and verify the methodology, three case studies were selected, specifically the bi-enzyme process for the production of lactobionic acid......The subject of this thesis is to develop a methodological framework that can systematically guide mathematical model building for better understanding of multi-enzyme processes. In this way, opportunities for process improvements can be identified by analyzing simulations of either existing...... in the scientific literature. Reliable mathematical models of such multi-catalytic schemes can exploit the potential benefit of these processes. In this way, the best outcome of the process can be obtained understanding the types of modification that are required for process optimization. An effective evaluation...

  16. Business process modeling in healthcare.

    Science.gov (United States)

    Ruiz, Francisco; Garcia, Felix; Calahorra, Luis; Llorente, César; Gonçalves, Luis; Daniel, Christel; Blobel, Bernd

    2012-01-01

    The importance of the process point of view is not restricted to a specific enterprise sector. In the field of health, as a result of the nature of the service offered, health institutions' processes are also the basis for decision making which is focused on achieving their objective of providing quality medical assistance. In this chapter the application of business process modelling - using the Business Process Modelling Notation (BPMN) standard is described. Main challenges of business process modelling in healthcare are the definition of healthcare processes, the multi-disciplinary nature of healthcare, the flexibility and variability of the activities involved in health care processes, the need of interoperability between multiple information systems, and the continuous updating of scientific knowledge in healthcare.

  17. Modeling nuclear processes by Simulink

    Energy Technology Data Exchange (ETDEWEB)

    Rashid, Nahrul Khair Alang Md, E-mail: nahrul@iium.edu.my [Faculty of Engineering, International Islamic University Malaysia, Jalan Gombak, Selangor (Malaysia)

    2015-04-29

    Modelling and simulation are essential parts in the study of dynamic systems behaviours. In nuclear engineering, modelling and simulation are important to assess the expected results of an experiment before the actual experiment is conducted or in the design of nuclear facilities. In education, modelling can give insight into the dynamic of systems and processes. Most nuclear processes can be described by ordinary or partial differential equations. Efforts expended to solve the equations using analytical or numerical solutions consume time and distract attention from the objectives of modelling itself. This paper presents the use of Simulink, a MATLAB toolbox software that is widely used in control engineering, as a modelling platform for the study of nuclear processes including nuclear reactor behaviours. Starting from the describing equations, Simulink models for heat transfer, radionuclide decay process, delayed neutrons effect, reactor point kinetic equations with delayed neutron groups, and the effect of temperature feedback are used as examples.

  18. Modeling nuclear processes by Simulink

    Science.gov (United States)

    Rashid, Nahrul Khair Alang Md

    2015-04-01

    Modelling and simulation are essential parts in the study of dynamic systems behaviours. In nuclear engineering, modelling and simulation are important to assess the expected results of an experiment before the actual experiment is conducted or in the design of nuclear facilities. In education, modelling can give insight into the dynamic of systems and processes. Most nuclear processes can be described by ordinary or partial differential equations. Efforts expended to solve the equations using analytical or numerical solutions consume time and distract attention from the objectives of modelling itself. This paper presents the use of Simulink, a MATLAB toolbox software that is widely used in control engineering, as a modelling platform for the study of nuclear processes including nuclear reactor behaviours. Starting from the describing equations, Simulink models for heat transfer, radionuclide decay process, delayed neutrons effect, reactor point kinetic equations with delayed neutron groups, and the effect of temperature feedback are used as examples.

  19. Elaboration and Simplification in Spanish Discourse

    Science.gov (United States)

    Granena, Gisela

    2008-01-01

    This article compares spoken discourse models in Spanish as a second language textbooks and online language learning resources with naturally occurring conversations. Telephone service encounters are analyzed from the point of view of three different dimensions of authenticity: linguistic, sociolinguistic, and psycholinguistic. An analysis of 20…

  20. Sato Processes in Default Modeling

    DEFF Research Database (Denmark)

    Kokholm, Thomas; Nicolato, Elisa

    In reduced form default models, the instantaneous default intensity is classically the modeling object. Survival probabilities are then given by the Laplace transform of the cumulative hazard defined as the integrated intensity process. Instead, recent literature has shown a tendency towards...... specifying the cumulative hazard process directly. Within this framework we present a new model class where cumulative hazards are described by self-similar additive processes, also known as Sato processes. Furthermore we also analyze specifications obtained via a simple deterministic time......-change of a homogeneous Levy process. While the processes in these two classes share the same average behavior over time, the associated intensities exhibit very different properties. Concrete specifications are calibrated to data on the single names included in the iTraxx Europe index. The performances are compared...

  1. Sato Processes in Default Modeling

    DEFF Research Database (Denmark)

    Kokholm, Thomas; Nicolato, Elisa

    2010-01-01

    In reduced form default models, the instantaneous default intensity is the classical modeling object. Survival probabilities are then given by the Laplace transform of the cumulative hazard defined as the integrated intensity process. Instead, recent literature tends to specify the cumulative...... hazard process directly. Within this framework we present a new model class where cumulative hazards are described by self-similar additive processes, also known as Sato processes. Furthermore, we analyze specifications obtained via a simple deterministic time-change of a homogeneous Lévy process. While...... the processes in these two classes share the same average behavior over time, the associated intensities exhibit very different properties. Concrete specifications are calibrated to data on all the single names included in the iTraxx Europe index. The performances are compared with those of the classical CIR...

  2. Optimization of biopharmaceutical downstream processes supported by mechanistic models and artificial neural networks.

    Science.gov (United States)

    Pirrung, Silvia M; van der Wielen, Luuk A M; van Beckhoven, Ruud F W C; van de Sandt, Emile J A X; Eppink, Michel H M; Ottens, Marcel

    2017-01-05

    Downstream process development is a major area of importance within the field of bioengineering. During the design of such a downstream process, important decisions have to be made regarding the type of unit operations as well as their sequence and their operating conditions. Current computational approaches addressing these issues either show a high level of simplification or struggle with computational speed. Therefore, this article presents a new approach that combines detailed mechanistic models and speed-enhancing artificial neural networks. This approach was able to simultaneously optimize a process with three different chromatographic columns toward yield with a minimum purity of 99.9%. The addition of artificial neural networks greatly accelerated this optimization. Due to high computational speed, the approach is easily extendable to include more unit operations. Therefore, it can be of great help in the acceleration of downstream process development. © 2017 American Institute of Chemical Engineers Biotechnol. Prog., 2017.

  3. Modelling of CWS combustion process

    Science.gov (United States)

    Rybenko, I. A.; Ermakova, L. A.

    2016-10-01

    The paper considers the combustion process of coal water slurry (CWS) drops. The physico-chemical process scheme consisting of several independent parallel-sequential stages is offered. This scheme of drops combustion process is proved by the particle size distribution test and research stereomicroscopic analysis of combustion products. The results of mathematical modelling and optimization of stationary regimes of CWS combustion are provided. During modeling the problem of defining possible equilibrium composition of products, which can be obtained as a result of CWS combustion processes at different temperatures, is solved.

  4. Social Models: Blueprints or Processes?

    Science.gov (United States)

    Little, Graham R.

    1981-01-01

    Discusses the nature and implications of two different models for societal planning: (1) the problem-solving process approach based on Karl Popper; and (2) the goal-setting "blueprint" approach based on Karl Marx. (DC)

  5. Green-Ampt模型参数简化及与土壤物理参数的关系%Parameters simplification of Green-Ampt infiltration models and relationships between infiltration and soil physical parameters

    Institute of Scientific and Technical Information of China (English)

    刘姗姗; 白美健; 许迪; 李益农; 胡卫东

    2012-01-01

    correlation with soil compaction and clay content, and the coefficient was 0.74 and 0.73 respectively. High multiple linear correlation was found between A and soil physical parameters, and the correlation coefficient was 0.90. There was medium multiple linear correlation between Ks and soil physical parameters, and the correlation coefficient was 0.79. The average relative error between observed data and infiltration parameters obtained by experience conversion function was about 10%. The results indicated that the simplified Green-Ampt model has certain precision in simulating the soil infiltration process.

  6. Model feedstock supply processing plants

    Directory of Open Access Journals (Sweden)

    V. M. Bautin

    2013-01-01

    Full Text Available The model of raw providing the processing enterprises entering into vertically integrated structure on production and processing of dairy raw materials, differing by an orientation on achievement of cumulative effect by the integrated structure acting as criterion function which maximizing is reached by optimization of capacities, volumes of deliveries of raw materials and its qualitative characteristics, costs of industrial processing of raw materials and demand for dairy production is developed.

  7. Model Checking of Boolean Process Models

    CERN Document Server

    Schneider, Christoph

    2011-01-01

    In the field of Business Process Management formal models for the control flow of business processes have been designed since more than 15 years. Which methods are best suited to verify the bulk of these models? The first step is to select a formal language which fixes the semantics of the models. We adopt the language of Boolean systems as reference language for Boolean process models. Boolean systems form a simple subclass of coloured Petri nets. Their characteristics are low tokens to model explicitly states with a subsequent skipping of activations and arbitrary logical rules of type AND, XOR, OR etc. to model the split and join of the control flow. We apply model checking as a verification method for the safeness and liveness of Boolean systems. Model checking of Boolean systems uses the elementary theory of propositional logic, no modal operators are needed. Our verification builds on a finite complete prefix of a certain T-system attached to the Boolean system. It splits the processes of the Boolean sy...

  8. Computer Modelling of Dynamic Processes

    Directory of Open Access Journals (Sweden)

    B. Rybakin

    2000-10-01

    Full Text Available Results of numerical modeling of dynamic problems are summed in the article up. These problems are characteristic for various areas of human activity, in particular for problem solving in ecology. The following problems are considered in the present work: computer modeling of dynamic effects on elastic-plastic bodies, calculation and determination of performances of gas streams in gas cleaning equipment, modeling of biogas formation processes.

  9. Command Process Modeling & Risk Analysis

    Science.gov (United States)

    Meshkat, Leila

    2011-01-01

    Commanding Errors may be caused by a variety of root causes. It's important to understand the relative significance of each of these causes for making institutional investment decisions. One of these causes is the lack of standardized processes and procedures for command and control. We mitigate this problem by building periodic tables and models corresponding to key functions within it. These models include simulation analysis and probabilistic risk assessment models.

  10. Path modeling and process control

    DEFF Research Database (Denmark)

    Høskuldsson, Agnar; Rodionova, O.; Pomerantsev, A.

    2007-01-01

    and having three or more stages. The methods are applied to a process control of a multi-stage production process having 25 variables and one output variable. When moving along the process, variables change their roles. It is shown how the methods of path modeling can be applied to estimate variables...... of the next stage with the purpose of obtaining optimal or almost optimal quality of the output variable. An important aspect of the methods presented is the possibility of extensive graphic analysis of data that can provide the engineer with a detailed view of the multi-variate variation in data.......Many production processes are carried out in stages. At the end of each stage, the production engineer can analyze the intermediate results and correct process parameters (variables) of the next stage. Both analysis of the process and correction to process parameters at next stage should...

  11. Modelling Hospital Materials Management Processes

    Directory of Open Access Journals (Sweden)

    Raffaele Iannone

    2013-06-01

    integrated and detailed analysis and description model for hospital materials management data and tasks, which is able to tackle information from patient requirements to usage, from replenishment requests to supplying and handling activities. The model takes account of medical risk reduction, traceability and streamlined processes perspectives. Second, the paper translates this information into a business process model and mathematical formalization.The study provides a useful guide to the various relevant technology‐related, management and business issues, laying the foundations of an efficient reengineering of the supply chain to reduce healthcare costs and improve the quality of care.

  12. A simplification of the unified gas kinetic scheme

    CERN Document Server

    Chen, Songze; Xu, Kun

    2016-01-01

    Unified gas kinetic scheme (UGKS) is an asymptotic preserving scheme for the kinetic equations. It is superior for transition flow simulations, and has been validated in the past years. However, compared to the well known discrete ordinate method (DOM) which is a classical numerical method solving the kinetic equations, the UGKS needs more computational resources. In this study, we propose a simplification of the unified gas kinetic scheme. It allows almost identical numerical cost as the DOM, but predicts numerical results as accurate as the UGKS. Based on the observation that the equilibrium part of the UGKS fluxes can be evaluated analytically, the equilibrium part in the UGKS flux is not necessary to be discretized in velocity space. In the simplified scheme, the numerical flux for the velocity distribution function and the numerical flux for the macroscopic conservative quantities are evaluated separately. The simplification is equivalent to a flux hybridization of the gas kinetic scheme for the Navier-S...

  13. Simplification of Training Data for Cross-Project Defect Prediction

    OpenAIRE

    He, Peng; Li, Bing; Zhang, Deguang; Ma, Yutao

    2014-01-01

    Cross-project defect prediction (CPDP) plays an important role in estimating the most likely defect-prone software components, especially for new or inactive projects. To the best of our knowledge, few prior studies provide explicit guidelines on how to select suitable training data of quality from a large number of public software repositories. In this paper, we have proposed a training data simplification method for practical CPDP in consideration of multiple levels of granularity and filte...

  14. Stand management optimization – the role of simplifications

    Directory of Open Access Journals (Sweden)

    Timo Pukkala

    2014-02-01

    Full Text Available Background Studies on optimal stand management often make simplifications or restrict the choice of treatments. Examples of simplifications are neglecting natural regeneration that appears on a plantation site, omitting advance regeneration in simulations, or restricting thinning treatments to low thinning (thinning from below. Methods This study analyzed the impacts of simplifications on the optimization results for Fennoscandian boreal forests. Management of pine and spruce plantations was optimized by gradually reducing the number of simplifying assumptions. Results Forced low thinning, cleaning the plantation from the natural regeneration of mixed species and ignoring advance regeneration all had a major impact on optimization results. High thinning (thinning from above resulted in higher NPV and longer rotation length than thinning from below. It was profitable to leave a mixed stand in the tending treatment of young plantation. When advance regeneration was taken into account, it was profitable to increase the number of thinnings and postpone final felling. In the optimal management, both pine and spruce plantation was gradually converted into uneven-aged mixture of spruce and birch. Conclusions The results suggest that, with the current management costs and timber price level, it may be profitable to switch to continuous cover management on medium growing sites of Fennoscandian boreal forests.

  15. A Graph-Based Min-# and Error-Optimal Trajectory Simplification Algorithm and Its Extension towards Online Services

    Directory of Open Access Journals (Sweden)

    Fan Wu

    2017-01-01

    Full Text Available Trajectory simplification has become a research hotspot since it plays a significant role in the data preprocessing, storage, and visualization of many offline and online applications, such as online maps, mobile health applications, and location-based services. Traditional heuristic-based algorithms utilize greedy strategy to reduce time cost, leading to high approximation error. An Optimal Trajectory Simplification Algorithm based on Graph Model (OPTTS is proposed to obtain the optimal solution in this paper. Both min-# and min-ε problems are solved by the construction and regeneration of the breadth-first spanning tree and the shortest path search based on the directed acyclic graph (DAG. Although the proposed OPTTS algorithm can get optimal simplification results, it is difficult to apply in real-time services due to its high time cost. Thus, a new Online Trajectory Simplification Algorithm based on Directed Acyclic Graph (OLTS is proposed to deal with trajectory stream. The algorithm dynamically constructs the breadth-first spanning tree, followed by real-time minimizing approximation error and real-time output. Experimental results show that OPTTS reduces the global approximation error by 82% compared to classical heuristic methods, while OLTS reduces the error by 77% and is 32% faster than the traditional online algorithm. Both OPTTS and OLTS have leading superiority and stable performance on different datasets.

  16. ECONOMIC MODELING PROCESSES USING MATLAB

    Directory of Open Access Journals (Sweden)

    Anamaria G. MACOVEI

    2008-06-01

    Full Text Available To study economic phenomena and processes using mathem atical modeling, and to determine the approximatesolution to a problem we need to choose a method of calculation and a numerical computer program, namely thepackage of programs MatLab. Any economic process or phenomenon is a mathematical description of h is behavior,and thus draw up an economic and mathematical model that has the following stages: formulation of the problem, theanalysis process modeling, the production model and design verification, validation and implementation of the model.This article is presented an economic model and its modeling is using mathematical equations and software packageMatLab, which helps us approximation effective solution. As data entry is considered the net cost, the cost of direct andtotal cost and the link between them. I presented the basic formula for determining the total cost. Economic modelcalculations were made in MatLab software package and with graphic representation of its interpretation of the resultsachieved in terms of our specific problem.

  17. Modeling of the Hydroentanglement Process

    Directory of Open Access Journals (Sweden)

    Ping Xiang

    2006-11-01

    Full Text Available Mechanical performance of hydroentangled nonwovens is determined by the degree of the fiber entanglement, which depends on parameters of the fibers, fiberweb, forming surface, water jet and the process speed. This paper develops a computational fluid dynamics model of the hydroentanglement process. Extensive comparison with experimental data showed that the degree of fiber entanglement is linearly related to flow vorticity in the fiberweb, which is induced by impinging water jets. The fiberweb is modeled as a porous material of uniform porosity and the actual geometry of forming wires is accounted for in the model. Simulation results are compared with experimental data for a Perfojet ® sleeve and four woven forming surfaces. Additionally, the model is used to predict the effect of fiberweb thickness on the degree of fiber entanglement for different forming surfaces.

  18. Modified Claus process probabilistic model

    Energy Technology Data Exchange (ETDEWEB)

    Larraz Mora, R. [Chemical Engineering Dept., Univ. of La Laguna (Spain)

    2006-03-15

    A model is proposed for the simulation of an industrial Claus unit with a straight-through configuration and two catalytic reactors. Process plant design evaluations based on deterministic calculations does not take into account the uncertainties that are associated with the different input variables. A probabilistic simulation method was applied in the Claus model to obtain an impression of how some of these inaccuracies influences plant performance. (orig.)

  19. Process Models for Security Architectures

    Directory of Open Access Journals (Sweden)

    Floarea NASTASE

    2006-01-01

    Full Text Available This paper presents a model for an integrated security system, which can be implemented in any organization. It is based on security-specific standards and taxonomies as ISO 7498-2 and Common Criteria. The functionalities are derived from the classes proposed in the Common Criteria document. In the paper we present the process model for each functionality and also we focus on the specific components.

  20. Multi-enzyme Process Modeling

    DEFF Research Database (Denmark)

    Andrade Santacoloma, Paloma de Gracia

    The subject of this thesis is to develop a methodological framework that can systematically guide mathematical model building for better understanding of multi-enzyme processes. In this way, opportunities for process improvements can be identified by analyzing simulations of either existing...... are affected (in a positive or negative way) by the presence of the other enzymes and compounds in the media. In this thesis the concept of multi-enzyme in-pot term is adopted for processes that are carried out by the combination of enzymes in a single reactor and implemented at pilot or industrial scale...

  1. Modeling of biopharmaceutical processes. Part 2: Process chromatography unit operation

    DEFF Research Database (Denmark)

    Kaltenbrunner, Oliver; McCue, Justin; Engel, Philip;

    2008-01-01

    Process modeling can be a useful tool to aid in process development, process optimization, and process scale-up. When modeling a chromatography process, one must first select the appropriate models that describe the mass transfer and adsorption that occurs within the porous adsorbent...

  2. Network Model Building (Process Mapping)

    OpenAIRE

    Blau, Gary; Yih, Yuehwern

    2004-01-01

    12 slides Provider Notes:See Project Planning Video (Windows Media) Posted at the bottom are Gary Blau's slides. Before watching, please note that "process mapping" and "modeling" are mentioned in the video and notes. Here they are meant to refer to the NSCORT "project plan"

  3. Modeling of the reburning process

    Energy Technology Data Exchange (ETDEWEB)

    Rota, R.; Bonini, F.; Servida, A.; Morbidelli, M.; Carra, S. [Politecnico di Milano, Milano (Italy). Dip. di Chimica Fisica Applicata

    1997-07-01

    Reburning has become a popular method of abating NO{sub x} emission in power plants. Its effectiveness is strongly affected by the interaction between gas phase chemistry and combustion chamber fluid dynamics. Both the mixing of the reactant streams and the elementary reactions in the gas phase control the overall kinetics of the process. This work developed a model coupling a detailed kinetic mechanism to a simplified description of the fluid dynamics of the reburning chamber. The model was checked with reference to experimental data from the literature. Detailed kinetic modeling was found to be essential to describe the reburning process, since the fluid dynamics of the reactor have a strong influence on reactions within. 20 refs., 9 figs., 3 tabs.

  4. Animal models and conserved processes

    Directory of Open Access Journals (Sweden)

    Greek Ray

    2012-09-01

    Full Text Available Abstract Background The concept of conserved processes presents unique opportunities for using nonhuman animal models in biomedical research. However, the concept must be examined in the context that humans and nonhuman animals are evolved, complex, adaptive systems. Given that nonhuman animals are examples of living systems that are differently complex from humans, what does the existence of a conserved gene or process imply for inter-species extrapolation? Methods We surveyed the literature including philosophy of science, biological complexity, conserved processes, evolutionary biology, comparative medicine, anti-neoplastic agents, inhalational anesthetics, and drug development journals in order to determine the value of nonhuman animal models when studying conserved processes. Results Evolution through natural selection has employed components and processes both to produce the same outcomes among species but also to generate different functions and traits. Many genes and processes are conserved, but new combinations of these processes or different regulation of the genes involved in these processes have resulted in unique organisms. Further, there is a hierarchy of organization in complex living systems. At some levels, the components are simple systems that can be analyzed by mathematics or the physical sciences, while at other levels the system cannot be fully analyzed by reducing it to a physical system. The study of complex living systems must alternate between focusing on the parts and examining the intact whole organism while taking into account the connections between the two. Systems biology aims for this holism. We examined the actions of inhalational anesthetic agents and anti-neoplastic agents in order to address what the characteristics of complex living systems imply for inter-species extrapolation of traits and responses related to conserved processes. Conclusion We conclude that even the presence of conserved processes is

  5. Cohesive zone modelling and the fracture process of structural tape

    DEFF Research Database (Denmark)

    Stigh, Ulf; Biel, Anders; Svensson, Daniel

    2016-01-01

    Structural tapes provide comparable toughness as structural adhesives at orders of magnitude lower stresses. This is potentially useful to minimize the effects of differences in thermal expansion in the joining of mixed materials. The strength properties are modelled using the cohesive zone model....... Thus, a cohesive zone represents the tape, i.e. stresses in the tape are transmitted to the substrates through tractions determined by the separations of the surfaces of substrates. This simplification allows for structural analysis of large complex structures. The relation between the traction...

  6. Ecosystem simplification, biodiversity loss and plant virus emergence.

    Science.gov (United States)

    Roossinck, Marilyn J; García-Arenal, Fernando

    2015-02-01

    Plant viruses can emerge into crops from wild plant hosts, or conversely from domestic (crop) plants into wild hosts. Changes in ecosystems, including loss of biodiversity and increases in managed croplands, can impact the emergence of plant virus disease. Although data are limited, in general the loss of biodiversity is thought to contribute to disease emergence. More in-depth studies have been done for human viruses, but studies with plant viruses suggest similar patterns, and indicate that simplification of ecosystems through increased human management may increase the emergence of viral diseases in crops.

  7. Strategy-Enhanced Interactive Proving and Arithmetic Simplification for PVS

    Science.gov (United States)

    diVito, Ben L.

    2003-01-01

    We describe an approach to strategy-based proving for improved interactive deduction in specialized domains. An experimental package of strategies (tactics) and support functions called Manip has been developed for PVS to reduce the tedium of arithmetic manipulation. Included are strategies aimed at algebraic simplification of real-valued expressions. A general deduction architecture is described in which domain-specific strategies, such as those for algebraic manipulation, are supported by more generic features, such as term-access techniques applicable in arbitrary settings. An extended expression language provides access to subterms within a sequent.

  8. Advanced oxidation processes: overall models

    Energy Technology Data Exchange (ETDEWEB)

    Rodriguez, M. [Univ. de los Andes, Escuela Basica de Ingenieria, La Hechicera, Merida (Venezuela); Curco, D.; Addardak, A.; Gimenez, J.; Esplugas, S. [Dept. de Ingenieria Quimica. Univ. de Barcelona, Barcelona (Spain)

    2003-07-01

    Modelling AOPs implies to consider all the steps included in the process, that means, mass transfer, kinetic (reaction) and luminic steps. In this way, recent works develop models which relate the global reaction rate to catalyst concentration and radiation absorption. However, the application of such models requires to know what is the controlling step for the overall process. In this paper, a simple method is explained which allows to determine the controlling step. Thus, it is assumed that reactor is divided in two hypothetical zones (dark and illuminated), and according to the experimental results, obtained by varying only the reaction volume, it can be decided if reaction occurs only in the illuminated zone or in the all reactor, including dark zone. The photocatalytic degradation of phenol, by using titania degussa P-25 as catalyst, is studied as reaction model. The preliminary results obtained are presented here, showing that it seems that, in this case, reaction only occurs in the illuminated zone of photoreactor. A model is developed to explain this behaviour. (orig.)

  9. Model for amorphous aggregation processes

    Science.gov (United States)

    Stranks, Samuel D.; Ecroyd, Heath; van Sluyter, Steven; Waters, Elizabeth J.; Carver, John A.; von Smekal, Lorenz

    2009-11-01

    The amorphous aggregation of proteins is associated with many phenomena, ranging from the formation of protein wine haze to the development of cataract in the eye lens and the precipitation of recombinant proteins during their expression and purification. While much literature exists describing models for linear protein aggregation, such as amyloid fibril formation, there are few reports of models which address amorphous aggregation. Here, we propose a model to describe the amorphous aggregation of proteins which is also more widely applicable to other situations where a similar process occurs, such as in the formation of colloids and nanoclusters. As first applications of the model, we have tested it against experimental turbidimetry data of three proteins relevant to the wine industry and biochemistry, namely, thaumatin, a thaumatinlike protein, and α -lactalbumin. The model is very robust and describes amorphous experimental data to a high degree of accuracy. Details about the aggregation process, such as shape parameters of the aggregates and rate constants, can also be extracted.

  10. System Model of Heat and Mass Transfer Process for Mobile Solvent Vapor Phase Drying Equipment

    Directory of Open Access Journals (Sweden)

    Shiwei Zhang

    2014-01-01

    Full Text Available The solvent vapor phase drying process is one of the most important processes during the production and maintenance for large oil-immersed power transformer. In this paper, the working principle, system composition, and technological process of mobile solvent vapor phase drying (MVPD equipment for transformer are introduced in detail. On the basis of necessary simplification and assumption for MVPD equipment and process, a heat and mass transfer mathematical model including 40 mathematical equations is established, which represents completely thermodynamics laws of phase change and transport process of solvent, water, and air in MVPD technological processes and describes in detail the quantitative relationship among important physical quantities such as temperature, pressure, and flux in key equipment units and process. Taking a practical field drying process of 500 KV/750 MVA power transformer as an example, the simulation calculation of a complete technological process is carried out by programming with MATLAB software and some relation curves of key process parameters changing with time are obtained such as body temperature, tank pressure, and water yield. The change trend of theoretical simulation results is very consistent with the actual production record data which verifies the correctness of mathematical model established.

  11. Face Processing: Models For Recognition

    Science.gov (United States)

    Turk, Matthew A.; Pentland, Alexander P.

    1990-03-01

    The human ability to process faces is remarkable. We can identify perhaps thousands of faces learned throughout our lifetime and read facial expression to understand such subtle qualities as emotion. These skills are quite robust, despite sometimes large changes in the visual stimulus due to expression, aging, and distractions such as glasses or changes in hairstyle or facial hair. Computers which model and recognize faces will be useful in a variety of applications, including criminal identification, human-computer interface, and animation. We discuss models for representing faces and their applicability to the task of recognition, and present techniques for identifying faces and detecting eye blinks.

  12. Steady-State Process Modelling

    DEFF Research Database (Denmark)

    2011-01-01

    illustrate the “equation oriented” approach as well as the “sequential modular” approach to solving complex flowsheets for steady state applications. The applications include the Williams-Otto plant, the hydrodealkylation (HDA) of toluene, conversion of ethylene to ethanol and a bio-ethanol process.......This chapter covers the basic principles of steady state modelling and simulation using a number of case studies. Two principal approaches are illustrated that develop the unit operation models from first principles as well as through application of standard flowsheet simulators. The approaches...

  13. Switching Processes in Queueing Models

    CERN Document Server

    Anisimov, Vladimir V

    2008-01-01

    Switching processes, invented by the author in 1977, is the main tool used in the investigation of traffic problems from automotive to telecommunications. The title provides a new approach to low traffic problems based on the analysis of flows of rare events and queuing models. In the case of fast switching, averaging principle and diffusion approximation results are proved and applied to the investigation of transient phenomena for wide classes of overloading queuing networks.  The book is devoted to developing the asymptotic theory for the class of switching queuing models which covers  mode

  14. Combustion Process Modelling and Control

    Directory of Open Access Journals (Sweden)

    Vladimír Maduda

    2007-10-01

    Full Text Available This paper deals with realization of combustion control system on programmable logic controllers. Control system design is based on analysis of the current state of combustion control systems in technological device of raw material processing area. Control system design is composed of two subsystems. First subsystem is represented by software system for measured data processing and for data processing from simulation of the combustion mathematical model. Outputs are parameters for setting of controller algorithms. Second subsystem consists from programme modules. The programme module is presented by specific control algorithm, for example proportional regulation, programmed proportional regulation, proportional regulation with correction on the oxygen in waste gas, and so on. According to the specific combustion control requirements it is possible built-up concrete control system by programme modules. The programme modules were programmed by Automation studio that is used for development, debugging and testing software for B&R controllers.

  15. Dynamics Modeling and Simplification of a 6-UPS Parallel Multi-dimensional Loading Device%6-UPS并联多维力加载装置的动力学建模及简化

    Institute of Scientific and Technical Information of China (English)

    刘少欣; 王丹; 陈五一

    2014-01-01

    By using Kane Equation,the dynamic characteristic of the multi-dimensional loading device was analyzed,and dy-namic mathematical model was established. Aimed at the different parameters of motion about the loading device,the effect of the grav-ity,inertial force and the Coriolis force in the dynamic model to the generalized force of system output was analyzed with simulation, under conditions of different acceleration and velocity. The results show that when the velocity or acceleration of the moving platform is within the working limits,the effect of the inertia force branch and the Coriolis force to the system is smaller than 2%,can be omitted, while the gravity of that is greater than 2%,can not. A simplified system dynamic model is gotten according to the analyzed results, which provides the theoretical basis for the control system of parallel device.%使用Kane方法对6-UPS并联机构多维力加载装置进行了动力学特性分析,并建立了动力学数学模型。针对该加载装置不同的运动参数,仿真分析了在不同速度及加速度条件下,动力学模型中惯性力、哥氏力和重力项对系统输出广义力的影响。结果表明:当动平台的速度或加速度在工作范围内,支链惯性力和哥氏力对系统影响小于2%,可以忽略,而重力影响大于2%,不可忽略。依据这一分析结果对系统动力学模型进行了简化,为并联机构的控制系统提供了理论依据。

  16. Modeling the snow cover in climate studies: 2. The sensitivity to internal snow parameters and interface processes

    Science.gov (United States)

    Loth, Bettina; Graf, Hans-F.

    1998-05-01

    In order to find an optimal complexity for snow-cover models in climate studies, the influence of single snow processes on both the snow mass balance and the energy fluxes between snow surface and atmosphere has been investigated. Using a sophisticated model, experiments were performed under several different atmospheric and regional conditions (Arctic, midlatitudes, alpine regions). A high simulation quality can be achieved with a multilayered snow-cover model resolving the internal snow processes (cf. part 1,[Loth and Graf, this issue]). Otherwise, large errors can occur, mostly in zones which are of paramount importance for the entire climate dynamics. Owing to simplifications of such a model, the mean energy balance of the snow cover, the turbulent heat fluxes, and the long-wave radiation at the snow surface may alter by between 1 W/m2 and 8 W/m2. The snow-surface temperatures can be systematically changed by about 10 K.

  17. 换流变压器噪声预测模型及其简化研究%Studies on noise prediction model and simplification for current convert transformers

    Institute of Scientific and Technical Information of China (English)

    阮学云; 李志远; 魏浩征; 黄莹

    2011-01-01

    换流变压器作为高压直流换流站主要噪声源,其噪声预测精度和控制方法的选择将直接影响换流站整体噪声预测水平及治理效果.通过对换流变压器噪声的产生机理、噪声频谱及常见治理方案等方面进行系统研究,重点推出了换流变压器噪声控制方法BOX-IN技术,并就BOX-IN装置的噪声预测模型进行了简化和对比验证.通过对降噪量进行现场测试,结果表明,BOX-IN装置降噪量达到20 dB(A)左右,与理论计算值近似,为进一步提高高压直流换流站噪声预测精度提供了理论依据.%As a main noise source of High-Voltage Direct Current (HVDC), converter transformer's noise prediction precision and the choice of control methods will directly influence the level of noise prediction and management effect for converter station. Through the systematic studies on the noise-generation mechanism, frequency spectrum, noise consol measurements and so on, the paper highlighted the BOX-IN technology, simplifies and validates noise prediction model for BOX-IN equipment. According to the field test, the result indicates that the noise reduction quantities of BOX-IN equipment achieves about 20dB(A) corresponding to the theoretical calculations, which provides a theoretical basis for improving the noise prediction precision of HVDC furtherly.

  18. The place of modeling in cognitive science.

    Science.gov (United States)

    McClelland, James L

    2009-01-01

    I consider the role of cognitive modeling in cognitive science. Modeling, and the computers that enable it, are central to the field, but the role of modeling is often misunderstood. Models are not intended to capture fully the processes they attempt to elucidate. Rather, they are explorations of ideas about the nature of cognitive processes. In these explorations, simplification is essential-through simplification, the implications of the central ideas become more transparent. This is not to say that simplification has no downsides; it does, and these are discussed. I then consider several contemporary frameworks for cognitive modeling, stressing the idea that each framework is useful in its own particular ways. Increases in computer power (by a factor of about 4 million) since 1958 have enabled new modeling paradigms to emerge, but these also depend on new ways of thinking. Will new paradigms emerge again with the next 1,000-fold increase?

  19. A New Algorithm for Cartographic Simplification of Streams and Lakes Using Deviation Angles and Error Bands

    Directory of Open Access Journals (Sweden)

    Türkay Gökgöz

    2015-10-01

    Full Text Available Multi-representation databases (MRDBs are used in several geographical information system applications for different purposes. MRDBs are mainly obtained through model and cartographic generalizations. Simplification is the essential operator of cartographic generalization, and streams and lakes are essential features in hydrography. In this study, a new algorithm was developed for the simplification of streams and lakes. In this algorithm, deviation angles and error bands are used to determine the characteristic vertices and the planimetric accuracy of the features, respectively. The algorithm was tested using a high-resolution national hydrography dataset of Pomme de Terre, a sub-basin in the USA. To assess the performance of the new algorithm, the Bend Simplify and Douglas-Peucker algorithms, the medium-resolution hydrography dataset of the sub-basin, and Töpfer’s radical law were used. For quantitative analysis, the vertex numbers, the lengths, and the sinuosity values were computed. Consequently, it was shown that the new algorithm was able to meet the main requirements (i.e., accuracy, legibility and aesthetics, and storage.

  20. Efficient Simplification Methods for Generating High Quality LODs of 3D Meshes

    Institute of Scientific and Technical Information of China (English)

    Muhammad Hussain

    2009-01-01

    Two simplification algorithms are proposed for automatic decimation of polygonal models, and for generating their LODs. Each algorithm orders vertices according to their priority values and then removes them iteratively. For setting the priority value of each vertex, exploiting normal field of its one-ring neighborhood, we introduce a new measure of geometric fidelity that reflects well the local geometric features of the vertex. After a vertex is selected, using other measures of geometric distortion that are based on normal field deviation and distance measure, it is decided which of the edges incident on the vertex is to be collapsed for removing it. The collapsed edge is substituted with a new vertex whose position is found by minimizing the local quadric error measure. A comparison with the state-of-the-art algorithms reveals that the proposed algorithms are simple to implement, are computationally more efficient, generate LODs with better quality, and preserve salient features even after drastic simplification. The methods are useful for applications such as 3D computer games, virtual reality, where focus is on fast running time, reduced memory overhead, and high quality LODs.

  1. Influence of vocal tract geometry simplifications on the numerical simulation of vowel sounds.

    Science.gov (United States)

    Arnela, Marc; Dabbaghchian, Saeed; Blandin, Rémi; Guasch, Oriol; Engwall, Olov; Van Hirtum, Annemie; Pelorson, Xavier

    2016-09-01

    For many years, the vocal tract shape has been approximated by one-dimensional (1D) area functions to study the production of voice. More recently, 3D approaches allow one to deal with the complex 3D vocal tract, although area-based 3D geometries of circular cross-section are still in use. However, little is known about the influence of performing such a simplification, and some alternatives may exist between these two extreme options. To this aim, several vocal tract geometry simplifications for vowels [ɑ], [i], and [u] are investigated in this work. Six cases are considered, consisting of realistic, elliptical, and circular cross-sections interpolated through a bent or straight midline. For frequencies below 4-5 kHz, the influence of bending and cross-sectional shape has been found weak, while above these values simplified bent vocal tracts with realistic cross-sections are necessary to correctly emulate higher-order mode propagation. To perform this study, the finite element method (FEM) has been used. FEM results have also been compared to a 3D multimodal method and to a classical 1D frequency domain model.

  2. Mathematical modeling of biological processes

    CERN Document Server

    Friedman, Avner

    2014-01-01

    This book on mathematical modeling of biological processes includes a wide selection of biological topics that demonstrate the power of mathematics and computational codes in setting up biological processes with a rigorous and predictive framework. Topics include: enzyme dynamics, spread of disease, harvesting bacteria, competition among live species, neuronal oscillations, transport of neurofilaments in axon, cancer and cancer therapy, and granulomas. Complete with a description of the biological background and biological question that requires the use of mathematics, this book is developed for graduate students and advanced undergraduate students with only basic knowledge of ordinary differential equations and partial differential equations; background in biology is not required. Students will gain knowledge on how to program with MATLAB without previous programming experience and how to use codes in order to test biological hypothesis.

  3. Principles of polymer processing modelling

    Directory of Open Access Journals (Sweden)

    Agassant Jean-François

    2016-01-01

    Full Text Available Polymer processing involves three thermo-mechanical stages: Plastication of solid polymer granules or powder to an homogeneous fluid which is shaped under pressure in moulds or dies and finally cooled and eventually drawn to obtain the final plastic part. Physical properties of polymers (high viscosity, non-linear rheology, low thermal diffusivity as well as the complex shape of most plastic parts make modelling a challenge. Several examples (film blowing extrusion dies, injection moulding, blow moulding are presented and discussed.

  4. Integrated modelling in materials and process technology

    DEFF Research Database (Denmark)

    Hattel, Jesper Henri

    2008-01-01

    Integrated modelling of entire process sequences and the subsequent in-service conditions, and multiphysics modelling of the single process steps are areas that increasingly support optimisation of manufactured parts. In the present paper, three different examples of modelling manufacturing...... processes from the viewpoint of combined materials and process modelling are presented: solidification of thin walled ductile cast iron, integrated modelling of spray forming and multiphysics modelling of friction stir welding. The fourth example describes integrated modelling applied to a failure analysis...

  5. A Review of Process Modeling Language Paradigms

    Institute of Scientific and Technical Information of China (English)

    MA Qin-hai; GUAN Zhi-min; LI Ying; ZHAO Xi-nan

    2002-01-01

    Process representation or modeling plays an important role in business process engineering.Process modeling languages can be evaluated by the extent to which they provide constructs useful for representing and reasoning about the aspects of a process, and subsequently are chosen for a certain purpose.This paper reviews process modeling language paradigms and points out their advantages and disadvantages.

  6. Ecosystem models are by definition simplifications of the real ...

    African Journals Online (AJOL)

    spamer

    have few generations, but they appear at opportunistic times in the plankton. In contrast .... Barents Sea data show that biomass starts to increase at the latest around mid ..... (1998) attributed delayed diatom blooms in the St Lawrence Estuary.

  7. Business Process Modelling for Measuring Quality

    NARCIS (Netherlands)

    Heidari, F.; Loucopoulos, P.; Brazier, F.M.

    2013-01-01

    Business process modelling languages facilitate presentation, communication and analysis of business processes with different stakeholders. This paper proposes an approach that drives specification and measurement of quality requirements and in doing so relies on business process models as

  8. Business Process Modelling for Measuring Quality

    NARCIS (Netherlands)

    Heidari, F.; Loucopoulos, P.; Brazier, F.M.

    2013-01-01

    Business process modelling languages facilitate presentation, communication and analysis of business processes with different stakeholders. This paper proposes an approach that drives specification and measurement of quality requirements and in doing so relies on business process models as represent

  9. Developing engineering processes through integrated modelling of product and process

    DEFF Research Database (Denmark)

    Nielsen, Jeppe Bjerrum; Hvam, Lars

    2012-01-01

    This article aims at developing an operational tool for integrated modelling of product assortments and engineering processes in companies making customer specific products. Integrating a product model in the design of engineering processes will provide a deeper understanding of the engineering...... activities as well as insight into how product features affect the engineering processes. The article suggests possible ways of integrating models of products with models of engineering processes. The models have been tested and further developed in an action research study carried out in collaboration...

  10. The Role of Simplification and Information in College Decisions: Results from the H&R Block FAFSA Experiment. NBER Working Paper No. 15361

    Science.gov (United States)

    Bettinger, Eric P.; Long, Bridget Terry; Oreopoulos, Philip; Sanbonmatsu, Lisa

    2009-01-01

    Growing concerns about low awareness and take-up rates for government support programs like college financial aid have spurred calls to simplify the application process and enhance visibility. This project examines the effects of two experimental treatments designed to test of the importance of simplification and information using a random…

  11. modeling grinding modeling grinding processes as micro processes ...

    African Journals Online (AJOL)

    eobe

    workpiece material dynamics thus allowing for process planning, optimization, and control. In spite of the .... arrangement of the grain vertices at the wheel active surface. ...... on Workpiece Roughness and Process Vibration” J. of the Braz.

  12. The Research of Simplification Of 1.9 TDI Diesel Engine Heat Release Parameters Determination

    Directory of Open Access Journals (Sweden)

    Justas Žaglinskis

    2014-12-01

    Full Text Available The investigation of modified methodology of Audi 1.9 TDI 1Z diesel engine heat release parameters’ determination is represented in the article. In this research the AVL BOOST BURN and IMPULS software was used to treat data and to simulate engine work process. The reverse task of indicated pressure determination from heat release data was solved here. T. Bulaty and W. Glanzman methodology was modified for purpose to simplify the determination of heat release parameters. The maximal cylinder pressure, which requires additional expensive equipment, was changed into the objective indicator – exhaust gas temperature. This modification allowed to simplify the experimental engine tests and also gave simulation results in an error range up to 2% of main engine operating parameters. The study results are assessed as an important point for the simplification of engine test under field conditions.

  13. THE METHOD OF GRAPHIC SIMPLIFICATION OF AREA FEATURE BOUNDARY WITH RIGHT ANGLES

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    Some rules of simplification of area feature boundary and the method of acquiring spatial knowledge,such as maintaining area and shape of area feature, are discussed.This paper focuses on the progressive method of graphic simplification of area feature boundary with right angles based on its characteristics.

  14. Managing Analysis Models in the Design Process

    Science.gov (United States)

    Briggs, Clark

    2006-01-01

    Design of large, complex space systems depends on significant model-based support for exploration of the design space. Integrated models predict system performance in mission-relevant terms given design descriptions and multiple physics-based numerical models. Both the design activities and the modeling activities warrant explicit process definitions and active process management to protect the project from excessive risk. Software and systems engineering processes have been formalized and similar formal process activities are under development for design engineering and integrated modeling. JPL is establishing a modeling process to define development and application of such system-level models.

  15. Cupola Furnace Computer Process Model

    Energy Technology Data Exchange (ETDEWEB)

    Seymour Katz

    2004-12-31

    The cupola furnace generates more than 50% of the liquid iron used to produce the 9+ million tons of castings annually. The cupola converts iron and steel into cast iron. The main advantages of the cupola furnace are lower energy costs than those of competing furnaces (electric) and the ability to melt less expensive metallic scrap than the competing furnaces. However the chemical and physical processes that take place in the cupola furnace are highly complex making it difficult to operate the furnace in optimal fashion. The results are low energy efficiency and poor recovery of important and expensive alloy elements due to oxidation. Between 1990 and 2004 under the auspices of the Department of Energy, the American Foundry Society and General Motors Corp. a computer simulation of the cupola furnace was developed that accurately describes the complex behavior of the furnace. When provided with the furnace input conditions the model provides accurate values of the output conditions in a matter of seconds. It also provides key diagnostics. Using clues from the diagnostics a trained specialist can infer changes in the operation that will move the system toward higher efficiency. Repeating the process in an iterative fashion leads to near optimum operating conditions with just a few iterations. More advanced uses of the program have been examined. The program is currently being combined with an ''Expert System'' to permit optimization in real time. The program has been combined with ''neural network'' programs to affect very easy scanning of a wide range of furnace operation. Rudimentary efforts were successfully made to operate the furnace using a computer. References to these more advanced systems will be found in the ''Cupola Handbook''. Chapter 27, American Foundry Society, Des Plaines, IL (1999).

  16. Computer Aided Continuous Time Stochastic Process Modelling

    DEFF Research Database (Denmark)

    Kristensen, N.R.; Madsen, Henrik; Jørgensen, Sten Bay

    2001-01-01

    A grey-box approach to process modelling that combines deterministic and stochastic modelling is advocated for identification of models for model-based control of batch and semi-batch processes. A computer-aided tool designed for supporting decision-making within the corresponding modelling cycle...

  17. Process Correlation Analysis Model for Process Improvement Identification

    Directory of Open Access Journals (Sweden)

    Su-jin Choi

    2014-01-01

    software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data.

  18. A Process Model for Establishing Business Process Crowdsourcing

    Directory of Open Access Journals (Sweden)

    Nguyen Hoang Thuan

    2017-06-01

    Full Text Available Crowdsourcing can be an organisational strategy to distribute work to Internet users and harness innovation, information, capacities, and variety of business endeavours. As crowdsourcing is different from other business strategies, organisations are often unsure as to how to best structure different crowdsourcing activities and integrate them with other organisational business processes. To manage this problem, we design a process model guiding how to establish business process crowdsourcing. The model consists of seven components covering the main activities of crowdsourcing processes, which are drawn from a knowledge base incorporating diverse knowledge sources in the domain. The built model is evaluated using case studies, suggesting the adequateness and utility of the model.

  19. Simplification of the CBS-QB3 method for predicting gas-phase deprotonation free energies

    Science.gov (United States)

    Casasnovas, Rodrigo; Frau, Juan; Ortega-Castro, Joaquín; Salvà, Antoni; Donoso, Josefa; Muñoz, Francisco

    Simplified versions of CBS-QB3 model chemistry were used to calculate the free energies of 36 deprotonation reactions in the gas phase. The best such version, S9, excluded coupled cluster calculation [CCSD(T)], and empirical (ΔEemp) and spin-orbit (ΔEint) correction terms. The mean absolute deviation and root mean square thus obtained (viz. 1.24 and 1.56 kcal/mol, respectively) were very-close to those provided by the original CBS-QB3 method (1.19 and 1.52 kcal/mol, respectively). The high-accuracy of the proposed simplification and its computational expeditiousness make it an excellent choice for energy calculations on gas-phase deprotonation reactions in complex systems.

  20. CONVERGENCE TO PROCESS ORGANIZATION BY MODEL OF PROCESS MATURITY

    Directory of Open Access Journals (Sweden)

    Blaženka Piuković Babičković

    2015-06-01

    Full Text Available With modern business process orientation binds primarily, process of thinking and process organizational structure. Although the business processes are increasingly a matter of writing and speaking, it is a major problem among the business world, especially in countries in transition, where it has been found that there is a lack of understanding of the concept of business process management. The aim of this paper is to give a specific contribution to overcoming the identified problem, by pointing out the significance of the concept of business process management, as well as the representation of the model for review of process maturity and tools that are recommended for use in process management.

  1. Event-driven process execution model for process virtual machine

    Institute of Scientific and Technical Information of China (English)

    WU Dong-yao; WEI Jun; GAO Chu-shu; DOU Wen-shen

    2012-01-01

    Current orchestration and choreography process engines only serve with dedicate process languages. To solve these problems, an Even~driven Process Execution Model (EPEM) was developed. Formalization and map- ping principles of the model were presented to guarantee the correctness and efficiency for process transformation. As a case study, the EPEM descriptions of Web Services Business Process Execution Language (WS~BPEL) were represented and a Process Virtual Machine (PVM)-OncePVM was implemented in compliance with the EPEM.

  2. Analog modelling of obduction processes

    Science.gov (United States)

    Agard, P.; Zuo, X.; Funiciello, F.; Bellahsen, N.; Faccenna, C.; Savva, D.

    2012-04-01

    Obduction corresponds to one of plate tectonics oddities, whereby dense, oceanic rocks (ophiolites) are presumably 'thrust' on top of light, continental ones, as for the short-lived, almost synchronous Peri-Arabic obduction (which took place along thousands of km from Turkey to Oman in c. 5-10 Ma). Analog modelling experiments were performed to study the mechanisms of obduction initiation and test various triggering hypotheses (i.e., plate acceleration, slab hitting the 660 km discontinuity, ridge subduction; Agard et al., 2007). The experimental setup comprises (1) an upper mantle, modelled as a low-viscosity transparent Newtonian glucose syrup filling a rigid Plexiglas tank and (2) high-viscosity silicone plates (Rhodrosil Gomme with PDMS iron fillers to reproduce densities of continental or oceanic plates), located at the centre of the tank above the syrup to simulate the subducting and the overriding plates - and avoid friction on the sides of the tank. Convergence is simulated by pushing on a piston at one end of the model with velocities comparable to those of plate tectonics (i.e., in the range 1-10 cm/yr). The reference set-up includes, from one end to the other (~60 cm): (i) the piston, (ii) a continental margin containing a transition zone to the adjacent oceanic plate, (iii) a weakness zone with variable resistance and dip (W), (iv) an oceanic plate - with or without a spreading ridge, (v) a subduction zone (S) dipping away from the piston and (vi) an upper, active continental margin, below which the oceanic plate is being subducted at the start of the experiment (as is known to have been the case in Oman). Several configurations were tested and over thirty different parametric tests were performed. Special emphasis was placed on comparing different types of weakness zone (W) and the extent of mechanical coupling across them, particularly when plates were accelerated. Displacements, together with along-strike and across-strike internal deformation in all

  3. Business process modeling for processing classified documents using RFID technology

    Directory of Open Access Journals (Sweden)

    Koszela Jarosław

    2016-01-01

    Full Text Available The article outlines the application of the processing approach to the functional description of the designed IT system supporting the operations of the secret office, which processes classified documents. The article describes the application of the method of incremental modeling of business processes according to the BPMN model to the description of the processes currently implemented (“as is” in a manual manner and target processes (“to be”, using the RFID technology for the purpose of their automation. Additionally, the examples of applying the method of structural and dynamic analysis of the processes (process simulation to verify their correctness and efficiency were presented. The extension of the process analysis method is a possibility of applying the warehouse of processes and process mining methods.

  4. Process correlation analysis model for process improvement identification.

    Science.gov (United States)

    Choi, Su-jin; Kim, Dae-Kyoo; Park, Sooyong

    2014-01-01

    Software process improvement aims at improving the development process of software systems. It is initiated by process assessment identifying strengths and weaknesses and based on the findings, improvement plans are developed. In general, a process reference model (e.g., CMMI) is used throughout the process of software process improvement as the base. CMMI defines a set of process areas involved in software development and what to be carried out in process areas in terms of goals and practices. Process areas and their elements (goals and practices) are often correlated due to the iterative nature of software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data.

  5. Effects on Text Simplification: Evaluation of Splitting Up Noun Phrases.

    Science.gov (United States)

    Leroy, Gondy; Kauchak, David; Hogue, Alan

    2016-01-01

    To help increase health literacy, we are developing a text simplification tool that creates more accessible patient education materials. Tool development is guided by a data-driven feature analysis comparing simple and difficult text. In the present study, we focus on the common advice to split long noun phrases. Our previous corpus analysis showed that easier texts contained shorter noun phrases. Subsequently, we conducted a user study to measure the difficulty of sentences containing noun phrases of different lengths (2-gram, 3-gram, and 4-gram); noun phrases of different conditions (split or not); and, to simulate unknown terms, pseudowords (present or not). We gathered 35 evaluations for 30 sentences in each condition (3 × 2 × 2 conditions) on Amazon's Mechanical Turk (N = 12,600). We conducted a 3-way analysis of variance for perceived and actual difficulty. Splitting noun phrases had a positive effect on perceived difficulty but a negative effect on actual difficulty. The presence of pseudowords increased perceived and actual difficulty. Without pseudowords, longer noun phrases led to increased perceived and actual difficulty. A follow-up study using the phrases (N = 1,350) showed that measuring awkwardness may indicate when to split noun phrases. We conclude that splitting noun phrases benefits perceived difficulty but hurts actual difficulty when the phrasing becomes less natural.

  6. A Simplification of a Real-Time Verification Problem

    CERN Document Server

    Saha, Indranil; Roy, Suman; 10.1007/978-3-540-75596-8_21

    2010-01-01

    We revisit the problem of real-time verification with dense dynamics using timeout and calendar based models and simplify this to a finite state verification problem. To overcome the complexity of verification of real-time systems with dense dynamics, Dutertre and Sorea, proposed timeout and calender based transition systems to model the behavior of real-time systems and verified safety properties using k-induction in association with bounded model checking. In this work, we introduce a specification formalism for these models in terms of Timeout Transition Diagrams and capture their behavior in terms of semantics of Timed Transition Systems. Further, we discuss a technique, which reduces the problem of verification of qualitative temporal properties on infinite state space of (a large fragment of) these timeout and calender based transition systems into that on clockless finite state models through a two-step process comprising of digitization and canonical finitary reduction. This technique enables us to ve...

  7. Four Common Simplifications of Multi-Criteria Decision Analysis do not hold for River Rehabilitation.

    Directory of Open Access Journals (Sweden)

    Simone D Langhans

    Full Text Available River rehabilitation aims at alleviating negative effects of human impacts such as loss of biodiversity and reduction of ecosystem services. Such interventions entail difficult trade-offs between different ecological and often socio-economic objectives. Multi-Criteria Decision Analysis (MCDA is a very suitable approach that helps assessing the current ecological state and prioritizing river rehabilitation measures in a standardized way, based on stakeholder or expert preferences. Applications of MCDA in river rehabilitation projects are often simplified, i.e. using a limited number of objectives and indicators, assuming linear value functions, aggregating individual indicator assessments additively, and/or assuming risk neutrality of experts. Here, we demonstrate an implementation of MCDA expert preference assessments to river rehabilitation and provide ample material for other applications. To test whether the above simplifications reflect common expert opinion, we carried out very detailed interviews with five river ecologists and a hydraulic engineer. We defined essential objectives and measurable quality indicators (attributes, elicited the experts´ preferences for objectives on a standardized scale (value functions and their risk attitude, and identified suitable aggregation methods. The experts recommended an extensive objectives hierarchy including between 54 and 93 essential objectives and between 37 to 61 essential attributes. For 81% of these, they defined non-linear value functions and in 76% recommended multiplicative aggregation. The experts were risk averse or risk prone (but never risk neutral, depending on the current ecological state of the river, and the experts´ personal importance of objectives. We conclude that the four commonly applied simplifications clearly do not reflect the opinion of river rehabilitation experts. The optimal level of model complexity, however, remains highly case-study specific depending on data and

  8. Four Common Simplifications of Multi-Criteria Decision Analysis do not hold for River Rehabilitation.

    Science.gov (United States)

    Langhans, Simone D; Lienert, Judit

    2016-01-01

    River rehabilitation aims at alleviating negative effects of human impacts such as loss of biodiversity and reduction of ecosystem services. Such interventions entail difficult trade-offs between different ecological and often socio-economic objectives. Multi-Criteria Decision Analysis (MCDA) is a very suitable approach that helps assessing the current ecological state and prioritizing river rehabilitation measures in a standardized way, based on stakeholder or expert preferences. Applications of MCDA in river rehabilitation projects are often simplified, i.e. using a limited number of objectives and indicators, assuming linear value functions, aggregating individual indicator assessments additively, and/or assuming risk neutrality of experts. Here, we demonstrate an implementation of MCDA expert preference assessments to river rehabilitation and provide ample material for other applications. To test whether the above simplifications reflect common expert opinion, we carried out very detailed interviews with five river ecologists and a hydraulic engineer. We defined essential objectives and measurable quality indicators (attributes), elicited the experts´ preferences for objectives on a standardized scale (value functions) and their risk attitude, and identified suitable aggregation methods. The experts recommended an extensive objectives hierarchy including between 54 and 93 essential objectives and between 37 to 61 essential attributes. For 81% of these, they defined non-linear value functions and in 76% recommended multiplicative aggregation. The experts were risk averse or risk prone (but never risk neutral), depending on the current ecological state of the river, and the experts´ personal importance of objectives. We conclude that the four commonly applied simplifications clearly do not reflect the opinion of river rehabilitation experts. The optimal level of model complexity, however, remains highly case-study specific depending on data and resource

  9. A new process sensitivity index to identify important system processes under process model and parametric uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Dai, Heng [Pacific Northwest National Laboratory, Richland Washington USA; Ye, Ming [Department of Scientific Computing, Florida State University, Tallahassee Florida USA; Walker, Anthony P. [Environmental Sciences Division and Climate Change Science Institute, Oak Ridge National Laboratory, Oak Ridge Tennessee USA; Chen, Xingyuan [Pacific Northwest National Laboratory, Richland Washington USA

    2017-04-01

    Hydrological models are always composed of multiple components that represent processes key to intended model applications. When a process can be simulated by multiple conceptual-mathematical models (process models), model uncertainty in representing the process arises. While global sensitivity analysis methods have been widely used for identifying important processes in hydrologic modeling, the existing methods consider only parametric uncertainty but ignore the model uncertainty for process representation. To address this problem, this study develops a new method to probe multimodel process sensitivity by integrating the model averaging methods into the framework of variance-based global sensitivity analysis, given that the model averaging methods quantify both parametric and model uncertainty. A new process sensitivity index is derived as a metric of relative process importance, and the index includes variance in model outputs caused by uncertainty in both process models and model parameters. For demonstration, the new index is used to evaluate the processes of recharge and geology in a synthetic study of groundwater reactive transport modeling. The recharge process is simulated by two models that converting precipitation to recharge, and the geology process is also simulated by two models of different parameterizations of hydraulic conductivity; each process model has its own random parameters. The new process sensitivity index is mathematically general, and can be applied to a wide range of problems in hydrology and beyond.

  10. Properties of spatial Cox process models

    DEFF Research Database (Denmark)

    Møller, Jesper

    Probabilistic properties of Cox processes of relevance for statistical modelling and inference are studied. Particularly, we study the most important classes of Cox processes, including log Gaussian Cox processes, shot noise Cox processes, and permanent Cox processes. We consider moment propertie...

  11. Modeling process flow using diagrams

    NARCIS (Netherlands)

    Kemper, B.; de Mast, J.; Mandjes, M.

    2010-01-01

    In the practice of process improvement, tools such as the flowchart, the value-stream map (VSM), and a variety of ad hoc variants of such diagrams are commonly used. The purpose of this paper is to present a clear, precise, and consistent framework for the use of such flow diagrams in process

  12. Modeling process flow using diagrams

    NARCIS (Netherlands)

    Kemper, B.; de Mast, J.; Mandjes, M.

    2010-01-01

    In the practice of process improvement, tools such as the flowchart, the value-stream map (VSM), and a variety of ad hoc variants of such diagrams are commonly used. The purpose of this paper is to present a clear, precise, and consistent framework for the use of such flow diagrams in process improv

  13. Context Based Reasoning in Business Process Models

    OpenAIRE

    Balabko, Pavel; Wegmann, Alain

    2003-01-01

    Modeling approaches often are not adapted to human reasoning: models are ambiguous and imprecise. A same model element may have multiple meanings in different functional roles of a system. Existing modeling approaches do not relate explicitly these functional roles with model elements. A principle that can solve this problem is that model elements should be defined in a context. We believe that the explicit modeling of context is especially useful in Business Process Modeling (BPM) where the ...

  14. Modelling of Batch Process Operations

    DEFF Research Database (Denmark)

    2011-01-01

    Here a batch cooling crystalliser is modelled and simulated as is a batch distillation system. In the batch crystalliser four operational modes of the crystalliser are considered, namely: initial cooling, nucleation, crystal growth and product removal. A model generation procedure is shown that s...

  15. Birth/death process model

    Science.gov (United States)

    Solloway, C. B.; Wakeland, W.

    1976-01-01

    First-order Markov model developed on digital computer for population with specific characteristics. System is user interactive, self-documenting, and does not require user to have complete understanding of underlying model details. Contains thorough error-checking algorithms on input and default capabilities.

  16. Steady-State Process Modelling

    DEFF Research Database (Denmark)

    2011-01-01

    This chapter covers the basic principles of steady state modelling and simulation using a number of case studies. Two principal approaches are illustrated that develop the unit operation models from first principles as well as through application of standard flowsheet simulators. The approaches i...

  17. Modeling business processes: theoretical and practical aspects

    Directory of Open Access Journals (Sweden)

    V.V. Dubininа

    2015-06-01

    Full Text Available The essence of process-oriented enterprise management has been examined in the article. The content and types of information technology have been analyzed in the article, due to the complexity and differentiation of existing methods, as well as the specificity of language, terminology of the enterprise business processes modeling. The theoretical aspects of business processes modeling have been reviewed and the modern traditional modeling techniques received practical application in the visualization model of retailers activity have been studied in the article. In the process of theoretical analysis of the modeling methods found that UFO-toolkit method that has been developed by Ukrainian scientists due to it systemology integrated opportunities, is the most suitable for structural and object analysis of retailers business processes. It was designed visualized simulation model of the business process "sales" as is" of retailers using a combination UFO-elements with the aim of the further practical formalization and optimization of a given business process.

  18. A Comparative of business process modelling techniques

    Science.gov (United States)

    Tangkawarow, I. R. H. T.; Waworuntu, J.

    2016-04-01

    In this era, there is a lot of business process modeling techniques. This article is the research about differences of business process modeling techniques. For each technique will explain about the definition and the structure. This paper presents a comparative analysis of some popular business process modelling techniques. The comparative framework is based on 2 criteria: notation and how it works when implemented in Somerleyton Animal Park. Each technique will end with the advantages and disadvantages. The final conclusion will give recommend of business process modeling techniques that easy to use and serve the basis for evaluating further modelling techniques.

  19. Modeling Events with Cascades of Poisson Processes

    CERN Document Server

    Simma, Aleksandr

    2012-01-01

    We present a probabilistic model of events in continuous time in which each event triggers a Poisson process of successor events. The ensemble of observed events is thereby modeled as a superposition of Poisson processes. Efficient inference is feasible under this model with an EM algorithm. Moreover, the EM algorithm can be implemented as a distributed algorithm, permitting the model to be applied to very large datasets. We apply these techniques to the modeling of Twitter messages and the revision history of Wikipedia.

  20. The Computer-Aided Analytic Process Model. Operations Handbook for the Analytic Process Model Demonstration Package

    Science.gov (United States)

    1986-01-01

    Research Note 86-06 THE COMPUTER-AIDED ANALYTIC PROCESS MODEL : OPERATIONS HANDBOOK FOR THE ANALYTIC PROCESS MODEL DE ONSTRATION PACKAGE Ronald G...ic Process Model ; Operations Handbook; Tutorial; Apple; Systems Taxonomy Mod--l; Training System; Bradl1ey infantry Fighting * Vehicle; BIFV...8217. . . . . . . .. . . . . . . . . . . . . . . . * - ~ . - - * m- .. . . . . . . item 20. Abstract -continued companion volume-- "The Analytic Process Model for

  1. Systematic approach for the identification of process reference models

    CSIR Research Space (South Africa)

    Van Der Merwe, A

    2009-02-01

    Full Text Available Process models are used in different application domains to capture knowledge on the process flow. Process reference models (PRM) are used to capture reusable process models, which should simplify the identification process of process models...

  2. Ada COCOMO and the Ada Process Model

    Science.gov (United States)

    1989-01-01

    language, the use of incremental development, and the use of the Ada process model capitalizing on the strengths of Ada to improve the efficiency of software...development. This paper presents the portions of the revised Ada COCOMO dealing with the effects of Ada and the Ada process model . The remainder of...this section of the paper discusses the objectives of Ada COCOMO. Section 2 describes the Ada Process Model and its overall effects on software

  3. Simulation Modeling of Software Development Processes

    Science.gov (United States)

    Calavaro, G. F.; Basili, V. R.; Iazeolla, G.

    1996-01-01

    A simulation modeling approach is proposed for the prediction of software process productivity indices, such as cost and time-to-market, and the sensitivity analysis of such indices to changes in the organization parameters and user requirements. The approach uses a timed Petri Net and Object Oriented top-down model specification. Results demonstrate the model representativeness, and its usefulness in verifying process conformance to expectations, and in performing continuous process improvement and optimization.

  4. A Study of Cloud Processing of Organic Aerosols Using Models and CHAPS Data

    Energy Technology Data Exchange (ETDEWEB)

    Ervens, Barbara [Univ. of Colorado, Boulder, CO (United States)

    2012-01-17

    The main theme of our work has been the identification of parameters that mostly affect the formation and modification of aerosol particles and their interaction with water vapor. Our detailed process model studies led to simplifications/parameterizations of these effects that bridge detailed aerosol information from laboratory and field studies and the need for computationally efficient expressions in complex atmospheric models. One focus of our studies has been organic aerosol mass that is formed in the atmosphere by physical and/or chemical processes (secondary organic aerosol, SOA) and represents a large fraction of atmospheric particulate matter. Most current models only describe SOA formation by condensation of low volatility (or semivolatile) gas phase products and neglect processes in the aqueous phase of particles or cloud droplets that differently affect aerosol size and vertical distribution and chemical composition (hygroscopicity). We developed and applied models of aqueous phase SOA formation in cloud droplets and aerosol particles (aqSOA). Placing our model results into the context of laboratory, model and field studies suggests a potentially significant contribution of aqSOA to the global organic mass loading. The second focus of our work has been the analysis of ambient data of particles that might act as cloud condensation nuclei (CCN) at different locations and emission scenarios. Our model studies showed that the description of particle chemical composition and mixing state can often be greatly simplified, in particular in aged aerosol. While over the past years many CCN studies have been successful performed by using such simplified composition/mixing state assumptions, much more uncertainty exists in aerosol-cloud interactions in cold clouds (ice or mixed-phase). Therefore we extended our parcel model that describes warm cloud formation by ice microphysics and explored microphysical parameters that determine the phase state and lifetime of

  5. Branching process models of cancer

    CERN Document Server

    Durrett, Richard

    2015-01-01

    This volume develops results on continuous time branching processes and applies them to study rate of tumor growth, extending classic work on the Luria-Delbruck distribution. As a consequence, the authors calculate the probability that mutations that confer resistance to treatment are present at detection and quantify the extent of tumor heterogeneity. As applications, the authors evaluate ovarian cancer screening strategies and give rigorous proofs for results of Heano and Michor concerning tumor metastasis. These notes should be accessible to students who are familiar with Poisson processes and continuous time. Richard Durrett is mathematics professor at Duke University, USA. He is the author of 8 books, over 200 journal articles, and has supervised more than 40 Ph.D. students. Most of his current research concerns the applications of probability to biology: ecology, genetics, and most recently cancer.

  6. A Process Model for Establishing Business Process Crowdsourcing

    OpenAIRE

    Nguyen Hoang Thuan; Pedro Antunes; David Johnstone

    2017-01-01

    Crowdsourcing can be an organisational strategy to distribute work to Internet users and harness innovation, information, capacities, and variety of business endeavours. As crowdsourcing is different from other business strategies, organisations are often unsure as to how to best structure different crowdsourcing activities and integrate them with other organisational business processes. To manage this problem, we design a process model guiding how to establish business process crowdsourcing....

  7. Total Ship Design Process Modeling

    Science.gov (United States)

    2012-04-30

    Microsoft Project® or Primavera ®, and perform process simulations that can investigate risk, cost, and schedule trade-offs. Prior efforts to capture...planning in the face of disruption, delay, and late‐changing  requirements. ADePT is interfaced with  PrimaVera , the AEC  industry favorite program

  8. An Improved Surface Simplification Method for Facial Expression Animation Based on Homogeneous Coordinate Transformation Matrix and Maximum Shape Operator

    Directory of Open Access Journals (Sweden)

    Juin-Ling Tseng

    2016-01-01

    Full Text Available Facial animation is one of the most popular 3D animation topics researched in recent years. However, when using facial animation, a 3D facial animation model has to be stored. This 3D facial animation model requires many triangles to accurately describe and demonstrate facial expression animation because the face often presents a number of different expressions. Consequently, the costs associated with facial animation have increased rapidly. In an effort to reduce storage costs, researchers have sought to simplify 3D animation models using techniques such as Deformation Sensitive Decimation and Feature Edge Quadric. The studies conducted have examined the problems in the homogeneity of the local coordinate system between different expression models and in the retainment of simplified model characteristics. This paper proposes a method that applies Homogeneous Coordinate Transformation Matrix to solve the problem of homogeneity of the local coordinate system and Maximum Shape Operator to detect shape changes in facial animation so as to properly preserve the features of facial expressions. Further, root mean square error and perceived quality error are used to compare the errors generated by different simplification methods in experiments. Experimental results show that, compared with Deformation Sensitive Decimation and Feature Edge Quadric, our method can not only reduce the errors caused by simplification of facial animation, but also retain more facial features.

  9. Modeling and simulation of membrane process

    Science.gov (United States)

    Staszak, Maciej

    2017-06-01

    The article presents the different approaches to polymer membrane mathematical modeling. Traditional models based on experimental physicochemical correlations and balance models are presented in the first part. Quantum and molecular mechanics models are presented as they are more popular for polymer membranes in fuel cells. The initial part is enclosed by neural network models which found their use for different types of processes in polymer membranes. The second part is devoted to models of fluid dynamics. The computational fluid dynamics technique can be divided into solving of Navier-Stokes equations and into Boltzmann lattice models. Both approaches are presented focusing on membrane processes.

  10. GPstuff: Bayesian Modeling with Gaussian Processes

    NARCIS (Netherlands)

    Vanhatalo, J.; Riihimaki, J.; Hartikainen, J.; Jylänki, P.P.; Tolvanen, V.; Vehtari, A.

    2013-01-01

    The GPstuff toolbox is a versatile collection of Gaussian process models and computational tools required for Bayesian inference. The tools include, among others, various inference methods, sparse approximations and model assessment methods.

  11. Online Rule Generation Software Process Model

    National Research Council Canada - National Science Library

    Sudeep Marwaha; Alka Aroa; Satma M C; Rajni Jain; R C Goyal

    2013-01-01

    .... The software process model for rule generation using decision tree classifier refers to the various steps required to be executed for the development of a web based software model for decision rule generation...

  12. Modeling pellet impact drilling process

    OpenAIRE

    Kovalev, Artem Vladimirovich; Ryabchikov, Sergey Yakovlevich; Isaev, Evgeniy Dmitrievich; Ulyanova, Oksana Sergeevna

    2016-01-01

    The paper describes pellet impact drilling which could be used to increase the drilling speed and the rate of penetration when drilling hard rocks. Pellet impact drilling implies rock destruction by metal pellets with high kinetic energy in the immediate vicinity of the earth formation encountered. The pellets are circulated in the bottom hole by a high velocity fluid jet, which is the principle component of the ejector pellet impact drill bit. The experiments conducted has allowed modeling t...

  13. Interpretive and Formal Models of Discourse Processing.

    Science.gov (United States)

    Bulcock, Jeffrey W.; Beebe, Mona J.

    Distinguishing between interpretive and formal models of discourse processing and between qualitative and quantitative research, this paper argues that formal models are the analogues of interpretive models, and that the two are complementary. It observes that interpretive models of reading are being increasingly derived from qualitative research…

  14. Process modelling on a canonical basis[Process modelling; Canonical modelling

    Energy Technology Data Exchange (ETDEWEB)

    Siepmann, Volker

    2006-12-20

    Based on an equation oriented solving strategy, this thesis investigates a new approach to process modelling. Homogeneous thermodynamic state functions represent consistent mathematical models of thermodynamic properties. Such state functions of solely extensive canonical state variables are the basis of this work, as they are natural objective functions in optimisation nodes to calculate thermodynamic equilibrium regarding phase-interaction and chemical reactions. Analytical state function derivatives are utilised within the solution process as well as interpreted as physical properties. By this approach, only a limited range of imaginable process constraints are considered, namely linear balance equations of state variables. A second-order update of source contributions to these balance equations is obtained by an additional constitutive equation system. These equations are general dependent on state variables and first-order sensitivities, and cover therefore practically all potential process constraints. Symbolic computation technology efficiently provides sparsity and derivative information of active equations to avoid performance problems regarding robustness and computational effort. A benefit of detaching the constitutive equation system is that the structure of the main equation system remains unaffected by these constraints, and a priori information allows to implement an efficient solving strategy and a concise error diagnosis. A tailor-made linear algebra library handles the sparse recursive block structures efficiently. The optimisation principle for single modules of thermodynamic equilibrium is extended to host entire process models. State variables of different modules interact through balance equations, representing material flows from one module to the other. To account for reusability and encapsulation of process module details, modular process modelling is supported by a recursive module structure. The second-order solving algorithm makes it

  15. An Extension to the Weibull Process Model

    Science.gov (United States)

    1981-11-01

    Subt5l . TYPE OF REPORT & PERIOD COVERED AN EXTENSION+TO THE WEIBULL PROCESS MODEL 6. PERFORMING O’G. REPORT NUMBER I. AuTHOR() S. CONTRACT OR GRANT...indicatinq its imrportance to applications. 4 AN EXTENSION TO TE WEIBULL PROCESS MODEL 1. INTRODUCTION Recent papers by Bain and Engelhardt (1980)1 and Crow

  16. Topology simplification: Important biological phenomenon or evolutionary relic?. Comment on "Disentangling DNA molecules" by Alexander Vologodskii

    Science.gov (United States)

    Bates, Andrew D.; Maxwell, Anthony

    2016-09-01

    The review, Disentangling DNA molecules[1], gives an excellent technical description of the phenomenon of topology simplification (TS) by type IIA DNA topoisomerases (topos). In the 20 years since its discovery [2], this effect has attracted a good deal of attention, probably because of its apparently magical nature, and because it seemed to offer a solution to the conundrum that all type II topos rely on ATP hydrolysis, but only bacterial DNA gyrases were known to transduce the free energy of hydrolysis into torsion (supercoiling) in the DNA. It made good sense to think that the other enzymes are using the energy to reduce the level of supercoiling, knotting, and particularly decatenation (unlinking), below equilibrium, since the key activity of the non-supercoiling topos is the removal of links between daughter chromosomes [3]. As Vologodskii discusses [1], there have been a number of theoretical models developed to explain how the local effect of a type II topo can influence the global level of knotting and catenation in large DNA molecules, and he explains how features of two of the most successful models (bent G segment and hooked juxtapositions) may be combined to explain the magnitude of the effect and overcome a kinetic problem with the hooked juxtaposition model.

  17. Hybrid modelling of anaerobic wastewater treatment processes.

    Science.gov (United States)

    Karama, A; Bernard, O; Genovesi, A; Dochain, D; Benhammou, A; Steyer, J P

    2001-01-01

    This paper presents a hybrid approach for the modelling of an anaerobic digestion process. The hybrid model combines a feed-forward network, describing the bacterial kinetics, and the a priori knowledge based on the mass balances of the process components. We have considered an architecture which incorporates the neural network as a static model of unmeasured process parameters (kinetic growth rate) and an integrator for the dynamic representation of the process using a set of dynamic differential equations. The paper contains a description of the neural network component training procedure. The performance of this approach is illustrated with experimental data.

  18. VARTM Process Modeling of Aerospace Composite Structures

    Science.gov (United States)

    Song, Xiao-Lan; Grimsley, Brian W.; Hubert, Pascal; Cano, Roberto J.; Loos, Alfred C.

    2003-01-01

    A three-dimensional model was developed to simulate the VARTM composite manufacturing process. The model considers the two important mechanisms that occur during the process: resin flow, and compaction and relaxation of the preform. The model was used to simulate infiltration of a carbon preform with an epoxy resin by the VARTM process. The model predicted flow patterns and preform thickness changes agreed qualitatively with the measured values. However, the predicted total infiltration times were much longer than measured most likely due to the inaccurate preform permeability values used in the simulation.

  19. Declarative business process modelling: principles and modelling languages

    Science.gov (United States)

    Goedertier, Stijn; Vanthienen, Jan; Caron, Filip

    2015-02-01

    The business process literature has proposed a multitude of business process modelling approaches or paradigms, each in response to a different business process type with a unique set of requirements. Two polar paradigms, i.e. the imperative and the declarative paradigm, appear to define the extreme positions on the paradigm spectrum. While imperative approaches focus on explicitly defining how an organisational goal should be reached, the declarative approaches focus on the directives, policies and regulations restricting the potential ways to achieve the organisational goal. In between, a variety of hybrid-paradigms can be distinguished, e.g. the advanced and adaptive case management. This article focuses on the less-exposed declarative approach on process modelling. An outline of the declarative process modelling and the modelling approaches is presented, followed by an overview of the observed declarative process modelling principles and an evaluation of the declarative process modelling approaches.

  20. Process and Context in Choice Models

    DEFF Research Database (Denmark)

    Ben-Akiva, Moshe; Palma, André de; McFadden, Daniel

    2012-01-01

    We develop a general framework that extends choice models by including an explicit representation of the process and context of decision making. Process refers to the steps involved in decision making. Context refers to factors affecting the process, focusing in this paper on social networks. The...

  1. Will Rule based BPM obliterate Process Models?

    NARCIS (Netherlands)

    Joosten, S.; Joosten, H.J.M.

    2007-01-01

    Business rules can be used directly for controlling business processes, without reference to a business process model. In this paper we propose to use business rules to specify both business processes and the software that supports them. Business rules expressed in smart mathematical notations bring

  2. Measures of Quality in Business Process Modelling

    Directory of Open Access Journals (Sweden)

    Radek Hronza

    2015-06-01

    Full Text Available Business process modelling and analysing is undoubtedly one of the most important parts of Applied (Business Informatics. Quality of business process models (diagrams is crucial for any purpose in this area. The goal of a process analyst’s work is to create generally understandable, explicit and error free models. If a process is properly described, created models can be used as an input into deep analysis and optimization. It can be assumed that properly designed business process models (similarly as in the case of correctly written algorithms contain characteristics that can be mathematically described. Besides it will be possible to create a tool that will help process analysts to design proper models. As part of this review will be conducted systematic literature review in order to find and analyse business process model’s design and business process model’s quality measures. It was found that mentioned area had already been the subject of research investigation in the past. Thirty-three suitable scietific publications and twenty-two quality measures were found. Analysed scientific publications and existing quality measures do not reflect all important attributes of business process model’s clarity, simplicity and completeness. Therefore it would be appropriate to add new measures of quality.

  3. Modeling of percolation process in hemicellulose hydrolysis.

    Science.gov (United States)

    Cahela, D R; Lee, Y Y; Chambers, R P

    1983-01-01

    A mathematical model was developed for a percolation reactor in connection with consecutive first-order reactions. The model was designed to simulated acid-catalyzed cellulose or hemicellulose hydrolysis. The modeling process resulted in an analytically derived reactor equation, including mass-transfer effects, which was found to be useful in process desing and reactor optimization. The modedl was verified by experimental data obtained from hemicellulose hydrolysis.

  4. Hybrid Sludge Modeling in Water Treatment Processes

    OpenAIRE

    Brenda, Marian

    2015-01-01

    Sludge occurs in many waste water and drinking water treatment processes. The numeric modeling of sludge is therefore crucial for developing and optimizing water treatment processes. Numeric single-phase sludge models mainly include settling and viscoplastic behavior. Even though many investigators emphasize the importance of modeling the rheology of sludge for good simulation results, it is difficult to measure, because of settling and the viscoplastic behavior. In this thesis, a new method ...

  5. On the computational modeling of FSW processes

    OpenAIRE

    Agelet de Saracibar Bosch, Carlos; Chiumenti, Michèle; Santiago, Diego de; Cervera Ruiz, Miguel; Dialami, Narges; Lombera, Guillermo

    2010-01-01

    This work deals with the computational modeling and numerical simulation of Friction Stir Welding (FSW) processes. Here a quasi-static, transient, mixed stabilized Eulerian formulation is used. Norton-Hoff and Sheppard-Wright rigid thermoplastic material models have been considered. A product formula algorithm, leading to a staggered solution scheme, has been used. The model has been implemented into the in-house developed FE code COMET. Results obtained in the simulation of FSW process are c...

  6. Integrating Tax Preparation with FAFSA Completion: Three Case Models

    Science.gov (United States)

    Daun-Barnett, Nathan; Mabry, Beth

    2012-01-01

    This research compares three different models implemented in four cities. The models integrated free tax-preparation services to assist low-income families with their completion of the Free Application for Federal Student Aid (FAFSA). There has been an increased focus on simplifying the FAFSA process. However, simplification is not the only…

  7. Generalizing on best practices in image processing: a model for promoting research integrity: Commentary on: Avoiding twisted pixels: ethical guidelines for the appropriate use and manipulation of scientific digital images.

    Science.gov (United States)

    Benos, Dale J; Vollmer, Sara H

    2010-12-01

    Modifying images for scientific publication is now quick and easy due to changes in technology. This has created a need for new image processing guidelines and attitudes, such as those offered to the research community by Doug Cromey (Cromey 2010). We suggest that related changes in technology have simplified the task of detecting misconduct for journal editors as well as researchers, and that this simplification has caused a shift in the responsibility for reporting misconduct. We also argue that the concept of best practices in image processing can serve as a general model for education in best practices in research.

  8. Extending Model Checking to Object Process Validation

    NARCIS (Netherlands)

    Rein, van H.

    2002-01-01

    Object-oriented techniques allow the gathering and modelling of system requirements in terms of an application area. The expression of data and process models at that level is a great asset in communication with non-technical people in that area, but it does not necessarily lead to consistent models

  9. Preform Characterization in VARTM Process Model Development

    Science.gov (United States)

    Grimsley, Brian W.; Cano, Roberto J.; Hubert, Pascal; Loos, Alfred C.; Kellen, Charles B.; Jensen, Brian J.

    2004-01-01

    Vacuum-Assisted Resin Transfer Molding (VARTM) is a Liquid Composite Molding (LCM) process where both resin injection and fiber compaction are achieved under pressures of 101.3 kPa or less. Originally developed over a decade ago for marine composite fabrication, VARTM is now considered a viable process for the fabrication of aerospace composites (1,2). In order to optimize and further improve the process, a finite element analysis (FEA) process model is being developed to include the coupled phenomenon of resin flow, preform compaction and resin cure. The model input parameters are obtained from resin and fiber-preform characterization tests. In this study, the compaction behavior and the Darcy permeability of a commercially available carbon fabric are characterized. The resulting empirical model equations are input to the 3- Dimensional Infiltration, version 5 (3DINFILv.5) process model to simulate infiltration of a composite panel.

  10. Package for calculations and simplifications of expressions with Dirac matrices (MatrixExp)

    Science.gov (United States)

    Poghosyan, V. A.

    2005-08-01

    This paper describes a package for calculations of expressions with Dirac matrices. Advantages over existing similar packages are described. MatrixExp package is intended for simplification of complex expressions involving γ-matrices, providing such tools as automatic Feynman parameterization, integration in d-dimensional space, sorting and grouping of results in a given order. Also, in comparison with the existing similar package Tracer, the presented package MatrixExp has more enhanced input possibility. User-available functions of MatrixExp package are described in detail. Also an example of calculation of Feynman diagram for process b→sγg with application of functions of MatrixExp package is presented. Program summaryTitle of program:MatrixExp Catalogue identifier:ADWB Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADWB Program obtainable from:CPC Program Library, Queen's University of Belfast, N. Ireland Licensing provisions:none Programming language:MATHEMATICA Computer:PC Pentium Operating system:Windows No. of lines in distributed program, including test data, etc.: 1551 No. of bytes in distributed program, including test data, etc.: 16 040 Distribution format:tar.gz RAM:loading the package uses approx. 3 500 000 bytes of RAM. However memory required for calculations depends heavily on the expressions in the view, as the package uses recursive functions, and MATHEMATICA dynamically allocates memory. Package has been tested to work on PC Pentium II 233 MHz with 128 Mb of memory calculating typical diagrams of contemporary calculations. Nature of problem:Feynman diagram calculation, simplification of expressions with γ-matrices Solution method:Analytic transformations, dimensional regularization, Feynman parameterization Restrictions:MatrixExp package works only with single line of expressions (G[l1, …]), in contrast to the Tracer package that works with multiple lines, i.e., the following is possible in Tracer, but not in MatrixExp: G[l1,

  11. Piecewise deterministic processes in biological models

    CERN Document Server

    Rudnicki, Ryszard

    2017-01-01

    This book presents a concise introduction to piecewise deterministic Markov processes (PDMPs), with particular emphasis on their applications to biological models. Further, it presents examples of biological phenomena, such as gene activity and population growth, where different types of PDMPs appear: continuous time Markov chains, deterministic processes with jumps, processes with switching dynamics, and point processes. Subsequent chapters present the necessary tools from the theory of stochastic processes and semigroups of linear operators, as well as theoretical results concerning the long-time behaviour of stochastic semigroups induced by PDMPs and their applications to biological models. As such, the book offers a valuable resource for mathematicians and biologists alike. The first group will find new biological models that lead to interesting and often new mathematical questions, while the second can observe how to include seemingly disparate biological processes into a unified mathematical theory, and...

  12. Software-Engineering Process Simulation (SEPS) model

    Science.gov (United States)

    Lin, C. Y.; Abdel-Hamid, T.; Sherif, J. S.

    1992-01-01

    The Software Engineering Process Simulation (SEPS) model is described which was developed at JPL. SEPS is a dynamic simulation model of the software project development process. It uses the feedback principles of system dynamics to simulate the dynamic interactions among various software life cycle development activities and management decision making processes. The model is designed to be a planning tool to examine tradeoffs of cost, schedule, and functionality, and to test the implications of different managerial policies on a project's outcome. Furthermore, SEPS will enable software managers to gain a better understanding of the dynamics of software project development and perform postmodern assessments.

  13. Job Aiding/Training Decision Process Model

    Science.gov (United States)

    1992-09-01

    I[ -, . 1’, oo Ii AL-CR-i1992-0004 AD-A256 947lEE = IIEI ifl ll 1l I JOB AIDING/TRAINING DECISION PROCESS MODEL A R M John P. Zenyuh DTIC S Phillip C...March 1990 - April 1990 4. TITLE AND SUBTITLE S. FUNDING NUMBERS C - F33615-86-C-0545 Job Aiding/Training Decision Process Model PE - 62205F PR - 1121 6...Components to Process Model Decision and Selection Points ........... 32 13. Summary of Subject Recommendations for Aiding Approaches

  14. Fuel Conditioning Facility Electrorefiner Process Model

    Energy Technology Data Exchange (ETDEWEB)

    DeeEarl Vaden

    2005-10-01

    The Fuel Conditioning Facility at the Idaho National Laboratory processes spent nuclear fuel from the Experimental Breeder Reactor II using electro-metallurgical treatment. To process fuel without waiting for periodic sample analyses to assess process conditions, an electrorefiner process model predicts the composition of the electrorefiner inventory and effluent streams. For the chemical equilibrium portion of the model, the two common methods for solving chemical equilibrium problems, stoichiometric and non stoichiometric, were investigated. In conclusion, the stoichiometric method produced equilibrium compositions close to the measured results whereas the non stoichiometric method did not.

  15. MULTI-SCALE GAUSSIAN PROCESSES MODEL

    Institute of Scientific and Technical Information of China (English)

    Zhou Yatong; Zhang Taiyi; Li Xiaohe

    2006-01-01

    A novel model named Multi-scale Gaussian Processes (MGP) is proposed. Motivated by the ideas of multi-scale representations in the wavelet theory, in the new model, a Gaussian process is represented at a scale by a linear basis that is composed of a scale function and its different translations. Finally the distribution of the targets of the given samples can be obtained at different scales. Compared with the standard Gaussian Processes (GP) model, the MGP model can control its complexity conveniently just by adjusting the scale parameter. So it can trade-off the generalization ability and the empirical risk rapidly. Experiments verify the feasibility of the MGP model, and exhibit that its performance is superior to the GP model if appropriate scales are chosen.

  16. Hybrid modelling of a sugar boiling process

    CERN Document Server

    Lauret, Alfred Jean Philippe; Gatina, Jean Claude

    2012-01-01

    The first and maybe the most important step in designing a model-based predictive controller is to develop a model that is as accurate as possible and that is valid under a wide range of operating conditions. The sugar boiling process is a strongly nonlinear and nonstationary process. The main process nonlinearities are represented by the crystal growth rate. This paper addresses the development of the crystal growth rate model according to two approaches. The first approach is classical and consists of determining the parameters of the empirical expressions of the growth rate through the use of a nonlinear programming optimization technique. The second is a novel modeling strategy that combines an artificial neural network (ANN) as an approximator of the growth rate with prior knowledge represented by the mass balance of sucrose crystals. The first results show that the first type of model performs local fitting while the second offers a greater flexibility. The two models were developed with industrial data...

  17. Probabilistic models of language processing and acquisition.

    Science.gov (United States)

    Chater, Nick; Manning, Christopher D

    2006-07-01

    Probabilistic methods are providing new explanatory approaches to fundamental cognitive science questions of how humans structure, process and acquire language. This review examines probabilistic models defined over traditional symbolic structures. Language comprehension and production involve probabilistic inference in such models; and acquisition involves choosing the best model, given innate constraints and linguistic and other input. Probabilistic models can account for the learning and processing of language, while maintaining the sophistication of symbolic models. A recent burgeoning of theoretical developments and online corpus creation has enabled large models to be tested, revealing probabilistic constraints in processing, undermining acquisition arguments based on a perceived poverty of the stimulus, and suggesting fruitful links with probabilistic theories of categorization and ambiguity resolution in perception.

  18. Towards a universal competitive intelligence process model

    Directory of Open Access Journals (Sweden)

    Rene Pellissier

    2013-07-01

    Full Text Available Background: Competitive intelligence (CI provides actionable intelligence, which provides a competitive edge in enterprises. However, without proper process, it is difficult to develop actionable intelligence. There are disagreements about how the CI process should be structured. For CI professionals to focus on producing actionable intelligence, and to do so with simplicity, they need a common CI process model.Objectives: The purpose of this research is to review the current literature on CI, to look at the aims of identifying and analysing CI process models, and finally to propose a universal CI process model.Method: The study was qualitative in nature and content analysis was conducted on all identified sources establishing and analysing CI process models. To identify relevant literature, academic databases and search engines were used. Moreover, a review of references in related studies led to more relevant sources, the references of which were further reviewed and analysed. To ensure reliability, only peer-reviewed articles were used.Results: The findings reveal that the majority of scholars view the CI process as a cycle of interrelated phases. The output of one phase is the input of the next phase.Conclusion: The CI process is a cycle of interrelated phases. The output of one phase is the input of the next phase. These phases are influenced by the following factors: decision makers, process and structure, organisational awareness and culture, and feedback.

  19. Towards a universal competitive intelligence process model

    Directory of Open Access Journals (Sweden)

    Rene Pellissier

    2013-08-01

    Full Text Available Background: Competitive intelligence (CI provides actionable intelligence, which provides a competitive edge in enterprises. However, without proper process, it is difficult to develop actionable intelligence. There are disagreements about how the CI process should be structured. For CI professionals to focus on producing actionable intelligence, and to do so with simplicity, they need a common CI process model.Objectives: The purpose of this research is to review the current literature on CI, to look at the aims of identifying and analysing CI process models, and finally to propose a universal CI process model.Method: The study was qualitative in nature and content analysis was conducted on all identified sources establishing and analysing CI process models. To identify relevant literature, academic databases and search engines were used. Moreover, a review of references in related studies led to more relevant sources, the references of which were further reviewed and analysed. To ensure reliability, only peer-reviewed articles were used.Results: The findings reveal that the majority of scholars view the CI process as a cycle of interrelated phases. The output of one phase is the input of the next phase.Conclusion: The CI process is a cycle of interrelated phases. The output of one phase is the input of the next phase. These phases are influenced by the following factors: decision makers, process and structure, organisational awareness and culture, and feedback.

  20. Online Rule Generation Software Process Model

    Directory of Open Access Journals (Sweden)

    Sudeep Marwaha

    2013-07-01

    Full Text Available For production systems like expert systems, a rule generation software can facilitate the faster deployment. The software process model for rule generation using decision tree classifier refers to the various steps required to be executed for the development of a web based software model for decision rule generation. The Royce’s final waterfall model has been used in this paper to explain the software development process. The paper presents the specific output of various steps of modified waterfall model for decision rules generation.

  1. Models and Modelling Tools for Chemical Product and Process Design

    DEFF Research Database (Denmark)

    Gani, Rafiqul

    2016-01-01

    The design, development and reliability of a chemical product and the process to manufacture it, need to be consistent with the end-use characteristics of the desired product. One of the common ways to match the desired product-process characteristics is through trial and error based experiments......-based framework is that in the design, development and/or manufacturing of a chemical product-process, the knowledge of the applied phenomena together with the product-process design details can be provided with diverse degrees of abstractions and details. This would allow the experimental resources......, are the needed models for such a framework available? Or, are modelling tools that can help to develop the needed models available? Can such a model-based framework provide the needed model-based work-flows matching the requirements of the specific chemical product-process design problems? What types of models...

  2. MODELLING PURCHASING PROCESSES FROM QUALITY ASPECTS

    Directory of Open Access Journals (Sweden)

    Zora Arsovski

    2008-12-01

    Full Text Available Management has a fundamental task to identify and direct primary and specific processes within purchasing function, applying the up-to-date information infrastructure. ISO 9001:2000 defines a process as a number of interrelated or interactive activities transforming inputs and outputs, and the "process approach" as a systematic identification in management processes employed with the organization and particularly - relationships among the processes. To direct a quality management system using process approach, the organization is to determine the map of its general (basic processes. Primary processes are determined on the grounds of their interrelationship and impact on satisfying customers' needs. To make a proper choice of general business processes, it is necessary to determine the entire business flow, beginning with the customer demand up to the delivery of products or service provided. In the next step the process model is to be converted into data model which is essential for implementation of the information system enabling automation, monitoring, measuring, inspection, analysis and improvement of key purchase processes. In this paper are given methodology and some results of investigation of development of IS for purchasing process from aspects of quality.

  3. CSP-based chemical kinetics mechanisms simplification strategy for non-premixed combustion: An application to hybrid rocket propulsion

    KAUST Repository

    Ciottoli, Pietro P.

    2017-08-14

    A set of simplified chemical kinetics mechanisms for hybrid rocket applications using gaseous oxygen (GOX) and hydroxyl-terminated polybutadiene (HTPB) is proposed. The starting point is a 561-species, 2538-reactions, detailed chemical kinetics mechanism for hydrocarbon combustion. This mechanism is used for predictions of the oxidation of butadiene, the primary HTPB pyrolysis product. A Computational Singular Perturbation (CSP) based simplification strategy for non-premixed combustion is proposed. The simplification algorithm is fed with the steady-solutions of classical flamelet equations, these being representative of the non-premixed nature of the combustion processes characterizing a hybrid rocket combustion chamber. The adopted flamelet steady-state solutions are obtained employing pure butadiene and gaseous oxygen as fuel and oxidizer boundary conditions, respectively, for a range of imposed values of strain rate and background pressure. Three simplified chemical mechanisms, each comprising less than 20 species, are obtained for three different pressure values, 3, 17, and 36 bar, selected in accordance with an experimental test campaign of lab-scale hybrid rocket static firings. Finally, a comprehensive strategy is shown to provide simplified mechanisms capable of reproducing the main flame features in the whole pressure range considered.

  4. Mathematical modelling of the calcination process | Olayiwola ...

    African Journals Online (AJOL)

    Mathematical modelling of the calcination process. ... High quality lime is an essential raw material for Electric Arc Furnaces and Basic Oxygen Furnaces ... From the numerical simulation, it is observed that the gas temperature increases as the ...

  5. Three-dimensional model analysis and processing

    CERN Document Server

    Yu, Faxin; Luo, Hao; Wang, Pinghui

    2011-01-01

    This book focuses on five hot research directions in 3D model analysis and processing in computer science:  compression, feature extraction, content-based retrieval, irreversible watermarking and reversible watermarking.

  6. Modeling of Dielectric Heating within Lyophilization Process

    Directory of Open Access Journals (Sweden)

    Jan Kyncl

    2014-01-01

    Full Text Available A process of lyophilization of paper books is modeled. The process of drying is controlled by a dielectric heating system. From the physical viewpoint, the task represents a 2D coupled problem described by two partial differential equations for the electric and temperature fields. The material parameters are supposed to be temperature-dependent functions. The continuous mathematical model is solved numerically. The methodology is illustrated with some examples whose results are discussed.

  7. A Process Model of Quantum Mechanics

    OpenAIRE

    Sulis, William

    2014-01-01

    A process model of quantum mechanics utilizes a combinatorial game to generate a discrete and finite causal space upon which can be defined a self-consistent quantum mechanics. An emergent space-time M and continuous wave function arise through a non-uniform interpolation process. Standard non-relativistic quantum mechanics emerges under the limit of infinite information (the causal space grows to infinity) and infinitesimal scale (the separation between points goes to zero). The model has th...

  8. A Procedural Model for Process Improvement Projects

    OpenAIRE

    Kreimeyer, Matthias;Daniilidis, Charampos;Lindemann, Udo

    2017-01-01

    Process improvement projects are of a complex nature. It is therefore necessary to use experience and knowledge gained in previous projects when executing a new project. Yet, there are few pragmatic planning aids, and transferring the institutional knowledge from one project to the next is difficult. This paper proposes a procedural model that extends common models for project planning to enable staff on a process improvement project to adequately plan their projects, enabling them to documen...

  9. THE STUDY OF SIMPLIFICATION AND EXPLICITATION TECHNIQUES IN KHALED HOSSEINI'S “A THOUSAND SPLENDID SUNS”

    Directory of Open Access Journals (Sweden)

    Reza Kafipour

    2016-12-01

    Full Text Available Teaching and learning strategies help facilitate teaching and learning. Among them, simplification and explicitation strategies are those which help transferring the meaning to the learners and readers of a translated text. The aim of this study was to investigate explicitation and simplification in Persian translation of novel of Khaled Hosseini's “A Thousand Splendid Suns”. The study also attempted to find out frequencies of simplification and explicitation techniques used by the translators in translating the novel. To do so, 359 sentences out of 6000 sentences in original text were selected by systematic random sampling procedure. Then the percentage and total sums of each one of the strategies were calculated. The result showed that both translators used simplification and explicitation techniques significantly in their translation whereas Saadvandian, the first translator, significantly applied more simplification techniques in comparison with Ghabrai, the second translator. However, no significant difference was found between translators in the application of explicitation techniques. The study implies that these two translation strategies were fully familiar for the translators as both translators used them significantly to make the translation more understandable to the readers.

  10. The impact of working memory and the "process of process modelling" on model quality: Investigating experienced versus inexperienced modellers

    DEFF Research Database (Denmark)

    Martini, Markus; Pinggera, Jakob; Neurauter, Manuel

    2016-01-01

    the role of cognitive processes as well as modelling processes in creating a PM in experienced and inexperienced modellers. Specifically, two working memory (WM) functions (holding and processing of information and relational integration) and three process of process modelling phases (comprehension...

  11. Flux Analysis in Process Models via Causality

    CERN Document Server

    Kahramanoğullari, Ozan

    2010-01-01

    We present an approach for flux analysis in process algebra models of biological systems. We perceive flux as the flow of resources in stochastic simulations. We resort to an established correspondence between event structures, a broadly recognised model of concurrency, and state transitions of process models, seen as Petri nets. We show that we can this way extract the causal resource dependencies in simulations between individual state transitions as partial orders of events. We propose transformations on the partial orders that provide means for further analysis, and introduce a software tool, which implements these ideas. By means of an example of a published model of the Rho GTP-binding proteins, we argue that this approach can provide the substitute for flux analysis techniques on ordinary differential equation models within the stochastic setting of process algebras.

  12. Cost Models for MMC Manufacturing Processes

    Science.gov (United States)

    Elzey, Dana M.; Wadley, Haydn N. G.

    1996-01-01

    Processes for the manufacture of advanced metal matrix composites are rapidly approaching maturity in the research laboratory and there is growing interest in their transition to industrial production. However, research conducted to date has almost exclusively focused on overcoming the technical barriers to producing high-quality material and little attention has been given to the economical feasibility of these laboratory approaches and process cost issues. A quantitative cost modeling (QCM) approach was developed to address these issues. QCM are cost analysis tools based on predictive process models relating process conditions to the attributes of the final product. An important attribute, of the QCM approach is the ability to predict the sensitivity of material production costs to product quality and to quantitatively explore trade-offs between cost and quality. Applications of the cost models allow more efficient direction of future MMC process technology development and a more accurate assessment of MMC market potential. Cost models were developed for two state-of-the art metal matrix composite (MMC) manufacturing processes: tape casting and plasma spray deposition. Quality and Cost models are presented for both processes and the resulting predicted quality-cost curves are presented and discussed.

  13. Physical and mathematical modelling of extrusion processes

    DEFF Research Database (Denmark)

    Arentoft, Mogens; Gronostajski, Z.; Niechajowics, A.

    2000-01-01

    The main objective of the work is to study the extrusion process using physical modelling and to compare the findings of the study with finite element predictions. The possibilities and advantages of the simultaneous application of both of these methods for the analysis of metal forming processes...

  14. Business Process Modeling Notation - An Overview

    Directory of Open Access Journals (Sweden)

    Alexandra Fortiş

    2006-01-01

    Full Text Available BPMN represents an industrial standard created to offer a common and user friendly notation to all the participants to a business process. The present paper aims to briefly present the main features of this notation as well as an interpretation of some of the main patterns characterizing a business process modeled by the working fluxes.

  15. Physical and mathematical modelling of extrusion processes

    DEFF Research Database (Denmark)

    Arentoft, Mogens; Gronostajski, Z.; Niechajowics, A.

    2000-01-01

    The main objective of the work is to study the extrusion process using physical modelling and to compare the findings of the study with finite element predictions. The possibilities and advantages of the simultaneous application of both of these methods for the analysis of metal forming processes...

  16. [The model of adaptive primary image processing].

    Science.gov (United States)

    Dudkin, K N; Mironov, S V; Dudkin, A K; Chikhman, V N

    1998-07-01

    A computer model of adaptive segmentation of the 2D visual objects was developed. Primary image descriptions are realised via spatial frequency filters and feature detectors performing as self-organised mechanisms. Simulation of the control processes related to attention, lateral, frequency-selective and cross-orientation inhibition, determines the adaptive image processing.

  17. Effect of transport-pathway simplifications on projected releases of radionuclides from a nuclear waste repository (Sweden)

    Science.gov (United States)

    Selroos, Jan-Olof; Painter, Scott L.

    2012-12-01

    The Swedish Nuclear Fuel and Waste Management Company has recently submitted an application for a license to construct a final repository for spent nuclear fuel, at approximately 500 m depth in crystalline bedrock. Migration pathways through the geosphere barrier are geometrically complex, with segments in fractured rock, deformation zones, backfilled tunnels, and near-surface soils. Several simplifications of these complex migration pathways were used in the assessments of repository performance that supported the license application. Specifically, in the geosphere transport calculations, radionuclide transport in soils and tunnels was neglected, and deformation zones were assumed to have transport characteristics of fractured rock. The effects of these simplifications on the projected performance of the geosphere barrier system are addressed. Geosphere performance is shown to be sensitive to how transport characteristics of deformation zones are conceptualized and incorporated into the model. Incorporation of advective groundwater travel time within backfilled tunnels reduces radiological dose from non-sorbing radionuclides such as I-129, while sorption in near-surface soils reduces radiological doses from sorbing radionuclides such as Ra-226. These results help quantify the degree to which geosphere performance was pessimistically assessed, and provide some guidance on how future studies to reduce uncertainty in geosphere performance may be focused.

  18. Laser surface processing and model studies

    CERN Document Server

    Yilbas, Bekir Sami

    2013-01-01

    This book introduces model studies associated with laser surface processing such as conduction limited heating, surface re-melting, Marangoni flow and its effects on the temperature field, re-melting of multi-layered surfaces, laser shock processing, and practical applications. The book provides insight into the physical processes involved with laser surface heating and phase change in laser irradiated region. It is written for engineers and researchers working on laser surface engineering.

  19. The (mathematical modelling process in biosciences

    Directory of Open Access Journals (Sweden)

    Nestor V. Torres

    2015-12-01

    Full Text Available In this communication we introduce a general framework and discussion on the role of models and the modelling process within the scientific activity in the biosciences realm. The objective is sum up the common procedure during the formalization and analysis of a biological problem under the foundations of Systems Biology, which approach the study of biological systems as a whole.We begin by presenting the definitions of (biological system and model. Particular attention is given to the meaning of mathematical model within the context of the biology. Then, we present the modelization and analysis process of biological systems. Three stages are described in detail: conceptualization of the biological system into a model, mathematical formalization of the previous conceptual model and optimization and system management derived from the analysis of the mathematical model.All along this presentation the main features and shortcomings of the process are developed together with a set of rules that could help in the modelling endeavour of any biological system. Special regard is given to the formative requirements and the interdisciplinary nature of this approach. We conclude with some general considerations on the challenges that the modelling are currently posing to the current biology.

  20. Modeling aerosol processes at the local scale

    Energy Technology Data Exchange (ETDEWEB)

    Lazaridis, M.; Isukapalli, S.S.; Georgopoulos, P.G. [Environmental and Occupational Health Sciences Inst., NJ (United States)

    1998-12-31

    This work presents an approach for modeling photochemical gaseous and aerosol phase processes in subgrid plumes from major localized (e.g. point) sources (plume-in-grid modeling), thus improving the ability to quantify the relationship between emission source activity and ambient air quality. This approach employs the Reactive Plume Model (RPM-AERO) which extends the regulatory model RPM-IV by incorporating aerosol processes and heterogeneous chemistry. The physics and chemistry of elemental carbon, organic carbon, sulfate, sodium, chloride and crustal material of aerosols are treated and attributed to the PM size distribution. A modified version of the Carbon Bond IV chemical mechanism is included to model the formation of organic aerosol, and the inorganic multicomponent atmospheric aerosol equilibrium model, SEQUILIB is used for calculating the amounts of inorganic species in particulate matter. Aerosol dynamics modeled include mechanisms of nucleation, condensation and gas/particle partitioning of organic matter. An integrated trajectory-in-grid modeling system, UAM/RPM-AERO, is under continuing development for extracting boundary and initial conditions from the mesoscale photochemical/aerosol model UAM-AERO. The RPM-AERO is applied here to case studies involving emissions from point sources to study sulfate particle formation in plumes. Model calculations show that homogeneous nucleation is an efficient process for new particle formation in plumes, in agreement with previous field studies and theoretical predictions.

  1. Analysis and evaluation of collaborative modeling processes

    NARCIS (Netherlands)

    Ssebuggwawo, D.

    2012-01-01

    Analysis and evaluation of collaborative modeling processes is confronted with many challenges. On the one hand, many systems design and re-engineering projects require collaborative modeling approaches that can enhance their productivity. But, such collaborative efforts, which often consist of the

  2. Model checking Quasi Birth Death processes

    NARCIS (Netherlands)

    Remke, A.K.I.

    2004-01-01

    Quasi-Birth Death processes (QBDs) are a special class of infinite state CTMCs that combines a large degree of modeling expressiveness with efficient solution methods. This work adapts the well-known stochastic logic CSL for use on QBDs as CSL and presents model checking algorithms for so-called lev

  3. Bayesian Network Based XP Process Modelling

    Directory of Open Access Journals (Sweden)

    Mohamed Abouelela

    2010-07-01

    Full Text Available A Bayesian Network based mathematical model has been used for modelling Extreme Programmingsoftware development process. The model is capable of predicting the expected finish time and theexpected defect rate for each XP release. Therefore, it can be used to determine the success/failure of anyXP Project. The model takes into account the effect of three XP practices, namely: Pair Programming,Test Driven Development and Onsite Customer practices. The model’s predictions were validated againsttwo case studies. Results show the precision of our model especially in predicting the project finish time.

  4. Process model patterns for collaborative work

    OpenAIRE

    Lonchamp, Jacques

    1998-01-01

    Colloque avec actes et comité de lecture.; As most real work is collaborative in nature, process model developers have to model collaborative situations. This paper defines generic collaborative patterns, ie, pragmatic and abstract building blocks for modelling recurrent situations. The first part specifies the graphical notation for the solution description. The second part gives some current patterns for the collaborative production of a single document in isolation and for the synchronizat...

  5. Using Perspective to Model Complex Processes

    Energy Technology Data Exchange (ETDEWEB)

    Kelsey, R.L.; Bisset, K.R.

    1999-04-04

    The notion of perspective, when supported in an object-based knowledge representation, can facilitate better abstractions of reality for modeling and simulation. The object modeling of complex physical and chemical processes is made more difficult in part due to the poor abstractions of state and phase changes available in these models. The notion of perspective can be used to create different views to represent the different states of matter in a process. These techniques can lead to a more understandable model. Additionally, the ability to record the progress of a process from start to finish is problematic. It is desirable to have a historic record of the entire process, not just the end result of the process. A historic record should facilitate backtracking and re-start of a process at different points in time. The same representation structures and techniques can be used to create a sequence of process markers to represent a historic record. By using perspective, the sequence of markers can have multiple and varying views tailored for a particular user's context of interest.

  6. Filament winding cylinders. I - Process model

    Science.gov (United States)

    Lee, Soo-Yong; Springer, George S.

    1990-01-01

    A model was developed which describes the filament winding process of composite cylinders. The model relates the significant process variables such as winding speed, fiber tension, and applied temperature to the thermal, chemical and mechanical behavior of the composite cylinder and the mandrel. Based on the model, a user friendly code was written which can be used to calculate (1) the temperature in the cylinder and the mandrel, (2) the degree of cure and viscosity in the cylinder, (3) the fiber tensions and fiber positions, (4) the stresses and strains in the cylinder and in the mandrel, and (5) the void diameters in the cylinder.

  7. Social Modification With The Changing Technology in The Case of Simplification Theory

    Directory of Open Access Journals (Sweden)

    Rana Nur Ülker

    2014-10-01

    Full Text Available Mc. Luhan’s told “The world become global village” came truth. The global village, which main tools are mass media technologies, especially internet, made civilization socials. Withthe rise of the global communication, every new inventions can be known easily and the technology can be observed. As Marcuse said that the global communication not only makes people same but also simple. Tools are being simple which is understood byeverybody from every civilization indeed every age period. In this study, we try to debate this matter, why and how “simplification theory” can make people and technology simple.The simplification theory can be defined as when the media technologies are changed, peoples values and overlook of world are modified. Due to fact that societies’ value and argument about the world translated technologic devices. To sum up, all the instrument of the about human being has changed.Key words: Simplification, Social Modification, Mass Media, Technology, Globalization

  8. Modelling of aerosol processes in plumes

    Energy Technology Data Exchange (ETDEWEB)

    Lazaridis, M.; Isukapalli, S.S.; Georgopoulos, P.G. [Norwegian Institute of Air Research, Kjeller (Norway)

    2001-07-01

    A modelling platform for studying photochemical gaseous and aerosol phase processes from localized (e.g., point) sources has been presented. The current approach employs a reactive plume model which extends the regulatory model RPM-IV by incorporating aerosol processes and heterogeneous chemistry. The physics and chemistry of elemental carbon, organic carbon, sulfate, nitrate, ammonium material of aerosols are treated and attributed to the PM size distribution. A modified version of the carbon bond IV chemical mechanism is included to model the formation of organic aerosol. Aerosol dynamics modeled include mechanisms of nucleation, condensation, dry deposition and gas/particle partitioning of organic matter. The model is first applied to a number of case studies involving emissions from point sources and sulfate particle formation in plumes. Model calculations show that homogeneous nucleation is an efficient process for new particle formation in plumes, in agreement with previous field studies and theoretical predictions. In addition, the model is compared with field data from power plant plumes with satisfactory predictions against gaseous species and total sulphate mass measurements. Finally, the plume model is applied to study secondary organic matter formation due to various emission categories such as vehicles and the oil production sector.

  9. Modeling Multivariate Volatility Processes: Theory and Evidence

    Directory of Open Access Journals (Sweden)

    Jelena Z. Minovic

    2009-05-01

    Full Text Available This article presents theoretical and empirical methodology for estimation and modeling of multivariate volatility processes. It surveys the model specifications and the estimation methods. Multivariate GARCH models covered are VEC (initially due to Bollerslev, Engle and Wooldridge, 1988, diagonal VEC (DVEC, BEKK (named after Baba, Engle, Kraft and Kroner, 1995, Constant Conditional Correlation Model (CCC, Bollerslev, 1990, Dynamic Conditional Correlation Model (DCC models of Tse and Tsui, 2002, and Engle, 2002. I illustrate approach by applying it to daily data from the Belgrade stock exchange, I examine two pairs of daily log returns for stocks and index, report the results obtained, and compare them with the restricted version of BEKK, DVEC and CCC representations. The methods for estimation parameters used are maximum log-likehood (in BEKK and DVEC models and twostep approach (in CCC model.

  10. Modeling the VARTM Composite Manufacturing Process

    Science.gov (United States)

    Song, Xiao-Lan; Loos, Alfred C.; Grimsley, Brian W.; Cano, Roberto J.; Hubert, Pascal

    2004-01-01

    A comprehensive simulation model of the Vacuum Assisted Resin Transfer Modeling (VARTM) composite manufacturing process has been developed. For isothermal resin infiltration, the model incorporates submodels which describe cure of the resin and changes in resin viscosity due to cure, resin flow through the reinforcement preform and distribution medium and compaction of the preform during the infiltration. The accuracy of the model was validated by measuring the flow patterns during resin infiltration of flat preforms. The modeling software was used to evaluate the effects of the distribution medium on resin infiltration of a flat preform. Different distribution medium configurations were examined using the model and the results were compared with data collected during resin infiltration of a carbon fabric preform. The results of the simulations show that the approach used to model the distribution medium can significantly effect the predicted resin infiltration times. Resin infiltration into the preform can be accurately predicted only when the distribution medium is modeled correctly.

  11. Pedagogic process modeling: Humanistic-integrative approach

    Directory of Open Access Journals (Sweden)

    Boritko Nikolaj M.

    2007-01-01

    Full Text Available The paper deals with some current problems of modeling the dynamics of the subject-features development of the individual. The term "process" is considered in the context of the humanistic-integrative approach, in which the principles of self education are regarded as criteria for efficient pedagogic activity. Four basic characteristics of the pedagogic process are pointed out: intentionality reflects logicality and regularity of the development of the process; discreteness (stageability in dicates qualitative stages through which the pedagogic phenomenon passes; nonlinearity explains the crisis character of pedagogic processes and reveals inner factors of self-development; situationality requires a selection of pedagogic conditions in accordance with the inner factors, which would enable steering the pedagogic process. Offered are two steps for singling out a particular stage and the algorithm for developing an integrative model for it. The suggested conclusions might be of use for further theoretic research, analyses of educational practices and for realistic predicting of pedagogical phenomena. .

  12. Ancestral process and diffusion model with selection

    CERN Document Server

    Mano, Shuhei

    2008-01-01

    The ancestral selection graph in population genetics introduced by Krone and Neuhauser (1997) is an analogue to the coalescent genealogy. The number of ancestral particles, backward in time, of a sample of genes is an ancestral process, which is a birth and death process with quadratic death and linear birth rate. In this paper an explicit form of the number of ancestral particle is obtained, by using the density of the allele frequency in the corresponding diffusion model obtained by Kimura (1955). It is shown that fixation is convergence of the ancestral process to the stationary measure. The time to fixation of an allele is studied in terms of the ancestral process.

  13. Mathematical modeling of biomass fuels formation process.

    Science.gov (United States)

    Gaska, Krzysztof; Wandrasz, Andrzej J

    2008-01-01

    The increasing demand for thermal and electric energy in many branches of industry and municipal management accounts for a drastic diminishing of natural resources (fossil fuels). Meanwhile, in numerous technical processes, a huge mass of wastes is produced. A segregated and converted combustible fraction of the wastes, with relatively high calorific value, may be used as a component of formed fuels. The utilization of the formed fuel components from segregated groups of waste in associated processes of co-combustion with conventional fuels causes significant savings resulting from partial replacement of fossil fuels, and reduction of environmental pollution resulting directly from the limitation of waste migration to the environment (soil, atmospheric air, surface and underground water). The realization of technological processes with the utilization of formed fuel in associated thermal systems should be qualified by technical criteria, which means that elementary processes as well as factors of sustainable development, from a global viewpoint, must not be disturbed. The utilization of post-process waste should be preceded by detailed technical, ecological and economic analyses. In order to optimize the mixing process of fuel components, a mathematical model of the forming process was created. The model is defined as a group of data structures which uniquely identify a real process and conversion of this data in algorithms based on a problem of linear programming. The paper also presents the optimization of parameters in the process of forming fuels using a modified simplex algorithm with a polynomial worktime. This model is a datum-point in the numerical modeling of real processes, allowing a precise determination of the optimal elementary composition of formed fuels components, with assumed constraints and decision variables of the task.

  14. From Business Value Model to Coordination Process Model

    Science.gov (United States)

    Fatemi, Hassan; van Sinderen, Marten; Wieringa, Roel

    The increased complexity of business webs calls for modeling the collaboration of enterprises from different perspectives, in particular the business and process perspectives, and for mutually aligning these perspectives. Business value modeling and coordination process modeling both are necessary for a good e-business design, but these activities have different goals and use different concepts. Nevertheless, the resulting models should be consistent with each other because they refer to the same system from different perspectives. Hence, checking the consistency between these models or producing one based on the other would be of high value. In this paper we discuss the issue of achieving consistency in multi-level e-business design and give guidelines to produce consistent coordination process models from business value models in a stepwise manner.

  15. Estimation of soil organic carbon based on remote sensing and process model

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    The estimation of the soil organic carbon content (SOC) is one of the important issues in the research of the global carbon cycle.However,there are great differences among different scientists regarding the estimated magnitude of SOC.There are two commonly used methods for the estimation of SOC,with each method having both advantages and disadvantages.One method is the so called direct method,which is based on the samples of measured SOC and maps of soil or vegetation types.The other method is the so called indirect method,which is based on the ecosystem process model of the carbon cycle.The disadvantage of the direct method is that it mainly discloses the difference of the SOC among different soil or vegetation types.It can hardly distinguish the difference of the SOC in the same type of soil or vegetation.The indirect method,a process-based method,is based on the mechanics of carbon transfer in the ecosystem and can potentially improve the spatial resolution of the SOC estimation if the input variables have a high spatial resolution.However,due to the complexity of the process-based model,the model usually simplifies some key model parameters that have spatial heterogeneity with constants.This simplification will produce a great deal of uncertainties in the estimation of the SOC,especially on the spatial precision.In this paper,we combined the process-based model (CASA model) with the measured SOC,in which the remote sensing data (AVHRR NDIV) was incorporated into the model to enhance the spatial resolution.To model the soil base respiration,the Van't Hoff model was used to combine with the CASA model.The results show that this method could significantly improve the spatial precision (8 km spatial resolution).The results also show that there is a relationship between soil base respiration and the SOC as the influence of environmental factors,i.e.,temperature and moisture,had been removed from soil respiration which makes the SOC the most important factor of soil

  16. Software Engineering Laboratory (SEL) cleanroom process model

    Science.gov (United States)

    Green, Scott; Basili, Victor; Godfrey, Sally; Mcgarry, Frank; Pajerski, Rose; Waligora, Sharon

    1991-01-01

    The Software Engineering Laboratory (SEL) cleanroom process model is described. The term 'cleanroom' originates in the integrated circuit (IC) production process, where IC's are assembled in dust free 'clean rooms' to prevent the destructive effects of dust. When applying the clean room methodology to the development of software systems, the primary focus is on software defect prevention rather than defect removal. The model is based on data and analysis from previous cleanroom efforts within the SEL and is tailored to serve as a guideline in applying the methodology to future production software efforts. The phases that are part of the process model life cycle from the delivery of requirements to the start of acceptance testing are described. For each defined phase, a set of specific activities is discussed, and the appropriate data flow is described. Pertinent managerial issues, key similarities and differences between the SEL's cleanroom process model and the standard development approach used on SEL projects, and significant lessons learned from prior cleanroom projects are presented. It is intended that the process model described here will be further tailored as additional SEL cleanroom projects are analyzed.

  17. The impact of working memory and the "process of process modelling" on model quality: Investigating experienced versus inexperienced modellers

    DEFF Research Database (Denmark)

    Martini, Markus; Pinggera, Jakob; Neurauter, Manuel

    2016-01-01

    of reconciliation phases was positively related to PM quality in experienced modellers. Our research reveals central cognitive mechanisms in process modelling and has potential practical implications for the development of modelling software and teaching the craft of process modelling.......A process model (PM) represents the graphical depiction of a business process, for instance, the entire process from online ordering a book until the parcel is delivered to the customer. Knowledge about relevant factors for creating PMs of high quality is lacking. The present study investigated...... the role of cognitive processes as well as modelling processes in creating a PM in experienced and inexperienced modellers. Specifically, two working memory (WM) functions (holding and processing of information and relational integration) and three process of process modelling phases (comprehension...

  18. Causally nonseparable processes admitting a causal model

    Science.gov (United States)

    Feix, Adrien; Araújo, Mateus; Brukner, Časlav

    2016-08-01

    A recent framework of quantum theory with no global causal order predicts the existence of ‘causally nonseparable’ processes. Some of these processes produce correlations incompatible with any causal order (they violate so-called ‘causal inequalities’ analogous to Bell inequalities) while others do not (they admit a ‘causal model’ analogous to a local model). Here we show for the first time that bipartite causally nonseparable processes with a causal model exist, and give evidence that they have no clear physical interpretation. We also provide an algorithm to generate processes of this kind and show that they have nonzero measure in the set of all processes. We demonstrate the existence of processes which stop violating causal inequalities but are still causally nonseparable when mixed with a certain amount of ‘white noise’. This is reminiscent of the behavior of Werner states in the context of entanglement and nonlocality. Finally, we provide numerical evidence for the existence of causally nonseparable processes which have a causal model even when extended with an entangled state shared among the parties.

  19. Stochastic differential equation model to Prendiville processes

    Energy Technology Data Exchange (ETDEWEB)

    Granita, E-mail: granitafc@gmail.com [Dept. of Mathematical Science, Universiti Teknologi Malaysia, 81310, Johor Malaysia (Malaysia); Bahar, Arifah [Dept. of Mathematical Science, Universiti Teknologi Malaysia, 81310, Johor Malaysia (Malaysia); UTM Center for Industrial & Applied Mathematics (UTM-CIAM) (Malaysia)

    2015-10-22

    The Prendiville process is another variation of the logistic model which assumes linearly decreasing population growth rate. It is a continuous time Markov chain (CTMC) taking integer values in the finite interval. The continuous time Markov chain can be approximated by stochastic differential equation (SDE). This paper discusses the stochastic differential equation of Prendiville process. The work started with the forward Kolmogorov equation in continuous time Markov chain of Prendiville process. Then it was formulated in the form of a central-difference approximation. The approximation was then used in Fokker-Planck equation in relation to the stochastic differential equation of the Prendiville process. The explicit solution of the Prendiville process was obtained from the stochastic differential equation. Therefore, the mean and variance function of the Prendiville process could be easily found from the explicit solution.

  20. Stochastic differential equation model to Prendiville processes

    Science.gov (United States)

    Granita, Bahar, Arifah

    2015-10-01

    The Prendiville process is another variation of the logistic model which assumes linearly decreasing population growth rate. It is a continuous time Markov chain (CTMC) taking integer values in the finite interval. The continuous time Markov chain can be approximated by stochastic differential equation (SDE). This paper discusses the stochastic differential equation of Prendiville process. The work started with the forward Kolmogorov equation in continuous time Markov chain of Prendiville process. Then it was formulated in the form of a central-difference approximation. The approximation was then used in Fokker-Planck equation in relation to the stochastic differential equation of the Prendiville process. The explicit solution of the Prendiville process was obtained from the stochastic differential equation. Therefore, the mean and variance function of the Prendiville process could be easily found from the explicit solution.

  1. The Probability Model of Expectation Disconfirmation Process

    Directory of Open Access Journals (Sweden)

    Hui-Hsin HUANG

    2015-06-01

    Full Text Available This paper proposes a probability model to explore the dynamic process of customer’s satisfaction. Bases on expectation disconfirmation theory, the satisfaction is constructed with customer’s expectation before buying behavior and the perceived performance after purchase. The experiment method is designed to measure expectation disconfirmation effects and we also use the collection data to estimate the overall satisfaction and model calibration. The results show good fitness between the model and the real data. This model has application for business marketing areas in order to manage relationship satisfaction.

  2. Chain binomial models and binomial autoregressive processes.

    Science.gov (United States)

    Weiss, Christian H; Pollett, Philip K

    2012-09-01

    We establish a connection between a class of chain-binomial models of use in ecology and epidemiology and binomial autoregressive (AR) processes. New results are obtained for the latter, including expressions for the lag-conditional distribution and related quantities. We focus on two types of chain-binomial model, extinction-colonization and colonization-extinction models, and present two approaches to parameter estimation. The asymptotic distributions of the resulting estimators are studied, as well as their finite-sample performance, and we give an application to real data. A connection is made with standard AR models, which also has implications for parameter estimation.

  3. Polarizable Water Model for the Coarse-Grained MARTINI Force Field

    NARCIS (Netherlands)

    Yesylevskyy, Semen O.; Schafer, Lars V.; Sengupta, Durba; Marrink, Siewert J.

    2010-01-01

    Coarse-grained (CG) simulations have become an essential tool to study a large variety of biomolecular processes, exploring temporal and spatial scales inaccessible to traditional models of atomistic resolution. One of the major simplifications of CG models is the representation of the solvent, whic

  4. Modeling heterogeneous chemical processes on aerosol surface

    Institute of Scientific and Technical Information of China (English)

    Junjun Deng; Tijian Wang; Li Liu; Fei Jiang

    2010-01-01

    To explore the possible impact of heterogeneous chemical processes on atmospheric trace components,a coupled box model including gas-phase chemical processes,aerosol thermodynamic equilibrium processes,and heterogeneous chemical processes on the surface of dust,black carbon(BC)and sea salt is set up to simulate the effects of heterogeneous chemistry on the aerosol surface,and analyze the primary factors affecting the heterogeneous processes.Results indicate that heterogeneous chemical processes on the aerosol surface in the atmosphere will affect the concentrations of trace gases such as H2O2,HO2,O3,NO2,NO3,HNO3 and SO2,and aerosols such as SO42-,NO3-and NH4+.Sensitivity tests suggest that the magnitude of the impact of heterogeneous processes strongly depends on aerosol concentration and the surface uptake coefficients used in the box model.However,the impact of temperature on heterogeneous chemical processes is considerably less.The"renoxification"of HNO3 will affect the components of the troposphere such as nitrogen oxide and ozone.

  5. Incorporating evolutionary processes into population viability models.

    Science.gov (United States)

    Pierson, Jennifer C; Beissinger, Steven R; Bragg, Jason G; Coates, David J; Oostermeijer, J Gerard B; Sunnucks, Paul; Schumaker, Nathan H; Trotter, Meredith V; Young, Andrew G

    2015-06-01

    We examined how ecological and evolutionary (eco-evo) processes in population dynamics could be better integrated into population viability analysis (PVA). Complementary advances in computation and population genomics can be combined into an eco-evo PVA to offer powerful new approaches to understand the influence of evolutionary processes on population persistence. We developed the mechanistic basis of an eco-evo PVA using individual-based models with individual-level genotype tracking and dynamic genotype-phenotype mapping to model emergent population-level effects, such as local adaptation and genetic rescue. We then outline how genomics can allow or improve parameter estimation for PVA models by providing genotypic information at large numbers of loci for neutral and functional genome regions. As climate change and other threatening processes increase in rate and scale, eco-evo PVAs will become essential research tools to evaluate the effects of adaptive potential, evolutionary rescue, and locally adapted traits on persistence.

  6. Quantitative Modeling of Earth Surface Processes

    Science.gov (United States)

    Pelletier, Jon D.

    This textbook describes some of the most effective and straightforward quantitative techniques for modeling Earth surface processes. By emphasizing a core set of equations and solution techniques, the book presents state-of-the-art models currently employed in Earth surface process research, as well as a set of simple but practical research tools. Detailed case studies demonstrate application of the methods to a wide variety of processes including hillslope, fluvial, aeolian, glacial, tectonic, and climatic systems. Exercises at the end of each chapter begin with simple calculations and then progress to more sophisticated problems that require computer programming. All the necessary computer codes are available online at www.cambridge.org/9780521855976. Assuming some knowledge of calculus and basic programming experience, this quantitative textbook is designed for advanced geomorphology courses and as a reference book for professional researchers in Earth and planetary science looking for a quantitative approach to Earth surface processes. More details...

  7. A process algebra model of QED

    Science.gov (United States)

    Sulis, William

    2016-03-01

    The process algebra approach to quantum mechanics posits a finite, discrete, determinate ontology of primitive events which are generated by processes (in the sense of Whitehead). In this ontology, primitive events serve as elements of an emergent space-time and of emergent fundamental particles and fields. Each process generates a set of primitive elements, using only local information, causally propagated as a discrete wave, forming a causal space termed a causal tapestry. Each causal tapestry forms a discrete and finite sampling of an emergent causal manifold (space-time) M and emergent wave function. Interactions between processes are described by a process algebra which possesses 8 commutative operations (sums and products) together with a non-commutative concatenation operator (transitions). The process algebra possesses a representation via nondeterministic combinatorial games. The process algebra connects to quantum mechanics through the set valued process and configuration space covering maps, which associate each causal tapestry with sets of wave functions over M. Probabilities emerge from interactions between processes. The process algebra model has been shown to reproduce many features of the theory of non-relativistic scalar particles to a high degree of accuracy, without paradox or divergences. This paper extends the approach to a semi-classical form of quantum electrodynamics.

  8. Model-Free Adaptive Heating Process Control

    OpenAIRE

    Ivana LUKÁČOVÁ; Piteľ, Ján

    2009-01-01

    The aim of this paper is to analyze the dynamic behaviour of a Model-Free Adaptive (MFA) heating process control. The MFA controller is designed as three layer neural network with proportional element. The method of backward propagation of errors was used for neural network training. Visualization and training of the artificial neural network was executed by Netlab in Matlab environment. Simulation of the MFA heating process control with outdoor temperature compensation has proved better resu...

  9. HYDROLOGICAL PROCESSES MODELLING USING ADVANCED HYDROINFORMATIC TOOLS

    Directory of Open Access Journals (Sweden)

    BEILICCI ERIKA

    2014-03-01

    Full Text Available The water has an essential role in the functioning of ecosystems by integrating the complex physical, chemical, and biological processes that sustain life. Water is a key factor in determining the productivity of ecosystems, biodiversity and species composition. Water is also essential for humanity: water supply systems for population, agriculture, fisheries, industries, and hydroelectric power depend on water supplies. The modelling of hydrological processes is an important activity for water resources management, especially now, when the climate change is one of the major challenges of our century, with strong influence on hydrological processes dynamics. Climate change and needs for more knowledge in water resources require the use of advanced hydroinformatic tools in hydrological processes modelling. The rationale and purpose of advanced hydroinformatic tools is to develop a new relationship between the stakeholders and the users and suppliers of the systems: to offer the basis (systems which supply useable results, the validity of which cannot be put in reasonable doubt by any of the stakeholders involved. For a successful modelling of hydrological processes also need specialists well trained and able to use advanced hydro-informatics tools. Results of modelling can be a useful tool for decision makers to taking efficient measures in social, economical and ecological domain regarding water resources, for an integrated water resources management.

  10. SWOT Analysis of Software Development Process Models

    Directory of Open Access Journals (Sweden)

    Ashish B. Sasankar

    2011-09-01

    Full Text Available Software worth billions and trillions of dollars have gone waste in the past due to lack of proper techniques used for developing software resulting into software crisis. Historically , the processes of software development has played an important role in the software engineering. A number of life cycle models have been developed in last three decades. This paper is an attempt to Analyze the software process model using SWOT method. The objective is to identify Strength ,Weakness ,Opportunities and Threats of Waterfall, Spiral, Prototype etc.

  11. Temperature Modelling of the Biomass Pretreatment Process

    DEFF Research Database (Denmark)

    2012-01-01

    In a second generation biorefinery, the biomass pretreatment stage has an important contribution to the efficiency of the downstream processing units involved in biofuel production. Most of the pretreatment process occurs in a large pressurized thermal reactor that presents an irregular temperature...... distribution. Therefore, an accurate temperature model is critical for observing the biomass pretreatment. More than that, the biomass is also pushed with a constant horizontal speed along the reactor in order to ensure a continuous throughput. The goal of this paper is to derive a temperature model...

  12. Fundamentals of Numerical Modelling of Casting Processes

    DEFF Research Database (Denmark)

    Pryds, Nini; Thorborg, Jesper; Lipinski, Marek;

    Fundamentals of Numerical Modelling of Casting Processes comprises a thorough presentation of the basic phenomena that need to be addressed in numerical simulation of casting processes. The main philosophy of the book is to present the topics in view of their physical meaning, whenever possible......, rather than relying strictly on mathematical formalism. The book, aimed both at the researcher and the practicing engineer, as well as the student, is naturally divided into four parts. Part I (Chapters 1-3) introduces the fundamentals of modelling in a 1-dimensional framework. Part II (Chapter 4...

  13. A Mathematical Model of Cigarette Smoldering Process

    Directory of Open Access Journals (Sweden)

    Chen P

    2014-12-01

    Full Text Available A mathematical model for a smoldering cigarette has been proposed. In the analysis of the cigarette combustion and pyrolysis processes, a receding burning front is defined, which has a constant temperature (~450 °C and divides the cigarette into two zones, the burning zone and the pyrolysis zone. The char combustion processes in the burning zone and the pyrolysis of virgin tobacco and evaporation of water in the pyrolysis zone are included in the model. The hot gases flow from the burning zone, are assumed to go out as sidestream smoke during smoldering. The internal heat transport is characterized by effective thermal conductivities in each zone. Thermal conduction of cigarette paper and convective and radiative heat transfer at the outer surface were also considered. The governing partial differential equations were solved using an integral method. Model predictions of smoldering speed as well as temperature and density profiles in the pyrolysis zone for different kinds of cigarettes were found to agree with the experimental data. The model also predicts the coal length and the maximum coal temperatures during smoldering conditions. The model provides a relatively fast and efficient way to simulate the cigarette burning processes. It offers a practical tool for exploring important parameters for cigarette smoldering processes, such as tobacco components, properties of cigarette paper, and heat generation in the burning zone and its dependence on the mass burn rate.

  14. Processing and Modeling of Porous Copper Using Sintering Dissolution Process

    Science.gov (United States)

    Salih, Mustafa Abualgasim Abdalhakam

    The growth of porous metal has produced materials with improved properties as compared to non-metals and solid metals. Porous metal can be classified as either open cell or closed cell. Open cell allows a fluid media to pass through it. Closed cell is made up of adjacent sealed pores with shared cell walls. Metal foams offer higher strength to weight ratios, increased impact energy absorption, and a greater tolerance to high temperatures and adverse environmental conditions when compared to bulk materials. Copper and its alloys are examples of these, well known for high strength and good mechanical, thermal and electrical properties. In the present study, the porous Cu was made by a powder metallurgy process, using three different space holders, sodium chloride, sodium carbonate and potassium carbonate. Several different samples have been produced, using different ratios of volume fraction. The densities of the porous metals have been measured and compared to the theoretical density calculated using an equation developed for these foams. The porous structure was determined with the removal of spacer materials through sintering process. The sintering process of each spacer material depends on the melting point of the spacer material. Processing, characterization, and mechanical properties were completed. These tests include density measurements, compression tests, computed tomography (CT) and scanning electron microscopy (SEM). The captured morphological images are utilized to generate the object-oriented finite element (OOF) analysis for the porous copper. Porous copper was formed with porosities in the range of 40-66% with density ranges from 3 to 5.2 g/cm3. A study of two different methods to measure porosity was completed. OOF (Object Oriented Finite Elements) is a desktop software application for studying the relationship between the microstructure of a material and its overall mechanical, dielectric, or thermal properties using finite element models based on

  15. Further Simplification of the Simple Erosion Narrowing Score With Item Response Theory Methodology.

    Science.gov (United States)

    Oude Voshaar, Martijn A H; Schenk, Olga; Ten Klooster, Peter M; Vonkeman, Harald E; Bernelot Moens, Hein J; Boers, Maarten; van de Laar, Mart A F J

    2016-08-01

    To further simplify the simple erosion narrowing score (SENS) by removing scored areas that contribute the least to its measurement precision according to analysis based on item response theory (IRT) and to compare the measurement performance of the simplified version to the original. Baseline and 18-month data of the Combinatietherapie Bij Reumatoide Artritis (COBRA) trial were modeled using longitudinal IRT methodology. Measurement precision was evaluated across different levels of structural damage. SENS was further simplified by omitting the least reliably scored areas. Discriminant validity of SENS and its simplification were studied by comparing their ability to differentiate between the COBRA and sulfasalazine arms. Responsiveness was studied by comparing standardized change scores between versions. SENS data showed good fit to the IRT model. Carpal and feet joints contributed the least statistical information to both erosion and joint space narrowing scores. Omitting the joints of the foot reduced measurement precision for the erosion score in cases with below-average levels of structural damage (relative efficiency compared with the original version ranged 35-59%). Omitting the carpal joints had minimal effect on precision (relative efficiency range 77-88%). Responsiveness of a simplified SENS without carpal joints closely approximated the original version (i.e., all Δ standardized change scores were ≤0.06). Discriminant validity was also similar between versions for both the erosion score (relative efficiency = 97%) and the SENS total score (relative efficiency = 84%). Our results show that the carpal joints may be omitted from the SENS without notable repercussion for its measurement performance. © 2016, American College of Rheumatology.

  16. Internet User Behaviour Model Discovery Process

    Directory of Open Access Journals (Sweden)

    2007-01-01

    Full Text Available The Academy of Economic Studies has more than 45000 students and about 5000 computers with Internet access which are connected to AES network. Students can access internet on these computers through a proxy server which stores information about the way the Internet is accessed. In this paper, we describe the process of discovering internet user behavior models by analyzing proxy server raw data and we emphasize the importance of such models for the e-learning environment.

  17. Internet User Behaviour Model Discovery Process

    OpenAIRE

    Dragos Marcel VESPAN

    2007-01-01

    The Academy of Economic Studies has more than 45000 students and about 5000 computers with Internet access which are connected to AES network. Students can access internet on these computers through a proxy server which stores information about the way the Internet is accessed. In this paper, we describe the process of discovering internet user behavior models by analyzing proxy server raw data and we emphasize the importance of such models for the e-learning environment.

  18. The Automation of Nowcast Model Assessment Processes

    Science.gov (United States)

    2016-09-01

    S) Leelinda P Dawson, John W Raby, and Jeffrey A Smith 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION...model runs .............................13 Fig. 6 An example PSA log file, ps_auto_log, using DDA, one case- study date, 3 domains, 3 model runs, and...case study date could be set for each run. This process was time-consuming when multiple configurations were required by the user. Also, each run

  19. Determinantal point process models on the sphere

    DEFF Research Database (Denmark)

    Møller, Jesper; Nielsen, Morten; Porcu, Emilio

    defined on Sd × Sd . We review the appealing properties of such processes, including their specific moment properties, density expressions and simulation procedures. Particularly, we characterize and construct isotropic DPPs models on Sd , where it becomes essential to specify the eigenvalues......We consider determinantal point processes on the d-dimensional unit sphere Sd . These are finite point processes exhibiting repulsiveness and with moment properties determined by a certain determinant whose entries are specified by a so-called kernel which we assume is a complex covariance function...

  20. Improving the process of process modelling by the use of domain process patterns

    Science.gov (United States)

    Koschmider, Agnes; Reijers, Hajo A.

    2015-01-01

    The use of business process models has become prevalent in a wide area of enterprise applications. But while their popularity is expanding, concerns are growing with respect to their proper creation and maintenance. An obvious way to boost the efficiency of creating high-quality business process models would be to reuse relevant parts of existing models. At this point, however, limited support exists to guide process modellers towards the usage of appropriate model content. In this paper, a set of content-oriented patterns is presented, which is extracted from a large set of process models from the order management and manufacturing production domains. The patterns are derived using a newly proposed set of algorithms, which are being discussed in this paper. The authors demonstrate how such Domain Process Patterns, in combination with information on their historic usage, can support process modellers in generating new models. To support the wider dissemination and development of Domain Process Patterns within and beyond the studied domains, an accompanying website has been set up.

  1. Impact of Modal Parameters on Milling Process Chatter Stability Lobes

    Institute of Scientific and Technical Information of China (English)

    LI Zhongqun; LIU Qiang

    2006-01-01

    Modals of the machine/tool and machine/part system are the principal factors affecting the stability of a milling process. Based on the modeling of chatter stability of milling process, the influence of modal parameters on chatter stability lobes independently or jointly has been analyzed by simulation. Peak-to-valley specific value, lobe coefficient and the corresponding calculation formula have been put forward. General laws and steps of modal simplification for multimodality system have been summarized.

  2. Modeling stroke rehabilitation processes using the Unified Modeling Language (UML).

    Science.gov (United States)

    Ferrante, Simona; Bonacina, Stefano; Pinciroli, Francesco

    2013-10-01

    In organising and providing rehabilitation procedures for stroke patients, the usual need for many refinements makes it inappropriate to attempt rigid standardisation, but greater detail is required concerning workflow. The aim of this study was to build a model of the post-stroke rehabilitation process. The model, implemented in the Unified Modeling Language, was grounded on international guidelines and refined following the clinical pathway adopted at local level by a specialized rehabilitation centre. The model describes the organisation of the rehabilitation delivery and it facilitates the monitoring of recovery during the process. Indeed, a system software was developed and tested to support clinicians in the digital administration of clinical scales. The model flexibility assures easy updating after process evolution. Copyright © 2013 Elsevier Ltd. All rights reserved.

  3. Model-based internal wave processing

    Energy Technology Data Exchange (ETDEWEB)

    Candy, J.V.; Chambers, D.H.

    1995-06-09

    A model-based approach is proposed to solve the oceanic internal wave signal processing problem that is based on state-space representations of the normal-mode vertical velocity and plane wave horizontal velocity propagation models. It is shown that these representations can be utilized to spatially propagate the modal (dept) vertical velocity functions given the basic parameters (wave numbers, Brunt-Vaisala frequency profile etc.) developed from the solution of the associated boundary value problem as well as the horizontal velocity components. Based on this framework, investigations are made of model-based solutions to the signal enhancement problem for internal waves.

  4. Counting Processes for Retail Default Modeling

    DEFF Research Database (Denmark)

    Kiefer, Nicholas Maximilian; Larson, C. Erik

    in a discrete state space. In a simple case, the states could be default/non-default; in other models relevant for credit modeling the states could be credit scores or payment status (30 dpd, 60 dpd, etc.). Here we focus on the use of stochastic counting processes for mortgage default modeling, using data...... on high LTV mortgages. Borrowers seeking to finance more than 80% of a house's value with a mortgage usually either purchase mortgage insurance, allowing a first mortgage greater than 80% from many lenders, or use second mortgages. Are there differences in performance between loans financed...

  5. Integrated Process Model on Intercultural Competence

    Directory of Open Access Journals (Sweden)

    Diana Bebenova - Nikolova

    2016-08-01

    Full Text Available The paper proposes an integrated model on intercultural competence, which attempts to present intercultural communication and competence from the term point of the dialectical approach, described by Martin and Nakayama (2010. The suggested concept deploys from previously developed and accepted models, both structure-oriented and process-oriented. At the same time it replies to the principles of the “Theory of Models” as outlined by Balboni and Caon (2014. In the near future, the model will be applied to assess intercultural competence of cross-border project teams, working under the CBC program between Romania – Bulgaria 2007-2014.

  6. Hencky's model for elastomer forming process

    Science.gov (United States)

    Oleinikov, A. A.; Oleinikov, A. I.

    2016-08-01

    In the numerical simulation of elastomer forming process, Henckys isotropic hyperelastic material model can guarantee relatively accurate prediction of strain range in terms of large deformations. It is shown, that this material model prolongate Hooke's law from the area of infinitesimal strains to the area of moderate ones. New representation of the fourth-order elasticity tensor for Hencky's hyperelastic isotropic material is obtained, it possesses both minor symmetries, and the major symmetry. Constitutive relations of considered model is implemented into MSC.Marc code. By calculating and fitting curves, the polyurethane elastomer material constants are selected. Simulation of equipment for elastomer sheet forming are considered.

  7. Between-Word Simplification Patterns in the Continuous Speech of Children with Speech Sound Disorders

    Science.gov (United States)

    Klein, Harriet B.; Liu-Shea, May

    2009-01-01

    Purpose: This study was designed to identify and describe between-word simplification patterns in the continuous speech of children with speech sound disorders. It was hypothesized that word combinations would reveal phonological changes that were unobserved with single words, possibly accounting for discrepancies between the intelligibility of…

  8. New helical-shape magnetic pole design for Magnetic Lead Screw enabling structure simplification

    DEFF Research Database (Denmark)

    Lu, Kaiyuan; Xia, Yongming; Wu, Weimin

    2015-01-01

    Magnetic lead screw (MLS) is a new type of high performance linear actuator that is attractive for many potential applications. The main difficulty of the MLS technology lies in the manufacturing of its complicated helical-shape magnetic poles. Structure simplification is, therefore, quite essent...

  9. A Data-Driven Point Cloud Simplification Framework for City-Scale Image-Based Localization.

    Science.gov (United States)

    Cheng, Wentao; Lin, Weisi; Zhang, Xinfeng; Goesele, Michael; Sun, Ming-Ting

    2017-01-01

    City-scale 3D point clouds reconstructed via structure-from-motion from a large collection of Internet images are widely used in the image-based localization task to estimate a 6-DOF camera pose of a query image. Due to prohibitive memory footprint of city-scale point clouds, image-based localization is difficult to be implemented on devices with limited memory resources. Point cloud simplification aims to select a subset of points to achieve a comparable localization performance using the original point cloud. In this paper, we propose a data-driven point cloud simplification framework by taking it as a weighted K-Cover problem, which mainly includes two complementary parts. First, a utility-based parameter determination method is proposed to select a reasonable parameter K for K-Cover-based approaches by evaluating the potential of a point cloud for establishing sufficient 2D-3D feature correspondences. Second, we formulate the 3D point cloud simplification problem as a weighted K-Cover problem, and propose an adaptive exponential weight function based on the visibility probability of 3D points. The experimental results on three popular datasets demonstrate that the proposed point cloud simplification framework outperforms the state-of-the-art methods for the image-based localization application with a well predicted parameter in the K-Cover problem.

  10. Perceptual Recovery from Consonant-Cluster Simplification in Korean Using Language-Specific Phonological Knowledge

    NARCIS (Netherlands)

    Cho, T.; McQueen, J.M.

    2011-01-01

    Two experiments examined whether perceptual recovery from Korean consonant-cluster simplification is based on language-specific phonological knowledge. In tri-consonantal C1C2C3 sequences such as /lkt/ and /lpt/ in Seoul Korean, either C1 or C2 can be completely deleted. Seoul Koreans monitored for

  11. Fuzzy model for Laser Assisted Bending Process

    Directory of Open Access Journals (Sweden)

    Giannini Oliviero

    2016-01-01

    Full Text Available In the present study, a fuzzy model was developed to predict the residual bending in a conventional metal bending process assisted by a high power diode laser. The study was focused on AA6082T6 aluminium thin sheets. In most dynamic sheet metal forming operations, the highly nonlinear deformation processes cause large amounts of elastic strain energy stored in the formed material. The novel hybrid forming process was thus aimed at inducing the local heating of the mechanically bent workpiece in order to decrease or eliminate the related springback phenomena. In particular, the influence on the extent of springback phenomena of laser process parameters such as source power, scan speed and starting elastic deformation of mechanically bent sheets, was experimentally assessed. Consistent trends in experimental response according to operational parameters were found. Accordingly, 3D process maps of the extent of the springback phenomena according to operational parameters were constructed. The effect of the inherent uncertainties on the predicted residual bending caused by the approximation in the model parameters was evaluated. In particular, a fuzzy-logic based approach was used to describe the model uncertainties and the transformation method was applied to propagate their effect on the residual bending.

  12. Computational Process Modeling for Additive Manufacturing (OSU)

    Science.gov (United States)

    Bagg, Stacey; Zhang, Wei

    2015-01-01

    Powder-Bed Additive Manufacturing (AM) through Direct Metal Laser Sintering (DMLS) or Selective Laser Melting (SLM) is being used by NASA and the Aerospace industry to "print" parts that traditionally are very complex, high cost, or long schedule lead items. The process spreads a thin layer of metal powder over a build platform, then melts the powder in a series of welds in a desired shape. The next layer of powder is applied, and the process is repeated until layer-by-layer, a very complex part can be built. This reduces cost and schedule by eliminating very complex tooling and processes traditionally used in aerospace component manufacturing. To use the process to print end-use items, NASA seeks to understand SLM material well enough to develop a method of qualifying parts for space flight operation. Traditionally, a new material process takes many years and high investment to generate statistical databases and experiential knowledge, but computational modeling can truncate the schedule and cost -many experiments can be run quickly in a model, which would take years and a high material cost to run empirically. This project seeks to optimize material build parameters with reduced time and cost through modeling.

  13. Model Identification of Integrated ARMA Processes

    Science.gov (United States)

    Stadnytska, Tetiana; Braun, Simone; Werner, Joachim

    2008-01-01

    This article evaluates the Smallest Canonical Correlation Method (SCAN) and the Extended Sample Autocorrelation Function (ESACF), automated methods for the Autoregressive Integrated Moving-Average (ARIMA) model selection commonly available in current versions of SAS for Windows, as identification tools for integrated processes. SCAN and ESACF can…

  14. Aligning Grammatical Theories and Language Processing Models

    Science.gov (United States)

    Lewis, Shevaun; Phillips, Colin

    2015-01-01

    We address two important questions about the relationship between theoretical linguistics and psycholinguistics. First, do grammatical theories and language processing models describe separate cognitive systems, or are they accounts of different aspects of the same system? We argue that most evidence is consistent with the one-system view. Second,…

  15. Model Identification of Integrated ARMA Processes

    Science.gov (United States)

    Stadnytska, Tetiana; Braun, Simone; Werner, Joachim

    2008-01-01

    This article evaluates the Smallest Canonical Correlation Method (SCAN) and the Extended Sample Autocorrelation Function (ESACF), automated methods for the Autoregressive Integrated Moving-Average (ARIMA) model selection commonly available in current versions of SAS for Windows, as identification tools for integrated processes. SCAN and ESACF can…

  16. Modeling of Reaction Processes Controlled by Diffusion

    CERN Document Server

    Revelli, J

    2003-01-01

    Stochastic modeling is quite powerful in science and technology.The technics derived from this process have been used with great success in laser theory, biological systems and chemical reactions.Besides, they provide a theoretical framework for the analysis of experimental results on the field of particle's diffusion in ordered and disordered materials.In this work we analyze transport processes in one-dimensional fluctuating media, which are media that change their state in time.This fact induces changes in the movements of the particles giving rise to different phenomena and dynamics that will be described and analyzed in this work.We present some random walk models to describe these fluctuating media.These models include state transitions governed by different dynamical processes.We also analyze the trapping problem in a lattice by means of a simple model which predicts a resonance-like phenomenon.Also we study effective diffusion processes over surfaces due to random walks in the bulk.We consider differe...

  17. Kinetics and modeling of anaerobic digestion process

    DEFF Research Database (Denmark)

    2003-01-01

    Anaerobic digestion modeling started in the early 1970s when the need for design and efficient operation of anaerobic systems became evident. At that time not only was the knowledge about the complex process of anaerobic digestion inadequate but also there were computational limitations. Thus...

  18. Dynamic Process of Money Transfer Models

    CERN Document Server

    Wang, Y; Wang, Yougui; Ding, Ning

    2005-01-01

    We have studied numerically the statistical mechanics of the dynamic phenomena, including money circulation and economic mobility, in some transfer models. The models on which our investigations were performed are the basic model proposed by A. Dragulescu and V. Yakovenko [1], the model with uniform saving rate developed by A. Chakraborti and B.K. Chakrabarti [2], and its extended model with diverse saving rate [3]. The velocity of circulation is found to be inversely related with the average holding time of money. In order to check the nature of money transferring process in these models, we demonstrated the probability distributions of holding time. In the model with uniform saving rate, the distribution obeys exponential law, which indicates money transfer here is a kind of Poisson process. But when the saving rate is set diversely, the holding time distribution follows a power law. The velocity can also be deduced from a typical individual's optimal choice. In this way, an approach for building the micro-...

  19. Iron and steel industry process model

    Energy Technology Data Exchange (ETDEWEB)

    Sparrow, F.T.

    1978-07-01

    The model depicts expected energy-consumption characteristics of the iron and steel industry and ancillary industries for the next 25 years by means of a process model of the major steps in steelmaking from ore mining and scrap recycling to the final finishing of carbon, alloy, and stainless steel into steel products such as structural steel, slabs, plates, tubes, and bars. Two plant types are modelled: fully integrated mills and minimills. User-determined inputs into the model are: (a) projected energy materials prices for the horizon; (b) projected costs of capacity expansion and replacement; (c) energy conserving options - both operating modes and investments; (d) internal rate of return required on projects; and (e) growth in finished steel demand. Nominal input choices in the model are: DOE baseline projections for oil, gas, distillates, residuals, and electricity for energy, and 1975 actual prices for materials; actual 1975 costs; adding new technologies; 15% after taxes; and 1975 actual demand with 1.5% growth/year. Output of the model includes: energy use by type, by process, and by time period, both in total and intensity (Btu/ton); energy-conservation options chosen; and utilization rates for existing capacity, and the capacity expansion decisions of the model.

  20. Process Model for Friction Stir Welding

    Science.gov (United States)

    Adams, Glynn

    1996-01-01

    Friction stir welding (FSW) is a relatively new process being applied for joining of metal alloys. The process was initially developed by The Welding Institute (TWI) in Cambridge, UK. The FSW process is being investigated at NASA/MSEC as a repair/initial weld procedure for fabrication of the super-light-weight aluminum-lithium shuttle external tank. The FSW investigations at MSFC were conducted on a horizontal mill to produce butt welds of flat plate material. The weldment plates are butted together and fixed to a backing plate on the mill bed. A pin tool is placed into the tool holder of the mill spindle and rotated at approximately 400 rpm. The pin tool is then plunged into the plates such that the center of the probe lies at, one end of the line of contact, between the plates and the shoulder of the pin tool penetrates the top surface of the weldment. The weld is produced by traversing the tool along the line of contact between the plates. A lead angle allows the leading edge of the shoulder to remain above the top surface of the plate. The work presented here is the first attempt at modeling a complex phenomenon. The mechanical aspects of conducting the weld process are easily defined and the process itself is controlled by relatively few input parameters. However, in the region of the weld, plasticizing and forging of the parent material occurs. These are difficult processes to model. The model presented here addresses only variations in the radial dimension outward from the pin tool axis. Examinations of the grain structure of the weld reveal that a considerable amount of material deformation also occurs in the direction parallel to the pin tool axis of rotation, through the material thickness. In addition, measurements of the axial load on the pin tool demonstrate that the forging affect of the pin tool shoulder is an important process phenomenon. Therefore, the model needs to be expanded to account for the deformations through the material thickness and the

  1. Modeling Low-temperature Geochemical Processes

    Science.gov (United States)

    Nordstrom, D. K.

    2003-12-01

    Geochemical modeling has become a popular and useful tool for a wide number of applications from research on the fundamental processes of water-rock interactions to regulatory requirements and decisions regarding permits for industrial and hazardous wastes. In low-temperature environments, generally thought of as those in the temperature range of 0-100 °C and close to atmospheric pressure (1 atm=1.01325 bar=101,325 Pa), complex hydrobiogeochemical reactions participate in an array of interconnected processes that affect us, and that, in turn, we affect. Understanding these complex processes often requires tools that are sufficiently sophisticated to portray multicomponent, multiphase chemical reactions yet transparent enough to reveal the main driving forces. Geochemical models are such tools. The major processes that they are required to model include mineral dissolution and precipitation; aqueous inorganic speciation and complexation; solute adsorption and desorption; ion exchange; oxidation-reduction; or redox; transformations; gas uptake or production; organic matter speciation and complexation; evaporation; dilution; water mixing; reaction during fluid flow; reaction involving biotic interactions; and photoreaction. These processes occur in rain, snow, fog, dry atmosphere, soils, bedrock weathering, streams, rivers, lakes, groundwaters, estuaries, brines, and diagenetic environments. Geochemical modeling attempts to understand the redistribution of elements and compounds, through anthropogenic and natural means, for a large range of scale from nanometer to global. "Aqueous geochemistry" and "environmental geochemistry" are often used interchangeably with "low-temperature geochemistry" to emphasize hydrologic or environmental objectives.Recognition of the strategy or philosophy behind the use of geochemical modeling is not often discussed or explicitly described. Plummer (1984, 1992) and Parkhurst and Plummer (1993) compare and contrast two approaches for

  2. Managing risks in business model innovation processes

    DEFF Research Database (Denmark)

    Taran, Yariv; Boer, Harry; Lindgren, Peter

    2010-01-01

    ) innovation is a risky enterprise, many companies are still choosing not to apply any risk management in the BM innovation process. The objective of this paper is to develop a better understanding of how risks are handled in the practice of BM innovation. An analysis of the BM innovation experiences of two......Companies today, in some industries more than others, invest more capital and resources just to stay competitive, develop more diverse solutions, and increasingly start thinking more radically when considering their business models. However, despite the understanding that business model (BM...... industrial companies shows that both companies are experiencing high levels of uncertainty and complexity during their innovation processes and are, consequently, struggling to find new processes for handling the risks involved. Based on the two companies’ experiences, various testable propositions are put...

  3. Modelling and control of a flotation process

    Energy Technology Data Exchange (ETDEWEB)

    Ding, L.; Gustafsson, T. [Control Engineering Group, Lulea Univ. of Technology, Lulea (Sweden)

    1999-07-01

    A general description of a flotation process is given. The dynamic model of a MIMO nonlinear subprocess in flotation, i. e. the pulp levels in five compartments in series is developed and the model is verified with real data from a production plant. In order to reject constant disturbances five extra states are introduced and the model is modified. An exact linearization has been made for the non-linear model and a linear quadratic gaussian controller is proposed based on the linearized model. The simulation result shows an improved performance of the pulp level control when the set points are changed or a disturbance occur. In future the controller will be tested in production. (author)

  4. Advances in modeling plastic waste pyrolysis processes

    Directory of Open Access Journals (Sweden)

    Y. Safadi, J. Zeaiter

    2014-01-01

    Full Text Available The tertiary recycling of plastics via pyrolysis is recently gaining momentum due to promising economic returns from the generated products that can be used as a chemical feedstock or fuel. The need for prediction models to simulate such processes is essential in understanding in depth the mechanisms that take place during the thermal or catalytic degradation of the waste polymer. This paper presents key different models used successfully in literature so far. Three modeling schemes are identified: Power-Law, Lumped-Empirical, and Population-Balance based equations. The categorization is based mainly on the level of detail and prediction capability from each modeling scheme. The data shows that the reliability of these modeling approaches vary with the degree of details the experimental work and product analysis are trying to achieve.

  5. Modeling delayed processes in biological systems

    Science.gov (United States)

    Feng, Jingchen; Sevier, Stuart A.; Huang, Bin; Jia, Dongya; Levine, Herbert

    2016-09-01

    Delayed processes are ubiquitous in biological systems and are often characterized by delay differential equations (DDEs) and their extension to include stochastic effects. DDEs do not explicitly incorporate intermediate states associated with a delayed process but instead use an estimated average delay time. In an effort to examine the validity of this approach, we study systems with significant delays by explicitly incorporating intermediate steps. We show that such explicit models often yield significantly different equilibrium distributions and transition times as compared to DDEs with deterministic delay values. Additionally, different explicit models with qualitatively different dynamics can give rise to the same DDEs revealing important ambiguities. We also show that DDE-based predictions of oscillatory behavior may fail for the corresponding explicit model.

  6. Modeling veterans healthcare administration disclosure processes :

    Energy Technology Data Exchange (ETDEWEB)

    Beyeler, Walter E; DeMenno, Mercy B.; Finley, Patrick D.

    2013-09-01

    As with other large healthcare organizations, medical adverse events at the Department of Veterans Affairs (VA) facilities can expose patients to unforeseen negative risks. VHA leadership recognizes that properly handled disclosure of adverse events can minimize potential harm to patients and negative consequences for the effective functioning of the organization. The work documented here seeks to help improve the disclosure process by situating it within the broader theoretical framework of issues management, and to identify opportunities for process improvement through modeling disclosure and reactions to disclosure. The computational model will allow a variety of disclosure actions to be tested across a range of incident scenarios. Our conceptual model will be refined in collaboration with domain experts, especially by continuing to draw on insights from VA Study of the Communication of Adverse Large-Scale Events (SCALE) project researchers.

  7. Model selection for Poisson processes with covariates

    CERN Document Server

    Sart, Mathieu

    2011-01-01

    We observe $n$ inhomogeneous Poisson processes with covariates and aim at estimating their intensities. To handle this problem, we assume that the intensity of each Poisson process is of the form $s (\\cdot, x)$ where $x$ is the covariate and where $s$ is an unknown function. We propose a model selection approach where the models are used to approximate the multivariate function $s$. We show that our estimator satisfies an oracle-type inequality under very weak assumptions both on the intensities and the models. By using an Hellinger-type loss, we establish non-asymptotic risk bounds and specify them under various kind of assumptions on the target function $s$ such as being smooth or composite. Besides, we show that our estimation procedure is robust with respect to these assumptions.

  8. Exploring the spatial distribution of light interception and photosynthesis of canopies by means of a functional-structural plant model

    NARCIS (Netherlands)

    Sarlikioti, V.; Visser, de P.H.B.; Marcelis, L.F.M.

    2011-01-01

    Background and Aims - At present most process-based models and the majority of three-dimensional models include simplifications of plant architecture that can compromise the accuracy of light interception simulations and, accordingly, canopy photosynthesis. The aim of this paper is to analyse canopy

  9. Centella asiatica attenuates Aβ-induced neurodegenerative spine loss and dendritic simplification.

    Science.gov (United States)

    Gray, Nora E; Zweig, Jonathan A; Murchison, Charles; Caruso, Maya; Matthews, Donald G; Kawamoto, Colleen; Harris, Christopher J; Quinn, Joseph F; Soumyanath, Amala

    2017-04-12

    The medicinal plant Centella asiatica has long been used to improve memory and cognitive function. We have previously shown that a water extract from the plant (CAW) is neuroprotective against the deleterious cognitive effects of amyloid-β (Aβ) exposure in a mouse model of Alzheimer's disease, and improves learning and memory in healthy aged mice as well. This study explores the physiological underpinnings of those effects by examining how CAW, as well as chemical compounds found within the extract, modulate synaptic health in Aβ-exposed neurons. Hippocampal neurons from amyloid precursor protein over-expressing Tg2576 mice and their wild-type (WT) littermates were used to investigate the effect of CAW and various compounds found within the extract on Aβ-induced dendritic simplification and synaptic loss. CAW enhanced arborization and spine densities in WT neurons and prevented the diminished outgrowth of dendrites and loss of spines caused by Aβ exposure in Tg2576 neurons. Triterpene compounds present in CAW were found to similarly improve arborization although they did not affect spine density. In contrast caffeoylquinic acid (CQA) compounds from CAW were able to modulate both of these endpoints, although there was specificity as to which CQAs mediated which effect. These data suggest that CAW, and several of the compounds found therein, can improve dendritic arborization and synaptic differentiation in the context of Aβ exposure which may underlie the cognitive improvement observed in response to the extract in vivo. Additionally, since CAW, and its constituent compounds, also improved these endpoints in WT neurons, these results may point to a broader therapeutic utility of the extract beyond Alzheimer's disease.

  10. Karst Aquifer Recharge: A Case History of over Simplification from the Uley South Basin, South Australia

    Directory of Open Access Journals (Sweden)

    Nara Somaratne

    2015-02-01

    Full Text Available The article “Karst aquifer recharge: Comments on ‘Characteristics of Point Recharge in Karst Aquifers’, by Adrian D. Werner, 2014, Water 6, doi:10.3390/w6123727” provides misrepresentation in some parts of Somaratne [1]. The description of Uley South Quaternary Limestone (QL as unconsolidated or poorly consolidated aeolianite sediments with the presence of well-mixed groundwater in Uley South [2] appears unsubstantiated. Examination of 98 lithological descriptions with corresponding drillers’ logs show only two wells containing bands of unconsolidated sediments. In Uley South basin, about 70% of salinity profiles obtained by electrical conductivity (EC logging from monitoring wells show stratification. The central and north central areas of the basin receive leakage from the Tertiary Sand (TS aquifer thereby influencing QL groundwater characteristics, such as chemistry, age and isotope composition. The presence of conduit pathways is evident in salinity profiles taken away from TS water affected areas. Pumping tests derived aquifer parameters show strong heterogeneity, a typical characteristic of karst aquifers. Uley South QL aquifer recharge is derived from three sources; diffuse recharge, point recharge from sinkholes and continuous leakage of TS water. This limits application of recharge estimation methods, such as the conventional chloride mass balance (CMB as the basic premise of the CMB is violated. The conventional CMB is not suitable for accounting chloride mass balance in groundwater systems displaying extreme range of chloride concentrations and complex mixing [3]. Over simplification of karst aquifer systems to suit application of the conventional CMB or 1-D unsaturated modelling as described in Werner [2], is not suitable use of these recharge estimation methods.

  11. Modelling Of Manufacturing Processes With Membranes

    Science.gov (United States)

    Crăciunean, Daniel Cristian; Crăciunean, Vasile

    2015-07-01

    The current objectives to increase the standards of quality and efficiency in manufacturing processes can be achieved only through the best combination of inputs, independent of spatial distance between them. This paper proposes modelling production processes based on membrane structures introduced in [4]. Inspired from biochemistry, membrane computation [4] is based on the concept of membrane represented in its formalism by the mathematical concept of multiset. The manufacturing process is the evolution of a super cell system from its initial state according to the given actions of aggregation. In this paper we consider that the atomic production unit of the process is the action. The actions and the resources on which the actions are produced, are distributed in a virtual network of companies working together. The destination of the output resources is specified by corresponding output events.

  12. Development of a comprehensive weld process model

    Energy Technology Data Exchange (ETDEWEB)

    Radhakrishnan, B.; Zacharia, T.; Paul, A.

    1997-05-01

    This cooperative research and development agreement (CRADA) between Concurrent Technologies Corporation (CTC) and Lockheed Martin Energy Systems (LMES) combines CTC`s expertise in the welding area and that of LMES to develop computer models and simulation software for welding processes. This development is of significant impact to the industry, including materials producers and fabricators. The main thrust of the research effort was to develop a comprehensive welding simulation methodology. A substantial amount of work has been done by several researchers to numerically model several welding processes. The primary drawback of most of the existing models is the lack of sound linkages between the mechanistic aspects (e.g., heat transfer, fluid flow, and residual stress) and the metallurgical aspects (e.g., microstructure development and control). A comprehensive numerical model which can be used to elucidate the effect of welding parameters/conditions on the temperature distribution, weld pool shape and size, solidification behavior, and microstructure development, as well as stresses and distortion, does not exist. It was therefore imperative to develop a comprehensive model which would predict all of the above phenomena during welding. The CRADA built upon an already existing three-dimensional (3-D) welding simulation model which was developed by LMES which is capable of predicting weld pool shape and the temperature history in 3-d single-pass welds. However, the model does not account for multipass welds, microstructural evolution, distortion and residual stresses. Additionally, the model requires large resources of computing time, which limits its use for practical applications. To overcome this, CTC and LMES have developed through this CRADA the comprehensive welding simulation model described above.

  13. Simplifications in the Behavior of Viscoelastic Composites With Growing Damage

    Science.gov (United States)

    1990-07-01

    First, observe that if the energy release rate is constant in time , equation (18) is an exact result. This situation exists for some elementary...change in a process are found from the R equations, aW aW @ q -I/q(36) aw a5w aSI r =T’r aSr Even if the qj are constant in time , equation (36) yields

  14. A parallel-pipelining software process model

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Software process is a framework for effective and timely delivery of software system. The framework plays a crucial role for software success. However, the development of large-scale software still faces the crisis of high risks, low quality, high costs and long cycle time.This paper proposed a three-phase parallel-pipelining software process model for improving speed and productivity, and reducing software costs and risks without sacrificing software quality. In this model, two strategies were presented. One strategy, based on subsystem-cost priority, Was used to prevent software development cost wasting and to reduce software complexity as well; the other strategy, used for balancing subsystem complexity, was designed to reduce the software complexity in the later development stages. Moreover. The proposed function-detailed and workload-simplified subsystem pipelining software process model presents much higher parallelity than the concurrent incremental model. Finally, the component-based product line technology not only ensures software quality and further reduces cycle time, software costs. And software risks but also sufficiently and rationally utilizes previous software product resources and enhances the competition ability of software development organizations.

  15. Operation Windshield and the simplification of emergency management.

    Science.gov (United States)

    Andrews, Michael

    2016-01-01

    Large, complex, multi-stakeholder exercises are the culmination of years of gradual progression through a comprehensive training and exercise programme. Exercises intended to validate training, refine procedures and test processes initially tested in isolation are combined to ensure seamless response and coordination during actual crises. The challenges of integrating timely and accurate situational awareness from an array of sources, including response agencies, municipal departments, partner agencies and the public, on an ever-growing range of media platforms, increase information management complexity in emergencies. Considering that many municipal emergency operations centre roles are filled by staff whose day jobs have little to do with crisis management, there is a need to simplify emergency management and make it more intuitive. North Shore Emergency Management has accepted the challenge of making emergency management less onerous to occasional practitioners through a series of initiatives aimed to build competence and confidence by making processes easier to use as well as by introducing technical tools that can simplify processes and enhance efficiencies. These efforts culminated in the full-scale earthquake exercise, Operation Windshield, which preceded the 2015 Emergency Preparedness and Business Continuity Conference in Vancouver, British Columbia.

  16. Mathematical modeling of the flash converting process

    Energy Technology Data Exchange (ETDEWEB)

    Sohn, H.Y.; Perez-Tello, M.; Riihilahti, K.M. [Utah Univ., Salt Lake City, UT (United States)

    1996-12-31

    An axisymmetric mathematical model for the Kennecott-Outokumpu flash converting process for converting solid copper matte to copper is presented. The model is an adaptation of the comprehensive mathematical model formerly developed at the University of Utah for the flash smelting of copper concentrates. The model incorporates the transport of momentum, heat, mass, and reaction kinetics between gas and particles in a particle-laden turbulent gas jet. The standard k-{epsilon} model is used to describe gas-phase turbulence in an Eulerian framework. The particle-phase is treated from a Lagrangian viewpoint which is coupled to the gas-phase via the source terms in the Eulerian gas-phase governing equations. Matte particles were represented as Cu{sub 2}S yFeS, and assumed to undergo homogeneous oxidation to Cu{sub 2}O, Fe{sub 3}O{sub 4}, and SO{sub 2}. A reaction kinetics mechanism involving both external mass transfer of oxygen gas to the particle surface and diffusion of oxygen through the porous oxide layer is proposed to estimate the particle oxidation rate Predictions of the mathematical model were compared with the experimental data collected in a bench-scale flash converting facility. Good agreement between the model predictions and the measurements was obtained. The model was used to study the effect of different gas-injection configurations on the overall fluid dynamics in a commercial size flash converting shaft. (author)

  17. Linear Latent Force Models using Gaussian Processes

    CERN Document Server

    Álvarez, Mauricio A; Lawrence, Neil D

    2011-01-01

    Purely data driven approaches for machine learning present difficulties when data is scarce relative to the complexity of the model or when the model is forced to extrapolate. On the other hand, purely mechanistic approaches need to identify and specify all the interactions in the problem at hand (which may not be feasible) and still leave the issue of how to parameterize the system. In this paper, we present a hybrid approach using Gaussian processes and differential equations to combine data driven modelling with a physical model of the system. We show how different, physically-inspired, kernel functions can be developed through sensible, simple, mechanistic assumptions about the underlying system. The versatility of our approach is illustrated with three case studies from motion capture, computational biology and geostatistics.

  18. Process models and model-data fusion in dendroecology

    Directory of Open Access Journals (Sweden)

    Joel eGuiot

    2014-08-01

    Full Text Available Dendrochronology (i.e. the study of annually dated tree-ring time series has proved to be a powerful technique to understand tree-growth. This paper intends to show the interest of using ecophysiological modeling not only to understand and predict tree-growth (dendroecology but also to reconstruct past climates (dendroclimatology. Process models have been used for several decades in dendroclimatology, but it is only recently that methods of model-data fusion have led to significant progress in modeling tree-growth as a function of climate and in reconstructing past climates. These model-data fusion (MDF methods, mainly based on the Bayesian paradigm, have been shown to be powerful for both model calibration and model inversion. After a rapid survey of tree-growth modeling, we illustrate MDF with examples based on series of Southern France Aleppo pines and Central France oaks. These examples show that if plants experienced CO2 fertilization, this would have a significant effect on tree-growth which in turn would bias the climate reconstructions. This bias could be extended to other environmental non-climatic factors directly or indirectly affecting annual ring formation and not taken into account in classical empirical models, which supports the use of more complex process-based models. Finally, we conclude by showing the interest of the data assimilation methods applied in climatology to produce climate re-analyses.

  19. Theoretical Modelling of Intercultural Communication Process

    Directory of Open Access Journals (Sweden)

    Mariia Soter

    2016-08-01

    Full Text Available The definition of the concepts of “communication”, “intercultural communication”, “model of communication” are analyzed in the article. The basic components of the communication process are singled out. The model of intercultural communication is developed. Communicative, behavioral and complex skills for optimal organization of intercultural communication, establishment of productive contact with a foreign partner to achieve mutual understanding, searching for acceptable ways of organizing interaction and cooperation for both communicants are highlighted in the article. It is noted that intercultural communication through interaction between people affects the development of different cultures’ aspects.

  20. Soft sensor modeling based on Gaussian processes

    Institute of Scientific and Technical Information of China (English)

    XIONG Zhi-hua; HUANG Guo-hong; SHAO Hui-he

    2005-01-01

    In order to meet the demand of online optimal running, a novel soft sensor modeling approach based on Gaussian processes was proposed. The approach is moderately simple to implement and use without loss of performance. It is trained by optimizing the hyperparameters using the scaled conjugate gradient algorithm with the squared exponential covariance function employed. Experimental simulations show that the soft sensor modeling approach has the advantage via a real-world example in a refinery. Meanwhile, the method opens new possibilities for application of kernel methods to potential fields.

  1. Design and intensification of industrial DADPM process

    NARCIS (Netherlands)

    Benneker, Anne Maria; van der Ham, Aloysius G.J.; de Waele, B.; de Zeeuw, A.J.; van den Berg, Henderikus

    2016-01-01

    Process intensification is an essential method for the improvement of energy and material efficiency, waste reduction and simplification of industrial processes. In this research a Process Intensification methodology developed by Lutze, Gani and Woodley at the Computer Aided Process Engineering

  2. Modeling of processes in the tourism sector

    Directory of Open Access Journals (Sweden)

    Salamatina Victoriya, S.

    2015-06-01

    Full Text Available In modern conditions for a number of Russian regions tourism is becoming budget. In this regard, it is of interest to the simulation of processes occurring in the tourism business, because they are affected by many random parameters, due to various economic, political, geographic, and other aspects. For improvement and development of systems for the management of tourism business systematically embeds economic mathematical apparatus in this area, because increased competitiveness requires continuous and constructive changes. Results of application of the economic mathematical apparatus allow a more systematic and internal unity to analyze and evaluate the applicability of further processes in tourism. For some economic processes typical tourist activities is that a certain effect and result from exposure to any of the factors on the indicators of the processes is not immediately but gradually, after some certain time, with a certain lag. With the necessity of accounting for this delay has to face when developing mathematical models of tourist business processes. In this case, the simulation of such processes it is advisable to apply economic-mathematical formalism of optimal control, called game theory.

  3. A model for processivity of molecular motors

    Institute of Scientific and Technical Information of China (English)

    Xie Ping; Dou Shuo-Xing; Wang Peng-Ye

    2004-01-01

    We propose a two-dimensional model for a complete description of the dynamics of molecular motors, including both the processive movement along track filaments and the dissociation from the filaments. Theoretical results on the distributions of the run length and dwell time at a given adenosine triphosphate (ATP) concentration, the dependences of mean run length, mean dwell time and mean velocity on ATP concentration and load are in good agreement with the previous experimental results.

  4. Modelling the Heterogeneous Markov Attrition Process .

    Directory of Open Access Journals (Sweden)

    Jau Yeu Menq

    1993-01-01

    Full Text Available A model for heterogeneous dynamics combat as a continuos-time Markov process has been studied, and on account of the special form of its infinitesimal generator, recursive algorithms are derived to compute the important characteristics of the combat, such as the combat time distribution, expected value and variance, and the probability of winning and expected survivors. Numerical results are also presented. This approach can also be used to consider initial contact forces of both sides as random variables.

  5. Building Qualitative Models of Thermodynamic Processes

    Science.gov (United States)

    2007-01-01

    two containers 61 53 Envisionment for simple flow with thermal properties 62 54 Envisionment for simple flow without thermal properties 63 55 A...pump and a path connecting two containers 64 56 Scenario input for a pump and a return path between two containers . . 64 57 Envisionment for the...Specifically, the domain model is written in the language of QPE[8], an envisioner for Qualitative Process theory . We assume a reading knowledge of QP theory

  6. Irreversible Processes in Inflationary Cosmological Models

    CERN Document Server

    Kremer, G M

    2002-01-01

    By using the thermodynamic theory of irreversible processes and Einstein general relativity, a cosmological model is proposed where the early universe is considered as a mixture of a scalar field with a matter field. The scalar field refers to the inflaton while the matter field to the classical particles. The irreversibility is related to a particle production process at the expense of the gravitational energy and of the inflaton energy. The particle production process is represented by a non-equilibrium pressure in the energy-momentum tensor. The non-equilibrium pressure is proportional to the Hubble parameter and its proportionality factor is identified with the coefficient of bulk viscosity. The dynamic equations of the inflaton and the Einstein field equations determine the time evolution of the cosmic scale factor, the Hubble parameter, the acceleration and of the energy densities of the inflaton and matter. Among other results it is shown that in some regimes the acceleration is positive which simulate...

  7. Impact of numerical models on fragmentation processes

    Science.gov (United States)

    Renouf, Mathieu; Gezahengn, Belien; Abbas, Micheline; Bourgeois, Florent

    2013-06-01

    Simulated fragmentation process in granular assemblies is a challenging problem which date back the beginning of the 90'. If first approaches have focus on the fragmentation on a single particle, with the development of robust, fast numerical method is is possible today to simulated such process in a large collection of particles. But the question of the fragmentation problem is still open: should the fragmentation be done dynamically (one particle becoming two fragments) and according which criterion or should the fragment paths be defined initially and which is the impact of the discretization and the model of fragments? The present contribution proposes to investigate the second aspect i.e. the impact of fragment modeling on the fragmentation processes. First to perform such an analysis, the geometry of fragments (disks/sphere or polygon/polyhedra), their behavior (rigid/deformable) and the law governing their interactions are investigated. Then such model will be used in a grinding application where the evolution of fragments and impact on the behavior of the whole packing are investigate.

  8. Near Field Environment Process Model Report

    Energy Technology Data Exchange (ETDEWEB)

    R.A. Wagner

    2000-11-14

    Waste emplacement and activities associated with construction of a repository system potentially will change environmental conditions within the repository system. These environmental changes principally result from heat generated by the decay of the radioactive waste, which elevates temperatures within the repository system. Elevated temperatures affect distribution of water, increase kinetic rates of geochemical processes, and cause stresses to change in magnitude and orientation from the stresses resulting from the overlying rock and from underground construction activities. The recognition of this evolving environment has been reflected in activities, studies and discussions generally associated with what has been termed the Near-Field Environment (NFE). The NFE interacts directly with waste packages and engineered barriers as well as potentially changing the fluid composition and flow conditions within the mountain. As such, the NFE defines the environment for assessing the performance of a potential Monitored Geologic Repository at Yucca Mountain, Nevada. The NFe evolves over time, and therefore is not amenable to direct characterization or measurement in the ambient system. Analysis or assessment of the NFE must rely upon projections based on tests and models that encompass the long-term processes of the evolution of this environment. This NFE Process Model Report (PMR) describes the analyses and modeling based on current understanding of the evolution of the near-field within the rock mass extending outward from the drift wall.

  9. The impact of working memory and the "process of process modelling" on model quality: Investigating experienced versus inexperienced modellers.

    Science.gov (United States)

    Martini, Markus; Pinggera, Jakob; Neurauter, Manuel; Sachse, Pierre; Furtner, Marco R; Weber, Barbara

    2016-05-09

    A process model (PM) represents the graphical depiction of a business process, for instance, the entire process from online ordering a book until the parcel is delivered to the customer. Knowledge about relevant factors for creating PMs of high quality is lacking. The present study investigated the role of cognitive processes as well as modelling processes in creating a PM in experienced and inexperienced modellers. Specifically, two working memory (WM) functions (holding and processing of information and relational integration) and three process of process modelling phases (comprehension, modelling, and reconciliation) were related to PM quality. Our results show that the WM function of relational integration was positively related to PM quality in both modelling groups. The ratio of comprehension phases was negatively related to PM quality in inexperienced modellers and the ratio of reconciliation phases was positively related to PM quality in experienced modellers. Our research reveals central cognitive mechanisms in process modelling and has potential practical implications for the development of modelling software and teaching the craft of process modelling.

  10. GREENSCOPE: A Method for Modeling Chemical Process ...

    Science.gov (United States)

    Current work within the U.S. Environmental Protection Agency’s National Risk Management Research Laboratory is focused on the development of a method for modeling chemical process sustainability. The GREENSCOPE methodology, defined for the four bases of Environment, Economics, Efficiency, and Energy, can evaluate processes with over a hundred different indicators. These indicators provide a means for realizing the principles of green chemistry and green engineering in the context of sustainability. Development of the methodology has centered around three focal points. One is a taxonomy of impacts that describe the indicators and provide absolute scales for their evaluation. The setting of best and worst limits for the indicators allows the user to know the status of the process under study in relation to understood values. Thus, existing or imagined processes can be evaluated according to their relative indicator scores, and process modifications can strive towards realizable targets. A second area of focus is in advancing definitions of data needs for the many indicators of the taxonomy. Each of the indicators has specific data that is necessary for their calculation. Values needed and data sources have been identified. These needs can be mapped according to the information source (e.g., input stream, output stream, external data, etc.) for each of the bases. The user can visualize data-indicator relationships on the way to choosing selected ones for evalua

  11. Kinetic Model of Biodiesel Processing Using Ultrasound

    Directory of Open Access Journals (Sweden)

    Bambang Susilo

    2009-04-01

    Full Text Available Ultrasound is predicted to be able to accelerate the chemical reaction, to increase the conversion of plant oil into biodiesel, and to decrease the need of catalyst and energy input. The application of ultrasound for processing of biodiesel and the mathematical model were conducted in this research. The result of the experiments showed that the ultrasound increased reaction rate and the conversion of palm oil into biodiesel up to 100%. It was better than the process with mechanical stirrer that the conversion was just 96%. The duration to complete the process using ultrasound was 1 minute. It was 30 to 120 times faster than that with mechanical stirrer. Ultrasound transforms mechanical energy into inner energy of the fluids and causes an increasing of temperature. Simultaneously, natural mixing process undergo because of acoustic circulation. Simulation with experiment data showed that the acceleration of transesterification with ultrasound was affected not only by natural mixing and increasing temperature. The cavitation, surface tension of micro bubble, and hot spot accelerate chemical reaction. In fact, transesterification of palm oil with ultrasound still needs catalyst. It needs only about 20% of catalyst compared to the process with mechanical stirrer.

  12. Modeling the Object-Oriented Software Process: OPEN and the Unified Process

    NARCIS (Netherlands)

    van den Berg, Klaas; Aksit, Mehmet; van den Broek, P.M.

    1999-01-01

    A short introduction to software process modeling is presented, particularly object-oriented modeling. Two major industrial process models are discussed: the OPEN model and the Unified Process model. In more detail, the quality assurance in the Unified Process tool (formally called Objectory) is rev

  13. Event Modeling in UML. Unified Modeling Language and Unified Process

    DEFF Research Database (Denmark)

    Bækgaard, Lars

    2002-01-01

    We show how events can be modeled in terms of UML. We view events as change agents that have consequences and as information objects that represent information. We show how to create object-oriented structures that represent events in terms of attributes, associations, operations, state charts......, and messages. We outline a run-time environment for the processing of events with multiple participants....

  14. Strategies to Automatically Derive a Process Model from a Configurable Process Model Based on Event Data

    Directory of Open Access Journals (Sweden)

    Mauricio Arriagada-Benítez

    2017-10-01

    Full Text Available Configurable process models are frequently used to represent business workflows and other discrete event systems among different branches of large organizations: they unify commonalities shared by all branches and describe their differences, at the same time. The configuration of such models is usually done manually, which is challenging. On the one hand, when the number of configurable nodes in the configurable process model grows, the size of the search space increases exponentially. On the other hand, the person performing the configuration may lack the holistic perspective to make the right choice for all configurable nodes at the same time, since choices influence each other. Nowadays, information systems that support the execution of business processes create event data reflecting how processes are performed. In this article, we propose three strategies (based on exhaustive search, genetic algorithms and a greedy heuristic that use event data to automatically derive a process model from a configurable process model that better represents the characteristics of the process in a specific branch. These strategies have been implemented in our proposed framework and tested in both business-like event logs as recorded in a higher educational enterprise resource planning system and a real case scenario involving a set of Dutch municipalities.

  15. Modelling of thermal processes in indoor icerinks

    Energy Technology Data Exchange (ETDEWEB)

    Korsgaard, V.; Forowicz, A.

    1986-01-01

    Heat transfer by radiation between ceiling and ice and high humidity of air in indoor icerinks very often cause heavy condensation on the ceiling or roof construction, which has some bad effects. To check how often condensation will occur and possible ways of preventing condensation, a simple computer model of the thermal processes taking place in an indoor icerink was eleborated. The assumptions being made concerning the model system geometry as well as the mathematical problem formulation are described. Next, the mathematical model of the problem being considered, the method of solution and the short description of the simulation program are presented. The report shows further the results obtained from several executions of the program using different data regarding changes in the model itself as well as the influence of different ventilation rates, heat input by radiation and convection etc. These results have allowed for general comparison between four cases, i.e. between the model icerink with a ceiling made from ordinary building material, with a bright aluminium foil glued to the ceiling surface and with a suspended shield of corrugated bright aluminium plates installed below the roof construction, which surface facing the roof is unpainted or painted to increase its absorptivity.

  16. Three-dimensional model for fusion processes

    Energy Technology Data Exchange (ETDEWEB)

    Olson, A.P.

    1984-01-01

    Active galactic nuclei (AGN) emit unusual spectra of radiation which is interpreted to signify extreme distance, extreme power, or both. The status of AGNs was recently reviewed by Balick and Heckman. It seems that the greatest conceptual difficulty with understanding AGNs is how to form a coherent phenomenological model of their properties. What drives the galactic engine. What and where are the mass-flows of fuel to this engine. Are there more than one engine. Do the engines have any symmetry properties. Is observed radiation isotropically emitted from the source. If it is polarized, what causes the polarization. Why is there a roughly spherical cloud of ionized gas about the center of our own galaxy, the Milky Way. The purpose of this paper is to discuss a new model, based on fusion processes which are not axisymmetric, uniform, isotropic, or even time-invariant. Then, the relationship to these questions will be developed. A unified model of fusion processes applicable to many astronomical phenomena will be proposed and discussed.

  17. TUNS/TCIS information model/process model

    Science.gov (United States)

    Wilson, James

    1992-01-01

    An Information Model is comprised of graphical and textual notation suitable for describing and defining the problem domain - in our case, TUNS or TCIS. The model focuses on the real world under study. It identifies what is in the problem and organizes the data into a formal structure for documentation and communication purposes. The Information Model is composed of an Entity Relationship Diagram (ERD) and a Data Dictionary component. The combination of these components provide an easy to understand methodology for expressing the entities in the problem space, the relationships between entities and the characteristics (attributes) of the entities. This approach is the first step in information system development. The Information Model identifies the complete set of data elements processed by TUNS. This representation provides a conceptual view of TUNS from the perspective of entities, data, and relationships. The Information Model reflects the business practices and real-world entities that users must deal with.

  18. Simplification of vacuole structure during plant cell death triggered by culture filtrates of Erwinia carotovora

    Institute of Scientific and Technical Information of China (English)

    Yumi Hirakawa; Toshihisa Nomura; Seiichiro Hasezawa; Takumi Higaki

    2015-01-01

    Vacuoles are suggested to play crucial roles in plant defense-related cel death. During programmed cel death, previous live cel imaging studies have observed vacuoles to become simpler in structure and have implicated this simplification as a prelude to the vacuole’s rupture and consequent lysis of the plasma membrane. Here, we examined dynamics of the vacuole in cel cycle-synchronized tobacco BY-2 (Nicotiana tabacum L. cv. Bright Yel ow 2) cel s during cel death induced by application of culture filtrates of Erwinia carotovora. The filtrate induced death in about 90%of the cel s by 24 h. Prior to cel death, vacuole shape simplified and endoplasmic actin filaments disassembled;however, the vacuoles did not rupture until after plasma membrane integrity was lost. Instead of facilitating rupture, the simplification of vacuole structure might play a role in the retrieval of membrane components needed for defense-related cel death.

  19. Modelling and control of dynamic systems using gaussian process models

    CERN Document Server

    Kocijan, Juš

    2016-01-01

    This monograph opens up new horizons for engineers and researchers in academia and in industry dealing with or interested in new developments in the field of system identification and control. It emphasizes guidelines for working solutions and practical advice for their implementation rather than the theoretical background of Gaussian process (GP) models. The book demonstrates the potential of this recent development in probabilistic machine-learning methods and gives the reader an intuitive understanding of the topic. The current state of the art is treated along with possible future directions for research. Systems control design relies on mathematical models and these may be developed from measurement data. This process of system identification, when based on GP models, can play an integral part of control design in data-based control and its description as such is an essential aspect of the text. The background of GP regression is introduced first with system identification and incorporation of prior know...

  20. Model systems for life processes on Mars

    Science.gov (United States)

    Mitz, M. A.

    1974-01-01

    In the evolution of life forms nonphotosynthetic mechanisms are developed. The question remains whether a total life system could evolve which is not dependent upon photosynthesis. In trying to visualize life on other planets, the photosynthetic process has problems. On Mars, the high intensity of light at the surface is a concern and alternative mechanisms need to be defined and analyzed. In the UV search for alternate mechanisms, several different areas may be identified. These involve activated inorganic compounds in the atmosphere, such as the products of photodissociation of carbon dioxide and the organic material which may be created by natural phenomena. In addition, a life system based on the pressure of the atmospheric constituents, such as carbon dioxide, is a possibility. These considerations may be important for the understanding of evolutionary processes of life on another planet. Model systems which depend on these alternative mechanisms are defined and related to presently planned and future planetary missions.

  1. Automation and work-simplification in the urinalysis laboratory. A pilgrim's progress.

    Science.gov (United States)

    Patterson, P P

    1988-09-01

    The evolution of the modern clinical laboratory has produced a gap between medical/scientific competence on the one hand and management skills on the other. Physician-managers need both sets of competencies. Concepts of operations management and cost accounting shape criteria for strategic decisions in technology improvement programs. Automation and work-simplification are key strategies for improving cost performance in clinical laboratories.

  2. A Markovian Process Modeling for Pickomino

    Science.gov (United States)

    Cardon, Stéphane; Chetcuti-Sperandio, Nathalie; Delorme, Fabien; Lagrue, Sylvain

    This paper deals with a nondeterministic game based on die rolls and on the "stop or continue" principle: Pickomino. During his turn, each participant has to make the best decisions first to choose the dice to keep, then to choose between continuing or stopping depending on the previous rolls and on the available resources. Markov Decision Processes (MDPs) offer the formal framework to model this game. The two main problems are first to determine the set of states, then to compute the transition probabilities.

  3. Modeling in Accounting, an Imperative Process?

    Directory of Open Access Journals (Sweden)

    Robu Sorin-Adrian

    2014-08-01

    Full Text Available The approach of this topic suggested to us by the fact that currently, it persists a controversy regarding the elements that influence decisively the qualitative characteristics of useful financial information. From these elements, we remark accounting models and concepts of capital maintenance in terms of the accounting result, which can be under the influence of factors such as subjectivity or even lack of neutrality. Therefore, in formulating the response to the question that is the title of the paper, we will start from the fact that the financial statements prepared by the accounting systems must be the result of processing after appropriate models, which ultimately can respond as good as possible to external user’s requirements and internal, in particular the knowledge of the financial position and performance of economic entities.

  4. Comments on: Spatiotemporal models for skewed processes

    KAUST Repository

    Genton, Marc G.

    2017-09-04

    We would first like to thank the authors for this paper that highlights the important problem of building models for non-Gaussian space-time processes. We will hereafter refer to the paper as SGV, and we also would like to acknowledge and thank them for providing us with the temporally detrended temperatures, plotted in their Figure 1, along with the coordinates of the twenty-one locations and the posterior means of the parameters for the MA1 model. We find much of interest to discuss in this paper, and as we progress through points of interest, we pose some questions to the authors that we hope they will be able to address.

  5. Time models and cognitive processes: a review

    Directory of Open Access Journals (Sweden)

    Michail eManiadakis

    2014-02-01

    Full Text Available The sense of time is an essential capacity of humans, with a major role in many of the cognitive processes expressed in our daily lifes. So far, in cognitive science and robotics research, mental capacities have been investigated in a theoretical and modelling framework that largely neglects the flow of time. Only recently there has been a small but constantly increasing interest in the temporal aspects of cognition, integrating time into a range of different models of perceptuo-motor capacities. The current paper aims to review existing works in the field and suggest directions for fruitful future work. This is particularly important for the newly developed field of artificial temporal cognition that is expected to significantly contribute in the development of sophisticated artificial agents seamlessly integrated into human societies.

  6. Privatization processes in banking: Motives and models

    Directory of Open Access Journals (Sweden)

    Ristić Života

    2006-01-01

    Full Text Available The paper consists of three methodologically and causally connected thematic parts: the first part deals with crucial motives and models of the privatization processes in the USA and EU with a particular analytical focus on the Herfindahl-Hirschman doctrine of the collective domination index, as well as on the essence of merger-acquisition and take-over models. The second thematic part of the paper, as a logical continuation of the first one represents a brief comparative analysis of the motives and models implemented in bank privatization in the south-eastern European countries with particular focus on identifying interests of foreign investors, an optimal volume and price of the investment, and assessment of finalized privatizations in those countries. The final part of the paper theoretically and practically stems from the first and the second part, in that way making an interdependent and a compatible thematic whole with them, presents qualitative and quantitative aspects of analyzing finalized privatization and/or sale-purchase of Serbian banks with particular focus on IPO and IPOPLUS as the prevailing models of future sale-purchase in privatizing Serbian banks.

  7. Development of a Comprehensive Weld Process Model

    Energy Technology Data Exchange (ETDEWEB)

    Radhakrishnan, B.; Zacharia, T.

    1997-05-01

    This cooperative research and development agreement (CRADA) between Concurrent Technologies Corporation (CTC) and Lockheed Martin Energy Systems (LMES) combines CTC's expertise in the welding area and that of LMES to develop computer models and simulation software for welding processes. This development is of significant impact to the industry, including materials producers and fabricators. The main thrust of the research effort was to develop a comprehensive welding simulation methodology. A substantial amount of work has been done by several researchers to numerically model several welding processes. The primary drawback of most of the existing models is the lack of sound linkages between the mechanistic aspects (e.g., heat transfer, fluid flow, and residual stress) and the metallurgical aspects (e.g., microstructure development and control). A comprehensive numerical model which can be used to elucidate the effect of welding parameters/conditions on the temperature distribution, weld pool shape and size, solidification behavior, and microstructure development, as well as stresses and distortion, does not exist. It was therefore imperative to develop a comprehensive model which would predict all of the above phenomena during welding. The CRADA built upon an already existing three- dimensional (3-D) welding simulation model which was developed by LMES which is capable of predicting weld pool shape and the temperature history in 3-d single-pass welds. However, the model does not account for multipass welds, microstructural evolution, distortion and residual stresses. Additionally, the model requires large resources of computing time, which limits its use for practical applications. To overcome this, CTC and LMES have developed through this CRADA the comprehensive welding simulation model described above. The following technical tasks have been accomplished as part of the CRADA. 1. The LMES welding code has been ported to the Intel Paragon parallel computer at

  8. Modeling of vehicular hydrogen storage transfer processes

    Energy Technology Data Exchange (ETDEWEB)

    Viola, J.; Ventner, R.D. [Toronto Univ., ON (Canada). Dept. of Mechanical and Industrial Engineering; Bose, T.; Benard, P. [Quebec Univ., Trois-Rivieres, PQ (Canada)

    2003-07-01

    The acceptance of hydrogen as an alternate fuel for powering vehicles depends on several factors, such as the performance properties of hydrogen fuels, the behaviour of the vehicle in terms of power response, and the handling of the fuel during the transfer operation to the vehicle. This paper presents a study which examined the transfer of fuel and compared the fueling processes of several hydrogen storage methods on a vehicle. The study involved a computer-simulation of different hydrogen storage systems to compare the characteristics of the various transfer processes. The thermodynamics of hydrogen transfer from a defined initial condition to its final state was studied. Tabulations of energy requirements, temperature and pressure variations, and limitations concerning the transfer rate were provided. The fueling procedure was simulated using dynamic models, and the components from the initial to the final equilibrium state within the vehicle were characterized. The fluctuations in the system during the physical transfer operations were illustrated. Some of the safety risks include passive risks from toxic and low temperature or cryogenic effects, and explosion and combustion. The authors used fuzzy analysis of survey results to examine safety, which is more subjective in nature than the other properties modeled. An introduction to fuzzy logic was presented, followed by a description of the method used. 2 refs., 7 figs.

  9. Computer modeling of complete IC fabrication process

    Science.gov (United States)

    Dutton, Robert W.

    1987-05-01

    The development of fundamental algorithms for process and device modeling as well as novel integration of the tools for advanced Integrated Circuit (IC) technology design is discussed. The development of the first complete 2D process simulator, SUPREM 4, is reported. The algorithms are discussed as well as application to local-oxidation and extrinsic diffusion conditions which occur in CMOS AND BiCMOS technologies. The evolution of 1D (SEDAN) and 2D (PISCES) device analysis is discussed. The application of SEDAN to a variety of non-silicon technologies (GaAs and HgCdTe) are considered. A new multi-window analysis capability for PISCES which exploits Monte Carlo analysis of hot carriers has been demonstrated and used to characterize a variety of silicon MOSFET and GaAs MESFET effects. A parallel computer implementation of PISCES has been achieved using a Hypercube architecture. The PISCES program has been used for a range of important device studies including: latchup, analog switch analysis, MOSFET capacitance studies and bipolar transient device for ECL gates. The program is broadly applicable to RAM and BiCMOS technology analysis and design. In the analog switch technology area this research effort has produced a variety of important modeling and advances.

  10. Specification of e-business process model for PayPal online payment process using Reo

    NARCIS (Netherlands)

    Xie, M.

    2005-01-01

    E-business process modeling allows business analysts to better understand and analyze the business processes, and eventually to use software systems to automate (parts of) these business processes to achieve higher profit. To support e-business process modeling, many business process modeling langua

  11. Arta process model of maritime clutter and targets

    CSIR Research Space (South Africa)

    McDonald

    2012-10-01

    Full Text Available A coherent autoregressive–to–anything (ARTA) stationary stochastic process for modelling maritime clutter and targets is presented in this paper. The ARTA stochastic process model is an improvement over previous models in the sense...

  12. Uncertainty modeling process for semantic technology

    Directory of Open Access Journals (Sweden)

    Rommel N. Carvalho

    2016-08-01

    Full Text Available The ubiquity of uncertainty across application domains generates a need for principled support for uncertainty management in semantically aware systems. A probabilistic ontology provides constructs for representing uncertainty in domain ontologies. While the literature has been growing on formalisms for representing uncertainty in ontologies, there remains little guidance in the knowledge engineering literature for how to design probabilistic ontologies. To address the gap, this paper presents the Uncertainty Modeling Process for Semantic Technology (UMP-ST, a new methodology for modeling probabilistic ontologies. To explain how the methodology works and to verify that it can be applied to different scenarios, this paper describes step-by-step the construction of a proof-of-concept probabilistic ontology. The resulting domain model can be used to support identification of fraud in public procurements in Brazil. While the case study illustrates the development of a probabilistic ontology in the PR-OWL probabilistic ontology language, the methodology is applicable to any ontology formalism that properly integrates uncertainty with domain semantics.

  13. GRAPHICAL MODELS OF THE AIRCRAFT MAINTENANCE PROCESS

    Directory of Open Access Journals (Sweden)

    Stanislav Vladimirovich Daletskiy

    2017-01-01

    Full Text Available The aircraft maintenance is realized by a rapid sequence of maintenance organizational and technical states, its re- search and analysis are carried out by statistical methods. The maintenance process concludes aircraft technical states con- nected with the objective patterns of technical qualities changes of the aircraft as a maintenance object and organizational states which determine the subjective organization and planning process of aircraft using. The objective maintenance pro- cess is realized in Maintenance and Repair System which does not include maintenance organization and planning and is a set of related elements: aircraft, Maintenance and Repair measures, executors and documentation that sets rules of their interaction for maintaining of the aircraft reliability and readiness for flight. The aircraft organizational and technical states are considered, their characteristics and heuristic estimates of connection in knots and arcs of graphs and of aircraft organi- zational states during regular maintenance and at technical state failure are given. It is shown that in real conditions of air- craft maintenance, planned aircraft technical state control and maintenance control through it, is only defined by Mainte- nance and Repair conditions at a given Maintenance and Repair type and form structures, and correspondingly by setting principles of Maintenance and Repair work types to the execution, due to maintenance, by aircraft and all its units mainte- nance and reconstruction strategies. The realization of planned Maintenance and Repair process determines the one of the constant maintenance component. The proposed graphical models allow to reveal quantitative correlations between graph knots to improve maintenance processes by statistical research methods, what reduces manning, timetable and expenses for providing safe civil aviation aircraft maintenance.

  14. Mechanical-mathematical modeling for landslide process

    Science.gov (United States)

    Svalova, V.

    2009-04-01

    500 m and displacement of a landslide in the plan over 1 m. Last serious activization of a landslide has taken place in 2002 with a motion on 53 cm. Catastrophic activization of the deep blockglide landslide in the area of Khoroshevo in Moscow took place in 2006-2007. A crack of 330 m long appeared in the old sliding circus, along which a new 220 m long creeping block was separated from the plateau and began sinking with a displaced surface of the plateau reaching to 12 m. Such activization of the landslide process was not observed in Moscow since mid XIX century. The sliding area of Khoroshevo was stable during long time without manifestations of activity. Revealing of the reasons of deformation and development of ways of protection from deep landslide motions is extremely actual and difficult problem which decision is necessary for preservation of valuable historical monuments and modern city constructions. The reasons of activization and protective measures are discussed. Structure of monitoring system for urban territories is elaborated. Mechanical-mathematical model of high viscous fluid was used for modeling of matter behavior on landslide slopes. Equation of continuity and an approximated equation of the Navier-Stockes for slow motions in a thin layer were used. The results of modelling give possibility to define the place of highest velocity on landslide surface, which could be the best place for monitoring post position. Model can be used for calibration of monitoring equipment and gives possibility to investigate some fundamental aspects of matter movement on landslide slope.

  15. Modelling and application of stochastic processes

    CERN Document Server

    1986-01-01

    The subject of modelling and application of stochastic processes is too vast to be exhausted in a single volume. In this book, attention is focused on a small subset of this vast subject. The primary emphasis is on realization and approximation of stochastic systems. Recently there has been considerable interest in the stochastic realization problem, and hence, an attempt has been made here to collect in one place some of the more recent approaches and algorithms for solving the stochastic realiza­ tion problem. Various different approaches for realizing linear minimum-phase systems, linear nonminimum-phase systems, and bilinear systems are presented. These approaches range from time-domain methods to spectral-domain methods. An overview of the chapter contents briefly describes these approaches. Also, in most of these chapters special attention is given to the problem of developing numerically ef­ ficient algorithms for obtaining reduced-order (approximate) stochastic realizations. On the application side,...

  16. Modeling and simulation of economic processes

    Directory of Open Access Journals (Sweden)

    Bogdan Brumar

    2010-12-01

    Full Text Available In general, any activity requires a longer action often characterized by a degree of uncertainty, insecurity, in terms of size of the objective pursued. Because of the complexity of real economic systems, the stochastic dependencies between different variables and parameters considered, not all systems can be adequately represented by a model that can be solved by analytical methods and covering all issues for management decision analysis-economic horizon real. Often in such cases, it is considered that the simulation technique is the only alternative available. Using simulation techniques to study real-world systems often requires a laborious work. Making a simulation experiment is a process that takes place in several stages.

  17. Kanban simulation model for production process optimization

    Directory of Open Access Journals (Sweden)

    Golchev Riste

    2015-01-01

    Full Text Available A long time has passed since the KANBAN system has been established as an efficient method for coping with the excessive inventory. Still, the possibilities for its improvement through its integration with other different approaches should be investigated further. The basic research challenge of this paper is to present benefits of KANBAN implementation supported with Discrete Event Simulation (DES. In that direction, at the beginning, the basics of KANBAN system are presented with emphasis on the information and material flow, together with a methodology for implementation of KANBAN system. Certain analysis on combining the simulation with this methodology is presented. The paper is concluded with a practical example which shows that through understanding the philosophy of the implementation methodology of KANBAN system and the simulation methodology, a simulation model can be created which can serve as a basis for a variety of experiments that can be conducted within a short period of time, resulting with production process optimization.

  18. USAF Logistics Process Optimization Study for the Aircraft Asset Sustainment Process. Volume 3. Future to be Asset Sustainment Process Model.

    Science.gov (United States)

    2007-11-02

    Models), contains the To-Be Retail Asset Sustainment Process Model displaying the activities and functions related to the improved processes for receipt...of a logistics process model for a more distant future asset sustainment scenario unconstrained by today’s logistics information systems limitations...It also contains a process model reflecting the Reengineering Team’s vision of the future asset sustainment process.

  19. [Standardization and modeling of surgical processes].

    Science.gov (United States)

    Strauss, G; Schmitz, P

    2016-12-01

    Due to the technological developments around the operating room, surgery in the twenty-first century is undergoing a paradigm shift. Which technologies have already been integrated into the surgical routine? How can a favorable cost-benefit balance be achieved by the implementation of new software-based assistance systems? This article presents the state of the art technology as exemplified by a semi-automated operation system for otorhinolaryngology surgery. The main focus is on systems for implementation of digital handbooks and navigational functions in situ. On the basis of continuous development in digital imaging, decisions may by facilitated by individual patient models thus allowing procedures to be optimized. The ongoing digitization and linking of all relevant information enable a high level of standardization in terms of operating procedures. This may be used by assistance systems as a basis for complete documentation and high process reliability. Automation of processes in the operating room results in an increase in quality, precision and standardization so that the effectiveness and efficiency of treatment can be improved; however, care must be taken that detrimental consequences, such as loss of skills and placing too much faith in technology must be avoided by adapted training concepts.

  20. Modeling Sound Processing in Cochlear Nuclei

    Science.gov (United States)

    Meddis, Ray

    2003-03-01

    The cochlear nucleus is an obligatory relay nucleus between the ear and the rest of the brain. It consists of many different types of neurons each responding differently to the same stimulus. Much is known about the wiring diagram of the system but it has so far proved difficult to characterise the signal processing that is going on or what purpose it serves. The solution to this problem is a pre-requisite of any attempt to produce a practical electronic simulation that exploits the brain's unique capacity to recognise the significance of acoustic events and generate appropriate responses. This talk will explain the different types of neural cell and specify hypotheses as to their various functions. Cell-types vary in terms of their size and shape as well as the number and type of minute electrical currents that flow across the cell membranes. Computer models will also be used to illustrate how the physical substrate (the wet-ware) is used to achieve its signal-processing goals.

  1. Extraction and Simplification of Building Façade Pieces from Mobile Laser Scanner Point Clouds for 3D Street View Services

    Directory of Open Access Journals (Sweden)

    Yan Li

    2016-12-01

    Full Text Available Extraction and analysis of building façades are key processes in the three-dimensional (3D building reconstruction and realistic geometrical modeling of the urban environment, which includes many applications, such as smart city management, autonomous navigation through the urban environment, fly-through rendering, 3D street view, virtual tourism, urban mission planning, etc. This paper proposes a building facade pieces extraction and simplification algorithm based on morphological filtering with point clouds obtained by a mobile laser scanner (MLS. First, this study presents a point cloud projection algorithm with high-accuracy orientation parameters from the position and orientation system (POS of MLS that can convert large volumes of point cloud data to a raster image. Second, this study proposes a feature extraction approach based on morphological filtering with point cloud projection that can obtain building facade features in an image space. Third, this study designs an inverse transformation of point cloud projection to convert building facade features from an image space to a 3D space. A building facade feature with restricted facade plane detection algorithm is implemented to reconstruct façade pieces for street view service. The results of building facade extraction experiments with large volumes of point cloud from MLS show that the proposed approach is suitable for various types of building facade extraction. The geometric accuracy of building façades is 0.66 m in x direction, 0.64 in y direction and 0.55 m in the vertical direction, which is the same level as the space resolution (0.5 m of the point cloud.

  2. Model development for naphthenic acids ozonation process.

    Science.gov (United States)

    Al Jibouri, Ali Kamel H; Wu, Jiangning

    2015-02-01

    Naphthenic acids (NAs) are toxic constituents of oil sands process-affected water (OSPW) which is generated during the extraction of bitumen from oil sands. NAs consist mainly of carboxylic acids which are generally biorefractory. For the treatment of OSPW, ozonation is a very beneficial method. It can significantly reduce the concentration of NAs and it can also convert NAs from biorefractory to biodegradable. In this study, a factorial design (2(4)) was used for the ozonation of OSPW to study the influences of the operating parameters (ozone concentration, oxygen/ozone flow rate, pH, and mixing) on the removal of a model NAs in a semi-batch reactor. It was found that ozone concentration had the most significant effect on the NAs concentration compared to other parameters. An empirical model was developed to correlate the concentration of NAs with ozone concentration, oxygen/ozone flow rate, and pH. In addition, a theoretical analysis was conducted to gain the insight into the relationship between the removal of NAs and the operating parameters.

  3. Properties of spatial Cox process models

    DEFF Research Database (Denmark)

    Møller, Jesper

    2005-01-01

    Particularly, we study the most important classes of Cox processes, including log Gaussian Cox processes, shot noise Cox processes, and permanent Cox processes. We consider moment properties and point process operations such as thinning, displacements, and super positioning. We also discuss how t...

  4. Nanowire growth process modeling and reliability models for nanodevices

    Science.gov (United States)

    Fathi Aghdam, Faranak

    Nowadays, nanotechnology is becoming an inescapable part of everyday life. The big barrier in front of its rapid growth is our incapability of producing nanoscale materials in a reliable and cost-effective way. In fact, the current yield of nano-devices is very low (around 10 %), which makes fabrications of nano-devices very expensive and uncertain. To overcome this challenge, the first and most important step is to investigate how to control nano-structure synthesis variations. The main directions of reliability research in nanotechnology can be classified either from a material perspective or from a device perspective. The first direction focuses on restructuring materials and/or optimizing process conditions at the nano-level (nanomaterials). The other direction is linked to nano-devices and includes the creation of nano-electronic and electro-mechanical systems at nano-level architectures by taking into account the reliability of future products. In this dissertation, we have investigated two topics on both nano-materials and nano-devices. In the first research work, we have studied the optimization of one of the most important nanowire growth processes using statistical methods. Research on nanowire growth with patterned arrays of catalyst has shown that the wire-to-wire spacing is an important factor affecting the quality of resulting nanowires. To improve the process yield and the length uniformity of fabricated nanowires, it is important to reduce the resource competition between nanowires during the growth process. We have proposed a physical-statistical nanowire-interaction model considering the shadowing effect and shared substrate diffusion area to determine the optimal pitch that would ensure the minimum competition between nanowires. A sigmoid function is used in the model, and the least squares estimation method is used to estimate the model parameters. The estimated model is then used to determine the optimal spatial arrangement of catalyst arrays

  5. Application of simulation models for the optimization of business processes

    Science.gov (United States)

    Jašek, Roman; Sedláček, Michal; Chramcov, Bronislav; Dvořák, Jiří

    2016-06-01

    The paper deals with the applications of modeling and simulation tools in the optimization of business processes, especially in solving an optimization of signal flow in security company. As a modeling tool was selected Simul8 software that is used to process modeling based on discrete event simulation and which enables the creation of a visual model of production and distribution processes.

  6. Developing Friction Stir Welding Process Model for ICME Application

    Science.gov (United States)

    Yang, Yu-Ping

    2015-01-01

    A framework for developing a product involving manufacturing processes was developed with integrated computational materials engineering approach. The key component in the framework is a process modeling tool which includes a thermal model, a microstructure model, a thermo-mechanical, and a property model. Using friction stir welding (FSW) process as an example, development of the process modeling tool was introduced in detail. The thermal model and the microstructure model of FSW of steels were validated with the experiment data. The model can predict reasonable temperature and hardness distributions as observed in the experiment. The model was applied to predict residual stress and joint strength of a pipe girth weld.

  7. Business Process Simulation: Requirements for Business and Resource Models

    Directory of Open Access Journals (Sweden)

    Audrius Rima

    2015-07-01

    Full Text Available The purpose of Business Process Model and Notation (BPMN is to provide easily understandable graphical representation of business process. Thus BPMN is widely used and applied in various areas one of them being a business process simulation. This paper addresses some BPMN model based business process simulation problems. The paper formulate requirements for business process and resource models in enabling their use for business process simulation.

  8. a Geometric Processing Workflow for Transforming Reality-Based 3d Models in Volumetric Meshes Suitable for Fea

    Science.gov (United States)

    Gonizzi Barsanti, S.; Guidi, G.

    2017-02-01

    Conservation of Cultural Heritage is a key issue and structural changes and damages can influence the mechanical behaviour of artefacts and buildings. The use of Finite Elements Methods (FEM) for mechanical analysis is largely used in modelling stress behaviour. The typical workflow involves the use of CAD 3D models made by Non-Uniform Rational B-splines (NURBS) surfaces, representing the ideal shape of the object to be simulated. Nowadays, 3D documentation of CH has been widely developed through reality-based approaches, but the models are not suitable for a direct use in FEA: the mesh has in fact to be converted to volumetric, and the density has to be reduced since the computational complexity of a FEA grows exponentially with the number of nodes. The focus of this paper is to present a new method aiming at generate the most accurate 3D representation of a real artefact from highly accurate 3D digital models derived from reality-based techniques, maintaining the accuracy of the high-resolution polygonal models in the solid ones. The approach proposed is based on a wise use of retopology procedures and a transformation of this model to a mathematical one made by NURBS surfaces suitable for being processed by volumetric meshers typically embedded in standard FEM packages. The strong simplification with little loss of consistency possible with the retopology step is used for maintaining as much coherence as possible between the original acquired mesh and the simplified model, creating in the meantime a topology that is more favourable for the automatic NURBS conversion.

  9. Modeling nutrient in-stream processes at the watershed scale using Nutrient Spiralling metrics

    Directory of Open Access Journals (Sweden)

    J. Armengol

    2009-07-01

    Full Text Available One of the fundamental problems of using large-scale biogeochemical models is the uncertainty involved in aggregating the components of fine-scale deterministic models in watershed applications, and in extrapolating the results of field-scale measurements to larger spatial scales. Although spatial or temporal lumping may reduce the problem, information obtained during fine-scale research may not apply to lumped categories. Thus, the use of knowledge gained through fine-scale studies to predict coarse-scale phenomena is not straightforward. In this study, we used the nutrient uptake metrics defined in the Nutrient Spiralling concept to formulate the equations governing total phosphorus in-stream fate in a deterministic, watershed-scale biogeochemical model. Once the model was calibrated, fitted phosphorus retention metrics where put in context of global patterns of phosphorus retention variability. For this purpose, we calculated power regressions between phosphorus retention metrics, streamflow, and phosphorus concentration in water using published data from 66 streams worldwide, including both pristine and nutrient enriched streams.
    Performance of the calibrated model confirmed that the Nutrient Spiralling formulation is a convenient simplification of the biogeochemical transformations involved in total phosphorus in-stream fate. Thus, this approach may be helpful even for customary deterministic applications working at short time steps. The calibrated phosphorus retention metrics were comparable to field estimates from the study watershed, and showed high coherence with global patterns of retention metrics from streams of the world. In this sense, the fitted phosphorus retention metrics were similar to field values measured in other nutrient enriched streams. Analysis of the bibliographical data supports the view that nutrient enriched streams have lower phosphorus retention efficiency than pristine streams, and that this efficiency loss

  10. Spatial self-organization of vegetation subject to climatic stress—insights from a system dynamics—individual-based hybrid model

    NARCIS (Netherlands)

    Vincenot, Christian E.; Carteni, Fabrizio; Mazzoleni, Stefano; Rietkerk, Max|info:eu-repo/dai/nl/175964866; Giannino, Francesco

    2016-01-01

    In simulation models of populations or communities, individual plants have often been obfuscated in favor of aggregated vegetation. This simplification comes with a loss of biological detail and a smoothing out of the demographic noise engendered by stochastic individual-scale processes and

  11. Spatial self-organization of vegetation subject to climatic stress—insights from a system dynamics—individual-based hybrid model

    NARCIS (Netherlands)

    Vincenot, Christian E.; Carteni, Fabrizio; Mazzoleni, Stefano; Rietkerk, Max; Giannino, Francesco

    2016-01-01

    In simulation models of populations or communities, individual plants have often been obfuscated in favor of aggregated vegetation. This simplification comes with a loss of biological detail and a smoothing out of the demographic noise engendered by stochastic individual-scale processes and heteroge

  12. The Formalization of the Business Process Modeling Goals

    Directory of Open Access Journals (Sweden)

    Ligita Bušinska

    2016-10-01

    Full Text Available In business process modeling the de facto standard BPMN has emerged. However, the applications of this notation have many subsets of elements and various extensions. Also, BPMN still coincides with many other modeling languages, forming a large set of available options for business process modeling languages and dialects. While, in general, the goal of modelers is a central notion in the choice of modeling languages and notations, in most researches that propose guidelines, techniques, and methods for business process modeling language evaluation and/or selection, the business process modeling goal is not formalized and not transparently taken into account. To overcome this gap, and to explicate and help to handle business process modeling complexity, the approach to formalize the business process modeling goal, and the supporting three dimensional business process modeling framework, are proposed.

  13. Computer Forensics Field Triage Process Model

    Directory of Open Access Journals (Sweden)

    Marcus K. Rogers

    2006-06-01

    Full Text Available With the proliferation of digital based evidence, the need for the timely identification, analysis and interpretation of digital evidence is becoming more crucial. In many investigations critical information is required while at the scene or within a short period of time - measured in hours as opposed to days. The traditional cyber forensics approach of seizing a system(s/media, transporting it to the lab, making a forensic image(s, and then searching the entire system for potential evidence, is no longer appropriate in some circumstances. In cases such as child abductions, pedophiles, missing or exploited persons, time is of the essence. In these types of cases, investigators dealing with the suspect or crime scene need investigative leads quickly; in some cases it is the difference between life and death for the victim(s. The Cyber Forensic Field Triage Process Model (CFFTPM proposes an onsite or field approach for providing the identification, analysis and interpretation of digital evidence in a short time frame, without the requirement of having to take the system(s/media back to the lab for an in-depth examination or acquiring a complete forensic image(s. The proposed model adheres to commonly held forensic principles, and does not negate the ability that once the initial field triage is concluded, the system(s/storage media be transported back to a lab environment for a more thorough examination and analysis. The CFFTPM has been successfully used in various real world cases, and its investigative importance and pragmatic approach has been amply demonstrated. Furthermore, the derived evidence from these cases has not been challenged in the court proceedings where it has been introduced. The current article describes the CFFTPM in detail, discusses the model’s forensic soundness, investigative support capabilities and practical considerations.

  14. Modeling of non-stationary autoregressive alpha-stable processe

    Data.gov (United States)

    National Aeronautics and Space Administration — In the literature, impulsive signals are mostly modeled by symmetric alpha-stable processes. To represent their temporal dependencies, usually autoregressive models...

  15. Using shape complexity to guide simplification of geospatial data for use in Location-based Services

    OpenAIRE

    Ying, Fangli; Mooney, Peter; Corcoran, Padraig; Winstanley, Adam C.

    2010-01-01

    A Java-based tool for determining if polygons require simplification before delivery to a mobile device using a Location-based Service (LBS) is described. Visualisation of vector-based spatial data on mobile devices is constrained by: small screen size; small data storage on the device; and potentially poor bandwidth connectivity. Our Java-based tool can download OpenStreetMap (OSM) XML data in real-time and calculate a number of shape complexity measures for each object in the...

  16. NET-SYNTHESIS: a software for synthesis, inference and simplification of signal transduction networks.

    Science.gov (United States)

    Kachalo, Sema; Zhang, Ranran; Sontag, Eduardo; Albert, Réka; DasGupta, Bhaskar

    2008-01-15

    We present a software for combined synthesis, inference and simplification of signal transduction networks. The main idea of our method lies in representing observed indirect causal relationships as network paths and using techniques from combinatorial optimization to find the sparsest graph consistent with all experimental observations. We illustrate the biological usability of our software by applying it to a previously published signal transduction network and by using it to synthesize and simplify a novel network corresponding to activation-induced cell death in large granular lymphocyte leukemia. NET-SYNTHESIS is freely downloadable from http://www.cs.uic.edu/~dasgupta/network-synthesis/

  17. Simplification Resilient LDPC-Coded Sparse-QIM Watermarking for 3D-Meshes

    CERN Document Server

    Vasic, Bata

    2012-01-01

    We propose a blind watermarking scheme for 3-D meshes which combines sparse quantization index modulation (QIM) with deletion correction codes. The QIM operates on the vertices in rough concave regions of the surface thus ensuring impeccability, while the deletion correction code recovers the data hidden in the vertices which is removed by mesh optimization and/or simplification. The proposed scheme offers two orders of magnitude better performance in terms of recovered watermark bit error rate compared to the existing schemes of similar payloads and fidelity constraints.

  18. An Analysis of Simplification Strategies in a Reading Textbook of Japanese as a Foreign Language

    Directory of Open Access Journals (Sweden)

    Kristina HMELJAK SANGAWA

    2016-06-01

    Full Text Available Reading is one of the bases of second language learning, and it can be most effective when the linguistic difficulty of the text matches the reader's level of language proficiency. The present paper reviews previous research on the readability and simplification of Japanese texts, and presents an analysis of a collection of simplified texts for learners of Japanese as a foreign language. The simplified texts are compared to their original versions to uncover different strategies used to make the texts more accessible to learners. The list of strategies thus obtained can serve as useful guidelines for assessing, selecting, and devising texts for learners of Japanese as a foreign language.

  19. Signal Processing Model for Radiation Transport

    Energy Technology Data Exchange (ETDEWEB)

    Chambers, D H

    2008-07-28

    This note describes the design of a simplified gamma ray transport model for use in designing a sequential Bayesian signal processor for low-count detection and classification. It uses a simple one-dimensional geometry to describe the emitting source, shield effects, and detector (see Fig. 1). At present, only Compton scattering and photoelectric absorption are implemented for the shield and the detector. Other effects may be incorporated in the future by revising the expressions for the probabilities of escape and absorption. Pair production would require a redesign of the simulator to incorporate photon correlation effects. The initial design incorporates the physical effects that were present in the previous event mode sequence simulator created by Alan Meyer. The main difference is that this simulator transports the rate distributions instead of single photons. Event mode sequences and other time-dependent photon flux sequences are assumed to be marked Poisson processes that are entirely described by their rate distributions. Individual realizations can be constructed from the rate distribution using a random Poisson point sequence generator.

  20. 保留边界特征的点云简化算法%Research on point cloud simplification with boundary features reservation

    Institute of Scientific and Technical Information of China (English)

    赵伟玲; 谢雪冬; 程俊廷

    2013-01-01

    为有效简化点云数据,提出保留边界特征的点云简化算法.该算法利用三维栅格划分法建立散乱点云的空间拓扑关系,计算每个数据点的近邻,通过球拟合法求得其曲率和具有方向性的法向量,采用投影点个数比值法找到并保留点云边界,根据具体情况设定所需阈值,对非边界点进行分类,通过对点的曲率与平均曲率比较、近邻保留点与近邻点个数比例,完成点云简化.实验结果表明:该算法不仅能对点云进行直接有效地简化,而且还能很好地保留点云模型的细节特征,简化比例达25% ~40%.该方法可以满足不同种类点云简化的要求,能够提高计算机运行效率.%This paper proposes a simplification method for point cloud with boundary feature reservation for effective simplification of the point cloud. This algorithm consists of firstly using the 3D grid subdivision method to represent the spatial topology relationship of the scattered point cloud and calculate the κ-nearest neighbors for each data point, using the ball-fitting method to simply compute the curvature and the directional normal vector, and then identifying and reserving all the boundary points according to the ratio of the number of projected points, setting the desired thresholds by the specific situations, and classifying the non-boundary points through these thresholds, and finally simplifying the scattered point cloud according to comparative study of curvature and mean curvature of the points and the proportion of reserved points in their κ-nearest neighbors. The algorithm is verified by reducing some typical point cloud cases with various surface features. The experimental results indicate that the algorithm, marked by setting the threshold size according to simplification requirements, allows the direct and effective reduction of point cloud, while preserving detail feature of point cloud model, with a simplification proportion up to 25

  1. Mathematical Modelling of Coal Gasification Processes

    Science.gov (United States)

    Sundararajan, T.; Raghavan, V.; Ajilkumar, A.; Vijay Kumar, K.

    2017-07-01

    Coal is by far the most commonly employed fuel for electrical power generation around the world. While combustion could be the route for coal utilization for high grade coals, gasification becomes the preferred process for low grade coals having higher composition of volatiles or ash. Indian coals suffer from high ash content-nearly 50% by weight in some cases. Instead of transporting such high ash coals, it is more energy efficient to gasify the coal and transport the product syngas. Integrated Gasification Combined Cycle (IGCC) plants and Underground Gasification of coal have become attractive technologies for the best utilization of high ash coals. Gasification could be achieved in fixed beds, fluidized beds and entrained beds; faster rates of gasification are possible in fluidized beds and entrained flow systems, because of the small particle sizes and higher gas velocities. The media employed for gasification could involve air/oxygen and steam. Use of oxygen will yield relatively higher calorific value syngas because of the absence of nitrogen. Sequestration of the carbon dioxide after the combustion of the syngas is also easier, if oxygen is used for gasification. Addition of steam can increase hydrogen yield in the syngas and thereby increase the calorific value also. Gasification in the presence of suitable catalysts can increase the composition of methane in the product gas. Several competing heterogenous and homogenous reactions occur during coal major heterogenous reaction pathways, while interactions between carbon monoxide, oxygen, hydrogen, water vapour, methane and carbon dioxide result in several simultaneous gas-phase (homogenous) reactions. The overall product composition of the coal gasification process depends on the input reactant composition, particle size and type of gasifier, and pressure and temperature of the gasifier. The use of catalysts can also selectively change the product composition. At IIT Madras, over the last one decade, both

  2. A user-study measuring the effects of lexical simplification and coherence enhancement on perceived and actual text difficulty.

    Science.gov (United States)

    Leroy, Gondy; Kauchak, David; Mouradi, Obay

    2013-08-01

    Low patient health literacy has been associated with cost increases in medicine because it contributes to inadequate care. Providing explanatory text is a convenient approach to distribute medical information and increase health literacy. Unfortunately, writing text that is easily understood is challenging. This work tests two text features for their impact on understanding: lexical simplification and coherence enhancement. A user study was conducted to test the features' effect on perceived and actual text difficulty. Individual sentences were used to test perceived difficulty. Using a 5-point Likert scale, participants compared eight pairs of original and simplified sentences. Abstracts were used to test actual difficulty. For each abstract, four versions were created: original, lexically simplified, coherence enhanced, and lexically simplified and coherence enhanced. Using a mixed design, one group of participants worked with the original and lexically simplified documents (no coherence enhancement) while a second group worked with the coherence enhanced versions. Actual difficulty was measured using a Cloze measure and multiple-choice questions. Using Amazon's Mechanical Turk, 200 people participated of which 187 qualified based on our data qualification tests. A paired-samples t-test for the sentence ratings showed a significant reduction in difficulty after lexical simplification (plexical simplification, with the simplification leading to worse scores (p=.004). A follow-up ANOVA showed this effect exists only for function words when coherence was not enhanced (p=.008). In contrast, a two-way ANOVA for answering multiple-choice questions showed a significant beneficial effect of coherence enhancement (p=.003) but no effect of lexical simplification. Lexical simplification reduced the perceived difficulty of texts. Coherence enhancement reduced the actual difficulty of text when measured using multiple-choice questions. However, the Cloze measure results

  3. Methodology for Modeling and Analysis of Business Processes (MMABP)

    OpenAIRE

    Vaclav Repa; Tomas Bruckner

    2015-01-01

    This paper introduces the methodology for modeling business processes. Creation of the methodology is described in terms of the Design Science Method. Firstly, the gap in contemporary Business Process Modeling approaches is identified and general modeling principles which can fill the gap are discussed. The way which these principles have been implemented in the main features of created methodology is described. Most critical identified points of the business process modeling are process stat...

  4. Querying Business Process Models with VMQL

    DEFF Research Database (Denmark)

    Störrle, Harald; Acretoaie, Vlad

    2013-01-01

    The Visual Model Query Language (VMQL) has been invented with the objectives (1) to make it easier for modelers to query models effectively, and (2) to be universally applicable to all modeling languages. In previous work, we have applied VMQL to UML, and validated the first of these two claims. ...

  5. Modeling and Execution of Multienterprise Business Processes

    OpenAIRE

    Singer, Robert; Kotremba, Johannes; Raß, Stefan

    2014-01-01

    We discuss a fully featured multienterprise business process plattform (ME-BPP) based on the concepts of agent-based business processes. Using the concepts of the subject-oriented business process (S-BPM) methodology we developed an architecture to realize a platform for the execution of distributed business processes. The platform is implemented based on cloud technology using commercial services. For our discussion we used the well known Service Interaction Patterns, as they are empirically...

  6. Partially decoupled modeling of hydraulic fracturing processes

    Energy Technology Data Exchange (ETDEWEB)

    Settari, A.; Puchyr, P.J.; Bachman, R.C. (Simtech Consulting Services, Calgary (CA))

    1990-02-01

    A new method of partial decoupling of the problem of modeling a hydraulic fracture in a reservoir is described. According to the authors this approach has significant advantages over previous methods with fully coupled or completely uncoupled models. Better accuracy can be achieved in modeling the fracture propagation, and the new system is very efficient and versatile. Virtually any reservoir model can be used for predicting postfracture productivity. Examples of single- and multiphase applications for modeling fractured wells are discussed.

  7. A Speeded Item Response Model with Gradual Process Change

    Science.gov (United States)

    Goegebeur, Yuri; De Boeck, Paul; Wollack, James A.; Cohen, Allan S.

    2008-01-01

    An item response theory model for dealing with test speededness is proposed. The model consists of two random processes, a problem solving process and a random guessing process, with the random guessing gradually taking over from the problem solving process. The involved change point and change rate are considered random parameters in order to…

  8. Group Contribution Based Process Flowsheet Synthesis, Design and Modelling

    DEFF Research Database (Denmark)

    Gani, Rafiqul; d'Anterroches, Loïc

    2004-01-01

    This paper presents a process-group-contribution Method to model. simulate and synthesize a flowsheet. The process-group based representation of a flowsheet together with a process "property" model are presented. The process-group based synthesis method is developed on the basis of the computer...

  9. Qualitative vs. quantitative software process simulation modelling: conversion and comparison

    OpenAIRE

    Zhang, He; Kitchenham, Barbara; Jeffery, Ross

    2009-01-01

    peer-reviewed Software Process Simulation Modeling (SPSM) research has increased in the past two decades. However, most of these models are quantitative, which require detailed understanding and accurate measurement. As the continuous work to our previous studies in qualitative modeling of software process, this paper aims to investigate the structure equivalence and model conversion between quantitative and qualitative process modeling, and to compare the characteristics and performance o...

  10. A Model of Process-Based Automation: Cost and Quality Implications in the Medication Management Process

    Science.gov (United States)

    Spaulding, Trent Joseph

    2011-01-01

    The objective of this research is to understand how a set of systems, as defined by the business process, creates value. The three studies contained in this work develop the model of process-based automation. The model states that complementarities among systems are specified by handoffs in the business process. The model also provides theory to…

  11. Model medication management process in Australian nursing homes using business process modeling.

    Science.gov (United States)

    Qian, Siyu; Yu, Ping

    2013-01-01

    One of the reasons for end user avoidance or rejection to use health information systems is poor alignment of the system with healthcare workflow, likely causing by system designers' lack of thorough understanding about healthcare process. Therefore, understanding the healthcare workflow is the essential first step for the design of optimal technologies that will enable care staff to complete the intended tasks faster and better. The often use of multiple or "high risk" medicines by older people in nursing homes has the potential to increase medication error rate. To facilitate the design of information systems with most potential to improve patient safety, this study aims to understand medication management process in nursing homes using business process modeling method. The paper presents study design and preliminary findings from interviewing two registered nurses, who were team leaders in two nursing homes. Although there were subtle differences in medication management between the two homes, major medication management activities were similar. Further field observation will be conducted. Based on the data collected from observations, an as-is process model for medication management will be developed.

  12. Goal Model to Business Process Model: A Methodology for Enterprise Government Tourism System Development

    National Research Council Canada - National Science Library

    Ahmad Nurul Fajar; Imam Marzuki Shofi

    2016-01-01

    .... However, the goal model could not used directly to make business process model. In order to solve this problem,this paper presents and proposed a Methodology to extract goal model into business process model that called GBPM Methodology...

  13. Modeling microbial processes in porous media

    Science.gov (United States)

    Murphy, Ellyn M.; Ginn, Timothy R.

    The incorporation of microbial processes into reactive transport models has generally proceeded along two separate lines of investigation: (1) transport of bacteria as inert colloids in porous media, and (2) the biodegradation of dissolved contaminants by a stationary phase of bacteria. Research over the last decade has indicated that these processes are closely linked. This linkage may occur when a change in metabolic activity alters the attachment/detachment rates of bacteria to surfaces, either promoting or retarding bacterial transport in a groundwater-contaminant plume. Changes in metabolic activity, in turn, are controlled by the time of exposure of the microbes to electron acceptors/donor and other components affecting activity. Similarly, metabolic activity can affect the reversibility of attachment, depending on the residence time of active microbes. Thus, improvements in quantitative analysis of active subsurface biota necessitate direct linkages between substrate availability, metabolic activity, growth, and attachment/detachment rates. This linkage requires both a detailed understanding of the biological processes and robust quantitative representations of these processes that can be tested experimentally. This paper presents an overview of current approaches used to represent physicochemical and biological processes in porous media, along with new conceptual approaches that link metabolic activity with partitioning of the microorganism between the aqueous and solid phases. Résumé L'introduction des processus microbiologiques dans des modèles de transport réactif a généralement suivi deux voies différentes de recherches: (1) le transport de bactéries sous forme de colloïdes inertes en milieu poreux, et (2) la biodégradation de polluants dissous par une phase stationnaire de bactéries. Les recherches conduites au cours des dix dernières années indiquent que ces processus sont intimement liés. Cette liaison peut intervenir lorsqu

  14. Model reduction methods for vector autoregressive processes

    CERN Document Server

    Brüggemann, Ralf

    2004-01-01

    1. 1 Objective of the Study Vector autoregressive (VAR) models have become one of the dominant research tools in the analysis of macroeconomic time series during the last two decades. The great success of this modeling class started with Sims' (1980) critique of the traditional simultaneous equation models (SEM). Sims criticized the use of 'too many incredible restrictions' based on 'supposed a priori knowledge' in large scale macroeconometric models which were popular at that time. Therefore, he advo­ cated largely unrestricted reduced form multivariate time series models, unrestricted VAR models in particular. Ever since his influential paper these models have been employed extensively to characterize the underlying dynamics in systems of time series. In particular, tools to summarize the dynamic interaction between the system variables, such as impulse response analysis or forecast error variance decompo­ sitions, have been developed over the years. The econometrics of VAR models and related quantities i...

  15. An integrated approach for modelling of aircraft maintenance processes

    Directory of Open Access Journals (Sweden)

    D. Yu. Kiselev

    2015-01-01

    Full Text Available The paper deals with modeling of the processes of maintenance and repair of aircraft. The role of information in im-proving the effectiveness of maintenance systems is described. The methodology for functional modelling of maintenance processes is given. A simulation model is used for modelling possible changes.

  16. Dynamic modeling of ultrafiltration membranes for whey separation processes

    NARCIS (Netherlands)

    Saltık, M.B.; Özkan, Leyla; Jacobs, Marc; Padt, van der Albert

    2017-01-01

    In this paper, we present a control relevant rigorous dynamic model for an ultrafiltration membrane unit in a whey separation process. The model consists of a set of differential algebraic equations and is developed for online model based applications such as model based control and process monitori

  17. Refinement of SDBC Business Process Models Using ISDL

    NARCIS (Netherlands)

    Shishkov, Boris; Quartel, Dick; Manolopoulos, Y.; Filipe, J.; Constantopoulos, P.; Cordeiro, J.

    2006-01-01

    Aiming at aligning business process modeling and software specification, the SDBC approach considers a multi-viewpoint modeling where static, dynamic, and data business process aspect models have to be mapped adequately to corresponding static, dynamic, and data software specification aspect models.

  18. Learning Markov Decision Processes for Model Checking

    DEFF Research Database (Denmark)

    Mao, Hua; Chen, Yingke; Jaeger, Manfred

    2012-01-01

    Constructing an accurate system model for formal model verification can be both resource demanding and time-consuming. To alleviate this shortcoming, algorithms have been proposed for automatically learning system models based on observed system behaviors. In this paper we extend the algorithm on...

  19. Macro Level Simulation Model Of Space Shuttle Processing

    Science.gov (United States)

    2000-01-01

    The contents include: 1) Space Shuttle Processing Simulation Model; 2) Knowledge Acquisition; 3) Simulation Input Analysis; 4) Model Applications in Current Shuttle Environment; and 5) Model Applications for Future Reusable Launch Vehicles (RLV's). This paper is presented in viewgraph form.

  20. The Inference Construct: A Model of the Writing Process

    Science.gov (United States)

    Van Nostrand, A. D.

    1978-01-01

    Presents a taxonomy of writing instruction, a model or paradigm of the writing process, an application of this model to the teaching of writing, and an explanation of the empirical basis of the model. (Author/GW)

  1. Inventory Reduction Using Business Process Reengineering and Simulation Modeling.

    Science.gov (United States)

    1996-12-01

    center is analyzed using simulation modeling and business process reengineering (BPR) concepts. The two simulation models were designed and evaluated by...reengineering and simulation modeling offer powerful tools to aid the manager in reducing cycle time and inventory levels.

  2. Property Modelling for Applications in Chemical Product and Process Design

    DEFF Research Database (Denmark)

    Gani, Rafiqul

    Physical-chemical properties of pure chemicals and their mixtures play an important role in the design of chemicals based products and the processes that manufacture them. Although, the use of experimental data in design and analysis of chemicals based products and their processes is desirable...... such as database, property model library, model parameter regression, and, property-model based product-process design will be presented. The database contains pure component and mixture data for a wide range of organic chemicals. The property models are based on the combined group contribution and atom...... modeling tools in design and analysis of chemical product-process design, including biochemical processes will be highlighted....

  3. Process Definition and Process Modeling Methods Version 01.01.00

    Science.gov (United States)

    1991-09-01

    process model. This generic process model is a state machine model . It permits progress in software development to be characterized as transitions...e.g., Entry-Task-Validation-Exit (ETVX) diagram, Petri Net, two-level state machine model , state machine, and Structured Analysis and Design

  4. Model building and model checking for biochemical processes.

    Science.gov (United States)

    Antoniotti, Marco; Policriti, Alberto; Ugel, Nadia; Mishra, Bud

    2003-01-01

    A central claim of computational systems biology is that, by drawing on mathematical approaches developed in the context of dynamic systems, kinetic analysis, computational theory and logic, it is possible to create powerful simulation, analysis, and reasoning tools for working biologists to decipher existing data, devise new experiments, and ultimately to understand functional properties of genomes, proteomes, cells, organs, and organisms. In this article, a novel computational tool is described that achieves many of the goals of this new discipline. The novelty of this system involves an automaton-based semantics of the temporal evolution of complex biochemical reactions starting from the representation given as a set of differential equations. The related tools also provide ability to qualitatively reason about the systems using a propositional temporal logic that can express an ordered sequence of events succinctly and unambiguously. The implementation of mathematical and computational models in the Simpathica and XSSYS systems is described briefly. Several example applications of these systems to cellular and biochemical processes are presented: the two most prominent are Leibler et al.'s repressilator (an artificial synthesized oscillatory network), and Curto- Voit-Sorribas-Cascante's purine metabolism reaction model.

  5. How can Product Development Process Modelling be made more useful?

    DEFF Research Database (Denmark)

    Wynn, David C; Maier, Anja; Clarkson, John P

    2010-01-01

    methods appear to have been widely accepted by industry as practical approaches to improve PD processes. To improve the attractiveness of process modelling and model-based methods to industry it is thus worthwhile to ask: How is PD process modelling useful, and how can the utility of modelling to industry......A significant body of research exists in the area of Product Development (PD) process modelling. This is highlighted by Browning and Ramasesh (2007), who recently reviewed over 400 papers in this field. However, despite hundreds, probably thousands of publications in this area, few of the proposed...... be improved? In this paper, we analyse PD process modelling ‘utility’ – which in broad terms we consider to be the degree to which a model-based approach or modelling intervention benefits practice. We view the utility of modelling as a composite property which depends both on the properties of models...

  6. Model visualization for evaluation of biocatalytic processes

    DEFF Research Database (Denmark)

    Law, HEM; Lewis, DJ; McRobbie, I

    2008-01-01

    Biocatalysis offers great potential as an additional, and in some cases as an alternative, synthetic tool for organic chemists, especially as a route to introduce chirality. However, the implementation of scalable biocatalytic processes nearly always requires the introduction of process and/or bi...

  7. Computer-Aided Multiscale Modelling for Chemical Process Engineering

    DEFF Research Database (Denmark)

    Morales Rodriguez, Ricardo; Gani, Rafiqul

    2007-01-01

    T) for model translation, analysis and solution. The integration of ModDev, MoT and ICAS or any other external software or process simulator (using COM-Objects) permits the generation of different models and/or process configurations for purposes of simulation, design and analysis. Consequently, it is possible......Chemical processes are generally modeled through monoscale approaches, which, while not adequate, satisfy a useful role in product-process design. In this case, use of a multi-dimensional and multi-scale model-based approach has importance in product-process development. A computer-aided framework...... for model generation, analysis, solution and implementation is necessary for the development and application of the desired model-based approach for product-centric process design/analysis. This goal is achieved through the combination of a system for model development (ModDev), and a modelling tool (Mo...

  8. Process Model for Defining Space Sensing and Situational Awareness Requirements

    Science.gov (United States)

    2006-04-01

    process model for defining systems for space sensing and space situational awareness is presented. The paper concentrates on eight steps for determining the requirements to include: decision maker needs, system requirements, exploitation methods and vulnerabilities, critical capabilities, and identify attack scenarios. Utilization of the USAF anti-tamper (AT) implementation process as a process model departure point for the space sensing and situational awareness (SSSA...is presented. The AT implementation process model , as an

  9. Predictive Modeling of Metal-Catalyzed Polyolefin Processes

    OpenAIRE

    Khare, Neeraj Prasad

    2003-01-01

    This dissertation describes the essential modeling components and techniques for building comprehensive polymer process models for metal-catalyzed polyolefin processes. The significance of this work is that it presents a comprehensive approach to polymer process modeling applied to large-scale commercial processes. Most researchers focus only on polymerization mechanisms and reaction kinetics, and neglect physical properties and phase equilibrium. Both physical properties and phase equilib...

  10. Hot cheese: a processed Swiss cheese model.

    Science.gov (United States)

    Li, Y; Thimbleby, H

    2014-01-01

    James Reason's classic Swiss cheese model is a vivid and memorable way to visualise how patient harm happens only when all system defences fail. Although Reason's model has been criticised for its simplicity and static portrait of complex systems, its use has been growing, largely because of the direct clarity of its simple and memorable metaphor. A more general, more flexible and equally memorable model of accident causation in complex systems is needed. We present the hot cheese model, which is more realistic, particularly in portraying defence layers as dynamic and active - more defences may cause more hazards. The hot cheese model, being more flexible, encourages deeper discussion of incidents than the simpler Swiss cheese model permits.

  11. Modelling the Active Hearing Process in Mosquitoes

    Science.gov (United States)

    Avitabile, Daniele; Homer, Martin; Jackson, Joe; Robert, Daniel; Champneys, Alan

    2011-11-01

    A simple microscopic mechanistic model is described of the active amplification within the Johnston's organ of the mosquito species Toxorhynchites brevipalpis. The model is based on the description of the antenna as a forced-damped oscillator coupled to a set of active threads (ensembles of scolopidia) that provide an impulsive force when they twitch. This twitching is in turn controlled by channels that are opened and closed if the antennal oscillation reaches a critical amplitude. The model matches both qualitatively and quantitatively with recent experiments. New results are presented using mathematical homogenization techniques to derive a mesoscopic model as a simple oscillator with nonlinear force and damping characteristics. It is shown how the results from this new model closely resemble those from the microscopic model as the number of threads approach physiologically correct values.

  12. An Abstract Model of Historical Processes

    Directory of Open Access Journals (Sweden)

    Michael Poulshock

    2017-06-01

    Full Text Available A theoretical model is presented which provides a way to simulate, at a very abstract level, power struggles in the social world. In the model, agents can benefit or harm each other, to varying degrees and with differing levels of influence. The agents interact over time, using the power they have to try to get more of it, while being constrained in their strategic choices by social inertia. The outcomes of the model are probabilistic. More research is needed to determine whether the model has any empirical validity.

  13. Modeling Business Processes with Azzurra: Order Fulfilment

    OpenAIRE

    Canobbio, Giulia; Dalpiaz, Fabiano

    2012-01-01

    Azzurra is a specification language for modeling and enacting business processes. Azzurra is founded on social concepts, such as roles, agents and commitments among them, and Azzurra models are social models consisting of networks of commitments. As such, Azzurra models support the flexible enactment of business processes, and provide a semantic notion of actor accountability and business process compliance. In this technical report, we apply Azzurra to the order fulfilment exemplar from ...

  14. Incorporating Enterprise Risk Management in the Business Model Innovation Process

    OpenAIRE

    Yariv Taran; Harry Boer; Peter Lindgren

    2013-01-01

    Purpose: Relative to other types of innovations, little is known about business model innovation, let alone the process of managing the risks involved in that process. Using the emerging (enterprise) risk management literature, an approach is proposed through which risk management can be embedded in the business model innovation process. Design: The integrated business model innovation risk management model developed in this paper has been tested through an action research study in a Dani...

  15. Dynamics of the two process model of human sleep regulation

    Science.gov (United States)

    Kenngott, Max; McKay, Cavendish

    2011-04-01

    We examine the dynamics of the two process model of human sleep regulation. In this model, sleep propensity is governed by the interaction between a periodic threshold (process C) and a saturating growth/decay (process S). We find that the parameter space of this model admits sleep cycles with a wide variety of characteristics, many of which are not observed in normal human sleepers. We also examine the effects of phase dependent feedback on this model.

  16. Modeling of Heating During Food Processing

    Science.gov (United States)

    Zheleva, Ivanka; Kamburova, Veselka

    Heat transfer processes are important for almost all aspects of food preparation and play a key role in determining food safety. Whether it is cooking, baking, boiling, frying, grilling, blanching, drying, sterilizing, or freezing, heat transfer is part of the processing of almost every food. Heat transfer is a dynamic process in which thermal energy is transferred from one body with higher temperature to another body with lower temperature. Temperature difference between the source of heat and the receiver of heat is the driving force in heat transfer.

  17. Managing risks in business model innovation processes

    DEFF Research Database (Denmark)

    Taran, Yariv; Boer, Harry; Lindgren, Peter

    2010-01-01

    industrial companies shows that both companies are experiencing high levels of uncertainty and complexity during their innovation processes and are, consequently, struggling to find new processes for handling the risks involved. Based on the two companies’ experiences, various testable propositions are put......) innovation is a risky enterprise, many companies are still choosing not to apply any risk management in the BM innovation process. The objective of this paper is to develop a better understanding of how risks are handled in the practice of BM innovation. An analysis of the BM innovation experiences of two...... forward, which link success and failure to the way companies appreciate and handle the risks involved in BM innovation....

  18. Comparative analysis of business rules and business process modeling languages

    Directory of Open Access Journals (Sweden)

    Audrius Rima

    2013-03-01

    Full Text Available During developing an information system is important to create clear models and choose suitable modeling languages. The article analyzes the SRML, SBVR, PRR, SWRL, OCL rules specifying language and UML, DFD, CPN, EPC and IDEF3 BPMN business process modeling language. The article presents business rules and business process modeling languages theoretical comparison. The article according to selected modeling aspects of the comparison between different business process modeling languages ​​and business rules representation languages sets. Also, it is selected the best fit of language set for three layer framework for business rule based software modeling.

  19. Modelling Template for the Development of the Process Flowsheet

    DEFF Research Database (Denmark)

    Fedorova, Marina; Gani, Rafiqul

    2015-01-01

    in connection to other modelling tools within the modelling framework are forming a user-friendly system, which will make the model development process easier and faster and provide the way for unified and consistent model documentation. The modeller can use the template for their specific problem or to extend...... parts in the modelling framework is highlighted through a development of a simple flowsheet model. Initially the model equations are obtained from the tool for model generation and then transferred to model analysis tool. Further, based on this model, a modelling template is created, which is used later...

  20. Multiscale soil-landscape process modeling

    NARCIS (Netherlands)

    Schoorl, J.M.; Veldkamp, A.

    2006-01-01

    The general objective of this chapter is to illustrate the role of soils and geomorphological processes in the multiscale soil-lanscape context. Included in this context is the fourth dimension (temporal dimension) and the human role (fifth dimension)

  1. Evaluation of EOR Processes Using Network Models

    DEFF Research Database (Denmark)

    Larsen, Jens Kjell; Krogsbøll, Anette

    1998-01-01

    The report consists of the following parts: 1) Studies of wetting properties of model fluids and fluid mixtures aimed at an optimal selection of candidates for micromodel experiments. 2) Experimental studies of multiphase transport properties using physical models of porous networks (micromodels)...

  2. Eye Tracking Meets the Process of Process Modeling: a Visual Analytic Approach

    DEFF Research Database (Denmark)

    Burattin, Andrea; Kaiser, M.; Neurauter, Manuel

    2017-01-01

    Research on the process of process modeling (PPM) studies how process models are created. It typically uses the logs of the interactions with the modeling tool to assess the modeler’s behavior. In this paper we suggest to introduce an additional stream of data (i.e., eye tracking) to improve...... diagram, heat maps, fixations distributions) both static and dynamic (i.e., movies with the evolution of the model and eye tracking data on top)....

  3. Modeling of Mixed Decision Making Process

    OpenAIRE

    yahia, Nesrine Ben; Bellamine, Narjès; Ghezala, Henda Ben

    2012-01-01

    Decision making whenever and wherever it is happened is key to organizations success. In order to make correct decision, individuals, teams and organizations need both knowledge management (to manage content) and collaboration (to manage group processes) to make that more effective and efficient. In this paper, we explain the knowledge management and collaboration convergence. Then, we propose a formal description of mixed and multimodal decision making (MDM) process where decision may be mad...

  4. Modeling of Mixed Decision Making Process

    OpenAIRE

    Yahia, Nesrine Ben; Bellamine, Narjès; Ghezala, Henda Ben

    2012-01-01

    Decision making whenever and wherever it is happened is key to organizations success. In order to make correct decision, individuals, teams and organizations need both knowledge management (to manage content) and collaboration (to manage group processes) to make that more effective and efficient. In this paper, we explain the knowledge management and collaboration convergence. Then, we propose a formal description of mixed and multimodal decision making (MDM) process where decision may be mad...

  5. Budget impact analysis of the simplification to atazanavir + ritonavir + lamivudine dual therapy of HIV-positive patients receiving atazanavir-based triple therapies in Italy starting from data of the Atlas-M trial

    Science.gov (United States)

    Restelli, Umberto; Fabbiani, Massimiliano; Di Giambenedetto, Simona; Nappi, Carmela; Croce, Davide

    2017-01-01

    Background This analysis aimed at evaluating the impact of a therapeutic strategy of treatment simplification of atazanavir (ATV)+ ritonavir (r) + lamivudine (3TC) in virologically suppressed patients receiving ATV+r+2 nucleoside reverse transcriptase inhibitors (NRTIs) on the budget of the Italian National Health Service (NHS). Methods A budget impact model with a 5-year time horizon was developed based on the clinical data of Atlas-M trial at 48 weeks (in terms of percentage of patients experiencing virologic failure and adverse events), from the Italian NHS perspective. A scenario in which the simplification strategy was not considered was compared with three scenarios in which, among a target population of 1,892 patients, different simplification strategies were taken into consideration in terms of percentage of patients simplified on a yearly basis among those eligible for simplification. The costs considered were direct medical costs related to antiretroviral drugs, adverse events management, and monitoring activities. Results The percentage of patients of the target population receiving ATV+r+3TC varies among the scenarios and is between 18.7% and 46.9% in year 1, increasing up to 56.3% and 84.4% in year 5. The antiretroviral treatment simplification strategy considered would lead to lower costs for the Italian NHS in a 5-year time horizon between −28.7 million € and −16.0 million €, with a reduction of costs between −22.1% (−3.6 million €) and −8.8% (−1.4 million €) in year 1 and up to −39.9% (−6.9 million €) and −26.6% (−4.6 million €) in year 5. Conclusion The therapy simplification for patients receiving ATV+r+2 NRTIs to ATV+r+3TC at a national level would lead to a reduction of direct medical costs over a 5-year period for the Italian NHS. PMID:28280375

  6. Modeling fixation locations using spatial point processes.

    Science.gov (United States)

    Barthelmé, Simon; Trukenbrod, Hans; Engbert, Ralf; Wichmann, Felix

    2013-10-01

    Whenever eye movements are measured, a central part of the analysis has to do with where subjects fixate and why they fixated where they fixated. To a first approximation, a set of fixations can be viewed as a set of points in space; this implies that fixations are spatial data and that the analysis of fixation locations can be beneficially thought of as a spatial statistics problem. We argue that thinking of fixation locations as arising from point processes is a very fruitful framework for eye-movement data, helping turn qualitative questions into quantitative ones. We provide a tutorial introduction to some of the main ideas of the field of spatial statistics, focusing especially on spatial Poisson processes. We show how point processes help relate image properties to fixation locations. In particular we show how point processes naturally express the idea that image features' predictability for fixations may vary from one image to another. We review other methods of analysis used in the literature, show how they relate to point process theory, and argue that thinking in terms of point processes substantially extends the range of analyses that can be performed and clarify their interpretation.

  7. Model uncertainty and Bayesian model averaging in vector autoregressive processes

    NARCIS (Netherlands)

    R.W. Strachan (Rodney); H.K. van Dijk (Herman)

    2006-01-01

    textabstractEconomic forecasts and policy decisions are often informed by empirical analysis based on econometric models. However, inference based upon a single model, when several viable models exist, limits its usefulness. Taking account of model uncertainty, a Bayesian model averaging procedure i

  8. Hierarchy-Based Team Software Process Simulation Model

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    According to the characteristic of Team Software Process (TSP), it adopts a hierarchy-based model combined discrete event model with system dynamics model. This model represents TSP as form of two levels, the inner level embodies the continuity of the software process, the outer embodies the software development process by phases, and the structure and principle of the model is explained in detail, then formalization description of the model is offered. At last, an example is presented to demonstrate the simulation process and result. This model can simulate team software process from various angles, supervise and predict the software process. Also it can make the management of software development become more scientific and improve the quality of software.

  9. Support of Modelling in Process-Engineering Education

    NARCIS (Netherlands)

    Schaaf, van der H.; Vermuë, M.H.; Tramper, J.; Hartog, R.J.M.

    2006-01-01

    An objective of the Process Technology curriculum at Wageningen University is to teach students a stepwise modeling approach in the context of process engineering. Many process-engineering students have difficulty with learning to design a model. Some common problems are lack of structure in the des

  10. Computational modeling in the primary processing of titanium: A review

    Science.gov (United States)

    Venkatesh, Vasisht; Wilson, Andrew; Kamal, Manish; Thomas, Matthew; Lambert, Dave

    2009-05-01

    Process modeling is increasingly becoming a vital tool for modern metals manufacturing. This paper reviews process modeling initiatives started at TIMET over the last decade for the primary processing of titanium alloys. SOLAR, a finite volume-based numerical model developed at the Ecole de Mine at Nancy, has been successfully utilized to optimize vacuum arc remelting process parameters, such as electromagnetic stirring profiles in order to minimize macrosegregation and improve ingot quality. Thermo-mechanical modeling of heat treating, billet forging, and slab rolling is accomplished via the commercial finite element analysis model, DEFORM, to determine heating times, cooling rates, strain distributions, etc.

  11. Deconstructing crop processes and models via identities

    DEFF Research Database (Denmark)

    Porter, John Roy; Christensen, Svend

    2013-01-01

    This paper is part review and part opinion piece; it has three parts of increasing novelty and speculation in approach. The first presents an overview of how some of the major crop simulation models approach the issue of simulating the responses of crops to changing climatic and weather variables......, mainly atmospheric CO2 concentration and increased and/or varying temperatures. It illustrates an important principle in models of a single cause having alternative effects and vice versa. The second part suggests some features, mostly missing in current crop models, that need to be included...

  12. An Abstract Model of Historical Processes

    CERN Document Server

    Poulshock, Michael

    2016-01-01

    A game theoretic model is presented which attempts to simulate, at a very abstract level, power struggles in the social world. In the model, agents can benefit or harm each other, to varying degrees and with differing levels of influence. The agents play a dynamic, noncooperative, perfect information game where the goal is to maximize payoffs based on positional utility and intertemporal preference, while being constrained by social inertia. Agents use the power they have in order to get more of it, both in an absolute and relative sense. More research is needed to assess the model's empirical validity.

  13. NONLINEAR MODEL PREDICTIVE CONTROL OF CHEMICAL PROCESSES

    Directory of Open Access Journals (Sweden)

    R. G. SILVA

    1999-03-01

    Full Text Available A new algorithm for model predictive control is presented. The algorithm utilizes a simultaneous solution and optimization strategy to solve the model's differential equations. The equations are discretized by equidistant collocation, and along with the algebraic model equations are included as constraints in a nonlinear programming (NLP problem. This algorithm is compared with the algorithm that uses orthogonal collocation on finite elements. The equidistant collocation algorithm results in simpler equations, providing a decrease in computation time for the control moves. Simulation results are presented and show a satisfactory performance of this algorithm.

  14. MODELING OF MANAGEMENT PROCESSES IN AN ORGANIZATION

    Directory of Open Access Journals (Sweden)

    Stefan Iovan

    2016-05-01

    Full Text Available When driving any major change within an organization, strategy and execution are intrinsic to a project’s success. Nevertheless, closing the gap between strategy and execution remains a challenge for many organizations [1]. Companies tend to focus more on execution than strategy for quick results, instead of taking the time needed to understand the parts that make up the whole, so the right execution plan can be put in place to deliver the best outcomes. A large part of this understands that business operations don’t fit neatly within the traditional organizational hierarchy. Business processes are often messy, collaborative efforts that cross teams, departments and systems, making them difficult to manage within a hierarchical structure [2]. Business process management (BPM fills this gap by redefining an organization according to its end-to-end processes, so opportunities for improvement can be identified and processes streamlined for growth, revenue and transformation. This white paper provides guidelines on what to consider when using business process applications to solve your BPM initiatives, and the unique capabilities software systems provides that can help ensure both your project’s success and the success of your organization as a whole. majority of medium and small businesses, big companies and even some guvermental organizations [2].

  15. On Process Modelling Using Physical Oriented And Phenomena Based Principles

    Directory of Open Access Journals (Sweden)

    Mihai Culea

    2000-12-01

    Full Text Available This work presents a modelling framework based on phenomena description of the process. The approach is taken to easy understand and construct process model in heterogeneous possible distributed modelling and simulation environments. A simplified case study of a heat exchanger is considered and Modelica modelling language to check the proposed concept. The partial results are promising and the research effort will be extended in a computer aided modelling environment based on phenomena.

  16. PtProcess: An R Package for Modelling Marked Point Processes Indexed by Time

    Directory of Open Access Journals (Sweden)

    David Harte

    2010-10-01

    Full Text Available This paper describes the package PtProcess which uses the R statistical language. The package provides a unified approach to fitting and simulating a wide variety of temporal point process or temporal marked point process models. The models are specified by an intensity function which is conditional on the history of the process. The user needs to provide routines for calculating the conditional intensity function. Then the package enables one to carry out maximum likelihood fitting, goodness of fit testing, simulation and comparison of models. The package includes the routines for the conditional intensity functions for a variety of standard point process models. The package is intended to simplify the fitting of point process models indexed by time in much the same way as generalized linear model programs have simplified the fitting of various linear models. The primary examples used in this paper are earthquake sequences but the package is intended to have a much wider applicability.

  17. Biomedical Simulation Models of Human Auditory Processes

    Science.gov (United States)

    Bicak, Mehmet M. A.

    2012-01-01

    Detailed acoustic engineering models that explore noise propagation mechanisms associated with noise attenuation and transmission paths created when using hearing protectors such as earplugs and headsets in high noise environments. Biomedical finite element (FE) models are developed based on volume Computed Tomography scan data which provides explicit external ear, ear canal, middle ear ossicular bones and cochlea geometry. Results from these studies have enabled a greater understanding of hearing protector to flesh dynamics as well as prioritizing noise propagation mechanisms. Prioritization of noise mechanisms can form an essential framework for exploration of new design principles and methods in both earplug and earcup applications. These models are currently being used in development of a novel hearing protection evaluation system that can provide experimentally correlated psychoacoustic noise attenuation. Moreover, these FE models can be used to simulate the effects of blast related impulse noise on human auditory mechanisms and brain tissue.

  18. Beyond dual-process models: A categorisation of processes underlying intuitive judgement and decision making

    NARCIS (Netherlands)

    Glöckner, A.; Witteman, C.L.M.

    2010-01-01

    Intuitive-automatic processes are crucial for making judgements and decisions. The fascinating complexity of these processes has attracted many decision researchers, prompting them to start investigating intuition empirically and to develop numerous models. Dual-process models assume a clear distinc

  19. The evolution of process-based hydrologic models

    NARCIS (Netherlands)

    Clark, Martyn P.; Bierkens, Marc F.P.; Samaniego, Luis; Woods, Ross A.; Uijlenhoet, Remko; Bennett, Katrina E.; Pauwels, Valentijn R.N.; Cai, Xitian; Wood, Andrew W.; Peters-Lidard, Christa D.

    2017-01-01

    The diversity in hydrologic models has historically led to great controversy on the "correct" approach to process-based hydrologic modeling, with debates centered on the adequacy of process parameterizations, data limitations and uncertainty, and computational constraints on model analysis. In this

  20. Updating parameters of the chicken processing line model

    DEFF Research Database (Denmark)

    Kurowicka, Dorota; Nauta, Maarten; Jozwiak, Katarzyna

    2010-01-01

    A mathematical model of chicken processing that quantitatively describes the transmission of Campylobacter on chicken carcasses from slaughter to chicken meat product has been developed in Nauta et al. (2005). This model was quantified with expert judgment. Recent availability of data allows...... of the chicken processing line model....

  1. Modeling Recycling Asphalt Pavement Processing Technologies in Asphalt Mixing Plants

    OpenAIRE

    Simonas Tamaliūnas; Henrikas Sivilevičius

    2011-01-01

    The article presents reclaimed asphalt pavement (RAP) processing technologies and equipment models used in the asphalt mixing plant (AMP). The schematic model indicating all possible ways to process RAP in AMP is shown. The model calculating the needed temperature of mineral materials used for heating RAP is given and an example of such calculation is provided.Article in Lithuanian

  2. Modeling Recycling Asphalt Pavement Processing Technologies in Asphalt Mixing Plants

    Directory of Open Access Journals (Sweden)

    Simonas Tamaliūnas

    2011-04-01

    Full Text Available The article presents reclaimed asphalt pavement (RAP processing technologies and equipment models used in the asphalt mixing plant (AMP. The schematic model indicating all possible ways to process RAP in AMP is shown. The model calculating the needed temperature of mineral materials used for heating RAP is given and an example of such calculation is provided.Article in Lithuanian

  3. An integrated model for supplier selection process

    Institute of Scientific and Technical Information of China (English)

    2003-01-01

    In today's highly competitive manufacturing environment, the supplier selection process becomes one of crucial activities in supply chain management. In order to select the best supplier(s) it is not only necessary to continuously tracking and benchmarking performance of suppliers but also to make a tradeoff between tangible and intangible factors some of which may conflict. In this paper an integration of case-based reasoning (CBR), analytical network process (ANP) and linear programming (LP) is proposed to solve the supplier selection problem.

  4. Statistical image processing and multidimensional modeling

    CERN Document Server

    Fieguth, Paul

    2010-01-01

    Images are all around us! The proliferation of low-cost, high-quality imaging devices has led to an explosion in acquired images. When these images are acquired from a microscope, telescope, satellite, or medical imaging device, there is a statistical image processing task: the inference of something - an artery, a road, a DNA marker, an oil spill - from imagery, possibly noisy, blurry, or incomplete. A great many textbooks have been written on image processing. However this book does not so much focus on images, per se, but rather on spatial data sets, with one or more measurements taken over

  5. Spinning disc atomisation process: Modelling and computations

    Science.gov (United States)

    Li, Yuan; Sisoev, Grigory; Shikhmurzaev, Yulii

    2016-11-01

    The atomisation of liquids using a spinning disc (SDA), where the centrifugal force is used to generate a continuous flow, with the liquid eventually disintegrating into drops which, on solidification, become particles, is a key element in many technologies. Examples of such technologies range from powder manufacturing in metallurgy to various biomedical applications. In order to be able to control the SDA process, it is necessary to understand it as a whole, from the feeding of the liquid and the wave pattern developing on the disc to the disintegration of the liquid film into filaments and these into drops. The SDA process has been the subject of a number of experimental studies and some elements of it, notably the film on a spinning disc and the dynamics of the jets streaming out from it, have been investigated theoretically. However, to date there have been no studies of the process as a whole, including, most importantly, the transition zone where the film that has already developed a certain wave pattern disintegrates into jets that spiral out. The present work reports some results of an ongoing project aimed at producing a definitive map of regimes occurring in the SDA process and their outcome.

  6. Incorporating evolutionary processes into population viability models

    NARCIS (Netherlands)

    Pierson, J.C.; Beissinger, S.R.; Bragg, J.G.; Coates, D.J.; Oostermeijer, J.G.B.; Sunnucks, P.; Schumaker, N.H.; Trotter, M.V.; Young, A.G.

    2015-01-01

    We examined how ecological and evolutionary (eco-evo) processes in population dynamics could be better integrated into population viability analysis (PVA). Complementary advances in computation and population genomics can be combined into an eco-evo PVA to offer powerful new approaches to understand

  7. Computer Modeling of Complete IC Fabrication Process.

    Science.gov (United States)

    1984-01-01

    now makes the correlation S between process specification and resulting physically observ- Csea able parameters a viable and valuable design tool. Fig... journal is are generally provided at a paid-up royalty often not adequate to transfer research or an annual royalty fee. methodologies and findings to other

  8. Iterative ramp sharpening for structure signature-preserving simplification of images

    Energy Technology Data Exchange (ETDEWEB)

    Grazzini, Jacopo A [Los Alamos National Laboratory; Soille, Pierre [EC-JRC

    2010-01-01

    In this paper, we present a simple and heuristic ramp sharpening algorithm that achieves local contrast enhancement of vector-valued images. The proposed algorithm performs a local comparison of intensity values as well as gradient strength and directional information derived from the gradient structure tensor so that the sharpening is applied only for pixels found on the ramps around true edges. This way, the contrast between objects and regions separated by a ramp is enhanced correspondingly, avoiding ringing artefacts. It is found that applying this technique in an iterative manner on blurred imagery produces sharpening preserving both structure and signature of the image. The final approach reaches a good compromise between complexity and effectiveness for image simplification, enhancing in an efficient manner the image details and maintaining the overall image appearance.

  9. Arthroscopic anatomical reconstruction of the lateral ankle ligaments: A technical simplification.

    Science.gov (United States)

    Lopes, R; Decante, C; Geffroy, L; Brulefert, K; Noailles, T

    2016-12-01

    Anatomical reconstruction of the lateral ankle ligaments has become a pivotal component of the treatment strategy for chronic ankle instability. The recently described arthroscopic version of this procedure is indispensable to ensure that concomitant lesions are appropriately managed, yet remains technically demanding. Here, we describe a simplified variant involving percutaneous creation of the calcaneal tunnel for the distal attachment of the calcaneo-fibular ligament. The rationale for this technical stratagem was provided by a preliminary cadaver study that demonstrated a correlation between the lateral malleolus and the distal footprint of the calcaneo-fibular ligament. The main objectives are simplification of the operative technique and decreased injury to tissues whose function is crucial to the recovery of proprioception. Copyright © 2016 Elsevier Masson SAS. All rights reserved.

  10. A simplification of the fractional Hartley transform applied to image security system in phase

    Science.gov (United States)

    Jimenez, Carlos J.; Vilardy, Juan M.; Perez, Ronal

    2017-01-01

    In this work we develop a new encryption system for encoded image in phase using the fractional Hartley transform (FrHT), truncation operations and random phase masks (RPMs). We introduce a simplification of the FrHT with the purpose of computing this transform in an efficient and fast way. The security of the encryption system is increased by using nonlinear operations, such as the phase encoding and the truncation operations. The image to encrypt (original image) is encoded in phase and the truncation operations applied in the encryption-decryption system are the amplitude and phase truncations. The encrypted image is protected by six keys, which are the two fractional orders of the FrHTs, the two RPMs and the two pseudorandom code images generated by the amplitude and phase truncation operations. All these keys have to be correct for a proper recovery of the original image in the decryption system. We present digital results that confirm our approach.

  11. Statistical properties of several models of fractional random point processes

    Science.gov (United States)

    Bendjaballah, C.

    2011-08-01

    Statistical properties of several models of fractional random point processes have been analyzed from the counting and time interval statistics points of view. Based on the criterion of the reduced variance, it is seen that such processes exhibit nonclassical properties. The conditions for these processes to be treated as conditional Poisson processes are examined. Numerical simulations illustrate part of the theoretical calculations.

  12. Analysis of Using Resources in Business Process Modeling and Simulation

    Directory of Open Access Journals (Sweden)

    Vasilecas Olegas

    2014-12-01

    Full Text Available One of the key purposes of Business Process Model and Notation (BPMN is to support graphical representation of the process model. However, such models have a lack of support for the graphical representation of resources, whose processes are used during simulation or execution of process instance. The paper analyzes different methods and their extensions for resource modeling. Further, this article presents a selected set of resource properties that are relevant for resource modeling. The paper proposes an approach that explains how to use the selected set of resource properties for extension of process modeling using BPMN and simulation tools. They are based on BPMN, where business process instances use resources in a concurrency manner.

  13. Modelling population processes with random initial conditions.

    Science.gov (United States)

    Pollett, P K; Dooley, A H; Ross, J V

    2010-02-01

    Population dynamics are almost inevitably associated with two predominant sources of variation: the first, demographic variability, a consequence of chance in progenitive and deleterious events; the second, initial state uncertainty, a consequence of partial observability and reporting delays and errors. Here we outline a general method for incorporating random initial conditions in population models where a deterministic model is sufficient to describe the dynamics of the population. Additionally, we show that for a large class of stochastic models the overall variation is the sum of variation due to random initial conditions and variation due to random dynamics, and thus we are able to quantify the variation not accounted for when random dynamics are ignored. Our results are illustrated with reference to both simulated and real data.

  14. Stochastic Processes via the Pathway Model

    Directory of Open Access Journals (Sweden)

    Arak M. Mathai

    2015-04-01

    Full Text Available After collecting data from observations or experiments, the next step is to analyze the data to build an appropriate mathematical or stochastic model to describe the data so that further studies can be done with the help of the model. In this article, the input-output type mechanism is considered first, where reaction, diffusion, reaction-diffusion, and production-destruction type physical situations can fit in. Then techniques are described to produce thicker or thinner tails (power law behavior in stochastic models. Then the pathway idea is described where one can switch to different functional forms of the probability density function through a parameter called the pathway parameter. The paper is a continuation of related solar neutrino research published previously in this journal.

  15. LISP based simulation generators for modeling complex space processes

    Science.gov (United States)

    Tseng, Fan T.; Schroer, Bernard J.; Dwan, Wen-Shing

    1987-01-01

    The development of a simulation assistant for modeling discrete event processes is presented. Included are an overview of the system, a description of the simulation generators, and a sample process generated using the simulation assistant.

  16. Protein chain pair simplification under the discrete Fréchet distance.

    Science.gov (United States)

    Wylie, Tim; Zhu, Binhai

    2013-01-01

    For protein structure alignment and comparison, a lot of work has been done using RMSD as the distance measure, which has drawbacks under certain circumstances. Thus, the discrete Fréchet distance was recently applied to the problem of protein (backbone) structure alignment and comparison with promising results. For this problem, visualization is also important because protein chain backbones can have as many as 500-600 $(\\alpha)$-carbon atoms, which constitute the vertices in the comparison. Even with an excellent alignment, the similarity of two polygonal chains can be difficult to visualize unless the chains are nearly identical. Thus, the chain pair simplification problem (CPS-3F) was proposed in 2008 to simultaneously simplify both chains with respect to each other under the discrete Fréchet distance. The complexity of CPS-3F is unknown, so heuristic methods have been developed. Here, we define a variation of CPS-3F, called the constrained CPS-3F problem ($({\\rm CPS\\hbox{-}3F}^+)$), and prove that it is polynomially solvable by presenting a dynamic programming solution, which we then prove is a factor-2 approximation for CPS-3F. We then compare the $({\\rm CPS\\hbox{-}3F}^+)$ solutions with previous empirical results, and further demonstrate some of the benefits of the simplified comparisons. Chain pair simplification based on the Hausdorff distance (CPS-2H) is known to be NP-complete, and here we prove that the constrained version ($(\\rm CPS\\hbox{-}2H^+)$) is also NP-complete. Finally, we discuss future work and implications along with a software library implementation, named the Fréchet-based Protein Alignment & Comparison Toolkit (FPACT).

  17. Central nervous system HIV infection in "less-drug regimen" antiretroviral therapy simplification strategies.

    Science.gov (United States)

    Ferretti, Francesca; Gianotti, Nicola; Lazzarin, Adriano; Cinque, Paola

    2014-02-01

    Less-drug regimens (LDR) refer to combinations of either two antiretroviral drugs or ritonavir-boosted protease inhibitor (PI) monotherapy. They may represent a simplification strategy in patients with persistently suppressed human immunodeficiency virus (HIV) viremia, with the main benefits of reducing drug-related toxicities and costs. Systemic virological efficacy of LDR is slightly lower as compared with combined antiretroviral therapy (cART), but patients with failure do not usually develop drug resistance and resuppress HIV replication after reintensification. A major concern of LDR is the lower efficacy in the virus reservoirs, especially in the central nervous system (CNS), where viral compartmentalization and independent evolution of infection may lead to CNS viral escape, often associated with neurologic symptoms. The authors reviewed studies of virological and functional CNS efficacy of LDR, particularly of boosted PI monotherapy regimens, for which more information is available. Symptomatic viral CSF escape was observed mainly in PI/r monotherapy patients with plasma failure and low nadir CD4+ cell counts, and resolved upon reintroduction of triple drug cART, whereas asymptomatic viral failure in CSF was not significantly more frequent in patients on PI/r monotherapy compared with patients on standard cART. In addition, there was no difference in functional outcomes between PI monotherapy and cART patients, irrespective of CSF viral escape. More data are needed on the CNS effect of dual ART regimens and, in general, on long-term efficacy of LDR. Simplification with LDR may be an attractive option in patients with suppressed viral load, if they are well selected and monitored for potential CNS complications.

  18. Difference-based Model Synchronization in an Industrial MDD Process

    DEFF Research Database (Denmark)

    Könemann, Patrick; Kindler, Ekkart; Unland, Ludger

    2009-01-01

    Models play a central role in model-driven software engineering. There are different kinds of models during the development process, which are related to each other and change over time. Therefore, it is difficult to keep the different models consistent with each other. Consistency of different...... model versions, and for synchronizing other types of models. The main concern is to apply our concepts to an industrial process, in particular keeping usability and performance in mind. Keyword: Model Differencing, Model Merging, Model Synchronization...... models is maintained manually in many cases today. This paper presents an approach for automated model differencing, so that the differences between two model versions can be extracted and stored. It can then be re-used independently of the models it was created from to interactively merge different...

  19. Modelling, Optimization and Optimal Control of Small Scale Stirred Tank Bioreactors

    Directory of Open Access Journals (Sweden)

    Mitko Petrov

    2004-10-01

    Full Text Available Models of the mass-transfer in a stirred tank bioreactor depending on general indexes of the processes of aeration and mixing in concrete simplifications of the hydrodynamic structure of the flows are developed. The offered combined model after parameters identification is used for optimization of the parameters of the apparatus construction. The optimization problem is solved by using of the fuzzy sets theory and in this way the unspecified as a result of the model simplification are read. In conclusion an optimal control of a fed-batch fermentation process of E. coli is completed by using Neuro-Dynamic programming. The received results after optimization show a considerable improvement of the mass-transfer indexes and the quantity indexes at the end of the process.

  20. Process Model Construction and Optimization Using Statistical Experimental Design,

    Science.gov (United States)

    1988-04-01

    Memo No. 88-442 ~LECTE March 1988 31988 %,.. MvAY 1 98 0) PROCESS MODEL CONSTRUCTION AND OPTIMIZATION USING STATISTICAL EXPERIMENTAL DESIGN Emmanuel...Sachs and George Prueger Abstract A methodology is presented for the construction of process models by the combination of physically based mechanistic...253-8138. .% I " Process Model Construction and Optimization Using Statistical Experimental Design" by Emanuel Sachs Assistant Professor and George