WorldWideScience

Sample records for unit process models

  1. Stochastic Analysis of a Queue Length Model Using a Graphics Processing Unit

    Czech Academy of Sciences Publication Activity Database

    Přikryl, Jan; Kocijan, J.

    2012-01-01

    Roč. 5, č. 2 (2012), s. 55-62 ISSN 1802-971X R&D Projects: GA MŠk(CZ) MEB091015 Institutional support: RVO:67985556 Keywords : graphics processing unit * GPU * Monte Carlo simulation * computer simulation * modeling Subject RIV: BC - Control Systems Theory http://library.utia.cas.cz/separaty/2012/AS/prikryl-stochastic analysis of a queue length model using a graphics processing unit.pdf

  2. Modelling of a Naphtha Recovery Unit (NRU with Implications for Process Optimization

    Directory of Open Access Journals (Sweden)

    Jiawei Du

    2018-06-01

    Full Text Available The naphtha recovery unit (NRU is an integral part of the processes used in the oil sands industry for bitumen extraction. The principle role of the NRU is to recover naphtha from the tailings for reuse in this process. This process is energy-intensive, and environmental guidelines for naphtha recovery must be met. Steady-state models for the NRU system are developed in this paper using two different approaches. The first approach is a statistical, data-based modelling approach where linear regression models have been developed using Minitab® from plant data collected during a performance test. The second approach involves the development of a first-principles model in Aspen Plus® based on the NRU process flow diagram. A novel refinement to this latter model, called “withdraw and remix”, is proposed based on comparing actual plant data to model predictions around the two units used to separate water and naphtha. The models developed in this paper suggest some interesting ideas for the further optimization of the process, in that it may be possible to achieve the required naphtha recovery using less energy. More plant tests are required to validate these ideas.

  3. Testing a model of componential processing of multi-symbol numbers-evidence from measurement units.

    Science.gov (United States)

    Huber, Stefan; Bahnmueller, Julia; Klein, Elise; Moeller, Korbinian

    2015-10-01

    Research on numerical cognition has addressed the processing of nonsymbolic quantities and symbolic digits extensively. However, magnitude processing of measurement units is still a neglected topic in numerical cognition research. Hence, we investigated the processing of measurement units to evaluate whether typical effects of multi-digit number processing such as the compatibility effect, the string length congruity effect, and the distance effect are also present for measurement units. In three experiments, participants had to single out the larger one of two physical quantities (e.g., lengths). In Experiment 1, the compatibility of number and measurement unit (compatible: 3 mm_6 cm with 3 mm) as well as string length congruity (congruent: 1 m_2 km with m 2 characters) were manipulated. We observed reliable compatibility effects with prolonged reaction times (RT) for incompatible trials. Moreover, a string length congruity effect was present in RT with longer RT for incongruent trials. Experiments 2 and 3 served as control experiments showing that compatibility effects persist when controlling for holistic distance and that a distance effect for measurement units exists. Our findings indicate that numbers and measurement units are processed in a componential manner and thus highlight that processing characteristics of multi-digit numbers generalize to measurement units. Thereby, our data lend further support to the recently proposed generalized model of componential multi-symbol number processing.

  4. Modeling of biopharmaceutical processes. Part 2: Process chromatography unit operation

    DEFF Research Database (Denmark)

    Kaltenbrunner, Oliver; McCue, Justin; Engel, Philip

    2008-01-01

    Process modeling can be a useful tool to aid in process development, process optimization, and process scale-up. When modeling a chromatography process, one must first select the appropriate models that describe the mass transfer and adsorption that occurs within the porous adsorbent. The theoret......Process modeling can be a useful tool to aid in process development, process optimization, and process scale-up. When modeling a chromatography process, one must first select the appropriate models that describe the mass transfer and adsorption that occurs within the porous adsorbent...

  5. Modeling of yield and environmental impact categories in tea processing units based on artificial neural networks.

    Science.gov (United States)

    Khanali, Majid; Mobli, Hossein; Hosseinzadeh-Bandbafha, Homa

    2017-12-01

    In this study, an artificial neural network (ANN) model was developed for predicting the yield and life cycle environmental impacts based on energy inputs required in processing of black tea, green tea, and oolong tea in Guilan province of Iran. A life cycle assessment (LCA) approach was used to investigate the environmental impact categories of processed tea based on the cradle to gate approach, i.e., from production of input materials using raw materials to the gate of tea processing units, i.e., packaged tea. Thus, all the tea processing operations such as withering, rolling, fermentation, drying, and packaging were considered in the analysis. The initial data were obtained from tea processing units while the required data about the background system was extracted from the EcoInvent 2.2 database. LCA results indicated that diesel fuel and corrugated paper box used in drying and packaging operations, respectively, were the main hotspots. Black tea processing unit caused the highest pollution among the three processing units. Three feed-forward back-propagation ANN models based on Levenberg-Marquardt training algorithm with two hidden layers accompanied by sigmoid activation functions and a linear transfer function in output layer, were applied for three types of processed tea. The neural networks were developed based on energy equivalents of eight different input parameters (energy equivalents of fresh tea leaves, human labor, diesel fuel, electricity, adhesive, carton, corrugated paper box, and transportation) and 11 output parameters (yield, global warming, abiotic depletion, acidification, eutrophication, ozone layer depletion, human toxicity, freshwater aquatic ecotoxicity, marine aquatic ecotoxicity, terrestrial ecotoxicity, and photochemical oxidation). The results showed that the developed ANN models with R 2 values in the range of 0.878 to 0.990 had excellent performance in predicting all the output variables based on inputs. Energy consumption for

  6. Model of a programmable quantum processing unit based on a quantum transistor effect

    Science.gov (United States)

    Ablayev, Farid; Andrianov, Sergey; Fetisov, Danila; Moiseev, Sergey; Terentyev, Alexandr; Urmanchev, Andrey; Vasiliev, Alexander

    2018-02-01

    In this paper we propose a model of a programmable quantum processing device realizable with existing nano-photonic technologies. It can be viewed as a basis for new high performance hardware architectures. Protocols for physical implementation of device on the controlled photon transfer and atomic transitions are presented. These protocols are designed for executing basic single-qubit and multi-qubit gates forming a universal set. We analyze the possible operation of this quantum computer scheme. Then we formalize the physical architecture by a mathematical model of a Quantum Processing Unit (QPU), which we use as a basis for the Quantum Programming Framework. This framework makes it possible to perform universal quantum computations in a multitasking environment.

  7. Product- and Process Units in the CRITT Translation Process Research Database

    DEFF Research Database (Denmark)

    Carl, Michael

    than 300 hours of text production. The database provides the raw logging data, as well as Tables of pre-processed product- and processing units. The TPR-DB includes various types of simple and composed product and process units that are intended to support the analysis and modelling of human text......The first version of the "Translation Process Research Database" (TPR DB v1.0) was released In August 2012, containing logging data of more than 400 translation and text production sessions. The current version of the TPR DB, (v1.4), contains data from more than 940 sessions, which represents more...

  8. Developing a Comprehensive Model of Intensive Care Unit Processes: Concept of Operations.

    Science.gov (United States)

    Romig, Mark; Tropello, Steven P; Dwyer, Cindy; Wyskiel, Rhonda M; Ravitz, Alan; Benson, John; Gropper, Michael A; Pronovost, Peter J; Sapirstein, Adam

    2015-04-23

    This study aimed to use a systems engineering approach to improve performance and stakeholder engagement in the intensive care unit to reduce several different patient harms. We developed a conceptual framework or concept of operations (ConOps) to analyze different types of harm that included 4 steps as follows: risk assessment, appropriate therapies, monitoring and feedback, as well as patient and family communications. This framework used a transdisciplinary approach to inventory the tasks and work flows required to eliminate 7 common types of harm experienced by patients in the intensive care unit. The inventory gathered both implicit and explicit information about how the system works or should work and converted the information into a detailed specification that clinicians could understand and use. Using the ConOps document, we created highly detailed work flow models to reduce harm and offer an example of its application to deep venous thrombosis. In the deep venous thrombosis model, we identified tasks that were synergistic across different types of harm. We will use a system of systems approach to integrate the variety of subsystems and coordinate processes across multiple types of harm to reduce the duplication of tasks. Through this process, we expect to improve efficiency and demonstrate synergistic interactions that ultimately can be applied across the spectrum of potential patient harms and patient locations. Engineering health care to be highly reliable will first require an understanding of the processes and work flows that comprise patient care. The ConOps strategy provided a framework for building complex systems to reduce patient harm.

  9. Thermal unit availability modeling in a regional simulation model

    International Nuclear Information System (INIS)

    Yamayee, Z.A.; Port, J.; Robinett, W.

    1983-01-01

    The System Analysis Model (SAM) developed under the umbrella of PNUCC's System Analysis Committee is capable of simulating the operation of a given load/resource scenario. This model employs a Monte-Carlo simulation to incorporate uncertainties. Among uncertainties modeled is thermal unit availability both for energy simulation (seasonal) and capacity simulations (hourly). This paper presents the availability modeling in the capacity and energy models. The use of regional and national data in deriving the two availability models, the interaction between the two and modifications made to the capacity model in order to reflect regional practices is presented. A sample problem is presented to show the modification process. Results for modeling a nuclear unit using NERC-GADS is presented

  10. On the hazard rate process for imperfectly monitored multi-unit systems

    International Nuclear Information System (INIS)

    Barros, A.; Berenguer, C.; Grall, A.

    2005-01-01

    The aim of this paper is to present a stochastic model to characterize the failure distribution of multi-unit systems when the current units state is imperfectly monitored. The definition of the hazard rate process existing with perfect monitoring is extended to the realistic case where the units failure time are not always detected (non-detection events). The so defined observed hazard rate process gives a better representation of the system behavior than the classical failure rate calculated without any information on the units state and than the hazard rate process based on perfect monitoring information. The quality of this representation is, however, conditioned by the monotony property of the process. This problem is mainly discussed and illustrated on a practical example (two parallel units). The results obtained motivate the use of the observed hazard rate process to characterize the stochastic behavior of the multi-unit systems and to optimize for example preventive maintenance policies

  11. On the hazard rate process for imperfectly monitored multi-unit systems

    Energy Technology Data Exchange (ETDEWEB)

    Barros, A. [Institut des Sciences et Techonologies de l' Information de Troyes (ISTIT-CNRS), Equipe de Modelisation et Surete des Systemes, Universite de Technologie de Troyes (UTT), 12, rue Marie Curie, BP2060, 10010 Troyes cedex (France)]. E-mail: anne.barros@utt.fr; Berenguer, C. [Institut des Sciences et Techonologies de l' Information de Troyes (ISTIT-CNRS), Equipe de Modelisation et Surete des Systemes, Universite de Technologie de Troyes (UTT), 12, rue Marie Curie, BP2060, 10010 Troyes cedex (France); Grall, A. [Institut des Sciences et Techonologies de l' Information de Troyes (ISTIT-CNRS), Equipe de Modelisation et Surete des Systemes, Universite de Technologie de Troyes (UTT), 12, rue Marie Curie, BP2060, 10010 Troyes cedex (France)

    2005-12-01

    The aim of this paper is to present a stochastic model to characterize the failure distribution of multi-unit systems when the current units state is imperfectly monitored. The definition of the hazard rate process existing with perfect monitoring is extended to the realistic case where the units failure time are not always detected (non-detection events). The so defined observed hazard rate process gives a better representation of the system behavior than the classical failure rate calculated without any information on the units state and than the hazard rate process based on perfect monitoring information. The quality of this representation is, however, conditioned by the monotony property of the process. This problem is mainly discussed and illustrated on a practical example (two parallel units). The results obtained motivate the use of the observed hazard rate process to characterize the stochastic behavior of the multi-unit systems and to optimize for example preventive maintenance policies.

  12. Neural Networks in Modelling Maintenance Unit Load Status

    Directory of Open Access Journals (Sweden)

    Anđelko Vojvoda

    2002-03-01

    Full Text Available This paper deals with a way of applying a neural networkfor describing se1vice station load in a maintenance unit. Dataacquired by measuring the workload of single stations in amaintenance unit were used in the process of training the neuralnetwork in order to create a model of the obse1ved system.The model developed in this way enables us to make more accuratepredictions over critical overload. Modelling was realisedby developing and using m-functions of the Matlab software.

  13. Judicial Process, Grade Eight. Resource Unit (Unit V).

    Science.gov (United States)

    Minnesota Univ., Minneapolis. Project Social Studies Curriculum Center.

    This resource unit, developed by the University of Minnesota's Project Social Studies, introduces eighth graders to the judicial process. The unit was designed with two major purposes in mind. First, it helps pupils understand judicial decision-making, and second, it provides for the study of the rights guaranteed by the federal Constitution. Both…

  14. Instruction Set Architectures for Quantum Processing Units

    OpenAIRE

    Britt, Keith A.; Humble, Travis S.

    2017-01-01

    Progress in quantum computing hardware raises questions about how these devices can be controlled, programmed, and integrated with existing computational workflows. We briefly describe several prominent quantum computational models, their associated quantum processing units (QPUs), and the adoption of these devices as accelerators within high-performance computing systems. Emphasizing the interface to the QPU, we analyze instruction set architectures based on reduced and complex instruction s...

  15. Modeling and experiment to threshing unit of stripper combine ...

    African Journals Online (AJOL)

    Modeling and experiment to threshing unit of stripper combine. ... were conducted with the different feed rates and drum rotator speeds for the rice stripped mixtures. ... and damage as well as for threshing unit design and process optimization.

  16. The Executive Process, Grade Eight. Resource Unit (Unit III).

    Science.gov (United States)

    Minnesota Univ., Minneapolis. Project Social Studies Curriculum Center.

    This resource unit, developed by the University of Minnesota's Project Social Studies, introduces eighth graders to the executive process. The unit uses case studies of presidential decision making such as the decision to drop the atomic bomb on Hiroshima, the Cuba Bay of Pigs and quarantine decisions, and the Little Rock decision. A case study of…

  17. Image processing unit with fall-back.

    NARCIS (Netherlands)

    2011-01-01

    An image processing unit ( 100,200,300 ) for computing a sequence of output images on basis of a sequence of input images, comprises: a motion estimation unit ( 102 ) for computing a motion vector field on basis of the input images; a quality measurement unit ( 104 ) for computing a value of a

  18. Portable brine evaporator unit, process, and system

    Science.gov (United States)

    Hart, Paul John; Miller, Bruce G.; Wincek, Ronald T.; Decker, Glenn E.; Johnson, David K.

    2009-04-07

    The present invention discloses a comprehensive, efficient, and cost effective portable evaporator unit, method, and system for the treatment of brine. The evaporator unit, method, and system require a pretreatment process that removes heavy metals, crude oil, and other contaminates in preparation for the evaporator unit. The pretreatment and the evaporator unit, method, and system process metals and brine at the site where they are generated (the well site). Thus, saving significant money to producers who can avoid present and future increases in transportation costs.

  19. Semi-automatic film processing unit

    International Nuclear Information System (INIS)

    Mohamad Annuar Assadat Husain; Abdul Aziz Bin Ramli; Mohd Khalid Matori

    2005-01-01

    The design concept applied in the development of an semi-automatic film processing unit needs creativity and user support in channelling the required information to select materials and operation system that suit the design produced. Low cost and efficient operation are the challenges that need to be faced abreast with the fast technology advancement. In producing this processing unit, there are few elements which need to be considered in order to produce high quality image. Consistent movement and correct time coordination for developing and drying are a few elements which need to be controlled. Other elements which need serious attentions are temperature, liquid density and the amount of time for the chemical liquids to react. Subsequent chemical reaction that take place will cause the liquid chemical to age and this will adversely affect the quality of image produced. This unit is also equipped with liquid chemical drainage system and disposal chemical tank. This unit would be useful in GP clinics especially in rural area which practice manual system for developing and require low operational cost. (Author)

  20. Data Sorting Using Graphics Processing Units

    Directory of Open Access Journals (Sweden)

    M. J. Mišić

    2012-06-01

    Full Text Available Graphics processing units (GPUs have been increasingly used for general-purpose computation in recent years. The GPU accelerated applications are found in both scientific and commercial domains. Sorting is considered as one of the very important operations in many applications, so its efficient implementation is essential for the overall application performance. This paper represents an effort to analyze and evaluate the implementations of the representative sorting algorithms on the graphics processing units. Three sorting algorithms (Quicksort, Merge sort, and Radix sort were evaluated on the Compute Unified Device Architecture (CUDA platform that is used to execute applications on NVIDIA graphics processing units. Algorithms were tested and evaluated using an automated test environment with input datasets of different characteristics. Finally, the results of this analysis are briefly discussed.

  1. Modelling Of Monazite Ore Break-Down By Alkali Process Spectrometry

    International Nuclear Information System (INIS)

    Visetpotjanakit, Suputtra; Changkrueng, Kalaya; Pichestapong, Pipat

    2005-10-01

    A computer modelling has been developed for the calculation of mass balance of monazite ore break-down by alkali process at Rare Earth Research and Development Center. The process includes the following units : ore digestion by concentrate NaOH, dissolution of digested ore by HCl, uranium and thorium precipitation and crystallization of Na3PO4 which is by-product from this process. The model named RRDCMBP was prepared in Visual Basic language. The modelling program can be run on personal computer and it is interactive and easy to use. User is able to choose any equipment in each unit process and input data to get output of mass balance results. The model could be helpful in the process analysis for the further process adjustment and development

  2. Dynamic modeling of ultrafiltration membranes for whey separation processes

    NARCIS (Netherlands)

    Saltik, M.B.; Ozkan, L.; Jacobs, M.; van der Padt, A.

    2017-01-01

    In this paper, we present a control relevant rigorous dynamic model for an ultrafiltration membrane unit in a whey separation process. The model consists of a set of differential algebraic equations and is developed for online model based applications such as model based control and process

  3. Lipid Processing Technology: Building a Multilevel Modeling Network

    DEFF Research Database (Denmark)

    Diaz Tovar, Carlos Axel; Mustaffa, Azizul Azri; Hukkerikar, Amol

    2011-01-01

    of a computer aided multilevel modeling network consisting a collection of new and adopted models, methods and tools for the systematic design and analysis of processes employing lipid technology. This is achieved by decomposing the problem into four levels of modeling: 1. pure component properties; 2. mixtures...... and phase behavior; 3. unit operations; and 4. process synthesis and design. The methods and tools in each level include: For the first level, a lipid‐database of collected experimental data from the open literature, confidential data from industry and generated data from validated predictive property...... of these unit operations with respect to performance parameters such as minimum total cost, product yield improvement, operability etc., and process intensification for the retrofit of existing biofuel plants. In the fourth level the information and models developed are used as building blocks...

  4. Plasma Processing of Model Residential Solid Waste

    Science.gov (United States)

    Messerle, V. E.; Mossé, A. L.; Nikonchuk, A. N.; Ustimenko, A. B.; Baimuldin, R. V.

    2017-09-01

    The authors have tested the technology of processing of model residential solid waste. They have developed and created a pilot plasma unit based on a plasma chamber incinerator. The waste processing technology has been tested and prepared for commercialization.

  5. Towards simplification of hydrologic modeling: Identification of dominant processes

    Science.gov (United States)

    Markstrom, Steven; Hay, Lauren E.; Clark, Martyn P.

    2016-01-01

    The Precipitation–Runoff Modeling System (PRMS), a distributed-parameter hydrologic model, has been applied to the conterminous US (CONUS). Parameter sensitivity analysis was used to identify: (1) the sensitive input parameters and (2) particular model output variables that could be associated with the dominant hydrologic process(es). Sensitivity values of 35 PRMS calibration parameters were computed using the Fourier amplitude sensitivity test procedure on 110 000 independent hydrologically based spatial modeling units covering the CONUS and then summarized to process (snowmelt, surface runoff, infiltration, soil moisture, evapotranspiration, interflow, baseflow, and runoff) and model performance statistic (mean, coefficient of variation, and autoregressive lag 1). Identified parameters and processes provide insight into model performance at the location of each unit and allow the modeler to identify the most dominant process on the basis of which processes are associated with the most sensitive parameters. The results of this study indicate that: (1) the choice of performance statistic and output variables has a strong influence on parameter sensitivity, (2) the apparent model complexity to the modeler can be reduced by focusing on those processes that are associated with sensitive parameters and disregarding those that are not, (3) different processes require different numbers of parameters for simulation, and (4) some sensitive parameters influence only one hydrologic process, while others may influence many

  6. Towards simplification of hydrologic modeling: identification of dominant processes

    Directory of Open Access Journals (Sweden)

    S. L. Markstrom

    2016-11-01

    Full Text Available parameter hydrologic model, has been applied to the conterminous US (CONUS. Parameter sensitivity analysis was used to identify: (1 the sensitive input parameters and (2 particular model output variables that could be associated with the dominant hydrologic process(es. Sensitivity values of 35 PRMS calibration parameters were computed using the Fourier amplitude sensitivity test procedure on 110 000 independent hydrologically based spatial modeling units covering the CONUS and then summarized to process (snowmelt, surface runoff, infiltration, soil moisture, evapotranspiration, interflow, baseflow, and runoff and model performance statistic (mean, coefficient of variation, and autoregressive lag 1. Identified parameters and processes provide insight into model performance at the location of each unit and allow the modeler to identify the most dominant process on the basis of which processes are associated with the most sensitive parameters. The results of this study indicate that: (1 the choice of performance statistic and output variables has a strong influence on parameter sensitivity, (2 the apparent model complexity to the modeler can be reduced by focusing on those processes that are associated with sensitive parameters and disregarding those that are not, (3 different processes require different numbers of parameters for simulation, and (4 some sensitive parameters influence only one hydrologic process, while others may influence many.

  7. Graphics processing unit accelerated three-dimensional model for the simulation of pulsed low-temperature plasmas

    Energy Technology Data Exchange (ETDEWEB)

    Fierro, Andrew, E-mail: andrew.fierro@ttu.edu; Dickens, James; Neuber, Andreas [Center for Pulsed Power and Power Electronics, Department of Electrical and Computer Engineering, Texas Tech University, Lubbock, Texas 79409 (United States)

    2014-12-15

    A 3-dimensional particle-in-cell/Monte Carlo collision simulation that is fully implemented on a graphics processing unit (GPU) is described and used to determine low-temperature plasma characteristics at high reduced electric field, E/n, in nitrogen gas. Details of implementation on the GPU using the NVIDIA Compute Unified Device Architecture framework are discussed with respect to efficient code execution. The software is capable of tracking around 10 × 10{sup 6} particles with dynamic weighting and a total mesh size larger than 10{sup 8} cells. Verification of the simulation is performed by comparing the electron energy distribution function and plasma transport parameters to known Boltzmann Equation (BE) solvers. Under the assumption of a uniform electric field and neglecting the build-up of positive ion space charge, the simulation agrees well with the BE solvers. The model is utilized to calculate plasma characteristics of a pulsed, parallel plate discharge. A photoionization model provides the simulation with additional electrons after the initial seeded electron density has drifted towards the anode. Comparison of the performance benefits between the GPU-implementation versus a CPU-implementation is considered, and a speed-up factor of 13 for a 3D relaxation Poisson solver is obtained. Furthermore, a factor 60 speed-up is realized for parallelization of the electron processes.

  8. Integrated Process Modeling-A Process Validation Life Cycle Companion.

    Science.gov (United States)

    Zahel, Thomas; Hauer, Stefan; Mueller, Eric M; Murphy, Patrick; Abad, Sandra; Vasilieva, Elena; Maurer, Daniel; Brocard, Cécile; Reinisch, Daniela; Sagmeister, Patrick; Herwig, Christoph

    2017-10-17

    During the regulatory requested process validation of pharmaceutical manufacturing processes, companies aim to identify, control, and continuously monitor process variation and its impact on critical quality attributes (CQAs) of the final product. It is difficult to directly connect the impact of single process parameters (PPs) to final product CQAs, especially in biopharmaceutical process development and production, where multiple unit operations are stacked together and interact with each other. Therefore, we want to present the application of Monte Carlo (MC) simulation using an integrated process model (IPM) that enables estimation of process capability even in early stages of process validation. Once the IPM is established, its capability in risk and criticality assessment is furthermore demonstrated. IPMs can be used to enable holistic production control strategies that take interactions of process parameters of multiple unit operations into account. Moreover, IPMs can be trained with development data, refined with qualification runs, and maintained with routine manufacturing data which underlines the lifecycle concept. These applications will be shown by means of a process characterization study recently conducted at a world-leading contract manufacturing organization (CMO). The new IPM methodology therefore allows anticipation of out of specification (OOS) events, identify critical process parameters, and take risk-based decisions on counteractions that increase process robustness and decrease the likelihood of OOS events.

  9. Development of interface technology between unit processes in E-Refining process

    Energy Technology Data Exchange (ETDEWEB)

    Lee, S. H.; Lee, H. S.; Kim, J. G. [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2010-10-15

    The pyroprocessing is composed mainly four subprocesses, such as an electrolytic reduction, an electrorefining, an electrowinning, and waste salt regeneration/ solidification processes. The electrorefining process, one of main processes which are composed of pyroprocess to recover the useful elements from spent fuel, is under development by Korea Atomic Energy Research Institute as a sub process of pyrochemical treatment of spent PWR fuel. The CERS(Continuous ElectroRefining System) is composed of some unit processes such as an electrorefiner, a salt distiller, a melting furnace for the U-ingot and U-chlorinator (UCl{sub 3} making equipment) as shown in Fig. 1. In this study, the interfaces technology between unit processes in E-Refining system is investigated and developed for the establishment of integrated E-Refining operation system as a part of integrated pyroprocessing

  10. Generation unit selection via capital asset pricing model for generation planning

    Energy Technology Data Exchange (ETDEWEB)

    Romy Cahyadi; K. Jo Min; Chung-Hsiao Wang; Nick Abi-Samra [College of Engineering, Ames, IA (USA)

    2003-11-01

    The USA's electric power industry is undergoing substantial regulatory and organizational changes. Such changes introduce substantial financial risk in generation planning. In order to incorporate the financial risk into the capital investment decision process of generation planning, this paper develops and analyses a generation unit selection process via the capital asset pricing model (CAPM). In particular, utilizing realistic data on gas-fired, coal-fired, and wind power generation units, the authors show which and how concrete steps can be taken for generation planning purposes. It is hoped that the generation unit selection process will help utilities in the area of effective and efficient generation planning when financial risks are considered. 20 refs., 14 tabs.

  11. Generation unit selection via capital asset pricing model for generation planning

    Energy Technology Data Exchange (ETDEWEB)

    Cahyadi, Romy; Jo Min, K. [College of Engineering, Ames, IA (United States); Chunghsiao Wang [LG and E Energy Corp., Louisville, KY (United States); Abi-Samra, Nick [Electric Power Research Inst., Palo Alto, CA (United States)

    2003-07-01

    The electric power industry in many parts of U.S.A. is undergoing substantial regulatory and organizational changes. Such changes introduce substantial financial risk in generation planning. In order to incorporate the financial risk into the capital investment decision process of generation planning, in this paper, we develop and analyse a generation unit selection process via the capital asset pricing model (CAPM). In particular, utilizing realistic data on gas-fired, coal-fired, and wind power generation units, we show which and how concrete steps can be taken for generation planning purposes. It is hoped that the generation unit selection process developed in this paper will help utilities in the area of effective and efficient generation planning when financial risks are considered. (Author)

  12. Modelling a process for dimerisation of 2-methylpropene

    Energy Technology Data Exchange (ETDEWEB)

    Ouni, T.

    2005-07-01

    Isooctane can be used to replace methyl-tert-butyl ether (MTBE) as a fuel additive. Isooctane is hydrogenated from isooctene, which is produced by dimerizing 2-methylpropene. In dimerization, two 2-methylpropene molecules react on ionexchange resin catalyst to produce isooctene isomers (2,4,4-trimethyl-1-pentene, 2,4,4- trimethyl-2-pentene). Presence of 2-methyl-2-propanol (TBA) improves reaction selectivity. Trimers and tetramers are formed as side products. Water and alkenes have reaction equilibrium with corresponding alcohols. The process configuration for isooctene production is a side reactor concept, and consists of reactor part, separation part (distillation tower) and a recycle structure. Units of miniplant at Helsinki University of Technology imitates the actual units of the isooctene production line in smaller scale, providing valuable information about the process and about the behaviour of individual units, as well as about the dynamics and operability of the process. Ideology behind Miniplant is to separate thermodynamical models from hardware specific models, so that they could be used as such in other contexts, e.g. in industrial scale. In the specific case of 2-methylpropene dimerisation the key thermodynamical models are vapour-liquid and liquid-liquid equilibrium as well as reaction kinetics. Hardware specific models include distillation column with spring-shaped packings and tubular catalytic reactor with heating coil and a thermowell. Developing these models through experiments and simulations was the primary target of this work. (orig.)

  13. PREMATH: a Precious-Material Holdup Estimator for unit operations and chemical processes

    International Nuclear Information System (INIS)

    Krichinsky, A.M.; Bruns, D.D.

    1982-01-01

    A computer program, PREMATH (Precious Material Holdup Estimator), has been developed to permit inventory estimation in vessels involved in unit operations and chemical processes. This program has been implemented in an operating nuclear fuel processing plant. PREMATH's purpose is to provide steady-state composition estimates for material residing in process vessels until representative samples can be obtained and chemical analyses can be performed. Since these compositions are used for inventory estimation, the results are determined for and cataloged in container-oriented files. The estimated compositions represent material collected in applicable vessels - including consideration for material previously acknowledged in these vessels. The program utilizes process measurements and simple material balance models to estimate material holdups and distribution within unit operations. During simulated run testing, PREMATH-estimated inventories typically produced material balances within 7% of the associated measured material balances for uranium and within 16% of the associated, measured material balances for thorium (a less valuable material than uranium) during steady-state process operation

  14. NUMATH: a nuclear-material-holdup estimator for unit operations and chemical processes

    International Nuclear Information System (INIS)

    Krichinsky, A.M.

    1981-01-01

    A computer program, NUMATH (Nuclear Material Holdup Estimator), has been developed to permit inventory estimation in vessels involved in unit operations and chemical processes. This program has been implemented in an operating nuclear fuel processing plant. NUMATH's purpose is to provide steady-state composition estimates for material residing in process vessels until representative samples can be obtained and chemical analyses can be performed. Since these compositions are used for inventory estimation, the results are determined for and cataloged in container-oriented files. The estimated compositions represent material collected in applicable vessels-including consideration for material previously acknowledged in these vessels. The program utilizes process measurements and simple material balance models to estimate material holdups and distribution within unit operations. During simulated run testing, NUMATH-estimated inventories typically produced material balances within 7% of the associated measured material balances for uranium and within 16% of the associated, measured material balance for thorium during steady-state process operation

  15. Modeling the Hydrologic Processes of a Permeable Pavement ...

    Science.gov (United States)

    A permeable pavement system can capture stormwater to reduce runoff volume and flow rate, improve onsite groundwater recharge, and enhance pollutant controls within the site. A new unit process model for evaluating the hydrologic performance of a permeable pavement system has been developed in this study. The developed model can continuously simulate infiltration through the permeable pavement surface, exfiltration from the storage to the surrounding in situ soils, and clogging impacts on infiltration/exfiltration capacity at the pavement surface and the bottom of the subsurface storage unit. The exfiltration modeling component simulates vertical and horizontal exfiltration independently based on Darcy’s formula with the Green-Ampt approximation. The developed model can be arranged with physically-based modeling parameters, such as hydraulic conductivity, Manning’s friction flow parameters, saturated and field capacity volumetric water contents, porosity, density, etc. The developed model was calibrated using high-frequency observed data. The modeled water depths are well matched with the observed values (R2 = 0.90). The modeling results show that horizontal exfiltration through the side walls of the subsurface storage unit is a prevailing factor in determining the hydrologic performance of the system, especially where the storage unit is developed in a long, narrow shape; or with a high risk of bottom compaction and clogging. This paper presents unit

  16. A General Accelerated Degradation Model Based on the Wiener Process.

    Science.gov (United States)

    Liu, Le; Li, Xiaoyang; Sun, Fuqiang; Wang, Ning

    2016-12-06

    Accelerated degradation testing (ADT) is an efficient tool to conduct material service reliability and safety evaluations by analyzing performance degradation data. Traditional stochastic process models are mainly for linear or linearization degradation paths. However, those methods are not applicable for the situations where the degradation processes cannot be linearized. Hence, in this paper, a general ADT model based on the Wiener process is proposed to solve the problem for accelerated degradation data analysis. The general model can consider the unit-to-unit variation and temporal variation of the degradation process, and is suitable for both linear and nonlinear ADT analyses with single or multiple acceleration variables. The statistical inference is given to estimate the unknown parameters in both constant stress and step stress ADT. The simulation example and two real applications demonstrate that the proposed method can yield reliable lifetime evaluation results compared with the existing linear and time-scale transformation Wiener processes in both linear and nonlinear ADT analyses.

  17. A General Accelerated Degradation Model Based on the Wiener Process

    Directory of Open Access Journals (Sweden)

    Le Liu

    2016-12-01

    Full Text Available Accelerated degradation testing (ADT is an efficient tool to conduct material service reliability and safety evaluations by analyzing performance degradation data. Traditional stochastic process models are mainly for linear or linearization degradation paths. However, those methods are not applicable for the situations where the degradation processes cannot be linearized. Hence, in this paper, a general ADT model based on the Wiener process is proposed to solve the problem for accelerated degradation data analysis. The general model can consider the unit-to-unit variation and temporal variation of the degradation process, and is suitable for both linear and nonlinear ADT analyses with single or multiple acceleration variables. The statistical inference is given to estimate the unknown parameters in both constant stress and step stress ADT. The simulation example and two real applications demonstrate that the proposed method can yield reliable lifetime evaluation results compared with the existing linear and time-scale transformation Wiener processes in both linear and nonlinear ADT analyses.

  18. Reforging the Wedding Ring: Exploring a Semi-Artificial Model of Population for the United Kingdom with Gaussian process emulators

    Directory of Open Access Journals (Sweden)

    Viet Dung Cao

    2013-10-01

    Full Text Available Background: We extend the "Wedding Ring‟ agent-based model of marriage formation to include some empirical information on the natural population change for the United Kingdom together with behavioural explanations that drive the observed nuptiality trends. Objective: We propose a method to explore statistical properties of agent-based demographic models. By coupling rule-based explanations driving the agent-based model with observed data we wish to bring agent-based modelling and demographic analysis closer together. Methods: We present a Semi-Artificial Model of Population, which aims to bridge demographic micro-simulation and agent-based traditions. We then utilise a Gaussian process emulator - a statistical model of the base model - to analyse the impact of selected model parameters on two key model outputs: population size and share of married agents. A sensitivity analysis is attempted, aiming to assess the relative importance of different inputs. Results: The resulting multi-state model of population dynamics has enhanced predictive capacity as compared to the original specification of the Wedding Ring, but there are some trade-offs between the outputs considered. The sensitivity analysis allows identification of the most important parameters in the modelled marriage formation process. Conclusions: The proposed methods allow for generating coherent, multi-level agent-based scenarios aligned with some aspects of empirical demographic reality. Emulators permit a statistical analysis of their properties and help select plausible parameter values. Comments: Given non-linearities in agent-based models such as the Wedding Ring, and the presence of feedback loops, the uncertainty in the model may not be directly computable by using traditional statistical methods. The use of statistical emulators offers a way forward.

  19. 15 CFR 971.209 - Processing outside the United States.

    Science.gov (United States)

    2010-01-01

    ... 15 Commerce and Foreign Trade 3 2010-01-01 2010-01-01 false Processing outside the United States... THE ENVIRONMENTAL DATA SERVICE DEEP SEABED MINING REGULATIONS FOR COMMERCIAL RECOVERY PERMITS Applications Contents § 971.209 Processing outside the United States. (a) Except as provided in this section...

  20. 40 CFR 63.765 - Glycol dehydration unit process vent standards.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 10 2010-07-01 2010-07-01 false Glycol dehydration unit process vent... Facilities § 63.765 Glycol dehydration unit process vent standards. (a) This section applies to each glycol dehydration unit subject to this subpart with an actual annual average natural gas flowrate equal to or...

  1. 40 CFR 63.1275 - Glycol dehydration unit process vent standards.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 11 2010-07-01 2010-07-01 true Glycol dehydration unit process vent... Facilities § 63.1275 Glycol dehydration unit process vent standards. (a) This section applies to each glycol dehydration unit subject to this subpart with an actual annual average natural gas flowrate equal to or...

  2. Proton Testing of Advanced Stellar Compass Digital Processing Unit

    DEFF Research Database (Denmark)

    Thuesen, Gøsta; Denver, Troelz; Jørgensen, Finn E

    1999-01-01

    The Advanced Stellar Compass Digital Processing Unit was radiation tested with 300 MeV protons at Proton Irradiation Facility (PIF), Paul Scherrer Institute, Switzerland.......The Advanced Stellar Compass Digital Processing Unit was radiation tested with 300 MeV protons at Proton Irradiation Facility (PIF), Paul Scherrer Institute, Switzerland....

  3. Landform Evolution Modeling of Specific Fluvially Eroded Physiographic Units on Titan

    Science.gov (United States)

    Moore, J. M.; Howard, A. D.; Schenk, P. M.

    2015-01-01

    Several recent studies have proposed certain terrain types (i.e., physiographic units) on Titan thought to be formed by fluvial processes acting on local uplands of bedrock or in some cases sediment. We have earlier used our landform evolution models to make general comparisons between Titan and other ice world landscapes (principally those of the Galilean satellites) that we have modeled the action of fluvial processes. Here we give examples of specific landscapes that, subsequent to modeled fluvial work acting on the surfaces, produce landscapes which resemble mapped terrain types on Titan.

  4. The Best Practice Unit: a model for learning, research and development

    Directory of Open Access Journals (Sweden)

    Jean Pierre Wilken

    2013-06-01

    Full Text Available The Best Practice Unit: a model for learning, research and development The Best Practice Unit (BPU model constitutes a unique form of practice-based research. A variant of the Community of Practice model developed by Wenger, McDermott and Snyder (2002, the BPU has the specific aim of improving professional practice by combining innovation and research. The model is used as a way of working by a group of professionals, researchers and other relevant individuals, who over a period of one to two years, work together towards a desired improvement. The model is characterized by interaction between individual and collective learning processes, the development of new or improved working methods, and the implementation of these methods in daily practice. Multiple knowledge resources are used, including experiential knowledge, professional knowledge and scientific knowledge. The research serves diverse purposes: articulating tacit knowledge, documenting learning and innovation processes, systematically describing the working methods that have been revealed or developed, and evaluating the efficacy of the new methods. Each BPU is supported by a facilitator, whose main task is to optimize learning processes. An analysis of ten different BPUs in different professional fields shows that this is a successful model. The article describes the methodology and results of this study. De Best Practice Unit: een model voor leren, onderzoek en ontwikkeling Het model van de Best Practice Unit (BPU is een unieke vorm van praktijkgericht onderzoek. De Best Practice Unit is een variant van de Community of Practice zoals ontwikkeld door Wenger, McDermott en Snyder (2002 met als specifiek doel om de professionele praktijk te verbeteren door innovatie en onderzoek te combineren. Het model wordt gebruikt om in een periode van 1-2 jaar met een groep professionals, onderzoekers en andere betrokkenen te werken aan een gewenste verbetering. Kenmerkend is de wisselwerking tussen

  5. A decision modeling for phasor measurement unit location selection in smart grid systems

    Science.gov (United States)

    Lee, Seung Yup

    As a key technology for enhancing the smart grid system, Phasor Measurement Unit (PMU) provides synchronized phasor measurements of voltages and currents of wide-area electric power grid. With various benefits from its application, one of the critical issues in utilizing PMUs is the optimal site selection of units. The main aim of this research is to develop a decision support system, which can be used in resource allocation task for smart grid system analysis. As an effort to suggest a robust decision model and standardize the decision modeling process, a harmonized modeling framework, which considers operational circumstances of component, is proposed in connection with a deterministic approach utilizing integer programming. With the results obtained from the optimal PMU placement problem, the advantages and potential that the harmonized modeling process possesses are assessed and discussed.

  6. Optimization models of the supply of power structures’ organizational units with centralized procurement

    Directory of Open Access Journals (Sweden)

    Sysoiev Volodymyr

    2013-01-01

    Full Text Available Management of the state power structures’ organizational units for materiel and technical support requires the use of effective tools for supporting decisions, due to the complexity, interdependence, and dynamism of supply in the market economy. The corporate nature of power structures is of particular interest to centralized procurement management, as it provides significant advantages through coordination, eliminating duplication, and economy of scale. This article presents optimization models of the supply of state power structures’ organizational units with centralized procurement, for different levels of simulated materiel and technical support processes. The models allow us to find the most profitable options for state power structures’ organizational supply units in a centre-oriented logistics system in conditions of the changing needs, volume of allocated funds, and logistics costs that accompany the process of supply, by maximizing the provision level of organizational units with necessary material and technical resources for the entire planning period of supply by minimizing the total logistical costs, taking into account the diverse nature and the different priorities of organizational units and material and technical resources.

  7. Modelling of an industrial NGL-Recovery unit considering environmental and economic impacts

    International Nuclear Information System (INIS)

    Sharratt, P. N.; Hernandez-Enriquez, A.; Flores-Tlacuahuac, A.

    2009-01-01

    In this work, an integrated model is presented that identifies key areas in the operation of a cryogenic NGL-recovery unit. This methodology sets out to provide deep understanding of various interrelationship across multiple plant operating factors including reliability, which could be essential for substantial improvement of process performance. The integrated model has been developed to predict the economic and environmental impacts of a real cryogenic unit (600 MMCUF/D) during normal operation, and has been built in Aspen TM. (Author)

  8. Conceptual Model of Quantities, Units, Dimensions, and Values

    Science.gov (United States)

    Rouquette, Nicolas F.; DeKoenig, Hans-Peter; Burkhart, Roger; Espinoza, Huascar

    2011-01-01

    JPL collaborated with experts from industry and other organizations to develop a conceptual model of quantities, units, dimensions, and values based on the current work of the ISO 80000 committee revising the International System of Units & Quantities based on the International Vocabulary of Metrology (VIM). By providing support for ISO 80000 in SysML via the International Vocabulary of Metrology (VIM), this conceptual model provides, for the first time, a standard-based approach for addressing issues of unit coherence and dimensional analysis into the practice of systems engineering with SysML-based tools. This conceptual model provides support for two kinds of analyses specified in the International Vocabulary of Metrology (VIM): coherence of units as well as of systems of units, and dimension analysis of systems of quantities. To provide a solid and stable foundation, the model for defining quantities, units, dimensions, and values in SysML is explicitly based on the concepts defined in VIM. At the same time, the model library is designed in such a way that extensions to the ISQ (International System of Quantities) and SI Units (Systeme International d Unites) can be represented, as well as any alternative systems of quantities and units. The model library can be used to support SysML user models in various ways. A simple approach is to define and document libraries of reusable systems of units and quantities for reuse across multiple projects, and to link units and quantity kinds from these libraries to Unit and QuantityKind stereotypes defined in SysML user models.

  9. Determinantal point process models on the sphere

    DEFF Research Database (Denmark)

    Møller, Jesper; Nielsen, Morten; Porcu, Emilio

    defined on Sd × Sd . We review the appealing properties of such processes, including their specific moment properties, density expressions and simulation procedures. Particularly, we characterize and construct isotropic DPPs models on Sd , where it becomes essential to specify the eigenvalues......We consider determinantal point processes on the d-dimensional unit sphere Sd . These are finite point processes exhibiting repulsiveness and with moment properties determined by a certain determinant whose entries are specified by a so-called kernel which we assume is a complex covariance function...... and eigenfunctions in a spectral representation for the kernel, and we figure out how repulsive isotropic DPPs can be. Moreover, we discuss the shortcomings of adapting existing models for isotropic covariance functions and consider strategies for developing new models, including a useful spectral approach....

  10. Non-linear Loudspeaker Unit Modelling

    DEFF Research Database (Denmark)

    Pedersen, Bo Rohde; Agerkvist, Finn T.

    2008-01-01

    Simulations of a 6½-inch loudspeaker unit are performed and compared with a displacement measurement. The non-linear loudspeaker model is based on the major nonlinear functions and expanded with time-varying suspension behaviour and flux modulation. The results are presented with FFT plots of thr...... frequencies and different displacement levels. The model errors are discussed and analysed including a test with loudspeaker unit where the diaphragm is removed....

  11. On Tour... Primary Hardwood Processing, Products and Recycling Unit

    Science.gov (United States)

    Philip A. Araman; Daniel L. Schmoldt

    1995-01-01

    Housed within the Department of Wood Science and Forest Products at Virginia Polytechnic Institute is a three-person USDA Forest Service research work unit (with one vacancy) devoted to hardwood processing and recycling research. Phil Araman is the project leader of this truly unique and productive unit, titled ãPrimary Hardwood Processing, Products and Recycling.ä The...

  12. Graphics processing units in bioinformatics, computational biology and systems biology.

    Science.gov (United States)

    Nobile, Marco S; Cazzaniga, Paolo; Tangherloni, Andrea; Besozzi, Daniela

    2017-09-01

    Several studies in Bioinformatics, Computational Biology and Systems Biology rely on the definition of physico-chemical or mathematical models of biological systems at different scales and levels of complexity, ranging from the interaction of atoms in single molecules up to genome-wide interaction networks. Traditional computational methods and software tools developed in these research fields share a common trait: they can be computationally demanding on Central Processing Units (CPUs), therefore limiting their applicability in many circumstances. To overcome this issue, general-purpose Graphics Processing Units (GPUs) are gaining an increasing attention by the scientific community, as they can considerably reduce the running time required by standard CPU-based software, and allow more intensive investigations of biological systems. In this review, we present a collection of GPU tools recently developed to perform computational analyses in life science disciplines, emphasizing the advantages and the drawbacks in the use of these parallel architectures. The complete list of GPU-powered tools here reviewed is available at http://bit.ly/gputools. © The Author 2016. Published by Oxford University Press.

  13. Tomography system having an ultrahigh-speed processing unit

    International Nuclear Information System (INIS)

    Brunnett, C.J.; Gerth, V.W. Jr.

    1977-01-01

    A transverse section tomography system has an ultrahigh-speed data processing unit for performing back projection and updating. An x-ray scanner directs x-ray beams through a planar section of a subject from a sequence of orientations and positions. The data processing unit includes a scan storage section for retrievably storing a set of filtered scan signals in scan storage locations corresponding to predetermined beam orientations. An array storage section is provided for storing image signals as they are generated

  14. Process modeling study of the CIF incinerator

    International Nuclear Information System (INIS)

    Hang, T.

    1995-01-01

    The Savannah River Site (SRS) plans to begin operating the Consolidated Incineration Facility (CIF) in 1996. The CIF will treat liquid and solid low-level radioactive, mixed and RCRA hazardous wastes generated at SRS. In addition to experimental test programs, process modeling was applied to provide guidance in areas of safety, environmental regulation compliances, process improvement and optimization. A steady-state flowsheet model was used to calculate material/energy balances and to track key chemical constituents throughout the process units. Dynamic models were developed to predict the CIF transient characteristics in normal and abnormal operation scenarios. Predictions include the rotary kiln heat transfer, dynamic responses of the CIF to fluctuations in the solid waste feed or upsets in the system equipments, performance of the control system, air inleakage in the kiln, etc. This paper reviews the modeling study performed to assist in the deflagration risk assessment

  15. Unit operation in food manufacturing and processing. Shokuhin seizo/kako ni okeru tan'i sosa

    Energy Technology Data Exchange (ETDEWEB)

    Matsuno, R. (Kyoto Univ., Kyoto (Japan). Faculty of Aguriculture)

    1993-09-05

    Processed foods must be produced in mass, cheap and safe and should be suitable for the delicate taste of human being. Food tastes are effected by an outlook on human attitude, and the surrounding environment. And these factors are reflected to unit operation in food manufacturing and processing and it is clarified that there are many technical difficulties. The characteristics of unit operation for food manufacturing and processing are that the food materials are a multicomponent system, moreover, a very small amount of aroma components, taste components, vitamin, physiologically activation materials and so on are more important than the main components, and also inapplicable of the model centering to the most quantitative component. The purpose of unit operation in food manufacturing and processing is to produce the properties of matter matching to human sense, and therefore there are many problems left unsolved. The development of analytical technology also has an influence on manufacturing and processing technology. Consequently, food manufacturing and processing technology must be based on general science. It is necessary to develop unit operation with an understanding of mutual effect between food and human body.

  16. Control system design specification of advanced spent fuel management process units

    Energy Technology Data Exchange (ETDEWEB)

    Ahn, S. H.; Kim, S. H.; Yoon, J. S

    2003-06-01

    In this study, the design specifications of instrumentation and control system for advanced spent fuel management process units are presented. The advanced spent fuel management process consists of several process units such as slitting device, dry pulverizing/mixing device, metallizer, etc. In this study, the control and operation characteristics of the advanced spent fuel management mockup process devices and the process devices developed in 2001 and 2002 are analysed. Also, a integral processing system of the unit process control signals is proposed, which the operation efficiency is improved. And a redundant PLC control system is constructed which the reliability is improved. A control scheme is proposed for the time delayed systems compensating the control performance degradation caused by time delay. The control system design specification is presented for the advanced spent fuel management process units. This design specifications can be effectively used for the detail design of the advanced spent fuel management process.

  17. Employing the intelligence cycle process model within the Homeland Security Enterprise

    OpenAIRE

    Stokes, Roger L.

    2013-01-01

    CHDS State/Local The purpose of this thesis was to examine the employment and adherence of the intelligence cycle process model within the National Network of Fusion Centers and the greater Homeland Security Enterprise by exploring the customary intelligence cycle process model established by the United States Intelligence Community (USIC). This thesis revealed there are various intelligence cycle process models used by the USIC and taught to the National Network. Given the numerous differ...

  18. Development of Neutronics Model for ShinKori Unit 1 Simulator

    Energy Technology Data Exchange (ETDEWEB)

    Hong, JinHyuk; Lee, MyeongSoo; Lee, SeungHo; Suh, JungKwan; Hwang, DoHyun [KEPRI, Daejeon (Korea, Republic of)

    2008-05-15

    ShinKori-Unit 1 and 2 is being built in the Kori site which will be operated at 2815 MWt of thermal core power. The purpose of this paper is to report on the performance of the developed neutronics model of ShinKori Unit 1 and 2. Also this report includes the convenient tool (XS2R5) for processing the large quantity of information received from the DIT/ROCS model and generating cross-sections. The neutronics model is based on the NESTLE code inserted to RELAP5/MOD3 thermal-hydraulics analysis code which was funded as FY-93 LDRD Project 7201 and is running on the commercial simulator environment tool (the 3KeyMaster{sup TM} of the WSC). As some examples for the verification of the developed neutronics model, some figures are provided. The output of the developed neutronics model is in accord with the Preliminary Safety Analysis Report (PSAR) of the reference plant.

  19. Modeling of the fatigue damage accumulation processes in the material of NPP design units under thermomechanical unstationary effects. Estimation of spent life and forecast of residual life

    International Nuclear Information System (INIS)

    Kiriushin, A.I.; Korotkikh, Yu.G.; Gorodov, G.F.

    2002-01-01

    Full text: The estimation problems of spent life and forecast of residual life of NPP equipment design units, operated at unstationary thermal force loads are considered. These loads are, as a rule, unregular and cause rotation of main stress tensor platforms of the most loaded zones of structural elements and viscoelastic plastic deformation of material in the places of stresses concentrations. The existing engineering approaches to the damages accumulation processes calculation in the material of structural units, their advantages and disadvantages are analyzed. For the processes of fatigue damages accumulation a model is proposed, which allows to take into account the unregular pattern of deformation multiaxiality of stressed state, rotation of main platforms, non-linear summation of damages at the loading mode change. The model in based on the equations of damaged medium mechanics, including the equations of viscoplastic deformation of the material and evolutionary equations of damages accumulation. The algorithms of spent life estimation and residual life forecast of the controlled equipment and systems zones are made on the bases of the given model by the known real history of loading, which is determined by real model of NPP operation. The results of numerical experiments on the basis of given model for various processes of thermal force loads and their comparison with experimental results are presented. (author)

  20. Matching soil grid unit resolutions with polygon unit scales for DNDC modelling of regional SOC pool

    Science.gov (United States)

    Zhang, H. D.; Yu, D. S.; Ni, Y. L.; Zhang, L. M.; Shi, X. Z.

    2015-03-01

    Matching soil grid unit resolution with polygon unit map scale is important to minimize uncertainty of regional soil organic carbon (SOC) pool simulation as their strong influences on the uncertainty. A series of soil grid units at varying cell sizes were derived from soil polygon units at the six map scales of 1:50 000 (C5), 1:200 000 (D2), 1:500 000 (P5), 1:1 000 000 (N1), 1:4 000 000 (N4) and 1:14 000 000 (N14), respectively, in the Tai lake region of China. Both format soil units were used for regional SOC pool simulation with DeNitrification-DeComposition (DNDC) process-based model, which runs span the time period 1982 to 2000 at the six map scales, respectively. Four indices, soil type number (STN) and area (AREA), average SOC density (ASOCD) and total SOC stocks (SOCS) of surface paddy soils simulated with the DNDC, were attributed from all these soil polygon and grid units, respectively. Subjecting to the four index values (IV) from the parent polygon units, the variation of an index value (VIV, %) from the grid units was used to assess its dataset accuracy and redundancy, which reflects uncertainty in the simulation of SOC. Optimal soil grid unit resolutions were generated and suggested for the DNDC simulation of regional SOC pool, matching with soil polygon units map scales, respectively. With the optimal raster resolution the soil grid units dataset can hold the same accuracy as its parent polygon units dataset without any redundancy, when VIV indices was assumed as criteria to the assessment. An quadratic curve regression model y = -8.0 × 10-6x2 + 0.228x + 0.211 (R2 = 0.9994, p < 0.05) was revealed, which describes the relationship between optimal soil grid unit resolution (y, km) and soil polygon unit map scale (1:x). The knowledge may serve for grid partitioning of regions focused on the investigation and simulation of SOC pool dynamics at certain map scale.

  1. Development of a transient, lumped hydrologic model for geomorphologic units in a geomorphology based rainfall-runoff modelling framework

    Science.gov (United States)

    Vannametee, E.; Karssenberg, D.; Hendriks, M. R.; de Jong, S. M.; Bierkens, M. F. P.

    2010-05-01

    We propose a modelling framework for distributed hydrological modelling of 103-105 km2 catchments by discretizing the catchment in geomorphologic units. Each of these units is modelled using a lumped model representative for the processes in the unit. Here, we focus on the development and parameterization of this lumped model as a component of our framework. The development of the lumped model requires rainfall-runoff data for an extensive set of geomorphological units. Because such large observational data sets do not exist, we create artificial data. With a high-resolution, physically-based, rainfall-runoff model, we create artificial rainfall events and resulting hydrographs for an extensive set of different geomorphological units. This data set is used to identify the lumped model of geomorphologic units. The advantage of this approach is that it results in a lumped model with a physical basis, with representative parameters that can be derived from point-scale measurable physical parameters. The approach starts with the development of the high-resolution rainfall-runoff model that generates an artificial discharge dataset from rainfall inputs as a surrogate of a real-world dataset. The model is run for approximately 105 scenarios that describe different characteristics of rainfall, properties of the geomorphologic units (i.e. slope gradient, unit length and regolith properties), antecedent moisture conditions and flow patterns. For each scenario-run, the results of the high-resolution model (i.e. runoff and state variables) at selected simulation time steps are stored in a database. The second step is to develop the lumped model of a geomorphological unit. This forward model consists of a set of simple equations that calculate Hortonian runoff and state variables of the geomorphologic unit over time. The lumped model contains only three parameters: a ponding factor, a linear reservoir parameter, and a lag time. The model is capable of giving an appropriate

  2. Modeling and analysis of chill and fill processes for the cryogenic storage and transfer engineering development unit tank

    Science.gov (United States)

    Hedayat, A.; Cartagena, W.; Majumdar, A. K.; LeClair, A. C.

    2016-03-01

    NASA's future missions may require long-term storage and transfer of cryogenic propellants. The Engineering Development Unit (EDU), a NASA in-house effort supported by both Marshall Space Flight Center (MSFC) and Glenn Research Center, is a cryogenic fluid management (CFM) test article that primarily serves as a manufacturing pathfinder and a risk reduction task for a future CFM payload. The EDU test article comprises a flight-like tank, internal components, insulation, and attachment struts. The EDU is designed to perform integrated passive thermal control performance testing with liquid hydrogen (LH2) in a test-like vacuum environment. A series of tests, with LH2 as a testing fluid, was conducted at Test Stand 300 at MSFC during the summer of 2014. The objective of this effort was to develop a thermal/fluid model for evaluating the thermodynamic behavior of the EDU tank during the chill and fill processes. The Generalized Fluid System Simulation Program, an MSFC in-house general-purpose computer program for flow network analysis, was utilized to model and simulate the chill and fill portion of the testing. The model contained the LH2 supply source, feed system, EDU tank, and vent system. The test setup, modeling description, and comparison of model predictions with the test data are presented.

  3. Meteorite Unit Models for Structural Properties

    Science.gov (United States)

    Agrawal, Parul; Carlozzi, Alexander A.; Karajeh, Zaid S.; Bryson, Kathryn L.

    2017-10-01

    To assess the threat posed by an asteroid entering Earth’s atmosphere, one must predict if, when, and how it fragments during entry. A comprehensive understanding of the asteroid material properties is needed to achieve this objective. At present, the meteorite material found on earth are the only objects from an entering asteroid that can be used as representative material and be tested inside a laboratory. Due to complex composition, it is challenging and expensive to obtain reliable material properties by means of laboratory test for a family of meteorites. In order to circumvent this challenge, meteorite unit models are developed to determine the effective material properties including Young’s modulus, compressive and tensile strengths and Poisson’s ratio, that in turn would help deduce the properties of asteroids. The meteorite unit model is a representative volume that accounts for diverse minerals, porosity, cracks and matrix composition.The Young’s Modulus and Poisson’s Ratio in the meteorite units are calculated by performing several hundreds of Monte Carlo simulations by randomly distributing the various phases inside these units. Once these values are obtained, cracks are introduced in these units. The size, orientation and distribution of cracks are derived by CT-scans and visual scans of various meteorites. Subsequently, simulations are performed to attain stress-strain relations, strength and effective modulus values in the presence of these cracks. The meteorite unit models are presented for H, L and LL ordinary chondrites, as well as for terrestrial basalt. In the case of the latter, data from the simulations is compared with experimental data to validate the methodology. These meteorite unit models will be subsequently used in fragmentation modeling of full scale asteroids.

  4. A FPGA-based signal processing unit for a GEM array detector

    International Nuclear Information System (INIS)

    Yen, W.W.; Chou, H.P.

    2013-06-01

    in the present study, a signal processing unit for a GEM one-dimensional array detector is presented to measure the trajectory of photoelectrons produced by cosmic X-rays. The present GEM array detector system has 16 signal channels. The front-end unit provides timing signals from trigger units and energy signals from charge sensitive amplifies. The prototype of the processing unit is implemented using commercial field programmable gate array circuit boards. The FPGA based system is linked to a personal computer for testing and data analysis. Tests using simulated signals indicated that the FPGA-based signal processing unit has a good linearity and is flexible for parameter adjustment for various experimental conditions (authors)

  5. [The nursing process at a burns unit: an ethnographic study].

    Science.gov (United States)

    Rossi, L A; Casagrande, L D

    2001-01-01

    This ethnographic study aimed at understanding the cultural meaning that nursing professionals working at a Burns Unit attribute to the nursing process as well as at identifying the factors affecting the implementation of this methodology. Data were collected through participant observation and semi-structured interviews. The findings indicate that, to the nurses from the investigated unit, the nursing process seems to be identified as bureaucratic management. Some factors determining this perception are: the way in which the nursing process has been taught and interpreted, routine as a guideline for nursing activity, and knowledge and power in the life-world of the Burns Unit.

  6. Performance Recognition for Sulphur Flotation Process Based on Froth Texture Unit Distribution

    Directory of Open Access Journals (Sweden)

    Mingfang He

    2013-01-01

    Full Text Available As an important indicator of flotation performance, froth texture is believed to be related to operational condition in sulphur flotation process. A novel fault detection method based on froth texture unit distribution (TUD is proposed to recognize the fault condition of sulphur flotation in real time. The froth texture unit number is calculated based on texture spectrum, and the probability density function (PDF of froth texture unit number is defined as texture unit distribution, which can describe the actual textual feature more accurately than the grey level dependence matrix approach. As the type of the froth TUD is unknown, a nonparametric kernel estimation method based on the fixed kernel basis is proposed, which can overcome the difficulty when comparing different TUDs under various conditions is impossible using the traditional varying kernel basis. Through transforming nonparametric description into dynamic kernel weight vectors, a principle component analysis (PCA model is established to reduce the dimensionality of the vectors. Then a threshold criterion determined by the TQ statistic based on the PCA model is proposed to realize the performance recognition. The industrial application results show that the accurate performance recognition of froth flotation can be achieved by using the proposed method.

  7. Parallel direct solver for finite element modeling of manufacturing processes

    DEFF Research Database (Denmark)

    Nielsen, Chris Valentin; Martins, P.A.F.

    2017-01-01

    The central processing unit (CPU) time is of paramount importance in finite element modeling of manufacturing processes. Because the most significant part of the CPU time is consumed in solving the main system of equations resulting from finite element assemblies, different approaches have been...

  8. Mathematical models of power plant units with once-through steam generators

    International Nuclear Information System (INIS)

    Hofmeister, W.; Kantner, A.

    1977-01-01

    An optimization of effective control functions with the current complex control loop structures and control algorithms is practically not possible. Therefore computer models are required which may be optimized with the process and plant data known before start-up of thermal power plants. The application of process computers allows additional predictions on the control-dynamic behavior of a thermal power plant unit. (TK) [de

  9. FamSeq: a variant calling program for family-based sequencing data using graphics processing units.

    Directory of Open Access Journals (Sweden)

    Gang Peng

    2014-10-01

    Full Text Available Various algorithms have been developed for variant calling using next-generation sequencing data, and various methods have been applied to reduce the associated false positive and false negative rates. Few variant calling programs, however, utilize the pedigree information when the family-based sequencing data are available. Here, we present a program, FamSeq, which reduces both false positive and false negative rates by incorporating the pedigree information from the Mendelian genetic model into variant calling. To accommodate variations in data complexity, FamSeq consists of four distinct implementations of the Mendelian genetic model: the Bayesian network algorithm, a graphics processing unit version of the Bayesian network algorithm, the Elston-Stewart algorithm and the Markov chain Monte Carlo algorithm. To make the software efficient and applicable to large families, we parallelized the Bayesian network algorithm that copes with pedigrees with inbreeding loops without losing calculation precision on an NVIDIA graphics processing unit. In order to compare the difference in the four methods, we applied FamSeq to pedigree sequencing data with family sizes that varied from 7 to 12. When there is no inbreeding loop in the pedigree, the Elston-Stewart algorithm gives analytical results in a short time. If there are inbreeding loops in the pedigree, we recommend the Bayesian network method, which provides exact answers. To improve the computing speed of the Bayesian network method, we parallelized the computation on a graphics processing unit. This allowed the Bayesian network method to process the whole genome sequencing data of a family of 12 individuals within two days, which was a 10-fold time reduction compared to the time required for this computation on a central processing unit.

  10. High Input Voltage, Silicon Carbide Power Processing Unit Performance Demonstration

    Science.gov (United States)

    Bozak, Karin E.; Pinero, Luis R.; Scheidegger, Robert J.; Aulisio, Michael V.; Gonzalez, Marcelo C.; Birchenough, Arthur G.

    2015-01-01

    A silicon carbide brassboard power processing unit has been developed by the NASA Glenn Research Center in Cleveland, Ohio. The power processing unit operates from two sources: a nominal 300 Volt high voltage input bus and a nominal 28 Volt low voltage input bus. The design of the power processing unit includes four low voltage, low power auxiliary supplies, and two parallel 7.5 kilowatt (kW) discharge power supplies that are capable of providing up to 15 kilowatts of total power at 300 to 500 Volts (V) to the thruster. Additionally, the unit contains a housekeeping supply, high voltage input filter, low voltage input filter, and master control board, such that the complete brassboard unit is capable of operating a 12.5 kilowatt Hall effect thruster. The performance of the unit was characterized under both ambient and thermal vacuum test conditions, and the results demonstrate exceptional performance with full power efficiencies exceeding 97%. The unit was also tested with a 12.5kW Hall effect thruster to verify compatibility and output filter specifications. With space-qualified silicon carbide or similar high voltage, high efficiency power devices, this would provide a design solution to address the need for high power electric propulsion systems.

  11. Scale up risk of developing oil shale processing units

    International Nuclear Information System (INIS)

    Oepik, I.

    1991-01-01

    The experiences in oil shale processing in three large countries, China, the U.S.A. and the U.S.S.R. have demonstrated, that the relative scale up risk of developing oil shale processing units is related to the scale up factor. On the background of large programmes for developing the oil shale industry branch, i.e. the $30 billion investments in colorado and Utah or 50 million t/year oil shale processing in Estonia and Leningrad Region planned in the late seventies, the absolute scope of the scale up risk of developing single retorting plants, seems to be justified. But under the conditions of low crude oil prices, when the large-scale development of oil shale processing industry is stopped, the absolute scope of the scale up risk is to be divided between a small number of units. Therefore, it is reasonable to build the new commercial oil shale processing plants with a minimum scale up risk. For example, in Estonia a new oil shale processing plant with gas combustion retorts projected to start in the early nineties will be equipped with four units of 1500 t/day enriched oil shale throughput each, designed with scale up factor M=1.5 and with a minimum scale up risk, only r=2.5-4.5%. The oil shale retorting unit for the PAMA plant in Israel [1] is planned to develop in three steps, also with minimum scale up risk: feasibility studies in Colorado with Israel's shale at Paraho 250 t/day retort and other tests, demonstration retort of 700 t/day and M=2.8 in Israel, and commercial retorts in the early nineties with the capacity of about 1000 t/day with M=1.4. The scale up risk of the PAMA project r=2-4% is approximately the same as that in Estonia. the knowledge of the scope of the scale up risk of developing oil shale processing retorts assists on the calculation of production costs in erecting new units. (author). 9 refs., 2 tabs

  12. Technical and economic modelling of processes for liquid fuel production in Europe

    International Nuclear Information System (INIS)

    Bridgwater, A.V.; Double, J.M.

    1991-01-01

    The project which is described had the objective of examining the full range of technologies for liquid fuel production from renewable feedstocks in a technical and economic evaluation in order to identify the most promising technologies. The technologies considered are indirect thermochemical liquefaction (i.e. via gasification) to produce methanol, fuel alcohol or hydrocarbon fuels, direct thermochemical liquefaction or pyrolysis to produce hydrocarbon fuels and fermentation to produce ethanol. Feedstocks considered were wood, refuse derived fuel, straw, wheat and sugar beet. In order to carry out the evaluation, a computer model was developed, based on a unit process approach. Each unit operation is modelled as a process step, the model calculating the mass balance, energy balance and operating cost of the unit process. The results from the process step models are then combined to generate the mass balance, energy balance, capital cost and operating cost for the total process. The results show that the lowest production cost (L7/GJ) is obtained for methanol generated from a straw feedstock, but there is a moderate level of technical uncertainty associated with this result. The lowest production cost for hydrocarbon fuel (L8.6/GJ) is given by the pyrolysis process using a wood feedstock. This process has a high level of uncertainty. Fermentation processes showed the highest production costs, ranging from L14.4/GJ for a simple wood feedstock process to L25.2/GJ for a process based on sugar beet. The important conclusions are as follows: - In every case, the product cost is above current liquid fuel prices; - In most cases the feedstock cost dominates the production cost; -The most attractive products are thermochemically produced alcohol fuels

  13. Technical and economic modelling of processes for liquid fuel production in Europe

    Energy Technology Data Exchange (ETDEWEB)

    Bridgwater, A V; Double, J M [Aston Univ. Birmingham (GB). Dept of Chemical Engineering

    1992-12-31

    The project which is described had the objective of examining the full range of technologies for liquid fuel production from renewable feedstocks in a technical and economic evaluation in order to identify the most promising technologies. The technologies considered are indirect thermochemical liquefaction (i.e. via gasification) to produce methanol, fuel alcohol or hydrocarbon fuels, direct thermochemical liquefaction or pyrolysis to produce hydrocarbon fuels and fermentation to produce ethanol. Feedstocks considered were wood, refuse derived fuel, straw, wheat and sugar beet. In order to carry out the evaluation, a computer model was developed, based on a unit process approach. Each unit operation is modelled as a process step, the model calculating the mass balance, energy balance and operating cost of the unit process. The results from the process step models are then combined to generate the mass balance, energy balance, capital cost and operating cost for the total process. The results show that the lowest production cost (L7/GJ) is obtained for methanol generated from a straw feedstock, but there is a moderate level of technical uncertainty associated with this result. The lowest production cost for hydrocarbon fuel (L8.6/GJ) is given by the pyrolysis process using a wood feedstock. This process has a high level of uncertainty. Fermentation processes showed the highest production costs, ranging from L14.4/GJ for a simple wood feedstock process to L25.2/GJ for a process based on sugar beet. The important conclusions are as follows: - In every case, the product cost is above current liquid fuel prices; - In most cases the feedstock cost dominates the production cost; -The most attractive products are thermochemically produced alcohol fuels.

  14. Development and Application of a Low Impact Development (LID-Based District Unit Planning Model

    Directory of Open Access Journals (Sweden)

    Cheol Hee Son

    2017-01-01

    Full Text Available The purpose of this study was to develop a low impact development-based district unit planning (LID-DP model and to verify the model by applying it to a test site. To develop the model, we identified various barriers to the urban planning process and examined the advantages of various LID-related techniques to determine where in the urban development process LID would provide the greatest benefit. The resulting model provides (1 a set of district unit planning processes that consider LID standards and (2 a set of evaluation methods that measure the benefits of the LID-DP model over standard urban development practices. The developed LID-DP process is composed of status analysis, comprehensive analysis, basic plan, and sectoral plans. To determine whether the LID-DP model met the proposed LID targets, we applied the model to a test site in Cheongju City, Chungcheongbuk-do Province, Republic of Korea. The test simulation showed that the LID-DP plan reduced nonpoint source pollutants (total nitrogen, 113%; total phosphorous, 193%; and biological oxygen demand, 199%; reduced rainfall runoff (infiltration volume, 102%; surface runoff, 101%; and improved the conservation rate of the natural environment area (132%. The successful application of this model also lent support for the greater importance of non-structural techniques over structural techniques in urban planning when taking ecological factors into account.

  15. Development of an equipment management model to improve effectiveness of processes

    International Nuclear Information System (INIS)

    Chang, H. S.; Ju, T. Y.; Song, T. Y.

    2012-01-01

    The nuclear industries have developed and are trying to create a performance model to improve effectiveness of the processes implemented at nuclear plants in order to enhance performance. Most high performing nuclear stations seek to continually improve the quality of their operations by identifying and closing important performance gaps. Thus, many utilities have implemented performance models adjusted to their plant's configuration and have instituted policies for such models. KHNP is developing a standard performance model to integrate the engineering processes and to improve the inter-relation among processes. The model, called the Standard Equipment Management Model (SEMM), is under development first by focusing on engineering processes and performance improvement processes related to plant equipment used at the site. This model includes performance indicators for each process that can allow evaluating and comparing the process performance among 21 operating units. The model will later be expanded to incorporate cost and management processes. (authors)

  16. 32 CFR 516.12 - Service of civil process outside the United States.

    Science.gov (United States)

    2010-07-01

    ... 32 National Defense 3 2010-07-01 2010-07-01 true Service of civil process outside the United... AID OF CIVIL AUTHORITIES AND PUBLIC RELATIONS LITIGATION Service of Process § 516.12 Service of civil process outside the United States. (a) Process of foreign courts. In foreign countries service of process...

  17. Modeling and Control of Multivariable Process Using Intelligent Techniques

    Directory of Open Access Journals (Sweden)

    Subathra Balasubramanian

    2010-10-01

    Full Text Available For nonlinear dynamic systems, the first principles based modeling and control is difficult to implement. In this study, a fuzzy controller and recurrent fuzzy controller are developed for MIMO process. Fuzzy logic controller is a model free controller designed based on the knowledge about the process. In fuzzy controller there are two types of rule-based fuzzy models are available: one the linguistic (Mamdani model and the other is Takagi–Sugeno model. Of these two, Takagi-Sugeno model (TS has attracted most attention. The fuzzy controller application is limited to static processes due to their feedforward structure. But, most of the real-time processes are dynamic and they require the history of input/output data. In order to store the past values a memory unit is needed, which is introduced by the recurrent structure. The proposed recurrent fuzzy structure is used to develop a controller for the two tank heating process. Both controllers are designed and implemented in a real time environment and their performance is compared.

  18. Penultimate modeling of spatial extremes: statistical inference for max-infinitely divisible processes

    KAUST Repository

    Huser, Raphaël

    2018-01-09

    Extreme-value theory for stochastic processes has motivated the statistical use of max-stable models for spatial extremes. However, fitting such asymptotic models to maxima observed over finite blocks is problematic when the asymptotic stability of the dependence does not prevail in finite samples. This issue is particularly serious when data are asymptotically independent, such that the dependence strength weakens and eventually vanishes as events become more extreme. We here aim to provide flexible sub-asymptotic models for spatially indexed block maxima, which more realistically account for discrepancies between data and asymptotic theory. We develop models pertaining to the wider class of max-infinitely divisible processes, extending the class of max-stable processes while retaining dependence properties that are natural for maxima: max-id models are positively associated, and they yield a self-consistent family of models for block maxima defined over any time unit. We propose two parametric construction principles for max-id models, emphasizing a point process-based generalized spectral representation, that allows for asymptotic independence while keeping the max-stable extremal-$t$ model as a special case. Parameter estimation is efficiently performed by pairwise likelihood, and we illustrate our new modeling framework with an application to Dutch wind gust maxima calculated over different time units.

  19. Iterative Methods for MPC on Graphical Processing Units

    DEFF Research Database (Denmark)

    Gade-Nielsen, Nicolai Fog; Jørgensen, John Bagterp; Dammann, Bernd

    2012-01-01

    The high oating point performance and memory bandwidth of Graphical Processing Units (GPUs) makes them ideal for a large number of computations which often arises in scientic computing, such as matrix operations. GPUs achieve this performance by utilizing massive par- allelism, which requires ree...... as to avoid the use of dense matrices, which may be too large for the limited memory capacity of current graphics cards.......The high oating point performance and memory bandwidth of Graphical Processing Units (GPUs) makes them ideal for a large number of computations which often arises in scientic computing, such as matrix operations. GPUs achieve this performance by utilizing massive par- allelism, which requires...

  20. Baseline groundwater model update for p-area groundwater operable unit, NBN

    Energy Technology Data Exchange (ETDEWEB)

    Ross, J. [Savannah River Site (SRS), Aiken, SC (United States); Amidon, M. [Savannah River Site (SRS), Aiken, SC (United States)

    2015-09-01

    This report documents the development of a numerical groundwater flow and transport model of the hydrogeologic system of the P-Area Reactor Groundwater Operable Unit at the Savannah River Site (SRS) (Figure 1-1). The P-Area model provides a tool to aid in understanding the hydrologic and geochemical processes that control the development and migration of the current tritium, tetrachloroethene (PCE), and trichloroethene (TCE) plumes in this region.

  1. League of Our Own: Creating a Model United Nations Scrimmage Conference

    Science.gov (United States)

    Ripley, Brian; Carter, Neal; Grove, Andrea K.

    2009-01-01

    Model United Nations (MUN) provides a great forum for students to learn about global issues and political processes, while also practicing communication and negotiation skills that will serve them well for a lifetime. Intercollegiate MUN conferences can be problematic, however, in terms of logistics, budgets, and student participation. In order to…

  2. On the (R,s,Q) Inventory Model when Demand is Modelled as a Compound Process

    NARCIS (Netherlands)

    Janssen, F.B.S.L.P.; Heuts, R.M.J.; de Kok, T.

    1996-01-01

    In this paper we present an approximation method to compute the reorder point s in a (R; s; Q) inventory model with a service level restriction, where demand is modelled as a compound Bernoulli process, that is, with a xed probability there is positive demand during a time unit, otherwise demand is

  3. Modeling the Hydrologic Processes of a Permeable Pavement System

    Science.gov (United States)

    A permeable pavement system can capture stormwater to reduce runoff volume and flow rate, improve onsite groundwater recharge, and enhance pollutant controls within the site. A new unit process model for evaluating the hydrologic performance of a permeable pavement system has be...

  4. Temperature Modelling of the Biomass Pretreatment Process

    DEFF Research Database (Denmark)

    Prunescu, Remus Mihail; Blanke, Mogens; Jensen, Jakob M.

    2012-01-01

    In a second generation biorefinery, the biomass pretreatment stage has an important contribution to the efficiency of the downstream processing units involved in biofuel production. Most of the pretreatment process occurs in a large pressurized thermal reactor that presents an irregular temperature...... that captures the environmental temperature differences inside the reactor using distributed parameters. A Kalman filter is then added to account for any missing dynamics and the overall model is embedded into a temperature soft sensor. The operator of the plant will be able to observe the temperature in any...

  5. 32 CFR 516.10 - Service of civil process within the United States.

    Science.gov (United States)

    2010-07-01

    ... 32 National Defense 3 2010-07-01 2010-07-01 true Service of civil process within the United States... CIVIL AUTHORITIES AND PUBLIC RELATIONS LITIGATION Service of Process § 516.10 Service of civil process within the United States. (a) Policy. DA officials will not prevent or evade the service or process in...

  6. Analysis of Unit Process Cost for an Engineering-Scale Pyroprocess Facility Using a Process Costing Method in Korea

    Directory of Open Access Journals (Sweden)

    Sungki Kim

    2015-08-01

    Full Text Available Pyroprocessing, which is a dry recycling method, converts spent nuclear fuel into U (Uranium/TRU (TRansUranium metal ingots in a high-temperature molten salt phase. This paper provides the unit process cost of a pyroprocess facility that can process up to 10 tons of pyroprocessing product per year by utilizing the process costing method. Toward this end, the pyroprocess was classified into four kinds of unit processes: pretreatment, electrochemical reduction, electrorefining and electrowinning. The unit process cost was calculated by classifying the cost consumed at each process into raw material and conversion costs. The unit process costs of the pretreatment, electrochemical reduction, electrorefining and electrowinning were calculated as 195 US$/kgU-TRU, 310 US$/kgU-TRU, 215 US$/kgU-TRU and 231 US$/kgU-TRU, respectively. Finally the total pyroprocess cost was calculated as 951 US$/kgU-TRU. In addition, the cost driver for the raw material cost was identified as the cost for Li3PO4, needed for the LiCl-KCl purification process, and platinum as an anode electrode in the electrochemical reduction process.

  7. Process modelling on a canonical basis[Process modelling; Canonical modelling

    Energy Technology Data Exchange (ETDEWEB)

    Siepmann, Volker

    2006-12-20

    Based on an equation oriented solving strategy, this thesis investigates a new approach to process modelling. Homogeneous thermodynamic state functions represent consistent mathematical models of thermodynamic properties. Such state functions of solely extensive canonical state variables are the basis of this work, as they are natural objective functions in optimisation nodes to calculate thermodynamic equilibrium regarding phase-interaction and chemical reactions. Analytical state function derivatives are utilised within the solution process as well as interpreted as physical properties. By this approach, only a limited range of imaginable process constraints are considered, namely linear balance equations of state variables. A second-order update of source contributions to these balance equations is obtained by an additional constitutive equation system. These equations are general dependent on state variables and first-order sensitivities, and cover therefore practically all potential process constraints. Symbolic computation technology efficiently provides sparsity and derivative information of active equations to avoid performance problems regarding robustness and computational effort. A benefit of detaching the constitutive equation system is that the structure of the main equation system remains unaffected by these constraints, and a priori information allows to implement an efficient solving strategy and a concise error diagnosis. A tailor-made linear algebra library handles the sparse recursive block structures efficiently. The optimisation principle for single modules of thermodynamic equilibrium is extended to host entire process models. State variables of different modules interact through balance equations, representing material flows from one module to the other. To account for reusability and encapsulation of process module details, modular process modelling is supported by a recursive module structure. The second-order solving algorithm makes it

  8. Comparing cropland net primary production estimates from inventory, a satellite-based model, and a process-based model in the Midwest of the United States

    Science.gov (United States)

    Li, Zhengpeng; Liu, Shuguang; Tan, Zhengxi; Bliss, Norman B.; Young, Claudia J.; West, Tristram O.; Ogle, Stephen M.

    2014-01-01

    Accurately quantifying the spatial and temporal variability of net primary production (NPP) for croplands is essential to understand regional cropland carbon dynamics. We compared three NPP estimates for croplands in the Midwestern United States: inventory-based estimates using crop yield data from the U.S. Department of Agriculture (USDA) National Agricultural Statistics Service (NASS); estimates from the satellite-based Moderate Resolution Imaging Spectroradiometer (MODIS) NPP product; and estimates from the General Ensemble biogeochemical Modeling System (GEMS) process-based model. The three methods estimated mean NPP in the range of 469–687 g C m−2 yr−1and total NPP in the range of 318–490 Tg C yr−1 for croplands in the Midwest in 2007 and 2008. The NPP estimates from crop yield data and the GEMS model showed the mean NPP for croplands was over 650 g C m−2 yr−1 while the MODIS NPP product estimated the mean NPP was less than 500 g C m−2 yr−1. MODIS NPP also showed very different spatial variability of the cropland NPP from the other two methods. We found these differences were mainly caused by the difference in the land cover data and the crop specific information used in the methods. Our study demonstrated that the detailed mapping of the temporal and spatial change of crop species is critical for estimating the spatial and temporal variability of cropland NPP. We suggest that high resolution land cover data with species–specific crop information should be used in satellite-based and process-based models to improve carbon estimates for croplands.

  9. 32 CFR 516.9 - Service of criminal process within the United States.

    Science.gov (United States)

    2010-07-01

    ... 32 National Defense 3 2010-07-01 2010-07-01 true Service of criminal process within the United... OF CIVIL AUTHORITIES AND PUBLIC RELATIONS LITIGATION Service of Process § 516.9 Service of criminal process within the United States. (a) Surrender of personnel. Guidance for surrender of military personnel...

  10. Reactor modeling and process analysis for partial oxidation of natural gas

    NARCIS (Netherlands)

    Albrecht, B.A.

    2004-01-01

    This thesis analyses a novel process of partial oxidation of natural gas and develops a numerical tool for the partial oxidation reactor modeling. The proposed process generates syngas in an integrated plant of a partial oxidation reactor, a syngas turbine and an air separation unit. This is called

  11. A Companion Model Approach to Modelling and Simulation of Industrial Processes

    International Nuclear Information System (INIS)

    Juslin, K.

    2005-09-01

    Modelling and simulation provides for huge possibilities if broadly taken up by engineers as a working method. However, when considering the launching of modelling and simulation tools in an engineering design project, they shall be easy to learn and use. Then, there is no time to write equations, to consult suppliers' experts, or to manually transfer data from one tool to another. The answer seems to be in the integration of easy to use and dependable simulation software with engineering tools. Accordingly, the modelling and simulation software shall accept as input such structured design information on industrial unit processes and their connections, as provided for by e.g. CAD software and product databases. The software technology, including required specification and communication standards, is already available. Internet based service repositories make it possible for equipment manufacturers to supply 'extended products', including such design data as needed by engineers engaged in process and automation integration. There is a market niche evolving for simulation service centres, operating in co-operation with project consultants, equipment manufacturers, process integrators, automation designers, plant operating personnel, and maintenance centres. The companion model approach for specification and solution of process simulation models, as presented herein, is developed from the above premises. The focus is on how to tackle real world processes, which from the modelling point of view are heterogeneous, dynamic, very stiff, very nonlinear and only piece vice continuous, without extensive manual interventions of human experts. An additional challenge, to solve the arising equations fast and reliable, is dealt with, as well. (orig.)

  12. Simulation of operational processes in hospital emergency units as lean healthcare tool

    Directory of Open Access Journals (Sweden)

    Andreia Macedo Gomes

    2017-07-01

    Full Text Available Recently, the Lean philosophy is gaining importance due to a competitive environment, which increases the need to reduce costs. Lean practices and tools have been applied to manufacturing, services, supply chain, startups and, the next frontier is healthcare. Most lean techniques can be easily adapted to health organizations. Therefore, this paper intends to summarize Lean practices and tools that are already being applied in health organizations. Among the numerous techniques and lean tools used, this research highlights the Simulation. Therefore, in order to understand the use of Simulation as a Lean Healthcare tool, this research aims to analyze, through the simulation technique, the operational dynamics of the service process of a fictitious hospital emergency unit. Initially a systematic review of the literature on the practices and tools of Lean Healthcare was carried out, in order to identify the main techniques practiced. The research highlighted Simulation as the sixth most cited tool in the literature. Subsequently, a simulation of a service model of an emergency unit was performed through the Arena software. As a main result, it can be highlighted that the attendants of the built model presented a degree of idleness, thus, they are able to atend a greater demand. As a last conclusion, it was verified that the emergency room is the process with longer service time and greater overload.

  13. Ultra-processed food consumption in children from a Basic Health Unit.

    Science.gov (United States)

    Sparrenberger, Karen; Friedrich, Roberta Roggia; Schiffner, Mariana Dihl; Schuch, Ilaine; Wagner, Mário Bernardes

    2015-01-01

    To evaluate the contribution of ultra-processed food (UPF) on the dietary consumption of children treated at a Basic Health Unit and the associated factors. Cross-sectional study carried out with a convenience sample of 204 children, aged 2-10 years old, in Southern Brazil. Children's food intake was assessed using a 24-h recall questionnaire. Food items were classified as minimally processed, processed for culinary use, and ultra-processed. A semi-structured questionnaire was applied to collect socio-demographic and anthropometric variables. Overweight in children was classified using a Z score >2 for children younger than 5 and Z score >+1 for those aged between 5 and 10 years, using the body mass index for age. Overweight frequency was 34% (95% CI: 28-41%). Mean energy consumption was 1672.3 kcal/day, with 47% (95% CI: 45-49%) coming from ultra-processed food. In the multiple linear regression model, maternal education (r=0.23; p=0.001) and child age (r=0.40; pde Pediatria. Published by Elsevier Editora Ltda. All rights reserved.

  14. SPEEDUP modeling of the defense waste processing facility at the SRS

    International Nuclear Information System (INIS)

    Smith, F.G. III.

    1997-01-01

    A computer model has been developed for the dynamic simulation of batch process operations within the Defense Waste Processing Facility (DWPF) at the Savannah River Site (SRS). The DWPF chemically treats high level waste materials from the site tank farm and vitrifies the resulting slurry into a borosilicate glass for permanent disposal. The DWPF consists of three major processing areas: Salt Processing Cell (SPC), Chemical Processing Cell (CPC) and the Melt Cell. A fully integrated model of these process units has been developed using the SPEEDUP trademark software from Aspen Technology. Except for glass production in the Melt Cell, all of the chemical operations within DWPF are batch processes. Since SPEEDUP is designed for dynamic modeling of continuous processes, considerable effort was required to device batch process algorithms. This effort was successful and the model is able to simulate batch operations and the dynamic behavior of the process. The model also includes an optimization calculation that maximizes the waste content in the final glass product. In this paper, we will describe the process model in some detail and present preliminary results from a few simulation studies

  15. Minimization of entropy production in separate and connected process units

    Energy Technology Data Exchange (ETDEWEB)

    Roesjorde, Audun

    2004-08-01

    The objective of this thesis was to further develop a methodology for minimizing the entropy production of single and connected chemical process units. When chemical process equipment is designed and operated at the lowest entropy production possible, the energy efficiency of the equipment is enhanced. We have found for single process units that the entropy production could be reduced with up to 20-40%, given the degrees of freedom in the optimization. In processes, our results indicated that even bigger reductions were possible. The states of minimum entropy production were studied and important painter's for obtaining significant reductions in the entropy production were identified. Both from sustain ability and economical viewpoints knowledge of energy efficient design and operation are important. In some of the systems we studied, nonequilibrium thermodynamics was used to model the entropy production. In Chapter 2, we gave a brief introduction to different industrial applications of nonequilibrium thermodynamics. The link between local transport phenomena and overall system description makes nonequilibrium thermodynamics a useful tool for understanding design of chemical process units. We developed the methodology of minimization of entropy production in several steps. First, we analyzed and optimized the entropy production of single units: Two alternative concepts of adiabatic distillation; diabatic and heat-integrated distillation, were analyzed and optimized in Chapter 3 to 5. In diabatic distillation, heat exchange is allowed along the column, and it is this feature that increases the energy efficiency of the distillation column. In Chapter 3, we found how a given area of heat transfer should be optimally distributed among the trays in a column separating a mixture of propylene and propane. The results showed that heat exchange was most important on the trays close to the re boiler and condenser. In Chapter 4 and 5, we studied how the entropy

  16. 15 CFR 971.427 - Processing outside the United States.

    Science.gov (United States)

    2010-01-01

    ... 15 Commerce and Foreign Trade 3 2010-01-01 2010-01-01 false Processing outside the United States... THE ENVIRONMENTAL DATA SERVICE DEEP SEABED MINING REGULATIONS FOR COMMERCIAL RECOVERY PERMITS Issuance/Transfer: Terms, Conditions and Restrictions Terms, Conditions and Restrictions § 971.427 Processing...

  17. Process Options Description for Vitrification Flowsheet Model of INEEL Sodium Bearing Waste

    International Nuclear Information System (INIS)

    Nichols, T.T.; Taylor, D.D.; Lauerhass, L.; Barnes, C.M.

    2002-01-01

    The technical information required for the development of a basic steady-state process simulation of the vitrification treatment train of sodium bearing waste (SBW) at Idaho National Engineering and Environmental Laboratory (INEEL) is presented. The objective of the modeling effort is to provide the predictive capability required to optimize an entire treatment train and assess system-wide impacts of local changes at individual unit operations, with the aim of reducing the schedule and cost of future process/facility design efforts. All the information required a priori for engineers to construct and link unit operation modules in a commercial software simulator to represent the alternative treatment trains is presented. The information is of a mid- to high-level nature and consists of the following: (1) a description of twenty-four specific unit operations--their operating conditions and constraints, primary species and key outputs, and the initial modeling approaches that will be used in the first year of the simulation's development; (2) three potential configurations of the unit operations (trains) and their interdependencies via stream connections; and (3) representative stream compositional makeups

  18. Process Options Description for Vitrification Flowsheet Model of INEEL Sodium Bearing Waste

    Energy Technology Data Exchange (ETDEWEB)

    Nichols, T.T.; Taylor, D.D.; Lauerhass, L.; Barnes, C.M.

    2002-02-21

    The technical information required for the development of a basic steady-state process simulation of the vitrification treatment train of sodium bearing waste (SBW) at Idaho National Engineering and Environmental Laboratory (INEEL) is presented. The objective of the modeling effort is to provide the predictive capability required to optimize an entire treatment train and assess system-wide impacts of local changes at individual unit operations, with the aim of reducing the schedule and cost of future process/facility design efforts. All the information required a priori for engineers to construct and link unit operation modules in a commercial software simulator to represent the alternative treatment trains is presented. The information is of a mid- to high-level nature and consists of the following: (1) a description of twenty-four specific unit operations--their operating conditions and constraints, primary species and key outputs, and the initial modeling approaches that will be used in the first year of the simulation's development; (2) three potential configurations of the unit operations (trains) and their interdependencies via stream connections; and (3) representative stream compositional makeups.

  19. Business process modeling applied to oil pipeline and terminal processes: a proposal for TRANSPETRO's oil pipelines and terminals in Rio de Janeiro and Minas Gerais

    Energy Technology Data Exchange (ETDEWEB)

    Santiago, Adilson da Silva [TRANSPETRO - PETROBRAS Transporte S.A., Rio de Janeiro, RJ (Brazil); Caulliraux, Heitor Mansur [Universidade Federal do Rio de Janeiro (COPPE/UFRJ/GPI), RJ (Brazil). Coordenacao de Pos-graduacao em Engenharia. Grupo de Producao Integrada; Almeida, Maria Fatima Ludovico de [Pontificia Universidade Catolica do Rio de Janeiro (PUC-Rio/ITUC), Rio de Janeiro, RJ (Brazil). Instituto Tecnologico; Felippe, Adriana Vieira de Oliveira [TRANSPETRO - PETROBRAS Transporte S.A., Rio de Janeiro, RJ (Brazil)

    2009-07-01

    Business process modeling (BPM) using event driven process chain diagrams (EPCs) to lay out business process work flows is now widely adopted around the world. The EPC method was developed within the framework of the ARIS Toolset developed by Prof. Wilhelm-August Scheer at the Institut fur Wirtschaftsinformatik at the Universitat des Saarlandes, in the early 1990s. It is used by many companies to model, analyze and redesign business processes. As such it forms the core technique for modeling in ARIS, which serves to link the different aspects of the so-called control view, which is discussed in the section on ARIS business process modeling. This paper describes a proposal made to TRANSPETRO's Oil Pipelines and Terminals Division in the states of Rio de Janeiro and Minas Gerais, which will be jointly developed by specialists and managers from TRANSPETRO and from COPPETEC, the collaborative research arm of Rio de Janeiro Federal University (UFRJ). The proposal is based on ARIS business process modeling and is presented here according to its seven phases, as follows: information survey and definition of the project structure; mapping and analysis of Campos Eliseos Terminal (TECAM) processes; validation of TECAM process maps; mapping and analysis of the remaining organizational units' processes; validation of the remaining organizational units' process maps; proposal of a business process model for all organizational units of TRANSPETRO's Oil Pipelines and Terminals Division in Rio de Janeiro and Minas Gerais; critical analysis of the process itself and the results and potential benefits of BPM. (author)

  20. Real-time radar signal processing using GPGPU (general-purpose graphic processing unit)

    Science.gov (United States)

    Kong, Fanxing; Zhang, Yan Rockee; Cai, Jingxiao; Palmer, Robert D.

    2016-05-01

    This study introduces a practical approach to develop real-time signal processing chain for general phased array radar on NVIDIA GPUs(Graphical Processing Units) using CUDA (Compute Unified Device Architecture) libraries such as cuBlas and cuFFT, which are adopted from open source libraries and optimized for the NVIDIA GPUs. The processed results are rigorously verified against those from the CPUs. Performance benchmarked in computation time with various input data cube sizes are compared across GPUs and CPUs. Through the analysis, it will be demonstrated that GPGPUs (General Purpose GPU) real-time processing of the array radar data is possible with relatively low-cost commercial GPUs.

  1. Modelling of uranium/plutonium splitting in purex process

    International Nuclear Information System (INIS)

    Boullis, B.; Baron, P.

    1987-06-01

    A mathematical model simulating the highly complex uranium/plutonium splitting operation in PUREX process has been achieved by the french ''Commissariat a l'Energie Atomique''. The development of such a model, which includes transfer and redox reactions kinetics for all the species involved, required an important experimental work in the field of basis chemical data acquisition. The model has been successfully validated by comparison of its results with those of specific trials achieved (at laboratory scale), and with the available results of the french reprocessing units operation. It has then been used for the design of french new plants splitting operations

  2. Ising Processing Units: Potential and Challenges for Discrete Optimization

    Energy Technology Data Exchange (ETDEWEB)

    Coffrin, Carleton James [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Nagarajan, Harsha [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Bent, Russell Whitford [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-07-05

    The recent emergence of novel computational devices, such as adiabatic quantum computers, CMOS annealers, and optical parametric oscillators, presents new opportunities for hybrid-optimization algorithms that leverage these kinds of specialized hardware. In this work, we propose the idea of an Ising processing unit as a computational abstraction for these emerging tools. Challenges involved in using and bench- marking these devices are presented, and open-source software tools are proposed to address some of these challenges. The proposed benchmarking tools and methodology are demonstrated by conducting a baseline study of established solution methods to a D-Wave 2X adiabatic quantum computer, one example of a commercially available Ising processing unit.

  3. Conceptual Design for the Pilot-Scale Plutonium Oxide Processing Unit in the Radiochemical Processing Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Lumetta, Gregg J.; Meier, David E.; Tingey, Joel M.; Casella, Amanda J.; Delegard, Calvin H.; Edwards, Matthew K.; Jones, Susan A.; Rapko, Brian M.

    2014-08-05

    This report describes a conceptual design for a pilot-scale capability to produce plutonium oxide for use as exercise and reference materials, and for use in identifying and validating nuclear forensics signatures associated with plutonium production. This capability is referred to as the Pilot-scale Plutonium oxide Processing Unit (P3U), and it will be located in the Radiochemical Processing Laboratory at the Pacific Northwest National Laboratory. The key unit operations are described, including plutonium dioxide (PuO2) dissolution, purification of the Pu by ion exchange, precipitation, and conversion to oxide by calcination.

  4. Integrated Site Model Process Model Report

    International Nuclear Information System (INIS)

    Booth, T.

    2000-01-01

    The Integrated Site Model (ISM) provides a framework for discussing the geologic features and properties of Yucca Mountain, which is being evaluated as a potential site for a geologic repository for the disposal of nuclear waste. The ISM is important to the evaluation of the site because it provides 3-D portrayals of site geologic, rock property, and mineralogic characteristics and their spatial variabilities. The ISM is not a single discrete model; rather, it is a set of static representations that provide three-dimensional (3-D), computer representations of site geology, selected hydrologic and rock properties, and mineralogic-characteristics data. These representations are manifested in three separate model components of the ISM: the Geologic Framework Model (GFM), the Rock Properties Model (RPM), and the Mineralogic Model (MM). The GFM provides a representation of the 3-D stratigraphy and geologic structure. Based on the framework provided by the GFM, the RPM and MM provide spatial simulations of the rock and hydrologic properties, and mineralogy, respectively. Functional summaries of the component models and their respective output are provided in Section 1.4. Each of the component models of the ISM considers different specific aspects of the site geologic setting. Each model was developed using unique methodologies and inputs, and the determination of the modeled units for each of the components is dependent on the requirements of that component. Therefore, while the ISM represents the integration of the rock properties and mineralogy into a geologic framework, the discussion of ISM construction and results is most appropriately presented in terms of the three separate components. This Process Model Report (PMR) summarizes the individual component models of the ISM (the GFM, RPM, and MM) and describes how the three components are constructed and combined to form the ISM

  5. Rubble-mound breakwater armour units displacement analysis by means of digital images processing methods in scale models

    OpenAIRE

    Courela, J.M.; Carvalho, R.; Lemos, R.; Fortes, C. J. E. M.; Leandro, J.

    2015-01-01

    Rubble-mound structures are commonly used for coastal and port protection and needs a properly design as well as inspection and maintenance during its lifetime. The design of such breakwaters usually requires a physical scale model to be tested under different irregular incident wave and tide conditions in order to evaluate its hydraulic and structural behaviour, namely the stability of the proposed design. Armour units displacement and fall analysis in physical models are then a ...

  6. Acceleration of Linear Finite-Difference Poisson-Boltzmann Methods on Graphics Processing Units.

    Science.gov (United States)

    Qi, Ruxi; Botello-Smith, Wesley M; Luo, Ray

    2017-07-11

    Electrostatic interactions play crucial roles in biophysical processes such as protein folding and molecular recognition. Poisson-Boltzmann equation (PBE)-based models have emerged as widely used in modeling these important processes. Though great efforts have been put into developing efficient PBE numerical models, challenges still remain due to the high dimensionality of typical biomolecular systems. In this study, we implemented and analyzed commonly used linear PBE solvers for the ever-improving graphics processing units (GPU) for biomolecular simulations, including both standard and preconditioned conjugate gradient (CG) solvers with several alternative preconditioners. Our implementation utilizes the standard Nvidia CUDA libraries cuSPARSE, cuBLAS, and CUSP. Extensive tests show that good numerical accuracy can be achieved given that the single precision is often used for numerical applications on GPU platforms. The optimal GPU performance was observed with the Jacobi-preconditioned CG solver, with a significant speedup over standard CG solver on CPU in our diversified test cases. Our analysis further shows that different matrix storage formats also considerably affect the efficiency of different linear PBE solvers on GPU, with the diagonal format best suited for our standard finite-difference linear systems. Further efficiency may be possible with matrix-free operations and integrated grid stencil setup specifically tailored for the banded matrices in PBE-specific linear systems.

  7. Neuro-fuzzy modelling of hydro unit efficiency

    International Nuclear Information System (INIS)

    Iliev, Atanas; Fushtikj, Vangel

    2003-01-01

    This paper presents neuro-fuzzy method for modeling of the hydro unit efficiency. The proposed method uses the characteristics of the fuzzy systems as universal function approximates, as well the abilities of the neural networks to adopt the parameters of the membership's functions and rules in the consequent part of the developed fuzzy system. Developed method is practically applied for modeling of the efficiency of unit which will be installed in the hydro power plant Kozjak. Comparison of the performance of the derived neuro-fuzzy method with several classical polynomials models is also performed. (Author)

  8. The process of implementation of emergency care units in Brazil.

    Science.gov (United States)

    O'Dwyer, Gisele; Konder, Mariana Teixeira; Reciputti, Luciano Pereira; Lopes, Mônica Guimarães Macau; Agostinho, Danielle Fernandes; Alves, Gabriel Farias

    2017-12-11

    To analyze the process of implementation of emergency care units in Brazil. We have carried out a documentary analysis, with interviews with twenty-four state urgency coordinators and a panel of experts. We have analyzed issues related to policy background and trajectory, players involved in the implementation, expansion process, advances, limits, and implementation difficulties, and state coordination capacity. We have used the theoretical framework of the analysis of the strategic conduct of the Giddens theory of structuration. Emergency care units have been implemented after 2007, initially in the Southeast region, and 446 emergency care units were present in all Brazilian regions in 2016. Currently, 620 emergency care units are under construction, which indicates expectation of expansion. Federal funding was a strong driver for the implementation. The states have planned their emergency care units, but the existence of direct negotiation between municipalities and the Union has contributed with the significant number of emergency care units that have been built but that do not work. In relation to the urgency network, there is tension with the hospital because of the lack of beds in the country, which generates hospitalizations in the emergency care unit. The management of emergency care units is predominantly municipal, and most of the emergency care units are located outside the capitals and classified as Size III. The main challenges identified were: under-funding and difficulty in recruiting physicians. The emergency care unit has the merit of having technological resources and being architecturally differentiated, but it will only succeed within an urgency network. Federal induction has generated contradictory responses, since not all states consider the emergency care unit a priority. The strengthening of the state management has been identified as a challenge for the implementation of the urgency network.

  9. The process of implementation of emergency care units in Brazil

    Directory of Open Access Journals (Sweden)

    Gisele O'Dwyer

    2017-12-01

    Full Text Available ABSTRACT OBJECTIVE To analyze the process of implementation of emergency care units in Brazil. METHODS We have carried out a documentary analysis, with interviews with twenty-four state urgency coordinators and a panel of experts. We have analyzed issues related to policy background and trajectory, players involved in the implementation, expansion process, advances, limits, and implementation difficulties, and state coordination capacity. We have used the theoretical framework of the analysis of the strategic conduct of the Giddens theory of structuration. RESULTS Emergency care units have been implemented after 2007, initially in the Southeast region, and 446 emergency care units were present in all Brazilian regions in 2016. Currently, 620 emergency care units are under construction, which indicates expectation of expansion. Federal funding was a strong driver for the implementation. The states have planned their emergency care units, but the existence of direct negotiation between municipalities and the Union has contributed with the significant number of emergency care units that have been built but that do not work. In relation to the urgency network, there is tension with the hospital because of the lack of beds in the country, which generates hospitalizations in the emergency care unit. The management of emergency care units is predominantly municipal, and most of the emergency care units are located outside the capitals and classified as Size III. The main challenges identified were: under-funding and difficulty in recruiting physicians. CONCLUSIONS The emergency care unit has the merit of having technological resources and being architecturally differentiated, but it will only succeed within an urgency network. Federal induction has generated contradictory responses, since not all states consider the emergency care unit a priority. The strengthening of the state management has been identified as a challenge for the implementation of the

  10. Reflector antenna analysis using physical optics on Graphics Processing Units

    DEFF Research Database (Denmark)

    Borries, Oscar Peter; Sørensen, Hans Henrik Brandenborg; Dammann, Bernd

    2014-01-01

    The Physical Optics approximation is a widely used asymptotic method for calculating the scattering from electrically large bodies. It requires significant computational work and little memory, and is thus well suited for application on a Graphics Processing Unit. Here, we investigate the perform......The Physical Optics approximation is a widely used asymptotic method for calculating the scattering from electrically large bodies. It requires significant computational work and little memory, and is thus well suited for application on a Graphics Processing Unit. Here, we investigate...

  11. Modeling of flash calcination process during clay activation

    International Nuclear Information System (INIS)

    Borrajo Perez, Ruben; Gonzalez Bayon, Juan Jose; Sanchez Rodriguez, Andy A.

    2011-01-01

    Pozzolanic activity in some materials can be increased by means of different processes, among them, thermal activation is one of the most promising. The activation process, occurring at high temperatures and velocities produces a material with better characteristics. In the last few years, high reactivity pozzolan during cure's early days has been produced. Temperature is an important parameter in the activation process and as a consequence, the activation units must consider temperature variation to allow the use of different raw materials, each one of them with different characteristics. Considering the high prices of Kaolin in the market, new materials are being tested, the clayey soil, which after a sedimentation process produces a clay that has turned out to be a suitable raw material, when the kinetics of the pozzolanic reaction is considered. Additionally, other material with higher levels of kaolin are being used with good results. This paper is about the modeling of thermal, hydrodynamics and dehydroxilation processes suffering for solids particles exposed to a hot gas stream. The models employed are discussed; the velocity and temperature of particles are obtained as a function of carrier gas parameters. The calculation include the heat losses and finally the model predict the residence time needed for finish the activation process. (author)

  12. Assessment of Process Capability: the case of Soft Drinks Processing Unit

    Science.gov (United States)

    Sri Yogi, Kottala

    2018-03-01

    The process capability studies have significant impact in investigating process variation which is important in achieving product quality characteristics. Its indices are to measure the inherent variability of a process and thus to improve the process performance radically. The main objective of this paper is to understand capability of the process being produced within specification of the soft drinks processing unit, a premier brands being marketed in India. A few selected critical parameters in soft drinks processing: concentration of gas volume, concentration of brix, torque of crock has been considered for this study. Assessed some relevant statistical parameters: short term capability, long term capability as a process capability indices perspective. For assessment we have used real time data of soft drinks bottling company which is located in state of Chhattisgarh, India. As our research output suggested reasons for variations in the process which is validated using ANOVA and also predicted Taguchi cost function, assessed also predicted waste monetarily this shall be used by organization for improving process parameters. This research work has substantially benefitted the organization in understanding the various variations of selected critical parameters for achieving zero rejection.

  13. A model-data based systems approach to process intensification

    DEFF Research Database (Denmark)

    Gani, Rafiqul

    . Their developments, however, are largely due to experiment based trial and error approaches and while they do not require validation, they can be time consuming and resource intensive. Also, one may ask, can a truly new intensified unit operation be obtained in this way? An alternative two-stage approach is to apply...... a model-based synthesis method to systematically generate and evaluate alternatives in the first stage and an experiment-model based validation in the second stage. In this way, the search for alternatives is done very quickly, reliably and systematically over a wide range, while resources are preserved...... for focused validation of only the promising candidates in the second-stage. This approach, however, would be limited to intensification based on “known” unit operations, unless the PI process synthesis/design is considered at a lower level of aggregation, namely the phenomena level. That is, the model-based...

  14. Modelling of temperature distribution and pulsations in fast reactor units

    International Nuclear Information System (INIS)

    Ushakov, P.A.; Sorokin, A.P.

    1994-01-01

    Reasons for the occurrence of thermal stresses in reactor units have been analyzed. The main reasons for this analysis are: temperature non-uniformity at the output of reactor core and breeder and the ensuing temperature pulsation; temperature pulsations due to mixing of sodium jets of a different temperature; temperature nonuniformity and pulsations resulting from the part of loops (circuits) un-plug; temperature nonuniformity and fluctuations in transient and accidental shut down of reactor or transfer to cooling by natural circulation. The results of investigating the thermal hydraulic characteristics are obtained by modelling the processes mentioned above. Analysis carried out allows the main lines of investigation to be defined and conclusions can be drawn regarding the problem of temperature distribution and fluctuation in fast reactor units

  15. Improving Earth/Prediction Models to Improve Network Processing

    Science.gov (United States)

    Wagner, G. S.

    2017-12-01

    The United States Atomic Energy Detection System (USAEDS) primaryseismic network consists of a relatively small number of arrays andthree-component stations. The relatively small number of stationsin the USAEDS primary network make it both necessary and feasibleto optimize both station and network processing.Station processing improvements include detector tuning effortsthat use Receiver Operator Characteristic (ROC) curves to helpjudiciously set acceptable Type 1 (false) vs. Type 2 (miss) errorrates. Other station processing improvements include the use ofempirical/historical observations and continuous background noisemeasurements to compute time-varying, maximum likelihood probabilityof detection thresholds.The USAEDS network processing software makes extensive use of theazimuth and slowness information provided by frequency-wavenumberanalysis at array sites, and polarization analysis at three-componentsites. Most of the improvements in USAEDS network processing aredue to improvements in the models used to predict azimuth, slowness,and probability of detection. Kriged travel-time, azimuth andslowness corrections-and associated uncertainties-are computedusing a ground truth database. Improvements in station processingand the use of improved models for azimuth, slowness, and probabilityof detection have led to significant improvements in USADES networkprocessing.

  16. CALCULATION PECULIARITIES OF RE-PROCESSED ROAD COVERING UNIT COST

    Directory of Open Access Journals (Sweden)

    Dilyara Kyazymovna Izmaylova

    2017-09-01

    Full Text Available In the article there are considered questions of economic expediency of non-waste technology application for road covering repair and restoration. Determined the conditions of asphalt-concrete processing at plants. Carried out cost changing analysis of asphalt granulate considering the conditions of transportation and preproduction processing. Given an example of expense calculation of one conventional unit of asphalt-concrete mixture volume preparation with and without processing.

  17. Micromagnetic simulations using Graphics Processing Units

    International Nuclear Information System (INIS)

    Lopez-Diaz, L; Aurelio, D; Torres, L; Martinez, E; Hernandez-Lopez, M A; Gomez, J; Alejos, O; Carpentieri, M; Finocchio, G; Consolo, G

    2012-01-01

    The methodology for adapting a standard micromagnetic code to run on graphics processing units (GPUs) and exploit the potential for parallel calculations of this platform is discussed. GPMagnet, a general purpose finite-difference GPU-based micromagnetic tool, is used as an example. Speed-up factors of two orders of magnitude can be achieved with GPMagnet with respect to a serial code. This allows for running extensive simulations, nearly inaccessible with a standard micromagnetic solver, at reasonable computational times. (topical review)

  18. Modeling and Optimization of the Medium-Term Units Commitment of Thermal Power

    Directory of Open Access Journals (Sweden)

    Shengli Liao

    2015-11-01

    Full Text Available Coal-fired thermal power plants, which represent the largest proportion of China’s electric power system, are very sluggish in responding to power system load demands. Thus, a reasonable and feasible scheme for the medium-term optimal commitment of thermal units (MOCTU can ensure that the generation process runs smoothly and minimizes the start-up and shut-down times of thermal units. In this paper, based on the real-world and practical demands of power dispatch centers in China, a flexible mathematical model for MOCTU that uses equal utilization hours for the installed capacity of all thermal power plants as the optimization goal and that considers the award hours for MOCTU is developed. MOCTU is a unit commitment (UC problem with characteristics of large-scale, high dimensions and nonlinearity. For optimization, an improved progressive optimality algorithm (IPOA offering the advantages of POA is adopted to overcome the drawback of POA of easily falling into the local optima. In the optimization process, strategies of system operating capacity equalization and single station operating peak combination are introduced to move the target solution from the boundary constraints along the target isopleths into the feasible solution’s interior to guarantee the global optima. The results of a case study consisting of nine thermal power plants with 27 units show that the presented algorithm can obtain an optimal solution and is competent in solving the MOCTU with high efficiency and accuracy as well as that the developed simulation model can be applied to practical engineering needs.

  19. Modeling styles in business process modeling

    NARCIS (Netherlands)

    Pinggera, J.; Soffer, P.; Zugal, S.; Weber, B.; Weidlich, M.; Fahland, D.; Reijers, H.A.; Mendling, J.; Bider, I.; Halpin, T.; Krogstie, J.; Nurcan, S.; Proper, E.; Schmidt, R.; Soffer, P.; Wrycza, S.

    2012-01-01

    Research on quality issues of business process models has recently begun to explore the process of creating process models. As a consequence, the question arises whether different ways of creating process models exist. In this vein, we observed 115 students engaged in the act of modeling, recording

  20. Theoretical and experimental study of a small unit for solar desalination using flashing process

    International Nuclear Information System (INIS)

    Nafey, A. Safwat; Mohamad, M.A.; El-Helaby, S.O.; Sharaf, M.A.

    2007-01-01

    A small unit for water desalination by solar energy and a flash evaporation process is investigated. The system is built at the Faculty of Petroleum and Mining Engineering at Suez, Egypt. The system consists of a solar water heater (flat plate solar collector) working as a brine heater and a vertical flash unit that is attached with a condenser/preheater unit. In this work, the system is investigated theoretically and experimentally at different real environmental conditions along Julian days of one year (2005). A mathematical model is developed to calculate the productivity of the system under different operating conditions. The BIRD's model for the calculation of solar insolation is used to predict the solar insolation instantaneously. Also, the solar insolation is measured by a highly sensitive digital pyranometer. Comparison between the theoretical and experimental results is performed. The average accumulative productivity of the system in November, December and January ranged between 1.04 to 1.45 kg/day/m 2 . The average summer productivity ranged between 5.44 to 7 kg/day/m 2 in July and August and 4.2 to 5 kg/day/m 2 in June

  1. Theoretical and experimental study of a small unit for solar desalination using flashing process

    Energy Technology Data Exchange (ETDEWEB)

    Nafey, A. Safwat; El-Helaby, S.O.; Sharaf, M.A. [Department of Engineering Science, Faculty of Petroleum and Mining Engineering, Suez Canal University, Suez 43522 (Egypt); Mohamad, M.A. [Solar Energy Department, National Research Center, Cairo (Egypt)

    2007-02-15

    A small unit for water desalination by solar energy and a flash evaporation process is investigated. The system is built at the Faculty of Petroleum and Mining Engineering at Suez, Egypt. The system consists of a solar water heater (flat plate solar collector) working as a brine heater and a vertical flash unit that is attached with a condenser/preheater unit. In this work, the system is investigated theoretically and experimentally at different real environmental conditions along Julian days of one year (2005). A mathematical model is developed to calculate the productivity of the system under different operating conditions. The BIRD's model for the calculation of solar insolation is used to predict the solar insolation instantaneously. Also, the solar insolation is measured by a highly sensitive digital pyranometer. Comparison between the theoretical and experimental results is performed. The average accumulative productivity of the system in November, December and January ranged between 1.04 to 1.45 kg/day/m{sup 2}. The average summer productivity ranged between 5.44 to 7 kg/day/m{sup 2} in July and August and 4.2 to 5 kg/day/m{sup 2} in June. (author)

  2. Reforging the Wedding Ring: Exploring a Semi-Artificial Model of Population for the United Kingdom with Gaussian process emulators

    OpenAIRE

    Viet Dung Cao; Jason Hilton; Eric Silverman; Jakub Bijak

    2013-01-01

    Background: We extend the "Wedding Ring‟ agent-based model of marriage formation to include some empirical information on the natural population change for the United Kingdom together with behavioural explanations that drive the observed nuptiality trends. Objective: We propose a method to explore statistical properties of agent-based demographic models. By coupling rule-based explanations driving the agent-based model with observed data we wish to bring agent-based modelling and demographic ...

  3. Deriving social relations among organizational units from process models

    NARCIS (Netherlands)

    Song, M.S.; Choi, I.; Kim, K.M.; Aalst, van der W.M.P.

    2008-01-01

    For companies to sustain competitive advantages, it is required to redesign and improve business processes continuously by monitoring and analyzing process enactment results. Furthermore, organizational structures must be redesigned according to the changes in business processes. However, there are

  4. Formalizing the Process of Constructing Chains of Lexical Units

    Directory of Open Access Journals (Sweden)

    Grigorij Chetverikov

    2015-06-01

    Full Text Available Formalizing the Process of Constructing Chains of Lexical Units The paper investigates mathematical aspects of describing the construction of chains of lexical units on the basis of finite-predicate algebra. Analyzing the construction peculiarities is carried out and application of the method of finding the power of linear logical transformation for removing characteristic words of a dictionary entry is given. Analysis and perspectives of the results of the study are provided.

  5. Process modeling style

    CERN Document Server

    Long, John

    2014-01-01

    Process Modeling Style focuses on other aspects of process modeling beyond notation that are very important to practitioners. Many people who model processes focus on the specific notation used to create their drawings. While that is important, there are many other aspects to modeling, such as naming, creating identifiers, descriptions, interfaces, patterns, and creating useful process documentation. Experience author John Long focuses on those non-notational aspects of modeling, which practitioners will find invaluable. Gives solid advice for creating roles, work produ

  6. Accelerating cardiac bidomain simulations using graphics processing units.

    Science.gov (United States)

    Neic, A; Liebmann, M; Hoetzl, E; Mitchell, L; Vigmond, E J; Haase, G; Plank, G

    2012-08-01

    Anatomically realistic and biophysically detailed multiscale computer models of the heart are playing an increasingly important role in advancing our understanding of integrated cardiac function in health and disease. Such detailed simulations, however, are computationally vastly demanding, which is a limiting factor for a wider adoption of in-silico modeling. While current trends in high-performance computing (HPC) hardware promise to alleviate this problem, exploiting the potential of such architectures remains challenging since strongly scalable algorithms are necessitated to reduce execution times. Alternatively, acceleration technologies such as graphics processing units (GPUs) are being considered. While the potential of GPUs has been demonstrated in various applications, benefits in the context of bidomain simulations where large sparse linear systems have to be solved in parallel with advanced numerical techniques are less clear. In this study, the feasibility of multi-GPU bidomain simulations is demonstrated by running strong scalability benchmarks using a state-of-the-art model of rabbit ventricles. The model is spatially discretized using the finite element methods (FEM) on fully unstructured grids. The GPU code is directly derived from a large pre-existing code, the Cardiac Arrhythmia Research Package (CARP), with very minor perturbation of the code base. Overall, bidomain simulations were sped up by a factor of 11.8 to 16.3 in benchmarks running on 6-20 GPUs compared to the same number of CPU cores. To match the fastest GPU simulation which engaged 20 GPUs, 476 CPU cores were required on a national supercomputing facility.

  7. A Hydrostrat Model and Alternatives for Groundwater Flow and Contaminant Transport Model of Corrective Action Unit 99: Rainer Mesa-Shoshone Mountain, Nye County, Nevada

    Energy Technology Data Exchange (ETDEWEB)

    NSTec Geotechnical Sciences Group

    2007-03-01

    The three-dimensional hydrostratigraphic framework model for the Rainier Mesa-Shoshone Mountain Corrective Action Unit was completed in Fiscal Year 2006. The model extends from eastern Pahute Mesa in the north to Mid Valley in the south and centers on the former nuclear testing areas at Rainier Mesa, Aqueduct Mesa, and Shoshone Mountain. The model area also includes an overlap with the existing Underground Test Area Corrective Action Unit models for Yucca Flat and Pahute Mesa. The model area is geologically diverse and includes un-extended yet highly deformed Paleozoic terrain and high volcanic mesas between the Yucca Flat extensional basin on the east and caldera complexes of the Southwestern Nevada Volcanic Field on the west. The area also includes a hydrologic divide between two groundwater sub-basins of the Death Valley regional flow system. A diverse set of geological and geophysical data collected over the past 50 years was used to develop a structural model and hydrostratigraphic system for the model area. Three deep characterization wells, a magnetotelluric survey, and reprocessed gravity data were acquired specifically for this modeling initiative. These data and associated interpretive products were integrated using EarthVision{reg_sign} software to develop the three-dimensional hydrostratigraphic framework model. Crucial steps in the model building process included establishing a fault model, developing a hydrostratigraphic scheme, compiling a drill-hole database, and constructing detailed geologic and hydrostratigraphic cross sections and subsurface maps. The more than 100 stratigraphic units in the model area were grouped into 43 hydrostratigraphic units based on each unit's propensity toward aquifer or aquitard characteristics. The authors organized the volcanic units in the model area into 35 hydrostratigraphic units that include 16 aquifers, 12 confining units, 2 composite units (a mixture of aquifer and confining units), and 5 intrusive

  8. Modeling and Simulation of Claus Unit Reaction Furnace

    Directory of Open Access Journals (Sweden)

    Maryam Pahlavan

    2016-01-01

    Full Text Available Reaction furnace is the most important part of the Claus sulfur recovery unit and its performance has a significant impact on the process efficiency. Too many reactions happen in the furnace and their kinetics and mechanisms are not completely understood; therefore, modeling reaction furnace is difficult and several works have been carried out on in this regard so far. Equilibrium models are commonly used to simulate the furnace, but the related literature states that the outlet of furnace is not in equilibrium and the furnace reactions are controlled by kinetic laws; therefore, in this study, the reaction furnace is simulated by a kinetic model. The predicted outlet temperature and concentrations by this model are compared with experimental data published in the literature and the data obtained by PROMAX V2.0 simulator. The results show that the accuracy of the proposed kinetic model and PROMAX simulator is almost similar, but the kinetic model used in this paper has two importance abilities. Firstly, it is a distributed model and can be used to obtain the temperature and concentration profiles along the furnace. Secondly, it is a dynamic model and can be used for analyzing the transient behavior and designing the control system.

  9. Developing maintenance technologies for FBR's heat exchanger units by advanced laser processing

    International Nuclear Information System (INIS)

    Nishimura, Akihiko; Shimada, Yukihiro

    2011-01-01

    Laser processing technologies were developed for the purpose of maintenance of FBR's heat exchanger units. Ultrashort laser processing fabricated fiber Bragg grating sensor for seismic monitoring. Fiber laser welding with a newly developed robot system repair cracks on inner wall of heat exchanger tubes. Safety operation of the heat exchanger units will be improved by the advanced laser processing technologies. These technologies are expected to be applied to the maintenance for the next generation FBRs. (author)

  10. Fast, multi-channel real-time processing of signals with microsecond latency using graphics processing units

    Energy Technology Data Exchange (ETDEWEB)

    Rath, N., E-mail: Nikolaus@rath.org; Levesque, J. P.; Mauel, M. E.; Navratil, G. A.; Peng, Q. [Department of Applied Physics and Applied Mathematics, Columbia University, 500 W 120th St, New York, New York 10027 (United States); Kato, S. [Department of Information Engineering, Nagoya University, Nagoya (Japan)

    2014-04-15

    Fast, digital signal processing (DSP) has many applications. Typical hardware options for performing DSP are field-programmable gate arrays (FPGAs), application-specific integrated DSP chips, or general purpose personal computer systems. This paper presents a novel DSP platform that has been developed for feedback control on the HBT-EP tokamak device. The system runs all signal processing exclusively on a Graphics Processing Unit (GPU) to achieve real-time performance with latencies below 8 μs. Signals are transferred into and out of the GPU using PCI Express peer-to-peer direct-memory-access transfers without involvement of the central processing unit or host memory. Tests were performed on the feedback control system of the HBT-EP tokamak using forty 16-bit floating point inputs and outputs each and a sampling rate of up to 250 kHz. Signals were digitized by a D-TACQ ACQ196 module, processing done on an NVIDIA GTX 580 GPU programmed in CUDA, and analog output was generated by D-TACQ AO32CPCI modules.

  11. Fast, multi-channel real-time processing of signals with microsecond latency using graphics processing units

    International Nuclear Information System (INIS)

    Rath, N.; Levesque, J. P.; Mauel, M. E.; Navratil, G. A.; Peng, Q.; Kato, S.

    2014-01-01

    Fast, digital signal processing (DSP) has many applications. Typical hardware options for performing DSP are field-programmable gate arrays (FPGAs), application-specific integrated DSP chips, or general purpose personal computer systems. This paper presents a novel DSP platform that has been developed for feedback control on the HBT-EP tokamak device. The system runs all signal processing exclusively on a Graphics Processing Unit (GPU) to achieve real-time performance with latencies below 8 μs. Signals are transferred into and out of the GPU using PCI Express peer-to-peer direct-memory-access transfers without involvement of the central processing unit or host memory. Tests were performed on the feedback control system of the HBT-EP tokamak using forty 16-bit floating point inputs and outputs each and a sampling rate of up to 250 kHz. Signals were digitized by a D-TACQ ACQ196 module, processing done on an NVIDIA GTX 580 GPU programmed in CUDA, and analog output was generated by D-TACQ AO32CPCI modules

  12. Development of Water Quality Modeling in the United States

    Science.gov (United States)

    This presentation describes historical trends in water quality model development in the United States, reviews current efforts, and projects promising future directions. Water quality modeling has a relatively long history in the United States. While its origins lie in the work...

  13. Fast analytical scatter estimation using graphics processing units.

    Science.gov (United States)

    Ingleby, Harry; Lippuner, Jonas; Rickey, Daniel W; Li, Yue; Elbakri, Idris

    2015-01-01

    To develop a fast patient-specific analytical estimator of first-order Compton and Rayleigh scatter in cone-beam computed tomography, implemented using graphics processing units. The authors developed an analytical estimator for first-order Compton and Rayleigh scatter in a cone-beam computed tomography geometry. The estimator was coded using NVIDIA's CUDA environment for execution on an NVIDIA graphics processing unit. Performance of the analytical estimator was validated by comparison with high-count Monte Carlo simulations for two different numerical phantoms. Monoenergetic analytical simulations were compared with monoenergetic and polyenergetic Monte Carlo simulations. Analytical and Monte Carlo scatter estimates were compared both qualitatively, from visual inspection of images and profiles, and quantitatively, using a scaled root-mean-square difference metric. Reconstruction of simulated cone-beam projection data of an anthropomorphic breast phantom illustrated the potential of this method as a component of a scatter correction algorithm. The monoenergetic analytical and Monte Carlo scatter estimates showed very good agreement. The monoenergetic analytical estimates showed good agreement for Compton single scatter and reasonable agreement for Rayleigh single scatter when compared with polyenergetic Monte Carlo estimates. For a voxelized phantom with dimensions 128 × 128 × 128 voxels and a detector with 256 × 256 pixels, the analytical estimator required 669 seconds for a single projection, using a single NVIDIA 9800 GX2 video card. Accounting for first order scatter in cone-beam image reconstruction improves the contrast to noise ratio of the reconstructed images. The analytical scatter estimator, implemented using graphics processing units, provides rapid and accurate estimates of single scatter and with further acceleration and a method to account for multiple scatter may be useful for practical scatter correction schemes.

  14. Transforming Collaborative Process Models into Interface Process Models by Applying an MDA Approach

    Science.gov (United States)

    Lazarte, Ivanna M.; Chiotti, Omar; Villarreal, Pablo D.

    Collaborative business models among enterprises require defining collaborative business processes. Enterprises implement B2B collaborations to execute these processes. In B2B collaborations the integration and interoperability of processes and systems of the enterprises are required to support the execution of collaborative processes. From a collaborative process model, which describes the global view of the enterprise interactions, each enterprise must define the interface process that represents the role it performs in the collaborative process in order to implement the process in a Business Process Management System. Hence, in this work we propose a method for the automatic generation of the interface process model of each enterprise from a collaborative process model. This method is based on a Model-Driven Architecture to transform collaborative process models into interface process models. By applying this method, interface processes are guaranteed to be interoperable and defined according to a collaborative process.

  15. Modelling process integration and its management – case of a public housing delivery organization in United Arab Emirates

    Directory of Open Access Journals (Sweden)

    Venkatachalam Senthilkumar

    2017-01-01

    Full Text Available Huge volume of project information are generated during the life cycle of an AEC projects. These project information are categorized in to technical and administrative information and managed through appropriate processes. There are many tools such as Document Management Systems, Building Information Modeling (BIM available to manage and integrate the technical information. However, the administrative information and its related processes such as the payment, status, authorization, approval etc. are not effectively managed. The current study aims to explore the administrative information management process of a local housing delivery public agency. This agency manages more than 2000 housing projects at any time of a year. The administrative processesare characterized withdelivery inconsistencies among various project participants. Though there are many commercially available process management systems, there exist limitations on the customization of the modules/ systems. Hence there is a need to develop an information management system which can integrates and manage these housing projects processes effectively. This requires the modeling of administrative processes and its interfaces among the various stakeholder processes. Hence this study aims to model the administrative processes and its related information during the life cycle of the project using IDEF0 and IDEF1X modeling. The captured processes and information interfaces are analyzed and appropriate process integration is suggested to avoid the delay in their project delivery processes. Further, the resultant model can be used for effectively managing the housing delivery projects.

  16. Fast ray-tracing of human eye optics on Graphics Processing Units.

    Science.gov (United States)

    Wei, Qi; Patkar, Saket; Pai, Dinesh K

    2014-05-01

    We present a new technique for simulating retinal image formation by tracing a large number of rays from objects in three dimensions as they pass through the optic apparatus of the eye to objects. Simulating human optics is useful for understanding basic questions of vision science and for studying vision defects and their corrections. Because of the complexity of computing such simulations accurately, most previous efforts used simplified analytical models of the normal eye. This makes them less effective in modeling vision disorders associated with abnormal shapes of the ocular structures which are hard to be precisely represented by analytical surfaces. We have developed a computer simulator that can simulate ocular structures of arbitrary shapes, for instance represented by polygon meshes. Topographic and geometric measurements of the cornea, lens, and retina from keratometer or medical imaging data can be integrated for individualized examination. We utilize parallel processing using modern Graphics Processing Units (GPUs) to efficiently compute retinal images by tracing millions of rays. A stable retinal image can be generated within minutes. We simulated depth-of-field, accommodation, chromatic aberrations, as well as astigmatism and correction. We also show application of the technique in patient specific vision correction by incorporating geometric models of the orbit reconstructed from clinical medical images. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  17. Capabilities for modelling of conversion processes in LCA

    DEFF Research Database (Denmark)

    Damgaard, Anders; Zarrin, Bahram; Tonini, Davide

    2015-01-01

    substances themselves change through a process chain. A good example of this is bio-refinery processes where different residual biomass products are converted through different steps into the final energy product. Here it is necessary to know the stoichiometry of the different products going in, and being...... little focus on the chemical composition of the functional flows, as flows in the models have mainly been tracked on a mass basis, as focus was on the function of the product and not the chemical composition of said product. Conversely modelling environmental technologies, such as wastewater treatment......, EASETECH (Clavreul et al., 2014) was developed which integrates a matrix approach for the functional unit which contains the full chemical composition for different material fractions, and also the number of different material fractions present in the overall mass being handled. These chemical substances...

  18. Integration Process for the Habitat Demonstration Unit

    Science.gov (United States)

    Gill, Tracy; Merbitz, Jerad; Kennedy, Kriss; Tri, Terry; Howe, A. Scott

    2010-01-01

    The Habitat Demonstration Unit (HDU) is an experimental exploration habitat technology and architecture test platform designed for analog demonstration activities The HDU project has required a team to integrate a variety of contributions from NASA centers and outside collaborators and poses a challenge in integrating these disparate efforts into a cohesive architecture To complete the development of the HDU from conception in June 2009 to rollout for operations in July 2010, a cohesive integration strategy has been developed to integrate the various systems of HDU and the payloads, such as the Geology Lab, that those systems will support The utilization of interface design standards and uniquely tailored reviews have allowed for an accelerated design process Scheduled activities include early fit-checks and the utilization of a Habitat avionics test bed prior to equipment installation into HDU A coordinated effort to utilize modeling and simulation systems has aided in design and integration concept development Modeling tools have been effective in hardware systems layout, cable routing and length estimation, and human factors analysis Decision processes on the shell development including the assembly sequence and the transportation have been fleshed out early on HDU to maximize the efficiency of both integration and field operations Incremental test operations leading up to an integrated systems test allows for an orderly systems test program The HDU will begin its journey as an emulation of a Pressurized Excursion Module (PEM) for 2010 field testing and then may evolve to a Pressurized Core Module (PCM) for 2011 and later field tests, depending on agency architecture decisions The HDU deployment will vary slightly from current lunar architecture plans to include developmental hardware and software items and additional systems called opportunities for technology demonstration One of the HDU challenges has been designing to be prepared for the integration of

  19. Psychiatry training in the United Kingdom--part 2: the training process.

    Science.gov (United States)

    Christodoulou, N; Kasiakogia, K

    2015-01-01

    In the second part of this diptych, we shall deal with psychiatric training in the United Kingdom in detail, and we will compare it--wherever this is meaningful--with the equivalent system in Greece. As explained in the first part of the paper, due to the recently increased emigration of Greek psychiatrists and psychiatric trainees, and the fact that the United Kingdom is a popular destination, it has become necessary to inform those aspiring to train in the United Kingdom of the system and the circumstances they should expect to encounter. This paper principally describes the structure of the United Kingdom's psychiatric training system, including the different stages trainees progress through and their respective requirements and processes. Specifically, specialty and subspecialty options are described and explained, special paths in training are analysed, and the notions of "special interest day" and the optional "Out of programme experience" schemes are explained. Furthermore, detailed information is offered on the pivotal points of each of the stages of the training process, with special care to explain the important differences and similarities between the systems in Greece and the United Kingdom. Special attention is given to The Royal College of Psychiatrists' Membership Exams (MRCPsych) because they are the only exams towards completing specialisation in Psychiatry in the United Kingdom. Also, the educational culture of progressing according to a set curriculum, of utilising diverse means of professional development, of empowering the trainees' autonomy by allowing initiative-based development and of applying peer supervision as a tool for professional development is stressed. We conclude that psychiatric training in the United Kingdom differs substantially to that of Greece in both structure and process. Τhere are various differences such as pure psychiatric training in the United Kingdom versus neurological and medical modules in Greece, in

  20. PENERAPAN MODEL PREVENTIVE MAINTENANCE SMITH DAN DEKKER DI PD. INDUSTRI UNIT INKABA

    Directory of Open Access Journals (Sweden)

    Hari Adianto

    2005-01-01

    Full Text Available The globalization era along with fast development of technology%2C industries must try to increase the quantity and quality of products that they have produced. The development of industrial products which have been increase continually need support from fluency of production process. In this case the industrial companies want a high availability system. PD Industri Unit Inkaba is one of the companies that moves in sector rubber technique industry. The company wants the production process go smoothly so that the company can keeps the existence and increases the product’s quality with cost efficiency and can competes with foreign markets. The smoothness of production process needs support from machines or production’s tools that have good condition. To keep the machines in good condition so that they will in the optimal condition when used%2C then the machines need to be maintained. M.A.J. Smith dan R. Dekker develop a model that combine availability model and preventive maintenance and consider the expected uptime and downtime of the system. This model is a 1 out of n system model%2C which has one operating machine and support by (n – 1 unit machine reserves. A 1 out of n system is also applicable to replaceable components. Smith and Dekker’s model gives the expected uptime and downtime of the system approximation that can gives good approximation of long term average operating cost. The results from decision of component preventive replacement age and optimal number of component reserve are expected to be able to keep the reliability system and be able to avoid the decease of availability system because of maintenance activity. Abstract in Bahasa Indonesia : Dalam era persaingan industri yang semakin global disertai perkembangan teknologi yang pesat%2C industri-industri terus berusaha meningkatkan kuantitas dan kualitas produk yang dihasilkannya. Perkembangan hasil industri yang semakin meningkat secara terus-menerus memerlukan dukungan

  1. Arta process model of maritime clutter and targets

    CSIR Research Space (South Africa)

    Mc

    2012-10-01

    Full Text Available stream_source_info McDonald_2013_ABSTRACT ONLY.pdf.txt stream_content_type text/plain stream_size 1370 Content-Encoding UTF-8 stream_name McDonald_2013_ABSTRACT ONLY.pdf.txt Content-Type text/plain; charset=UTF-8 IET... Radar 2012, International conference on radar systems, Glasgow, United Kingdom, 22-25 October 2012 ARTA PROCESS MODEL OF MARITIME CLUTTER AND TARGETS Andre McDonald and Jacques Cilliers Council for Scientific and Industrial Research (CSIR) Meiring...

  2. Prototype design of singles processing unit for the small animal PET

    Science.gov (United States)

    Deng, P.; Zhao, L.; Lu, J.; Li, B.; Dong, R.; Liu, S.; An, Q.

    2018-05-01

    Position Emission Tomography (PET) is an advanced clinical diagnostic imaging technique for nuclear medicine. Small animal PET is increasingly used for studying the animal model of disease, new drugs and new therapies. A prototype of Singles Processing Unit (SPU) for a small animal PET system was designed to obtain the time, energy, and position information. The energy and position is actually calculated through high precison charge measurement, which is based on amplification, shaping, A/D conversion and area calculation in digital signal processing domian. Analysis and simulations were also conducted to optimize the key parameters in system design. Initial tests indicate that the charge and time precision is better than 3‰ FWHM and 350 ps FWHM respectively, while the position resolution is better than 3.5‰ FWHM. Commination tests of the SPU prototype with the PET detector indicate that the system time precision is better than 2.5 ns, while the flood map and energy spectra concored well with the expected.

  3. Computerized prediction of intensive care unit discharge after cardiac surgery: development and validation of a Gaussian processes model

    Directory of Open Access Journals (Sweden)

    Meyfroidt Geert

    2011-10-01

    Full Text Available Abstract Background The intensive care unit (ICU length of stay (LOS of patients undergoing cardiac surgery may vary considerably, and is often difficult to predict within the first hours after admission. The early clinical evolution of a cardiac surgery patient might be predictive for his LOS. The purpose of the present study was to develop a predictive model for ICU discharge after non-emergency cardiac surgery, by analyzing the first 4 hours of data in the computerized medical record of these patients with Gaussian processes (GP, a machine learning technique. Methods Non-interventional study. Predictive modeling, separate development (n = 461 and validation (n = 499 cohort. GP models were developed to predict the probability of ICU discharge the day after surgery (classification task, and to predict the day of ICU discharge as a discrete variable (regression task. GP predictions were compared with predictions by EuroSCORE, nurses and physicians. The classification task was evaluated using aROC for discrimination, and Brier Score, Brier Score Scaled, and Hosmer-Lemeshow test for calibration. The regression task was evaluated by comparing median actual and predicted discharge, loss penalty function (LPF ((actual-predicted/actual and calculating root mean squared relative errors (RMSRE. Results Median (P25-P75 ICU length of stay was 3 (2-5 days. For classification, the GP model showed an aROC of 0.758 which was significantly higher than the predictions by nurses, but not better than EuroSCORE and physicians. The GP had the best calibration, with a Brier Score of 0.179 and Hosmer-Lemeshow p-value of 0.382. For regression, GP had the highest proportion of patients with a correctly predicted day of discharge (40%, which was significantly better than the EuroSCORE (p Conclusions A GP model that uses PDMS data of the first 4 hours after admission in the ICU of scheduled adult cardiac surgery patients was able to predict discharge from the ICU as a

  4. Undergraduate Game Degree Programs in the United Kingdom and United States: A Comparison of the Curriculum Planning Process

    Science.gov (United States)

    McGill, Monica M.

    2010-01-01

    Digital games are marketed, mass-produced, and consumed by an increasing number of people and the game industry is only expected to grow. In response, post-secondary institutions in the United Kingdom (UK) and the United States (US) have started to create game degree programs. Though curriculum theorists provide insight into the process of…

  5. Ecohydrologic process modeling of mountain block groundwater recharge.

    Science.gov (United States)

    Magruder, Ian A; Woessner, William W; Running, Steve W

    2009-01-01

    Regional mountain block recharge (MBR) is a key component of alluvial basin aquifer systems typical of the western United States. Yet neither water scientists nor resource managers have a commonly available and reasonably invoked quantitative method to constrain MBR rates. Recent advances in landscape-scale ecohydrologic process modeling offer the possibility that meteorological data and land surface physical and vegetative conditions can be used to generate estimates of MBR. A water balance was generated for a temperate 24,600-ha mountain watershed, elevation 1565 to 3207 m, using the ecosystem process model Biome-BGC (BioGeochemical Cycles) (Running and Hunt 1993). Input data included remotely sensed landscape information and climate data generated with the Mountain Climate Simulator (MT-CLIM) (Running et al. 1987). Estimated mean annual MBR flux into the crystalline bedrock terrain is 99,000 m(3) /d, or approximately 19% of annual precipitation for the 2003 water year. Controls on MBR predictions include evapotranspiration (radiation limited in wet years and moisture limited in dry years), soil properties, vegetative ecotones (significant at lower elevations), and snowmelt (dominant recharge process). The ecohydrologic model is also used to investigate how climatic and vegetative controls influence recharge dynamics within three elevation zones. The ecohydrologic model proves useful for investigating controls on recharge to mountain blocks as a function of climate and vegetation. Future efforts will need to investigate the uncertainty in the modeled water balance by incorporating an advanced understanding of mountain recharge processes, an ability to simulate those processes at varying scales, and independent approaches to calibrating MBR estimates. Copyright © 2009 The Author(s). Journal compilation © 2009 National Ground Water Association.

  6. Health care managers' views on and approaches to implementing models for improving care processes.

    Science.gov (United States)

    Andreasson, Jörgen; Eriksson, Andrea; Dellve, Lotta

    2016-03-01

    To develop a deeper understanding of health-care managers' views on and approaches to the implementation of models for improving care processes. In health care, there are difficulties in implementing models for improving care processes that have been decided on by upper management. Leadership approaches to this implementation can affect the outcome. In-depth interviews with first- and second-line managers in Swedish hospitals were conducted and analysed using grounded theory. 'Coaching for participation' emerged as a central theme for managers in handling top-down initiated process development. The vertical approach in this coaching addresses how managers attempt to sustain unit integrity through adapting and translating orders from top management. The horizontal approach in the coaching refers to managers' strategies for motivating and engaging their employees in implementation work. Implementation models for improving care processes require a coaching leadership built on close manager-employee interaction, mindfulness regarding the pace of change at the unit level, managers with the competence to share responsibility with their teams and engaged employees with the competence to share responsibility for improving the care processes, and organisational structures that support process-oriented work. Implications for nursing management are the importance of giving nurse managers knowledge of change management. © 2015 John Wiley & Sons Ltd.

  7. Utilizing General Purpose Graphics Processing Units to Improve Performance of Computer Modelling and Visualization

    Science.gov (United States)

    Monk, J.; Zhu, Y.; Koons, P. O.; Segee, B. E.

    2009-12-01

    With the introduction of the G8X series of cards by nVidia an architecture called CUDA was released, virtually all subsequent video cards have had CUDA support. With this new architecture nVidia provided extensions for C/C++ that create an Application Programming Interface (API) allowing code to be executed on the GPU. Since then the concept of GPGPU (general purpose graphics processing unit) has been growing, this is the concept that the GPU is very good a algebra and running things in parallel so we should take use of that power for other applications. This is highly appealing in the area of geodynamic modeling, as multiple parallel solutions of the same differential equations at different points in space leads to a large speedup in simulation speed. Another benefit of CUDA is a programmatic method of transferring large amounts of data between the computer's main memory and the dedicated GPU memory located on the video card. In addition to being able to compute and render on the video card, the CUDA framework allows for a large speedup in the situation, such as with a tiled display wall, where the rendered pixels are to be displayed in a different location than where they are rendered. A CUDA extension for VirtualGL was developed allowing for faster read back at high resolutions. This paper examines several aspects of rendering OpenGL graphics on large displays using VirtualGL and VNC. It demonstrates how performance can be significantly improved in rendering on a tiled monitor wall. We present a CUDA enhanced version of VirtualGL as well as the advantages to having multiple VNC servers. It will discuss restrictions caused by read back and blitting rates and how they are affected by different sizes of virtual displays being rendered.

  8. Standard Model processes

    CERN Document Server

    Mangano, M.L.; Aguilar-Saavedra, Juan Antonio; Alekhin, S.; Badger, S.; Bauer, C.W.; Becher, T.; Bertone, V.; Bonvini, M.; Boselli, S.; Bothmann, E.; Boughezal, R.; Cacciari, M.; Carloni Calame, C.M.; Caola, F.; Campbell, J.M.; Carrazza, S.; Chiesa, M.; Cieri, L.; Cimaglia, F.; Febres Cordero, F.; Ferrarese, P.; D'Enterria, D.; Ferrera, G.; Garcia i Tormo, X.; Garzelli, M.V.; Germann, E.; Hirschi, V.; Han, T.; Ita, H.; Jäger, B.; Kallweit, S.; Karlberg, A.; Kuttimalai, S.; Krauss, F.; Larkoski, A.J.; Lindert, J.; Luisoni, G.; Maierhöfer, P.; Mattelaer, O.; Martinez, H.; Moch, S.; Montagna, G.; Moretti, M.; Nason, P.; Nicrosini, O.; Oleari, C.; Pagani, D.; Papaefstathiou, A.; Petriello, F.; Piccinini, F.; Pierini, M.; Pierog, T.; Pozzorini, S.; Re, E.; Robens, T.; Rojo, J.; Ruiz, R.; Sakurai, K.; Salam, G.P.; Salfelder, L.; Schönherr, M.; Schulze, M.; Schumann, S.; Selvaggi, M.; Shivaji, A.; Siodmok, A.; Skands, P.; Torrielli, P.; Tramontano, F.; Tsinikos, I.; Tweedie, B.; Vicini, A.; Westhoff, S.; Zaro, M.; Zeppenfeld, D.; CERN. Geneva. ATS Department

    2017-06-22

    This report summarises the properties of Standard Model processes at the 100 TeV pp collider. We document the production rates and typical distributions for a number of benchmark Standard Model processes, and discuss new dynamical phenomena arising at the highest energies available at this collider. We discuss the intrinsic physics interest in the measurement of these Standard Model processes, as well as their role as backgrounds for New Physics searches.

  9. General purpose graphics-processing-unit implementation of cosmological domain wall network evolution.

    Science.gov (United States)

    Correia, J R C C C; Martins, C J A P

    2017-10-01

    Topological defects unavoidably form at symmetry breaking phase transitions in the early universe. To probe the parameter space of theoretical models and set tighter experimental constraints (exploiting the recent advances in astrophysical observations), one requires more and more demanding simulations, and therefore more hardware resources and computation time. Improving the speed and efficiency of existing codes is essential. Here we present a general purpose graphics-processing-unit implementation of the canonical Press-Ryden-Spergel algorithm for the evolution of cosmological domain wall networks. This is ported to the Open Computing Language standard, and as a consequence significant speedups are achieved both in two-dimensional (2D) and 3D simulations.

  10. General purpose graphics-processing-unit implementation of cosmological domain wall network evolution

    Science.gov (United States)

    Correia, J. R. C. C. C.; Martins, C. J. A. P.

    2017-10-01

    Topological defects unavoidably form at symmetry breaking phase transitions in the early universe. To probe the parameter space of theoretical models and set tighter experimental constraints (exploiting the recent advances in astrophysical observations), one requires more and more demanding simulations, and therefore more hardware resources and computation time. Improving the speed and efficiency of existing codes is essential. Here we present a general purpose graphics-processing-unit implementation of the canonical Press-Ryden-Spergel algorithm for the evolution of cosmological domain wall networks. This is ported to the Open Computing Language standard, and as a consequence significant speedups are achieved both in two-dimensional (2D) and 3D simulations.

  11. Designing a Process for Tracking Business Model Change

    DEFF Research Database (Denmark)

    Groskovs, Sergejs

    The paper has adopted a design science research approach to design and verify with key stakeholders a fundamental management process of revising KPIs (key performance indicators), including those indicators that are related to business model change. The paper proposes a general guide for such pro......The paper has adopted a design science research approach to design and verify with key stakeholders a fundamental management process of revising KPIs (key performance indicators), including those indicators that are related to business model change. The paper proposes a general guide...... for such process design, which is applicable in similar settings, i.e. other multi-subsidiary global firms operating in dynamic industries. The management of the focal case uses a set of KPIs to track performance and thus to allow for bringing about strategic and tactical changes, including the initiatives...... by establishing new KPIs on an ongoing basis together with the business units on the ground, and is thus of key importance to strategic management of the firm. The paper concludes with a discussion of its methodological compliance to design science research guidelines and revisits the literature in process...

  12. [Modeling a clinical process for differentiated thyroid cancer health care in Hospital Base Valdivia, Chile].

    Science.gov (United States)

    Ávila-Schwerter, C; Torres-Andrade, M C; Méndez, C A; Márquez-Manzano, M

    2016-01-01

    To design a clinical process model in the management of differentiated thyroid cancer in order to improve accessibility to this treatment. Based on modified Participatory Action Research, a model design process was conducted using a literature review and meetings with organisations committed to the redesigning process, and to agree an improved and feasible process. The process map was constructed by participatory action including, characterisation of the value chain, fault detection in the flow of the process, relevant documents and process for proposing modifications and approvals necessary for this purpose. Links were established between the main process and the support and strategic processes. The participatory model helped to cut the waiting times for diagnosis and treatment of this disease from 12 to 4 months. For each unit to be able to fully visualise the map of the process and understand their contribution as a set of integrated contributions and not fragmented, helps in the comprehensive management of patients and operation processes based on the hierarchical and dominant organisational model in Chilean hospitals. To analyse and remodel clinical processes by participatory action helps to limit failures in the fluidity of care of the patients, by presenting each participating unit with a general view of the process, the problems, and the possible solutions. Furthermore, this approach helps to clarify the process in order to make it more efficient, to harmonise relationships, and to improve coordination in order to optimise patient care. Copyright © 2015 SECA. Publicado por Elsevier España, S.L.U. All rights reserved.

  13. Visualizing the process of process modeling with PPMCharts

    NARCIS (Netherlands)

    Claes, J.; Vanderfeesten, I.T.P.; Pinggera, J.; Reijers, H.A.; Weber, B.; Poels, G.; La Rosa, M.; Soffer, P.

    2013-01-01

    In the quest for knowledge about how to make good process models, recent research focus is shifting from studying the quality of process models to studying the process of process modeling (often abbreviated as PPM) itself. This paper reports on our efforts to visualize this specific process in such

  14. MOUNTAIN-SCALE COUPLED PROCESSES (TH/THC/THM) MODELS

    International Nuclear Information System (INIS)

    Y.S. Wu

    2005-01-01

    This report documents the development and validation of the mountain-scale thermal-hydrologic (TH), thermal-hydrologic-chemical (THC), and thermal-hydrologic-mechanical (THM) models. These models provide technical support for screening of features, events, and processes (FEPs) related to the effects of coupled TH/THC/THM processes on mountain-scale unsaturated zone (UZ) and saturated zone (SZ) flow at Yucca Mountain, Nevada (BSC 2005 [DIRS 174842], Section 2.1.1.1). The purpose and validation criteria for these models are specified in ''Technical Work Plan for: Near-Field Environment and Transport: Coupled Processes (Mountain-Scale TH/THC/THM, Drift-Scale THC Seepage, and Drift-Scale Abstraction) Model Report Integration'' (BSC 2005 [DIRS 174842]). Model results are used to support exclusion of certain FEPs from the total system performance assessment for the license application (TSPA-LA) model on the basis of low consequence, consistent with the requirements of 10 CFR 63.342 [DIRS 173273]. Outputs from this report are not direct feeds to the TSPA-LA. All the FEPs related to the effects of coupled TH/THC/THM processes on mountain-scale UZ and SZ flow are discussed in Sections 6 and 7 of this report. The mountain-scale coupled TH/THC/THM processes models numerically simulate the impact of nuclear waste heat release on the natural hydrogeological system, including a representation of heat-driven processes occurring in the far field. The mountain-scale TH simulations provide predictions for thermally affected liquid saturation, gas- and liquid-phase fluxes, and water and rock temperature (together called the flow fields). The main focus of the TH model is to predict the changes in water flux driven by evaporation/condensation processes, and drainage between drifts. The TH model captures mountain-scale three-dimensional flow effects, including lateral diversion and mountain-scale flow patterns. The mountain-scale THC model evaluates TH effects on water and gas

  15. MOUNTAIN-SCALE COUPLED PROCESSES (TH/THC/THM)MODELS

    Energy Technology Data Exchange (ETDEWEB)

    Y.S. Wu

    2005-08-24

    This report documents the development and validation of the mountain-scale thermal-hydrologic (TH), thermal-hydrologic-chemical (THC), and thermal-hydrologic-mechanical (THM) models. These models provide technical support for screening of features, events, and processes (FEPs) related to the effects of coupled TH/THC/THM processes on mountain-scale unsaturated zone (UZ) and saturated zone (SZ) flow at Yucca Mountain, Nevada (BSC 2005 [DIRS 174842], Section 2.1.1.1). The purpose and validation criteria for these models are specified in ''Technical Work Plan for: Near-Field Environment and Transport: Coupled Processes (Mountain-Scale TH/THC/THM, Drift-Scale THC Seepage, and Drift-Scale Abstraction) Model Report Integration'' (BSC 2005 [DIRS 174842]). Model results are used to support exclusion of certain FEPs from the total system performance assessment for the license application (TSPA-LA) model on the basis of low consequence, consistent with the requirements of 10 CFR 63.342 [DIRS 173273]. Outputs from this report are not direct feeds to the TSPA-LA. All the FEPs related to the effects of coupled TH/THC/THM processes on mountain-scale UZ and SZ flow are discussed in Sections 6 and 7 of this report. The mountain-scale coupled TH/THC/THM processes models numerically simulate the impact of nuclear waste heat release on the natural hydrogeological system, including a representation of heat-driven processes occurring in the far field. The mountain-scale TH simulations provide predictions for thermally affected liquid saturation, gas- and liquid-phase fluxes, and water and rock temperature (together called the flow fields). The main focus of the TH model is to predict the changes in water flux driven by evaporation/condensation processes, and drainage between drifts. The TH model captures mountain-scale three-dimensional flow effects, including lateral diversion and mountain-scale flow patterns. The mountain-scale THC model evaluates TH effects on

  16. Model for Sucker-Rod Pumping Unit Operating Modes Analysis Based on SimMechanics Library

    Science.gov (United States)

    Zyuzev, A. M.; Bubnov, M. V.

    2018-01-01

    The article provides basic information about the process of a sucker-rod pumping unit (SRPU) model developing by means of SimMechanics library in the MATLAB Simulink environment. The model is designed for the development of a pump productivity optimal management algorithms, sensorless diagnostics of the plunger pump and pumpjack, acquisition of the dynamometer card and determination of a dynamic fluid level in the well, normalization of the faulty unit operation before troubleshooting is performed by staff as well as equilibrium ratio determining by energy indicators and outputting of manual balancing recommendations to achieve optimal power consumption efficiency. Particular attention is given to the application of various blocks from SimMechanics library to take into account the pumpjack construction principal characteristic and to obtain an adequate model. The article explains in depth the developed tools features for collecting and analysis of simulated mechanism data. The conclusions were drawn about practical implementation possibility of the SRPU modelling results and areas for further development of investigation.

  17. Use of general purpose graphics processing units with MODFLOW

    Science.gov (United States)

    Hughes, Joseph D.; White, Jeremy T.

    2013-01-01

    To evaluate the use of general-purpose graphics processing units (GPGPUs) to improve the performance of MODFLOW, an unstructured preconditioned conjugate gradient (UPCG) solver has been developed. The UPCG solver uses a compressed sparse row storage scheme and includes Jacobi, zero fill-in incomplete, and modified-incomplete lower-upper (LU) factorization, and generalized least-squares polynomial preconditioners. The UPCG solver also includes options for sequential and parallel solution on the central processing unit (CPU) using OpenMP. For simulations utilizing the GPGPU, all basic linear algebra operations are performed on the GPGPU; memory copies between the central processing unit CPU and GPCPU occur prior to the first iteration of the UPCG solver and after satisfying head and flow criteria or exceeding a maximum number of iterations. The efficiency of the UPCG solver for GPGPU and CPU solutions is benchmarked using simulations of a synthetic, heterogeneous unconfined aquifer with tens of thousands to millions of active grid cells. Testing indicates GPGPU speedups on the order of 2 to 8, relative to the standard MODFLOW preconditioned conjugate gradient (PCG) solver, can be achieved when (1) memory copies between the CPU and GPGPU are optimized, (2) the percentage of time performing memory copies between the CPU and GPGPU is small relative to the calculation time, (3) high-performance GPGPU cards are utilized, and (4) CPU-GPGPU combinations are used to execute sequential operations that are difficult to parallelize. Furthermore, UPCG solver testing indicates GPGPU speedups exceed parallel CPU speedups achieved using OpenMP on multicore CPUs for preconditioners that can be easily parallelized.

  18. Modeling human auditory evoked brainstem responses based on nonlinear cochlear processing

    DEFF Research Database (Denmark)

    Harte, James; Rønne, Filip Munch; Dau, Torsten

    2010-01-01

    . To generate AEPs recorded at remote locations, a convolution was made on an empirically obtained elementary unit waveform with the instantaneous discharge rate function for the corresponding AN unit. AEPs to click-trains, as well as to tone pulses at various frequencies, were both modelled and recorded...... at different stimulation levels and repetition rates. The observed nonlinearities in the recorded potential patterns, with respect to ABR wave V latencies and amplitudes, could be largely accounted for by level-dependent BM processing as well as effects of short-term neural adaptation. The present study...

  19. Numerical modelling of forces, stresses and breakages of concrete armour units

    NARCIS (Netherlands)

    Latham, John Paul; Xiang, Jiansheng; Anastasaki, Eleni; Guo, Liwei; Karantzoulis, Nikolaos; Viré, A.C.; Pain, Christopher

    2014-01-01

    Numerical modelling has the potential to probe the complexity of the interacting physics of rubble mound armour systems. Through forward modelling of armour unit packs, stochastic variables such as unit displacement and maximum contact force per unit during an external oscillatory disturbance can

  20. Tomography system having an ultrahigh speed processing unit

    International Nuclear Information System (INIS)

    Cox, J.P. Jr.; Gerth, V.W. Jr.

    1977-01-01

    A transverse section tomography system has an ultrahigh-speed data processing unit for performing back projection and updating. An x-ray scanner directs x-ray beams through a planar section of a subject from a sequence of orientations and positions. The scanner includes a movably supported radiation detector for detecting the intensity of the beams of radiation after they pass through the subject

  1. Investigation of the Dynamic Melting Process in a Thermal Energy Storage Unit Using a Helical Coil Heat Exchanger

    Directory of Open Access Journals (Sweden)

    Xun Yang

    2017-08-01

    Full Text Available In this study, the dynamic melting process of the phase change material (PCM in a vertical cylindrical tube-in-tank thermal energy storage (TES unit was investigated through numerical simulations and experimental measurements. To ensure good heat exchange performance, a concentric helical coil was inserted into the TES unit to pipe the heat transfer fluid (HTF. A numerical model using the computational fluid dynamics (CFD approach was developed based on the enthalpy-porosity method to simulate the unsteady melting process including temperature and liquid fraction variations. Temperature measurements using evenly spaced thermocouples were conducted, and the temperature variation at three locations inside the TES unit was recorded. The effects of the HTF inlet parameters were investigated by parametric studies with different temperatures and flow rate values. Reasonably good agreement was achieved between the numerical prediction and the temperature measurement, which confirmed the numerical simulation accuracy. The numerical results showed the significance of buoyancy effect for the dynamic melting process. The system TES performance was very sensitive to the HTF inlet temperature. By contrast, no apparent influences can be found when changing the HTF flow rates. This study provides a comprehensive solution to investigate the heat exchange process of the TES system using PCM.

  2. Development of Wolsong Unit 2 Containment Analysis Model

    Energy Technology Data Exchange (ETDEWEB)

    Hoon, Choi [Korea Hydro and Nuclear Power Co., Ltd., Daejeon (Korea, Republic of); Jin, Ko Bong; Chan, Park Young [Hanbat National Univ., Daejeon (Korea, Republic of)

    2014-05-15

    To be prepared for the full scope safety analysis of Wolsong unit 2 with modified fuel, input decks for the various objectives, which can be read by GOTHIC 7.2b(QA), are developed and tested for the steady state simulation. A detailed nodalization of 39 control volumes and 92 flow paths is constructed to determine the differential pressure across internal walls or hydrogen concentration and distribution inside containment. A lumped model with 15 control volumes and 74 flow paths has also been developed to reduce the computer run time for the assessments in which the analysis results are not sensitive to detailed thermal hydraulic distribution inside containment such as peak pressure, pressure dependent signal and radionuclide release. The input data files provide simplified representations of the geometric layout of the containment building (volumes, dimensions, flow paths, doors, panels, etc.) and the performance characteristics of the various containment subsystems. The parameter values are based on best estimate or design values for that parameter. The analysis values are determined by conservatism depending on the analysis objective and may be different for various analysis objectives. Basic input decks of Wolsong unit 2 were developed for the various analysis purposes with GOTHIC 7.2b(QA). Depend on the analysis objective, two types of models are prepared. Detailed model models each confined room in the containment as a separate node. All of the geometric data are based on the drawings of Wolsong unit 2. Developed containment models are simulating the steady state well to the designated initial condition. These base models will be used for Wolsong unit 2 in case of safety analysis of full scope is needed.

  3. The First Prototype for the FastTracker Processing Unit

    CERN Document Server

    Andreani, A; The ATLAS collaboration; Beretta, M; Bogdan, M; Citterio, M; Alberti, F; Giannetti, P; Lanza, A; Magalotti, D; Piendibene, M; Shochet, M; Stabile, A; Tang, J; Tompkins, L

    2012-01-01

    Modern experiments search for extremely rare processes hidden in much larger background levels. As the experiment complexity and the accelerator backgrounds and luminosity increase we need increasingly complex and exclusive selections. We present the first prototype of a new Processing Unit, the core of the FastTracker processor for Atlas, whose computing power is such that a couple of hundreds of them will be able to reconstruct all the tracks with transverse momentum above 1 GeV in the ATLAS events up to Phase II instantaneous luminosities (5×1034 cm-2 s-1) with an event input rate of 100 kHz and a latency below hundreds of microseconds. We plan extremely powerful, very compact and low consumption units for the far future, essential to increase efficiency and purity of the Level 2 selected samples through the intensive use of tracking. This strategy requires massive computing power to minimize the online execution time of complex tracking algorithms. The time consuming pattern recognition problem, generall...

  4. Stochastic model of milk homogenization process using Markov's chain

    Directory of Open Access Journals (Sweden)

    A. A. Khvostov

    2016-01-01

    Full Text Available The process of development of a mathematical model of the process of homogenization of dairy products is considered in the work. The theory of Markov's chains was used in the development of the mathematical model, Markov's chain with discrete states and continuous parameter for which the homogenisation pressure is taken, being the basis for the model structure. Machine realization of the model is implemented in the medium of structural modeling MathWorks Simulink™. Identification of the model parameters was carried out by minimizing the standard deviation calculated from the experimental data for each fraction of dairy products fat phase. As the set of experimental data processing results of the micrographic images of fat globules of whole milk samples distribution which were subjected to homogenization at different pressures were used. Pattern Search method was used as optimization method with the Latin Hypercube search algorithm from Global Optimization Тoolbox library. The accuracy of calculations averaged over all fractions of 0.88% (the relative share of units, the maximum relative error was 3.7% with the homogenization pressure of 30 MPa, which may be due to the very abrupt change in properties from the original milk in the particle size distribution at the beginning of the homogenization process and the lack of experimental data at homogenization pressures of below the specified value. The mathematical model proposed allows to calculate the profile of volume and mass distribution of the fat phase (fat globules in the product, depending on the homogenization pressure and can be used in the laboratory and research of dairy products composition, as well as in the calculation, design and modeling of the process equipment of the dairy industry enterprises.

  5. Quality Improvement Process in a Large Intensive Care Unit: Structure and Outcomes.

    Science.gov (United States)

    Reddy, Anita J; Guzman, Jorge A

    2016-11-01

    Quality improvement in the health care setting is a complex process, and even more so in the critical care environment. The development of intensive care unit process measures and quality improvement strategies are associated with improved outcomes, but should be individualized to each medical center as structure and culture can differ from institution to institution. The purpose of this report is to describe the structure of quality improvement processes within a large medical intensive care unit while using examples of the study institution's successes and challenges in the areas of stat antibiotic administration, reduction in blood product waste, central line-associated bloodstream infections, and medication errors. © The Author(s) 2015.

  6. Optimization model of a system of crude oil distillation units whit heat integration and meta modeling

    International Nuclear Information System (INIS)

    Lopez, Diana C; Mahecha, Cesar A; Hoyos, Luis J; Acevedo, Leonardo; Villamizar Jaime F

    2009-01-01

    The process of crude distillation impacts the economy of any refinery in a considerable manner. Therefore, it is necessary to improve it taking good advantage of the available infrastructure, generating products that conform to the specifications without violating the equipment operating constraints or plant restrictions at industrial units. The objective of this paper is to present the development of an optimization model for a Crude Distillation Unit (CDU) system at a ECOPETROL S.A. refinery in Barrancabermeja, involving the typical restrictions (flow according to pipeline capacity, pumps, distillation columns, etc) and a restriction that has not been included in bibliographic reports for this type of models: the heat integration of streams from Atmospheric Distillation Towers (ADTs) and Vacuum Distillation Towers (VDT) with the heat exchanger networks for crude pre-heating. On the other hand, ADTs were modeled with Meta models in function of column temperatures and pressures, pumparounds flows and return temperatures, stripping steam flows, Jet EBP ASTM D-86 and Diesel EBP ASTM D-86. Pre-heating trains were modeled with mass and energy balances, and design equation of each heat exchanger. The optimization model is NLP, maximizing the system profit. This model was implemented in GAMSide 22,2 using the CONOPT solver and it found new operating points with better economic results than those obtained with the normal operation in the real plants. It predicted optimum operation conditions of 3 ADTs for constant composition crude and calculated the yields and properties of atmospheric products, additional to temperatures and duties of 27 Crude Oil exchangers.

  7. Accelerating Malware Detection via a Graphics Processing Unit

    Science.gov (United States)

    2010-09-01

    Processing Unit . . . . . . . . . . . . . . . . . . 4 PE Portable Executable . . . . . . . . . . . . . . . . . . . . . 4 COFF Common Object File Format...operating systems for the future [Szo05]. The PE format is an updated version of the common object file format ( COFF ) [Mic06]. Microsoft released a new...NAs02]. These alerts can be costly in terms of time and resources for individuals and organizations to investigate each misidentified file [YWL07] [Vak10

  8. Development of Property Models with Uncertainty Estimate for Process Design under Uncertainty

    DEFF Research Database (Denmark)

    Hukkerikar, Amol; Sarup, Bent; Abildskov, Jens

    more reliable predictions with a new and improved set of model parameters for GC (group contribution) based and CI (atom connectivity index) based models and to quantify the uncertainties in the estimated property values from a process design point-of-view. This includes: (i) parameter estimation using....... The comparison of model prediction uncertainties with reported range of measurement uncertainties is presented for the properties with related available data. The application of the developed methodology to quantify the effect of these uncertainties on the design of different unit operations (distillation column......, the developed methodology can be used to quantify the sensitivity of process design to uncertainties in property estimates; obtain rationally the risk/safety factors in process design; and identify additional experimentation needs in order to reduce most critical uncertainties....

  9. Alternative Procedure of Heat Integration Tehnique Election between Two Unit Processes to Improve Energy Saving

    Science.gov (United States)

    Santi, S. S.; Renanto; Altway, A.

    2018-01-01

    The energy use system in a production process, in this case heat exchangers networks (HENs), is one element that plays a role in the smoothness and sustainability of the industry itself. Optimizing Heat Exchanger Networks (HENs) from process streams can have a major effect on the economic value of an industry as a whole. So the solving of design problems with heat integration becomes an important requirement. In a plant, heat integration can be carried out internally or in combination between process units. However, steps in the determination of suitable heat integration techniques require long calculations and require a long time. In this paper, we propose an alternative step in determining heat integration technique by investigating 6 hypothetical units using Pinch Analysis approach with objective function energy target and total annual cost target. The six hypothetical units consist of units A, B, C, D, E, and F, where each unit has the location of different process streams to the temperature pinch. The result is a potential heat integration (ΔH’) formula that can trim conventional steps from 7 steps to just 3 steps. While the determination of the preferred heat integration technique is to calculate the potential of heat integration (ΔH’) between the hypothetical process units. Completion of calculation using matlab language programming.

  10. Business process model repositories : efficient process retrieval

    NARCIS (Netherlands)

    Yan, Z.

    2012-01-01

    As organizations increasingly work in process-oriented manner, the number of business process models that they develop and have to maintain increases. As a consequence, it has become common for organizations to have collections of hundreds or even thousands of business process models. When a

  11. Graphics processing unit based computation for NDE applications

    Science.gov (United States)

    Nahas, C. A.; Rajagopal, Prabhu; Balasubramaniam, Krishnan; Krishnamurthy, C. V.

    2012-05-01

    Advances in parallel processing in recent years are helping to improve the cost of numerical simulation. Breakthroughs in Graphical Processing Unit (GPU) based computation now offer the prospect of further drastic improvements. The introduction of 'compute unified device architecture' (CUDA) by NVIDIA (the global technology company based in Santa Clara, California, USA) has made programming GPUs for general purpose computing accessible to the average programmer. Here we use CUDA to develop parallel finite difference schemes as applicable to two problems of interest to NDE community, namely heat diffusion and elastic wave propagation. The implementations are for two-dimensions. Performance improvement of the GPU implementation against serial CPU implementation is then discussed.

  12. Accelerating Monte Carlo simulations of photon transport in a voxelized geometry using a massively parallel graphics processing unit

    International Nuclear Information System (INIS)

    Badal, Andreu; Badano, Aldo

    2009-01-01

    Purpose: It is a known fact that Monte Carlo simulations of radiation transport are computationally intensive and may require long computing times. The authors introduce a new paradigm for the acceleration of Monte Carlo simulations: The use of a graphics processing unit (GPU) as the main computing device instead of a central processing unit (CPU). Methods: A GPU-based Monte Carlo code that simulates photon transport in a voxelized geometry with the accurate physics models from PENELOPE has been developed using the CUDA programming model (NVIDIA Corporation, Santa Clara, CA). Results: An outline of the new code and a sample x-ray imaging simulation with an anthropomorphic phantom are presented. A remarkable 27-fold speed up factor was obtained using a GPU compared to a single core CPU. Conclusions: The reported results show that GPUs are currently a good alternative to CPUs for the simulation of radiation transport. Since the performance of GPUs is currently increasing at a faster pace than that of CPUs, the advantages of GPU-based software are likely to be more pronounced in the future.

  13. Accelerating Monte Carlo simulations of photon transport in a voxelized geometry using a massively parallel graphics processing unit

    Energy Technology Data Exchange (ETDEWEB)

    Badal, Andreu; Badano, Aldo [Division of Imaging and Applied Mathematics, OSEL, CDRH, U.S. Food and Drug Administration, Silver Spring, Maryland 20993-0002 (United States)

    2009-11-15

    Purpose: It is a known fact that Monte Carlo simulations of radiation transport are computationally intensive and may require long computing times. The authors introduce a new paradigm for the acceleration of Monte Carlo simulations: The use of a graphics processing unit (GPU) as the main computing device instead of a central processing unit (CPU). Methods: A GPU-based Monte Carlo code that simulates photon transport in a voxelized geometry with the accurate physics models from PENELOPE has been developed using the CUDA programming model (NVIDIA Corporation, Santa Clara, CA). Results: An outline of the new code and a sample x-ray imaging simulation with an anthropomorphic phantom are presented. A remarkable 27-fold speed up factor was obtained using a GPU compared to a single core CPU. Conclusions: The reported results show that GPUs are currently a good alternative to CPUs for the simulation of radiation transport. Since the performance of GPUs is currently increasing at a faster pace than that of CPUs, the advantages of GPU-based software are likely to be more pronounced in the future.

  14. Accelerating Monte Carlo simulations of photon transport in a voxelized geometry using a massively parallel graphics processing unit.

    Science.gov (United States)

    Badal, Andreu; Badano, Aldo

    2009-11-01

    It is a known fact that Monte Carlo simulations of radiation transport are computationally intensive and may require long computing times. The authors introduce a new paradigm for the acceleration of Monte Carlo simulations: The use of a graphics processing unit (GPU) as the main computing device instead of a central processing unit (CPU). A GPU-based Monte Carlo code that simulates photon transport in a voxelized geometry with the accurate physics models from PENELOPE has been developed using the CUDATM programming model (NVIDIA Corporation, Santa Clara, CA). An outline of the new code and a sample x-ray imaging simulation with an anthropomorphic phantom are presented. A remarkable 27-fold speed up factor was obtained using a GPU compared to a single core CPU. The reported results show that GPUs are currently a good alternative to CPUs for the simulation of radiation transport. Since the performance of GPUs is currently increasing at a faster pace than that of CPUs, the advantages of GPU-based software are likely to be more pronounced in the future.

  15. The Tidal Model as experienced by patients and nurses in a regional forensic unit.

    Science.gov (United States)

    Cook, N R; Phillips, B N; Sadler, D

    2005-10-01

    The Tidal Model has been implemented in Rangipapa, a regional secure mental health forensic unit in New Zealand. A phenomenological study was undertaken to obtain reflective description of the nursing care experience from the perspective's of four Registered Nurses and four Special Patients. Five major themes were identified that appeared to capture the experiences of the participants. The themes show changes to the unit's unique culture and values following implementation of the model. These changes engendered a sense of hope, where nurses felt they were making a difference and patients were able to communicate in their own words their feelings of hope and optimism. Levelling was experienced as an effect emerging from individual and group processes whereby a shift in power enhanced a sense of self and connectedness in their relationships. These interpersonal transactions were noted by the special patients as being positive for their recovery. This enabled effective nurse-patient collaboration expressed simply as working together. The participants reported a feeling of humanity, so that there was a human face to a potentially objectifying forensic setting. Implications arising from this study are that the use of the model enables a synergistic interpersonal process wherein nurses are professionally satisfied and patients are validated in their experience supporting their recovery.

  16. Modeling Aspects of Activated Sludge Processes Part l l: Mathematical Process Modeling and Biokinetics of Activated Sludge Processes

    Energy Technology Data Exchange (ETDEWEB)

    AbdElHaleem, H S [Cairo Univ.-CivlI Eng. Dept., Giza (Egypt); EI-Ahwany, A H [CairoUlmrsity- Faculty ofEngincering - Chemical Engineering Department, Giza (Egypt); Ibrahim, H I [Helwan University- Faculty of Engineering - Biomedical Engineering Department, Helwan (Egypt); Ibrahim, G [Menofia University- Faculty of Engineering Sbebin EI Kom- Basic Eng. Sc. Dept., Menofia (Egypt)

    2004-07-01

    Mathematical process modeling and biokinetics of activated sludge process were reviewed considering different types of models. It has been evaluated the task group models of ASMI. and 2, and 3 versioned by Henze et al considering the conditions of each model and the different processes of which every model consists. It is revealed that ASMI contains some defects avoided in ASM3. Relied on homogeneity, Models can be classified into homogenous models characterized by taking the activated sludge process as one phase. In this type of models, the internal mass transfer inside the floes was neglected.. Hence, the kinetic parameter produces can be considered inaccurate. The other type of models is the heterogeneous model This type considers the mass transfer operations in addition to the biochemical reaction processes; hence, the resulted kinetic parameters can be considered more accurate than that of homogenous type.

  17. Modeling Aspects of Activated Sludge Processes Part l l: Mathematical Process Modeling and Biokinetics of Activated Sludge Processes

    International Nuclear Information System (INIS)

    AbdElHaleem, H.S.; EI-Ahwany, A. H.; Ibrahim, H.I.; Ibrahim, G.

    2004-01-01

    Mathematical process modeling and biokinetics of activated sludge process were reviewed considering different types of models. It has been evaluated the task group models of ASMI. and 2, and 3 versioned by Henze et al considering the conditions of each model and the different processes of which every model consists. It is revealed that ASMI contains some defects avoided in ASM3. Relied on homogeneity, Models can be classified into homogenous models characterized by taking the activated sludge process as one phase. In this type of models, the internal mass transfer inside the floes was neglected.. Hence, the kinetic parameter produces can be considered inaccurate. The other type of models is the heterogeneous model This type considers the mass transfer operations in addition to the biochemical reaction processes; hence, the resulted kinetic parameters can be considered more accurate than that of homogenous type

  18. A dynamic model used for controller design of a coal fired once-through boiler-turbine unit

    International Nuclear Information System (INIS)

    Liu, Ji-Zhen; Yan, Shu; Zeng, De-Liang; Hu, Yong; Lv, You

    2015-01-01

    Supercritical OTB (once-through boiler) units with high steam temperature and pressure have been widely used in modern power plants due to their high cycle efficiency and less emissions. To ensure the effective operation of such power generation systems, it is necessary to build a model for the design of the overall control system. There are already detailed models of once-through boilers; however, their complexity prevents them from being applied in the controller design. This study describes a lumped parameter dynamic model that has a relatively low complexity while faithfully capturing the essential overall plant dynamics. The model structure was derived by fundamental physical laws utilizing reasonable simplifications and data analysis to avoid the phase transition position problem. Parameter identification for the model structure was completed using operational data from a 1000 MW ultra-supercritical OTB. The model was determined to be reasonable by comparison tests between computed data and measured data for both steady and dynamic states. The simplified model is verified to have appropriate fidelity in control system design to achieve effective and economic operation of the unit. - Highlights: • A simplified dynamic model of once-through boiler-turbine unit is given. • The essential dynamics of active power and throttle pressure is presented. • The change of phase transition position is avoided in modeling process. • The model has appropriate complexity and fidelity for controller design.

  19. Stress-reducing preventive maintenance model for a unit under stressful environment

    International Nuclear Information System (INIS)

    Park, J.H.; Chang, Woojin; Lie, C.H.

    2012-01-01

    We develop a preventive maintenance (PM) model for a unit operated under stressful environment. The PM model in this paper consists of a failure rate model and two cost models to determine the optimal PM scheduling which minimizes a cost rate. The assumption for the proposed model is that stressful environment accelerates the failure of the unit and periodic maintenances reduce stress from outside. The failure rate model handles the maintenance effect of PM using improvement and stress factors. The cost models are categorized into two failure recognition cases: immediate failure recognition and periodic failure detection. The optimal PM scheduling is obtained by considering the trade-off between the related cost and the lifetime of a unit in our model setting. The practical usage of our proposed model is tested through a numerical example.

  20. Maximizing the retention level for proportional reinsurance under  -regulation of the finite time surplus process with unit-equalized interarrival time

    Directory of Open Access Journals (Sweden)

    Sukanya Somprom

    2016-07-01

    Full Text Available The research focuses on an insurance model controlled by proportional reinsurance in the finite-time surplus process with a unit-equalized time interval. We prove the existence of the maximal retention level for independent and identically distributed claim processes under α-regulation, i.e., a model where the insurance company has to manage the probability of insolvency to be at most α. In addition, we illustrate the maximal retention level for exponential claims by applying the bisection technique.

  1. Effect of land cover on atmospheric processes and air quality over the continental United States – a NASA Unified WRF (NU-WRF model study

    Directory of Open Access Journals (Sweden)

    Z. Tao

    2013-07-01

    Full Text Available The land surface plays a crucial role in regulating water and energy fluxes at the land–atmosphere (L–A interface and controls many processes and feedbacks in the climate system. Land cover and vegetation type remains one key determinant of soil moisture content that impacts air temperature, planetary boundary layer (PBL evolution, and precipitation through soil-moisture–evapotranspiration coupling. In turn, it will affect atmospheric chemistry and air quality. This paper presents the results of a modeling study of the effect of land cover on some key L–A processes with a focus on air quality. The newly developed NASA Unified Weather Research and Forecast (NU-WRF modeling system couples NASA's Land Information System (LIS with the community WRF model and allows users to explore the L–A processes and feedbacks. Three commonly used satellite-derived land cover datasets – i.e., from the US Geological Survey (USGS and University of Maryland (UMD, which are based on the Advanced Very High Resolution Radiometer (AVHRR, and from the Moderate Resolution Imaging Spectroradiometer (MODIS – bear large differences in agriculture, forest, grassland, and urban spatial distributions in the continental United States, and thus provide an excellent case to investigate how land cover change would impact atmospheric processes and air quality. The weeklong simulations demonstrate the noticeable differences in soil moisture/temperature, latent/sensible heat flux, PBL height, wind, NO2/ozone, and PM2.5 air quality. These discrepancies can be traced to associate with the land cover properties, e.g., stomatal resistance, albedo and emissivity, and roughness characteristics. It also implies that the rapid urban growth may have complex air quality implications with reductions in peak ozone but more frequent high ozone events.

  2. Mathematical modeling of synthetic unit hydrograph case study: Citarum watershed

    Science.gov (United States)

    Islahuddin, Muhammad; Sukrainingtyas, Adiska L. A.; Kusuma, M. Syahril B.; Soewono, Edy

    2015-09-01

    Deriving unit hydrograph is very important in analyzing watershed's hydrologic response of a rainfall event. In most cases, hourly measures of stream flow data needed in deriving unit hydrograph are not always available. Hence, one needs to develop methods for deriving unit hydrograph for ungagged watershed. Methods that have evolved are based on theoretical or empirical formulas relating hydrograph peak discharge and timing to watershed characteristics. These are usually referred to Synthetic Unit Hydrograph. In this paper, a gamma probability density function and its variant are used as mathematical approximations of a unit hydrograph for Citarum Watershed. The model is adjusted with real field condition by translation and scaling. Optimal parameters are determined by using Particle Swarm Optimization method with weighted objective function. With these models, a synthetic unit hydrograph can be developed and hydrologic parameters can be well predicted.

  3. Process correlation analysis model for process improvement identification.

    Science.gov (United States)

    Choi, Su-jin; Kim, Dae-Kyoo; Park, Sooyong

    2014-01-01

    Software process improvement aims at improving the development process of software systems. It is initiated by process assessment identifying strengths and weaknesses and based on the findings, improvement plans are developed. In general, a process reference model (e.g., CMMI) is used throughout the process of software process improvement as the base. CMMI defines a set of process areas involved in software development and what to be carried out in process areas in terms of goals and practices. Process areas and their elements (goals and practices) are often correlated due to the iterative nature of software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data.

  4. The impact of working memory and the "process of process modelling" on model quality: Investigating experienced versus inexperienced modellers

    DEFF Research Database (Denmark)

    Martini, Markus; Pinggera, Jakob; Neurauter, Manuel

    2016-01-01

    of reconciliation phases was positively related to PM quality in experienced modellers. Our research reveals central cognitive mechanisms in process modelling and has potential practical implications for the development of modelling software and teaching the craft of process modelling....... the role of cognitive processes as well as modelling processes in creating a PM in experienced and inexperienced modellers. Specifically, two working memory (WM) functions (holding and processing of information and relational integration) and three process of process modelling phases (comprehension...

  5. Predicting Summer Dryness Under a Warmer Climate: Modeling Land Surface Processes in the Midwestern United States

    Science.gov (United States)

    Winter, J. M.; Eltahir, E. A.

    2009-12-01

    One of the most significant impacts of climate change is the potential alteration of local hydrologic cycles over agriculturally productive areas. As the world’s food supply continues to be taxed by its burgeoning population, a greater percentage of arable land will need to be utilized and land currently producing food must become more efficient. This study seeks to quantify the effects of climate change on soil moisture in the American Midwest. A series of 24-year numerical experiments were conducted to assess the ability of Regional Climate Model Version 3 coupled to Integrated Biosphere Simulator (RegCM3-IBIS) and Biosphere-Atmosphere Transfer Scheme 1e (RegCM3-BATS1e) to simulate the observed hydroclimatology of the midwestern United States. Model results were evaluated using NASA Surface Radiation Budget, NASA Earth Radiation Budget Experiment, Illinois State Water Survey, Climate Research Unit Time Series 2.1, Global Soil Moisture Data Bank, and regional-scale estimations of evapotranspiration. The response of RegCM3-IBIS and RegCM3-BATS1e to a surrogate climate change scenario, a warming of 3oC at the boundaries and doubling of CO2, was explored. Precipitation increased significantly during the spring and summer in both RegCM3-IBIS and RegCM3-BATS1e, leading to additional runoff. In contrast, enhancement of evapotranspiration and shortwave radiation were modest. Soil moisture remained relatively unchanged in RegCM3-IBIS, while RegCM3-BATS1e exhibited some fall and winter wetting.

  6. Product and Process Modelling

    DEFF Research Database (Denmark)

    Cameron, Ian T.; Gani, Rafiqul

    . These approaches are put into the context of life cycle modelling, where multiscale and multiform modelling is increasingly prevalent in the 21st century. The book commences with a discussion of modern product and process modelling theory and practice followed by a series of case studies drawn from a variety......This book covers the area of product and process modelling via a case study approach. It addresses a wide range of modelling applications with emphasis on modelling methodology and the subsequent in-depth analysis of mathematical models to gain insight via structural aspects of the models...... to biotechnology applications, food, polymer and human health application areas. The book highlights to important nature of modern product and process modelling in the decision making processes across the life cycle. As such it provides an important resource for students, researchers and industrial practitioners....

  7. Mathematical model of parking space unit for triangular parking area

    Science.gov (United States)

    Syahrini, Intan; Sundari, Teti; Iskandar, Taufiq; Halfiani, Vera; Munzir, Said; Ramli, Marwan

    2018-01-01

    Parking space unit (PSU) is an effective measure for the area size of a vehicle, including the free space and the width of the door opening of the vehicle (car). This article discusses a mathematical model for parking space of vehicles in triangular shape area. An optimization model for triangular parking lot is developed. Integer Linear Programming (ILP) method is used to determine the maximum number of the PSU. The triangular parking lot is in isosceles and equilateral triangles shape and implements four possible rows and five possible angles for each field. The vehicles which are considered are cars and motorcycles. The results show that the isosceles triangular parking area has 218 units of optimal PSU, which are 84 units of PSU for cars and 134 units of PSU for motorcycles. Equilateral triangular parking area has 688 units of optimal PSU, which are 175 units of PSU for cars and 513 units of PSU for motorcycles.

  8. Smoldyn on graphics processing units: massively parallel Brownian dynamics simulations.

    Science.gov (United States)

    Dematté, Lorenzo

    2012-01-01

    Space is a very important aspect in the simulation of biochemical systems; recently, the need for simulation algorithms able to cope with space is becoming more and more compelling. Complex and detailed models of biochemical systems need to deal with the movement of single molecules and particles, taking into consideration localized fluctuations, transportation phenomena, and diffusion. A common drawback of spatial models lies in their complexity: models can become very large, and their simulation could be time consuming, especially if we want to capture the systems behavior in a reliable way using stochastic methods in conjunction with a high spatial resolution. In order to deliver the promise done by systems biology to be able to understand a system as whole, we need to scale up the size of models we are able to simulate, moving from sequential to parallel simulation algorithms. In this paper, we analyze Smoldyn, a widely diffused algorithm for stochastic simulation of chemical reactions with spatial resolution and single molecule detail, and we propose an alternative, innovative implementation that exploits the parallelism of Graphics Processing Units (GPUs). The implementation executes the most computational demanding steps (computation of diffusion, unimolecular, and bimolecular reaction, as well as the most common cases of molecule-surface interaction) on the GPU, computing them in parallel on each molecule of the system. The implementation offers good speed-ups and real time, high quality graphics output

  9. A Fast MHD Code for Gravitationally Stratified Media using Graphical Processing Units: SMAUG

    Science.gov (United States)

    Griffiths, M. K.; Fedun, V.; Erdélyi, R.

    2015-03-01

    Parallelization techniques have been exploited most successfully by the gaming/graphics industry with the adoption of graphical processing units (GPUs), possessing hundreds of processor cores. The opportunity has been recognized by the computational sciences and engineering communities, who have recently harnessed successfully the numerical performance of GPUs. For example, parallel magnetohydrodynamic (MHD) algorithms are important for numerical modelling of highly inhomogeneous solar, astrophysical and geophysical plasmas. Here, we describe the implementation of SMAUG, the Sheffield Magnetohydrodynamics Algorithm Using GPUs. SMAUG is a 1-3D MHD code capable of modelling magnetized and gravitationally stratified plasma. The objective of this paper is to present the numerical methods and techniques used for porting the code to this novel and highly parallel compute architecture. The methods employed are justified by the performance benchmarks and validation results demonstrating that the code successfully simulates the physics for a range of test scenarios including a full 3D realistic model of wave propagation in the solar atmosphere.

  10. Efficient particle-in-cell simulation of auroral plasma phenomena using a CUDA enabled graphics processing unit

    Science.gov (United States)

    Sewell, Stephen

    This thesis introduces a software framework that effectively utilizes low-cost commercially available Graphic Processing Units (GPUs) to simulate complex scientific plasma phenomena that are modeled using the Particle-In-Cell (PIC) paradigm. The software framework that was developed conforms to the Compute Unified Device Architecture (CUDA), a standard for general purpose graphic processing that was introduced by NVIDIA Corporation. This framework has been verified for correctness and applied to advance the state of understanding of the electromagnetic aspects of the development of the Aurora Borealis and Aurora Australis. For each phase of the PIC methodology, this research has identified one or more methods to exploit the problem's natural parallelism and effectively map it for execution on the graphic processing unit and its host processor. The sources of overhead that can reduce the effectiveness of parallelization for each of these methods have also been identified. One of the novel aspects of this research was the utilization of particle sorting during the grid interpolation phase. The final representation resulted in simulations that executed about 38 times faster than simulations that were run on a single-core general-purpose processing system. The scalability of this framework to larger problem sizes and future generation systems has also been investigated.

  11. Study of automatic boat loading unit and horizontal sintering process of uranium dioxide pellet

    International Nuclear Information System (INIS)

    He Zhongjing; Chen Yu; Yao Dengfeng; Wang Youliang; Shu Binhua; Wu Genjiu

    2014-01-01

    Sintering process is a key process for the manufacture of nuclear fuel UO_2 pellet. In our factory, the continuous high temperature sintering furnace is used for sintering process. During the sintering of green pellets, the furnace, the boat and the accumulation way can influence the quality of the final product. In this text, on the basis of early process research, The automatic loading boat Unit and horizontal sintering process is studied successively. The results show that the physical and chemical properties of the products manufactured by automatic loading boat unit and horizontal sintering process can meet the technique requirements completely, and this system is reliable and continuous. (authors)

  12. Multi-unit Integration in Microfluidic Processes: Current Status and Future Horizons

    Directory of Open Access Journals (Sweden)

    Pratap R. Patnaik

    2011-07-01

    Full Text Available Microfluidic processes, mainly for biological and chemical applications, have expanded rapidly in recent years. While the initial focus was on single units, principally microreactors, technological and economic considerations have caused a shift to integrated microchips in which a number of microdevices function coherently. These integrated devices have many advantages over conventional macro-scale processes. However, the small scale of operation, complexities in the underlying physics and chemistry, and differences in the time constants of the participating units, in the interactions among them and in the outputs of interest make it difficult to design and optimize integrated microprocesses. These aspects are discussed here, current research and applications are reviewed, and possible future directions are considered.

  13. Computerized nursing process in the Intensive Care Unit: ergonomics and usability

    OpenAIRE

    Almeida,Sônia Regina Wagner de; Sasso,Grace Teresinha Marcon Dal; Barra,Daniela Couto Carvalho

    2016-01-01

    Abstract OBJECTIVE Analyzing the ergonomics and usability criteria of the Computerized Nursing Process based on the International Classification for Nursing Practice in the Intensive Care Unit according to International Organization for Standardization(ISO). METHOD A quantitative, quasi-experimental, before-and-after study with a sample of 16 participants performed in an Intensive Care Unit. Data collection was performed through the application of five simulated clinical cases and an evalua...

  14. 32 CFR 516.11 - Service of criminal process outside the United States.

    Science.gov (United States)

    2010-07-01

    ... 32 National Defense 3 2010-07-01 2010-07-01 true Service of criminal process outside the United... AID OF CIVIL AUTHORITIES AND PUBLIC RELATIONS LITIGATION Service of Process § 516.11 Service of... status of forces agreements, govern the service of criminal process of foreign courts and the surrender...

  15. The impact of working memory and the "process of process modelling" on model quality: Investigating experienced versus inexperienced modellers.

    Science.gov (United States)

    Martini, Markus; Pinggera, Jakob; Neurauter, Manuel; Sachse, Pierre; Furtner, Marco R; Weber, Barbara

    2016-05-09

    A process model (PM) represents the graphical depiction of a business process, for instance, the entire process from online ordering a book until the parcel is delivered to the customer. Knowledge about relevant factors for creating PMs of high quality is lacking. The present study investigated the role of cognitive processes as well as modelling processes in creating a PM in experienced and inexperienced modellers. Specifically, two working memory (WM) functions (holding and processing of information and relational integration) and three process of process modelling phases (comprehension, modelling, and reconciliation) were related to PM quality. Our results show that the WM function of relational integration was positively related to PM quality in both modelling groups. The ratio of comprehension phases was negatively related to PM quality in inexperienced modellers and the ratio of reconciliation phases was positively related to PM quality in experienced modellers. Our research reveals central cognitive mechanisms in process modelling and has potential practical implications for the development of modelling software and teaching the craft of process modelling.

  16. Automated processing of whole blood units: operational value and in vitro quality of final blood components.

    Science.gov (United States)

    Jurado, Marisa; Algora, Manuel; Garcia-Sanchez, Félix; Vico, Santiago; Rodriguez, Eva; Perez, Sonia; Barbolla, Luz

    2012-01-01

    The Community Transfusion Centre in Madrid currently processes whole blood using a conventional procedure (Compomat, Fresenius) followed by automated processing of buffy coats with the OrbiSac system (CaridianBCT). The Atreus 3C system (CaridianBCT) automates the production of red blood cells, plasma and an interim platelet unit from a whole blood unit. Interim platelet unit are pooled to produce a transfusable platelet unit. In this study the Atreus 3C system was evaluated and compared to the routine method with regards to product quality and operational value. Over a 5-week period 810 whole blood units were processed using the Atreus 3C system. The attributes of the automated process were compared to those of the routine method by assessing productivity, space, equipment and staffing requirements. The data obtained were evaluated in order to estimate the impact of implementing the Atreus 3C system in the routine setting of the blood centre. Yield and in vitro quality of the final blood components processed with the two systems were evaluated and compared. The Atreus 3C system enabled higher throughput while requiring less space and employee time by decreasing the amount of equipment and processing time per unit of whole blood processed. Whole blood units processed on the Atreus 3C system gave a higher platelet yield, a similar amount of red blood cells and a smaller volume of plasma. These results support the conclusion that the Atreus 3C system produces blood components meeting quality requirements while providing a high operational efficiency. Implementation of the Atreus 3C system could result in a large organisational improvement.

  17. Applying the Business Process and Practice Alignment Meta-model: Daily Practices and Process Modelling

    Directory of Open Access Journals (Sweden)

    Ventura Martins Paula

    2017-03-01

    Full Text Available Background: Business Process Modelling (BPM is one of the most important phases of information system design. Business Process (BP meta-models allow capturing informational and behavioural aspects of business processes. Unfortunately, standard BP meta-modelling approaches focus just on process description, providing different BP models. It is not possible to compare and identify related daily practices in order to improve BP models. This lack of information implies that further research in BP meta-models is needed to reflect the evolution/change in BP. Considering this limitation, this paper introduces a new BP meta-model designed by Business Process and Practice Alignment Meta-model (BPPAMeta-model. Our intention is to present a meta-model that addresses features related to the alignment between daily work practices and BP descriptions. Objectives: This paper intends to present a metamodel which is going to integrate daily work information into coherent and sound process definitions. Methods/Approach: The methodology employed in the research follows a design-science approach. Results: The results of the case study are related to the application of the proposed meta-model to align the specification of a BP model with work practices models. Conclusions: This meta-model can be used within the BPPAM methodology to specify or improve business processes models based on work practice descriptions.

  18. Silicon-Carbide Power MOSFET Performance in High Efficiency Boost Power Processing Unit for Extreme Environments

    Science.gov (United States)

    Ikpe, Stanley A.; Lauenstein, Jean-Marie; Carr, Gregory A.; Hunter, Don; Ludwig, Lawrence L.; Wood, William; Del Castillo, Linda Y.; Fitzpatrick, Fred; Chen, Yuan

    2016-01-01

    Silicon-Carbide device technology has generated much interest in recent years. With superior thermal performance, power ratings and potential switching frequencies over its Silicon counterpart, Silicon-Carbide offers a greater possibility for high powered switching applications in extreme environment. In particular, Silicon-Carbide Metal-Oxide- Semiconductor Field-Effect Transistors' (MOSFETs) maturing process technology has produced a plethora of commercially available power dense, low on-state resistance devices capable of switching at high frequencies. A novel hard-switched power processing unit (PPU) is implemented utilizing Silicon-Carbide power devices. Accelerated life data is captured and assessed in conjunction with a damage accumulation model of gate oxide and drain-source junction lifetime to evaluate potential system performance at high temperature environments.

  19. The capital asset pricing model versus the three factor model: A United Kingdom Perspective

    Directory of Open Access Journals (Sweden)

    Chandra Shekhar Bhatnagar

    2013-07-01

    Full Text Available The Sharpe (1964, Lintner (1965 and Black (1972 Capital Asset Pricing Model (CAPM postulates that the equilibrium rates of return on all risky assets are a linear function of their covariance with the market portfolio. Recent work by Fama and French (1996, 2006 introduce a Three Factor Model that questions the “real world application” of the CAPM Theorem and its ability to explain stock returns as well as value premium effects in the United States market. This paper provides an out-of-sample perspective to the work of Fama and French (1996, 2006. Multiple regression is used to compare the performance of the CAPM, a split sample CAPM and the Three Factor Model in explaining observed stock returns and value premium effects in the United Kingdom market. The methodology of Fama and French (2006 was used as the framework for this study. The findings show that the Three Factor Model holds for the United Kingdom Market and is superior to the CAPM and the split sample CAPM in explaining both stock returns and value premium effects. The “real world application” of the CAPM is therefore not supported by the United Kingdom data.

  20. Molten Salt Breeder Reactor Analysis Based on Unit Cell Model

    Energy Technology Data Exchange (ETDEWEB)

    Jeong, Yongjin; Choi, Sooyoung; Lee, Deokjung [Ulsan National Institute of Science and Technology, Ulsan (Korea, Republic of)

    2014-05-15

    Contemporary computer codes like the MCNP6 or SCALE are only good for solving a fixed solid fuel reactor. However, due to the molten-salt fuel, MSR analysis needs some functions such as online reprocessing and refueling, and circulating fuel. J. J. Power of Oak Ridge National Laboratory (ORNL) suggested in 2013 a method for simulating the Molten Salt Breeder Reactor (MSBR) with SCALE, which does not support continuous material processing. In order to simulate MSR characteristics, the method proposes dividing a depletion time into short time intervals and batchwise reprocessing and refueling at each step. We are applying this method by using the MCNP6 and PYTHON and NEWT-TRITON-PYTHON and PYTHON code systems to MSBR. This paper contains various parameters to analyze the MSBR unit cell model such as the multiplication factor, breeding ratio, change of amount of fuel, amount of fuel feeding, and neutron flux distribution. The result of MCNP6 and NEWT module in SCALE show some difference in depletion analysis, but it still seems that they can be used to analyze MSBR. Using these two computer code system, it is possible to analyze various parameters for the MSBR unit cells such as the multiplication factor, breeding ratio, amount of material, total feeding, and neutron flux distribution. Furthermore, the two code systems will be able to be used for analyzing other MSR model or whole core models of MSR.

  1. Molten Salt Breeder Reactor Analysis Based on Unit Cell Model

    International Nuclear Information System (INIS)

    Jeong, Yongjin; Choi, Sooyoung; Lee, Deokjung

    2014-01-01

    Contemporary computer codes like the MCNP6 or SCALE are only good for solving a fixed solid fuel reactor. However, due to the molten-salt fuel, MSR analysis needs some functions such as online reprocessing and refueling, and circulating fuel. J. J. Power of Oak Ridge National Laboratory (ORNL) suggested in 2013 a method for simulating the Molten Salt Breeder Reactor (MSBR) with SCALE, which does not support continuous material processing. In order to simulate MSR characteristics, the method proposes dividing a depletion time into short time intervals and batchwise reprocessing and refueling at each step. We are applying this method by using the MCNP6 and PYTHON and NEWT-TRITON-PYTHON and PYTHON code systems to MSBR. This paper contains various parameters to analyze the MSBR unit cell model such as the multiplication factor, breeding ratio, change of amount of fuel, amount of fuel feeding, and neutron flux distribution. The result of MCNP6 and NEWT module in SCALE show some difference in depletion analysis, but it still seems that they can be used to analyze MSBR. Using these two computer code system, it is possible to analyze various parameters for the MSBR unit cells such as the multiplication factor, breeding ratio, amount of material, total feeding, and neutron flux distribution. Furthermore, the two code systems will be able to be used for analyzing other MSR model or whole core models of MSR

  2. Accelerating Molecular Dynamic Simulation on Graphics Processing Units

    Science.gov (United States)

    Friedrichs, Mark S.; Eastman, Peter; Vaidyanathan, Vishal; Houston, Mike; Legrand, Scott; Beberg, Adam L.; Ensign, Daniel L.; Bruns, Christopher M.; Pande, Vijay S.

    2009-01-01

    We describe a complete implementation of all-atom protein molecular dynamics running entirely on a graphics processing unit (GPU), including all standard force field terms, integration, constraints, and implicit solvent. We discuss the design of our algorithms and important optimizations needed to fully take advantage of a GPU. We evaluate its performance, and show that it can be more than 700 times faster than a conventional implementation running on a single CPU core. PMID:19191337

  3. A neuroconstructivist model of past tense development and processing.

    Science.gov (United States)

    Westermann, Gert; Ruh, Nicolas

    2012-07-01

    We present a neural network model of learning and processing the English past tense that is based on the notion that experience-dependent cortical development is a core aspect of cognitive development. During learning the model adds and removes units and connections to develop a task-specific final architecture. The model provides an integrated account of characteristic errors during learning the past tense, adult generalization to pseudoverbs, and dissociations between verbs observed after brain damage in aphasic patients. We put forward a theory of verb inflection in which a functional processing architecture develops through interactions between experience-dependent brain development and the structure of the environment, in this case, the statistical properties of verbs in the language. The outcome of this process is a structured processing system giving rise to graded dissociations between verbs that are easy and verbs that are hard to learn and process. In contrast to dual-mechanism accounts of inflection, we argue that describing dissociations as a dichotomy between regular and irregular verbs is a post hoc abstraction and is not linked to underlying processing mechanisms. We extend current single-mechanism accounts of inflection by highlighting the role of structural adaptation in development and in the formation of the adult processing system. In contrast to some single-mechanism accounts, we argue that the link between irregular inflection and verb semantics is not causal and that existing data can be explained on the basis of phonological representations alone. This work highlights the benefit of taking brain development seriously in theories of cognitive development. Copyright 2012 APA, all rights reserved.

  4. The Sport Education Model: A Track and Field Unit Application

    Science.gov (United States)

    O'Neil, Kason; Krause, Jennifer M.

    2016-01-01

    Track and field is a traditional instructional unit often taught in secondary physical education settings due to its history, variety of events, and potential for student interest. This article provides an approach to teaching this unit using the sport education model (SEM) of instruction, which has traditionally been presented as a model for team…

  5. Reliability modelling for wear out failure period of a single unit system

    OpenAIRE

    Arekar, Kirti; Ailawadi, Satish; Jain, Rinku

    2012-01-01

    The present paper deals with two time-shifted density models for wear out failure period of a single unit system. The study, considered the time-shifted Gamma and Normal distributions. Wear out failures occur as a result of deterioration processes or mechanical wear and its probability of occurrence increases with time. A failure rate as a function of time deceases in an early failure period and it increases in wear out period. Failure rates for time shifted distributions and expression for m...

  6. Modeling multiphase materials processes

    CERN Document Server

    Iguchi, Manabu

    2010-01-01

    ""Modeling Multiphase Materials Processes: Gas-Liquid Systems"" describes the methodology and application of physical and mathematical modeling to multi-phase flow phenomena in materials processing. The book focuses on systems involving gas-liquid interaction, the most prevalent in current metallurgical processes. The performance characteristics of these processes are largely dependent on transport phenomena. This volume covers the inherent characteristics that complicate the modeling of transport phenomena in such systems, including complex multiphase structure, intense turbulence, opacity of

  7. A dual-process perspective on fluency-based aesthetics: the pleasure-interest model of aesthetic liking.

    Science.gov (United States)

    Graf, Laura K M; Landwehr, Jan R

    2015-11-01

    In this article, we develop an account of how aesthetic preferences can be formed as a result of two hierarchical, fluency-based processes. Our model suggests that processing performed immediately upon encountering an aesthetic object is stimulus driven, and aesthetic preferences that accrue from this processing reflect aesthetic evaluations of pleasure or displeasure. When sufficient processing motivation is provided by a perceiver's need for cognitive enrichment and/or the stimulus' processing affordance, elaborate perceiver-driven processing can emerge, which gives rise to fluency-based aesthetic evaluations of interest, boredom, or confusion. Because the positive outcomes in our model are pleasure and interest, we call it the Pleasure-Interest Model of Aesthetic Liking (PIA Model). Theoretically, this model integrates a dual-process perspective and ideas from lay epistemology into processing fluency theory, and it provides a parsimonious framework to embed and unite a wealth of aesthetic phenomena, including contradictory preference patterns for easy versus difficult-to-process aesthetic stimuli. © 2015 by the Society for Personality and Social Psychology, Inc.

  8. Luria’s model of the functional units of the brain and the neuropsychology of dreaming

    Directory of Open Access Journals (Sweden)

    Téllez A.

    2016-12-01

    Full Text Available Traditionally, neuropsychology has focused on identifying the brain mechanisms of specific psychological processes, such as attention, motor skills, perception, memory, language, and consciousness, as well as their corresponding disorders. However, there are psychological processes that have received little attention in this field, such as dreaming. This study examined the clinical and experimental neuropsychological research relevant to dreaming, ranging from sleep disorders in patients with brain damage, to brain functioning during REM sleep, using different methods of brain imaging. These findings were analyzed within the framework of Luria’s Three Functional Unit Model of the Brain, and a proposal was made to explain certain of the essential characteristics of dreaming. This explanation describes how, during dreaming, an activation of the First Functional Unit occurs, comprising the reticular formation of the brainstem; this activates, in turn, the Second Functional Unit — which is formed by the parietal, occipital, and temporal lobes and Unit L, which is comprised of the limbic system, as well as simultaneous hypo-functioning of the Third Functional Unit (frontal lobe. This activity produces a perception of hallucinatory images of various sensory modes, as well as a lack of inhibition, a non-selfreflexive thought process, and a lack of planning and direction of such oneiric images. Dreaming is considered a type of natural confabulation, similar to the one that occurs in patients with frontal lobe damage or schizophrenia. The study also suggests that the confabulatory, bizarre, and impulsive nature of dreaming has a function in the cognitiveemotional homeostasis that aids proper brain function throughout the day.

  9. Modelling and simulation of process control systems for WWER

    Energy Technology Data Exchange (ETDEWEB)

    Pangelov, N [Energoproekt, Sofia (Bulgaria)

    1996-12-31

    A dynamic modelling method for simulation of process control system is developed (method for identification). It is based on the least squares method and highly efficient linear uninterrupted differential equations. The method has the following advantages: there are no significant limitations in the type of input/output signals and in the length of data time series; identification at none zero initial condition is possible; on-line identification is possible; a high accuracy is observed in case of noise. On the basis of real experiments and data time series simulated with known computer codes it is possible to construct highly efficient models of different systems for solving the following problems: real time simulation with high accuracy for training purposes; estimation of immeasurable parameters important to safety; malfunction diagnostics based on plant dynamics; prediction of dynamic behaviour; control vector estimation in regime adviser. Two real applications of this method are described: in dynamic behaviour modelling for steam generator level, and in creating of a Process Control System Simulator (PCSS) based on KASKAD-2 for WWER-1000 units of the Kozloduy NPP. 6 refs., 8 figs.

  10. Stochastic model of template-directed elongation processes in biology.

    Science.gov (United States)

    Schilstra, Maria J; Nehaniv, Chrystopher L

    2010-10-01

    We present a novel modular, stochastic model for biological template-based linear chain elongation processes. In this model, elongation complexes (ECs; DNA polymerase, RNA polymerase, or ribosomes associated with nascent chains) that span a finite number of template units step along the template, one after another, with semaphore constructs preventing overtaking. The central elongation module is readily extended with modules that represent initiation and termination processes. The model was used to explore the effect of EC span on motor velocity and dispersion, and the effect of initiation activator and repressor binding kinetics on the overall elongation dynamics. The results demonstrate that (1) motors that move smoothly are able to travel at a greater velocity and closer together than motors that move more erratically, and (2) the rate at which completed chains are released is proportional to the occupancy or vacancy of activator or repressor binding sites only when initiation or activator/repressor dissociation is slow in comparison with elongation. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  11. AN APPROACH TO EFFICIENT FEM SIMULATIONS ON GRAPHICS PROCESSING UNITS USING CUDA

    Directory of Open Access Journals (Sweden)

    Björn Nutti

    2014-04-01

    Full Text Available The paper presents a highly efficient way of simulating the dynamic behavior of deformable objects by means of the finite element method (FEM with computations performed on Graphics Processing Units (GPU. The presented implementation reduces bottlenecks related to memory accesses by grouping the necessary data per node pairs, in contrast to the classical way done per element. This strategy reduces the memory access patterns that are not suitable for the GPU memory architecture. Furthermore, the presented implementation takes advantage of the underlying sparse-block-matrix structure, and it has been demonstrated how to avoid potential bottlenecks in the algorithm. To achieve plausible deformational behavior for large local rotations, the objects are modeled by means of a simplified co-rotational FEM formulation.

  12. FEATURES OF THE SOCIO-POLITICAL PROCESS IN THE UNITED STATES

    Directory of Open Access Journals (Sweden)

    Tatyana Evgenevna Beydina

    2017-06-01

    Full Text Available The subject of this article is the study of political and social developments of the USA at the present stage. There are four stages of the American tradition of studying political processes. The first stage is connected with substantiation of the Executive, Legislative and Judicial branches of political system (works of F. Pollack and R. Sili. The second one includes behavioral studies of politics. Besides studying political processes Charles Merriam has studied their similarities and differences. The third stage is characterized by political system studies – the works of T. Parsons, D. Easton, R. Aron, G. Almond and K. Deutsch. The fourth stage is characterized by superpower and the systems democratization problem (S. Huntington, Zb. Bzhezinsky. American social processes were qualified by R. Park, P. Sorokin, E. Giddens. The work is concentrated on the divided explanation of social and political processes of the us and the reflection of unity of American social-political reality. Academic novelty is composed of substantiation of the US social-political process concept and characterization of its features. The US social-political process is characterized by two channels: soft power and aggression. Soft power appears in the US economy dominancy. The main results of the research are features of the socio-political process in the United States. Purpose: the main goal of the research is to systematize the definition of social-political process of the USA and estimate the line of its study within American political tradition. Methodology: in this article have used methods: such as system, comparison and historical analysis, structural-functional analysis. Results: during the research the analysis of the dynamics of social and political processes of the United States had been made. Practical implications it is expedient to apply the received results in the international relation theory and practice.

  13. A Multiyear Model of Influenza Vaccination in the United States.

    Science.gov (United States)

    Kamis, Arnold; Zhang, Yuji; Kamis, Tamara

    2017-07-28

    Vaccinating adults against influenza remains a challenge in the United States. Using data from the Centers for Disease Control and Prevention, we present a model for predicting who receives influenza vaccination in the United States between 2012 and 2014, inclusive. The logistic regression model contains nine predictors: age, pneumococcal vaccination, time since last checkup, highest education level attained, employment, health care coverage, number of personal doctors, smoker status, and annual household income. The model, which classifies correctly 67 percent of the data in 2013, is consistent with models tested on the 2012 and 2014 datasets. Thus, we have a multiyear model to explain and predict influenza vaccination in the United States. The results indicate room for improvement in vaccination rates. We discuss how cognitive biases may underlie reluctance to obtain vaccination. We argue that targeted communications addressing cognitive biases could be useful for effective framing of vaccination messages, thus increasing the vaccination rate. Finally, we discuss limitations of the current study and questions for future research.

  14. Planning intensive care unit design using computer simulation modeling: optimizing integration of clinical, operational, and architectural requirements.

    Science.gov (United States)

    OʼHara, Susan

    2014-01-01

    Nurses have increasingly been regarded as critical members of the planning team as architects recognize their knowledge and value. But the nurses' role as knowledge experts can be expanded to leading efforts to integrate the clinical, operational, and architectural expertise through simulation modeling. Simulation modeling allows for the optimal merge of multifactorial data to understand the current state of the intensive care unit and predict future states. Nurses can champion the simulation modeling process and reap the benefits of a cost-effective way to test new designs, processes, staffing models, and future programming trends prior to implementation. Simulation modeling is an evidence-based planning approach, a standard, for integrating the sciences with real client data, to offer solutions for improving patient care.

  15. Massively Parallel Signal Processing using the Graphics Processing Unit for Real-Time Brain-Computer Interface Feature Extraction.

    Science.gov (United States)

    Wilson, J Adam; Williams, Justin C

    2009-01-01

    The clock speeds of modern computer processors have nearly plateaued in the past 5 years. Consequently, neural prosthetic systems that rely on processing large quantities of data in a short period of time face a bottleneck, in that it may not be possible to process all of the data recorded from an electrode array with high channel counts and bandwidth, such as electrocorticographic grids or other implantable systems. Therefore, in this study a method of using the processing capabilities of a graphics card [graphics processing unit (GPU)] was developed for real-time neural signal processing of a brain-computer interface (BCI). The NVIDIA CUDA system was used to offload processing to the GPU, which is capable of running many operations in parallel, potentially greatly increasing the speed of existing algorithms. The BCI system records many channels of data, which are processed and translated into a control signal, such as the movement of a computer cursor. This signal processing chain involves computing a matrix-matrix multiplication (i.e., a spatial filter), followed by calculating the power spectral density on every channel using an auto-regressive method, and finally classifying appropriate features for control. In this study, the first two computationally intensive steps were implemented on the GPU, and the speed was compared to both the current implementation and a central processing unit-based implementation that uses multi-threading. Significant performance gains were obtained with GPU processing: the current implementation processed 1000 channels of 250 ms in 933 ms, while the new GPU method took only 27 ms, an improvement of nearly 35 times.

  16. Semantic Business Process Modeling

    OpenAIRE

    Markovic, Ivan

    2010-01-01

    This book presents a process-oriented business modeling framework based on semantic technologies. The framework consists of modeling languages, methods, and tools that allow for semantic modeling of business motivation, business policies and rules, and business processes. Quality of the proposed modeling framework is evaluated based on the modeling content of SAP Solution Composer and several real-world business scenarios.

  17. Modeling of air-gap membrane distillation process: A theoretical and experimental study

    KAUST Repository

    Alsaadi, Ahmad Salem

    2013-06-03

    A one dimensional (1-D) air gap membrane distillation (AGMD) model for flat sheet type modules has been developed. This model is based on mathematical equations that describe the heat and mass transfer mechanisms of a single-stage AGMD process. It can simulate AGMD modules in both co-current and counter-current flow regimes. The theoretical model was validated using AGMD experimental data obtained under different operating conditions and parameters. The predicted water vapor flux was compared to the flux measured at five different feed water temperatures, two different feed water salinities, three different air gap widths and two MD membranes with different average pore sizes. This comparison showed that the model flux predictions are strongly correlated with the experimental data, with model predictions being within +10% of the experimentally determined values. The model was then used to study and analyze the parameters that have significant effect on scaling-up the AGMD process such as the effect of increasing the membrane length, and feed and coolant flow rates. The model was also used to analyze the maximum thermal efficiency of the AGMD process by tracing changes in water production rate and the heat input to the process along the membrane length. This was used to understand the gain in both process production and thermal efficiency for different membrane surface areas and the resultant increases in process capital and water unit cost. © 2013 Elsevier B.V.

  18. Business Process Modeling: Perceived Benefits

    Science.gov (United States)

    Indulska, Marta; Green, Peter; Recker, Jan; Rosemann, Michael

    The process-centered design of organizations and information systems is globally seen as an appropriate response to the increased economic pressure on organizations. At the methodological core of process-centered management is process modeling. However, business process modeling in large initiatives can be a time-consuming and costly exercise, making it potentially difficult to convince executive management of its benefits. To date, and despite substantial interest and research in the area of process modeling, the understanding of the actual benefits of process modeling in academia and practice is limited. To address this gap, this paper explores the perception of benefits derived from process modeling initiatives, as reported through a global Delphi study. The study incorporates the views of three groups of stakeholders - academics, practitioners and vendors. Our findings lead to the first identification and ranking of 19 unique benefits associated with process modeling. The study in particular found that process modeling benefits vary significantly between practitioners and academics. We argue that the variations may point to a disconnect between research projects and practical demands.

  19. Classification of hyperspectral imagery using MapReduce on a NVIDIA graphics processing unit (Conference Presentation)

    Science.gov (United States)

    Ramirez, Andres; Rahnemoonfar, Maryam

    2017-04-01

    A hyperspectral image provides multidimensional figure rich in data consisting of hundreds of spectral dimensions. Analyzing the spectral and spatial information of such image with linear and non-linear algorithms will result in high computational time. In order to overcome this problem, this research presents a system using a MapReduce-Graphics Processing Unit (GPU) model that can help analyzing a hyperspectral image through the usage of parallel hardware and a parallel programming model, which will be simpler to handle compared to other low-level parallel programming models. Additionally, Hadoop was used as an open-source version of the MapReduce parallel programming model. This research compared classification accuracy results and timing results between the Hadoop and GPU system and tested it against the following test cases: the CPU and GPU test case, a CPU test case and a test case where no dimensional reduction was applied.

  20. Design, manufacturing and commissioning of mobile unit for EDF (Dow Chemical process)

    International Nuclear Information System (INIS)

    Cangini, D.; Cordier, J.P.; PEC Engineering, Osny, France)

    1985-01-01

    To process their spent ion exchange resins and the liquid wastes, EDF has ordered from PEC a mobile unit using the DOW CHEMICAL binder. This paper presents the EDF's design requirements as well as the new French regulation for waste embedding. The mobile unit was started in January 1983 and commissioned successfully in January 1985 in the TRICASTIN EDF's power plant

  1. Software Graphics Processing Unit (sGPU) for Deep Space Applications

    Science.gov (United States)

    McCabe, Mary; Salazar, George; Steele, Glen

    2015-01-01

    A graphics processing capability will be required for deep space missions and must include a range of applications, from safety-critical vehicle health status to telemedicine for crew health. However, preliminary radiation testing of commercial graphics processing cards suggest they cannot operate in the deep space radiation environment. Investigation into an Software Graphics Processing Unit (sGPU)comprised of commercial-equivalent radiation hardened/tolerant single board computers, field programmable gate arrays, and safety-critical display software shows promising results. Preliminary performance of approximately 30 frames per second (FPS) has been achieved. Use of multi-core processors may provide a significant increase in performance.

  2. Computer-aided modeling of aluminophosphate zeolites as packings of building units

    KAUST Repository

    Peskov, Maxim

    2012-03-22

    New building schemes of aluminophosphate molecular sieves from packing units (PUs) are proposed. We have investigated 61 framework types discovered in zeolite-like aluminophosphates and have identified important PU combinations using a recently implemented computational algorithm of the TOPOS package. All PUs whose packing completely determines the overall topology of the aluminophosphate framework were described and catalogued. We have enumerated 235 building models for the aluminophosphates belonging to 61 zeolite framework types, from ring- or cage-like PU clusters. It is indicated that PUs can be considered as precursor species in the zeolite synthesis processes. © 2012 American Chemical Society.

  3. Partial wave analysis using graphics processing units

    Energy Technology Data Exchange (ETDEWEB)

    Berger, Niklaus; Liu Beijiang; Wang Jike, E-mail: nberger@ihep.ac.c [Institute of High Energy Physics, Chinese Academy of Sciences, 19B Yuquan Lu, Shijingshan, 100049 Beijing (China)

    2010-04-01

    Partial wave analysis is an important tool for determining resonance properties in hadron spectroscopy. For large data samples however, the un-binned likelihood fits employed are computationally very expensive. At the Beijing Spectrometer (BES) III experiment, an increase in statistics compared to earlier experiments of up to two orders of magnitude is expected. In order to allow for a timely analysis of these datasets, additional computing power with short turnover times has to be made available. It turns out that graphics processing units (GPUs) originally developed for 3D computer games have an architecture of massively parallel single instruction multiple data floating point units that is almost ideally suited for the algorithms employed in partial wave analysis. We have implemented a framework for tensor manipulation and partial wave fits called GPUPWA. The user writes a program in pure C++ whilst the GPUPWA classes handle computations on the GPU, memory transfers, caching and other technical details. In conjunction with a recent graphics processor, the framework provides a speed-up of the partial wave fit by more than two orders of magnitude compared to legacy FORTRAN code.

  4. The ATLAS Fast TracKer Processing Units

    CERN Document Server

    Krizka, Karol; The ATLAS collaboration

    2016-01-01

    The Fast Tracker is a hardware upgrade to the ATLAS trigger and data-acquisition system, with the goal of providing global track reconstruction by the start of the High Level Trigger starts. The Fast Tracker can process incoming data from the whole inner detector at full first level trigger rate, up to 100 kHz, using custom electronic boards. At the core of the system is a Processing Unit installed in a VMEbus crate, formed by two sets of boards: the Associative Memory Board and a powerful rear transition module called the Auxiliary card, while the second set is the Second Stage board. The associative memories perform the pattern matching looking for correlations within the incoming data, compatible with track candidates at coarse resolution. The pattern matching task is performed using custom application specific integrated circuits, called associative memory chips. The auxiliary card prepares the input and reject bad track candidates obtained from from the Associative Memory Board using the full precision a...

  5. Hydrologic consistency as a basis for assessing complexity of monthly water balance models for the continental United States

    Science.gov (United States)

    Martinez, Guillermo F.; Gupta, Hoshin V.

    2011-12-01

    Methods to select parsimonious and hydrologically consistent model structures are useful for evaluating dominance of hydrologic processes and representativeness of data. While information criteria (appropriately constrained to obey underlying statistical assumptions) can provide a basis for evaluating appropriate model complexity, it is not sufficient to rely upon the principle of maximum likelihood (ML) alone. We suggest that one must also call upon a "principle of hydrologic consistency," meaning that selected ML structures and parameter estimates must be constrained (as well as possible) to reproduce desired hydrological characteristics of the processes under investigation. This argument is demonstrated in the context of evaluating the suitability of candidate model structures for lumped water balance modeling across the continental United States, using data from 307 snow-free catchments. The models are constrained to satisfy several tests of hydrologic consistency, a flow space transformation is used to ensure better consistency with underlying statistical assumptions, and information criteria are used to evaluate model complexity relative to the data. The results clearly demonstrate that the principle of consistency provides a sensible basis for guiding selection of model structures and indicate strong spatial persistence of certain model structures across the continental United States. Further work to untangle reasons for model structure predominance can help to relate conceptual model structures to physical characteristics of the catchments, facilitating the task of prediction in ungaged basins.

  6. Preliminary evaluation of the Community Multiscale Air Quality model for 2002 over the Southeastern United States.

    Science.gov (United States)

    Morris, Ralph E; McNally, Dennis E; Tesche, Thomas W; Tonnesen, Gail; Boylan, James W; Brewer, Patricia

    2005-11-01

    The Visibility Improvement State and Tribal Association of the Southeast (VISTAS) is one of five Regional Planning Organizations that is charged with the management of haze, visibility, and other regional air quality issues in the United States. The VISTAS Phase I work effort modeled three episodes (January 2002, July 1999, and July 2001) to identify the optimal model configuration(s) to be used for the 2002 annual modeling in Phase II. Using model configurations recommended in the Phase I analysis, 2002 annual meteorological (Mesoscale Meterological Model [MM5]), emissions (Sparse Matrix Operator Kernal Emissions [SMOKE]), and air quality (Community Multiscale Air Quality [CMAQ]) simulations were performed on a 36-km grid covering the continental United States and a 12-km grid covering the Eastern United States. Model estimates were then compared against observations. This paper presents the results of the preliminary CMAQ model performance evaluation for the initial 2002 annual base case simulation. Model performance is presented for the Eastern United States using speciated fine particle concentration and wet deposition measurements from several monitoring networks. Initial results indicate fairly good performance for sulfate with fractional bias values generally within +/-20%. Nitrate is overestimated in the winter by approximately +50% and underestimated in the summer by more than -100%. Organic carbon exhibits a large summer underestimation bias of approximately -100% with much improved performance seen in the winter with a bias near zero. Performance for elemental carbon is reasonable with fractional bias values within +/- 40%. Other fine particulate (soil) and coarse particular matter exhibit large (80-150%) overestimation in the winter but improved performance in the summer. The preliminary 2002 CMAQ runs identified several areas of enhancements to improve model performance, including revised temporal allocation factors for ammonia emissions to improve

  7. Developing a Model for Identifying Students at Risk of Failure in a First Year Accounting Unit

    Science.gov (United States)

    Smith, Malcolm; Therry, Len; Whale, Jacqui

    2012-01-01

    This paper reports on the process involved in attempting to build a predictive model capable of identifying students at risk of failure in a first year accounting unit in an Australian university. Identifying attributes that contribute to students being at risk can lead to the development of appropriate intervention strategies and support…

  8. Modelling technological process of ion-exchange filtration of fluids in porous media

    Science.gov (United States)

    Ravshanov, N.; Saidov, U. M.

    2018-05-01

    Solution of an actual problem related to the process of filtration and dehydration of liquid and ionic solutions from gel particles and heavy ionic compounds is considered in the paper. This technological process is realized during the preparation and cleaning of chemical solutions, drinking water, pharmaceuticals, liquid fuels, products for public use, etc. For the analysis, research, determination of the main parameters of the technological process and operating modes of filter units and for support in managerial decision-making, a mathematical model is developed. Using the developed model, a series of computational experiments on a computer is carried out. The results of numerical calculations are illustrated in the form of graphs. Based on the analysis of numerical experiments, the conclusions are formulated that serve as the basis for making appropriate managerial decisions.

  9. Exploiting graphics processing units for computational biology and bioinformatics.

    Science.gov (United States)

    Payne, Joshua L; Sinnott-Armstrong, Nicholas A; Moore, Jason H

    2010-09-01

    Advances in the video gaming industry have led to the production of low-cost, high-performance graphics processing units (GPUs) that possess more memory bandwidth and computational capability than central processing units (CPUs), the standard workhorses of scientific computing. With the recent release of generalpurpose GPUs and NVIDIA's GPU programming language, CUDA, graphics engines are being adopted widely in scientific computing applications, particularly in the fields of computational biology and bioinformatics. The goal of this article is to concisely present an introduction to GPU hardware and programming, aimed at the computational biologist or bioinformaticist. To this end, we discuss the primary differences between GPU and CPU architecture, introduce the basics of the CUDA programming language, and discuss important CUDA programming practices, such as the proper use of coalesced reads, data types, and memory hierarchies. We highlight each of these topics in the context of computing the all-pairs distance between instances in a dataset, a common procedure in numerous disciplines of scientific computing. We conclude with a runtime analysis of the GPU and CPU implementations of the all-pairs distance calculation. We show our final GPU implementation to outperform the CPU implementation by a factor of 1700.

  10. Allocating HIV prevention funds in the United States: recommendations from an optimization model.

    Directory of Open Access Journals (Sweden)

    Arielle Lasry

    Full Text Available The Centers for Disease Control and Prevention (CDC had an annual budget of approximately $327 million to fund health departments and community-based organizations for core HIV testing and prevention programs domestically between 2001 and 2006. Annual HIV incidence has been relatively stable since the year 2000 and was estimated at 48,600 cases in 2006 and 48,100 in 2009. Using estimates on HIV incidence, prevalence, prevention program costs and benefits, and current spending, we created an HIV resource allocation model that can generate a mathematically optimal allocation of the Division of HIV/AIDS Prevention's extramural budget for HIV testing, and counseling and education programs. The model's data inputs and methods were reviewed by subject matter experts internal and external to the CDC via an extensive validation process. The model projects the HIV epidemic for the United States under different allocation strategies under a fixed budget. Our objective is to support national HIV prevention planning efforts and inform the decision-making process for HIV resource allocation. Model results can be summarized into three main recommendations. First, more funds should be allocated to testing and these should further target men who have sex with men and injecting drug users. Second, counseling and education interventions ought to provide a greater focus on HIV positive persons who are aware of their status. And lastly, interventions should target those at high risk for transmitting or acquiring HIV, rather than lower-risk members of the general population. The main conclusions of the HIV resource allocation model have played a role in the introduction of new programs and provide valuable guidance to target resources and improve the impact of HIV prevention efforts in the United States.

  11. Application of Bayesian Techniques to Model the Burden of Human Salmonellosis Attributable to U.S. Food Commodities at the Point of Processing: Adaptation of a Danish Model

    DEFF Research Database (Denmark)

    Guo, Chuanfa; Hoekstra, Robert M.; Schroeder, Carl M.

    2011-01-01

    -of-processing foodborne illness attribution model by adapting the Hald Salmonella Bayesian source attribution model. Key model outputs include estimates of the relative proportions of domestically acquired sporadic human Salmonella infections resulting from contamination of raw meat, poultry, and egg products processed...... in the United States from 1998 through 2003. The current model estimates the relative contribution of chicken (48%), ground beef (28%), turkey (17%), egg products (6%), intact beef (1%), and pork (...

  12. A systematic review evaluating the role of nurses and processes for delivering early mobility interventions in the intensive care unit.

    Science.gov (United States)

    Krupp, Anna; Steege, Linsey; King, Barbara

    2018-04-19

    To investigate processes for delivering early mobility interventions in adult intensive care unit patients used in research and quality improvement studies and the role of nurses in early mobility interventions. A systematic review was conducted. Electronic databases PubMED, CINAHL, PEDro, and Cochrane were searched for studies published from 2000 to June 2017 that implemented an early mobility intervention in adult intensive care units. Included studies involved progression to ambulation as a component of the intervention, included the role of the nurse in preparing for or delivering the intervention, and reported at least one patient or organisational outcome measure. The System Engineering Initiative for Patient Safety (SEIPS) model, a framework for understanding structure, processes, and healthcare outcomes, was used to evaluate studies. 25 studies were included in the final review. Studies consisted of randomised control trials, prospective, retrospective, or mixed designs. A range of processes to support the delivery of early mobility were found. These processes include forming interdisciplinary teams, increasing mobility staff, mobility protocols, interdisciplinary education, champions, communication, and feedback. Variation exists in the process of delivering early mobility in the intensive care unit. In particular, further rigorous studies are needed to better understand the role of nurses in implementing early mobility to maintain a patient's functional status. Copyright © 2018 Elsevier Ltd. All rights reserved.

  13. Graphics Processing Units for HEP trigger systems

    International Nuclear Information System (INIS)

    Ammendola, R.; Bauce, M.; Biagioni, A.; Chiozzi, S.; Cotta Ramusino, A.; Fantechi, R.; Fiorini, M.; Giagu, S.; Gianoli, A.; Lamanna, G.; Lonardo, A.; Messina, A.

    2016-01-01

    General-purpose computing on GPUs (Graphics Processing Units) is emerging as a new paradigm in several fields of science, although so far applications have been tailored to the specific strengths of such devices as accelerator in offline computation. With the steady reduction of GPU latencies, and the increase in link and memory throughput, the use of such devices for real-time applications in high-energy physics data acquisition and trigger systems is becoming ripe. We will discuss the use of online parallel computing on GPU for synchronous low level trigger, focusing on CERN NA62 experiment trigger system. The use of GPU in higher level trigger system is also briefly considered.

  14. Graphics Processing Units for HEP trigger systems

    Energy Technology Data Exchange (ETDEWEB)

    Ammendola, R. [INFN Sezione di Roma “Tor Vergata”, Via della Ricerca Scientifica 1, 00133 Roma (Italy); Bauce, M. [INFN Sezione di Roma “La Sapienza”, P.le A. Moro 2, 00185 Roma (Italy); University of Rome “La Sapienza”, P.lee A.Moro 2, 00185 Roma (Italy); Biagioni, A. [INFN Sezione di Roma “La Sapienza”, P.le A. Moro 2, 00185 Roma (Italy); Chiozzi, S.; Cotta Ramusino, A. [INFN Sezione di Ferrara, Via Saragat 1, 44122 Ferrara (Italy); University of Ferrara, Via Saragat 1, 44122 Ferrara (Italy); Fantechi, R. [INFN Sezione di Pisa, Largo B. Pontecorvo 3, 56127 Pisa (Italy); CERN, Geneve (Switzerland); Fiorini, M. [INFN Sezione di Ferrara, Via Saragat 1, 44122 Ferrara (Italy); University of Ferrara, Via Saragat 1, 44122 Ferrara (Italy); Giagu, S. [INFN Sezione di Roma “La Sapienza”, P.le A. Moro 2, 00185 Roma (Italy); University of Rome “La Sapienza”, P.lee A.Moro 2, 00185 Roma (Italy); Gianoli, A. [INFN Sezione di Ferrara, Via Saragat 1, 44122 Ferrara (Italy); University of Ferrara, Via Saragat 1, 44122 Ferrara (Italy); Lamanna, G., E-mail: gianluca.lamanna@cern.ch [INFN Sezione di Pisa, Largo B. Pontecorvo 3, 56127 Pisa (Italy); INFN Laboratori Nazionali di Frascati, Via Enrico Fermi 40, 00044 Frascati (Roma) (Italy); Lonardo, A. [INFN Sezione di Roma “La Sapienza”, P.le A. Moro 2, 00185 Roma (Italy); Messina, A. [INFN Sezione di Roma “La Sapienza”, P.le A. Moro 2, 00185 Roma (Italy); University of Rome “La Sapienza”, P.lee A.Moro 2, 00185 Roma (Italy); and others

    2016-07-11

    General-purpose computing on GPUs (Graphics Processing Units) is emerging as a new paradigm in several fields of science, although so far applications have been tailored to the specific strengths of such devices as accelerator in offline computation. With the steady reduction of GPU latencies, and the increase in link and memory throughput, the use of such devices for real-time applications in high-energy physics data acquisition and trigger systems is becoming ripe. We will discuss the use of online parallel computing on GPU for synchronous low level trigger, focusing on CERN NA62 experiment trigger system. The use of GPU in higher level trigger system is also briefly considered.

  15. Prospects and requirements for an operational modelling unit in flood crisis situations

    Directory of Open Access Journals (Sweden)

    Anders Katharina

    2016-01-01

    Full Text Available Dike failure events pose severe flood crisis situations on areas in the hinterland of dikes. In recent decades the importance of being prepared for dike breaches has been increasingly recognized. However, the pre-assessment of inundation resulting from dike breaches is possible only based on scenarios, which might not reflect the situation of a real event. This paper presents a setup and workflow that allows to model dike breachinduced inundation operationally, i.e. when an event is imminent or occurring. A comprehensive system setup of an operational modelling unit has been developed and implemented in the frame of a federal project in Saxony-Anhalt, Germany. The modelling unit setup comprises a powerful methodology of flood modelling and elaborated operational guidelines for crisis situations. Nevertheless, it is of fundamental importance that the modelling unit is instated prior to flood events as a permanent system. Moreover the unit needs to be fully integrated in flood crisis management. If these crucial requirements are met, a modelling unit is capable of fundamentally supporting flood management with operational prognoses of adequate quality even in the limited timeframe of crisis situations.

  16. Divergent projections of future land use in the United States arising from different models and scenarios

    Science.gov (United States)

    Sohl, Terry L.; Wimberly, Michael; Radeloff, Volker C.; Theobald, David M.; Sleeter, Benjamin M.

    2016-01-01

    A variety of land-use and land-cover (LULC) models operating at scales from local to global have been developed in recent years, including a number of models that provide spatially explicit, multi-class LULC projections for the conterminous United States. This diversity of modeling approaches raises the question: how consistent are their projections of future land use? We compared projections from six LULC modeling applications for the United States and assessed quantitative, spatial, and conceptual inconsistencies. Each set of projections provided multiple scenarios covering a period from roughly 2000 to 2050. Given the unique spatial, thematic, and temporal characteristics of each set of projections, individual projections were aggregated to a common set of basic, generalized LULC classes (i.e., cropland, pasture, forest, range, and urban) and summarized at the county level across the conterminous United States. We found very little agreement in projected future LULC trends and patterns among the different models. Variability among scenarios for a given model was generally lower than variability among different models, in terms of both trends in the amounts of basic LULC classes and their projected spatial patterns. Even when different models assessed the same purported scenario, model projections varied substantially. Projections of agricultural trends were often far above the maximum historical amounts, raising concerns about the realism of the projections. Comparisons among models were hindered by major discrepancies in categorical definitions, and suggest a need for standardization of historical LULC data sources. To capture a broader range of uncertainties, ensemble modeling approaches are also recommended. However, the vast inconsistencies among LULC models raise questions about the theoretical and conceptual underpinnings of current modeling approaches. Given the substantial effects that land-use change can have on ecological and societal processes, there

  17. Junior Leader Training Development in Operational Units

    Science.gov (United States)

    2012-04-01

    UNITS Successful operational units do not arise without tough, realistic, and challenging training. Field Manual (FM) 7-0, Training Units and D...operations. The manual provides junior leaders with guidance on how to conduct training and training management. Of particular importance is the definition...1 Relation htp between ADDIE and the Anny Training Management Model. The Army Training Management Model and ADDIE process appear in TRADOC PAM 350

  18. Process Improvement to Enhance Quality in a Large Volume Labor and Birth Unit.

    Science.gov (United States)

    Bell, Ashley M; Bohannon, Jessica; Porthouse, Lisa; Thompson, Heather; Vago, Tony

    The goal of the perinatal team at Mercy Hospital St. Louis is to provide a quality patient experience during labor and birth. After the move to a new labor and birth unit in 2013, the team recognized many of the routines and practices needed to be modified based on different demands. The Lean process was used to plan and implement required changes. This technique was chosen because it is based on feedback from clinicians, teamwork, strategizing, and immediate evaluation and implementation of common sense solutions. Through rapid improvement events, presence of leaders in the work environment, and daily huddles, team member engagement and communication were enhanced. The process allowed for team members to offer ideas, test these ideas, and evaluate results, all within a rapid time frame. For 9 months, frontline clinicians met monthly for a weeklong rapid improvement event to create better experiences for childbearing women and those who provide their care, using Lean concepts. At the end of each week, an implementation plan and metrics were developed to help ensure sustainment. The issues that were the focus of these process improvements included on-time initiation of scheduled cases such as induction of labor and cesarean birth, timely and efficient assessment and triage disposition, postanesthesia care and immediate newborn care completed within approximately 2 hours, transfer from the labor unit to the mother baby unit, and emergency transfers to the main operating room and intensive care unit. On-time case initiation for labor induction and cesarean birth improved, length of stay in obstetric triage decreased, postanesthesia recovery care was reorganized to be completed within the expected 2-hour standard time frame, and emergency transfers to the main hospital operating room and intensive care units were standardized and enhanced for efficiency and safety. Participants were pleased with the process improvements and quality outcomes. Working together as a team

  19. The inverse Gamma process: A family of continuous stochastic models for describing state-dependent deterioration phenomena

    International Nuclear Information System (INIS)

    Guida, M.; Pulcini, G.

    2013-01-01

    This paper proposes the family of non-stationary inverse Gamma processes for modeling state-dependent deterioration processes with nonlinear trend. The proposed family of processes, which is based on the assumption that the “inverse” time process is Gamma, is mathematically more tractable than previously proposed state-dependent processes, because, unlike the previous models, the inverse Gamma process is a time-continuous and state-continuous model and does not require discretization of time and state. The conditional distribution of the deterioration growth over a generic time interval, the conditional distribution of the residual life and the residual reliability of the unit, given the current state, are provided. Point and interval estimation of the parameters which index the proposed process, as well as of several quantities of interest, are also discussed. Finally, the proposed model is applied to the wear process of the liners of some Diesel engines which was previously analyzed and proved to be a purely state-dependent process. The comparison of the inferential results obtained under the competitor models shows the ability of the Inverse Gamma process to adequately model the observed state-dependent wear process

  20. Heterogeneous Multicore Parallel Programming for Graphics Processing Units

    Directory of Open Access Journals (Sweden)

    Francois Bodin

    2009-01-01

    Full Text Available Hybrid parallel multicore architectures based on graphics processing units (GPUs can provide tremendous computing power. Current NVIDIA and AMD Graphics Product Group hardware display a peak performance of hundreds of gigaflops. However, exploiting GPUs from existing applications is a difficult task that requires non-portable rewriting of the code. In this paper, we present HMPP, a Heterogeneous Multicore Parallel Programming workbench with compilers, developed by CAPS entreprise, that allows the integration of heterogeneous hardware accelerators in a unintrusive manner while preserving the legacy code.

  1. Nanoscale multireference quantum chemistry: full configuration interaction on graphical processing units.

    Science.gov (United States)

    Fales, B Scott; Levine, Benjamin G

    2015-10-13

    Methods based on a full configuration interaction (FCI) expansion in an active space of orbitals are widely used for modeling chemical phenomena such as bond breaking, multiply excited states, and conical intersections in small-to-medium-sized molecules, but these phenomena occur in systems of all sizes. To scale such calculations up to the nanoscale, we have developed an implementation of FCI in which electron repulsion integral transformation and several of the more expensive steps in σ vector formation are performed on graphical processing unit (GPU) hardware. When applied to a 1.7 × 1.4 × 1.4 nm silicon nanoparticle (Si72H64) described with the polarized, all-electron 6-31G** basis set, our implementation can solve for the ground state of the 16-active-electron/16-active-orbital CASCI Hamiltonian (more than 100,000,000 configurations) in 39 min on a single NVidia K40 GPU.

  2. WWTP Process Tank Modelling

    DEFF Research Database (Denmark)

    Laursen, Jesper

    The present thesis considers numerical modeling of activated sludge tanks on municipal wastewater treatment plants. Focus is aimed at integrated modeling where the detailed microbiological model the Activated Sludge Model 3 (ASM3) is combined with a detailed hydrodynamic model based on a numerical...... solution of the Navier-Stokes equations in a multiphase scheme. After a general introduction to the activated sludge tank as a system, the activated sludge tank model is gradually setup in separate stages. The individual sub-processes that are often occurring in activated sludge tanks are initially...... hydrofoil shaped propellers. These two sub-processes deliver the main part of the supplied energy to the activated sludge tank, and for this reason they are important for the mixing conditions in the tank. For other important processes occurring in the activated sludge tank, existing models and measurements...

  3. Application of ion-exchange unit in uranium extraction process in China (to be continued)

    International Nuclear Information System (INIS)

    Gong Chuanwen

    2004-01-01

    The application conditions of five different ion exchange units in uranium milling plant and wastewater treatment plant of uranium mine in China are introduced, including working parameters, existing problems and improvements. The advantages and disadvantages of these units are reviewed briefly. The procedure points to be followed in selecting ion exchange unit are recommended in the engineering design. The primary views are presented upon the application prospects of some ion exchange units in uranium extraction process in China

  4. Unit Process Wetlands for Removal of Trace Organic Contaminants and Pathogens from Municipal Wastewater Effluents

    Science.gov (United States)

    Jasper, Justin T.; Nguyen, Mi T.; Jones, Zackary L.; Ismail, Niveen S.; Sedlak, David L.; Sharp, Jonathan O.; Luthy, Richard G.; Horne, Alex J.; Nelson, Kara L.

    2013-01-01

    Abstract Treatment wetlands have become an attractive option for the removal of nutrients from municipal wastewater effluents due to their low energy requirements and operational costs, as well as the ancillary benefits they provide, including creating aesthetically appealing spaces and wildlife habitats. Treatment wetlands also hold promise as a means of removing other wastewater-derived contaminants, such as trace organic contaminants and pathogens. However, concerns about variations in treatment efficacy of these pollutants, coupled with an incomplete mechanistic understanding of their removal in wetlands, hinder the widespread adoption of constructed wetlands for these two classes of contaminants. A better understanding is needed so that wetlands as a unit process can be designed for their removal, with individual wetland cells optimized for the removal of specific contaminants, and connected in series or integrated with other engineered or natural treatment processes. In this article, removal mechanisms of trace organic contaminants and pathogens are reviewed, including sorption and sedimentation, biotransformation and predation, photolysis and photoinactivation, and remaining knowledge gaps are identified. In addition, suggestions are provided for how these treatment mechanisms can be enhanced in commonly employed unit process wetland cells or how they might be harnessed in novel unit process cells. It is hoped that application of the unit process concept to a wider range of contaminants will lead to more widespread application of wetland treatment trains as components of urban water infrastructure in the United States and around the globe. PMID:23983451

  5. Unit Process Wetlands for Removal of Trace Organic Contaminants and Pathogens from Municipal Wastewater Effluents.

    Science.gov (United States)

    Jasper, Justin T; Nguyen, Mi T; Jones, Zackary L; Ismail, Niveen S; Sedlak, David L; Sharp, Jonathan O; Luthy, Richard G; Horne, Alex J; Nelson, Kara L

    2013-08-01

    Treatment wetlands have become an attractive option for the removal of nutrients from municipal wastewater effluents due to their low energy requirements and operational costs, as well as the ancillary benefits they provide, including creating aesthetically appealing spaces and wildlife habitats. Treatment wetlands also hold promise as a means of removing other wastewater-derived contaminants, such as trace organic contaminants and pathogens. However, concerns about variations in treatment efficacy of these pollutants, coupled with an incomplete mechanistic understanding of their removal in wetlands, hinder the widespread adoption of constructed wetlands for these two classes of contaminants. A better understanding is needed so that wetlands as a unit process can be designed for their removal, with individual wetland cells optimized for the removal of specific contaminants, and connected in series or integrated with other engineered or natural treatment processes. In this article, removal mechanisms of trace organic contaminants and pathogens are reviewed, including sorption and sedimentation, biotransformation and predation, photolysis and photoinactivation, and remaining knowledge gaps are identified. In addition, suggestions are provided for how these treatment mechanisms can be enhanced in commonly employed unit process wetland cells or how they might be harnessed in novel unit process cells. It is hoped that application of the unit process concept to a wider range of contaminants will lead to more widespread application of wetland treatment trains as components of urban water infrastructure in the United States and around the globe.

  6. Implementation of the Business Process Modelling Notation (BPMN) in the modelling of anatomic pathology processes.

    Science.gov (United States)

    Rojo, Marcial García; Rolón, Elvira; Calahorra, Luis; García, Felix Oscar; Sánchez, Rosario Paloma; Ruiz, Francisco; Ballester, Nieves; Armenteros, María; Rodríguez, Teresa; Espartero, Rafael Martín

    2008-07-15

    Process orientation is one of the essential elements of quality management systems, including those in use in healthcare. Business processes in hospitals are very complex and variable. BPMN (Business Process Modelling Notation) is a user-oriented language specifically designed for the modelling of business (organizational) processes. Previous experiences of the use of this notation in the processes modelling within the Pathology in Spain or another country are not known. We present our experience in the elaboration of the conceptual models of Pathology processes, as part of a global programmed surgical patient process, using BPMN. With the objective of analyzing the use of BPMN notation in real cases, a multidisciplinary work group was created, including software engineers from the Dep. of Technologies and Information Systems from the University of Castilla-La Mancha and health professionals and administrative staff from the Hospital General de Ciudad Real. The work in collaboration was carried out in six phases: informative meetings, intensive training, process selection, definition of the work method, process describing by hospital experts, and process modelling. The modelling of the processes of Anatomic Pathology is presented using BPMN. The presented subprocesses are those corresponding to the surgical pathology examination of the samples coming from operating theatre, including the planning and realization of frozen studies. The modelling of Anatomic Pathology subprocesses has allowed the creation of an understandable graphical model, where management and improvements are more easily implemented by health professionals.

  7. modeling grinding modeling grinding processes as micro processes

    African Journals Online (AJOL)

    eobe

    industrial precision grinding processes are cylindrical, center less and ... Several model shave been proposed and used to study grinding ..... grinding force for the two cases were 9.07237N/mm ..... International Journal of Machine Tools &.

  8. Pre-design safety analyses of cesium ion-exchange compact processing unit

    International Nuclear Information System (INIS)

    Richmond, W.G.; Ballinger, M.Y.

    1993-11-01

    This report describes an innovative radioactive waste pretreatment concept. This cost-effective, highly flexible processing approach is based on the use of Compact Processing Units (CPUs) to treat highly radioactive tank wastes in proximity to the tanks themselves. The units will be designed to treat tank wastes at rates from 8 to 20 liters per minute and have the capacity to remove cesium, and ultimately other radionuclides, from 4,000 cubic meters of waste per year. This new concept is being integrated into waste per year. This new concept is being integrated into Hanford's tank farm management plans by a team of PNL and Westinghouse Hanford Company scientists and engineers. The first CPU to be designed and deployed will be used to remove cesium from Hanford double-shell tank (DST) supernatant waste. Separating Cs from the waste would be a major step toward lowering the radioactivity in the bulk of the waste, allowing it to be disposed of as a low-level solid waste form (e.g.,grout), while concentrating the more highly radioactive material for processing as high-level solid waste

  9. Advanced computational modelling for drying processes – A review

    International Nuclear Information System (INIS)

    Defraeye, Thijs

    2014-01-01

    Highlights: • Understanding the product dehydration process is a key aspect in drying technology. • Advanced modelling thereof plays an increasingly important role for developing next-generation drying technology. • Dehydration modelling should be more energy-oriented. • An integrated “nexus” modelling approach is needed to produce more energy-smart products. • Multi-objective process optimisation requires development of more complete multiphysics models. - Abstract: Drying is one of the most complex and energy-consuming chemical unit operations. R and D efforts in drying technology have skyrocketed in the past decades, as new drivers emerged in this industry next to procuring prime product quality and high throughput, namely reduction of energy consumption and carbon footprint as well as improving food safety and security. Solutions are sought in optimising existing technologies or developing new ones which increase energy and resource efficiency, use renewable energy, recuperate waste heat and reduce product loss, thus also the embodied energy therein. Novel tools are required to push such technological innovations and their subsequent implementation. Particularly computer-aided drying process engineering has a large potential to develop next-generation drying technology, including more energy-smart and environmentally-friendly products and dryers systems. This review paper deals with rapidly emerging advanced computational methods for modelling dehydration of porous materials, particularly for foods. Drying is approached as a combined multiphysics, multiscale and multiphase problem. These advanced methods include computational fluid dynamics, several multiphysics modelling methods (e.g. conjugate modelling), multiscale modelling and modelling of material properties and the associated propagation of material property variability. Apart from the current challenges for each of these, future perspectives should be directed towards material property

  10. Methodology for systematic analysis and improvement of manufacturing unit process life-cycle inventory (UPLCI)—CO2PE! initiative (cooperative effort on process emissions in manufacturing). Part 1: Methodology description

    DEFF Research Database (Denmark)

    Kellens, Karel; Dewulf, Wim; Overcash, Michael

    2012-01-01

    the provision of high-quality data for LCA studies of products using these unit process datasets for the manufacturing processes, as well as the in-depth analysis of individual manufacturing unit processes.In addition, the accruing availability of data for a range of similar machines (same process, different......This report proposes a life-cycle analysis (LCA)-oriented methodology for systematic inventory analysis of the use phase of manufacturing unit processes providing unit process datasets to be used in life-cycle inventory (LCI) databases and libraries. The methodology has been developed...... and resource efficiency improvements of the manufacturing unit process. To ensure optimal reproducibility and applicability, documentation guidelines for data and metadata are included in both approaches. Guidance on definition of functional unit and reference flow as well as on determination of system...

  11. Exergy analysis of a solar-powered vacuum membrane distillation unit using two models

    International Nuclear Information System (INIS)

    Miladi, Rihab; Frikha, Nader; Gabsi, Slimane

    2017-01-01

    A detailed exergy analysis of a solar powered VMD unit was performed using two models: the ideal mixture model and the model using the thermodynamics properties of seawater. The exergy flow rates of process steam, given by the two models differed of about 18%, on average. Despite these differences, the two models agree that during the step of condensation, the most important fraction of exergy was destroyed. Moreover, in this work, two forms of exergy efficiency are calculated. The overall exergy efficiency of the unit with reference to the exergy collected by the solar collector was 3.25% and 2.30% according to Cerci and Sharqawy models, respectively. But, it was 0.182% and 0.128%, when referenced to the exergy of solar radiation, according to Cerci and Sharqawy models, respectively. Besides, the utilitarian exergy efficiency was 9.96%. Since the heat exchanger, the hollow-fiber module and the condenser have a very high exergy performance, then it can be concluded that the enhancement or reduction of exergy losses will be mainly by recovering heat lost in brine discharges and in the rejection of the cooling water. In addition, the influence of the rejection rate on exergy efficiencies was studied. - Highlights: • Two exergy models were compared using a VMD plant dataset. • Two forms of exergy efficiency were evaluated and discussed. • The components responsible for the biggest losses in the system were identified. • The direction for performance enhancement of the desalination device was pointed out. • The influence of the rejection rate on exergy efficiencies was studied.

  12. Lumped Parameter Modeling for Rapid Vibration Response Prototyping and Test Correlation for Electronic Units

    Science.gov (United States)

    Van Dyke, Michael B.

    2013-01-01

    Present preliminary work using lumped parameter models to approximate dynamic response of electronic units to random vibration; Derive a general N-DOF model for application to electronic units; Illustrate parametric influence of model parameters; Implication of coupled dynamics for unit/board design; Demonstrate use of model to infer printed wiring board (PWB) dynamics from external chassis test measurement.

  13. Automatic pre-processing for an object-oriented distributed hydrological model using GRASS-GIS

    Science.gov (United States)

    Sanzana, P.; Jankowfsky, S.; Branger, F.; Braud, I.; Vargas, X.; Hitschfeld, N.

    2012-04-01

    Landscapes are very heterogeneous, which impact the hydrological processes occurring in the catchments, especially in the modeling of peri-urban catchments. The Hydrological Response Units (HRUs), resulting from the intersection of different maps, such as land use, soil types and geology, and flow networks, allow the representation of these elements in an explicit way, preserving natural and artificial contours of the different layers. These HRUs are used as model mesh in some distributed object-oriented hydrological models, allowing the application of a topological oriented approach. The connectivity between polygons and polylines provides a detailed representation of the water balance and overland flow in these distributed hydrological models, based on irregular hydro-landscape units. When computing fluxes between these HRUs, the geometrical parameters, such as the distance between the centroid of gravity of the HRUs and the river network, and the length of the perimeter, can impact the realism of the calculated overland, sub-surface and groundwater fluxes. Therefore, it is necessary to process the original model mesh in order to avoid these numerical problems. We present an automatic pre-processing implemented in the open source GRASS-GIS software, for which several Python scripts or some algorithms already available were used, such as the Triangle software. First, some scripts were developed to improve the topology of the various elements, such as snapping of the river network to the closest contours. When data are derived with remote sensing, such as vegetation areas, their perimeter has lots of right angles that were smoothed. Second, the algorithms more particularly address bad-shaped elements of the model mesh such as polygons with narrow shapes, marked irregular contours and/or the centroid outside of the polygons. To identify these elements we used shape descriptors. The convexity index was considered the best descriptor to identify them with a threshold

  14. Modeling of column apparatus processes

    CERN Document Server

    Boyadjiev, Christo; Boyadjiev, Boyan; Popova-Krumova, Petya

    2016-01-01

    This book presents a new approach for the modeling of chemical and interphase mass transfer processes in industrial column apparatuses, using convection-diffusion and average-concentration models. The convection-diffusion type models are used for a qualitative analysis of the processes and to assess the main, small and slight physical effects, and then reject the slight effects. As a result, the process mechanism can be identified. It also introduces average concentration models for quantitative analysis, which use the average values of the velocity and concentration over the cross-sectional area of the column. The new models are used to analyze different processes (simple and complex chemical reactions, absorption, adsorption and catalytic reactions), and make it possible to model the processes of gas purification with sulfur dioxide, which form the basis of several patents.

  15. Low cost solar array project production process and equipment task. A Module Experimental Process System Development Unit (MEPSDU)

    Science.gov (United States)

    1981-01-01

    Technical readiness for the production of photovoltaic modules using single crystal silicon dendritic web sheet material is demonstrated by: (1) selection, design and implementation of solar cell and photovoltaic module process sequence in a Module Experimental Process System Development Unit; (2) demonstration runs; (3) passing of acceptance and qualification tests; and (4) achievement of a cost effective module.

  16. Organization of Control Units with Operational Addressing

    OpenAIRE

    Alexander A. Barkalov; Roman M. Babakov; Larysa A. Titarenko

    2012-01-01

    The using of operational addressing unit as the block of control unit is proposed. The new structure model of Moore finite-state machine with reduced hardware amount is developed. The generalized structure of operational addressing unit is suggested. An example of synthesis process for Moore finite-state machine with operational addressing unit is given. The analytical researches of proposed structure of control unit are executed.

  17. Investigating the Process of Process Modeling with Eye Movement Analysis

    OpenAIRE

    Pinggera, Jakob; Furtner, Marco; Martini, Markus; Sachse, Pierre; Reiter, Katharina; Zugal, Stefan; Weber, Barbara

    2015-01-01

    Research on quality issues of business process models has recently begun to explore the process of creating process models by analyzing the modeler's interactions with the modeling environment. In this paper we aim to complement previous insights on the modeler's modeling behavior with data gathered by tracking the modeler's eye movements when engaged in the act of modeling. We present preliminary results and outline directions for future research to triangulate toward a more comprehensive un...

  18. What makes process models understandable?

    NARCIS (Netherlands)

    Mendling, J.; Reijers, H.A.; Cardoso, J.; Alonso, G.; Dadam, P.; Rosemann, M.

    2007-01-01

    Despite that formal and informal quality aspects are of significant importance to business process modeling, there is only little empirical work reported on process model quality and its impact factors. In this paper we investigate understandability as a proxy for quality of process models and focus

  19. Modeling the Object-Oriented Software Process: OPEN and the Unified Process

    NARCIS (Netherlands)

    van den Berg, Klaas; Aksit, Mehmet; van den Broek, P.M.

    A short introduction to software process modeling is presented, particularly object-oriented modeling. Two major industrial process models are discussed: the OPEN model and the Unified Process model. In more detail, the quality assurance in the Unified Process tool (formally called Objectory) is

  20. Benchmarking of Generation and Distribution Units in Nepal Using Modified DEA Models

    Science.gov (United States)

    Jha, Deependra Kumar; Yorino, Naoto; Zoka, Yoshifumi

    This paper analyzes the performance of Nepalese Electricity Supply Industry (ESI) by investigating the relative operational efficiencies of the generating stations as well as the Distribution Centers (DCs) of the Integrated Nepal Power System (INPS). Nepal Electricity Authority (NEA), a state owned utility, owns and operates the INPS. Performance evaluation of both generation and distribution systems is carried out by formulating suitable weight restriction type Data Envelopment Analysis (DEA) models. The models include a wide range of inputs and outputs representing essence of the respective processes. Decision maker's preferences as well as available quantitative information associated with the operation of the Decision Making Units (DMUs) are judiciously incorporated in the DEA models. The proposed models are realized through execution of computer programs written in General Algebraic Modeling Systems (GAMS) and the results obtained are thus compared against those from the conventional DEA models. Sensitivity analysis is performed in order to check the robustness of the results as well as to identify the improvement directions for DMUs. Ranking of the DMUs has been presented based on their average overall efficiency scores.

  1. Modeling the Object-Oriented Software Process: OPEN and the Unified Process

    OpenAIRE

    van den Berg, Klaas; Aksit, Mehmet; van den Broek, P.M.

    1999-01-01

    A short introduction to software process modeling is presented, particularly object-oriented modeling. Two major industrial process models are discussed: the OPEN model and the Unified Process model. In more detail, the quality assurance in the Unified Process tool (formally called Objectory) is reviewed.

  2. Process quality in the Trade Finance unit from the perspective of corporate banking employees

    OpenAIRE

    Mikkola, Henri

    2013-01-01

    This thesis examines the quality of the processes in the Trade Finance unit of Pohjola Bank, from the perspective of the corporate banking employees at Helsinki OP Bank. The Trade Finance unit provides methods of payment for foreign trade. Such services are intended for companies and the perspective investigated in this thesis is that of corporate banking employees. The purpose of this thesis is to define the quality of the processes and to develop solutions for difficulties discovered. The q...

  3. Utero-fetal unit and pregnant woman modeling using a computer graphics approach for dosimetry studies.

    Science.gov (United States)

    Anquez, Jérémie; Boubekeur, Tamy; Bibin, Lazar; Angelini, Elsa; Bloch, Isabelle

    2009-01-01

    Potential sanitary effects related to electromagnetic fields exposure raise public concerns, especially for fetuses during pregnancy. Human fetus exposure can only be assessed through simulated dosimetry studies, performed on anthropomorphic models of pregnant women. In this paper, we propose a new methodology to generate a set of detailed utero-fetal unit (UFU) 3D models during the first and third trimesters of pregnancy, based on segmented 3D ultrasound and MRI data. UFU models are built using recent geometry processing methods derived from mesh-based computer graphics techniques and embedded in a synthetic woman body. Nine pregnant woman models have been generated using this approach and validated by obstetricians, for anatomical accuracy and representativeness.

  4. Distributed model based control of multi unit evaporation systems

    International Nuclear Information System (INIS)

    Yudi Samyudia

    2006-01-01

    In this paper, we present a new approach to the analysis and design of distributed control systems for multi-unit plants. The approach is established after treating the effect of recycled dynamics as a gap metric uncertainty from which a distributed controller can be designed sequentially for each unit to tackle the uncertainty. We then use a single effect multi-unit evaporation system to illustrate how the proposed method is used to analyze different control strategies and to systematically achieve a better closed-loop performance using a distributed model-based controller

  5. Modeling Intercity Mode Choice and Airport Choice in the United States

    OpenAIRE

    Ashiabor, Senanu Y.

    2007-01-01

    The aim of this study was to develop a framework to model travel choice behavior in order to estimate intercity travel demand at nation-level in the United States. Nested and mixed logit models were developed to study national-level intercity transportation in the United States. A separate General Aviation airport choice model to estimates General Aviation person-trips and number of aircraft operations though more than 3000 airports was also developed. The combination of the General Aviati...

  6. Modeling and simulation of a New Design of the SMCEC Desalination Unit Using Solar Energy

    International Nuclear Information System (INIS)

    Zhani, K.; Ben Bacha, H.

    2009-01-01

    The aim of this research is to parametrically study a new process working design with Humidification/Dehumidification (HD) technique using solar energy which is developed to ameliorate the production of the SMCEC unit (Solar Multiple Condensation Evaporation Cycle). The SMCEC unit is currently operating at Sfax's national engineering school in Tunisia. The improvement of the production consists in increasing the capacity of air to load water vapor with heating and subsequent humidification of air at the exit of the condensation tower instead of rejecting or recycling it. So, to attend our objective, we need to integrate into the SMCEC unit a flat plate solar air collector for heating air and a humidifier for its humidification. Then, the newly designed system is basically composed of a flat plate solar air collector, a flat plate solar water collector, a humidifier, an evaporation tower and a condensation tower. A general model based on heat and mass transfers in each component of the unit is developed in a steady state regime. The obtained set of ordinary differential equations is converted to a set of algebraic system of equations by the functional approximation method of orthogonal collocation. The developed model is used to investigate both the effect of different operating modes on the water condensation rate and the steady state behavior of each component of the unit and the entire system exposed to a variation of the entrance parameters and meteorological conditions.

  7. Multi-enzyme Process Modeling

    DEFF Research Database (Denmark)

    Andrade Santacoloma, Paloma de Gracia

    are affected (in a positive or negative way) by the presence of the other enzymes and compounds in the media. In this thesis the concept of multi-enzyme in-pot term is adopted for processes that are carried out by the combination of enzymes in a single reactor and implemented at pilot or industrial scale...... features of the process and provides the information required to structure the process model by using a step-by-step procedure with the required tools and methods. In this way, this framework increases efficiency of the model development process with respect to time and resources needed (fast and effective....... In this way the model parameters that drives the main dynamic behavior can be identified and thus a better understanding of this type of processes. In order to develop, test and verify the methodology, three case studies were selected, specifically the bi-enzyme process for the production of lactobionic acid...

  8. Process model repositories and PNML

    NARCIS (Netherlands)

    Hee, van K.M.; Post, R.D.J.; Somers, L.J.A.M.; Werf, van der J.M.E.M.; Kindler, E.

    2004-01-01

    Bringing system and process models together in repositories facilitates the interchange of model information between modelling tools, and allows the combination and interlinking of complementary models. Petriweb is a web application for managing such repositories. It supports hierarchical process

  9. Model-based software process improvement

    Science.gov (United States)

    Zettervall, Brenda T.

    1994-01-01

    The activities of a field test site for the Software Engineering Institute's software process definition project are discussed. Products tested included the improvement model itself, descriptive modeling techniques, the CMM level 2 framework document, and the use of process definition guidelines and templates. The software process improvement model represents a five stage cyclic approach for organizational process improvement. The cycles consist of the initiating, diagnosing, establishing, acting, and leveraging phases.

  10. Biodiversity of indigenous staphylococci of naturally fermented dry sausages and manufacturing environments of small-scale processing units.

    Science.gov (United States)

    Leroy, Sabine; Giammarinaro, Philippe; Chacornac, Jean-Paul; Lebert, Isabelle; Talon, Régine

    2010-04-01

    The staphylococcal community of the environments of nine French small-scale processing units and their naturally fermented meat products was identified by analyzing 676 isolates. Fifteen species were accurately identified using validated molecular methods. The three prevalent species were Staphylococcus equorum (58.4%), Staphylococcus saprophyticus (15.7%) and Staphylococcus xylosus (9.3%). S. equorum was isolated in all the processing units in similar proportion in meat and environmental samples. S. saprophyticus was also isolated in all the processing units with a higher percentage in environmental samples. S. xylosus was present sporadically in the processing units and its prevalence was higher in meat samples. The genetic diversity of the strains within the three species isolated from one processing unit was studied by PFGE and revealed a high diversity for S. equorum and S. saprophyticus both in the environment and the meat isolates. The genetic diversity remained high through the manufacturing steps. A small percentage of the strains of the two species share the two ecological niches. These results highlight that some strains, probably introduced by the meat, will persist in the manufacturing environment, while other strains are more adapted to the meat products.

  11. Modeling number of bacteria per food unit in comparison to bacterial concentration in quantitative risk assessment: impact on risk estimates.

    Science.gov (United States)

    Pouillot, Régis; Chen, Yuhuan; Hoelzer, Karin

    2015-02-01

    When developing quantitative risk assessment models, a fundamental consideration for risk assessors is to decide whether to evaluate changes in bacterial levels in terms of concentrations or in terms of bacterial numbers. Although modeling bacteria in terms of integer numbers may be regarded as a more intuitive and rigorous choice, modeling bacterial concentrations is more popular as it is generally less mathematically complex. We tested three different modeling approaches in a simulation study. The first approach considered bacterial concentrations; the second considered the number of bacteria in contaminated units, and the third considered the expected number of bacteria in contaminated units. Simulation results indicate that modeling concentrations tends to overestimate risk compared to modeling the number of bacteria. A sensitivity analysis using a regression tree suggests that processes which include drastic scenarios consisting of combinations of large bacterial inactivation followed by large bacterial growth frequently lead to a >10-fold overestimation of the average risk when modeling concentrations as opposed to bacterial numbers. Alternatively, the approach of modeling the expected number of bacteria in positive units generates results similar to the second method and is easier to use, thus potentially representing a promising compromise. Published by Elsevier Ltd.

  12. Optimization Solutions for Improving the Performance of the Parallel Reduction Algorithm Using Graphics Processing Units

    Directory of Open Access Journals (Sweden)

    Ion LUNGU

    2012-01-01

    Full Text Available In this paper, we research, analyze and develop optimization solutions for the parallel reduction function using graphics processing units (GPUs that implement the Compute Unified Device Architecture (CUDA, a modern and novel approach for improving the software performance of data processing applications and algorithms. Many of these applications and algorithms make use of the reduction function in their computational steps. After having designed the function and its algorithmic steps in CUDA, we have progressively developed and implemented optimization solutions for the reduction function. In order to confirm, test and evaluate the solutions' efficiency, we have developed a custom tailored benchmark suite. We have analyzed the obtained experimental results regarding: the comparison of the execution time and bandwidth when using graphic processing units covering the main CUDA architectures (Tesla GT200, Fermi GF100, Kepler GK104 and a central processing unit; the data type influence; the binary operator's influence.

  13. Using Unified Modelling Language (UML) as a process-modelling technique for clinical-research process improvement.

    Science.gov (United States)

    Kumarapeli, P; De Lusignan, S; Ellis, T; Jones, B

    2007-03-01

    The Primary Care Data Quality programme (PCDQ) is a quality-improvement programme which processes routinely collected general practice computer data. Patient data collected from a wide range of different brands of clinical computer systems are aggregated, processed, and fed back to practices in an educational context to improve the quality of care. Process modelling is a well-established approach used to gain understanding and systematic appraisal, and identify areas of improvement of a business process. Unified modelling language (UML) is a general purpose modelling technique used for this purpose. We used UML to appraise the PCDQ process to see if the efficiency and predictability of the process could be improved. Activity analysis and thinking-aloud sessions were used to collect data to generate UML diagrams. The UML model highlighted the sequential nature of the current process as a barrier for efficiency gains. It also identified the uneven distribution of process controls, lack of symmetric communication channels, critical dependencies among processing stages, and failure to implement all the lessons learned in the piloting phase. It also suggested that improved structured reporting at each stage - especially from the pilot phase, parallel processing of data and correctly positioned process controls - should improve the efficiency and predictability of research projects. Process modelling provided a rational basis for the critical appraisal of a clinical data processing system; its potential maybe underutilized within health care.

  14. Modeling and simulation of CO methanation process for renewable electricity storage

    International Nuclear Information System (INIS)

    Er-rbib, Hanaâ; Bouallou, Chakib

    2014-01-01

    In this paper, a new approach of converting renewable electricity into methane via syngas (a mixture of CO and H 2 ) and CO methanation is presented. Surplus of electricity is used to electrolyze H 2 O and CO 2 to H 2 and CO by using a SOEC (Solid Oxide Electrolysis Cell). Syngas produced is then converted into methane. When high consumption peaks appear, methane is used to produce electricity. The main conversion step in this process is CO methanation. A modeling of catalytic fixed bed methanation reactor and a design of methanation unit composed of multistage adiabatic reactors are carried out using Aspen plus™ software. The model was validated by comparing the simulated results of gas composition (CH 4 , CO, CO 2 and H 2 ) with industrial data. In addition, the effects of recycle ratio on adiabatic reactor stages, outlet temperature, and H 2 and CO conversions are carefully investigated. It is found that for storing 10 MW of renewable electricity, methanation unit is composed of three adiabatic reactors with recycle loop and intermediate cooling at 553 K and 1.5 MPa. The methanation unit generates 3778.6 kg/h of steam at 523.2 K and 1 MPa (13.67 MW). - Highlights: • A catalytic fixed bed reactor of CO methanation was modeled. • The maximum relative error of the methanation reactor model is 12%. • For 10 MW storage of renewable electricity, three adiabatic reactors are required. • The recycle ratio affects the reactor outlet temperature and CO conversion

  15. Development of a Parafin Wax deposition Unit for Fused Deposition Modelling (FDM)

    DEFF Research Database (Denmark)

    D'Angelo, Greta; Hansen, Hans Nørgaard; Pedersen, David Bue

    2014-01-01

    . This project illustrates the redesign of an extrusion unit for the deposition of paraffin wax in Fused Deposition Modelling (FDM) instead of the conventional polymeric materials. Among the benefits and brought by the use of paraffin wax in such system are: the possibility to make highly complex and precise...... parts to subsequently use in a Lost Wax Casting process, multi-material Additive Manufacturing and the use of wax as support material during the production of complicated parts. Moreover it is believed that including waxes among the materials usable in FDM would promote new ways of using and exploring...

  16. Developing engineering processes through integrated modelling of product and process

    DEFF Research Database (Denmark)

    Nielsen, Jeppe Bjerrum; Hvam, Lars

    2012-01-01

    This article aims at developing an operational tool for integrated modelling of product assortments and engineering processes in companies making customer specific products. Integrating a product model in the design of engineering processes will provide a deeper understanding of the engineering...... activities as well as insight into how product features affect the engineering processes. The article suggests possible ways of integrating models of products with models of engineering processes. The models have been tested and further developed in an action research study carried out in collaboration...... with a major international engineering company....

  17. UML in business process modeling

    Directory of Open Access Journals (Sweden)

    Bartosz Marcinkowski

    2013-03-01

    Full Text Available Selection and proper application of business process modeling methods and techniques have a significant impact on organizational improvement capabilities as well as proper understanding of functionality of information systems that shall support activity of the organization. A number of business process modeling notations were popularized in practice in recent decades. Most significant of the notations include Business Process Modeling Notation (OMG BPMN and several Unified Modeling Language (OMG UML extensions. In this paper, the assessment whether one of the most flexible and strictly standardized contemporary business process modeling notations, i.e. Rational UML Profile for Business Modeling, enable business analysts to prepare business models that are all-embracing and understandable by all the stakeholders. After the introduction, methodology of research is discussed. Section 2 presents selected case study results. The paper is concluded with a summary.

  18. Automatic extraction of process categories from process model collections

    NARCIS (Netherlands)

    Malinova, M.; Dijkman, R.M.; Mendling, J.; Lohmann, N.; Song, M.; Wohed, P.

    2014-01-01

    Many organizations build up their business process management activities in an incremental way. As a result, there is no overarching structure defined at the beginning. However, as business process modeling initiatives often yield hundreds to thousands of process models, there is a growing need for

  19. Biosphere Process Model Report

    Energy Technology Data Exchange (ETDEWEB)

    J. Schmitt

    2000-05-25

    To evaluate the postclosure performance of a potential monitored geologic repository at Yucca Mountain, a Total System Performance Assessment (TSPA) will be conducted. Nine Process Model Reports (PMRs), including this document, are being developed to summarize the technical basis for each of the process models supporting the TSPA model. These reports cover the following areas: (1) Integrated Site Model; (2) Unsaturated Zone Flow and Transport; (3) Near Field Environment; (4) Engineered Barrier System Degradation, Flow, and Transport; (5) Waste Package Degradation; (6) Waste Form Degradation; (7) Saturated Zone Flow and Transport; (8) Biosphere; and (9) Disruptive Events. Analysis/Model Reports (AMRs) contain the more detailed technical information used to support TSPA and the PMRs. The AMRs consists of data, analyses, models, software, and supporting documentation that will be used to defend the applicability of each process model for evaluating the postclosure performance of the potential Yucca Mountain repository system. This documentation will ensure the traceability of information from its source through its ultimate use in the TSPA-Site Recommendation (SR) and in the National Environmental Policy Act (NEPA) analysis processes. The objective of the Biosphere PMR is to summarize (1) the development of the biosphere model, and (2) the Biosphere Dose Conversion Factors (BDCFs) developed for use in TSPA. The Biosphere PMR does not present or summarize estimates of potential radiation doses to human receptors. Dose calculations are performed as part of TSPA and will be presented in the TSPA documentation. The biosphere model is a component of the process to evaluate postclosure repository performance and regulatory compliance for a potential monitored geologic repository at Yucca Mountain, Nevada. The biosphere model describes those exposure pathways in the biosphere by which radionuclides released from a potential repository could reach a human receptor

  20. Biosphere Process Model Report

    International Nuclear Information System (INIS)

    Schmitt, J.

    2000-01-01

    To evaluate the postclosure performance of a potential monitored geologic repository at Yucca Mountain, a Total System Performance Assessment (TSPA) will be conducted. Nine Process Model Reports (PMRs), including this document, are being developed to summarize the technical basis for each of the process models supporting the TSPA model. These reports cover the following areas: (1) Integrated Site Model; (2) Unsaturated Zone Flow and Transport; (3) Near Field Environment; (4) Engineered Barrier System Degradation, Flow, and Transport; (5) Waste Package Degradation; (6) Waste Form Degradation; (7) Saturated Zone Flow and Transport; (8) Biosphere; and (9) Disruptive Events. Analysis/Model Reports (AMRs) contain the more detailed technical information used to support TSPA and the PMRs. The AMRs consists of data, analyses, models, software, and supporting documentation that will be used to defend the applicability of each process model for evaluating the postclosure performance of the potential Yucca Mountain repository system. This documentation will ensure the traceability of information from its source through its ultimate use in the TSPA-Site Recommendation (SR) and in the National Environmental Policy Act (NEPA) analysis processes. The objective of the Biosphere PMR is to summarize (1) the development of the biosphere model, and (2) the Biosphere Dose Conversion Factors (BDCFs) developed for use in TSPA. The Biosphere PMR does not present or summarize estimates of potential radiation doses to human receptors. Dose calculations are performed as part of TSPA and will be presented in the TSPA documentation. The biosphere model is a component of the process to evaluate postclosure repository performance and regulatory compliance for a potential monitored geologic repository at Yucca Mountain, Nevada. The biosphere model describes those exposure pathways in the biosphere by which radionuclides released from a potential repository could reach a human receptor

  1. Conceptual models of information processing

    Science.gov (United States)

    Stewart, L. J.

    1983-01-01

    The conceptual information processing issues are examined. Human information processing is defined as an active cognitive process that is analogous to a system. It is the flow and transformation of information within a human. The human is viewed as an active information seeker who is constantly receiving, processing, and acting upon the surrounding environmental stimuli. Human information processing models are conceptual representations of cognitive behaviors. Models of information processing are useful in representing the different theoretical positions and in attempting to define the limits and capabilities of human memory. It is concluded that an understanding of conceptual human information processing models and their applications to systems design leads to a better human factors approach.

  2. A Block-Asynchronous Relaxation Method for Graphics Processing Units

    OpenAIRE

    Anzt, H.; Dongarra, J.; Heuveline, Vincent; Tomov, S.

    2011-01-01

    In this paper, we analyze the potential of asynchronous relaxation methods on Graphics Processing Units (GPUs). For this purpose, we developed a set of asynchronous iteration algorithms in CUDA and compared them with a parallel implementation of synchronous relaxation methods on CPU-based systems. For a set of test matrices taken from the University of Florida Matrix Collection we monitor the convergence behavior, the average iteration time and the total time-to-solution time. Analyzing the r...

  3. Toward a model framework of generalized parallel componential processing of multi-symbol numbers.

    Science.gov (United States)

    Huber, Stefan; Cornelsen, Sonja; Moeller, Korbinian; Nuerk, Hans-Christoph

    2015-05-01

    In this article, we propose and evaluate a new model framework of parallel componential multi-symbol number processing, generalizing the idea of parallel componential processing of multi-digit numbers to the case of negative numbers by considering the polarity signs similar to single digits. In a first step, we evaluated this account by defining and investigating a sign-decade compatibility effect for the comparison of positive and negative numbers, which extends the unit-decade compatibility effect in 2-digit number processing. Then, we evaluated whether the model is capable of accounting for previous findings in negative number processing. In a magnitude comparison task, in which participants had to single out the larger of 2 integers, we observed a reliable sign-decade compatibility effect with prolonged reaction times for incompatible (e.g., -97 vs. +53; in which the number with the larger decade digit has the smaller, i.e., negative polarity sign) as compared with sign-decade compatible number pairs (e.g., -53 vs. +97). Moreover, an analysis of participants' eye fixation behavior corroborated our model of parallel componential processing of multi-symbol numbers. These results are discussed in light of concurrent theoretical notions about negative number processing. On the basis of the present results, we propose a generalized integrated model framework of parallel componential multi-symbol processing. (c) 2015 APA, all rights reserved).

  4. Regional LLRW processing alternatives applying the DOE REGINALT systems analysis model

    International Nuclear Information System (INIS)

    Beers, G.H.

    1987-01-01

    The DOE Low-Level Waste Management Program has developed a computer-based decision support system of models that may be used by nonprogrammers to evaluate a comprehensive approach to commercial low-level radioactive waste (LLRW) management. REGINALT (Regional Waste Management Alternatives Analysis Model) implementation will be described as the model is applied to hypothetical regional compact for the purpose of examining the technical and economic potential of two waste processing alternatives. Using waste from a typical regional compact, two specific regional waste processing centers are compared for feasibility. Example 1 assumes that a regional supercompaction facility is being developed for the region. Example 2 assumes that a regional facility with both supercompaction and incineration is specified. Both examples include identical disposal facilities, except that capacity may differ due to variation in volume reduction achieved. The two examples are compared with regard to volume reduction achieved, estimated occupational exposure for the processing facilities, and life cycle costs per generated unit waste. A base case also illustrates current disposal practices. The results of the comparisons evaluated, and other steps, if necessary, for additional decision support are identified

  5. Genome-Wide Mapping of Transcriptional Regulation and Metabolism Describes Information-Processing Units in Escherichia coli

    Directory of Open Access Journals (Sweden)

    Daniela Ledezma-Tejeida

    2017-08-01

    Full Text Available In the face of changes in their environment, bacteria adjust gene expression levels and produce appropriate responses. The individual layers of this process have been widely studied: the transcriptional regulatory network describes the regulatory interactions that produce changes in the metabolic network, both of which are coordinated by the signaling network, but the interplay between them has never been described in a systematic fashion. Here, we formalize the process of detection and processing of environmental information mediated by individual transcription factors (TFs, utilizing a concept termed genetic sensory response units (GENSOR units, which are composed of four components: (1 a signal, (2 signal transduction, (3 genetic switch, and (4 a response. We used experimentally validated data sets from two databases to assemble a GENSOR unit for each of the 189 local TFs of Escherichia coli K-12 contained in the RegulonDB database. Further analysis suggested that feedback is a common occurrence in signal processing, and there is a gradient of functional complexity in the response mediated by each TF, as opposed to a one regulator/one pathway rule. Finally, we provide examples of other GENSOR unit applications, such as hypothesis generation, detailed description of cellular decision making, and elucidation of indirect regulatory mechanisms.

  6. Process control and product evaluation in micro molding using a screwless/two-plunger injection unit

    DEFF Research Database (Denmark)

    Tosello, Guido; Hansen, Hans Nørgaard; Dormann, B.

    2010-01-01

    A newly developed μ-injection molding machine equipped with a screwless/two-plunger injection unit has been employed to mould miniaturized dog-bone shaped specimens on polyoxymethylene and its process capability and robustness have been analyzed. The influence of process parameters on μ-injection......A newly developed μ-injection molding machine equipped with a screwless/two-plunger injection unit has been employed to mould miniaturized dog-bone shaped specimens on polyoxymethylene and its process capability and robustness have been analyzed. The influence of process parameters on μ......-injection molding was investigated using the Design of Experiments technique. Injection pressure and piston stroke speed as well as part weight and dimensions were considered as quality factors over a wide range of process parameters. Experimental results obtained under different processing conditions were...

  7. Optimized Laplacian image sharpening algorithm based on graphic processing unit

    Science.gov (United States)

    Ma, Tinghuai; Li, Lu; Ji, Sai; Wang, Xin; Tian, Yuan; Al-Dhelaan, Abdullah; Al-Rodhaan, Mznah

    2014-12-01

    In classical Laplacian image sharpening, all pixels are processed one by one, which leads to large amount of computation. Traditional Laplacian sharpening processed on CPU is considerably time-consuming especially for those large pictures. In this paper, we propose a parallel implementation of Laplacian sharpening based on Compute Unified Device Architecture (CUDA), which is a computing platform of Graphic Processing Units (GPU), and analyze the impact of picture size on performance and the relationship between the processing time of between data transfer time and parallel computing time. Further, according to different features of different memory, an improved scheme of our method is developed, which exploits shared memory in GPU instead of global memory and further increases the efficiency. Experimental results prove that two novel algorithms outperform traditional consequentially method based on OpenCV in the aspect of computing speed.

  8. Process Simulation for the Design and Scale Up of Heterogeneous Catalytic Process: Kinetic Modelling Issues

    Directory of Open Access Journals (Sweden)

    Antonio Tripodi

    2017-05-01

    Full Text Available Process simulation represents an important tool for plant design and optimization, either applied to well established or to newly developed processes. Suitable thermodynamic packages should be selected in order to properly describe the behavior of reactors and unit operations and to precisely define phase equilibria. Moreover, a detailed and representative kinetic scheme should be available to predict correctly the dependence of the process on its main variables. This review points out some models and methods for kinetic analysis specifically applied to the simulation of catalytic processes, as a basis for process design and optimization. Attention is paid also to microkinetic modelling and to the methods based on first principles, to elucidate mechanisms and independently calculate thermodynamic and kinetic parameters. Different case studies support the discussion. At first, we have selected two basic examples from the industrial chemistry practice, e.g., ammonia and methanol synthesis, which may be described through a relatively simple reaction pathway and the relative available kinetic scheme. Then, a more complex reaction network is deeply discussed to define the conversion of bioethanol into syngas/hydrogen or into building blocks, such as ethylene. In this case, lumped kinetic schemes completely fail the description of process behavior. Thus, in this case, more detailed—e.g., microkinetic—schemes should be available to implement into the simulator. However, the correct definition of all the kinetic data when complex microkinetic mechanisms are used, often leads to unreliable, highly correlated parameters. In such cases, greater effort to independently estimate some relevant kinetic/thermodynamic data through Density Functional Theory (DFT/ab initio methods may be helpful to improve process description.

  9. The modelling of dynamic chemical state of paper machine unit operations; Dynaamisen kemiallisen tilan mallintaminen paperikoneen yksikkoeoperaatioissa - MPKT 04

    Energy Technology Data Exchange (ETDEWEB)

    Ylen, J.P.; Jutila, P. [Helsinki Univ. of Technology, Otaniemi (Finland)

    1998-12-31

    The chemical state of paper mass is considered to be a key factor to the smooth operation of the paper machine. There are simulators that have been developed either for dynamic energy and mass balances or for static chemical phenomena, but the combination of these is not a straight forward task. Control Engineering Laboratory of Helsinki University of Technology has studied the paper machine wet end phenomena with the emphasis on pH-modelling. VTT (Technical Research Centre of Finland) Process Physics has used thermodynamical modelling successfully in e.g. Bleaching processes. In this research the different approaches are combined in order to get reliable dynamical models and modelling procedures for various unit operations. A flexible pilot process will be constructed and different materials will be processed starting from simple inorganic substances (e.g. Calcium carbonate and distilled water) working towards more complex masses (thick pulp with process waters and various reagents). The pilot process is well instrumented with ion selective electrodes, total calcium analysator and all basic measurements. (orig.)

  10. The modelling of dynamic chemical state of paper machine unit operations; Dynaamisen kemiallisen tilan mallintaminen paperikoneen yksikkoeoperaatioissa - MPKT 04

    Energy Technology Data Exchange (ETDEWEB)

    Ylen, J P; Jutila, P [Helsinki Univ. of Technology, Otaniemi (Finland)

    1999-12-31

    The chemical state of paper mass is considered to be a key factor to the smooth operation of the paper machine. There are simulators that have been developed either for dynamic energy and mass balances or for static chemical phenomena, but the combination of these is not a straight forward task. Control Engineering Laboratory of Helsinki University of Technology has studied the paper machine wet end phenomena with the emphasis on pH-modelling. VTT (Technical Research Centre of Finland) Process Physics has used thermodynamical modelling successfully in e.g. Bleaching processes. In this research the different approaches are combined in order to get reliable dynamical models and modelling procedures for various unit operations. A flexible pilot process will be constructed and different materials will be processed starting from simple inorganic substances (e.g. Calcium carbonate and distilled water) working towards more complex masses (thick pulp with process waters and various reagents). The pilot process is well instrumented with ion selective electrodes, total calcium analysator and all basic measurements. (orig.)

  11. MODELING AND SIMULATION OF A HYDROCRACKING UNIT

    Directory of Open Access Journals (Sweden)

    HASSAN A. FARAG

    2016-06-01

    Full Text Available Hydrocracking is used in the petroleum industry to convert low quality feed stocks into high valued transportation fuels such as gasoline, diesel, and jet fuel. The aim of the present work is to develop a rigorous steady state two-dimensional mathematical model which includes conservation equations of mass and energy for simulating the operation of a hydrocracking unit. Both the catalyst bed and quench zone have been included in this integrated model. The model equations were numerically solved in both axial and radial directions using Matlab software. The presented model was tested against a real plant data in Egypt. The results indicated that a very good agreement between the model predictions and industrial values have been reported for temperature profiles, concentration profiles, and conversion in both radial and axial directions at the hydrocracking unit. Simulation of the quench zone conversion and temperature profiles in the quench zone was also included and gave a low deviation from the actual ones. In concentration profiles, the percentage deviation in the first reactor was found to be 9.28 % and 9.6% for the second reactor. The effect of several parameters such as: Pellet Heat Transfer Coefficient, Effective Radial Thermal Conductivity, Wall Heat Transfer Coefficient, Effective Radial Diffusivity, and Cooling medium (quench zone has been included in this study. The variation of Wall Heat Transfer Coefficient, Effective Radial Diffusivity for the near-wall region, gave no remarkable changes in the temperature profiles. On the other hand, even small variations of Effective Radial Thermal Conductivity, affected the simulated temperature profiles significantly, and this effect could not be compensated by the variations of the other parameters of the model.

  12. Improved model management with aggregated business process models

    NARCIS (Netherlands)

    Reijers, H.A.; Mans, R.S.; Toorn, van der R.A.

    2009-01-01

    Contemporary organizations invest much efforts in creating models of their business processes. This raises the issue of how to deal with large sets of process models that become available over time. This paper proposes an extension of Event-driven Process Chains, called the aggregate EPC (aEPC),

  13. Analysis of impact of general-purpose graphics processor units in supersonic flow modeling

    Science.gov (United States)

    Emelyanov, V. N.; Karpenko, A. G.; Kozelkov, A. S.; Teterina, I. V.; Volkov, K. N.; Yalozo, A. V.

    2017-06-01

    Computational methods are widely used in prediction of complex flowfields associated with off-normal situations in aerospace engineering. Modern graphics processing units (GPU) provide architectures and new programming models that enable to harness their large processing power and to design computational fluid dynamics (CFD) simulations at both high performance and low cost. Possibilities of the use of GPUs for the simulation of external and internal flows on unstructured meshes are discussed. The finite volume method is applied to solve three-dimensional unsteady compressible Euler and Navier-Stokes equations on unstructured meshes with high resolution numerical schemes. CUDA technology is used for programming implementation of parallel computational algorithms. Solutions of some benchmark test cases on GPUs are reported, and the results computed are compared with experimental and computational data. Approaches to optimization of the CFD code related to the use of different types of memory are considered. Speedup of solution on GPUs with respect to the solution on central processor unit (CPU) is compared. Performance measurements show that numerical schemes developed achieve 20-50 speedup on GPU hardware compared to CPU reference implementation. The results obtained provide promising perspective for designing a GPU-based software framework for applications in CFD.

  14. Modeling and design of a combined transverse and axial flow threshing unit for rice harvesters

    Directory of Open Access Journals (Sweden)

    Zhong Tang

    2014-11-01

    Full Text Available The thorough investigation of both grain threshing and grain separating processes is a crucial consideration for effective structural design and variable optimization of the tangential flow threshing cylinder and longitudinal axial flow threshing cylinder composite units (TLFC unit of small and medium-sized (SME combine harvesters. The objective of this paper was to obtain the structural variables of a TLFC unit by theoretical modeling and experimentation on a tangential flow threshing cylinder unit (TFC unit and longitudinal axial flow threshing cylinder unit (LFC unit. Threshing and separation equations for five types of threshing teeth (knife bar, trapezoidal tooth, spike tooth, rasp bar, and rectangular bar, were obtained using probability theory. Results demonstrate that the threshing and separation capacity of the knife bar TFC unit was stronger than the other threshing teeth. The length of the LFC unit was divided into four sections, with helical blades on the first section (0-0.17 m, the spike tooth on the second section (0.17-1.48 m, the trapezoidal tooth on the third section (1.48-2.91 m, and the discharge plate on the fourth section (2.91-3.35 m. Test results showed an un-threshed grain rate of 0.243%, un-separated grain rate of 0.346%, and broken grain rate of 0.184%. Evidenced by these results, threshing and separation performance is significantly improved by analyzing and optimizing the structure and variables of a TLFC unit. The results of this research can be used to successfully design the TLFC unit of small and medium-sized (SME combine harvesters.

  15. Business process modeling in healthcare.

    Science.gov (United States)

    Ruiz, Francisco; Garcia, Felix; Calahorra, Luis; Llorente, César; Gonçalves, Luis; Daniel, Christel; Blobel, Bernd

    2012-01-01

    The importance of the process point of view is not restricted to a specific enterprise sector. In the field of health, as a result of the nature of the service offered, health institutions' processes are also the basis for decision making which is focused on achieving their objective of providing quality medical assistance. In this chapter the application of business process modelling - using the Business Process Modelling Notation (BPMN) standard is described. Main challenges of business process modelling in healthcare are the definition of healthcare processes, the multi-disciplinary nature of healthcare, the flexibility and variability of the activities involved in health care processes, the need of interoperability between multiple information systems, and the continuous updating of scientific knowledge in healthcare.

  16. Monitoring and modelling of a continuous from-powder-to-tablet process line

    DEFF Research Database (Denmark)

    Mortier, Séverine T.F.C.; Nopens, Ingmar; De Beer, Thomas

    2014-01-01

    -time adjustment of critical input variables to ensure that the process stays within the Design Space. Mechanistic models are very useful for this purpose as, once validated, several tools can be applied to gain further process knowledge, for example uncertainty and sensitivity analysis. In addition, several......The intention to shift from batch to continuous production processes within the pharmaceutical industry enhances the need to monitor and control the process in-line and real-time to continuously guarantee the end-product quality. Mass and energy balances have been successfully applied to a drying...... process which is part of a continuous from-powder-to-tablet manufacturing line to calculate the residual moisture content of granules leaving the drying unit on the basis of continuously generated data from univariate sensors. Next to monitoring, the application of continuous processes demands also real...

  17. A reliability model of a warm standby configuration with two identical sets of units

    International Nuclear Information System (INIS)

    Huang, Wei; Loman, James; Song, Thomas

    2015-01-01

    This article presents a new reliability model and the development of its analytical solution for a warm standby redundant configuration with units that are originally operated in active mode, and then, upon turn-on of originally standby units, are put into warm standby mode. These units can be used later if a standby- turned into active-unit fails. Numerical results of an example configuration are presented and discussed with comparison to other warm standby configurations, and to Monte Carlo simulation results obtained from BlockSim software. Results show that the Monte Carlo simulation model gives virtually identical reliability value when the simulation uses a high number of replications, confirming the developed model. - Highlights: • A new reliability model is developed for a warm standby redundancy with two sets of identical units. • The units subject to state change from active to standby then back to active mode. • A closed form analytical solution is developed with exponential distribution. • To validate the developed model, a Monte Carlo simulation for an exemplary configuration is performed

  18. Integrating textual and model-based process descriptions for comprehensive process search

    NARCIS (Netherlands)

    Leopold, Henrik; van der Aa, Han; Pittke, Fabian; Raffel, Manuel; Mendling, Jan; Reijers, Hajo A.

    2016-01-01

    Documenting business processes using process models is common practice in many organizations. However, not all process information is best captured in process models. Hence, many organizations complement these models with textual descriptions that specify additional details. The problem with this

  19. On process model representation and AlF{sub 3} dynamics of aluminium electrolysis cells

    Energy Technology Data Exchange (ETDEWEB)

    Drengstig, Tormod

    1997-12-31

    This thesis develops a formal graphical based process representation scheme for modelling complex, non-standard unit processes. The scheme is based on topological and phenomenological decompositions. The topological decomposition is the modularization of processes into modules representing volumes and boundaries, whereas the phenomenological decomposition focuses on physical phenomena and characteristics inside these topological modules. This defines legal and illegal connections between components at all levels and facilitates a full implementation of the methodology into a computer aided modelling tool that can interpret graphical symbols and guide modelers towards a consistent mathematical model of the process. The thesis also presents new results on the excess AlF{sub 3} and bath temperature dynamics of an aluminium electrolysis cell. A dynamic model of such a cell is developed and validated against known behaviour and real process data. There are dynamics that the model does not capture and this is further discussed. It is hypothesized that long-term prediction of bath temperature and excess AlF{sub 3} is impossible with a current efficiency model considering only bath composition and temperature. A control strategy for excess AlF{sub 3} and bath temperature is proposed based on an almost constant AlF{sub 3} input close to average consumption and energy manipulations to compensate for the disturbances. 96 refs., 135 figs., 22 tabs.

  20. Modelling multi-phase liquid-sediment scour and resuspension induced by rapid flows using Smoothed Particle Hydrodynamics (SPH) accelerated with a Graphics Processing Unit (GPU)

    Science.gov (United States)

    Fourtakas, G.; Rogers, B. D.

    2016-06-01

    A two-phase numerical model using Smoothed Particle Hydrodynamics (SPH) is applied to two-phase liquid-sediments flows. The absence of a mesh in SPH is ideal for interfacial and highly non-linear flows with changing fragmentation of the interface, mixing and resuspension. The rheology of sediment induced under rapid flows undergoes several states which are only partially described by previous research in SPH. This paper attempts to bridge the gap between the geotechnics, non-Newtonian and Newtonian flows by proposing a model that combines the yielding, shear and suspension layer which are needed to predict accurately the global erosion phenomena, from a hydrodynamics prospective. The numerical SPH scheme is based on the explicit treatment of both phases using Newtonian and the non-Newtonian Bingham-type Herschel-Bulkley-Papanastasiou constitutive model. This is supplemented by the Drucker-Prager yield criterion to predict the onset of yielding of the sediment surface and a concentration suspension model. The multi-phase model has been compared with experimental and 2-D reference numerical models for scour following a dry-bed dam break yielding satisfactory results and improvements over well-known SPH multi-phase models. With 3-D simulations requiring a large number of particles, the code is accelerated with a graphics processing unit (GPU) in the open-source DualSPHysics code. The implementation and optimisation of the code achieved a speed up of x58 over an optimised single thread serial code. A 3-D dam break over a non-cohesive erodible bed simulation with over 4 million particles yields close agreement with experimental scour and water surface profiles.

  1. Computing the Density Matrix in Electronic Structure Theory on Graphics Processing Units.

    Science.gov (United States)

    Cawkwell, M J; Sanville, E J; Mniszewski, S M; Niklasson, Anders M N

    2012-11-13

    The self-consistent solution of a Schrödinger-like equation for the density matrix is a critical and computationally demanding step in quantum-based models of interatomic bonding. This step was tackled historically via the diagonalization of the Hamiltonian. We have investigated the performance and accuracy of the second-order spectral projection (SP2) algorithm for the computation of the density matrix via a recursive expansion of the Fermi operator in a series of generalized matrix-matrix multiplications. We demonstrate that owing to its simplicity, the SP2 algorithm [Niklasson, A. M. N. Phys. Rev. B2002, 66, 155115] is exceptionally well suited to implementation on graphics processing units (GPUs). The performance in double and single precision arithmetic of a hybrid GPU/central processing unit (CPU) and full GPU implementation of the SP2 algorithm exceed those of a CPU-only implementation of the SP2 algorithm and traditional matrix diagonalization when the dimensions of the matrices exceed about 2000 × 2000. Padding schemes for arrays allocated in the GPU memory that optimize the performance of the CUBLAS implementations of the level 3 BLAS DGEMM and SGEMM subroutines for generalized matrix-matrix multiplications are described in detail. The analysis of the relative performance of the hybrid CPU/GPU and full GPU implementations indicate that the transfer of arrays between the GPU and CPU constitutes only a small fraction of the total computation time. The errors measured in the self-consistent density matrices computed using the SP2 algorithm are generally smaller than those measured in matrices computed via diagonalization. Furthermore, the errors in the density matrices computed using the SP2 algorithm do not exhibit any dependence of system size, whereas the errors increase linearly with the number of orbitals when diagonalization is employed.

  2. Evolution of the Power Processing Units Architecture for Electric Propulsion at CRISA

    Science.gov (United States)

    Palencia, J.; de la Cruz, F.; Wallace, N.

    2008-09-01

    Since 2002, the team formed by EADS Astrium CRISA, Astrium GmbH Friedrichshafen, and QinetiQ has participated in several flight programs where the Electric Propulsion based on Kaufman type Ion Thrusters is the baseline conceptOn 2002, CRISA won the contract for the development of the Ion Propulsion Control Unit (IPCU) for GOCE. This unit together with the T5 thruster by QinetiQ provides near perfect atmospheric drag compensation offering thrust levels in the range of 1 to 20mN.By the end of 2003, CRISA started the adaptation of the IPCU concept to the QinetiQ T6 Ion Thruster for the Alphabus program.This paper shows how the Power Processing Unit design evolved in time including the current developments.

  3. Process modeling for Humanities: tracing and analyzing scientific processes

    OpenAIRE

    Hug , Charlotte; Salinesi , Camille; Deneckere , Rebecca; Lamasse , Stéphane

    2011-01-01

    International audience; This paper concerns epistemology and the understanding of research processes in Humanities, such as Archaeology. We believe that to properly understand research processes, it is essential to trace them. The collected traces depend on the process model established, which has to be as accurate as possible to exhaustively record the traces. In this paper, we briefly explain why the existing process models for Humanities are not sufficient to represent traces. We then pres...

  4. Design process and instrumentation of a low NOx wire-mesh duct burner for micro-cogeneration unit

    Energy Technology Data Exchange (ETDEWEB)

    Ramadan, O.B.; Gauthier, J.E.D. [Carleton Univ., Ottawa, ON (Canada). Dept. of Mechanical and Aerospace Engineering; Hughes, P.M.; Brandon, R. [Natural Resources Canada, Ottawa, ON (Canada). CANMET Energy Technology Centre

    2007-07-01

    Air pollution and global climate change have become a serious environmental problem leading to increasingly stringent government regulations worldwide. New designs and methods for improving combustion systems to minimize the production of toxic emissions, like nitrogen oxides (NOx) are therefore needed. In order to control smog, acid rain, ozone depletion, and greenhouse-effect warming, a reduction of nitrogen oxide is necessary. One alternative for combined electrical power and heat generation (CHP) are micro-cogeneration units which use a micro-turbine as a prime mover. However, to increase the efficiencies of these units, micro-cogeneration technology still needs to be developed further. This paper described the design process, building, and testing of a new low NOx wire-mesh duct burner (WMDB) for the development of a more efficient micro-cogeneration unit. The primary goal of the study was to develop a practical and simple WMDB, which produces low emissions by using lean-premixed surface combustion concept and its objectives were separated into four phases which were described in this paper. Phase I involved the design and construction of the burner. Phase II involved a qualitative flow visualization study for the duct burner premixer to assist the new design of the burner by introducing an efficient premixer that could be used in this new application. Phase III of this research program involved non-reacting flow modeling on the burner premixer flow field using a commercial computational fluid dynamic model. In phase IV, the reacting flow experimental investigation was performed. It was concluded that the burner successfully increased the quantity and the quality of the heat released from the micro-CHP unit and carbon monoxide emissions of less than 9 ppm were reached. 3 refs., 3 figs.

  5. Controllable unit concept as applied to a hypothetical tritium process

    International Nuclear Information System (INIS)

    Seabaugh, P.W.; Sellers, D.E.; Woltermann, H.A.; Boh, D.R.; Miles, J.C.; Fushimi, F.C.

    1976-01-01

    A methodology (controllable unit accountability) is described that identifies controlling errors for corrective action, locates areas and time frames of suspected diversions, defines time and sensitivity limits of diversion flags, defines the time frame in which pass-through quantities of accountable material and by inference SNM remain controllable and provides a basis for identification of incremental cost associated with purely safeguards considerations. The concept provides a rationale from which measurement variability and specific safeguard criteria can be converted into a numerical value that represents the degree of control or improvement attainable with a specific measurement system or combination of systems. Currently the methodology is being applied to a high-throughput, mixed-oxide fuel fabrication process. The process described is merely used to illustrate a procedure that can be applied to other more pertinent processes

  6. Status Report from the United Kingdom [Processing of Low-Grade Uranium Ores

    Energy Technology Data Exchange (ETDEWEB)

    North, A A [Warren Spring Laboratory, Stevenage, Herts. (United Kingdom)

    1967-06-15

    The invitation to present this status report could have been taken literally as a request for information on experience gained in the actual processing of low-grade uranium ores in the United Kingdom, in which case there would have been very little to report; however, the invitation naturally was considered to be a request for a report on the experience gained by the United Kingdom of the processing of uranium ores. Lowgrade uranium ores are not treated in the United Kingdom simply because the country does not possess any known significant deposits of uranium ore. It is of interest to record the fact that during the nineteenth century mesothermal vein deposits associated with Hercynian granite were worked at South Terras, Cornwall, and ore that contained approximately 100 tons of uranium oxide was exported to Germany. Now only some 20 tons of contained uranium oxide remain at South Terras; also in Cornwall there is a small number of other vein deposits that each hold about five tons of uranium. Small lodes of uranium ore have been located in the southern uplands of Scotland; in North Wales lower palaeozoic black shales have only as much as 50 to 80 parts per million of uranium oxide, and a slightly lower grade carbonaceous shale is found near the base of the millstone grit that occurs in the north of England. Thus the experience gained by the United Kingdom has been of the treatment of uranium ores that occur abroad.

  7. Modeling nuclear processes by Simulink

    Energy Technology Data Exchange (ETDEWEB)

    Rashid, Nahrul Khair Alang Md, E-mail: nahrul@iium.edu.my [Faculty of Engineering, International Islamic University Malaysia, Jalan Gombak, Selangor (Malaysia)

    2015-04-29

    Modelling and simulation are essential parts in the study of dynamic systems behaviours. In nuclear engineering, modelling and simulation are important to assess the expected results of an experiment before the actual experiment is conducted or in the design of nuclear facilities. In education, modelling can give insight into the dynamic of systems and processes. Most nuclear processes can be described by ordinary or partial differential equations. Efforts expended to solve the equations using analytical or numerical solutions consume time and distract attention from the objectives of modelling itself. This paper presents the use of Simulink, a MATLAB toolbox software that is widely used in control engineering, as a modelling platform for the study of nuclear processes including nuclear reactor behaviours. Starting from the describing equations, Simulink models for heat transfer, radionuclide decay process, delayed neutrons effect, reactor point kinetic equations with delayed neutron groups, and the effect of temperature feedback are used as examples.

  8. Modeling nuclear processes by Simulink

    International Nuclear Information System (INIS)

    Rashid, Nahrul Khair Alang Md

    2015-01-01

    Modelling and simulation are essential parts in the study of dynamic systems behaviours. In nuclear engineering, modelling and simulation are important to assess the expected results of an experiment before the actual experiment is conducted or in the design of nuclear facilities. In education, modelling can give insight into the dynamic of systems and processes. Most nuclear processes can be described by ordinary or partial differential equations. Efforts expended to solve the equations using analytical or numerical solutions consume time and distract attention from the objectives of modelling itself. This paper presents the use of Simulink, a MATLAB toolbox software that is widely used in control engineering, as a modelling platform for the study of nuclear processes including nuclear reactor behaviours. Starting from the describing equations, Simulink models for heat transfer, radionuclide decay process, delayed neutrons effect, reactor point kinetic equations with delayed neutron groups, and the effect of temperature feedback are used as examples

  9. IRI-2012 MODEL ADAPTABILITY ESTIMATION FOR AUTOMATED PROCESSING OF VERTICAL SOUNDING IONOGRAMS

    Directory of Open Access Journals (Sweden)

    V. D. Nikolaeva

    2014-01-01

    Full Text Available The paper deals with possibility of IRI-2012 global empirical model applying to the vertical sounding of the ionosphere semiautomatic data processing. Main ionosphere characteristics from vertical sounding data at IZMIRAN Voeikovo station in February 2013 were compared with IRI-2012 model calculation results. 2688 model values and 1866 real values of f0F2, f0E, hmF2, hmE were processed. E and F2 layers critical frequency (f0E, f0F2 and the maximum altitudes (hmF2, hmE were determined from the ionograms. Vertical profiles of the electron concentration were restored with IRI-2012 model by measured frequency and height. The model calculation was also made without the inclusion of the real vertical sounding data. Monthly averages and standard deviations (σ for the parameters f0F2, f0E, hmF2, hmE for each hour of the day were calculated according to the vertical sounding and model values. Model applicability conditions for automated processing of ionograms for subauroral ionosphere were determined. Initial IRI-2012 model can be applied in the sub-auroral ionograms processing at daytime with undisturbed conditions in the absence of sporadic ionization. In this case model calculations can be adjusted by the near-time vertical sounding data. IRI-2012 model values for f0E (in daytime and hmF2 can be applied to reduce computational costs in the systems of automatic parameters search and preliminary determination of the searching area range for the main parameters. IRI-2012 model can be used for a more accurate approximation of the real data series in the absence of the real values. In view of sporadic ionization, ionosphere models of the high latitudes must be applied with corpuscular ions formation unit.

  10. Mathematical modelling of thermal and flow processes in vertical ground heat exchangers

    Directory of Open Access Journals (Sweden)

    Pater Sebastian

    2017-12-01

    Full Text Available The main task of mathematical modelling of thermal and flow processes in vertical ground heat exchanger (BHE-Borehole Heat Exchanger is to determine the unit of borehole depth heat flux obtainable or transferred during the operation of the installation. This assignment is indirectly associated with finding the circulating fluid temperature flowing out from the U-tube at a given inlet temperature of fluid in respect to other operational parameters of the installation.

  11. ENTREPRENEURIAL OPPORTUNITIES IN FOOD PROCESSING UNITS (WITH SPECIAL REFERENCES TO BYADGI RED CHILLI COLD STORAGE UNITS IN THE KARNATAKA STATE

    Directory of Open Access Journals (Sweden)

    P. ISHWARA

    2010-01-01

    Full Text Available After the green revolution, we are now ushering in the evergreen revolution in the country; food processing is an evergreen activity. It is the key to the agricultural sector. In this paper an attempt has been made to study the workings of food processing units with special references to Red Chilli Cold Storage units in the Byadgi district of Karnataka State. Byadgi has been famous for Red Chilli since the days it’s of antiquity. The vast and extensive market yard in Byadagi taluk is famous as the second largest Red Chilli dealing market in the country. However, the most common and recurring problem faced by the farmer is inability to store enough red chilli from one harvest to another. Red chilli that was locally abundant for only a short period of time had to be stored against times of scarcity. In recent years, due to Oleoresin, demand for Red Chilli has grow from other countries like Sri Lanka, Bangladesh, America, Europe, Nepal, Indonesia, Mexico etc. The study reveals that all the cold storage units of the study area have been using vapour compression refrigeration system or method. All entrepreneurs have satisfied with their turnover and profit and they are in a good economic position. Even though the average turnover and profits are increased, few units have shown negligible amount of decrease in turnover and profit. This is due to the competition from increasing number of cold storages and early established units. The cold storages of the study area have been storing Red chilli, Chilli seeds, Chilli powder, Tamarind, Jeera, Dania, Turmeric, Sunflower, Zinger, Channa, Flower seeds etc,. But the 80 per cent of the each cold storage is filled by the red chilli this is due to the existence of vast and extensivered chilli market yard in the Byadgi. There is no business without problems. In the same way the entrepreneurs who are chosen for the study are facing a few problems in their business like skilled labour, technical and management

  12. Sato Processes in Default Modeling

    DEFF Research Database (Denmark)

    Kokholm, Thomas; Nicolato, Elisa

    -change of a homogeneous Levy process. While the processes in these two classes share the same average behavior over time, the associated intensities exhibit very different properties. Concrete specifications are calibrated to data on the single names included in the iTraxx Europe index. The performances are compared......In reduced form default models, the instantaneous default intensity is classically the modeling object. Survival probabilities are then given by the Laplace transform of the cumulative hazard defined as the integrated intensity process. Instead, recent literature has shown a tendency towards...... specifying the cumulative hazard process directly. Within this framework we present a new model class where cumulative hazards are described by self-similar additive processes, also known as Sato processes. Furthermore we also analyze specifications obtained via a simple deterministic time...

  13. Research on the pyrolysis of hardwood in an entrained bed process development unit

    Energy Technology Data Exchange (ETDEWEB)

    Kovac, R.J.; Gorton, C.W.; Knight, J.A.; Newman, C.J.; O' Neil, D.J. (Georgia Inst. of Tech., Atlanta, GA (United States). Research Inst.)

    1991-08-01

    An atmospheric flash pyrolysis process, the Georgia Tech Entrained Flow Pyrolysis Process, for the production of liquid biofuels from oak hardwood is described. The development of the process began with bench-scale studies and a conceptual design in the 1978--1981 timeframe. Its development and successful demonstration through research on the pyrolysis of hardwood in an entrained bed process development unit (PDU), in the period of 1982--1989, is presented. Oil yields (dry basis) up to 60% were achieved in the 1.5 ton-per-day PDU, far exceeding the initial target/forecast of 40% oil yields. Experimental data, based on over forty runs under steady-state conditions, supported by material and energy balances of near-100% closures, have been used to establish a process model which indicates that oil yields well in excess of 60% (dry basis) can be achieved in a commercial reactor. Experimental results demonstrate a gross product thermal efficiency of 94% and a net product thermal efficiency of 72% or more; the highest values yet achieved with a large-scale biomass liquefaction process. A conceptual manufacturing process and an economic analysis for liquid biofuel production at 60% oil yield from a 200-TPD commercial plant is reported. The plant appears to be profitable at contemporary fuel costs of $21/barrel oil-equivalent. Total capital investment is estimated at under $2.5 million. A rate-of-return on investment of 39.4% and a pay-out period of 2.1 years has been estimated. The manufacturing cost of the combustible pyrolysis oil is $2.70 per gigajoule. 20 figs., 87 tabs.

  14. Caesium-137 in soils in relation to the nine unit landsurface model in a semi-arid environment in Western Australia

    International Nuclear Information System (INIS)

    Loughran, R.J.; Pilgrim, A.T.; Conacher, A.J.

    1987-01-01

    The comparison of 137Cs levels at hillslope sites with input sites used in this study presented data time-averaged for the period since the onset of fallout, 1954. It also took no account of temporary fluctuations in fallout and soil movement. Consequently, relationships sought between shorter-term measurements of sediment yield and 137Cs loss may not be in total agreement. Furthermore, soil samples for 137Cs in each washtray catchment represented only a small fraction of the landsurface. Despite these apparent shortcomings, the 137Cs technique was able to discriminate between the more stable (unit 1) interfluve landsurfaces and units 5 (transportational midslope) and 6 (colluvial footslope) on Catena One, and to a lesser degree on Catena Two. The unit 6 sites had less 137Cs than unit 1 (input) sites and some of the profiles were truncated. Indications were that, while there was no 137Cs evidence for net soil gain over the past 30 years, the degree of soil loss was generally lower than on unit 5 sites on the same catena. Since the model does not define most landsurface unit in terms of dominance of a particular process but rather in terms of what pedogeomorphic processes and their diagnostic properties distinguish any given unit from other units' (Conacher and Dalrymple, 1977: 65), it may be concluded that 137Cs levels are another of these properties

  15. Spatial resolution recovery utilizing multi-ray tracing and graphic processing unit in PET image reconstruction

    International Nuclear Information System (INIS)

    Liang, Yicheng; Peng, Hao

    2015-01-01

    Depth-of-interaction (DOI) poses a major challenge for a PET system to achieve uniform spatial resolution across the field-of-view, particularly for small animal and organ-dedicated PET systems. In this work, we implemented an analytical method to model system matrix for resolution recovery, which was then incorporated in PET image reconstruction on a graphical processing unit platform, due to its parallel processing capacity. The method utilizes the concepts of virtual DOI layers and multi-ray tracing to calculate the coincidence detection response function for a given line-of-response. The accuracy of the proposed method was validated for a small-bore PET insert to be used for simultaneous PET/MR breast imaging. In addition, the performance comparisons were studied among the following three cases: 1) no physical DOI and no resolution modeling; 2) two physical DOI layers and no resolution modeling; and 3) no physical DOI design but with a different number of virtual DOI layers. The image quality was quantitatively evaluated in terms of spatial resolution (full-width-half-maximum and position offset), contrast recovery coefficient and noise. The results indicate that the proposed method has the potential to be used as an alternative to other physical DOI designs and achieve comparable imaging performances, while reducing detector/system design cost and complexity. (paper)

  16. Computerized nursing process in the Intensive Care Unit: ergonomics and usability

    Directory of Open Access Journals (Sweden)

    Sônia Regina Wagner de Almeida

    Full Text Available Abstract OBJECTIVE Analyzing the ergonomics and usability criteria of the Computerized Nursing Process based on the International Classification for Nursing Practice in the Intensive Care Unit according to International Organization for Standardization(ISO. METHOD A quantitative, quasi-experimental, before-and-after study with a sample of 16 participants performed in an Intensive Care Unit. Data collection was performed through the application of five simulated clinical cases and an evaluation instrument. Data analysis was performed by descriptive and inferential statistics. RESULTS The organization, content and technical criteria were considered "excellent", and the interface criteria were considered "very good", obtaining means of 4.54, 4.60, 4.64 and 4.39, respectively. The analyzed standards obtained means above 4.0, being considered "very good" by the participants. CONCLUSION The Computerized Nursing Processmet ergonomic and usability standards according to the standards set by ISO. This technology supports nurses' clinical decision-making by providing complete and up-to-date content for Nursing practice in the Intensive Care Unit.

  17. Multiphysics modelling of manufacturing processes: A review

    DEFF Research Database (Denmark)

    Jabbari, Masoud; Baran, Ismet; Mohanty, Sankhya

    2018-01-01

    Numerical modelling is increasingly supporting the analysis and optimization of manufacturing processes in the production industry. Even if being mostly applied to multistep processes, single process steps may be so complex by nature that the needed models to describe them must include multiphysics...... the diversity in the field of modelling of manufacturing processes as regards process, materials, generic disciplines as well as length scales: (1) modelling of tape casting for thin ceramic layers, (2) modelling the flow of polymers in extrusion, (3) modelling the deformation process of flexible stamps...... for nanoimprint lithography, (4) modelling manufacturing of composite parts and (5) modelling the selective laser melting process. For all five examples, the emphasis is on modelling results as well as describing the models in brief mathematical details. Alongside with relevant references to the original work...

  18. Design of the Laboratory-Scale Plutonium Oxide Processing Unit in the Radiochemical Processing Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Lumetta, Gregg J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Meier, David E. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Tingey, Joel M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Casella, Amanda J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Delegard, Calvin H. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Edwards, Matthew K. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Orton, Robert D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Rapko, Brian M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Smart, John E. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-05-01

    This report describes a design for a laboratory-scale capability to produce plutonium oxide (PuO2) for use in identifying and validating nuclear forensics signatures associated with plutonium production, as well as for use as exercise and reference materials. This capability will be located in the Radiochemical Processing Laboratory at the Pacific Northwest National Laboratory. The key unit operations are described, including PuO2 dissolution, purification of the Pu by ion exchange, precipitation, and re-conversion to PuO2 by calcination.

  19. Configurable multi-perspective business process models

    NARCIS (Netherlands)

    La Rosa, M.; Dumas, M.; Hofstede, ter A.H.M.; Mendling, J.

    2011-01-01

    A configurable process model provides a consolidated view of a family of business processes. It promotes the reuse of proven practices by providing analysts with a generic modeling artifact from which to derive individual process models. Unfortunately, the scope of existing notations for

  20. Probabilistic evaluation of process model matching techniques

    NARCIS (Netherlands)

    Kuss, Elena; Leopold, Henrik; van der Aa, Han; Stuckenschmidt, Heiner; Reijers, Hajo A.

    2016-01-01

    Process model matching refers to the automatic identification of corresponding activities between two process models. It represents the basis for many advanced process model analysis techniques such as the identification of similar process parts or process model search. A central problem is how to

  1. An Investigation of the Role of Grapheme Units in Word Recognition

    Science.gov (United States)

    Lupker, Stephen J.; Acha, Joana; Davis, Colin J.; Perea, Manuel

    2012-01-01

    In most current models of word recognition, the word recognition process is assumed to be driven by the activation of letter units (i.e., that letters are the perceptual units in reading). An alternative possibility is that the word recognition process is driven by the activation of grapheme units, that is, that graphemes, rather than letters, are…

  2. A simple rainfall-runoff model based on hydrological units applied to the Teba catchment (south-east Spain)

    Science.gov (United States)

    Donker, N. H. W.

    2001-01-01

    A hydrological model (YWB, yearly water balance) has been developed to model the daily rainfall-runoff relationship of the 202 km2 Teba river catchment, located in semi-arid south-eastern Spain. The period of available data (1976-1993) includes some very rainy years with intensive storms (responsible for flooding parts of the town of Malaga) and also some very dry years.The YWB model is in essence a simple tank model in which the catchment is subdivided into a limited number of meaningful hydrological units. Instead of generating per unit surface runoff resulting from infiltration excess, runoff has been made the result of storage excess. Actual evapotranspiration is obtained by means of curves, included in the software, representing the relationship between the ratio of actual to potential evapotranspiration as a function of soil moisture content for three soil texture classes.The total runoff generated is split between base flow and surface runoff according to a given baseflow index. The two components are routed separately and subsequently joined. A large number of sequential years can be processed, and the results of each year are summarized by a water balance table and a daily based rainfall runoff time series. An attempt has been made to restrict the amount of input data to the minimum.Interactive manual calibration is advocated in order to allow better incorporation of field evidence and the experience of the model user. Field observations allowed for an approximate calibration at the hydrological unit level.

  3. A Modified Microfinance Model Proposed for the United States

    Directory of Open Access Journals (Sweden)

    Eldon H Bernstein

    2014-07-01

    While the goal in the traditional model in developing markets is the elimination of poverty, we show how those critical conditions help to explain the lack of success in the United States.  We propose a modified model whose goal is the creation of an entrepreneurial venture or improving the performance of an existing small enterprise.

  4. Syntax highlighting in business process models

    NARCIS (Netherlands)

    Reijers, H.A.; Freytag, T.; Mendling, J.; Eckleder, A.

    2011-01-01

    Sense-making of process models is an important task in various phases of business process management initiatives. Despite this, there is currently hardly any support in business process modeling tools to adequately support model comprehension. In this paper we adapt the concept of syntax

  5. Genetic Process Mining: Alignment-based Process Model Mutation

    NARCIS (Netherlands)

    Eck, van M.L.; Buijs, J.C.A.M.; Dongen, van B.F.; Fournier, F.; Mendling, J.

    2015-01-01

    The Evolutionary Tree Miner (ETM) is a genetic process discovery algorithm that enables the user to guide the discovery process based on preferences with respect to four process model quality dimensions: replay fitness, precision, generalization and simplicity. Traditionally, the ETM algorithm uses

  6. Modeling of an improved chemical vapor infiltration process for ceramic composites fabrication

    International Nuclear Information System (INIS)

    Tai, N.H.; Chou, T.W.

    1990-01-01

    A quasi-steady-state approach is applied to model the pressure-driven, temperature-gradient chemical vapor infiltration (improved CVI process) for ceramic matrix composites fabrication. The deposited matrix in this study is SiC which is converted from the thermal decomposition of methyltrichlorosilane gas under excess hydrogen. A three-dimensional unit cell is adopted to simulate the spatial arrangements of reinforcements in discontinuous fiber mats and three-dimensionally woven fabrics. The objectives of this paper are to predict the temperature and density distributions in a fibrous preform during processing, the advancement of the solidified front, the total fabrication period, and the vapor inlet pressure variation for maintaining a constant flow rate

  7. High-Performance Pseudo-Random Number Generation on Graphics Processing Units

    OpenAIRE

    Nandapalan, Nimalan; Brent, Richard P.; Murray, Lawrence M.; Rendell, Alistair

    2011-01-01

    This work considers the deployment of pseudo-random number generators (PRNGs) on graphics processing units (GPUs), developing an approach based on the xorgens generator to rapidly produce pseudo-random numbers of high statistical quality. The chosen algorithm has configurable state size and period, making it ideal for tuning to the GPU architecture. We present a comparison of both speed and statistical quality with other common parallel, GPU-based PRNGs, demonstrating favourable performance o...

  8. Analysis of the overall energy intensity of alumina refinery process using unit process energy intensity and product ratio method

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Liru; Aye, Lu [International Technologies Center (IDTC), Department of Civil and Environmental Engineering,The University of Melbourne, Vic. 3010 (Australia); Lu, Zhongwu [Institute of Materials and Metallurgy, Northeastern University, Shenyang 110004 (China); Zhang, Peihong [Department of Municipal and Environmental Engineering, Shenyang Architecture University, Shenyang 110168 (China)

    2006-07-15

    Alumina refinery is an energy intensive industry. Traditional energy saving methods employed have been single-equipment-orientated. Based on two concepts of 'energy carrier' and 'system', this paper presents a method that analyzes the effects of unit process energy intensity (e) and product ratio (p) on overall energy intensity of alumina. The important conclusion drawn from this method is that it is necessary to decrease both the unit process energy intensity and the product ratios in order to decrease the overall energy intensity of alumina, which may be taken as a future policy for energy saving. As a case study, the overall energy intensity of the Chinese Zhenzhou alumina refinery plant with Bayer-sinter combined method between 1995 and 2000 was analyzed. The result shows that the overall energy intensity of alumina in this plant decreased by 7.36 GJ/t-Al{sub 2}O{sub 3} over this period, 49% of total energy saving is due to direct energy saving, and 51% is due to indirect energy saving. The emphasis in this paper is on decreasing product ratios of high-energy consumption unit processes, such as evaporation, slurry sintering, aluminium trihydrate calcining and desilication. Energy savings can be made (1) by increasing the proportion of Bayer and indirect digestion, (2) by increasing the grade of ore by ore dressing or importing some rich gibbsite and (3) by promoting the advancement in technology. (author)

  9. A low-cost system for graphical process monitoring with colour video symbol display units

    International Nuclear Information System (INIS)

    Grauer, H.; Jarsch, V.; Mueller, W.

    1977-01-01

    A system for computer controlled graphic process supervision, using color symbol video displays is described. It has the following characteristics: - compact unit: no external memory for image storage - problem oriented simple descriptive cut to the process program - no restriction of the graphical representation of process variables - computer and display independent, by implementation of colours and parameterized code creation for the display. (WB) [de

  10. Integrating post-Newtonian equations on graphics processing units

    Energy Technology Data Exchange (ETDEWEB)

    Herrmann, Frank; Tiglio, Manuel [Department of Physics, Center for Fundamental Physics, and Center for Scientific Computation and Mathematical Modeling, University of Maryland, College Park, MD 20742 (United States); Silberholz, John [Center for Scientific Computation and Mathematical Modeling, University of Maryland, College Park, MD 20742 (United States); Bellone, Matias [Facultad de Matematica, Astronomia y Fisica, Universidad Nacional de Cordoba, Cordoba 5000 (Argentina); Guerberoff, Gustavo, E-mail: tiglio@umd.ed [Facultad de Ingenieria, Instituto de Matematica y Estadistica ' Prof. Ing. Rafael Laguardia' , Universidad de la Republica, Montevideo (Uruguay)

    2010-02-07

    We report on early results of a numerical and statistical study of binary black hole inspirals. The two black holes are evolved using post-Newtonian approximations starting with initially randomly distributed spin vectors. We characterize certain aspects of the distribution shortly before merger. In particular we note the uniform distribution of black hole spin vector dot products shortly before merger and a high correlation between the initial and final black hole spin vector dot products in the equal-mass, maximally spinning case. More than 300 million simulations were performed on graphics processing units, and we demonstrate a speed-up of a factor 50 over a more conventional CPU implementation. (fast track communication)

  11. The triconnected abstraction of process models

    OpenAIRE

    Polyvyanyy, Artem; Smirnov, Sergey; Weske, Mathias

    2009-01-01

    Contents: Artem Polyvanny, Sergey Smirnow, and Mathias Weske The Triconnected Abstraction of Process Models 1 Introduction 2 Business Process Model Abstraction 3 Preliminaries 4 Triconnected Decomposition 4.1 Basic Approach for Process Component Discovery 4.2 SPQR-Tree Decomposition 4.3 SPQR-Tree Fragments in the Context of Process Models 5 Triconnected Abstraction 5.1 Abstraction Rules 5.2 Abstraction Algorithm 6 Related Work and Conclusions

  12. Continuous soil maps - a fuzzy set approach to bridge the gap between aggregation levels of process and distribution models

    NARCIS (Netherlands)

    Gruijter, de J.J.; Walvoort, D.J.J.; Gaans, van P.F.M.

    1997-01-01

    Soil maps as multi-purpose models of spatial soil distribution have a much higher level of aggregation (map units) than the models of soil processes and land-use effects that need input from soil maps. This mismatch between aggregation levels is particularly detrimental in the context of precision

  13. Modeling Suspension and Continuation of a Process

    Directory of Open Access Journals (Sweden)

    Oleg Svatos

    2012-04-01

    Full Text Available This work focuses on difficulties an analyst encounters when modeling suspension and continuation of a process in contemporary process modeling languages. As a basis there is introduced general lifecycle of an activity which is then compared to activity lifecycles supported by individual process modeling languages. The comparison shows that the contemporary process modeling languages cover the defined general lifecycle of an activity only partially. There are picked two popular process modeling languages and there is modeled real example, which reviews how the modeling languages can get along with their lack of native support of suspension and continuation of an activity. Upon the unsatisfying results of the contemporary process modeling languages in the modeled example, there is presented a new process modeling language which, as demonstrated, is capable of capturing suspension and continuation of an activity in much simpler and precise way.

  14. Study on Quantification for Multi-unit Seismic PSA Model using Monte Carlo Sampling

    International Nuclear Information System (INIS)

    Oh, Kyemin; Han, Sang Hoon; Jang, Seung-cheol; Park, Jin Hee; Lim, Ho-Gon; Yang, Joon Eon; Heo, Gyunyoung

    2015-01-01

    In existing PSA, frequency for accident sequences occurred in single-unit has been estimated. While multi-unit PSA has to consider various combinations because accident sequence in each units can be different. However, it is difficult to quantify all of combination between inter-units using traditional method such as Minimal Cut Upper Bound (MCUB). For this reason, we used Monte Carlo sampling as a method to quantify multi-unit PSA model. In this paper, Monte Carlo method was used to quantify multi-unit PSA model. The advantage of this method is to consider all of combinations by the increase of number of unit and to calculate nearly exact value compared to other method. However, it is difficult to get detailed information such as minimal cut sets and accident sequence. To solve partially this problem, FTeMC was modified. In multi-unit PSA, quantification for both internal and external multi-unit accidents is the significant issue. Although our result above mentioned was one of the case studies to check application of method suggested in this paper, it is expected that this method can be used in practical assessment for multi-unit risk

  15. Declarative modeling for process supervision

    International Nuclear Information System (INIS)

    Leyval, L.

    1989-01-01

    Our work is a contribution to computer aided supervision of continuous processes. It is inspired by an area of Artificial Intelligence: qualitative physics. Here, supervision is based on a model which continuously provides operators with a synthetic view of the process; but this model is founded on general principles of control theory rather than on physics. It involves concepts such as high gain or small time response. It helps in linking temporally the evolution of various variables. Moreover, the model provides predictions of the future behaviour of the process, which allows action advice and alarm filtering. This should greatly reduce the famous cognitive overload associated to any complex and dangerous evolution of the process

  16. Comprehensive stroke units: a review of comparative evidence and experience.

    Science.gov (United States)

    Chan, Daniel K Y; Cordato, Dennis; O'Rourke, Fintan; Chan, Daniel L; Pollack, Michael; Middleton, Sandy; Levi, Chris

    2013-06-01

    Stroke unit care offers significant benefits in survival and dependency when compared to general medical ward. Most stroke units are either acute or rehabilitation, but comprehensive (combined acute and rehabilitation) model (comprehensive stroke unit) is less common. To examine different levels of evidence of comprehensive stroke unit compared to other organized inpatient stroke care and share local experience of comprehensive stroke units. Cochrane Library and Medline (1980 to December 2010) review of English language articles comparing stroke units to alternative forms of stroke care delivery, different types of stroke unit models, and differences in processes of care within different stroke unit models. Different levels of comparative evidence of comprehensive stroke units to other models of stroke units are collected. There are no randomized controlled trials directly comparing comprehensive stroke units to other stroke unit models (either acute or rehabilitation). Comprehensive stroke units are associated with reduced length of stay and greatest reduction in combined death and dependency in a meta-analysis study when compared to other stroke unit models. Comprehensive stroke units also have better length of stay and functional outcome when compared to acute or rehabilitation stroke unit models in a cross-sectional study, and better length of stay in a 'before-and-after' comparative study. Components of stroke unit care that improve outcome are multifactorial and most probably include early mobilization. A comprehensive stroke unit model has been successfully implemented in metropolitan and rural hospital settings. Comprehensive stroke units are associated with reductions in length of stay and combined death and dependency and improved functional outcomes compared to other stroke unit models. A comprehensive stroke unit model is worth considering as the preferred model of stroke unit care in the planning and delivery of metropolitan and rural stroke services

  17. ECONOMIC MODELING PROCESSES USING MATLAB

    Directory of Open Access Journals (Sweden)

    Anamaria G. MACOVEI

    2008-06-01

    Full Text Available To study economic phenomena and processes using mathem atical modeling, and to determine the approximatesolution to a problem we need to choose a method of calculation and a numerical computer program, namely thepackage of programs MatLab. Any economic process or phenomenon is a mathematical description of h is behavior,and thus draw up an economic and mathematical model that has the following stages: formulation of the problem, theanalysis process modeling, the production model and design verification, validation and implementation of the model.This article is presented an economic model and its modeling is using mathematical equations and software packageMatLab, which helps us approximation effective solution. As data entry is considered the net cost, the cost of direct andtotal cost and the link between them. I presented the basic formula for determining the total cost. Economic modelcalculations were made in MatLab software package and with graphic representation of its interpretation of the resultsachieved in terms of our specific problem.

  18. Gravity driven and in situ fractional crystallization processes in the Centre Hill complex, Abitibi Subprovince, Canada: Evidence from bilaterally-paired cyclic units

    Science.gov (United States)

    Thériault, R. D.; Fowler, A. D.

    1996-12-01

    The formation of layers in mafic intrusions has been explained by various processes, making it the subject of much controversy. The concept that layering originates from gravitational settling of crystals has been superseded in recent years by models involving in situ fractional crystallization. Here we present evidence from the Centre Hill complex that both processes may be operative simultaneously within the same intrusion. The Centre Hill complex is part of the Munro Lake sill, an Archean layered mafic intrusion emplaced in volcanic rocks of the Abitibi Subprovince. The Centre Hill complex comprises the following lithostratigraphic units: six lower cyclic units of peridotite and clinopyroxenite; a middle unit of leucogabbro; six upper cyclic units of branching-textured gabbro (BTG) and clotted-textured gabbro (CTG), the uppermost of these units being overlain by a marginal zone of fine-grained gabbro. The cyclic units of peridotite/clinopyroxenite and BTG/CTG are interpreted to have formed concurrently through fractional crystallization, associated with periodic replenishment of magma to the chamber. The units of peridotite and clinopyroxenite formed by gravitational accumulation of crystals that grew under the roof. The cyclic units of BTG and CTG formed along the upper margin of the sill by two different mechanisms: (1) layers of BTG crystallized in situ along an inward-growing roof and (2) layers of CTG formed by accumulation of buoyant plagioclase crystals. The layers of BTG are characterized by branching pseudomorphs after fayalite up to 50 cm in length that extend away from the upper margin. The original branching crystals are interpreted to have grown from stagnant intercumulus melt in a high thermal gradient resulting from the injection of new magma to the chamber.

  19. The impact of a lean rounding process in a pediatric intensive care unit.

    Science.gov (United States)

    Vats, Atul; Goin, Kristin H; Villarreal, Monica C; Yilmaz, Tuba; Fortenberry, James D; Keskinocak, Pinar

    2012-02-01

    Poor workflow associated with physician rounding can produce inefficiencies that decrease time for essential activities, delay clinical decisions, and reduce staff and patient satisfaction. Workflow and provider resources were not optimized when a pediatric intensive care unit increased by 22,000 square feet (to 33,000) and by nine beds (to 30). Lean methods (focusing on essential processes) and scenario analysis were used to develop and implement a patient-centric standardized rounding process, which we hypothesize would lead to improved rounding efficiency, decrease required physician resources, improve satisfaction, and enhance throughput. Human factors techniques and statistical tools were used to collect and analyze observational data for 11 rounding events before and 12 rounding events after process redesign. Actions included: 1) recording rounding events, times, and patient interactions and classifying them as essential, nonessential, or nonvalue added; 2) comparing rounding duration and time per patient to determine the impact on efficiency; 3) analyzing discharge orders for timeliness; 4) conducting staff surveys to assess improvements in communication and care coordination; and 5) analyzing customer satisfaction data to evaluate impact on patient experience. Thirty-bed pediatric intensive care unit in a children's hospital with academic affiliation. Eight attending pediatric intensivists and their physician rounding teams. Eight attending physician-led teams were observed for 11 rounding events before and 12 rounding events after implementation of a standardized lean rounding process focusing on essential processes. Total rounding time decreased significantly (157 ± 35 mins before vs. 121 ± 20 mins after), through a reduction in time spent on nonessential (53 ± 30 vs. 9 ± 6 mins) activities. The previous process required three attending physicians for an average of 157 mins (7.55 attending physician man-hours), while the new process required two

  20. Styles in business process modeling: an exploration and a model

    NARCIS (Netherlands)

    Pinggera, J.; Soffer, P.; Fahland, D.; Weidlich, M.; Zugal, S.; Weber, B.; Reijers, H.A.; Mendling, J.

    2015-01-01

    Business process models are an important means to design, analyze, implement, and control business processes. As with every type of conceptual model, a business process model has to meet certain syntactic, semantic, and pragmatic quality requirements to be of value. For many years, such quality

  1. Composing Models of Geographic Physical Processes

    Science.gov (United States)

    Hofer, Barbara; Frank, Andrew U.

    Processes are central for geographic information science; yet geographic information systems (GIS) lack capabilities to represent process related information. A prerequisite to including processes in GIS software is a general method to describe geographic processes independently of application disciplines. This paper presents such a method, namely a process description language. The vocabulary of the process description language is derived formally from mathematical models. Physical processes in geography can be described in two equivalent languages: partial differential equations or partial difference equations, where the latter can be shown graphically and used as a method for application specialists to enter their process models. The vocabulary of the process description language comprises components for describing the general behavior of prototypical geographic physical processes. These process components can be composed by basic models of geographic physical processes, which is shown by means of an example.

  2. PGAS in-memory data processing for the Processing Unit of the Upgraded Electronics of the Tile Calorimeter of the ATLAS Detector

    International Nuclear Information System (INIS)

    Ohene-Kwofie, Daniel; Otoo, Ekow

    2015-01-01

    The ATLAS detector, operated at the Large Hadron Collider (LHC) records proton-proton collisions at CERN every 50ns resulting in a sustained data flow up to PB/s. The upgraded Tile Calorimeter of the ATLAS experiment will sustain about 5PB/s of digital throughput. These massive data rates require extremely fast data capture and processing. Although there has been a steady increase in the processing speed of CPU/GPGPU assembled for high performance computing, the rate of data input and output, even under parallel I/O, has not kept up with the general increase in computing speeds. The problem then is whether one can implement an I/O subsystem infrastructure capable of meeting the computational speeds of the advanced computing systems at the petascale and exascale level.We propose a system architecture that leverages the Partitioned Global Address Space (PGAS) model of computing to maintain an in-memory data-store for the Processing Unit (PU) of the upgraded electronics of the Tile Calorimeter which is proposed to be used as a high throughput general purpose co-processor to the sROD of the upgraded Tile Calorimeter. The physical memory of the PUs are aggregated into a large global logical address space using RDMA- capable interconnects such as PCI- Express to enhance data processing throughput. (paper)

  3. Quantification of terrestrial ecosystem carbon dynamics in the conterminous United States combining a process-based biogeochemical model and MODIS and AmeriFlux data

    Directory of Open Access Journals (Sweden)

    M. Chen

    2011-09-01

    Full Text Available Satellite remote sensing provides continuous temporal and spatial information of terrestrial ecosystems. Using these remote sensing data and eddy flux measurements and biogeochemical models, such as the Terrestrial Ecosystem Model (TEM, should provide a more adequate quantification of carbon dynamics of terrestrial ecosystems. Here we use Moderate Resolution Imaging Spectroradiometer (MODIS Enhanced Vegetation Index (EVI, Land Surface Water Index (LSWI and carbon flux data of AmeriFlux to conduct such a study. We first modify the gross primary production (GPP modeling in TEM by incorporating EVI and LSWI to account for the effects of the changes of canopy photosynthetic capacity, phenology and water stress. Second, we parameterize and verify the new version of TEM with eddy flux data. We then apply the model to the conterminous United States over the period 2000–2005 at a 0.05° × 0.05° spatial resolution. We find that the new version of TEM made improvement over the previous version and generally captured the expected temporal and spatial patterns of regional carbon dynamics. We estimate that regional GPP is between 7.02 and 7.78 Pg C yr−1 and net primary production (NPP ranges from 3.81 to 4.38 Pg C yr−1 and net ecosystem production (NEP varies within 0.08–0.73 Pg C yr−1 over the period 2000–2005 for the conterminous United States. The uncertainty due to parameterization is 0.34, 0.65 and 0.18 Pg C yr−1 for the regional estimates of GPP, NPP and NEP, respectively. The effects of extreme climate and disturbances such as severe drought in 2002 and destructive Hurricane Katrina in 2005 were captured by the model. Our study provides a new independent and more adequate measure of carbon fluxes for the conterminous United States, which will benefit studies of carbon-climate feedback and facilitate policy-making of carbon management and climate.

  4. The AMchip04 and the Processing Unit Prototype for the FastTracker

    CERN Document Server

    Andreani, A; The ATLAS collaboration; Beretta, M; Bogdan, M; Citterio, M; Alberti, F; Giannetti, P; Lanza, A; Magalotti, D; Piendibene, M; Shochet, M; Stabile, A; Tang, J; Tompkins, L; Volpi, G

    2012-01-01

    Modern experiments search for extremely rare processes hidden in much larger background levels. As the experiment complexity and the accelerator backgrounds and luminosity increase we need increasingly complex and exclusive selections. We present the first prototype of a new Processing Unit, the core of the FastTracker processor for Atlas, whose computing power is such that a couple of hundreds of them will be able to reconstruct all the tracks with transverse momentum above 1 GeV in the ATLAS events up to Phase II instantaneous luminosities (5×1034 cm-2 s-1) with an event input rate of 100 kHz and a latency below hundreds of microseconds. We plan extremely powerful, very compact and low consumption units for the far future, essential to increase efficiency and purity of the Level 2 selected samples through the intensive use of tracking. This strategy requires massive computing power to minimize the online execution time of complex tracking algorithms. The time consuming pattern recognition problem, generall...

  5. A Convex Model of Risk-Based Unit Commitment for Day-Ahead Market Clearing Considering Wind Power Uncertainty

    DEFF Research Database (Denmark)

    Zhang, Ning; Kang, Chongqing; Xia, Qing

    2015-01-01

    The integration of wind power requires the power system to be sufficiently flexible to accommodate its forecast errors. In the market clearing process, the scheduling of flexibility relies on the manner in which the wind power uncertainty is addressed in the unit commitment (UC) model. This paper...... and are considered in both the objective functions and the constraints. The RUC model is shown to be convex and is transformed into a mixed integer linear programming (MILP) problem using relaxation and piecewise linearization. The proposed RUC model is tested using a three-bus system and an IEEE RTS79 system...... that the risk modeling facilitates a strategic market clearing procedure with a reasonable computational expense....

  6. Consumer Decision Process in Restaurant Selection: An Application of the Stylized EKB Model

    Directory of Open Access Journals (Sweden)

    Eugenia Wickens

    2016-12-01

    Full Text Available Purpose – The aim of this paper is to propose a framework based on empirical work for understanding the consumer decision processes involved in the selection of a restaurant for leisure meals. Design/Methodology/Approach – An interpretive approach is taken in order to understand the intricacies of the process and the various stages in the process. Six focus group interviews with consumers of various ages and occupations in the South East of the United Kingdom were conducted. Findings and implications – The stylized EKB model of the consumer decision process (Tuan-Pham & Higgins, 2005 was used as a framework for developing different stages of the process. Two distinct parts of the process were identified. Occasion was found to be critical to the stage of problem recognition. In terms of evaluation of alternatives and, in particular, sensitivity to evaluative content, the research indicates that the regulatory focus theory of Tuan-Pham and Higgins (2005 applies to the decision of selecting a restaurant. Limitations – It is acknowledged that this exploratory study is based on a small sample in a single geographical area. Originality – The paper is the first application of the stylized EKB model, which takes into account the motivational dimensions of consumer decision making, missing in other models. It concludes that it may have broader applications to other research contexts.

  7. APPLICATION FEATURES OF SPATIAL CONDUCTOMETRY SENSORS IN MODELLING OF COOLANT FLOW MIXING IN NUCLEAR POWER UNIT EQUIPMENT

    Directory of Open Access Journals (Sweden)

    A. A. Barinov

    2016-01-01

    Full Text Available Coolant flow mixing processes with different temperatures and concentrations of diluted additives widely known in nuclear power units operation. In some cases these processes make essential impact on the resource and behavior of the nuclear unit during transient and emergency situations. The aim of the study was creation of measurement system and test facility to carry out basic tests and to embed spatial conductometry method in investigation practice of turbulent coolant flows. In the course of investigation measurement system with sensors and experimental facility was designed, several first tests were carried out. A special attention was dedicated to calibration and clarification of conductometry sensor application methodologies in studies of turbulent flow characteristics. Investigations involved method of electrically contrast tracer jet with concurrent flow in closed channel of round crosssection. The measurements include both averaged and unsteady realizations of measurement signal. Experimental data processing shows good agreement with other tests acquired from another measurement systems based on different physical principles. Calibration functions were acquired, methodical basis of spatial conductometry measurement system application was created. Gathered experience of spatial sensor application made it possible to formulate the principles of further investigation that involve large-scale models of nuclear unit equipment. Spatial wire-mesh sensors proved to be a perspective type of eddy resolving measurement devices.

  8. Pengembangan Model Persediaan Continuous Review dengan All-Unit Discount dan Faktor Kadaluwarsa

    Directory of Open Access Journals (Sweden)

    Cherish Rikardo

    2017-06-01

    Full Text Available Paper ini membahas suatu model matematika untuk suatu sistem persediaan dengan mempertimbangkan adanya all-unit discount dan faktor kadaluwarsa. Permintaan barang bersifat deterministik yang merupakan fungsi dari waktu dan bergantung pada jumlah persediaan yang ada (inventory dependent demand, laju kadaluwarsa barang juga bergantung pada waktu dan tidak ada lead time. Model yang dibahas merupakan pengembangan dari Nagare dan Dutta [9] dengan menambah faktor all-unit discount dan laju kadaluwarsa yang bergantung pada waktu. Dari model yang dikembangkan akan ditentukan kuantitas pemesanan (economic order quantity dan waktu antar pemesanan yang optimum yang meminimumkan total biaya persediaan tahunan. Algoritma penentuan solusi optimum dari model yang dikembangkan dan contoh numerik sebagai ilustrasi dari permasalahan persediaan ini diberikan. Analisis sensitivitas model dengan melihat pengaruh laju kadaluwarsa dan laju permintaan terhadap kuantitas dan waktu antar pemesanan yang optimal juga diberikan. Dari hasil analisis sensitivitas diperoleh bahwa semakin besar laju kadaluwarsa, maka waktu antar pemesanan menjadi semakin singkat dan kuantitas pemesanan menjadi semakin sedikit.  Hal yang sama terjadi jika laju permintaan barang semakin besar.   Kata kunci:  Persediaan, kadaluwarsa, all-unit discount.

  9. Factors associated with student learning processes in primary health care units: a questionnaire study.

    Science.gov (United States)

    Bos, Elisabeth; Alinaghizadeh, Hassan; Saarikoski, Mikko; Kaila, Päivi

    2015-01-01

    Clinical placement plays a key role in education intended to develop nursing and caregiving skills. Studies of nursing students' clinical learning experiences show that these dimensions affect learning processes: (i) supervisory relationship, (ii) pedagogical atmosphere, (iii) management leadership style, (iv) premises of nursing care on the ward, and (v) nursing teachers' roles. Few empirical studies address the probability of an association between these dimensions and factors such as student (a) motivation, (b) satisfaction with clinical placement, and (c) experiences with professional role models. The study aimed to investigate factors associated with the five dimensions in clinical learning environments within primary health care units. The Swedish version of Clinical Learning Environment, Supervision and Teacher, a validated evaluation scale, was administered to 356 graduating nursing students after four or five weeks clinical placement in primary health care units. Response rate was 84%. Multivariate analysis of variance is determined if the five dimensions are associated with factors a, b, and c above. The analysis revealed a statistically significant association with the five dimensions and two factors: students' motivation and experiences with professional role models. The satisfaction factor had a statistically significant association (effect size was high) with all dimensions; this clearly indicates that students experienced satisfaction. These questionnaire results show that a good clinical learning experience constitutes a complex whole (totality) that involves several interacting factors. Supervisory relationship and pedagogical atmosphere particularly influenced students' satisfaction and motivation. These results provide valuable decision-support material for clinical education planning, implementation, and management. Copyright © 2014 Elsevier Ltd. All rights reserved.

  10. The Structured Process Modeling Theory (SPMT) : a cognitive view on why and how modelers benefit from structuring the process of process modeling

    NARCIS (Netherlands)

    Claes, J.; Vanderfeesten, I.T.P.; Gailly, F.; Grefen, P.W.P.J.; Poels, G.

    2015-01-01

    After observing various inexperienced modelers constructing a business process model based on the same textual case description, it was noted that great differences existed in the quality of the produced models. The impression arose that certain quality issues originated from cognitive failures

  11. Security central processing unit applications in the protection of nuclear facilities

    International Nuclear Information System (INIS)

    Goetzke, R.E.

    1987-01-01

    New or upgraded electronic security systems protecting nuclear facilities or complexes will be heavily computer dependent. Proper planning for new systems and the employment of new state-of-the-art 32 bit processors in the processing of subsystem reports are key elements in effective security systems. The processing of subsystem reports represents only a small segment of system overhead. In selecting a security system to meet the current and future needs for nuclear security applications the central processing unit (CPU) applied in the system architecture is the critical element in system performance. New 32 bit technology eliminates the need for program overlays while providing system programmers with well documented program tools to develop effective systems to operate in all phases of nuclear security applications

  12. Integrated modelling in materials and process technology

    DEFF Research Database (Denmark)

    Hattel, Jesper Henri

    2008-01-01

    Integrated modelling of entire process sequences and the subsequent in-service conditions, and multiphysics modelling of the single process steps are areas that increasingly support optimisation of manufactured parts. In the present paper, three different examples of modelling manufacturing...... processes from the viewpoint of combined materials and process modelling are presented: solidification of thin walled ductile cast iron, integrated modelling of spray forming and multiphysics modelling of friction stir welding. The fourth example describes integrated modelling applied to a failure analysis...

  13. The impact of working memory and the “process of process modelling” on model quality: Investigating experienced versus inexperienced modellers

    Science.gov (United States)

    Martini, Markus; Pinggera, Jakob; Neurauter, Manuel; Sachse, Pierre; Furtner, Marco R.; Weber, Barbara

    2016-01-01

    A process model (PM) represents the graphical depiction of a business process, for instance, the entire process from online ordering a book until the parcel is delivered to the customer. Knowledge about relevant factors for creating PMs of high quality is lacking. The present study investigated the role of cognitive processes as well as modelling processes in creating a PM in experienced and inexperienced modellers. Specifically, two working memory (WM) functions (holding and processing of information and relational integration) and three process of process modelling phases (comprehension, modelling, and reconciliation) were related to PM quality. Our results show that the WM function of relational integration was positively related to PM quality in both modelling groups. The ratio of comprehension phases was negatively related to PM quality in inexperienced modellers and the ratio of reconciliation phases was positively related to PM quality in experienced modellers. Our research reveals central cognitive mechanisms in process modelling and has potential practical implications for the development of modelling software and teaching the craft of process modelling. PMID:27157858

  14. Test results of the signal processing and amplifier unit for the emittance measurement system

    International Nuclear Information System (INIS)

    Stawiszynski, L.; Schneider, S.

    1984-01-01

    The signal processing and amplifier unit for the emittance measurement system is the unit with which the beam current on the harp-wires and the slit is measured and converted to a digital output. Temperature effects are very critical at low currents and the purpose of the test measurements described in this report was mainly to establish the accuracy and repeatability of the measurements under the influence of temperature variations

  15. Group Contribution Based Process Flowsheet Synthesis, Design and Modelling

    DEFF Research Database (Denmark)

    d'Anterroches, Loïc; Gani, Rafiqul

    2005-01-01

    In a group contribution method for pure component property prediction, a molecule is described as a set of groups linked together to form a molecular structure. In the same way, for flowsheet "property" prediction, a flowsheet can be described as a set of process-groups linked together to represent...... the flowsheet structure. Just as a functional group is a collection of atoms, a process-group is a collection of operations forming an "unit" operation or a set of "unit" operations. The link between the process-groups are the streams similar to the bonds that are attachments to atoms/groups. Each process-group...... provides a contribution to the "property" of the flowsheet, which can be performance in terms of energy consumption, thereby allowing a flowsheet "property" to be calculated, once it is described by the groups. Another feature of this approach is that the process-group attachments provide automatically...

  16. Detecting Difference between Process Models Based on the Refined Process Structure Tree

    Directory of Open Access Journals (Sweden)

    Jing Fan

    2017-01-01

    Full Text Available The development of mobile workflow management systems (mWfMS leads to large number of business process models. In the meantime, the location restriction embedded in mWfMS may result in different process models for a single business process. In order to help users quickly locate the difference and rebuild the process model, detecting the difference between different process models is needed. Existing detection methods either provide a dissimilarity value to represent the difference or use predefined difference template to generate the result, which cannot reflect the entire composition of the difference. Hence, in this paper, we present a new approach to solve this problem. Firstly, we parse the process models to their corresponding refined process structure trees (PSTs, that is, decomposing a process model into a hierarchy of subprocess models. Then we design a method to convert the PST to its corresponding task based process structure tree (TPST. As a consequence, the problem of detecting difference between two process models is transformed to detect difference between their corresponding TPSTs. Finally, we obtain the difference between two TPSTs based on the divide and conquer strategy, where the difference is described by an edit script and we make the cost of the edit script close to minimum. The extensive experimental evaluation shows that our method can meet the real requirements in terms of precision and efficiency.

  17. Strategies to Automatically Derive a Process Model from a Configurable Process Model Based on Event Data

    Directory of Open Access Journals (Sweden)

    Mauricio Arriagada-Benítez

    2017-10-01

    Full Text Available Configurable process models are frequently used to represent business workflows and other discrete event systems among different branches of large organizations: they unify commonalities shared by all branches and describe their differences, at the same time. The configuration of such models is usually done manually, which is challenging. On the one hand, when the number of configurable nodes in the configurable process model grows, the size of the search space increases exponentially. On the other hand, the person performing the configuration may lack the holistic perspective to make the right choice for all configurable nodes at the same time, since choices influence each other. Nowadays, information systems that support the execution of business processes create event data reflecting how processes are performed. In this article, we propose three strategies (based on exhaustive search, genetic algorithms and a greedy heuristic that use event data to automatically derive a process model from a configurable process model that better represents the characteristics of the process in a specific branch. These strategies have been implemented in our proposed framework and tested in both business-like event logs as recorded in a higher educational enterprise resource planning system and a real case scenario involving a set of Dutch municipalities.

  18. Repairing process models to reflect reality

    NARCIS (Netherlands)

    Fahland, D.; Aalst, van der W.M.P.; Barros, A.; Gal, A.; Kindler, E.

    2012-01-01

    Process mining techniques relate observed behavior (i.e., event logs) to modeled behavior (e.g., a BPMN model or a Petri net). Processes models can be discovered from event logs and conformance checking techniques can be used to detect and diagnose differences between observed and modeled behavior.

  19. Discrete-Event Execution Alternatives on General Purpose Graphical Processing Units

    International Nuclear Information System (INIS)

    Perumalla, Kalyan S.

    2006-01-01

    Graphics cards, traditionally designed as accelerators for computer graphics, have evolved to support more general-purpose computation. General Purpose Graphical Processing Units (GPGPUs) are now being used as highly efficient, cost-effective platforms for executing certain simulation applications. While most of these applications belong to the category of time-stepped simulations, little is known about the applicability of GPGPUs to discrete event simulation (DES). Here, we identify some of the issues and challenges that the GPGPU stream-based interface raises for DES, and present some possible approaches to moving DES to GPGPUs. Initial performance results on simulation of a diffusion process show that DES-style execution on GPGPU runs faster than DES on CPU and also significantly faster than time-stepped simulations on either CPU or GPGPU.

  20. Calculation of the real states of Ignalina NPP Unit 1 and Unit 2 RBMK-1500 reactors in the verification process of QUABOX/CUBBOX code

    International Nuclear Information System (INIS)

    Bubelis, E.; Pabarcius, R.; Demcenko, M.

    2001-01-01

    Calculations of the main neutron-physical characteristics of RBMK-1500 reactors of Ignalina NPP Unit 1 and Unit 2 were performed, taking real reactor core states as the basis for these calculations. Comparison of the calculation results, obtained using QUABOX/CUBBOX code, with experimental data and the calculation results, obtained using STEPAN code, showed that all the main neutron-physical characteristics of the reactors of Unit 1 and Unit 2 of Ignalina NPP are in the safe deviation range of die analyzed parameters, and that reactors of Ignalina NPP, during the process of the reactor core composition change, are operated in a safe and stable manner. (author)

  1. Business process modeling for processing classified documents using RFID technology

    Directory of Open Access Journals (Sweden)

    Koszela Jarosław

    2016-01-01

    Full Text Available The article outlines the application of the processing approach to the functional description of the designed IT system supporting the operations of the secret office, which processes classified documents. The article describes the application of the method of incremental modeling of business processes according to the BPMN model to the description of the processes currently implemented (“as is” in a manual manner and target processes (“to be”, using the RFID technology for the purpose of their automation. Additionally, the examples of applying the method of structural and dynamic analysis of the processes (process simulation to verify their correctness and efficiency were presented. The extension of the process analysis method is a possibility of applying the warehouse of processes and process mining methods.

  2. Orthographic units in the absence of visual processing: Evidence from sublexical structure in braille.

    Science.gov (United States)

    Fischer-Baum, Simon; Englebretson, Robert

    2016-08-01

    Reading relies on the recognition of units larger than single letters and smaller than whole words. Previous research has linked sublexical structures in reading to properties of the visual system, specifically on the parallel processing of letters that the visual system enables. But whether the visual system is essential for this to happen, or whether the recognition of sublexical structures may emerge by other means, is an open question. To address this question, we investigate braille, a writing system that relies exclusively on the tactile rather than the visual modality. We provide experimental evidence demonstrating that adult readers of (English) braille are sensitive to sublexical units. Contrary to prior assumptions in the braille research literature, we find strong evidence that braille readers do indeed access sublexical structure, namely the processing of multi-cell contractions as single orthographic units and the recognition of morphemes within morphologically-complex words. Therefore, we conclude that the recognition of sublexical structure is not exclusively tied to the visual system. However, our findings also suggest that there are aspects of morphological processing on which braille and print readers differ, and that these differences may, crucially, be related to reading using the tactile rather than the visual sensory modality. Copyright © 2016 Elsevier B.V. All rights reserved.

  3. Multidisciplinary Simulation Acceleration using Multiple Shared-Memory Graphical Processing Units

    Science.gov (United States)

    Kemal, Jonathan Yashar

    For purposes of optimizing and analyzing turbomachinery and other designs, the unsteady Favre-averaged flow-field differential equations for an ideal compressible gas can be solved in conjunction with the heat conduction equation. We solve all equations using the finite-volume multiple-grid numerical technique, with the dual time-step scheme used for unsteady simulations. Our numerical solver code targets CUDA-capable Graphical Processing Units (GPUs) produced by NVIDIA. Making use of MPI, our solver can run across networked compute notes, where each MPI process can use either a GPU or a Central Processing Unit (CPU) core for primary solver calculations. We use NVIDIA Tesla C2050/C2070 GPUs based on the Fermi architecture, and compare our resulting performance against Intel Zeon X5690 CPUs. Solver routines converted to CUDA typically run about 10 times faster on a GPU for sufficiently dense computational grids. We used a conjugate cylinder computational grid and ran a turbulent steady flow simulation using 4 increasingly dense computational grids. Our densest computational grid is divided into 13 blocks each containing 1033x1033 grid points, for a total of 13.87 million grid points or 1.07 million grid points per domain block. To obtain overall speedups, we compare the execution time of the solver's iteration loop, including all resource intensive GPU-related memory copies. Comparing the performance of 8 GPUs to that of 8 CPUs, we obtain an overall speedup of about 6.0 when using our densest computational grid. This amounts to an 8-GPU simulation running about 39.5 times faster than running than a single-CPU simulation.

  4. Simulation and optimization of an industrial PSA unit

    Directory of Open Access Journals (Sweden)

    Barg C.

    2000-01-01

    Full Text Available The Pressure Swing Adsorption (PSA units have been used as a low cost alternative to the usual gas separation processes. Its largest commercial application is for hydrogen purification systems. Several studies have been made about the simulation of pressure swing adsorption units, but there are only few reports on the optimization of such processes. The objective of this study is to simulate and optimize an industrial PSA unit for hydrogen purification. This unit consists of six beds, each of them have three layers of different kinds of adsorbents. The main impurities are methane, carbon monoxide and sulfidric gas. The product stream has 99.99% purity in hydrogen, and the recovery is around 90%. A mathematical model for a commercial PSA unit is developed. The cycle time and the pressure swing steps are optimized. All the features concerning with complex commercial processes are considered.

  5. Business Process Modelling for Measuring Quality

    NARCIS (Netherlands)

    Heidari, F.; Loucopoulos, P.; Brazier, F.M.

    2013-01-01

    Business process modelling languages facilitate presentation, communication and analysis of business processes with different stakeholders. This paper proposes an approach that drives specification and measurement of quality requirements and in doing so relies on business process models as

  6. Lamb wave propagation modelling and simulation using parallel processing architecture and graphical cards

    International Nuclear Information System (INIS)

    Paćko, P; Bielak, T; Staszewski, W J; Uhl, T; Spencer, A B; Worden, K

    2012-01-01

    This paper demonstrates new parallel computation technology and an implementation for Lamb wave propagation modelling in complex structures. A graphical processing unit (GPU) and computer unified device architecture (CUDA), available in low-cost graphical cards in standard PCs, are used for Lamb wave propagation numerical simulations. The local interaction simulation approach (LISA) wave propagation algorithm has been implemented as an example. Other algorithms suitable for parallel discretization can also be used in practice. The method is illustrated using examples related to damage detection. The results demonstrate good accuracy and effective computational performance of very large models. The wave propagation modelling presented in the paper can be used in many practical applications of science and engineering. (paper)

  7. Investigation of Mediational Processes Using Parallel Process Latent Growth Curve Modeling

    Science.gov (United States)

    Cheong, JeeWon; MacKinnon, David P.; Khoo, Siek Toon

    2010-01-01

    This study investigated a method to evaluate mediational processes using latent growth curve modeling. The mediator and the outcome measured across multiple time points were viewed as 2 separate parallel processes. The mediational process was defined as the independent variable influencing the growth of the mediator, which, in turn, affected the growth of the outcome. To illustrate modeling procedures, empirical data from a longitudinal drug prevention program, Adolescents Training and Learning to Avoid Steroids, were used. The program effects on the growth of the mediator and the growth of the outcome were examined first in a 2-group structural equation model. The mediational process was then modeled and tested in a parallel process latent growth curve model by relating the prevention program condition, the growth rate factor of the mediator, and the growth rate factor of the outcome. PMID:20157639

  8. Parallel Execution of Functional Mock-up Units in Buildings Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Ozmen, Ozgur [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Nutaro, James J. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); New, Joshua Ryan [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-06-30

    A Functional Mock-up Interface (FMI) defines a standardized interface to be used in computer simulations to develop complex cyber-physical systems. FMI implementation by a software modeling tool enables the creation of a simulation model that can be interconnected, or the creation of a software library called a Functional Mock-up Unit (FMU). This report describes an FMU wrapper implementation that imports FMUs into a C++ environment and uses an Euler solver that executes FMUs in parallel using Open Multi-Processing (OpenMP). The purpose of this report is to elucidate the runtime performance of the solver when a multi-component system is imported as a single FMU (for the whole system) or as multiple FMUs (for different groups of components as sub-systems). This performance comparison is conducted using two test cases: (1) a simple, multi-tank problem; and (2) a more realistic use case based on the Modelica Buildings Library. In both test cases, the performance gains are promising when each FMU consists of a large number of states and state events that are wrapped in a single FMU. Load balancing is demonstrated to be a critical factor in speeding up parallel execution of multiple FMUs.

  9. The quality process as a management tool for public transport operators. The example of the EFQM Model through the franchise bidding process in the United Kingdom

    OpenAIRE

    Jérémy Piraux

    2008-01-01

    The quality process is a fashionable concept in public transport. Operators try to improve service quality and customer satisfaction, while public authorities impose the implementation of new quality processes in franchise contracts. EFQM differs from other quality models because of its global and integrated approach. In the UK, it has become the reference in the railway franchising process. Keolis, established in the UK for 10 years, developed its own EFQM approach. This study brings methodo...

  10. PC based diagnostic system for nitrogen production unit of HWP

    International Nuclear Information System (INIS)

    Lamba, D.S.; Rao, V.C.; Krishnan, S.; Kamaraj, T.; Krishnaswamy, C.

    1992-01-01

    The plant diagnostic system monitors the input data from local processing unit and tries to diagnose the cause of the failure. The system is a rule based application program that can perform tasks itself using fault tree model which displays the logical relationships between critical events and their possible ways occurrence, i.e. hardware failure, process faults and human error etc. Unit 37 Nitrogen Plant is taken as a prototype model for trying the plant diagnostics system. (author). 3 refs., 2 figs

  11. Applying Quality Function Deployment Model in Burn Unit Service Improvement.

    Science.gov (United States)

    Keshtkaran, Ali; Hashemi, Neda; Kharazmi, Erfan; Abbasi, Mehdi

    2016-01-01

    Quality function deployment (QFD) is one of the most effective quality design tools. This study applies QFD technique to improve the quality of the burn unit services in Ghotbedin Hospital in Shiraz, Iran. First, the patients' expectations of burn unit services and their priorities were determined through Delphi method. Thereafter, burn unit service specifications were determined through Delphi method. Further, the relationships between the patients' expectations and service specifications and also the relationships between service specifications were determined through an expert group's opinion. Last, the final importance scores of service specifications were calculated through simple additive weighting method. The findings show that burn unit patients have 40 expectations in six different areas. These expectations are in 16 priority levels. Burn units also have 45 service specifications in six different areas. There are four-level relationships between the patients' expectations and service specifications and four-level relationships between service specifications. The most important burn unit service specifications have been identified in this study. The QFD model developed in the study can be a general guideline for QFD planners and executives.

  12. An Analysis of OpenACC Programming Model: Image Processing Algorithms as a Case Study

    Directory of Open Access Journals (Sweden)

    M. J. Mišić

    2014-06-01

    Full Text Available Graphics processing units and similar accelerators have been intensively used in general purpose computations for several years. In the last decade, GPU architecture and organization changed dramatically to support an ever-increasing demand for computing power. Along with changes in hardware, novel programming models have been proposed, such as NVIDIA’s Compute Unified Device Architecture (CUDA and Open Computing Language (OpenCL by Khronos group. Although numerous commercial and scientific applications have been developed using these two models, they still impose a significant challenge for less experienced users. There are users from various scientific and engineering communities who would like to speed up their applications without the need to deeply understand a low-level programming model and underlying hardware. In 2011, OpenACC programming model was launched. Much like OpenMP for multicore processors, OpenACC is a high-level, directive-based programming model for manycore processors like GPUs. This paper presents an analysis of OpenACC programming model and its applicability in typical domains like image processing. Three, simple image processing algorithms have been implemented for execution on the GPU with OpenACC. The results were compared with their sequential counterparts, and results are briefly discussed.

  13. Behavioral conformance of artifact-centric process models

    NARCIS (Netherlands)

    Fahland, D.; Leoni, de M.; Dongen, van B.F.; Aalst, van der W.M.P.; Abramowicz, W.

    2011-01-01

    The use of process models in business information systems for analysis, execution, and improvement of processes assumes that the models describe reality. Conformance checking is a technique to validate how good a given process model describes recorded executions of the actual process. Recently,

  14. Future evolution of the Fast TracKer (FTK) processing unit

    CERN Document Server

    Gentsos, C; The ATLAS collaboration; Giannetti, P; Magalotti, D; Nikolaidis, S

    2014-01-01

    The Fast Tracker (FTK) processor [1] for the ATLAS experiment has a computing core made of 128 Processing Units that reconstruct tracks in the silicon detector in a ~100 μsec deep pipeline. The track parameter resolution provided by FTK enables the HLT trigger to identify efficiently and reconstruct significant samples of fermionic Higgs decays. Data processing speed is achieved with custom VLSI pattern recognition, linearized track fitting executed inside modern FPGAs, pipelining, and parallel processing. One large FPGA executes full resolution track fitting inside low resolution candidate tracks found by a set of 16 custom Asic devices, called Associative Memories (AM chips) [2]. The FTK dual structure, based on the cooperation of VLSI dedicated AM and programmable FPGAs, is maintained to achieve further technology performance, miniaturization and integration of the current state of the art prototypes. This allows to fully exploit new applications within and outside the High Energy Physics field. We plan t...

  15. Cost Models for MMC Manufacturing Processes

    Science.gov (United States)

    Elzey, Dana M.; Wadley, Haydn N. G.

    1996-01-01

    Processes for the manufacture of advanced metal matrix composites are rapidly approaching maturity in the research laboratory and there is growing interest in their transition to industrial production. However, research conducted to date has almost exclusively focused on overcoming the technical barriers to producing high-quality material and little attention has been given to the economical feasibility of these laboratory approaches and process cost issues. A quantitative cost modeling (QCM) approach was developed to address these issues. QCM are cost analysis tools based on predictive process models relating process conditions to the attributes of the final product. An important attribute, of the QCM approach is the ability to predict the sensitivity of material production costs to product quality and to quantitatively explore trade-offs between cost and quality. Applications of the cost models allow more efficient direction of future MMC process technology development and a more accurate assessment of MMC market potential. Cost models were developed for two state-of-the art metal matrix composite (MMC) manufacturing processes: tape casting and plasma spray deposition. Quality and Cost models are presented for both processes and the resulting predicted quality-cost curves are presented and discussed.

  16. Developing a Steady-state Kinetic Model for Industrial Scale Semi-Regenerative Catalytic Naphtha Reforming Process

    Directory of Open Access Journals (Sweden)

    Seif Mohaddecy, R.

    2014-05-01

    Full Text Available Due to the demand for high octane gasoline as a transportation fuel, the catalytic naphtha reformer has become one of the most important processes in petroleum refineries. In this research, the steady-state modelling of a catalytic fixed-bed naphtha reforming process to predict the momentous output variables was studied. These variables were octane number, yield, hydrogen purity, and temperature of all reforming reactors. To do such a task, an industrial scale semi-regenerative catalytic naphtha reforming unit was studied and modelled. In addition, to evaluate the developed model, the predicted variables i.e. outlet temperatures of reactors, research octane number, yield of gasoline and hydrogen purity were compared against actual data. The results showed that there is a close mapping between the actual and predicted variables, and the mean relative absolute deviation of the mentioned process variables were 0.38 %, 0.52 %, 0.54 %, 0.32 %, 4.8 % and 3.2 %, respectively.

  17. Simulation of a tubular solid oxide fuel cell stack using AspenPlusTM unit operation models

    International Nuclear Information System (INIS)

    Zhang, W.; Croiset, E.; Douglas, P.L.; Fowler, M.W.; Entchev, E.

    2005-01-01

    The design of a fuel cell system involves both optimization of the fuel cell stack and the balance of plant with respect to efficiency and economics. Many commercially available process simulators, such as AspenPlus TM , can facilitate the analysis of a solid oxide fuel cell (SOFC) system. A SOFC system may include fuel pre-processors, heat exchangers, turbines, bottoming cycles, etc., all of which can be very effectively modelled in process simulation software. The current challenge is that AspenPlus TM or any other commercial process simulators do not have a model of a basic SOFC stack. Therefore, to enable performing SOFC system simulation using one of these simulators, one must construct an SOFC stack model that can be implemented in them. The most common approach is to develop a complete SOFC model in a programming language, such as Fortran, Visual Basic or C++, first and then link it to a commercial process simulator as a user defined model or subroutine. This paper introduces a different approach to the development of a SOFC model by utilizing existing AspenPlus TM functions and existing unit operation modules. The developed ''AspenPlus TM SOFC'' model is able to provide detailed thermodynamic and parametric analyses of the SOFC operation and can easily be extended to study the entire power plant consisting of the SOFC and the balance of plant without the requirement for linking with other software. Validation of this model is performed by comparison to a Siemens-Westinghouse 100 kW class tubular SOFC stack. Sensitivity analyses of major operating parameters, such as utilization factor (U f ), current density (I c ) and steam-carbon ratio (S/C), were performed using the developed model, and the results are discussed in this paper

  18. LED Lighting System Reliability Modeling and Inference via Random Effects Gamma Process and Copula Function

    Directory of Open Access Journals (Sweden)

    Huibing Hao

    2015-01-01

    Full Text Available Light emitting diode (LED lamp has attracted increasing interest in the field of lighting systems due to its low energy and long lifetime. For different functions (i.e., illumination and color, it may have two or more performance characteristics. When the multiple performance characteristics are dependent, it creates a challenging problem to accurately analyze the system reliability. In this paper, we assume that the system has two performance characteristics, and each performance characteristic is governed by a random effects Gamma process where the random effects can capture the unit to unit differences. The dependency of performance characteristics is described by a Frank copula function. Via the copula function, the reliability assessment model is proposed. Considering the model is so complicated and analytically intractable, the Markov chain Monte Carlo (MCMC method is used to estimate the unknown parameters. A numerical example about actual LED lamps data is given to demonstrate the usefulness and validity of the proposed model and method.

  19. Conceptual modelling of human resource evaluation process

    Directory of Open Access Journals (Sweden)

    Negoiţă Doina Olivia

    2017-01-01

    Full Text Available Taking into account the highly diverse tasks which employees have to fulfil due to complex requirements of nowadays consumers, the human resource within an enterprise has become a strategic element for developing and exploiting products which meet the market expectations. Therefore, organizations encounter difficulties when approaching the human resource evaluation process. Hence, the aim of the current paper is to design a conceptual model of the aforementioned process, which allows the enterprises to develop a specific methodology. In order to design the conceptual model, Business Process Modelling instruments were employed - Adonis Community Edition Business Process Management Toolkit using the ADONIS BPMS Notation. The conceptual model was developed based on an in-depth secondary research regarding the human resource evaluation process. The proposed conceptual model represents a generic workflow (sequential and/ or simultaneously activities, which can be extended considering the enterprise’s needs regarding their requirements when conducting a human resource evaluation process. Enterprises can benefit from using software instruments for business process modelling as they enable process analysis and evaluation (predefined / specific queries and also model optimization (simulations.

  20. Water Use in the United States Energy System: A National Assessment and Unit Process Inventory of Water Consumption and Withdrawals.

    Science.gov (United States)

    Grubert, Emily; Sanders, Kelly T

    2018-06-05

    The United States (US) energy system is a large water user, but the nature of that use is poorly understood. To support resource comanagement and fill this noted gap in the literature, this work presents detailed estimates for US-based water consumption and withdrawals for the US energy system as of 2014, including both intensity values and the first known estimate of total water consumption and withdrawal by the US energy system. We address 126 unit processes, many of which are new additions to the literature, differentiated among 17 fuel cycles, five life cycle stages, three water source categories, and four levels of water quality. Overall coverage is about 99% of commercially traded US primary energy consumption with detailed energy flows by unit process. Energy-related water consumption, or water removed from its source and not directly returned, accounts for about 10% of both total and freshwater US water consumption. Major consumers include biofuels (via irrigation), oil (via deep well injection, usually of nonfreshwater), and hydropower (via evaporation and seepage). The US energy system also accounts for about 40% of both total and freshwater US water withdrawals, i.e., water removed from its source regardless of fate. About 70% of withdrawals are associated with the once-through cooling systems of approximately 300 steam cycle power plants that produce about 25% of US electricity.

  1. Unit testing, model validation, and biological simulation.

    Science.gov (United States)

    Sarma, Gopal P; Jacobs, Travis W; Watts, Mark D; Ghayoomie, S Vahid; Larson, Stephen D; Gerkin, Richard C

    2016-01-01

    The growth of the software industry has gone hand in hand with the development of tools and cultural practices for ensuring the reliability of complex pieces of software. These tools and practices are now acknowledged to be essential to the management of modern software. As computational models and methods have become increasingly common in the biological sciences, it is important to examine how these practices can accelerate biological software development and improve research quality. In this article, we give a focused case study of our experience with the practices of unit testing and test-driven development in OpenWorm, an open-science project aimed at modeling Caenorhabditis elegans. We identify and discuss the challenges of incorporating test-driven development into a heterogeneous, data-driven project, as well as the role of model validation tests, a category of tests unique to software which expresses scientific models.

  2. Modeling Dynamic Systems with Efficient Ensembles of Process-Based Models.

    Directory of Open Access Journals (Sweden)

    Nikola Simidjievski

    Full Text Available Ensembles are a well established machine learning paradigm, leading to accurate and robust models, predominantly applied to predictive modeling tasks. Ensemble models comprise a finite set of diverse predictive models whose combined output is expected to yield an improved predictive performance as compared to an individual model. In this paper, we propose a new method for learning ensembles of process-based models of dynamic systems. The process-based modeling paradigm employs domain-specific knowledge to automatically learn models of dynamic systems from time-series observational data. Previous work has shown that ensembles based on sampling observational data (i.e., bagging and boosting, significantly improve predictive performance of process-based models. However, this improvement comes at the cost of a substantial increase of the computational time needed for learning. To address this problem, the paper proposes a method that aims at efficiently learning ensembles of process-based models, while maintaining their accurate long-term predictive performance. This is achieved by constructing ensembles with sampling domain-specific knowledge instead of sampling data. We apply the proposed method to and evaluate its performance on a set of problems of automated predictive modeling in three lake ecosystems using a library of process-based knowledge for modeling population dynamics. The experimental results identify the optimal design decisions regarding the learning algorithm. The results also show that the proposed ensembles yield significantly more accurate predictions of population dynamics as compared to individual process-based models. Finally, while their predictive performance is comparable to the one of ensembles obtained with the state-of-the-art methods of bagging and boosting, they are substantially more efficient.

  3. Application of Prognostic Mesoscale Modeling in the Southeast United States

    International Nuclear Information System (INIS)

    Buckley, R.L.

    1999-01-01

    A prognostic model is being used to provide regional forecasts for a variety of applications at the Savannah River Site (SRS). Emergency response dispersion models available at SRS use the space and time-dependent meteorological data provided by this model to supplement local and regional observations. Output from the model is also used locally to aid in forecasting at SRS, and regionally in providing forecasts of the potential time and location of hurricane landfall within the southeast United States

  4. Extended state observer based fuzzy model predictive control for ultra-supercritical boiler-turbine unit

    International Nuclear Information System (INIS)

    Zhang, Fan; Wu, Xiao; Shen, Jiong

    2017-01-01

    Highlights: • A novel ESOFMPC is proposed based on the combination of ESO and stable MPC. • The improved ESO can overcome unknown disturbances on any channel of MIMO system. • Nonlinearity and disturbance of boiler-turbine unit can be handled simultaneously. - Abstract: The regulation of ultra-supercritical (USC) boiler-turbine unit in large-scale power plants is vulnerable to various unknown disturbances, meanwhile, the internal nonlinearity makes it a challenging task for wide range load tracking. To overcome these two issues simultaneously, an extended state observer based fuzzy model predictive control is proposed for the USC boiler-turbine unit. Firstly, the fuzzy model of a 1000-MW coal-fired USC boiler-turbine unit is established through the nonlinearity analysis. Then a fuzzy stable model predictive controller is devised on the fuzzy model using output cost function for the purpose of wide range load tracking. An improved linear extended state observer, which can estimate plant behavior variations and unknown disturbances regardless of the direct feedthrough characteristic of the system, is synthesized with the predictive controller to enhance its disturbance rejection property. Closed-loop stability of the overall control system is guaranteed. Simulation results on a 1000-MW USC boiler-turbine unit model demonstrate the effectiveness of the proposed approach.

  5. Information-Processing Models and Curriculum Design

    Science.gov (United States)

    Calfee, Robert C.

    1970-01-01

    "This paper consists of three sections--(a) the relation of theoretical analyses of learning to curriculum design, (b) the role of information-processing models in analyses of learning processes, and (c) selected examples of the application of information-processing models to curriculum design problems." (Author)

  6. Use of a tangential filtration unit for processing liquid waste from nuclear laundries

    International Nuclear Information System (INIS)

    Augustin, X.; Buzonniere, A. de; Barnier, H.

    1993-01-01

    Nuclear laundries produce large quantities of weakly contaminated effluents charged with insoluble and soluble products. In collaboration with CEA, TECHNICATOME has developed an ultrafiltration process for liquid waste from nuclear laundries, associated with prior in-solubilization of the radiochemical activity. This process 'seeded ultrafiltration' is based on the use of decloggable mineral filter media and combines very high separation efficiency with long membrane life. The efficiency of the tangential filtration unit which has been processing effluents from the Cadarache Nuclear Research Center (CEA-France) nuclear laundry since mid-1988, has been confirmed on several sites

  7. Optimization model of a system of crude oil distillation units with heat integration and metamodeling

    International Nuclear Information System (INIS)

    Lopez, Diana C; Mahecha, Cesar A; Hoyos, Luis J; Acevedo, Leonardo; Villamizar Jaime F

    2010-01-01

    The process of crude distillation impacts the economy of any refinery in a considerable manner. Therefore, it is necessary to improve it taking good advantage of the available infrastructure, generating products that conform to the specifications without violating the equipment operating constraints or plant restrictions at industrial units. The objective of this paper is to present the development of an optimization model for a Crude Distillation Unit (CDU) system at a ECOPETROL S.A. refinery in Barrancabermeja, involving the typical restrictions (flow according to pipeline capacity, pumps, distillation columns, etc) and a restriction that has not been included in bibliographic reports for this type of models: the heat integration of streams from Atmospheric Distillation Towers (ADTs) and Vacuum Distillation Towers (VDT) with the heat exchanger networks for crude pre-heating. On the other hand, ADTs were modeled with Metamodels in function of column temperatures and pressures, pump a rounds flows and return temperatures, stripping steam flows, Jet EBP ASTM D-86 and Diesel EBP ASTM D-86. Pre-heating trains were modeled with mass and energy balances, and design equation of each heat exchanger. The optimization model is NLP, maximizing the system profit. This model was implemented in GAMSide 22,2 using the CONOPT solver and it found new operating points with better economic results than those obtained with the normal operation in the real plants. It predicted optimum operation conditions of 3 ADTs for constant composition crude and calculated the yields and properties of atmospheric products, additional to temperatures and duties of 27 Crude Oil exchangers.

  8. A photoactivated artificial muscle model unit: reversible, photoinduced sliding of nanosheets.

    Science.gov (United States)

    Nabetani, Yu; Takamura, Hazuki; Hayasaka, Yuika; Shimada, Tetsuya; Takagi, Shinsuke; Tachibana, Hiroshi; Masui, Dai; Tong, Zhiwei; Inoue, Haruo

    2011-11-02

    A novel photoactivated artificial muscle model unit is reported. Here we show that organic/inorganic hybrid nanosheets reversibly slide horizontally on a giant scale and the interlayer spaces in the layered hybrid structure shrink and expand vertically by photoirradiation. The sliding movement of the system on a giant scale is the first example of an artificial muscle model unit having much similarity with that in natural muscle fibrils. In particular, our layered hybrid molecular system exhibits a macroscopic morphological change on a giant scale (~1500 nm) relative to the molecular size of ~1 nm by means of a reversible sliding mechanism.

  9. Point process-based modeling of multiple debris flow landslides using INLA: an application to the 2009 Messina disaster

    KAUST Repository

    Lombardo, Luigi

    2018-02-13

    We develop a stochastic modeling approach based on spatial point processes of log-Gaussian Cox type for a collection of around 5000 landslide events provoked by a precipitation trigger in Sicily, Italy. Through the embedding into a hierarchical Bayesian estimation framework, we can use the integrated nested Laplace approximation methodology to make inference and obtain the posterior estimates of spatially distributed covariate and random effects. Several mapping units are useful to partition a given study area in landslide prediction studies. These units hierarchically subdivide the geographic space from the highest grid-based resolution to the stronger morphodynamic-oriented slope units. Here we integrate both mapping units into a single hierarchical model, by treating the landslide triggering locations as a random point pattern. This approach diverges fundamentally from the unanimously used presence–absence structure for areal units since we focus on modeling the expected landslide count jointly within the two mapping units. Predicting this landslide intensity provides more detailed and complete information as compared to the classically used susceptibility mapping approach based on relative probabilities. To illustrate the model’s versatility, we compute absolute probability maps of landslide occurrences and check their predictive power over space. While the landslide community typically produces spatial predictive models for landslides only in the sense that covariates are spatially distributed, no actual spatial dependence has been explicitly integrated so far. Our novel approach features a spatial latent effect defined at the slope unit level, allowing us to assess the spatial influence that remains unexplained by the covariates in the model. For rainfall-induced landslides in regions where the raingauge network is not sufficient to capture the spatial distribution of the triggering precipitation event, this latent effect provides valuable imaging support

  10. Measuring similarity between business process models

    NARCIS (Netherlands)

    Dongen, van B.F.; Dijkman, R.M.; Mendling, J.

    2007-01-01

    Quality aspects become increasingly important when business process modeling is used in a large-scale enterprise setting. In order to facilitate a storage without redundancy and an efficient retrieval of relevant process models in model databases it is required to develop a theoretical understanding

  11. The ‘hit’ phenomenon: a mathematical model of human dynamics interactions as a stochastic process

    Science.gov (United States)

    Ishii, Akira; Arakaki, Hisashi; Matsuda, Naoya; Umemura, Sanae; Urushidani, Tamiko; Yamagata, Naoya; Yoshida, Narihiko

    2012-06-01

    A mathematical model for the ‘hit’ phenomenon in entertainment within a society is presented as a stochastic process of human dynamics interactions. The model uses only the advertisement budget time distribution as an input, and word-of-mouth (WOM), represented by posts on social network systems, is used as data to make a comparison with the calculated results. The unit of time is days. The WOM distribution in time is found to be very close to the revenue distribution in time. Calculations for the Japanese motion picture market based on the mathematical model agree well with the actual revenue distribution in time.

  12. Model medication management process in Australian nursing homes using business process modeling.

    Science.gov (United States)

    Qian, Siyu; Yu, Ping

    2013-01-01

    One of the reasons for end user avoidance or rejection to use health information systems is poor alignment of the system with healthcare workflow, likely causing by system designers' lack of thorough understanding about healthcare process. Therefore, understanding the healthcare workflow is the essential first step for the design of optimal technologies that will enable care staff to complete the intended tasks faster and better. The often use of multiple or "high risk" medicines by older people in nursing homes has the potential to increase medication error rate. To facilitate the design of information systems with most potential to improve patient safety, this study aims to understand medication management process in nursing homes using business process modeling method. The paper presents study design and preliminary findings from interviewing two registered nurses, who were team leaders in two nursing homes. Although there were subtle differences in medication management between the two homes, major medication management activities were similar. Further field observation will be conducted. Based on the data collected from observations, an as-is process model for medication management will be developed.

  13. A multi-level simulation platform of natural gas internal reforming solid oxide fuel cell-gas turbine hybrid generation system - Part II. Balancing units model library and system simulation

    Science.gov (United States)

    Bao, Cheng; Cai, Ningsheng; Croiset, Eric

    2011-10-01

    Following our integrated hierarchical modeling framework of natural gas internal reforming solid oxide fuel cell (IRSOFC), this paper firstly introduces the model libraries of main balancing units, including some state-of-the-art achievements and our specific work. Based on gPROMS programming code, flexible configuration and modular design are fully realized by specifying graphically all unit models in each level. Via comparison with the steady-state experimental data of Siemens-Westinghouse demonstration system, the in-house multi-level SOFC-gas turbine (GT) simulation platform is validated to be more accurate than the advanced power system analysis tool (APSAT). Moreover, some units of the demonstration system are designed reversely for analysis of a typically part-load transient process. The framework of distributed and dynamic modeling in most of units is significant for the development of control strategies in the future.

  14. Miniaturized Power Processing Unit Study: A Cubesat Electric Propulsion Technology Enabler Project

    Science.gov (United States)

    Ghassemieh, Shakib M.

    2014-01-01

    This study evaluates High Voltage Power Processing Unit (PPU) technology and driving requirements necessary to enable the Microfluidic Electric Propulsion technology research and development by NASA and university partners. This study provides an overview of the state of the art PPU technology with recommendations for technology demonstration projects and missions for NASA to pursue.

  15. Graphics Processing Unit Accelerated Hirsch-Fye Quantum Monte Carlo

    Science.gov (United States)

    Moore, Conrad; Abu Asal, Sameer; Rajagoplan, Kaushik; Poliakoff, David; Caprino, Joseph; Tomko, Karen; Thakur, Bhupender; Yang, Shuxiang; Moreno, Juana; Jarrell, Mark

    2012-02-01

    In Dynamical Mean Field Theory and its cluster extensions, such as the Dynamic Cluster Algorithm, the bottleneck of the algorithm is solving the self-consistency equations with an impurity solver. Hirsch-Fye Quantum Monte Carlo is one of the most commonly used impurity and cluster solvers. This work implements optimizations of the algorithm, such as enabling large data re-use, suitable for the Graphics Processing Unit (GPU) architecture. The GPU's sheer number of concurrent parallel computations and large bandwidth to many shared memories takes advantage of the inherent parallelism in the Green function update and measurement routines, and can substantially improve the efficiency of the Hirsch-Fye impurity solver.

  16. Simulation Modeling of Software Development Processes

    Science.gov (United States)

    Calavaro, G. F.; Basili, V. R.; Iazeolla, G.

    1996-01-01

    A simulation modeling approach is proposed for the prediction of software process productivity indices, such as cost and time-to-market, and the sensitivity analysis of such indices to changes in the organization parameters and user requirements. The approach uses a timed Petri Net and Object Oriented top-down model specification. Results demonstrate the model representativeness, and its usefulness in verifying process conformance to expectations, and in performing continuous process improvement and optimization.

  17. Product unit neural network models for predicting the growth limits of Listeria monocytogenes.

    Science.gov (United States)

    Valero, A; Hervás, C; García-Gimeno, R M; Zurera, G

    2007-08-01

    A new approach to predict the growth/no growth interface of Listeria monocytogenes as a function of storage temperature, pH, citric acid (CA) and ascorbic acid (AA) is presented. A linear logistic regression procedure was performed and a non-linear model was obtained by adding new variables by means of a Neural Network model based on Product Units (PUNN). The classification efficiency of the training data set and the generalization data of the new Logistic Regression PUNN model (LRPU) were compared with Linear Logistic Regression (LLR) and Polynomial Logistic Regression (PLR) models. 92% of the total cases from the LRPU model were correctly classified, an improvement on the percentage obtained using the PLR model (90%) and significantly higher than the results obtained with the LLR model, 80%. On the other hand predictions of LRPU were closer to data observed which permits to design proper formulations in minimally processed foods. This novel methodology can be applied to predictive microbiology for describing growth/no growth interface of food-borne microorganisms such as L. monocytogenes. The optimal balance is trying to find models with an acceptable interpretation capacity and with good ability to fit the data on the boundaries of variable range. The results obtained conclude that these kinds of models might well be very a valuable tool for mathematical modeling.

  18. Social software for business process modeling

    NARCIS (Netherlands)

    Koschmider, A.; Song, M.S.; Reijers, H.A.

    2010-01-01

    Formal models of business processes are used for a variety of purposes. But where the elicitation of the characteristics of a business process usually takes place in a collaborative fashion, the building of the final, formal process model is done mostly by a single person. This article presents the

  19. Object-Oriented Approach to Modeling Units of Pneumatic Systems

    Directory of Open Access Journals (Sweden)

    Yu. V. Kyurdzhiev

    2014-01-01

    Full Text Available The article shows the relevance of the approaches to the object-oriented programming when modeling the pneumatic units (PU.Based on the analysis of the calculation schemes of aggregates pneumatic systems two basic objects, namely a cavity flow and a material point were highlighted.Basic interactions of objects are defined. Cavity-cavity interaction: ex-change of matter and energy with the flows of mass. Cavity-point interaction: force interaction, exchange of energy in the form of operation. Point-point in-teraction: force interaction, elastic interaction, inelastic interaction, and inter-vals of displacement.The authors have developed mathematical models of basic objects and interactions. Models and interaction of elements are implemented in the object-oriented programming.Mathematical models of elements of PU design scheme are implemented in derived from the base class. These classes implement the models of flow cavity, piston, diaphragm, short channel, diaphragm to be open by a given law, spring, bellows, elastic collision, inelastic collision, friction, PU stages with a limited movement, etc.A numerical integration of differential equations for the mathematical models of PU design scheme elements is based on the Runge-Kutta method of the fourth order. On request each class performs a tact of integration i.e. calcu-lation of the coefficient method.The paper presents an integration algorithm of the system of differential equations. All objects of the PU design scheme are placed in a unidirectional class list. Iterator loop cycle initiates the integration tact of all the objects in the list. One in four iteration makes a transition to the next step of integration. Calculation process stops when any object shows a shutdowns flag.The proposed approach was tested in the calculation of a number of PU designs. With regard to traditional approaches to modeling, the authors-proposed method features in easy enhancement, code reuse, high reliability

  20. MASSIVELY PARALLEL LATENT SEMANTIC ANALYSES USING A GRAPHICS PROCESSING UNIT

    Energy Technology Data Exchange (ETDEWEB)

    Cavanagh, J.; Cui, S.

    2009-01-01

    Latent Semantic Analysis (LSA) aims to reduce the dimensions of large term-document datasets using Singular Value Decomposition. However, with the ever-expanding size of datasets, current implementations are not fast enough to quickly and easily compute the results on a standard PC. A graphics processing unit (GPU) can solve some highly parallel problems much faster than a traditional sequential processor or central processing unit (CPU). Thus, a deployable system using a GPU to speed up large-scale LSA processes would be a much more effective choice (in terms of cost/performance ratio) than using a PC cluster. Due to the GPU’s application-specifi c architecture, harnessing the GPU’s computational prowess for LSA is a great challenge. We presented a parallel LSA implementation on the GPU, using NVIDIA® Compute Unifi ed Device Architecture and Compute Unifi ed Basic Linear Algebra Subprograms software. The performance of this implementation is compared to traditional LSA implementation on a CPU using an optimized Basic Linear Algebra Subprograms library. After implementation, we discovered that the GPU version of the algorithm was twice as fast for large matrices (1 000x1 000 and above) that had dimensions not divisible by 16. For large matrices that did have dimensions divisible by 16, the GPU algorithm ran fi ve to six times faster than the CPU version. The large variation is due to architectural benefi ts of the GPU for matrices divisible by 16. It should be noted that the overall speeds for the CPU version did not vary from relative normal when the matrix dimensions were divisible by 16. Further research is needed in order to produce a fully implementable version of LSA. With that in mind, the research we presented shows that the GPU is a viable option for increasing the speed of LSA, in terms of cost/performance ratio.

  1. Adsorption thermal energy storage for cogeneration in industrial batch processes: Experiment, dynamic modeling and system analysis

    International Nuclear Information System (INIS)

    Schreiber, Heike; Graf, Stefan; Lanzerath, Franz; Bardow, André

    2015-01-01

    Adsorption thermal energy storage is investigated for heat supply with cogeneration in industrial batch processes. The feasibility of adsorption thermal energy storage is demonstrated with a lab-scale prototype. Based on these experiments, a dynamic model is developed and successfully calibrated to measurement data. Thereby, a reliable description of the dynamic behavior of the adsorption thermal energy storage unit is achieved. The model is used to study and benchmark the performance of adsorption thermal energy storage combined with cogeneration for batch process energy supply. As benchmark, we consider both a peak boiler and latent thermal energy storage based on a phase change material. Beer brewing is considered as an example of an industrial batch process. The study shows that adsorption thermal energy storage has the potential to increase energy efficiency significantly; primary energy consumption can be reduced by up to 25%. However, successful integration of adsorption thermal storage requires appropriate integration of low grade heat: Preferentially, low grade heat is available at times of discharging and in demand when charging the storage unit. Thus, adsorption thermal energy storage is most beneficial if applied to a batch process with heat demands on several temperature levels. - Highlights: • A highly efficient energy supply for industrial batch processes is presented. • Adsorption thermal energy storage (TES) is analyzed in experiment and simulation. • Adsorption TES can outperform both peak boilers and latent TES. • Performance of adsorption TES strongly depends on low grade heat temperature.

  2. Modeling business processes: theoretical and practical aspects

    Directory of Open Access Journals (Sweden)

    V.V. Dubininа

    2015-06-01

    Full Text Available The essence of process-oriented enterprise management has been examined in the article. The content and types of information technology have been analyzed in the article, due to the complexity and differentiation of existing methods, as well as the specificity of language, terminology of the enterprise business processes modeling. The theoretical aspects of business processes modeling have been reviewed and the modern traditional modeling techniques received practical application in the visualization model of retailers activity have been studied in the article. In the process of theoretical analysis of the modeling methods found that UFO-toolkit method that has been developed by Ukrainian scientists due to it systemology integrated opportunities, is the most suitable for structural and object analysis of retailers business processes. It was designed visualized simulation model of the business process "sales" as is" of retailers using a combination UFO-elements with the aim of the further practical formalization and optimization of a given business process.

  3. Accelerating image reconstruction in three-dimensional optoacoustic tomography on graphics processing units.

    Science.gov (United States)

    Wang, Kun; Huang, Chao; Kao, Yu-Jiun; Chou, Cheng-Ying; Oraevsky, Alexander A; Anastasio, Mark A

    2013-02-01

    Optoacoustic tomography (OAT) is inherently a three-dimensional (3D) inverse problem. However, most studies of OAT image reconstruction still employ two-dimensional imaging models. One important reason is because 3D image reconstruction is computationally burdensome. The aim of this work is to accelerate existing image reconstruction algorithms for 3D OAT by use of parallel programming techniques. Parallelization strategies are proposed to accelerate a filtered backprojection (FBP) algorithm and two different pairs of projection/backprojection operations that correspond to two different numerical imaging models. The algorithms are designed to fully exploit the parallel computing power of graphics processing units (GPUs). In order to evaluate the parallelization strategies for the projection/backprojection pairs, an iterative image reconstruction algorithm is implemented. Computer simulation and experimental studies are conducted to investigate the computational efficiency and numerical accuracy of the developed algorithms. The GPU implementations improve the computational efficiency by factors of 1000, 125, and 250 for the FBP algorithm and the two pairs of projection/backprojection operators, respectively. Accurate images are reconstructed by use of the FBP and iterative image reconstruction algorithms from both computer-simulated and experimental data. Parallelization strategies for 3D OAT image reconstruction are proposed for the first time. These GPU-based implementations significantly reduce the computational time for 3D image reconstruction, complementing our earlier work on 3D OAT iterative image reconstruction.

  4. An unit commitment model for hydrothermal systems; Um modelo de unit commitment para sistemas hidrotermicos

    Energy Technology Data Exchange (ETDEWEB)

    Franca, Thiago de Paula; Luciano, Edson Jose Rezende; Nepomuceno, Leonardo [Universidade Estadual Paulista (UNESP), Bauru, SP (Brazil). Dept. de Engenharia Eletrica], Emails: ra611191@feb.unesp.br, edson.joserl@uol.com.br, leo@feb.unesp.br

    2009-07-01

    A model of Unit Commitment to hydrothermal systems that includes the costs of start/stop of generators is proposed. These costs has been neglected in a good part of the programming models for operation of hydrothermal systems (pre-dispatch). The impact of the representation of costs in total production costs is evaluated. The proposed model is solved by a hybrid methodology, which involves the use of genetic algorithms (to solve the entire part of the problem) and sequential quadratic programming methods. This methodology is applied to the solution of an IEEE test system. The results emphasize the importance of representation of the start/stop in the generation schedule.

  5. General purpose graphic processing unit implementation of adaptive pulse compression algorithms

    Science.gov (United States)

    Cai, Jingxiao; Zhang, Yan

    2017-07-01

    This study introduces a practical approach to implement real-time signal processing algorithms for general surveillance radar based on NVIDIA graphical processing units (GPUs). The pulse compression algorithms are implemented using compute unified device architecture (CUDA) libraries such as CUDA basic linear algebra subroutines and CUDA fast Fourier transform library, which are adopted from open source libraries and optimized for the NVIDIA GPUs. For more advanced, adaptive processing algorithms such as adaptive pulse compression, customized kernel optimization is needed and investigated. A statistical optimization approach is developed for this purpose without needing much knowledge of the physical configurations of the kernels. It was found that the kernel optimization approach can significantly improve the performance. Benchmark performance is compared with the CPU performance in terms of processing accelerations. The proposed implementation framework can be used in various radar systems including ground-based phased array radar, airborne sense and avoid radar, and aerospace surveillance radar.

  6. Coal conversion process by the United Power Plants of Westphalia

    Energy Technology Data Exchange (ETDEWEB)

    1974-08-01

    The coal conversion process used by the United Power Plants of Westphalia and its possible applications are described. In this process, the crushed and predried coal is degassed and partly gasified in a gas generator, during which time the sulfur present in the coal is converted into hydrogen sulfide, which together with the carbon dioxide is subsequently washed out and possibly utilized or marketed. The residual coke together with the ashes and tar is then sent to the melting chamber of the steam generator where the ashes are removed. After desulfurization, the purified gas is fed into an external circuit and/or to a gas turbine for electricity generation. The raw gas from the gas generator can be directly used as fuel in a conventional power plant. The calorific value of the purified gas varies from 3200 to 3500 kcal/cu m. The purified gas can be used as reducing agent, heating gas, as raw material for various chemical processes, or be conveyed via pipelines to remote areas for electricity generation. The conversion process has the advantages of increased economy of electricity generation with desulfurization, of additional gas generation, and, in long-term prospects, of the use of the waste heat from high-temperature nuclear reactors for this process.

  7. Technology Evaluation of Process Configurations for Second Generation Bioethanol Production using Dynamic Model-based Simulations

    DEFF Research Database (Denmark)

    Morales Rodriguez, Ricardo; Meyer, Anne S.; Gernaey, Krist

    2011-01-01

    An assessment of a number of different process flowsheets for bioethanol production was performed using dynamic model-based simulations. The evaluation employed diverse operational scenarios such as, fed-batch, continuous and continuous with recycle configurations. Each configuration was evaluated...... against the following benchmark criteria, yield (kg ethanol/kg dry-biomass), final product concentration and number of unit operations required in the different process configurations. The results has shown the process configuration for simultaneous saccharification and co-fermentation (SSCF) operating...... in continuous mode with a recycle of the SSCF reactor effluent, results in the best productivity of bioethanol among the proposed process configurations, with a yield of 0.18 kg ethanol /kg dry-biomass....

  8. Qualitative simulation in formal process modelling

    International Nuclear Information System (INIS)

    Sivertsen, Elin R.

    1999-01-01

    In relation to several different research activities at the OECD Halden Reactor Project, the usefulness of formal process models has been identified. Being represented in some appropriate representation language, the purpose of these models is to model process plants and plant automatics in a unified way to allow verification and computer aided design of control strategies. The present report discusses qualitative simulation and the tool QSIM as one approach to formal process models. In particular, the report aims at investigating how recent improvements of the tool facilitate the use of the approach in areas like process system analysis, procedure verification, and control software safety analysis. An important long term goal is to provide a basis for using qualitative reasoning in combination with other techniques to facilitate the treatment of embedded programmable systems in Probabilistic Safety Analysis (PSA). This is motivated from the potential of such a combination in safety analysis based on models comprising both software, hardware, and operator. It is anticipated that the research results from this activity will benefit V and V in a wide variety of applications where formal process models can be utilized. Examples are operator procedures, intelligent decision support systems, and common model repositories (author) (ml)

  9. Nuclear safety inspection in treatment process for SG heat exchange tubes deficiency of unit 1, TNPS

    International Nuclear Information System (INIS)

    Zhang Chunming; Song Chenxiu; Zhao Pengyu; Hou Wei

    2006-01-01

    This paper describes treatment process for SG heat exchange tubes deficiency of Unit 1, TNPS, nuclear safety inspection of Northern Regional Office during treatment process for deficiency and further inspection after deficiency had been treated. (authors)

  10. Modeling closed nuclear fuel cycles processes

    Energy Technology Data Exchange (ETDEWEB)

    Shmidt, O.V. [A.A. Bochvar All-Russian Scientific Research Institute for Inorganic Materials, Rogova, 5a street, Moscow, 123098 (Russian Federation); Makeeva, I.R. [Zababakhin All-Russian Scientific Research Institute of Technical Physics, Vasiliev street 13, Snezhinsk, Chelyabinsk region, 456770 (Russian Federation); Liventsov, S.N. [Tomsk Polytechnic University, Tomsk, Lenin Avenue, 30, 634050 (Russian Federation)

    2016-07-01

    Computer models of processes are necessary for determination of optimal operating conditions for closed nuclear fuel cycle (NFC) processes. Computer models can be quickly changed in accordance with new and fresh data from experimental research. 3 kinds of process simulation are necessary. First, the VIZART software package is a balance model development used for calculating the material flow in technological processes. VIZART involves taking into account of equipment capacity, transport lines and storage volumes. Secondly, it is necessary to simulate the physico-chemical processes that are involved in the closure of NFC. The third kind of simulation is the development of software that allows the optimization, diagnostics and control of the processes which implies real-time simulation of product flows on the whole plant or on separate lines of the plant. (A.C.)

  11. Phenobarbital in intensive care unit pediatric population: predictive performances of population pharmacokinetic model.

    Science.gov (United States)

    Marsot, Amélie; Michel, Fabrice; Chasseloup, Estelle; Paut, Olivier; Guilhaumou, Romain; Blin, Olivier

    2017-10-01

    An external evaluation of phenobarbital population pharmacokinetic model described by Marsot et al. was performed in pediatric intensive care unit. Model evaluation is an important issue for dose adjustment. This external evaluation should allow confirming the proposed dosage adaptation and extending these recommendations to the entire intensive care pediatric population. External evaluation of phenobarbital published population pharmacokinetic model of Marsot et al. was realized in a new retrospective dataset of 35 patients hospitalized in a pediatric intensive care unit. The published population pharmacokinetic model was implemented in nonmem 7.3. Predictive performance was assessed by quantifying bias and inaccuracy of model prediction. Normalized prediction distribution errors (NPDE) and visual predictive check (VPC) were also evaluated. A total of 35 infants were studied with a mean age of 33.5 weeks (range: 12 days-16 years) and a mean weight of 12.6 kg (range: 2.7-70.0 kg). The model predicted the observed phenobarbital concentrations with a reasonable bias and inaccuracy. The median prediction error was 3.03% (95% CI: -8.52 to 58.12%), and the median absolute prediction error was 26.20% (95% CI: 13.07-75.59%). No trends in NPDE and VPC were observed. The model previously proposed by Marsot et al. in neonates hospitalized in intensive care unit was externally validated for IV infusion administration. The model-based dosing regimen was extended in all pediatric intensive care unit to optimize treatment. Due to inter- and intravariability in pharmacokinetic model, this dosing regimen should be combined with therapeutic drug monitoring. © 2017 Société Française de Pharmacologie et de Thérapeutique.

  12. Eye Tracking Meets the Process of Process Modeling: a Visual Analytic Approach

    DEFF Research Database (Denmark)

    Burattin, Andrea; Kaiser, M.; Neurauter, Manuel

    2017-01-01

    Research on the process of process modeling (PPM) studies how process models are created. It typically uses the logs of the interactions with the modeling tool to assess the modeler’s behavior. In this paper we suggest to introduce an additional stream of data (i.e., eye tracking) to improve the ...

  13. Modeling biochemical transformation processes and information processing with Narrator.

    Science.gov (United States)

    Mandel, Johannes J; Fuss, Hendrik; Palfreyman, Niall M; Dubitzky, Werner

    2007-03-27

    Software tools that model and simulate the dynamics of biological processes and systems are becoming increasingly important. Some of these tools offer sophisticated graphical user interfaces (GUIs), which greatly enhance their acceptance by users. Such GUIs are based on symbolic or graphical notations used to describe, interact and communicate the developed models. Typically, these graphical notations are geared towards conventional biochemical pathway diagrams. They permit the user to represent the transport and transformation of chemical species and to define inhibitory and stimulatory dependencies. A critical weakness of existing tools is their lack of supporting an integrative representation of transport, transformation as well as biological information processing. Narrator is a software tool facilitating the development and simulation of biological systems as Co-dependence models. The Co-dependence Methodology complements the representation of species transport and transformation together with an explicit mechanism to express biological information processing. Thus, Co-dependence models explicitly capture, for instance, signal processing structures and the influence of exogenous factors or events affecting certain parts of a biological system or process. This combined set of features provides the system biologist with a powerful tool to describe and explore the dynamics of life phenomena. Narrator's GUI is based on an expressive graphical notation which forms an integral part of the Co-dependence Methodology. Behind the user-friendly GUI, Narrator hides a flexible feature which makes it relatively easy to map models defined via the graphical notation to mathematical formalisms and languages such as ordinary differential equations, the Systems Biology Markup Language or Gillespie's direct method. This powerful feature facilitates reuse, interoperability and conceptual model development. Narrator is a flexible and intuitive systems biology tool. It is

  14. Mapping and modeling the biogeochemical cycling of turf grasses in the United States.

    Science.gov (United States)

    Milesi, Cristina; Running, Steven W; Elvidge, Christopher D; Dietz, John B; Tuttle, Benjamin T; Nemani, Ramakrishna R

    2005-09-01

    Turf grasses are ubiquitous in the urban landscape of the United States and are often associated with various types of environmental impacts, especially on water resources, yet there have been limited efforts to quantify their total surface and ecosystem functioning, such as their total impact on the continental water budget and potential net ecosystem exchange (NEE). In this study, relating turf grass area to an estimate of fractional impervious surface area, it was calculated that potentially 163,800 km2 (+/- 35,850 km2) of land are cultivated with turf grasses in the continental United States, an area three times larger than that of any irrigated crop. Using the Biome-BGC ecosystem process model, the growth of warm-season and cool-season turf grasses was modeled at a number of sites across the 48 conterminous states under different management scenarios, simulating potential carbon and water fluxes as if the entire turf surface was to be managed like a well-maintained lawn. The results indicate that well-watered and fertilized turf grasses act as a carbon sink. The potential NEE that could derive from the total surface potentially under turf (up to 17 Tg C/yr with the simulated scenarios) would require up to 695 to 900 liters of water per person per day, depending on the modeled water irrigation practices, suggesting that outdoor water conservation practices such as xeriscaping and irrigation with recycled waste-water may need to be extended as many municipalities continue to face increasing pressures on freshwater.

  15. 40 CFR Appendix Xiii to Part 266 - Mercury Bearing Wastes That May Be Processed in Exempt Mercury Recovery Units

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 26 2010-07-01 2010-07-01 false Mercury Bearing Wastes That May Be Processed in Exempt Mercury Recovery Units XIII Appendix XIII to Part 266 Protection of Environment... XIII to Part 266—Mercury Bearing Wastes That May Be Processed in Exempt Mercury Recovery Units These...

  16. Advanced spent fuel processing technologies for the United States GNEP programme

    International Nuclear Information System (INIS)

    Laidler, J.J.

    2007-01-01

    Spent fuel processing technologies for future advanced nuclear fuel cycles are being developed under the scope of the Global Nuclear Energy Partnership (GNEP). This effort seeks to make available for future deployment a fissile material recycling system that does not involve the separation of pure plutonium from spent fuel. In the nuclear system proposed by the United States under the GNEP initiative, light water reactor spent fuel is treated by means of a solvent extraction process that involves a group extraction of transuranic elements. The recovered transuranics are recycled as fuel material for advanced burner reactors, which can lead in the long term to fast reactors with conversion ratios greater than unity, helping to assure the sustainability of nuclear power systems. Both aqueous and pyrochemical methods are being considered for fast reactor spent fuel processing in the current US development programme. (author)

  17. The Formalization of the Business Process Modeling Goals

    Directory of Open Access Journals (Sweden)

    Ligita Bušinska

    2016-10-01

    Full Text Available In business process modeling the de facto standard BPMN has emerged. However, the applications of this notation have many subsets of elements and various extensions. Also, BPMN still coincides with many other modeling languages, forming a large set of available options for business process modeling languages and dialects. While, in general, the goal of modelers is a central notion in the choice of modeling languages and notations, in most researches that propose guidelines, techniques, and methods for business process modeling language evaluation and/or selection, the business process modeling goal is not formalized and not transparently taken into account. To overcome this gap, and to explicate and help to handle business process modeling complexity, the approach to formalize the business process modeling goal, and the supporting three dimensional business process modeling framework, are proposed.

  18. Congestion estimation technique in the optical network unit registration process.

    Science.gov (United States)

    Kim, Geunyong; Yoo, Hark; Lee, Dongsoo; Kim, Youngsun; Lim, Hyuk

    2016-07-01

    We present a congestion estimation technique (CET) to estimate the optical network unit (ONU) registration success ratio for the ONU registration process in passive optical networks. An optical line terminal (OLT) estimates the number of collided ONUs via the proposed scheme during the serial number state. The OLT can obtain congestion level among ONUs to be registered such that this information may be exploited to change the size of a quiet window to decrease the collision probability. We verified the efficiency of the proposed method through simulation and experimental results.

  19. Using Systems Theory to Examine Patient and Nurse Structures, Processes, and Outcomes in Centralized and Decentralized Units.

    Science.gov (United States)

    Real, Kevin; Fay, Lindsey; Isaacs, Kathy; Carll-White, Allison; Schadler, Aric

    2018-01-01

    This study utilizes systems theory to understand how changes to physical design structures impact communication processes and patient and staff design-related outcomes. Many scholars and researchers have noted the importance of communication and teamwork for patient care quality. Few studies have examined changes to nursing station design within a systems theory framework. This study employed a multimethod, before-and-after, quasi-experimental research design. Nurses completed surveys in centralized units and later in decentralized units ( N = 26 pre , N = 51 post ). Patients completed surveys ( N = 62 pre ) in centralized units and later in decentralized units ( N = 49 post ). Surveys included quantitative measures and qualitative open-ended responses. Patients preferred the decentralized units because of larger single-occupancy rooms, greater privacy/confidentiality, and overall satisfaction with design. Nurses had a more complex response. Nurses approved the patient rooms, unit environment, and noise levels in decentralized units. However, they reported reduced access to support spaces, lower levels of team/mentoring communication, and less satisfaction with design than in centralized units. Qualitative findings supported these results. Nurses were more positive about centralized units and patients were more positive toward decentralized units. The results of this study suggest a need to understand how system components operate in concert. A major contribution of this study is the inclusion of patient satisfaction with design, an important yet overlooked fact in patient satisfaction. Healthcare design researchers and practitioners may consider how changing system interdependencies can lead to unexpected changes to communication processes and system outcomes in complex systems.

  20. Pre-Processing and Modeling Tools for Bigdata

    Directory of Open Access Journals (Sweden)

    Hashem Hadi

    2016-09-01

    Full Text Available Modeling tools and operators help the user / developer to identify the processing field on the top of the sequence and to send into the computing module only the data related to the requested result. The remaining data is not relevant and it will slow down the processing. The biggest challenge nowadays is to get high quality processing results with a reduced computing time and costs. To do so, we must review the processing sequence, by adding several modeling tools. The existing processing models do not take in consideration this aspect and focus on getting high calculation performances which will increase the computing time and costs. In this paper we provide a study of the main modeling tools for BigData and a new model based on pre-processing.

  1. Kopernik : modeling business processes for digital customers

    OpenAIRE

    Estañol Lamarca, Montserrat; Castro, Manuel; Díaz-Montenegro, Sylvia; Teniente López, Ernest

    2016-01-01

    This paper presents the Kopernik methodology for modeling business processes for digital customers. These processes require a high degree of flexibility in the execution of their tasks or actions. We achieve this by using the artifact-centric approach to process modeling and the use of condition-action rules. The processes modeled following Kopernik can then be implemented in an existing commercial tool, Balandra.

  2. All-optical quantum computing with a hybrid solid-state processing unit

    International Nuclear Information System (INIS)

    Pei Pei; Zhang Fengyang; Li Chong; Song Heshan

    2011-01-01

    We develop an architecture of a hybrid quantum solid-state processing unit for universal quantum computing. The architecture allows distant and nonidentical solid-state qubits in distinct physical systems to interact and work collaboratively. All the quantum computing procedures are controlled by optical methods using classical fields and cavity QED. Our methods have a prominent advantage of the insensitivity to dissipation process benefiting from the virtual excitation of subsystems. Moreover, the quantum nondemolition measurements and state transfer for the solid-state qubits are proposed. The architecture opens promising perspectives for implementing scalable quantum computation in a broader sense that different solid-state systems can merge and be integrated into one quantum processor afterward.

  3. Specification of e-business process model for PayPal online payment process using Reo

    OpenAIRE

    Xie, M.

    2005-01-01

    textabstractE-business process modeling allows business analysts to better understand and analyze the business processes, and eventually to use software systems to automate (parts of) these business processes to achieve higher profit. To support e-business process modeling, many business process modeling languages have been used as tools. However, many existing business process modeling languages lack (a) formal semantics, (b) formal computational model, and (c) an integrated view of the busi...

  4. Simulation based assembly and alignment process ability analysis for line replaceable units of the high power solid state laser facility

    International Nuclear Information System (INIS)

    Wang, Junfeng; Lu, Cong; Li, Shiqi

    2016-01-01

    Highlights: • Discrete event simulation is applied to analyze the assembly and alignment process ability of LRUs in SG-III facility. • The overall assembly and alignment process of LRUs with specific characteristics is described. • An extended-directed graph is proposed to express the assembly and alignment process of LRUs. • Different scenarios have been simulated to evaluate assembling process ability of LRUs and decision making is supported to ensure the construction millstone. - Abstract: Line replaceable units (LRUs) are important components of the very large high power solid state laser facilities. The assembly and alignment process ability of LRUs will impact the construction milestone of facilities. This paper describes the use of discrete event simulation method for assembly and alignment process analysis of LRUs in such facilities. The overall assembly and alignment process for LRUs is presented based on the layout of the optics assembly laboratory and the process characteristics are analyzed. An extended-directed graph is proposed to express the assembly and alignment process of LRUs. Taking the LRUs of disk amplifier system in Shen Guang-III (SG-III) facility as the example, some process simulation models are built based on the Quest simulation platform. The constraints, such as duration, equipment, technician and part supply, are considered in the simulation models. Different simulation scenarios have been carried out to evaluate the assembling process ability of LRUs. The simulation method can provide a valuable decision making and process optimization tool for the optics assembly laboratory layout and the process working out of such facilities.

  5. Simulation based assembly and alignment process ability analysis for line replaceable units of the high power solid state laser facility

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Junfeng; Lu, Cong; Li, Shiqi, E-mail: sqli@hust.edu.cn

    2016-11-15

    Highlights: • Discrete event simulation is applied to analyze the assembly and alignment process ability of LRUs in SG-III facility. • The overall assembly and alignment process of LRUs with specific characteristics is described. • An extended-directed graph is proposed to express the assembly and alignment process of LRUs. • Different scenarios have been simulated to evaluate assembling process ability of LRUs and decision making is supported to ensure the construction millstone. - Abstract: Line replaceable units (LRUs) are important components of the very large high power solid state laser facilities. The assembly and alignment process ability of LRUs will impact the construction milestone of facilities. This paper describes the use of discrete event simulation method for assembly and alignment process analysis of LRUs in such facilities. The overall assembly and alignment process for LRUs is presented based on the layout of the optics assembly laboratory and the process characteristics are analyzed. An extended-directed graph is proposed to express the assembly and alignment process of LRUs. Taking the LRUs of disk amplifier system in Shen Guang-III (SG-III) facility as the example, some process simulation models are built based on the Quest simulation platform. The constraints, such as duration, equipment, technician and part supply, are considered in the simulation models. Different simulation scenarios have been carried out to evaluate the assembling process ability of LRUs. The simulation method can provide a valuable decision making and process optimization tool for the optics assembly laboratory layout and the process working out of such facilities.

  6. Segmenting Continuous Motions with Hidden Semi-markov Models and Gaussian Processes

    Directory of Open Access Journals (Sweden)

    Tomoaki Nakamura

    2017-12-01

    Full Text Available Humans divide perceived continuous information into segments to facilitate recognition. For example, humans can segment speech waves into recognizable morphemes. Analogously, continuous motions are segmented into recognizable unit actions. People can divide continuous information into segments without using explicit segment points. This capacity for unsupervised segmentation is also useful for robots, because it enables them to flexibly learn languages, gestures, and actions. In this paper, we propose a Gaussian process-hidden semi-Markov model (GP-HSMM that can divide continuous time series data into segments in an unsupervised manner. Our proposed method consists of a generative model based on the hidden semi-Markov model (HSMM, the emission distributions of which are Gaussian processes (GPs. Continuous time series data is generated by connecting segments generated by the GP. Segmentation can be achieved by using forward filtering-backward sampling to estimate the model's parameters, including the lengths and classes of the segments. In an experiment using the CMU motion capture dataset, we tested GP-HSMM with motion capture data containing simple exercise motions; the results of this experiment showed that the proposed GP-HSMM was comparable with other methods. We also conducted an experiment using karate motion capture data, which is more complex than exercise motion capture data; in this experiment, the segmentation accuracy of GP-HSMM was 0.92, which outperformed other methods.

  7. Process mining using BPMN: relating event logs and process models

    NARCIS (Netherlands)

    Kalenkova, A.A.; van der Aalst, W.M.P.; Lomazova, I.A.; Rubin, V.A.

    2017-01-01

    Process-aware information systems (PAIS) are systems relying on processes, which involve human and software resources to achieve concrete goals. There is a need to develop approaches for modeling, analysis, improvement and monitoring processes within PAIS. These approaches include process mining

  8. Process mining using BPMN : relating event logs and process models

    NARCIS (Netherlands)

    Kalenkova, A.A.; Aalst, van der W.M.P.; Lomazova, I.A.; Rubin, V.A.

    2015-01-01

    Process-aware information systems (PAIS) are systems relying on processes, which involve human and software resources to achieve concrete goals. There is a need to develop approaches for modeling, analysis, improvement and monitoring processes within PAIS. These approaches include process mining

  9. A Comparative of business process modelling techniques

    Science.gov (United States)

    Tangkawarow, I. R. H. T.; Waworuntu, J.

    2016-04-01

    In this era, there is a lot of business process modeling techniques. This article is the research about differences of business process modeling techniques. For each technique will explain about the definition and the structure. This paper presents a comparative analysis of some popular business process modelling techniques. The comparative framework is based on 2 criteria: notation and how it works when implemented in Somerleyton Animal Park. Each technique will end with the advantages and disadvantages. The final conclusion will give recommend of business process modeling techniques that easy to use and serve the basis for evaluating further modelling techniques.

  10. Piecewise deterministic processes in biological models

    CERN Document Server

    Rudnicki, Ryszard

    2017-01-01

    This book presents a concise introduction to piecewise deterministic Markov processes (PDMPs), with particular emphasis on their applications to biological models. Further, it presents examples of biological phenomena, such as gene activity and population growth, where different types of PDMPs appear: continuous time Markov chains, deterministic processes with jumps, processes with switching dynamics, and point processes. Subsequent chapters present the necessary tools from the theory of stochastic processes and semigroups of linear operators, as well as theoretical results concerning the long-time behaviour of stochastic semigroups induced by PDMPs and their applications to biological models. As such, the book offers a valuable resource for mathematicians and biologists alike. The first group will find new biological models that lead to interesting and often new mathematical questions, while the second can observe how to include seemingly disparate biological processes into a unified mathematical theory, and...

  11. Computer Aided Continuous Time Stochastic Process Modelling

    DEFF Research Database (Denmark)

    Kristensen, N.R.; Madsen, Henrik; Jørgensen, Sten Bay

    2001-01-01

    A grey-box approach to process modelling that combines deterministic and stochastic modelling is advocated for identification of models for model-based control of batch and semi-batch processes. A computer-aided tool designed for supporting decision-making within the corresponding modelling cycle...

  12. Lagrangian particle modeling of air pollution transport in southwestern United States

    Energy Technology Data Exchange (ETDEWEB)

    Uliasz, M. [Warsaw Univ. of Technology (Poland); Stocker, R.A.; Pielke, R.A. [Colorado State Univ., Fort Collins, CO (United States)

    1994-12-31

    Several modeling techniques of various complexity and accuracy are applied in a numerical modeling study of regional air pollution transport being performed within the Measurement Of Haze And Visual Effect (MOHAVE) project. The goal of this study is to assess the impact of the Mohave Power Project (MPP) and other potential sources of air pollution to specific Class I areas located in the desert southwest United States including the Grand Canyon National Park. The Colorado State University team is performing the daily meteorological and dispersion simulations for a year long study using a nonhydrostatic mesoscale meteorological model; the Regional Atmospheric Modeling System (RAMS) coupled with a Lagrangian particle dispersion (LPD) model. The modeling domain covers the southwestern United States with its extremely complex terrain. Two complementary dispersion modeling techniques: a traditional source-oriented approach and receptor-oriented approach are used to calculate concentration and influence function fields, respectively. All computations are performed on two IBM RISC-6000 workstations dedicated to the project. The goal of this paper is to present our design for daily dispersion simulations with an emphasis on influence function calculations using examples from the winter and summer intensive periods of the MOHAVE project.

  13. GRAPHICAL MODELS OF THE AIRCRAFT MAINTENANCE PROCESS

    Directory of Open Access Journals (Sweden)

    Stanislav Vladimirovich Daletskiy

    2017-01-01

    Full Text Available The aircraft maintenance is realized by a rapid sequence of maintenance organizational and technical states, its re- search and analysis are carried out by statistical methods. The maintenance process concludes aircraft technical states con- nected with the objective patterns of technical qualities changes of the aircraft as a maintenance object and organizational states which determine the subjective organization and planning process of aircraft using. The objective maintenance pro- cess is realized in Maintenance and Repair System which does not include maintenance organization and planning and is a set of related elements: aircraft, Maintenance and Repair measures, executors and documentation that sets rules of their interaction for maintaining of the aircraft reliability and readiness for flight. The aircraft organizational and technical states are considered, their characteristics and heuristic estimates of connection in knots and arcs of graphs and of aircraft organi- zational states during regular maintenance and at technical state failure are given. It is shown that in real conditions of air- craft maintenance, planned aircraft technical state control and maintenance control through it, is only defined by Mainte- nance and Repair conditions at a given Maintenance and Repair type and form structures, and correspondingly by setting principles of Maintenance and Repair work types to the execution, due to maintenance, by aircraft and all its units mainte- nance and reconstruction strategies. The realization of planned Maintenance and Repair process determines the one of the constant maintenance component. The proposed graphical models allow to reveal quantitative correlations between graph knots to improve maintenance processes by statistical research methods, what reduces manning, timetable and expenses for providing safe civil aviation aircraft maintenance.

  14. The importance of topographically corrected null models for analyzing ecological point processes.

    Science.gov (United States)

    McDowall, Philip; Lynch, Heather J

    2017-07-01

    Analyses of point process patterns and related techniques (e.g., MaxEnt) make use of the expected number of occurrences per unit area and second-order statistics based on the distance between occurrences. Ecologists working with point process data often assume that points exist on a two-dimensional x-y plane or within a three-dimensional volume, when in fact many observed point patterns are generated on a two-dimensional surface existing within three-dimensional space. For many surfaces, however, such as the topography of landscapes, the projection from the surface to the x-y plane preserves neither area nor distance. As such, when these point patterns are implicitly projected to and analyzed in the x-y plane, our expectations of the point pattern's statistical properties may not be met. When used in hypothesis testing, we find that the failure to account for the topography of the generating surface may bias statistical tests that incorrectly identify clustering and, furthermore, may bias coefficients in inhomogeneous point process models that incorporate slope as a covariate. We demonstrate the circumstances under which this bias is significant, and present simple methods that allow point processes to be simulated with corrections for topography. These point patterns can then be used to generate "topographically corrected" null models against which observed point processes can be compared. © 2017 by the Ecological Society of America.

  15. Neuroscientific Model of Motivational Process

    OpenAIRE

    Kim, Sung-il

    2013-01-01

    Considering the neuroscientific findings on reward, learning, value, decision-making, and cognitive control, motivation can be parsed into three sub processes, a process of generating motivation, a process of maintaining motivation, and a process of regulating motivation. I propose a tentative neuroscientific model of motivational processes which consists of three distinct but continuous sub processes, namely reward-driven approach, value-based decision-making, and goal-directed control. Rewa...

  16. Measures of Quality in Business Process Modelling

    Directory of Open Access Journals (Sweden)

    Radek Hronza

    2015-06-01

    Full Text Available Business process modelling and analysing is undoubtedly one of the most important parts of Applied (Business Informatics. Quality of business process models (diagrams is crucial for any purpose in this area. The goal of a process analyst’s work is to create generally understandable, explicit and error free models. If a process is properly described, created models can be used as an input into deep analysis and optimization. It can be assumed that properly designed business process models (similarly as in the case of correctly written algorithms contain characteristics that can be mathematically described. Besides it will be possible to create a tool that will help process analysts to design proper models. As part of this review will be conducted systematic literature review in order to find and analyse business process model’s design and business process model’s quality measures. It was found that mentioned area had already been the subject of research investigation in the past. Thirty-three suitable scietific publications and twenty-two quality measures were found. Analysed scientific publications and existing quality measures do not reflect all important attributes of business process model’s clarity, simplicity and completeness. Therefore it would be appropriate to add new measures of quality.

  17. Nonequilibrium phase transitions in a model with social influence of inflexible units

    International Nuclear Information System (INIS)

    Jiang, Luo-luo; Hua, Da-yin; Chen, Ting

    2007-01-01

    In many social, economical and biological systems, the evolution of the states of interacting units cannot be simply treated with a physical law in the realm of traditional statistical mechanics. We propose a simple binary-state model to discuss the effect of the inflexible units on the dynamical behavior of a social system, in which a unit may have a chance to keep its state with a probability 1 - q even though its state is different from those of the majority of its interacting neighbors. It is found that the effect of these inflexible units can lead to a nontrivial phase diagram

  18. Nonequilibrium phase transitions in a model with social influence of inflexible units

    Energy Technology Data Exchange (ETDEWEB)

    Jiang, Luo-luo; Hua, Da-yin; Chen, Ting [Physics Department, Ningbo University, Ningbo 315211 (China)

    2007-09-14

    In many social, economical and biological systems, the evolution of the states of interacting units cannot be simply treated with a physical law in the realm of traditional statistical mechanics. We propose a simple binary-state model to discuss the effect of the inflexible units on the dynamical behavior of a social system, in which a unit may have a chance to keep its state with a probability 1 - q even though its state is different from those of the majority of its interacting neighbors. It is found that the effect of these inflexible units can lead to a nontrivial phase diagram.

  19. [Variations in the diagnostic confirmation process between breast cancer mass screening units].

    Science.gov (United States)

    Natal, Carmen; Fernández-Somoano, Ana; Torá-Rocamora, Isabel; Tardón, Adonina; Castells, Xavier

    2016-01-01

    To analyse variations in the diagnostic confirmation process between screening units, variations in the outcome of each episode and the relationship between the use of the different diagnostic confirmation tests and the lesion detection rate. Observational study of variability of the standardised use of diagnostic and lesion detection tests in 34 breast cancer mass screening units participating in early-detection programmes in three Spanish regions from 2002-2011. The diagnostic test variation ratio in percentiles 25-75 ranged from 1.68 (further appointments) to 3.39 (fine-needle aspiration). The variation ratio in detection rates of benign lesions, ductal carcinoma in situ and invasive cancer were 2.79, 1.99 and 1.36, respectively. A positive relationship between rates of testing and detection rates was found with fine-needle aspiration-benign lesions (R(2): 0.53), fine-needle aspiration-invasive carcinoma (R(2): 0 28), core biopsy-benign lesions (R(2): 0.64), core biopsy-ductal carcinoma in situ (R(2): 0.61) and core biopsy-invasive carcinoma (R(2): 0.48). Variation in the use of invasive tests between the breast cancer screening units participating in early-detection programmes was found to be significantly higher than variations in lesion detection. Units which conducted more fine-needle aspiration tests had higher benign lesion detection rates, while units that conducted more core biopsies detected more benign lesions and cancer. Copyright © 2016 SESPAS. Published by Elsevier Espana. All rights reserved.

  20. Silicon Carbide (SiC) Power Processing Unit (PPU) for Hall Effect Thrusters, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — In this SBIR project, APEI, Inc. is proposing to develop a high efficiency, rad-hard 3.8 kW silicon carbide (SiC) Power Processing Unit (PPU) for Hall Effect...

  1. Transfer of manufacturing units

    DEFF Research Database (Denmark)

    Madsen, Erik Skov; Riis, Jens Ove; Sørensen, Brian Vejrum

    2008-01-01

    The ongoing and unfolding relocation of activities is one of the major trends, that calls for attention in the domain of operations management. In particular, prescriptive models outlining: stages of the process, where to locate, and how to establish the new facilities have been studied, while...... and dilemmas to be addressed when transferring manufacturing units....

  2. Co-occurrence of Photochemical and Microbiological Transformation Processes in Open-Water Unit Process Wetlands.

    Science.gov (United States)

    Prasse, Carsten; Wenk, Jannis; Jasper, Justin T; Ternes, Thomas A; Sedlak, David L

    2015-12-15

    The fate of anthropogenic trace organic contaminants in surface waters can be complex due to the occurrence of multiple parallel and consecutive transformation processes. In this study, the removal of five antiviral drugs (abacavir, acyclovir, emtricitabine, lamivudine and zidovudine) via both bio- and phototransformation processes, was investigated in laboratory microcosm experiments simulating an open-water unit process wetland receiving municipal wastewater effluent. Phototransformation was the main removal mechanism for abacavir, zidovudine, and emtricitabine, with half-lives (t1/2,photo) in wetland water of 1.6, 7.6, and 25 h, respectively. In contrast, removal of acyclovir and lamivudine was mainly attributable to slower microbial processes (t1/2,bio = 74 and 120 h, respectively). Identification of transformation products revealed that bio- and phototransformation reactions took place at different moieties. For abacavir and zidovudine, rapid transformation was attributable to high reactivity of the cyclopropylamine and azido moieties, respectively. Despite substantial differences in kinetics of different antiviral drugs, biotransformation reactions mainly involved oxidation of hydroxyl groups to the corresponding carboxylic acids. Phototransformation rates of parent antiviral drugs and their biotransformation products were similar, indicating that prior exposure to microorganisms (e.g., in a wastewater treatment plant or a vegetated wetland) would not affect the rate of transformation of the part of the molecule susceptible to phototransformation. However, phototransformation strongly affected the rates of biotransformation of the hydroxyl groups, which in some cases resulted in greater persistence of phototransformation products.

  3. Application of a model of social information processing to nursing theory: how nurses respond to patients.

    Science.gov (United States)

    Sheldon, Lisa Kennedy; Ellington, Lee

    2008-11-01

    This paper is a report of a study to assess the applicability of a theoretical model of social information processing in expanding a nursing theory addressing how nurses respond to patients. Nursing communication affects patient outcomes such as anxiety, adherence to treatments and satisfaction with care. Orlando's theory of nursing process describes nurses' reactions to patients' behaviour as generating a perception, thought and feeling in the nurse and then action by the nurse. A model of social information processing describes the sequential steps in the cognitive processes used to respond to social cues and may be useful in describing the nursing process. Cognitive interviews were conducted in 2006 with a convenience sample of 5 nurses in the United States of America. The data were interpreted using the Crick and Dodge model of social information processing. Themes arising from cognitive interviews validated concepts of the nursing theory and the constructs of the model of social information processing. The interviews revealed that the support of peers was an additional construct involved in the development of communication skills, creation of a database and enhancement of self-efficacy. Models of social information processing enhance understanding of the process of how nurses respond to patients and further develop nursing theories further. In combination, the theories are useful in developing research into nurse-patient communication. Future research based on the expansion of nursing theory may identify effective and culturally appropriate nurse response patterns to specific patient interactions with implications for nursing care and patient outcomes.

  4. Velocity Model for CO2 Sequestration in the Southeastern United States Atlantic Continental Margin

    Science.gov (United States)

    Ollmann, J.; Knapp, C. C.; Almutairi, K.; Almayahi, D.; Knapp, J. H.

    2017-12-01

    The sequestration of carbon dioxide (CO2) is emerging as a major player in offsetting anthropogenic greenhouse gas emissions. With 40% of the United States' anthropogenic CO2 emissions originating in the southeast, characterizing potential CO2 sequestration sites is vital to reducing the United States' emissions. The goal of this research project, funded by the Department of Energy (DOE), is to estimate the CO2 storage potential for the Southeastern United States Atlantic Continental Margin. Previous studies find storage potential in the Atlantic continental margin. Up to 16 Gt and 175 Gt of storage potential are estimated for the Upper Cretaceous and Lower Cretaceous formations, respectively. Considering 2.12 Mt of CO2 are emitted per year by the United States, substantial storage potential is present in the Southeastern United States Atlantic Continental Margin. In order to produce a time-depth relationship, a velocity model must be constructed. This velocity model is created using previously collected seismic reflection, refraction, and well data in the study area. Seismic reflection horizons were extrapolated using well log data from the COST GE-1 well. An interpolated seismic section was created using these seismic horizons. A velocity model will be made using P-wave velocities from seismic reflection data. Once the time-depth conversion is complete, the depths of stratigraphic units in the seismic refraction data will be compared to the newly assigned depths of the seismic horizons. With a lack of well control in the study area, the addition of stratigraphic unit depths from 171 seismic refraction recording stations provides adequate data to tie to the depths of picked seismic horizons. Using this velocity model, the seismic reflection data can be presented in depth in order to estimate the thickness and storage potential of CO2 reservoirs in the Southeastern United States Atlantic Continental Margin.

  5. Self-organization comprehensive real-time state evaluation model for oil pump unit on the basis of operating condition classification and recognition

    Science.gov (United States)

    Liang, Wei; Yu, Xuchao; Zhang, Laibin; Lu, Wenqing

    2018-05-01

    In oil transmission station, the operating condition (OC) of an oil pump unit sometimes switches accordingly, which will lead to changes in operating parameters. If not taking the switching of OCs into consideration while performing a state evaluation on the pump unit, the accuracy of evaluation would be largely influenced. Hence, in this paper, a self-organization Comprehensive Real-Time State Evaluation Model (self-organization CRTSEM) is proposed based on OC classification and recognition. However, the underlying model CRTSEM is built through incorporating the advantages of Gaussian Mixture Model (GMM) and Fuzzy Comprehensive Evaluation Model (FCEM) first. That is to say, independent state models are established for every state characteristic parameter according to their distribution types (i.e. the Gaussian distribution and logistic regression distribution). Meanwhile, Analytic Hierarchy Process (AHP) is utilized to calculate the weights of state characteristic parameters. Then, the OC classification is determined by the types of oil delivery tasks, and CRTSEMs of different standard OCs are built to constitute the CRTSEM matrix. On the other side, the OC recognition is realized by a self-organization model that is established on the basis of Back Propagation (BP) model. After the self-organization CRTSEM is derived through integration, real-time monitoring data can be inputted for OC recognition. At the end, the current state of the pump unit can be evaluated by using the right CRTSEM. The case study manifests that the proposed self-organization CRTSEM can provide reasonable and accurate state evaluation results for the pump unit. Besides, the assumption that the switching of OCs will influence the results of state evaluation is also verified.

  6. Modeling of ETL-Processes and Processed Information in Clinical Data Warehousing.

    Science.gov (United States)

    Tute, Erik; Steiner, Jochen

    2018-01-01

    Literature describes a big potential for reuse of clinical patient data. A clinical data warehouse (CDWH) is a means for that. To support management and maintenance of processes extracting, transforming and loading (ETL) data into CDWHs as well as to ease reuse of metadata between regular IT-management, CDWH and secondary data users by providing a modeling approach. Expert survey and literature review to find requirements and existing modeling techniques. An ETL-modeling-technique was developed extending existing modeling techniques. Evaluation by exemplarily modeling existing ETL-process and a second expert survey. Nine experts participated in the first survey. Literature review yielded 15 included publications. Six existing modeling techniques were identified. A modeling technique extending 3LGM2 and combining it with openEHR information models was developed and evaluated. Seven experts participated in the evaluation. The developed approach can help in management and maintenance of ETL-processes and could serve as interface between regular IT-management, CDWH and secondary data users.

  7. Material Balance And Reaction Kinetics Modeling For Penex Isomerization Process In Daura Refinery

    Directory of Open Access Journals (Sweden)

    Hamadi Adel Sharif

    2017-01-01

    Full Text Available Penex Deisohexanizer isomerization of light straight run naphtha is a significant process for petroleum refining and proved to be effective technology to produce gasoline components with a high octane number. Modeling of the chemical kinetic reactions is an important tool because it is a better tool for optimization of the experimental data into parameters used for industrial reactors. The present study deals on the isomerization process in Daura refinery. Material balance calculations were done mathematically on the unit for the kinetics prediction purpose. A kinetic mathematical model was derived for the prediction rate constants K1 and K2 and activation energy Ea at operating temperatures range 120-180°C. According to the model, the results show that with increasing of temperature leads to increased K1 directly, where the K2 values proportional inversely. The activation energy results show that Ea1(nC6

  8. Simulation Models of Human Decision-Making Processes

    Directory of Open Access Journals (Sweden)

    Nina RIZUN

    2014-10-01

    Full Text Available The main purpose of the paper is presentation of the new concept of human decision-making process modeling via using the analogy with Automatic Control Theory. From the author's point of view this concept allows to develop and improve the theory of decision-making in terms of the study and classification of specificity of the human intellectual processes in different conditions. It was proved that the main distinguishing feature between the Heuristic / Intuitive and Rational Decision-Making Models is the presence of so-called phenomenon of "enrichment" of the input information with human propensity, hobbies, tendencies, expectations, axioms and judgments, presumptions or bias and their justification. In order to obtain additional knowledge about the basic intellectual processes as well as the possibility of modeling the decision results in various parameters characterizing the decision-maker, the complex of the simulation models was developed. These models are based on the assumptions that:  basic intellectual processes of the Rational Decision-Making Model can be adequately simulated and identified by the transient processes of the proportional-integral-derivative controller; basic intellectual processes of the Bounded Rationality and Intuitive Models can be adequately simulated and identified by the transient processes of the nonlinear elements.The taxonomy of the most typical automatic control theory elements and their compliance with certain decision-making models with a point of view of decision-making process specificity and decision-maker behavior during a certain time of professional activity was obtained.

  9. Ferromanganese Furnace Modelling Using Object-Oriented Principles

    Energy Technology Data Exchange (ETDEWEB)

    Wasboe, S.O.

    1996-12-31

    This doctoral thesis defines an object-oriented framework for aiding unit process modelling and applies it to model high-carbon ferromanganese furnaces. A framework is proposed for aiding modelling of the internal topology and the phenomena taking place inside unit processes. Complex unit processes may consist of a number of zones where different phenomena take place. A topology is therefore defined for the unit process itself, which shows the relations between the zones. Inside each zone there is a set of chemical species and phenomena, such as reactions, phase transitions, heat transfer etc. A formalized graphical methodology is developed as a tool for modelling these zones and their interaction. The symbols defined in the graphical framework are associated with objects and classes. The rules for linking the objects are described using OMT (Object Modeling Technique) diagrams and formal language formulations. The basic classes that are defined are implemented using the C++ programming language. The ferromanganese process is a complex unit process. A general description of the process equipment is given, and a detailed discussion of the process itself and a system theoretical overview of it. The object-oriented framework is then used to develop a dynamic model based on mass and energy balances. The model is validated by measurements from an industrial furnace. 101 refs., 119 figs., 20 tabs.

  10. Sodium content of popular commercially processed and restaurant foods in the United States

    Science.gov (United States)

    Nutrient Data Laboratory (NDL) of the U.S. Department of Agriculture (USDA) in close collaboration with U.S. Center for Disease Control and Prevention is monitoring the sodium content of commercially processed and restaurant foods in the United States. The main purpose of this manuscript is to prov...

  11. The 2014 United States National Seismic Hazard Model

    Science.gov (United States)

    Petersen, Mark D.; Moschetti, Morgan P.; Powers, Peter; Mueller, Charles; Haller, Kathleen; Frankel, Arthur; Zeng, Yuehua; Rezaeian, Sanaz; Harmsen, Stephen; Boyd, Oliver; Field, Edward; Chen, Rui; Rukstales, Kenneth S.; Luco, Nicolas; Wheeler, Russell; Williams, Robert; Olsen, Anna H.

    2015-01-01

    New seismic hazard maps have been developed for the conterminous United States using the latest data, models, and methods available for assessing earthquake hazard. The hazard models incorporate new information on earthquake rupture behavior observed in recent earthquakes; fault studies that use both geologic and geodetic strain rate data; earthquake catalogs through 2012 that include new assessments of locations and magnitudes; earthquake adaptive smoothing models that more fully account for the spatial clustering of earthquakes; and 22 ground motion models, some of which consider more than double the shaking data applied previously. Alternative input models account for larger earthquakes, more complicated ruptures, and more varied ground shaking estimates than assumed in earlier models. The ground motions, for levels applied in building codes, differ from the previous version by less than ±10% over 60% of the country, but can differ by ±50% in localized areas. The models are incorporated in insurance rates, risk assessments, and as input into the U.S. building code provisions for earthquake ground shaking.

  12. A Hydrostratigraphic Model and Alternatives for the Groundwater Flow and Contaminant Transport Model of Corrective Action Unit 97: Yucca Flat-Climax Mine, Lincoln and Nye Counties, Nevada

    Energy Technology Data Exchange (ETDEWEB)

    Geotechnical Sciences Group Bechtel Nevada

    2006-01-01

    A new three-dimensional hydrostratigraphic framework model for the Yucca Flat-Climax Mine Corrective Action Unit was completed in 2005. The model area includes Yucca Flat and Climax Mine, former nuclear testing areas at the Nevada Test Site, and proximal areas. The model area is approximately 1,250 square kilometers in size and is geologically complex. Yucca Flat is a topographically closed basin typical of many valleys in the Basin and Range province. Faulted and tilted blocks of Tertiary-age volcanic rocks and underlying Proterozoic and Paleozoic sedimentary rocks form low ranges around the structural basin. During the Cretaceous Period a granitic intrusive was emplaced at the north end of Yucca Flat. A diverse set of geological and geophysical data collected over the past 50 years was used to develop a structural model and hydrostratigraphic system for the basin. These were integrated using EarthVision? software to develop the 3-dimensional hydrostratigraphic framework model. Fifty-six stratigraphic units in the model area were grouped into 25 hydrostratigraphic units based on each unit's propensity toward aquifer or aquitard characteristics. The authors organized the alluvial section into 3 hydrostratigraphic units including 2 aquifers and 1 confining unit. The volcanic units in the model area are organized into 13 hydrostratigraphic units that include 8 aquifers and 5 confining units. The underlying pre-Tertiary rocks are divided into 7 hydrostratigraphic units, including 3 aquifers and 4 confining units. Other units include 1 Tertiary-age sedimentary confining unit and 1 Mesozoic-age granitic confining unit. The model depicts the thickness, extent, and geometric relationships of these hydrostratigraphic units (''layers'' in the model) along with the major structural features (i.e., faults). The model incorporates 178 high-angle normal faults of Tertiary age and 2 low-angle thrust faults of Mesozoic age. The complexity of the model

  13. Independent effects of temperature and precipitation on modeled runoff in the conterminous United States

    Science.gov (United States)

    McCabe, G.J.; Wolock, D.M.

    2011-01-01

    A water-balance model is used to simulate time series of water-year runoff for 4 km ?? 4 km grid cells for the conterminous United States during the 1900-2008 period. Model outputs are used to examine the separate effects of precipitation and temperature on runoff variability. Overall, water-year runoff has increased in the conterminous United States and precipitation has accounted for almost all of the variability in water-year runoff during the past century. In contrast, temperature effects on runoff have been small for most locations in the United States even during periods when temperatures for most of the United States increased significantly. Copyright 2011 by the American Geophysical Union.

  14. User-guided discovery of declarative process models

    NARCIS (Netherlands)

    Maggi, F.M.; Mooij, A.J.; Aalst, van der W.M.P.; Chawla, N.; King, I.; Sperduti, A.

    2011-01-01

    Process mining techniques can be used to effectively discover process models from logs with example behaviour. Cross-correlating a discovered model with information in the log can be used to improve the underlying process. However, existing process discovery techniques have two important drawbacks.

  15. Mi-STAR Unit Challenges serve as a model for integrating earth science and systems thinking in a Next Generation Science Standards (NGSS) aligned curriculum.

    Science.gov (United States)

    Gochis, E. E.; Tubman, S.; Matthys, T.; Bluth, G.; Oppliger, D.; Danhoff, B.; Huntoon, J. E.

    2017-12-01

    Michigan Science Teaching and Assessment Reform (Mi-STAR) is developing an NGSS-aligned middle school curriculum and associated teacher professional learning program in which science is taught and learned as an integrated body of knowledge that can be applied to address societal issues. With the generous support of the Herbert H. and Grace A. Dow Foundation, Mi-STAR has released several pilot-tested units through the Mi-STAR curriculum portal at mi-star.mtu.edu. Each of these units focuses on an ongoing `Unit Challenge' investigation that integrates STEM content across disciplinary boundaries, stimulates interest, and engages students in using scientific practices to address 21st century challenges. Each Mi-STAR unit is connected to a Unifying NGSS Crosscutting Concept (CCC) that allows students to recognize the concepts that are related to the phenomena or problems under investigation. In the 6th grade, students begin with an exploration of the CCC Systems and System Models. Through repeated applications across units, students refine their understanding of what a system is and how to model a complex Earth system. An example 6th grade unit entitled "Water on the Move: The Water Cycle," provides an example of how Mi-STAR approaches the use of Unifying CCCs and Unit Challenges to enhance middle school students' understanding of the interconnections of Earth system processes and human activities. Throughout the unit, students use a series of hands-on explorations and simulations to explore the hydrologic cycle and how human activity can alter Earth systems. Students develop new knowledge through repeated interactions with the Unit Challenge, which requires development of system models and construction of evidence-based arguments related to flooding problems in a local community. Students have the opportunity to make predictions about how proposed land-use management practices (e.g. development of a skate-park, rain garden, soccer field, etc.) can alter the earth

  16. Deferred Action: Theoretical model of process architecture design for emergent business processes

    Directory of Open Access Journals (Sweden)

    Patel, N.V.

    2007-01-01

    Full Text Available E-Business modelling and ebusiness systems development assumes fixed company resources, structures, and business processes. Empirical and theoretical evidence suggests that company resources and structures are emergent rather than fixed. Planning business activity in emergent contexts requires flexible ebusiness models based on better management theories and models . This paper builds and proposes a theoretical model of ebusiness systems capable of catering for emergent factors that affect business processes. Drawing on development of theories of the ‘action and design’class the Theory of Deferred Action is invoked as the base theory for the theoretical model. A theoretical model of flexible process architecture is presented by identifying its core components and their relationships, and then illustrated with exemplar flexible process architectures capable of responding to emergent factors. Managerial implications of the model are considered and the model’s generic applicability is discussed.

  17. System Model of Heat and Mass Transfer Process for Mobile Solvent Vapor Phase Drying Equipment

    Directory of Open Access Journals (Sweden)

    Shiwei Zhang

    2014-01-01

    Full Text Available The solvent vapor phase drying process is one of the most important processes during the production and maintenance for large oil-immersed power transformer. In this paper, the working principle, system composition, and technological process of mobile solvent vapor phase drying (MVPD equipment for transformer are introduced in detail. On the basis of necessary simplification and assumption for MVPD equipment and process, a heat and mass transfer mathematical model including 40 mathematical equations is established, which represents completely thermodynamics laws of phase change and transport process of solvent, water, and air in MVPD technological processes and describes in detail the quantitative relationship among important physical quantities such as temperature, pressure, and flux in key equipment units and process. Taking a practical field drying process of 500 KV/750 MVA power transformer as an example, the simulation calculation of a complete technological process is carried out by programming with MATLAB software and some relation curves of key process parameters changing with time are obtained such as body temperature, tank pressure, and water yield. The change trend of theoretical simulation results is very consistent with the actual production record data which verifies the correctness of mathematical model established.

  18. Model of the heat load under dynamic abrasive processing of food material

    Directory of Open Access Journals (Sweden)

    G. V. Аlеksееv

    2016-01-01

    Full Text Available The modern stage of the improvement food production is conditioned by tense fight for their cost-performance that is defined in significant measure by maximum efficiency of the use agricultural cheese. At the same time problems with disadvantage ecological condition, accompanying life our society, require from taken person of the food different influences on recovery of the organism. For decision of this problem to researchers most different countries unite their own efforts on decision of the touched questions. The improvement and development technology must rest in study existing. In base of the studies can lie the mathematical product models of the feeding and corresponding to processes created in different exploratory organization. The development qualitative, claimed, competitive products – a purpose of each modern producer, choosing for itself most idle time, effective and economic justified way of the decision given problems. Modern prospecting in theories and practical person of the checking quality and analysis allow to use in principal new methods at determination of the possible negative changes to product of the feeding happened in them, in particular, under heat processing. The given methods, except traditional touch component, take into account else and complex of the analytical models of the models, for positioning undesirable warm-up mode for processing the product in target group of the consumers (for instance for integer medical-preventive feeding.

  19. Silicon Carbide (SiC) Power Processing Unit (PPU) for Hall Effect Thrusters, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — In this SBIR project, APEI, Inc. is proposing to develop a high efficiency, rad-hard 3.8 kW silicon carbide (SiC) power supply for the Power Processing Unit (PPU) of...

  20. HTS current lead units prepared by the TFA-MOD processed YBCO coated conductors

    International Nuclear Information System (INIS)

    Shiohara, K.; Sakai, S.; Ishii, Y.; Yamada, Y.; Tachikawa, K.; Koizumi, T.; Aoki, Y.; Hasegawa, T.; Tamura, H.; Mito, T.

    2010-01-01

    Two superconducting current lead units have been prepared using ten coated conductors of the Tri-Fluoro-Acetate - Metal Organic Deposition (TFA-MOD) processed Y 1 Ba 2 Cu 3 O 7-δ (YBCO) coated conductors with critical current (I c ) of about 170 A at 77 K in self-field. The coated conductors are 5 mm in width, 190 mm in length and about 120 μm in overall thickness. The 1.5 μm thick superconducting YBCO layer was synthesized through the TFA-MOD process on Hastelloy TM C-276 substrate tape with two buffer oxide layers of Gd 2 Zr 2 O 7 and CeO 2 . The five YBCO coated conductors are attached on a 1 mm thick Glass Fiber Reinforced Plastics (GFRP) board and soldered to Cu caps at the both ends. We prepared two 500 A-class current lead units. The DC transport current of 800 A was stably applied at 77 K without any voltage generation in all coated conductors. The voltage between both Cu caps linearly increased with increasing the applied current, and was about 350 μV at 500 A in both current lead units. According to the estimated values of the heat leakage from 77 K to 4.2 K, the heat leakage for the current lead unit was 46.5 mW. We successfully attained reduction of the heat leakage because of improvement of the transport current performance (I c ), a thinner Ag layer of YBCO coated conductor and usage of the GFRP board for reinforcement instead of a stainless steel board used in the previous study. The DC transport current of 1400 A was stably applied when the two current lead units were joined in parallel. The sum of the heat leakages from 77 K to 4.2 K for the combined the current lead units was 93 mW. In comparison with the conventional Cu current leads by gas-cooling, it could be noted that the heat leakage of the current lead is about one order of magnitude smaller than that of the Cu current lead.

  1. Calibration by Hydrological Response Unit of a National Hydrologic Model to Improve Spatial Representation and Distribution of Parameters

    Science.gov (United States)

    Norton, P. A., II

    2015-12-01

    The U. S. Geological Survey is developing a National Hydrologic Model (NHM) to support consistent hydrologic modeling across the conterminous United States (CONUS). The Precipitation-Runoff Modeling System (PRMS) simulates daily hydrologic and energy processes in watersheds, and is used for the NHM application. For PRMS each watershed is divided into hydrologic response units (HRUs); by default each HRU is assumed to have a uniform hydrologic response. The Geospatial Fabric (GF) is a database containing initial parameter values for input to PRMS and was created for the NHM. The parameter values in the GF were derived from datasets that characterize the physical features of the entire CONUS. The NHM application is composed of more than 100,000 HRUs from the GF. Selected parameter values commonly are adjusted by basin in PRMS using an automated calibration process based on calibration targets, such as streamflow. Providing each HRU with distinct values that captures variability within the CONUS may improve simulation performance of the NHM. During calibration of the NHM by HRU, selected parameter values are adjusted for PRMS based on calibration targets, such as streamflow, snow water equivalent (SWE) and actual evapotranspiration (AET). Simulated SWE, AET, and runoff were compared to value ranges derived from multiple sources (e.g. the Snow Data Assimilation System, the Moderate Resolution Imaging Spectroradiometer (i.e. MODIS) Global Evapotranspiration Project, the Simplified Surface Energy Balance model, and the Monthly Water Balance Model). This provides each HRU with a distinct set of parameter values that captures the variability within the CONUS, leading to improved model performance. We present simulation results from the NHM after preliminary calibration, including the results of basin-level calibration for the NHM using: 1) default initial GF parameter values, and 2) parameter values calibrated by HRU.

  2. Techno-economic assessment of FT unit for synthetic diesel production in existing stand-alone biomass gasification plant using process simulation tool

    DEFF Research Database (Denmark)

    Hunpinyo, Piyapong; Narataruksa, Phavanee; Tungkamani, Sabaithip

    2014-01-01

    For alternative thermo-chemical conversion process route via gasification, biomass can be gasified to produce syngas (mainly CO and H2). On more applications of utilization, syngas can be used to synthesize fuels through the catalytic process option for producing synthetic liquid fuels...... such as Fischer-Tropsch (FT) diesel. The embedding of the FT plant into the stand-alone based on power mode plants for production of a synthetic fuel is a promising practice, which requires an extensive adaptation of conventional techniques to the special chemical needs found in a gasified biomass. Because...... there are currently no plans to engage the FT process in Thailand, the authors have targeted that this work focus on improving the FT configurations in existing biomass gasification facilities (10 MWth). A process simulation model for calculating extended unit operations in a demonstrative context is designed...

  3. Modelling Hospital Materials Management Processes

    Directory of Open Access Journals (Sweden)

    Raffaele Iannone

    2013-06-01

    integrated and detailed analysis and description model for hospital materials management data and tasks, which is able to tackle information from patient requirements to usage, from replenishment requests to supplying and handling activities. The model takes account of medical risk reduction, traceability and streamlined processes perspectives. Second, the paper translates this information into a business process model and mathematical formalization.The study provides a useful guide to the various relevant technology‐related, management and business issues, laying the foundations of an efficient reengineering of the supply chain to reduce healthcare costs and improve the quality of care.

  4. Regime-switching models to study psychological process

    NARCIS (Netherlands)

    Hamaker, E.L.; Grasman, R.P.P.P.; Kamphuis, J.H.

    2010-01-01

    Many psychological processes are characterized by recurrent shifts between different states. To model these processes at the level of the individual, regime-switching models may prove useful. In this chapter we discuss two of these models: the threshold autoregressive model and the Markov

  5. Modeling biochemical transformation processes and information processing with Narrator

    Directory of Open Access Journals (Sweden)

    Palfreyman Niall M

    2007-03-01

    Full Text Available Abstract Background Software tools that model and simulate the dynamics of biological processes and systems are becoming increasingly important. Some of these tools offer sophisticated graphical user interfaces (GUIs, which greatly enhance their acceptance by users. Such GUIs are based on symbolic or graphical notations used to describe, interact and communicate the developed models. Typically, these graphical notations are geared towards conventional biochemical pathway diagrams. They permit the user to represent the transport and transformation of chemical species and to define inhibitory and stimulatory dependencies. A critical weakness of existing tools is their lack of supporting an integrative representation of transport, transformation as well as biological information processing. Results Narrator is a software tool facilitating the development and simulation of biological systems as Co-dependence models. The Co-dependence Methodology complements the representation of species transport and transformation together with an explicit mechanism to express biological information processing. Thus, Co-dependence models explicitly capture, for instance, signal processing structures and the influence of exogenous factors or events affecting certain parts of a biological system or process. This combined set of features provides the system biologist with a powerful tool to describe and explore the dynamics of life phenomena. Narrator's GUI is based on an expressive graphical notation which forms an integral part of the Co-dependence Methodology. Behind the user-friendly GUI, Narrator hides a flexible feature which makes it relatively easy to map models defined via the graphical notation to mathematical formalisms and languages such as ordinary differential equations, the Systems Biology Markup Language or Gillespie's direct method. This powerful feature facilitates reuse, interoperability and conceptual model development. Conclusion Narrator is a

  6. A multilayer electro-thermal model of pouch battery during normal discharge and internal short circuit process

    International Nuclear Information System (INIS)

    Chen, Mingbiao; Bai, Fanfei; Song, Wenji; Lv, Jie; Lin, Shili

    2017-01-01

    Highlights: • 2D network equivalent circuit considers the interplay of cell units. • The temperature non-uniformity Φ of multilayer model is bigger than that of lumped model. • The temperature non-uniformity is quantified and the reason of non-uniformity is analyzed. • Increasing the thermal conductivity of the separator can effectively relieve the heat spot effect of ISC. - Abstract: As the electrical and thermal characteristic will affect the batteries’ safety, performance, calendar life and capacity fading, an electro-thermal coupled model for pouch battery LiFePO_4/C is developed in normal discharge and internal short circuit process. The battery is discretized into many cell elements which are united as a 2D network equivalent circuit. The electro-thermal model is solved with finite difference method. Non-uniformity of current distribution and temperature distribution is simulated and the result is validated with experiment data at various discharge rates. Comparison of the lumped model and the multilayer structure model shows that the temperature non-uniformity Φ of multilayer model is bigger than that of lumped model and shows more precise. The temperature non-uniformity is quantified and the reason of non-uniformity is analyzed. The electro-thermal model can also be used to guide the safety design of battery. The temperature of the ISC element near tabs is the highest because the equivalent resistance of the external circuit (not including the ISC element) is the smallest when the resistance of cell units is small. It is found that increasing the thermal conductivity of integrated layer can effectively relieve the heat spot effect of ISC.

  7. Estimating soil hydrological response by combining precipitation-runoff modeling and hydro-functional soil homogeneous units

    Science.gov (United States)

    Aroca-Jimenez, Estefania; Bodoque, Jose Maria; Diez-Herrero, Andres

    2015-04-01

    Flash floods constitute one of the natural hazards better able to generate risk, particularly with regard to Society. The complexity of this process and its dependence on various factors related to the characteristics of the basin and rainfall make flash floods are difficult to characterize in terms of their hydrological response.To do this, it is essential a proper analysis of the so called 'initial abstractions'. Among all of these processes, infiltration plays a crucial role in explaining the occurrence of floods in mountainous basins.For its characterization the Green-Ampt model , which depends on the characteristics of rainfall and physical properties of soil has been used in this work.This is a method enabling to simulate floods in mountainous basins where hydrological response is sub-daily. However, it has the disadvantage that it is based on physical properties of soil which have a high spatial variability. To address this difficulty soil mapping units have been delineated according to the geomorphological landforms and elements. They represent hydro-functional mapping units that are theoretically homogeneous from the perspective of the pedostructure parameters of the pedon. So the soil texture of each homogeneous group of landform units was studied by granulometric analyses using standarized sieves and Sedigraph devices. In addition, uncertainty associated with the parameterization of the Green-Ampt method has been estimated by implementing a Monte Carlo approach, which required assignment of the proper distribution function to each parameter.The suitability of this method was contrasted by calibrating and validating a hydrological model, in which the generation of runoff hydrograph has been simulated using the SCS unit hydrograph (HEC-GeoHMS software), while flood wave routing has been characterized using the Muskingum-Cunge method. Calibration and validation of the model was from the use of an automatic routine based on the employ of the search algorithm

  8. Modelling local government unit credit risk in the Republic of Croatia

    Directory of Open Access Journals (Sweden)

    Petra Posedel

    2012-12-01

    Full Text Available The objective of this paper is to determine possible indicators that affect local unit credit risk and investigate their effect on default (credit risk of local government units in Croatia. No system for the estimation of local unit credit risk has been established in Croatia so far causing many practical problems in local unit borrowing. Because of the specific nature of the operations of local government units and legislation that does not allow local government units to go into bankruptcy, conventional methods for estimating credit risk are not applicable, and the set of standard potential determinants of credit risk has to be expanded with new indicators. Thus in the paper, in addition to the usual determinants of credit risk, the hypothesis of the influence of political factors on local unit credit risk in Croatia is also tested out, with the use of a Tobit model. Results of econometric analysis show that credit risk of local government units in Croatia is affected by the political structure of local government, the proportion of income tax and surtax in operating revenue, the ratio of net operating balance, net financial liabilities and direct debt to operating revenue, as well as the ratio of debt repayment and cash, and direct debt and operating revenue.

  9. Probabilistic modeling of discourse-aware sentence processing.

    Science.gov (United States)

    Dubey, Amit; Keller, Frank; Sturt, Patrick

    2013-07-01

    Probabilistic models of sentence comprehension are increasingly relevant to questions concerning human language processing. However, such models are often limited to syntactic factors. This restriction is unrealistic in light of experimental results suggesting interactions between syntax and other forms of linguistic information in human sentence processing. To address this limitation, this article introduces two sentence processing models that augment a syntactic component with information about discourse co-reference. The novel combination of probabilistic syntactic components with co-reference classifiers permits them to more closely mimic human behavior than existing models. The first model uses a deep model of linguistics, based in part on probabilistic logic, allowing it to make qualitative predictions on experimental data; the second model uses shallow processing to make quantitative predictions on a broad-coverage reading-time corpus. Copyright © 2013 Cognitive Science Society, Inc.

  10. The (Mathematical) Modeling Process in Biosciences.

    Science.gov (United States)

    Torres, Nestor V; Santos, Guido

    2015-01-01

    In this communication, we introduce a general framework and discussion on the role of models and the modeling process in the field of biosciences. The objective is to sum up the common procedures during the formalization and analysis of a biological problem from the perspective of Systems Biology, which approaches the study of biological systems as a whole. We begin by presenting the definitions of (biological) system and model. Particular attention is given to the meaning of mathematical model within the context of biology. Then, we present the process of modeling and analysis of biological systems. Three stages are described in detail: conceptualization of the biological system into a model, mathematical formalization of the previous conceptual model and optimization and system management derived from the analysis of the mathematical model. All along this work the main features and shortcomings of the process are analyzed and a set of rules that could help in the task of modeling any biological system are presented. Special regard is given to the formative requirements and the interdisciplinary nature of this approach. We conclude with some general considerations on the challenges that modeling is posing to current biology.

  11. Crop Yield Simulations Using Multiple Regional Climate Models in the Southwestern United States

    Science.gov (United States)

    Stack, D.; Kafatos, M.; Kim, S.; Kim, J.; Walko, R. L.

    2013-12-01

    Agricultural productivity (described by crop yield) is strongly dependent on climate conditions determined by meteorological parameters (e.g., temperature, rainfall, and solar radiation). California is the largest producer of agricultural products in the United States, but crops in associated arid and semi-arid regions live near their physiological limits (e.g., in hot summer conditions with little precipitation). Thus, accurate climate data are essential in assessing the impact of climate variability on agricultural productivity in the Southwestern United States and other arid regions. To address this issue, we produced simulated climate datasets and used them as input for the crop production model. For climate data, we employed two different regional climate models (WRF and OLAM) using a fine-resolution (8km) grid. Performances of the two different models are evaluated in a fine-resolution regional climate hindcast experiment for 10 years from 2001 to 2010 by comparing them to the North American Regional Reanalysis (NARR) dataset. Based on this comparison, multi-model ensembles with variable weighting are used to alleviate model bias and improve the accuracy of crop model productivity over large geographic regions (county and state). Finally, by using a specific crop-yield simulation model (APSIM) in conjunction with meteorological forcings from the multi-regional climate model ensemble, we demonstrate the degree to which maize yields are sensitive to the regional climate in the Southwestern United States.

  12. Grey water treatment by a continuous process of an electrocoagulation unit and a submerged membrane bioreactor system

    KAUST Repository

    Bani-Melhem, Khalid; Smith, Edward

    2012-01-01

    This paper presents the performance of an integrated process consisting of an electro-coagulation (EC) unit and a submerged membrane bioreactor (SMBR) technology for grey water treatment. For comparison purposes, another SMBR process without

  13. Styrene-spaced copolymers including anthraquinone and β-O-4 lignin model units: synthesis, characterization and reactivity under alkaline pulping conditions.

    Science.gov (United States)

    Megiatto, Jackson D; Cazeils, Emmanuel; Ham-Pichavant, Frédérique; Grelier, Stéphane; Gardrat, Christian; Castellan, Alain

    2012-05-14

    A series of random copoly(styrene)s has been synthesized via radical polymerization of functionalized anthraquinone (AQ) and β-O-4 lignin model monomers. The copolymers were designed to have a different number of styrene spacer groups between the AQ and β-O-4 lignin side chains aiming at investigating the distance effects on AQ/β-O-4 electron transfer mechanisms. A detailed molecular characterization, including techniques such as size exclusion chromatography, MALDI-TOF mass spectrometry, and (1)H, (13)C, (31)P NMR and UV-vis spectroscopies, afforded quantitative information about the composition of the copolymers as well as the average distribution of the AQ and β-O-4 groups in the macromolecular structures. TGA and DSC thermal analysis have indicated that the copolymers were thermally stable under regular pulping conditions, revealing the inertness of the styrene polymer backbone in the investigation of electron transfer mechanisms. Alkaline pulping experiments showed that close contact between the redox active side chains in the copolymers was fundamental for an efficient degradation of the β-O-4 lignin model units, highlighting the importance of electron transfer reactions in the lignin degradation mechanisms catalyzed by AQ. In the absence of glucose, AQ units oxidized phenolic β-O-4 lignin model parts, mainly by electron transfer leading to vanillin as major product. By contrast, in presence of glucose, anthrahydroquinone units (formed by reduction of AQ) reduced the quinone-methide units (issued by dehydration of phenolic β-O-4 lignin model part) mainly by electron transfer leading to guaiacol as major product. Both processes were distance dependent.

  14. Exergy analysis of an industrial unit of catalyst regeneration based on the results of modeling and simulation

    International Nuclear Information System (INIS)

    Toghyani, Mahboubeh; Rahimi, Amir

    2015-01-01

    An industrial process is synthesized and developed for decoking of de-hydrogenation catalyst, used in LAB (Linear Alkyl Benzene) production. A multi-tube fixed bed reactor, with short length tubes is designed for decoking of catalyst as the main equipment of the process. This study provides a microscopic exergy analysis for decoking reactor and a macroscopic exergy analysis for synthesized regeneration process. The dynamic mathematical modeling technique and the simulation of process by a commercial software are applied simultaneously. The used model was previously developed for performance analysis of decoking reactor. An appropriate exergy model is developed and adopted to estimate the enthalpy, exergetic efficiency and irreversibility. The model is validated with respect to some operating data measured in a commercial regeneration unit for variations in gas and particle characteristics along the reactor. In coke-combustion period, in spite of high reaction rate, the reactor has low exergetic efficiency due to entropy production during heat and mass transfer processes. The effects of inlet gas flow rate, temperature and oxygen concentration are investigated on the exergetic efficiency and irreversibilities. Macroscopic results indicate that the fan has the highest irreversibilities among the other equipment. Applying proper operating variables reduces the cycle irreversibilities at least by 20%. - Highlights: • A microscopic exergy analysis for a multi-tube fixed bed reactor is conducted. • Controlling the O_2 concentration upgrades the reactor exergetic performance. • A macroscopic exergy analysis for synthesized regeneration process is conducted. • The fan is one of the main sources of the regeneration cycle irreversibility. • The proposed strategies can reduce the cycle irreversibilities at least by 20%.

  15. Development of diagnostic process for abnormal conditions of Ulchin units 1 and 2

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Hyun Soo; Kwak, Jeong Keun; Yun, Jung Hyun; Kim, Jong Hyun [KEPCO International Nuclear Graduate School, Ulsan (Korea, Republic of)

    2012-10-15

    Diagnosis of abnormal conditions during operation is one of difficult tasks to nuclear power plant operators. Operators may have trouble in handling abnormal conditions due to various reasons such as 1) many alarms (around 2,000 alarms in the Ulchin units 1 and 2 each) and multi alarms occurrences, 2) the same alarms occurrences in different abnormal conditions, and 3) a number of Abnormal Operating Procedures (AOPs). For these reasons, the first diagnosis on abnormal conditions largely relies on operator's experiences and pattern recognition. Then, this difficulty may be highlighted for inexperienced operators. This paper suggests an approach to develop the optimal diagnostic process for appropriate selection of AOPs by using the Elimination by Aspect (EBA) method. The EBA method uses a heuristic followed by decision makers during a process of sequential choice and which constitutes a good balance between the cost of a decision and its quality. At each stage of decision, the individuals eliminate all the options not having an expected given attribute, until only one option remains. This approach is applied to steam generator level control system abnormal procedure for Ulchin units 1 and 2. The result indicates that the EBA method is applicable to the development of optimal process on diagnosis of abnormal conditions.

  16. Model for Simulating a Spiral Software-Development Process

    Science.gov (United States)

    Mizell, Carolyn; Curley, Charles; Nayak, Umanath

    2010-01-01

    A discrete-event simulation model, and a computer program that implements the model, have been developed as means of analyzing a spiral software-development process. This model can be tailored to specific development environments for use by software project managers in making quantitative cases for deciding among different software-development processes, courses of action, and cost estimates. A spiral process can be contrasted with a waterfall process, which is a traditional process that consists of a sequence of activities that include analysis of requirements, design, coding, testing, and support. A spiral process is an iterative process that can be regarded as a repeating modified waterfall process. Each iteration includes assessment of risk, analysis of requirements, design, coding, testing, delivery, and evaluation. A key difference between a spiral and a waterfall process is that a spiral process can accommodate changes in requirements at each iteration, whereas in a waterfall process, requirements are considered to be fixed from the beginning and, therefore, a waterfall process is not flexible enough for some projects, especially those in which requirements are not known at the beginning or may change during development. For a given project, a spiral process may cost more and take more time than does a waterfall process, but may better satisfy a customer's expectations and needs. Models for simulating various waterfall processes have been developed previously, but until now, there have been no models for simulating spiral processes. The present spiral-process-simulating model and the software that implements it were developed by extending a discrete-event simulation process model of the IEEE 12207 Software Development Process, which was built using commercially available software known as the Process Analysis Tradeoff Tool (PATT). Typical inputs to PATT models include industry-average values of product size (expressed as number of lines of code

  17. Software-Engineering Process Simulation (SEPS) model

    Science.gov (United States)

    Lin, C. Y.; Abdel-Hamid, T.; Sherif, J. S.

    1992-01-01

    The Software Engineering Process Simulation (SEPS) model is described which was developed at JPL. SEPS is a dynamic simulation model of the software project development process. It uses the feedback principles of system dynamics to simulate the dynamic interactions among various software life cycle development activities and management decision making processes. The model is designed to be a planning tool to examine tradeoffs of cost, schedule, and functionality, and to test the implications of different managerial policies on a project's outcome. Furthermore, SEPS will enable software managers to gain a better understanding of the dynamics of software project development and perform postmodern assessments.

  18. Graphics Processing Unit Enhanced Parallel Document Flocking Clustering

    Energy Technology Data Exchange (ETDEWEB)

    Cui, Xiaohui [ORNL; Potok, Thomas E [ORNL; ST Charles, Jesse Lee [ORNL

    2010-01-01

    Analyzing and clustering documents is a complex problem. One explored method of solving this problem borrows from nature, imitating the flocking behavior of birds. One limitation of this method of document clustering is its complexity O(n2). As the number of documents grows, it becomes increasingly difficult to generate results in a reasonable amount of time. In the last few years, the graphics processing unit (GPU) has received attention for its ability to solve highly-parallel and semi-parallel problems much faster than the traditional sequential processor. In this paper, we have conducted research to exploit this archi- tecture and apply its strengths to the flocking based document clustering problem. Using the CUDA platform from NVIDIA, we developed a doc- ument flocking implementation to be run on the NVIDIA GEFORCE GPU. Performance gains ranged from thirty-six to nearly sixty times improvement of the GPU over the CPU implementation.

  19. Metrics for Business Process Models

    Science.gov (United States)

    Mendling, Jan

    Up until now, there has been little research on why people introduce errors in real-world business process models. In a more general context, Simon [404] points to the limitations of cognitive capabilities and concludes that humans act rationally only to a certain extent. Concerning modeling errors, this argument would imply that human modelers lose track of the interrelations of large and complex models due to their limited cognitive capabilities and introduce errors that they would not insert in a small model. A recent study by Mendling et al. [275] explores in how far certain complexity metrics of business process models have the potential to serve as error determinants. The authors conclude that complexity indeed appears to have an impact on error probability. Before we can test such a hypothesis in a more general setting, we have to establish an understanding of how we can define determinants that drive error probability and how we can measure them.

  20. Fiducial-based monocular 3D displacement measurement of breakwater armour unit models.

    CSIR Research Space (South Africa)

    Vieira, R

    2008-11-01

    Full Text Available This paper presents a fiducial-based approach to monitoring the movement of breakwater armour units in a model hall environment. Target symbols with known dimensions are attached to the physical models, allowing the recovery of three...

  1. Modeling Hydrologic Processes after Vegetation Restoration in an Urban Watershed with HEC-HMS

    Science.gov (United States)

    Stevenson, K.; Kinoshita, A. M.

    2017-12-01

    The San Diego River Watershed in California (USA) is highly urbanized, where stream channel geomorphology are directly affected by anthropogenic disturbances. Flooding and water quality concerns have led to an increased interest in improving the condition of urban waterways. Alvarado Creek, a 1200-meter section of a tributary to the San Diego River will be used as a case study to understand the degree to which restoration efforts reduce the impacts of climate change and anthropogenic activities on hydrologic processes and water quality in urban stream ecosystems. In 2016, non-native vegetation (i.e. Washingtonia spp. (fan palm), Phoenix canariensis (Canary Island palm)) and approximately 7257 kilograms of refuse were removed from the study reach. This research develops the United States Army Corp of Engineers Hydrologic Engineering Center's Hydraulic Modeling System (USACE HEC-HMS) using field-based data to model and predict the short- and long-term impacts of restoration on geomorphic and hydrologic processes. Observations include cross-sectional area, grain-size distributions, water quality, and continuous measurements of streamflow, temperature, and precipitation. Baseline and design storms are simulated before and after restoration. The model will be calibrated and validated using field observations. The design storms represent statistical likelihoods of storms occurrences, and the pre- and post-restoration hydrologic responses will be compared to evaluate the impact of vegetation and waste removal on runoff processes. Ultimately model parameters will be transferred to other urban creeks in San Diego that may potentially undergo restoration. Modeling will be used to learn about the response trajectory of rainfall-runoff processes following restoration efforts in urban streams and guide future management and restoration activities.

  2. Revising process models through inductive learning

    NARCIS (Netherlands)

    Maggi, F.M.; Corapi, D.; Russo, A.; Lupu, E.; Visaggio, G.; Muehlen, zur M.; Su, J.

    2011-01-01

    Discovering the Business Process (BP) model underpinning existing practices through analysis of event logs, allows users to understand, analyse and modify the process. But, to be useful, the BP model must be kept in line with practice throughout its lifetime, as changes occur to the business

  3. Diagnosing differences between business process models

    NARCIS (Netherlands)

    Dijkman, R.M.; Dumas, M.; Reichert, M.; Shan, M.-C.

    2008-01-01

    This paper presents a technique to diagnose differences between business process models in the EPC notation. The diagnosis returns the exact position of a difference in the business process models and diagnoses the type of a difference, using a typology of differences developed in previous work.

  4. COGNITIVE STRUCTURING OF TEXT COMPREHENSION PROCESS IN THE ASPECT OF MICROLINGUISTICS

    Directory of Open Access Journals (Sweden)

    Kolodina Nina Ivanovna

    2014-09-01

    Full Text Available The theory of mnemo-units of knowledge in the aspect of microliguistics is deliberated in the article. Mnemo-unit of knowledge is considered to be a unit of knowledge in the operative memory, which cannot be verbalized but can be explained. The singling out such units, on the one hand, gives the opportunity to construct the structural scheme of comprehension process, and, on the other hand, to justify the theory of comprehension process as the process of operating with tiny recognized and unrecognized units which have schematic or contour fixation in the human's memory. The process of text comprehension is analyzed and compared with the process of making saccades. Given examples about the eyesight fixation on the words picked out on the line allow to speak about the fixation of attention only on these words. The summing up of such theоretic and practical data leads to the opportunity to base the theory of mnemo-units of knowledge in the aspect of microlinguistics. The comprehension process demands supporting the steady connections between the mnemo-units of knowledge. In their turn, the steady connections between the mnemo-units of knowledge, which are necessary for production of thinking forms, are insured by constant activization of the same units at the same sequence. Constant and sequent activization of the same units of knowledge leads to the human thinking process stereotyping. The cognitive model of structural thinking process is built in the article. The analysis of received data on stereotyped comprehension process allows to reveal the fact that the activization of one mnemo-units group demands the activization of another mnemo-units group. Activated mnemo-units groups determine the psychological structure of personality. In this aspect the motivation and the behavior are the necessary steps in the cognitive model of structural comprehension process while the psychological structure is considered.

  5. AMFIBIA: A Meta-Model for the Integration of Business Process Modelling Aspects

    DEFF Research Database (Denmark)

    Axenath, Björn; Kindler, Ekkart; Rubin, Vladimir

    2007-01-01

    AMFIBIA is a meta-model that formalises the essential aspects and concepts of business processes. Though AMFIBIA is not the first approach to formalising the aspects and concepts of business processes, it is more ambitious in the following respects: Firstly, it is independent from particular...... modelling formalisms of business processes and it is designed in such a way that any formalism for modelling some aspect of a business process can be plugged into AMFIBIA. Therefore, AMFIBIA is formalism-independent. Secondly, it is not biased toward any aspect of business processes; the different aspects...... can be considered and modelled independently of each other. Moreover, AMFIBIA is not restricted to a fixed set of aspects; new aspects of business processes can be easily integrated. Thirdly, AMFIBIA does not only name and relate the concepts of business process modelling, as it is typically done...

  6. Model based process-product design and analysis

    DEFF Research Database (Denmark)

    Gani, Rafiqul

    This paper gives a perspective on modelling and the important role it has within product-process design and analysis. Different modelling issues related to development and application of systematic model-based solution approaches for product-process design is discussed and the need for a hybrid...... model-based framework is highlighted. This framework should be able to manage knowledge-data, models, and associated methods and tools integrated with design work-flows and data-flows for specific product-process design problems. In particular, the framework needs to manage models of different types......, forms and complexity, together with their associated parameters. An example of a model-based system for design of chemicals based formulated products is also given....

  7. Informing Leadership Models: Nursing and Organizational Characteristics of Neonatal Intensive Care Units in Freestanding Children's Hospitals.

    Science.gov (United States)

    Toole, Cheryl A; DeGrazia, Michele; Connor, Jean Anne; Gauvreau, Kimberlee; Kuzdeba, Hillary Bishop; Hickey, Patricia A

    Neonatal intensive care units (NICUs) located in freestanding children's hospitals may exhibit significant variation in nursing and organizational characteristics, which can serve as opportunities for collaboration to understand optimal staffing models and linkages to patient outcomes. Adopting methods used by Hickey et al in pediatric cardiovascular critical care, the purpose of this study was to provide a foundational description of the nursing and organizational characteristics for NICUs located in freestanding children's hospitals in the United States. Clinical nurse leaders in NICUs located in freestanding children's hospitals were invited to participate in an electronic cross-sectional survey. Descriptive analyses were used to summarize nursing and organizational characteristics. The response rate was 30% (13/43), with 69.2% of NICUs classified as level III/IV and 30.8% classified as level II/III. Licensed bed capacity varied significantly (range, 24-167), as did the proportion of full-time equivalent nurses (range, 71.78-252.3). Approximately three-quarters of staff nurses held baccalaureate degrees or higher. A quarter of nurses had 16 or more years (26.3%) of experience, and 36.9% of nurses had 11 or more years of nursing experience. Nearly one-third (29.2%) had 5 or less years of total nursing experience. Few nurses (10.6%) held neonatal specialty certification. All units had nurse educators, national and unit-based quality metrics, and procedural checklists. This study identified (1) variation in staffing models signaling an opportunity for collaboration, (2) the need to establish ongoing processes for sites to participate in future collaborative efforts, and (3) survey modifications necessary to ensure a more comprehensive understanding of nursing and organizational characteristics in freestanding children's hospital NICUs.

  8. Understanding Quality in Process Modelling: Towards a Holistic Perspective

    Directory of Open Access Journals (Sweden)

    Jan Recker

    2007-09-01

    Full Text Available Quality is one of the main topics in current conceptual modelling research, as is the field of business process modelling. Yet, widely acknowledged academic contributions towards an understanding or measurement of business process model quality are limited at best. In this paper I argue that the development of methodical theories concerning the measurement or establishment of process model quality must be preceded by methodological elaborations on business process modelling. I further argue that existing epistemological foundations of process modelling are insufficient for describing all extrinsic and intrinsic traits of model quality. This in turn has led to a lack of holistic understanding of process modelling. Taking into account the inherent social and purpose-oriented character of process modelling in contemporary organizations I present a socio-pragmatic constructionist methodology of business process modelling and sketch out implications of this perspective towards an understanding of process model quality. I anticipate that, based on this research, theories can be developed that facilitate the evaluation of the ’goodness’ of a business process model.

  9. Animated-simulation modeling facilitates clinical-process costing.

    Science.gov (United States)

    Zelman, W N; Glick, N D; Blackmore, C C

    2001-09-01

    Traditionally, the finance department has assumed responsibility for assessing process costs in healthcare organizations. To enhance process-improvement efforts, however, many healthcare providers need to include clinical staff in process cost analysis. Although clinical staff often use electronic spreadsheets to model the cost of specific processes, PC-based animated-simulation tools offer two major advantages over spreadsheets: they allow clinicians to interact more easily with the costing model so that it more closely represents the process being modeled, and they represent cost output as a cost range rather than as a single cost estimate, thereby providing more useful information for decision making.

  10. Comparison of ultrafiltration and dissolved air flotation efficiencies in industrial units during the papermaking process

    OpenAIRE

    Monte Lara, Concepción; Ordóñez Sanz, Ruth; Hermosilla Redondo, Daphne; Sánchez González, Mónica; Blanco Suárez, Ángeles

    2011-01-01

    The efficiency of an ultrafiltration unit has been studied and compared with a dissolved air flotation system to get water with a suited quality to be reused in the process. The study was done at a paper mill producing light weight coated paper and newsprint paper from 100% recovered paper. Efficiency was analysed by removal of turbidity, cationic demand, total and dissolved chemical oxygen demand, hardness, sulphates and microstickies. Moreover, the performance of the ultrafiltration unit an...

  11. Exploring the decision-making process in the delivery of physiotherapy in a stroke unit.

    Science.gov (United States)

    McGlinchey, Mark P; Davenport, Sally

    2015-01-01

    The aim of this study was to explore the decision-making process in the delivery of physiotherapy in a stroke unit. A focused ethnographical approach involving semi-structured interviews and observations of clinical practice was used. A purposive sample of seven neurophysiotherapists and four patients participated in semi-structured interviews. From this group, three neurophysiotherapists and four patients were involved in observation of practice. Data from interviews and observations were analysed to generate themes. Three themes were identified: planning the ideal physiotherapy delivery, the reality of physiotherapy delivery and involvement in the decision-making process. Physiotherapists used a variety of clinical reasoning strategies and considered many factors to influence their decision-making in the planning and delivery of physiotherapy post-stroke. These factors included the therapist's clinical experience, patient's presentation and response to therapy, prioritisation, organisational constraints and compliance with organisational practice. All physiotherapists highlighted the importance to involve patients in planning and delivering their physiotherapy. However, there were varying levels of patient involvement observed in this process. The study has generated insight into the reality of decision-making in the planning and delivery of physiotherapy post-stroke. Further research involving other stroke units is required to gain a greater understanding of this aspect of physiotherapy. Implications for Rehabilitation Physiotherapists need to consider multiple patient, therapist and organisational factors when planning and delivering physiotherapy in a stroke unit. Physiotherapists should continually reflect upon how they provide physiotherapy, with respect to the duration, frequency and time of day sessions are delivered, in order to guide current and future physiotherapy delivery. As patients may demonstrate varying levels of participation in deciding and

  12. How visual cognition influences process model comprehension

    NARCIS (Netherlands)

    Petrusel, Razvan; Mendling, Jan; Reijers, Hajo A.

    2017-01-01

    Process analysts and other professionals extensively use process models to analyze business processes and identify performance improvement opportunities. Therefore, it is important that such models can be easily and properly understood. Previous research has mainly focused on two types of factors

  13. A general model for membrane-based separation processes

    DEFF Research Database (Denmark)

    Soni, Vipasha; Abildskov, Jens; Jonsson, Gunnar Eigil

    2009-01-01

    behaviour will play an important role. In this paper, modelling of membrane-based processes for separation of gas and liquid mixtures are considered. Two general models, one for membrane-based liquid separation processes (with phase change) and another for membrane-based gas separation are presented....... The separation processes covered are: membrane-based gas separation processes, pervaporation and various types of membrane distillation processes. The specific model for each type of membrane-based process is generated from the two general models by applying the specific system descriptions and the corresponding...

  14. On Problem of Mathematical Modelling of Thermo-Physical Processes in Regenerative Water-Evaporating Coolers

    Science.gov (United States)

    Gulevsky, V. A.; Shatsky, V. P.; Osipov, E. I.; Menzhulova, A. S.

    2018-03-01

    For cooling the air environment of industrial premises water-evaporating air, conditioners are being increasingly applied. The simplicity of their construction, ecological safety and low power consumption distinguish them from the coolers of other types. Cooling the processed air is due to the loss of energy for the evaporation of moisture from the surface of the water-wetted plates that form air channels. As a result of this process, cooled air is often saturated with moisture, which limits the possibilities for the operation of the coolers of this type. In these cases, more complex coolers of indirect principle without such drawback should be applied. The most effective modification of indirect cooling is the installation of recuperative principle units. The paper presents a mathematical model of heat-mass transfer in such water-evaporating coolers. The scheme of realization of this model based on an iterative algorithm of solution of the system of finite–difference linear equations that takes into account longitudinal and transverse thermal conductivity of the heat transfer plates is suggested. The possibility of obtaining the optimal values of the redistribution of the main and auxiliary air flows through the substantiation of the aerodynamic resistance of the output grid is proved. This allows refusing the inclusion in the additional system cooling fan unit for discharging an auxiliary stream of air.

  15. APROMORE : an advanced process model repository

    NARCIS (Netherlands)

    La Rosa, M.; Reijers, H.A.; Aalst, van der W.M.P.; Dijkman, R.M.; Mendling, J.; Dumas, M.; García-Bañuelos, L.

    2011-01-01

    Business process models are becoming available in large numbers due to their widespread use in many industrial applications such as enterprise and quality engineering projects. On the one hand, this raises a challenge as to their proper management: how can it be ensured that the proper process model

  16. Modeling and forecasting the volatility of Islamic unit trust in Malaysia using GARCH model

    Science.gov (United States)

    Ismail, Nuraini; Ismail, Mohd Tahir; Karim, Samsul Ariffin Abdul; Hamzah, Firdaus Mohamad

    2015-10-01

    Due to the tremendous growth of Islamic unit trust in Malaysia since it was first introduced on 12th of January 1993 through the fund named Tabung Ittikal managed by Arab-Malaysian Securities, vast studies have been done to evaluate the performance of Islamic unit trust offered in Malaysia's capital market. Most of the studies found that one of the factors that affect the performance of the fund is the volatility level. Higher volatility produces better performance of the fund. Thus, we believe that a strategy must be set up by the fund managers in order for the fund to perform better. By using a series of net asset value (NAV) data of three different types of fund namely CIMB-IDEGF, CIMB-IBGF and CIMB-ISF from a fund management company named CIMB Principal Asset Management Berhad over a six years period from 1st January 2008 until 31st December 2013, we model and forecast the volatility of these Islamic unit trusts. The study found that the best fitting models for CIMB-IDEGF, CIMB-IBGF and CIMB-ISF are ARCH(4), GARCH(3,3) and GARCH(3,1) respectively. Meanwhile, the fund that is expected to be the least volatile is CIMB-IDEGF and the fund that is expected to be the most volatile is CIMB-IBGF.

  17. A model evaluation checklist for process-based environmental models

    Science.gov (United States)

    Jackson-Blake, Leah

    2015-04-01

    Mechanistic catchment-scale phosphorus models appear to perform poorly where diffuse sources dominate. The reasons for this were investigated for one commonly-applied model, the INtegrated model of CAtchment Phosphorus (INCA-P). Model output was compared to 18 months of daily water quality monitoring data in a small agricultural catchment in Scotland, and model structure, key model processes and internal model responses were examined. Although the model broadly reproduced dissolved phosphorus dynamics, it struggled with particulates. The reasons for poor performance were explored, together with ways in which improvements could be made. The process of critiquing and assessing model performance was then generalised to provide a broadly-applicable model evaluation checklist, incorporating: (1) Calibration challenges, relating to difficulties in thoroughly searching a high-dimensional parameter space and in selecting appropriate means of evaluating model performance. In this study, for example, model simplification was identified as a necessary improvement to reduce the number of parameters requiring calibration, whilst the traditionally-used Nash Sutcliffe model performance statistic was not able to discriminate between realistic and unrealistic model simulations, and alternative statistics were needed. (2) Data limitations, relating to a lack of (or uncertainty in) input data, data to constrain model parameters, data for model calibration and testing, and data to test internal model processes. In this study, model reliability could be improved by addressing all four kinds of data limitation. For example, there was insufficient surface water monitoring data for model testing against an independent dataset to that used in calibration, whilst additional monitoring of groundwater and effluent phosphorus inputs would help distinguish between alternative plausible model parameterisations. (3) Model structural inadequacies, whereby model structure may inadequately represent

  18. The evolution of process-based hydrologic models

    NARCIS (Netherlands)

    Clark, Martyn P.; Bierkens, Marc F.P.; Samaniego, Luis; Woods, Ross A.; Uijlenhoet, Remko; Bennett, Katrina E.; Pauwels, Valentijn R.N.; Cai, Xitian; Wood, Andrew W.; Peters-Lidard, Christa D.

    2017-01-01

    The diversity in hydrologic models has historically led to great controversy on the "correct" approach to process-based hydrologic modeling, with debates centered on the adequacy of process parameterizations, data limitations and uncertainty, and computational constraints on model analysis. In this

  19. Software Process Validation: Quantitatively Measuring the Correspondence of a Process to a Model

    National Research Council Canada - National Science Library

    Cook, Jonathan E; Wolf, Alexander L

    1997-01-01

    .... When process models and process executions diverge, something significant is happening. The authors have developed techniques for uncovering and measuring the discrepancies between models and executions, which they call process validation...

  20. Distillation modeling for a uranium refining process

    Energy Technology Data Exchange (ETDEWEB)

    Westphal, B.R.

    1996-03-01

    As part of the spent fuel treatment program at Argonne National Laboratory, a vacuum distillation process is being employed for the recovery of uranium following an electrorefining process. Distillation of a salt electrolyte, containing a eutectic mixture of lithium and potassium chlorides, from uranium is achieved by a simple batch operation and is termed {open_quotes}cathode processing{close_quotes}. The incremental distillation of electrolyte salt will be modeled by an equilibrium expression and on a molecular basis since the operation is conducted under moderate vacuum conditions. As processing continues, the two models will be compared and analyzed for correlation with actual operating results. Possible factors that may contribute to aberrations from the models include impurities at the vapor-liquid boundary, distillate reflux, anomalous pressure gradients, and mass transport phenomena at the evaporating surface. Ultimately, the purpose of either process model is to enable the parametric optimization of the process.