WorldWideScience

Sample records for environment process model

  1. Near Field Environment Process Model Report

    Energy Technology Data Exchange (ETDEWEB)

    R.A. Wagner

    2000-11-14

    Waste emplacement and activities associated with construction of a repository system potentially will change environmental conditions within the repository system. These environmental changes principally result from heat generated by the decay of the radioactive waste, which elevates temperatures within the repository system. Elevated temperatures affect distribution of water, increase kinetic rates of geochemical processes, and cause stresses to change in magnitude and orientation from the stresses resulting from the overlying rock and from underground construction activities. The recognition of this evolving environment has been reflected in activities, studies and discussions generally associated with what has been termed the Near-Field Environment (NFE). The NFE interacts directly with waste packages and engineered barriers as well as potentially changing the fluid composition and flow conditions within the mountain. As such, the NFE defines the environment for assessing the performance of a potential Monitored Geologic Repository at Yucca Mountain, Nevada. The NFe evolves over time, and therefore is not amenable to direct characterization or measurement in the ambient system. Analysis or assessment of the NFE must rely upon projections based on tests and models that encompass the long-term processes of the evolution of this environment. This NFE Process Model Report (PMR) describes the analyses and modeling based on current understanding of the evolution of the near-field within the rock mass extending outward from the drift wall.

  2. Energy and environment efficiency analysis based on an improved environment DEA cross-model: Case study of complex chemical processes

    International Nuclear Information System (INIS)

    Geng, ZhiQiang; Dong, JunGen; Han, YongMing; Zhu, QunXiong

    2017-01-01

    Highlights: •An improved environment DEA cross-model method is proposed. •Energy and environment efficiency analysis framework of complex chemical processes is obtained. •This proposed method is efficient in energy-saving and emission reduction of complex chemical processes. -- Abstract: The complex chemical process is a high pollution and high energy consumption industrial process. Therefore, it is very important to analyze and evaluate the energy and environment efficiency of the complex chemical process. Data Envelopment Analysis (DEA) is used to evaluate the relative effectiveness of decision-making units (DMUs). However, the traditional DEA method usually cannot genuinely distinguish the effective and inefficient DMU due to its extreme or unreasonable weight distribution of input and output variables. Therefore, this paper proposes an energy and environment efficiency analysis method based on an improved environment DEA cross-model (DEACM) method. The inputs of the complex chemical process are divided into energy and non-energy inputs. Meanwhile, the outputs are divided into desirable and undesirable outputs. And then the energy and environment performance index (EEPI) based on the cross evaluation is used to represent the overall performance of each DMU. Moreover, the improvement direction of energy-saving and carbon emission reduction of each inefficiency DMU is quantitatively obtained based on the self-evaluation model of the improved environment DEACM. The results show that the improved environment DEACM method has a better effective discrimination than the original DEA method by analyzing the energy and environment efficiency of the ethylene production process in complex chemical processes, and it can obtain the potential of energy-saving and carbon emission reduction of ethylene plants, especially the improvement direction of inefficient DMUs to improve energy efficiency and reduce carbon emission.

  3. Space - A unique environment for process modeling R&D

    Science.gov (United States)

    Overfelt, Tony

    1991-01-01

    Process modeling, the application of advanced computational techniques to simulate real processes as they occur in regular use, e.g., welding, casting and semiconductor crystal growth, is discussed. Using the low-gravity environment of space will accelerate the technical validation of the procedures and enable extremely accurate determinations of the many necessary thermophysical properties. Attention is given to NASA's centers for the commercial development of space; joint ventures of universities, industries, and goverment agencies to study the unique attributes of space that offer potential for applied R&D and eventual commercial exploitation.

  4. Modeling snow accumulation and ablation processes in forested environments

    Science.gov (United States)

    Andreadis, Konstantinos M.; Storck, Pascal; Lettenmaier, Dennis P.

    2009-05-01

    The effects of forest canopies on snow accumulation and ablation processes can be very important for the hydrology of midlatitude and high-latitude areas. A mass and energy balance model for snow accumulation and ablation processes in forested environments was developed utilizing extensive measurements of snow interception and release in a maritime mountainous site in Oregon. The model was evaluated using 2 years of weighing lysimeter data and was able to reproduce the snow water equivalent (SWE) evolution throughout winters both beneath the canopy and in the nearby clearing, with correlations to observations ranging from 0.81 to 0.99. Additionally, the model was evaluated using measurements from a Boreal Ecosystem-Atmosphere Study (BOREAS) field site in Canada to test the robustness of the canopy snow interception algorithm in a much different climate. Simulated SWE was relatively close to the observations for the forested sites, with discrepancies evident in some cases. Although the model formulation appeared robust for both types of climates, sensitivity to parameters such as snow roughness length and maximum interception capacity suggested the magnitude of improvements of SWE simulations that might be achieved by calibration.

  5. Conceptualization of Approaches and Thought Processes Emerging in Validating of Model in Mathematical Modeling in Technology Aided Environment

    Science.gov (United States)

    Hidiroglu, Çaglar Naci; Bukova Güzel, Esra

    2013-01-01

    The aim of the present study is to conceptualize the approaches displayed for validation of model and thought processes provided in mathematical modeling process performed in technology-aided learning environment. The participants of this grounded theory study were nineteen secondary school mathematics student teachers. The data gathered from the…

  6. Conceptual Model of Dynamic Geographic Environment

    Directory of Open Access Journals (Sweden)

    Martínez-Rosales Miguel Alejandro

    2014-04-01

    Full Text Available In geographic environments, there are many and different types of geographic entities such as automobiles, trees, persons, buildings, storms, hurricanes, etc. These entities can be classified into two groups: geographic objects and geographic phenomena. By its nature, a geographic environment is dynamic, thus, it’s static modeling is not sufficient. Considering the dynamics of geographic environment, a new type of geographic entity called event is introduced. The primary target is a modeling of geographic environment as an event sequence, because in this case the semantic relations are much richer than in the case of static modeling. In this work, the conceptualization of this model is proposed. It is based on the idea to process each entity apart instead of processing the environment as a whole. After that, the so called history of each entity and its spatial relations to other entities are defined to describe the whole environment. The main goal is to model systems at a conceptual level that make use of spatial and temporal information, so that later it can serve as the semantic engine for such systems.

  7. On the influence of the environment on modeling the fatigue crack growth process

    International Nuclear Information System (INIS)

    Mc Evily, A.J.

    1987-01-01

    The effect of the environment at room and elevated temperature were considered with respect to the influence exerted on the basic mechanical aspects of the fatigue crack growth process. An experimental assessment of this influence was obtained by conducting fatigue crack growth tests both in air and vacuum and the results of such experiments are given. Topics considered include crack closure, short crack growth in notched and unnotched specimens, Mode II crack growth, and the effects of oxidation at elevated temperatures. It is shown that the basic mechanisms of fatigue crack growth can be greatly altered by the presence of oxide films at the fatigue crack tip. Modeling the mechanical aspects of the crack growth process is by itself a challenging task. In addition, the environmental considerations adds to the complexity of the modeling process. (Author)

  8. Comparison of debris environment models (MASTER-2005, 2001, ORDEM2000): For international standardization of process based implementation of meteoroid and debris environmental models

    OpenAIRE

    Fukushige, Shinya; Akahoshi, Yasuhiro; Kitazawa, Yukihito; Goka, Tateo; 福重 進也; 赤星 保浩; 北澤 幸人; 五家 建夫

    2007-01-01

    Space agencies of some countries have space debris environment model for design of spacecrafts. These models can estimate debris flux as a function of the size, relative impact velocity, and impact angle in a spacecraft orbit. However, it is known calculation results of models are not always consistent with each other. Therefore, international common implementation process of debris environment model is required. In this paper, as the first step of international standardization of implementat...

  9. Modeling critical zone processes in intensively managed environments

    Science.gov (United States)

    Kumar, Praveen; Le, Phong; Woo, Dong; Yan, Qina

    2017-04-01

    Processes in the Critical Zone (CZ), which sustain terrestrial life, are tightly coupled across hydrological, physical, biochemical, and many other domains over both short and long timescales. In addition, vegetation acclimation resulting from elevated atmospheric CO2 concentration, along with response to increased temperature and altered rainfall pattern, is expected to result in emergent behaviors in ecologic and hydrologic functions, subsequently controlling CZ processes. We hypothesize that the interplay between micro-topographic variability and these emergent behaviors will shape complex responses of a range of ecosystem dynamics within the CZ. Here, we develop a modeling framework ('Dhara') that explicitly incorporates micro-topographic variability based on lidar topographic data with coupling of multi-layer modeling of the soil-vegetation continuum and 3-D surface-subsurface transport processes to study ecological and biogeochemical dynamics. We further couple a C-N model with a physically based hydro-geomorphologic model to quantify (i) how topographic variability controls the spatial distribution of soil moisture, temperature, and biogeochemical processes, and (ii) how farming activities modify the interaction between soil erosion and soil organic carbon (SOC) dynamics. To address the intensive computational demand from high-resolution modeling at lidar data scale, we use a hybrid CPU-GPU parallel computing architecture run over large supercomputing systems for simulations. Our findings indicate that rising CO2 concentration and air temperature have opposing effects on soil moisture, surface water and ponding in topographic depressions. Further, the relatively higher soil moisture and lower soil temperature contribute to decreased soil microbial activities in the low-lying areas due to anaerobic conditions and reduced temperatures. The decreased microbial relevant processes cause the reduction of nitrification rates, resulting in relatively lower nitrate

  10. Modeling Impulse and Non-Impulse Store Choice Processes in a Multi-Agent Simulation of Pedestrian Activity in Shopping Environments

    NARCIS (Netherlands)

    Dijkstra, J.; Timmermans, H.J.P.; Vries, de B.; Timmermans, H.J.P.

    2009-01-01

    This chapter presents a multi-agent approach for modeling impulse and non-impulse store choice processes of pedestrian activity in shopping environments. The pedestrian simulation context will be discussed as well as the behavioral principles underlying the store choice processes. For these

  11. Mathematical Modelling of Thermal Process to Aquatic Environment with Different Hydrometeorological Conditions

    Directory of Open Access Journals (Sweden)

    Alibek Issakhov

    2014-01-01

    Full Text Available This paper presents the mathematical model of the thermal process from thermal power plant to aquatic environment of the reservoir-cooler, which is located in the Pavlodar region, 17 Km to the north-east of Ekibastuz town. The thermal process in reservoir-cooler with different hydrometeorological conditions is considered, which is solved by three-dimensional Navier-Stokes equations and temperature equation for an incompressible flow in a stratified medium. A numerical method based on the projection method, divides the problem into three stages. At the first stage, it is assumed that the transfer of momentum occurs only by convection and diffusion. Intermediate velocity field is solved by fractional steps method. At the second stage, three-dimensional Poisson equation is solved by the Fourier method in combination with tridiagonal matrix method (Thomas algorithm. Finally, at the third stage, it is expected that the transfer is only due to the pressure gradient. Numerical method determines the basic laws of the hydrothermal processes that qualitatively and quantitatively are approximated depending on different hydrometeorological conditions.

  12. Engineered Barrier System: Physical and Chemical Environment Model

    International Nuclear Information System (INIS)

    Jolley, D. M.; Jarek, R.; Mariner, P.

    2004-01-01

    The conceptual and predictive models documented in this Engineered Barrier System: Physical and Chemical Environment Model report describe the evolution of the physical and chemical conditions within the waste emplacement drifts of the repository. The modeling approaches and model output data will be used in the total system performance assessment (TSPA-LA) to assess the performance of the engineered barrier system and the waste form. These models evaluate the range of potential water compositions within the emplacement drifts, resulting from the interaction of introduced materials and minerals in dust with water seeping into the drifts and with aqueous solutions forming by deliquescence of dust (as influenced by atmospheric conditions), and from thermal-hydrological-chemical (THC) processes in the drift. These models also consider the uncertainty and variability in water chemistry inside the drift and the compositions of introduced materials within the drift. This report develops and documents a set of process- and abstraction-level models that constitute the engineered barrier system: physical and chemical environment model. Where possible, these models use information directly from other process model reports as input, which promotes integration among process models used for total system performance assessment. Specific tasks and activities of modeling the physical and chemical environment are included in the technical work plan ''Technical Work Plan for: In-Drift Geochemistry Modeling'' (BSC 2004 [DIRS 166519]). As described in the technical work plan, the development of this report is coordinated with the development of other engineered barrier system analysis model reports

  13. Investigating the Process of Process Modeling with Eye Movement Analysis

    OpenAIRE

    Pinggera, Jakob; Furtner, Marco; Martini, Markus; Sachse, Pierre; Reiter, Katharina; Zugal, Stefan; Weber, Barbara

    2015-01-01

    Research on quality issues of business process models has recently begun to explore the process of creating process models by analyzing the modeler's interactions with the modeling environment. In this paper we aim to complement previous insights on the modeler's modeling behavior with data gathered by tracking the modeler's eye movements when engaged in the act of modeling. We present preliminary results and outline directions for future research to triangulate toward a more comprehensive un...

  14. Engineered Barrier System: Physical and Chemical Environment Model

    Energy Technology Data Exchange (ETDEWEB)

    D. M. Jolley; R. Jarek; P. Mariner

    2004-02-09

    The conceptual and predictive models documented in this Engineered Barrier System: Physical and Chemical Environment Model report describe the evolution of the physical and chemical conditions within the waste emplacement drifts of the repository. The modeling approaches and model output data will be used in the total system performance assessment (TSPA-LA) to assess the performance of the engineered barrier system and the waste form. These models evaluate the range of potential water compositions within the emplacement drifts, resulting from the interaction of introduced materials and minerals in dust with water seeping into the drifts and with aqueous solutions forming by deliquescence of dust (as influenced by atmospheric conditions), and from thermal-hydrological-chemical (THC) processes in the drift. These models also consider the uncertainty and variability in water chemistry inside the drift and the compositions of introduced materials within the drift. This report develops and documents a set of process- and abstraction-level models that constitute the engineered barrier system: physical and chemical environment model. Where possible, these models use information directly from other process model reports as input, which promotes integration among process models used for total system performance assessment. Specific tasks and activities of modeling the physical and chemical environment are included in the technical work plan ''Technical Work Plan for: In-Drift Geochemistry Modeling'' (BSC 2004 [DIRS 166519]). As described in the technical work plan, the development of this report is coordinated with the development of other engineered barrier system analysis model reports.

  15. r-process nucleosynthesis in dynamic helium-burning environments

    International Nuclear Information System (INIS)

    Cowan, J.J.; Cameron, A.G.W.; Truran, J.W.

    1985-01-01

    The results of an extended examination of r-process nucleosynthesis in helium-burning environments are presented. Using newly calculated nuclear rates, dynamical r-process calculations have been made of thermal runaways in helium cores typical of low-mass stars and in the helium zones of stars undergoing supernova explosions. These calculations show that, for a sufficient flux of neutrons produced by the 13 C neutron source, r-process nuclei in solar proportions can be produced. The conditions required for r-process production are found to be: 10 20 --10 21 neutrons cm -3 for times of 0.01--0.1 s and neutron number densities in excess of 10 19 cm -3 for times of approx.1 s. The amount of 13 C required is found to be exceedingly high: larger than is found to occur in any current stellar evolutionary model. It is thus unlikely that these helium-burning environments are responsible for producing the bulk of the r-process elements seen in the solar system

  16. Use of an uncertainty analysis for genome-scale models as a prediction tool for microbial growth processes in subsurface environments.

    Science.gov (United States)

    Klier, Christine

    2012-03-06

    The integration of genome-scale, constraint-based models of microbial cell function into simulations of contaminant transport and fate in complex groundwater systems is a promising approach to help characterize the metabolic activities of microorganisms in natural environments. In constraint-based modeling, the specific uptake flux rates of external metabolites are usually determined by Michaelis-Menten kinetic theory. However, extensive data sets based on experimentally measured values are not always available. In this study, a genome-scale model of Pseudomonas putida was used to study the key issue of uncertainty arising from the parametrization of the influx of two growth-limiting substrates: oxygen and toluene. The results showed that simulated growth rates are highly sensitive to substrate affinity constants and that uncertainties in specific substrate uptake rates have a significant influence on the variability of simulated microbial growth. Michaelis-Menten kinetic theory does not, therefore, seem to be appropriate for descriptions of substrate uptake processes in the genome-scale model of P. putida. Microbial growth rates of P. putida in subsurface environments can only be accurately predicted if the processes of complex substrate transport and microbial uptake regulation are sufficiently understood in natural environments and if data-driven uptake flux constraints can be applied.

  17. Marketing research model of competitive environment

    Directory of Open Access Journals (Sweden)

    Krasilya Dmitriy

    2015-11-01

    Full Text Available To support its competitive advantages in current market conditions, each company needs to choose better ways of guaranteeing its favorable competitive position. In this regard, considerable interest lies in the structuring and algorithmization of marketing research processes that provide the information background of such choice. The article is devoted to modeling the process of marketing research of competitive environment.

  18. Dynamical nexus of water supply, hydropower and environment based on the modeling of multiple socio-natural processes: from socio-hydrological perspective

    Science.gov (United States)

    Liu, D.; Wei, X.; Li, H. Y.; Lin, M.; Tian, F.; Huang, Q.

    2017-12-01

    In the socio-hydrological system, the ecological functions and environmental services, which are chosen to maintain, are determined by the preference of the society, which is making the trade-off among the values of riparian vegetation, fish, river landscape, water supply, hydropower, navigation and so on. As the society develops, the preference of the value will change and the ecological functions and environmental services which are chosen to maintain will change. The aim of the study is to focus on revealing the feedback relationship of water supply, hydropower and environment and the dynamical feedback mechanism at macro-scale, and to establish socio-hydrological evolution model of the watershed based on the modeling of multiple socio-natural processes. The study will aim at the Han River in China, analyze the impact of the water supply and hydropower on the ecology, hydrology and other environment elements, and study the effect on the water supply and hydropower to ensure the ecological and environmental water of the different level. Water supply and ecology are usually competitive. In some reservoirs, hydropower and ecology are synergic relationship while they are competitive in some reservoirs. The study will analyze the multiple mechanisms to implement the dynamical feedbacks of environment to hydropower, set up the quantitative relationship description of the feedback mechanisms, recognize the dominant processes in the feedback relationships of hydropower and environment and then analyze the positive and negative feedbacks in the feedback networks. The socio-hydrological evolution model at the watershed scale will be built and applied to simulate the long-term evolution processes of the watershed of the current situation. Dynamical nexus of water supply, hydropower and environment will be investigated.

  19. Designing user models in a virtual cave environment

    Energy Technology Data Exchange (ETDEWEB)

    Brown-VanHoozer, S. [Argonne National Lab., Idaho Falls, ID (United States); Hudson, R. [Argonne National Lab., IL (United States); Gokhale, N. [Madge Networks, San Jose, CA (United States)

    1995-12-31

    In this paper, the results of a first study into the use of virtual reality for human factor studies and design of simple and complex models of control systems, components, and processes are described. The objective was to design a model in a virtual environment that would reflect more characteristics of the user`s mental model of a system and fewer of the designer`s. The technology of a CAVE{trademark} virtual environment and the methodology of Neuro Linguistic Programming were employed in this study.

  20. Business Ecosystem Definition in Built Environment Using a Stakeholder Assessment Process

    Directory of Open Access Journals (Sweden)

    Tuomas Lappi

    2015-06-01

    Full Text Available Actors and their relationships are core elements of the business ecosystem concept, a trending model of business collaboration emphasizing organizational diversity, relationship dependency and joint evolution. This study approaches a built environment business ecosystem to structure the acknowledged complexity of ecosystem definition by applying a three-step stakeholder assessment process. The process is based on a stakeholder network diagram, Mitchell, Agle, and Wood’s (1997 well-recognized stakeholder salience model and a two-dimensional stakeholder matrix. The assessment process is applied to a school campus case study to define a built environment business ecosystem and the salience of the ecosystem actors. Results, including salience score calculation, validate the applicability of the proposed process. The findings provide novel insights for ecosystem researchers into how stakeholder theory concepts can be applied to broaden the understanding of business ecosystem dynamics.

  1. FEMME, a flexible environment for mathematically modelling the environment

    NARCIS (Netherlands)

    Soetaert, K.E.R.; DeClippele, V.; Herman, P.M.J.

    2002-01-01

    A new, FORTRAN-based, simulation environment called FEMME (Flexible Environment for Mathematically Modelling the Environment), designed for implementing, solving and analysing mathematical models in ecology is presented. Three separate phases in ecological modelling are distinguished: (1) the model

  2. Estimation of environment-related properties of chemicals for design of sustainable processes: development of group-contribution+ (GC+) property models and uncertainty analysis.

    Science.gov (United States)

    Hukkerikar, Amol Shivajirao; Kalakul, Sawitree; Sarup, Bent; Young, Douglas M; Sin, Gürkan; Gani, Rafiqul

    2012-11-26

    of the developed property models for the estimation of environment-related properties and uncertainties of the estimated property values is highlighted through an illustrative example. The developed property models provide reliable estimates of environment-related properties needed to perform process synthesis, design, and analysis of sustainable chemical processes and allow one to evaluate the effect of uncertainties of estimated property values on the calculated performance of processes giving useful insights into quality and reliability of the design of sustainable processes.

  3. Security Management Model in Cloud Computing Environment

    OpenAIRE

    Ahmadpanah, Seyed Hossein

    2016-01-01

    In the cloud computing environment, cloud virtual machine (VM) will be more and more the number of virtual machine security and management faced giant Challenge. In order to address security issues cloud computing virtualization environment, this paper presents a virtual machine based on efficient and dynamic deployment VM security management model state migration and scheduling, study of which virtual machine security architecture, based on AHP (Analytic Hierarchy Process) virtual machine de...

  4. Technical know-how for modeling of geological environment. (1) Overview and groundwater flow modeling

    International Nuclear Information System (INIS)

    Saegusa, Hiromitsu; Takeuchi, Shinji; Maekawa, Keisuke; Osawa, Hideaki; Semba, Takeshi

    2011-01-01

    It is important for site characterization projects to manage the decision-making process with transparency and traceability and to transfer the technical know-how accumulated during the research and development to the implementing phase and to future generations. The modeling for a geological environment is to be used to synthesize investigation results. Evaluation of the impact of uncertainties in the model is important to identify and prioritize key issues for further investigations. Therefore, a plan for site characterization should be made based on the results of the modeling. The aim of this study is to support for the planning of initial surface-based site characterization based on the technical know-how accumulated from the Mizunami Underground Research Laboratory Project and the Horonobe Underground Research Laboratory Project. These projects are broad scientific studies of the deep geological environment that are a basis for research and development for the geological disposal of high-level radioactive wastes. In this study, the work-flow of the groundwater flow modeling, which is one of the geological environment models, and is to be used for setting the area for the geological environment modeling and for groundwater flow characterization, and the related decision-making process using literature data have been summarized. (author)

  5. OPERATIONAL SAR DATA PROCESSING IN GIS ENVIRONMENTS FOR RAPID DISASTER MAPPING

    Directory of Open Access Journals (Sweden)

    A. Meroni

    2013-05-01

    Full Text Available Having access to SAR data can be highly important and critical especially for disaster mapping. Updating a GIS with contemporary information from SAR data allows to deliver a reliable set of geospatial information to advance civilian operations, e.g. search and rescue missions. Therefore, we present in this paper the operational processing of SAR data within a GIS environment for rapid disaster mapping. This is exemplified by the November 2010 flash flood in the Veneto region, Italy. A series of COSMO-SkyMed acquisitions was processed in ArcGIS® using a single-sensor, multi-mode, multi-temporal approach. The relevant processing steps were combined using the ArcGIS ModelBuilder to create a new model for rapid disaster mapping in ArcGIS, which can be accessed both via a desktop and a server environment.

  6. Interplanetary Radiation and Internal Charging Environment Models for Solar Sails

    Science.gov (United States)

    Minow, Joseph I.; Altstatt, Richard L.; NeegaardParker, Linda

    2005-01-01

    A Solar Sail Radiation Environment (SSRE) model has been developed for defining charged particle environments over an energy range from 0.01 keV to 1 MeV for hydrogen ions, helium ions, and electrons. The SSRE model provides the free field charged particle environment required for characterizing energy deposition per unit mass, charge deposition, and dose rate dependent conductivity processes required to evaluate radiation dose and internal (bulk) charging processes in the solar sail membrane in interplanetary space. Solar wind and energetic particle measurements from instruments aboard the Ulysses spacecraft in a solar, near-polar orbit provide the particle data over a range of heliospheric latitudes used to derive the environment that can be used for radiation and charging environments for both high inclination 0.5 AU Solar Polar Imager mission and the 1.0 AU L1 solar missions. This paper describes the techniques used to model comprehensive electron, proton, and helium spectra over the range of particle energies of significance to energy and charge deposition in thin (less than 25 micrometers) solar sail materials.

  7. On Process Modelling Using Physical Oriented And Phenomena Based Principles

    Directory of Open Access Journals (Sweden)

    Mihai Culea

    2000-12-01

    Full Text Available This work presents a modelling framework based on phenomena description of the process. The approach is taken to easy understand and construct process model in heterogeneous possible distributed modelling and simulation environments. A simplified case study of a heat exchanger is considered and Modelica modelling language to check the proposed concept. The partial results are promising and the research effort will be extended in a computer aided modelling environment based on phenomena.

  8. Scalable Networked Information Processing Environment (SNIPE)

    Energy Technology Data Exchange (ETDEWEB)

    Fagg, G.E.; Moore, K. [Univ. of Tennessee, Knoxville, TN (United States). Dept. of Computer Science; Dongarra, J.J. [Univ. of Tennessee, Knoxville, TN (United States). Dept. of Computer Science]|[Oak Ridge National Lab., TN (United States). Computer Science and Mathematics Div.; Geist, A. [Oak Ridge National Lab., TN (United States). Computer Science and Mathematics Div.

    1997-11-01

    SNIPE is a metacomputing system that aims to provide a reliable, secure, fault tolerant environment for long term distributed computing applications and data stores across the global Internet. This system combines global naming and replication of both processing and data to support large scale information processing applications leading to better availability and reliability than currently available with typical cluster computing and/or distributed computer environments.

  9. Sociotechnical design processes and working environment: The case of a continuous process wok

    DEFF Research Database (Denmark)

    Broberg, Ole

    2000-01-01

    A five-year design process of a continuous process wok has been studied with the aim of elucidating the conditions for integrating working environment aspects. The design process is seen as a network building activity and as a social shaping process of the artefact. A working environment log...... is suggested as a tool designers can use to integrate considerations of future operators' working environment....

  10. Understanding Fundamental Material Degradation Processes in High Temperature Aggressive Chemomechanical Environments

    International Nuclear Information System (INIS)

    2014-01-01

    The objective of this project is to develop a fundamental understanding of the mechanisms that limit materials durability for very high-temperature applications. Current design limitations are based on material strength and corrosion resistance. This project will characterize the interactions of high-temperature creep, fatigue, and environmental attack in structural metallic alloys of interest for the very high-temperature gas-cooled reactor (VHTR) or Next Generation Nuclear Plant (NGNP) and for the associated thermo-chemical processing systems for hydrogen generation. Each of these degradation processes presents a major materials design challenge on its own, but in combination, they can act synergistically to rapidly degrade materials and limit component lives. This research and development effort will provide experimental results to characterize creep-fatigue-environment interactions and develop predictive models to define operation limits for high-temperature structural material applications. Researchers will study individually and in combination creep-fatigue-environmental attack processes in Alloys 617, 230, and 800H, as well as in an advanced Ni-Cr oxide dispersion strengthened steel (ODS) system. For comparison, the study will also examine basic degradation processes in nichrome (Ni-20Cr), which is a basis for most high-temperature structural materials, as well as many of the superalloys. These materials are selected to represent primary candidate alloys, one advanced developmental alloy that may have superior high-temperature durability, and one model system on which basic performance and modeling efforts can be based. The research program is presented in four parts, which all complement each other. The first three are primarily experimental in nature, and the last will tie the work together in a coordinated modeling effort. The sections are (1) dynamic creep-fatigue-environment process, (2) subcritical crack processes, (3) dynamic corrosion crack

  11. Understanding Fundamental Material Degradation Processes in High Temperature Aggressive Chemomechanical Environments

    Energy Technology Data Exchange (ETDEWEB)

    Stubbins, James; Gewirth, Andrew; Sehitoglu, Huseyin; Sofronis, Petros; Robertson, Ian

    2014-01-16

    The objective of this project is to develop a fundamental understanding of the mechanisms that limit materials durability for very high-temperature applications. Current design limitations are based on material strength and corrosion resistance. This project will characterize the interactions of high-temperature creep, fatigue, and environmental attack in structural metallic alloys of interest for the very high-temperature gas-cooled reactor (VHTR) or Next–Generation Nuclear Plant (NGNP) and for the associated thermo-chemical processing systems for hydrogen generation. Each of these degradation processes presents a major materials design challenge on its own, but in combination, they can act synergistically to rapidly degrade materials and limit component lives. This research and development effort will provide experimental results to characterize creep-fatigue-environment interactions and develop predictive models to define operation limits for high-temperature structural material applications. Researchers will study individually and in combination creep-fatigue-environmental attack processes in Alloys 617, 230, and 800H, as well as in an advanced Ni-Cr oxide dispersion strengthened steel (ODS) system. For comparison, the study will also examine basic degradation processes in nichrome (Ni-20Cr), which is a basis for most high-temperature structural materials, as well as many of the superalloys. These materials are selected to represent primary candidate alloys, one advanced developmental alloy that may have superior high-temperature durability, and one model system on which basic performance and modeling efforts can be based. The research program is presented in four parts, which all complement each other. The first three are primarily experimental in nature, and the last will tie the work together in a coordinated modeling effort. The sections are (1) dynamic creep-fatigue-environment process, (2) subcritical crack processes, (3) dynamic corrosion – crack

  12. MODELING OF PATTERN FORMING PROCESS OF AUTOMATIC RADIO DIRECTION FINDER OF PHASE VHF IN THE DEVELOPMENT ENVIRONMENT OF LabVIEW APPLIED PROGRAMS

    Directory of Open Access Journals (Sweden)

    G. K. Aslanov

    2015-01-01

    Full Text Available In the article is developed the model demonstrating the forming process of pattern of antenna system of aerodrome quasidopler automatic radiodirection-finder station in the development environment of LabVIEW applied programs of National Instrument company. 

  13. A Multiagent Modeling Environment for Simulating Work Practice in Organizations

    Science.gov (United States)

    Sierhuis, Maarten; Clancey, William J.; vanHoof, Ron

    2004-01-01

    In this paper we position Brahms as a tool for simulating organizational processes. Brahms is a modeling and simulation environment for analyzing human work practice, and for using such models to develop intelligent software agents to support the work practice in organizations. Brahms is the result of more than ten years of research at the Institute for Research on Learning (IRL), NYNEX Science & Technology (the former R&D institute of the Baby Bell telephone company in New York, now Verizon), and for the last six years at NASA Ames Research Center, in the Work Systems Design and Evaluation group, part of the Computational Sciences Division (Code IC). Brahms has been used on more than ten modeling and simulation research projects, and recently has been used as a distributed multiagent development environment for developing work practice support tools for human in-situ science exploration on planetary surfaces, in particular a human mission to Mars. Brahms was originally conceived of as a business process modeling and simulation tool that incorporates the social systems of work, by illuminating how formal process flow descriptions relate to people s actual located activities in the workplace. Our research started in the early nineties as a reaction to experiences with work process modeling and simulation . Although an effective tool for convincing management of the potential cost-savings of the newly designed work processes, the modeling and simulation environment was only able to describe work as a normative workflow. However, the social systems, uncovered in work practices studied by the design team played a significant role in how work actually got done-actual lived work. Multi- tasking, informal assistance and circumstantial work interactions could not easily be represented in a tool with a strict workflow modeling paradigm. In response, we began to develop a tool that would have the benefits of work process modeling and simulation, but be distinctively able to

  14. Gene-Environment Interplay in Twin Models

    Science.gov (United States)

    Hatemi, Peter K.

    2013-01-01

    In this article, we respond to Shultziner’s critique that argues that identical twins are more alike not because of genetic similarity, but because they select into more similar environments and respond to stimuli in comparable ways, and that these effects bias twin model estimates to such an extent that they are invalid. The essay further argues that the theory and methods that undergird twin models, as well as the empirical studies which rely upon them, are unaware of these potential biases. We correct this and other misunderstandings in the essay and find that gene-environment (GE) interplay is a well-articulated concept in behavior genetics and political science, operationalized as gene-environment correlation and gene-environment interaction. Both are incorporated into interpretations of the classical twin design (CTD) and estimated in numerous empirical studies through extensions of the CTD. We then conduct simulations to quantify the influence of GE interplay on estimates from the CTD. Due to the criticism’s mischaracterization of the CTD and GE interplay, combined with the absence of any empirical evidence to counter what is presented in the extant literature and this article, we conclude that the critique does not enhance our understanding of the processes that drive political traits, genetic or otherwise. PMID:24808718

  15. Design of plant safety model in plant enterprise engineering environment

    International Nuclear Information System (INIS)

    Gabbar, Hossam A.; Suzuki, Kazuhiko; Shimada, Yukiyasu

    2001-01-01

    Plant enterprise engineering environment (PEEE) is an approach aiming to manage the plant through its lifecycle. In such environment, safety is considered as the common objective for all activities throughout the plant lifecycle. One approach to achieve plant safety is to embed safety aspects within each function and activity within such environment. One ideal way to enable safety aspects within each automated function is through modeling. This paper proposes a theoretical approach to design plant safety model as integrated with the plant lifecycle model within such environment. Object-oriented modeling approach is used to construct the plant safety model using OO CASE tool on the basis of unified modeling language (UML). Multiple views are defined for plant objects to express static, dynamic, and functional semantics of these objects. Process safety aspects are mapped to each model element and inherited from design to operation stage, as it is naturally embedded within plant's objects. By developing and realizing the plant safety model, safer plant operation can be achieved and plant safety can be assured

  16. Modelling Technology for Building Fire Scene with Virtual Geographic Environment

    Science.gov (United States)

    Song, Y.; Zhao, L.; Wei, M.; Zhang, H.; Liu, W.

    2017-09-01

    Building fire is a risky activity that can lead to disaster and massive destruction. The management and disposal of building fire has always attracted much interest from researchers. Integrated Virtual Geographic Environment (VGE) is a good choice for building fire safety management and emergency decisions, in which a more real and rich fire process can be computed and obtained dynamically, and the results of fire simulations and analyses can be much more accurate as well. To modelling building fire scene with VGE, the application requirements and modelling objective of building fire scene were analysed in this paper. Then, the four core elements of modelling building fire scene (the building space environment, the fire event, the indoor Fire Extinguishing System (FES) and the indoor crowd) were implemented, and the relationship between the elements was discussed also. Finally, with the theory and framework of VGE, the technology of building fire scene system with VGE was designed within the data environment, the model environment, the expression environment, and the collaborative environment as well. The functions and key techniques in each environment are also analysed, which may provide a reference for further development and other research on VGE.

  17. [Watershed water environment pollution models and their applications: a review].

    Science.gov (United States)

    Zhu, Yao; Liang, Zhi-Wei; Li, Wei; Yang, Yi; Yang, Mu-Yi; Mao, Wei; Xu, Han-Li; Wu, Wei-Xiang

    2013-10-01

    Watershed water environment pollution model is the important tool for studying watershed environmental problems. Through the quantitative description of the complicated pollution processes of whole watershed system and its parts, the model can identify the main sources and migration pathways of pollutants, estimate the pollutant loadings, and evaluate their impacts on water environment, providing a basis for watershed planning and management. This paper reviewed the watershed water environment models widely applied at home and abroad, with the focuses on the models of pollutants loading (GWLF and PLOAD), water quality of received water bodies (QUAL2E and WASP), and the watershed models integrated pollutant loadings and water quality (HSPF, SWAT, AGNPS, AnnAGNPS, and SWMM), and introduced the structures, principles, and main characteristics as well as the limitations in practical applications of these models. The other models of water quality (CE-QUAL-W2, EFDC, and AQUATOX) and watershed models (GLEAMS and MIKE SHE) were also briefly introduced. Through the case analysis on the applications of single model and integrated models, the development trend and application prospect of the watershed water environment pollution models were discussed.

  18. A production model and maintenance planning model for the process industry

    NARCIS (Netherlands)

    Ashayeri, J.; Teelen, A.; Selen, W.J.

    1995-01-01

    In this paper a model is developed to simultaneously plan preventive maintenance and production in a process industry environment, where maintenance planning is extremely important. The model schedules production jobs and preventive maintenance jobs, while minimizing costs associated with

  19. Process competencies in a problem and project based learning environment

    DEFF Research Database (Denmark)

    Du, Xiangyun; Kolmos, Anette

    2006-01-01

    with the expected professional competencies. Based on the educational practice of PBL Aalborg Model, which is characterized by problem-orientation, project-organization and team work, this paper examines the process of developing process competencies through studying engineering in a PBL environment from...... process competencies through doing problem and project based work in teams? 2) How do students perceive their achievement of these process competencies?......Future engineers are not only required to master technological competencies concerning solving problems, producing and innovating technology, they are also expected to have capabilities of cooperation, communication, and project management in diverse social context, which are referred to as process...

  20. Glacimarine environments: processes and sediments

    National Research Council Canada - National Science Library

    Dowdeswell, J. A; Scourse, James D

    1990-01-01

    This volume examines the processes responsible for sedimentation in modern glaciomarine environments, and how such modern studies can be used as analogues in the interpretation of ancient glaciomarine sequences...

  1. MODEL OF THE IMPLEMENTATION PROCESS OF DESIGNING A CLOUD-BASED LEARNING ENVIRONMENT FOR THE PREPARATION OF BACHELOR OF COMPUTER SCIENCE

    Directory of Open Access Journals (Sweden)

    Vakaliuk T.

    2017-12-01

    Full Text Available The article presents the model of the process of implementation of the design of a cloud-oriented learning environment (CBLE for the preparation of bachelor of computer science, which consists of seven stages: analysis, setting goals and objectives, formulating requirements for the cloud-oriented learning environment, modeling the CBLE, developing CBLE, using CBLE in the educational Bachelor of Computer Science and Performance Testing. Each stage contains sub-steps. The analysis stage is considered in three aspects: psychological, pedagogical and technological. The formulation of the requirements for the CBLE was carried out taking into account the content and objectives of the training; experience of using CBLE; the personal qualities and knowledge, skills and abilities of students. The simulation phase was divided into sub-stages: the development of a structural and functional model of the CBLE for the preparation of bachelors of computer science; development of a model of cloud-oriented learning support system (COLSS; development of a model of interaction processes in CBLE. The fifth stage was also divided into the following sub-steps: domain registration and customization of the appearance of COLSS; definition of the disciplines provided by the curriculum preparation of bachelors of computer science; creation of own cabinets of teachers and students; download educational and methodological and accompanying materials; the choice of traditional and cloud-oriented forms, methods, means of training. The verification of the functioning of the CBLE will be carried out in the following areas: the functioning of the CBLE; results of students' educational activity; formation of information and communication competence of students.

  2. Open source integrated modeling environment Delta Shell

    Science.gov (United States)

    Donchyts, G.; Baart, F.; Jagers, B.; van Putten, H.

    2012-04-01

    In the last decade, integrated modelling has become a very popular topic in environmental modelling since it helps solving problems, which is difficult to model using a single model. However, managing complexity of integrated models and minimizing time required for their setup remains a challenging task. The integrated modelling environment Delta Shell simplifies this task. The software components of Delta Shell are easy to reuse separately from each other as well as a part of integrated environment that can run in a command-line or a graphical user interface mode. The most components of the Delta Shell are developed using C# programming language and include libraries used to define, save and visualize various scientific data structures as well as coupled model configurations. Here we present two examples showing how Delta Shell simplifies process of setting up integrated models from the end user and developer perspectives. The first example shows coupling of a rainfall-runoff, a river flow and a run-time control models. The second example shows how coastal morphological database integrates with the coastal morphological model (XBeach) and a custom nourishment designer. Delta Shell is also available as open-source software released under LGPL license and accessible via http://oss.deltares.nl.

  3. The role of physical processes controlling the behaviour of radionuclide contaminants in the aquatic environment: a review of state-of-the-art modelling approaches

    International Nuclear Information System (INIS)

    Monte, Luigi; Perianez, Raul; Boyer, Patrick; Smith, Jim T.; Brittain, John E.

    2009-01-01

    This paper is aimed at presenting and discussing the methodologies implemented in state-of-the-art models for predicting the physical processes of radionuclide migration through the aquatic environment, including transport due to water currents, diffusion, settling and re-suspension. Models are briefly described, model parameter values reviewed and values recommended. The different modelling approaches are briefly classified and the advantages and disadvantages of the various model approaches and methodologies are assessed.

  4. Macro Level Simulation Model Of Space Shuttle Processing

    Science.gov (United States)

    2000-01-01

    The contents include: 1) Space Shuttle Processing Simulation Model; 2) Knowledge Acquisition; 3) Simulation Input Analysis; 4) Model Applications in Current Shuttle Environment; and 5) Model Applications for Future Reusable Launch Vehicles (RLV's). This paper is presented in viewgraph form.

  5. TAME - the terrestrial-aquatic model of the environment: model definition

    International Nuclear Information System (INIS)

    Klos, R.A.; Mueller-Lemans, H.; Dorp, F. van; Gribi, P.

    1996-10-01

    TAME - the Terrestrial-Aquatic Model of the Environment is a new computer model for use in assessments of the radiological impact of the release of radionuclides to the biosphere, following their disposal in underground waste repositories. Based on regulatory requirements, the end-point of the calculations is the maximum annual individual dose to members of a hypothetical population group inhabiting the biosphere region. Additional mid- and end-points in the TAME calculations are dose as function of time from eleven exposure pathways, foodstuff concentrations and the distribution of radionuclides in the modelled biosphere. A complete description of the mathematical representations of the biosphere in TAME is given in this document, based on a detailed review of the underlying conceptual framework for the model. Example results are used to illustrate features of the conceptual and mathematical models. The end-point of dose is shown to be robust for the simplifying model assumptions used to define the biosphere for the example calculations. TAME comprises two distinct sub-models - one representing the transport of radionuclides in the near-surface environment and one for the calculation of dose to individual inhabitants of that biosphere. The former is the result of a detailed review of the modelling requirements for such applications and is based on a comprehensive consideration of all features, events and processes (FEPs) relevant to Swiss biospheres, both in the present-day biosphere and in potential future biosphere states. Representations of the transport processes are derived from first principles. Mass balance for water and solid material fluxes is used to determine the rates of contaminant transfer between components of the biosphere system. The calculation of doses is based on existing representations of exposure pathways and draws on experience both from Switzerland and elsewhere. (author) figs., tabs., refs

  6. TAME - the terrestrial-aquatic model of the environment: model definition

    Energy Technology Data Exchange (ETDEWEB)

    Klos, R.A. [Paul Scherrer Inst. (PSI), Villigen (Switzerland); Mueller-Lemans, H. [Tergoso AG fuer Umweltfragen, Sargans (Switzerland); Dorp, F. van [Nationale Genossenschaft fuer die Lagerung Radioaktiver Abfaelle (NAGRA), Baden (Switzerland); Gribi, P. [Colenco AG, Baden (Switzerland)

    1996-10-01

    TAME - the Terrestrial-Aquatic Model of the Environment is a new computer model for use in assessments of the radiological impact of the release of radionuclides to the biosphere, following their disposal in underground waste repositories. Based on regulatory requirements, the end-point of the calculations is the maximum annual individual dose to members of a hypothetical population group inhabiting the biosphere region. Additional mid- and end-points in the TAME calculations are dose as function of time from eleven exposure pathways, foodstuff concentrations and the distribution of radionuclides in the modelled biosphere. A complete description of the mathematical representations of the biosphere in TAME is given in this document, based on a detailed review of the underlying conceptual framework for the model. Example results are used to illustrate features of the conceptual and mathematical models. The end-point of dose is shown to be robust for the simplifying model assumptions used to define the biosphere for the example calculations. TAME comprises two distinct sub-models - one representing the transport of radionuclides in the near-surface environment and one for the calculation of dose to individual inhabitants of that biosphere. The former is the result of a detailed review of the modelling requirements for such applications and is based on a comprehensive consideration of all features, events and processes (FEPs) relevant to Swiss biospheres, both in the present-day biosphere and in potential future biosphere states. Representations of the transport processes are derived from first principles. Mass balance for water and solid material fluxes is used to determine the rates of contaminant transfer between components of the biosphere system. The calculation of doses is based on existing representations of exposure pathways and draws on experience both from Switzerland and elsewhere. (author) figs., tabs., refs.

  7. Integrated approaches to the application of advanced modeling technology in process development and optimization

    Energy Technology Data Exchange (ETDEWEB)

    Allgor, R.J.; Feehery, W.F.; Tolsma, J.E. [Massachusetts Institute of Technology, Cambridge, MA (United States)] [and others

    1995-12-31

    The batch process development problem serves as good candidate to guide the development of process modeling environments. It demonstrates that very robust numerical techniques are required within an environment that can collect, organize, and maintain the data and models required to address the batch process development problem. This paper focuses on improving the robustness and efficiency of the numerical algorithms required in such a modeling environment through the development of hybrid numerical and symbolic strategies.

  8. Comprehensive environment-suitability evaluation model about Carya cathayensis

    International Nuclear Information System (INIS)

    Da-Sheng, W.; Li-Juan, L.; Qin-Fen, Y.

    2013-01-01

    On the relation between the suitable environment and the distribution areas of Carya cathayensis Sarg., the current studies are mainly committed to qualitative descriptions, but did not consider quantitative models. The objective of this study was to establish a environment-suitability evaluation model which used to predict potential suitable areas of C. cathayensis. Firstly, the 3 factor data of soil type, soil parent material and soil thickness were obtained based on 2-class forest resource survey, and other factor data, which included elevation, slope, aspect, surface curvature, humidity index, and solar radiation index, were extracted from DEM (Digital Elevation Model). Additionally, the key affecting factors were defined by PCA (Principal Component Analysis), the weights of evaluation factors were determined by AHP (Analysis Hierarchy Process) and the quantitative classification of single factor was determined by membership function with fuzzy mathematics. Finally, a comprehensive environment-suitability evaluation model was established and which was also used to predict the potential suitable areas of C. cathayensis in Daoshi Town in the study region. The results showed that 85.6% of actual distribution areas were in the most suitable and more suitable regions and 11.5% in the general suitable regions

  9. The Conceptualization of the Mathematical Modelling Process in Technology-Aided Environment

    Science.gov (United States)

    Hidiroglu, Çaglar Naci; Güzel, Esra Bukova

    2017-01-01

    The aim of the study is to conceptualize the technology-aided mathematical modelling process in the frame of cognitive modelling perspective. The grounded theory approach was adopted in the study. The research was conducted with seven groups consisting of nineteen prospective mathematics teachers. The data were collected from the video records of…

  10. Modeling and simulation of heterogeneous catalytic processes

    CERN Document Server

    Dixon, Anthony

    2014-01-01

    Heterogeneous catalysis and mathematical modeling are essential components of the continuing search for better utilization of raw materials and energy, with reduced impact on the environment. Numerical modeling of chemical systems has progressed rapidly due to increases in computer power, and is used extensively for analysis, design and development of catalytic reactors and processes. This book presents reviews of the state-of-the-art in modeling of heterogeneous catalytic reactors and processes. Reviews by leading authorities in the respective areas Up-to-date reviews of latest techniques in modeling of catalytic processes Mix of US and European authors, as well as academic/industrial/research institute perspectives Connections between computation and experimental methods in some of the chapters.

  11. Modeling tritium transport in the environment

    International Nuclear Information System (INIS)

    Murphy, C.E. Jr.

    1986-01-01

    A model of tritium transport in the environment near an atmospheric source of tritium is presented in the general context of modeling material cycling in ecosystems. The model was developed to test hypotheses about the process involved in tritium cycling. The temporal and spatial scales of the model were picked to allow comparison to environmental monitoring data collected in the vicinity of the Savannah River Plant. Initial simulations with the model showed good agreement with monitoring data, including atmospheric and vegetation tritium concentrations. The model can also simulate values of tritium in vegetation organic matter if the key parameter distributing the source of organic hydrogen is varied to fit the data. However, because of the lack of independent conformation of the distribution parameter, there is still uncertainty about the role of organic movement of tritium in the food chain, and its effect on the dose to man

  12. ANTHEM2000TM: Integration of the ANTHEM Thermal Hydraulic Model in the ROSETM Environment

    International Nuclear Information System (INIS)

    Boire, R.; Nguyen, M; Salim, G.

    1999-01-01

    ROSEN TM is an object oriented, visual programming environment used for many applications, including the development of power plant simulators. ROSE provides an integrated suite of tools for the creation, calibration, test, integration, configuration management and documentation of process, electrical and I and C models. CAE recently undertook an ambitious project to integrate its two phase thermal hydraulic model ANTHEM TM into the ROSE environment. ANTHEM is a non equilibrium, non-homogenous model based on the drift flux formalism. CAE has used the model in numerous two phase applications for nuclear and fossil power plant simulators. The integration of ANTHEM into ROSE brings the full power of visual based programming to two phase modeling applications. Features include graphical model building, calibration tools, a superior test environment and process visualisation. In addition the integration of ANTHEM into ROSE makes it possible to easily apply the fidelity of ANTHEM to BOP applications. This paper describes the implementation of the ANTHEM model within the ROSE environment and gives examples of its use. (author)

  13. Assessing safety risk in electricity distribution processes using ET & BA improved technique and its ranking by VIKOR and TOPSIS models in fuzzy environment

    Directory of Open Access Journals (Sweden)

    S. Rahmani

    2016-04-01

      Conclusion: The height and electricity are of the main causes of accidents in electricity transmission and distribution industry which caused the overhead power networks to be ranked as high risk. Application of decision-making models in fuzzy environment minimizes the judgment of assessors in the risk assessment process.

  14. A cluster expansion model for predicting activation barrier of atomic processes

    International Nuclear Information System (INIS)

    Rehman, Tafizur; Jaipal, M.; Chatterjee, Abhijit

    2013-01-01

    We introduce a procedure based on cluster expansion models for predicting the activation barrier of atomic processes encountered while studying the dynamics of a material system using the kinetic Monte Carlo (KMC) method. Starting with an interatomic potential description, a mathematical derivation is presented to show that the local environment dependence of the activation barrier can be captured using cluster interaction models. Next, we develop a systematic procedure for training the cluster interaction model on-the-fly, which involves: (i) obtaining activation barriers for handful local environments using nudged elastic band (NEB) calculations, (ii) identifying the local environment by analyzing the NEB results, and (iii) estimating the cluster interaction model parameters from the activation barrier data. Once a cluster expansion model has been trained, it is used to predict activation barriers without requiring any additional NEB calculations. Numerical studies are performed to validate the cluster expansion model by studying hop processes in Ag/Ag(100). We show that the use of cluster expansion model with KMC enables efficient generation of an accurate process rate catalog

  15. r-process nucleosynthesis in dynamic helium-burning environments

    Science.gov (United States)

    Cowan, J. J.; Cameron, A. G. W.; Truran, J. W.

    1985-01-01

    The results of an extended examination of r-process nucleosynthesis in helium-burning enviroments are presented. Using newly calculated nuclear rates, dynamical r-process calculations have been made of thermal runaways in helium cores typical of low-mass stars and in the helium zones of stars undergoing supernova explosions. These calculations show that, for a sufficient flux of neutrons produced by the C-13 neutron source, r-process nuclei in solar proportions can be produced. The conditions required for r-process production are found to be 10 to the 20th-10 to the 21st neutrons per cubic centimeter for times of 0.01-0.1 s and neutron number densities in excess of 10 to the 19th per cubic centimeter for times of about 1 s. The amount of C-13 required is found to be exceedingly high - larger than is found to occur in any current stellar evolutionary model. It is thus unlikely that these helium-burning environments are responsible for producing the bulk of the r-process elements seen in the solar system.

  16. Drift-Scale Coupled Processes (DST and THC Seepage) Models

    Energy Technology Data Exchange (ETDEWEB)

    E. Gonnenthal; N. Spyoher

    2001-02-05

    The purpose of this Analysis/Model Report (AMR) is to document the Near-Field Environment (NFE) and Unsaturated Zone (UZ) models used to evaluate the potential effects of coupled thermal-hydrologic-chemical (THC) processes on unsaturated zone flow and transport. This is in accordance with the ''Technical Work Plan (TWP) for Unsaturated Zone Flow and Transport Process Model Report'', Addendum D, Attachment D-4 (Civilian Radioactive Waste Management System (CRWMS) Management and Operating Contractor (M and O) 2000 [153447]) and ''Technical Work Plan for Nearfield Environment Thermal Analyses and Testing'' (CRWMS M and O 2000 [153309]). These models include the Drift Scale Test (DST) THC Model and several THC seepage models. These models provide the framework to evaluate THC coupled processes at the drift scale, predict flow and transport behavior for specified thermal loading conditions, and predict the chemistry of waters and gases entering potential waste-emplacement drifts. The intended use of this AMR is to provide input for the following: (1) Performance Assessment (PA); (2) Abstraction of Drift-Scale Coupled Processes AMR (ANL-NBS-HS-000029); (3) UZ Flow and Transport Process Model Report (PMR); and (4) Near-Field Environment (NFE) PMR. The work scope for this activity is presented in the TWPs cited above, and summarized as follows: continue development of the repository drift-scale THC seepage model used in support of the TSPA in-drift geochemical model; incorporate heterogeneous fracture property realizations; study sensitivity of results to changes in input data and mineral assemblage; validate the DST model by comparison with field data; perform simulations to predict mineral dissolution and precipitation and their effects on fracture properties and chemistry of water (but not flow rates) that may seep into drifts; submit modeling results to the TDMS and document the models. The model development, input data

  17. Drift-Scale Coupled Processes (DST and THC Seepage) Models

    International Nuclear Information System (INIS)

    Sonnenthale, E.

    2001-01-01

    The purpose of this Analysis/Model Report (AMR) is to document the Near-Field Environment (NFE) and Unsaturated Zone (UZ) models used to evaluate the potential effects of coupled thermal-hydrologic-chemical (THC) processes on unsaturated zone flow and transport. This is in accordance with the ''Technical Work Plan (TWP) for Unsaturated Zone Flow and Transport Process Model Report'', Addendum D, Attachment D-4 (Civilian Radioactive Waste Management System (CRWMS) Management and Operating Contractor (M and O) 2000 [1534471]) and ''Technical Work Plan for Nearfield Environment Thermal Analyses and Testing'' (CRWMS M and O 2000 [153309]). These models include the Drift Scale Test (DST) THC Model and several THC seepage models. These models provide the framework to evaluate THC coupled processes at the drift scale, predict flow and transport behavior for specified thermal loading conditions, and predict the chemistry of waters and gases entering potential waste-emplacement drifts. The intended use of this AMR is to provide input for the following: Performance Assessment (PA); Near-Field Environment (NFE) PMR; Abstraction of Drift-Scale Coupled Processes AMR (ANL-NBS-HS-000029); and UZ Flow and Transport Process Model Report (PMR). The work scope for this activity is presented in the TWPs cited above, and summarized as follows: Continue development of the repository drift-scale THC seepage model used in support of the TSPA in-drift geochemical model; incorporate heterogeneous fracture property realizations; study sensitivity of results to changes in input data and mineral assemblage; validate the DST model by comparison with field data; perform simulations to predict mineral dissolution and precipitation and their effects on fracture properties and chemistry of water (but not flow rates) that may seep into drifts; submit modeling results to the TDMS and document the models. The model development, input data, sensitivity and validation studies described in this AMR are

  18. Lagrangian modelling of dispersion, sedimentation and resuspension processes in marine environments

    International Nuclear Information System (INIS)

    Gidhagen, L.; Rahm, L.; Nyberg, L.

    1989-01-01

    The model is based on a modified Langevin's equation which simulates the turbulent crossflow velocity fluctuations in shear flows. The velocity and turbulence fields used are generated by a 2-dimensional hydrodynamical model including a k-ε turbulence scheme. Since the dispersion model is formulated for only low particle concentrations, it is decoupled from the hydrodynamical model calculations. A great drawback in conventional dispersion modelling is the more or less unavoidable numerical diffusion. The use of a Lagrangian particle model will avoid this effect and the resulting too low concentrations for a given release. One consequence is a more realistic distribution of deposited particles. However, with regard to the overall deposition rates the simulated sedimentation process agrees well with well-established advection/diffusion model formulations. With a modified hydrodynamic model, the dispersion model can directly be applied to stratified 3D simulations. (orig./HP) [de

  19. Biosphere Process Model Report

    Energy Technology Data Exchange (ETDEWEB)

    J. Schmitt

    2000-05-25

    To evaluate the postclosure performance of a potential monitored geologic repository at Yucca Mountain, a Total System Performance Assessment (TSPA) will be conducted. Nine Process Model Reports (PMRs), including this document, are being developed to summarize the technical basis for each of the process models supporting the TSPA model. These reports cover the following areas: (1) Integrated Site Model; (2) Unsaturated Zone Flow and Transport; (3) Near Field Environment; (4) Engineered Barrier System Degradation, Flow, and Transport; (5) Waste Package Degradation; (6) Waste Form Degradation; (7) Saturated Zone Flow and Transport; (8) Biosphere; and (9) Disruptive Events. Analysis/Model Reports (AMRs) contain the more detailed technical information used to support TSPA and the PMRs. The AMRs consists of data, analyses, models, software, and supporting documentation that will be used to defend the applicability of each process model for evaluating the postclosure performance of the potential Yucca Mountain repository system. This documentation will ensure the traceability of information from its source through its ultimate use in the TSPA-Site Recommendation (SR) and in the National Environmental Policy Act (NEPA) analysis processes. The objective of the Biosphere PMR is to summarize (1) the development of the biosphere model, and (2) the Biosphere Dose Conversion Factors (BDCFs) developed for use in TSPA. The Biosphere PMR does not present or summarize estimates of potential radiation doses to human receptors. Dose calculations are performed as part of TSPA and will be presented in the TSPA documentation. The biosphere model is a component of the process to evaluate postclosure repository performance and regulatory compliance for a potential monitored geologic repository at Yucca Mountain, Nevada. The biosphere model describes those exposure pathways in the biosphere by which radionuclides released from a potential repository could reach a human receptor

  20. Biosphere Process Model Report

    International Nuclear Information System (INIS)

    Schmitt, J.

    2000-01-01

    To evaluate the postclosure performance of a potential monitored geologic repository at Yucca Mountain, a Total System Performance Assessment (TSPA) will be conducted. Nine Process Model Reports (PMRs), including this document, are being developed to summarize the technical basis for each of the process models supporting the TSPA model. These reports cover the following areas: (1) Integrated Site Model; (2) Unsaturated Zone Flow and Transport; (3) Near Field Environment; (4) Engineered Barrier System Degradation, Flow, and Transport; (5) Waste Package Degradation; (6) Waste Form Degradation; (7) Saturated Zone Flow and Transport; (8) Biosphere; and (9) Disruptive Events. Analysis/Model Reports (AMRs) contain the more detailed technical information used to support TSPA and the PMRs. The AMRs consists of data, analyses, models, software, and supporting documentation that will be used to defend the applicability of each process model for evaluating the postclosure performance of the potential Yucca Mountain repository system. This documentation will ensure the traceability of information from its source through its ultimate use in the TSPA-Site Recommendation (SR) and in the National Environmental Policy Act (NEPA) analysis processes. The objective of the Biosphere PMR is to summarize (1) the development of the biosphere model, and (2) the Biosphere Dose Conversion Factors (BDCFs) developed for use in TSPA. The Biosphere PMR does not present or summarize estimates of potential radiation doses to human receptors. Dose calculations are performed as part of TSPA and will be presented in the TSPA documentation. The biosphere model is a component of the process to evaluate postclosure repository performance and regulatory compliance for a potential monitored geologic repository at Yucca Mountain, Nevada. The biosphere model describes those exposure pathways in the biosphere by which radionuclides released from a potential repository could reach a human receptor

  1. A Model for Urban Environment and Resource Planning Based on Green GDP Accounting System

    Directory of Open Access Journals (Sweden)

    Linyu Xu

    2013-01-01

    Full Text Available The urban environment and resources are currently on course that is unsustainable in the long run due to excessive human pursuit of economic goals. Thus, it is very important to develop a model to analyse the relationship between urban economic development and environmental resource protection during the process of rapid urbanisation. This paper proposed a model to identify the key factors in urban environment and resource regulation based on a green GDP accounting system, which consisted of four parts: economy, society, resource, and environment. In this model, the analytic hierarchy process (AHP method and a modified Pearl curve model were combined to allow for dynamic evaluation, with higher green GDP value as the planning target. The model was applied to the environmental and resource planning problem of Wuyishan City, and the results showed that energy use was a key factor that influenced the urban environment and resource development. Biodiversity and air quality were the most sensitive factors that influenced the value of green GDP in the city. According to the analysis, the urban environment and resource planning could be improved for promoting sustainable development in Wuyishan City.

  2. An integrative model linking feedback environment and organizational citizenship behavior.

    Science.gov (United States)

    Peng, Jei-Chen; Chiu, Su-Fen

    2010-01-01

    Past empirical evidence has suggested that a positive supervisor feedback environment may enhance employees' organizational citizenship behavior (OCB). In this study, we aim to extend previous research by proposing and testing an integrative model that examines the mediating processes underlying the relationship between supervisor feedback environment and employee OCB. Data were collected from 259 subordinate-supervisor dyads across a variety of organizations in Taiwan. We used structural equation modeling to test our hypotheses. The results demonstrated that supervisor feedback environment influenced employees' OCB indirectly through (1) both positive affective-cognition and positive attitude (i.e., person-organization fit and organizational commitment), and (2) both negative affective-cognition and negative attitude (i.e., role stressors and job burnout). Theoretical and practical implications are discussed.

  3. Metal Catalyzed Fusion: Nuclear Active Environment vs. Process

    Science.gov (United States)

    Chubb, Talbot

    2009-03-01

    To achieve radiationless dd fusion and/or other LENR reactions via chemistry: some focus on environment of interior or altered near-surface volume of bulk metal; some on environment inside metal nanocrystals or on their surface; some on the interface between nanometal crystals and ionic crystals; some on a momentum shock-stimulation reaction process. Experiment says there is also a spontaneous reaction process.

  4. Preface. Forest ecohydrological processes in a changing environment.

    Science.gov (United States)

    Xiaohua Wei; Ge Sun; James Vose; Kyoichi Otsuki; Zhiqiang Zhang; Keith Smetterm

    2011-01-01

    The papers in this issue are a selection of the presentations made at the second International Conference on Forests and Water in a Changing Environment. This special issue ‘Forest Ecohydrological Processes in a Changing Environment’ covers the topics regarding the effects of forest, land use and climate changes on ecohydrological processes across forest stand,...

  5. Multispectral simulation environment for modeling low-light-level sensor systems

    Science.gov (United States)

    Ientilucci, Emmett J.; Brown, Scott D.; Schott, John R.; Raqueno, Rolando V.

    1998-11-01

    Image intensifying cameras have been found to be extremely useful in low-light-level (LLL) scenarios including military night vision and civilian rescue operations. These sensors utilize the available visible region photons and an amplification process to produce high contrast imagery. It has been demonstrated that processing techniques can further enhance the quality of this imagery. For example, fusion with matching thermal IR imagery can improve image content when very little visible region contrast is available. To aid in the improvement of current algorithms and the development of new ones, a high fidelity simulation environment capable of producing radiometrically correct multi-band imagery for low- light-level conditions is desired. This paper describes a modeling environment attempting to meet these criteria by addressing the task as two individual components: (1) prediction of a low-light-level radiance field from an arbitrary scene, and (2) simulation of the output from a low- light-level sensor for a given radiance field. The radiance prediction engine utilized in this environment is the Digital Imaging and Remote Sensing Image Generation (DIRSIG) model which is a first principles based multi-spectral synthetic image generation model capable of producing an arbitrary number of bands in the 0.28 to 20 micrometer region. The DIRSIG model is utilized to produce high spatial and spectral resolution radiance field images. These images are then processed by a user configurable multi-stage low-light-level sensor model that applies the appropriate noise and modulation transfer function (MTF) at each stage in the image processing chain. This includes the ability to reproduce common intensifying sensor artifacts such as saturation and 'blooming.' Additionally, co-registered imagery in other spectral bands may be simultaneously generated for testing fusion and exploitation algorithms. This paper discusses specific aspects of the DIRSIG radiance prediction for low

  6. Space Environment Modeling

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Collection includes presentation materials and outputs from operational space environment models produced by the NOAA Space Weather Prediction Center (SWPC) and...

  7. Modeling the cometary environment using a fluid approach

    Science.gov (United States)

    Shou, Yinsi

    Comets are believed to have preserved the building material of the early solar system and to hold clues to the origin of life on Earth. Abundant remote observations of comets by telescopes and the in-situ measurements by a handful of space missions reveal that the cometary environments are complicated by various physical and chemical processes among the neutral gases and dust grains released from comets, cometary ions, and the solar wind in the interplanetary space. Therefore, physics-based numerical models are in demand to interpret the observational data and to deepen our understanding of the cometary environment. In this thesis, three models using a fluid approach, which include important physical and chemical processes underlying the cometary environment, have been developed to study the plasma, neutral gas, and the dust grains, respectively. Although models based on the fluid approach have limitations in capturing all of the correct physics for certain applications, especially for very low gas density environment, they are computationally much more efficient than alternatives. In the simulations of comet 67P/Churyumov-Gerasimenko at various heliocentric distances with a wide range of production rates, our multi-fluid cometary neutral gas model and multi-fluid cometary dust model have achieved comparable results to the Direct Simulation Monte Carlo (DSMC) model, which is based on a kinetic approach that is valid in all collisional regimes. Therefore, our model is a powerful alternative to the particle-based model, especially for some computationally intensive simulations. Capable of accounting for the varying heating efficiency under various physical conditions in a self-consistent way, the multi-fluid cometary neutral gas model is a good tool to study the dynamics of the cometary coma with different production rates and heliocentric distances. The modeled H2O expansion speeds reproduce the general trend and the speed's nonlinear dependencies of production rate

  8. Modelling radioactivity in the environment

    International Nuclear Information System (INIS)

    Scott, E.M.

    2003-01-01

    Just as an environmental model typically will be composed of a number of linked sub-models, representing physical, chemical or biological processes understood to varying degrees, this volume includes a series of linked chapters exemplifying the fundamental nature of environmental radioactivity models in all compartments of the environment. Modelling is an often misunderstood and maligned activity and this book can provide, to a broad audience, a greater understanding of modelling power but also some of the limitations. Modellers and experimentalists often do not understand and mistrust each other's work yet they are mutually dependent, in the sense that good experimental science can direct good modelling work and vice-versa. There is an increasing reliance on model results in environmental management, yet there is also often misuse and misrepresentation of these results. This book can help to bridge the gap between unrealistic expectations of model power and the realisation of what is possible, practicable and feasible in modelling of environmental radioactivity; and finally, - modelling tools, capacity and power have increased many-fold in a relatively short period of time. Much of this is due to the much-heralded computer revolution, but much is also due to better science. It is useful to consider what gap if any still remains between what is possible and what is necessary

  9. The Automation of Nowcast Model Assessment Processes

    Science.gov (United States)

    2016-09-01

    secondly, provide modelers with the information needed to understand the model errors and how their algorithm changes might mitigate these errors. In...by ARL modelers. 2. Development Environment The automation of Point-Stat processes (i.e., PSA) was developed using Python 3.5.* Python was selected...because it is easy to use, widely used for scripting, and satisfies all the requirements to automate the implementation of the Point-Stat tool. In

  10. Patient Data Synchronization Process in a Continuity of Care Environment

    Science.gov (United States)

    Haras, Consuela; Sauquet, Dominique; Ameline, Philippe; Jaulent, Marie-Christine; Degoulet, Patrice

    2005-01-01

    In a distributed patient record environment, we analyze the processes needed to ensure exchange and access to EHR data. We propose an adapted method and the tools for data synchronization. Our study takes into account the issues of user rights management for data access and of decreasing the amount of data exchanged over the network. We describe a XML-based synchronization model that is portable and independent of specific medical data models. The implemented platform consists of several servers, of local network clients, of workstations running user’s interfaces and of data exchange and synchronization tools. PMID:16779049

  11. Mechanistic Fermentation Models for Process Design, Monitoring, and Control

    DEFF Research Database (Denmark)

    Mears, Lisa; Stocks, Stuart M.; Albæk, Mads Orla

    2017-01-01

    Mechanistic models require a significant investment of time and resources, but their application to multiple stages of fermentation process development and operation can make this investment highly valuable. This Opinion article discusses how an established fermentation model may be adapted...... for application to different stages of fermentation process development: planning, process design, monitoring, and control. Although a longer development time is required for such modeling methods in comparison to purely data-based model techniques, the wide range of applications makes them a highly valuable tool...... for fermentation research and development. In addition, in a research environment, where collaboration is important, developing mechanistic models provides a platform for knowledge sharing and consolidation of existing process understanding....

  12. MODELING OF INNOVATION EDUCATIONAL ENVIRONMENT OF GENERAL EDUCATIONAL INSTITUTION: THE SCIENTIFIC APPROACHES

    OpenAIRE

    Anzhelika D. Tsymbalaru

    2010-01-01

    In the paper the scientific approaches to modeling of innovation educational environment of a general educational institution – system (analysis of object, process and result of modeling as system objects), activity (organizational and psychological structure) and synergetic (aspects and principles).

  13. Management of Ecological-Economic Processes of Pollution Accumulation and Assimilation in the Coastal Zone Marine Environment

    Directory of Open Access Journals (Sweden)

    I.E. Timchenko

    2017-02-01

    Full Text Available A model for managing the balance of pollution (getting into the sea with the coastal runoff assimilation and accumulation, based on the negative feedback between the coastal economic system efficiency and penalties for the sea coastal zone pollution is proposed. The model is constructed by the Adaptive Balance of Causes method and is intended for finding a rational balance of profit from the use of assimilative resources of the marine environment and the costs of maintaining its quality. The increase of pollutions in the coastal zone is taken as proportional to the volume of product realization. The decrease of pollution concentration is related to the environment protection activities paid for by the production. The model contains the agents for managing the volume of the economic system generalized production release. The agents control pollution accumulation rate at different ones of the bio-chemical processes resulting in the marine environment natural purification. Scenario analysis of ecological-economic processes in the “Land–Sea” system is carried out, and the dependencies of economic subsystem production profitability on penalty sanctions limiting the pollutant flux getting into the sea are constructed. Sea temperature and water mass dynamics effect on these processes is considered. The scenarios of their intra-annual variability are constructed. It is shown that the sea temperature and near-water wind consideration in the model have a significant effect on marine environment pollution level and production profitability. The conclusion is that the proposed adaptive simulation model “Sea–Land” can be used for forecasting the scenarios of coastal subsystem production processes (the volume of generalized product manufacturing, production cost, profitability in parallel with the forecast of pollution concentration in the sea scenarios.

  14. APROS - A multifunctional modelling environment

    International Nuclear Information System (INIS)

    Juslin, K.; Paljakka, M.

    1999-01-01

    The Advanced Process Simulation (APROS) environment has after more than a decade of dedicated product development and intense commercial use reached a level of maturity that is difficult to find with regard to similar products. One of the basic ideas behind this software tool is its multifunctional concept. The concept requires that the tool is suitable for modelling and simulation of the dynamics of a process plant during all phases of its life-span from pre-design to training and model supported operation and control. The implementation of this concept had a significant impact on the software structure. Several, sometimes contradictory requirements had to be encompassed. It should be suitable both for small simple models and full scope simulators. It should facilitate time-steps from milliseconds to minutes, for the same models, just depending on the scope of study. It should combine several modelling paradigms such as continuous, discrete, mechanistic and empirical. The intrinsic model building blocks should be comprehensively verified, but users' model equations should be accepted, as well. It should be easy to connect to external models or hardware, and to use both in master or slave mode. It should be easy to study and modify the internals of the models, their structures and parameters, but it should also be possible to disclose all delicate model information from unwanted access. The calculation should be optimised for current computer hardware, but the model specifications should be easily transportable to new platforms. And finally, it should be suited both for researchers, engineers and plant operators. How did we succeed? We had 20 years of comprehensive thermal hydraulic modelling tradition before starting the project. We had the key experts with the key knowledge. We dedicated more than 100 man years of efforts for the new software developments. Presently, we have a superb team maintaining and improving the software, complemented with new enthusiastic

  15. On the Performance Potential of Connection Fault-Tolerant Commit Processing in Mobile Environment

    OpenAIRE

    Tome Dimovski; Pece Mitrevski

    2012-01-01

    Mobile inventory, mobile commerce, banking and/or commercial applications are some distinctive examples that increasingly use distributed transactions. It is inevitably harder to design efficient commit protocols, due to some intrinsic mobile environment limitations. A handful of protocols for transaction processing have been offered, but the majority considers only a limited number of communication models. We introduce an improved Connection Fault-Tolerant model and evaluate its performance ...

  16. Understanding Creative Design Processes by Integrating Sketching and CAD Modelling Design Environments: A Preliminary Protocol Result from Architectural Designers

    Directory of Open Access Journals (Sweden)

    Yi Teng Shih

    2015-11-01

    Full Text Available This paper presents the results of a preliminary protocol study of the cognitive behaviour of architectural designers during the design process. The aim is to better understand the similarities and differences in cognitive behaviour using Sequential Mixed Media (SMM and Alternative Mixed Media (AMM approaches, and how switching between media may impact on design processes. Two participants with at least one-year’s professional design experience and a Bachelor of Design degree, and competence in both sketching and computer-aid design (CAD modelling participated in the study. Video recordings of participants working on different projects were coded using the Function-Behaviour-Structure (FBS coding scheme. Participants were also interviewed and their explanations about their switching behaviours were categorised into three types: S→C, S/C↹R and C→S. Preliminary results indicate that switching between media may influence how designers identify problems and develop solutions. In particular, two design issues were identified.  These relate to the FBS coding scheme, where structure (S and behaviour derived from structure (Bs, change to documentation (D after switching from sketching to CAD modelling (S→C. These switches make it possible for designers to integrate both approaches into one design medium and facilitate their design processes in AMM design environments.

  17. Model alloy oxidation in oxyfuel characteristic environment

    International Nuclear Information System (INIS)

    Coelho, D.; Rizzo, F.; Kranzmann, A.; Monteiro, M.; Caminha, I.

    2014-01-01

    In the oxyfuel process, pure oxygen is burned in boilers with recycled gas producing a gas rich in CO_2, making it easer to capture the CO_2 in the end of the process. The present work investigates the high temperature corrosion characteristics of a model Fe-Cr-Co alloy in typical oxyfuel process environment. Samples were oxidized at 600°C during 1000 hours in single atmosphere condition, where the samples is exposed to the same gas in all faces, and in a dual atmosphere condition, where the sample is exposed to water vapor in one side and to oxyfuel gas in the other. Samples where characterized by SEM and EDX. Results showed that corrosion is higher in a dual atmosphere condition than in single condition. (author)

  18. Development of climate data storage and processing model

    Science.gov (United States)

    Okladnikov, I. G.; Gordov, E. P.; Titov, A. G.

    2016-11-01

    We present a storage and processing model for climate datasets elaborated in the framework of a virtual research environment (VRE) for climate and environmental monitoring and analysis of the impact of climate change on the socio-economic processes on local and regional scales. The model is based on a «shared nothings» distributed computing architecture and assumes using a computing network where each computing node is independent and selfsufficient. Each node holds a dedicated software for the processing and visualization of geospatial data providing programming interfaces to communicate with the other nodes. The nodes are interconnected by a local network or the Internet and exchange data and control instructions via SSH connections and web services. Geospatial data is represented by collections of netCDF files stored in a hierarchy of directories in the framework of a file system. To speed up data reading and processing, three approaches are proposed: a precalculation of intermediate products, a distribution of data across multiple storage systems (with or without redundancy), and caching and reuse of the previously obtained products. For a fast search and retrieval of the required data, according to the data storage and processing model, a metadata database is developed. It contains descriptions of the space-time features of the datasets available for processing, their locations, as well as descriptions and run options of the software components for data analysis and visualization. The model and the metadata database together will provide a reliable technological basis for development of a high- performance virtual research environment for climatic and environmental monitoring.

  19. Modelling of processes occurring in deep geological repository - development of new modules in the GoldSim environment

    International Nuclear Information System (INIS)

    Vopalka, D.; Lukin, D.; Vokal, A.

    2006-01-01

    Three new modules modelling the processes that occur in a deep geological repository have been prepared in the GoldSim computer code environment (using its Transport Module). These modules help to understand the role of selected parameters in the near-field region of the final repository and to prepare an own complex model of the repository behaviour. The source term module includes radioactive decay and ingrowth in the canister, first order degradation of fuel matrix, solubility limitation of the concentration of the studied nuclides, and diffusive migration through the surrounding bentonite layer controlled by the output boundary condition formulated with respect to the rate of water flow in the rock. The corrosion module describes corrosion of canisters made of carbon steel and transport of corrosion products in the near-field region. This module computes balance equations between dissolving species and species transported by diffusion and/or advection from the surface of a solid material. The diffusion module that includes also non-linear form of the interaction isotherm can be used for an evaluation of small-scale diffusion experiments. (author)

  20. Modelling of processes occurring in deep geological repository - Development of new modules in the GoldSim environment

    Science.gov (United States)

    Vopálka, D.; Lukin, D.; Vokál, A.

    2006-01-01

    Three new modules modelling the processes that occur in a deep geological repository have been prepared in the GoldSim computer code environment (using its Transport Module). These modules help to understand the role of selected parameters in the near-field region of the final repository and to prepare an own complex model of the repository behaviour. The source term module includes radioactive decay and ingrowth in the canister, first order degradation of fuel matrix, solubility limitation of the concentration of the studied nuclides, and diffusive migration through the surrounding bentonite layer controlled by the output boundary condition formulated with respect to the rate of water flow in the rock. The corrosion module describes corrosion of canisters made of carbon steel and transport of corrosion products in the near-field region. This module computes balance equations between dissolving species and species transported by diffusion and/or advection from the surface of a solid material. The diffusion module that includes also non-linear form of the interaction isotherm can be used for an evaluation of small-scale diffusion experiments.

  1. A Knowledge-based Environment for Software Process Performance Analysis

    Directory of Open Access Journals (Sweden)

    Natália Chaves Lessa Schots

    2015-08-01

    Full Text Available Background: Process performance analysis is a key step for implementing continuous improvement in software organizations. However, the knowledge to execute such analysis is not trivial and the person responsible to executing it must be provided with appropriate support. Aim: This paper presents a knowledge-based environment, named SPEAKER, proposed for supporting software organizations during the execution of process performance analysis. SPEAKER comprises a body of knowledge and a set of activities and tasks for software process performance analysis along with supporting tools to executing these activities and tasks. Method: We conducted an informal literature reviews and a systematic mapping study, which provided basic requirements for the proposed environment. We implemented the SPEAKER environment integrating supporting tools for the execution of activities and tasks of performance analysis and the knowledge necessary to execute them, in order to meet the variability presented by the characteristics of these activities. Results: In this paper, we describe each SPEAKER module and the individual evaluations of these modules, and also present an example of use comprising how the environment can guide the user through a specific performance analysis activity. Conclusion: Although we only conducted individual evaluations of SPEAKER’s modules, the example of use indicates the feasibility of the proposed environment. Therefore, the environment as a whole will be further evaluated to verify if it attains its goal of assisting in the execution of process performance analysis by non-specialist people.

  2. Transformation and distribution processes governing the fate and behaviour of nanomaterials in the environment: an overview

    DEFF Research Database (Denmark)

    Hansen, Steffen Foss; Hartmann, Nanna B.; Baun, Anders

    2015-01-01

    assessment. Chemical fate modelling is one approach to fill this gap within a short time frame. To ensure the reliability of predicted environmental concentrations informed choices are needed during model formulation and development. A major knowledge gap, hampering the further development of such model...... present in the environment. Specific nanomaterials are used as case studies to illustrate these processes. Key environmental processes are identified and ranked and key knowledge gaps are identified, feeding into the longer-term goal of improving the existing models for predicted environmental...

  3. Modeling of space environment impact on nanostructured materials. General principles

    Science.gov (United States)

    Voronina, Ekaterina; Novikov, Lev

    2016-07-01

    In accordance with the resolution of ISO TC20/SC14 WG4/WG6 joint meeting, Technical Specification (TS) 'Modeling of space environment impact on nanostructured materials. General principles' which describes computer simulation methods of space environment impact on nanostructured materials is being prepared. Nanomaterials surpass traditional materials for space applications in many aspects due to their unique properties associated with nanoscale size of their constituents. This superiority in mechanical, thermal, electrical and optical properties will evidently inspire a wide range of applications in the next generation spacecraft intended for the long-term (~15-20 years) operation in near-Earth orbits and the automatic and manned interplanetary missions. Currently, ISO activity on developing standards concerning different issues of nanomaterials manufacturing and applications is high enough. Most such standards are related to production and characterization of nanostructures, however there is no ISO documents concerning nanomaterials behavior in different environmental conditions, including the space environment. The given TS deals with the peculiarities of the space environment impact on nanostructured materials (i.e. materials with structured objects which size in at least one dimension lies within 1-100 nm). The basic purpose of the document is the general description of the methodology of applying computer simulation methods which relate to different space and time scale to modeling processes occurring in nanostructured materials under the space environment impact. This document will emphasize the necessity of applying multiscale simulation approach and present the recommendations for the choice of the most appropriate methods (or a group of methods) for computer modeling of various processes that can occur in nanostructured materials under the influence of different space environment components. In addition, TS includes the description of possible

  4. Measures of quality of process models created in BPMN

    Directory of Open Access Journals (Sweden)

    Radek Hronza

    2015-12-01

    Full Text Available Description, documentation, evaluation and redesign of key processes during their execution should be an essential part of the strategic management of any organization. All organizations live in a dynamically changing environment. Therefore they must adapt its internal processes to market changes. These processes must be described. Suitable way of description could be BPMN notation. Right after description of processes via BPMN, processes should be controlled to ensure expected of their quality. System (which could be automated based on mathematical expression of qualitative characteristics of process models (i.e. measures of quality of process models can support mentioned process controls. Research team trying to design and get into practical use such a tool. The aim of this publication is description of mentioned system – based on measures of the quality of process models - and answer associated scientific questions.

  5. Technological learning in energy-environment-economy modelling: A survey

    International Nuclear Information System (INIS)

    Kahouli-Brahmi, Sondes

    2008-01-01

    This paper aims at providing an overview and a critical analysis of the technological learning concept and its incorporation in energy-environment-economy models. A special emphasis is put on surveying and discussing, through the so-called learning curve, both studies estimating learning rates in the energy field and studies incorporating endogenous technological learning in bottom-up and top-down models. The survey of learning rate estimations gives special attention to interpreting and explaining the sources of variability of estimated rates, which is shown to be mainly inherent in R and D expenditures, the problem of omitted variable bias, the endogeneity relationship and the role of spillovers. Large-scale models survey show that, despite some methodological and computational complexity related to the non-linearity and the non-convexity associated with the learning curve incorporation, results of the numerous modelling experiments give several new insights with regard to the analysis of the prospects of specific technological options and their cost decrease potential (bottom-up models), and with regard to the analysis of strategic considerations, especially inherent in the innovation and energy diffusion process, in particular the energy sector's endogenous responses to environment policy instruments (top-down models)

  6. Collaborative Engineering Environments. Two Examples of Process Improvement

    NARCIS (Netherlands)

    Spee, J.B.R.M.; Bijwaard, D.; Laan, D.J.

    Companies are recognising that innovative processes are determining factors in competitiveness. Two examples from projects in aircraft development describe the introduction of collaborative engineering environments as a way to improve engineering processes. A multi-disciplinary simulation

  7. Event Modeling in UML. Unified Modeling Language and Unified Process

    DEFF Research Database (Denmark)

    Bækgaard, Lars

    2002-01-01

    We show how events can be modeled in terms of UML. We view events as change agents that have consequences and as information objects that represent information. We show how to create object-oriented structures that represent events in terms of attributes, associations, operations, state charts......, and messages. We outline a run-time environment for the processing of events with multiple participants....

  8. An integrated model of social environment and social context for pediatric rehabilitation.

    Science.gov (United States)

    Batorowicz, Beata; King, Gillian; Mishra, Lipi; Missiuna, Cheryl

    2016-01-01

    This article considers the conceptualization and operationalization of "social environment" and "social context" with implications for research and practice with children and youth with impairments. We first discuss social environment and social context as constructs important for understanding interaction between external environmental qualities and the individual's experience. The article considers existing conceptualizations within psychological and sociological bodies of literature, research using these concepts, current developmental theories and issues in the understanding of environment and participation within rehabilitation science. We then describe a model that integrates a person-focused perspective with an environment-focused perspective and that outlines the mechanisms through which children/youth and social environment interact and transact. Finally, we consider the implications of the proposed model for research and clinical practice. This conceptual model directs researchers and practitioners toward interventions that will address the mechanisms of child-environment interaction and that will build capacity within both children and their social environments, including families, peers groups and communities. Health is created and lived by people within the settings of their everyday life; where they learn, work, play, and love [p.2]. Understanding how social environment and personal factors interact over time to affect the development of children/youth can influence the design of services for children and youth with impairments. The model described integrates the individual-focused and environment-focused perspectives and outlines the mechanisms of the ongoing reciprocal interaction between children/youth and their social environments: provision of opportunities, resources and supports and contextual processes of choice, active engagement and collaboration. Addressing these mechanisms could contribute to creating healthier environments in which all

  9. A model for assessing information technology effectiveness in the business environment

    Directory of Open Access Journals (Sweden)

    Sandra Cristina Riascos Erazo

    2008-05-01

    Full Text Available The impact of technology on administrative processes has improved business strategies (especially regarding the e-ffect of information technology - IT, often leading to organisational success. Its effectiveness in this environment was thus modelled due to such importance; this paper describes studying a series of models aimed at assessing IT, its ad-vantages and disadvantages. A model is proposed involving different aspects for an integral assessment of IT effecti-veness and considering administrative activities’ particular characteristics. This analytical study provides guidelines for identifying IT effectiveness in a business environment and current key strategies in technological innovation. This stu-dy was based on ISO 9126, ISO 9001, ISO 15939 and ISO 25000 standards as well as COBIT and CMM stan-dards.

  10. Performance assessment modeling of pyrometallurgical process wasteforms

    International Nuclear Information System (INIS)

    Nutt, W.M.; Hill, R.N.; Bullen, D.B.

    1995-01-01

    Performance assessment analyses have been completed to estimate the behavior of high-level nuclear wasteforms generated from the pyrometallurgical processing of liquid metal reactor (LMR) and light water reactor (LWR) spent nuclear fuel. Waste emplaced in the proposed repository at Yucca Mountain is investigated as the basis for the study. The resulting cumulative actinide and fission product releases to the accessible environment within a 100,000 year period from the various pyrometallurgical process wasteforms are compared to those of directly disposed LWR spent fuel using the same total repository system model. The impact of differing radionuclide transport models on the overall release characteristics is investigated

  11. DEVELOPMENT OF A HETEROGENIC DISTRIBUTED ENVIRONMENT FOR SPATIAL DATA PROCESSING USING CLOUD TECHNOLOGIES

    Directory of Open Access Journals (Sweden)

    A. S. Garov

    2016-06-01

    Full Text Available We are developing a unified distributed communication environment for processing of spatial data which integrates web-, desktop- and mobile platforms and combines volunteer computing model and public cloud possibilities. The main idea is to create a flexible working environment for research groups, which may be scaled according to required data volume and computing power, while keeping infrastructure costs at minimum. It is based upon the "single window" principle, which combines data access via geoportal functionality, processing possibilities and communication between researchers. Using an innovative software environment the recently developed planetary information system (http://cartsrv.mexlab.ru/geoportal will be updated. The new system will provide spatial data processing, analysis and 3D-visualization and will be tested based on freely available Earth remote sensing data as well as Solar system planetary images from various missions. Based on this approach it will be possible to organize the research and representation of results on a new technology level, which provides more possibilities for immediate and direct reuse of research materials, including data, algorithms, methodology, and components. The new software environment is targeted at remote scientific teams, and will provide access to existing spatial distributed information for which we suggest implementation of a user interface as an advanced front-end, e.g., for virtual globe system.

  12. Development of a Heterogenic Distributed Environment for Spatial Data Processing Using Cloud Technologies

    Science.gov (United States)

    Garov, A. S.; Karachevtseva, I. P.; Matveev, E. V.; Zubarev, A. E.; Florinsky, I. V.

    2016-06-01

    We are developing a unified distributed communication environment for processing of spatial data which integrates web-, desktop- and mobile platforms and combines volunteer computing model and public cloud possibilities. The main idea is to create a flexible working environment for research groups, which may be scaled according to required data volume and computing power, while keeping infrastructure costs at minimum. It is based upon the "single window" principle, which combines data access via geoportal functionality, processing possibilities and communication between researchers. Using an innovative software environment the recently developed planetary information system (geoportal"target="_blank">http://cartsrv.mexlab.ru/geoportal) will be updated. The new system will provide spatial data processing, analysis and 3D-visualization and will be tested based on freely available Earth remote sensing data as well as Solar system planetary images from various missions. Based on this approach it will be possible to organize the research and representation of results on a new technology level, which provides more possibilities for immediate and direct reuse of research materials, including data, algorithms, methodology, and components. The new software environment is targeted at remote scientific teams, and will provide access to existing spatial distributed information for which we suggest implementation of a user interface as an advanced front-end, e.g., for virtual globe system.

  13. Mathematical modelling of the laser processing of compose materials

    International Nuclear Information System (INIS)

    Gromyko, G.F.; Matsuka, N.P.

    2009-01-01

    Expansion of the protective coating scope led to the necessity to work out lower priced methods of treatment of machine elements. Making of an adequate, agreed with process features, mathematical model and development of effective methods of its solving are promising directions in this fields. In this paper the mathematical model of high-temperature laser treatment via moving source of pre-sprayed with composite powder padding is developed. Presented model describes accurately enough the heat processes taking place by laser processing of machine elements. Varying input parameters of model (laser power, temperature and composition of environment, characteristics and quantitative composition of using materials, etc.) one can get a cheap tool of preliminary estimates for wide range of similar problems. Difference method, based on process physical features and taking into account main process-dependent parameters had been developed for solving of the built system of nonlinear equations. (authors)

  14. Dissolving decision making? : Models and their roles in decision-making processes and policy at large

    NARCIS (Netherlands)

    Zeiss, Ragna; van Egmond, S.

    2014-01-01

    This article studies the roles three science-based models play in Dutch policy and decision making processes. Key is the interaction between model construction and environment. Their political and scientific environments form contexts that shape the roles of models in policy decision making.

  15. Alternative biosphere modeling for safety assessment of HLW disposal taking account of geosphere-biosphere interface of marine environment

    International Nuclear Information System (INIS)

    Kato, Tomoko; Ishiguro, Katsuhiko; Naito, Morimasa; Ikeda, Takao; Little, Richard

    2001-03-01

    In the safety assessment of a high-level radioactive waste (HLW) disposal system, it is required to estimate radiological impacts on future human beings arising from potential radionuclide releases from a deep repository into the surface environment. In order to estimated the impacts, a biosphere model is developed by reasonably assuming radionuclide migration processes in the surface environment and relevant human lifestyles. It is important to modify the present biosphere models or to develop alternative biosphere models applying the biosphere models according to quality and quantify of the information acquired through the siting process for constructing the repository. In this study, alternative biosphere models were developed taking geosphere-biosphere interface of marine environment into account. Moreover, the flux to dose conversion factors calculated by these alternative biosphere models was compared with those by the present basic biosphere models. (author)

  16. An Interactive, Web-based High Performance Modeling Environment for Computational Epidemiology.

    Science.gov (United States)

    Deodhar, Suruchi; Bisset, Keith R; Chen, Jiangzhuo; Ma, Yifei; Marathe, Madhav V

    2014-07-01

    We present an integrated interactive modeling environment to support public health epidemiology. The environment combines a high resolution individual-based model with a user-friendly web-based interface that allows analysts to access the models and the analytics back-end remotely from a desktop or a mobile device. The environment is based on a loosely-coupled service-oriented-architecture that allows analysts to explore various counter factual scenarios. As the modeling tools for public health epidemiology are getting more sophisticated, it is becoming increasingly hard for non-computational scientists to effectively use the systems that incorporate such models. Thus an important design consideration for an integrated modeling environment is to improve ease of use such that experimental simulations can be driven by the users. This is achieved by designing intuitive and user-friendly interfaces that allow users to design and analyze a computational experiment and steer the experiment based on the state of the system. A key feature of a system that supports this design goal is the ability to start, stop, pause and roll-back the disease propagation and intervention application process interactively. An analyst can access the state of the system at any point in time and formulate dynamic interventions based on additional information obtained through state assessment. In addition, the environment provides automated services for experiment set-up and management, thus reducing the overall time for conducting end-to-end experimental studies. We illustrate the applicability of the system by describing computational experiments based on realistic pandemic planning scenarios. The experiments are designed to demonstrate the system's capability and enhanced user productivity.

  17. EvoBuild: A Quickstart Toolkit for Programming Agent-Based Models of Evolutionary Processes

    Science.gov (United States)

    Wagh, Aditi; Wilensky, Uri

    2018-04-01

    Extensive research has shown that one of the benefits of programming to learn about scientific phenomena is that it facilitates learning about mechanisms underlying the phenomenon. However, using programming activities in classrooms is associated with costs such as requiring additional time to learn to program or students needing prior experience with programming. This paper presents a class of programming environments that we call quickstart: Environments with a negligible threshold for entry into programming and a modest ceiling. We posit that such environments can provide benefits of programming for learning without incurring associated costs for novice programmers. To make this claim, we present a design-based research study conducted to compare programming models of evolutionary processes with a quickstart toolkit with exploring pre-built models of the same processes. The study was conducted in six seventh grade science classes in two schools. Students in the programming condition used EvoBuild, a quickstart toolkit for programming agent-based models of evolutionary processes, to build their NetLogo models. Students in the exploration condition used pre-built NetLogo models. We demonstrate that although students came from a range of academic backgrounds without prior programming experience, and all students spent the same number of class periods on the activities including the time students took to learn programming in this environment, EvoBuild students showed greater learning about evolutionary mechanisms. We discuss the implications of this work for design research on programming environments in K-12 science education.

  18. Modeling and Control of Multivariable Process Using Intelligent Techniques

    Directory of Open Access Journals (Sweden)

    Subathra Balasubramanian

    2010-10-01

    Full Text Available For nonlinear dynamic systems, the first principles based modeling and control is difficult to implement. In this study, a fuzzy controller and recurrent fuzzy controller are developed for MIMO process. Fuzzy logic controller is a model free controller designed based on the knowledge about the process. In fuzzy controller there are two types of rule-based fuzzy models are available: one the linguistic (Mamdani model and the other is Takagi–Sugeno model. Of these two, Takagi-Sugeno model (TS has attracted most attention. The fuzzy controller application is limited to static processes due to their feedforward structure. But, most of the real-time processes are dynamic and they require the history of input/output data. In order to store the past values a memory unit is needed, which is introduced by the recurrent structure. The proposed recurrent fuzzy structure is used to develop a controller for the two tank heating process. Both controllers are designed and implemented in a real time environment and their performance is compared.

  19. Business Process Elicitation, Modeling, and Reengineering: Teaching and Learning with Simulated Environments

    Science.gov (United States)

    Jeyaraj, Anand

    2010-01-01

    The design of enterprise information systems requires students to master technical skills for elicitation, modeling, and reengineering business processes as well as soft skills for information gathering and communication. These tacit skills and behaviors cannot be effectively taught students but rather experienced and learned by students. This…

  20. The national operational environment model (NOEM)

    Science.gov (United States)

    Salerno, John J.; Romano, Brian; Geiler, Warren

    2011-06-01

    The National Operational Environment Model (NOEM) is a strategic analysis/assessment tool that provides insight into the complex state space (as a system) that is today's modern operational environment. The NOEM supports baseline forecasts by generating plausible futures based on the current state. It supports what-if analysis by forecasting ramifications of potential "Blue" actions on the environment. The NOEM also supports sensitivity analysis by identifying possible pressure (leverage) points in support of the Commander that resolves forecasted instabilities, and by ranking sensitivities in a list for each leverage point and response. The NOEM can be used to assist Decision Makers, Analysts and Researchers with understanding the inter-workings of a region or nation state, the consequences of implementing specific policies, and the ability to plug in new operational environment theories/models as they mature. The NOEM is built upon an open-source, license-free set of capabilities, and aims to provide support for pluggable modules that make up a given model. The NOEM currently has an extensive number of modules (e.g. economic, security & social well-being pieces such as critical infrastructure) completed along with a number of tools to exercise them. The focus this year is on modeling the social and behavioral aspects of a populace within their environment, primarily the formation of various interest groups, their beliefs, their requirements, their grievances, their affinities, and the likelihood of a wide range of their actions, depending on their perceived level of security and happiness. As such, several research efforts are currently underway to model human behavior from a group perspective, in the pursuit of eventual integration and balance of populace needs/demands within their respective operational environment and the capacity to meet those demands. In this paper we will provide an overview of the NOEM, the need for and a description of its main components

  1. Development of Computer Aided Modelling Templates for Model Re-use in Chemical and Biochemical Process and Product Design: Importand export of models

    DEFF Research Database (Denmark)

    Fedorova, Marina; Tolksdorf, Gregor; Fillinger, Sandra

    2015-01-01

    been established, in order to provide a wider range of modelling capabilities. Through this link, developed models can be exported/imported to/from other modelling-simulation software environments to allow model reusability in chemical and biochemical product and process design. The use of this link...

  2. Spacecraft Internal Acoustic Environment Modeling

    Science.gov (United States)

    Chu, Shao-Sheng R.; Allen Christopher S.

    2010-01-01

    Acoustic modeling can be used to identify key noise sources, determine/analyze sub-allocated requirements, keep track of the accumulation of minor noise sources, and to predict vehicle noise levels at various stages in vehicle development, first with estimates of noise sources, later with experimental data. This paper describes the implementation of acoustic modeling for design purposes by incrementally increasing model fidelity and validating the accuracy of the model while predicting the noise of sources under various conditions. During FY 07, a simple-geometry Statistical Energy Analysis (SEA) model was developed and validated using a physical mockup and acoustic measurements. A process for modeling the effects of absorptive wall treatments and the resulting reverberation environment were developed. During FY 08, a model with more complex and representative geometry of the Orion Crew Module (CM) interior was built, and noise predictions based on input noise sources were made. A corresponding physical mockup was also built. Measurements were made inside this mockup, and comparisons were made with the model and showed excellent agreement. During FY 09, the fidelity of the mockup and corresponding model were increased incrementally by including a simple ventilation system. The airborne noise contribution of the fans was measured using a sound intensity technique, since the sound power levels were not known beforehand. This is opposed to earlier studies where Reference Sound Sources (RSS) with known sound power level were used. Comparisons of the modeling result with the measurements in the mockup showed excellent results. During FY 10, the fidelity of the mockup and the model were further increased by including an ECLSS (Environmental Control and Life Support System) wall, associated closeout panels, and the gap between ECLSS wall and mockup wall. The effect of sealing the gap and adding sound absorptive treatment to ECLSS wall were also modeled and validated.

  3. An Intelligent Complex Event Processing with D Numbers under Fuzzy Environment

    Directory of Open Access Journals (Sweden)

    Fuyuan Xiao

    2016-01-01

    Full Text Available Efficient matching of incoming mass events to persistent queries is fundamental to complex event processing systems. Event matching based on pattern rule is an important feature of complex event processing engine. However, the intrinsic uncertainty in pattern rules which are predecided by experts increases the difficulties of effective complex event processing. It inevitably involves various types of the intrinsic uncertainty, such as imprecision, fuzziness, and incompleteness, due to the inability of human beings subjective judgment. Nevertheless, D numbers is a new mathematic tool to model uncertainty, since it ignores the condition that elements on the frame must be mutually exclusive. To address the above issues, an intelligent complex event processing method with D numbers under fuzzy environment is proposed based on the Technique for Order Preferences by Similarity to an Ideal Solution (TOPSIS method. The novel method can fully support decision making in complex event processing systems. Finally, a numerical example is provided to evaluate the efficiency of the proposed method.

  4. Process modelling on a canonical basis[Process modelling; Canonical modelling

    Energy Technology Data Exchange (ETDEWEB)

    Siepmann, Volker

    2006-12-20

    Based on an equation oriented solving strategy, this thesis investigates a new approach to process modelling. Homogeneous thermodynamic state functions represent consistent mathematical models of thermodynamic properties. Such state functions of solely extensive canonical state variables are the basis of this work, as they are natural objective functions in optimisation nodes to calculate thermodynamic equilibrium regarding phase-interaction and chemical reactions. Analytical state function derivatives are utilised within the solution process as well as interpreted as physical properties. By this approach, only a limited range of imaginable process constraints are considered, namely linear balance equations of state variables. A second-order update of source contributions to these balance equations is obtained by an additional constitutive equation system. These equations are general dependent on state variables and first-order sensitivities, and cover therefore practically all potential process constraints. Symbolic computation technology efficiently provides sparsity and derivative information of active equations to avoid performance problems regarding robustness and computational effort. A benefit of detaching the constitutive equation system is that the structure of the main equation system remains unaffected by these constraints, and a priori information allows to implement an efficient solving strategy and a concise error diagnosis. A tailor-made linear algebra library handles the sparse recursive block structures efficiently. The optimisation principle for single modules of thermodynamic equilibrium is extended to host entire process models. State variables of different modules interact through balance equations, representing material flows from one module to the other. To account for reusability and encapsulation of process module details, modular process modelling is supported by a recursive module structure. The second-order solving algorithm makes it

  5. STEPP: A Grounded Model to Assure the Quality of Instructional Activities in e-Learning Environments

    Directory of Open Access Journals (Sweden)

    Hamdy AHMED ABDELAZIZ

    2013-07-01

    Full Text Available The present theoretical paper aims to develop a grounded model for designing instructional activities appropriate to e-learning and online learning environments. The suggested model is guided by learning principles of cognitivism, constructivism, and connectivism learning principles to help online learners constructing meaningful experiences and moving from knowledge acquisition to knowledge creation process. The proposed model consists of five dynamic and grounded domains that assure the quality of designing and using e-learning activities: Ø Social Domain; Ø Technological Domain; Ø Epistemological Domain; Ø Psychological domain; and Ø Pedagogical Domain. Each of these domains needs four types of presences to reflect the design and the application process of e-learning activities. These four presences are: Ø cognitive presence, Ø human presence, Ø psychological presence and Ø mental presence. Applying the proposed model (STEPP throughout all online and adaptive e-learning environments may improve the process of designing and developing e-learning activities to be used as mindtools for current and future learners.

  6. Neuroscientific Model of Motivational Process

    Science.gov (United States)

    Kim, Sung-il

    2013-01-01

    Considering the neuroscientific findings on reward, learning, value, decision-making, and cognitive control, motivation can be parsed into three sub processes, a process of generating motivation, a process of maintaining motivation, and a process of regulating motivation. I propose a tentative neuroscientific model of motivational processes which consists of three distinct but continuous sub processes, namely reward-driven approach, value-based decision-making, and goal-directed control. Reward-driven approach is the process in which motivation is generated by reward anticipation and selective approach behaviors toward reward. This process recruits the ventral striatum (reward area) in which basic stimulus-action association is formed, and is classified as an automatic motivation to which relatively less attention is assigned. By contrast, value-based decision-making is the process of evaluating various outcomes of actions, learning through positive prediction error, and calculating the value continuously. The striatum and the orbitofrontal cortex (valuation area) play crucial roles in sustaining motivation. Lastly, the goal-directed control is the process of regulating motivation through cognitive control to achieve goals. This consciously controlled motivation is associated with higher-level cognitive functions such as planning, retaining the goal, monitoring the performance, and regulating action. The anterior cingulate cortex (attention area) and the dorsolateral prefrontal cortex (cognitive control area) are the main neural circuits related to regulation of motivation. These three sub processes interact with each other by sending reward prediction error signals through dopaminergic pathway from the striatum and to the prefrontal cortex. The neuroscientific model of motivational process suggests several educational implications with regard to the generation, maintenance, and regulation of motivation to learn in the learning environment. PMID:23459598

  7. Neuroscientific model of motivational process.

    Science.gov (United States)

    Kim, Sung-Il

    2013-01-01

    Considering the neuroscientific findings on reward, learning, value, decision-making, and cognitive control, motivation can be parsed into three sub processes, a process of generating motivation, a process of maintaining motivation, and a process of regulating motivation. I propose a tentative neuroscientific model of motivational processes which consists of three distinct but continuous sub processes, namely reward-driven approach, value-based decision-making, and goal-directed control. Reward-driven approach is the process in which motivation is generated by reward anticipation and selective approach behaviors toward reward. This process recruits the ventral striatum (reward area) in which basic stimulus-action association is formed, and is classified as an automatic motivation to which relatively less attention is assigned. By contrast, value-based decision-making is the process of evaluating various outcomes of actions, learning through positive prediction error, and calculating the value continuously. The striatum and the orbitofrontal cortex (valuation area) play crucial roles in sustaining motivation. Lastly, the goal-directed control is the process of regulating motivation through cognitive control to achieve goals. This consciously controlled motivation is associated with higher-level cognitive functions such as planning, retaining the goal, monitoring the performance, and regulating action. The anterior cingulate cortex (attention area) and the dorsolateral prefrontal cortex (cognitive control area) are the main neural circuits related to regulation of motivation. These three sub processes interact with each other by sending reward prediction error signals through dopaminergic pathway from the striatum and to the prefrontal cortex. The neuroscientific model of motivational process suggests several educational implications with regard to the generation, maintenance, and regulation of motivation to learn in the learning environment.

  8. 3rd International Conference on Modelling and Management of Engineering Processes

    CERN Document Server

    Gericke, Kilian; Szélig, Nikoletta; Vajna, Sándor

    2015-01-01

    Innovative processes for the development of products and services are more and more considered as an organisational capability, which is recognised to be increasingly important for business success in today’s competitive environment. However, management and academia need a more profound understanding of these processes in order to develop improved management approaches to exploit business potentials. This book contains the proceedings of the 3rd International Conference on Modelling and Management of Engineering Processes (MMEP2013) held in Magdeburg, Germany, in November 2013. It includes contributions from international leading researchers in the fields of process modelling and process management. The conference topics were recent trends in modelling and management of engineering processes, potential synergies between different modelling approaches, future challenges for the management of engineering processes as well as future research in these areas.

  9. Study on intelligent processing system of man-machine interactive garment frame model

    Science.gov (United States)

    Chen, Shuwang; Yin, Xiaowei; Chang, Ruijiang; Pan, Peiyun; Wang, Xuedi; Shi, Shuze; Wei, Zhongqian

    2018-05-01

    A man-machine interactive garment frame model intelligent processing system is studied in this paper. The system consists of several sensor device, voice processing module, mechanical parts and data centralized acquisition devices. The sensor device is used to collect information on the environment changes brought by the body near the clothes frame model, the data collection device is used to collect the information of the environment change induced by the sensor device, voice processing module is used for speech recognition of nonspecific person to achieve human-machine interaction, mechanical moving parts are used to make corresponding mechanical responses to the information processed by data collection device.it is connected with data acquisition device by a means of one-way connection. There is a one-way connection between sensor device and data collection device, two-way connection between data acquisition device and voice processing module. The data collection device is one-way connection with mechanical movement parts. The intelligent processing system can judge whether it needs to interact with the customer, realize the man-machine interaction instead of the current rigid frame model.

  10. Evolution of quantum-like modeling in decision making processes

    Energy Technology Data Exchange (ETDEWEB)

    Khrennikova, Polina [School of Management, University of Leicester, University Road Leicester LE1 7RH (United Kingdom)

    2012-12-18

    The application of the mathematical formalism of quantum mechanics to model behavioral patterns in social science and economics is a novel and constantly emerging field. The aim of the so called 'quantum like' models is to model the decision making processes in a macroscopic setting, capturing the particular 'context' in which the decisions are taken. Several subsequent empirical findings proved that when making a decision people tend to violate the axioms of expected utility theory and Savage's Sure Thing principle, thus violating the law of total probability. A quantum probability formula was devised to describe more accurately the decision making processes. A next step in the development of QL-modeling in decision making was the application of Schroedinger equation to describe the evolution of people's mental states. A shortcoming of Schroedinger equation is its inability to capture dynamics of an open system; the brain of the decision maker can be regarded as such, actively interacting with the external environment. Recently the master equation, by which quantum physics describes the process of decoherence as the result of interaction of the mental state with the environmental 'bath', was introduced for modeling the human decision making. The external environment and memory can be referred to as a complex 'context' influencing the final decision outcomes. The master equation can be considered as a pioneering and promising apparatus for modeling the dynamics of decision making in different contexts.

  11. Evolution of quantum-like modeling in decision making processes

    Science.gov (United States)

    Khrennikova, Polina

    2012-12-01

    The application of the mathematical formalism of quantum mechanics to model behavioral patterns in social science and economics is a novel and constantly emerging field. The aim of the so called 'quantum like' models is to model the decision making processes in a macroscopic setting, capturing the particular 'context' in which the decisions are taken. Several subsequent empirical findings proved that when making a decision people tend to violate the axioms of expected utility theory and Savage's Sure Thing principle, thus violating the law of total probability. A quantum probability formula was devised to describe more accurately the decision making processes. A next step in the development of QL-modeling in decision making was the application of Schrödinger equation to describe the evolution of people's mental states. A shortcoming of Schrödinger equation is its inability to capture dynamics of an open system; the brain of the decision maker can be regarded as such, actively interacting with the external environment. Recently the master equation, by which quantum physics describes the process of decoherence as the result of interaction of the mental state with the environmental 'bath', was introduced for modeling the human decision making. The external environment and memory can be referred to as a complex 'context' influencing the final decision outcomes. The master equation can be considered as a pioneering and promising apparatus for modeling the dynamics of decision making in different contexts.

  12. Evolution of quantum-like modeling in decision making processes

    International Nuclear Information System (INIS)

    Khrennikova, Polina

    2012-01-01

    The application of the mathematical formalism of quantum mechanics to model behavioral patterns in social science and economics is a novel and constantly emerging field. The aim of the so called 'quantum like' models is to model the decision making processes in a macroscopic setting, capturing the particular 'context' in which the decisions are taken. Several subsequent empirical findings proved that when making a decision people tend to violate the axioms of expected utility theory and Savage's Sure Thing principle, thus violating the law of total probability. A quantum probability formula was devised to describe more accurately the decision making processes. A next step in the development of QL-modeling in decision making was the application of Schrödinger equation to describe the evolution of people's mental states. A shortcoming of Schrödinger equation is its inability to capture dynamics of an open system; the brain of the decision maker can be regarded as such, actively interacting with the external environment. Recently the master equation, by which quantum physics describes the process of decoherence as the result of interaction of the mental state with the environmental 'bath', was introduced for modeling the human decision making. The external environment and memory can be referred to as a complex 'context' influencing the final decision outcomes. The master equation can be considered as a pioneering and promising apparatus for modeling the dynamics of decision making in different contexts.

  13. Spontaneous cortical activity reveals hallmarks of an optimal internal model of the environment.

    Science.gov (United States)

    Berkes, Pietro; Orbán, Gergo; Lengyel, Máté; Fiser, József

    2011-01-07

    The brain maintains internal models of its environment to interpret sensory inputs and to prepare actions. Although behavioral studies have demonstrated that these internal models are optimally adapted to the statistics of the environment, the neural underpinning of this adaptation is unknown. Using a Bayesian model of sensory cortical processing, we related stimulus-evoked and spontaneous neural activities to inferences and prior expectations in an internal model and predicted that they should match if the model is statistically optimal. To test this prediction, we analyzed visual cortical activity of awake ferrets during development. Similarity between spontaneous and evoked activities increased with age and was specific to responses evoked by natural scenes. This demonstrates the progressive adaptation of internal models to the statistics of natural stimuli at the neural level.

  14. Combining On-Line Characterization Tools with Modern Software Environments for Optimal Operation of Polymerization Processes

    Directory of Open Access Journals (Sweden)

    Navid Ghadipasha

    2016-02-01

    Full Text Available This paper discusses the initial steps towards the formulation and implementation of a generic and flexible model centric framework for integrated simulation, estimation, optimization and feedback control of polymerization processes. For the first time it combines the powerful capabilities of the automatic continuous on-line monitoring of polymerization system (ACOMP, with a modern simulation, estimation and optimization software environment towards an integrated scheme for the optimal operation of polymeric processes. An initial validation of the framework was performed for modelling and optimization using literature data, illustrating the flexibility of the method to apply under different systems and conditions. Subsequently, off-line capabilities of the system were fully tested experimentally for model validations, parameter estimation and process optimization using ACOMP data. Experimental results are provided for free radical solution polymerization of methyl methacrylate.

  15. Assimilation of the Observational Data in the Marine Ecosystem Adaptive Model at the Known Mean Values of the Processes in the Marine Environment

    Directory of Open Access Journals (Sweden)

    I.Е. Тimchenko

    2017-10-01

    Full Text Available Assimilation of observational data in the marine ecosystem adaptive models constructed by the adaptive balance of causes method is considered. It is shown that the feedback balance between the ecosystem variables and the rates of their change used in the method equations, permits to introduce a stationary state of the ecosystem characterized by the observed mean values of the variables. The method for assessing the normalized coefficients of influences based on application of the Euler theorem on homogeneous functions to the functions representing material balances of biochemical reactions of the substance transformation is proposed. It is shown that the normalized ratios of the modeled process mean values can be used as the estimates of the reaction product derivatives obtained on the basis of their resources included in the equations of material balances. One-dimensional adaptive model of the sea upper layer ecosystem is constructed as an example; it is based on the scheme of cause-effect relations of the Fasham, Dacklow and McKelvie model of plankton dynamics and nitrogen cycle It is shown that in such a model, observational data is assimilated by automatic adaptation of the model variables to the assimilated information providing that the substance material balance are preserved in the transformation reactions. The data simulating both observations of the chlorophyll a concentrations and the marine environment dynamics are assimilated in the model. Time scenarios of the biochemical processes are constructed; they confirm applicability of the proposed method for assessing the effect coefficients based on the ratios of the simulated process mean values.

  16. Silicon-Carbide Power MOSFET Performance in High Efficiency Boost Power Processing Unit for Extreme Environments

    Science.gov (United States)

    Ikpe, Stanley A.; Lauenstein, Jean-Marie; Carr, Gregory A.; Hunter, Don; Ludwig, Lawrence L.; Wood, William; Del Castillo, Linda Y.; Fitzpatrick, Fred; Chen, Yuan

    2016-01-01

    Silicon-Carbide device technology has generated much interest in recent years. With superior thermal performance, power ratings and potential switching frequencies over its Silicon counterpart, Silicon-Carbide offers a greater possibility for high powered switching applications in extreme environment. In particular, Silicon-Carbide Metal-Oxide- Semiconductor Field-Effect Transistors' (MOSFETs) maturing process technology has produced a plethora of commercially available power dense, low on-state resistance devices capable of switching at high frequencies. A novel hard-switched power processing unit (PPU) is implemented utilizing Silicon-Carbide power devices. Accelerated life data is captured and assessed in conjunction with a damage accumulation model of gate oxide and drain-source junction lifetime to evaluate potential system performance at high temperature environments.

  17. An Extended Petri-Net Based Approach for Supply Chain Process Enactment in Resource-Centric Web Service Environment

    Science.gov (United States)

    Wang, Xiaodong; Zhang, Xiaoyu; Cai, Hongming; Xu, Boyi

    Enacting a supply-chain process involves variant partners and different IT systems. REST receives increasing attention for distributed systems with loosely coupled resources. Nevertheless, resource model incompatibilities and conflicts prevent effective process modeling and deployment in resource-centric Web service environment. In this paper, a Petri-net based framework for supply-chain process integration is proposed. A resource meta-model is constructed to represent the basic information of resources. Then based on resource meta-model, XML schemas and documents are derived, which represent resources and their states in Petri-net. Thereafter, XML-net, a high level Petri-net, is employed for modeling control and data flow of process. From process model in XML-net, RESTful services and choreography descriptions are deduced. Therefore, unified resource representation and RESTful services description are proposed for cross-system integration in a more effective way. A case study is given to illustrate the approach and the desirable features of the approach are discussed.

  18. Business Process Modeling for Domain Inbound Logistics System : Analytical Perspective with BPMN 2.0

    OpenAIRE

    Khabbazi, Mahmood Reza; Hasan, M. K; Sulaiman, R; Shapi’i, A

    2013-01-01

    Among different Business Process Management strategies and methodologies, one common feature is to captureexisting processes and representing the new processes adequately. Business Process Modelling (BPM) plays acrucial role on such an effort. This paper proposes a “to-be” inbound logistics business processes model usingBPMN 2.0 standard specifying the structure and behaviour of the system within the SME environment. Thegeneric framework of inbound logistics model consists of one main high-le...

  19. Modeling the effect of urban infrastructure on hydrologic processes within i-Tree Hydro, a statistically and spatially distributed model

    Science.gov (United States)

    Taggart, T. P.; Endreny, T. A.; Nowak, D.

    2014-12-01

    Gray and green infrastructure in urban environments alters many natural hydrologic processes, creating an urban water balance unique to the developed environment. A common way to assess the consequences of impervious cover and grey infrastructure is by measuring runoff hydrographs. This focus on the watershed outlet masks the spatial variation of hydrologic process alterations across the urban environment in response to localized landscape characteristics. We attempt to represent this spatial variation in the urban environment using the statistically and spatially distributed i-Tree Hydro model, a scoping level urban forest effects water balance model. i-Tree Hydro has undergone expansion and modification to include the effect of green infrastructure processes, road network attributes, and urban pipe system leakages. These additions to the model are intended to increase the understanding of the altered urban hydrologic cycle by examining the effects of the location of these structures on the water balance. Specifically, the effect of these additional structures and functions on the spatially varying properties of interception, soil moisture and runoff generation. Differences in predicted properties and optimized parameter sets between the two models are examined and related to the recent landscape modifications. Datasets used in this study consist of watersheds and sewersheds within the Syracuse, NY metropolitan area, an urban area that has integrated green and gray infrastructure practices to alleviate stormwater problems.

  20. Integrated model of port oil piping transportation system safety including operating environment threats

    Directory of Open Access Journals (Sweden)

    Kołowrocki Krzysztof

    2017-06-01

    Full Text Available The paper presents an integrated general model of complex technical system, linking its multistate safety model and the model of its operation process including operating environment threats and considering variable at different operation states its safety structures and its components safety parameters. Under the assumption that the system has exponential safety function, the safety characteristics of the port oil piping transportation system are determined.

  1. Integrated model of port oil piping transportation system safety including operating environment threats

    OpenAIRE

    Kołowrocki, Krzysztof; Kuligowska, Ewa; Soszyńska-Budny, Joanna

    2017-01-01

    The paper presents an integrated general model of complex technical system, linking its multistate safety model and the model of its operation process including operating environment threats and considering variable at different operation states its safety structures and its components safety parameters. Under the assumption that the system has exponential safety function, the safety characteristics of the port oil piping transportation system are determined.

  2. Process control for sheet-metal stamping process modeling, controller design and shop-floor implementation

    CERN Document Server

    Lim, Yongseob; Ulsoy, A Galip

    2014-01-01

    Process Control for Sheet-Metal Stamping presents a comprehensive and structured approach to the design and implementation of controllers for the sheet metal stamping process. The use of process control for sheet-metal stamping greatly reduces defects in deep-drawn parts and can also yield large material savings from reduced scrap. Sheet-metal forming is a complex process and most often characterized by partial differential equations that are numerically solved using finite-element techniques. In this book, twenty years of academic research are reviewed and the resulting technology transitioned to the industrial environment. The sheet-metal stamping process is modeled in a manner suitable for multiple-input multiple-output control system design, with commercially available sensors and actuators. These models are then used to design adaptive controllers and real-time controller implementation is discussed. Finally, experimental results from actual shopfloor deployment are presented along with ideas for further...

  3. Neuroscientific Model of Motivational Process

    Directory of Open Access Journals (Sweden)

    Sung-Il eKim

    2013-03-01

    Full Text Available Considering the neuroscientific findings on reward, learning, value, decision-making, and cognitive control, motivation can be parsed into three subprocesses, a process of generating motivation, a process of maintaining motivation, and a process of regulating motivation. I propose a tentative neuroscientific model of motivational processes which consists of three distinct but continuous subprocesses, namely reward-driven approach, value-based decision making, and goal-directed control. Reward-driven approach is the process in which motivation is generated by reward anticipation and selective approach behaviors toward reward. This process recruits the ventral striatum (reward area in which basic stimulus-action association is formed, and is classified as an automatic motivation to which relatively less attention is assigned. By contrast, value-based decision making is the process of evaluating various outcomes of actions, learning through positive prediction error, and calculating the value continuously. The striatum and the orbitofrontal cortex (valuation area play crucial roles in sustaining motivation. Lastly, the goal-directed control is the process of regulating motivation through cognitive control to achieve goals. This consciously controlled motivation is associated with higher-level cognitive functions such as planning, retaining the goal, monitoring the performance, and regulating action. The anterior cingulate cortex (attention area and the dorsolateral prefrontal cortex (cognitive control area are the main neural circuits related to regulation of motivation. These three subprocesses interact with each other by sending reward prediction error signals through dopaminergic pathway from the striatum and to the prefrontal cortex. The neuroscientific model of motivational process suggests several educational implications with regard to the generation, maintenance, and regulation of motivation to learn in the learning environment.

  4. SPARX, a new environment for Cryo-EM image processing.

    Science.gov (United States)

    Hohn, Michael; Tang, Grant; Goodyear, Grant; Baldwin, P R; Huang, Zhong; Penczek, Pawel A; Yang, Chao; Glaeser, Robert M; Adams, Paul D; Ludtke, Steven J

    2007-01-01

    SPARX (single particle analysis for resolution extension) is a new image processing environment with a particular emphasis on transmission electron microscopy (TEM) structure determination. It includes a graphical user interface that provides a complete graphical programming environment with a novel data/process-flow infrastructure, an extensive library of Python scripts that perform specific TEM-related computational tasks, and a core library of fundamental C++ image processing functions. In addition, SPARX relies on the EMAN2 library and cctbx, the open-source computational crystallography library from PHENIX. The design of the system is such that future inclusion of other image processing libraries is a straightforward task. The SPARX infrastructure intelligently handles retention of intermediate values, even those inside programming structures such as loops and function calls. SPARX and all dependencies are free for academic use and available with complete source.

  5. Social, economic, and political processes that create built environment inequities: perspectives from urban African Americans in Atlanta.

    Science.gov (United States)

    Redwood, Yanique; Schulz, Amy J; Israel, Barbara A; Yoshihama, Mieko; Wang, Caroline C; Kreuter, Marshall

    2010-01-01

    Growing evidence suggests that the built environment features found in many high-poverty urban areas contribute to negative health outcomes. Both built environment hazards and negative health outcomes disproportionately affect poor people of color. We used community-based participatory research and Photovoice in inner-city Atlanta to elicit African Americans' perspectives on their health priorities. The built environment emerged as a critical factor, impacting physical and mental health outcomes. We offer a conceptual model, informed by residents' perspectives, linking social, economic, and political processes to built environment and health inequities. Research, practice, and policy implications are discussed within an environmental justice framework.

  6. Mathematical modeling of biomass fuels formation process

    International Nuclear Information System (INIS)

    Gaska, Krzysztof; Wandrasz, Andrzej J.

    2008-01-01

    The increasing demand for thermal and electric energy in many branches of industry and municipal management accounts for a drastic diminishing of natural resources (fossil fuels). Meanwhile, in numerous technical processes, a huge mass of wastes is produced. A segregated and converted combustible fraction of the wastes, with relatively high calorific value, may be used as a component of formed fuels. The utilization of the formed fuel components from segregated groups of waste in associated processes of co-combustion with conventional fuels causes significant savings resulting from partial replacement of fossil fuels, and reduction of environmental pollution resulting directly from the limitation of waste migration to the environment (soil, atmospheric air, surface and underground water). The realization of technological processes with the utilization of formed fuel in associated thermal systems should be qualified by technical criteria, which means that elementary processes as well as factors of sustainable development, from a global viewpoint, must not be disturbed. The utilization of post-process waste should be preceded by detailed technical, ecological and economic analyses. In order to optimize the mixing process of fuel components, a mathematical model of the forming process was created. The model is defined as a group of data structures which uniquely identify a real process and conversion of this data in algorithms based on a problem of linear programming. The paper also presents the optimization of parameters in the process of forming fuels using a modified simplex algorithm with a polynomial worktime. This model is a datum-point in the numerical modeling of real processes, allowing a precise determination of the optimal elementary composition of formed fuels components, with assumed constraints and decision variables of the task

  7. Bottom-up learning of hierarchical models in a class of deterministic POMDP environments

    Directory of Open Access Journals (Sweden)

    Itoh Hideaki

    2015-09-01

    Full Text Available The theory of partially observable Markov decision processes (POMDPs is a useful tool for developing various intelligent agents, and learning hierarchical POMDP models is one of the key approaches for building such agents when the environments of the agents are unknown and large. To learn hierarchical models, bottom-up learning methods in which learning takes place in a layer-by-layer manner from the lowest to the highest layer are already extensively used in some research fields such as hidden Markov models and neural networks. However, little attention has been paid to bottom-up approaches for learning POMDP models. In this paper, we present a novel bottom-up learning algorithm for hierarchical POMDP models and prove that, by using this algorithm, a perfect model (i.e., a model that can perfectly predict future observations can be learned at least in a class of deterministic POMDP environments

  8. Mathematical modeling in economics, ecology and the environment

    CERN Document Server

    Hritonenko, Natali

    2013-01-01

    Updated to textbook form by popular demand, this second edition discusses diverse mathematical models used in economics, ecology, and the environmental sciences with emphasis on control and optimization. It is intended for graduate and upper-undergraduate course use, however, applied mathematicians, industry practitioners, and a vast number of interdisciplinary academics will find the presentation highly useful. Core topics of this text are: ·         Economic growth and technological development ·         Population dynamics and human impact on the environment ·         Resource extraction and scarcity ·         Air and water contamination ·         Rational management of the economy and environment ·         Climate change and global dynamics The step-by-step approach taken is problem-based and easy to follow. The authors aptly demonstrate that the same models may be used to describe different economic and environmental processes and that similar invest...

  9. Applications integration in a hybrid cloud computing environment: modelling and platform

    Science.gov (United States)

    Li, Qing; Wang, Ze-yuan; Li, Wei-hua; Li, Jun; Wang, Cheng; Du, Rui-yang

    2013-08-01

    With the development of application services providers and cloud computing, more and more small- and medium-sized business enterprises use software services and even infrastructure services provided by professional information service companies to replace all or part of their information systems (ISs). These information service companies provide applications, such as data storage, computing processes, document sharing and even management information system services as public resources to support the business process management of their customers. However, no cloud computing service vendor can satisfy the full functional IS requirements of an enterprise. As a result, enterprises often have to simultaneously use systems distributed in different clouds and their intra enterprise ISs. Thus, this article presents a framework to integrate applications deployed in public clouds and intra ISs. A run-time platform is developed and a cross-computing environment process modelling technique is also developed to improve the feasibility of ISs under hybrid cloud computing environments.

  10. Teaching Process Writing in an Online Environment

    Science.gov (United States)

    Carolan, Fergal; Kyppö, Anna

    2015-01-01

    This reflective practice paper offers some insights into teaching an interdisciplinary academic writing course aimed at promoting process writing. The study reflects on students' acquisition of writing skills and the teacher's support practices in a digital writing environment. It presents writers' experiences related to various stages of process…

  11. Cabin Environment Physics Risk Model

    Science.gov (United States)

    Mattenberger, Christopher J.; Mathias, Donovan Leigh

    2014-01-01

    This paper presents a Cabin Environment Physics Risk (CEPR) model that predicts the time for an initial failure of Environmental Control and Life Support System (ECLSS) functionality to propagate into a hazardous environment and trigger a loss-of-crew (LOC) event. This physics-of failure model allows a probabilistic risk assessment of a crewed spacecraft to account for the cabin environment, which can serve as a buffer to protect the crew during an abort from orbit and ultimately enable a safe return. The results of the CEPR model replace the assumption that failure of the crew critical ECLSS functionality causes LOC instantly, and provide a more accurate representation of the spacecraft's risk posture. The instant-LOC assumption is shown to be excessively conservative and, moreover, can impact the relative risk drivers identified for the spacecraft. This, in turn, could lead the design team to allocate mass for equipment to reduce overly conservative risk estimates in a suboptimal configuration, which inherently increases the overall risk to the crew. For example, available mass could be poorly used to add redundant ECLSS components that have a negligible benefit but appear to make the vehicle safer due to poor assumptions about the propagation time of ECLSS failures.

  12. A web-based, collaborative modeling, simulation, and parallel computing environment for electromechanical systems

    Directory of Open Access Journals (Sweden)

    Xiaoliang Yin

    2015-03-01

    Full Text Available Complex electromechanical system is usually composed of multiple components from different domains, including mechanical, electronic, hydraulic, control, and so on. Modeling and simulation for electromechanical system on a unified platform is one of the research hotspots in system engineering at present. It is also the development trend of the design for complex electromechanical system. The unified modeling techniques and tools based on Modelica language provide a satisfactory solution. To meet with the requirements of collaborative modeling, simulation, and parallel computing for complex electromechanical systems based on Modelica, a general web-based modeling and simulation prototype environment, namely, WebMWorks, is designed and implemented. Based on the rich Internet application technologies, an interactive graphic user interface for modeling and post-processing on web browser was implemented; with the collaborative design module, the environment supports top-down, concurrent modeling and team cooperation; additionally, service-oriented architecture–based architecture was applied to supply compiling and solving services which run on cloud-like servers, so the environment can manage and dispatch large-scale simulation tasks in parallel on multiple computing servers simultaneously. An engineering application about pure electric vehicle is tested on WebMWorks. The results of simulation and parametric experiment demonstrate that the tested web-based environment can effectively shorten the design cycle of the complex electromechanical system.

  13. Multiengine Speech Processing Using SNR Estimator in Variable Noisy Environments

    Directory of Open Access Journals (Sweden)

    Ahmad R. Abu-El-Quran

    2012-01-01

    Full Text Available We introduce a multiengine speech processing system that can detect the location and the type of audio signal in variable noisy environments. This system detects the location of the audio source using a microphone array; the system examines the audio first, determines if it is speech/nonspeech, then estimates the value of the signal to noise (SNR using a Discrete-Valued SNR Estimator. Using this SNR value, instead of trying to adapt the speech signal to the speech processing system, we adapt the speech processing system to the surrounding environment of the captured speech signal. In this paper, we introduced the Discrete-Valued SNR Estimator and a multiengine classifier, using Multiengine Selection or Multiengine Weighted Fusion. Also we use the SI as example of the speech processing. The Discrete-Valued SNR Estimator achieves an accuracy of 98.4% in characterizing the environment's SNR. Compared to a conventional single engine SI system, the improvement in accuracy was as high as 9.0% and 10.0% for the Multiengine Selection and Multiengine Weighted Fusion, respectively.

  14. Differential Susceptibility to the Environment: Are Developmental Models Compatible with the Evidence from Twin Studies?

    Science.gov (United States)

    Del Giudice, Marco

    2016-01-01

    According to models of differential susceptibility, the same neurobiological and temperamental traits that determine increased sensitivity to stress and adversity also confer enhanced responsivity to the positive aspects of the environment. Differential susceptibility models have expanded to include complex developmental processes in which genetic…

  15. Hypercompetitive Environments: An Agent-based model approach

    Science.gov (United States)

    Dias, Manuel; Araújo, Tanya

    Information technology (IT) environments are characterized by complex changes and rapid evolution. Globalization and the spread of technological innovation have increased the need for new strategic information resources, both from individual firms and management environments. Improvements in multidisciplinary methods and, particularly, the availability of powerful computational tools, are giving researchers an increasing opportunity to investigate management environments in their true complex nature. The adoption of a complex systems approach allows for modeling business strategies from a bottom-up perspective — understood as resulting from repeated and local interaction of economic agents — without disregarding the consequences of the business strategies themselves to individual behavior of enterprises, emergence of interaction patterns between firms and management environments. Agent-based models are at the leading approach of this attempt.

  16. Bayesian Genomic Prediction with Genotype × Environment Interaction Kernel Models

    Science.gov (United States)

    Cuevas, Jaime; Crossa, José; Montesinos-López, Osval A.; Burgueño, Juan; Pérez-Rodríguez, Paulino; de los Campos, Gustavo

    2016-01-01

    The phenomenon of genotype × environment (G × E) interaction in plant breeding decreases selection accuracy, thereby negatively affecting genetic gains. Several genomic prediction models incorporating G × E have been recently developed and used in genomic selection of plant breeding programs. Genomic prediction models for assessing multi-environment G × E interaction are extensions of a single-environment model, and have advantages and limitations. In this study, we propose two multi-environment Bayesian genomic models: the first model considers genetic effects (u) that can be assessed by the Kronecker product of variance–covariance matrices of genetic correlations between environments and genomic kernels through markers under two linear kernel methods, linear (genomic best linear unbiased predictors, GBLUP) and Gaussian (Gaussian kernel, GK). The other model has the same genetic component as the first model (u) plus an extra component, f, that captures random effects between environments that were not captured by the random effects u. We used five CIMMYT data sets (one maize and four wheat) that were previously used in different studies. Results show that models with G × E always have superior prediction ability than single-environment models, and the higher prediction ability of multi-environment models with u and f over the multi-environment model with only u occurred 85% of the time with GBLUP and 45% of the time with GK across the five data sets. The latter result indicated that including the random effect f is still beneficial for increasing prediction ability after adjusting by the random effect u. PMID:27793970

  17. Bayesian Genomic Prediction with Genotype × Environment Interaction Kernel Models

    Directory of Open Access Journals (Sweden)

    Jaime Cuevas

    2017-01-01

    Full Text Available The phenomenon of genotype × environment (G × E interaction in plant breeding decreases selection accuracy, thereby negatively affecting genetic gains. Several genomic prediction models incorporating G × E have been recently developed and used in genomic selection of plant breeding programs. Genomic prediction models for assessing multi-environment G × E interaction are extensions of a single-environment model, and have advantages and limitations. In this study, we propose two multi-environment Bayesian genomic models: the first model considers genetic effects ( u that can be assessed by the Kronecker product of variance–covariance matrices of genetic correlations between environments and genomic kernels through markers under two linear kernel methods, linear (genomic best linear unbiased predictors, GBLUP and Gaussian (Gaussian kernel, GK. The other model has the same genetic component as the first model ( u plus an extra component, f, that captures random effects between environments that were not captured by the random effects u . We used five CIMMYT data sets (one maize and four wheat that were previously used in different studies. Results show that models with G × E always have superior prediction ability than single-environment models, and the higher prediction ability of multi-environment models with u   and   f over the multi-environment model with only u occurred 85% of the time with GBLUP and 45% of the time with GK across the five data sets. The latter result indicated that including the random effect f is still beneficial for increasing prediction ability after adjusting by the random effect u .

  18. An approach for investigation of secure access processes at a combined e-learning environment

    Science.gov (United States)

    Romansky, Radi; Noninska, Irina

    2017-12-01

    The article discuses an approach to investigate processes for regulation the security and privacy control at a heterogenous e-learning environment realized as a combination of traditional and cloud means and tools. Authors' proposal for combined architecture of e-learning system is presented and main subsystems and procedures are discussed. A formalization of the processes for using different types resources (public, private internal and private external) is proposed. The apparatus of Markovian chains (MC) is used for modeling and analytical investigation of the secure access to the resources is used and some assessments are presented.

  19. Big Data X-Learning Resources Integration and Processing in Cloud Environments

    Directory of Open Access Journals (Sweden)

    Kong Xiangsheng

    2014-09-01

    Full Text Available The cloud computing platform has good flexibility characteristics, more and more learning systems are migrated to the cloud platform. Firstly, this paper describes different types of educational environments and the data they provide. Then, it proposes a kind of heterogeneous learning resources mining, integration and processing architecture. In order to integrate and process the different types of learning resources in different educational environments, this paper specifically proposes a novel solution and massive storage integration algorithm and conversion algorithm to the heterogeneous learning resources storage and management cloud environments.

  20. Modeling and optimization of wet sizing process

    International Nuclear Information System (INIS)

    Thai Ba Cau; Vu Thanh Quang and Nguyen Ba Tien

    2004-01-01

    Mathematical simulation on basis of Stock law has been done for wet sizing process on cylinder equipment of laboratory and semi-industrial scale. The model consists of mathematical equations describing relations between variables, such as: - Resident time distribution function of emulsion particles in the separating zone of the equipment depending on flow-rate, height, diameter and structure of the equipment. - Size-distribution function in the fine and coarse parts depending on resident time distribution function of emulsion particles, characteristics of the material being processed, such as specific density, shapes, and characteristics of the environment of classification, such as specific density, viscosity. - Experimental model was developed on data collected from an experimental cylindrical equipment with diameter x height of sedimentation chamber equal to 50 x 40 cm for an emulsion of zirconium silicate in water. - Using this experimental model allows to determine optimal flow-rate in order to obtain product with desired grain size in term of average size or size distribution function. (author)

  1. Modeled near-field environment porosity modifications due to coupled thermohydrologic and geochemical processes

    International Nuclear Information System (INIS)

    Glassley, W. E.; Nitao, J. J.

    1998-01-01

    Heat deposited by waste packages in nuclear waste repositories can modify rock properties by instigating mineral dissolution and precipitation along hydrothermal flow pathways. Modeling this reactive transport requires coupling fluid flow to permeability changes resulting from dissolution and precipitation. Modification of the NUFT thermohydrologic (TH) code package to account for this coupling in a simplified geochemical system has been used to model the time- dependent change in porosity, permeability, matrix and fracture saturation, and temperature in the vicinity of waste-emplacement drifts, using conditions anticipated for the potential Yucca Mountain repository. The results show, within a few hundred years, dramatic porosity reduction approximately 10 m above emplacement drifts. Most of this reduction is attributed to deposition of solute load at the boiling front, although some of it also results from decreasing temperature along the flow path. The actual distribution of the nearly sealed region is sensitive to the time- dependent characteristics of the thermal load imposed on the environment and suggests that the geometry of the sealed region can be engineered by managing the waste-emplacement strategy

  2. Automatic, Global and Dynamic Student Modeling in a Ubiquitous Learning Environment

    Directory of Open Access Journals (Sweden)

    Sabine Graf

    2009-03-01

    Full Text Available Ubiquitous learning allows students to learn at any time and any place. Adaptivity plays an important role in ubiquitous learning, aiming at providing students with adaptive and personalized learning material, activities, and information at the right place and the right time. However, for providing rich adaptivity, the student model needs to be able to gather a variety of information about the students. In this paper, an automatic, global, and dynamic student modeling approach is introduced, which aims at identifying and frequently updating information about students’ progress, learning styles, interests and knowledge level, problem solving abilities, preferences for using the system, social connectivity, and current location. This information is gathered in an automatic way, using students’ behavior and actions in different learning situations provided by different components/services of the ubiquitous learning environment. By providing a comprehensive student model, students can be supported by rich adaptivity in every component/service of the learning environment. Furthermore, the information in the student model can help in giving teachers a better understanding about the students’ learning process.

  3. A model for hypermedia learning environments based on electronic books

    Directory of Open Access Journals (Sweden)

    Ignacio Aedo

    1997-12-01

    Full Text Available Current hypermedia learning environments do not have a common development basis. Their designers have often used ad-hoc solutions to solve the learning problems they have encountered. However, hypermedia technology can take advantage of employing a theoretical scheme - a model - which takes into account various kinds of learning activities, and solves some of the problems associated with its use in the learning process. The model can provide designers with the tools for creating a hypermedia learning system, by allowing the elements and functions involved in the definition of a specific application to be formally represented.

  4. Modeling Unidirectional Pedestrian Movement: An Investigation of Diffusion Behavior in the Built Environment

    Directory of Open Access Journals (Sweden)

    Ying Liu

    2015-01-01

    Full Text Available Unidirectional pedestrian movement is a special phenomenon in the evacuation process of large public buildings and urban environments at pedestrian scale. Several macroscopic models for collective behaviors have been built to predict pedestrian flow. However, current models do not explain the diffusion behavior in pedestrian crowd movement, which can be important in representing spatial-temporal crowd density differentiation in the movement process. This study builds a macroscopic model for describing crowd diffusion behavior and evaluating unidirectional pedestrian flow. The proposed model employs discretization of time and walking speed in geometric distribution to calculate downstream pedestrian crowd flow and analyze movement process based on upstream number of pedestrians and average walking speed. The simulated results are calibrated with video observation data in a baseball stadium to verify the model precision. Statistical results have verified that the proposed pedestrian diffusion model could accurately describe pedestrian macromovement behavior within the margin of error.

  5. Stochastic Signal Processing for Sound Environment System with Decibel Evaluation and Energy Observation

    Directory of Open Access Journals (Sweden)

    Akira Ikuta

    2014-01-01

    Full Text Available In real sound environment system, a specific signal shows various types of probability distribution, and the observation data are usually contaminated by external noise (e.g., background noise of non-Gaussian distribution type. Furthermore, there potentially exist various nonlinear correlations in addition to the linear correlation between input and output time series. Consequently, often the system input and output relationship in the real phenomenon cannot be represented by a simple model using only the linear correlation and lower order statistics. In this study, complex sound environment systems difficult to analyze by using usual structural method are considered. By introducing an estimation method of the system parameters reflecting correlation information for conditional probability distribution under existence of the external noise, a prediction method of output response probability for sound environment systems is theoretically proposed in a suitable form for the additive property of energy variable and the evaluation in decibel scale. The effectiveness of the proposed stochastic signal processing method is experimentally confirmed by applying it to the observed data in sound environment systems.

  6. Modeling styles in business process modeling

    NARCIS (Netherlands)

    Pinggera, J.; Soffer, P.; Zugal, S.; Weber, B.; Weidlich, M.; Fahland, D.; Reijers, H.A.; Mendling, J.; Bider, I.; Halpin, T.; Krogstie, J.; Nurcan, S.; Proper, E.; Schmidt, R.; Soffer, P.; Wrycza, S.

    2012-01-01

    Research on quality issues of business process models has recently begun to explore the process of creating process models. As a consequence, the question arises whether different ways of creating process models exist. In this vein, we observed 115 students engaged in the act of modeling, recording

  7. Bayesian Genomic Prediction with Genotype × Environment Interaction Kernel Models.

    Science.gov (United States)

    Cuevas, Jaime; Crossa, José; Montesinos-López, Osval A; Burgueño, Juan; Pérez-Rodríguez, Paulino; de Los Campos, Gustavo

    2017-01-05

    The phenomenon of genotype × environment (G × E) interaction in plant breeding decreases selection accuracy, thereby negatively affecting genetic gains. Several genomic prediction models incorporating G × E have been recently developed and used in genomic selection of plant breeding programs. Genomic prediction models for assessing multi-environment G × E interaction are extensions of a single-environment model, and have advantages and limitations. In this study, we propose two multi-environment Bayesian genomic models: the first model considers genetic effects [Formula: see text] that can be assessed by the Kronecker product of variance-covariance matrices of genetic correlations between environments and genomic kernels through markers under two linear kernel methods, linear (genomic best linear unbiased predictors, GBLUP) and Gaussian (Gaussian kernel, GK). The other model has the same genetic component as the first model [Formula: see text] plus an extra component, F: , that captures random effects between environments that were not captured by the random effects [Formula: see text] We used five CIMMYT data sets (one maize and four wheat) that were previously used in different studies. Results show that models with G × E always have superior prediction ability than single-environment models, and the higher prediction ability of multi-environment models with [Formula: see text] over the multi-environment model with only u occurred 85% of the time with GBLUP and 45% of the time with GK across the five data sets. The latter result indicated that including the random effect f is still beneficial for increasing prediction ability after adjusting by the random effect [Formula: see text]. Copyright © 2017 Cuevas et al.

  8. Modeling and Analyzing Operational Decision-Making Synchronization of C2 Organization in Complex Environment

    Directory of Open Access Journals (Sweden)

    Zou Zhigang

    2013-01-01

    Full Text Available In order to improve capability of operational decision-making synchronization (ODMS in command and control (C2 organization, the paper puts forward that ODMS is the negotiation process of situation cognition with three phases about “situation cognition, situation interaction and decision-making synchronization” in complex environment, and then the model and strategies of ODMS are given in quantity. Firstly, measure indexes of three steps above are given in the paper based on the time consumed in negotiation, and three patterns are proposed for negotiating timely in high quality during situation interaction. Secondly, the ODMS model with two stages in continuous changing situation is put forward in the paper, and ODMS strategies are analyzed within environment influence and time restriction. Thirdly, simulation cases are given to validate the process of ODMS under different continuous changing situations the results of this model are better than the other previous models to fulfill the actual restrictions, and the process of ODMS can be adjusted more reasonable for improving the capability of ODMS. Then we discuss the case and summarize the influence factors of ODMS in the C2 organization as organization structure, shared information resources, negotiation patterns, and allocation of decision rights.

  9. An Instructional Method for the AutoCAD Modeling Environment.

    Science.gov (United States)

    Mohler, James L.

    1997-01-01

    Presents a command organizer for AutoCAD to aid new uses in operating within the 3-D modeling environment. Addresses analyzing the problem, visualization skills, nonlinear tools, a static view of a dynamic model, the AutoCAD organizer, environment attributes, and control of the environment. Contains 11 references. (JRH)

  10. A process algebra software engineering environment

    NARCIS (Netherlands)

    Diertens, B.

    2008-01-01

    In previous work we described how the process algebra based language PSF can be used in software engineering, using the ToolBus, a coordination architecture also based on process algebra, as implementation model. In this article we summarize that work and describe the software development process

  11. Conceptual Diagnosis Model Based on Distinct Knowledge Dyads for Interdisciplinary Environments

    Directory of Open Access Journals (Sweden)

    Cristian VIZITIU

    2014-06-01

    Full Text Available The present paper has a synergic dual purpose of bringing a psychological and neuroscience related perspective oriented towards decision making and knowledge creation diagnosis in the frame of Knowledge Management. !e conceptual model is built by means ofCognitive-Emotional and Explicit-Tacit knowledge dyads and structured on Analytic Hierarchy Process (AHP according to the hypothesis which designates the first dyad as an accessing mechanism of knowledge stored in the second dyad. Due to the well acknowledged needsconcerning new advanced decision making instruments and enhanced knowledge creation processes in the field of technical space projects emphasized by a high level of complexity, the herein study tries also to prove the relevance of the proposed conceptual diagnosis modelin Systems Engineering (SE methodology which foresees at its turn concurrent engineering within interdisciplinary working environments. !e theoretical model, entitled DiagnoSE, has the potential to provide practical implications to space/space related business sector butnot merely, and on the other hand, to trigger and inspire other knowledge management related researches for refining and testing the proposed instrument in SE or other similar decision making based working environment.

  12. Analysis of Feedback processes in Online Group Interaction: a methodological model

    Directory of Open Access Journals (Sweden)

    Anna Espasa

    2013-06-01

    Full Text Available The aim of this article is to present a methodological model to analyze students' group interaction to improve their essays in online learning environments, based on asynchronous and written communication. In these environments teacher and student scaffolds for discussion are essential to promote interaction. One of these scaffolds can be the feedback. Research on feedback processes has predominantly focused on feedback design rather than on how students utilize feedback to improve learning. This methodological model fills this gap contributing to analyse the implementation of the feedback processes while students discuss collaboratively in a specific case of writing assignments. A review of different methodological models was carried out to define a framework adjusted to the analysis of the relationship of written and asynchronous group interaction, and students' activity and changes incorporated into the final text. The model proposed includes the following dimensions: 1 student participation 2 nature of student learning and 3 quality of student learning. The main contribution of this article is to present the methodological model and also to ascertain the model's operativity regarding how students incorporate such feedback into their essays.

  13. Process modeling style

    CERN Document Server

    Long, John

    2014-01-01

    Process Modeling Style focuses on other aspects of process modeling beyond notation that are very important to practitioners. Many people who model processes focus on the specific notation used to create their drawings. While that is important, there are many other aspects to modeling, such as naming, creating identifiers, descriptions, interfaces, patterns, and creating useful process documentation. Experience author John Long focuses on those non-notational aspects of modeling, which practitioners will find invaluable. Gives solid advice for creating roles, work produ

  14. Microbial consortia in meat processing environments

    Science.gov (United States)

    Alessandria, V.; Rantsiou, K.; Cavallero, M. C.; Riva, S.; Cocolin, L.

    2017-09-01

    Microbial contamination in food processing plants can play a fundamental role in food quality and safety. The description of the microbial consortia in the meat processing environment is important since it is a first step in understanding possible routes of product contamination. Furthermore, it may contribute in the development of sanitation programs for effective pathogen removal. The purpose of this study was to characterize the type of microbiota in the environment of meat processing plants: the microbiota of three different meat plants was studied by both traditional and molecular methods (PCR-DGGE) in two different periods. Different levels of contamination emerged between the three plants as well as between the two sampling periods. Conventional methods of killing free-living bacteria through antimicrobial agents and disinfection are often ineffective against bacteria within a biofilm. The use of gas-discharge plasmas potentially can offer a good alternative to conventional sterilization methods. The purpose of this study was to measure the effectiveness of Atmospheric Pressure Plasma (APP) surface treatments against bacteria in biofilms. Biofilms produced by three different L. monocytogenes strains on stainless steel surface were subjected to three different conditions (power, exposure time) of APP. Our results showed how most of the culturable cells are inactivated after the Plasma exposure but the RNA analysis by qPCR highlighted the entrance of the cells in the viable-but non culturable (VBNC) state, confirming the hypothesis that cells are damaged after plasma treatment, but in a first step, still remain alive. The understanding of the effects of APP on the L. monocytogenes biofilm can improve the development of sanitation programs with the use of APP for effective pathogen removal.

  15. Modelling and estimating degradation processes with application in structural reliability

    International Nuclear Information System (INIS)

    Chiquet, J.

    2007-06-01

    The characteristic level of degradation of a given structure is modeled through a stochastic process called the degradation process. The random evolution of the degradation process is governed by a differential system with Markovian environment. We put the associated reliability framework by considering the failure of the structure once the degradation process reaches a critical threshold. A closed form solution of the reliability function is obtained thanks to Markov renewal theory. Then, we build an estimation methodology for the parameters of the stochastic processes involved. The estimation methods and the theoretical results, as well as the associated numerical algorithms, are validated on simulated data sets. Our method is applied to the modelling of a real degradation mechanism, known as crack growth, for which an experimental data set is considered. (authors)

  16. Modeling the role of environment in addiction.

    Science.gov (United States)

    Caprioli, Daniele; Celentano, Michele; Paolone, Giovanna; Badiani, Aldo

    2007-11-15

    The aim of this review is to provide an overview of the main types of animal models used to investigate the modulatory role of environment on drug addiction. The environment can alter the responsiveness to addictive drugs in at least three major ways. First, adverse life experiences can make an individual more vulnerable to develop drug addiction or to relapse into drug seeking. Second, neutral environmental cues can acquire, through Pavlovian conditioning, the ability to trigger drug seeking even after long periods of abstinence. Third, the environment immediately surrounding drug taking can alter the behavioral, subjective, and rewarding effects of a given drug, thus influencing the propensity to use the same drug again. We have focused in particular on the results obtained using an animal model we have developed to study the latter type of drug-environment interaction.

  17. A Conceptual Model of the Cognitive Processing of Environmental Distance Information

    Science.gov (United States)

    Montello, Daniel R.

    I review theories and research on the cognitive processing of environmental distance information by humans, particularly that acquired via direct experience in the environment. The cognitive processes I consider for acquiring and thinking about environmental distance information include working-memory, nonmediated, hybrid, and simple-retrieval processes. Based on my review of the research literature, and additional considerations about the sources of distance information and the situations in which it is used, I propose an integrative conceptual model to explain the cognitive processing of distance information that takes account of the plurality of possible processes and information sources, and describes conditions under which particular processes and sources are likely to operate. The mechanism of summing vista distances is identified as widely important in situations with good visual access to the environment. Heuristics based on time, effort, or other information are likely to play their most important role when sensory access is restricted.

  18. Fusion environment sensitive flow and fracture processes

    International Nuclear Information System (INIS)

    1980-01-01

    As a planning activity, the objectives of the workshop were to list, prioritize and milestone the activities necessary to understand, interpret and control the mechanical behavior of candidate fusion reactor alloys. Emphasis was placed on flow and fracture processes which are unique to the fusion environment since the national fusion materials program must evaluate these effects without assistance from other reactor programs

  19. Model for Simulating a Spiral Software-Development Process

    Science.gov (United States)

    Mizell, Carolyn; Curley, Charles; Nayak, Umanath

    2010-01-01

    A discrete-event simulation model, and a computer program that implements the model, have been developed as means of analyzing a spiral software-development process. This model can be tailored to specific development environments for use by software project managers in making quantitative cases for deciding among different software-development processes, courses of action, and cost estimates. A spiral process can be contrasted with a waterfall process, which is a traditional process that consists of a sequence of activities that include analysis of requirements, design, coding, testing, and support. A spiral process is an iterative process that can be regarded as a repeating modified waterfall process. Each iteration includes assessment of risk, analysis of requirements, design, coding, testing, delivery, and evaluation. A key difference between a spiral and a waterfall process is that a spiral process can accommodate changes in requirements at each iteration, whereas in a waterfall process, requirements are considered to be fixed from the beginning and, therefore, a waterfall process is not flexible enough for some projects, especially those in which requirements are not known at the beginning or may change during development. For a given project, a spiral process may cost more and take more time than does a waterfall process, but may better satisfy a customer's expectations and needs. Models for simulating various waterfall processes have been developed previously, but until now, there have been no models for simulating spiral processes. The present spiral-process-simulating model and the software that implements it were developed by extending a discrete-event simulation process model of the IEEE 12207 Software Development Process, which was built using commercially available software known as the Process Analysis Tradeoff Tool (PATT). Typical inputs to PATT models include industry-average values of product size (expressed as number of lines of code

  20. TEACH-ME: IMPLEMENTATION OF MOBILE ENVIRONMENTS TO THE TEACH - LEARNING PROCESS

    Directory of Open Access Journals (Sweden)

    Luis Eduardo Pérez Peregrino

    2011-05-01

    Full Text Available The research project TEACH-ME (Technology, Engineering Calculus and Hewlett-Packard Mobile Environment presents an educational proposal that seeks to innovate the teaching and learning processes of mathematics, Logic Basic Programming and Management of Information, through the introduction of collaborative working environments, in order to provide the integrated development of learning methodologies, enhancing cognitive abilities in their students. As a case study, it presents the results obtained when applying this project to students in their first semester at the Faculty of Engineering at “Corporación Universitaria Minuto de Dios” University, which introduces the use of tablet PCs from Hewlett Packard to support the teaching process. This article presents the process of implementing of the TEACH-ME Project, developed as an academic environment that has allowed the implementation processes of research on the impact of the application of information technologies and communication technologies to the higher education teaching. We will present the project background, what the implementation process has so far done, the impact obtained from the learning and teaching processes, the integration of technologies at an academic meeting who has helped carry out the project and, finally, the contributions of the Tablet PC to the teaching-learning process at the University.

  1. Operational SAR Data Processing in GIS Environments for Rapid Disaster Mapping

    Science.gov (United States)

    Bahr, Thomas

    2014-05-01

    DEM without the need of ground control points. This step includes radiometric calibration. (3) A subsequent change detection analysis generates the final map showing the extent of the flash flood on Nov. 5th 2010. The underlying algorithms are provided by three different sources: Geocoding & radiometric calibration (2) is a standard functionality from the commercial SARscape Toolbox for ArcGIS. This toolbox is extended by the filter tool (1), which is called from the SARscape modules in ENVI. The change detection analysis (3) is based on ENVI processing routines and scripted with IDL. (2) and (3) are integrated with ArcGIS using a predefined Python interface. These 3 processing steps are combined using the ArcGIS ModelBuilder to create a new model for rapid disaster mapping in ArcGIS, based on SAR data. Moreover, this model can be dissolved from its desktop environment and published to users across the ArcGIS Server enterprise. Thus disaster zones, e.g. after severe flooding, can be automatically identified and mapped to support local task forces - using an operational workflow for SAR image analysis, which can be executed by the responsible operators without SAR expert knowledge.

  2. Report of the 2014 Programming Models and Environments Summit

    Energy Technology Data Exchange (ETDEWEB)

    Heroux, Michael [US Dept. of Energy, Washington, DC (United States); Lethin, Richard [US Dept. of Energy, Washington, DC (United States)

    2016-09-19

    Programming models and environments play the essential roles in high performance computing of enabling the conception, design, implementation and execution of science and engineering application codes. Programmer productivity is strongly influenced by the effectiveness of our programming models and environments, as is software sustainability since our codes have lifespans measured in decades, so the advent of new computing architectures, increased concurrency, concerns for resilience, and the increasing demands for high-fidelity, multi-physics, multi-scale and data-intensive computations mean that we have new challenges to address as part of our fundamental R&D requirements. Fortunately, we also have new tools and environments that make design, prototyping and delivery of new programming models easier than ever. The combination of new and challenging requirements and new, powerful toolsets enables significant synergies for the next generation of programming models and environments R&D. This report presents the topics discussed and results from the 2014 DOE Office of Science Advanced Scientific Computing Research (ASCR) Programming Models & Environments Summit, and subsequent discussions among the summit participants and contributors to topics in this report.

  3. Modeling human behaviors and reactions under dangerous environment.

    Science.gov (United States)

    Kang, J; Wright, D K; Qin, S F; Zhao, Y

    2005-01-01

    This paper describes the framework of a real-time simulation system to model human behavior and reactions in dangerous environments. The system utilizes the latest 3D computer animation techniques, combined with artificial intelligence, robotics and psychology, to model human behavior, reactions and decision making under expected/unexpected dangers in real-time in virtual environments. The development of the system includes: classification on the conscious/subconscious behaviors and reactions of different people; capturing different motion postures by the Eagle Digital System; establishing 3D character animation models; establishing 3D models for the scene; planning the scenario and the contents; and programming within Virtools Dev. Programming within Virtools Dev is subdivided into modeling dangerous events, modeling character's perceptions, modeling character's decision making, modeling character's movements, modeling character's interaction with environment and setting up the virtual cameras. The real-time simulation of human reactions in hazardous environments is invaluable in military defense, fire escape, rescue operation planning, traffic safety studies, and safety planning in chemical factories, the design of buildings, airplanes, ships and trains. Currently, human motion modeling can be realized through established technology, whereas to integrate perception and intelligence into virtual human's motion is still a huge undertaking. The challenges here are the synchronization of motion and intelligence, the accurate modeling of human's vision, smell, touch and hearing, the diversity and effects of emotion and personality in decision making. There are three types of software platforms which could be employed to realize the motion and intelligence within one system, and their advantages and disadvantages are discussed.

  4. Biosphere modeling in waste disposal safety assessments -- An example using the terrestrial-aquatic model of the environment

    International Nuclear Information System (INIS)

    Klos, R.A.

    1998-01-01

    Geological disposal of radioactive wastes is intended to provide long-term isolation of potentially harmful radionuclides from the human environment and the biosphere. The long timescales involved pose unique problems for biosphere modeling because there are considerable uncertainties regarding the state of the biosphere into which releases might ultimately occur. The key to representing the biosphere in long-timescale assessments is the flexibility with which those aspects of the biosphere that are of relevance to dose calculations are represented, and this comes from the way in which key biosphere features, events, and processes are represented in model codes. How this is done in contemporary assessments is illustrated by the Terrestrial-Aquatic Model of the Environment (TAME), an advanced biosphere model for waste disposal assessments recently developed in Switzerland. A numerical example of the release of radionuclides from a subterranean source to an inland valley biosphere is used to illustrate how biosphere modeling is carried out and the practical ways in which meaningful quantitative results can be achieved. The results emphasize the potential for accumulation of radionuclides in the biosphere over long timescales and also illustrate the role of parameter values in such modeling

  5. Transforming Collaborative Process Models into Interface Process Models by Applying an MDA Approach

    Science.gov (United States)

    Lazarte, Ivanna M.; Chiotti, Omar; Villarreal, Pablo D.

    Collaborative business models among enterprises require defining collaborative business processes. Enterprises implement B2B collaborations to execute these processes. In B2B collaborations the integration and interoperability of processes and systems of the enterprises are required to support the execution of collaborative processes. From a collaborative process model, which describes the global view of the enterprise interactions, each enterprise must define the interface process that represents the role it performs in the collaborative process in order to implement the process in a Business Process Management System. Hence, in this work we propose a method for the automatic generation of the interface process model of each enterprise from a collaborative process model. This method is based on a Model-Driven Architecture to transform collaborative process models into interface process models. By applying this method, interface processes are guaranteed to be interoperable and defined according to a collaborative process.

  6. Modelling of Gas Flow in the Underground Coal Gasification Process and its Interactions with the Rock Environment

    Directory of Open Access Journals (Sweden)

    Tomasz Janoszek

    2013-01-01

    Full Text Available The main goal of this study was the analysis of gas flow in the underground coal gasification process and interactions with the surrounding rock mass. The article is a discussion of the assumptions for the geometric model and for the numerical method for its solution as well as assumptions for modelling the geochemical model of the interaction between gas-rock-water, in terms of equilibrium calculations, chemical and gas flow modelling in porous mediums. Ansys-Fluent software was used to describe the underground coal gasification process (UCG. The numerical solution was compared with experimental data. The PHREEQC program was used to describe the chemical reaction between the gaseous products of the UCG process and the rock strata in the presence of reservoir waters.

  7. Progress in integrated energy-economy-environment model system development

    International Nuclear Information System (INIS)

    Yasukawa, Shigeru; Mankin, Shuichi; Sato, Osamu; Tadokoro, Yoshihiro; Nakano, Yasuyuki; Nagano, Takao

    1987-11-01

    The Integrated Energy-Economy-Environment Model System has been developed for providing analytical tools for the system analysis and technology assessments in the field of nuclear research and development. This model system consists of the following four model groups. The first model block installs 5 models and can serve to analyze and generate long-term scenarios on economy-energy-environment evolution. The second model block installs 2 models and can serve to analyze the structural transition phenomena in energy-economy-environment interactions. The third model block installs 2 models and can handle power reactor installation strategy problem and long-term fuel cycle analysis. The fourth model block installs 5 models and codes and can treats cost-benefit-risk analysis and assessments. This report describes mainly the progress and the outlines of application of the model system in these years after the first report on the research and development of the model system (JAERI-M 84 - 139). (author)

  8. Evaluation and comparison of models and modelling tools simulating nitrogen processes in treatment wetlands

    DEFF Research Database (Denmark)

    Edelfeldt, Stina; Fritzson, Peter

    2008-01-01

    with Modelica 2.1 (Wiley-IEEE Press, USA, 2004).] and an associated tool. The differences and similarities between the MathModelica Model Editor and three other ecological modelling tools have also been evaluated. The results show that the models can well be modelled and simulated in the MathModelica Model...... Editor, and that nitrogen decrease in a constructed treatment wetland should be described and simulated using the Nitrification/Denitrification model as this model has the highest overall quality score and provides a more variable environment.......In this paper, two ecological models of nitrogen processes in treatment wetlands have been evaluated and compared. These models were implemented, simulated, and visualized using the Modelica modelling and simulation language [P. Fritzson, Principles of Object-Oriented Modelling and Simulation...

  9. Process material management in the Space Station environment

    Science.gov (United States)

    Perry, J. L.; Humphries, W. R.

    1988-01-01

    The Space Station will provide a unique facility for conducting material-processing and life-science experiments under microgravity conditions. These conditions place special requirements on the U.S. Laboratory for storing and transporting chemicals and process fluids, reclaiming water from selected experiments, treating and storing experiment wastes, and providing vacuum utilities. To meet these needs and provide a safe laboratory environment, the Process Material Management System (PMMS) is being developed. Preliminary design requirements and concepts related to the PMMS are addressed, and the MSFC PMMS breadboard test facility and a preliminary plan for validating the overall system design are discussed.

  10. The dynamic radiation environment assimilation model (DREAM)

    International Nuclear Information System (INIS)

    Reeves, Geoffrey D.; Koller, Josef; Tokar, Robert L.; Chen, Yue; Henderson, Michael G.; Friedel, Reiner H.

    2010-01-01

    The Dynamic Radiation Environment Assimilation Model (DREAM) is a 3-year effort sponsored by the US Department of Energy to provide global, retrospective, or real-time specification of the natural and potential nuclear radiation environments. The DREAM model uses Kalman filtering techniques that combine the strengths of new physical models of the radiation belts with electron observations from long-term satellite systems such as GPS and geosynchronous systems. DREAM includes a physics model for the production and long-term evolution of artificial radiation belts from high altitude nuclear explosions. DREAM has been validated against satellites in arbitrary orbits and consistently produces more accurate results than existing models. Tools for user-specific applications and graphical displays are in beta testing and a real-time version of DREAM has been in continuous operation since November 2009.

  11. PERBAIKAN SISTEM KERJA PROSES EVAKUASI YANG DILAKUKAN PETUGAS PARAMEDIS AMBULANS MENGGUNAKAN VIRTUAL ENVIRONMENT MODELING

    Directory of Open Access Journals (Sweden)

    Herian Atma

    2010-12-01

    Full Text Available Work System Improvement of Evacuation Process Conducted by Emergency Medical Technicians Using Virtual Environment Modeling. The work of emergency medical technicians (EMT during patient evacuation involves lifting task in an emergency situation, which results in the increasing risk of musculoskeletal disorders such as low back pain. The purpose of this research was to investigate the workplace and ergonomic aspect that influence work posture of the EMT using simulation approach in a virtual environment. Biomechanic model (mannequin of the EMT had been simulated and analyzed by using LBA and OWAS method. The mannequin was given an improvement based on ergonomic principle of manual lifting task and then was reanalyzed. Improvement that can be used for the work system of the evacuation process conducted by EMT considering its nature situation is the work posture of personnel during the process of lifting the patient into the stretcher. The technique that can be used is the proper lifting techniques. The results of this research can be used as a recommendation to the work system of the EMT.

  12. The Optimization of the Local Public Policies’ Development Process Through Modeling And Simulation

    Directory of Open Access Journals (Sweden)

    Minodora URSĂCESCU

    2012-06-01

    Full Text Available The local public policies development in Romania represents an empirically realized measure, the strategic management practices in this domain not being based on a scientific instrument capable to anticipate and evaluate the results of implementing a local public policy in a logic of needs-policies-effects type. Beginning from this motivation, the purpose of the paper resides in the reconceptualization of the public policies process on functioning principles of the dynamic systems with inverse connection, by means of mathematical modeling and techniques simulation. Therefore, the research is oriented in the direction of developing an optimization method for the local public policies development process, using as instruments the mathematical modeling and the techniques simulation. The research’s main results are on the one side constituted by generating a new process concept of the local public policies, and on the other side by proposing the conceptual model of a complex software product which will permit the parameterized modeling in a virtual environment of these policies development process. The informatic product’s finality resides in modeling and simulating each local public policy type, taking into account the respective policy’s characteristics, but also the value of their appliance environment parameters in a certain moment.

  13. Towards Spherical Mesh Gravity and Magnetic Modelling in an HPC Environment

    Science.gov (United States)

    Lane, R. J.; Brodie, R. C.; de Hoog, M.; Navin, J.; Chen, C.; Du, J.; Liang, Q.; Wang, H.; Li, Y.

    2013-12-01

    Staff at Geoscience Australia (GA), Australia's Commonwealth Government geoscientific agency, have routinely performed 3D gravity and magnetic modelling as part of geoscience investigations. For this work, we have used software programs that have been based on a Cartesian mesh spatial framework. These programs have come as executable files that were compiled to operate in a Windows environment on single core personal computers (PCs). To cope with models with higher resolution and larger extents, we developed an approach whereby a large problem could be broken down into a number of overlapping smaller models (';tiles') that could be modelled separately, with the results combined back into a single output model. To speed up the processing, we established a Condor distributed network from existing desktop PCs. A number of factors have caused us to consider a new approach to this modelling work. The drivers for change include; 1) models with very large lateral extents where the effects of Earth curvature are a consideration, 2) a desire to ensure that the modelling of separate regions is carried out in a consistent and managed fashion, 3) migration of scientific computing to off-site High Performance Computing (HPC) facilities, and 4) development of virtual globe environments for integration and visualization of 3D spatial objects. Some of the more surprising realizations to emerge have been that; 1) there aren't any readily available commercial software packages for modelling gravity and magnetic data in a spherical mesh spatial framework, 2) there are many different types of HPC environments, 3) no two HPC environments are the same, and 4) the most common virtual globe environment (i.e., Google Earth) doesn't allow spatial objects to be displayed below the topographic/bathymetric surface. Our response has been to do the following; 1) form a collaborative partnership with researchers at the Colorado School of Mines (CSM) and the China University of Geosciences (CUG

  14. Spectroscopic analyses of chemical adaptation processes within microalgal biomass in response to changing environments

    Energy Technology Data Exchange (ETDEWEB)

    Vogt, Frank, E-mail: fvogt@utk.edu; White, Lauren

    2015-03-31

    Highlights: • Microalgae transform large quantities of inorganics into biomass. • Microalgae interact with their growing environment and adapt their chemical composition. • Sequestration capabilities are dependent on cells’ chemical environments. • We develop a chemometric hard-modeling to describe these chemical adaptation dynamics. • This methodology will enable studies of microalgal compound sequestration. - Abstract: Via photosynthesis, marine phytoplankton transforms large quantities of inorganic compounds into biomass. This has considerable environmental impacts as microalgae contribute for instance to counter-balancing anthropogenic releases of the greenhouse gas CO{sub 2}. On the other hand, high concentrations of nitrogen compounds in an ecosystem can lead to harmful algae blooms. In previous investigations it was found that the chemical composition of microalgal biomass is strongly dependent on the nutrient availability. Therefore, it is expected that algae’s sequestration capabilities and productivity are also determined by the cells’ chemical environments. For investigating this hypothesis, novel analytical methodologies are required which are capable of monitoring live cells exposed to chemically shifting environments followed by chemometric modeling of their chemical adaptation dynamics. FTIR-ATR experiments have been developed for acquiring spectroscopic time series of live Dunaliella parva cultures adapting to different nutrient situations. Comparing experimental data from acclimated cultures to those exposed to a chemically shifted nutrient situation reveals insights in which analyte groups participate in modifications of microalgal biomass and on what time scales. For a chemometric description of these processes, a data model has been deduced which explains the chemical adaptation dynamics explicitly rather than empirically. First results show that this approach is feasible and derives information about the chemical biomass

  15. Spectroscopic analyses of chemical adaptation processes within microalgal biomass in response to changing environments

    International Nuclear Information System (INIS)

    Vogt, Frank; White, Lauren

    2015-01-01

    Highlights: • Microalgae transform large quantities of inorganics into biomass. • Microalgae interact with their growing environment and adapt their chemical composition. • Sequestration capabilities are dependent on cells’ chemical environments. • We develop a chemometric hard-modeling to describe these chemical adaptation dynamics. • This methodology will enable studies of microalgal compound sequestration. - Abstract: Via photosynthesis, marine phytoplankton transforms large quantities of inorganic compounds into biomass. This has considerable environmental impacts as microalgae contribute for instance to counter-balancing anthropogenic releases of the greenhouse gas CO 2 . On the other hand, high concentrations of nitrogen compounds in an ecosystem can lead to harmful algae blooms. In previous investigations it was found that the chemical composition of microalgal biomass is strongly dependent on the nutrient availability. Therefore, it is expected that algae’s sequestration capabilities and productivity are also determined by the cells’ chemical environments. For investigating this hypothesis, novel analytical methodologies are required which are capable of monitoring live cells exposed to chemically shifting environments followed by chemometric modeling of their chemical adaptation dynamics. FTIR-ATR experiments have been developed for acquiring spectroscopic time series of live Dunaliella parva cultures adapting to different nutrient situations. Comparing experimental data from acclimated cultures to those exposed to a chemically shifted nutrient situation reveals insights in which analyte groups participate in modifications of microalgal biomass and on what time scales. For a chemometric description of these processes, a data model has been deduced which explains the chemical adaptation dynamics explicitly rather than empirically. First results show that this approach is feasible and derives information about the chemical biomass adaptations

  16. The RHIC/AGS Online Model Environment: Design and Overview

    International Nuclear Information System (INIS)

    Satogata, T.; Brown, K.; Pilat, F.; Tafti Alai, A.; Tepikian, S.; Vanzeijtz

    1999-01-01

    An integrated online modeling environment is currently under development for use by AGS and RHIC physicists and commissioners. This environment combines the modeling efforts of both groups in a CDEV[1] client-server design, providing access to expected machine optics and physics parameters based on live and design machine settings. An abstract modeling interface has been designed as a set of adapters[2] around core computational modeling engines such as MAD and UAL/Teapot++[3]. This approach allows us to leverage existing survey, lattice, and magnet infrastructure, as well as easily incorporate new model engine developments. This paper describes the architecture of the RHIC/AGS modeling environment, including the application interface through CDEV and general tools for graphical interaction with the model using Tcl/Tk. Separate papers at this conference address the specifics of implementation and modeling experience for AGS and RHIC

  17. Influence of fractal substructures of the percolating cluster on transferring processes in macroscopically disordered environments

    Science.gov (United States)

    Kolesnikov, B. P.

    2017-11-01

    The presented work belongs to the issue of searching for the effective kinetic properties of macroscopically disordered environments (MDE). These properties characterize MDE in general on the sizes which significantly exceed the sizes of macro inhomogeneity. The structure of MDE is considered as a complex of interpenetrating percolating and finite clusters consolidated from homonymous components, topological characteristics of which influence on the properties of the whole environment. The influence of percolating clusters’ fractal substructures (backbone, skeleton of backbone, red bonds) on the transfer processes during crossover (a structure transition from fractal to homogeneous condition) is investigated based on the offered mathematical approach for finding the effective conductivity of MDEs and on the percolating cluster model. The nature of the change of the critical conductivity index t during crossover from the characteristic value for the area close to percolation threshold to the value corresponded to homogeneous condition is demonstrated. The offered model describes the transfer processes in MDE with the finite conductivity relation of «conductive» and «low conductive» phases above and below percolation threshold and in smearing area (an analogue of a blur area of the second-order phase transfer).

  18. Mathematical Modeling of the Process for Microbial Production of Branched Chained Amino Acids

    Directory of Open Access Journals (Sweden)

    Todorov K.

    2009-12-01

    Full Text Available This article deals with modelling of branched chained amino acids production. One of important branched chained amino acid is L-valine. The aim of the article is synthesis of dynamic unstructured model of fed-batch fermentation process with intensive droppings for L-valine production. The presented approach of the investigation includes the following main procedures: description of the process by generalized stoichiometric equations; preliminary data processing and calculation of specific rates for main kinetic variables; identification of the specific rates takes into account the dissolved oxygen tension; establishment and optimisation of dynamic model of the process; simulation researches. MATLAB is used as a research environment.

  19. Meteoroid Environment Modeling: the Meteoroid Engineering Model and Shower Forecasting

    Science.gov (United States)

    Moorhead, Althea V.

    2017-01-01

    The meteoroid environment is often divided conceptually into meteor showers plus a sporadic background component. The sporadic complex poses the bulk of the risk to spacecraft, but showers can produce significant short-term enhancements of the meteoroid flux. The Meteoroid Environment Office (MEO) has produced two environment models to handle these cases: the Meteoroid Engineering Model (MEM) and an annual meteor shower forecast. Both MEM and the forecast are used by multiple manned spaceflight projects in their meteoroid risk evaluation, and both tools are being revised to incorporate recent meteor velocity, density, and timing measurements. MEM describes the sporadic meteoroid complex and calculates the flux, speed, and directionality of the meteoroid environment relative to a user-supplied spacecraft trajectory, taking the spacecraft's motion into account. MEM is valid in the inner solar system and offers near-Earth and cis-lunar environments. While the current version of MEM offers a nominal meteoroid environment corresponding to a single meteoroid bulk density, the next version of MEMR3 will offer both flux uncertainties and a density distribution in addition to a revised near-Earth environment. We have updated the near-Earth meteor speed distribution and have made the first determination of uncertainty in this distribution. We have also derived a meteor density distribution from the work of Kikwaya et al. (2011). The annual meteor shower forecast takes the form of a report and data tables that can be used in conjunction with an existing MEM assessment. Fluxes are typically quoted to a constant limiting kinetic energy in order to comport with commonly used ballistic limit equations. For the 2017 annual forecast, the MEO substantially revised the list of showers and their characteristics using 14 years of meteor flux measurements from the Canadian Meteor Orbit Radar (CMOR). Defunct or insignificant showers were removed and the temporal profiles of many showers

  20. Modelling Virtual Environments for Geovisualization

    DEFF Research Database (Denmark)

    Bodum, Lars

    2005-01-01

    The use of virtual environments in geovisualization has become a major topic within the last few years. The main reason for this interest in the growing use of 3D models and visual realizations in a wide range of applications concerned with the geographic element of information. The implementation...... within the geographic domain. A categorization of the virtual environments is offered through which the differences between them are highlighted. It is possible to achieve this categorization in many ways from many perspectives since this is not and will not be research of a purely positive nature...

  1. On the scaling limits of Galton Watson processes in varying environment

    NARCIS (Netherlands)

    Bansaye, V.; Simatos, F.

    2011-01-01

    Renormalized sequences of Galton Watson processes converge to Continuous State Branching Processes (CSBP), characterized by a L\\'evy triplet of two numbers and a measure. This paper investigates the case of Galton Watson processes in varying environment and provides an explicit sufficient condition

  2. A Novel Petri Nets-Based Modeling Method for the Interaction between the Sensor and the Geographic Environment in Emerging Sensor Networks

    Science.gov (United States)

    Zhang, Feng; Xu, Yuetong; Chou, Jarong

    2016-01-01

    The service of sensor device in Emerging Sensor Networks (ESNs) is the extension of traditional Web services. Through the sensor network, the service of sensor device can communicate directly with the entity in the geographic environment, and even impact the geographic entity directly. The interaction between the sensor device in ESNs and geographic environment is very complex, and the interaction modeling is a challenging problem. This paper proposed a novel Petri Nets-based modeling method for the interaction between the sensor device and the geographic environment. The feature of the sensor device service in ESNs is more easily affected by the geographic environment than the traditional Web service. Therefore, the response time, the fault-tolerant ability and the resource consumption become important factors in the performance of the whole sensor application system. Thus, this paper classified IoT services as Sensing services and Controlling services according to the interaction between IoT service and geographic entity, and classified GIS services as data services and processing services. Then, this paper designed and analyzed service algebra and Colored Petri Nets model to modeling the geo-feature, IoT service, GIS service and the interaction process between the sensor and the geographic enviroment. At last, the modeling process is discussed by examples. PMID:27681730

  3. A Novel Petri Nets-Based Modeling Method for the Interaction between the Sensor and the Geographic Environment in Emerging Sensor Networks

    Directory of Open Access Journals (Sweden)

    Feng Zhang

    2016-09-01

    Full Text Available The service of sensor device in Emerging Sensor Networks (ESNs is the extension of traditional Web services. Through the sensor network, the service of sensor device can communicate directly with the entity in the geographic environment, and even impact the geographic entity directly. The interaction between the sensor device in ESNs and geographic environment is very complex, and the interaction modeling is a challenging problem. This paper proposed a novel Petri Nets-based modeling method for the interaction between the sensor device and the geographic environment. The feature of the sensor device service in ESNs is more easily affected by the geographic environment than the traditional Web service. Therefore, the response time, the fault-tolerant ability and the resource consumption become important factors in the performance of the whole sensor application system. Thus, this paper classified IoT services as Sensing services and Controlling services according to the interaction between IoT service and geographic entity, and classified GIS services as data services and processing services. Then, this paper designed and analyzed service algebra and Colored Petri Nets model to modeling the geo-feature, IoT service, GIS service and the interaction process between the sensor and the geographic enviroment. At last, the modeling process is discussed by examples.

  4. Educational Process Reengineering and Diffusion of Innovation in Formal Learning Environment

    DEFF Research Database (Denmark)

    Khalid, Md. Saifuddin; Hossain, Mohammad Shahadat; Rongbutsri, Nikorn

    2011-01-01

    administration and evaluation and assessment. Educational environments are flexible and not governed by standard operating procedures, making technology use lithe. Theory of diffusion of innovations‟ is recommended to be integrated to reason and measure acceptance or rejection of EPR selected technology......In technology mediated learning while relative advantages of technologies is proven, lack of contextualization and process centric change, and lack of user driven change has kept intervention and adoption of educational technologies among individuals and organizations as challenges. Reviewing...... the formal, informal and non-formal learning environments, this study focuses on the formal part. This paper coins the term 'Educational Process Reengineering (EPR) based on the established concept of 'Business Process Reengineering (BPR) for process improvement of teaching learning activities, academic...

  5. A consensus model for group decision making under interval type-2 fuzzy environment

    Institute of Scientific and Technical Information of China (English)

    Xiao-xiong ZHANG; Bing-feng GE; Yue-jin TAN

    2016-01-01

    We propose a new consensus model for group decision making (GDM) problems, using an interval type-2 fuzzy environment. In our model, experts are asked to express their preferences using linguistic terms characterized by interval type-2 fuzzy sets (IT2 FSs), because these can provide decision makers with greater freedom to express the vagueness in real-life situa-tions. Consensus and proximity measures based on the arithmetic operations of IT2 FSs are used simultaneously to guide the decision-making process. The majority of previous studies have taken into account only the importance of the experts in the aggregation process, which may give unreasonable results. Thus, we propose a new feedback mechanism that generates different advice strategies for experts according to their levels of importance. In general, experts with a lower level of importance require a larger number of suggestions to change their initial preferences. Finally, we investigate a numerical example and execute com-parable models and ours, to demonstrate the performance of our proposed model. The results indicate that the proposed model provides greater insight into the GDM process.

  6. Building, Using, Sharing and Reusing Environment Concept Models

    National Research Council Canada - National Science Library

    Chadbourne, Christopher; Clark, Douglas

    2006-01-01

    .... The Environment Concept Model (ECM) is an object-oriented documentation technique. The technique is tailored for system engineers who must deliver a consistent synthetic environment representation, on time and within budget...

  7. An extended car-following model considering the acceleration derivative in some typical traffic environments

    Science.gov (United States)

    Zhou, Tong; Chen, Dong; Liu, Weining

    2018-03-01

    Based on the full velocity difference and acceleration car-following model, an extended car-following model is proposed by considering the vehicle’s acceleration derivative. The stability condition is given by applying the control theory. Considering some typical traffic environments, the results of theoretical analysis and numerical simulation show the extended model has a more actual acceleration of string vehicles than that of the previous models in starting process, stopping process and sudden brake. Meanwhile, the traffic jams more easily occur when the coefficient of vehicle’s acceleration derivative increases, which is presented by space-time evolution. The results confirm that the vehicle’s acceleration derivative plays an important role in the traffic jamming transition and the evolution of traffic congestion.

  8. Creating an inclusive mall environment with the PRECEDE-PROCEED model: a living lab case study.

    Science.gov (United States)

    Ahmed, Sara; Swaine, Bonnie; Milot, Marc; Gaudet, Caroline; Poldma, Tiiu; Bartlett, Gillian; Mazer, Barbara; Le Dorze, Guylaine; Barbic, Skye; Rodriguez, Ana Maria; Lefebvre, Hélène; Archambault, Philippe; Kairy, Dahlia; Fung, Joyce; Labbé, Delphine; Lamontagne, Anouk; Kehayia, Eva

    2017-10-01

    Although public environments provide opportunities for participation and social inclusion, they are not always inclusive spaces and may not accommodate the wide diversity of people. The Rehabilitation Living Lab in the Mall is a unique, interdisciplinary, and multi-sectoral research project with an aim to transform a shopping complex in Montreal, Canada, into an inclusive environment optimizing the participation and social inclusion of all people. The PRECEDE-PROCEDE Model (PPM), a community-oriented and participatory planning model, was applied as a framework. The PPM is comprised of nine steps divided between planning, implementation, and evaluation. The PPM is well suited as a framework for the development of an inclusive mall. Its ecological approach considers the environment, as well as the social and individual factors relating to mall users' needs and expectations. Transforming a mall to be more inclusive is a complex process involving many stakeholders. The PPM allows the synthesis of several sources of information, as well as the identification and prioritization of key issues to address. The PPM also helps to frame and drive the implementation and evaluate the components of the project. This knowledge can help others interested in using the PPM to create similar enabling and inclusive environments world-wide. Implication for rehabilitation While public environments provide opportunities for participation and social inclusion, they are not always inclusive spaces and may not accommodate the wide diversity of people. The PRECEDE PROCEDE Model (PPM) is well suited as a framework for the development, implementation, and evaluation of an inclusive mall. Environmental barriers can negatively impact the rehabilitation process by impeding the restoration and augmentation of function. Removing barriers to social participation and independent living by improving inclusivity in the mall and other environments positively impacts the lives of people with disabilities.

  9. Fermentation process tracking through enhanced spectral calibration modeling.

    Science.gov (United States)

    Triadaphillou, Sophia; Martin, Elaine; Montague, Gary; Norden, Alison; Jeffkins, Paul; Stimpson, Sarah

    2007-06-15

    The FDA process analytical technology (PAT) initiative will materialize in a significant increase in the number of installations of spectroscopic instrumentation. However, to attain the greatest benefit from the data generated, there is a need for calibration procedures that extract the maximum information content. For example, in fermentation processes, the interpretation of the resulting spectra is challenging as a consequence of the large number of wavelengths recorded, the underlying correlation structure that is evident between the wavelengths and the impact of the measurement environment. Approaches to the development of calibration models have been based on the application of partial least squares (PLS) either to the full spectral signature or to a subset of wavelengths. This paper presents a new approach to calibration modeling that combines a wavelength selection procedure, spectral window selection (SWS), where windows of wavelengths are automatically selected which are subsequently used as the basis of the calibration model. However, due to the non-uniqueness of the windows selected when the algorithm is executed repeatedly, multiple models are constructed and these are then combined using stacking thereby increasing the robustness of the final calibration model. The methodology is applied to data generated during the monitoring of broth concentrations in an industrial fermentation process from on-line near-infrared (NIR) and mid-infrared (MIR) spectrometers. It is shown that the proposed calibration modeling procedure outperforms traditional calibration procedures, as well as enabling the identification of the critical regions of the spectra with regard to the fermentation process.

  10. Strategic management in urban environment using SWOT and QSPM model

    Directory of Open Access Journals (Sweden)

    M. Pazouki

    2017-04-01

    Full Text Available Sustainable urban development is a new concept of fundamental environmental metropolitan management that not only creates the demand for changing the concepts of economic development, but also affects social development. The current study  provides  a conceptual model of a sustainable environment pattern In District 22 of Tehran that depends on the relationship between environment and economy, and a network of urban function, which  Included transport infrastructure and community centers and economic and regional level in support of the ecological services in Tehran. This landscape often  had discrepancies  with the development of the city between the layers and the creation of ecological fragile areas. The main objective of the study was to determine the sustainability indicators and create a future development  model  for District 22 of Tehran. The data was collected by having a review of similar studies and field research on the subject and therefore the effective factors were identified. After accomplished proceedings, the questionnaire was prepared and the results were used in SWOT charts' grading after analyzing at interior and exterior matrix. Ultimately, quantitative strategic planning matrix (QSPM was performed based on the results and analysis. This process provided a comprehensive model for sustainable urban development as sustainable development urban landscape pattern.

  11. Relevance of a Healthy Change Process and Psychosocial Work Environment Factors in Predicting Stress, Health Complaints, and Commitment Among Employees in a Ghanaian Bank

    OpenAIRE

    Quaye, Emmanuel

    2010-01-01

    This thesis was intended to examine the effect of the healthiness of change process and psychosocial work environment factors in predicting job stress, health complaints and commitment among employees in a Ghanaian bank (N=132), undergoing organizational change. The change process was measured in terms of dimensions from the Healthy Change Process Index (HCPI) and the psychosocial work environment was measured by the Demands-Control-Support (DCS) model. Hierarchical regression analyses reveal...

  12. Process Model for Friction Stir Welding

    Science.gov (United States)

    Adams, Glynn

    1996-01-01

    forging affect of the shoulder. The energy balance at the boundary of the plastic region with the environment required that energy flow away from the boundary in both radial directions. One resolution to this problem may be to introduce a time dependency into the process model, allowing the energy flow to oscillate across this boundary. Finally, experimental measurements are needed to verify the concepts used here and to aid in improving the model.

  13. A Data Stream Model For Runoff Simulation In A Changing Environment

    Science.gov (United States)

    Yang, Q.; Shao, J.; Zhang, H.; Wang, G.

    2017-12-01

    Runoff simulation is of great significance for water engineering design, water disaster control, water resources planning and management in a catchment or region. A large number of methods including concept-based process-driven models and statistic-based data-driven models, have been proposed and widely used in worldwide during past decades. Most existing models assume that the relationship among runoff and its impacting factors is stationary. However, in the changing environment (e.g., climate change, human disturbance), their relationship usually evolves over time. In this study, we propose a data stream model for runoff simulation in a changing environment. Specifically, the proposed model works in three steps: learning a rule set, expansion of a rule, and simulation. The first step is to initialize a rule set. When a new observation arrives, the model will check which rule covers it and then use the rule for simulation. Meanwhile, Page-Hinckley (PH) change detection test is used to monitor the online simulation error of each rule. If a change is detected, the corresponding rule is removed from the rule set. In the second step, for each rule, if it covers more than a given number of instance, the rule is expected to expand. In the third step, a simulation model of each leaf node is learnt with a perceptron without activation function, and is updated with adding a newly incoming observation. Taking Fuxi River catchment as a case study, we applied the model to simulate the monthly runoff in the catchment. Results show that abrupt change is detected in the year of 1997 by using the Page-Hinckley change detection test method, which is consistent with the historic record of flooding. In addition, the model achieves good simulation results with the RMSE of 13.326, and outperforms many established methods. The findings demonstrated that the proposed data stream model provides a promising way to simulate runoff in a changing environment.

  14. Standard Model processes

    CERN Document Server

    Mangano, M.L.; Aguilar-Saavedra, Juan Antonio; Alekhin, S.; Badger, S.; Bauer, C.W.; Becher, T.; Bertone, V.; Bonvini, M.; Boselli, S.; Bothmann, E.; Boughezal, R.; Cacciari, M.; Carloni Calame, C.M.; Caola, F.; Campbell, J.M.; Carrazza, S.; Chiesa, M.; Cieri, L.; Cimaglia, F.; Febres Cordero, F.; Ferrarese, P.; D'Enterria, D.; Ferrera, G.; Garcia i Tormo, X.; Garzelli, M.V.; Germann, E.; Hirschi, V.; Han, T.; Ita, H.; Jäger, B.; Kallweit, S.; Karlberg, A.; Kuttimalai, S.; Krauss, F.; Larkoski, A.J.; Lindert, J.; Luisoni, G.; Maierhöfer, P.; Mattelaer, O.; Martinez, H.; Moch, S.; Montagna, G.; Moretti, M.; Nason, P.; Nicrosini, O.; Oleari, C.; Pagani, D.; Papaefstathiou, A.; Petriello, F.; Piccinini, F.; Pierini, M.; Pierog, T.; Pozzorini, S.; Re, E.; Robens, T.; Rojo, J.; Ruiz, R.; Sakurai, K.; Salam, G.P.; Salfelder, L.; Schönherr, M.; Schulze, M.; Schumann, S.; Selvaggi, M.; Shivaji, A.; Siodmok, A.; Skands, P.; Torrielli, P.; Tramontano, F.; Tsinikos, I.; Tweedie, B.; Vicini, A.; Westhoff, S.; Zaro, M.; Zeppenfeld, D.; CERN. Geneva. ATS Department

    2017-06-22

    This report summarises the properties of Standard Model processes at the 100 TeV pp collider. We document the production rates and typical distributions for a number of benchmark Standard Model processes, and discuss new dynamical phenomena arising at the highest energies available at this collider. We discuss the intrinsic physics interest in the measurement of these Standard Model processes, as well as their role as backgrounds for New Physics searches.

  15. Canadian coastal environments, shoreline processes, and oil spill cleanup

    International Nuclear Information System (INIS)

    Owens, E.H.

    1994-03-01

    The coastal zone is a dynamic environment, so that in developing practical and effective oil spill response strategies it is necessary to understand the forces that contribute to shore-zone processs. The coasts of Canada encompass a wide range of environments and are characterized by a variety of shoreline types that include the exposed, resistant cliffs of eastern Newfoundland and the sheltered marshes of the Beaufort Sea. A report is presented to provide an understanding of the dynamics and physical processes as they vary on the different coasts of Canada, including the Great Lakes. An outline of the general character and processes on a regional basis describes the coastal environments and introduces the literature that can be consulted for more specific information. The likely fate and persistence of oil that reaches the shoreline is discussed to provide the framework for development of spill response strategies and for the selection of appropriate shoreline cleanup or treatment countermeasures. Lessons learned from recent experience with major oil spills and field experiments are integrated into the discussion. Separate abstracts have been prepared for each of the four sections of this report. 502 refs., 5 figs

  16. Family Environment and Cognitive Development: Twelve Analytic Models

    Science.gov (United States)

    Walberg, Herbert J.; Marjoribanks, Kevin

    1976-01-01

    The review indicates that refined measures of the family environment and the use of complex statistical models increase the understanding of the relationships between socioeconomic status, sibling variables, family environment, and cognitive development. (RC)

  17. Visualizing the process of process modeling with PPMCharts

    NARCIS (Netherlands)

    Claes, J.; Vanderfeesten, I.T.P.; Pinggera, J.; Reijers, H.A.; Weber, B.; Poels, G.; La Rosa, M.; Soffer, P.

    2013-01-01

    In the quest for knowledge about how to make good process models, recent research focus is shifting from studying the quality of process models to studying the process of process modeling (often abbreviated as PPM) itself. This paper reports on our efforts to visualize this specific process in such

  18. Modelling human behaviours and reactions under dangerous environment

    OpenAIRE

    Kang, J; Wright, D K; Qin, S F; Zhao, Y

    2005-01-01

    This paper describes the framework of a real-time simulation system to model human behavior and reactions in dangerous environments. The system utilizes the latest 3D computer animation techniques, combined with artificial intelligence, robotics and psychology, to model human behavior, reactions and decision making under expected/unexpected dangers in real-time in virtual environments. The development of the system includes: classification on the conscious/subconscious behaviors and reactions...

  19. A validated agent-based model to study the spatial and temporal heterogeneities of malaria incidence in the rainforest environment.

    Science.gov (United States)

    Pizzitutti, Francesco; Pan, William; Barbieri, Alisson; Miranda, J Jaime; Feingold, Beth; Guedes, Gilvan R; Alarcon-Valenzuela, Javiera; Mena, Carlos F

    2015-12-22

    The Amazon environment has been exposed in the last decades to radical changes that have been accompanied by a remarkable rise of both Plasmodium falciparum and Plasmodium vivax malaria. The malaria transmission process is highly influenced by factors such as spatial and temporal heterogeneities of the environment and individual-based characteristics of mosquitoes and humans populations. All these determinant factors can be simulated effectively trough agent-based models. This paper presents a validated agent-based model of local-scale malaria transmission. The model reproduces the environment of a typical riverine village in the northern Peruvian Amazon, where the malaria transmission is highly seasonal and apparently associated with flooding of large areas caused by the neighbouring river. Agents representing humans, mosquitoes and the two species of Plasmodium (P. falciparum and P. vivax) are simulated in a spatially explicit representation of the environment around the village. The model environment includes: climate, people houses positions and elevation. A representation of changes in the mosquito breeding areas extension caused by the river flooding is also included in the simulation environment. A calibration process was carried out to reproduce the variations of the malaria monthly incidence over a period of 3 years. The calibrated model is also able to reproduce the spatial heterogeneities of local scale malaria transmission. A "what if" eradication strategy scenario is proposed: if the mosquito breeding sites are eliminated through mosquito larva habitat management in a buffer area extended at least 200 m around the village, the malaria transmission is eradicated from the village. The use of agent-based models can reproduce effectively the spatiotemporal variations of the malaria transmission in a low endemicity environment dominated by river floodings like in the Amazon.

  20. Development of radiation processes for better environment

    International Nuclear Information System (INIS)

    Majali, A.B.; Sabharwal, S.; Deshpande, R.S.; Sarma, K.S.S.; Bhardwaj, Y.K.; Dhanawade, B.R.

    1998-01-01

    The increasing population and industrialization, worldover, is placing escalating demands for the development of newer technologies that are environment friendly and minimize the pollution associated with the development. Radiation technology can be of benefit in reducing the pollution levels associated with many processes. The sulphur vulcanization method for natural rubber latex vulcanization results in the formation of considerable amounts of nitrosoamines, both in the product as well as in the factory environment. Radiation vulcanization of natural rubber latex has emerged as a commercially viable alternative to produce sulphur and nitrosoamine free rubber. A Co-60 γ-radiation based pilot plant has been functioning since April 1993 to produce vulcanized natural rubber latex (RVNRL) using acrylate monomers as sensitizer. The role of sensitizer, viz. n-butyl acrylate in the vulcanization process has been elucidated using the pulse radiolysis technique. Emission of toxic sulphur containing gases form an inevitable part of viscose-rayon process and this industry is in search of ways to reduce the associated pollution levels. The irradiation of cellulose results in cellulose activation and reduction in the degree of polymerization (DP). These effects can reduce the solvents required to dissolve the paper pulp. There is a keen interest in utilizing radiation technology in viscose rayon production. We have utilized the electron beam (EB) accelerator for reducing the degree of polymerization (DP) of paper pulp. Laboratory scale tests have been carried out to standardize the conditions for production of pulp having the desired DP by EB irradiation. The use of irradiated paper pulp can result in ∼40% reduction in the consumption of CS 2 in the process that can be beneficial in reducing the pollution associated with the process. PTFE waste can be recycled into a low molecular weight microfine powder by irradiation. An EB based process has been standardized to produce

  1. Treatment of dynamical processes in two-dimensional models of the troposphere and stratosphere

    International Nuclear Information System (INIS)

    Wuebbles, D.J.

    1980-07-01

    The physical structure of the troposphere and stratosphere is the result of an intricate interplay among a large number of radiative, chemical, and dynamical processes. Because it is not possible to model the global environment in the laboratory, theoretical models must be relied on, subject to observational verification, to simulate atmospheric processes. Of particular concern in recent years has been the modeling of those processes affecting the structure of ozone and other trace species in the stratosphere and troposphere. Zonally averaged two-dimensional models with spatial resolution in the vertical and meridional directions can provide a much more realistic representation of tracer transport than one-dimensional models, yet are capable of the detailed representation of chemical and radiative processes contained in the one-dimensional models. The purpose of this study is to describe and analyze existing approaches to representing global atmospheric transport processes in two-dimensional models and to discuss possible alternatives to these approaches. A general description of the processes controlling the transport of trace constituents in the troposphere and stratosphere is given

  2. Environment Modeling Using Runtime Values for JPF-Android

    Science.gov (United States)

    van der Merwe, Heila; Tkachuk, Oksana; Nel, Seal; van der Merwe, Brink; Visser, Willem

    2015-01-01

    Software applications are developed to be executed in a specific environment. This environment includes external native libraries to add functionality to the application and drivers to fire the application execution. For testing and verification, the environment of an application is simplified abstracted using models or stubs. Empty stubs, returning default values, are simple to generate automatically, but they do not perform well when the application expects specific return values. Symbolic execution is used to find input parameters for drivers and return values for library stubs, but it struggles to detect the values of complex objects. In this work-in-progress paper, we explore an approach to generate drivers and stubs based on values collected during runtime instead of using default values. Entry-points and methods that need to be modeled are instrumented to log their parameters and return values. The instrumented applications are then executed using a driver and instrumented libraries. The values collected during runtime are used to generate driver and stub values on- the-fly that improve coverage during verification by enabling the execution of code that previously crashed or was missed. We are implementing this approach to improve the environment model of JPF-Android, our model checking and analysis tool for Android applications.

  3. Multi-model-based interactive authoring environment for creating shareable medical knowledge.

    Science.gov (United States)

    Ali, Taqdir; Hussain, Maqbool; Ali Khan, Wajahat; Afzal, Muhammad; Hussain, Jamil; Ali, Rahman; Hassan, Waseem; Jamshed, Arif; Kang, Byeong Ho; Lee, Sungyoung

    2017-10-01

    criteria, which we assessed through the system- and user-centric evaluation processes. For system-centric evaluation, we compared the implementation of clinical information modelling system requirements in our proposed system and in existing systems. The results suggested that 82.05% of the requirements were fully supported, 7.69% were partially supported, and 10.25% were not supported by our system. In the existing systems, 35.89% of requirements were fully supported, 28.20% were partially supported, and 35.89% were not supported. For user-centric evaluation, the assessment criterion was 'ease of use'. Our proposed system showed 15 times better results with respect to MLM creation time than the existing systems. Moreover, on average, the participants made only one error in MLM creation using our proposed system, but 13 errors per MLM using the existing systems. We provide a user-friendly authoring environment for creation of shareable and interoperable knowledge for CDSS to overcome knowledge acquisition complexity. The authoring environment uses state-of-the-art decision support-related clinical standards with increased ease of use. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Random regression models for detection of gene by environment interaction

    Directory of Open Access Journals (Sweden)

    Meuwissen Theo HE

    2007-02-01

    Full Text Available Abstract Two random regression models, where the effect of a putative QTL was regressed on an environmental gradient, are described. The first model estimates the correlation between intercept and slope of the random regression, while the other model restricts this correlation to 1 or -1, which is expected under a bi-allelic QTL model. The random regression models were compared to a model assuming no gene by environment interactions. The comparison was done with regards to the models ability to detect QTL, to position them accurately and to detect possible QTL by environment interactions. A simulation study based on a granddaughter design was conducted, and QTL were assumed, either by assigning an effect independent of the environment or as a linear function of a simulated environmental gradient. It was concluded that the random regression models were suitable for detection of QTL effects, in the presence and absence of interactions with environmental gradients. Fixing the correlation between intercept and slope of the random regression had a positive effect on power when the QTL effects re-ranked between environments.

  5. COMPUTER MODEL AND SIMULATION OF A GLOVE BOX PROCESS

    International Nuclear Information System (INIS)

    Foster, C.

    2001-01-01

    The development of facilities to deal with the disposition of nuclear materials at an acceptable level of Occupational Radiation Exposure (ORE) is a significant issue facing the nuclear community. One solution is to minimize the worker's exposure though the use of automated systems. However, the adoption of automated systems for these tasks is hampered by the challenging requirements that these systems must meet in order to be cost effective solutions in the hazardous nuclear materials processing environment. Retrofitting current glove box technologies with automation systems represents potential near-term technology that can be applied to reduce worker ORE associated with work in nuclear materials processing facilities. Successful deployment of automation systems for these applications requires the development of testing and deployment strategies to ensure the highest level of safety and effectiveness. Historically, safety tests are conducted with glove box mock-ups around the finished design. This late detection of problems leads to expensive redesigns and costly deployment delays. With wide spread availability of computers and cost effective simulation software it is possible to discover and fix problems early in the design stages. Computer simulators can easily create a complete model of the system allowing a safe medium for testing potential failures and design shortcomings. The majority of design specification is now done on computer and moving that information to a model is relatively straightforward. With a complete model and results from a Failure Mode Effect Analysis (FMEA), redesigns can be worked early. Additional issues such as user accessibility, component replacement, and alignment problems can be tackled early in the virtual environment provided by computer simulation. In this case, a commercial simulation package is used to simulate a lathe process operation at the Los Alamos National Laboratory (LANL). The Lathe process operation is indicative of

  6. Improving the Aircraft Design Process Using Web-based Modeling and Simulation

    Science.gov (United States)

    Reed, John A.; Follen, Gregory J.; Afjeh, Abdollah A.

    2003-01-01

    Designing and developing new aircraft systems is time-consuming and expensive. Computational simulation is a promising means for reducing design cycle times, but requires a flexible software environment capable of integrating advanced multidisciplinary and muitifidelity analysis methods, dynamically managing data across heterogeneous computing platforms, and distributing computationally complex tasks. Web-based simulation, with its emphasis on collaborative composition of simulation models, distributed heterogeneous execution, and dynamic multimedia documentation, has the potential to meet these requirements. This paper outlines the current aircraft design process, highlighting its problems and complexities, and presents our vision of an aircraft design process using Web-based modeling and simulation.

  7. CONSTRUCTIVE MODEL OF ADAPTATION OF DATA STRUCTURES IN RAM. PART II. CONSTRUCTORS OF SCENARIOS AND ADAPTATION PROCESSES

    Directory of Open Access Journals (Sweden)

    V. I. Shynkarenko

    2016-04-01

    Full Text Available Purpose.The second part of the paper completes presentation of constructive and the productive structures (CPS, modeling adaptation of data structures in memory (RAM. The purpose of the second part in the research is to develop a model of process of adaptation data in a RAM functioning in different hardware and software environments and scenarios of data processing. Methodology. The methodology of mathematical and algorithmic constructionism was applied. In this part of the paper, changes were developed the constructors of scenarios and adaptation processes based on a generalized CPS through its transformational conversions. Constructors are interpreted, specialized CPS. Were highlighted the terminal alphabets of the constructor scenarios in the form of data processing algorithms and the constructor of adaptation – in the form of algorithmic components of the adaptation process. The methodology involves the development of substitution rules that determine the output process of the relevant structures. Findings. In the second part of the paper, system is represented by CPS modeling adaptation data placement in the RAM, namely, constructors of scenarios and of adaptation processes. The result of the implementation of constructor of scenarios is a set of data processing operations in the form of text in the language of programming C#, constructor of the adaptation processes – a process of adaptation, and the result the process of adaptation – the adapted binary code of processing data structures. Originality. For the first time proposed the constructive model of data processing – the scenario that takes into account the order and number of calls to the various elements of data structures and adaptation of data structures to the different hardware and software environments. At the same the placement of data in RAM and processing algorithms are adapted. Constructionism application in modeling allows to link data models and algorithms for

  8. A Big Data-driven Model for the Optimization of Healthcare Processes.

    Science.gov (United States)

    Koufi, Vassiliki; Malamateniou, Flora; Vassilacopoulos, George

    2015-01-01

    Healthcare organizations increasingly navigate a highly volatile, complex environment in which technological advancements and new healthcare delivery business models are the only constants. In their effort to out-perform in this environment, healthcare organizations need to be agile enough in order to become responsive to these increasingly changing conditions. To act with agility, healthcare organizations need to discover new ways to optimize their operations. To this end, they focus on healthcare processes that guide healthcare delivery and on the technologies that support them. Business process management (BPM) and Service-Oriented Architecture (SOA) can provide a flexible, dynamic, cloud-ready infrastructure where business process analytics can be utilized to extract useful insights from mountains of raw data, and make them work in ways beyond the abilities of human brains, or IT systems from just a year ago. This paper presents a framework which provides healthcare professionals gain better insight within and across your business processes. In particular, it performs real-time analysis on process-related data in order reveal areas of potential process improvement.

  9. Representing environment-induced helix-coil transitions in a coarse grained peptide model

    Science.gov (United States)

    Dalgicdir, Cahit; Globisch, Christoph; Sayar, Mehmet; Peter, Christine

    2016-10-01

    Coarse grained (CG) models are widely used in studying peptide self-assembly and nanostructure formation. One of the recurrent challenges in CG modeling is the problem of limited transferability, for example to different thermodynamic state points and system compositions. Understanding transferability is generally a prerequisite to knowing for which problems a model can be reliably used and predictive. For peptides, one crucial transferability question is whether a model reproduces the molecule's conformational response to a change in its molecular environment. This is of particular importance since CG peptide models often have to resort to auxiliary interactions that aid secondary structure formation. Such interactions take care of properties of the real system that are per se lost in the coarse graining process such as dihedral-angle correlations along the backbone or backbone hydrogen bonding. These auxiliary interactions may then easily overstabilize certain conformational propensities and therefore destroy the ability of the model to respond to stimuli and environment changes, i.e. they impede transferability. In the present paper we have investigated a short peptide with amphiphilic EALA repeats which undergoes conformational transitions between a disordered and a helical state upon a change in pH value or due to the presence of a soft apolar/polar interface. We designed a base CG peptide model that does not carry a specific (backbone) bias towards a secondary structure. This base model was combined with two typical approaches of ensuring secondary structure formation, namely a C α -C α -C α -C α pseudodihedral angle potential or a virtual site interaction that mimics hydrogen bonding. We have investigated the ability of the two resulting CG models to represent the environment-induced conformational changes in the helix-coil equilibrium of EALA. We show that with both approaches a CG peptide model can be obtained that is environment-transferable and that

  10. Extending MBI Model using ITIL and COBIT Processes

    Directory of Open Access Journals (Sweden)

    Sona Karkoskova

    2015-10-01

    Full Text Available Most organizations today operate in a highly complex and competitive business environment and need to be able to react to rapidly changing market conditions. IT management frameworks are widely used to provide effective support for business objectives through aligning IT with business and optimizing the use of IT resources. In this paper we analyze three IT management frameworks (ITIL, COBIT and MBI with the objective to identify the relationships between these frameworks, and mapping ITIL and COBIT processes to MBI tasks. As a result of this analysis we propose extensions to the MBI model to incorporate IT Performance Management and a Capability Maturity Model.

  11. MOUNTAIN-SCALE COUPLED PROCESSES (TH/THC/THM) MODELS

    International Nuclear Information System (INIS)

    Y.S. Wu

    2005-01-01

    This report documents the development and validation of the mountain-scale thermal-hydrologic (TH), thermal-hydrologic-chemical (THC), and thermal-hydrologic-mechanical (THM) models. These models provide technical support for screening of features, events, and processes (FEPs) related to the effects of coupled TH/THC/THM processes on mountain-scale unsaturated zone (UZ) and saturated zone (SZ) flow at Yucca Mountain, Nevada (BSC 2005 [DIRS 174842], Section 2.1.1.1). The purpose and validation criteria for these models are specified in ''Technical Work Plan for: Near-Field Environment and Transport: Coupled Processes (Mountain-Scale TH/THC/THM, Drift-Scale THC Seepage, and Drift-Scale Abstraction) Model Report Integration'' (BSC 2005 [DIRS 174842]). Model results are used to support exclusion of certain FEPs from the total system performance assessment for the license application (TSPA-LA) model on the basis of low consequence, consistent with the requirements of 10 CFR 63.342 [DIRS 173273]. Outputs from this report are not direct feeds to the TSPA-LA. All the FEPs related to the effects of coupled TH/THC/THM processes on mountain-scale UZ and SZ flow are discussed in Sections 6 and 7 of this report. The mountain-scale coupled TH/THC/THM processes models numerically simulate the impact of nuclear waste heat release on the natural hydrogeological system, including a representation of heat-driven processes occurring in the far field. The mountain-scale TH simulations provide predictions for thermally affected liquid saturation, gas- and liquid-phase fluxes, and water and rock temperature (together called the flow fields). The main focus of the TH model is to predict the changes in water flux driven by evaporation/condensation processes, and drainage between drifts. The TH model captures mountain-scale three-dimensional flow effects, including lateral diversion and mountain-scale flow patterns. The mountain-scale THC model evaluates TH effects on water and gas

  12. MOUNTAIN-SCALE COUPLED PROCESSES (TH/THC/THM)MODELS

    Energy Technology Data Exchange (ETDEWEB)

    Y.S. Wu

    2005-08-24

    This report documents the development and validation of the mountain-scale thermal-hydrologic (TH), thermal-hydrologic-chemical (THC), and thermal-hydrologic-mechanical (THM) models. These models provide technical support for screening of features, events, and processes (FEPs) related to the effects of coupled TH/THC/THM processes on mountain-scale unsaturated zone (UZ) and saturated zone (SZ) flow at Yucca Mountain, Nevada (BSC 2005 [DIRS 174842], Section 2.1.1.1). The purpose and validation criteria for these models are specified in ''Technical Work Plan for: Near-Field Environment and Transport: Coupled Processes (Mountain-Scale TH/THC/THM, Drift-Scale THC Seepage, and Drift-Scale Abstraction) Model Report Integration'' (BSC 2005 [DIRS 174842]). Model results are used to support exclusion of certain FEPs from the total system performance assessment for the license application (TSPA-LA) model on the basis of low consequence, consistent with the requirements of 10 CFR 63.342 [DIRS 173273]. Outputs from this report are not direct feeds to the TSPA-LA. All the FEPs related to the effects of coupled TH/THC/THM processes on mountain-scale UZ and SZ flow are discussed in Sections 6 and 7 of this report. The mountain-scale coupled TH/THC/THM processes models numerically simulate the impact of nuclear waste heat release on the natural hydrogeological system, including a representation of heat-driven processes occurring in the far field. The mountain-scale TH simulations provide predictions for thermally affected liquid saturation, gas- and liquid-phase fluxes, and water and rock temperature (together called the flow fields). The main focus of the TH model is to predict the changes in water flux driven by evaporation/condensation processes, and drainage between drifts. The TH model captures mountain-scale three-dimensional flow effects, including lateral diversion and mountain-scale flow patterns. The mountain-scale THC model evaluates TH effects on

  13. Continuous state branching processes in random environment: The Brownian case

    OpenAIRE

    Palau, Sandra; Pardo, Juan Carlos

    2015-01-01

    We consider continuous state branching processes that are perturbed by a Brownian motion. These processes are constructed as the unique strong solution of a stochastic differential equation. The long-term extinction and explosion behaviours are studied. In the stable case, the extinction and explosion probabilities are given explicitly. We find three regimes for the asymptotic behaviour of the explosion probability and, as in the case of branching processes in random environment, we find five...

  14. Business process model repositories : efficient process retrieval

    NARCIS (Netherlands)

    Yan, Z.

    2012-01-01

    As organizations increasingly work in process-oriented manner, the number of business process models that they develop and have to maintain increases. As a consequence, it has become common for organizations to have collections of hundreds or even thousands of business process models. When a

  15. MODELING OF WATER DISTRIBUTION SYSTEM PARAMETERS AND THEIR PARTICULAR IMPORTANCE IN ENVIRONMENT ENGINEERING PROCESSES

    Directory of Open Access Journals (Sweden)

    Agnieszka Trębicka

    2016-05-01

    Full Text Available The object of this study is to present a mathematical model of water-supply network and the analysis of basic parameters of water distribution system with a digital model. The reference area is Kleosin village, municipality Juchnowiec Kościelny in podlaskie province, located at the border with Białystok. The study focused on the significance of every change related to the quality and quantity of water delivered to WDS through modeling the basic parameters of water distribution system in different variants of work in order to specify new, more rational ways of exploitation (decrease in pressure value and to define conditions for development and modernization of the water-supply network, with special analysis of the scheme, in frames of specification of the most dangerous places in the network. The analyzed processes are based on copying and developing the existing state of water distribution sub-system (the WDS with the use of mathematical modeling that includes the newest accessible computer techniques.

  16. Modelling of Processes of Logistics in Cyberspace Security

    Directory of Open Access Journals (Sweden)

    Konečný Jiří

    2017-01-01

    Full Text Available The goal of this contribution is especially to familiarize experts in various fields with the need for a new approach to the system-defined model and modelling of processes in the engineering practice and the expression of some state variables' possibilities for the modelling of real-world systems with regard to the highly dynamic development of structures and to the behaviour of systems of logistics. Thus, in this contribution, the necessity of making full use of cybernetics as a field for the management and communication of information is expressed, and also the environment of cybernetics as a much needed cybernetic realm (cyberspace, determining the steady state between cyber-attacks and cyber-defence as a modern knowledge-based potential in general and specifically of logistics in cyber security. Connected with this process is the very important area of lifelong training of experts in the dynamic world of science and technology (that is, also in a social system which is also expressed here briefly, and also the cyber and information security, all of which falls under the cyberspace of new perspective electronic learning (e-learning with the use of modern laboratories with new effects also for future possibilities of process modelling of artificial intelligence (AI with a perspective of mass use of UAVs in logistics.

  17. A Neural Network Model to Learn Multiple Tasks under Dynamic Environments

    Science.gov (United States)

    Tsumori, Kenji; Ozawa, Seiichi

    When environments are dynamically changed for agents, the knowledge acquired in an environment might be useless in future. In such dynamic environments, agents should be able to not only acquire new knowledge but also modify old knowledge in learning. However, modifying all knowledge acquired before is not efficient because the knowledge once acquired may be useful again when similar environment reappears and some knowledge can be shared among different environments. To learn efficiently in such environments, we propose a neural network model that consists of the following modules: resource allocating network, long-term & short-term memory, and environment change detector. We evaluate the model under a class of dynamic environments where multiple function approximation tasks are sequentially given. The experimental results demonstrate that the proposed model possesses stable incremental learning, accurate environmental change detection, proper association and recall of old knowledge, and efficient knowledge transfer.

  18. Modeling Aspects of Activated Sludge Processes Part l l: Mathematical Process Modeling and Biokinetics of Activated Sludge Processes

    Energy Technology Data Exchange (ETDEWEB)

    AbdElHaleem, H S [Cairo Univ.-CivlI Eng. Dept., Giza (Egypt); EI-Ahwany, A H [CairoUlmrsity- Faculty ofEngincering - Chemical Engineering Department, Giza (Egypt); Ibrahim, H I [Helwan University- Faculty of Engineering - Biomedical Engineering Department, Helwan (Egypt); Ibrahim, G [Menofia University- Faculty of Engineering Sbebin EI Kom- Basic Eng. Sc. Dept., Menofia (Egypt)

    2004-07-01

    Mathematical process modeling and biokinetics of activated sludge process were reviewed considering different types of models. It has been evaluated the task group models of ASMI. and 2, and 3 versioned by Henze et al considering the conditions of each model and the different processes of which every model consists. It is revealed that ASMI contains some defects avoided in ASM3. Relied on homogeneity, Models can be classified into homogenous models characterized by taking the activated sludge process as one phase. In this type of models, the internal mass transfer inside the floes was neglected.. Hence, the kinetic parameter produces can be considered inaccurate. The other type of models is the heterogeneous model This type considers the mass transfer operations in addition to the biochemical reaction processes; hence, the resulted kinetic parameters can be considered more accurate than that of homogenous type.

  19. Modeling Aspects of Activated Sludge Processes Part l l: Mathematical Process Modeling and Biokinetics of Activated Sludge Processes

    International Nuclear Information System (INIS)

    AbdElHaleem, H.S.; EI-Ahwany, A. H.; Ibrahim, H.I.; Ibrahim, G.

    2004-01-01

    Mathematical process modeling and biokinetics of activated sludge process were reviewed considering different types of models. It has been evaluated the task group models of ASMI. and 2, and 3 versioned by Henze et al considering the conditions of each model and the different processes of which every model consists. It is revealed that ASMI contains some defects avoided in ASM3. Relied on homogeneity, Models can be classified into homogenous models characterized by taking the activated sludge process as one phase. In this type of models, the internal mass transfer inside the floes was neglected.. Hence, the kinetic parameter produces can be considered inaccurate. The other type of models is the heterogeneous model This type considers the mass transfer operations in addition to the biochemical reaction processes; hence, the resulted kinetic parameters can be considered more accurate than that of homogenous type

  20. Upgrading Preschool Environment in a Swedish Municipality: Evaluation of an Implementation Process.

    Science.gov (United States)

    Altin, Carolina; Kvist Lindholm, Sofia; Wejdmark, Mats; Lättman-Masch, Robert; Boldemann, Cecilia

    2015-07-01

    Redesigning outdoor preschool environment may favorably affect multiple factors relevant to health and reach many children. Cross-sectional studies in various landscapes at different latitudes have explored the characteristics of preschool outdoor environment considering the play potential triggering combined physical activity and sun-protective behavior due to space, vegetation, and topography. Criteria were pinpointed to upgrade preschool outdoor environment for multiple health outcomes to be applied in local government in charge of public preschools. Purposeful land use policies and administrative management of outdoor land use may serve to monitor the quality of preschool outdoor environments (upgrading and planning). This study evaluates the process of implementing routines for upgrading outdoor preschool environments in a medium-sized municipality, Sweden, 2008-2011, using qualitative and quantitative analysis. Recorded written material (logs and protocols) related to the project was processed using thematic analysis. Quantitative data (m(2) flat/multileveled, overgrown/naked surface, and fraction of free visible sky) were analyzed to assess the impact of implementation (surface, topography, greenery integrated in play). The preschool outdoor environments were upgraded accordingly. The quality of implementation was assessed using the theory of policy streams approach. Though long-term impact remains to be confirmed the process seems to have changed work routines in the interior management for purposeful upgrading of preschool outdoor environments. The aptitude and applicability of inexpensive methods for assessing, selecting, and upgrading preschool land at various latitudes, climates, and outdoor play policies (including gender aspects and staff policies) should be further discussed, as well as the compilation of data for monitoring and evaluation. © 2015 Society for Public Health Education.

  1. Process correlation analysis model for process improvement identification.

    Science.gov (United States)

    Choi, Su-jin; Kim, Dae-Kyoo; Park, Sooyong

    2014-01-01

    Software process improvement aims at improving the development process of software systems. It is initiated by process assessment identifying strengths and weaknesses and based on the findings, improvement plans are developed. In general, a process reference model (e.g., CMMI) is used throughout the process of software process improvement as the base. CMMI defines a set of process areas involved in software development and what to be carried out in process areas in terms of goals and practices. Process areas and their elements (goals and practices) are often correlated due to the iterative nature of software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data.

  2. The impact of working memory and the "process of process modelling" on model quality: Investigating experienced versus inexperienced modellers

    DEFF Research Database (Denmark)

    Martini, Markus; Pinggera, Jakob; Neurauter, Manuel

    2016-01-01

    of reconciliation phases was positively related to PM quality in experienced modellers. Our research reveals central cognitive mechanisms in process modelling and has potential practical implications for the development of modelling software and teaching the craft of process modelling....... the role of cognitive processes as well as modelling processes in creating a PM in experienced and inexperienced modellers. Specifically, two working memory (WM) functions (holding and processing of information and relational integration) and three process of process modelling phases (comprehension...

  3. Product and Process Modelling

    DEFF Research Database (Denmark)

    Cameron, Ian T.; Gani, Rafiqul

    . These approaches are put into the context of life cycle modelling, where multiscale and multiform modelling is increasingly prevalent in the 21st century. The book commences with a discussion of modern product and process modelling theory and practice followed by a series of case studies drawn from a variety......This book covers the area of product and process modelling via a case study approach. It addresses a wide range of modelling applications with emphasis on modelling methodology and the subsequent in-depth analysis of mathematical models to gain insight via structural aspects of the models...... to biotechnology applications, food, polymer and human health application areas. The book highlights to important nature of modern product and process modelling in the decision making processes across the life cycle. As such it provides an important resource for students, researchers and industrial practitioners....

  4. Virtual Collaborative Simulation Environment for Integrated Product and Process Development

    Science.gov (United States)

    Gulli, Michael A.

    1997-01-01

    Deneb Robotics is a leader in the development of commercially available, leading edge three- dimensional simulation software tools for virtual prototyping,, simulation-based design, manufacturing process simulation, and factory floor simulation and training applications. Deneb has developed and commercially released a preliminary Virtual Collaborative Engineering (VCE) capability for Integrated Product and Process Development (IPPD). This capability allows distributed, real-time visualization and evaluation of design concepts, manufacturing processes, and total factory and enterprises in one seamless simulation environment.

  5. Regulatory Models and the Environment: Practice, Pitfalls, and Prospects

    Energy Technology Data Exchange (ETDEWEB)

    Holmes, K. John; Graham, Judith A.; McKone, Thomas; Whipple, Chris

    2008-06-01

    Computational models support environmental regulatory activities by providing the regulator an ability to evaluate available knowledge, assess alternative regulations, and provide a framework to assess compliance. But all models face inherent uncertainties, because human and natural systems are always more complex and heterogeneous than can be captured in a model. Here we provide a summary discussion of the activities, findings, and recommendations of the National Research Council's Committee on Regulatory Environmental Models, a committee funded by the US Environmental Protection Agency to provide guidance on the use of computational models in the regulatory process. Modeling is a difficult enterprise even outside of the potentially adversarial regulatory environment. The demands grow when the regulatory requirements for accountability, transparency, public accessibility, and technical rigor are added to the challenges. Moreover, models cannot be validated (declared true) but instead should be evaluated with regard to their suitability as tools to address a specific question. The committee concluded that these characteristics make evaluation of a regulatory model more complex than simply comparing measurement data with model results. Evaluation also must balance the need for a model to be accurate with the need for a model to be reproducible, transparent, and useful for the regulatory decision at hand. Meeting these needs requires model evaluation to be applied over the"life cycle" of a regulatory model with an approach that includes different forms of peer review, uncertainty analysis, and extrapolation methods than for non-regulatory models.

  6. Engineered Barrier System Degradation, Flow, and Transport Process Model Report

    Energy Technology Data Exchange (ETDEWEB)

    E.L. Hardin

    2000-07-17

    The Engineered Barrier System Degradation, Flow, and Transport Process Model Report (EBS PMR) is one of nine PMRs supporting the Total System Performance Assessment (TSPA) being developed by the Yucca Mountain Project for the Site Recommendation Report (SRR). The EBS PMR summarizes the development and abstraction of models for processes that govern the evolution of conditions within the emplacement drifts of a potential high-level nuclear waste repository at Yucca Mountain, Nye County, Nevada. Details of these individual models are documented in 23 supporting Analysis/Model Reports (AMRs). Nineteen of these AMRs are for process models, and the remaining 4 describe the abstraction of results for application in TSPA. The process models themselves cluster around four major topics: ''Water Distribution and Removal Model, Physical and Chemical Environment Model, Radionuclide Transport Model, and Multiscale Thermohydrologic Model''. One AMR (Engineered Barrier System-Features, Events, and Processes/Degradation Modes Analysis) summarizes the formal screening analysis used to select the Features, Events, and Processes (FEPs) included in TSPA and those excluded from further consideration. Performance of a potential Yucca Mountain high-level radioactive waste repository depends on both the natural barrier system (NBS) and the engineered barrier system (EBS) and on their interactions. Although the waste packages are generally considered as components of the EBS, the EBS as defined in the EBS PMR includes all engineered components outside the waste packages. The principal function of the EBS is to complement the geologic system in limiting the amount of water contacting nuclear waste. A number of alternatives were considered by the Project for different EBS designs that could provide better performance than the design analyzed for the Viability Assessment. The design concept selected was Enhanced Design Alternative II (EDA II).

  7. Engineered Barrier System Degradation, Flow, and Transport Process Model Report

    International Nuclear Information System (INIS)

    E.L. Hardin

    2000-01-01

    The Engineered Barrier System Degradation, Flow, and Transport Process Model Report (EBS PMR) is one of nine PMRs supporting the Total System Performance Assessment (TSPA) being developed by the Yucca Mountain Project for the Site Recommendation Report (SRR). The EBS PMR summarizes the development and abstraction of models for processes that govern the evolution of conditions within the emplacement drifts of a potential high-level nuclear waste repository at Yucca Mountain, Nye County, Nevada. Details of these individual models are documented in 23 supporting Analysis/Model Reports (AMRs). Nineteen of these AMRs are for process models, and the remaining 4 describe the abstraction of results for application in TSPA. The process models themselves cluster around four major topics: ''Water Distribution and Removal Model, Physical and Chemical Environment Model, Radionuclide Transport Model, and Multiscale Thermohydrologic Model''. One AMR (Engineered Barrier System-Features, Events, and Processes/Degradation Modes Analysis) summarizes the formal screening analysis used to select the Features, Events, and Processes (FEPs) included in TSPA and those excluded from further consideration. Performance of a potential Yucca Mountain high-level radioactive waste repository depends on both the natural barrier system (NBS) and the engineered barrier system (EBS) and on their interactions. Although the waste packages are generally considered as components of the EBS, the EBS as defined in the EBS PMR includes all engineered components outside the waste packages. The principal function of the EBS is to complement the geologic system in limiting the amount of water contacting nuclear waste. A number of alternatives were considered by the Project for different EBS designs that could provide better performance than the design analyzed for the Viability Assessment. The design concept selected was Enhanced Design Alternative II (EDA II)

  8. Programmatic access to logical models in the Cell Collective modeling environment via a REST API.

    Science.gov (United States)

    Kowal, Bryan M; Schreier, Travis R; Dauer, Joseph T; Helikar, Tomáš

    2016-01-01

    Cell Collective (www.cellcollective.org) is a web-based interactive environment for constructing, simulating and analyzing logical models of biological systems. Herein, we present a Web service to access models, annotations, and simulation data in the Cell Collective platform through the Representational State Transfer (REST) Application Programming Interface (API). The REST API provides a convenient method for obtaining Cell Collective data through almost any programming language. To ensure easy processing of the retrieved data, the request output from the API is available in a standard JSON format. The Cell Collective REST API is freely available at http://thecellcollective.org/tccapi. All public models in Cell Collective are available through the REST API. For users interested in creating and accessing their own models through the REST API first need to create an account in Cell Collective (http://thecellcollective.org). thelikar2@unl.edu. Technical user documentation: https://goo.gl/U52GWo. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  9. Kinetic modeling of microbially-driven redox chemistry of radionuclides in subsurface environments: Coupling transport, microbial metabolism and geochemistry

    International Nuclear Information System (INIS)

    Wang, Yifeng; Papenguth, Hans W.

    2000-01-01

    Microbial degradation of organic matter is a driving force in many subsurface geochemical systems, and therefore may have significant impacts on the fate of radionuclides released into subsurface environments. In this paper, the authors present a general reaction-transport model for microbial metabolism, redox chemistry, and radionuclide migration in subsurface systems. The model explicitly accounts for biomass accumulation and the coupling of radionuclide redox reactions with major biogeochemical processes. Based on the consideration that the biomass accumulation in subsurface environments is likely to achieve a quasi-steady state, they have accordingly modified the traditional microbial growth kinetic equation. They justified the use of the biogeochemical models without the explicit representation of biomass accumulation, if the interest of modeling is in the net impact of microbial reactions on geochemical processes. They then applied their model to a scenario in which an oxic water flow containing both uranium and completing organic ligands is recharged into an oxic aquifer in a carbonate formation. The model simulation shows that uranium can be reduced and therefore immobilized in the anoxic zone created by microbial degradation

  10. Kinetic modeling of microbially-driven redox chemistry of radionuclides in subsurface environments: Coupling transport, microbial metabolism and geochemistry

    Energy Technology Data Exchange (ETDEWEB)

    WANG,YIFENG; PAPENGUTH,HANS W.

    2000-05-04

    Microbial degradation of organic matter is a driving force in many subsurface geochemical systems, and therefore may have significant impacts on the fate of radionuclides released into subsurface environments. In this paper, the authors present a general reaction-transport model for microbial metabolism, redox chemistry, and radionuclide migration in subsurface systems. The model explicitly accounts for biomass accumulation and the coupling of radionuclide redox reactions with major biogeochemical processes. Based on the consideration that the biomass accumulation in subsurface environments is likely to achieve a quasi-steady state, they have accordingly modified the traditional microbial growth kinetic equation. They justified the use of the biogeochemical models without the explicit representation of biomass accumulation, if the interest of modeling is in the net impact of microbial reactions on geochemical processes. They then applied their model to a scenario in which an oxic water flow containing both uranium and completing organic ligands is recharged into an oxic aquifer in a carbonate formation. The model simulation shows that uranium can be reduced and therefore immobilized in the anoxic zone created by microbial degradation.

  11. Extending BPM Environments of Your Choice with Performance Related Decision Support

    Science.gov (United States)

    Fritzsche, Mathias; Picht, Michael; Gilani, Wasif; Spence, Ivor; Brown, John; Kilpatrick, Peter

    What-if Simulations have been identified as one solution for business performance related decision support. Such support is especially useful in cases where it can be automatically generated out of Business Process Management (BPM) Environments from the existing business process models and performance parameters monitored from the executed business process instances. Currently, some of the available BPM Environments offer basic-level performance prediction capabilities. However, these functionalities are normally too limited to be generally useful for performance related decision support at business process level. In this paper, an approach is presented which allows the non-intrusive integration of sophisticated tooling for what-if simulations, analytic performance prediction tools, process optimizations or a combination of such solutions into already existing BPM environments. The approach abstracts from process modelling techniques which enable automatic decision support spanning processes across numerous BPM Environments. For instance, this enables end-to-end decision support for composite processes modelled with the Business Process Modelling Notation (BPMN) on top of existing Enterprise Resource Planning (ERP) processes modelled with proprietary languages.

  12. A marketing mix model for a complex and turbulent environment

    Directory of Open Access Journals (Sweden)

    R. B. Mason

    2007-12-01

    Full Text Available Purpose: This paper is based on the proposition that the choice of marketing tactics is determined, or at least significantly influenced, by the nature of the company’s external environment. It aims to illustrate the type of marketing mix tactics that are suggested for a complex and turbulent environment when marketing and the environment are viewed through a chaos and complexity theory lens. Design/Methodology/Approach: Since chaos and complexity theories are proposed as a good means of understanding the dynamics of complex and turbulent markets, a comprehensive review and analysis of literature on the marketing mix and marketing tactics from a chaos and complexity viewpoint was conducted. From this literature review, a marketing mix model was conceptualised. Findings: A marketing mix model considered appropriate for success in complex and turbulent environments was developed. In such environments, the literature suggests destabilising marketing activities are more effective, whereas stabilising type activities are more effective in simple, stable environments. Therefore the model proposes predominantly destabilising type tactics as appropriate for a complex and turbulent environment such as is currently being experienced in South Africa. Implications: This paper is of benefit to marketers by emphasising a new way to consider the future marketing activities of their companies. How this model can assist marketers and suggestions for research to develop and apply this model are provided. It is hoped that the model suggested will form the basis of empirical research to test its applicability in the turbulent South African environment. Originality/Value: Since businesses and markets are complex adaptive systems, using complexity theory to understand how to cope in complex, turbulent environments is necessary, but has not been widely researched. In fact, most chaos and complexity theory work in marketing has concentrated on marketing strategy, with

  13. The importance of modelling in European decisions concerning energy and environment

    International Nuclear Information System (INIS)

    Rossetti di Valdalbero, D.

    2012-01-01

    Under pressure as much from the economic crisis as from a desire to improve governance, political decision makers seek to evaluate economic and social impacts of their choices. Quantification of trend projections and alternative scenarios - developed with the aid of economic and energy models - provide a base on which to make these choices. Attempts to put figures on and to clarify contrasting options provoke animated debates, both about the quantitative tools used (a scientific question) and on their use (a political question). This article aims to show the importance of the modellers in providing answers to these questions, by analysing the process of European policy decisions on energy and the environment over the period 1995-2005. (authors)

  14. Animal models of gene-environment interactions in schizophrenia.

    Science.gov (United States)

    Ayhan, Yavuz; Sawa, Akira; Ross, Christopher A; Pletnikov, Mikhail V

    2009-12-07

    The pathogenesis of schizophrenia and related mental illnesses likely involves multiple interactions between susceptibility genes of small effects and environmental factors. Gene-environment interactions occur across different stages of neurodevelopment to produce heterogeneous clinical and pathological manifestations of the disease. The main obstacle for mechanistic studies of gene-environment interplay has been the paucity of appropriate experimental systems for elucidating the molecular pathways that mediate gene-environment interactions relevant to schizophrenia. Recent advances in psychiatric genetics and a plethora of experimental data from animal studies allow us to suggest a new approach to gene-environment interactions in schizophrenia. We propose that animal models based on identified genetic mutations and measurable environment factors will help advance studies of the molecular mechanisms of gene-environment interplay.

  15. Methods for Process Evaluation of Work Environment Interventions

    DEFF Research Database (Denmark)

    Fredslund, Hanne; Strandgaard Pedersen, Jesper

    2004-01-01

    or management perceptions and actions in implementing any intervention and their influence on the overall result of the intervention' (Nytrø, Saksvik, Mikkelsen, Bohle, and Quinlan, 2000). Process evaluation can be used to a) provide feedback for improving interventions, b) interpret the outcomes of effect......In recent years, intervention studies have become increasingly popular within occupational health psychology. The vast majority of such studies have focused on interventions themselves and their effects on the working environment and employee health and well-being. Few studies have focused on how...... the context and processes surrounding the intervention may have influenced the outcomes (Hurrell and Murphy, 1996). Thus, there is still relatively little published research that provides us with information on how to evaluate such strategies and processes (Saksvik, Nytrø, Dahl-Jørgensen, and Mikkelsen, 2002...

  16. Using BPMN to model Internet of Things behavior within business process

    Directory of Open Access Journals (Sweden)

    Dulce Domingos

    2017-01-01

    Full Text Available Whereas, traditionally, business processes use the Internet of Things (IoTs as a distributed source of information, the increase of computational capabilities of IoT devices provides them with the means to also execute parts of the business logic, reducing the amount of exchanged data and central processing. Current approaches based on Business Process Model and Notation (BPMN already support modelers to define both business processes and IoT devices behavior at the same level of abstraction. However, they are not restricted to standard BPMN elements and they generate IoT device specific low-level code. The work we present in this paper exclusivelly uses standard BPMN to define central as well as IoT behavior of business processes. In addition, the BPMN that defines the IoT behavior is translated to a neutral-platform programming code. The deployment and execution environments use Web services to support the communication between the process execution engine and IoT devices.

  17. Modeling of biopharmaceutical processes. Part 2: Process chromatography unit operation

    DEFF Research Database (Denmark)

    Kaltenbrunner, Oliver; McCue, Justin; Engel, Philip

    2008-01-01

    Process modeling can be a useful tool to aid in process development, process optimization, and process scale-up. When modeling a chromatography process, one must first select the appropriate models that describe the mass transfer and adsorption that occurs within the porous adsorbent. The theoret......Process modeling can be a useful tool to aid in process development, process optimization, and process scale-up. When modeling a chromatography process, one must first select the appropriate models that describe the mass transfer and adsorption that occurs within the porous adsorbent...

  18. Modelling the cohesive sediment transport in the marine environment: the case of Thermaikos Gulf

    Directory of Open Access Journals (Sweden)

    Y. N. Krestenitis

    2007-01-01

    Full Text Available The transport of fine-grained sediments in the marine environment entails risks of pollutant intrusions from substances absorbed onto the cohesive flocks' surface, gradually released to the aquatic field. These substances include nutrients such as nitrate, phosphate and silicate compounds from drainage from fertilization of adjacent cultivated areas that enter the coastal areas through rivers and streams, or trace metals as remainders from urban and industrial activities. As a consequence, knowledge on the motion and distribution of sediment particles coming from a given pollutant source is expected to provide the 'bulk' information on pollutant distribution, necessary for determining the region of influence of the source and to estimate probable trophic levels of the seawater and potential environmental risks. In that aim a numerical model has been developed to predict the fate of the sediments introduced to the marine environment from different pollution sources, such as river outflows, erosion of the seabed, aeolian transported material and drainage systems. The proposed three-dimensional mathematical model is based on the particle tracking method, according to which matter concentration is expressed by particles, each representing a particular amount of sedimentary mass, passively advected and dispersed by the currents. The processes affecting characteristics and propagation of sedimentary material in the marine environment, incorporated in the parameterization, apart from advection and dispersion, include cohesive sediment and near-bed processes. The movement of the particles along with variations in sedimentary characteristics and state, carried by each particle as personal information, are traced with time. Specifically, concerning transport processes, the local seawater velocity and the particle's settling control advection, whereas the random Brownian motion due to turbulence simulates turbulent diffusion. The

  19. Towards a Business Process Modeling Technique for Agile Development of Case Management Systems

    Directory of Open Access Journals (Sweden)

    Ilia Bider

    2017-12-01

    Full Text Available A modern organization needs to adapt its behavior to changes in the business environment by changing its Business Processes (BP and corresponding Business Process Support (BPS systems. One way of achieving such adaptability is via separation of the system code from the process description/model by applying the concept of executable process models. Furthermore, to ease introduction of changes, such process model should separate different perspectives, for example, control-flow, human resources, and data perspectives, from each other. In addition, for developing a completely new process, it should be possible to start with a reduced process model to get a BPS system quickly running, and then continue to develop it in an agile manner. This article consists of two parts, the first sets requirements on modeling techniques that could be used in the tools that supports agile development of BPs and BPS systems. The second part suggests a business process modeling technique that allows to start modeling with the data/information perspective which would be appropriate for processes supported by Case or Adaptive Case Management (CM/ACM systems. In a model produced by this technique, called data-centric business process model, a process instance/case is defined as sequence of states in a specially designed instance database, while the process model is defined as a set of rules that set restrictions on allowed states and transitions between them. The article details the background for the project of developing the data-centric process modeling technique, presents the outline of the structure of the model, and gives formal definitions for a substantial part of the model.

  20. The impact of working memory and the "process of process modelling" on model quality: Investigating experienced versus inexperienced modellers.

    Science.gov (United States)

    Martini, Markus; Pinggera, Jakob; Neurauter, Manuel; Sachse, Pierre; Furtner, Marco R; Weber, Barbara

    2016-05-09

    A process model (PM) represents the graphical depiction of a business process, for instance, the entire process from online ordering a book until the parcel is delivered to the customer. Knowledge about relevant factors for creating PMs of high quality is lacking. The present study investigated the role of cognitive processes as well as modelling processes in creating a PM in experienced and inexperienced modellers. Specifically, two working memory (WM) functions (holding and processing of information and relational integration) and three process of process modelling phases (comprehension, modelling, and reconciliation) were related to PM quality. Our results show that the WM function of relational integration was positively related to PM quality in both modelling groups. The ratio of comprehension phases was negatively related to PM quality in inexperienced modellers and the ratio of reconciliation phases was positively related to PM quality in experienced modellers. Our research reveals central cognitive mechanisms in process modelling and has potential practical implications for the development of modelling software and teaching the craft of process modelling.

  1. Event-Entity-Relationship Modeling in Data Warehouse Environments

    DEFF Research Database (Denmark)

    Bækgaard, Lars

    We use the event-entity-relationship model (EVER) to illustrate the use of entity-based modeling languages for conceptual schema design in data warehouse environments. EVER is a general-purpose information modeling language that supports the specification of both general schema structures and multi...

  2. Applying the Business Process and Practice Alignment Meta-model: Daily Practices and Process Modelling

    Directory of Open Access Journals (Sweden)

    Ventura Martins Paula

    2017-03-01

    Full Text Available Background: Business Process Modelling (BPM is one of the most important phases of information system design. Business Process (BP meta-models allow capturing informational and behavioural aspects of business processes. Unfortunately, standard BP meta-modelling approaches focus just on process description, providing different BP models. It is not possible to compare and identify related daily practices in order to improve BP models. This lack of information implies that further research in BP meta-models is needed to reflect the evolution/change in BP. Considering this limitation, this paper introduces a new BP meta-model designed by Business Process and Practice Alignment Meta-model (BPPAMeta-model. Our intention is to present a meta-model that addresses features related to the alignment between daily work practices and BP descriptions. Objectives: This paper intends to present a metamodel which is going to integrate daily work information into coherent and sound process definitions. Methods/Approach: The methodology employed in the research follows a design-science approach. Results: The results of the case study are related to the application of the proposed meta-model to align the specification of a BP model with work practices models. Conclusions: This meta-model can be used within the BPPAM methodology to specify or improve business processes models based on work practice descriptions.

  3. Modeling of non-additive mixture properties using the Online CHEmical database and Modeling environment (OCHEM

    Directory of Open Access Journals (Sweden)

    Oprisiu Ioana

    2013-01-01

    Full Text Available Abstract The Online Chemical Modeling Environment (OCHEM, http://ochem.eu is a web-based platform that provides tools for automation of typical steps necessary to create a predictive QSAR/QSPR model. The platform consists of two major subsystems: a database of experimental measurements and a modeling framework. So far, OCHEM has been limited to the processing of individual compounds. In this work, we extended OCHEM with a new ability to store and model properties of binary non-additive mixtures. The developed system is publicly accessible, meaning that any user on the Web can store new data for binary mixtures and develop models to predict their non-additive properties. The database already contains almost 10,000 data points for the density, bubble point, and azeotropic behavior of binary mixtures. For these data, we developed models for both qualitative (azeotrope/zeotrope and quantitative endpoints (density and bubble points using different learning methods and specially developed descriptors for mixtures. The prediction performance of the models was similar to or more accurate than results reported in previous studies. Thus, we have developed and made publicly available a powerful system for modeling mixtures of chemical compounds on the Web.

  4. Introducing ORACLE: Library Processing in a Multi-User Environment.

    Science.gov (United States)

    Queensland Library Board, Brisbane (Australia).

    Currently being developed by the State Library of Queensland, Australia, ORACLE (On-Line Retrieval of Acquisitions, Cataloguing, and Circulation Details for Library Enquiries) is a computerized library system designed to provide rapid processing of library materials in a multi-user environment. It is based on the Australian MARC format and fully…

  5. Analysis of the Growth Process of Neural Cells in Culture Environment Using Image Processing Techniques

    Science.gov (United States)

    Mirsafianf, Atefeh S.; Isfahani, Shirin N.; Kasaei, Shohreh; Mobasheri, Hamid

    Here we present an approach for processing neural cells images to analyze their growth process in culture environment. We have applied several image processing techniques for: 1- Environmental noise reduction, 2- Neural cells segmentation, 3- Neural cells classification based on their dendrites' growth conditions, and 4- neurons' features Extraction and measurement (e.g., like cell body area, number of dendrites, axon's length, and so on). Due to the large amount of noise in the images, we have used feed forward artificial neural networks to detect edges more precisely.

  6. Model and Analytic Processes for Export License Assessments

    Energy Technology Data Exchange (ETDEWEB)

    Thompson, Sandra E.; Whitney, Paul D.; Weimar, Mark R.; Wood, Thomas W.; Daly, Don S.; Brothers, Alan J.; Sanfilippo, Antonio P.; Cook, Diane; Holder, Larry

    2011-09-29

    This paper represents the Department of Energy Office of Nonproliferation Research and Development (NA-22) Simulations, Algorithms and Modeling (SAM) Program's first effort to identify and frame analytical methods and tools to aid export control professionals in effectively predicting proliferation intent; a complex, multi-step and multi-agency process. The report focuses on analytical modeling methodologies that alone, or combined, may improve the proliferation export control license approval process. It is a follow-up to an earlier paper describing information sources and environments related to international nuclear technology transfer. This report describes the decision criteria used to evaluate modeling techniques and tools to determine which approaches will be investigated during the final 2 years of the project. The report also details the motivation for why new modeling techniques and tools are needed. The analytical modeling methodologies will enable analysts to evaluate the information environment for relevance to detecting proliferation intent, with specific focus on assessing risks associated with transferring dual-use technologies. Dual-use technologies can be used in both weapons and commercial enterprises. A decision-framework was developed to evaluate which of the different analytical modeling methodologies would be most appropriate conditional on the uniqueness of the approach, data availability, laboratory capabilities, relevance to NA-22 and Office of Arms Control and Nonproliferation (NA-24) research needs and the impact if successful. Modeling methodologies were divided into whether they could help micro-level assessments (e.g., help improve individual license assessments) or macro-level assessment. Macro-level assessment focuses on suppliers, technology, consumers, economies, and proliferation context. Macro-level assessment technologies scored higher in the area of uniqueness because less work has been done at the macro level. An

  7. Examining Student Research Choices and Processes in a Disintermediated Searching Environment

    Science.gov (United States)

    Rempel, Hannah Gascho; Buck, Stefanie; Deitering, Anne-Marie

    2013-01-01

    Students today perform research in a disintermediated environment, which often allows them to struggle directly with the process of selecting research tools and choosing scholarly sources. The authors conducted a qualitative study with twenty students, using structured observations to ascertain the processes students use to select databases and…

  8. Requests for Help in a Multilingual Professional Environment Testimonies and Actantial Models

    Directory of Open Access Journals (Sweden)

    Lejot Eve

    2017-12-01

    Full Text Available Professional multilingual environments using English as a lingua franca are prone to imbalances in communication, linguistic insecurity and rising tension. Non-native English speakers develop avoidance strategies in order to lessen their apprehension. To overcome these imbalances, this research aims to understand the relationships formed around languages focusing on the dynamics of integration and the requests for help. Guided by the actantial models of Greimas (1966, this qualitative study employs semiolinguistics and discourse analysis, including 19 narrative interviews with employees of Airbus and UNESCO in Hamburg, Germany in 2013. This methodology draws on actors connected through relationships of power and/or collaboration. The actantial models applied seek linguistic input through designational paradigms, shifters and modal occurrences. The actantial models illustrate how a good language competence provides a better understanding of one’s direct as well as passive environment. The learning process is shown to be a conduit to integration. The actantial model and discourse analysis shed light on the complex situation of multilingual communication settings by highlighting the influence of individuals’ linguistic skills. As a matter of fact, depending on the role of each individual in a given situation, lending a helping hand sometimes equates to upsetting the balance.

  9. CASES ON COLLABORATION IN VIRTUAL LEARNIONG ENVIRONMENTS: Processes and Interactions

    Directory of Open Access Journals (Sweden)

    Reviewed by Yasin OZARSLAN

    2010-01-01

    Full Text Available Collaboration in Virtual Learning Environment brings meaningful learning interactions between learners in virtual environments. This book collects case studies of collaborative virtual learning environments focusing on the nature of human interactions in virtual spaces and defining the types and qualities of learning processes in these spaces from the perspectives of learners, teachers, designers, and professional and academic developers in various disciplines, learning communities and universities from around the world. This book addresses the research cases on experiences, implementations, and applications of virtual learning environments.The book's broader audience is anyone who is interested in areas such as collaborative virtual learning environments, interactive technologies and virtual communities, social interaction and social competence, distance education and collaborative learning. The book is edited by Donna Russell who is an Assistant Professor at the University of Missouri-Kansas City and co-owner of Arete‘ Consulting, LLC. It is consisted of 358 pages covering 19 articles and provides information about context for characteristics and implications of the varied virtual learning environments. Topics covered in this book are argumentative interactions and learning, collaborative learning and work in digital libraries, collaborative virtual learning environments , digital communities to enhance retention, distance education ,interactive technologies and virtual communities, massively multi-user virtual environments, online graduate community, online training programs, social interaction and social competence and virtual story-worlds.

  10. INSTITUTIONAL ENVIRONMENT OF THE AGRICULTURAL MARKET FORMATION PROCESS

    Directory of Open Access Journals (Sweden)

    S. Revenko

    2013-11-01

    Full Text Available This article considers institutional aspects of the organized agricultural market formation process. Theoretical base to distinguish institute and institutes is given. In order to find out main influential institutes of the “organization” phenomenon author analyses Ukrainian institutional environment that is under construction process. Author considers main processes which are running during the organized market formation. Author researches theoretical approaches to the institutional staff. In order to structure the most common approaches and theoretical knowledge of this problem author proposes few schemes. Author’s points of view for many questions of the organized market formation process are proposed. Researcher analyzes effectiveness of the institutes and governmental regulation of the agricultural market. Readers can find strategically new approach to the agricultural market formation policy from the governmental point of view. Essence of the socioeconomic formation of agricultural market is considered. Main factors of agriculture market formation are outlined. Agricultural market structural parts consideration systematic approach is proposed. Ineffectiveness of the agriculture market relations without regulation process is proved. The most unfavorable reasons of the agricultural market formation are determined.

  11. Modeling multiphase materials processes

    CERN Document Server

    Iguchi, Manabu

    2010-01-01

    ""Modeling Multiphase Materials Processes: Gas-Liquid Systems"" describes the methodology and application of physical and mathematical modeling to multi-phase flow phenomena in materials processing. The book focuses on systems involving gas-liquid interaction, the most prevalent in current metallurgical processes. The performance characteristics of these processes are largely dependent on transport phenomena. This volume covers the inherent characteristics that complicate the modeling of transport phenomena in such systems, including complex multiphase structure, intense turbulence, opacity of

  12. Genomic-Enabled Prediction Kernel Models with Random Intercepts for Multi-environment Trials

    Science.gov (United States)

    Cuevas, Jaime; Granato, Italo; Fritsche-Neto, Roberto; Montesinos-Lopez, Osval A.; Burgueño, Juan; Bandeira e Sousa, Massaine; Crossa, José

    2018-01-01

    In this study, we compared the prediction accuracy of the main genotypic effect model (MM) without G×E interactions, the multi-environment single variance G×E deviation model (MDs), and the multi-environment environment-specific variance G×E deviation model (MDe) where the random genetic effects of the lines are modeled with the markers (or pedigree). With the objective of further modeling the genetic residual of the lines, we incorporated the random intercepts of the lines (l) and generated another three models. Each of these 6 models were fitted with a linear kernel method (Genomic Best Linear Unbiased Predictor, GB) and a Gaussian Kernel (GK) method. We compared these 12 model-method combinations with another two multi-environment G×E interactions models with unstructured variance-covariances (MUC) using GB and GK kernels (4 model-method). Thus, we compared the genomic-enabled prediction accuracy of a total of 16 model-method combinations on two maize data sets with positive phenotypic correlations among environments, and on two wheat data sets with complex G×E that includes some negative and close to zero phenotypic correlations among environments. The two models (MDs and MDE with the random intercept of the lines and the GK method) were computationally efficient and gave high prediction accuracy in the two maize data sets. Regarding the more complex G×E wheat data sets, the prediction accuracy of the model-method combination with G×E, MDs and MDe, including the random intercepts of the lines with GK method had important savings in computing time as compared with the G×E interaction multi-environment models with unstructured variance-covariances but with lower genomic prediction accuracy. PMID:29476023

  13. Modeling and control for closed environment plant production systems

    Science.gov (United States)

    Fleisher, David H.; Ting, K. C.; Janes, H. W. (Principal Investigator)

    2002-01-01

    A computer program was developed to study multiple crop production and control in controlled environment plant production systems. The program simulates crop growth and development under nominal and off-nominal environments. Time-series crop models for wheat (Triticum aestivum), soybean (Glycine max), and white potato (Solanum tuberosum) are integrated with a model-based predictive controller. The controller evaluates and compensates for effects of environmental disturbances on crop production scheduling. The crop models consist of a set of nonlinear polynomial equations, six for each crop, developed using multivariate polynomial regression (MPR). Simulated data from DSSAT crop models, previously modified for crop production in controlled environments with hydroponics under elevated atmospheric carbon dioxide concentration, were used for the MPR fitting. The model-based predictive controller adjusts light intensity, air temperature, and carbon dioxide concentration set points in response to environmental perturbations. Control signals are determined from minimization of a cost function, which is based on the weighted control effort and squared-error between the system response and desired reference signal.

  14. Trust Model to Enhance Security and Interoperability of Cloud Environment

    Science.gov (United States)

    Li, Wenjuan; Ping, Lingdi

    Trust is one of the most important means to improve security and enable interoperability of current heterogeneous independent cloud platforms. This paper first analyzed several trust models used in large and distributed environment and then introduced a novel cloud trust model to solve security issues in cross-clouds environment in which cloud customer can choose different providers' services and resources in heterogeneous domains can cooperate. The model is domain-based. It divides one cloud provider's resource nodes into the same domain and sets trust agent. It distinguishes two different roles cloud customer and cloud server and designs different strategies for them. In our model, trust recommendation is treated as one type of cloud services just like computation or storage. The model achieves both identity authentication and behavior authentication. The results of emulation experiments show that the proposed model can efficiently and safely construct trust relationship in cross-clouds environment.

  15. Models for genotype by environment interaction estimation on halomorphic soil

    Directory of Open Access Journals (Sweden)

    Dimitrijević Miodrag

    2006-01-01

    Full Text Available In genotype by environment interaction estimation, as well as, in total trial variability anal­ysis several models are in use. The most often used are Analysis of variance, Eberhart and Russell model and AMMI model. Each of the models has its own specificities, in the way of sources of varia­tion comprehension and treatment. It is known that agriculturally less productive environments increase errors, dimmish reaction differences between genotypes and decrease repeatability of conditions during years. A sample consisting on six bread wheat varieties was studied in three veg­etation periods on halomorphic soil, solonetz type in Banat (vil. Kumane. Genotype by environ­ment interaction was quantified using ANOVA, Eberhart and Russell model and AMMI model. The results were compared not only on pure solonetz soil (control, but also on two level of ameliora­tion (25 and 50t/ha phosphor-gypsum.

  16. A virtual auditory environment for investigating the auditory signal processing of realistic sounds

    DEFF Research Database (Denmark)

    Favrot, Sylvain Emmanuel; Buchholz, Jörg

    2008-01-01

    In the present study, a novel multichannel loudspeaker-based virtual auditory environment (VAE) is introduced. The VAE aims at providing a versatile research environment for investigating the auditory signal processing in real environments, i.e., considering multiple sound sources and room...... reverberation. The environment is based on the ODEON room acoustic simulation software to render the acoustical scene. ODEON outputs are processed using a combination of different order Ambisonic techniques to calculate multichannel room impulse responses (mRIR). Auralization is then obtained by the convolution...... the VAE development, special care was taken in order to achieve a realistic auditory percept and to avoid “artifacts” such as unnatural coloration. The performance of the VAE has been evaluated and optimized on a 29 loudspeaker setup using both objective and subjective measurement techniques....

  17. Enhancing Users' Participation in Business Process Modeling through Ontology-Based Training

    Science.gov (United States)

    Macris, A.; Malamateniou, F.; Vassilacopoulos, G.

    Successful business process design requires active participation of users who are familiar with organizational activities and business process modelling concepts. Hence, there is a need to provide users with reusable, flexible, agile and adaptable training material in order to enable them instil their knowledge and expertise in business process design and automation activities. Knowledge reusability is of paramount importance in designing training material on process modelling since it enables users participate actively in process design/redesign activities stimulated by the changing business environment. This paper presents a prototype approach for the design and use of training material that provides significant advantages to both the designer (knowledge - content reusability and semantic web enabling) and the user (semantic search, knowledge navigation and knowledge dissemination). The approach is based on externalizing domain knowledge in the form of ontology-based knowledge networks (i.e. training scenarios serving specific training needs) so that it is made reusable.

  18. Global Impact of Energy Use in Middle East Oil Economies: A Modeling Framework for Analyzing Technology-Energy-Environment-Economy Chain

    OpenAIRE

    Hodjat Ghadimi

    2007-01-01

    To explore choices of improving energy efficiency in energy-rich countries of the Middle East, this study lays out an integrated modeling framework for analyzing the technology-energy-environment-economy chain for the case of an energy exporting country. This framework consists of an input output process-flow model (IOPM) and a computable general equilibrium (CGE) model. The former investigates the micro-level production processes and sectoral interdependencies to show how alternative technol...

  19. Quantum field inspired model of decision making: Asymptotic stabilization of belief state via interaction with surrounding mental environment

    OpenAIRE

    Bagarello, Fabio; Basieva, Irina; Khrennikov, Andrei

    2017-01-01

    This paper is devoted to justification of quantum-like models of the process of decision making based on the theory of open quantum systems, i.e. decision making is considered as decoherence. This process is modeled as interaction of a decision maker, Alice, with a mental (information) environment ${\\cal R}$ surrounding her. Such an interaction generates "dissipation of uncertainty" from Alice's belief-state $\\rho(t)$ into ${\\cal R}$ and asymptotic stabilization of $\\rho(t)$ to a steady belie...

  20. A neuroconstructivist model of past tense development and processing.

    Science.gov (United States)

    Westermann, Gert; Ruh, Nicolas

    2012-07-01

    We present a neural network model of learning and processing the English past tense that is based on the notion that experience-dependent cortical development is a core aspect of cognitive development. During learning the model adds and removes units and connections to develop a task-specific final architecture. The model provides an integrated account of characteristic errors during learning the past tense, adult generalization to pseudoverbs, and dissociations between verbs observed after brain damage in aphasic patients. We put forward a theory of verb inflection in which a functional processing architecture develops through interactions between experience-dependent brain development and the structure of the environment, in this case, the statistical properties of verbs in the language. The outcome of this process is a structured processing system giving rise to graded dissociations between verbs that are easy and verbs that are hard to learn and process. In contrast to dual-mechanism accounts of inflection, we argue that describing dissociations as a dichotomy between regular and irregular verbs is a post hoc abstraction and is not linked to underlying processing mechanisms. We extend current single-mechanism accounts of inflection by highlighting the role of structural adaptation in development and in the formation of the adult processing system. In contrast to some single-mechanism accounts, we argue that the link between irregular inflection and verb semantics is not causal and that existing data can be explained on the basis of phonological representations alone. This work highlights the benefit of taking brain development seriously in theories of cognitive development. Copyright 2012 APA, all rights reserved.

  1. Semantic Business Process Modeling

    OpenAIRE

    Markovic, Ivan

    2010-01-01

    This book presents a process-oriented business modeling framework based on semantic technologies. The framework consists of modeling languages, methods, and tools that allow for semantic modeling of business motivation, business policies and rules, and business processes. Quality of the proposed modeling framework is evaluated based on the modeling content of SAP Solution Composer and several real-world business scenarios.

  2. An Innovative Economic Incentive Model for Improvement of the Working Environment in Europe

    DEFF Research Database (Denmark)

    Koch, Christian

    1997-01-01

    work injury compensation system, that a number of European countries operate. Building on an understanding of enterprise decision processes and ressources, it is suggested to implement a range of tools targetting different groups of enterprises. The central tools are bonus schemes, a special scheme......The design of economic incentive schemes to promote working environment embetterment has proven complicated. The contribution list some central preconditions and tools of an economic incentive model for improving the working environment. The economic incentive is proposed built into the compulsory...... for small enterprises, a marketing label and low interest investment aid. It is finally discussed what contrains the implementation of such a scheme can be confronted with....

  3. Enabling knowledge processes in innovative environments

    NARCIS (Netherlands)

    Pavesi, S.

    2003-01-01

    The concept of organisational knowledge as a valuable strategic asset has become quite popular recently. Increased competition, globalisation and the emergence of new organisational models built on process-based organisational structures require organisations to create, capture, share and apply

  4. Accident response -- X-ray to virtual environment

    International Nuclear Information System (INIS)

    Hefele, J.; Stupin, D.; Kelley, T.; Sheats, M.; Tsai, C.

    1999-01-01

    The Engineering Sciences and Applications (ESA) Division of Los Alamos National Laboratory (LANL) has been working to develop a process to extract topographical information from digital x-ray data for modeling in a Computer Aided Design (CAD) environment and translation into a virtual environment. The application for this process is the evolution of a field deployable tool for use by the Accident Response Group (ARG) at the Laboratory. The authors have used both CT Scan and radiography data in their process development. The data is translated into a format recognizable by Pro/ENGINEER trademark and then into a virtual environment that can be operated on by dVISE trademark. They have successfully taken both CT Scan and radiograph data of single components and created solid and virtual environment models for interrogation

  5. Modeling the C. elegans nematode and its environment using a particle system.

    Science.gov (United States)

    Rönkkö, Mauno; Wong, Garry

    2008-07-21

    A particle system, as understood in computer science, is a novel technique for modeling living organisms in their environment. Such particle systems have traditionally been used for modeling the complex dynamics of fluids and gases. In the present study, a particle system was devised to model the movement and feeding behavior of the nematode Caenorhabditis elegans in three different virtual environments: gel, liquid, and soil. The results demonstrate that distinct movements of the nematode can be attributed to its mechanical interactions with the virtual environment. These results also revealed emergent properties associated with modeling organisms within environment-based systems.

  6. Business Process Modeling: Perceived Benefits

    Science.gov (United States)

    Indulska, Marta; Green, Peter; Recker, Jan; Rosemann, Michael

    The process-centered design of organizations and information systems is globally seen as an appropriate response to the increased economic pressure on organizations. At the methodological core of process-centered management is process modeling. However, business process modeling in large initiatives can be a time-consuming and costly exercise, making it potentially difficult to convince executive management of its benefits. To date, and despite substantial interest and research in the area of process modeling, the understanding of the actual benefits of process modeling in academia and practice is limited. To address this gap, this paper explores the perception of benefits derived from process modeling initiatives, as reported through a global Delphi study. The study incorporates the views of three groups of stakeholders - academics, practitioners and vendors. Our findings lead to the first identification and ranking of 19 unique benefits associated with process modeling. The study in particular found that process modeling benefits vary significantly between practitioners and academics. We argue that the variations may point to a disconnect between research projects and practical demands.

  7. PREDICTIVE MODELS FOR SUPPORT OF INCIDENT MANAGEMENT PROCESS IN IT SERVICE MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Martin SARNOVSKY

    2018-03-01

    Full Text Available ABSTRACT The work presented in this paper is focused on creating of predictive models that help in the process of incident resolution and implementation of IT infrastructure changes to increase the overall support of IT management. Our main objective was to build the predictive models using machine learning algorithms and CRISP-DM methodology. We used the incident and related changes database obtained from the IT environment of the Rabobank Group company, which contained information about the processing of the incidents during the incident management process. We decided to investigate the dependencies between the incident observation on particular infrastructure component and the actual source of the incident as well as the dependency between the incidents and related changes in the infrastructure. We used Random Forests and Gradient Boosting Machine classifiers in the process of identification of incident source as well as in the prediction of possible impact of the observed incident. Both types of models were tested on testing set and evaluated using defined metrics.

  8. Distinguishing Environment and System in Coloured Petri Net Models of Reactive Systems

    DEFF Research Database (Denmark)

    Tjell, Simon

    2007-01-01

    This paper introduces and formally defines the environment-and-system-partitioned property for behavioral models of reactive systems expressed in the formal modeling language Coloured Petri Net. The purpose of the formalization is to make it possible to automatically validate any CPN model...... with respect to this property based on structural analysis. A model has the environment-and-system-partitioned property if it is based on a clear division between environment and system. This division is important in many model-driven approaches to software development such as model-based testing and automated...

  9. Modelling of L-valine Repeated Fed-batch Fermentation Process Taking into Account the Dissolved Oxygen Tension

    Directory of Open Access Journals (Sweden)

    Tzanko Georgiev

    2009-03-01

    Full Text Available This article deals with synthesis of dynamic unstructured model of variable volume fed-batch fermentation process with intensive droppings for L-valine production. The presented approach of the investigation includes the following main procedures: description of the process by generalized stoichiometric equations; preliminary data processing and calculation of specific rates for main kinetic variables; identification of the specific rates takes into account the dissolved oxygen tension; establishment and optimisation of dynamic model of the process; simulation researches. MATLAB is used as a research environment.

  10. Gene-Environment Interplay in Twin Models

    OpenAIRE

    Verhulst, Brad; Hatemi, Peter K.

    2013-01-01

    In this article, we respond to Shultziner’s critique that argues that identical twins are more alike not because of genetic similarity, but because they select into more similar environments and respond to stimuli in comparable ways, and that these effects bias twin model estimates to such an extent that they are invalid. The essay further argues that the theory and methods that undergird twin models, as well as the empirical studies which rely upon them, are unaware of these potential biases...

  11. Towards Automation 2.0: A Neurocognitive Model for Environment Recognition, Decision-Making, and Action Execution

    Directory of Open Access Journals (Sweden)

    Zucker Gerhard

    2011-01-01

    Full Text Available The ongoing penetration of building automation by information technology is by far not saturated. Today's systems need not only be reliable and fault tolerant, they also have to regard energy efficiency and flexibility in the overall consumption. Meeting the quality and comfort goals in building automation while at the same time optimizing towards energy, carbon footprint and cost-efficiency requires systems that are able to handle large amounts of information and negotiate system behaviour that resolves conflicting demands—a decision-making process. In the last years, research has started to focus on bionic principles for designing new concepts in this area. The information processing principles of the human mind have turned out to be of particular interest as the mind is capable of processing huge amounts of sensory data and taking adequate decisions for (re-actions based on these analysed data. In this paper, we discuss how a bionic approach can solve the upcoming problems of energy optimal systems. A recently developed model for environment recognition and decision-making processes, which is based on research findings from different disciplines of brain research is introduced. This model is the foundation for applications in intelligent building automation that have to deal with information from home and office environments. All of these applications have in common that they consist of a combination of communicating nodes and have many, partly contradicting goals.

  12. A model for radionuclide Migration in Urban Environment and Drainage Systems

    International Nuclear Information System (INIS)

    Garcia, E.; Gallego, E.; Jimenez, F.

    1998-01-01

    The Model for Radionuclide Migration in Urban Environment and Drainage Systems aims to estimate the discharge of radioactivity removed by natural or forced decontamination into the receiving waters from the drainage system, as well as the radioactivity joined with the sludge produced in treatments plants, whose various applications can mean a potential hazard. This model, built in Powersim, is included in the MOIRA system, a project whose main aim is the evaluation of the situation after a radioactive contamination of the aquatic ecosystems and the estimation of optimal remedial strategies to restore the contaminated waters. Powersim is an easy-to-use software package which simulates dynamic processes. Two sub-models compose the global model: one, simulating the evolution of Cs-137 in urban areas, and the other, the behaviour of this radionuclide, once it ha entered the drainage systems, with the various existing alternatives of waste water treatment in Europe. (Author) 8 refs

  13. Teaching and Learning of Computational Modelling in Creative Shaping Processes

    Directory of Open Access Journals (Sweden)

    Daniela REIMANN

    2017-10-01

    Full Text Available Today, not only diverse design-related disciplines are required to actively deal with the digitization of information and its potentials and side effects for education processes. In Germany, technology didactics developed in vocational education and computer science education in general education, both separated from media pedagogy as an after-school program. Media education is not a subject in German schools yet. However, in the paper we argue for an interdisciplinary approach to learn about computational modeling in creative processes and aesthetic contexts. It crosses the borders of programming technology, arts and design processes in meaningful contexts. Educational scenarios using smart textile environments are introduced and reflected for project based learning.

  14. Review of current activities to model and measure the orbital debris environment in low-earth orbit

    Science.gov (United States)

    Reynolds, R. C.

    A very active orbital debris program is currently being pursued at the NASA/Johnson Space Center (JSC), with projects designed to better define the current environment, to project future environments, to model the processes contributing to or constraining the growth of debris in the environment, and to gather supporting data needed to improve the understanding of the orbital debris problem and the hazard it presents to spacecraft. This paper is a review of the activity being conducted at JSC, by NASA, Lockheed Engineering and Sciences Company, and other support contractors, and presents a review of current activity, results of current research, and a discussion of directions for future development.

  15. Empirical probability model of cold plasma environment in the Jovian magnetosphere

    Science.gov (United States)

    Futaana, Yoshifumi; Wang, Xiao-Dong; Barabash, Stas; Roussos, Elias; Truscott, Pete

    2015-04-01

    We analyzed the Galileo PLS dataset to produce a new cold plasma environment model for the Jovian magneto- sphere. Although there exist many sophisticated radiation models, treating energetic plasma (e.g. JOSE, GIRE, or Salammbo), only a limited number of simple models has been utilized for cold plasma environment. By extend- ing the existing cold plasma models toward the probability domain, we can predict the extreme periods of Jovian environment by specifying the percentile of the environmental parameters. The new model was produced in the following procedure. We first referred to the existing cold plasma models of Divine and Garrett, 1983 (DG83) or Bagenal and Delamere 2011 (BD11). These models are scaled to fit the statistical median of the parameters obtained from Galileo PLS data. The scaled model (also called as "mean model") indicates the median environment of Jovian magnetosphere. Then, assuming that the deviations in the Galileo PLS parameters are purely due to variations in the environment, we extended the mean model toward the percentile domain. The input parameter of the model is simply the position of the spacecraft (distance, magnetic longitude and lati- tude) and the specific percentile (e.g. 0.5 for the mean model). All the parameters in the model are described in mathematical forms; therefore the needed computational resources are quite low. The new model can be used for assessing the JUICE mission profile. The spatial extent of the model covers the main phase of the JUICE mission; namely from the Europa orbit to 40 Rj (where Rj is the radius of Jupiter). In addition, theoretical extensions toward the latitudinal direction are also included in the model to support the high latitude orbit of the JUICE spacecraft.

  16. Business process modeling of industrial maintenance at TRANSPETRO: integrating oil pipeline and marine terminals activities

    Energy Technology Data Exchange (ETDEWEB)

    Arruda, Daniela Mendonca; Oliveira, Italo Luiz [TRANSPETRO - PETROBRAS Transporte S.A., Rio de Janeiro, RJ (Brazil). Diretoria de Terminais e Oleodutos; Almeida, Maria Fatima Ludovico de [Pontificia Universidade Catolica do Rio de Janeiro (PUC-Rio), Rio de Janeiro, RJ (Brazil). Programa de Pos-Graduacao em Metrologia para Qualidade e Inovacao

    2009-07-01

    This paper describes the experience of TRANSPETRO in remodeling industrial maintenance activities focusing on: preparing for business process modeling (BPM); mapping and analyzing 'As-Is' process; designing 'To-Be' process; implementing remodeled process; improving process continuously. The conceptual model and results achieved will contribute to several areas within the company as: reliability engineering; human resources, including employees' selective processes, training and development, and certifications; standardization process encompassing standard and operational procedures adoption according to up-dating external normative references and legal requirements; health, safety and environment (HSE) performance improvement. These are some of potential benefits from BPM focusing on TRANSPETRO's industrial maintenance area in the search of operational excellence. (author)

  17. Experimental simulation: using generative modelling and palaeoecological data to understand human-environment interactions

    Directory of Open Access Journals (Sweden)

    George Perry

    2016-10-01

    Full Text Available The amount of palaeoecological information available continues to grow rapidly, providing improved descriptions of the dynamics of past ecosystems and enabling them to be seen from new perspectives. At the same time, there has been concern over whether palaeoecological enquiry needs to move beyond descriptive inference to a more hypothesis-focussed or experimental approach; however, the extent to which conventional hypothesis-driven scientific frameworks can be applied to historical contexts (i.e., the past is the subject of ongoing debate. In other disciplines concerned with human-environment interactions, including physical geography and archaeology, there has been growing use of generative simulation models, typified by agent-based approaches. Generative modelling encourages counter-factual questioning (what if…?, a mode of argument that is particularly important in systems and time-periods, such as the Holocene and now the Anthropocene, where the effects of humans and other biophysical processes are deeply intertwined. However, palaeoecologically focused simulation of the dynamics of the ecosystems of the past either seems to be conducted to assess the applicability of some model to the future or treats humans simplistically as external forcing factors. In this review we consider how generative simulation-modelling approaches could contribute to our understanding of past human-environment interactions. We consider two key issues: the need for null models for understanding past dynamics and the need to be able learn more from pattern-based analysis. In this light, we argue that there is considerable scope for palaeocology to benefit from developments in generative models and their evaluation. We discuss the view that simulation is a form of experiment and, by using case studies, consider how the many patterns available to palaeoecologists can support model evaluation in a way that moves beyond simplistic pattern-matching and how such models

  18. Improving science and mathematics education with computational modelling in interactive engagement environments

    Science.gov (United States)

    Neves, Rui Gomes; Teodoro, Vítor Duarte

    2012-09-01

    A teaching approach aiming at an epistemologically balanced integration of computational modelling in science and mathematics education is presented. The approach is based on interactive engagement learning activities built around computational modelling experiments that span the range of different kinds of modelling from explorative to expressive modelling. The activities are designed to make a progressive introduction to scientific computation without requiring prior development of a working knowledge of programming, generate and foster the resolution of cognitive conflicts in the understanding of scientific and mathematical concepts and promote performative competency in the manipulation of different and complementary representations of mathematical models. The activities are supported by interactive PDF documents which explain the fundamental concepts, methods and reasoning processes using text, images and embedded movies, and include free space for multimedia enriched student modelling reports and teacher feedback. To illustrate, an example from physics implemented in the Modellus environment and tested in undergraduate university general physics and biophysics courses is discussed.

  19. Modeling a distributed environment for a petroleum reservoir engineering application with software product line

    International Nuclear Information System (INIS)

    Scheidt, Rafael de Faria; Vilain, Patrícia; Dantas, M A R

    2014-01-01

    Petroleum reservoir engineering is a complex and interesting field that requires large amount of computational facilities to achieve successful results. Usually, software environments for this field are developed without taking care out of possible interactions and extensibilities required by reservoir engineers. In this paper, we present a research work which it is characterized by the design and implementation based on a software product line model for a real distributed reservoir engineering environment. Experimental results indicate successfully the utilization of this approach for the design of distributed software architecture. In addition, all components from the proposal provided greater visibility of the organization and processes for the reservoir engineers

  20. Modeling a distributed environment for a petroleum reservoir engineering application with software product line

    Science.gov (United States)

    de Faria Scheidt, Rafael; Vilain, Patrícia; Dantas, M. A. R.

    2014-10-01

    Petroleum reservoir engineering is a complex and interesting field that requires large amount of computational facilities to achieve successful results. Usually, software environments for this field are developed without taking care out of possible interactions and extensibilities required by reservoir engineers. In this paper, we present a research work which it is characterized by the design and implementation based on a software product line model for a real distributed reservoir engineering environment. Experimental results indicate successfully the utilization of this approach for the design of distributed software architecture. In addition, all components from the proposal provided greater visibility of the organization and processes for the reservoir engineers.

  1. User Utility Oriented Queuing Model for Resource Allocation in Cloud Environment

    Directory of Open Access Journals (Sweden)

    Zhe Zhang

    2015-01-01

    Full Text Available Resource allocation is one of the most important research topics in servers. In the cloud environment, there are massive hardware resources of different kinds, and many kinds of services are usually run on virtual machines of the cloud server. In addition, cloud environment is commercialized, and economical factor should also be considered. In order to deal with commercialization and virtualization of cloud environment, we proposed a user utility oriented queuing model for task scheduling. Firstly, we modeled task scheduling in cloud environment as an M/M/1 queuing system. Secondly, we classified the utility into time utility and cost utility and built a linear programming model to maximize total utility for both of them. Finally, we proposed a utility oriented algorithm to maximize the total utility. Massive experiments validate the effectiveness of our proposed model.

  2. Engineered Barrier System: Physical and Chemical Environment

    International Nuclear Information System (INIS)

    Dixon, P.

    2004-01-01

    The conceptual and predictive models documented in this Engineered Barrier System: Physical and Chemical Environment Model report describe the evolution of the physical and chemical conditions within the waste emplacement drifts of the repository. The modeling approaches and model output data will be used in the total system performance assessment (TSPA-LA) to assess the performance of the engineered barrier system and the waste form. These models evaluate the range of potential water compositions within the emplacement drifts, resulting from the interaction of introduced materials and minerals in dust with water seeping into the drifts and with aqueous solutions forming by deliquescence of dust (as influenced by atmospheric conditions), and from thermal-hydrological-chemical (THC) processes in the drift. These models also consider the uncertainty and variability in water chemistry inside the drift and the compositions of introduced materials within the drift. This report develops and documents a set of process- and abstraction-level models that constitute the engineered barrier system: physical and chemical environment model. Where possible, these models use information directly from other process model reports as input, which promotes integration among process models used for total system performance assessment. Specific tasks and activities of modeling the physical and chemical environment are included in the technical work plan ''Technical Work Plan for: In-Drift Geochemistry Modeling'' (BSC 2004 [DIRS 166519]). As described in the technical work plan, the development of this report is coordinated with the development of other engineered barrier system analysis model reports

  3. THE PROPOSED MODEL OF COLLABORATIVE VIRTUAL LEARNING ENVIRONMENT FOR INTRODUCTORY PROGRAMMING COURSE

    Directory of Open Access Journals (Sweden)

    Mahfudzah OTHMAN

    2012-01-01

    Full Text Available This paper discusses the proposed model of the collaborative virtual learning system for the introductory computer programming course which uses one of the collaborative learning techniques known as the “Think-Pair-Share”. The main objective of this study is to design a model for an online learning system that facilitates the collaborative learning activities in a virtual environment such as online communications and pair or small group discussions. In order to model the virtual learning environment, the RUP methodology has been used where it involves the data collection phase and the analysis and design phase. Fifty respondents have been randomly selected to participate in the data collection phase to investigate the students’ interest and learning styles as well as their learning preferences. The results have shown the needs for the development of online small group discussions that can be used as an alternative learning style for programming courses. The proposed design of the virtual learning system named as the Online Collaborative Learning System or OCLS is being depicted using the object-oriented models which are the use-case model and class diagram in order to show the concise processes of virtual “Think-Pair-Share” collaborative activities. The “Think-Pair-Share” collaborative learning technique that is being used in this model has been chosen because of its simplicity and relatively low-risk. This paper also presents the proposed model of the system’s architecture that will become the guidelines for the physical development of OCLS using the web-based applications.

  4. Prevalence and concentration of Salmonella and Campylobacter in the processing environment of small-scale pastured broiler farms.

    Science.gov (United States)

    Trimble, Lisa M; Alali, Walid Q; Gibson, Kristen E; Ricke, Steven C; Crandall, Philip; Jaroni, Divya; Berrang, Mark; Habteselassie, Mussie Y

    2013-11-01

    A growing niche in the locally grown food movement is the small-scale production of broiler chickens using the pasture-raised poultry production model. Limited research exists that focuses on Salmonella and Campylobacter contamination in the environment associated with on-farm processing of pasture-raised broilers. The objective of this study was to establish data relative to Salmonella and Campylobacter prevalence and concentration in soil and mortality compost resulting from prior processing waste disposal in the small-scale, on-farm broiler processing environment. Salmonella and Campylobacter concentrations were determined in soil (n = 42), compost (n = 39), and processing wastewater (PWW; n = 46) samples from 4 small broiler farms using a 3-tube most probable number (MPN) method for Salmonella and direct plating method for Campylobacter. Salmonella prevalence and concentration (mean log10 MPN per sample weight or volume) in soil [60%, 0.97 (95% CI: 0.66 to 1.27)], compost [64%, 0.95 (95% CI: 0.66 to 1.24)], and wastewater [48%, 1.29 (95% CI: 0.87 to 1.71)] were not significantly different (P > 0.05). Although Campylobacter prevalence was not significantly different by sample type (64.3, 64.3, and 45.7% in soil, compost, and PWW, respectively), the concentration (mean log10 cfu) of this pathogen was significantly lower (P poultry production waste disposal practices and provides a record of data that may serve as a guide for future improvement of these practices. Further research is needed regarding the small-scale broiler production environment in relation to improving disposal of processing waste for optimum control of human pathogens.

  5. Diffusion Dominant Solute Transport Modelling in Fractured Media Under Deep Geological Environment - 12211

    Energy Technology Data Exchange (ETDEWEB)

    Kwong, S. [National Nuclear Laboratory (United Kingdom); Jivkov, A.P. [Research Centre for Radwaste and Decommissioning and Modelling and Simulation Centre, University of Manchester (United Kingdom)

    2012-07-01

    Deep geologic disposal of high activity and long-lived radioactive waste is gaining increasing support in many countries, where suitable low permeability geological formation in combination with engineered barriers are used to provide long term waste contaminant and minimise the impacts to the environment and risk to the biosphere. This modelling study examines the solute transport in fractured media under low flow velocities that are relevant to a deep geological environment. In particular, reactive solute transport through fractured media is studied using a 2-D model, that considers advection and diffusion, to explore the coupled effects of kinetic and equilibrium chemical processes. The effects of water velocity in the fracture, matrix porosity and diffusion on solute transport are investigated and discussed. Some illustrative modelled results are presented to demonstrate the use of the model to examine the effects of media degradation on solute transport, under the influences of hydrogeological (diffusion dominant) and microbially mediated chemical processes. The challenges facing the prediction of long term degradation such as cracks evolution, interaction and coalescence are highlighted. The potential of a novel microstructure informed modelling approach to account for these effects is discussed, particularly with respect to investigating multiple phenomena impact on material performance. The GRM code is used to examine the effects of media degradation for a geological waste disposal package, under the combined hydrogeological (diffusion dominant) and chemical effects in low groundwater flow conditions that are typical of deep geological disposal systems. An illustrative reactive transport modelling application demonstrates the use of the code to examine the interplay of kinetic controlled biogeochemical reactive processes with advective and diffusive transport, under the influence of media degradation. The initial model results are encouraging which show the

  6. Process monitoring in high volume semiconductor production environment with in-fab TXRF

    International Nuclear Information System (INIS)

    Ghatak-Roy, A.R.; Hossain, T.Z.

    2000-01-01

    After its introduction in the 80's, TXRF has become an important tool for surface contamination analysis. This is particularly true for the semiconductor industries, where monitoring trace level contamination in ultra clean environment is absolutely necessary for successful device production with reasonable yield. In FAB 25 of the Advanced Micro Devices in Austin, we have installed two TXRF tools, which are model TXRF3750 manufactured by Rigaku. They contain rotating tungsten anodes with three beam capability for wide selection of elements. One of the beams (WM) is used for monitoring of low Z elements such as Na, Mg and Al. The standard output is 9 kW with 300 mA at 30 kV. The tool runs 24 hours a day, 7 days a week, except for maintenance and breakdowns. We have been using TXRF for in-fab monitoring of various tools and processes for trace contamination and some quantification of materials. This in-fab operation is important because it gives real time monitoring without the necessity of bringing the wafers out of the fab. Secondly, being in ultra clean fab environment, the risk of background contamination is minimized. Since TXRF measurement is fast and does not need any sample preparation, this works very well as production support tool. Several wafer fab technicians have been trained to use the tool for round the clock operation. We have successfully monitored tools and processes in our fab. One example is the monitoring of numerous sinks used in the cleaning of production wafers after various processes. Monitor wafers are run after sink cleaning and solvent changes and they are then analyzed for any contamination. Another example is the monitoring of tools that use Ferrofluidic seals so as to prevent any contamination from Fe and Cr. Other tools using TXRF include diffusion furnaces, etchers and plasma cleaning tools. We have also been monitoring processes such as ion implantation, metal deposition and rapid thermal annealing. In this presentation, we will

  7. Real-time modeling of primitive environments through wavelet sensors and Hebbian learning

    Science.gov (United States)

    Vaccaro, James M.; Yaworsky, Paul S.

    1999-06-01

    Modeling the world through sensory input necessarily provides a unique perspective for the observer. Given a limited perspective, objects and events cannot always be encoded precisely but must involve crude, quick approximations to deal with sensory information in a real- time manner. As an example, when avoiding an oncoming car, a pedestrian needs to identify the fact that a car is approaching before ascertaining the model or color of the vehicle. In our methodology, we use wavelet-based sensors with self-organized learning to encode basic sensory information in real-time. The wavelet-based sensors provide necessary transformations while a rank-based Hebbian learning scheme encodes a self-organized environment through translation, scale and orientation invariant sensors. Such a self-organized environment is made possible by combining wavelet sets which are orthonormal, log-scale with linear orientation and have automatically generated membership functions. In earlier work we used Gabor wavelet filters, rank-based Hebbian learning and an exponential modulation function to encode textural information from images. Many different types of modulation are possible, but based on biological findings the exponential modulation function provided a good approximation of first spike coding of `integrate and fire' neurons. These types of Hebbian encoding schemes (e.g., exponential modulation, etc.) are useful for quick response and learning, provide several advantages over contemporary neural network learning approaches, and have been found to quantize data nonlinearly. By combining wavelets with Hebbian learning we can provide a real-time front-end for modeling an intelligent process, such as the autonomous control of agents in a simulated environment.

  8. WWTP Process Tank Modelling

    DEFF Research Database (Denmark)

    Laursen, Jesper

    The present thesis considers numerical modeling of activated sludge tanks on municipal wastewater treatment plants. Focus is aimed at integrated modeling where the detailed microbiological model the Activated Sludge Model 3 (ASM3) is combined with a detailed hydrodynamic model based on a numerical...... solution of the Navier-Stokes equations in a multiphase scheme. After a general introduction to the activated sludge tank as a system, the activated sludge tank model is gradually setup in separate stages. The individual sub-processes that are often occurring in activated sludge tanks are initially...... hydrofoil shaped propellers. These two sub-processes deliver the main part of the supplied energy to the activated sludge tank, and for this reason they are important for the mixing conditions in the tank. For other important processes occurring in the activated sludge tank, existing models and measurements...

  9. Affordances perspective and grammaticalization: Incorporation of language, environment and users in the model of semantic paths

    Directory of Open Access Journals (Sweden)

    Alexander Andrason

    2015-12-01

    Full Text Available The present paper demonstrates that insights from the affordances perspective can contribute to developing a more comprehensive model of grammaticalization. The authors argue that the grammaticalization process is afforded differently depending on the values of three contributing parameters: the factor (schematized as a qualitative-quantitative map or a wave of a gram, environment (understood as the structure of the stream along which the gram travels, and actor (narrowed to certain cognitive-epistemological capacities of the users, in particular to the fact of being a native speaker. By relating grammaticalization to these three parameters and by connecting it to the theory of optimization, the proposed model offers a better approximation to realistic cases of grammaticalization: The actor and environment are overtly incorporated into the model and divergences from canonical grammaticalization paths are both tolerated and explicable.

  10. Implementation of the Business Process Modelling Notation (BPMN) in the modelling of anatomic pathology processes.

    Science.gov (United States)

    Rojo, Marcial García; Rolón, Elvira; Calahorra, Luis; García, Felix Oscar; Sánchez, Rosario Paloma; Ruiz, Francisco; Ballester, Nieves; Armenteros, María; Rodríguez, Teresa; Espartero, Rafael Martín

    2008-07-15

    Process orientation is one of the essential elements of quality management systems, including those in use in healthcare. Business processes in hospitals are very complex and variable. BPMN (Business Process Modelling Notation) is a user-oriented language specifically designed for the modelling of business (organizational) processes. Previous experiences of the use of this notation in the processes modelling within the Pathology in Spain or another country are not known. We present our experience in the elaboration of the conceptual models of Pathology processes, as part of a global programmed surgical patient process, using BPMN. With the objective of analyzing the use of BPMN notation in real cases, a multidisciplinary work group was created, including software engineers from the Dep. of Technologies and Information Systems from the University of Castilla-La Mancha and health professionals and administrative staff from the Hospital General de Ciudad Real. The work in collaboration was carried out in six phases: informative meetings, intensive training, process selection, definition of the work method, process describing by hospital experts, and process modelling. The modelling of the processes of Anatomic Pathology is presented using BPMN. The presented subprocesses are those corresponding to the surgical pathology examination of the samples coming from operating theatre, including the planning and realization of frozen studies. The modelling of Anatomic Pathology subprocesses has allowed the creation of an understandable graphical model, where management and improvements are more easily implemented by health professionals.

  11. modeling grinding modeling grinding processes as micro processes

    African Journals Online (AJOL)

    eobe

    industrial precision grinding processes are cylindrical, center less and ... Several model shave been proposed and used to study grinding ..... grinding force for the two cases were 9.07237N/mm ..... International Journal of Machine Tools &.

  12. Parametric Modelling of As-Built Beam Framed Structure in Bim Environment

    Science.gov (United States)

    Yang, X.; Koehl, M.; Grussenmeyer, P.

    2017-02-01

    A complete documentation and conservation of a historic timber roof requires the integration of geometry modelling, attributional and dynamic information management and results of structural analysis. Recently developed as-built Building Information Modelling (BIM) technique has the potential to provide a uniform platform, which provides possibility to integrate the traditional geometry modelling, parametric elements management and structural analysis together. The main objective of the project presented in this paper is to develop a parametric modelling tool for a timber roof structure whose elements are leaning and crossing beam frame. Since Autodesk Revit, as the typical BIM software, provides the platform for parametric modelling and information management, an API plugin, able to automatically create the parametric beam elements and link them together with strict relationship, was developed. The plugin under development is introduced in the paper, which can obtain the parametric beam model via Autodesk Revit API from total station points and terrestrial laser scanning data. The results show the potential of automatizing the parametric modelling by interactive API development in BIM environment. It also integrates the separate data processing and different platforms into the uniform Revit software.

  13. Secure Software Configuration Management Processes for nuclear safety software development environment

    International Nuclear Information System (INIS)

    Chou, I.-Hsin

    2011-01-01

    Highlights: → The proposed method emphasizes platform-independent security processes. → A hybrid process based on the nuclear SCM and security regulations is proposed. → Detailed descriptions and Process Flow Diagram are useful for software developers. - Abstract: The main difference between nuclear and generic software is that the risk factor is infinitely greater in nuclear software - if there is a malfunction in the safety system, it can result in significant economic loss, physical damage or threat to human life. However, secure software development environment have often been ignored in the nuclear industry. In response to the terrorist attacks on September 11, 2001, the US Nuclear Regulatory Commission (USNRC) revised the Regulatory Guide (RG 1.152-2006) 'Criteria for use of computers in safety systems of nuclear power plants' to provide specific security guidance throughout the software development life cycle. Software Configuration Management (SCM) is an essential discipline in the software development environment. SCM involves identifying configuration items, controlling changes to those items, and maintaining integrity and traceability of them. For securing the nuclear safety software, this paper proposes a Secure SCM Processes (S 2 CMP) which infuses regulatory security requirements into proposed SCM processes. Furthermore, a Process Flow Diagram (PFD) is adopted to describe S 2 CMP, which is intended to enhance the communication between regulators and developers.

  14. Advances in environmental radiation protection: re-thinking animal-environment interaction modelling for wildlife dose assessment

    Energy Technology Data Exchange (ETDEWEB)

    Wood, Michael D. [School of Environment and Life Sciences, University of Salford, Manchester, M4 4WT (United Kingdom); Beresford, Nicholas A. [School of Environment and Life Sciences, University of Salford, Manchester, M4 4WT (United Kingdom); Centre for Ecology and Hydrology, Bailrigg, Lancaster, LA1 4AP (United Kingdom); Bradshaw, Clare [Department of Ecology, Environment and Plant Sciences, Stockholm University, 10691 Stockholm (Sweden); Gashchak, Sergey [Chornobyl Centre for Nuclear Safety, Radioactive Waste and Radioecology, 07100 Slavutych (Ukraine); Hinton, Thomas G. [Institut de Radioprotection et de Surete Nucleaire (IRSN), Centre de Cadarache, 13115 Saint Paul-lez-Durance (France)

    2014-07-01

    Current wildlife dose assessment models adopt simplistic approaches to the representation of animal-environment interaction. The simplest approaches are to assume either that environmental media (e.g. soil, sediment or water) are uniformly contaminated or relating organism exposure to activity concentrations in media collected at the point of sampling of the animal. The external exposure of a reference organism is then estimated by defining the geometric relationship between the organism and the medium. For example, a reference organism within the soil would have a 4p exposure geometry and a reference organism on the soil would have a 2p exposure geometry. At best, the current modelling approaches recognise differences in media activity concentrations by calculating exposure for different areas of contamination and then estimating the fraction of time that an organism spends in each area. In other fields of pollution ecology, for example wildlife risk assessment for chemical pollution, more advanced approaches are being implemented to model animal-environment interaction and estimate exposure. These approaches include individual-based movement modelling and random walk modelling and a variety of software tools have been developed to facilitate the implementation of these models. Although there are more advanced animal-environment interaction modelling approaches that are available, it is questionable whether these should be adopted for use in environmental radiation protection. Would their adoption significantly reduce uncertainty within the assessment process and, if so, by how much? These questions are being addressed within the new TREE (TRansfer - Exposure - Effects) research programme funded by the United Kingdom Natural Environment Research Council (NERC) and within Working Group (WG) 8 of the International Atomic Energy Agency's MODARIA programme. MODARIA WG8 is reviewing some of the alternative approaches that have been developed for animal-environment

  15. A Test of Two Alternative Cognitive Processing Models: Learning Styles and Dual Coding

    Science.gov (United States)

    Cuevas, Joshua; Dawson, Bryan L.

    2018-01-01

    This study tested two cognitive models, learning styles and dual coding, which make contradictory predictions about how learners process and retain visual and auditory information. Learning styles-based instructional practices are common in educational environments despite a questionable research base, while the use of dual coding is less…

  16. Large urban fire environment: trends and model city predictions

    International Nuclear Information System (INIS)

    Larson, D.A.; Small, R.D.

    1983-01-01

    The urban fire environment that would result from a megaton-yield nuclear weapon burst is considered. The dependence of temperatures and velocities on fire size, burning intensity, turbulence, and radiation is explored, and specific calculations for three model urban areas are presented. In all cases, high velocity fire winds are predicted. The model-city results show the influence of building density and urban sprawl on the fire environment. Additional calculations consider large-area fires with the burning intensity reduced in a blast-damaged urban center

  17. Chaotic home environment is associated with reduced infant processing speed under high task demands.

    Science.gov (United States)

    Tomalski, Przemysław; Marczuk, Karolina; Pisula, Ewa; Malinowska, Anna; Kawa, Rafał; Niedźwiecka, Alicja

    2017-08-01

    Early adversity has profound long-term consequences for child development across domains. The effects of early adversity on structural and functional brain development were shown for infants under 12 months of life. However, the causal mechanisms of these effects remain relatively unexplored. Using a visual habituation task we investigated whether chaotic home environment may affect processing speed in 5.5 month-old infants (n=71). We found detrimental effects of chaos on processing speed for complex but not for simple visual stimuli. No effects of socio-economic status on infant processing speed were found although the sample was predominantly middle class. Our results indicate that chaotic early environment may adversely affect processing speed in early infancy, but only when greater cognitive resources need to be deployed. The study highlights an attractive avenue for research on the mechanisms linking home environment with the development of attention control. Copyright © 2017 Elsevier Inc. All rights reserved.

  18. Applying nonlinear MODM model to supply chain management with quantity discount policy under complex fuzzy environment

    Directory of Open Access Journals (Sweden)

    Zhe Zhang

    2014-06-01

    Full Text Available Purpose: The aim of this paper is to deal with the supply chain management (SCM with quantity discount policy under the complex fuzzy environment, which is characterized as the bi-fuzzy variables. By taking into account the strategy and the process of decision making, a bi-fuzzy nonlinear multiple objective decision making (MODM model is presented to solve the proposed problem.Design/methodology/approach: The bi-fuzzy variables in the MODM model are transformed into the trapezoidal fuzzy variables by the DMs's degree of optimism ?1 and ?2, which are de-fuzzified by the expected value index subsequently. For solving the complex nonlinear model, a multi-objective adaptive particle swarm optimization algorithm (MO-APSO is designed as the solution method.Findings: The proposed model and algorithm are applied to a typical example of SCM problem to illustrate the effectiveness. Based on the sensitivity analysis of the results, the bi-fuzzy nonlinear MODM SCM model is proved to be sensitive to the possibility level ?1.Practical implications: The study focuses on the SCM under complex fuzzy environment in SCM, which has a great practical significance. Therefore, the bi-fuzzy MODM model and MO-APSO can be further applied in SCM problem with quantity discount policy.Originality/value: The bi-fuzzy variable is employed in the nonlinear MODM model of SCM to characterize the hybrid uncertain environment, and this work is original. In addition, the hybrid crisp approach is proposed to transferred to model to an equivalent crisp one by the DMs's degree of optimism and the expected value index. Since the MODM model consider the bi-fuzzy environment and quantity discount policy, so this paper has a great practical significance.

  19. Modelization of cognition, activity and motivation as indicators for Interactive Learning Environment

    Directory of Open Access Journals (Sweden)

    Asmaa Darouich

    2017-06-01

    Full Text Available In Interactive Learning Environment (ILE, the cognitive activity and behavior of learners are the center of the researchers’ concerns. The improvement of learning through combining these axes as a structure of indicators for well-designed learning environment, encloses the measurement of the educational activity as a part of the learning process. In this paper, we propose a mathematical modeling approach based on learners actions to estimate the cognitive activity, learning behavior and motivation, in accordance with a proposed course content structure. This Cognitive indicator includes the study of knowledge, memory and reasoning. While, activity indicator aims to study effort, resistance and intensity. The results recovered on a sample of students with different levels of education, assume that the proposed approach presents a relation among all these indicators which is relatively reliable in the term of cognitive system.

  20. Modeling of column apparatus processes

    CERN Document Server

    Boyadjiev, Christo; Boyadjiev, Boyan; Popova-Krumova, Petya

    2016-01-01

    This book presents a new approach for the modeling of chemical and interphase mass transfer processes in industrial column apparatuses, using convection-diffusion and average-concentration models. The convection-diffusion type models are used for a qualitative analysis of the processes and to assess the main, small and slight physical effects, and then reject the slight effects. As a result, the process mechanism can be identified. It also introduces average concentration models for quantitative analysis, which use the average values of the velocity and concentration over the cross-sectional area of the column. The new models are used to analyze different processes (simple and complex chemical reactions, absorption, adsorption and catalytic reactions), and make it possible to model the processes of gas purification with sulfur dioxide, which form the basis of several patents.

  1. Modeling persistence of motion in a crowded environment: The diffusive limit of excluding velocity-jump processes

    Science.gov (United States)

    Gavagnin, Enrico; Yates, Christian A.

    2018-03-01

    Persistence of motion is the tendency of an object to maintain motion in a direction for short time scales without necessarily being biased in any direction in the long term. One of the most appropriate mathematical tools to study this behavior is an agent-based velocity-jump process. In the absence of agent-agent interaction, the mean-field continuum limit of the agent-based model (ABM) gives rise to the well known hyperbolic telegraph equation. When agent-agent interaction is included in the ABM, a strictly advective system of partial differential equations (PDEs) can be derived at the population level. However, no diffusive limit of the ABM has been obtained from such a model. Connecting the microscopic behavior of the ABM to a diffusive macroscopic description is desirable, since it allows the exploration of a wider range of scenarios and establishes a direct connection with commonly used statistical tools of movement analysis. In order to connect the ABM at the population level to a diffusive PDE at the population level, we consider a generalization of the agent-based velocity-jump process on a two-dimensional lattice with three forms of agent interaction. This generalization allows us to take a diffusive limit and obtain a faithful population-level description. We investigate the properties of the model at both the individual and population levels and we elucidate some of the models' key characteristic features. In particular, we show an intrinsic anisotropy inherent to the models and we find evidence of a spontaneous form of aggregation at both the micro- and macroscales.

  2. The Parental Environment Cluster Model of Child Neglect: An Integrative Conceptual Model.

    Science.gov (United States)

    Burke, Judith; Chandy, Joseph; Dannerbeck, Anne; Watt, J. Wilson

    1998-01-01

    Presents Parental Environment Cluster model of child neglect which identifies three clusters of factors involved in parents' neglectful behavior: (1) parenting skills and functions; (2) development and use of positive social support; and (3) resource availability and management skills. Model offers a focal theory for research, structure for…

  3. Method for Signal Processing of Electric Field Modulation Sensor in a Conductive Environment

    Directory of Open Access Journals (Sweden)

    O. I. Miseyk

    2015-01-01

    Full Text Available In investigating the large waters and deep oceans the most promising are modulation sensors for measuring electric field in a conducting environment in a very low frequency range in devices of autonomous or non-autonomous vertical sounding. When using sensors of this type it is necessary to solve the problem of enhancement and measurement of the modulated signal from the baseband noise.The work analyses hydrodynamic and electromagnetic noise at the input of transducer with "rotating" sensitive axis. By virtue of matching the measuring electrodes with the signal processing circuit a conclusion has been drawn that the proposed basic model of a transducer with "rotating” sensitive axis is the most efficient in terms of enhancement and measurement of modulated signal from the baseband noise. It has been shown that it is undesirable for transducers to have the rotation of electrodes resulting, in this case, in arising noise to be synchronously changed with transducer rotation frequency (modulation frequency. This will complicate the further signal-noise enhancement later in their processing.The paper justifies the choice of demodulation output signal, called synchronous demodulation using a low-pass filter with a cutoff frequency much lower than the carrier frequency to provide an output signal in the range of very low frequency and dc electric fields.The paper offers an original circuit to process the signals taken from the modulation sensor with "rotating" measurement base. This circuit has advantages over the earlier known circuits for measuring electric fields in a conducting (marine environment in the ultralow frequency range of these fields in terms of sensitivity and measuring accuracy of modulation sensors.

  4. Driver's Behavior and Decision-Making Optimization Model in Mixed Traffic Environment

    Directory of Open Access Journals (Sweden)

    Xiaoyuan Wang

    2015-02-01

    Full Text Available Driving process is an information treating procedure going on unceasingly. It is very important for the research of traffic flow theory, to study on drivers' information processing pattern in mixed traffic environment. In this paper, bicycle is regarded as a kind of information source to vehicle drivers; the “conflict point method” is brought forward to analyze the influence of bicycles on driving behavior. The “conflict” is studied to be translated into a special kind of car-following or lane-changing process. Furthermore, the computer clocked scan step length is dropped to 0.1 s, in order to scan and analyze the dynamic (static information which influences driving behavior in a more exact way. The driver's decision-making process is described through information fusion based on duality contrast and fuzzy optimization theory. The model test and verification show that the simulation results with the “conflict point method” and the field data are consistent basically. It is feasible to imitate driving behavior and the driver information fusion process with the proposed methods. Decision-making optimized process can be described more accurately through computer precision clocked scan strategy. The study in this paper can provide the foundation for further research of multiresource information fusion process of driving behavior.

  5. A cooperative model for IS security risk management in distributed environment.

    Science.gov (United States)

    Feng, Nan; Zheng, Chundong

    2014-01-01

    Given the increasing cooperation between organizations, the flexible exchange of security information across the allied organizations is critical to effectively manage information systems (IS) security in a distributed environment. In this paper, we develop a cooperative model for IS security risk management in a distributed environment. In the proposed model, the exchange of security information among the interconnected IS under distributed environment is supported by Bayesian networks (BNs). In addition, for an organization's IS, a BN is utilized to represent its security environment and dynamically predict its security risk level, by which the security manager can select an optimal action to safeguard the firm's information resources. The actual case studied illustrates the cooperative model presented in this paper and how it can be exploited to manage the distributed IS security risk effectively.

  6. What makes process models understandable?

    NARCIS (Netherlands)

    Mendling, J.; Reijers, H.A.; Cardoso, J.; Alonso, G.; Dadam, P.; Rosemann, M.

    2007-01-01

    Despite that formal and informal quality aspects are of significant importance to business process modeling, there is only little empirical work reported on process model quality and its impact factors. In this paper we investigate understandability as a proxy for quality of process models and focus

  7. eTOXlab, an open source modeling framework for implementing predictive models in production environments.

    Science.gov (United States)

    Carrió, Pau; López, Oriol; Sanz, Ferran; Pastor, Manuel

    2015-01-01

    Computational models based in Quantitative-Structure Activity Relationship (QSAR) methodologies are widely used tools for predicting the biological properties of new compounds. In many instances, such models are used as a routine in the industry (e.g. food, cosmetic or pharmaceutical industry) for the early assessment of the biological properties of new compounds. However, most of the tools currently available for developing QSAR models are not well suited for supporting the whole QSAR model life cycle in production environments. We have developed eTOXlab; an open source modeling framework designed to be used at the core of a self-contained virtual machine that can be easily deployed in production environments, providing predictions as web services. eTOXlab consists on a collection of object-oriented Python modules with methods mapping common tasks of standard modeling workflows. This framework allows building and validating QSAR models as well as predicting the properties of new compounds using either a command line interface or a graphic user interface (GUI). Simple models can be easily generated by setting a few parameters, while more complex models can be implemented by overriding pieces of the original source code. eTOXlab benefits from the object-oriented capabilities of Python for providing high flexibility: any model implemented using eTOXlab inherits the features implemented in the parent model, like common tools and services or the automatic exposure of the models as prediction web services. The particular eTOXlab architecture as a self-contained, portable prediction engine allows building models with confidential information within corporate facilities, which can be safely exported and used for prediction without disclosing the structures of the training series. The software presented here provides full support to the specific needs of users that want to develop, use and maintain predictive models in corporate environments. The technologies used by e

  8. Modelling the near-Earth space environment using LDEF data

    Science.gov (United States)

    Atkinson, Dale R.; Coombs, Cassandra R.; Crowell, Lawrence B.; Watts, Alan J.

    1992-01-01

    Near-Earth space is a dynamic environment, that is currently not well understood. In an effort to better characterize the near-Earth space environment, this study compares the results of actual impact crater measurement data and the Space Environment (SPENV) Program developed in-house at POD, to theoretical models established by Kessler (NASA TM-100471, 1987) and Cour-Palais (NASA SP-8013, 1969). With the continuing escalation of debris there will exist a definite hazard to unmanned satellites as well as manned operations. Since the smaller non-trackable debris has the highest impact rate, it is clearly necessary to establish the true debris environment for all particle sizes. Proper comprehension of the near-Earth space environment and its origin will permit improvement in spacecraft design and mission planning, thereby reducing potential disasters and extreme costs. Results of this study directly relate to the survivability of future spacecraft and satellites that are to travel through and/or reside in low Earth orbit (LEO). More specifically, these data are being used to: (1) characterize the effects of the LEO micrometeoroid an debris environment on satellite designs and components; (2) update the current theoretical micrometeoroid and debris models for LEO; (3) help assess the survivability of spacecraft and satellites that must travel through or reside in LEO, and the probability of their collision with already resident debris; and (4) help define and evaluate future debris mitigation and disposal methods. Combined model predictions match relatively well with the LDEF data for impact craters larger than approximately 0.05 cm, diameter; however, for smaller impact craters, the combined predictions diverge and do not reflect the sporadic clouds identified by the Interplanetary Dust Experiment (IDE) aboard LDEF. The divergences cannot currently be explained by the authors or model developers. The mean flux of small craters (approximately 0.05 cm diameter) is

  9. Modeling the Object-Oriented Software Process: OPEN and the Unified Process

    NARCIS (Netherlands)

    van den Berg, Klaas; Aksit, Mehmet; van den Broek, P.M.

    A short introduction to software process modeling is presented, particularly object-oriented modeling. Two major industrial process models are discussed: the OPEN model and the Unified Process model. In more detail, the quality assurance in the Unified Process tool (formally called Objectory) is

  10. Implications of Building Information Modeling on Interior Design Education: The Impact on Teaching Design Processes

    Directory of Open Access Journals (Sweden)

    Amy Roehl, MFA

    2013-06-01

    Full Text Available Currently, major shifts occur in design processes effecting business practices for industries involved with designing and delivering the built environment. These changing conditions are a direct result of industry adoption of relatively new technologies called BIM or Building Information Modeling. This review of literature examines implications of these changing processes on interior design education.

  11. Modelling the morphodynamics and co-evolution of coast and estuarine environments

    Science.gov (United States)

    Morris, Chloe; Coulthard, Tom; Parsons, Daniel R.; Manson, Susan; Barkwith, Andrew

    2017-04-01

    The morphodynamics of coast and estuarine environments are known to be sensitive to environmental change and sea-level rise. However, whilst these systems have received considerable individual research attention, how they interact and co-evolve is relatively understudied. These systems are intrinsically linked and it is therefore advantageous to study them holistically in order to build a more comprehensive understanding of their behaviour and to inform sustainable management over the long term. Complex environments such as these are often studied using numerical modelling techniques. Inherent from the limited research in this area, existing models are currently not capable of simulating dynamic coast-estuarine interactions. A new model is being developed through coupling the one-line Coastline Evolution Model (CEM) with CAESAR-Lisflood (C-L), a hydrodynamic Landscape Evolution Model. It is intended that the eventual model be used to advance the understanding of these systems and how they may evolve over the mid to long term in response to climate change. In the UK, the Holderness Coast, Humber Estuary and Spurn Point system offers a diverse and complex case study for this research. Holderness is one of the fastest eroding coastlines in Europe and research suggests that the large volumes of material removed from its cliffs are responsible for the formation of the Spurn Point feature and for the Holocene infilling of the Humber Estuary. Marine, fluvial and coastal processes are continually reshaping this system and over the next century, it is predicted that climate change could lead to increased erosion along the coast and supply of material to the Humber Estuary and Spurn Point. How this manifests will be hugely influential to the future morphology of these systems and the existence of Spurn Point. Progress to date includes a new version of the CEM that has been prepared for integration into C-L and includes an improved graphical user interface and more complex

  12. Engineered Barrier System: Physical and Chemical Environment

    Energy Technology Data Exchange (ETDEWEB)

    P. Dixon

    2004-04-26

    The conceptual and predictive models documented in this Engineered Barrier System: Physical and Chemical Environment Model report describe the evolution of the physical and chemical conditions within the waste emplacement drifts of the repository. The modeling approaches and model output data will be used in the total system performance assessment (TSPA-LA) to assess the performance of the engineered barrier system and the waste form. These models evaluate the range of potential water compositions within the emplacement drifts, resulting from the interaction of introduced materials and minerals in dust with water seeping into the drifts and with aqueous solutions forming by deliquescence of dust (as influenced by atmospheric conditions), and from thermal-hydrological-chemical (THC) processes in the drift. These models also consider the uncertainty and variability in water chemistry inside the drift and the compositions of introduced materials within the drift. This report develops and documents a set of process- and abstraction-level models that constitute the engineered barrier system: physical and chemical environment model. Where possible, these models use information directly from other process model reports as input, which promotes integration among process models used for total system performance assessment. Specific tasks and activities of modeling the physical and chemical environment are included in the technical work plan ''Technical Work Plan for: In-Drift Geochemistry Modeling'' (BSC 2004 [DIRS 166519]). As described in the technical work plan, the development of this report is coordinated with the development of other engineered barrier system analysis model reports.

  13. An u-Service Model Based on a Smart Phone for Urban Computing Environments

    Science.gov (United States)

    Cho, Yongyun; Yoe, Hyun

    In urban computing environments, all of services should be based on the interaction between humans and environments around them, which frequently and ordinarily in home and office. This paper propose an u-service model based on a smart phone for urban computing environments. The suggested service model includes a context-aware and personalized service scenario development environment that can instantly describe user's u-service demand or situation information with smart devices. To do this, the architecture of the suggested service model consists of a graphical service editing environment for smart devices, an u-service platform, and an infrastructure with sensors and WSN/USN. The graphic editor expresses contexts as execution conditions of a new service through a context model based on ontology. The service platform deals with the service scenario according to contexts. With the suggested service model, an user in urban computing environments can quickly and easily make u-service or new service using smart devices.

  14. Modeling the Object-Oriented Software Process: OPEN and the Unified Process

    OpenAIRE

    van den Berg, Klaas; Aksit, Mehmet; van den Broek, P.M.

    1999-01-01

    A short introduction to software process modeling is presented, particularly object-oriented modeling. Two major industrial process models are discussed: the OPEN model and the Unified Process model. In more detail, the quality assurance in the Unified Process tool (formally called Objectory) is reviewed.

  15. PARAMETER ESTIMATION AND MODEL SELECTION FOR INDOOR ENVIRONMENTS BASED ON SPARSE OBSERVATIONS

    Directory of Open Access Journals (Sweden)

    Y. Dehbi

    2017-09-01

    Full Text Available This paper presents a novel method for the parameter estimation and model selection for the reconstruction of indoor environments based on sparse observations. While most approaches for the reconstruction of indoor models rely on dense observations, we predict scenes of the interior with high accuracy in the absence of indoor measurements. We use a model-based top-down approach and incorporate strong but profound prior knowledge. The latter includes probability density functions for model parameters and sparse observations such as room areas and the building footprint. The floorplan model is characterized by linear and bi-linear relations with discrete and continuous parameters. We focus on the stochastic estimation of model parameters based on a topological model derived by combinatorial reasoning in a first step. A Gauss-Markov model is applied for estimation and simulation of the model parameters. Symmetries are represented and exploited during the estimation process. Background knowledge as well as observations are incorporated in a maximum likelihood estimation and model selection is performed with AIC/BIC. The likelihood is also used for the detection and correction of potential errors in the topological model. Estimation results are presented and discussed.

  16. Parameter Estimation and Model Selection for Indoor Environments Based on Sparse Observations

    Science.gov (United States)

    Dehbi, Y.; Loch-Dehbi, S.; Plümer, L.

    2017-09-01

    This paper presents a novel method for the parameter estimation and model selection for the reconstruction of indoor environments based on sparse observations. While most approaches for the reconstruction of indoor models rely on dense observations, we predict scenes of the interior with high accuracy in the absence of indoor measurements. We use a model-based top-down approach and incorporate strong but profound prior knowledge. The latter includes probability density functions for model parameters and sparse observations such as room areas and the building footprint. The floorplan model is characterized by linear and bi-linear relations with discrete and continuous parameters. We focus on the stochastic estimation of model parameters based on a topological model derived by combinatorial reasoning in a first step. A Gauss-Markov model is applied for estimation and simulation of the model parameters. Symmetries are represented and exploited during the estimation process. Background knowledge as well as observations are incorporated in a maximum likelihood estimation and model selection is performed with AIC/BIC. The likelihood is also used for the detection and correction of potential errors in the topological model. Estimation results are presented and discussed.

  17. Exploring the Potential of Aerial Photogrammetry for 3d Modelling of High-Alpine Environments

    Science.gov (United States)

    Legat, K.; Moe, K.; Poli, D.; Bollmannb, E.

    2016-03-01

    High-alpine areas are subject to rapid topographic changes, mainly caused by natural processes like glacial retreat and other geomorphological processes, and also due to anthropogenic interventions like construction of slopes and infrastructure in skiing resorts. Consequently, the demand for highly accurate digital terrain models (DTMs) in alpine environments has arisen. Public administrations often have dedicated resources for the regular monitoring of glaciers and natural hazard processes. In case of glaciers, traditional monitoring encompasses in-situ measurements of area and length and the estimation of volume and mass changes. Next to field measurements, data for such monitoring programs can be derived from DTMs and digital ortho photos (DOPs). Skiing resorts, on the other hand, require DTMs as input for planning and - more recently - for RTK-GNSS supported ski-slope grooming. Although different in scope, the demand of both user groups is similar: high-quality and up-to-date terrain data for extended areas often characterised by difficult accessibility and large elevation ranges. Over the last two decades, airborne laser scanning (ALS) has replaced photogrammetric approaches as state-of-the-art technology for the acquisition of high-resolution DTMs also in alpine environments. Reasons include the higher productivity compared to (manual) stereo-photogrammetric measurements, canopy-penetration capability, and limitations of photo measurements on sparsely textured surfaces like snow or ice. Nevertheless, the last few years have shown strong technological advances in the field of aerial camera technology, image processing and photogrammetric software which led to new possibilities for image-based DTM generation even in alpine terrain. At Vermessung AVT, an Austrian-based surveying company, and its subsidiary Terra Messflug, very promising results have been achieved for various projects in high-alpine environments, using images acquired by large-format digital

  18. On the upscaling of process-based models in deltaic applications

    Science.gov (United States)

    Li, L.; Storms, J. E. A.; Walstra, D. J. R.

    2018-03-01

    Process-based numerical models are increasingly used to study the evolution of marine and terrestrial depositional environments. Whilst a detailed description of small-scale processes provides an accurate representation of reality, application on geological timescales is restrained by the associated increase in computational time. In order to reduce the computational time, a number of acceleration methods are combined and evaluated for a schematic supply-driven delta (static base level) and an accommodation-driven delta (variable base level). The performance of the combined acceleration methods is evaluated by comparing the morphological indicators such as distributary channel networking and delta volumes derived from the model predictions for various levels of acceleration. The results of the accelerated models are compared to the outcomes from a series of simulations to capture autogenic variability. Autogenic variability is quantified by re-running identical models on an initial bathymetry with 1 cm added noise. The overall results show that the variability of the accelerated models fall within the autogenic variability range, suggesting that the application of acceleration methods does not significantly affect the simulated delta evolution. The Time-scale compression method (the acceleration method introduced in this paper) results in an increased computational efficiency of 75% without adversely affecting the simulated delta evolution compared to a base case. The combination of the Time-scale compression method with the existing acceleration methods has the potential to extend the application range of process-based models towards geologic timescales.

  19. Multi-enzyme Process Modeling

    DEFF Research Database (Denmark)

    Andrade Santacoloma, Paloma de Gracia

    are affected (in a positive or negative way) by the presence of the other enzymes and compounds in the media. In this thesis the concept of multi-enzyme in-pot term is adopted for processes that are carried out by the combination of enzymes in a single reactor and implemented at pilot or industrial scale...... features of the process and provides the information required to structure the process model by using a step-by-step procedure with the required tools and methods. In this way, this framework increases efficiency of the model development process with respect to time and resources needed (fast and effective....... In this way the model parameters that drives the main dynamic behavior can be identified and thus a better understanding of this type of processes. In order to develop, test and verify the methodology, three case studies were selected, specifically the bi-enzyme process for the production of lactobionic acid...

  20. Modelling and prediction of radionuclide migration from shallow, subgrade nuclear waste facilities in arid environments

    International Nuclear Information System (INIS)

    Smith, A.; Ward, A.; Geldenhuis, S.

    1986-01-01

    Over the past fifteen years, prodigious efforts and significant advances have been made in methods of prediction of the migration rate of dissolved species in aqueous systems. Despite such work, there remain formidable obstacles in prediction of solute transport in the unsaturated zone over the long time periods necessarily related to the radionuclide bearing wastes. The objective of this paper is to consider the methods, issues and problems with the use of predictive solute transport models for radionuclide migration from nuclear waste disposal in arid environments, if and when engineering containment of the waste fails. Having considered the ability for long term solute prediction for a number of geological environments, the advantages of a disposal environment in which the solute transport process is diffusion controlled will be described

  1. Process model repositories and PNML

    NARCIS (Netherlands)

    Hee, van K.M.; Post, R.D.J.; Somers, L.J.A.M.; Werf, van der J.M.E.M.; Kindler, E.

    2004-01-01

    Bringing system and process models together in repositories facilitates the interchange of model information between modelling tools, and allows the combination and interlinking of complementary models. Petriweb is a web application for managing such repositories. It supports hierarchical process

  2. Model-based software process improvement

    Science.gov (United States)

    Zettervall, Brenda T.

    1994-01-01

    The activities of a field test site for the Software Engineering Institute's software process definition project are discussed. Products tested included the improvement model itself, descriptive modeling techniques, the CMM level 2 framework document, and the use of process definition guidelines and templates. The software process improvement model represents a five stage cyclic approach for organizational process improvement. The cycles consist of the initiating, diagnosing, establishing, acting, and leveraging phases.

  3. A Comprehensive Model of the Meteoroids Environment Around Mercury

    Science.gov (United States)

    Pokorny, P.; Sarantos, M.; Janches, D.

    2018-05-01

    We present a comprehensive dynamical model for the meteoroid environment around Mercury comprised of meteoroids originating in asteroids, short and long period comets. Our model is fully calibrated and provides predictions for different values of TAA.

  4. Atomistic Modeling of Corrosion Events at the Interface between a Metal and Its Environment

    Directory of Open Access Journals (Sweden)

    Christopher D. Taylor

    2012-01-01

    Full Text Available Atomistic simulation is a powerful tool for probing the structure and properties of materials and the nature of chemical reactions. Corrosion is a complex process that involves chemical reactions occurring at the interface between a material and its environment and is, therefore, highly suited to study by atomistic modeling techniques. In this paper, the complex nature of corrosion processes and mechanisms is briefly reviewed. Various atomistic methods for exploring corrosion mechanisms are then described, and recent applications in the literature surveyed. Several instances of the application of atomistic modeling to corrosion science are then reviewed in detail, including studies of the metal-water interface, the reaction of water on electrified metallic interfaces, the dissolution of metal atoms from metallic surfaces, and the role of competitive adsorption in controlling the chemical nature and structure of a metallic surface. Some perspectives are then given concerning the future of atomistic modeling in the field of corrosion science.

  5. Design Process for Online Websites Created for Teaching Turkish as a Foreign Language in Web Based Environments

    Science.gov (United States)

    Türker, Fatih Mehmet

    2016-01-01

    In today's world, where online learning environments have increased their efficiency in education and training, the design of the websites prepared for education and training purposes has become an important process. This study is about the teaching process of the online learning environments created to teach Turkish in web based environments, and…

  6. The use of LCA for modelling sustainability and environmental impact of manufacturing processes

    Energy Technology Data Exchange (ETDEWEB)

    Culaba, A.; Purvis, M. [Portsmouth Univ. (United Kingdom). Dept. of Mechanical and Manufacturing Engineering

    1995-12-31

    Most industries rely significantly on natural resources for raw materials and energy requirements. As a consequence of manufacturing activities, various pollutants are generated in the process. While effects on the environment can be detrimental, wastes and emissions account for a high percentage loss in the overall material balance. Unless these unnecessary losses are minimized and recovered, the environment would continue to be disadvantaged and long-term supply of raw materials and energy would likewise be affected. The key to the analysis of such problems concerns generalised procedures for the modelling of the sustainable use of resources in manufacturing processes and the development of associated sustainability criteria. This requires identifying the various aspects of manufacturing from the time the raw materials are extracted until they have been processed into products and then used or consumed and finally disposed of. The use of life cycle assessment (LCA) methodology encompasses these analyses and that of the identification of environmental effects associated with every stage of the manufacturing process. The presentation concludes that LCA is a very useful and effective tool in providing planners, legislator and decision-makers with the necessary information on the probable impacts of manufacture on the environment as well as underlying legislation, ecological, health standards and emission limits. (author)

  7. The use of LCA for modelling sustainability and environmental impact of manufacturing processes

    Energy Technology Data Exchange (ETDEWEB)

    Culaba, A; Purvis, M [Portsmouth Univ. (United Kingdom). Dept. of Mechanical and Manufacturing Engineering

    1996-12-31

    Most industries rely significantly on natural resources for raw materials and energy requirements. As a consequence of manufacturing activities, various pollutants are generated in the process. While effects on the environment can be detrimental, wastes and emissions account for a high percentage loss in the overall material balance. Unless these unnecessary losses are minimized and recovered, the environment would continue to be disadvantaged and long-term supply of raw materials and energy would likewise be affected. The key to the analysis of such problems concerns generalised procedures for the modelling of the sustainable use of resources in manufacturing processes and the development of associated sustainability criteria. This requires identifying the various aspects of manufacturing from the time the raw materials are extracted until they have been processed into products and then used or consumed and finally disposed of. The use of life cycle assessment (LCA) methodology encompasses these analyses and that of the identification of environmental effects associated with every stage of the manufacturing process. The presentation concludes that LCA is a very useful and effective tool in providing planners, legislator and decision-makers with the necessary information on the probable impacts of manufacture on the environment as well as underlying legislation, ecological, health standards and emission limits. (author)

  8. Modeling Dynamic Food Choice Processes to Understand Dietary Intervention Effects.

    Science.gov (United States)

    Marcum, Christopher Steven; Goldring, Megan R; McBride, Colleen M; Persky, Susan

    2018-02-17

    Meal construction is largely governed by nonconscious and habit-based processes that can be represented as a collection of in dividual, micro-level food choices that eventually give rise to a final plate. Despite this, dietary behavior intervention research rarely captures these micro-level food choice processes, instead measuring outcomes at aggregated levels. This is due in part to a dearth of analytic techniques to model these dynamic time-series events. The current article addresses this limitation by applying a generalization of the relational event framework to model micro-level food choice behavior following an educational intervention. Relational event modeling was used to model the food choices that 221 mothers made for their child following receipt of an information-based intervention. Participants were randomized to receive either (a) control information; (b) childhood obesity risk information; (c) childhood obesity risk information plus a personalized family history-based risk estimate for their child. Participants then made food choices for their child in a virtual reality-based food buffet simulation. Micro-level aspects of the built environment, such as the ordering of each food in the buffet, were influential. Other dynamic processes such as choice inertia also influenced food selection. Among participants receiving the strongest intervention condition, choice inertia decreased and the overall rate of food selection increased. Modeling food selection processes can elucidate the points at which interventions exert their influence. Researchers can leverage these findings to gain insight into nonconscious and uncontrollable aspects of food selection that influence dietary outcomes, which can ultimately improve the design of dietary interventions.

  9. Optimization of multi-environment trials for genomic selection based on crop models.

    Science.gov (United States)

    Rincent, R; Kuhn, E; Monod, H; Oury, F-X; Rousset, M; Allard, V; Le Gouis, J

    2017-08-01

    We propose a statistical criterion to optimize multi-environment trials to predict genotype × environment interactions more efficiently, by combining crop growth models and genomic selection models. Genotype × environment interactions (GEI) are common in plant multi-environment trials (METs). In this context, models developed for genomic selection (GS) that refers to the use of genome-wide information for predicting breeding values of selection candidates need to be adapted. One promising way to increase prediction accuracy in various environments is to combine ecophysiological and genetic modelling thanks to crop growth models (CGM) incorporating genetic parameters. The efficiency of this approach relies on the quality of the parameter estimates, which depends on the environments composing this MET used for calibration. The objective of this study was to determine a method to optimize the set of environments composing the MET for estimating genetic parameters in this context. A criterion called OptiMET was defined to this aim, and was evaluated on simulated and real data, with the example of wheat phenology. The MET defined with OptiMET allowed estimating the genetic parameters with lower error, leading to higher QTL detection power and higher prediction accuracies. MET defined with OptiMET was on average more efficient than random MET composed of twice as many environments, in terms of quality of the parameter estimates. OptiMET is thus a valuable tool to determine optimal experimental conditions to best exploit MET and the phenotyping tools that are currently developed.

  10. Study of thermal environment in Jingjintang urban agglomeration based on WRF model and Landsat data

    International Nuclear Information System (INIS)

    Huang, Q N; Cao, Z Q; Guo, H D; Xi, X H; Li, X W

    2014-01-01

    In recent decades, unprecedented urban expansion has taken place in developing countries resulting in the emergence of megacities or urban agglomeration. It has been highly concerned by many countries about a variety of urban environmental issues such as greenhouse gas emissions and urban heat island phenomenon associated with urbanization. Generally, thermal environment is monitored by remote sensing satellite data. This method is usually limited by weather and repeated cycle. Another approach is relied on numerical simulation based on models. In the study, these two means are combined to study the thermal environment of Jingjintang urban agglomeration. The high temperature processes of the study area in 2009 and 1990s are simulated by using WRF (the Weather Research and Forecasting Model) coupled with UCM (Urban Canopy Model) and the urban impervious surface estimated from Landsat-5 TM data using support vector machine. Results show that the trend of simulated air temperature (2 meter) is in accord with observed air temperature. Moreover, it indicates the differences of air temperature and Land Surface Temperature caused by the urbanization efficiently. The UHI effect at night is stronger than that in the day. The maximum difference of LST reaches to 8–10°C for new build-up area at night. The method provided in this research can be used to analyze impacts on urban thermal environment caused by urbanization and it also provides means on thermal environment monitoring and prediction which will benefit the coping capacity of extreme event

  11. Using Unified Modelling Language (UML) as a process-modelling technique for clinical-research process improvement.

    Science.gov (United States)

    Kumarapeli, P; De Lusignan, S; Ellis, T; Jones, B

    2007-03-01

    The Primary Care Data Quality programme (PCDQ) is a quality-improvement programme which processes routinely collected general practice computer data. Patient data collected from a wide range of different brands of clinical computer systems are aggregated, processed, and fed back to practices in an educational context to improve the quality of care. Process modelling is a well-established approach used to gain understanding and systematic appraisal, and identify areas of improvement of a business process. Unified modelling language (UML) is a general purpose modelling technique used for this purpose. We used UML to appraise the PCDQ process to see if the efficiency and predictability of the process could be improved. Activity analysis and thinking-aloud sessions were used to collect data to generate UML diagrams. The UML model highlighted the sequential nature of the current process as a barrier for efficiency gains. It also identified the uneven distribution of process controls, lack of symmetric communication channels, critical dependencies among processing stages, and failure to implement all the lessons learned in the piloting phase. It also suggested that improved structured reporting at each stage - especially from the pilot phase, parallel processing of data and correctly positioned process controls - should improve the efficiency and predictability of research projects. Process modelling provided a rational basis for the critical appraisal of a clinical data processing system; its potential maybe underutilized within health care.

  12. [Effect of solution environments on ceramic membrane microfiltration of model system of Chinese medicines].

    Science.gov (United States)

    Zhang, Lianjun; Lu, Jin; Le, Kang; Fu, Tingming; Guo, Liwei

    2010-07-01

    To investigate the effect of differents solution environments on the ceramic membrane microfiltration of model system of Chinese medicines. Taking binary system of soybean protein-berberine as the research object, flux, transmittance of berberine and traping rate of protein as indexes, different solution environment on membrane process were investigated. When the concentration of soybean protein was under 1 g x L(-1), the membrane flux was minimum with the traping of berberine decreased slightly as the concentration increased. When pH was 4, the flux was maximum with the traping rate of protein was 99%, and the transmittance of berberine reached above 60%. The efficiency of membrane separation can be improved by optimizing the solution environment of water-extraction of chinese medicines. The efficiency of membrane separation is the best when adjust the pH to the isoelectric point of proteins for the proteins as the main pollutant in aqueous solution.

  13. EIT forward problem parallel simulation environment with anisotropic tissue and realistic electrode models.

    Science.gov (United States)

    De Marco, Tommaso; Ries, Florian; Guermandi, Marco; Guerrieri, Roberto

    2012-05-01

    Electrical impedance tomography (EIT) is an imaging technology based on impedance measurements. To retrieve meaningful insights from these measurements, EIT relies on detailed knowledge of the underlying electrical properties of the body. This is obtained from numerical models of current flows therein. The nonhomogeneous and anisotropic electric properties of human tissues make accurate modeling and simulation very challenging, leading to a tradeoff between physical accuracy and technical feasibility, which at present severely limits the capabilities of EIT. This work presents a complete algorithmic flow for an accurate EIT modeling environment featuring high anatomical fidelity with a spatial resolution equal to that provided by an MRI and a novel realistic complete electrode model implementation. At the same time, we demonstrate that current graphics processing unit (GPU)-based platforms provide enough computational power that a domain discretized with five million voxels can be numerically modeled in about 30 s.

  14. Activity Recognition Using Hybrid Generative/Discriminative Models on Home Environments Using Binary Sensors

    Directory of Open Access Journals (Sweden)

    Araceli Sanchis

    2013-04-01

    Full Text Available Activities of daily living are good indicators of elderly health status, and activity recognition in smart environments is a well-known problem that has been previously addressed by several studies. In this paper, we describe the use of two powerful machine learning schemes, ANN (Artificial Neural Network and SVM (Support Vector Machines, within the framework of HMM (Hidden Markov Model in order to tackle the task of activity recognition in a home setting. The output scores of the discriminative models, after processing, are used as observation probabilities of the hybrid approach. We evaluate our approach by comparing these hybrid models with other classical activity recognition methods using five real datasets. We show how the hybrid models achieve significantly better recognition performance, with significance level p < 0:05, proving that the hybrid approach is better suited for the addressed domain.

  15. Statistical Multipath Model Based on Experimental GNSS Data in Static Urban Canyon Environment

    Directory of Open Access Journals (Sweden)

    Yuze Wang

    2018-04-01

    Full Text Available A deep understanding of multipath characteristics is essential to design signal simulators and receivers in global navigation satellite system applications. As a new constellation is deployed and more applications occur in the urban environment, the statistical multipath models of navigation signal need further study. In this paper, we present statistical distribution models of multipath time delay, multipath power attenuation, and multipath fading frequency based on the experimental data in the urban canyon environment. The raw data of multipath characteristics are obtained by processing real navigation signal to study the statistical distribution. By fitting the statistical data, it shows that the probability distribution of time delay follows a gamma distribution which is related to the waiting time of Poisson distributed events. The fading frequency follows an exponential distribution, and the mean of multipath power attenuation decreases linearly with an increasing time delay. In addition, the detailed statistical characteristics for different elevations and orbits satellites is studied, and the parameters of each distribution are quite different. The research results give useful guidance for navigation simulator and receiver designers.

  16. The Lived Environment Life Quality Model for institutionalized people with dementia.

    Science.gov (United States)

    Wood, Wendy; Lampe, Jenna L; Logan, Christina A; Metcalfe, Amy R; Hoesly, Beth E

    2017-02-01

    There is a need for a conceptual practice model that explicates ecological complexities involved in using occupation to optimize the quality of life of institutionalized people with dementia. This study aimed to prepare the Lived Environment Life Quality Model, a dementia-specific conceptual practice model of occupational therapy in institutional facilities, for publication and application to practice. Interviews and focus groups with six expert occupational therapists were subjected to qualitative content analysis to confirm, disconfirm, and further develop the model. The model's lived-environment domain as the focus of assessment and intervention was extensively confirmed, and its quality-of-life domain as the focus of intervention goals and outcomes was both confirmed and further developed. As confirmed in this study, the Lived Environment Life Quality Model is a client-centred, ecologically valid, and occupation-focused guide to optimizing quality of life of institutionalized adults with dementia in present moments and progressively over time.

  17. Developing engineering processes through integrated modelling of product and process

    DEFF Research Database (Denmark)

    Nielsen, Jeppe Bjerrum; Hvam, Lars

    2012-01-01

    This article aims at developing an operational tool for integrated modelling of product assortments and engineering processes in companies making customer specific products. Integrating a product model in the design of engineering processes will provide a deeper understanding of the engineering...... activities as well as insight into how product features affect the engineering processes. The article suggests possible ways of integrating models of products with models of engineering processes. The models have been tested and further developed in an action research study carried out in collaboration...... with a major international engineering company....

  18. UML in business process modeling

    Directory of Open Access Journals (Sweden)

    Bartosz Marcinkowski

    2013-03-01

    Full Text Available Selection and proper application of business process modeling methods and techniques have a significant impact on organizational improvement capabilities as well as proper understanding of functionality of information systems that shall support activity of the organization. A number of business process modeling notations were popularized in practice in recent decades. Most significant of the notations include Business Process Modeling Notation (OMG BPMN and several Unified Modeling Language (OMG UML extensions. In this paper, the assessment whether one of the most flexible and strictly standardized contemporary business process modeling notations, i.e. Rational UML Profile for Business Modeling, enable business analysts to prepare business models that are all-embracing and understandable by all the stakeholders. After the introduction, methodology of research is discussed. Section 2 presents selected case study results. The paper is concluded with a summary.

  19. Event-Entity-Relationship Modeling in Data Warehouse Environments

    DEFF Research Database (Denmark)

    Bækgaard, Lars

    We use the event-entity-relationship model (EVER) to illustrate the use of entity-based modeling languages for conceptual schema design in data warehouse environments. EVER is a general-purpose information modeling language that supports the specification of both general schema structures and multi......-dimensional schemes that are customized to serve specific information needs. EVER is based on an event concept that is very well suited for multi-dimensional modeling because measurement data often represent events in multi-dimensional databases...

  20. Genomic-Enabled Prediction in Maize Using Kernel Models with Genotype × Environment Interaction.

    Science.gov (United States)

    Bandeira E Sousa, Massaine; Cuevas, Jaime; de Oliveira Couto, Evellyn Giselly; Pérez-Rodríguez, Paulino; Jarquín, Diego; Fritsche-Neto, Roberto; Burgueño, Juan; Crossa, Jose

    2017-06-07

    Multi-environment trials are routinely conducted in plant breeding to select candidates for the next selection cycle. In this study, we compare the prediction accuracy of four developed genomic-enabled prediction models: (1) single-environment, main genotypic effect model (SM); (2) multi-environment, main genotypic effects model (MM); (3) multi-environment, single variance G×E deviation model (MDs); and (4) multi-environment, environment-specific variance G×E deviation model (MDe). Each of these four models were fitted using two kernel methods: a linear kernel Genomic Best Linear Unbiased Predictor, GBLUP (GB), and a nonlinear kernel Gaussian kernel (GK). The eight model-method combinations were applied to two extensive Brazilian maize data sets (HEL and USP data sets), having different numbers of maize hybrids evaluated in different environments for grain yield (GY), plant height (PH), and ear height (EH). Results show that the MDe and the MDs models fitted with the Gaussian kernel (MDe-GK, and MDs-GK) had the highest prediction accuracy. For GY in the HEL data set, the increase in prediction accuracy of SM-GK over SM-GB ranged from 9 to 32%. For the MM, MDs, and MDe models, the increase in prediction accuracy of GK over GB ranged from 9 to 49%. For GY in the USP data set, the increase in prediction accuracy of SM-GK over SM-GB ranged from 0 to 7%. For the MM, MDs, and MDe models, the increase in prediction accuracy of GK over GB ranged from 34 to 70%. For traits PH and EH, gains in prediction accuracy of models with GK compared to models with GB were smaller than those achieved in GY. Also, these gains in prediction accuracy decreased when a more difficult prediction problem was studied. Copyright © 2017 Bandeira e Sousa et al.

  1. Genomic-Enabled Prediction in Maize Using Kernel Models with Genotype × Environment Interaction

    Directory of Open Access Journals (Sweden)

    Massaine Bandeira e Sousa

    2017-06-01

    Full Text Available Multi-environment trials are routinely conducted in plant breeding to select candidates for the next selection cycle. In this study, we compare the prediction accuracy of four developed genomic-enabled prediction models: (1 single-environment, main genotypic effect model (SM; (2 multi-environment, main genotypic effects model (MM; (3 multi-environment, single variance G×E deviation model (MDs; and (4 multi-environment, environment-specific variance G×E deviation model (MDe. Each of these four models were fitted using two kernel methods: a linear kernel Genomic Best Linear Unbiased Predictor, GBLUP (GB, and a nonlinear kernel Gaussian kernel (GK. The eight model-method combinations were applied to two extensive Brazilian maize data sets (HEL and USP data sets, having different numbers of maize hybrids evaluated in different environments for grain yield (GY, plant height (PH, and ear height (EH. Results show that the MDe and the MDs models fitted with the Gaussian kernel (MDe-GK, and MDs-GK had the highest prediction accuracy. For GY in the HEL data set, the increase in prediction accuracy of SM-GK over SM-GB ranged from 9 to 32%. For the MM, MDs, and MDe models, the increase in prediction accuracy of GK over GB ranged from 9 to 49%. For GY in the USP data set, the increase in prediction accuracy of SM-GK over SM-GB ranged from 0 to 7%. For the MM, MDs, and MDe models, the increase in prediction accuracy of GK over GB ranged from 34 to 70%. For traits PH and EH, gains in prediction accuracy of models with GK compared to models with GB were smaller than those achieved in GY. Also, these gains in prediction accuracy decreased when a more difficult prediction problem was studied.

  2. Automatic extraction of process categories from process model collections

    NARCIS (Netherlands)

    Malinova, M.; Dijkman, R.M.; Mendling, J.; Lohmann, N.; Song, M.; Wohed, P.

    2014-01-01

    Many organizations build up their business process management activities in an incremental way. As a result, there is no overarching structure defined at the beginning. However, as business process modeling initiatives often yield hundreds to thousands of process models, there is a growing need for

  3. Modeling of the Martian environment for radiation analysis

    International Nuclear Information System (INIS)

    De Angelis, G.; Wilson, J.W.; Clowdsley, M.S.; Qualls, G.D.; Singleterry, R.C.

    2006-01-01

    A model for the radiation environment to be found on the planet Mars due to Galactic Cosmic Rays (GCR) has been developed. Solar modulated primary particles rescaled for conditions at Mars are transported through the Martian atmosphere down to the surface, with altitude and backscattering patterns taken into account. The altitude to compute the atmospheric thickness profile has been determined by using a model for the topography based on the data provided by the Mars Orbiter Laser Altimeter (MOLA) instrument on board the Mars Global Surveyor (MGS) spacecraft. The Mars surface composition has been modeled based on averages over the measurements obtained from orbiting spacecraft and at various landing sites, taking into account the possible volatile inventory (e.g. CO 2 and H 2 O ices) along with its time variations throughout the Martian year. The Mars Radiation Environment Model has been made available worldwide through the Space Ionizing Radiation Effects and Shielding Tools (SIREST) website, a project of NASA Langley Research Center. This site has been developed to provide the scientific and engineering communities with an interactive site containing a variety of environmental models, shield evaluation codes, and radiation response models to allow a thorough assessment of ionizing radiation risk for current and future space missions

  4. Conceptual models of information processing

    Science.gov (United States)

    Stewart, L. J.

    1983-01-01

    The conceptual information processing issues are examined. Human information processing is defined as an active cognitive process that is analogous to a system. It is the flow and transformation of information within a human. The human is viewed as an active information seeker who is constantly receiving, processing, and acting upon the surrounding environmental stimuli. Human information processing models are conceptual representations of cognitive behaviors. Models of information processing are useful in representing the different theoretical positions and in attempting to define the limits and capabilities of human memory. It is concluded that an understanding of conceptual human information processing models and their applications to systems design leads to a better human factors approach.

  5. USING PCU-CAMEL, A WEB-BASED LEARNING ENVIRONMENT, IN EVALUATING TEACHING-LEARNING PROCESS

    Directory of Open Access Journals (Sweden)

    Arlinah Imam Rahardjo

    2008-01-01

    Full Text Available PCU-CAMEL (Petra Christian University-Computer Aided Mechanical Engineering Department Learning Environment has been developed to integrate the use of this web-based learning environment into the traditional, face-to-face setting of class activities. This integrated learning method is designed as an effort to enrich and improve the teaching-learning process at Petra Christian University. A study was conducted to introduce the use of PCU-CAMEL as a tool in evaluating teaching learning process. The study on this method of evaluation was conducted by using a case analysis on the integration of PCU-CAMEL to the traditional face-to-face meetings of LIS (Library Information System class at the Informatics Engineering Department of Petra Christian University. Students’ responses documented in some features of PCU-CAMEL were measured and analyzed to evaluate the effectiveness of this integrated system in developing intrinsic motivation of the LIS students of the first and second semester of 2004/2005 to learn. It is believed that intrinsic motivation can drive students to learn more. From the study conducted, it is concluded that besides its capability in developing intrinsic motivation, PCU-CAMEL as a web-based learning environment, can also serve as an effective tool for both students and instructors to evaluate the teaching-learning process. However, some weaknesses did exist in using this method of evaluating teaching-learning process. The free style and unstructured form of the documentation features of this web-based learning environment can lead to ineffective evaluation results

  6. Hyporheic flow and transport processes: mechanisms, models, and biogeochemical implications

    Science.gov (United States)

    Boano, Fulvio; Harvey, Judson W.; Marion, Andrea; Packman, Aaron I.; Revelli, Roberto; Ridolfi, Luca; Anders, Wörman

    2014-01-01

    Fifty years of hyporheic zone research have shown the important role played by the hyporheic zone as an interface between groundwater and surface waters. However, it is only in the last two decades that what began as an empirical science has become a mechanistic science devoted to modeling studies of the complex fluid dynamical and biogeochemical mechanisms occurring in the hyporheic zone. These efforts have led to the picture of surface-subsurface water interactions as regulators of the form and function of fluvial ecosystems. Rather than being isolated systems, surface water bodies continuously interact with the subsurface. Exploration of hyporheic zone processes has led to a new appreciation of their wide reaching consequences for water quality and stream ecology. Modern research aims toward a unified approach, in which processes occurring in the hyporheic zone are key elements for the appreciation, management, and restoration of the whole river environment. In this unifying context, this review summarizes results from modeling studies and field observations about flow and transport processes in the hyporheic zone and describes the theories proposed in hydrology and fluid dynamics developed to quantitatively model and predict the hyporheic transport of water, heat, and dissolved and suspended compounds from sediment grain scale up to the watershed scale. The implications of these processes for stream biogeochemistry and ecology are also discussed."

  7. Study on synthesis of geological environment at Horonobe area. A technical review

    International Nuclear Information System (INIS)

    Toida, Masaru; Suyama, Yasuhiro; Shiogama, Yukihiro; Atsumi, Hiroyuki; Abe, Yasunori; Furuichi, Mitsuaki

    2003-03-01

    The objective of the Horonobe Under Ground Research Project includes enhancing reliability of disposal techniques and safety assessment methods which are based on data on deep underground geological environment obtained by surface explorations and models for geological environment developed using those data. In this study, through development of conceptual models of geological environment based on those data, the flows from data collection to modeling, which have been conducted independently for each geological environment of geology/geological structure, hydrogeology, geochemistry of groundwater and rock mechanics, were synthesized, and a systematic approach including processes from investigation of geological environment to its modeling was established, which is expected to ensure objectivity and traceability of the design and safety assessment of a disposal system. This study is also a part of a program that includes an iterative process in which geological models would be developed and revised repeatedly through the Horonobe Under Ground Research Project and development of geological environment investigation techniques. The results of the study are summarized as follows: (1) Models based on current knowledge were developed; conceptual geology/geological structural model, conceptual hydrogeological model, conceptual geochemical model of groundwater, and conceptual rock mechanical model, (2) Information of data flow and interpretation in the modeling process were synthesized into an data flow which includes knowledge on historical geology and palaeogeology in addition to four models shown above in terms of safety assessment, and (3) Based on modeling processes and syntheses of data flow shown above, tasks that should be considered were organized and suggestions of investigation program were provided for the next phase. (author)

  8. Toward a Model of Human Information Processing for Decision-Making and Skill Acquisition in Laparoscopic Colorectal Surgery.

    Science.gov (United States)

    White, Eoin J; McMahon, Muireann; Walsh, Michael T; Coffey, J Calvin; O Sullivan, Leonard

    To create a human information-processing model for laparoscopic surgery based on already established literature and primary research to enhance laparoscopic surgical education in this context. We reviewed the literature for information-processing models most relevant to laparoscopic surgery. Our review highlighted the necessity for a model that accounts for dynamic environments, perception, allocation of attention resources between the actions of both hands of an operator, and skill acquisition and retention. The results of the literature review were augmented through intraoperative observations of 7 colorectal surgical procedures, supported by laparoscopic video analysis of 12 colorectal procedures. The Wickens human information-processing model was selected as the most relevant theoretical model to which we make adaptions for this specific application. We expanded the perception subsystem of the model to involve all aspects of perception during laparoscopic surgery. We extended the decision-making system to include dynamic decision-making to account for case/patient-specific and surgeon-specific deviations. The response subsystem now includes dual-task performance and nontechnical skills, such as intraoperative communication. The memory subsystem is expanded to include skill acquisition and retention. Surgical decision-making during laparoscopic surgery is the result of a highly complex series of processes influenced not only by the operator's knowledge, but also patient anatomy and interaction with the surgical team. Newer developments in simulation-based education must focus on the theoretically supported elements and events that underpin skill acquisition and affect the cognitive abilities of novice surgeons. The proposed human information-processing model builds on established literature regarding information processing, accounting for a dynamic environment of laparoscopic surgery. This revised model may be used as a foundation for a model describing robotic

  9. Practical utilization of modeling and simulation in laboratory process waste assessments

    International Nuclear Information System (INIS)

    Lyttle, T.W.; Smith, D.M.; Weinrach, J.B.; Burns, M.L.

    1993-01-01

    At Los Alamos National Laboratory (LANL), facility waste streams tend to be small but highly diverse. Initial characterization of such waste streams is difficult in part due to a lack of tools to assist the waste generators in completing such assessments. A methodology has been developed at LANL to allow process knowledgeable field personnel to develop baseline waste generation assessments and to evaluate potential waste minimization technology. This process waste assessment (PWA) system is an application constructed within the process modeling system. The Process Modeling System (PMS) is an object-oriented, mass balance-based, discrete-event simulation using the common LISP object system (CLOS). Analytical capabilities supported within the PWA system include: complete mass balance specifications, historical characterization of selected waste streams and generation of facility profiles for materials consumption, resource utilization and worker exposure. Anticipated development activities include provisions for a best available technologies (BAT) database and integration with the LANL facilities management Geographic Information System (GIS). The environments used to develop these assessment tools will be discussed in addition to a review of initial implementation results

  10. Modelling of processes occurring in deep geological repository - development of new modules in the GoldSim environment

    International Nuclear Information System (INIS)

    Vopalka, D.; Lukin, D.; Vokal, A.

    2006-01-01

    Three new modules were prepared in the environment of the GoldSim environment (using its Transport Module). The source term module includes radioactive decay and ingrowth in the canister, first order degradation of fuel matrix, solubility limitation of the concentration of the studied nuclides, and diffusive migration through the surrounding bentonite layer controlled by the output boundary condition formulated with respect to the rate of water flow in the rock. The module was successfully compared with results of similar codes (MIVCYL and Pagoda) and possibilities of the module were extended by a more realistic model of matrix degradation. A better quantification of the role of radionuclide sorption on the bentonite surface was enabled by a module that included non-linear form of the interaction isotherm. Using this module both the influence of the shape of sorption isotherm on the values of diffusion coefficients and the limits of K d -approach that dominates in most codes used in performance assessment studies was discussed. The third of the GoldSim modules presented has been worked out for the description of corrosion of canisters made of carbon steel and for the transport of corrosion products in the near-field region. This module evaluates balance equations between the dissolving species and species transported by diffusion and/or advection from the surface of a solid material. The model also includes transport of iron directly to a fracture in the surrounding rock or into a layer of granite host rock without fractures, and takes into account the reduction of the actual corrosion rate of the canister by growth of the corrosion layer thickness

  11. ADOxx Modelling Method Conceptualization Environment

    Directory of Open Access Journals (Sweden)

    Nesat Efendioglu

    2017-04-01

    Full Text Available The importance of Modelling Methods Engineering is equally rising with the importance of domain specific languages (DSL and individual modelling approaches. In order to capture the relevant semantic primitives for a particular domain, it is necessary to involve both, (a domain experts, who identify relevant concepts as well as (b method engineers who compose a valid and applicable modelling approach. This process consists of a conceptual design of formal or semi-formal of modelling method as well as a reliable, migratable, maintainable and user friendly software development of the resulting modelling tool. Modelling Method Engineering cycle is often under-estimated as both the conceptual architecture requires formal verification and the tool implementation requires practical usability, hence we propose a guideline and corresponding tools to support actors with different background along this complex engineering process. Based on practical experience in business, more than twenty research projects within the EU frame programmes and a number of bilateral research initiatives, this paper introduces the phases, corresponding a toolbox and lessons learned with the aim to support the engineering of a modelling method. ”The proposed approach is illustrated and validated within use cases from three different EU-funded research projects in the fields of (1 Industry 4.0, (2 e-learning and (3 cloud computing. The paper discusses the approach, the evaluation results and derived outlooks.

  12. An effort allocation model considering different budgetary constraint on fault detection process and fault correction process

    Directory of Open Access Journals (Sweden)

    Vijay Kumar

    2016-01-01

    Full Text Available Fault detection process (FDP and Fault correction process (FCP are important phases of software development life cycle (SDLC. It is essential for software to undergo a testing phase, during which faults are detected and corrected. The main goal of this article is to allocate the testing resources in an optimal manner to minimize the cost during testing phase using FDP and FCP under dynamic environment. In this paper, we first assume there is a time lag between fault detection and fault correction. Thus, removal of a fault is performed after a fault is detected. In addition, detection process and correction process are taken to be independent simultaneous activities with different budgetary constraints. A structured optimal policy based on optimal control theory is proposed for software managers to optimize the allocation of the limited resources with the reliability criteria. Furthermore, release policy for the proposed model is also discussed. Numerical example is given in support of the theoretical results.

  13. The Role of the Built Environment: How Decentralized Nurse Stations Shape Communication, Patient Care Processes, and Patient Outcomes.

    Science.gov (United States)

    Real, Kevin; Bardach, Shoshana H; Bardach, David R

    2017-12-01

    Increasingly, health communication scholars are attending to how hospital built environments shape communication, patient care processes, and patient outcomes. This multimethod study was conducted on two floors of a newly designed urban hospital. Nine focus groups interviews were conducted with 35 health care professionals from 10 provider groups. Seven of the groups were homogeneous by profession or level: nursing (three groups), nurse managers (two groups), and one group each of nurse care technicians ("techs") and physicians. Two mixed groups were comprised of staff from pharmacy, occupational therapy, patient care facilitators, physical therapy, social work, and pastoral care. Systematic qualitative analysis was conducted using a conceptual framework based on systems theory and prior health care design and communication research. Additionally, quantitative modeling was employed to assess walking distances in two different hospital designs. Results indicate nurses walked significantly more in the new hospital environment. Qualitative analysis revealed three insights developed in relationship to system structures, processes, and outcomes. First, decentralized nurse stations changed system interdependencies by reducing nurse-to-nurse interactions and teamwork while heightening nurse interdependencies and teamwork with other health care occupations. Second, many nursing-related processes remained centralized while nurse stations were decentralized, creating systems-based problems for nursing care. Third, nursing communities of practices were adversely affected by the new design. Implications of this study suggest that nurse station design shapes communication, patient care processes, and patient outcomes. Further, it is important to understand how the built environment, often treated as invisible in communication research, is crucial to understanding communication within complex health care systems.

  14. Comprehensive Numerical Modeling of the Blast Furnace Ironmaking Process

    Science.gov (United States)

    Zhou, Chenn; Tang, Guangwu; Wang, Jichao; Fu, Dong; Okosun, Tyamo; Silaen, Armin; Wu, Bin

    2016-05-01

    Blast furnaces are counter-current chemical reactors, widely utilized in the ironmaking industry. Hot reduction gases injected from lower regions of the furnace ascend, reacting with the descending burden. Through this reaction process, iron ore is reduced into liquid iron that is tapped from the furnace hearth. Due to the extremely harsh environment inside the blast furnace, it is difficult to measure or observe internal phenomena during operation. Through the collaboration between steel companies and the Center for Innovation through Visualization and Simulation, multiple computational fluid dynamics (CFD) models have been developed to simulate the complex multiphase reacting flow in the three regions of the furnace, the shaft, the raceway, and the hearth. The models have been used effectively to troubleshoot and optimize blast furnace operations. In addition, the CFD models have been integrated with virtual reality. An interactive virtual blast furnace has been developed for training purpose. This paper summarizes the developments and applications of blast furnace CFD models and the virtual blast furnace.

  15. Model-Based Learning Environment Based on The Concept IPS School-Based Management

    Directory of Open Access Journals (Sweden)

    Hamid Darmadi

    2017-03-01

    Full Text Available The results showed: (1 learning model IPS-oriented environment can grow and not you love the cultural values of the area as a basis for the development of national culture, (2 community participation, and the role of government in implementing learning model of IPS-based environment provides a positive impact for the improvement of management school resources, (3 learning model IPS-based environment effectively creating a way of life together peacefully, increase the intensity of togetherness and mutual respect (4 learning model IPS-based environment can improve student learning outcomes, (5 there are differences in the expression of attitudes and results learning among students who are located in the area of conflict with students who are outside the area of conflict (6 analysis of the scale of attitudes among school students da SMA result rewards high school students to the values of unity and nation, respect for diversity and peaceful coexistence, It is recommended that the Department of Education authority as an institution of Trustees and the development of social and cultural values in the province can apply IPS learning model based environments.

  16. CONSTRUCTIVE EDUCATIONAL ENVIRONMENT SCHOOL-UNIVERSITY

    Directory of Open Access Journals (Sweden)

    Natalya Petrovna Shatalova

    2016-02-01

    Full Text Available The article presents the results of the study the key components of the development of constructive thinking of students on the basis of model building constructive educational environment school-University. It was conducted from a position of constructive approach in education, as a process of systemic-structural methodology of cognitive and creative activity of the student, promotes development and formation of various constructive qualities of the individual. The functions of constructive educational environment school-University aimed at developing constructive thinking of students, defined by its structural components and connections, shows the consistency of self-development of constructive thinking and job satisfaction the development of constructive skills. The findings reveal innovative possibilities of cooperation of schools and universities in the design and functioning model of constructive educatio-nal space that contributes to the development of constructive thinking of all its stakeholders.Purpose: measuring the effectiveness of the model constructive educational environment school-University aimed at the development of students.Methodology: the Programme of research included: (1 diagnosis of the development level of constructive thinking on the questionnaire developed in the context of the constructive theory of education, (2 augmented and revised by the author the diagnosis of satisfaction and importance model of constructive educational environment school-University by the method of G.A. Gagarin, as well as theoretical modeling, method of involved observation, formal teaching method.Results. The article introduces the concept of «constructive learning environments», which are considered in relation to the organization and conduct of joint activities of teachers, teachers and students. The authors give a theoretical comparative analysis of scientific works of colleagues in the context of the problem. Offer a brief

  17. Improved model management with aggregated business process models

    NARCIS (Netherlands)

    Reijers, H.A.; Mans, R.S.; Toorn, van der R.A.

    2009-01-01

    Contemporary organizations invest much efforts in creating models of their business processes. This raises the issue of how to deal with large sets of process models that become available over time. This paper proposes an extension of Event-driven Process Chains, called the aggregate EPC (aEPC),

  18. Visualizing the process of interaction in a 3D environment

    Science.gov (United States)

    Vaidya, Vivek; Suryanarayanan, Srikanth; Krishnan, Kajoli; Mullick, Rakesh

    2007-03-01

    As the imaging modalities used in medicine transition to increasingly three-dimensional data the question of how best to interact with and analyze this data becomes ever more pressing. Immersive virtual reality systems seem to hold promise in tackling this, but how individuals learn and interact in these environments is not fully understood. Here we will attempt to show some methods in which user interaction in a virtual reality environment can be visualized and how this can allow us to gain greater insight into the process of interaction/learning in these systems. Also explored is the possibility of using this method to improve understanding and management of ergonomic issues within an interface.

  19. Reaction time for processing visual stimulus in a computer-assisted rehabilitation environment.

    Science.gov (United States)

    Sanchez, Yerly; Pinzon, David; Zheng, Bin

    2017-10-01

    To examine the reaction time when human subjects process information presented in the visual channel under both a direct vision and a virtual rehabilitation environment when walking was performed. Visual stimulus included eight math problems displayed on the peripheral vision to seven healthy human subjects in a virtual rehabilitation training (computer-assisted rehabilitation environment (CAREN)) and a direct vision environment. Subjects were required to verbally report the results of these math calculations in a short period of time. Reaction time measured by Tobii Eye tracker and calculation accuracy were recorded and compared between the direct vision and virtual rehabilitation environment. Performance outcomes measured for both groups included reaction time, reading time, answering time and the verbal answer score. A significant difference between the groups was only found for the reaction time (p = .004). Participants had more difficulty recognizing the first equation of the virtual environment. Participants reaction time was faster in the direct vision environment. This reaction time delay should be kept in mind when designing skill training scenarios in virtual environments. This was a pilot project to a series of studies assessing cognition ability of stroke patients who are undertaking a rehabilitation program with a virtual training environment. Implications for rehabilitation Eye tracking is a reliable tool that can be employed in rehabilitation virtual environments. Reaction time changes between direct vision and virtual environment.

  20. 2. National scientific conference on process engineering in environment protection. Conference materials

    International Nuclear Information System (INIS)

    1994-01-01

    The national conference on 'Process engineering in environment protection' Jachranka 1994 has been divided into three sessions. Section 1 has been devoted to flue gas purification and collects 13 papers. Section 2 on liquid purification gathered 8 presentation. Section 3 - the poster session with 12 posters on related topics. During the conference 2 lectures and 3 posters have been devoted to the application of nuclear techniques to the solution different problems connected with environment protection

  1. Business process modeling in healthcare.

    Science.gov (United States)

    Ruiz, Francisco; Garcia, Felix; Calahorra, Luis; Llorente, César; Gonçalves, Luis; Daniel, Christel; Blobel, Bernd

    2012-01-01

    The importance of the process point of view is not restricted to a specific enterprise sector. In the field of health, as a result of the nature of the service offered, health institutions' processes are also the basis for decision making which is focused on achieving their objective of providing quality medical assistance. In this chapter the application of business process modelling - using the Business Process Modelling Notation (BPMN) standard is described. Main challenges of business process modelling in healthcare are the definition of healthcare processes, the multi-disciplinary nature of healthcare, the flexibility and variability of the activities involved in health care processes, the need of interoperability between multiple information systems, and the continuous updating of scientific knowledge in healthcare.

  2. A distributed computing model for telemetry data processing

    Science.gov (United States)

    Barry, Matthew R.; Scott, Kevin L.; Weismuller, Steven P.

    1994-05-01

    We present a new approach to distributing processed telemetry data among spacecraft flight controllers within the control centers at NASA's Johnson Space Center. This approach facilitates the development of application programs which integrate spacecraft-telemetered data and ground-based synthesized data, then distributes this information to flight controllers for analysis and decision-making. The new approach combines various distributed computing models into one hybrid distributed computing model. The model employs both client-server and peer-to-peer distributed computing models cooperating to provide users with information throughout a diverse operations environment. Specifically, it provides an attractive foundation upon which we are building critical real-time monitoring and control applications, while simultaneously lending itself to peripheral applications in playback operations, mission preparations, flight controller training, and program development and verification. We have realized the hybrid distributed computing model through an information sharing protocol. We shall describe the motivations that inspired us to create this protocol, along with a brief conceptual description of the distributed computing models it employs. We describe the protocol design in more detail, discussing many of the program design considerations and techniques we have adopted. Finally, we describe how this model is especially suitable for supporting the implementation of distributed expert system applications.

  3. A distributed computing model for telemetry data processing

    Science.gov (United States)

    Barry, Matthew R.; Scott, Kevin L.; Weismuller, Steven P.

    1994-01-01

    We present a new approach to distributing processed telemetry data among spacecraft flight controllers within the control centers at NASA's Johnson Space Center. This approach facilitates the development of application programs which integrate spacecraft-telemetered data and ground-based synthesized data, then distributes this information to flight controllers for analysis and decision-making. The new approach combines various distributed computing models into one hybrid distributed computing model. The model employs both client-server and peer-to-peer distributed computing models cooperating to provide users with information throughout a diverse operations environment. Specifically, it provides an attractive foundation upon which we are building critical real-time monitoring and control applications, while simultaneously lending itself to peripheral applications in playback operations, mission preparations, flight controller training, and program development and verification. We have realized the hybrid distributed computing model through an information sharing protocol. We shall describe the motivations that inspired us to create this protocol, along with a brief conceptual description of the distributed computing models it employs. We describe the protocol design in more detail, discussing many of the program design considerations and techniques we have adopted. Finally, we describe how this model is especially suitable for supporting the implementation of distributed expert system applications.

  4. Conservation Process Model (cpm): a Twofold Scientific Research Scope in the Information Modelling for Cultural Heritage

    Science.gov (United States)

    Fiorani, D.; Acierno, M.

    2017-05-01

    The aim of the present research is to develop an instrument able to adequately support the conservation process by means of a twofold approach, based on both BIM environment and ontology formalisation. Although BIM has been successfully experimented within AEC (Architecture Engineering Construction) field, it has showed many drawbacks for architectural heritage. To cope with unicity and more generally complexity of ancient buildings, applications so far developed have shown to poorly adapt BIM to conservation design with unsatisfactory results (Dore, Murphy 2013; Carrara 2014). In order to combine achievements reached within AEC through BIM environment (design control and management) with an appropriate, semantically enriched and flexible The presented model has at its core a knowledge base developed through information ontologies and oriented around the formalization and computability of all the knowledge necessary for the full comprehension of the object of architectural heritage an its conservation. Such a knowledge representation is worked out upon conceptual categories defined above all within architectural criticism and conservation scope. The present paper aims at further extending the scope of conceptual modelling within cultural heritage conservation already formalized by the model. A special focus is directed on decay analysis and surfaces conservation project.

  5. A wind turbine evaluation model under a multi-criteria decision making environment

    International Nuclear Information System (INIS)

    Lee, Amy H.I.; Hung, Meng-Chan; Kang, He-Yau; Pearn, W.L.

    2012-01-01

    Highlights: ► This paper proposes an evaluation model to select suitable turbines in a wind farm. ► Interpretive structural modeling is used to know the relationship among factors. ► Fuzzy analytic network process is used to calculate the priorities of turbines. ► The results can be references for selecting the most appropriate wind turbines. - Abstract: Due to the impacts of fossil and nuclear energy on the security, economics, and environment in the world, the demand of alternative energy resources is expanding consistently and tremendously in recent years. Wind energy production, with its safe and environmental characteristics, has become the fastest growing renewable energy source in the world. The construction of new wind farms and the installation of new wind turbines are important processes in order to provide a long-term energy production. In this research, a comprehensive evaluation model, which incorporates interpretive structural modeling (ISM) and fuzzy analytic network process (FANP), is constructed to select suitable turbines when developing a wind farm. A case study is carried out in Taiwan in evaluating the expected performance of several potential types of wind turbines, and experts in a wind farm are invited to contribute their expertise in determining the importance of the factors of the wind turbine evaluation and in rating the performance of the turbines with respect to each factor. The most suitable turbines for installation can finally be generated after the calculations. The results can be references for decision makers in selecting the most appropriate wind turbines.

  6. Topobathymetric LiDAR point cloud processing and landform classification in a tidal environment

    Science.gov (United States)

    Skovgaard Andersen, Mikkel; Al-Hamdani, Zyad; Steinbacher, Frank; Rolighed Larsen, Laurids; Brandbyge Ernstsen, Verner

    2017-04-01

    Historically it has been difficult to create high resolution Digital Elevation Models (DEMs) in land-water transition zones due to shallow water depth and often challenging environmental conditions. This gap of information has been reflected as a "white ribbon" with no data in the land-water transition zone. In recent years, the technology of airborne topobathymetric Light Detection and Ranging (LiDAR) has proven capable of filling out the gap by simultaneously capturing topographic and bathymetric elevation information, using only a single green laser. We collected green LiDAR point cloud data in the Knudedyb tidal inlet system in the Danish Wadden Sea in spring 2014. Creating a DEM from a point cloud requires the general processing steps of data filtering, water surface detection and refraction correction. However, there is no transparent and reproducible method for processing green LiDAR data into a DEM, specifically regarding the procedure of water surface detection and modelling. We developed a step-by-step procedure for creating a DEM from raw green LiDAR point cloud data, including a procedure for making a Digital Water Surface Model (DWSM) (see Andersen et al., 2017). Two different classification analyses were applied to the high resolution DEM: A geomorphometric and a morphological classification, respectively. The classification methods were originally developed for a small test area; but in this work, we have used the classification methods to classify the complete Knudedyb tidal inlet system. References Andersen MS, Gergely Á, Al-Hamdani Z, Steinbacher F, Larsen LR, Ernstsen VB (2017). Processing and performance of topobathymetric lidar data for geomorphometric and morphological classification in a high-energy tidal environment. Hydrol. Earth Syst. Sci., 21: 43-63, doi:10.5194/hess-21-43-2017. Acknowledgements This work was funded by the Danish Council for Independent Research | Natural Sciences through the project "Process-based understanding and

  7. The mathematics of models for climatology and environment. Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    Ildefonso Diaz, J. [ed.] [Universidad Complutense de Madrid (Spain). Facultad de Ciencas Matematicas

    1997-12-31

    This book presents a coherent survey of modelling in climatology and the environment and the mathematical treatment of those problems. It is divided into 4 parts containing a total of 16 chapters. Parts I, II and III are devoted to general models and part IV to models related to some local problems. Most of the mathematical models considered here involve systems of nonlinear partial differential equations.

  8. Educational complex of light-colored modeling of urban environment

    Directory of Open Access Journals (Sweden)

    Karpenko Vladimir E.

    2018-01-01

    Full Text Available Mechanisms, methodological tools and structure of a training complex of light-colored modeling of the urban environment are developed in this paper. The following results of the practical work of students are presented: light composition and installation, media facades, lighting of building facades, city streets and embankment. As a result of modeling, the structure of the light form is determined. Light-transmitting materials and causing characteristic optical illusions, light-visual and light-dynamic effects (video-dynamics and photostatics, basic compositional techniques of light form are revealed. The main elements of the light installation are studied, including a light projection, an electronic device, interactivity and relationality of the installation, and the mechanical device which becomes a part of the installation composition. The meaning of modern media facade technology is the transformation of external building structures and their facades into a changing information cover, into a media content translator using LED technology. Light tectonics and the light rhythm of the plastics of the architectural object are built up through point and local illumination, modeling of the urban ensemble assumes the structural interaction of several light building models with special light-composition techniques. When modeling the social and pedestrian environment, the lighting parameters depend on the scale of the chosen space and are adapted taking into account the visual perception of the pedestrian, and the atmospheric effects of comfort and safety of the environment are achieved with the help of special light compositional techniques. With the aim of realizing the tasks of light modeling, a methodology has been created, including the mechanisms of models, variability and complementarity. The perspectives of light modeling in the context of structural elements of the city, neuropsychology, wireless and bioluminescence technologies are proposed

  9. Adapting Evaluations of Alternative Payment Models to a Changing Environment.

    Science.gov (United States)

    Grannemann, Thomas W; Brown, Randall S

    2018-04-01

    To identify the most robust methods for evaluating alternative payment models (APMs) in the emerging health care delivery system environment. We assess the impact of widespread testing of alternative payment models on the ability to find credible comparison groups. We consider the applicability of factorial research designs for assessing the effects of these models. The widespread adoption of alternative payment models could effectively eliminate the possibility of comparing APM results with a "pure" control or comparison group unaffected by other interventions. In this new environment, factorial experiments have distinct advantages over the single-model experimental or quasi-experimental designs that have been the mainstay of recent tests of Medicare payment and delivery models. The best prospects for producing definitive evidence of the effects of payment incentives for APMs include fractional factorial experiments that systematically vary requirements and payment provisions within a payment model. © Health Research and Educational Trust.

  10. Integrating textual and model-based process descriptions for comprehensive process search

    NARCIS (Netherlands)

    Leopold, Henrik; van der Aa, Han; Pittke, Fabian; Raffel, Manuel; Mendling, Jan; Reijers, Hajo A.

    2016-01-01

    Documenting business processes using process models is common practice in many organizations. However, not all process information is best captured in process models. Hence, many organizations complement these models with textual descriptions that specify additional details. The problem with this

  11. Workflows for microarray data processing in the Kepler environment

    Science.gov (United States)

    2012-01-01

    Background Microarray data analysis has been the subject of extensive and ongoing pipeline development due to its complexity, the availability of several options at each analysis step, and the development of new analysis demands, including integration with new data sources. Bioinformatics pipelines are usually custom built for different applications, making them typically difficult to modify, extend and repurpose. Scientific workflow systems are intended to address these issues by providing general-purpose frameworks in which to develop and execute such pipelines. The Kepler workflow environment is a well-established system under continual development that is employed in several areas of scientific research. Kepler provides a flexible graphical interface, featuring clear display of parameter values, for design and modification of workflows. It has capabilities for developing novel computational components in the R, Python, and Java programming languages, all of which are widely used for bioinformatics algorithm development, along with capabilities for invoking external applications and using web services. Results We developed a series of fully functional bioinformatics pipelines addressing common tasks in microarray processing in the Kepler workflow environment. These pipelines consist of a set of tools for GFF file processing of NimbleGen chromatin immunoprecipitation on microarray (ChIP-chip) datasets and more comprehensive workflows for Affymetrix gene expression microarray bioinformatics and basic primer design for PCR experiments, which are often used to validate microarray results. Although functional in themselves, these workflows can be easily customized, extended, or repurposed to match the needs of specific projects and are designed to be a toolkit and starting point for specific applications. These workflows illustrate a workflow programming paradigm focusing on local resources (programs and data) and therefore are close to traditional shell scripting or

  12. Workflows for microarray data processing in the Kepler environment

    Directory of Open Access Journals (Sweden)

    Stropp Thomas

    2012-05-01

    Full Text Available Abstract Background Microarray data analysis has been the subject of extensive and ongoing pipeline development due to its complexity, the availability of several options at each analysis step, and the development of new analysis demands, including integration with new data sources. Bioinformatics pipelines are usually custom built for different applications, making them typically difficult to modify, extend and repurpose. Scientific workflow systems are intended to address these issues by providing general-purpose frameworks in which to develop and execute such pipelines. The Kepler workflow environment is a well-established system under continual development that is employed in several areas of scientific research. Kepler provides a flexible graphical interface, featuring clear display of parameter values, for design and modification of workflows. It has capabilities for developing novel computational components in the R, Python, and Java programming languages, all of which are widely used for bioinformatics algorithm development, along with capabilities for invoking external applications and using web services. Results We developed a series of fully functional bioinformatics pipelines addressing common tasks in microarray processing in the Kepler workflow environment. These pipelines consist of a set of tools for GFF file processing of NimbleGen chromatin immunoprecipitation on microarray (ChIP-chip datasets and more comprehensive workflows for Affymetrix gene expression microarray bioinformatics and basic primer design for PCR experiments, which are often used to validate microarray results. Although functional in themselves, these workflows can be easily customized, extended, or repurposed to match the needs of specific projects and are designed to be a toolkit and starting point for specific applications. These workflows illustrate a workflow programming paradigm focusing on local resources (programs and data and therefore are close to

  13. Workflows for microarray data processing in the Kepler environment.

    Science.gov (United States)

    Stropp, Thomas; McPhillips, Timothy; Ludäscher, Bertram; Bieda, Mark

    2012-05-17

    Microarray data analysis has been the subject of extensive and ongoing pipeline development due to its complexity, the availability of several options at each analysis step, and the development of new analysis demands, including integration with new data sources. Bioinformatics pipelines are usually custom built for different applications, making them typically difficult to modify, extend and repurpose. Scientific workflow systems are intended to address these issues by providing general-purpose frameworks in which to develop and execute such pipelines. The Kepler workflow environment is a well-established system under continual development that is employed in several areas of scientific research. Kepler provides a flexible graphical interface, featuring clear display of parameter values, for design and modification of workflows. It has capabilities for developing novel computational components in the R, Python, and Java programming languages, all of which are widely used for bioinformatics algorithm development, along with capabilities for invoking external applications and using web services. We developed a series of fully functional bioinformatics pipelines addressing common tasks in microarray processing in the Kepler workflow environment. These pipelines consist of a set of tools for GFF file processing of NimbleGen chromatin immunoprecipitation on microarray (ChIP-chip) datasets and more comprehensive workflows for Affymetrix gene expression microarray bioinformatics and basic primer design for PCR experiments, which are often used to validate microarray results. Although functional in themselves, these workflows can be easily customized, extended, or repurposed to match the needs of specific projects and are designed to be a toolkit and starting point for specific applications. These workflows illustrate a workflow programming paradigm focusing on local resources (programs and data) and therefore are close to traditional shell scripting or R

  14. Investigating Pre-service Mathematics Teachers’ Geometric Problem Solving Process in Dynamic Geometry Environment

    Directory of Open Access Journals (Sweden)

    Deniz Özen

    2013-03-01

    Full Text Available The aim of this study is to investigate pre-service elementary mathematics teachers’ open geometric problem solving process in a Dynamic Geometry Environment. With its qualitative inquiry based research design employed, the participants of the study are three pre-service teachers from 4th graders of the Department of Elementary Mathematics Teaching. In this study, clinical interviews, screencaptures of the problem solving process in the Cabri Geomery Environment, and worksheets included 2 open geometry problems have been used to collect the data. It has been investigated that all the participants passed through similar recursive phases as construction, exploration, conjecture, validate, and justification in the problem solving process. It has been thought that this study provide a new point of view to curriculum developers, teachers and researchers

  15. Automated cyber threat analysis and specified process using vector relational data modeling

    OpenAIRE

    Kelly, Ryan Forrest

    2014-01-01

    Approved for public release; distribution is unlimited Computer network defense systems should be sufficiently integrated to pull data from any information source, model an expert cyber analyst’s decision process, continuously adapt to an evolving cyber threat environment, and amalgamate with industry standard network hardware. Unfortunately, cyber defense systems are generally stovepipe solutions that do not natively integrate disparate network systems. Correlation engines are generally l...

  16. An Innovative Real-time Environment for Unified Deterministic and Stochastic Groundwater Modeling

    Science.gov (United States)

    Li, S.; Liu, Q.

    2003-12-01

    Despite an exponential growth of computational capability over the last two decades-one that has allowed computational science and engineering to become a unique, powerful tool for scientific discovery-the extreme cost of groundwater modeling continues to limit its use. This occurs primarily because the modeling paradigm that has been employed for decades limits our ability to take full advantage of recent developments in computer, communication, graphic, and visualization technologies. In this presentation we introduce an innovative and sophisticated computational environment for groundwater modeling that promises to eliminate the current bottleneck and greatly expand the utility of computational tools for scientific discovery related to groundwater. Based on a set of efficient and robust computational algorithms, the new software system, called Interactive Groundwater (IGW), allows simulating complex flow and transport in aquifers subject to both systematic and "randomly" varying stresses and geological and chemical heterogeneity. Adopting a new paradigm, IGW eliminates a major bottleneck inherent in the traditional fragmented modeling technologies and enables real-time modeling, real-time visualization, real-time analysis, and real-time presentation. IGW functions as a "numerical laboratory" in which a researcher can freely explore in real-time: creating visually an aquifer of desired configurations, interactively imposing desired stresses, and then immediately investigating and visualizing the geology and the processes of flow and contaminant transport and transformation. A modeler can pause to edit at any time and interact on-line with any aspects (e.g., conceptual and numerical representation, boundary conditions, model solvers, and ways of visualization and analysis) of the integrated modeling process; he/she can initiate or stop, whenever needed, particle tracking, plume modeling, subscale modeling, cross-sectional modeling, stochastic modeling, monitoring

  17. Effect of river flow fluctuations on riparian vegetation dynamics: Processes and models

    Science.gov (United States)

    Vesipa, Riccardo; Camporeale, Carlo; Ridolfi, Luca

    2017-12-01

    Several decades of field observations, laboratory experiments and mathematical modelings have demonstrated that the riparian environment is a disturbance-driven ecosystem, and that the main source of disturbance is river flow fluctuations. The focus of the present work has been on the key role that flow fluctuations play in determining the abundance, zonation and species composition of patches of riparian vegetation. To this aim, the scientific literature on the subject, over the last 20 years, has been reviewed. First, the most relevant ecological, morphological and chemical mechanisms induced by river flow fluctuations are described from a process-based perspective. The role of flow variability is discussed for the processes that affect the recruitment of vegetation, the vegetation during its adult life, and the morphological and nutrient dynamics occurring in the riparian habitat. Particular emphasis has been given to studies that were aimed at quantifying the effect of these processes on vegetation, and at linking them to the statistical characteristics of the river hydrology. Second, the advances made, from a modeling point of view, have been considered and discussed. The main models that have been developed to describe the dynamics of riparian vegetation have been presented. Different modeling approaches have been compared, and the corresponding advantages and drawbacks have been pointed out. Finally, attention has been paid to identifying the processes considered by the models, and these processes have been compared with those that have actually been observed or measured in field/laboratory studies.

  18. Business model transformation process in the context of business ecosystem

    OpenAIRE

    Heikkinen, A.-M. (Anne-Mari)

    2014-01-01

    Abstract It is current phenomena that business environment has changed and has set new requirements for companies. Companies must adapt to the changes comes from outside its normal business environment and take into consideration wider business environment where it operates. These changes also have set new demands for company business model. Companies Business models need to be changed to match state of art business environ...

  19. NRPB models for calculating the transfer of radionuclides through the environment. Verification and validation

    International Nuclear Information System (INIS)

    Attwood, C.; Barraclough, I.; Brown, J.

    1998-06-01

    There is a wide range of models available at NRPB to predict the transfer of radionuclides through the environment. Such models form an essential part of assessments of the radiological impact of releases of radionuclides into the environment. These models cover: the atmosphere; the aquatic environment; the geosphere; the terrestrial environment including foodchains. It is important that the models used for radiological impact assessments are robust, reliable and suitable for the assessment being undertaken. During model development it is, therefore, important that the model is both verified and validated. Verification of a model involves ensuring that it has been implemented correctly, while validation consists of demonstrating that the model is an adequate representation of the real environment. The extent to which a model can be verified depends on its complexity and whether similar models exist. For relatively simple models verification is straightforward, but for more complex models verification has to form part of the development, coding and testing of the model within quality assurance procedures. Validation of models should ideally consist of comparisons between the results of the models and experimental or environmental measurement data that were not used to develop the model. This is more straightforward for some models than for others depending on the quantity and type of data available. Validation becomes increasingly difficult for models which are intended to predict environmental transfer at long times or at great distances. It is, therefore, necessary to adopt qualitative validation techniques to ensure that the model is an adequate representation of the real environment. This report summarises the models used at NRPB to predict the transfer of radionuclides through the environment as part of a radiological impact assessment. It outlines the work carried out to verify and validate the models. The majority of these models are not currently available

  20. Stress-reducing preventive maintenance model for a unit under stressful environment

    International Nuclear Information System (INIS)

    Park, J.H.; Chang, Woojin; Lie, C.H.

    2012-01-01

    We develop a preventive maintenance (PM) model for a unit operated under stressful environment. The PM model in this paper consists of a failure rate model and two cost models to determine the optimal PM scheduling which minimizes a cost rate. The assumption for the proposed model is that stressful environment accelerates the failure of the unit and periodic maintenances reduce stress from outside. The failure rate model handles the maintenance effect of PM using improvement and stress factors. The cost models are categorized into two failure recognition cases: immediate failure recognition and periodic failure detection. The optimal PM scheduling is obtained by considering the trade-off between the related cost and the lifetime of a unit in our model setting. The practical usage of our proposed model is tested through a numerical example.

  1. Lithium-ion Battery Electrothermal Model, Parameter Estimation, and Simulation Environment

    Directory of Open Access Journals (Sweden)

    Simone Orcioni

    2017-03-01

    Full Text Available The market for lithium-ion batteries is growing exponentially. The performance of battery cells is growing due to improving production technology, but market request is growing even more rapidly. Modeling and characterization of single cells and an efficient simulation environment is fundamental for the development of an efficient battery management system. The present work is devoted to defining a novel lumped electrothermal circuit of a single battery cell, the extraction procedure of the parameters of the single cell from experiments, and a simulation environment in SystemC-WMS for the simulation of a battery pack. The electrothermal model of the cell was validated against experimental measurements obtained in a climatic chamber. The model is then used to simulate a 48-cell battery, allowing statistical variations among parameters. The different behaviors of the cells in terms of state of charge, current, voltage, or heat flow rate can be observed in the results of the simulation environment.

  2. Exploring the Processes of Generating LOD (0-2) Citygml Models in Greater Municipality of Istanbul

    Science.gov (United States)

    Buyuksalih, I.; Isikdag, U.; Zlatanova, S.

    2013-08-01

    3D models of cities, visualised and exploded in 3D virtual environments have been available for several years. Currently a large number of impressive realistic 3D models have been regularly presented at scientific, professional and commercial events. One of the most promising developments is OGC standard CityGML. CityGML is object-oriented model that support 3D geometry and thematic semantics, attributes and relationships, and offers advanced options for realistic visualization. One of the very attractive characteristics of the model is the support of 5 levels of detail (LOD), starting from 2.5D less accurate model (LOD0) and ending with very detail indoor model (LOD4). Different local government offices and municipalities have different needs when utilizing the CityGML models, and the process of model generation depends on local and domain specific needs. Although the processes (i.e. the tasks and activities) for generating the models differs depending on its utilization purpose, there are also some common tasks (i.e. common denominator processes) in the model generation of City GML models. This paper focuses on defining the common tasks in generation of LOD (0-2) City GML models and representing them in a formal way with process modeling diagrams.

  3. Agricultural watershed modeling: a review for hydrology and soil erosion processes

    Directory of Open Access Journals (Sweden)

    Carlos Rogério de Mello

    2016-02-01

    Full Text Available ABSTRACT Models have been used by man for thousands of years to control his environment in a favorable way to better human living conditions. The use of hydrologic models has been a widely effective tool in order to support decision makers dealing with watersheds related to several economic and social activities, like public water supply, energy generation, and water availability for agriculture, among others. The purpose of this review is to briefly discuss some models on soil and water movement on landscapes (RUSLE, WEPP, GeoWEPP, LASH, DHSVM and AnnAGNPS to provide information about them to help and serve in a proper manner in order to discuss particular problems related to hydrology and soil erosion processes. Models have been changed and evaluated significantly in recent years, highlighting the use of remote sense, GIS and automatic calibration process, allowing them capable of simulating watersheds under a given land-use and climate change effects. However, hydrology models have almost the same physical structure, which is not enough for simulating problems related to the long-term effects of different land-uses. That has been our challenge for next future: to understand entirely the hydrology cycle, having as reference the critical zone, in which the hydrological processes act together from canopy to the bottom of aquifers.

  4. Pharmaceutical process chemistry: evolution of a contemporary data-rich laboratory environment.

    Science.gov (United States)

    Caron, Stéphane; Thomson, Nicholas M

    2015-03-20

    Over the past 20 years, the industrial laboratory environment has gone through a major transformation in the industrial process chemistry setting. In order to discover and develop robust and efficient syntheses and processes for a pharmaceutical portfolio with growing synthetic complexity and increased regulatory expectations, the round-bottom flask and other conventional equipment familiar to a traditional organic chemistry laboratory are being replaced. The new process chemistry laboratory fosters multidisciplinary collaborations by providing a suite of tools capable of delivering deeper process understanding through mechanistic insights and detailed kinetics translating to greater predictability at scale. This transformation is essential to the field of organic synthesis in order to promote excellence in quality, safety, speed, and cost efficiency in synthesis.

  5. Organization of educational process as a part of the information environment of the university

    Directory of Open Access Journals (Sweden)

    Оksana S. Savelyeva

    2015-06-01

    Full Text Available The questions concerning the insurance of openness and transparency of the educational process, monitoring the provision of educational services and the quality of learning within a unified information environment of Odessa National Polytechnic University are considered. It is proposed to consider the organization of the educational process as a major component of the educational process, that is a system of activities covering the distribution of the academic load between departments, recruitment of teachers, the formation of class schedules, consultation, final control and state certification. The analysis and the forming of set of parameters are carried out, the main components of the functional subsystem "The organization of educational process" as one of the components of the information environment of university are identified. Building a system hierarchically ensures the effective management of subsystems of organization of educational process and interaction between participants of the educational process and allows the system to change quickly if it is necessary.

  6. Modelling of channel transmission loss processes in semi-arid catchments of southern Africa using the Pitman Model

    Directory of Open Access Journals (Sweden)

    V. Mvandaba

    2018-05-01

    Full Text Available Water availability is one of the major societal issues facing the world. The ability to understand and quantify the impact of key hydrological processes, on the availability of water resources, is therefore integral to ensuring equitable and sustainable resource management. Channel transmission losses are an under-researched hydrological process that affects resource availability in many semi-arid regions such as the Limpopo River Basin in southern Africa, where the loss processes amount to approximately 30 % of the water balance. To improve the understanding of these loss processes and test the capability of modelling routines, three approaches using the Pitman model are applied to selected alluvial aquifer environments. The three approaches are an explicit transmission loss function, the use of a wetland function to represent channel-floodplain storage exchanges and the use of a dummy reservoir to represent floodplain storage and evapotranspiration losses. Results indicate that all three approaches are able to simulate channel transmission losses with differing impacts on the regional flows. A determination of which method best represents the channel transmission losses process requires further testing in a study area that has reliable observed historical records.

  7. Process modeling for Humanities: tracing and analyzing scientific processes

    OpenAIRE

    Hug , Charlotte; Salinesi , Camille; Deneckere , Rebecca; Lamasse , Stéphane

    2011-01-01

    International audience; This paper concerns epistemology and the understanding of research processes in Humanities, such as Archaeology. We believe that to properly understand research processes, it is essential to trace them. The collected traces depend on the process model established, which has to be as accurate as possible to exhaustively record the traces. In this paper, we briefly explain why the existing process models for Humanities are not sufficient to represent traces. We then pres...

  8. Improvement of sweating model in 2-Node Model and its application to thermal safety for hot environments

    Energy Technology Data Exchange (ETDEWEB)

    Ooka, Ryozo [Institute of Industrial Science, The University of Tokyo, 4-6-1 Komaba Meguro-ku, Tokyo 153 8505 (Japan); Minami, Yuriko [Tokyo Electric Power Company, Tokyo (Japan); Sakoi, Tomonori [International Young Researchers Empowerment Center, Shinshu University, Nagano (Japan); Tsuzuki, Kazuyo [National Institute of Advanced Industrial Science and Technology, Tsukuba (Japan); Rijal, H.B. [Integrated Research System for Sustainability Science, The University of Tokyo, Tokyo (Japan)

    2010-07-15

    Recently, due to global warming and the heat-island effect, more and more people are exposed to the dangers of heat disorders. A hot thermal environment can be evaluated using various indices, such as new Standard Effective Temperature (SET{sup *}) using the 2-Node Model (2 NM), Wet Bulb Globe Temperature (WBGT), Predicted Heat Strain (PHS) model, and so on. The authors aim to develop a safety evaluation approach for hot environments. Subject experiments are performed in a laboratory to comprehend the physiological response of the human body. The results are compared with the computed values from the 2 NM and PHS models, and improved the sweating model in 2 NM in order to take into account the relationship with metabolic rate. A demonstration is provided of using the new sweating model for evaluating thermal safety in a hot environment. (author)

  9. Model of a programmable quantum processing unit based on a quantum transistor effect

    Science.gov (United States)

    Ablayev, Farid; Andrianov, Sergey; Fetisov, Danila; Moiseev, Sergey; Terentyev, Alexandr; Urmanchev, Andrey; Vasiliev, Alexander

    2018-02-01

    In this paper we propose a model of a programmable quantum processing device realizable with existing nano-photonic technologies. It can be viewed as a basis for new high performance hardware architectures. Protocols for physical implementation of device on the controlled photon transfer and atomic transitions are presented. These protocols are designed for executing basic single-qubit and multi-qubit gates forming a universal set. We analyze the possible operation of this quantum computer scheme. Then we formalize the physical architecture by a mathematical model of a Quantum Processing Unit (QPU), which we use as a basis for the Quantum Programming Framework. This framework makes it possible to perform universal quantum computations in a multitasking environment.

  10. Estimation of Environment-Related Properties of Chemicals for Design of Sustainable Processes: Development of Group-Contribution+ (GC+) Property Models and Uncertainty Analysis

    Science.gov (United States)

    The aim of this work is to develop group-contribution+ (GC+) method (combined group-contribution (GC) method and atom connectivity index (CI) method) based property models to provide reliable estimations of environment-related properties of organic chemicals together with uncert...

  11. Modeling Gene-Environment Interactions With Quasi-Natural Experiments.

    Science.gov (United States)

    Schmitz, Lauren; Conley, Dalton

    2017-02-01

    This overview develops new empirical models that can effectively document Gene × Environment (G×E) interactions in observational data. Current G×E studies are often unable to support causal inference because they use endogenous measures of the environment or fail to adequately address the nonrandom distribution of genes across environments, confounding estimates. Comprehensive measures of genetic variation are incorporated into quasi-natural experimental designs to exploit exogenous environmental shocks or isolate variation in environmental exposure to avoid potential confounders. In addition, we offer insights from population genetics that improve upon extant approaches to address problems from population stratification. Together, these tools offer a powerful way forward for G×E research on the origin and development of social inequality across the life course. © 2015 Wiley Periodicals, Inc.

  12. Modeling nuclear processes by Simulink

    Energy Technology Data Exchange (ETDEWEB)

    Rashid, Nahrul Khair Alang Md, E-mail: nahrul@iium.edu.my [Faculty of Engineering, International Islamic University Malaysia, Jalan Gombak, Selangor (Malaysia)

    2015-04-29

    Modelling and simulation are essential parts in the study of dynamic systems behaviours. In nuclear engineering, modelling and simulation are important to assess the expected results of an experiment before the actual experiment is conducted or in the design of nuclear facilities. In education, modelling can give insight into the dynamic of systems and processes. Most nuclear processes can be described by ordinary or partial differential equations. Efforts expended to solve the equations using analytical or numerical solutions consume time and distract attention from the objectives of modelling itself. This paper presents the use of Simulink, a MATLAB toolbox software that is widely used in control engineering, as a modelling platform for the study of nuclear processes including nuclear reactor behaviours. Starting from the describing equations, Simulink models for heat transfer, radionuclide decay process, delayed neutrons effect, reactor point kinetic equations with delayed neutron groups, and the effect of temperature feedback are used as examples.

  13. Modeling nuclear processes by Simulink

    International Nuclear Information System (INIS)

    Rashid, Nahrul Khair Alang Md

    2015-01-01

    Modelling and simulation are essential parts in the study of dynamic systems behaviours. In nuclear engineering, modelling and simulation are important to assess the expected results of an experiment before the actual experiment is conducted or in the design of nuclear facilities. In education, modelling can give insight into the dynamic of systems and processes. Most nuclear processes can be described by ordinary or partial differential equations. Efforts expended to solve the equations using analytical or numerical solutions consume time and distract attention from the objectives of modelling itself. This paper presents the use of Simulink, a MATLAB toolbox software that is widely used in control engineering, as a modelling platform for the study of nuclear processes including nuclear reactor behaviours. Starting from the describing equations, Simulink models for heat transfer, radionuclide decay process, delayed neutrons effect, reactor point kinetic equations with delayed neutron groups, and the effect of temperature feedback are used as examples

  14. Pattern recognition model to estimate intergranular stress corrosion cracking (IGSCC) at crevices and pit sites of 304 SS in BWR environments

    International Nuclear Information System (INIS)

    Urquidi-Macdonald, Mirna

    2004-01-01

    Many publications have shown that crack growth rates (CGR) due to intergranular stress corrosion cracking (IGSCC) of metals is dependent on many parameters related to the manufacturing process of the steel and the environment to which the steel is exposed. Those parameters include, but are not restricted to, the concentration of chloride, fluoride, nitrates, and sulfates, pH, fluid velocity, electrochemical potential (ECP), electrolyte conductivity, stress and sensitization applied to the steel during its production and use. It is not well established how combinations of each of these parameters impact the CGR. Many different models and beliefs have been published, resulting in predictions that sometimes disagree with experimental observations. To some extent, the models are the closest to the nature of IGSCC, however, there is not a model that fully describes the entire range of observations, due to the difficulty of the problem. Among the models, the Fracture Environment Model, developed by Macdonald et al., is the most physico-chemical model, accounting for experimental observations in a wide range of environments or ECPs. In this work, we collected experimental data on BWR environments and designed a data mining pattern recognition model to learn from that data. The model was used to generate CGR estimations as a function of ECP on a BWR environment. The results of the predictive model were compared to the Fracture Environment Model predictions. The results from those two models are very close to the experimental observations of the area corresponding to creep and IGSCC controlled by diffusion. At more negative ECPs than the potential corresponding to creep, the pattern recognition predicts an increase of CGR with decreasing ECP, while the Fracture Environment Model predicts the opposite. The results of this comparison confirm that the pattern recognition model covers 3 phenomena: hydrogen embrittlement at very negative ECP, creep at intermediate ECP, and IGSCC

  15. Pattern recognition model to estimate intergranular stress corrosion cracking (IGSCC) at crevices and pit sites of 304 SS in BWR environments

    Energy Technology Data Exchange (ETDEWEB)

    Urquidi-Macdonald, Mirna [Penn State University, 212 Earth-Engineering Science Building, University Park, PA 16801 (United States)

    2004-07-01

    Many publications have shown that crack growth rates (CGR) due to intergranular stress corrosion cracking (IGSCC) of metals is dependent on many parameters related to the manufacturing process of the steel and the environment to which the steel is exposed. Those parameters include, but are not restricted to, the concentration of chloride, fluoride, nitrates, and sulfates, pH, fluid velocity, electrochemical potential (ECP), electrolyte conductivity, stress and sensitization applied to the steel during its production and use. It is not well established how combinations of each of these parameters impact the CGR. Many different models and beliefs have been published, resulting in predictions that sometimes disagree with experimental observations. To some extent, the models are the closest to the nature of IGSCC, however, there is not a model that fully describes the entire range of observations, due to the difficulty of the problem. Among the models, the Fracture Environment Model, developed by Macdonald et al., is the most physico-chemical model, accounting for experimental observations in a wide range of environments or ECPs. In this work, we collected experimental data on BWR environments and designed a data mining pattern recognition model to learn from that data. The model was used to generate CGR estimations as a function of ECP on a BWR environment. The results of the predictive model were compared to the Fracture Environment Model predictions. The results from those two models are very close to the experimental observations of the area corresponding to creep and IGSCC controlled by diffusion. At more negative ECPs than the potential corresponding to creep, the pattern recognition predicts an increase of CGR with decreasing ECP, while the Fracture Environment Model predicts the opposite. The results of this comparison confirm that the pattern recognition model covers 3 phenomena: hydrogen embrittlement at very negative ECP, creep at intermediate ECP, and IGSCC

  16. Generic models for use in assessing the impact of discharges of radioactive substances to the environment

    International Nuclear Information System (INIS)

    2001-01-01

    The concern of society in general for the quality of the environment and the realization that all human activities have some environmental effect has led to the development of a procedure for environmental impact analysis. This procedure is a predictive one, which forecasts probable environmental effects before some action, such as the construction and operation of a nuclear power station, is decided upon. The method of prediction is by the application of models that describe the environmental processes in mathematical terms in order to produce a quantitative result which can be used in the decision making process. This report describes such a procedure for application to radioactive discharges and is addressed to the national regulatory bodies and technical and administrative personnel responsible for performing environmental impact analyses. The report is also intended to support the recently published IAEA Safety Guide on Regulatory Control of Radioactive Discharges to the Environment. It expands on and supersedes previous advice published in IAEA Safety Series No. 57 on Generic Models and Parameters for Assessing the Environmental Transfer of Radionuclides from Routine Releases. This Safety Report was developed through a series of consultants meetings and three Advisory Group Meetings

  17. Design study of pyrochemical process operation by using virtual engineering models

    International Nuclear Information System (INIS)

    Kakehi, I.; Tozawa, K.; Matsumoto, T.; Tanaka, K.

    2000-04-01

    This report describes accomplishment of simulations of Pyrochemical Process Operation by using virtual engineering models. The pyrochemical process using molten salt electrorefining would introduce new technologies for new fuels of particle oxide, particle nitride and metallic fuels. This system is a batch treatment system of reprocessing and re-fabrication, which transports products of solid form from a process to next process. As a results, this system needs automated transport system for process operations by robotics. In this study, a simulation code system has been prepared, which provides virtual engineering environment to evaluate the pyrochemical process operation of a batch treatment system using handling robots. And the simulation study has been conducted to evaluate the required system functions, which are the function of handling robots, the interactions between robot and process equipment, and the time schedule of process, in the automated transport system by robotics. As a result of simulation of the process operation, which we have designed, the automated transport system by robotics of the pyrochemical process is realistic. And the issues for the system development have been pointed out. (author)

  18. Using 222Rn as a tracer of geophysical processes in underground environments

    International Nuclear Information System (INIS)

    Lacerda, T.; Anjos, R. M.; Valladares, D. L.; Rizzotto, M.; Velasco, H.; Rosas, J. P. de; Ayub, J. Juri; Silva, A. A. R. da; Yoshimura, E. M.

    2014-01-01

    Radon levels in two old mines in San Luis, Argentina, are reported and analyzed. These mines are today used for touristic visitation. Our goal was to assess the potential use of such radioactive noble gas as tracer of geological processes in underground environments. CR-39 nuclear track detectors were used during the winter and summer seasons. The findings show that the significant radon concentrations reported in this environment are subject to large seasonal modulations, due to the strong dependence of natural ventilation on the variations of outside temperature. The results also indicate that radon pattern distribution appear as a good method to localize unknown ducts, fissures or secondary tunnels in subterranean environments

  19. Mathematical modeling of heat treatment processes conserving biological activity of plant bioresources

    Science.gov (United States)

    Rodionova, N. S.; Popov, E. S.; Pozhidaeva, E. A.; Pynzar, S. S.; Ryaskina, L. O.

    2018-05-01

    The aim of this study is to develop a mathematical model of the heat exchange process of LT-processing to estimate the dynamics of temperature field changes and optimize the regime parameters, due to the non-stationarity process, the physicochemical and thermophysical properties of food systems. The application of LT-processing, based on the use of low-temperature modes in thermal culinary processing of raw materials with preliminary vacuum packaging in a polymer heat- resistant film is a promising trend in the development of technics and technology in the catering field. LT-processing application of food raw materials guarantees the preservation of biologically active substances in food environments, which are characterized by a certain thermolability, as well as extend the shelf life and high consumer characteristics of food systems that are capillary-porous bodies. When performing the mathematical modeling of the LT-processing process, the packet of symbolic mathematics “Maple” was used, as well as the mathematical packet flexPDE that uses the finite element method for modeling objects with distributed parameters. The processing of experimental results was evaluated with the help of the developed software in the programming language Python 3.4. To calculate and optimize the parameters of the LT processing process of polycomponent food systems, the differential equation of non-stationary thermal conductivity was used, the solution of which makes it possible to identify the temperature change at any point of the solid at different moments. The present study specifies data on the thermophysical characteristics of the polycomponent food system based on plant raw materials, with the help of which the physico-mathematical model of the LT- processing process has been developed. The obtained mathematical model allows defining of the dynamics of the temperature field in different sections of the LT-processed polycomponent food systems on the basis of calculating the

  20. Isotherm, kinetic and thermodynamics study of humic acid removal process from aquatic environment by chitosan nano particle

    Directory of Open Access Journals (Sweden)

    Maryam Ghafoori

    2016-09-01

    Full Text Available Background and Aim: Humic substances include natural organic polyelectrolyte materials that formed most of the dissolved organic carbon in aquatic environments. Reaction between humic substances and chlorine leading to formation of disinfection byproducts (DBPs those are toxic, carcinogenic and mutagenic. The aim of this study was investigation of isotherms, kinetics and thermodynamics of humic acid removal process by nano chitosan from aquatic environment. Materials and Methods: This practical research was an experimental study that performed in a batch system. The effect of various parameters such as pH, humic acid concentration, contact time, adsorbent dosage, isotherms, thermodynamics and Kinetics of humic acid adsorption process were investigated. Humic acid concentration measured using spectrophotometer at wave length of 254 nm. Results: The results of this research showed that maximum adsorption capacity of nanochitosan that fall out in concentration of 50 mg/l and contact time of 90 minutes was 52.34 mg/g. Also, the maximum adsorption was observed in pH = 4 and adsorbent dosage 0.02 g. Laboratory data show that adsorption of humic acid by nanochitosan follow the Langmuir isotherm model. According to result of thermodynamic study, entropy changes (ΔS was equal to 2.24 J/mol°k, enthalpy changes (ΔH was equal to 870 kJ/mol and Gibbs free energy (ΔG was negative that represent the adsorption process is spontaneous and endothermic. The kinetics of adsorption has a good compliant with pseudo second order model. Conclusion: Regarding to results of this study, nano chitosan can be suggested as a good adsorbent for the removal of humic acids from aqueous solutions.

  1. IoT-based user-driven service modeling environment for a smart space management system.

    Science.gov (United States)

    Choi, Hoan-Suk; Rhee, Woo-Seop

    2014-11-20

    The existing Internet environment has been extended to the Internet of Things (IoT) as an emerging new paradigm. The IoT connects various physical entities. These entities have communication capability and deploy the observed information to various service areas such as building management, energy-saving systems, surveillance services, and smart homes. These services are designed and developed by professional service providers. Moreover, users' needs have become more complicated and personalized with the spread of user-participation services such as social media and blogging. Therefore, some active users want to create their own services to satisfy their needs, but the existing IoT service-creation environment is difficult for the non-technical user because it requires a programming capability to create a service. To solve this problem, we propose the IoT-based user-driven service modeling environment to provide an easy way to create IoT services. Also, the proposed environment deploys the defined service to another user. Through the personalization and customization of the defined service, the value and dissemination of the service is increased. This environment also provides the ontology-based context-information processing that produces and describes the context information for the IoT-based user-driven service.

  2. Sato Processes in Default Modeling

    DEFF Research Database (Denmark)

    Kokholm, Thomas; Nicolato, Elisa

    -change of a homogeneous Levy process. While the processes in these two classes share the same average behavior over time, the associated intensities exhibit very different properties. Concrete specifications are calibrated to data on the single names included in the iTraxx Europe index. The performances are compared......In reduced form default models, the instantaneous default intensity is classically the modeling object. Survival probabilities are then given by the Laplace transform of the cumulative hazard defined as the integrated intensity process. Instead, recent literature has shown a tendency towards...... specifying the cumulative hazard process directly. Within this framework we present a new model class where cumulative hazards are described by self-similar additive processes, also known as Sato processes. Furthermore we also analyze specifications obtained via a simple deterministic time...

  3. Work Environment Dialogue in a Swedish Municipality — Strengths and Limits of the Nordic Work Environment Model

    Directory of Open Access Journals (Sweden)

    Kaj Frick

    2013-01-01

    Full Text Available In the Nordic work environment model, health risks at work are mainly to be managed in cooperation with the employees and their representatives. The model is based on strong trade unions and is supported by the state through participatory rights and funding to produce and disseminate knowledge on risks and solutions. The model is evident in the large Swedish municipal sector with its strong unions and extensive social dialogue. However, municipal employees also face widespread risks, mainly from mental and physical overload. They led the costly wave of rising sickness absence from the late 1990s. Municipal (and other employers therefore attempt to reduce the absence. The rural municipality of Leksand started a project Hälsosam with the broad objectives to half the absence, implement a national agreement on better dialogue, make Leksand an attractive employer, and improve employee influence and work environment. The article’s objective is to use Hälsosam’s intervention project to explore the limits of what the Nordic work environment model can achieve against risks rooted in the employers’ prerogative of organizing, resourcing, and managing the operations that create the conditions at work. Hälsosam’s practice focused on sickness absence and the forms of the new national agreement. The absence was halved by reducing cases of long-term sickness. There was also workplace health promotion and the safety reps were supported through regular meetings. However, little was done to the extensive mental and physical overload revealed in a survey. Nor was the mandatory work environment management improved, as was ordered by the municipal council. This remained delegated to first-line managers who had a limited ability to handle work risks. This limited practice implemented Leksand’s political priority to reduce the absenteeism, while other objectives had less political support. The difficulties to improve the work environment and its management

  4. Object oriented business process modelling in RFID applied computing environment

    NARCIS (Netherlands)

    Zhao, X.; Liu, Chengfei; Lin, T.; Ranasinghe, D.C.; Sheng, Q.Z.

    2010-01-01

    As a tracking technology, Radio Frequency Identification (RFID) is now widely applied to enhance the context awareness of enterprise information systems. Such awareness provides great opportunities to facilitate business process automation and thereby improve operation efficiency and accuracy. With

  5. Towards a Development Environment for Model Based Test Design

    OpenAIRE

    Jing, Han

    2008-01-01

    Within the UP IP I&V organization there is high focus on increasing the ability to predict product quality in a cost efficient way. Test automation has therefore been an important enabler for us. The IP test design environment is continuously evolving and the investigations will show which improvements that is most important to implement in short and long term. In Ericsson UP IP I&V, the test automation framework environments are severed to complete some process by automated method, f...

  6. The porous surface model, a novel experimental system for online quantitative observation of microbial processes under unsaturated conditions

    DEFF Research Database (Denmark)

    Dechesne, Arnaud; Or, D.; Gulez, Gamze

    2008-01-01

    Water is arguably the most important constituent of microbial microhabitats due to its control of physical and physiological processes critical to microbial activity. In natural environments, bacteria often live on unsaturated surfaces, in thin (micrometric) liquid films. Nevertheless, no experim....... The PSM constitutes a tool uniquely adapted to study the influence of liquid film geometry on microbial processes. It should therefore contribute to uncovering mechanisms of microbial adaptation to unsaturated environments.......Water is arguably the most important constituent of microbial microhabitats due to its control of physical and physiological processes critical to microbial activity. In natural environments, bacteria often live on unsaturated surfaces, in thin (micrometric) liquid films. Nevertheless......, no experimental systems are available that allow real-time observation of bacterial processes in liquid films of controlled thickness. We propose a novel, inexpensive, easily operated experimental platform, termed the porous surface model (PSM) that enables quantitative real-time microscopic observations...

  7. The Interaction Model in iLearning Environments and its Use in the Smart Lab Concept

    Directory of Open Access Journals (Sweden)

    Yuliya Lyalina

    2011-11-01

    Full Text Available This paper identifies and discusses current trends and challenges, offers an overview of state-of-the-art technologies in the development of remote and smart laboratories, and introduces the iLearning interaction model. The use of the model allows reconstructing already- existing iLearning environments. The smart lab model is described for face-to-face, Mobile and Blended Learning. As a result, this allows offering new information technology that organizes the educational process according to learning type (face-to-face, hands-on learning, Life Long Learning, E-Learning, M-Learning, Blended learning, Game-based learning, etc.. The remote access Architecture and Interface for the multifunctional Smart Lab will be developed.

  8. CLEW: A Cooperative Learning Environment for the Web.

    Science.gov (United States)

    Ribeiro, Marcelo Blois; Noya, Ricardo Choren; Fuks, Hugo

    This paper outlines CLEW (collaborative learning environment for the Web). The project combines MUD (Multi-User Dimension), workflow, VRML (Virtual Reality Modeling Language) and educational concepts like constructivism in a learning environment where students actively participate in the learning process. The MUD shapes the environment structure.…

  9. Mapping care processes within a hospital: from theory to a web-based proposal merging enterprise modelling and ISO normative principles.

    Science.gov (United States)

    Staccini, Pascal; Joubert, Michel; Quaranta, Jean-François; Fieschi, Marius

    2005-03-01

    Today, the economic and regulatory environment, involving activity-based and prospective payment systems, healthcare quality and risk analysis, traceability of the acts performed and evaluation of care practices, accounts for the current interest in clinical and hospital information systems. The structured gathering of information relative to users' needs and system requirements is fundamental when installing such systems. This stage takes time and is generally misconstrued by caregivers and is of limited efficacy to analysts. We used a modelling technique designed for manufacturing processes (IDEF0/SADT). We enhanced the basic model of an activity with descriptors extracted from the Ishikawa cause-and-effect diagram (methods, men, materials, machines, and environment). We proposed an object data model of a process and its components, and programmed a web-based tool in an object-oriented environment. This tool makes it possible to extract the data dictionary of a given process from the description of its elements and to locate documents (procedures, recommendations, instructions) according to each activity or role. Aimed at structuring needs and storing information provided by directly involved teams regarding the workings of an institution (or at least part of it), the process-mapping approach has an important contribution to make in the analysis of clinical information systems.

  10. Multi objective optimization model for minimizing production cost and environmental impact in CNC turning process

    Science.gov (United States)

    Widhiarso, Wahyu; Rosyidi, Cucuk Nur

    2018-02-01

    Minimizing production cost in a manufacturing company will increase the profit of the company. The cutting parameters will affect total processing time which then will affect the production cost of machining process. Besides affecting the production cost and processing time, the cutting parameters will also affect the environment. An optimization model is needed to determine the optimum cutting parameters. In this paper, we develop an optimization model to minimize the production cost and the environmental impact in CNC turning process. The model is used a multi objective optimization. Cutting speed and feed rate are served as the decision variables. Constraints considered are cutting speed, feed rate, cutting force, output power, and surface roughness. The environmental impact is converted from the environmental burden by using eco-indicator 99. Numerical example is given to show the implementation of the model and solved using OptQuest of Oracle Crystal Ball software. The results of optimization indicate that the model can be used to optimize the cutting parameters to minimize the production cost and the environmental impact.

  11. Multiphysics modelling of manufacturing processes: A review

    DEFF Research Database (Denmark)

    Jabbari, Masoud; Baran, Ismet; Mohanty, Sankhya

    2018-01-01

    Numerical modelling is increasingly supporting the analysis and optimization of manufacturing processes in the production industry. Even if being mostly applied to multistep processes, single process steps may be so complex by nature that the needed models to describe them must include multiphysics...... the diversity in the field of modelling of manufacturing processes as regards process, materials, generic disciplines as well as length scales: (1) modelling of tape casting for thin ceramic layers, (2) modelling the flow of polymers in extrusion, (3) modelling the deformation process of flexible stamps...... for nanoimprint lithography, (4) modelling manufacturing of composite parts and (5) modelling the selective laser melting process. For all five examples, the emphasis is on modelling results as well as describing the models in brief mathematical details. Alongside with relevant references to the original work...

  12. Using ecosystem services to represent the environment in hydro-economic models

    Science.gov (United States)

    Momblanch, Andrea; Connor, Jeffery D.; Crossman, Neville D.; Paredes-Arquiola, Javier; Andreu, Joaquín

    2016-07-01

    Demand for water is expected to grow in line with global human population growth, but opportunities to augment supply are limited in many places due to resource limits and expected impacts of climate change. Hydro-economic models are often used to evaluate water resources management options, commonly with a goal of understanding how to maximise water use value and reduce conflicts among competing uses. The environment is now an important factor in decision making, which has resulted in its inclusion in hydro-economic models. We reviewed 95 studies applying hydro-economic models, and documented how the environment is represented in them and the methods they use to value environmental costs and benefits. We also sought out key gaps and inconsistencies in the treatment of the environment in hydro-economic models. We found that representation of environmental values of water is patchy in most applications, and there should be systematic consideration of the scope of environmental values to include and how they should be valued. We argue that the ecosystem services framework offers a systematic approach to identify the full range of environmental costs and benefits. The main challenges to more holistic representation of the environment in hydro-economic models are the current limits to understanding of ecological functions which relate physical, ecological and economic values and critical environmental thresholds; and the treatment of uncertainty.

  13. Configurable multi-perspective business process models

    NARCIS (Netherlands)

    La Rosa, M.; Dumas, M.; Hofstede, ter A.H.M.; Mendling, J.

    2011-01-01

    A configurable process model provides a consolidated view of a family of business processes. It promotes the reuse of proven practices by providing analysts with a generic modeling artifact from which to derive individual process models. Unfortunately, the scope of existing notations for

  14. Probabilistic evaluation of process model matching techniques

    NARCIS (Netherlands)

    Kuss, Elena; Leopold, Henrik; van der Aa, Han; Stuckenschmidt, Heiner; Reijers, Hajo A.

    2016-01-01

    Process model matching refers to the automatic identification of corresponding activities between two process models. It represents the basis for many advanced process model analysis techniques such as the identification of similar process parts or process model search. A central problem is how to

  15. Construction material processed using lunar simulant in various environments

    Science.gov (United States)

    Chase, Stan; Ocallaghan-Hay, Bridget; Housman, Ralph; Kindig, Michael; King, John; Montegrande, Kevin; Norris, Raymond; Vanscotter, Ryan; Willenborg, Jonathan; Staubs, Harry

    1995-01-01

    The manufacture of construction materials from locally available resources in space is an important first step in the establishment of lunar and planetary bases. The objective of the CoMPULSIVE (Construction Material Processed Using Lunar Simulant In Various Environments) experiment is to develop a procedure to produce construction materials by sintering or melting Johnson Space Center Simulant 1 (JSC-1) lunar soil simulant in both earth-based (1-g) and microgravity (approximately 0-g) environments. The characteristics of the resultant materials will be tested to determine its physical and mechanical properties. The physical characteristics include: crystalline, thermal, and electrical properties. The mechanical properties include: compressive tensile, and flexural strengths. The simulant, placed in a sealed graphite crucible, will be heated using a high temperature furnace. The crucible will then be cooled by radiative and forced convective means. The core furnace element consists of space qualified quartz-halogen incandescent lamps with focusing mirrors. Sample temperatures of up to 2200 C are attainable using this heating method.

  16. Environment modeling using runtime values for JPF-Android

    CSIR Research Space (South Africa)

    Van der Merwe, H

    2015-11-01

    Full Text Available , the environment of an application is simplified/abstracted using models or stubs. Empty stubs, returning default values, are simple to generate automatically, but they do not perform well when the application expects specific return values. Symbolic execution...

  17. Business Process Innovation using the Process Innovation Laboratory

    DEFF Research Database (Denmark)

    Møller, Charles

    for practical applications has not been identified. The aim of this paper is to establish a conceptual framework for business process innovation in the supply chain based on advanced enterprise systems. The main approach to business process innovation in this context is to create a new methodology for exploring...... process models and patterns of applications. The paper thus presents a new concept for business process innovation called the process innovation laboratory a.k.a. the ?-Lab. The ?-Lab is a comprehensive framework for BPI using advanced enterprise systems. The ?-Lab is a collaborative workspace...... for experimenting with process models and an explorative approach to study integrated modeling in a controlled environment. The ?-Lab facilitates innovation by using an integrated action learning approach to process modeling including contemporary technological, organizational and business perspectives....

  18. Incorporating pushing in exclusion-process models of cell migration.

    Science.gov (United States)

    Yates, Christian A; Parker, Andrew; Baker, Ruth E

    2015-05-01

    The macroscale movement behavior of a wide range of isolated migrating cells has been well characterized experimentally. Recently, attention has turned to understanding the behavior of cells in crowded environments. In such scenarios it is possible for cells to interact, inducing neighboring cells to move in order to make room for their own movements or progeny. Although the behavior of interacting cells has been modeled extensively through volume-exclusion processes, few models, thus far, have explicitly accounted for the ability of cells to actively displace each other in order to create space for themselves. In this work we consider both on- and off-lattice volume-exclusion position-jump processes in which cells are explicitly allowed to induce movements in their near neighbors in order to create space for themselves to move or proliferate into. We refer to this behavior as pushing. From these simple individual-level representations we derive continuum partial differential equations for the average occupancy of the domain. We find that, for limited amounts of pushing, comparison between the averaged individual-level simulations and the population-level model is nearly as good as in the scenario without pushing. Interestingly, we find that, in the on-lattice case, the diffusion coefficient of the population-level model is increased by pushing, whereas, for the particular off-lattice model that we investigate, the diffusion coefficient is reduced. We conclude, therefore, that it is important to consider carefully the appropriate individual-level model to use when representing complex cell-cell interactions such as pushing.

  19. Syntax highlighting in business process models

    NARCIS (Netherlands)

    Reijers, H.A.; Freytag, T.; Mendling, J.; Eckleder, A.

    2011-01-01

    Sense-making of process models is an important task in various phases of business process management initiatives. Despite this, there is currently hardly any support in business process modeling tools to adequately support model comprehension. In this paper we adapt the concept of syntax

  20. Genetic Process Mining: Alignment-based Process Model Mutation

    NARCIS (Netherlands)

    Eck, van M.L.; Buijs, J.C.A.M.; Dongen, van B.F.; Fournier, F.; Mendling, J.

    2015-01-01

    The Evolutionary Tree Miner (ETM) is a genetic process discovery algorithm that enables the user to guide the discovery process based on preferences with respect to four process model quality dimensions: replay fitness, precision, generalization and simplicity. Traditionally, the ETM algorithm uses

  1. Integrated catchment modelling within a strategic planning and decision making process: Werra case study

    Science.gov (United States)

    Dietrich, Jörg; Funke, Markus

    Integrated water resources management (IWRM) redefines conventional water management approaches through a closer cross-linkage between environment and society. The role of public participation and socio-economic considerations becomes more important within the planning and decision making process. In this paper we address aspects of the integration of catchment models into such a process taking the implementation of the European Water Framework Directive (WFD) as an example. Within a case study situated in the Werra river basin (Central Germany), a systems analytic decision process model was developed. This model uses the semantics of the Unified Modeling Language (UML) activity model. As an example application, the catchment model SWAT and the water quality model RWQM1 were applied to simulate the effect of phosphorus emissions from non-point and point sources on water quality. The decision process model was able to guide the participants of the case study through the interdisciplinary planning and negotiation of actions. Further improvements of the integration framework include tools for quantitative uncertainty analyses, which are crucial for real life application of models within an IWRM decision making toolbox. For the case study, the multi-criteria assessment of actions indicates that the polluter pays principle can be met at larger scales (sub-catchment or river basin) without significantly compromising cost efficiency for the local situation.

  2. Cielo Computational Environment Usage Model With Mappings to ACE Requirements for the General Availability User Environment Capabilities Release Version 1.1

    Energy Technology Data Exchange (ETDEWEB)

    Vigil,Benny Manuel [Los Alamos National Laboratory; Ballance, Robert [SNL; Haskell, Karen [SNL

    2012-08-09

    Cielo is a massively parallel supercomputer funded by the DOE/NNSA Advanced Simulation and Computing (ASC) program, and operated by the Alliance for Computing at Extreme Scale (ACES), a partnership between Los Alamos National Laboratory (LANL) and Sandia National Laboratories (SNL). The primary Cielo compute platform is physically located at Los Alamos National Laboratory. This Cielo Computational Environment Usage Model documents the capabilities and the environment to be provided for the Q1 FY12 Level 2 Cielo Capability Computing (CCC) Platform Production Readiness Milestone. This document describes specific capabilities, tools, and procedures to support both local and remote users. The model is focused on the needs of the ASC user working in the secure computing environments at Lawrence Livermore National Laboratory (LLNL), Los Alamos National Laboratory, or Sandia National Laboratories, but also addresses the needs of users working in the unclassified environment. The Cielo Computational Environment Usage Model maps the provided capabilities to the tri-Lab ASC Computing Environment (ACE) Version 8.0 requirements. The ACE requirements reflect the high performance computing requirements for the Production Readiness Milestone user environment capabilities of the ASC community. A description of ACE requirements met, and those requirements that are not met, are included in each section of this document. The Cielo Computing Environment, along with the ACE mappings, has been issued and reviewed throughout the tri-Lab community.

  3. Centrifuge modelling of contaminant transport processes

    OpenAIRE

    Culligan, P. J.; Savvidou, C.; Barry, D. A.

    1996-01-01

    Over the past decade, research workers have started to investigate problems of subsurface contaminant transport through physical modelling on a geotechnical centrifuge. A major advantage of this apparatus is its ability to model complex natural systems in a controlled laboratory environment In this paper, we discusses the principles and scaling laws related to the centrifugal modelling of contaminant transport, and presents four examples of recent work that has bee...

  4. A modelling framework for the transport, transformation and biouptake of manufactured nanoparticles in the aquatic environment

    Science.gov (United States)

    Lofts, Stephen; Keller, Virginie; Dumont, Egon; Williams, Richard; Praetorius, Antonia; von der Kammer, Frank

    2016-04-01

    The development of innovative new chemical products is a key aspect of the modern economy, yet society demands that such development is environmentally sustainable. Developing knowledge of how new classes of chemicals behave following release to the environment is key to understanding the hazards that will potentially result. Nanoparticles are a key example of a class of chemicals that have undergone a significant expansion in production and use in recent years and so there is a need to develop tools to predict their potential hazard following their deliberate or incidental release to the environment. Generalising the understanding of the environmental behaviour of manufactured nanoparticles in general is challenging, as they are chemically and physically diverse (e.g. metals, metal oxides, carbon nanotubes, cellulose, quantum dots). Furthermore, nanoparticles may be manufactured with capping agents to modify their desired behaviour in industrial applications; such agents may also influence their environmental behaviour. Also, nanoparticles may become significantly modified from their as-manufactured forms both prior to and after the point of environmental release. Tools for predicting nanoparticle behaviour and hazard need to be able to consider a wide range of release scenarios and aspects of nanoparticle behaviour in the environment (e.g. dissolution, transformation of capping agents, agglomeration and aggregation behaviour), where such behaviours are not shared by all types of nanoparticle. This implies the need for flexible, futureproofed tools capable of being updated to take new understanding of behavioural processes into account as such knowledge emerges. This presentation will introduce the NanoFASE model system, a multimedia modelling framework for the transport, transformation and biouptake of manufactured nanoparticles. The complete system will comprise atmospheric, terrestrial and aquatic compartments to allow holistic simulation of nanoparticles; this

  5. Analysis of the PPBE Process in the Current Dynamic Political Environment

    Science.gov (United States)

    2008-06-01

    provides a comparative analysis using the Political, Economic , Socio- Cultural, Technological, Ecological and Legal ( PESTEL ) Analysis model of the...37 A. PESTEL ANALYSIS OF THE 1960/1970 ERA...44 B. PESTEL ANALYSIS OF THE POST 9/11 ENVIRONMENT..................45 1. Political

  6. A Phenomena-Oriented Environment for Teaching Process Modeling: Novel Modeling Software and Its Use in Problem Solving.

    Science.gov (United States)

    Foss, Alan S.; Geurts, Kevin R.; Goodeve, Peter J.; Dahm, Kevin D.; Stephanopoulos, George; Bieszczad, Jerry; Koulouris, Alexandros

    1999-01-01

    Discusses a program that offers students a phenomenon-oriented environment expressed in the fundamental concepts and language of chemical engineering such as mass and energy balancing, phase equilibria, reaction stoichiometry and rate, modes of heat, and species transport. (CCM)

  7. Multiscale Computing with the Multiscale Modeling Library and Runtime Environment

    NARCIS (Netherlands)

    Borgdorff, J.; Mamonski, M.; Bosak, B.; Groen, D.; Ben Belgacem, M.; Kurowski, K.; Hoekstra, A.G.

    2013-01-01

    We introduce a software tool to simulate multiscale models: the Multiscale Coupling Library and Environment 2 (MUSCLE 2). MUSCLE 2 is a component-based modeling tool inspired by the multiscale modeling and simulation framework, with an easy-to-use API which supports Java, C++, C, and Fortran. We

  8. Cognitive Virtualization: Combining Cognitive Models and Virtual Environments

    International Nuclear Information System (INIS)

    Tuan Q. Tran; David I. Gertman; Donald D. Dudenhoeffer; Ronald L. Boring; Alan R. Mecham

    2007-01-01

    3D manikins are often used in visualizations to model human activity in complex settings. Manikins assist in developing understanding of human actions, movements and routines in a variety of different environments representing new conceptual designs. One such environment is a nuclear power plant control room, here they have the potential to be used to simulate more precise ergonomic assessments of human work stations. Next generation control rooms will pose numerous challenges for system designers. The manikin modeling approach by itself, however, may be insufficient for dealing with the desired technical advancements and challenges of next generation automated systems. Uncertainty regarding effective staffing levels; and the potential for negative human performance consequences in the presence of advanced automated systems (e.g., reduced vigilance, poor situation awareness, mistrust or blind faith in automation, higher information load and increased complexity) call for further research. Baseline assessment of novel control room equipment(s) and configurations needs to be conducted. These design uncertainties can be reduced through complementary analysis that merges ergonomic manikin models with models of higher cognitive functions, such as attention, memory, decision-making, and problem-solving. This paper will discuss recent advancements in merging a theoretical-driven cognitive modeling framework within a 3D visualization modeling tool to evaluate of next generation control room human factors and ergonomic assessment. Though this discussion primary focuses on control room design, the application for such a merger between 3D visualization and cognitive modeling can be extended to various areas of focus such as training and scenario planning

  9. Enhancing Network Data Obliviousness in Trusted Execution Environment-based Stream Processing Systems

    KAUST Repository

    Alsibyani, Hassan M.

    2018-05-15

    Cloud computing usage is increasing and a common concern is the privacy and security of the data and computation. Third party cloud environments are not considered fit for processing private information because the data will be revealed to the cloud provider. However, Trusted Execution Environments (TEEs), such as Intel SGX, provide a way for applications to run privately and securely on untrusted platforms. Nonetheless, using a TEE by itself for stream processing systems is not sufficient since network communication patterns may leak properties of the data under processing. This work addresses leaky topology structures and suggests mitigation techniques for each of these. We create specific metrics to evaluate leaks occurring from the network patterns; the metrics measure information leaked when the stream processing system is running. We consider routing techniques for inter-stage communication in a streaming application to mitigate this data leakage. We consider a dynamic policy to change the mitigation technique depending on how much information is currently leaking. Additionally, we consider techniques to hide irregularities resulting from a filtering stage in a topology. We also consider leakages resulting from applications containing cycles. For each of the techniques, we explore their effectiveness in terms of the advantage they provide in overcoming the network leakage. The techniques are tested partly using simulations and some were implemented in a prototype SGX-based stream processing system.

  10. CLIMLAB: a Python-based software toolkit for interactive, process-oriented climate modeling

    Science.gov (United States)

    Rose, B. E. J.

    2015-12-01

    Global climate is a complex emergent property of the rich interactions between simpler components of the climate system. We build scientific understanding of this system by breaking it down into component process models (e.g. radiation, large-scale dynamics, boundary layer turbulence), understanding each components, and putting them back together. Hands-on experience and freedom to tinker with climate models (whether simple or complex) is invaluable for building physical understanding. CLIMLAB is an open-ended software engine for interactive, process-oriented climate modeling. With CLIMLAB you can interactively mix and match model components, or combine simpler process models together into a more comprehensive model. It was created primarily to support classroom activities, using hands-on modeling to teach fundamentals of climate science at both undergraduate and graduate levels. CLIMLAB is written in Python and ties in with the rich ecosystem of open-source scientific Python tools for numerics and graphics. The IPython notebook format provides an elegant medium for distributing interactive example code. I will give an overview of the current capabilities of CLIMLAB, the curriculum we have developed thus far, and plans for the future. Using CLIMLAB requires some basic Python coding skills. We consider this an educational asset, as we are targeting upper-level undergraduates and Python is an increasingly important language in STEM fields. However CLIMLAB is well suited to be deployed as a computational back-end for a graphical gaming environment based on earth-system modeling.

  11. The triconnected abstraction of process models

    OpenAIRE

    Polyvyanyy, Artem; Smirnov, Sergey; Weske, Mathias

    2009-01-01

    Contents: Artem Polyvanny, Sergey Smirnow, and Mathias Weske The Triconnected Abstraction of Process Models 1 Introduction 2 Business Process Model Abstraction 3 Preliminaries 4 Triconnected Decomposition 4.1 Basic Approach for Process Component Discovery 4.2 SPQR-Tree Decomposition 4.3 SPQR-Tree Fragments in the Context of Process Models 5 Triconnected Abstraction 5.1 Abstraction Rules 5.2 Abstraction Algorithm 6 Related Work and Conclusions

  12. Modeling Suspension and Continuation of a Process

    Directory of Open Access Journals (Sweden)

    Oleg Svatos

    2012-04-01

    Full Text Available This work focuses on difficulties an analyst encounters when modeling suspension and continuation of a process in contemporary process modeling languages. As a basis there is introduced general lifecycle of an activity which is then compared to activity lifecycles supported by individual process modeling languages. The comparison shows that the contemporary process modeling languages cover the defined general lifecycle of an activity only partially. There are picked two popular process modeling languages and there is modeled real example, which reviews how the modeling languages can get along with their lack of native support of suspension and continuation of an activity. Upon the unsatisfying results of the contemporary process modeling languages in the modeled example, there is presented a new process modeling language which, as demonstrated, is capable of capturing suspension and continuation of an activity in much simpler and precise way.

  13. High-school students' reasoning while constructing plant growth models in a computer-supported educational environment

    Science.gov (United States)

    Ergazaki, Marida; Komis, Vassilis; Zogza, Vassiliki

    2005-08-01

    This paper highlights specific aspects of high-school students’ reasoning while coping with a modeling task of plant growth in a computer-supported educational environment. It is particularly concerned with the modeling levels (‘macro-phenomenological’ and ‘micro-conceptual’ level) activated by peers while exploring plant growth and with their ability to shift between or within these levels. The focus is on the types of reasoning developed in the modeling process, as well as on the reasoning coherence around the central concept of plant growth. The findings of the study show that a significant proportion of the 18 participating dyads perform modeling on both levels, while their ability to shift between them as well as between the various elements of the ‘micro-conceptual’ level is rather constrained. Furthermore, the reasoning types identified in peers’ modeling process are ‘convergent’, ‘serial’, ‘linked’ and ‘convergent attached’, with the first type being the most frequent. Finally, a significant part of the participating dyads display a satisfactory degree of reasoning ‘coherence’, performing their task committed to the main objective of exploring plant growth. Teaching implications of the findings are also discussed.

  14. [Analytic methods for seed models with genotype x environment interactions].

    Science.gov (United States)

    Zhu, J

    1996-01-01

    Genetic models with genotype effect (G) and genotype x environment interaction effect (GE) are proposed for analyzing generation means of seed quantitative traits in crops. The total genetic effect (G) is partitioned into seed direct genetic effect (G0), cytoplasm genetic of effect (C), and maternal plant genetic effect (Gm). Seed direct genetic effect (G0) can be further partitioned into direct additive (A) and direct dominance (D) genetic components. Maternal genetic effect (Gm) can also be partitioned into maternal additive (Am) and maternal dominance (Dm) genetic components. The total genotype x environment interaction effect (GE) can also be partitioned into direct genetic by environment interaction effect (G0E), cytoplasm genetic by environment interaction effect (CE), and maternal genetic by environment interaction effect (GmE). G0E can be partitioned into direct additive by environment interaction (AE) and direct dominance by environment interaction (DE) genetic components. GmE can also be partitioned into maternal additive by environment interaction (AmE) and maternal dominance by environment interaction (DmE) genetic components. Partitions of genetic components are listed for parent, F1, F2 and backcrosses. A set of parents, their reciprocal F1 and F2 seeds is applicable for efficient analysis of seed quantitative traits. MINQUE(0/1) method can be used for estimating variance and covariance components. Unbiased estimation for covariance components between two traits can also be obtained by the MINQUE(0/1) method. Random genetic effects in seed models are predictable by the Adjusted Unbiased Prediction (AUP) approach with MINQUE(0/1) method. The jackknife procedure is suggested for estimation of sampling variances of estimated variance and covariance components and of predicted genetic effects, which can be further used in a t-test for parameter. Unbiasedness and efficiency for estimating variance components and predicting genetic effects are tested by

  15. CONSERVATION PROCESS MODEL (CPM: A TWOFOLD SCIENTIFIC RESEARCH SCOPE IN THE INFORMATION MODELLING FOR CULTURAL HERITAGE

    Directory of Open Access Journals (Sweden)

    D. Fiorani

    2017-05-01

    Full Text Available The aim of the present research is to develop an instrument able to adequately support the conservation process by means of a twofold approach, based on both BIM environment and ontology formalisation. Although BIM has been successfully experimented within AEC (Architecture Engineering Construction field, it has showed many drawbacks for architectural heritage. To cope with unicity and more generally complexity of ancient buildings, applications so far developed have shown to poorly adapt BIM to conservation design with unsatisfactory results (Dore, Murphy 2013; Carrara 2014. In order to combine achievements reached within AEC through BIM environment (design control and management with an appropriate, semantically enriched and flexible The presented model has at its core a knowledge base developed through information ontologies and oriented around the formalization and computability of all the knowledge necessary for the full comprehension of the object of architectural heritage an its conservation. Such a knowledge representation is worked out upon conceptual categories defined above all within architectural criticism and conservation scope. The present paper aims at further extending the scope of conceptual modelling within cultural heritage conservation already formalized by the model. A special focus is directed on decay analysis and surfaces conservation project.

  16. Developing and testing a model of psychosocial work environment and performance

    DEFF Research Database (Denmark)

    Edwards, Kasper; Pejtersen, Jan Hyld; Møller, Niels

    2011-01-01

    Good psychosocial work environment has been assumed to result in good work performance. However, little documentation exists which support the claim and the same goes for the opposite claim. This paper reports findings from a combined quantitative and qualitative study of the relationship between...... psychosocial work environment and performance in a large Danish firm. The objects of the study were more than 45 customer centers’ with 9-20 employees each. A substantial database covering the 45 customer centers over a period of 5 years has been gathered. In this period the Copenhagen psychosocial...... questionnaire (COPSOQ) has been used two times with two years in between. This allows us to build a model of the relationship between psychosocial work environment, selected context variables and performance data. The model proposes that good psychosocial work environment is a function of leadership which...

  17. Developing and testing a model of psychosocial work environment and performance

    DEFF Research Database (Denmark)

    Edwards, Kasper; Pejtersen, Jan Hyld; Møller, Niels

    Good psychosocial work environment has been assumed to result in good work performance. However, little documentation exists which support the claim and the same goes for the opposite claim. This paper reports findings from a combined quantitative and qualitative study of the relationship between...... psychosocial work environment and performance in a large Danish firm. The objects of the study were more than 45 customer centers’ with 9-20 employees each. A substantial database covering the 45 customer centers over a period of 5 years has been gathered. In this period the Copenhagen psychosocial...... questionnaire (COPSOQ) has been used two times with two years in between. This allows us to build a model of the relationship between psychosocial work environment, selected context variables and performance data. The model proposes that good psychosocial work environment is a function of leadership which...

  18. An Investigation of Anaerobic Processes in Fuel/Natural Seawater Environments

    Science.gov (United States)

    2012-02-08

    separated esters and glycerin. Biodiesel contains no sulfur. In the United States the term "biodiesel" is standardized as fatty acid methyl ester ( FAME ...crude oil remaining. Biodiesel is produced from vegetable oils by converting the triglyceride oils to methyl (or ethyl) esters with a process known...water from the environment. Microbial growth in seawater can be limited by nutrients, including carbon. Biodiesel methyl esters are quite sparingly

  19. Atrazine degradation using chemical-free process of USUV: Analysis of the micro-heterogeneous environments and the degradation mechanisms

    International Nuclear Information System (INIS)

    Xu, L.J.; Chu, W.; Graham, Nigel

    2014-01-01

    Graphical abstract: - Highlights: • Two chemical-free AOP processes are combined to enhance atrazine degradation. • ATZ degradation in sonophotolytic process was analyzed using a previous proposed model. • The micro-bubble/liquid heterogeneous environments in sonolytic processes were investigated. • The salt effects on different sonolytic processes were examined. • ATZ degradation mechanisms were investigated and pathways were proposed. - Abstract: The effectiveness of sonolysis (US), photolysis (UV), and sonophotolysis (USUV) for the degradation of atrazine (ATZ) was investigated. An untypical kinetics analysis was found useful to describe the combined process, which is compatible to pseudo first-order kinetics. The heterogeneous environments of two different ultrasounds (20 and 400 kHz) were evaluated. The heterogeneous distribution of ATZ in the ultrasonic solution was found critical in determining the reaction rates at different frequencies. The presence of NaCl would promote/inhibit the rates by the growth and decline of “salting out” effect and surface tension. The benefits of combining these two processes were for the first time investigated from the aspect of promoting the intermediates degradation which were resistant in individual processes. UV caused a rapid transformation of ATZ to 2-hydroxyatrazine (OIET), which was insensitive to UV irradiation; however, US and USUV were able to degrade OIET and other intermediates through • OH attack. On the other hand, UV irradiation also could promote radical generation via H 2 O 2 decomposition, thereby resulting in less accumulation of more hydrophilic intermediates, which are difficult to degradation in the US process. Reaction pathways for ATZ degradation by all three processes are proposed. USUV achieved the greatest degree of ATZ mineralization with more than 60% TOC removed, contributed solely by the oxidation of side chains. Ammeline was found to be the only end-product in both US and USUV

  20. Atrazine degradation using chemical-free process of USUV: Analysis of the micro-heterogeneous environments and the degradation mechanisms

    Energy Technology Data Exchange (ETDEWEB)

    Xu, L.J., E-mail: xulijie827@gmail.com [Department of Civil and Environmental Engineering, The Hong Kong Polytechnic University, Hung Hom, Kowloon (Hong Kong); Chu, W., E-mail: cewchu@polyu.edu.hk [Department of Civil and Environmental Engineering, The Hong Kong Polytechnic University, Hung Hom, Kowloon (Hong Kong); Graham, Nigel, E-mail: n.graham@imperial.ac.uk [Department of Civil and Environmental Engineering, Imperial College London, South Kensington Campus, London SW7 2AZ (United Kingdom)

    2014-06-30

    Graphical abstract: - Highlights: • Two chemical-free AOP processes are combined to enhance atrazine degradation. • ATZ degradation in sonophotolytic process was analyzed using a previous proposed model. • The micro-bubble/liquid heterogeneous environments in sonolytic processes were investigated. • The salt effects on different sonolytic processes were examined. • ATZ degradation mechanisms were investigated and pathways were proposed. - Abstract: The effectiveness of sonolysis (US), photolysis (UV), and sonophotolysis (USUV) for the degradation of atrazine (ATZ) was investigated. An untypical kinetics analysis was found useful to describe the combined process, which is compatible to pseudo first-order kinetics. The heterogeneous environments of two different ultrasounds (20 and 400 kHz) were evaluated. The heterogeneous distribution of ATZ in the ultrasonic solution was found critical in determining the reaction rates at different frequencies. The presence of NaCl would promote/inhibit the rates by the growth and decline of “salting out” effect and surface tension. The benefits of combining these two processes were for the first time investigated from the aspect of promoting the intermediates degradation which were resistant in individual processes. UV caused a rapid transformation of ATZ to 2-hydroxyatrazine (OIET), which was insensitive to UV irradiation; however, US and USUV were able to degrade OIET and other intermediates through • OH attack. On the other hand, UV irradiation also could promote radical generation via H{sub 2}O{sub 2} decomposition, thereby resulting in less accumulation of more hydrophilic intermediates, which are difficult to degradation in the US process. Reaction pathways for ATZ degradation by all three processes are proposed. USUV achieved the greatest degree of ATZ mineralization with more than 60% TOC removed, contributed solely by the oxidation of side chains. Ammeline was found to be the only end-product in both US

  1. New model of enterprises resource planning implementation planning process in manufacturing enterprises

    Directory of Open Access Journals (Sweden)

    Mirjana Misita

    2016-05-01

    Full Text Available This article presents new model of enterprises resource planning implementation planning process in manufacturing enterprises based on assessment of risk sources. This assessment was performed by applying analytic hierarchy process. Analytic hierarchy process method allows variation of relative importance of specific risk sources dependent on the section from which the risk source originates (organizational environment, technical issues, people issues, adoption process management, and external support. Survey was conducted on 85 manufacturing enterprises involved with an enterprises resource planning solution. Ranking of risk sources assessments returns most frequent risks of enterprises resource planning implementation success in manufacturing enterprises, and representative factors were isolated through factor analysis by risk source origin. Finally, results indicate that there are hidden causes of failed implementation, for example, risk source “top management training and education,” from risk origin “adoption process management.”

  2. Modeling physiological processes that relate toxicant exposure and bacterial population dynamics.

    Directory of Open Access Journals (Sweden)

    Tin Klanjscek

    Full Text Available Quantifying effects of toxicant exposure on metabolic processes is crucial to predicting microbial growth patterns in different environments. Mechanistic models, such as those based on Dynamic Energy Budget (DEB theory, can link physiological processes to microbial growth.Here we expand the DEB framework to include explicit consideration of the role of reactive oxygen species (ROS. Extensions considered are: (i additional terms in the equation for the "hazard rate" that quantifies mortality risk; (ii a variable representing environmental degradation; (iii a mechanistic description of toxic effects linked to increase in ROS production and aging acceleration, and to non-competitive inhibition of transport channels; (iv a new representation of the "lag time" based on energy required for acclimation. We estimate model parameters using calibrated Pseudomonas aeruginosa optical density growth data for seven levels of cadmium exposure. The model reproduces growth patterns for all treatments with a single common parameter set, and bacterial growth for treatments of up to 150 mg(Cd/L can be predicted reasonably well using parameters estimated from cadmium treatments of 20 mg(Cd/L and lower. Our approach is an important step towards connecting levels of biological organization in ecotoxicology. The presented model reveals possible connections between processes that are not obvious from purely empirical considerations, enables validation and hypothesis testing by creating testable predictions, and identifies research required to further develop the theory.

  3. Estimation of environment-related properties of chemicals for design of sustainable processes: Development of group-contribution+ (GC+) models and uncertainty analysis

    DEFF Research Database (Denmark)

    Hukkerikar, Amol; Kalakul, Sawitree; Sarup, Bent

    2012-01-01

    The aim of this work is to develop group-3 contribution+ (GC+)method (combined group-contribution (GC) method and atom connectivity index (CI)) based 15 property models to provide reliable estimations of environment-related properties of organic chemicals together with uncertainties of estimated...... property values. For this purpose, a systematic methodology for property modeling and uncertainty analysis is used. The methodology includes a parameter estimation step to determine parameters of property models and an uncertainty analysis step to establish statistical information about the quality......, poly functional chemicals, etc.) taken from the database of the US Environmental Protection Agency (EPA) and from the database of USEtox is used. For property modeling and uncertainty analysis, the Marrero and Gani GC method and atom connectivity index method have been considered. In total, 22...

  4. Modeling Electrostatic Fields Generated by Internal Charging of Materials in Space Radiation Environments

    Science.gov (United States)

    Minow, Joseph I.

    2011-01-01

    Internal charging is a risk to spacecraft in energetic electron environments. DICTAT, NU MIT computational codes are the most widely used engineering tools for evaluating internal charging of insulator materials exposed to these environments. Engineering tools are designed for rapid evaluation of ESD threats, but there is a need for more physics based models for investigating the science of materials interactions with energetic electron environments. Current tools are limited by the physics included in the models and ease of user implementation .... additional development work is needed to improve models.

  5. Declarative modeling for process supervision

    International Nuclear Information System (INIS)

    Leyval, L.

    1989-01-01

    Our work is a contribution to computer aided supervision of continuous processes. It is inspired by an area of Artificial Intelligence: qualitative physics. Here, supervision is based on a model which continuously provides operators with a synthetic view of the process; but this model is founded on general principles of control theory rather than on physics. It involves concepts such as high gain or small time response. It helps in linking temporally the evolution of various variables. Moreover, the model provides predictions of the future behaviour of the process, which allows action advice and alarm filtering. This should greatly reduce the famous cognitive overload associated to any complex and dangerous evolution of the process

  6. ECONOMIC MODELING PROCESSES USING MATLAB

    Directory of Open Access Journals (Sweden)

    Anamaria G. MACOVEI

    2008-06-01

    Full Text Available To study economic phenomena and processes using mathem atical modeling, and to determine the approximatesolution to a problem we need to choose a method of calculation and a numerical computer program, namely thepackage of programs MatLab. Any economic process or phenomenon is a mathematical description of h is behavior,and thus draw up an economic and mathematical model that has the following stages: formulation of the problem, theanalysis process modeling, the production model and design verification, validation and implementation of the model.This article is presented an economic model and its modeling is using mathematical equations and software packageMatLab, which helps us approximation effective solution. As data entry is considered the net cost, the cost of direct andtotal cost and the link between them. I presented the basic formula for determining the total cost. Economic modelcalculations were made in MatLab software package and with graphic representation of its interpretation of the resultsachieved in terms of our specific problem.

  7. In situ observation of plutonium transfer processes in the marine environment

    International Nuclear Information System (INIS)

    Guary, J.-C.; Fraizier, Andre

    1975-09-01

    A preliminary observation of plutonium transfer processes in the marine environment was carried out and showed that concentration of the radionuclide was lower when marine organisms stood at a higher trophic level. This observation supplemented by an investigation on contamination pathways showed that plutonium was not concentrated along the food chain and its uptake occured preferentially by direct contact of species with seawater, a process chiefly affecting producers and primary consumers. It appeared that the marine sediment was not a significant vector of plutonium transfer in burrowing species [fr

  8. Impact Modelling for Circular Economy: Geodesign Discussion Support Environment

    NARCIS (Netherlands)

    Šileryte, R.; Wandl, A.; van Timmeren, A.; Bregt, Arnold; Sarjakoski, Tapani; van Lammeren, Ron; Rip, Frans

    2017-01-01

    Transitioning towards circular economy requires changes in the current system which yield a number of impacts on such fundamental values as human health, natural environment, exhaustible resources, social well-being and prosperity. Moreover, this process involves multiple actors and requires careful

  9. Styles in business process modeling: an exploration and a model

    NARCIS (Netherlands)

    Pinggera, J.; Soffer, P.; Fahland, D.; Weidlich, M.; Zugal, S.; Weber, B.; Reijers, H.A.; Mendling, J.

    2015-01-01

    Business process models are an important means to design, analyze, implement, and control business processes. As with every type of conceptual model, a business process model has to meet certain syntactic, semantic, and pragmatic quality requirements to be of value. For many years, such quality

  10. Composing Models of Geographic Physical Processes

    Science.gov (United States)

    Hofer, Barbara; Frank, Andrew U.

    Processes are central for geographic information science; yet geographic information systems (GIS) lack capabilities to represent process related information. A prerequisite to including processes in GIS software is a general method to describe geographic processes independently of application disciplines. This paper presents such a method, namely a process description language. The vocabulary of the process description language is derived formally from mathematical models. Physical processes in geography can be described in two equivalent languages: partial differential equations or partial difference equations, where the latter can be shown graphically and used as a method for application specialists to enter their process models. The vocabulary of the process description language comprises components for describing the general behavior of prototypical geographic physical processes. These process components can be composed by basic models of geographic physical processes, which is shown by means of an example.

  11. The future of the global environment. A model-based analysis supporting UNEP's first global environment outlook

    International Nuclear Information System (INIS)

    Bakkes, J.; Van Woerden, J.; Alcamo, J.; Berk, M.; Bol, P.; Van den Born, G.J.; Ten Brink, B.; Hettelingh, J.P.; Niessen, L.; Langeweg, F.; Swart, R.

    1997-01-01

    Integrated assessments in support of environmental policy have been applied to a number of countries and regions, and to international negotiations. UNEP's first Global Environment Outlook (GEO-1) can be seen as a step towards making the tool of integrated assessment more widely available as a means for focusing action. This technical report documents RIVM's contribution to the GEO-1 report, focusing on the subject 'looking ahead'. It is illustrated that a 'what if' analysis helps to look beyond the delays in environmental and resource processes. This report illustrates that integrated assessment and modelling techniques can be excellent tools for environment and development policy-setting. The methodology, however, will need to be further developed and adapted to the realities and expectations of diverse regions, incorporating alternative policy strategies and development scenarios. This report focuses primarily on the period 1970-2015, because reliable historical data are often only generally available from 1970 onwards and the year 2015 is believed to match the time perspective of decision-makers. The findings of the analysis are reported in terms of six regions, corresponding with the division of the UNEP regional offices. Questions asked are: how will socioeconomic driving forces affect freshwater and land resources, and how will these changes mutually interact, and why are these changes important for society? Chapter 2 deals with the development of the social and economic driving forces. In the Chapters 3 and 4 it is discussed how this pressure influences selected aspects of the environment. Chapter 3 alone addresses the importance of selected elements of the interacting global element cycles for environmental quality, while Chapter 4 addresses land resources, their potential for food production and associated dependence on freshwater resources. The impacts on selected components of natural areas (Chapter 5) and society (Chapter 6) are subsequently addressed

  12. A Process-Based Transport-Distance Model of Aeolian Transport

    Science.gov (United States)

    Naylor, A. K.; Okin, G.; Wainwright, J.; Parsons, A. J.

    2017-12-01

    We present a new approach to modeling aeolian transport based on transport distance. Particle fluxes are based on statistical probabilities of particle detachment and distributions of transport lengths, which are functions of particle size classes. A computational saltation model is used to simulate transport distances over a variety of sizes. These are fit to an exponential distribution, which has the advantages of computational economy, concordance with current field measurements, and a meaningful relationship to theoretical assumptions about mean and median particle transport distance. This novel approach includes particle-particle interactions, which are important for sustaining aeolian transport and dust emission. Results from this model are compared with results from both bulk- and particle-sized-specific transport equations as well as empirical wind tunnel studies. The transport-distance approach has been successfully used for hydraulic processes, and extending this methodology from hydraulic to aeolian transport opens up the possibility of modeling joint transport by wind and water using consistent physics. Particularly in nutrient-limited environments, modeling the joint action of aeolian and hydraulic transport is essential for understanding the spatial distribution of biomass across landscapes and how it responds to climatic variability and change.

  13. High temperature corrosion in the service environments of a nuclear process heat plant

    International Nuclear Information System (INIS)

    Quadakkers, W.J.

    1987-01-01

    In a nuclear process heat plant the heat-exchanging components fabricated from nickel- and Fe-Ni-based alloys are subjected to corrosive service environments at temperatures up to 950 0 C for service lives of up to 140 000 h. In this paper the corrosion behaviour of the high temperature alloys in the different service environments will be described. It is shown that the degree of protection provided by Cr 2 O 3 -based surface oxide scales against carburization and decarburization of the alloys is primarily determined not by the oxidation potential of the atmospheres but by a dynamic process involving, on the one hand, the oxidizing gas species and the metal and, on the other hand, the carbon in the alloy and the oxide scale. (orig.)

  14. Longitudinal monitoring of Listeria monocytogenes and Listeria phages in seafood processing environments in Thailand.

    Science.gov (United States)

    Vongkamjan, Kitiya; Benjakul, Soottawat; Kim Vu, Hue Thi; Vuddhakul, Varaporn

    2017-09-01

    Listeria monocytogenes is a foodborne pathogen commonly found in environments of seafood processing, thus presenting a challenge for eradication from seafood processing facilities. Monitoring the prevalence and subtype diversity of L. monocytogenes together with phages that are specific to Listeria spp. ("Listeria phages") will provide knowledge on the bacteria-phage ecology in food processing plants. In this work, a total of 595 samples were collected from raw material, finished seafood products and environmental samples from different sites of a seafood processing plant during 17 sampling visits in 1.5 years of study. L. monocytogenes and Listeria spp. (non-monocytogenes) were found in 22 (3.7%) and 43 (7.2%) samples, respectively, whereas 29 Listeria phages were isolated from 9 (1.5%) phage-positive samples. DNA fingerprint analysis of L. monocytogenes isolates revealed 11 Random Amplified Polymorphic DNA (RAPD) profiles, with two subtypes were frequently observed over time. Our data reveal a presence of Listeria phages within the same seafood processing environments where a diverse set of L. monocytogenes subtypes was also found. Although serotype 4b was observed at lower frequency, data indicate that isolates from this seafood processing plant belonged to both epidemiologically important serotypes 1/2a and 4b, which may suggest a potential public health risk. Phages (all showed a unique genome size of 65 ± 2 kb) were classified into 9 host range groups, representing both broad- and narrow-host range. While most L. monocytogenes isolates from this facility were susceptible to phages, five isolates showed resistance to 12-20 phages. Variations in phage host range among Listeria phages isolated from food processing plant may affect a presence of a diverse set of L. monocytogenes isolates derived from the same processing environment in Thailand. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Molecular recognition of the environment and mechanisms of the origin of species in quantum-like modeling of evolution.

    Science.gov (United States)

    Melkikh, Alexey V; Khrennikov, Andrei

    2017-11-01

    A review of the mechanisms of speciation is performed. The mechanisms of the evolution of species, taking into account the feedback of the state of the environment and mechanisms of the emergence of complexity, are considered. It is shown that these mechanisms, at the molecular level, cannot work steadily in terms of classical mechanics. Quantum mechanisms of changes in the genome, based on the long-range interaction potential between biologically important molecules, are proposed as one of possible explanation. Different variants of interactions of the organism and environment based on molecular recognition and leading to new species origins are considered. Experiments to verify the model are proposed. This bio-physical study is completed by the general operational model of based on quantum information theory. The latter is applied to model of epigenetic evolution. We briefly present the basics of the quantum-like approach to modeling of bio-informational processes. This approach is illustrated by the quantum-like model of epigenetic evolution. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Nutrition Care Process Implementation: Experiences in Various Dietetics Environments in Sweden.

    Science.gov (United States)

    Lövestam, Elin; Boström, Anne-Marie; Orrevall, Ylva

    2017-11-01

    The Nutrition Care Process (NCP) and Nutrition Care Process Terminology (NCPT) are currently being implemented by nutrition and dietetics practitioners all over the world. Several advantages have been related to this implementation, such as consistency and clarity of dietetics-related health care records and the possibility to collect and research patient outcomes. However, little is known about dietitians' experiences of the implementation process. The aim of this qualitative study was to explore Swedish dietitians' experiences of the NCP implementation process in different dietetics environments. Thirty-seven Swedish dietitians from 13 different dietetics workplaces participated in seven focus group discussions that were audiotaped and carefully transcribed. A thematic secondary analysis was performed, after which all the discussions were re-read, following the implementation narrative from each workplace. In the analysis, The Promoting Action on Research Implementation in Health Services implementation model was used as a framework. Main categories identified in the thematic analysis were leadership and implementation strategy, the group and colleagues, the electronic health record, and evaluation. Three typical cases are described to illustrate the diversity of these aspects in dietetics settings: Case A represents a small hospital with an inclusive leadership style and discussion-friendly culture where dietitians had embraced the NCP/NCPT implementation. Case B represents a larger hospital with a more hierarchical structure where dietitians were more ambivalent toward NCP/NCPT implementation. Case C represents the only dietitian working at a small multiprofessional primary care center who received no dietetics-related support from management or colleagues. She had not started NCP/NCPT implementation. The diversity of dietetics settings and their different prerequisites should be considered in the development of NCP/NCPT implementation strategies. Tailored

  17. The Structured Process Modeling Theory (SPMT) : a cognitive view on why and how modelers benefit from structuring the process of process modeling

    NARCIS (Netherlands)

    Claes, J.; Vanderfeesten, I.T.P.; Gailly, F.; Grefen, P.W.P.J.; Poels, G.

    2015-01-01

    After observing various inexperienced modelers constructing a business process model based on the same textual case description, it was noted that great differences existed in the quality of the produced models. The impression arose that certain quality issues originated from cognitive failures

  18. Multi-environment QTL mixed models for drought stress adaptation in wheat

    NARCIS (Netherlands)

    Mathews, K.L.; Malosetti, M.; Chapman, S.; McIntyre, L.; Reynolds, M.; Shorter, R.; Eeuwijk, van F.A.

    2008-01-01

    Many quantitative trait loci (QTL) detection methods ignore QTL-by-environment interaction (QEI) and are limited in accommodation of error and environment-specific variance. This paper outlines a mixed model approach using a recombinant inbred spring wheat population grown in six drought stress

  19. Integrated modelling in materials and process technology

    DEFF Research Database (Denmark)

    Hattel, Jesper Henri

    2008-01-01

    Integrated modelling of entire process sequences and the subsequent in-service conditions, and multiphysics modelling of the single process steps are areas that increasingly support optimisation of manufactured parts. In the present paper, three different examples of modelling manufacturing...... processes from the viewpoint of combined materials and process modelling are presented: solidification of thin walled ductile cast iron, integrated modelling of spray forming and multiphysics modelling of friction stir welding. The fourth example describes integrated modelling applied to a failure analysis...

  20. Modelling Global Pattern Formations for Collaborative Learning Environments

    DEFF Research Database (Denmark)

    Grappiolo, Corrado; Cheong, Yun-Gyung; Khaled, Rilla

    2012-01-01

    We present our research towards the design of a computational framework capable of modelling the formation and evolution of global patterns (i.e. group structures) in a population of social individuals. The framework is intended to be used in collaborative environments, e.g. social serious games...

  1. A Stochastic Model of Plausibility in Live Virtual Constructive Environments

    Science.gov (United States)

    2017-09-14

    from the model parameters that are inputs to the computer model ( mathematical model) but whose exact values are unknown to experimentalists and...Environments Jeremy R. Millar Follow this and additional works at: https://scholar.afit.edu/etd Part of the Computer Sciences Commons This Dissertation...25 3.3 Computing Plausibility Exceedance Probabilities . . . . . . . . . . . . . . . . . . . 28 IV

  2. Physics-based statistical model and simulation method of RF propagation in urban environments

    Science.gov (United States)

    Pao, Hsueh-Yuan; Dvorak, Steven L.

    2010-09-14

    A physics-based statistical model and simulation/modeling method and system of electromagnetic wave propagation (wireless communication) in urban environments. In particular, the model is a computationally efficient close-formed parametric model of RF propagation in an urban environment which is extracted from a physics-based statistical wireless channel simulation method and system. The simulation divides the complex urban environment into a network of interconnected urban canyon waveguides which can be analyzed individually; calculates spectral coefficients of modal fields in the waveguides excited by the propagation using a database of statistical impedance boundary conditions which incorporates the complexity of building walls in the propagation model; determines statistical parameters of the calculated modal fields; and determines a parametric propagation model based on the statistical parameters of the calculated modal fields from which predictions of communications capability may be made.

  3. The impact of working memory and the “process of process modelling” on model quality: Investigating experienced versus inexperienced modellers

    Science.gov (United States)

    Martini, Markus; Pinggera, Jakob; Neurauter, Manuel; Sachse, Pierre; Furtner, Marco R.; Weber, Barbara

    2016-01-01

    A process model (PM) represents the graphical depiction of a business process, for instance, the entire process from online ordering a book until the parcel is delivered to the customer. Knowledge about relevant factors for creating PMs of high quality is lacking. The present study investigated the role of cognitive processes as well as modelling processes in creating a PM in experienced and inexperienced modellers. Specifically, two working memory (WM) functions (holding and processing of information and relational integration) and three process of process modelling phases (comprehension, modelling, and reconciliation) were related to PM quality. Our results show that the WM function of relational integration was positively related to PM quality in both modelling groups. The ratio of comprehension phases was negatively related to PM quality in inexperienced modellers and the ratio of reconciliation phases was positively related to PM quality in experienced modellers. Our research reveals central cognitive mechanisms in process modelling and has potential practical implications for the development of modelling software and teaching the craft of process modelling. PMID:27157858

  4. INTUITEL and the Hypercube Model - Developing Adaptive Learning Environments

    Directory of Open Access Journals (Sweden)

    Kevin Fuchs

    2016-06-01

    Full Text Available In this paper we introduce an approach for the creation of adaptive learning environments that give human-like recommendations to a learner in the form of a virtual tutor. We use ontologies defining pedagogical, didactic and learner-specific data describing a learner's progress, learning history, capabilities and the learner's current state within the learning environment. Learning recommendations are based on a reasoning process on these ontologies and can be provided in real-time. The ontologies may describe learning content from any domain of knowledge. Furthermore, we describe an approach to store learning histories as spatio-temporal trajectories and to correlate them with influencing didactic factors. We show how such analysis of spatiotemporal data can be used for learning analytics to improve future adaptive learning environments.

  5. Detecting Difference between Process Models Based on the Refined Process Structure Tree

    Directory of Open Access Journals (Sweden)

    Jing Fan

    2017-01-01

    Full Text Available The development of mobile workflow management systems (mWfMS leads to large number of business process models. In the meantime, the location restriction embedded in mWfMS may result in different process models for a single business process. In order to help users quickly locate the difference and rebuild the process model, detecting the difference between different process models is needed. Existing detection methods either provide a dissimilarity value to represent the difference or use predefined difference template to generate the result, which cannot reflect the entire composition of the difference. Hence, in this paper, we present a new approach to solve this problem. Firstly, we parse the process models to their corresponding refined process structure trees (PSTs, that is, decomposing a process model into a hierarchy of subprocess models. Then we design a method to convert the PST to its corresponding task based process structure tree (TPST. As a consequence, the problem of detecting difference between two process models is transformed to detect difference between their corresponding TPSTs. Finally, we obtain the difference between two TPSTs based on the divide and conquer strategy, where the difference is described by an edit script and we make the cost of the edit script close to minimum. The extensive experimental evaluation shows that our method can meet the real requirements in terms of precision and efficiency.

  6. Strategies to Automatically Derive a Process Model from a Configurable Process Model Based on Event Data

    Directory of Open Access Journals (Sweden)

    Mauricio Arriagada-Benítez

    2017-10-01

    Full Text Available Configurable process models are frequently used to represent business workflows and other discrete event systems among different branches of large organizations: they unify commonalities shared by all branches and describe their differences, at the same time. The configuration of such models is usually done manually, which is challenging. On the one hand, when the number of configurable nodes in the configurable process model grows, the size of the search space increases exponentially. On the other hand, the person performing the configuration may lack the holistic perspective to make the right choice for all configurable nodes at the same time, since choices influence each other. Nowadays, information systems that support the execution of business processes create event data reflecting how processes are performed. In this article, we propose three strategies (based on exhaustive search, genetic algorithms and a greedy heuristic that use event data to automatically derive a process model from a configurable process model that better represents the characteristics of the process in a specific branch. These strategies have been implemented in our proposed framework and tested in both business-like event logs as recorded in a higher educational enterprise resource planning system and a real case scenario involving a set of Dutch municipalities.

  7. An analytical model for particulate deposition on vertical heat transfer surfaces in a boiling environment

    International Nuclear Information System (INIS)

    Keefer, R.H.; Rider, J.L.; Waldman, L.A.

    1993-01-01

    A frequent problem in heat exchange equipment is the deposition of particulates entrained in the working fluid onto heat transfer surfaces. These deposits increase the overall heat transfer resistance and can significantly degrade the performance of the heat exchanger. Accurate prediction of the deposition rate is necessary to ensure that the design and specified operating conditions of the heat exchanger adequately address the effects of this deposit layer. Although the deposition process has been studied in considerable detail, much of the work has focused on investigating individual aspects of the deposition process. This paper consolidates this previous research into a mechanistically based analytical prediction model for particulate deposition from a boiling liquid onto vertical heat transfer surfaces. Consistent with the well known Kern-Seaton approach, the model postulates net particulate accumulation to depend on the relative contributions of deposition and reentrainment processes. Mechanisms for deposition include boiling, momentum, and diffusion effects. Reentrainment is presumed to occur via an intermittent erosion process, with the energy for particle removal being supplied by turbulent flow instabilities. The contributions of these individual mechanisms are integrated to obtain a single equation for the deposit thickness versus time. The validity of the resulting model is demonstrated by comparison with data published in the open literature. Model estimates show good agreement with data obtained over a range of thermal-hydraulic conditions in both flow and pool boiling environments. The utility of the model in performing parametric studies (e.g. to determine the effect of flow velocity on net deposition) is also demonstrated. The initial success of the model suggests that it could prove useful in establishing a range of heat exchanger. operating conditions to minimize deposition

  8. Repairing process models to reflect reality

    NARCIS (Netherlands)

    Fahland, D.; Aalst, van der W.M.P.; Barros, A.; Gal, A.; Kindler, E.

    2012-01-01

    Process mining techniques relate observed behavior (i.e., event logs) to modeled behavior (e.g., a BPMN model or a Petri net). Processes models can be discovered from event logs and conformance checking techniques can be used to detect and diagnose differences between observed and modeled behavior.

  9. Business process modeling for processing classified documents using RFID technology

    Directory of Open Access Journals (Sweden)

    Koszela Jarosław

    2016-01-01

    Full Text Available The article outlines the application of the processing approach to the functional description of the designed IT system supporting the operations of the secret office, which processes classified documents. The article describes the application of the method of incremental modeling of business processes according to the BPMN model to the description of the processes currently implemented (“as is” in a manual manner and target processes (“to be”, using the RFID technology for the purpose of their automation. Additionally, the examples of applying the method of structural and dynamic analysis of the processes (process simulation to verify their correctness and efficiency were presented. The extension of the process analysis method is a possibility of applying the warehouse of processes and process mining methods.

  10. Improvement of the Model of Enterprise Management Process on the Basis of General Management Functions

    Directory of Open Access Journals (Sweden)

    Ruslan Skrynkovskyy

    2017-12-01

    Full Text Available The purpose of the article is to improve the model of the enterprise (institution, organization management process on the basis of general management functions. The graphic model of the process of management according to the process-structured management is presented. It has been established that in today's business environment, the model of the management process should include such general management functions as: 1 controlling the achievement of results; 2 planning based on the main goal; 3 coordination and corrective actions (in the system of organization of work and production; 4 action as a form of act (conscious, volitional, directed; 5 accounting system (accounting, statistical, operational-technical and managerial; 6 diagnosis (economic, legal with such subfunctions as: identification of the state and capabilities; analysis (economic, legal, systemic with argumentation; assessment of the state, trends and prospects of development. The prospect of further research in this direction is: 1 the formation of a system of interrelation of functions and management methods, taking into account the presented research results; 2 development of the model of effective and efficient communication business process of the enterprise.

  11. Auto-Scaling of Geo-Based Image Processing in an OpenStack Cloud Computing Environment

    Directory of Open Access Journals (Sweden)

    Sanggoo Kang

    2016-08-01

    Full Text Available Cloud computing is a base platform for the distribution of large volumes of data and high-performance image processing on the Web. Despite wide applications in Web-based services and their many benefits, geo-spatial applications based on cloud computing technology are still developing. Auto-scaling realizes automatic scalability, i.e., the scale-out and scale-in processing of virtual servers in a cloud computing environment. This study investigates the applicability of auto-scaling to geo-based image processing algorithms by comparing the performance of a single virtual server and multiple auto-scaled virtual servers under identical experimental conditions. In this study, the cloud computing environment is built with OpenStack, and four algorithms from the Orfeo toolbox are used for practical geo-based image processing experiments. The auto-scaling results from all experimental performance tests demonstrate applicable significance with respect to cloud utilization concerning response time. Auto-scaling contributes to the development of web-based satellite image application services using cloud-based technologies.

  12. Business Process Modelling for Measuring Quality

    NARCIS (Netherlands)

    Heidari, F.; Loucopoulos, P.; Brazier, F.M.

    2013-01-01

    Business process modelling languages facilitate presentation, communication and analysis of business processes with different stakeholders. This paper proposes an approach that drives specification and measurement of quality requirements and in doing so relies on business process models as

  13. Increased prediction accuracy in wheat breeding trials using a marker × environment interaction genomic selection model.

    Science.gov (United States)

    Lopez-Cruz, Marco; Crossa, Jose; Bonnett, David; Dreisigacker, Susanne; Poland, Jesse; Jannink, Jean-Luc; Singh, Ravi P; Autrique, Enrique; de los Campos, Gustavo

    2015-02-06

    Genomic selection (GS) models use genome-wide genetic information to predict genetic values of candidates of selection. Originally, these models were developed without considering genotype × environment interaction(G×E). Several authors have proposed extensions of the single-environment GS model that accommodate G×E using either covariance functions or environmental covariates. In this study, we model G×E using a marker × environment interaction (M×E) GS model; the approach is conceptually simple and can be implemented with existing GS software. We discuss how the model can be implemented by using an explicit regression of phenotypes on markers or using co-variance structures (a genomic best linear unbiased prediction-type model). We used the M×E model to analyze three CIMMYT wheat data sets (W1, W2, and W3), where more than 1000 lines were genotyped using genotyping-by-sequencing and evaluated at CIMMYT's research station in Ciudad Obregon, Mexico, under simulated environmental conditions that covered different irrigation levels, sowing dates and planting systems. We compared the M×E model with a stratified (i.e., within-environment) analysis and with a standard (across-environment) GS model that assumes that effects are constant across environments (i.e., ignoring G×E). The prediction accuracy of the M×E model was substantially greater of that of an across-environment analysis that ignores G×E. Depending on the prediction problem, the M×E model had either similar or greater levels of prediction accuracy than the stratified analyses. The M×E model decomposes marker effects and genomic values into components that are stable across environments (main effects) and others that are environment-specific (interactions). Therefore, in principle, the interaction model could shed light over which variants have effects that are stable across environments and which ones are responsible for G×E. The data set and the scripts required to reproduce the analysis are

  14. Investigation of Mediational Processes Using Parallel Process Latent Growth Curve Modeling

    Science.gov (United States)

    Cheong, JeeWon; MacKinnon, David P.; Khoo, Siek Toon

    2010-01-01

    This study investigated a method to evaluate mediational processes using latent growth curve modeling. The mediator and the outcome measured across multiple time points were viewed as 2 separate parallel processes. The mediational process was defined as the independent variable influencing the growth of the mediator, which, in turn, affected the growth of the outcome. To illustrate modeling procedures, empirical data from a longitudinal drug prevention program, Adolescents Training and Learning to Avoid Steroids, were used. The program effects on the growth of the mediator and the growth of the outcome were examined first in a 2-group structural equation model. The mediational process was then modeled and tested in a parallel process latent growth curve model by relating the prevention program condition, the growth rate factor of the mediator, and the growth rate factor of the outcome. PMID:20157639

  15. Semantic modeling of portfolio assessment in e-learning environment

    Directory of Open Access Journals (Sweden)

    Lucila Romero

    2017-01-01

    Full Text Available In learning environment, portfolio is used as a tool to keep track of learner’s progress. Particularly, when it comes to e-learning, continuous assessment allows greater customization and efficiency in learning process and prevents students lost interest in their study. Also, each student has his own characteristics and learning skills that must be taken into account in order to keep learner`s interest. So, personalized monitoring is the key to guarantee the success of technology-based education. In this context, portfolio assessment emerge as the solution because is an easy way to allow teacher organize and personalize assessment according to students characteristic and need. A portfolio assessment can contain various types of assessment like formative assessment, summative assessment, hetero or self-assessment and use different instruments like multiple choice questions, conceptual maps, and essay among others. So, a portfolio assessment represents a compilation of all assessments must be solved by a student in a course, it documents progress and set targets. In previous work, it has been proposed a conceptual framework that consist of an ontology network named AOnet which is a semantic tool conceptualizing different types of assessments. Continuing that work, this paper presents a proposal to implement portfolios assessment in e-learning environments. The proposal consists of a semantic model that describes key components and relations of this domain to set the bases to develop a tool to generate, manage and perform portfolios assessment.

  16. Exascale Co-design for Modeling Materials in Extreme Environments

    Energy Technology Data Exchange (ETDEWEB)

    Germann, Timothy C. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2014-07-08

    Computational materials science has provided great insight into the response of materials under extreme conditions that are difficult to probe experimentally. For example, shock-induced plasticity and phase transformation processes in single-crystal and nanocrystalline metals have been widely studied via large-scale molecular dynamics simulations, and many of these predictions are beginning to be tested at advanced 4th generation light sources such as the Advanced Photon Source (APS) and Linac Coherent Light Source (LCLS). I will describe our simulation predictions and their recent verification at LCLS, outstanding challenges in modeling the response of materials to extreme mechanical and radiation environments, and our efforts to tackle these as part of the multi-institutional, multi-disciplinary Exascale Co-design Center for Materials in Extreme Environments (ExMatEx). ExMatEx has initiated an early and deep collaboration between domain (computational materials) scientists, applied mathematicians, computer scientists, and hardware architects, in order to establish the relationships between algorithms, software stacks, and architectures needed to enable exascale-ready materials science application codes within the next decade. We anticipate that we will be able to exploit hierarchical, heterogeneous architectures to achieve more realistic large-scale simulations with adaptive physics refinement, and are using tractable application scale-bridging proxy application testbeds to assess new approaches and requirements. Such current scale-bridging strategies accumulate (or recompute) a distributed response database from fine-scale calculations, in a top-down rather than bottom-up multiscale approach.

  17. STRUCTURAL AND FUNCTIONAL MODEL OF CLOUD ORIENTED LEARNING ENVIRONMENT FOR BACHELORS OF INFORMATICS TRAINING

    Directory of Open Access Journals (Sweden)

    Tetiana A. Vakaliuk

    2017-06-01

    Full Text Available The article summarizes the essence of the category "model". There are presented the main types of models used in educational research: structural, functional, structural and functional model as well as basic requirements for building these types of models. The national experience in building models and designing cloud-based learning environment of educational institutions (both higher and secondary is analyzed. It is presented structural and functional model of cloud-based learning environment for Bachelor of Informatics. Also we describe each component of cloud-based learning environment model for bachelors of informatics training: target, managerial, organizational, content and methodical, communication, technological and productive. It is summarized, that COLE should solve all major tasks that relate to higher education institutions.

  18. Techniques for Modeling Human Performance in Synthetic Environments: A Supplementary Review

    National Research Council Canada - National Science Library

    Ritter, Frank E; Shadbolt, Nigel R; Elliman, David; Young, Richard M; Gobet, Fernand; Baxter, Gordon D

    2003-01-01

    Selected recent developments and promising directions for improving the quality of models of human performance in synthetic environments are summarized, beginning with the potential uses and goals for behavioral models...

  19. A Virtual Environment for Resilient Infrastructure Modeling and Design

    Science.gov (United States)

    2015-09-01

    Security CI Critical Infrastructure CID Center for Infrastructure Defense CSV Comma Separated Value DAD Defender-Attacker-Defender DHS Department...responses to disruptive events (e.g., cascading failure behavior) in a context- rich , controlled environment for exercises, education, and training...The general attacker-defender (AD) and defender-attacker-defender ( DAD ) models for CI are defined in Brown et al. (2006). These models help

  20. Behavioral conformance of artifact-centric process models

    NARCIS (Netherlands)

    Fahland, D.; Leoni, de M.; Dongen, van B.F.; Aalst, van der W.M.P.; Abramowicz, W.

    2011-01-01

    The use of process models in business information systems for analysis, execution, and improvement of processes assumes that the models describe reality. Conformance checking is a technique to validate how good a given process model describes recorded executions of the actual process. Recently,

  1. Modeling and optimization of CO2 capture processes by chemical absorption

    International Nuclear Information System (INIS)

    Neveux, Thibaut

    2013-01-01

    CO 2 capture processes by chemical absorption lead to a large energy penalty on efficiency of coal-fired power plants, establishing one of the main bottleneck to its industrial deployment. The objective of this thesis is the development and validation of a global methodology, allowing the precise evaluation of the potential of a given amine capture process. Characteristic phenomena of chemical absorption have been thoroughly studied and represented with state-of-the-art models. The e-UNIQUAC model has been used to describe vapor-liquid and chemical equilibria of electrolyte solutions and the model parameters have been identified for four solvents. A rate-based formulation has been adopted for the representation of chemically enhanced heat and mass transfer in columns. The absorption and stripping models have been successfully validated against experimental data from an industrial and a laboratory pilot plants. The influence of the numerous phenomena has been investigated in order to highlight the most limiting ones. A methodology has been proposed to evaluate the total energy penalty resulting from the implementation of a capture process on an advanced supercritical coal-fired power plant, including thermal and electric consumptions. Then, the simulation and process evaluation environments have been coupled with a non-linear optimization algorithm in order to find optimal operating and design parameters with respect to energetic and economic performances. This methodology has been applied to optimize five process flow schemes operating with an monoethanolamine aqueous solution at 30% by weight: the conventional flow scheme and four process modifications. The performance comparison showed that process modifications using a heat pump effect give the best gains. The use of technical-economic analysis as an evaluation criterion of a process performance, coupled with a optimization algorithm, has proved its capability to find values for the numerous operating and design

  2. The role of virtual reality and 3D modelling in built environment education

    OpenAIRE

    Horne, Margaret; Thompson, Emine Mine

    2007-01-01

    This study builds upon previous research on the integration of Virtual Reality (VR) within the built environment curriculum and aims to investigate the role of Virtual Reality and three-dimensional (3D) computer modelling on learning and teaching in a school of the built environment. In order to achieve this aim a number of academic experiences were analysed to explore the applicability and viability of 3D computer modelling and Virtual Reality (VR) into built environment subject areas. Altho...

  3. Multi-scale dynamic modeling of atmospheric pollution in urban environment

    International Nuclear Information System (INIS)

    Thouron, Laetitia

    2017-01-01

    transport chemistry (SinG) and a computational fluid dynamics model (Code-Saturne) and (3) a microscale process which is the traffic-related resuspension of the particles present on the road surface with three different formulations (deterministic, semi-empirical and empirical). The interest of this thesis is to compare and evaluate the operability and performance of several air quality models at different scales (region, neighborhood and street) in order to better understand the characterization of air quality in an urban environment. (author) [fr

  4. Cost Models for MMC Manufacturing Processes

    Science.gov (United States)

    Elzey, Dana M.; Wadley, Haydn N. G.

    1996-01-01

    Processes for the manufacture of advanced metal matrix composites are rapidly approaching maturity in the research laboratory and there is growing interest in their transition to industrial production. However, research conducted to date has almost exclusively focused on overcoming the technical barriers to producing high-quality material and little attention has been given to the economical feasibility of these laboratory approaches and process cost issues. A quantitative cost modeling (QCM) approach was developed to address these issues. QCM are cost analysis tools based on predictive process models relating process conditions to the attributes of the final product. An important attribute, of the QCM approach is the ability to predict the sensitivity of material production costs to product quality and to quantitatively explore trade-offs between cost and quality. Applications of the cost models allow more efficient direction of future MMC process technology development and a more accurate assessment of MMC market potential. Cost models were developed for two state-of-the art metal matrix composite (MMC) manufacturing processes: tape casting and plasma spray deposition. Quality and Cost models are presented for both processes and the resulting predicted quality-cost curves are presented and discussed.

  5. A mathematical model of collective cell migration in a three-dimensional, heterogeneous environment.

    Science.gov (United States)

    Stonko, David P; Manning, Lathiena; Starz-Gaiano, Michelle; Peercy, Bradford E

    2015-01-01

    Cell migration is essential in animal development, homeostasis, and disease progression, but many questions remain unanswered about how this process is controlled. While many kinds of individual cell movements have been characterized, less effort has been directed towards understanding how clusters of cells migrate collectively through heterogeneous, cellular environments. To explore this, we have focused on the migration of the border cells during Drosophila egg development. In this case, a cluster of different cell types coalesce and traverse as a group between large cells, called nurse cells, in the center of the egg chamber. We have developed a new model for this collective cell migration based on the forces of adhesion, repulsion, migration and stochastic fluctuation to generate the movement of discrete cells. We implement the model using Identical Math Cells, or IMCs. IMCs can each represent one biological cell of the system, or can be aggregated using increased adhesion forces to model the dynamics of larger biological cells. The domain of interest is filled with IMCs, each assigned specific biophysical properties to mimic a diversity of cell types. Using this system, we have successfully simulated the migration of the border cell cluster through an environment filled with larger cells, which represent nurse cells. Interestingly, our simulations suggest that the forces utilized in this model are sufficient to produce behaviors of the cluster that are observed in vivo, such as rotation. Our framework was developed to capture a heterogeneous cell population, and our implementation strategy allows for diverse, but precise, initial position specification over a three- dimensional domain. Therefore, we believe that this model will be useful for not only examining aspects of Drosophila oogenesis, but also for modeling other two or three-dimensional systems that have multiple cell types and where investigating the forces between cells is of interest.

  6. A mathematical model of collective cell migration in a three-dimensional, heterogeneous environment.

    Directory of Open Access Journals (Sweden)

    David P Stonko

    Full Text Available Cell migration is essential in animal development, homeostasis, and disease progression, but many questions remain unanswered about how this process is controlled. While many kinds of individual cell movements have been characterized, less effort has been directed towards understanding how clusters of cells migrate collectively through heterogeneous, cellular environments. To explore this, we have focused on the migration of the border cells during Drosophila egg development. In this case, a cluster of different cell types coalesce and traverse as a group between large cells, called nurse cells, in the center of the egg chamber. We have developed a new model for this collective cell migration based on the forces of adhesion, repulsion, migration and stochastic fluctuation to generate the movement of discrete cells. We implement the model using Identical Math Cells, or IMCs. IMCs can each represent one biological cell of the system, or can be aggregated using increased adhesion forces to model the dynamics of larger biological cells. The domain of interest is filled with IMCs, each assigned specific biophysical properties to mimic a diversity of cell types. Using this system, we have successfully simulated the migration of the border cell cluster through an environment filled with larger cells, which represent nurse cells. Interestingly, our simulations suggest that the forces utilized in this model are sufficient to produce behaviors of the cluster that are observed in vivo, such as rotation. Our framework was developed to capture a heterogeneous cell population, and our implementation strategy allows for diverse, but precise, initial position specification over a three- dimensional domain. Therefore, we believe that this model will be useful for not only examining aspects of Drosophila oogenesis, but also for modeling other two or three-dimensional systems that have multiple cell types and where investigating the forces between cells is of

  7. FAME, the Flux Analysis and Modeling Environment

    Directory of Open Access Journals (Sweden)

    Boele Joost

    2012-01-01

    Full Text Available Abstract Background The creation and modification of genome-scale metabolic models is a task that requires specialized software tools. While these are available, subsequently running or visualizing a model often relies on disjoint code, which adds additional actions to the analysis routine and, in our experience, renders these applications suboptimal for routine use by (systems biologists. Results The Flux Analysis and Modeling Environment (FAME is the first web-based modeling tool that combines the tasks of creating, editing, running, and analyzing/visualizing stoichiometric models into a single program. Analysis results can be automatically superimposed on familiar KEGG-like maps. FAME is written in PHP and uses the Python-based PySCeS-CBM for its linear solving capabilities. It comes with a comprehensive manual and a quick-start tutorial, and can be accessed online at http://f-a-m-e.org/. Conclusions With FAME, we present the community with an open source, user-friendly, web-based "one stop shop" for stoichiometric modeling. We expect the application will be of substantial use to investigators and educators alike.

  8. Grey relation projection model for evaluating permafrost environment in the Muli coal mining area, China

    Energy Technology Data Exchange (ETDEWEB)

    Wei Cao; Yu Sheng; Yinghong Qin; Jing Li; Jichun Wu [Chinese Academy of Sciences, Lanzhou (China). State Key Laboratory of Frozen Soil Engineering

    2010-12-15

    This study attempts to estimate the current stage of the permafrost environment in the Muli coal mining area, an opencast mining site in the Qinghai-Tibet plateau, China. The estimation is done by regarding this site's permafrost environment as a system which was divided into three subsystems consisting of permafrost freeze-thaw erosion sensibility, permafrost thermal stability, and permafrost ecological fragility. The subsystems were characterized with their influencing indicators, each of which was assigned with a weight according to analytic hierarchy process. The relationship between these indictors is established using an environmental evaluation model based on grey system theory. The evaluated results show that currently the normalised grey relation projection values (GRPV) of permafrost freezing-thawing erosion sensibility, permafrost thermal stability, permafrost ecological fragility and permafrost environment are 0.58 (general situation), 0.47 (bad situation), 0.63 (general situation) and 0.56 (general situation), respectively. These values imply that the permafrost environment has been deteriorated to a certain degree by human activities and potentially could be further degraded. However, at this degree, a new equilibrium could be achieved if the current environmental degradation ratio is held and if effective treatments are constructed against further damages.

  9. LINDOZ model for Finland environment: Model description and evaluation of model performance

    International Nuclear Information System (INIS)

    Galeriu, D.; Apostoaie, A.I.; Mocanu, N.; Paunescu, N.

    1996-01-01

    LINDOZ model was developed as a realistic assessment tool for radioactive contamination of the environment. It was designed to produce estimates for the concentration of the pollutant in different compartments of the terrestrial ecosystem (soil, vegetation, animal tissue, and animal products), and to evaluate human exposure to the contaminant (concentration in whole human body, and dose to humans) from inhalation, ingestion and external irradiation. The user can apply LINDOZ for both routine and accidental type of releases. 2 figs, 2 tabs

  10. Health, Supportive Environments, and the Reasonable Person Model

    Science.gov (United States)

    Stephen Kaplan; Rachel Kaplan

    2003-01-01

    The Reasonable Person Model is a conceptual framework that links environmental factors with human behavior. People are more reasonable, cooperative, helpful, and satisfied when the environment supports their basic informational needs. The same environmental supports are important factors in enhancing human health. We use this framework to identify the informational...

  11. High Fidelity Simulation of Littoral Environments: Applications and Coupling of Participating Models

    National Research Council Canada - National Science Library

    Allard, Richard

    2003-01-01

    The High Fidelity Simulation of Littoral Environments (HFSoLE) Challenge Project (C75) encompasses a suite of seven oceanographic models capable of exchanging information in a physically meaningful sense across the littoral environment...

  12. A model of adaptive decision-making from representation of information environment by quantum fields

    Science.gov (United States)

    Bagarello, F.; Haven, E.; Khrennikov, A.

    2017-10-01

    We present the mathematical model of decision-making (DM) of agents acting in a complex and uncertain environment (combining huge variety of economical, financial, behavioural and geopolitical factors). To describe interaction of agents with it, we apply the formalism of quantum field theory (QTF). Quantum fields are a purely informational nature. The QFT model can be treated as a far relative of the expected utility theory, where the role of utility is played by adaptivity to an environment (bath). However, this sort of utility-adaptivity cannot be represented simply as a numerical function. The operator representation in Hilbert space is used and adaptivity is described as in quantum dynamics. We are especially interested in stabilization of solutions for sufficiently large time. The outputs of this stabilization process, probabilities for possible choices, are treated in the framework of classical DM. To connect classical and quantum DM, we appeal to Quantum Bayesianism. We demonstrate the quantum-like interference effect in DM, which is exhibited as a violation of the formula of total probability, and hence the classical Bayesian inference scheme. This article is part of the themed issue `Second quantum revolution: foundational questions'.

  13. A model of adaptive decision-making from representation of information environment by quantum fields.

    Science.gov (United States)

    Bagarello, F; Haven, E; Khrennikov, A

    2017-11-13

    We present the mathematical model of decision-making (DM) of agents acting in a complex and uncertain environment (combining huge variety of economical, financial, behavioural and geopolitical factors). To describe interaction of agents with it, we apply the formalism of quantum field theory (QTF). Quantum fields are a purely informational nature. The QFT model can be treated as a far relative of the expected utility theory, where the role of utility is played by adaptivity to an environment (bath). However, this sort of utility-adaptivity cannot be represented simply as a numerical function. The operator representation in Hilbert space is used and adaptivity is described as in quantum dynamics. We are especially interested in stabilization of solutions for sufficiently large time. The outputs of this stabilization process, probabilities for possible choices, are treated in the framework of classical DM. To connect classical and quantum DM, we appeal to Quantum Bayesianism. We demonstrate the quantum-like interference effect in DM, which is exhibited as a violation of the formula of total probability, and hence the classical Bayesian inference scheme.This article is part of the themed issue 'Second quantum revolution: foundational questions'. © 2017 The Author(s).

  14. Conceptual modelling of human resource evaluation process

    Directory of Open Access Journals (Sweden)

    Negoiţă Doina Olivia

    2017-01-01

    Full Text Available Taking into account the highly diverse tasks which employees have to fulfil due to complex requirements of nowadays consumers, the human resource within an enterprise has become a strategic element for developing and exploiting products which meet the market expectations. Therefore, organizations encounter difficulties when approaching the human resource evaluation process. Hence, the aim of the current paper is to design a conceptual model of the aforementioned process, which allows the enterprises to develop a specific methodology. In order to design the conceptual model, Business Process Modelling instruments were employed - Adonis Community Edition Business Process Management Toolkit using the ADONIS BPMS Notation. The conceptual model was developed based on an in-depth secondary research regarding the human resource evaluation process. The proposed conceptual model represents a generic workflow (sequential and/ or simultaneously activities, which can be extended considering the enterprise’s needs regarding their requirements when conducting a human resource evaluation process. Enterprises can benefit from using software instruments for business process modelling as they enable process analysis and evaluation (predefined / specific queries and also model optimization (simulations.

  15. Using {sup 222}Rn as a tracer of geophysical processes in underground environments

    Energy Technology Data Exchange (ETDEWEB)

    Lacerda, T.; Anjos, R. M. [LARA - Laboratório de Radioecologia e Alterações Ambientais, Instituto de Física, Universidade Federal Fluminense, Av. Gal Milton Tavares de Souza, s/no, Gragoatá, 24210-346, Niterói, RJ (Brazil); Valladares, D. L.; Rizzotto, M.; Velasco, H.; Rosas, J. P. de; Ayub, J. Juri [GEA - Instituto de Matemática Aplicada San Luis (IMASL), Universidad Nacional de San Luis, CCT-San Luis CONICET, Ej. de los Andes 950, D5700HHW San Luis (Argentina); Silva, A. A. R. da; Yoshimura, E. M. [Instituto de Física, Universidade de São Paulo, P.O. Box 66318, 05314-970, São Paulo, SP (Brazil)

    2014-11-11

    Radon levels in two old mines in San Luis, Argentina, are reported and analyzed. These mines are today used for touristic visitation. Our goal was to assess the potential use of such radioactive noble gas as tracer of geological processes in underground environments. CR-39 nuclear track detectors were used during the winter and summer seasons. The findings show that the significant radon concentrations reported in this environment are subject to large seasonal modulations, due to the strong dependence of natural ventilation on the variations of outside temperature. The results also indicate that radon pattern distribution appear as a good method to localize unknown ducts, fissures or secondary tunnels in subterranean environments.

  16. Mapping modern software process engineering techniques onto an HEP development environment

    CERN Document Server

    Wellisch, J P

    2003-01-01

    One of the most challenging issues faced in HEP in recent years is the question of how to capitalise on software development and maintenance experience in a continuous manner. To capitalise means in our context to evaluate and apply new process technologies as they arise, and to further evolve technologies already widely in use. It also implies the definition and adoption of standards. The CMS off- line software improvement effort aims at continual software quality improvement, and continual improvement in the efficiency of the working environment with the goal to facilitate doing great new physics. To achieve this, we followed a process improvement program based on ISO-15504, and Rational Unified Process. This experiment in software process improvement in HEP has been progressing now for a period of 3 years. Taking previous experience from ATLAS and SPIDER into account, we used a soft approach of continuous change within the limits of current culture to create of de facto software process standards within th...

  17. Modeling Dynamic Systems with Efficient Ensembles of Process-Based Models.

    Directory of Open Access Journals (Sweden)

    Nikola Simidjievski

    Full Text Available Ensembles are a well established machine learning paradigm, leading to accurate and robust models, predominantly applied to predictive modeling tasks. Ensemble models comprise a finite set of diverse predictive models whose combined output is expected to yield an improved predictive performance as compared to an individual model. In this paper, we propose a new method for learning ensembles of process-based models of dynamic systems. The process-based modeling paradigm employs domain-specific knowledge to automatically learn models of dynamic systems from time-series observational data. Previous work has shown that ensembles based on sampling observational data (i.e., bagging and boosting, significantly improve predictive performance of process-based models. However, this improvement comes at the cost of a substantial increase of the computational time needed for learning. To address this problem, the paper proposes a method that aims at efficiently learning ensembles of process-based models, while maintaining their accurate long-term predictive performance. This is achieved by constructing ensembles with sampling domain-specific knowledge instead of sampling data. We apply the proposed method to and evaluate its performance on a set of problems of automated predictive modeling in three lake ecosystems using a library of process-based knowledge for modeling population dynamics. The experimental results identify the optimal design decisions regarding the learning algorithm. The results also show that the proposed ensembles yield significantly more accurate predictions of population dynamics as compared to individual process-based models. Finally, while their predictive performance is comparable to the one of ensembles obtained with the state-of-the-art methods of bagging and boosting, they are substantially more efficient.

  18. NCWin — A Component Object Model (COM) for processing and visualizing NetCDF data

    Science.gov (United States)

    Liu, Jinxun; Chen, J.M.; Price, D.T.; Liu, S.

    2005-01-01

    NetCDF (Network Common Data Form) is a data sharing protocol and library that is commonly used in large-scale atmospheric and environmental data archiving and modeling. The NetCDF tool described here, named NCWin and coded with Borland C + + Builder, was built as a standard executable as well as a COM (component object model) for the Microsoft Windows environment. COM is a powerful technology that enhances the reuse of applications (as components). Environmental model developers from different modeling environments, such as Python, JAVA, VISUAL FORTRAN, VISUAL BASIC, VISUAL C + +, and DELPHI, can reuse NCWin in their models to read, write and visualize NetCDF data. Some Windows applications, such as ArcGIS and Microsoft PowerPoint, can also call NCWin within the application. NCWin has three major components: 1) The data conversion part is designed to convert binary raw data to and from NetCDF data. It can process six data types (unsigned char, signed char, short, int, float, double) and three spatial data formats (BIP, BIL, BSQ); 2) The visualization part is designed for displaying grid map series (playing forward or backward) with simple map legend, and displaying temporal trend curves for data on individual map pixels; and 3) The modeling interface is designed for environmental model development by which a set of integrated NetCDF functions is provided for processing NetCDF data. To demonstrate that the NCWin can easily extend the functions of some current GIS software and the Office applications, examples of calling NCWin within ArcGIS and MS PowerPoint for showing NetCDF map animations are given.

  19. AGENT-BASED NEGOTIATION PLATFORM IN COLLABORATIVE NETWORKED ENVIRONMENT

    Directory of Open Access Journals (Sweden)

    Adina-Georgeta CREȚAN

    2014-05-01

    Full Text Available This paper proposes an agent-based platform to model and support parallel and concurrent negotiations among organizations acting in the same industrial market. The underlying complexity is to model the dynamic environment where multi-attribute and multi-participant negotiations are racing over a set of heterogeneous resources. The metaphor Interaction Abstract Machines (IAMs is used to model the parallelism and the non-deterministic aspects of the negotiation processes that occur in Collaborative Networked Environment.

  20. Statistical Process Control in a Modern Production Environment

    DEFF Research Database (Denmark)

    Windfeldt, Gitte Bjørg

    gathered here and standard statistical software. In Paper 2 a new method for process monitoring is introduced. The method uses a statistical model of the quality characteristic and a sliding window of observations to estimate the probability that the next item will not respect the specications......Paper 1 is aimed at practicians to help them test the assumption that the observations in a sample are independent and identically distributed. An assumption that is essential when using classical Shewhart charts. The test can easily be performed in the control chart setup using the samples....... If the estimated probability exceeds a pre-determined threshold the process will be stopped. The method is exible, allowing a complexity in modeling that remains invisible to the end user. Furthermore, the method allows to build diagnostic plots based on the parameters estimates that can provide valuable insight...