WorldWideScience

Sample records for environment process model

  1. Near Field Environment Process Model Report

    Energy Technology Data Exchange (ETDEWEB)

    R.A. Wagner

    2000-11-14

    Waste emplacement and activities associated with construction of a repository system potentially will change environmental conditions within the repository system. These environmental changes principally result from heat generated by the decay of the radioactive waste, which elevates temperatures within the repository system. Elevated temperatures affect distribution of water, increase kinetic rates of geochemical processes, and cause stresses to change in magnitude and orientation from the stresses resulting from the overlying rock and from underground construction activities. The recognition of this evolving environment has been reflected in activities, studies and discussions generally associated with what has been termed the Near-Field Environment (NFE). The NFE interacts directly with waste packages and engineered barriers as well as potentially changing the fluid composition and flow conditions within the mountain. As such, the NFE defines the environment for assessing the performance of a potential Monitored Geologic Repository at Yucca Mountain, Nevada. The NFe evolves over time, and therefore is not amenable to direct characterization or measurement in the ambient system. Analysis or assessment of the NFE must rely upon projections based on tests and models that encompass the long-term processes of the evolution of this environment. This NFE Process Model Report (PMR) describes the analyses and modeling based on current understanding of the evolution of the near-field within the rock mass extending outward from the drift wall.

  2. Modeling critical zone processes in intensively managed environments

    Science.gov (United States)

    Kumar, Praveen; Le, Phong; Woo, Dong; Yan, Qina

    2017-04-01

    Processes in the Critical Zone (CZ), which sustain terrestrial life, are tightly coupled across hydrological, physical, biochemical, and many other domains over both short and long timescales. In addition, vegetation acclimation resulting from elevated atmospheric CO2 concentration, along with response to increased temperature and altered rainfall pattern, is expected to result in emergent behaviors in ecologic and hydrologic functions, subsequently controlling CZ processes. We hypothesize that the interplay between micro-topographic variability and these emergent behaviors will shape complex responses of a range of ecosystem dynamics within the CZ. Here, we develop a modeling framework ('Dhara') that explicitly incorporates micro-topographic variability based on lidar topographic data with coupling of multi-layer modeling of the soil-vegetation continuum and 3-D surface-subsurface transport processes to study ecological and biogeochemical dynamics. We further couple a C-N model with a physically based hydro-geomorphologic model to quantify (i) how topographic variability controls the spatial distribution of soil moisture, temperature, and biogeochemical processes, and (ii) how farming activities modify the interaction between soil erosion and soil organic carbon (SOC) dynamics. To address the intensive computational demand from high-resolution modeling at lidar data scale, we use a hybrid CPU-GPU parallel computing architecture run over large supercomputing systems for simulations. Our findings indicate that rising CO2 concentration and air temperature have opposing effects on soil moisture, surface water and ponding in topographic depressions. Further, the relatively higher soil moisture and lower soil temperature contribute to decreased soil microbial activities in the low-lying areas due to anaerobic conditions and reduced temperatures. The decreased microbial relevant processes cause the reduction of nitrification rates, resulting in relatively lower nitrate

  3. Conceptualization of Approaches and Thought Processes Emerging in Validating of Model in Mathematical Modeling in Technology Aided Environment

    Science.gov (United States)

    Hidiroglu, Çaglar Naci; Bukova Güzel, Esra

    2013-01-01

    The aim of the present study is to conceptualize the approaches displayed for validation of model and thought processes provided in mathematical modeling process performed in technology-aided learning environment. The participants of this grounded theory study were nineteen secondary school mathematics student teachers. The data gathered from the…

  4. Integrating Procedural Modelling Process and Immersive VR Environment for Architectural Design Education

    OpenAIRE

    Lin Chun-Heng; Hsu Pei-Hsien

    2017-01-01

    In the field of architecture, researches are being conducted for the issue of combining VR environment and procedural modelling. Instead of simply using VR as an visualization tool for procedural modelling, as in other researches, we argue that manipulations of 3D elements and adjustments of design parameters should be implemented in VR to provide further supports for a design process. An integrated system is thus proposed based on this augment, with a special focus on architectural design ed...

  5. Integrating Procedural Modelling Process and Immersive VR Environment for Architectural Design Education

    Directory of Open Access Journals (Sweden)

    Lin Chun-Heng

    2017-01-01

    Full Text Available In the field of architecture, researches are being conducted for the issue of combining VR environment and procedural modelling. Instead of simply using VR as an visualization tool for procedural modelling, as in other researches, we argue that manipulations of 3D elements and adjustments of design parameters should be implemented in VR to provide further supports for a design process. An integrated system is thus proposed based on this augment, with a special focus on architectural design education. This paper presents the design of such a multi-user system, in which 3D modelling, procedural modelling and VR platform are integrated, aiming to support architectural design education.

  6. Process-based modeling of the aeloian environment at the dune scale

    Energy Technology Data Exchange (ETDEWEB)

    Stam, J.M.T. (IGG-TNO, Delft (Netherlands))

    1993-09-01

    Process-based models are quantitative models that simulate the physical process of sedimentation with the objective of reconstructing the spatial distribution, stratification, and properties of the subsurface. In this study, a two-dimensional, process-based model of the aeolian environment, at the dune-interdune scale, has been developed. Sedimentation is governed by the variation of wind velocity over the topography, which is calculated analytically. Velocity calculations are coupled to a sediment transport equation, to determine where erosion and deposition occur. The resulting change in topography determines a new velocity field, which is then calculated. Features that the model simulates include ripple formation and dune migration, as well as the resulting internal sedimentary structures. Process-based models can be used as tool to help interpret structures in ancient formations. This model has been applied specifically to reconstruct dune-interdune sequences observed in cores from the Rotliegendes, localized in the southern Permian basin (North Sea). The interdune strata are characterized by a low permeability. A flow simulation has been done on the aeolian section generated by the model, showing the effect of these heterogeneities on fluid flow.

  7. Spatio-Temporal Risk Assessment Process Modeling for Urban Hazard Events in Sensor Web Environment

    Directory of Open Access Journals (Sweden)

    Wei Wang

    2016-11-01

    Full Text Available Immediate risk assessment and analysis are crucial in managing urban hazard events (UHEs. However, it is a challenge to develop an immediate risk assessment process (RAP that can integrate distributed sensors and data to determine the uncertain model parameters of facilities, environments, and populations. To solve this problem, this paper proposes a RAP modeling method within a unified spatio-temporal framework and forms a 10-tuple process information description structure based on a Meta-Object Facility (MOF. A RAP is designed as an abstract RAP chain that collects urban information resources and performs immediate risk assessments. In addition, we propose a prototype system known as Risk Assessment Process Management (RAPM to achieve the functions of RAP modeling, management, execution and visualization. An urban gas leakage event is simulated as an example in which individual risk and social risk are used to illustrate the applicability of the RAP modeling method based on the 10-tuple metadata framework. The experimental results show that the proposed RAP immediately assesses risk by the aggregation of urban sensors, data, and model resources. Moreover, an extension mechanism is introduced in the spatio-temporal RAP modeling method to assess risk and to provide decision-making support for different UHEs.

  8. Speech processing in mobile environments

    CERN Document Server

    Rao, K Sreenivasa

    2014-01-01

    This book focuses on speech processing in the presence of low-bit rate coding and varying background environments. The methods presented in the book exploit the speech events which are robust in noisy environments. Accurate estimation of these crucial events will be useful for carrying out various speech tasks such as speech recognition, speaker recognition and speech rate modification in mobile environments. The authors provide insights into designing and developing robust methods to process the speech in mobile environments. Covering temporal and spectral enhancement methods to minimize the effect of noise and examining methods and models on speech and speaker recognition applications in mobile environments.

  9. Mathematical Modelling of Thermal Process to Aquatic Environment with Different Hydrometeorological Conditions

    Directory of Open Access Journals (Sweden)

    Alibek Issakhov

    2014-01-01

    Full Text Available This paper presents the mathematical model of the thermal process from thermal power plant to aquatic environment of the reservoir-cooler, which is located in the Pavlodar region, 17 Km to the north-east of Ekibastuz town. The thermal process in reservoir-cooler with different hydrometeorological conditions is considered, which is solved by three-dimensional Navier-Stokes equations and temperature equation for an incompressible flow in a stratified medium. A numerical method based on the projection method, divides the problem into three stages. At the first stage, it is assumed that the transfer of momentum occurs only by convection and diffusion. Intermediate velocity field is solved by fractional steps method. At the second stage, three-dimensional Poisson equation is solved by the Fourier method in combination with tridiagonal matrix method (Thomas algorithm. Finally, at the third stage, it is expected that the transfer is only due to the pressure gradient. Numerical method determines the basic laws of the hydrothermal processes that qualitatively and quantitatively are approximated depending on different hydrometeorological conditions.

  10. Mathematical modelling of thermal process to aquatic environment with different hydrometeorological conditions.

    Science.gov (United States)

    Issakhov, Alibek

    2014-01-01

    This paper presents the mathematical model of the thermal process from thermal power plant to aquatic environment of the reservoir-cooler, which is located in the Pavlodar region, 17 Km to the north-east of Ekibastuz town. The thermal process in reservoir-cooler with different hydrometeorological conditions is considered, which is solved by three-dimensional Navier-Stokes equations and temperature equation for an incompressible flow in a stratified medium. A numerical method based on the projection method, divides the problem into three stages. At the first stage, it is assumed that the transfer of momentum occurs only by convection and diffusion. Intermediate velocity field is solved by fractional steps method. At the second stage, three-dimensional Poisson equation is solved by the Fourier method in combination with tridiagonal matrix method (Thomas algorithm). Finally, at the third stage, it is expected that the transfer is only due to the pressure gradient. Numerical method determines the basic laws of the hydrothermal processes that qualitatively and quantitatively are approximated depending on different hydrometeorological conditions.

  11. The Conceptualization of the Mathematical Modelling Process in Technology-Aided Environment

    Science.gov (United States)

    Hidiroglu, Çaglar Naci; Güzel, Esra Bukova

    2017-01-01

    The aim of the study is to conceptualize the technology-aided mathematical modelling process in the frame of cognitive modelling perspective. The grounded theory approach was adopted in the study. The research was conducted with seven groups consisting of nineteen prospective mathematics teachers. The data were collected from the video records of…

  12. Erosion and sedimentation models in New Zealand: spanning scales, processes and environments

    Science.gov (United States)

    Elliott, Sandy; Oehler, Francois; Derose, Ron

    2010-05-01

    Erosion and sedimentation are of keen interest in New Zealand due to pasture loss in hill areas, damage to infrastructure, loss of stream conveyance, and ecological impacts in estuarine and coastal areas. Management of these impacts requires prediction of the rates, locations, and timing of erosion and transport across a range of scales, and prediction of the response to intervention measures. A range of models has been applied in New Zealand to address these requirements, including: empirical models for the location and probability of occurrence of shallow landslides; empirical national-scale sediment load models with spatial and temporal downscaling; dynamic field-scale sheet erosion models upscaled and linked to estuarine deposition models, including assessment of climate change and effects of urbanisation; detailed (20 m) physically-based distributed dynamic catchment models applied to catchment scale; and provision of GIS-based decision support tools. Despite these advances, considerable work is required to provide the right information at the right scale. Remaining issues are linking between control measures described at the scale of implementation (part of hillslopes, reaches) to catchment-scale outcomes, which entails fine spatial resolution and large computational demands; ability to predict some key processes such as bank and head gully erosion; representation of sediment remobilisation of stores associated with response to land clearance; ability to represent episodic or catastrophic erosion processes along with relatively continuous processes such as sheet flow in a single model; and prediction of sediment concentrations and clarity under normal flow conditions. In this presentation we describe a variety of models and their application in New Zealand, summarise the models in terms of scales, complexity and uses, and outline approaches to resolving the remaining difficulties.

  13. Business Process Elicitation, Modeling, and Reengineering: Teaching and Learning with Simulated Environments

    Science.gov (United States)

    Jeyaraj, Anand

    2010-01-01

    The design of enterprise information systems requires students to master technical skills for elicitation, modeling, and reengineering business processes as well as soft skills for information gathering and communication. These tacit skills and behaviors cannot be effectively taught students but rather experienced and learned by students. This…

  14. Numerical modeling of electrical upsetting manufacturing processes based on Forge® environment

    Science.gov (United States)

    Alves, J.; Acevedo, S.; Marie, S.; Adams, B.; Mocellin, K.; Bay, F.

    2017-10-01

    The present work reviews the latest developments done within Forge®, finite element numerical simulation software for all bulk metal forming processes, to deal with electric processing of materials. We present a complete parallel finite-element coupled Electrical-Thermal-Mechanical model for two-dimensional and three-dimensional electro forming applications. The electro-thermal modeling is considered by sequential-coupling in which the Joule heating term computed from the electric resolution is used as a source term for the thermal problem. For the experimental comparison we use an electric upsetting forming case developed at the Osnabrück University of Applied Sciences. The forming process consists in a closed die hot forging case in which an electric current is passed through the billet to heat it up. At the same time, it is deformed by an applied pressure on the billets end surface. We compare the experimental set-up with 2D and 3D numerical simulations.

  15. Parallel processing optimization strategy based on MapReduce model in cloud storage environment

    Science.gov (United States)

    Cui, Jianming; Liu, Jiayi; Li, Qiuyan

    2017-05-01

    Currently, a large number of documents in the cloud storage process employed the way of packaging after receiving all the packets. From the local transmitter this stored procedure to the server, packing and unpacking will consume a lot of time, and the transmission efficiency is low as well. A new parallel processing algorithm is proposed to optimize the transmission mode. According to the operation machine graphs model work, using MPI technology parallel execution Mapper and Reducer mechanism. It is good to use MPI technology to implement Mapper and Reducer parallel mechanism. After the simulation experiment of Hadoop cloud computing platform, this algorithm can not only accelerate the file transfer rate, but also shorten the waiting time of the Reducer mechanism. It will break through traditional sequential transmission constraints and reduce the storage coupling to improve the transmission efficiency.

  16. Developing a multi-systemic fall prevention model, incorporating the physical environment, the care process and technology: a systematic review.

    Science.gov (United States)

    Choi, Young-Seon; Lawler, Erin; Boenecke, Clayton A; Ponatoski, Edward R; Zimring, Craig M

    2011-12-01

    This paper reports a review that assessed the effectiveness and characteristics of fall prevention interventions implemented in hospitals. A multi-systemic fall prevention model that establishes a practical framework was developed from the evidence. Falls occur through complex interactions between patient-related and environmental risk factors, suggesting a need for multifaceted fall prevention approaches that address both factors. We searched Medline, CINAHL, PsycInfo and the Web of Science databases for references published between January 1990 and June 2009 and scrutinized secondary references from acquired papers. Due to the heterogeneity of interventions and populations, we conducted a quantitative systematic review without a meta-analysis and used a narrative summary to report findings. From the review, three distinct characteristics of fall prevention interventions emerged: (1) the physical environment, (2) the care process and culture and (3) technology. While clinically significant evidence shows the efficacy of environment-related interventions in reducing falls and fall-related injuries, the literature identified few hospitals that had introduced environment-related interventions in their multifaceted fall intervention strategies. Using the multi-systemic fall prevention model, hospitals should promote a practical strategy that benefits from the collective effects of the physical environment, the care process and culture and technology to prevent falls and fall-related injuries. By doing so, they can more effectively address the various risk factors for falling and therefore, prevent falls. Studies that test the proposed model need to be conducted to establish the efficacy of the model in practice. © 2011 The Authors. Journal of Advanced Nursing © 2011 Blackwell Publishing Ltd.

  17. A Process Algebra Software Engineering Environment

    OpenAIRE

    Diertens, B.

    2008-01-01

    In previous work we described how the process algebra based language PSF can be used in software engineering, using the ToolBus, a coordination architecture also based on process algebra, as implementation model. In this article we summarize that work and describe the software development process more formally by presenting the tools we use in this process in a CASE setting, leading to the PSF-ToolBus software engineering environment. We generalize the refine step in this environment towards ...

  18. Application of response surface methodology in the modeling of cadmium removal from aqueous environment by electrocoagulation process

    Directory of Open Access Journals (Sweden)

    M. Teymoori

    2017-06-01

    Full Text Available Background: Discharging effluents containing heavy metals is very harmful due to accumulative property and non-biodegradability of them. Electrocoagulation, due to easy operation and less chemical consumption, is being considered as a process for removal of heavy metals from aqueous environments. Objective: The aim of the study was to evaluate the effects of variables influencing the removal of cadmium from aqueous environment during electrocoagulation and to develop a model for predicting the results. Methods: This experimental study was performed in a batch electrocoagulation reactor using aluminum electrodes. Direct current was supplied using D.C power supply. Cadmium concentrations were measured using standard methods for the examination of water and wastewater. Response surface methodology and central composite design were used to determine the effects of variables (pH, initial concentration of cadmium, electric current density, reaction time and distance between electrodes on the removal of cadmium, to design the experiments, to prepare the prediction model for cadmium removal and to optimize the variables. Findings: The optimal removal efficiency in the conditions of pH; 8.32, current density; 2.965mA/cm2, initial concentration; 65.5 mg/L, reaction time; 54 min and distance between electrodes: 0.723cm was 97.5%. Also R2, Adj.R2 and Pred. R2 were 0.98, 0.97 and 0.96, respectively, which indicates a good fit of the data on quadratic equations. The interactions of all variables were significant in the removal of cadmium by electrocoagulation. The quadratic model is a suitable model for predicting the removal of cadmium using electrocoagulation. Conclusion: Electrocoagulation equipped with aluminum electrodes has high performances in cadmium removal from aqueous environment. This process can be used in primary or supplementary treatment of industrial wastewater containing this heavy metal.

  19. A Methodology and Software Environment for Testing Process Model’s Sequential Predictions with Protocols

    Science.gov (United States)

    1992-12-21

    further. Additionally, the verbal sequentiality assumption of Erikson & Simon’s (1984) verbal protocol theory was tested, and found to hold. The...science answered a lot of my questions. Erik Altmann, Uli Bodenhousen, Femand Gobet, Joe Mertz, Akira Miyake, and Chris Schunn. The R crowd my first year...in particular Erik Altmann, our British visitors Richard Young and Andrew Howes, and the users of preliminary versions of the environment who gave me

  20. Integrating Procedural Modelling Process and Immersive VR Environment for Architectural Design Education

    National Research Council Canada - National Science Library

    Chun-Heng Lin; Pei-Hsien Hsu

    2017-01-01

    ... should be implemented in VR to provide further supports for a design process. An integrated system is thus proposed based on this augment, with a special focus on architectural design education...

  1. Performance Improvement: Applying a Human Performance Model to Organizational Processes in a Military Training Environment

    Science.gov (United States)

    Aaberg, Wayne; Thompson, Carla J.; West, Haywood V.; Swiergosz, Matthew J.

    2009-01-01

    This article provides a description and the results of a study that utilized the human performance (HP) model and methods to explore and analyze a training organization. The systemic and systematic practices of the HP model are applicable to military training organizations as well as civilian organizations. Implications of the study for future…

  2. An efficient simulation environment for modeling large-scale cortical processing.

    Science.gov (United States)

    Richert, Micah; Nageswaran, Jayram Moorkanikara; Dutt, Nikil; Krichmar, Jeffrey L

    2011-01-01

    We have developed a spiking neural network simulator, which is both easy to use and computationally efficient, for the generation of large-scale computational neuroscience models. The simulator implements current or conductance based Izhikevich neuron networks, having spike-timing dependent plasticity and short-term plasticity. It uses a standard network construction interface. The simulator allows for execution on either GPUs or CPUs. The simulator, which is written in C/C++, allows for both fine grain and coarse grain specificity of a host of parameters. We demonstrate the ease of use and computational efficiency of this model by implementing a large-scale model of cortical areas V1, V4, and area MT. The complete model, which has 138,240 neurons and approximately 30 million synapses, runs in real-time on an off-the-shelf GPU. The simulator source code, as well as the source code for the cortical model examples is publicly available.

  3. Modelling and Organising Customer-Driven Business Processes in a Mass Customisation Environment

    DEFF Research Database (Denmark)

    Hvolby, Hans-Henrik; Martin, Chris; Dreyer, Heidi

    2014-01-01

    approach to gain insights into what are the new demands on planners and schedulers. Two case studies in Denmark are highlighted. The researchers found that the tasks of planning business processes in the order flow is likely change in the future as increased adaptation to customer ordering takes place...

  4. An efficient simulation environment for modeling large-scale cortical processing

    Directory of Open Access Journals (Sweden)

    Micah eRichert

    2011-09-01

    Full Text Available We have developed a spiking neural network simulator, which is both easy to use and computationally efficient, for the generation of large-scale computational neuroscience models. The simulator implements current or conductance based Izhikevich neuron networks, having Spike-Timing Dependent Plasticity (STDP and Short-Term Plasticity (STP. It uses a standard network construction interface. The simulator allows for execution on either GPUs or CPUs. The simulator, which is written in C/C++, allows for both fine grain and coarse grain specificity of a host of parameters. We demonstrate the ease of use and computational efficiency of this model by implementing a large-scale model of cortical areas V1, V4 and area MT. The complete model, which has 138,240 neurons and approximately 30 million synapses, runs in real-time on an off-the-shelf GPU. The simulator source code, as well as the source code for the cortical model examples is publicly available.

  5. Space Environment Modeling

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Collection includes presentation materials and outputs from operational space environment models produced by the NOAA Space Weather Prediction Center (SWPC) and...

  6. Understanding Creative Design Processes by Integrating Sketching and CAD Modelling Design Environments: A Preliminary Protocol Result from Architectural Designers

    Directory of Open Access Journals (Sweden)

    Yi Teng Shih

    2015-11-01

    Full Text Available This paper presents the results of a preliminary protocol study of the cognitive behaviour of architectural designers during the design process. The aim is to better understand the similarities and differences in cognitive behaviour using Sequential Mixed Media (SMM and Alternative Mixed Media (AMM approaches, and how switching between media may impact on design processes. Two participants with at least one-year’s professional design experience and a Bachelor of Design degree, and competence in both sketching and computer-aid design (CAD modelling participated in the study. Video recordings of participants working on different projects were coded using the Function-Behaviour-Structure (FBS coding scheme. Participants were also interviewed and their explanations about their switching behaviours were categorised into three types: S→C, S/C↹R and C→S. Preliminary results indicate that switching between media may influence how designers identify problems and develop solutions. In particular, two design issues were identified.  These relate to the FBS coding scheme, where structure (S and behaviour derived from structure (Bs, change to documentation (D after switching from sketching to CAD modelling (S→C. These switches make it possible for designers to integrate both approaches into one design medium and facilitate their design processes in AMM design environments.

  7. Geothermal Systems in Yellowstone National Park are Excellent Model Environments for Linking Microbial Processes and Geochemical Cycling

    Science.gov (United States)

    Inskeep, W. P.; Jay, Z.

    2008-12-01

    Geothermal systems in Yellowstone National Park (YNP) are geochemically diverse, span pH values from approximately 2 to 10, and generally contain a plethora of reduced constituents that may serve as electron donors for chemotrophic microorganisms. One of our long-term goals has been to determine linkages between geochemical processes and the distribution of microbial populations in high-temperature environments, where geochemical conditions often constrain microbial community diversity. Although geochemical characteristics vary greatly across the world's largest geothermal basin, there exist key geochemical attributes that are likely most important for defining patterns in microbial distribution. For example, excellent model systems exist in YNP, where the predominant geochemical and microbial processes are focused on either S species and or Fe-oxidation-reduction. In such cases, we hypothesize that genetic diversity and functional gene content will link directly with habitat parameters. Several cases studies will be presented where pilot metagenomic data (random shotgun sequencing of environmental DNA) was used to identify key functional attributes and confirm that specific patterns of microbial distribution are indeed reflected in other gene loci besides the 16S rRNA gene. These model systems are excellent candidates for elucidating definitive linkages between S, As, and or Fe cycling, genomics and microbial regulation.

  8. Modelling of processes occurring in deep geological repository - Development of new modules in the GoldSim environment

    Science.gov (United States)

    Vopálka, D.; Lukin, D.; Vokál, A.

    2006-01-01

    Three new modules modelling the processes that occur in a deep geological repository have been prepared in the GoldSim computer code environment (using its Transport Module). These modules help to understand the role of selected parameters in the near-field region of the final repository and to prepare an own complex model of the repository behaviour. The source term module includes radioactive decay and ingrowth in the canister, first order degradation of fuel matrix, solubility limitation of the concentration of the studied nuclides, and diffusive migration through the surrounding bentonite layer controlled by the output boundary condition formulated with respect to the rate of water flow in the rock. The corrosion module describes corrosion of canisters made of carbon steel and transport of corrosion products in the near-field region. This module computes balance equations between dissolving species and species transported by diffusion and/or advection from the surface of a solid material. The diffusion module that includes also non-linear form of the interaction isotherm can be used for an evaluation of small-scale diffusion experiments.

  9. Modeling Multiphase Coastal and Hydraulic Processes in an Interactive Python Environment with the Open Source Proteus Toolkit

    Science.gov (United States)

    Kees, C. E.; Farthing, M. W.; Ahmadia, A. J.; Bakhtyar, R.; Miller, C. T.

    2014-12-01

    Hydrology is dominated by multiphase flow processes, due to the importance of capturing water's interaction with soil and air phases. Unfortunately, many different mathematical model formulations are required to model particular processes and scales of interest, and each formulation often requires specialized numerical methods. The Proteus toolkit is a software package for research on models for coastal and hydraulic processes and improvements in numerics, particularly 3D multiphase processes and parallel numerics. The models considered include multiphase flow, shallow water flow, turbulent free surface flow, and various flow-driven processes. We will discuss the objectives of Proteus and recent evolution of the toolkit's design as well as present examples of how it has been used used to construct computational models of multiphase flows for the US Army Corps of Engineers. Proteus is also an open source toolkit authored primarily within the US Army Corps of Engineers, and used, developed, and maintained by a small community of researchers in both theoretical modeling and computational methods research. We will discuss how open source and community development practices have played a role in the creation of Proteus.

  10. Modelling of Gas Flow in the Underground Coal Gasification Process and its Interactions with the Rock Environment

    Directory of Open Access Journals (Sweden)

    Tomasz Janoszek

    2013-01-01

    Full Text Available The main goal of this study was the analysis of gas flow in the underground coal gasification process and interactions with the surrounding rock mass. The article is a discussion of the assumptions for the geometric model and for the numerical method for its solution as well as assumptions for modelling the geochemical model of the interaction between gas-rock-water, in terms of equilibrium calculations, chemical and gas flow modelling in porous mediums. Ansys-Fluent software was used to describe the underground coal gasification process (UCG. The numerical solution was compared with experimental data. The PHREEQC program was used to describe the chemical reaction between the gaseous products of the UCG process and the rock strata in the presence of reservoir waters.

  11. MODELING OF PATTERN FORMING PROCESS OF AUTOMATIC RADIO DIRECTION FINDER OF PHASE VHF IN THE DEVELOPMENT ENVIRONMENT OF LabVIEW APPLIED PROGRAMS

    Directory of Open Access Journals (Sweden)

    G. K. Aslanov

    2015-01-01

    Full Text Available In the article is developed the model demonstrating the forming process of pattern of antenna system of aerodrome quasidopler automatic radiodirection-finder station in the development environment of LabVIEW applied programs of National Instrument company. 

  12. Assessing safety risk in electricity distribution processes using ET & BA improved technique and its ranking by VIKOR and TOPSIS models in fuzzy environment

    Directory of Open Access Journals (Sweden)

    S. Rahmani

    2016-04-01

      Conclusion: The height and electricity are of the main causes of accidents in electricity transmission and distribution industry which caused the overhead power networks to be ranked as high risk. Application of decision-making models in fuzzy environment minimizes the judgment of assessors in the risk assessment process.

  13. FACTORS AFFECTING TEACHING THE CONCEPT of RENEWABLE ENERGY in TECHNOLOGY ASSISTED ENVIRONMENTS AND DESIGNING PROCESSES in THE DISTANCE EDUCATION MODEL

    Directory of Open Access Journals (Sweden)

    A. Seda YUCEL

    2007-01-01

    Full Text Available The energy policies of today focus mainly on sustainable energy systems and renewable energy resources. Chemistry is closely related to energy recycling, energy types, renewable energy, and nature-energy interaction; therefore, it is now an obligation to enrich chemistry classes with renewable energy concepts and related awareness. Before creating renewable energy awareness, the factors thought to affect such awareness should be determined. Knowing these factors would facilitate finding out what to take into account in creating renewable energy awareness. In this study, certain factors thought to affect the development of renewable energy awareness were investigated. The awareness was created through a technology-assisted renewable energy module and assessed using a renewable energy assessment tool. The effects of the students’ self-directed learning readiness with Guglielmino (1977, inner-individual orientation, and anxiety orientation on the awareness were examined. These three factors were found to have significant effects on renewable energy, which was developed through technology utilization. In addition, based on the finding that delivering the subject of renewable energy in technology assisted environments is more effective, the criteria that should be taken into consideration in transforming this subject into a design model that is more suitable for distance education were identified.

  14. Biosphere Process Model Report

    Energy Technology Data Exchange (ETDEWEB)

    J. Schmitt

    2000-05-25

    To evaluate the postclosure performance of a potential monitored geologic repository at Yucca Mountain, a Total System Performance Assessment (TSPA) will be conducted. Nine Process Model Reports (PMRs), including this document, are being developed to summarize the technical basis for each of the process models supporting the TSPA model. These reports cover the following areas: (1) Integrated Site Model; (2) Unsaturated Zone Flow and Transport; (3) Near Field Environment; (4) Engineered Barrier System Degradation, Flow, and Transport; (5) Waste Package Degradation; (6) Waste Form Degradation; (7) Saturated Zone Flow and Transport; (8) Biosphere; and (9) Disruptive Events. Analysis/Model Reports (AMRs) contain the more detailed technical information used to support TSPA and the PMRs. The AMRs consists of data, analyses, models, software, and supporting documentation that will be used to defend the applicability of each process model for evaluating the postclosure performance of the potential Yucca Mountain repository system. This documentation will ensure the traceability of information from its source through its ultimate use in the TSPA-Site Recommendation (SR) and in the National Environmental Policy Act (NEPA) analysis processes. The objective of the Biosphere PMR is to summarize (1) the development of the biosphere model, and (2) the Biosphere Dose Conversion Factors (BDCFs) developed for use in TSPA. The Biosphere PMR does not present or summarize estimates of potential radiation doses to human receptors. Dose calculations are performed as part of TSPA and will be presented in the TSPA documentation. The biosphere model is a component of the process to evaluate postclosure repository performance and regulatory compliance for a potential monitored geologic repository at Yucca Mountain, Nevada. The biosphere model describes those exposure pathways in the biosphere by which radionuclides released from a potential repository could reach a human receptor

  15. Modelling Virtual Environments for Geovisualization

    DEFF Research Database (Denmark)

    Bodum, Lars

    2005-01-01

    The use of virtual environments in geovisualization has become a major topic within the last few years. The main reason for this interest in the growing use of 3D models and visual realizations in a wide range of applications concerned with the geographic element of information. The implementation...... and use of virtual environments has developed form being a rather sophisticated type of visualization that demanded extreme computer resources to become an integral part of the software on every desktop computer. This chapter addresses both philosophical and technical issues regarding the modelling...... of virtual environments. It specifically focuses on different representational aspects to be taken into consideration when a virtual environment is created. These aspects are data modelling, 3D modelling and level-of-detail. A range of different approaches can be taken to visualize a virtual environment...

  16. A Learning Model for Enhancing the Student's Control in Educational Process Using Web 2.0 Personal Learning Environments

    Science.gov (United States)

    Rahimi, Ebrahim; van den Berg, Jan; Veen, Wim

    2015-01-01

    In recent educational literature, it has been observed that improving student's control has the potential of increasing his or her feeling of ownership, personal agency and activeness as means to maximize his or her educational achievement. While the main conceived goal for personal learning environments (PLEs) is to increase student's control by…

  17. Slot Region Radiation Environment Models

    Science.gov (United States)

    Sandberg, Ingmar; Daglis, Ioannis; Heynderickx, Daniel; Evans, Hugh; Nieminen, Petteri

    2013-04-01

    Herein we present the main characteristics and first results of the Slot Region Radiation Environment Models (SRREMs) project. The statistical models developed in SRREMs aim to address the variability of trapped electron and proton fluxes in the region between the inner and the outer electron radiation belt. The energetic charged particle fluxes in the slot region are highly dynamic and are known to vary by several orders of magnitude on both short and long timescales. During quiet times, the particle fluxes are much lower than those found at the peak of the inner and outer belts and the region is considered benign. During geospace magnetic storms, though, this region can fill with energetic particles as the peak of the outer belt is pushed Earthwards and the fluxes can increase drastically. There has been a renewed interest in the potential operation of commercial satellites in orbits that are at least partially contained within the Slot Region. Hence, there is a need to improve the current radiation belt models, most of which do not model the extreme variability of the slot region and instead provide long-term averages between the better-known low and medium Earth orbits (LEO and MEO). The statistical models developed in the SRREMs project are based on the analysis of a large volume of available data and on the construction of a virtual database of slot region particle fluxes. The analysis that we have followed retains the long-term temporal, spatial and spectral variations in electron and proton fluxes as well as the short-term enhancement events at altitudes and inclinations relevant for satellites in the slot region. A large number of datasets have been used for the construction, evaluation and inter-calibration of the SRREMs virtual dataset. Special emphasis has been given on the use and analysis of ESA Standard Radiation Environment Monitor (SREM) data from the units on-board PROBA-1, INTEGRAL, and GIOVE-B due to the sufficient spatial and long temporal

  18. Marketing research model of competitive environment

    Directory of Open Access Journals (Sweden)

    Krasilya Dmitriy

    2015-11-01

    Full Text Available To support its competitive advantages in current market conditions, each company needs to choose better ways of guaranteeing its favorable competitive position. In this regard, considerable interest lies in the structuring and algorithmization of marketing research processes that provide the information background of such choice. The article is devoted to modeling the process of marketing research of competitive environment.

  19. Modeling safety in a distributed technology management environment for more cost-effective conceptual design of chemical process plants

    NARCIS (Netherlands)

    Schupp, B.A.; Lemkowitz, S.M.; Goossens, L.H.J.; Hale, A.R.; Pasman, H.J.

    2002-01-01

    Profitability of the CPI can improve by better integrating safety into the design process. At present, conceptual desgners lack means to design safety. This paper discusses a methodology, Design for Safety (DFS), that strives to provide these. It consists of two major concepts. A technology

  20. A Process and Environment Aware Sierra/SolidMechanics Cohesive Zone Modeling Capability for Polymer/Solid Interfaces

    Energy Technology Data Exchange (ETDEWEB)

    Reedy, E. D. [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Chambers, Robert S. [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Hughes, Lindsey Gloe [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Kropka, Jamie Michael [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Stavig, Mark E. [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Stevens, Mark J. [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States)

    2015-09-01

    The performance and reliability of many mechanical and electrical components depend on the integrity of po lymer - to - solid interfaces . Such interfaces are found in adhesively bonded joints, encapsulated or underfilled electronic modules, protective coatings, and laminates. The work described herein was aimed at improving Sandia's finite element - based capability to predict interfacial crack growth by 1) using a high fidelity nonlinear viscoelastic material model for the adhesive in fracture simulations, and 2) developing and implementing a novel cohesive zone fracture model that generates a mode - mixity dependent toughness as a natural consequence of its formulation (i.e., generates the observed increase in interfacial toughness wi th increasing crack - tip interfacial shear). Furthermore, molecular dynamics simulations were used to study fundamental material/interfa cial physics so as to develop a fuller understanding of the connection between molecular structure and failure . Also reported are test results that quantify how joint strength and interfacial toughness vary with temperature.

  1. Conceptual Model of Dynamic Geographic Environment

    Directory of Open Access Journals (Sweden)

    Martínez-Rosales Miguel Alejandro

    2014-04-01

    Full Text Available In geographic environments, there are many and different types of geographic entities such as automobiles, trees, persons, buildings, storms, hurricanes, etc. These entities can be classified into two groups: geographic objects and geographic phenomena. By its nature, a geographic environment is dynamic, thus, it’s static modeling is not sufficient. Considering the dynamics of geographic environment, a new type of geographic entity called event is introduced. The primary target is a modeling of geographic environment as an event sequence, because in this case the semantic relations are much richer than in the case of static modeling. In this work, the conceptualization of this model is proposed. It is based on the idea to process each entity apart instead of processing the environment as a whole. After that, the so called history of each entity and its spatial relations to other entities are defined to describe the whole environment. The main goal is to model systems at a conceptual level that make use of spatial and temporal information, so that later it can serve as the semantic engine for such systems.

  2. Modeling hyporheic zone processes

    Science.gov (United States)

    Runkel, Robert L.; McKnight, Diane M.; Rajaram, Harihar

    2003-01-01

    Stream biogeochemistry is influenced by the physical and chemical processes that occur in the surrounding watershed. These processes include the mass loading of solutes from terrestrial and atmospheric sources, the physical transport of solutes within the watershed, and the transformation of solutes due to biogeochemical reactions. Research over the last two decades has identified the hyporheic zone as an important part of the stream system in which these processes occur. The hyporheic zone may be loosely defined as the porous areas of the stream bed and stream bank in which stream water mixes with shallow groundwater. Exchange of water and solutes between the stream proper and the hyporheic zone has many biogeochemical implications, due to differences in the chemical composition of surface and groundwater. For example, surface waters are typically oxidized environments with relatively high dissolved oxygen concentrations. In contrast, reducing conditions are often present in groundwater systems leading to low dissolved oxygen concentrations. Further, microbial oxidation of organic materials in groundwater leads to supersaturated concentrations of dissolved carbon dioxide relative to the atmosphere. Differences in surface and groundwater pH and temperature are also common. The hyporheic zone is therefore a mixing zone in which there are gradients in the concentrations of dissolved gasses, the concentrations of oxidized and reduced species, pH, and temperature. These gradients lead to biogeochemical reactions that ultimately affect stream water quality. Due to the complexity of these natural systems, modeling techniques are frequently employed to quantify process dynamics.

  3. Managing environment models in multi robot teams

    Science.gov (United States)

    2016-10-09

    constraints and optimizing higher level criteria. Local trajectory planning require a precise geometric model of the environment, possibly complemented...localization following processes: Fig. 3: The three robots Minnie, Mana and Momo 1) A Digital Terrain Model (DTM) is a raster repre- sentation that...from the embedded layers: number of PCDs, precision of the raster model and last update timestamp. In order to handle arbitrary large maps, the tile

  4. Teaching process writing in an online environment

    OpenAIRE

    Carolan, Fergal; Kyppö, Anna

    2015-01-01

    This reflective practice paper offers some insights into teaching an interdisciplinary academic writing course aimed at promoting process writing. The study reflects on students’ acquisition of writing skills and the teacher’s support practices in a digital writing environment. It presents writers’ experiences related to various stages of process writing, their growing awareness of becoming good writers but also the constant struggle with common writing problems. Preconceive...

  5. Teaching Process Writing in an Online Environment

    Science.gov (United States)

    Carolan, Fergal; Kyppö, Anna

    2015-01-01

    This reflective practice paper offers some insights into teaching an interdisciplinary academic writing course aimed at promoting process writing. The study reflects on students' acquisition of writing skills and the teacher's support practices in a digital writing environment. It presents writers' experiences related to various stages of process…

  6. A process algebra software engineering environment

    NARCIS (Netherlands)

    Diertens, B.

    2008-01-01

    In previous work we described how the process algebra based language PSF can be used in software engineering, using the ToolBus, a coordination architecture also based on process algebra, as implementation model. In this article we summarize that work and describe the software development process

  7. Modeling multiphase materials processes

    CERN Document Server

    Iguchi, Manabu

    2010-01-01

    ""Modeling Multiphase Materials Processes: Gas-Liquid Systems"" describes the methodology and application of physical and mathematical modeling to multi-phase flow phenomena in materials processing. The book focuses on systems involving gas-liquid interaction, the most prevalent in current metallurgical processes. The performance characteristics of these processes are largely dependent on transport phenomena. This volume covers the inherent characteristics that complicate the modeling of transport phenomena in such systems, including complex multiphase structure, intense turbulence, opacity of

  8. Mining local process models

    OpenAIRE

    Tax, Niek; Sidorova, Natalia; Haakma, Reinder; van der Aalst, Wil M.P.

    2016-01-01

    In this paper we describe a method to discover frequent behavioral patterns in event logs. We express these patterns as \\emph{local process models}. Local process model mining can be positioned in-between process discovery and episode / sequential pattern mining. The technique presented in this paper is able to learn behavioral patterns involving sequential composition, concurrency, choice and loop, like in process mining. However, we do not look at start-to-end models, which distinguishes ou...

  9. Process modeling style

    CERN Document Server

    Long, John

    2014-01-01

    Process Modeling Style focuses on other aspects of process modeling beyond notation that are very important to practitioners. Many people who model processes focus on the specific notation used to create their drawings. While that is important, there are many other aspects to modeling, such as naming, creating identifiers, descriptions, interfaces, patterns, and creating useful process documentation. Experience author John Long focuses on those non-notational aspects of modeling, which practitioners will find invaluable. Gives solid advice for creating roles, work produ

  10. Cabin Environment Physics Risk Model

    Science.gov (United States)

    Mattenberger, Christopher J.; Mathias, Donovan Leigh

    2014-01-01

    This paper presents a Cabin Environment Physics Risk (CEPR) model that predicts the time for an initial failure of Environmental Control and Life Support System (ECLSS) functionality to propagate into a hazardous environment and trigger a loss-of-crew (LOC) event. This physics-of failure model allows a probabilistic risk assessment of a crewed spacecraft to account for the cabin environment, which can serve as a buffer to protect the crew during an abort from orbit and ultimately enable a safe return. The results of the CEPR model replace the assumption that failure of the crew critical ECLSS functionality causes LOC instantly, and provide a more accurate representation of the spacecraft's risk posture. The instant-LOC assumption is shown to be excessively conservative and, moreover, can impact the relative risk drivers identified for the spacecraft. This, in turn, could lead the design team to allocate mass for equipment to reduce overly conservative risk estimates in a suboptimal configuration, which inherently increases the overall risk to the crew. For example, available mass could be poorly used to add redundant ECLSS components that have a negligible benefit but appear to make the vehicle safer due to poor assumptions about the propagation time of ECLSS failures.

  11. Product and Process Modelling

    DEFF Research Database (Denmark)

    Cameron, Ian T.; Gani, Rafiqul

    This book covers the area of product and process modelling via a case study approach. It addresses a wide range of modelling applications with emphasis on modelling methodology and the subsequent in-depth analysis of mathematical models to gain insight via structural aspects of the models....... These approaches are put into the context of life cycle modelling, where multiscale and multiform modelling is increasingly prevalent in the 21st century. The book commences with a discussion of modern product and process modelling theory and practice followed by a series of case studies drawn from a variety...... of process industries. The book builds on the extensive modelling experience of the authors, who have developed models for both research and industrial purposes. It complements existing books by the authors in the modelling area. Those areas include the traditional petroleum and petrochemical industries...

  12. Product and Process Modelling

    DEFF Research Database (Denmark)

    Cameron, Ian T.; Gani, Rafiqul

    to biotechnology applications, food, polymer and human health application areas. The book highlights to important nature of modern product and process modelling in the decision making processes across the life cycle. As such it provides an important resource for students, researchers and industrial practitioners.......This book covers the area of product and process modelling via a case study approach. It addresses a wide range of modelling applications with emphasis on modelling methodology and the subsequent in-depth analysis of mathematical models to gain insight via structural aspects of the models....... These approaches are put into the context of life cycle modelling, where multiscale and multiform modelling is increasingly prevalent in the 21st century. The book commences with a discussion of modern product and process modelling theory and practice followed by a series of case studies drawn from a variety...

  13. Product and Process Modelling

    DEFF Research Database (Denmark)

    Cameron, Ian T.; Gani, Rafiqul

    This book covers the area of product and process modelling via a case study approach. It addresses a wide range of modelling applications with emphasis on modelling methodology and the subsequent in-depth analysis of mathematical models to gain insight via structural aspects of the models...... of process industries. The book builds on the extensive modelling experience of the authors, who have developed models for both research and industrial purposes. It complements existing books by the authors in the modelling area. Those areas include the traditional petroleum and petrochemical industries...... to biotechnology applications, food, polymer and human health application areas. The book highlights to important nature of modern product and process modelling in the decision making processes across the life cycle. As such it provides an important resource for students, researchers and industrial practitioners....

  14. Standard Model processes

    CERN Document Server

    Mangano, M.L.; Aguilar Saavedra, J.A.; Alekhin, S.; Badger, S.; Bauer, C.W.; Becher, T.; Bertone, V.; Bonvini, M.; Boselli, S.; Bothmann, E.; Boughezal, R.; Cacciari, M.; Carloni Calame, C.M.; Caola, F.; Campbell, J.M.; Carrazza, S.; Chiesa, M.; Cieri, L.; Cimaglia, F.; Febres Cordero, F.; Ferrarese, P.; D'Enterria, D.; Ferrera, G.; Garcia i Tormo, X.; Garzelli, M.V.; Germann, E.; Hirschi, V.; Han, T.; Ita, H.; Jäger, B.; Kallweit, S.; Karlberg, A.; Kuttimalai, S.; Krauss, F.; Larkoski, A.J.; Lindert, J.; Luisoni, G.; Maierhöfer, P.; Mattelaer, O.; Martinez, H.; Moch, S.; Montagna, G.; Moretti, M.; Nason, P.; Nicrosini, O.; Oleari, C.; Pagani, D.; Papaefstathiou, A.; Petriello, F.; Piccinini, F.; Pierini, M.; Pierog, T.; Pozzorini, S.; Re, E.; Robens, T.; Rojo, J.; Ruiz, R.; Sakurai, K.; Salam, G.P.; Salfelder, L.; Schönherr, M.; Schulze, M.; Schumann, S.; Selvaggi, M.; Shivaji, A.; Siodmok, A.; Skands, P.; Torrielli, P.; Tramontano, F.; Tsinikos, I.; Tweedie, B.; Vicini, A.; Westhoff, S.; Zaro, M.; Zeppenfeld, D.; CERN. Geneva. ATS Department

    2017-06-22

    This report summarises the properties of Standard Model processes at the 100 TeV pp collider. We document the production rates and typical distributions for a number of benchmark Standard Model processes, and discuss new dynamical phenomena arising at the highest energies available at this collider. We discuss the intrinsic physics interest in the measurement of these Standard Model processes, as well as their role as backgrounds for New Physics searches.

  15. Business Model Process Configurations

    DEFF Research Database (Denmark)

    Taran, Yariv; Nielsen, Christian; Thomsen, Peter

    2015-01-01

    Purpose – The paper aims: 1) To develop systematically a structural list of various business model process configuration and to group (deductively) these selected configurations in a structured typological categorization list. 2) To facilitate companies in the process of BM innovation, by develop......Purpose – The paper aims: 1) To develop systematically a structural list of various business model process configuration and to group (deductively) these selected configurations in a structured typological categorization list. 2) To facilitate companies in the process of BM innovation......, by developing (inductively) an ontological classification framework, in view of the BM process configurations typology developed. Design/methodology/approach – Given the inconsistencies found in the business model studies (e.g. definitions, configurations, classifications) we adopted the analytical induction...... method of data analysis. Findings - A comprehensive literature review and analysis resulted in a list of business model process configurations systematically organized under five classification groups, namely, revenue model; value proposition; value configuration; target customers, and strategic...

  16. Microbial processes in fractured rock environments

    Science.gov (United States)

    Kinner, Nancy E.; Eighmy, T. Taylor; Mills, M.; Coulburn, J.; Tisa, L.

    Little is known about the types and activities of microbes in fractured rock environments, but recent studies in a variety of bedrock formations have documented the presence of a diverse array of prokaryotes (Eubacteria and Archaea) and some protists. The prokaryotes appear to live in both diffusion-dominated microfractures and larger, more conductive open fractures. Some of the prokaryotes are associated with the surfaces of the host rock and mineral precipitates, while other planktonic forms are floating/moving in the groundwater filling the fractures. Studies indicate that the surface-associated and planktonic communities are distinct, and their importance in microbially mediated processes occurring in the bedrock environment may vary, depending on the availability of electron donors/acceptors and nutrients needed by the cells. In general, abundances of microbes are low compared with other environments, because of the paucity of these substances that are transported into the deeper subsurface where most bedrock occurs, unless there is significant pollution with an electron donor. To obtain a complete picture of the microbes present and their metabolic activity, it is usually necessary to sample formation water from specific fractures (versus open boreholes), and fracture surfaces (i.e., cores). Transport of the microbes through the major fracture pathways can be rapid, but may be quite limited in the microfractures. Very low abundances of small ( 2-3 μm) flagellated protists, which appear to prey upon planktonic bacteria, have been found in a bedrock aquifer. Much more research is needed to expand the understanding of all microbial processes in fractured rock environments.

  17. ADOxx Modelling Method Conceptualization Environment

    Directory of Open Access Journals (Sweden)

    Nesat Efendioglu

    2017-04-01

    Full Text Available The importance of Modelling Methods Engineering is equally rising with the importance of domain specific languages (DSL and individual modelling approaches. In order to capture the relevant semantic primitives for a particular domain, it is necessary to involve both, (a domain experts, who identify relevant concepts as well as (b method engineers who compose a valid and applicable modelling approach. This process consists of a conceptual design of formal or semi-formal of modelling method as well as a reliable, migratable, maintainable and user friendly software development of the resulting modelling tool. Modelling Method Engineering cycle is often under-estimated as both the conceptual architecture requires formal verification and the tool implementation requires practical usability, hence we propose a guideline and corresponding tools to support actors with different background along this complex engineering process. Based on practical experience in business, more than twenty research projects within the EU frame programmes and a number of bilateral research initiatives, this paper introduces the phases, corresponding a toolbox and lessons learned with the aim to support the engineering of a modelling method. ”The proposed approach is illustrated and validated within use cases from three different EU-funded research projects in the fields of (1 Industry 4.0, (2 e-learning and (3 cloud computing. The paper discusses the approach, the evaluation results and derived outlooks.

  18. Product and Process Modelling

    DEFF Research Database (Denmark)

    Cameron, Ian T.; Gani, Rafiqul

    of process industries. The book builds on the extensive modelling experience of the authors, who have developed models for both research and industrial purposes. It complements existing books by the authors in the modelling area. Those areas include the traditional petroleum and petrochemical industries...... to biotechnology applications, food, polymer and human health application areas. The book highlights to important nature of modern product and process modelling in the decision making processes across the life cycle. As such it provides an important resource for students, researchers and industrial practitioners....

  19. Modeling excessive nutrient loading in the environment.

    Science.gov (United States)

    Reckhow, K H; Chapra, S C

    1999-01-01

    Models addressing excessive nutrient loading in the environment originated over 50 years ago with the simple nutrient concentration thresholds proposed by Sawyer (1947. Fertilization of lakes by agricultural and urban drainage. New Engl. Water Works Assoc. 61, 109-127). Since then, models have improved due to progress in modeling techniques and technology as well as enhancements in scientific knowledge. Several of these advances are examined here. Among the recent approaches in modeling techniques we review are error propagation, model confirmation, generalized sensitivity analysis, and Bayesian analysis. In the scientific arena and process characterization, we focus on advances in surface water modeling, discussing enhanced modeling of organic carbon, improved hydrodynamics, and refined characterization of sediment diagenesis. We conclude with some observations on future needs and anticipated developments.

  20. WWTP Process Tank Modelling

    DEFF Research Database (Denmark)

    Laursen, Jesper

    hydrofoil shaped propellers. These two sub-processes deliver the main part of the supplied energy to the activated sludge tank, and for this reason they are important for the mixing conditions in the tank. For other important processes occurring in the activated sludge tank, existing models and measurements...... zones in mind, its special construction poses strict demands to the hydrodynamic model. In case study three the model is extended to a three-phase model where also the injection of air bubbles during the aeration process is modeled. The aeration of sludge is controlled through a simple expression...... for the reoxygenation of the wastewater phase as a function of the local volume fraction of air and the concentration of soluble oxygen. A simple model for the bulk consumption of oxygen is linked to the reoxygenation expression in order to model measured oxygen concentrations in the suspension. In the final case study...

  1. Open source integrated modeling environment Delta Shell

    Science.gov (United States)

    Donchyts, G.; Baart, F.; Jagers, B.; van Putten, H.

    2012-04-01

    In the last decade, integrated modelling has become a very popular topic in environmental modelling since it helps solving problems, which is difficult to model using a single model. However, managing complexity of integrated models and minimizing time required for their setup remains a challenging task. The integrated modelling environment Delta Shell simplifies this task. The software components of Delta Shell are easy to reuse separately from each other as well as a part of integrated environment that can run in a command-line or a graphical user interface mode. The most components of the Delta Shell are developed using C# programming language and include libraries used to define, save and visualize various scientific data structures as well as coupled model configurations. Here we present two examples showing how Delta Shell simplifies process of setting up integrated models from the end user and developer perspectives. The first example shows coupling of a rainfall-runoff, a river flow and a run-time control models. The second example shows how coastal morphological database integrates with the coastal morphological model (XBeach) and a custom nourishment designer. Delta Shell is also available as open-source software released under LGPL license and accessible via http://oss.deltares.nl.

  2. Microbial consortia in meat processing environments

    Science.gov (United States)

    Alessandria, V.; Rantsiou, K.; Cavallero, M. C.; Riva, S.; Cocolin, L.

    2017-09-01

    Microbial contamination in food processing plants can play a fundamental role in food quality and safety. The description of the microbial consortia in the meat processing environment is important since it is a first step in understanding possible routes of product contamination. Furthermore, it may contribute in the development of sanitation programs for effective pathogen removal. The purpose of this study was to characterize the type of microbiota in the environment of meat processing plants: the microbiota of three different meat plants was studied by both traditional and molecular methods (PCR-DGGE) in two different periods. Different levels of contamination emerged between the three plants as well as between the two sampling periods. Conventional methods of killing free-living bacteria through antimicrobial agents and disinfection are often ineffective against bacteria within a biofilm. The use of gas-discharge plasmas potentially can offer a good alternative to conventional sterilization methods. The purpose of this study was to measure the effectiveness of Atmospheric Pressure Plasma (APP) surface treatments against bacteria in biofilms. Biofilms produced by three different L. monocytogenes strains on stainless steel surface were subjected to three different conditions (power, exposure time) of APP. Our results showed how most of the culturable cells are inactivated after the Plasma exposure but the RNA analysis by qPCR highlighted the entrance of the cells in the viable-but non culturable (VBNC) state, confirming the hypothesis that cells are damaged after plasma treatment, but in a first step, still remain alive. The understanding of the effects of APP on the L. monocytogenes biofilm can improve the development of sanitation programs with the use of APP for effective pathogen removal.

  3. Spacecraft Internal Acoustic Environment Modeling

    Science.gov (United States)

    Chu, SShao-sheng R.; Allen, Christopher S.

    2009-01-01

    carried out by acquiring octave band microphone data simultaneously at ten fixed locations throughout the mockup. SPLs (Sound Pressure Levels) predicted by our SEA model match well with measurements for our CM mockup, with a more complicated shape. Additionally in FY09, background NC noise (Noise Criterion) simulation and MRT (Modified Rhyme Test) were developed and performed in the mockup to determine the maximum noise level in CM habitable volume for fair crew voice communications. Numerous demonstrations of simulated noise environment in the mockup and associated SIL (Speech Interference Level) via MRT were performed for various communities, including members from NASA and Orion prime-/sub-contractors. Also, a new HSIR (Human-Systems Integration Requirement) for limiting pre- and post-landing SIL was proposed.

  4. Estimation of environment-related properties of chemicals for design of sustainable processes: Development of group-contribution+ (GC+) models and uncertainty analysis

    DEFF Research Database (Denmark)

    Hukkerikar, Amol; Kalakul, Sawitree; Sarup, Bent

    2012-01-01

    The aim of this work is to develop group-3 contribution+ (GC+)method (combined group-contribution (GC) method and atom connectivity index (CI)) based 15 property models to provide reliable estimations of environment-related properties of organic chemicals together with uncertainties of estimated...... property values. For this purpose, a systematic methodology for property modeling and uncertainty analysis is used. The methodology includes a parameter estimation step to determine parameters of property models and an uncertainty analysis step to establish statistical information about the quality......, poly functional chemicals, etc.) taken from the database of the US Environmental Protection Agency (EPA) and from the database of USEtox is used. For property modeling and uncertainty analysis, the Marrero and Gani GC method and atom connectivity index method have been considered. In total, 22...

  5. Foam process models.

    Energy Technology Data Exchange (ETDEWEB)

    Moffat, Harry K.; Noble, David R.; Baer, Thomas A. (Procter & Gamble Co., West Chester, OH); Adolf, Douglas Brian; Rao, Rekha Ranjana; Mondy, Lisa Ann

    2008-09-01

    In this report, we summarize our work on developing a production level foam processing computational model suitable for predicting the self-expansion of foam in complex geometries. The model is based on a finite element representation of the equations of motion, with the movement of the free surface represented using the level set method, and has been implemented in SIERRA/ARIA. An empirically based time- and temperature-dependent density model is used to encapsulate the complex physics of foam nucleation and growth in a numerically tractable model. The change in density with time is at the heart of the foam self-expansion as it creates the motion of the foam. This continuum-level model uses an homogenized description of foam, which does not include the gas explicitly. Results from the model are compared to temperature-instrumented flow visualization experiments giving the location of the foam front as a function of time for our EFAR model system.

  6. Near-field environment/processes working group summary

    Energy Technology Data Exchange (ETDEWEB)

    Murphy, W.M. [Center for Nuclear Waste Regulatory Analyses, San Antonio, TX (United States)

    1995-09-01

    This article is a summary of the proceedings of a group discussion which took place at the Workshop on the Role of Natural Analogs in Geologic Disposal of High-Level Nuclear Waste in San Antonio, Texas on July 22-25, 1991. The working group concentrated on the subject of the near-field environment to geologic repositories for high-level nuclear waste. The near-field environment may be affected by thermal perturbations from the waste, and by disturbances caused by the introduction of exotic materials during construction of the repository. This group also discussed the application of modelling of performance-related processes.

  7. Radiolysis Process Model

    Energy Technology Data Exchange (ETDEWEB)

    Buck, Edgar C.; Wittman, Richard S.; Skomurski, Frances N.; Cantrell, Kirk J.; McNamara, Bruce K.; Soderquist, Chuck Z.

    2012-07-17

    Assessing the performance of spent (used) nuclear fuel in geological repository requires quantification of time-dependent phenomena that may influence its behavior on a time-scale up to millions of years. A high-level waste repository environment will be a dynamic redox system because of the time-dependent generation of radiolytic oxidants and reductants and the corrosion of Fe-bearing canister materials. One major difference between used fuel and natural analogues, including unirradiated UO2, is the intense radiolytic field. The radiation emitted by used fuel can produce radiolysis products in the presence of water vapor or a thin-film of water (including OH• and H• radicals, O2-, eaq, H2O2, H2, and O2) that may increase the waste form degradation rate and change radionuclide behavior. H2O2 is the dominant oxidant for spent nuclear fuel in an O2 depleted water environment, the most sensitive parameters have been identified with respect to predictions of a radiolysis model under typical conditions. As compared with the full model with about 100 reactions it was found that only 30-40 of the reactions are required to determine [H2O2] to one part in 10–5 and to preserve most of the predictions for major species. This allows a systematic approach for model simplification and offers guidance in designing experiments for validation.

  8. Process competencies in a problem and project based learning environment

    DEFF Research Database (Denmark)

    Du, Xiangyun; Kolmos, Anette

    2006-01-01

    competencies. Consequently, engineering education is facing challenges regarding how to facilitate students with scientific-technological competencies as well as process competencies. Problem based learning (PBL) as an educational model is regarded as an effective example regarding preparing students......Future engineers are not only required to master technological competencies concerning solving problems, producing and innovating technology, they are also expected to have capabilities of cooperation, communication, and project management in diverse social context, which are referred to as process...... with the expected professional competencies. Based on the educational practice of PBL Aalborg Model, which is characterized by problem-orientation, project-organization and team work, this paper examines the process of developing process competencies through studying engineering in a PBL environment from...

  9. Modelling of Indoor Environments Using Lindenmayer Systems

    Science.gov (United States)

    Peter, M.

    2017-09-01

    Documentation of the "as-built" state of building interiors has gained a lot of interest in the recent years. Various data acquisition methods exist, e.g. the extraction from photographed evacuation plans using image processing or, most prominently, indoor mobile laser scanning. Due to clutter or data gaps as well as errors during data acquisition and processing, automatic reconstruction of CAD/BIM-like models from these data sources is not a trivial task. Thus it is often tried to support reconstruction by general rules for the perpendicularity and parallelism which are predominant in man-made structures. Indoor environments of large, public buildings, however, often also follow higher-level rules like symmetry and repetition of e.g. room sizes and corridor widths. In the context of reconstruction of city city elements (e.g. street networks) or building elements (e.g. façade layouts), formal grammars have been put to use. In this paper, we describe the use of Lindenmayer systems - which originally have been developed for the computer-based modelling of plant growth - to model and reproduce the layout of indoor environments in 2D.

  10. Process algebra modelling styles for biomolecular processes

    OpenAIRE

    Calder, M.; Hillston, J.

    2009-01-01

    We investigate how biomolecular processes are modelled in process algebras, focussing on chemical reactions. We consider various modelling styles and how design decisions made in the definition of the process algebra have an impact on how a modelling style can be applied. Our goal is to highlight the often implicit choices that modellers make in choosing a formalism, and illustrate, through the use of examples, how this can affect expressability as well as the type and complexity of the analy...

  11. Neuroscientific Model of Motivational Process

    Science.gov (United States)

    Kim, Sung-il

    2013-01-01

    Considering the neuroscientific findings on reward, learning, value, decision-making, and cognitive control, motivation can be parsed into three sub processes, a process of generating motivation, a process of maintaining motivation, and a process of regulating motivation. I propose a tentative neuroscientific model of motivational processes which consists of three distinct but continuous sub processes, namely reward-driven approach, value-based decision-making, and goal-directed control. Reward-driven approach is the process in which motivation is generated by reward anticipation and selective approach behaviors toward reward. This process recruits the ventral striatum (reward area) in which basic stimulus-action association is formed, and is classified as an automatic motivation to which relatively less attention is assigned. By contrast, value-based decision-making is the process of evaluating various outcomes of actions, learning through positive prediction error, and calculating the value continuously. The striatum and the orbitofrontal cortex (valuation area) play crucial roles in sustaining motivation. Lastly, the goal-directed control is the process of regulating motivation through cognitive control to achieve goals. This consciously controlled motivation is associated with higher-level cognitive functions such as planning, retaining the goal, monitoring the performance, and regulating action. The anterior cingulate cortex (attention area) and the dorsolateral prefrontal cortex (cognitive control area) are the main neural circuits related to regulation of motivation. These three sub processes interact with each other by sending reward prediction error signals through dopaminergic pathway from the striatum and to the prefrontal cortex. The neuroscientific model of motivational process suggests several educational implications with regard to the generation, maintenance, and regulation of motivation to learn in the learning environment. PMID:23459598

  12. Neuroscientific model of motivational process.

    Science.gov (United States)

    Kim, Sung-Il

    2013-01-01

    Considering the neuroscientific findings on reward, learning, value, decision-making, and cognitive control, motivation can be parsed into three sub processes, a process of generating motivation, a process of maintaining motivation, and a process of regulating motivation. I propose a tentative neuroscientific model of motivational processes which consists of three distinct but continuous sub processes, namely reward-driven approach, value-based decision-making, and goal-directed control. Reward-driven approach is the process in which motivation is generated by reward anticipation and selective approach behaviors toward reward. This process recruits the ventral striatum (reward area) in which basic stimulus-action association is formed, and is classified as an automatic motivation to which relatively less attention is assigned. By contrast, value-based decision-making is the process of evaluating various outcomes of actions, learning through positive prediction error, and calculating the value continuously. The striatum and the orbitofrontal cortex (valuation area) play crucial roles in sustaining motivation. Lastly, the goal-directed control is the process of regulating motivation through cognitive control to achieve goals. This consciously controlled motivation is associated with higher-level cognitive functions such as planning, retaining the goal, monitoring the performance, and regulating action. The anterior cingulate cortex (attention area) and the dorsolateral prefrontal cortex (cognitive control area) are the main neural circuits related to regulation of motivation. These three sub processes interact with each other by sending reward prediction error signals through dopaminergic pathway from the striatum and to the prefrontal cortex. The neuroscientific model of motivational process suggests several educational implications with regard to the generation, maintenance, and regulation of motivation to learn in the learning environment.

  13. Automated Environment Generation for Software Model Checking

    Science.gov (United States)

    Tkachuk, Oksana; Dwyer, Matthew B.; Pasareanu, Corina S.

    2003-01-01

    A key problem in model checking open systems is environment modeling (i.e., representing the behavior of the execution context of the system under analysis). Software systems are fundamentally open since their behavior is dependent on patterns of invocation of system components and values defined outside the system but referenced within the system. Whether reasoning about the behavior of whole programs or about program components, an abstract model of the environment can be essential in enabling sufficiently precise yet tractable verification. In this paper, we describe an approach to generating environments of Java program fragments. This approach integrates formally specified assumptions about environment behavior with sound abstractions of environment implementations to form a model of the environment. The approach is implemented in the Bandera Environment Generator (BEG) which we describe along with our experience using BEG to reason about properties of several non-trivial concurrent Java programs.

  14. Resolving the impact of short-term variations in physical processes impacting on the spawning environment of eastern Baltic cod : application of a 3-D hydrodynamic model

    DEFF Research Database (Denmark)

    Hinrichsen, H.H.; St. John, Michael; Lehmann, A.

    2002-01-01

    water into the Baltic, modifying wind stress, freshwater runoff and thermal inputs. The model is started from three-dimensional fields of temperature, salinity and oxygen obtained from a previous model run and forced by realistic atmospheric conditions. Results of this realistic reference run were...... cod. Recent research has identified the importance of inflows of saline and oxygenated North Sea water into the Baltic Sea for the recruitment of Baltic cod. However, other processes have been suggested to modify this reproduction volume including variations in timing and volume of terrestrial runoff...... compared to runs with modified meteorological forcing conditions and river runoff. From these simulations, it is apparent that processes other than major Baltic inflows have the potential to alter the reproduction volume of Baltic cod. Low near-surface air temperatures in the North Sea, the Skagerrak...

  15. Dynamic process management for engineering environments

    NARCIS (Netherlands)

    Mentink, R.J.; van Houten, Frederikus J.A.M.; Kals, H.J.J.

    2003-01-01

    The research presented in this paper proposes a concept for dynamic process management as part of an integrated approach to engineering process support. The theory of information management is the starting point for the development of a process management system based on evolution of information

  16. ERP processes automation in corporate environments

    Directory of Open Access Journals (Sweden)

    Antonoaie Victor

    2017-01-01

    Full Text Available The automation processes are used in organizations to speed up analyses processes and reduce manual labour. Robotic Automation of IT processes implemented in a modern corporate workspace provides an excellent tool for assisting professionals in making decisions, saving resources and serving as a know-how repository. This study presents the newest trends in process automation, its benefits such as security, ease of use, reduction of overall process duration, and provide examples of SAPERP projects where this technology was implemented and meaningful impact was obtained.

  17. Data model for Process Documentation

    OpenAIRE

    Munroe, S; Groth, P; Jiang, Sheng; Miles, S; Tan, V; Moreau, L

    2006-01-01

    This document describes the data model for \\emph{process documentation}; information describing process. It starts by describing the logical organisation of process documentation, before drilling down into the models of the different forms of process documentation. It then describes how individual pieces of process documentation and data items can be identified. Finally, a model of context is provided.

  18. Modeling the Environment of a Mobile Security Robot

    Science.gov (United States)

    1990-06-01

    AD-A233 074 Technical Document 1835 June 1990 Modeling the Environment of a Mobile Security Robot H. R. Everett Code 5303 G. A. Gilbreath T. Tran J...dimensional probability distribution plot showing the perceived location of nearby objects in the environment . Note representation of chair at A...head position control, acoustical and image observe a plan view of the environment . This processing, drive and steering, speech synthesis

  19. Sanitation in the Shell Egg Processing Environment

    Science.gov (United States)

    In the past, most of the regulations regarding egg processing are concerned with quality rather than safety. Hazard Analysis and Critical Control Point (HACCP) will be required by retailers or by the federal government. GMPs (Good Manufacturing Practices) and SSOPs (Sanitation Standard Operating P...

  20. Virtual Research Environments for Natural Hazard Modelling

    Science.gov (United States)

    Napier, Hazel; Aldridge, Tim

    2017-04-01

    The Natural Hazards Partnership (NHP) is a group of 17 collaborating public sector organisations providing a mechanism for co-ordinated advice to government and agencies responsible for civil contingency and emergency response during natural hazard events. The NHP has set up a Hazard Impact Model (HIM) group tasked with modelling the impact of a range of UK hazards with the aim of delivery of consistent hazard and impact information. The HIM group consists of 7 partners initially concentrating on modelling the socio-economic impact of 3 key hazards - surface water flooding, land instability and high winds. HIM group partners share scientific expertise and data within their specific areas of interest including hydrological modelling, meteorology, engineering geology, GIS, data delivery, and modelling of socio-economic impacts. Activity within the NHP relies on effective collaboration between partners distributed across the UK. The NHP are acting as a use case study for a new Virtual Research Environment (VRE) being developed by the EVER-EST project (European Virtual Environment for Research - Earth Science Themes: a solution). The VRE is allowing the NHP to explore novel ways of cooperation including improved capabilities for e-collaboration, e-research, automation of processes and e-learning. Collaboration tools are complemented by the adoption of Research Objects, semantically rich aggregations of resources enabling the creation of uniquely identified digital artefacts resulting in reusable science and research. Application of the Research Object concept to HIM development facilitates collaboration, by encapsulating scientific knowledge in a shareable format that can be easily shared and used by partners working on the same model but within their areas of expertise. This paper describes the application of the VRE to the NHP use case study. It outlines the challenges associated with distributed partnership working and how they are being addressed in the VRE. A case

  1. Service Oriented Spacecraft Modeling Environment Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The I-Logix team proposes development of the Service Oriented Spacecraft Modeling Environment (SOSME) to allow faster and more effective spacecraft system design...

  2. Preface. Forest ecohydrological processes in a changing environment.

    Science.gov (United States)

    Xiaohua Wei; Ge Sun; James Vose; Kyoichi Otsuki; Zhiqiang Zhang; Keith Smetterm

    2011-01-01

    The papers in this issue are a selection of the presentations made at the second International Conference on Forests and Water in a Changing Environment. This special issue ‘Forest Ecohydrological Processes in a Changing Environment’ covers the topics regarding the effects of forest, land use and climate changes on ecohydrological processes across forest stand,...

  3. Space Environments and Effects: Trapped Proton Model

    Science.gov (United States)

    Huston, S. L.; Kauffman, W. (Technical Monitor)

    2002-01-01

    An improved model of the Earth's trapped proton environment has been developed. This model, designated Trapped Proton Model version 1 (TPM-1), determines the omnidirectional flux of protons with energy between 1 and 100 MeV throughout near-Earth space. The model also incorporates a true solar cycle dependence. The model consists of several data files and computer software to read them. There are three versions of the mo'del: a FORTRAN-Callable library, a stand-alone model, and a Web-based model.

  4. MIXED SUBSTRATES IN ENVIRONMENT AND BIOTECHNOLOGICAL PROCESSES

    Directory of Open Access Journals (Sweden)

    T. P. Pirog

    2013-12-01

    Full Text Available The modern literature and own experimental data on the use of substrates’ mixtures for intensification of microbial synthesis technologies of practically valuable fermentation products (ethanol, lactic acid, butanediol, primary (amino acids, n-hydroxybenzoate, triglycerides and secondary (lovastatin, surfactants metabolites as well as for intensification of biodegradation of aromatic xenobiotics (benzene, cresols, phenols, toluene and pesticides (dimethoate are presented. Special attention is paid on the molecular mechanisms that were established in recent years and underlying the phenomenon catabolic repression in Gram-positive (Bacillus subtilis and Gram-negative (Pseudomonas, Escherichia coli bacteria and yeast Saccharomyces cerevisiae, and on the use of these data to develop technologies for utilization of plant biomass to produce industrially important metabolites. The survival strategies of heterotrophic microorganisms in natural oligotrophic environments are considered, including the simultaneous use of multiple substrates, allowing improved kinetic characteristics that give them a competitive advantage, also provided significant metabolic/physiological flexibility. The own experimental data on the use of mixtures of growth substrates for the intensification of surfactants’ synthesis of Rhodococcus erythropolis IMV Ac-5017 and Acinetobacter calcoaceticus IMV B-7241 are summarized. The dependence of the synthesis of surfactants in a mixture of energy excess (hexadecane and energy deficient (glycerol, ethanol substrates on the way of inoculum preparation, concentration of mono-substrates in the mixture, and their molar ratio were determined.

  5. Auditory processing models

    DEFF Research Database (Denmark)

    Dau, Torsten

    2008-01-01

    The Handbook of Signal Processing in Acoustics will compile the techniques and applications of signal processing as they are used in the many varied areas of Acoustics. The Handbook will emphasize the interdisciplinary nature of signal processing in acoustics. Each Section of the Handbook...... will present topics on signal processing which are important in a specific area of acoustics. These will be of interest to specialists in these areas because they will be presented from their technical perspective, rather than a generic engineering approach to signal processing. Non-specialists, or specialists...

  6. Sociotechnical design processes and working environment: The case of a continuous process wok

    DEFF Research Database (Denmark)

    Broberg, Ole

    2000-01-01

    A five-year design process of a continuous process wok has been studied with the aim of elucidating the conditions for integrating working environment aspects. The design proc-ess is seen as a network building activity and as a social shaping process of the artefact. A working environment log is ...

  7. Sociotechnical design processes and working environment: The case of a continuous process wok

    DEFF Research Database (Denmark)

    Broberg, Ole

    2000-01-01

    A five-year design process of a continuous process wok has been studied with the aim of elucidating the conditions for integrating working environment aspects. The design process is seen as a network building activity and as a social shaping process of the artefact. A working environment log is s...

  8. Business process management governance model

    OpenAIRE

    Stenšak, Jožica

    2016-01-01

    Theoretical part of this Master’s thesis is based on business process management governance. The term business process governance refers to a high-level model defining structures, metrics, roles and responsibilities for measuring, improvement and management of company key processes. Governance is one of key activity areas for increased business process maturity provided by maturity models. Establishing Center of Excellence for business process management is a governance key concept, which ena...

  9. Modeling aggregation and sedimentation of nanoparticles in the aquatic environment.

    Science.gov (United States)

    Markus, A A; Parsons, J R; Roex, E W M; de Voogt, P; Laane, R W P M

    2015-02-15

    With nanoparticles being used more and more in consumer and industrial products it is almost inevitable that they will be released into the aquatic environment. In order to understand the possible environmental risks it is important to understand their behavior in the aquatic environment. From laboratory studies it is known that nanoparticles in the aquatic environment are subjected to a variety of processes: homoaggregation, heteroaggregation to suspended particulate matter and subsequent sedimentation, dissolution and chemical transformation. This article presents a mathematical model that describes these processes and their relative contribution to the behavior of nanoparticles in the aquatic environment. After calibrating the model with existing data, it is able to adequately describe the published experimental data with a single set of parameters, covering a wide range of initial concentrations. The model shows that at the concentrations used in the laboratory, homoaggregation and sedimentation of the aggregates are the most important processes. As for the natural environment much lower concentrations are expected, heteroaggregation will play the most important role instead. More experimental datasets are required to determine if the process parameters that were found here are generally applicable. Nonetheless it is a promising tool for modeling the transport and fate of nanoparticles in watersheds and other natural water bodies. Copyright © 2014 Elsevier B.V. All rights reserved.

  10. Use and perception of the environment: cultural and developmental processes

    Science.gov (United States)

    Martin M. Chemers; Irwin Altman

    1977-01-01

    This paper presents a "social systems" orientation for integrating the diverse aspects of environment, culture, and individual behavior. It suggests that a wide range of variables, including the physical environment, cultural and social processes, environmental perceptions and cognitions, behavior, and products of behavior, are connected in a complex,...

  11. Model for integrated management of quality, labor risks prevention, environment and ethical aspects, applied to R&D&I and production processes in an organization

    Science.gov (United States)

    González, M. R.; Torres, F.; Yoldi, V.; Arcega, F.; Plaza, I.

    2012-04-01

    It is proposed an integrated management model for an organization. This model is based on the continuous improvement Plan-Do-Check-Act cycle and it intends to integrate the environmental, risk prevention and ethical aspects as well as research, development and innovation projects management in the general quality management structure proposed by ISO 9001:2008. It aims to fulfill the standards ISO 9001, ISO 14001, OSHAS 18001, SGE 21 y 166002.

  12. Sociotechnical design processes and working environment: The case of a continuous process wok

    DEFF Research Database (Denmark)

    Broberg, Ole

    2000-01-01

    A five-year design process of a continuous process wok has been studied with the aim of elucidating the conditions for integrating working environment aspects. The design process is seen as a network building activity and as a social shaping process of the artefact. A working environment log...... is suggested as a tool designers can use to integrate considerations of future operators' working environment....

  13. Designing user models in a virtual cave environment

    Energy Technology Data Exchange (ETDEWEB)

    Brown-VanHoozer, S. [Argonne National Lab., Idaho Falls, ID (United States); Hudson, R. [Argonne National Lab., IL (United States); Gokhale, N. [Madge Networks, San Jose, CA (United States)

    1995-12-31

    In this paper, the results of a first study into the use of virtual reality for human factor studies and design of simple and complex models of control systems, components, and processes are described. The objective was to design a model in a virtual environment that would reflect more characteristics of the user`s mental model of a system and fewer of the designer`s. The technology of a CAVE{trademark} virtual environment and the methodology of Neuro Linguistic Programming were employed in this study.

  14. [Watershed water environment pollution models and their applications: a review].

    Science.gov (United States)

    Zhu, Yao; Liang, Zhi-Wei; Li, Wei; Yang, Yi; Yang, Mu-Yi; Mao, Wei; Xu, Han-Li; Wu, Wei-Xiang

    2013-10-01

    Watershed water environment pollution model is the important tool for studying watershed environmental problems. Through the quantitative description of the complicated pollution processes of whole watershed system and its parts, the model can identify the main sources and migration pathways of pollutants, estimate the pollutant loadings, and evaluate their impacts on water environment, providing a basis for watershed planning and management. This paper reviewed the watershed water environment models widely applied at home and abroad, with the focuses on the models of pollutants loading (GWLF and PLOAD), water quality of received water bodies (QUAL2E and WASP), and the watershed models integrated pollutant loadings and water quality (HSPF, SWAT, AGNPS, AnnAGNPS, and SWMM), and introduced the structures, principles, and main characteristics as well as the limitations in practical applications of these models. The other models of water quality (CE-QUAL-W2, EFDC, and AQUATOX) and watershed models (GLEAMS and MIKE SHE) were also briefly introduced. Through the case analysis on the applications of single model and integrated models, the development trend and application prospect of the watershed water environment pollution models were discussed.

  15. Modelling decision making in an uncertain environment

    OpenAIRE

    Bullen, Guy; Ouafae, Bennis; Kratz, Frédéric

    2010-01-01

    International audience; This article proposes a generative approach to decision making in a complex and uncertain environment, as n alternative to normative or descriptive approaches. A simple and intuitive graphical model provides management teams with a non-restrictive framework for thinking through their decisions. The second half of the article proposes a mathematical model to estimate the multiple influences between decisions in a complex project, whether they are direct or indirect. Thi...

  16. Engineered Barrier System: Physical and Chemical Environment Model

    Energy Technology Data Exchange (ETDEWEB)

    D. M. Jolley; R. Jarek; P. Mariner

    2004-02-09

    The conceptual and predictive models documented in this Engineered Barrier System: Physical and Chemical Environment Model report describe the evolution of the physical and chemical conditions within the waste emplacement drifts of the repository. The modeling approaches and model output data will be used in the total system performance assessment (TSPA-LA) to assess the performance of the engineered barrier system and the waste form. These models evaluate the range of potential water compositions within the emplacement drifts, resulting from the interaction of introduced materials and minerals in dust with water seeping into the drifts and with aqueous solutions forming by deliquescence of dust (as influenced by atmospheric conditions), and from thermal-hydrological-chemical (THC) processes in the drift. These models also consider the uncertainty and variability in water chemistry inside the drift and the compositions of introduced materials within the drift. This report develops and documents a set of process- and abstraction-level models that constitute the engineered barrier system: physical and chemical environment model. Where possible, these models use information directly from other process model reports as input, which promotes integration among process models used for total system performance assessment. Specific tasks and activities of modeling the physical and chemical environment are included in the technical work plan ''Technical Work Plan for: In-Drift Geochemistry Modeling'' (BSC 2004 [DIRS 166519]). As described in the technical work plan, the development of this report is coordinated with the development of other engineered barrier system analysis model reports.

  17. Modelling Technology for Building Fire Scene with Virtual Geographic Environment

    Science.gov (United States)

    Song, Y.; Zhao, L.; Wei, M.; Zhang, H.; Liu, W.

    2017-09-01

    Building fire is a risky activity that can lead to disaster and massive destruction. The management and disposal of building fire has always attracted much interest from researchers. Integrated Virtual Geographic Environment (VGE) is a good choice for building fire safety management and emergency decisions, in which a more real and rich fire process can be computed and obtained dynamically, and the results of fire simulations and analyses can be much more accurate as well. To modelling building fire scene with VGE, the application requirements and modelling objective of building fire scene were analysed in this paper. Then, the four core elements of modelling building fire scene (the building space environment, the fire event, the indoor Fire Extinguishing System (FES) and the indoor crowd) were implemented, and the relationship between the elements was discussed also. Finally, with the theory and framework of VGE, the technology of building fire scene system with VGE was designed within the data environment, the model environment, the expression environment, and the collaborative environment as well. The functions and key techniques in each environment are also analysed, which may provide a reference for further development and other research on VGE.

  18. Modeling of column apparatus processes

    CERN Document Server

    Boyadjiev, Christo; Boyadjiev, Boyan; Popova-Krumova, Petya

    2016-01-01

    This book presents a new approach for the modeling of chemical and interphase mass transfer processes in industrial column apparatuses, using convection-diffusion and average-concentration models. The convection-diffusion type models are used for a qualitative analysis of the processes and to assess the main, small and slight physical effects, and then reject the slight effects. As a result, the process mechanism can be identified. It also introduces average concentration models for quantitative analysis, which use the average values of the velocity and concentration over the cross-sectional area of the column. The new models are used to analyze different processes (simple and complex chemical reactions, absorption, adsorption and catalytic reactions), and make it possible to model the processes of gas purification with sulfur dioxide, which form the basis of several patents.

  19. Organizational Knowledge Conversion and Creation Processes in a Chaotic Environment

    Directory of Open Access Journals (Sweden)

    Andrei Ștefan NESTIAN

    2013-05-01

    Full Text Available This is an explorative and conceptual paper, based on the analysis and comparison of relevant literature. the purpose of the article is to clarify the differences between knowledge creating processes and knowledge conversion processes, by analysing them when confronted with a chaotic environment. the way the knowledge conversion and creation processes are presented by Ikujiro Nonaka and his co-workers suggests the necessary existence of a Ba in order to generate the spiral of knowledge creation. this implies the acceptance of a relationship between the environment and the knowledge conversion process, in which the environment influences the knowledge creation. the article is based on the hypothesis that a chaotic environment, characterized by unpredictability, non-linearity and crisis, will lead to specific ways of functioning of the knowledge creation and conversion process that highlight the relations between the two different types of processes. Starting from the general concept of resilience, herein one proposes and explains the concept of resilience of the knowledge conversion system. the role of the attractors from the chaotic environment in the creation of new knowledge is identified and explained

  20. Multi-enzyme Process Modeling

    DEFF Research Database (Denmark)

    Andrade Santacoloma, Paloma de Gracia

    or potential process configurations operated under different conditions. In these cases, process engineering, enzyme immobilization and protein engineering are presented as fields that can offer feasible solutions for better process configurations or biocatalyst modification to enhance actual process...... proven to be useful for a fast model formulation of multi-enzyme processes. Additionally, programming codes were developed using MATLAB (The Mathworks, Natick, MA) which were also used as computational tools to support the implementation, solution and analysis of all the mathematical problems faced...

  1. UML in business process modeling

    Directory of Open Access Journals (Sweden)

    Bartosz Marcinkowski

    2013-03-01

    Full Text Available Selection and proper application of business process modeling methods and techniques have a significant impact on organizational improvement capabilities as well as proper understanding of functionality of information systems that shall support activity of the organization. A number of business process modeling notations were popularized in practice in recent decades. Most significant of the notations include Business Process Modeling Notation (OMG BPMN and several Unified Modeling Language (OMG UML extensions. In this paper, the assessment whether one of the most flexible and strictly standardized contemporary business process modeling notations, i.e. Rational UML Profile for Business Modeling, enable business analysts to prepare business models that are all-embracing and understandable by all the stakeholders. After the introduction, methodology of research is discussed. Section 2 presents selected case study results. The paper is concluded with a summary.

  2. Internet User Behaviour Model Discovery Process

    OpenAIRE

    Dragos Marcel VESPAN

    2007-01-01

    The Academy of Economic Studies has more than 45000 students and about 5000 computers with Internet access which are connected to AES network. Students can access internet on these computers through a proxy server which stores information about the way the Internet is accessed. In this paper, we describe the process of discovering internet user behavior models by analyzing proxy server raw data and we emphasize the importance of such models for the e-learning environment.

  3. Internet User Behaviour Model Discovery Process

    Directory of Open Access Journals (Sweden)

    2007-01-01

    Full Text Available The Academy of Economic Studies has more than 45000 students and about 5000 computers with Internet access which are connected to AES network. Students can access internet on these computers through a proxy server which stores information about the way the Internet is accessed. In this paper, we describe the process of discovering internet user behavior models by analyzing proxy server raw data and we emphasize the importance of such models for the e-learning environment.

  4. On Process Modelling Using Physical Oriented And Phenomena Based Principles

    Directory of Open Access Journals (Sweden)

    Mihai Culea

    2000-12-01

    Full Text Available This work presents a modelling framework based on phenomena description of the process. The approach is taken to easy understand and construct process model in heterogeneous possible distributed modelling and simulation environments. A simplified case study of a heat exchanger is considered and Modelica modelling language to check the proposed concept. The partial results are promising and the research effort will be extended in a computer aided modelling environment based on phenomena.

  5. Business Process Modeling: Perceived Benefits

    Science.gov (United States)

    Indulska, Marta; Green, Peter; Recker, Jan; Rosemann, Michael

    The process-centered design of organizations and information systems is globally seen as an appropriate response to the increased economic pressure on organizations. At the methodological core of process-centered management is process modeling. However, business process modeling in large initiatives can be a time-consuming and costly exercise, making it potentially difficult to convince executive management of its benefits. To date, and despite substantial interest and research in the area of process modeling, the understanding of the actual benefits of process modeling in academia and practice is limited. To address this gap, this paper explores the perception of benefits derived from process modeling initiatives, as reported through a global Delphi study. The study incorporates the views of three groups of stakeholders - academics, practitioners and vendors. Our findings lead to the first identification and ranking of 19 unique benefits associated with process modeling. The study in particular found that process modeling benefits vary significantly between practitioners and academics. We argue that the variations may point to a disconnect between research projects and practical demands.

  6. Model Theory for Process Algebra

    NARCIS (Netherlands)

    Bergstra, J.A.; Middelburg, C.A.

    2004-01-01

    We present a first-order extension of the algebraic theory about processes known as ACP and its main models. Useful predicates on processes, such as deadlock freedom and determinism, can be added to this theory through first-order definitional extensions. Model theory is used to analyse the

  7. Virtual Collaborative Simulation Environment for Integrated Product and Process Development

    Science.gov (United States)

    Gulli, Michael A.

    1997-01-01

    Deneb Robotics is a leader in the development of commercially available, leading edge three- dimensional simulation software tools for virtual prototyping,, simulation-based design, manufacturing process simulation, and factory floor simulation and training applications. Deneb has developed and commercially released a preliminary Virtual Collaborative Engineering (VCE) capability for Integrated Product and Process Development (IPPD). This capability allows distributed, real-time visualization and evaluation of design concepts, manufacturing processes, and total factory and enterprises in one seamless simulation environment.

  8. Business process modeling in healthcare.

    Science.gov (United States)

    Ruiz, Francisco; Garcia, Felix; Calahorra, Luis; Llorente, César; Gonçalves, Luis; Daniel, Christel; Blobel, Bernd

    2012-01-01

    The importance of the process point of view is not restricted to a specific enterprise sector. In the field of health, as a result of the nature of the service offered, health institutions' processes are also the basis for decision making which is focused on achieving their objective of providing quality medical assistance. In this chapter the application of business process modelling - using the Business Process Modelling Notation (BPMN) standard is described. Main challenges of business process modelling in healthcare are the definition of healthcare processes, the multi-disciplinary nature of healthcare, the flexibility and variability of the activities involved in health care processes, the need of interoperability between multiple information systems, and the continuous updating of scientific knowledge in healthcare.

  9. FAME - A Flexible Appearance Modelling Environment

    DEFF Research Database (Denmark)

    Stegmann, Mikkel Bille; Ersbøll, Bjarne Kjær; Larsen, Rasmus

    2003-01-01

    applications within medicine and describes a public domain implementation, namely the Flexible Appearance Modelling Environment (FAME). We give guidelines for the use of this research platform, and show that the optimisation techniques used renders it applicable to interactive medical applications. To increase......Combined modelling of pixel intensities and shape has proven to be a very robust and widely applicable approach to interpret images. As such the Active Appearance Model (AAM) framework has been applied to a wide variety of problems within medical image analysis. This paper summarises AAM...... performance and make models generalise better, we apply parallel analysis to obtain automatic and objective model truncation. Further, two different AAM training methods are compared along with a reference case study carried out on cross-sectional short-axis cardiac magnetic resonance images and face images...

  10. A Multiagent Modeling Environment for Simulating Work Practice in Organizations

    Science.gov (United States)

    Sierhuis, Maarten; Clancey, William J.; vanHoof, Ron

    2004-01-01

    In this paper we position Brahms as a tool for simulating organizational processes. Brahms is a modeling and simulation environment for analyzing human work practice, and for using such models to develop intelligent software agents to support the work practice in organizations. Brahms is the result of more than ten years of research at the Institute for Research on Learning (IRL), NYNEX Science & Technology (the former R&D institute of the Baby Bell telephone company in New York, now Verizon), and for the last six years at NASA Ames Research Center, in the Work Systems Design and Evaluation group, part of the Computational Sciences Division (Code IC). Brahms has been used on more than ten modeling and simulation research projects, and recently has been used as a distributed multiagent development environment for developing work practice support tools for human in-situ science exploration on planetary surfaces, in particular a human mission to Mars. Brahms was originally conceived of as a business process modeling and simulation tool that incorporates the social systems of work, by illuminating how formal process flow descriptions relate to people s actual located activities in the workplace. Our research started in the early nineties as a reaction to experiences with work process modeling and simulation . Although an effective tool for convincing management of the potential cost-savings of the newly designed work processes, the modeling and simulation environment was only able to describe work as a normative workflow. However, the social systems, uncovered in work practices studied by the design team played a significant role in how work actually got done-actual lived work. Multi- tasking, informal assistance and circumstantial work interactions could not easily be represented in a tool with a strict workflow modeling paradigm. In response, we began to develop a tool that would have the benefits of work process modeling and simulation, but be distinctively able to

  11. Building Information Modelling for Smart Built Environments

    OpenAIRE

    Jianchao Zhang; Boon-Chong Seet; Tek Tjing Lie

    2015-01-01

    Building information modelling (BIM) provides architectural 3D visualization and a standardized way to share and exchange building information. Recently, there has been an increasing interest in using BIM, not only for design and construction, but also the post-construction management of the built facility. With the emergence of smart built environment (SBE) technology, which embeds most spaces with smart objects to enhance the building’s efficiency, security and comfort of its occupants, th...

  12. CASES ON COLLABORATION IN VIRTUAL LEARNIONG ENVIRONMENTS: Processes and Interactions

    Directory of Open Access Journals (Sweden)

    Reviewed by Yasin OZARSLAN

    2010-01-01

    Full Text Available Collaboration in Virtual Learning Environment brings meaningful learning interactions between learners in virtual environments. This book collects case studies of collaborative virtual learning environments focusing on the nature of human interactions in virtual spaces and defining the types and qualities of learning processes in these spaces from the perspectives of learners, teachers, designers, and professional and academic developers in various disciplines, learning communities and universities from around the world. This book addresses the research cases on experiences, implementations, and applications of virtual learning environments.The book's broader audience is anyone who is interested in areas such as collaborative virtual learning environments, interactive technologies and virtual communities, social interaction and social competence, distance education and collaborative learning. The book is edited by Donna Russell who is an Assistant Professor at the University of Missouri-Kansas City and co-owner of Arete‘ Consulting, LLC. It is consisted of 358 pages covering 19 articles and provides information about context for characteristics and implications of the varied virtual learning environments. Topics covered in this book are argumentative interactions and learning, collaborative learning and work in digital libraries, collaborative virtual learning environments , digital communities to enhance retention, distance education ,interactive technologies and virtual communities, massively multi-user virtual environments, online graduate community, online training programs, social interaction and social competence and virtual story-worlds.

  13. Guidance in Business Process Modelling

    Science.gov (United States)

    Bartho, Andreas; Gröner, Gerd; Rahmani, Tirdad; Zhao, Yuting; Zivkovic, Srdjan

    This chapter shows how process modellers can be supported by guidance. If a telecommunication provider introduces a value-added service, this might involve the establishment of new business processes, whose specification is not trivial. A guidance engine can help a process engineer develop a new business process by stepwise refining, i.e. creating a more concrete version of the process from an abstract version. The guidance engine identifies inconsistencies and proposes possible refinement steps. The topics covered in this chapter range from theoretical foundations of business process refinement over the formalisation of refinement problems in ontologies to implementation issues. The presented solutions were developed in the MOST project.

  14. Architecture of the Product State Model Environment

    DEFF Research Database (Denmark)

    Holm Larsen, Michael; Lynggaard, Hans Jørgen B.

    2003-01-01

    This paper addresses the issue of using product models to support product lifecycle activities withparticular focus on the production phase. The motivation of the research is that products are producedmore costly and with longer lead-time than necessary.The paper provides a review of product...... modelling technologies and approaches, and the overallarchitecture for the Product State Model (PSM) Environment as a basis for quality monitoring.Especially, the paper focuses on the circumstances prevailing in a one-of-a-kind manufacturingenvironment like the shipbuilding industry, where product modelling...... on thedevelopment activities of the PSM architecture. An example discusses how to handle product relatedinformation on the shop floor in a manufacturing company and focuses on how dynamically updatedproduct data can improve control of production activities. This prototype example of welding a jointbetween two steel...

  15. Kinetic Modeling of Microbiological Processes

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Chongxuan [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Fang, Yilin [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2012-08-26

    Kinetic description of microbiological processes is vital for the design and control of microbe-based biotechnologies such as waste water treatment, petroleum oil recovery, and contaminant attenuation and remediation. Various models have been proposed to describe microbiological processes. This editorial article discusses the advantages and limiation of these modeling approaches in cluding tranditional, Monod-type models and derivatives, and recently developed constraint-based approaches. The article also offers the future direction of modeling researches that best suit for petroleum and environmental biotechnologies.

  16. Markov Decision Process Measurement Model.

    Science.gov (United States)

    LaMar, Michelle M

    2017-04-26

    Within-task actions can provide additional information on student competencies but are challenging to model. This paper explores the potential of using a cognitive model for decision making, the Markov decision process, to provide a mapping between within-task actions and latent traits of interest. Psychometric properties of the model are explored, and simulation studies report on parameter recovery within the context of a simple strategy game. The model is then applied to empirical data from an educational game. Estimates from the model are found to correlate more strongly with posttest results than a partial-credit IRT model based on outcome data alone.

  17. Introducing ORACLE: Library Processing in a Multi-User Environment.

    Science.gov (United States)

    Queensland Library Board, Brisbane (Australia).

    Currently being developed by the State Library of Queensland, Australia, ORACLE (On-Line Retrieval of Acquisitions, Cataloguing, and Circulation Details for Library Enquiries) is a computerized library system designed to provide rapid processing of library materials in a multi-user environment. It is based on the Australian MARC format and fully…

  18. Model feedstock supply processing plants

    Directory of Open Access Journals (Sweden)

    V. M. Bautin

    2013-01-01

    Full Text Available The model of raw providing the processing enterprises entering into vertically integrated structure on production and processing of dairy raw materials, differing by an orientation on achievement of cumulative effect by the integrated structure acting as criterion function which maximizing is reached by optimization of capacities, volumes of deliveries of raw materials and its qualitative characteristics, costs of industrial processing of raw materials and demand for dairy production is developed.

  19. Competence model in education and training process

    OpenAIRE

    Kovac, Darko

    2008-01-01

    Traditional learning processes in contemporary management practice, especially in global business environment, are being challenged by new approaches. They can not be avoided in the hospitality and tourism industry. Kolb’s experimental learning model is a solid base to build on. Experimental learning theory uses personal and group experiences while taking participants through various stages of learning associated with the theory. However, when talking about concrete experience, reflective obs...

  20. The Automation of Nowcast Model Assessment Processes

    Science.gov (United States)

    2016-09-01

    documents. Citation of manufacturer’s or trade names does not constitute an official endorsement or approval of the use thereof. Destroy this report when...by ARL modelers. 2. Development Environment The automation of Point-Stat processes (i.e., PSA) was developed using Python 3.5.* Python was selected...addition, the Weather Running Estimate-Nowcast_Real Time (WREN_RT) system currently in development is implemented in Python . WREN_RT will be a system

  1. An integrative model linking feedback environment and organizational citizenship behavior.

    Science.gov (United States)

    Peng, Jei-Chen; Chiu, Su-Fen

    2010-01-01

    Past empirical evidence has suggested that a positive supervisor feedback environment may enhance employees' organizational citizenship behavior (OCB). In this study, we aim to extend previous research by proposing and testing an integrative model that examines the mediating processes underlying the relationship between supervisor feedback environment and employee OCB. Data were collected from 259 subordinate-supervisor dyads across a variety of organizations in Taiwan. We used structural equation modeling to test our hypotheses. The results demonstrated that supervisor feedback environment influenced employees' OCB indirectly through (1) both positive affective-cognition and positive attitude (i.e., person-organization fit and organizational commitment), and (2) both negative affective-cognition and negative attitude (i.e., role stressors and job burnout). Theoretical and practical implications are discussed.

  2. Command Process Modeling & Risk Analysis

    Science.gov (United States)

    Meshkat, Leila

    2011-01-01

    Commanding Errors may be caused by a variety of root causes. It's important to understand the relative significance of each of these causes for making institutional investment decisions. One of these causes is the lack of standardized processes and procedures for command and control. We mitigate this problem by building periodic tables and models corresponding to key functions within it. These models include simulation analysis and probabilistic risk assessment models.

  3. Process material management in the Space Station environment

    Science.gov (United States)

    Perry, J. L.; Humphries, W. R.

    1988-01-01

    The Space Station will provide a unique facility for conducting material-processing and life-science experiments under microgravity conditions. These conditions place special requirements on the U.S. Laboratory for storing and transporting chemicals and process fluids, reclaiming water from selected experiments, treating and storing experiment wastes, and providing vacuum utilities. To meet these needs and provide a safe laboratory environment, the Process Material Management System (PMMS) is being developed. Preliminary design requirements and concepts related to the PMMS are addressed, and the MSFC PMMS breadboard test facility and a preliminary plan for validating the overall system design are discussed.

  4. Path modeling and process control

    DEFF Research Database (Denmark)

    Høskuldsson, Agnar; Rodionova, O.; Pomerantsev, A.

    2007-01-01

    Many production processes are carried out in stages. At the end of each stage, the production engineer can analyze the intermediate results and correct process parameters (variables) of the next stage. Both analysis of the process and correction to process parameters at next stage should...... and having three or more stages. The methods are applied to a process control of a multi-stage production process having 25 variables and one output variable. When moving along the process, variables change their roles. It is shown how the methods of path modeling can be applied to estimate variables...... of the next stage with the purpose of obtaining optimal or almost optimal quality of the output variable. An important aspect of the methods presented is the possibility of extensive graphic analysis of data that can provide the engineer with a detailed view of the multi-variate variation in data....

  5. Environment

    DEFF Research Database (Denmark)

    Valentini, Chiara

    2017-01-01

    The term environment refers to the internal and external context in which organizations operate. For some scholars, environment is defined as an arrangement of political, economic, social and cultural factors existing in a given context that have an impact on organizational processes and structures....... For others, environment is a generic term describing a large variety of stakeholders and how these interact and act upon organizations. Organizations and their environment are mutually interdependent and organizational communications are highly affected by the environment. This entry examines the origin...... and development of organization-environment interdependence, the nature of the concept of environment and its relevance for communication scholarships and activities....

  6. Task Flow Modeling in Electronic Business Environments

    Directory of Open Access Journals (Sweden)

    2007-01-01

    Full Text Available In recent years, internet based commerce has developed as a new paradigm. Many factors such as "at home delivery", easy ordering, and usually lower prices contributed to the success of the e-commerce. However, more recently, companies realized that one of the major factors in having a successful internet based business is the design of a user interface that is in concordance with the users' expectations, which includes both functionality and user friendly features. The func-tionality feature of an e-business interface is one of the most important elements when discussing about a specific internet based business. In our paper, we present methods to model task flows for e-business interfaces. We strengthen our study with the design modeling of a practical scenario that may appear in an on-line commercial environment.

  7. Modeling Social Dynamics in a Collaborative Environment

    CERN Document Server

    Iñiguez, Gerardo; Yasseri, Taha; Kaski, Kimmo; Kertész, János

    2014-01-01

    Wikipedia is a prime example of today's value production in a collaborative environment. Using this example, we model the emergence, persistence and resolution of severe conflicts during collaboration by coupling opinion formation with article edition in a bounded confidence dynamics. The complex social behaviour involved in article edition is implemented as a minimal model with two basic elements; (i) individuals interact directly to share information and convince each other, and (ii) they edit a common medium to establish their own opinions. Opinions of the editors and that represented by the article are characterised by a scalar variable. When the editorial pool is fixed, three regimes can be distinguished: (a) a stable mainstream article opinion is continuously contested by editors with extremist views and there is slow convergence towards consensus, (b) the article oscillates between editors with extremist views, reaching consensus relatively fast at one of the extremes, and (c) the extremist editors are...

  8. Multiengine Speech Processing Using SNR Estimator in Variable Noisy Environments

    Directory of Open Access Journals (Sweden)

    Ahmad R. Abu-El-Quran

    2012-01-01

    Full Text Available We introduce a multiengine speech processing system that can detect the location and the type of audio signal in variable noisy environments. This system detects the location of the audio source using a microphone array; the system examines the audio first, determines if it is speech/nonspeech, then estimates the value of the signal to noise (SNR using a Discrete-Valued SNR Estimator. Using this SNR value, instead of trying to adapt the speech signal to the speech processing system, we adapt the speech processing system to the surrounding environment of the captured speech signal. In this paper, we introduced the Discrete-Valued SNR Estimator and a multiengine classifier, using Multiengine Selection or Multiengine Weighted Fusion. Also we use the SI as example of the speech processing. The Discrete-Valued SNR Estimator achieves an accuracy of 98.4% in characterizing the environment's SNR. Compared to a conventional single engine SI system, the improvement in accuracy was as high as 9.0% and 10.0% for the Multiengine Selection and Multiengine Weighted Fusion, respectively.

  9. Mapping between two models of etching process

    Directory of Open Access Journals (Sweden)

    T.Patsahan

    2007-12-01

    Full Text Available We consider two models for the etching processes using numerical simulations based on cellular-automata discrete-lattice approach. In the first model we use a uniform etching probability for each surface site. In the second model the etching probability at a given site depends on the local environment of this site. In contrast to the first model we have now a non-local description of the surface evolution. It is natural to consider the following question: is this non-locality sufficient to induce new physics? To answer this question is the main goal of the paper. We show that there exists an equivalence between the two models. This means that the non-local model gives results similar to the local one provided we use an effective value of the etching probability.

  10. modeling grinding modeling grinding processes as micro processes

    African Journals Online (AJOL)

    eobe

    MODELING GRINDING PROCESSES AS MICRO-MACHINING OPERATION. A. S. Olayinka & A. C. Igboanugo. A. S. Olayinka & A. C. Igboanugo. Nigerian Journal of Technology. Vol. 34 No. 3, July 2015. 515 to element material and grinding wheel material. Dynamic specific chip formation energy ud is determined by ...

  11. ECONOMIC MODELING PROCESSES USING MATLAB

    Directory of Open Access Journals (Sweden)

    Anamaria G. MACOVEI

    2008-06-01

    Full Text Available To study economic phenomena and processes using mathem atical modeling, and to determine the approximatesolution to a problem we need to choose a method of calculation and a numerical computer program, namely thepackage of programs MatLab. Any economic process or phenomenon is a mathematical description of h is behavior,and thus draw up an economic and mathematical model that has the following stages: formulation of the problem, theanalysis process modeling, the production model and design verification, validation and implementation of the model.This article is presented an economic model and its modeling is using mathematical equations and software packageMatLab, which helps us approximation effective solution. As data entry is considered the net cost, the cost of direct andtotal cost and the link between them. I presented the basic formula for determining the total cost. Economic modelcalculations were made in MatLab software package and with graphic representation of its interpretation of the resultsachieved in terms of our specific problem.

  12. Analysis of Interpersonal Communication Processes in Digital Factory Environments

    Science.gov (United States)

    Schütze, Jens; Baum, Heiko; Laue, Martin; Müller, Egon

    The paper outlines the scope of influence of digital factory on the interpersonal communication process and the exemplary description of them. On the basis of a brief description about the theoretical basic concepts of the digital factory occurs the illustration of communicative features in digital factory. Practical coherences of interpersonal communication from a human oriented view were analyzed in Volkswagen AG in Wolfsburg in a pilot project. A modeling method was developed within the process analysis. This method makes it possible to visualize interpersonal communication and its human oriented attribute in a technically focused workflow. Due to the results of a developed inquiry about communication analysis and process models of modeling methods it was possible to build the processes in a suitable way for humans and to obtain a positive implication on the communication processes.

  13. Methods for Process Evaluation of Work Environment Interventions

    DEFF Research Database (Denmark)

    Fredslund, Hanne; Strandgaard Pedersen, Jesper

    2004-01-01

    In recent years, intervention studies have become increasingly popular within occupational health psychology. The vast majority of such studies have focused on interventions themselves and their effects on the working environment and employee health and well-being. Few studies have focused on how......). This paper describes how organisation theory can be used to develop a method for identifying and analysing processes in relation to the implementation of work environment interventions. The reason for using organisation theory is twofold: 1) interventions are never implemented in a vacuum but in a specific...... organisational context (workplace) with certain characteristics, that the organisation theory can capture, 2) within the organisational sociological field there is a long tradition for studying organisational changes such as workplace interventions. In this paper process is defined as `individual, collective...

  14. Modeling developmental processes in psychology

    OpenAIRE

    Nurmi, Jari-Erik

    2013-01-01

    In the present article I suggest first that modeling in psychology can be described as an interactive process between a phenomenon under study (reality) and different levels of theoretical conceptualizations that vary in respect to how directly they can be related to empirical observations and at what level of generalization they operate. Then, I give three examples of my own work concerning building theories and testing models. Next, I discuss some caveats scientists face when building theor...

  15. Regulatory Models and the Environment: Practice, Pitfalls, and Prospects

    Energy Technology Data Exchange (ETDEWEB)

    Holmes, K. John; Graham, Judith A.; McKone, Thomas; Whipple, Chris

    2008-06-01

    Computational models support environmental regulatory activities by providing the regulator an ability to evaluate available knowledge, assess alternative regulations, and provide a framework to assess compliance. But all models face inherent uncertainties, because human and natural systems are always more complex and heterogeneous than can be captured in a model. Here we provide a summary discussion of the activities, findings, and recommendations of the National Research Council's Committee on Regulatory Environmental Models, a committee funded by the US Environmental Protection Agency to provide guidance on the use of computational models in the regulatory process. Modeling is a difficult enterprise even outside of the potentially adversarial regulatory environment. The demands grow when the regulatory requirements for accountability, transparency, public accessibility, and technical rigor are added to the challenges. Moreover, models cannot be validated (declared true) but instead should be evaluated with regard to their suitability as tools to address a specific question. The committee concluded that these characteristics make evaluation of a regulatory model more complex than simply comparing measurement data with model results. Evaluation also must balance the need for a model to be accurate with the need for a model to be reproducible, transparent, and useful for the regulatory decision at hand. Meeting these needs requires model evaluation to be applied over the"life cycle" of a regulatory model with an approach that includes different forms of peer review, uncertainty analysis, and extrapolation methods than for non-regulatory models.

  16. Mathematical modelling in economic processes.

    Directory of Open Access Journals (Sweden)

    L.V. Kravtsova

    2008-06-01

    Full Text Available In article are considered a number of methods of mathematical modelling of economic processes and opportunities of use of spreadsheets Excel for reception of the optimum decision of tasks or calculation of financial operations with the help of the built-in functions.

  17. A network-oriented business modeling environment

    Science.gov (United States)

    Bisconti, Cristian; Storelli, Davide; Totaro, Salvatore; Arigliano, Francesco; Savarino, Vincenzo; Vicari, Claudia

    The development of formal models related to the organizational aspects of an enterprise is fundamental when these aspects must be re-engineered and digitalized, especially when the enterprise is involved in the dynamics and value flows of a business network. Business modeling provides an opportunity to synthesize and make business processes, business rules and the structural aspects of an organization explicit, allowing business managers to control their complexity and guide an enterprise through effective decisional and strategic activities. This chapter discusses the main results of the TEKNE project in terms of software components that enable enterprises to configure, store, search and share models of any aspects of their business while leveraging standard and business-oriented technologies and languages to bridge the gap between the world of business people and IT experts and to foster effective business-to-business collaborations.

  18. A model for hypermedia learning environments based on electronic books

    Directory of Open Access Journals (Sweden)

    Ignacio Aedo

    1997-12-01

    Full Text Available Current hypermedia learning environments do not have a common development basis. Their designers have often used ad-hoc solutions to solve the learning problems they have encountered. However, hypermedia technology can take advantage of employing a theoretical scheme - a model - which takes into account various kinds of learning activities, and solves some of the problems associated with its use in the learning process. The model can provide designers with the tools for creating a hypermedia learning system, by allowing the elements and functions involved in the definition of a specific application to be formally represented.

  19. Welding process modelling and control

    Science.gov (United States)

    Romine, Peter L.; Adenwala, Jinen A.

    1993-01-01

    The research and analysis performed, and software developed, and hardware/software recommendations made during 1992 in development of the PC-based data acquisition system for support of Welding Process Modeling and Control is reported. A need was identified by the Metals Processing Branch of NASA Marshall Space Flight Center, for a mobile data aquisition and analysis system, customized for welding measurement and calibration. Several hardware configurations were evaluated and a PC-based system was chosen. The Welding Measurement System (WMS) is a dedicated instrument, strictly for the use of data aquisition and analysis. Although the WMS supports many of the functions associated with the process control, it is not the intention for this system to be used for welding process control.

  20. Meteoroid Environment Modeling: the Meteoroid Engineering Model and Shower Forecasting

    Science.gov (United States)

    Moorhead, Althea V.

    2017-01-01

    The meteoroid environment is often divided conceptually into meteor showers plus a sporadic background component. The sporadic complex poses the bulk of the risk to spacecraft, but showers can produce significant short-term enhancements of the meteoroid flux. The Meteoroid Environment Office (MEO) has produced two environment models to handle these cases: the Meteoroid Engineering Model (MEM) and an annual meteor shower forecast. Both MEM and the forecast are used by multiple manned spaceflight projects in their meteoroid risk evaluation, and both tools are being revised to incorporate recent meteor velocity, density, and timing measurements. MEM describes the sporadic meteoroid complex and calculates the flux, speed, and directionality of the meteoroid environment relative to a user-supplied spacecraft trajectory, taking the spacecraft's motion into account. MEM is valid in the inner solar system and offers near-Earth and cis-lunar environments. While the current version of MEM offers a nominal meteoroid environment corresponding to a single meteoroid bulk density, the next version of MEMR3 will offer both flux uncertainties and a density distribution in addition to a revised near-Earth environment. We have updated the near-Earth meteor speed distribution and have made the first determination of uncertainty in this distribution. We have also derived a meteor density distribution from the work of Kikwaya et al. (2011). The annual meteor shower forecast takes the form of a report and data tables that can be used in conjunction with an existing MEM assessment. Fluxes are typically quoted to a constant limiting kinetic energy in order to comport with commonly used ballistic limit equations. For the 2017 annual forecast, the MEO substantially revised the list of showers and their characteristics using 14 years of meteor flux measurements from the Canadian Meteor Orbit Radar (CMOR). Defunct or insignificant showers were removed and the temporal profiles of many showers

  1. HYBRID MODELS FOR TRAJECTORY ERROR MODELLING IN URBAN ENVIRONMENTS

    Directory of Open Access Journals (Sweden)

    E. Angelatsa

    2016-06-01

    Full Text Available This paper tackles the first step of any strategy aiming to improve the trajectory of terrestrial mobile mapping systems in urban environments. We present an approach to model the error of terrestrial mobile mapping trajectories, combining deterministic and stochastic models. Due to urban specific environment, the deterministic component will be modelled with non-continuous functions composed by linear shifts, drifts or polynomial functions. In addition, we will introduce a stochastic error component for modelling residual noise of the trajectory error function. First step for error modelling requires to know the actual trajectory error values for several representative environments. In order to determine as accurately as possible the trajectories error, (almost error less trajectories should be estimated using extracted nonsemantic features from a sequence of images collected with the terrestrial mobile mapping system and from a full set of ground control points. Once the references are estimated, they will be used to determine the actual errors in terrestrial mobile mapping trajectory. The rigorous analysis of these data sets will allow us to characterize the errors of a terrestrial mobile mapping system for a wide range of environments. This information will be of great use in future campaigns to improve the results of the 3D points cloud generation. The proposed approach has been evaluated using real data. The data originate from a mobile mapping campaign over an urban and controlled area of Dortmund (Germany, with harmful GNSS conditions. The mobile mapping system, that includes two laser scanner and two cameras, was mounted on a van and it was driven over a controlled area around three hours. The results show the suitability to decompose trajectory error with non-continuous deterministic and stochastic components.

  2. An equilibrium profile model for tidal environments

    Directory of Open Access Journals (Sweden)

    A. M. Bernabeu

    2002-12-01

    Full Text Available During a full tidal cycle, the beach profile is exposed to continuously changing hydrodynamical conditions. Consequently, the profile evolves constantly to adapt to these changes. The equilibrium condition on tidal beaches is defined in terms of the relative occurrence of swash, surf zone and shoaling processes. We have assumed that the tidal beach profile is in equilibrium when the net sediment transport along a tidal cycle is zero. In this model the contribution of swash is considered negligible. A simple and easy-to-apply equilibrium profile formulation is proposed. This model is based on the assumption that surf zone processes dominate the profile morphology wherever wave breaking occurs during the tidal cycle. The obtained equilibrium profile is valid from the high tide level to the breaker point at low tide level. The tidal influence on the profile morphology is the lengthening of the surf profile. The higher the tidal range, the longer the surf profile. The model was tested against field and laboratory data, showing reasonable predictions of measured beach profiles.

  3. Modeling Low-temperature Geochemical Processes

    Science.gov (United States)

    Nordstrom, D. K.

    2003-12-01

    Geochemical modeling has become a popular and useful tool for a wide number of applications from research on the fundamental processes of water-rock interactions to regulatory requirements and decisions regarding permits for industrial and hazardous wastes. In low-temperature environments, generally thought of as those in the temperature range of 0-100 °C and close to atmospheric pressure (1 atm=1.01325 bar=101,325 Pa), complex hydrobiogeochemical reactions participate in an array of interconnected processes that affect us, and that, in turn, we affect. Understanding these complex processes often requires tools that are sufficiently sophisticated to portray multicomponent, multiphase chemical reactions yet transparent enough to reveal the main driving forces. Geochemical models are such tools. The major processes that they are required to model include mineral dissolution and precipitation; aqueous inorganic speciation and complexation; solute adsorption and desorption; ion exchange; oxidation-reduction; or redox; transformations; gas uptake or production; organic matter speciation and complexation; evaporation; dilution; water mixing; reaction during fluid flow; reaction involving biotic interactions; and photoreaction. These processes occur in rain, snow, fog, dry atmosphere, soils, bedrock weathering, streams, rivers, lakes, groundwaters, estuaries, brines, and diagenetic environments. Geochemical modeling attempts to understand the redistribution of elements and compounds, through anthropogenic and natural means, for a large range of scale from nanometer to global. "Aqueous geochemistry" and "environmental geochemistry" are often used interchangeably with "low-temperature geochemistry" to emphasize hydrologic or environmental objectives.Recognition of the strategy or philosophy behind the use of geochemical modeling is not often discussed or explicitly described. Plummer (1984, 1992) and Parkhurst and Plummer (1993) compare and contrast two approaches for

  4. Animal models and conserved processes

    Directory of Open Access Journals (Sweden)

    Greek Ray

    2012-09-01

    Full Text Available Abstract Background The concept of conserved processes presents unique opportunities for using nonhuman animal models in biomedical research. However, the concept must be examined in the context that humans and nonhuman animals are evolved, complex, adaptive systems. Given that nonhuman animals are examples of living systems that are differently complex from humans, what does the existence of a conserved gene or process imply for inter-species extrapolation? Methods We surveyed the literature including philosophy of science, biological complexity, conserved processes, evolutionary biology, comparative medicine, anti-neoplastic agents, inhalational anesthetics, and drug development journals in order to determine the value of nonhuman animal models when studying conserved processes. Results Evolution through natural selection has employed components and processes both to produce the same outcomes among species but also to generate different functions and traits. Many genes and processes are conserved, but new combinations of these processes or different regulation of the genes involved in these processes have resulted in unique organisms. Further, there is a hierarchy of organization in complex living systems. At some levels, the components are simple systems that can be analyzed by mathematics or the physical sciences, while at other levels the system cannot be fully analyzed by reducing it to a physical system. The study of complex living systems must alternate between focusing on the parts and examining the intact whole organism while taking into account the connections between the two. Systems biology aims for this holism. We examined the actions of inhalational anesthetic agents and anti-neoplastic agents in order to address what the characteristics of complex living systems imply for inter-species extrapolation of traits and responses related to conserved processes. Conclusion We conclude that even the presence of conserved processes is

  5. A production model and maintenance planning model for the process industry

    NARCIS (Netherlands)

    Ashayeri, J.; Teelen, A.; Selen, W.J.

    1995-01-01

    In this paper a model is developed to simultaneously plan preventive maintenance and production in a process industry environment, where maintenance planning is extremely important. The model schedules production jobs and preventive maintenance jobs, while minimizing costs associated with

  6. Steady-State Process Modelling

    DEFF Research Database (Denmark)

    Cameron, Ian; Gani, Rafiqul

    2011-01-01

    This chapter covers the basic principles of steady state modelling and simulation using a number of case studies. Two principal approaches are illustrated that develop the unit operation models from first principles as well as through application of standard flowsheet simulators. The approaches...... illustrate the “equation oriented” approach as well as the “sequential modular” approach to solving complex flowsheets for steady state applications. The applications include the Williams-Otto plant, the hydrodealkylation (HDA) of toluene, conversion of ethylene to ethanol and a bio-ethanol process....

  7. Switching Processes in Queueing Models

    CERN Document Server

    Anisimov, Vladimir V

    2008-01-01

    Switching processes, invented by the author in 1977, is the main tool used in the investigation of traffic problems from automotive to telecommunications. The title provides a new approach to low traffic problems based on the analysis of flows of rare events and queuing models. In the case of fast switching, averaging principle and diffusion approximation results are proved and applied to the investigation of transient phenomena for wide classes of overloading queuing networks.  The book is devoted to developing the asymptotic theory for the class of switching queuing models which covers  mode

  8. External Knowledge Sourcing and Innovation Processes in Modern Economic Environment

    Directory of Open Access Journals (Sweden)

    Roszkowska Dorota

    2017-06-01

    Full Text Available In an open and digital economy where ICTs, global networks and innovation systems play a key economic role, knowledge used by companies is increasingly gathered using different external sources. Rapidly changing technology enables companies to use new ways to innovate. New innovation processes permit companies to reduce risk and the costs of innovation. New paradigms, called open innovation and co-innovation, allow organizations to remain innovative in a rapidly changing environment. The objectives of this paper are: to provide a better understanding of open innovation and co-innovation paradigms and to suggest instruments for organizations to benefit from co-innovation ecosystem.

  9. Sampling the food processing environment: taking up the cudgel for preventive quality management in food processing environments.

    Science.gov (United States)

    Wagner, Martin; Stessl, Beatrix

    2014-01-01

    The Listeria monitoring program for Austrian cheese factories was established in 1988. The basic idea is to control the introduction of L. monocytogenes into the food processing environment, preventing the pathogen from contaminating the food under processing. The Austrian Listeria monitoring program comprises four levels of investigation, dealing with routine monitoring of samples and consequences of finding a positive sample. Preventive quality control concepts attempt to detect a foodborne hazard along the food processing chain, prior to food delivery, retailing, and consumption. The implementation of a preventive food safety concept provokes a deepened insight by the manufacturers into problems concerning food safety. The development of preventive quality assurance strategies contributes to the national food safety status and protects public health.

  10. INSTITUTIONAL ENVIRONMENT OF THE AGRICULTURAL MARKET FORMATION PROCESS

    Directory of Open Access Journals (Sweden)

    S. Revenko

    2013-11-01

    Full Text Available This article considers institutional aspects of the organized agricultural market formation process. Theoretical base to distinguish institute and institutes is given. In order to find out main influential institutes of the “organization” phenomenon author analyses Ukrainian institutional environment that is under construction process. Author considers main processes which are running during the organized market formation. Author researches theoretical approaches to the institutional staff. In order to structure the most common approaches and theoretical knowledge of this problem author proposes few schemes. Author’s points of view for many questions of the organized market formation process are proposed. Researcher analyzes effectiveness of the institutes and governmental regulation of the agricultural market. Readers can find strategically new approach to the agricultural market formation policy from the governmental point of view. Essence of the socioeconomic formation of agricultural market is considered. Main factors of agriculture market formation are outlined. Agricultural market structural parts consideration systematic approach is proposed. Ineffectiveness of the agriculture market relations without regulation process is proved. The most unfavorable reasons of the agricultural market formation are determined.

  11. Long-Lived Plasma Process, Created by Impulse Discharge in Micro-Disperse Droplet Environment

    Directory of Open Access Journals (Sweden)

    Serge Olszewski

    2013-01-01

    Full Text Available The processes of organic compound (phenol and cation-active surfactants destruction in water solutions, which stay under the influence of plasma treatment have been investigated in different dynamic plasma-liquid systems (PLS with discharges in droplet micro-disperse environments (DMDE. The long-lived plasma process with separate spectral properties has been observed for pulsed discharge in DMDE. The approximate computer model is being proposed for a description of this effect. According to the introduced model this long-lived process is the aggregation of correlated discharges between charged droplets.

  12. Building Information Modelling for Smart Built Environments

    Directory of Open Access Journals (Sweden)

    Jianchao Zhang

    2015-01-01

    Full Text Available Building information modelling (BIM provides architectural 3D visualization and a standardized way to share and exchange building information. Recently, there has been an increasing interest in using BIM, not only for design and construction, but also the post-construction management of the built facility. With the emergence of smart built environment (SBE technology, which embeds most spaces with smart objects to enhance the building’s efficiency, security and comfort of its occupants, there is a need to understand and address the challenges BIM faces in the design, construction and management of future smart buildings. In this paper, we investigate how BIM can contribute to the development of SBE. Since BIM is designed to host information of the building throughout its life cycle, our investigation has covered phases from architecture design to facility management. Firstly, we extend BIM for the design phase to provide material/device profiling and the information exchange interface for various smart objects. Next, we propose a three-layer verification framework to assist BIM users in identifying possible defects in their SBE design. For the post-construction phase, we have designed a facility management tool to provide advanced energy management of smart grid-connected SBEs, where smart objects, as well as distributed energy resources (DERs are deployed.

  13. Mesoscopic Modeling of Reactive Transport Processes

    Science.gov (United States)

    Kang, Q.; Chen, L.; Deng, H.

    2012-12-01

    Reactive transport processes involving precipitation and/or dissolution are pervasive in geochemical, biological and engineered systems. Typical examples include self-assembled patterns such as Liesegang rings or bands, cones of stalactites in limestones caves, biofilm growth in aqueous environment, formation of mineral deposits in boilers and heat exchangers, uptake of toxic metal ions from polluted water by calcium carbonate, and mineral trapping of CO2. Compared to experimental studies, a numerical approach enables a systematic study of the reaction kinetics, mass transport, and mechanisms of nucleation and crystal growth, and hence provides a detailed description of reactive transport processes. In this study, we enhance a previously developed lattice Boltzmann pore-scale model by taking into account the nucleation process, and develop a mesoscopic approach to simulate reactive transport processes involving precipitation and/or dissolution of solid phases. The model is then used to simulate the formation of Liesegang precipitation patterns and investigate the effects of gel on the morphology of the precipitates. It is shown that this model can capture the porous structures of the precipitates and can account for the effects of the gel concentration and material. A wide range of precipitation patterns is predicted under different gel concentrations, including regular bands, treelike patterns, and for the first time with numerical models, transition patterns from regular bands to treelike patterns. The model is also applied to study the effect of secondary precipitate on the dissolution of primary mineral. Several types of dissolution and precipitation processes are identified based on the morphology and structures of the precipitates and on the extent to which the precipitates affect the dissolution of the primary mineral. Finally the model is applied to study the formation of pseudomorph. It is demonstrated for the first time by numerical simulation that a

  14. Unstabling in the legal Environment and Decision-Making process

    Directory of Open Access Journals (Sweden)

    Mostafa Jafari

    2017-03-01

    Full Text Available The aim of this study was to determine the impact of change dimensions in the legal environment on the quality of each stage of the decision-making process of the senior managers of public institutions. The interest population of this study included all general managers, directors, administrative, financial and support assistants, financial controllers and managers and other executive directors and deputies of Zanjan province that were studied by census method. The data collection toll was a researcher-made questionnaire that its reliability and validity were confirmed (Cronbach's alpha coefficient: 0.87. Data analysis was performed using descriptive statistics and inferential statistics techniques (Chi-square test and Friedman test by SPSS software. The results show that the dimensions of changes in the legal environment factors affect on the two first and third stages of the decision making process of managers (stages of data collection, decision-making and its implementation. However, it has no effect of the data and information analysis stage.

  15. The Engagement Model of Person-Environment Interaction

    Science.gov (United States)

    Neufeld, Jason E.; Rasmussen, Heather N.; Lopez, Shane J.; Ryder, Jamie A.; Magyar-Moe, Jeana L.; Ford, Alicia Ito; Edwards, Lisa M.; Bouwkamp, Jennifer C.

    2006-01-01

    This article focuses on growth-promoting aspects in the environment, and the authors propose a strength-based, dynamic model of person-environment interaction. The authors begin by briefly discussing the typical recognition of contextual variables in models that rely on the concept of person-environment fit. This is followed by a review of recent…

  16. Prediction of Ready Queue Processing Time in Multiprocessor Environment Using Lottery Scheduling (ULS

    Directory of Open Access Journals (Sweden)

    Amita CHOUDHARY

    2011-01-01

    Full Text Available While in multi-user environment, CPU has to manage lot of requests generated over the same time. Waiting queue of processes generates a problem of scheduling for processors. Designers and hardware architects have suggested system of multiprocessors to overcome the queue length. Lottery scheduling is one such method where processes in waiting queue are selected through a chance manner. This opens a way to use probability models to get estimates of system parameters. This paper is an application where the processing time of jobs in ready queue is predicted using the sampling method under the k-processors environment (k>1.The random selection of one process by each of k processors through without replacement method is a sample data set which helps in the prediction of possible ready queue processing time. Some theorems are established and proved to get desired results in terms of confidence intervals.

  17. Unimodal models to relate species to environment

    NARCIS (Netherlands)

    Braak, ter C.J.F.

    1987-01-01

    To assess the impact of environmental change on biological communities knowledge about species-environment relationships is indispensable. Ecologists attempt to uncover the relationships between species and environment from data obtained from field surveys. In the survey, species are scored on their

  18. Enhanced living environments from models to technologies

    CERN Document Server

    Dobre, Ciprian; Ganchev, Ivan; Garcia, Nuno; Goleva, Rossitza Ivanova

    2017-01-01

    Enhanced living environments employ information and communications technologies to support true ambient assisted living for people with disabilities. This book provides an overview of today's architectures, techniques, protocols, components, and cloud-based solutions related to ambient assisted living and enhanced living environments.

  19. Models of memory: information processing.

    Science.gov (United States)

    Eysenck, M W

    1988-01-01

    A complete understanding of human memory will necessarily involve consideration of the active processes involved at the time of learning and of the organization and nature of representation of information in long-term memory. In addition to process and structure, it is important for theory to indicate the ways in which stimulus-driven and conceptually driven processes interact with each other in the learning situation. Not surprisingly, no existent theory provides a detailed specification of all of these factors. However, there are a number of more specific theories which are successful in illuminating some of the component structures and processes. The working memory model proposed by Baddeley and Hitch (1974) and modified subsequently has shown how the earlier theoretical construct of the short-term store should be replaced with the notion of working memory. In essence, working memory is a system which is used both to process information and to permit the transient storage of information. It comprises a number of conceptually distinct, but functionally interdependent components. So far as long-term memory is concerned, there is evidence of a number of different kinds of representation. Of particular importance is the distinction between declarative knowledge and procedural knowledge, a distinction which has received support from the study of amnesic patients. Kosslyn has argued for a distinction between literal representation and propositional representation, whereas Tulving has distinguished between episodic and semantic memories. While Tulving's distinction is perhaps the best known, there is increasing evidence that episodic and semantic memory differ primarily in content rather than in process, and so the distinction may be of less theoretical value than was originally believed.(ABSTRACT TRUNCATED AT 250 WORDS)

  20. Exploring Undergraduate Students' Mental Models of the Environment: Are They Related to Environmental Affect and Behavior?

    Science.gov (United States)

    Liu, Shu-Chiu; Lin, Huann-shyang

    2015-01-01

    A draw-and-explain task and questionnaire were used to explore Taiwanese undergraduate students' mental models of the environment and whether and how they relate to their environmental affect and behavioral commitment. We found that students generally held incomplete mental models of the environment, focusing on objects rather than on processes or…

  1. Mathematical modeling of biological processes

    CERN Document Server

    Friedman, Avner

    2014-01-01

    This book on mathematical modeling of biological processes includes a wide selection of biological topics that demonstrate the power of mathematics and computational codes in setting up biological processes with a rigorous and predictive framework. Topics include: enzyme dynamics, spread of disease, harvesting bacteria, competition among live species, neuronal oscillations, transport of neurofilaments in axon, cancer and cancer therapy, and granulomas. Complete with a description of the biological background and biological question that requires the use of mathematics, this book is developed for graduate students and advanced undergraduate students with only basic knowledge of ordinary differential equations and partial differential equations; background in biology is not required. Students will gain knowledge on how to program with MATLAB without previous programming experience and how to use codes in order to test biological hypothesis.

  2. Modeling of biopharmaceutical processes. Part 2: Process chromatography unit operation

    DEFF Research Database (Denmark)

    Kaltenbrunner, Oliver; McCue, Justin; Engel, Philip

    2008-01-01

    Process modeling can be a useful tool to aid in process development, process optimization, and process scale-up. When modeling a chromatography process, one must first select the appropriate models that describe the mass transfer and adsorption that occurs within the porous adsorbent....... The theoretical understanding of chromatographic behavior can augment available experimental data and aid in the design of specific experiments to develop a more complete understanding of the behavior of a unit operation....

  3. Software Engineering with Process Algebra: Modelling Client / Server Architectures

    OpenAIRE

    Diertens, B.

    2009-01-01

    In previous work we described how the process algebra based language PSF can be used in software engineering, using the ToolBus, a coordination architecture also based on process algebra, as implementation model. We also described this software development process more formally by presenting the tools we use in this process in a CASE setting, leading to the PSF-ToolBus software engineering environment. In this article we summarize that work and describe a similar software development process ...

  4. Modeling the cometary environment using a fluid approach

    Science.gov (United States)

    Shou, Yinsi

    Comets are believed to have preserved the building material of the early solar system and to hold clues to the origin of life on Earth. Abundant remote observations of comets by telescopes and the in-situ measurements by a handful of space missions reveal that the cometary environments are complicated by various physical and chemical processes among the neutral gases and dust grains released from comets, cometary ions, and the solar wind in the interplanetary space. Therefore, physics-based numerical models are in demand to interpret the observational data and to deepen our understanding of the cometary environment. In this thesis, three models using a fluid approach, which include important physical and chemical processes underlying the cometary environment, have been developed to study the plasma, neutral gas, and the dust grains, respectively. Although models based on the fluid approach have limitations in capturing all of the correct physics for certain applications, especially for very low gas density environment, they are computationally much more efficient than alternatives. In the simulations of comet 67P/Churyumov-Gerasimenko at various heliocentric distances with a wide range of production rates, our multi-fluid cometary neutral gas model and multi-fluid cometary dust model have achieved comparable results to the Direct Simulation Monte Carlo (DSMC) model, which is based on a kinetic approach that is valid in all collisional regimes. Therefore, our model is a powerful alternative to the particle-based model, especially for some computationally intensive simulations. Capable of accounting for the varying heating efficiency under various physical conditions in a self-consistent way, the multi-fluid cometary neutral gas model is a good tool to study the dynamics of the cometary coma with different production rates and heliocentric distances. The modeled H2O expansion speeds reproduce the general trend and the speed's nonlinear dependencies of production rate

  5. Modeling pellet impact drilling process

    Science.gov (United States)

    Kovalyov, A. V.; Ryabchikov, S. Ya; Isaev, Ye D.; Ulyanova, O. S.

    2016-03-01

    The paper describes pellet impact drilling which could be used to increase the drilling speed and the rate of penetration when drilling hard rocks. Pellet impact drilling implies rock destruction by metal pellets with high kinetic energy in the immediate vicinity of the earth formation encountered. The pellets are circulated in the bottom hole by a high velocity fluid jet, which is the principle component of the ejector pellet impact drill bit. The experiments conducted has allowed modeling the process of pellet impact drilling, which creates the scientific and methodological basis for engineering design of drilling operations under different geo-technical conditions.

  6. Workflows for microarray data processing in the Kepler environment

    Directory of Open Access Journals (Sweden)

    Stropp Thomas

    2012-05-01

    Full Text Available Abstract Background Microarray data analysis has been the subject of extensive and ongoing pipeline development due to its complexity, the availability of several options at each analysis step, and the development of new analysis demands, including integration with new data sources. Bioinformatics pipelines are usually custom built for different applications, making them typically difficult to modify, extend and repurpose. Scientific workflow systems are intended to address these issues by providing general-purpose frameworks in which to develop and execute such pipelines. The Kepler workflow environment is a well-established system under continual development that is employed in several areas of scientific research. Kepler provides a flexible graphical interface, featuring clear display of parameter values, for design and modification of workflows. It has capabilities for developing novel computational components in the R, Python, and Java programming languages, all of which are widely used for bioinformatics algorithm development, along with capabilities for invoking external applications and using web services. Results We developed a series of fully functional bioinformatics pipelines addressing common tasks in microarray processing in the Kepler workflow environment. These pipelines consist of a set of tools for GFF file processing of NimbleGen chromatin immunoprecipitation on microarray (ChIP-chip datasets and more comprehensive workflows for Affymetrix gene expression microarray bioinformatics and basic primer design for PCR experiments, which are often used to validate microarray results. Although functional in themselves, these workflows can be easily customized, extended, or repurposed to match the needs of specific projects and are designed to be a toolkit and starting point for specific applications. These workflows illustrate a workflow programming paradigm focusing on local resources (programs and data and therefore are close to

  7. Workflows for microarray data processing in the Kepler environment

    Science.gov (United States)

    2012-01-01

    Background Microarray data analysis has been the subject of extensive and ongoing pipeline development due to its complexity, the availability of several options at each analysis step, and the development of new analysis demands, including integration with new data sources. Bioinformatics pipelines are usually custom built for different applications, making them typically difficult to modify, extend and repurpose. Scientific workflow systems are intended to address these issues by providing general-purpose frameworks in which to develop and execute such pipelines. The Kepler workflow environment is a well-established system under continual development that is employed in several areas of scientific research. Kepler provides a flexible graphical interface, featuring clear display of parameter values, for design and modification of workflows. It has capabilities for developing novel computational components in the R, Python, and Java programming languages, all of which are widely used for bioinformatics algorithm development, along with capabilities for invoking external applications and using web services. Results We developed a series of fully functional bioinformatics pipelines addressing common tasks in microarray processing in the Kepler workflow environment. These pipelines consist of a set of tools for GFF file processing of NimbleGen chromatin immunoprecipitation on microarray (ChIP-chip) datasets and more comprehensive workflows for Affymetrix gene expression microarray bioinformatics and basic primer design for PCR experiments, which are often used to validate microarray results. Although functional in themselves, these workflows can be easily customized, extended, or repurposed to match the needs of specific projects and are designed to be a toolkit and starting point for specific applications. These workflows illustrate a workflow programming paradigm focusing on local resources (programs and data) and therefore are close to traditional shell scripting or

  8. A Process for Technology Prioritization in a Competitive Environment

    Science.gov (United States)

    Stephens, Karen; Herman, Melody; Griffin, Brand

    2006-01-01

    This slide presentation reviews NASA's process for prioritizing technology requirements where there is a competitive environment. The In-Space Propulsion Technology (ISPT) project is used to exemplify the process. The ISPT project focuses on the mid level Technology Readiness Level (TRL) for development. These are TRL's 4 through 6, (i.e. Technology Development and Technology Demonstration. The objective of the planning activity is to identify the current most likely date each technology is needed and create ISPT technology development schedules based on these dates. There is a minimum of 4 years between flight and pacing mission. The ISPT Project needed to identify the "pacing mission" for each technology in order to provide funding for each area. Graphic representations show the development of the process. A matrix shows which missions are currently receiving pull from the both the Solar System Exploration and the Sun-Solar System Connection Roadmaps. The timeframes of the pacing missions technologies are shown for various types of propulsion. A pacing mission that was in the near future serves to increase the priority for funding. Adaptations were made when budget reductions precluded the total implementation of the plan.

  9. Construction material processed using lunar simulant in various environments

    Science.gov (United States)

    Chase, Stan; Ocallaghan-Hay, Bridget; Housman, Ralph; Kindig, Michael; King, John; Montegrande, Kevin; Norris, Raymond; Vanscotter, Ryan; Willenborg, Jonathan; Staubs, Harry

    1995-01-01

    The manufacture of construction materials from locally available resources in space is an important first step in the establishment of lunar and planetary bases. The objective of the CoMPULSIVE (Construction Material Processed Using Lunar Simulant In Various Environments) experiment is to develop a procedure to produce construction materials by sintering or melting Johnson Space Center Simulant 1 (JSC-1) lunar soil simulant in both earth-based (1-g) and microgravity (approximately 0-g) environments. The characteristics of the resultant materials will be tested to determine its physical and mechanical properties. The physical characteristics include: crystalline, thermal, and electrical properties. The mechanical properties include: compressive tensile, and flexural strengths. The simulant, placed in a sealed graphite crucible, will be heated using a high temperature furnace. The crucible will then be cooled by radiative and forced convective means. The core furnace element consists of space qualified quartz-halogen incandescent lamps with focusing mirrors. Sample temperatures of up to 2200 C are attainable using this heating method.

  10. Statistical Process Control in a Modern Production Environment

    DEFF Research Database (Denmark)

    Windfeldt, Gitte Bjørg

    Paper 1 is aimed at practicians to help them test the assumption that the observations in a sample are independent and identically distributed. An assumption that is essential when using classical Shewhart charts. The test can easily be performed in the control chart setup using the samples...... gathered here and standard statistical software. In Paper 2 a new method for process monitoring is introduced. The method uses a statistical model of the quality characteristic and a sliding window of observations to estimate the probability that the next item will not respect the specications....... If the estimated probability exceeds a pre-determined threshold the process will be stopped. The method is exible, allowing a complexity in modeling that remains invisible to the end user. Furthermore, the method allows to build diagnostic plots based on the parameters estimates that can provide valuable insight...

  11. Modeling Based Decision Support Environment Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Phoenix Integration's vision is the creation of an intuitive human-in-the-loop engineering environment called Decision Navigator that leverages recent advances in...

  12. Wigner Ville Distribution in Signal Processing, using Scilab Environment

    Directory of Open Access Journals (Sweden)

    Petru Chioncel

    2011-01-01

    Full Text Available The Wigner Ville distribution offers a visual display of quantitative information about the way a signal’s energy is distributed in both, time and frequency. Through that, this distribution embodies the fundamentally concepts of the Fourier and time-domain analysis. The energy of the signal is distributed so that specific frequencies are localized in time by the group delay time and at specifics instants in time the frequency is given by the instantaneous frequency. The net positive volum of the Wigner distribution is numerically equal to the signal’s total energy. The paper shows the application of the Wigner Ville distribution, in the field of signal processing, using Scilab environment.

  13. A Content Specification for Business Process Models

    Directory of Open Access Journals (Sweden)

    Akhilesh Bajaj

    1996-11-01

    Full Text Available Business process modeling is an essential prerequisite to business process reengineering (BPR, and workflow management (WFM. Process models have been traditionally used to model software processes, and many business process models are adaptations of these process models. Using these process models to represent business processes results in two problems. First, since these process models usually represent different perspectives of systems (or in this case, businesses the user needs to integrate multiple existing models to completely represent the business processes. This reduces the ease of use, and leads to a lower acceptance by users. Second, business processes contain concepts not found in software processes (e.g., physical objects, roles, etc.. Traditional process models cannot represent these new concepts, and hence traditional process models model business processes inadequately. These two problems can be easily solved if a comprehensive business process model exists, that models all perspectives of a business process, and that allows representation of these new concepts. As a first step towards this goal, we propose a content specification that would need to be satisfied by such a business process model. The primary contribution of this work is a comprehensive content specification for a business process model that will solve the two problems listed above. This content specification also serves as a framework to analyze process models in detail, and to compare them based on their content (i.e., what concepts they model and the degree to which they model each aspect of a business process (i.e., how much of a business process they model.

  14. Business Process Modelling for Measuring Quality

    NARCIS (Netherlands)

    Heidari, F.; Loucopoulos, P.; Brazier, F.M.

    2013-01-01

    Business process modelling languages facilitate presentation, communication and analysis of business processes with different stakeholders. This paper proposes an approach that drives specification and measurement of quality requirements and in doing so relies on business process models as

  15. Collapse models and perceptual processes

    Science.gov (United States)

    Carlo Ghirardi, Gian; Romano, Raffaele

    2014-04-01

    Theories including a collapse mechanism have been presented various years ago. They are based on a modification of standard quantum mechanics in which nonlinear and stochastic terms are added to the evolution equation. Their principal merits derive from the fact that they are mathematically precise schemes accounting, on the basis of a unique universal dynamical principle, both for the quantum behavior of microscopic systems as well as for the reduction associated to measurement processes and for the classical behavior of macroscopic objects. Since such theories qualify themselves not as new interpretations but as modifications of the standard theory they can be, in principle, tested against quantum mechanics. Recently, various investigations identifying possible crucial test have been discussed. In spite of the extreme difficulty to perform such tests it seems that recent technological developments allow at least to put precise limits on the parameters characterizing the modifications of the evolution equation. Here we will simply mention some of the recent investigations in this direction, while we will mainly concentrate our attention to the way in which collapse theories account for definite perceptual process. The differences between the case of reductions induced by perceptions and those related to measurement procedures by means of standard macroscopic devices will be discussed. On this basis, we suggest a precise experimental test of collapse theories involving conscious observers. We make plausible, by discussing in detail a toy model, that the modified dynamics can give rise to quite small but systematic errors in the visual perceptual process.

  16. Thoughts About Created Environment: A Neuman Systems Model Concept.

    Science.gov (United States)

    Verberk, Frans; Fawcett, Jacqueline

    2017-04-01

    This essay is about the Neuman systems model concept of the created environment. The essay, based on work by Frans Verberk, a Neuman systems model scholar from the Netherlands, extends understanding of the created environment by explaining how this distinctive perspective of environment represents an elaboration of the physiological, psychological, sociocultural, developmental, and spiritual variables, which are other central concepts of the Neuman Systems Model.

  17. Influence of global climatic processes on environment The Arctic seas

    Science.gov (United States)

    Kholmyansky, Mikhael; Anokhin, Vladimir; Kartashov, Alexandr

    2016-04-01

    One of the most actual problems of the present is changes of environment of Arctic regions under the influence of global climatic processes. Authors as a result of the works executed by them in different areas of the Russian Arctic regions, have received the materials characterising intensity of these processes. Complex researches are carried out on water area and in a coastal zone the White, the Barents, the Kara and the East-Siberian seas, on lake water areas of subarctic region since 1972 on the present. Into structure of researches enter: hydrophysical, cryological observations, direct measurements of temperatures, the analysis of the drill data, electrometric definitions of the parametres of a frozen zone, lithodynamic and geochemical definitions, geophysical investigations of boreholes, studying of glaciers on the basis of visual observations and the analysis of photographs. The obtained data allows to estimate change of temperature of a water layer, deposits and benthonic horizon of atmosphere for last 25 years. On the average they make 0,38⁰C for sea waters, 0,23⁰C for friable deposits and 0,72⁰C for atmosphere. Under the influence of temperature changes in hydrosphere and lithosphere of a shelf cryolithic zone changes the characteristics. It is possible to note depth increase of roof position of the cryolithic zone on the most part of the studied water area. Modern fast rise in temperature high-ice rocks composing coast, has led to avalanche process thermo - denudation and to receipt in the sea of quantity of a material of 1978 three times exceeding level Rise in temperature involves appreciable deviation borders of the Arctic glacial covers. On our monitoring measurements change of the maintenance of oxygen in benthonic area towards increase that is connected with reduction of the general salinity of waters at the expense of fresh water arriving at ice thawing is noticed. It, in turn, leads to change of a biogene part of ecosystem. The executed

  18. GREENSCOPE: A Method for Modeling Chemical Process ...

    Science.gov (United States)

    Current work within the U.S. Environmental Protection Agency’s National Risk Management Research Laboratory is focused on the development of a method for modeling chemical process sustainability. The GREENSCOPE methodology, defined for the four bases of Environment, Economics, Efficiency, and Energy, can evaluate processes with over a hundred different indicators. These indicators provide a means for realizing the principles of green chemistry and green engineering in the context of sustainability. Development of the methodology has centered around three focal points. One is a taxonomy of impacts that describe the indicators and provide absolute scales for their evaluation. The setting of best and worst limits for the indicators allows the user to know the status of the process under study in relation to understood values. Thus, existing or imagined processes can be evaluated according to their relative indicator scores, and process modifications can strive towards realizable targets. A second area of focus is in advancing definitions of data needs for the many indicators of the taxonomy. Each of the indicators has specific data that is necessary for their calculation. Values needed and data sources have been identified. These needs can be mapped according to the information source (e.g., input stream, output stream, external data, etc.) for each of the bases. The user can visualize data-indicator relationships on the way to choosing selected ones for evalua

  19. Mars Environment and Magnetic Orbiter model payload

    DEFF Research Database (Denmark)

    Langlais, B.; Leblanc, F.; Fouchet, T.

    2009-01-01

    Mars Environment and Magnetic Orbiter was proposed as an answer to the Cosmic Vision Call of Opportunity as a M-class mission. The MEMO mission is designed to study the strong interconnections between the planetary interior, atmosphere and solar conditions essential to understand planetary...

  20. A Collaborative Model for Ubiquitous Learning Environments

    Science.gov (United States)

    Barbosa, Jorge; Barbosa, Debora; Rabello, Solon

    2016-01-01

    Use of mobile devices and widespread adoption of wireless networks have enabled the emergence of Ubiquitous Computing. Application of this technology to improving education strategies gave rise to Ubiquitous e-Learning, also known as Ubiquitous Learning. There are several approaches to organizing ubiquitous learning environments, but most of them…

  1. The Triconnected Abstraction of Process Models

    Science.gov (United States)

    Polyvyanyy, Artem; Smirnov, Sergey; Weske, Mathias

    Companies use business process models to represent their working procedures in order to deploy services to markets, to analyze them, and to improve upon them. Competitive markets necessitate complex procedures, which lead to large process specifications with sophisticated structures. Real world process models can often incorporate hundreds of modeling constructs. While a large degree of detail complicates the comprehension of the processes, it is essential to many analysis tasks. This paper presents a technique to abstract, i.e., to simplify process models. Given a detailed model, we introduce abstraction rules which generalize process fragments in order to bring the model to a higher abstraction level. The approach is suited for the abstraction of large process specifications in order to aid model comprehension as well as decomposing problems of process model analysis. The work is based on process structure trees that have recently been introduced to the field of business process management.

  2. Developing engineering processes through integrated modelling of product and process

    DEFF Research Database (Denmark)

    Nielsen, Jeppe Bjerrum; Hvam, Lars

    2012-01-01

    This article aims at developing an operational tool for integrated modelling of product assortments and engineering processes in companies making customer specific products. Integrating a product model in the design of engineering processes will provide a deeper understanding of the engineering...... activities as well as insight into how product features affect the engineering processes. The article suggests possible ways of integrating models of products with models of engineering processes. The models have been tested and further developed in an action research study carried out in collaboration...... with a major international engineering company....

  3. Mining Process Model Variants: Challenges, Techniques, Examples

    NARCIS (Netherlands)

    Li, C.

    2010-01-01

    During the last years a new generation of process-aware information systems has emerged, which enables process model configurations at buildtime as well as process instance changes during runtime. Respective model adaptations result in large collections of process model variants that are derived

  4. Communicating model insights using interactive learning environments

    NARCIS (Netherlands)

    Slinger, J.H.; Yucel, G.; Pruyt, E.

    2009-01-01

    Much attention is focused on the rational and advisory style of developing and applying System Dynamics models. Even group model building focuses primarily on the formulation and understanding of the model by the group members themselves. There is a dearth of attention for communication of the

  5. Modeling and Formal Verification of Smart Environments

    OpenAIRE

    Corno, Fulvio; Muhammad Sanaullah

    2014-01-01

    Smart Environments (SmE) are a growing combination of various computing frameworks (ubiquitous, pervasive etc), devices, control algorithms and a complex web of interactions. It is at the core of user facilitation in a number of industrial, domestic and public areas. Based on their application areas, SmE may be critical in terms of correctness, reliability, safety, security etc. To achieve error-free and requirement-compliant implementation, these systems are designed resorting to various mod...

  6. Software process assessment using multiple process assessment models

    OpenAIRE

    Peldžius, Stasys

    2014-01-01

    Many software companies face such problems as projects being behind schedule, exceeding the budget, customer dissatisfaction with product quality. Most of the problems arise due to immature software process of the company. The most popular process assessment models worldwide are ISO/IEC 15504 and CMMI. Companies seeking wider official recognition choose between these two models. Companies face the problem that different customers require process assessment according to different models....

  7. Shuttle measured contaminant environment and modeling for payloads. Preliminary assessment of the space telescope environment in the shuttle bay

    Science.gov (United States)

    Scialdone, J. J.

    1983-01-01

    A baseline gaseous and particulate environment of the Shuttle bay was developed based on the various measurements which were made during the first four flights of the Shuttle. The environment is described by the time dependent pressure, density, scattered molecular fluxes, the column densities and including the transient effects of water dumps, engine firings and opening and closing of the bay doors. The particulate conditions in the ambient and on surfaces were predicted as a function of the mission time based on the available data. This basic Shuttle environment when combined with the outgassing and the particulate contributions of the payloads, can provide a description of the environment of a payload in the Shuttle bay. As an example of this application, the environment of the Space Telescope in the bay, which may be representative of the environment of several payloads, was derived. Among the many findings obtained in the process of modeling the environment, one is that the payloads environment in the bay is not substantially different or more objectionable than the self-generated environment of a large payload or spacecraft. It is, however, more severe during ground facilities operations, the first 15 to 20 hours of the flight, during and for a short period after ater was dumped overboard, and the reaction control engines are being fired.

  8. Information-educational environment with adaptive control of learning process

    Science.gov (United States)

    Modjaev, A. D.; Leonova, N. M.

    2017-01-01

    Recent years, a new scientific branch connected with the activities in social sphere management developing intensively and it is called "Social Cybernetics". In the framework of this scientific branch, theory and methods of management of social sphere are formed. Considerable attention is paid to the management, directly in real time. However, the decision of such management tasks is largely constrained by the lack of or insufficiently deep study of the relevant sections of the theory and methods of management. The article discusses the use of cybernetic principles in solving problems of control in social systems. Applying to educational activities a model of composite interrelated objects representing the behaviour of students at various stages of educational process is introduced. Statistical processing of experimental data obtained during the actual learning process is being done. If you increase the number of features used, additionally taking into account the degree and nature of variability of levels of current progress of students during various types of studies, new properties of students' grouping are discovered. L-clusters were identified, reflecting the behaviour of learners with similar characteristics during lectures. It was established that the characteristics of the clusters contain information about the dynamics of learners' behaviour, allowing them to be used in additional lessons. The ways of solving the problem of adaptive control based on the identified dynamic characteristics of the learners are planned.

  9. Computer Aided Continuous Time Stochastic Process Modelling

    DEFF Research Database (Denmark)

    Kristensen, N.R.; Madsen, Henrik; Jørgensen, Sten Bay

    2001-01-01

    A grey-box approach to process modelling that combines deterministic and stochastic modelling is advocated for identification of models for model-based control of batch and semi-batch processes. A computer-aided tool designed for supporting decision-making within the corresponding modelling cycle...

  10. Measures of quality of process models created in BPMN

    Directory of Open Access Journals (Sweden)

    Radek Hronza

    2015-12-01

    Full Text Available Description, documentation, evaluation and redesign of key processes during their execution should be an essential part of the strategic management of any organization. All organizations live in a dynamically changing environment. Therefore they must adapt its internal processes to market changes. These processes must be described. Suitable way of description could be BPMN notation. Right after description of processes via BPMN, processes should be controlled to ensure expected of their quality. System (which could be automated based on mathematical expression of qualitative characteristics of process models (i.e. measures of quality of process models can support mentioned process controls. Research team trying to design and get into practical use such a tool. The aim of this publication is description of mentioned system – based on measures of the quality of process models - and answer associated scientific questions.

  11. Space in multi-agent systems modelling spatial processes

    Directory of Open Access Journals (Sweden)

    Petr Rapant

    2007-06-01

    Full Text Available Need for modelling of spatial processes arise in the spehere of geoinformation systems in the last time. Some processes (espetially natural ones can be modeled by means of using external tools, e. g. for modelling of contaminant transport in the environment. But in the case of socio-economic processes suitable tools interconnected with GIS are still in quest of reserch and development. One of the candidate technologies are so called multi-agent systems. Their theory is developed quite well, but they lack suitable means for dealing with space. This article deals with this problem and proposes solution for the field of a road transport modelling.

  12. Process Correlation Analysis Model for Process Improvement Identification

    Directory of Open Access Journals (Sweden)

    Su-jin Choi

    2014-01-01

    software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data.

  13. Interoperation Modeling for Intelligent Domotic Environments

    Science.gov (United States)

    Bonino, Dario; Corno, Fulvio

    This paper introduces an ontology-based model for domotic device inter-operation. Starting from a previously published ontology (DogOnt) a refactoring and extension is described allowing to explicitly represent device capabilities, states and commands, and supporting abstract modeling of device inter-operation.

  14. Understanding Fundamental Material Degradation Processes in High Temperature Aggressive Chemomechanical Environments

    Energy Technology Data Exchange (ETDEWEB)

    Stubbins, James; Gewirth, Andrew; Sehitoglu, Huseyin; Sofronis, Petros; Robertson, Ian

    2014-01-16

    The objective of this project is to develop a fundamental understanding of the mechanisms that limit materials durability for very high-temperature applications. Current design limitations are based on material strength and corrosion resistance. This project will characterize the interactions of high-temperature creep, fatigue, and environmental attack in structural metallic alloys of interest for the very high-temperature gas-cooled reactor (VHTR) or Next–Generation Nuclear Plant (NGNP) and for the associated thermo-chemical processing systems for hydrogen generation. Each of these degradation processes presents a major materials design challenge on its own, but in combination, they can act synergistically to rapidly degrade materials and limit component lives. This research and development effort will provide experimental results to characterize creep-fatigue-environment interactions and develop predictive models to define operation limits for high-temperature structural material applications. Researchers will study individually and in combination creep-fatigue-environmental attack processes in Alloys 617, 230, and 800H, as well as in an advanced Ni-Cr oxide dispersion strengthened steel (ODS) system. For comparison, the study will also examine basic degradation processes in nichrome (Ni-20Cr), which is a basis for most high-temperature structural materials, as well as many of the superalloys. These materials are selected to represent primary candidate alloys, one advanced developmental alloy that may have superior high-temperature durability, and one model system on which basic performance and modeling efforts can be based. The research program is presented in four parts, which all complement each other. The first three are primarily experimental in nature, and the last will tie the work together in a coordinated modeling effort. The sections are (1) dynamic creep-fatigue-environment process, (2) subcritical crack processes, (3) dynamic corrosion – crack

  15. Analog modelling of obduction processes

    Science.gov (United States)

    Agard, P.; Zuo, X.; Funiciello, F.; Bellahsen, N.; Faccenna, C.; Savva, D.

    2012-04-01

    Obduction corresponds to one of plate tectonics oddities, whereby dense, oceanic rocks (ophiolites) are presumably 'thrust' on top of light, continental ones, as for the short-lived, almost synchronous Peri-Arabic obduction (which took place along thousands of km from Turkey to Oman in c. 5-10 Ma). Analog modelling experiments were performed to study the mechanisms of obduction initiation and test various triggering hypotheses (i.e., plate acceleration, slab hitting the 660 km discontinuity, ridge subduction; Agard et al., 2007). The experimental setup comprises (1) an upper mantle, modelled as a low-viscosity transparent Newtonian glucose syrup filling a rigid Plexiglas tank and (2) high-viscosity silicone plates (Rhodrosil Gomme with PDMS iron fillers to reproduce densities of continental or oceanic plates), located at the centre of the tank above the syrup to simulate the subducting and the overriding plates - and avoid friction on the sides of the tank. Convergence is simulated by pushing on a piston at one end of the model with velocities comparable to those of plate tectonics (i.e., in the range 1-10 cm/yr). The reference set-up includes, from one end to the other (~60 cm): (i) the piston, (ii) a continental margin containing a transition zone to the adjacent oceanic plate, (iii) a weakness zone with variable resistance and dip (W), (iv) an oceanic plate - with or without a spreading ridge, (v) a subduction zone (S) dipping away from the piston and (vi) an upper, active continental margin, below which the oceanic plate is being subducted at the start of the experiment (as is known to have been the case in Oman). Several configurations were tested and over thirty different parametric tests were performed. Special emphasis was placed on comparing different types of weakness zone (W) and the extent of mechanical coupling across them, particularly when plates were accelerated. Displacements, together with along-strike and across-strike internal deformation in all

  16. Business process modeling for processing classified documents using RFID technology

    Directory of Open Access Journals (Sweden)

    Koszela Jarosław

    2016-01-01

    Full Text Available The article outlines the application of the processing approach to the functional description of the designed IT system supporting the operations of the secret office, which processes classified documents. The article describes the application of the method of incremental modeling of business processes according to the BPMN model to the description of the processes currently implemented (“as is” in a manual manner and target processes (“to be”, using the RFID technology for the purpose of their automation. Additionally, the examples of applying the method of structural and dynamic analysis of the processes (process simulation to verify their correctness and efficiency were presented. The extension of the process analysis method is a possibility of applying the warehouse of processes and process mining methods.

  17. Process correlation analysis model for process improvement identification.

    Science.gov (United States)

    Choi, Su-jin; Kim, Dae-Kyoo; Park, Sooyong

    2014-01-01

    Software process improvement aims at improving the development process of software systems. It is initiated by process assessment identifying strengths and weaknesses and based on the findings, improvement plans are developed. In general, a process reference model (e.g., CMMI) is used throughout the process of software process improvement as the base. CMMI defines a set of process areas involved in software development and what to be carried out in process areas in terms of goals and practices. Process areas and their elements (goals and practices) are often correlated due to the iterative nature of software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data.

  18. Mathematical modeling in economics, ecology and the environment

    CERN Document Server

    Hritonenko, Natali

    2013-01-01

    Updated to textbook form by popular demand, this second edition discusses diverse mathematical models used in economics, ecology, and the environmental sciences with emphasis on control and optimization. It is intended for graduate and upper-undergraduate course use, however, applied mathematicians, industry practitioners, and a vast number of interdisciplinary academics will find the presentation highly useful. Core topics of this text are: ·         Economic growth and technological development ·         Population dynamics and human impact on the environment ·         Resource extraction and scarcity ·         Air and water contamination ·         Rational management of the economy and environment ·         Climate change and global dynamics The step-by-step approach taken is problem-based and easy to follow. The authors aptly demonstrate that the same models may be used to describe different economic and environmental processes and that similar invest...

  19. Modeling of Blending Processes | Graichen | Zede Journal

    African Journals Online (AJOL)

    The blending as to essential properties of raw materials tends to growing importance in chemical process industries recently. That is why the modeling of such processes is an urgent need. It can be shown - on the base of the general model for mechanical macroprocesses - that the modeling of blending processes is ...

  20. DEVELOPMENT OF A HETEROGENIC DISTRIBUTED ENVIRONMENT FOR SPATIAL DATA PROCESSING USING CLOUD TECHNOLOGIES

    Directory of Open Access Journals (Sweden)

    A. S. Garov

    2016-06-01

    Full Text Available We are developing a unified distributed communication environment for processing of spatial data which integrates web-, desktop- and mobile platforms and combines volunteer computing model and public cloud possibilities. The main idea is to create a flexible working environment for research groups, which may be scaled according to required data volume and computing power, while keeping infrastructure costs at minimum. It is based upon the "single window" principle, which combines data access via geoportal functionality, processing possibilities and communication between researchers. Using an innovative software environment the recently developed planetary information system (http://cartsrv.mexlab.ru/geoportal will be updated. The new system will provide spatial data processing, analysis and 3D-visualization and will be tested based on freely available Earth remote sensing data as well as Solar system planetary images from various missions. Based on this approach it will be possible to organize the research and representation of results on a new technology level, which provides more possibilities for immediate and direct reuse of research materials, including data, algorithms, methodology, and components. The new software environment is targeted at remote scientific teams, and will provide access to existing spatial distributed information for which we suggest implementation of a user interface as an advanced front-end, e.g., for virtual globe system.

  1. Development of a Heterogenic Distributed Environment for Spatial Data Processing Using Cloud Technologies

    Science.gov (United States)

    Garov, A. S.; Karachevtseva, I. P.; Matveev, E. V.; Zubarev, A. E.; Florinsky, I. V.

    2016-06-01

    We are developing a unified distributed communication environment for processing of spatial data which integrates web-, desktop- and mobile platforms and combines volunteer computing model and public cloud possibilities. The main idea is to create a flexible working environment for research groups, which may be scaled according to required data volume and computing power, while keeping infrastructure costs at minimum. It is based upon the "single window" principle, which combines data access via geoportal functionality, processing possibilities and communication between researchers. Using an innovative software environment the recently developed planetary information system (http://cartsrv.mexlab.ru/geoportal) will be updated. The new system will provide spatial data processing, analysis and 3D-visualization and will be tested based on freely available Earth remote sensing data as well as Solar system planetary images from various missions. Based on this approach it will be possible to organize the research and representation of results on a new technology level, which provides more possibilities for immediate and direct reuse of research materials, including data, algorithms, methodology, and components. The new software environment is targeted at remote scientific teams, and will provide access to existing spatial distributed information for which we suggest implementation of a user interface as an advanced front-end, e.g., for virtual globe system.

  2. Integrated approaches to the application of advanced modeling technology in process development and optimization

    Energy Technology Data Exchange (ETDEWEB)

    Allgor, R.J.; Feehery, W.F.; Tolsma, J.E. [Massachusetts Institute of Technology, Cambridge, MA (United States)] [and others

    1995-12-31

    The batch process development problem serves as good candidate to guide the development of process modeling environments. It demonstrates that very robust numerical techniques are required within an environment that can collect, organize, and maintain the data and models required to address the batch process development problem. This paper focuses on improving the robustness and efficiency of the numerical algorithms required in such a modeling environment through the development of hybrid numerical and symbolic strategies.

  3. A new security model for collaborative environments

    Energy Technology Data Exchange (ETDEWEB)

    Agarwal, Deborah [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Lorch, Markus [Virginia Polytechnic Inst. and State Univ. (Virginia Tech), Blacksburg, VA (United States); Thompson, Mary [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Perry, Marcia [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2003-06-06

    Prevalent authentication and authorization models for distributed systems provide for the protection of computer systems and resources from unauthorized use. The rules and policies that drive the access decisions in such systems are typically configured up front and require trust establishment before the systems can be used. This approach does not work well for computer software that moderates human-to-human interaction. This work proposes a new model for trust establishment and management in computer systems supporting collaborative work. The model supports the dynamic addition of new users to a collaboration with very little initial trust placed into their identity and supports the incremental building of trust relationships through endorsements from established collaborators. It also recognizes the strength of a users authentication when making trust decisions. By mimicking the way humans build trust naturally the model can support a wide variety of usage scenarios. Its particular strength lies in the support for ad-hoc and dynamic collaborations and the ubiquitous access to a Computer Supported Collaboration Workspace (CSCW) system from locations with varying levels of trust and security.

  4. A new process sensitivity index to identify important system processes under process model and parametric uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Dai, Heng [Pacific Northwest National Laboratory, Richland Washington USA; Ye, Ming [Department of Scientific Computing, Florida State University, Tallahassee Florida USA; Walker, Anthony P. [Environmental Sciences Division and Climate Change Science Institute, Oak Ridge National Laboratory, Oak Ridge Tennessee USA; Chen, Xingyuan [Pacific Northwest National Laboratory, Richland Washington USA

    2017-04-01

    Hydrological models are always composed of multiple components that represent processes key to intended model applications. When a process can be simulated by multiple conceptual-mathematical models (process models), model uncertainty in representing the process arises. While global sensitivity analysis methods have been widely used for identifying important processes in hydrologic modeling, the existing methods consider only parametric uncertainty but ignore the model uncertainty for process representation. To address this problem, this study develops a new method to probe multimodel process sensitivity by integrating the model averaging methods into the framework of variance-based global sensitivity analysis, given that the model averaging methods quantify both parametric and model uncertainty. A new process sensitivity index is derived as a metric of relative process importance, and the index includes variance in model outputs caused by uncertainty in both process models and model parameters. For demonstration, the new index is used to evaluate the processes of recharge and geology in a synthetic study of groundwater reactive transport modeling. The recharge process is simulated by two models that converting precipitation to recharge, and the geology process is also simulated by two models of different parameterizations of hydraulic conductivity; each process model has its own random parameters. The new process sensitivity index is mathematically general, and can be applied to a wide range of problems in hydrology and beyond.

  5. Modelling light and photosynthesis in the marine environment

    Directory of Open Access Journals (Sweden)

    Bogdan Woźniak

    2003-06-01

    Full Text Available The overriding and far-reaching aim of our work has been to achieve a good understanding of the processes of light interaction with phytoplankton in the sea and to develop an innovative physical model of photosynthesis in the marine environment, suitable for the remote sensin gof marine primary production. Unlike previous models, the present one takesgreater account of the complexity of the physiological processes in phytoplankton. We have focused in particular on photophysiological processes, which are governed directly or indirectly by light energy, or in which light, besides the nutrient content in and the temperature of seawater, is one of the principal limiting factors.    To achieve this aim we have carried out comprehensive statistical analyses of the natural variability of the main photophysiological properties of phytoplankton and their links with the principal abiotic factors in the sea. These analyses have made use of extensive empirical data gathered in a wide diversity of seas and oceans by Polish and Russian teams as well as by joint Polish-Russian expeditions. Data sets available on the Internet have also been applied. As a result, a set of more or less complex, semi-empirical models of light-stimulated processes occurring in marine phytoplankton cells has been developed. The trophic type of sea, photo-acclimation and the production of photoprotecting carotenoids, chromatic acclimation and the production of various forms of chlorophyll-antennas and photosynthetic carotenoids, cell adaptation by the package effect, light absorption, photosynthesis, photoinhibition, the fluorescence effect, and the activation of PS2 centres are all considered in the models. These take into account not only the influence of light, but also, indirectly, that of the vertical mixing of water; in the case of photosynthesis, the quantum yield has been also formulated as being dependent on the nutrient concentrations and the temperature of seawater

  6. Silicon-Carbide Power MOSFET Performance in High Efficiency Boost Power Processing Unit for Extreme Environments

    Science.gov (United States)

    Ikpe, Stanley A.; Lauenstein, Jean-Marie; Carr, Gregory A.; Hunter, Don; Ludwig, Lawrence L.; Wood, William; Del Castillo, Linda Y.; Fitzpatrick, Fred; Chen, Yuan

    2016-01-01

    Silicon-Carbide device technology has generated much interest in recent years. With superior thermal performance, power ratings and potential switching frequencies over its Silicon counterpart, Silicon-Carbide offers a greater possibility for high powered switching applications in extreme environment. In particular, Silicon-Carbide Metal-Oxide- Semiconductor Field-Effect Transistors' (MOSFETs) maturing process technology has produced a plethora of commercially available power dense, low on-state resistance devices capable of switching at high frequencies. A novel hard-switched power processing unit (PPU) is implemented utilizing Silicon-Carbide power devices. Accelerated life data is captured and assessed in conjunction with a damage accumulation model of gate oxide and drain-source junction lifetime to evaluate potential system performance at high temperature environments.

  7. Network Modeling and Simulation Environment (NEMSE)

    Science.gov (United States)

    2012-07-01

    transmission ( frame rate and resolution) and encoding (compression) characteristics of a video stream to adapt to changing bandwidth limitations. 3.3...transitions the NEMSE Demos to 6.2 research. The NEMSE Demos were System-in-the-Loop ( STIL ) using OPNET Modeler, COPE, FPGA Physical Layer Emulator...sensor payloads were flown by SUSEX: Gimbaled IR Video – LWIR, MWIR, & SWIR, WAMI – 1-5 frame /sec, large footprint, Gimbaled EO Video – 26x zoom, AR

  8. The Quality of Home Environment in Brazil: An Ecological Model

    Science.gov (United States)

    de Oliveira, Ebenezer A.; Barros, Fernando C.; Anselmi, Luciana D. da Silva; Piccinini, Cesar A.

    2006-01-01

    Based on Bronfenbrenner's (1999) ecological perspective, a longitudinal, prospective model of individual differences in the quality of home environment (Home Observation for Measurement of the Environment--HOME) was tested in a sample of 179 Brazilian children and their families. Perinatal measures of family socioeconomic status (SES) and child…

  9. Transforming Collaborative Process Models into Interface Process Models by Applying an MDA Approach

    Science.gov (United States)

    Lazarte, Ivanna M.; Chiotti, Omar; Villarreal, Pablo D.

    Collaborative business models among enterprises require defining collaborative business processes. Enterprises implement B2B collaborations to execute these processes. In B2B collaborations the integration and interoperability of processes and systems of the enterprises are required to support the execution of collaborative processes. From a collaborative process model, which describes the global view of the enterprise interactions, each enterprise must define the interface process that represents the role it performs in the collaborative process in order to implement the process in a Business Process Management System. Hence, in this work we propose a method for the automatic generation of the interface process model of each enterprise from a collaborative process model. This method is based on a Model-Driven Architecture to transform collaborative process models into interface process models. By applying this method, interface processes are guaranteed to be interoperable and defined according to a collaborative process.

  10. Processing spoken lectures in resource-scarce environments

    CSIR Research Space (South Africa)

    Van Heerden, CJ

    2011-11-01

    Full Text Available an acoustic model from another language, in this case American English. The authors show that while target-language acoustic models are preferable, similar performance can be achieved by repeatedly bootstrapping with the American English model, segmenting...

  11. Modelling of Batch Process Operations

    DEFF Research Database (Denmark)

    Abdul Samad, Noor Asma Fazli; Cameron, Ian; Gani, Rafiqul

    2011-01-01

    Here a batch cooling crystalliser is modelled and simulated as is a batch distillation system. In the batch crystalliser four operational modes of the crystalliser are considered, namely: initial cooling, nucleation, crystal growth and product removal. A model generation procedure is shown that s...

  12. Mathematical Modeling: A Structured Process

    Science.gov (United States)

    Anhalt, Cynthia Oropesa; Cortez, Ricardo

    2015-01-01

    Mathematical modeling, in which students use mathematics to explain or interpret physical, social, or scientific phenomena, is an essential component of the high school curriculum. The Common Core State Standards for Mathematics (CCSSM) classify modeling as a K-12 standard for mathematical practice and as a conceptual category for high school…

  13. Quantitative Modeling of Human-Environment Interactions in Preindustrial Time

    Science.gov (United States)

    Sommer, Philipp S.; Kaplan, Jed O.

    2017-04-01

    Quantifying human-environment interactions and anthropogenic influences on the environment prior to the Industrial revolution is essential for understanding the current state of the earth system. This is particularly true for the terrestrial biosphere, but marine ecosystems and even climate were likely modified by human activities centuries to millennia ago. Direct observations are however very sparse in space and time, especially as one considers prehistory. Numerical models are therefore essential to produce a continuous picture of human-environment interactions in the past. Agent-based approaches, while widely applied to quantifying human influence on the environment in localized studies, are unsuitable for global spatial domains and Holocene timescales because of computational demands and large parameter uncertainty. Here we outline a new paradigm for the quantitative modeling of human-environment interactions in preindustrial time that is adapted to the global Holocene. Rather than attempting to simulate agency directly, the model is informed by a suite of characteristics describing those things about society that cannot be predicted on the basis of environment, e.g., diet, presence of agriculture, or range of animals exploited. These categorical data are combined with the properties of the physical environment in coupled human-environment model. The model is, at its core, a dynamic global vegetation model with a module for simulating crop growth that is adapted for preindustrial agriculture. This allows us to simulate yield and calories for feeding both humans and their domesticated animals. We couple this basic caloric availability with a simple demographic model to calculate potential population, and, constrained by labor requirements and land limitations, we create scenarios of land use and land cover on a moderate-resolution grid. We further implement a feedback loop where anthropogenic activities lead to changes in the properties of the physical

  14. modeling grinding modeling grinding processes as micro processes

    African Journals Online (AJOL)

    eobe

    composites. Modeling of dynamic micro-milling cutting forces was presented [14] with the effect of plowing, elastic recovery, run-out and dynamics on micro- milling forces examined. The continuous demand for hard and tough materials that can withstand varying stress conditions to ensure prolonged service life of ...

  15. Agent–host–environment model of blunt abdominal trauma in ...

    African Journals Online (AJOL)

    % of mortality is related to trauma. Abdominal injuries account for approximately 10% of trauma deaths in childhood. Child injury has great effects on communities and countries. The agent–host– environment model has been used to describe ...

  16. Exascale Co-design for Modeling Materials in Extreme Environments

    Energy Technology Data Exchange (ETDEWEB)

    Germann, Timothy C. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2014-07-08

    Computational materials science has provided great insight into the response of materials under extreme conditions that are difficult to probe experimentally. For example, shock-induced plasticity and phase transformation processes in single-crystal and nanocrystalline metals have been widely studied via large-scale molecular dynamics simulations, and many of these predictions are beginning to be tested at advanced 4th generation light sources such as the Advanced Photon Source (APS) and Linac Coherent Light Source (LCLS). I will describe our simulation predictions and their recent verification at LCLS, outstanding challenges in modeling the response of materials to extreme mechanical and radiation environments, and our efforts to tackle these as part of the multi-institutional, multi-disciplinary Exascale Co-design Center for Materials in Extreme Environments (ExMatEx). ExMatEx has initiated an early and deep collaboration between domain (computational materials) scientists, applied mathematicians, computer scientists, and hardware architects, in order to establish the relationships between algorithms, software stacks, and architectures needed to enable exascale-ready materials science application codes within the next decade. We anticipate that we will be able to exploit hierarchical, heterogeneous architectures to achieve more realistic large-scale simulations with adaptive physics refinement, and are using tractable application scale-bridging proxy application testbeds to assess new approaches and requirements. Such current scale-bridging strategies accumulate (or recompute) a distributed response database from fine-scale calculations, in a top-down rather than bottom-up multiscale approach.

  17. Monitoring of multi-patterning processes in production environment

    Science.gov (United States)

    Han, Sangjun; Lee, Honggoo; Woo, Jaesun; Kim, Seungyoung; Kim, Wan-Soo; Buhl, Stefan; Habets, Boris; Kim, Seop

    2017-03-01

    Multi-patterning processes have become common in the leading-edge semiconductor industry. These processes require a good patterning uniformity over the wafer while different process steps have impact. The initial lithography steps can be nearly perfect, but the CD variation after a trim process may cause CD variation after the spacer deposition. In fact, that leads to final non-uniformity of the final CD. Monitoring and controlling the individual CD parameters is not sufficient to ensure a stable process. We define a set of new KPIs, taking all contributions into account and using macro measurement data. We show that a reliable monitoring is achieved to meet the process specifications.

  18. Modeling business processes: theoretical and practical aspects

    Directory of Open Access Journals (Sweden)

    V.V. Dubininа

    2015-06-01

    Full Text Available The essence of process-oriented enterprise management has been examined in the article. The content and types of information technology have been analyzed in the article, due to the complexity and differentiation of existing methods, as well as the specificity of language, terminology of the enterprise business processes modeling. The theoretical aspects of business processes modeling have been reviewed and the modern traditional modeling techniques received practical application in the visualization model of retailers activity have been studied in the article. In the process of theoretical analysis of the modeling methods found that UFO-toolkit method that has been developed by Ukrainian scientists due to it systemology integrated opportunities, is the most suitable for structural and object analysis of retailers business processes. It was designed visualized simulation model of the business process "sales" as is" of retailers using a combination UFO-elements with the aim of the further practical formalization and optimization of a given business process.

  19. Steering of Educational Processes in a Digital Medium Environment

    DEFF Research Database (Denmark)

    Tække, Jesper; Paulsen, Michael

    2015-01-01

    This paper is about challenges to steering and leadership of educational interaction in classrooms provided by the new medium environment that comes with digital media. In the new medium environment, the old way of steering what is going on in the classroom appears not to work since...... it was developed in the image of the industrial society and based on a closed classroom. Now with the digital media and wireless networks the classroom is opened and the old way of organizing teaching has become inadequate: The students are disturbed by the new media, instead of learning through them. Inspired...... by systems theory we outline a more adequate way of teaching in the new medium environment – a teaching that can manage the new situation and use the new possibilities provided by the digital media. The argumentation builds on empirical findings from the action research project Socio Media Education (SME...

  20. Model development for wireless propagation in forested environments

    OpenAIRE

    Zegarra, Jesus

    2015-01-01

    Approved for public release; distribution is unlimited Wireless propagation modeling is a necessary task in the design of countless applications. Wireless signals attenuate at different rates according to the propagation environment. Given that vegetation is an unavoidable feature for most outdoor wireless channels, propagation models in forested environments are in high demand. The characterization of radio waves propagating through foliage is particularly complex due to the random charac...

  1. Business Process Management Integration with Application Development Environment

    OpenAIRE

    Bizjak, Matic

    2017-01-01

    Bachelor’s thesis describes business process, business process management, business process management systems and ways to integrate them into existing applications. Resource oriented architecture is presented and used to develop the solution. The main purpose of this work is to design and develop RESTful web service which exposes and adds new functionalities to application programming interface, which is used to integrate business process management system with software development framework...

  2. Focal and Ambient Processing of Built Environments: Intellectual and Atmospheric Experiences of Architecture.

    Science.gov (United States)

    Rooney, Kevin K; Condia, Robert J; Loschky, Lester C

    2017-01-01

    Neuroscience has well established that human vision divides into the central and peripheral fields of view. Central vision extends from the point of gaze (where we are looking) out to about 5° of visual angle (the width of one's fist at arm's length), while peripheral vision is the vast remainder of the visual field. These visual fields project to the parvo and magno ganglion cells, which process distinctly different types of information from the world around us and project that information to the ventral and dorsal visual streams, respectively. Building on the dorsal/ventral stream dichotomy, we can further distinguish between focal processing of central vision, and ambient processing of peripheral vision. Thus, our visual processing of and attention to objects and scenes depends on how and where these stimuli fall on the retina. The built environment is no exception to these dependencies, specifically in terms of how focal object perception and ambient spatial perception create different types of experiences we have with built environments. We argue that these foundational mechanisms of the eye and the visual stream are limiting parameters of architectural experience. We hypothesize that people experience architecture in two basic ways based on these visual limitations; by intellectually assessing architecture consciously through focal object processing and assessing architecture in terms of atmosphere through pre-conscious ambient spatial processing. Furthermore, these separate ways of processing architectural stimuli operate in parallel throughout the visual perceptual system. Thus, a more comprehensive understanding of architecture must take into account that built environments are stimuli that are treated differently by focal and ambient vision, which enable intellectual analysis of architectural experience versus the experience of architectural atmosphere, respectively. We offer this theoretical model to help advance a more precise understanding of the

  3. Comparing Designers’ Problem-Solving Behavior in a Parametric Design Environment and a Geometric Modeling Environment

    Directory of Open Access Journals (Sweden)

    Ning Gu

    2013-09-01

    Full Text Available This paper presents the results of a protocol study which compares designers’ behavior in a parametric design environment (PDE and a geometric modeling environment (GME. An experiment was conducted in which seven designers were required to complete two architectural conceptual design tasks with similar complexity respectively in a PDE and GME. Protocol analysis is employed to compare the cognitive behavior of designers in these two environments. By analyzing the designers’ actions, including shifting between “problem” and “solution” spaces, it was possible to compare their cognitive activities in PDEs and GMEs. Results of this research suggest that designers put similar effort into the design problem space and the solution space in PDE and GME and that interaction between these two spaces also appears similar in the two design environments. However, different Problem-Solution index values and discontinuity ratios are found across design stages of the two design environments.

  4. ENHANCEMENT EDUCATIONAL PROCESS IN THE CONDITIONS OF INFORMATIVE-COMMUNICATIVE PEDAGOGICAL ENVIRONMENT.

    Directory of Open Access Journals (Sweden)

    L. Petukhova

    2010-06-01

    Full Text Available In the article are considered the possibilities and conditions of realization informative-communicative pedagogical environment at high school, the ways of educational process expansion in the context of informative-communicative pedagogical environment are also described.

  5. Steering of Educational Processes in a Digital Medium Environment

    DEFF Research Database (Denmark)

    Tække, Jesper; Paulsen, Michael Eric

    2016-01-01

    are dealing with in this article is how one should respond educationally to the new media situation. Or more precisely: What should or could Bildung (edification) be within the current environment of new media. It draws on Luhmann (2006), Biesta (2006), Klafki (2014) and Kant (1784), describing what Bildung...

  6. Multi-dimensional Point Process Models in R

    Directory of Open Access Journals (Sweden)

    Roger Peng

    2003-09-01

    Full Text Available A software package for fitting and assessing multidimensional point process models using the R statistical computing environment is described. Methods of residual analysis based on random thinning are discussed and implemented. Features of the software are demonstrated using data on wildfire occurrences in Los Angeles County, California and earthquake occurrences in Northern California.

  7. Styles in business process modeling: an exploration and a model

    NARCIS (Netherlands)

    Pinggera, Jakob; Soffer, Pnina; Fahland, Dirk; Weidlich, Matthias; Zugal, Stefan; Weber, Barbara; Reijers, Hajo A.; Mendling, Jan

    2015-01-01

    Business process models are an important means to design, analyze, implement, and control business processes. As with every type of conceptual model, a business process model has to meet certain syntactic, semantic, and pragmatic quality requirements to be of value. For many years, such quality

  8. Plasma Processing of Model Residential Solid Waste

    Science.gov (United States)

    Messerle, V. E.; Mossé, A. L.; Nikonchuk, A. N.; Ustimenko, A. B.; Baimuldin, R. V.

    2017-09-01

    The authors have tested the technology of processing of model residential solid waste. They have developed and created a pilot plasma unit based on a plasma chamber incinerator. The waste processing technology has been tested and prepared for commercialization.

  9. Branching process models of cancer

    CERN Document Server

    Durrett, Richard

    2015-01-01

    This volume develops results on continuous time branching processes and applies them to study rate of tumor growth, extending classic work on the Luria-Delbruck distribution. As a consequence, the authors calculate the probability that mutations that confer resistance to treatment are present at detection and quantify the extent of tumor heterogeneity. As applications, the authors evaluate ovarian cancer screening strategies and give rigorous proofs for results of Heano and Michor concerning tumor metastasis. These notes should be accessible to students who are familiar with Poisson processes and continuous time. Richard Durrett is mathematics professor at Duke University, USA. He is the author of 8 books, over 200 journal articles, and has supervised more than 40 Ph.D. students. Most of his current research concerns the applications of probability to biology: ecology, genetics, and most recently cancer.

  10. Mechanistic Fermentation Models for Process Design, Monitoring, and Control

    DEFF Research Database (Denmark)

    Mears, Lisa; Stocks, Stuart M.; Albæk, Mads Orla

    2017-01-01

    Mechanistic models require a significant investment of time and resources, but their application to multiple stages of fermentation process development and operation can make this investment highly valuable. This Opinion article discusses how an established fermentation model may be adapted...... for application to different stages of fermentation process development: planning, process design, monitoring, and control. Although a longer development time is required for such modeling methods in comparison to purely data-based model techniques, the wide range of applications makes them a highly valuable tool...... for fermentation research and development. In addition, in a research environment, where collaboration is important, developing mechanistic models provides a platform for knowledge sharing and consolidation of existing process understanding....

  11. Spectroscopic analyses of chemical adaptation processes within microalgal biomass in response to changing environments

    Energy Technology Data Exchange (ETDEWEB)

    Vogt, Frank, E-mail: fvogt@utk.edu; White, Lauren

    2015-03-31

    Highlights: • Microalgae transform large quantities of inorganics into biomass. • Microalgae interact with their growing environment and adapt their chemical composition. • Sequestration capabilities are dependent on cells’ chemical environments. • We develop a chemometric hard-modeling to describe these chemical adaptation dynamics. • This methodology will enable studies of microalgal compound sequestration. - Abstract: Via photosynthesis, marine phytoplankton transforms large quantities of inorganic compounds into biomass. This has considerable environmental impacts as microalgae contribute for instance to counter-balancing anthropogenic releases of the greenhouse gas CO{sub 2}. On the other hand, high concentrations of nitrogen compounds in an ecosystem can lead to harmful algae blooms. In previous investigations it was found that the chemical composition of microalgal biomass is strongly dependent on the nutrient availability. Therefore, it is expected that algae’s sequestration capabilities and productivity are also determined by the cells’ chemical environments. For investigating this hypothesis, novel analytical methodologies are required which are capable of monitoring live cells exposed to chemically shifting environments followed by chemometric modeling of their chemical adaptation dynamics. FTIR-ATR experiments have been developed for acquiring spectroscopic time series of live Dunaliella parva cultures adapting to different nutrient situations. Comparing experimental data from acclimated cultures to those exposed to a chemically shifted nutrient situation reveals insights in which analyte groups participate in modifications of microalgal biomass and on what time scales. For a chemometric description of these processes, a data model has been deduced which explains the chemical adaptation dynamics explicitly rather than empirically. First results show that this approach is feasible and derives information about the chemical biomass

  12. Systematic approach for the identification of process reference models

    CSIR Research Space (South Africa)

    Van Der Merwe, A

    2009-02-01

    Full Text Available Process models are used in different application domains to capture knowledge on the process flow. Process reference models (PRM) are used to capture reusable process models, which should simplify the identification process of process models...

  13. A process model of affect misattribution.

    Science.gov (United States)

    Payne, B Keith; Hall, Deborah L; Cameron, C Daryl; Bishara, Anthony J

    2010-10-01

    People often misattribute the causes of their thoughts and feelings. The authors propose a multinomial process model of affect misattributions, which separates three component processes. The first is an affective response to the true cause of affect. The second is an affective response to the apparent cause. The third process is when the apparent source is confused for the real source. The model is validated using the affect misattribution procedure (AMP), which uses misattributions as a means to implicitly measure attitudes. The model illuminates not only the AMP but also other phenomena in which researchers wish to model the processes underlying misattributions using subjective judgments.

  14. Integrated modelling in materials and process technology

    DEFF Research Database (Denmark)

    Hattel, Jesper Henri

    2008-01-01

    processes from the viewpoint of combined materials and process modelling are presented: solidification of thin walled ductile cast iron, integrated modelling of spray forming and multiphysics modelling of friction stir welding. The fourth example describes integrated modelling applied to a failure analysis...... of premature rupture of a stellite weld on a P91 valve used in a power plant. For all four examples, the focus is put on modelling results rather than describing the models in detail. Proper comparison with experimental work is given in all examples for model validation as well as relevant references...

  15. Modeling and simulation of membrane process

    Science.gov (United States)

    Staszak, Maciej

    2017-06-01

    The article presents the different approaches to polymer membrane mathematical modeling. Traditional models based on experimental physicochemical correlations and balance models are presented in the first part. Quantum and molecular mechanics models are presented as they are more popular for polymer membranes in fuel cells. The initial part is enclosed by neural network models which found their use for different types of processes in polymer membranes. The second part is devoted to models of fluid dynamics. The computational fluid dynamics technique can be divided into solving of Navier-Stokes equations and into Boltzmann lattice models. Both approaches are presented focusing on membrane processes.

  16. Towards Model Checking Stochastic Process Algebra

    NARCIS (Netherlands)

    Hermanns, H.; Grieskamp, W.; Santen, T.; Katoen, Joost P.; Stoddart, B.; Meyer-Kayser, J.; Siegle, M.

    2000-01-01

    Stochastic process algebras have been proven useful because they allow behaviour-oriented performance and reliability modelling. As opposed to traditional performance modelling techniques, the behaviour- oriented style supports composition and abstraction in a natural way. However, analysis of

  17. Inquiry, play, and problem solving in a process learning environment

    Science.gov (United States)

    Thwaits, Anne Y.

    What is the nature of art/science collaborations in museums? How do art objects and activities contribute to the successes of science centers? Based on the premise that art exhibitions and art-based activities engage museum visitors in different ways than do strictly factual, information-based displays, I address these questions in a case study that examines the roles of visual art and artists in the Exploratorium, a museum that has influenced exhibit design and professional practice in many of the hands-on science centers in the United States and around the world. The marriage of art and science in education is not a new idea---Leonardo da Vinci and other early polymaths surely understood how their various endeavors informed one another, and some 20th century educators understood the value of the arts and creativity in the learning and practice of other disciplines. When, in 2010, the National Science Teachers Association added an A to the federal government's ubiquitous STEM initiative and turned it into STEAM, art educators nationwide took notice. With a heightened interest in the integration of and collaboration between disciplines comes an increased need for models of best practice for educators and educational institutions. With the intention to understand the nature of such collaborations and the potential they hold, I undertook this study. I made three site visits to the Exploratorium, where I took photos, recorded notes in a journal, interacted with exhibits, and observed museum visitors. I collected other data by examining the institution's website, press releases, annual reports, and fact sheets; and by reading popular and scholarly articles written by museum staff members and by independent journalists. I quickly realized that the Exploratorium was not created in the way than most museums are, and the history of its founding and the ideals of its founder illuminate what was then and continues now to be different about this museum from most others in the

  18. Modeling pellet impact drilling process

    OpenAIRE

    Kovalev, Artem Vladimirovich; Ryabchikov, Sergey Yakovlevich; Isaev, Evgeniy Dmitrievich; Ulyanova, Oksana Sergeevna

    2016-01-01

    The paper describes pellet impact drilling which could be used to increase the drilling speed and the rate of penetration when drilling hard rocks. Pellet impact drilling implies rock destruction by metal pellets with high kinetic energy in the immediate vicinity of the earth formation encountered. The pellets are circulated in the bottom hole by a high velocity fluid jet, which is the principle component of the ejector pellet impact drill bit. The experiments conducted has allowed modeling t...

  19. Process competencies in a problem and project based learning environment

    OpenAIRE

    Du, Xiangyun; Kolmos, Anette

    2006-01-01

    Future engineers are not only required to master technological competencies concerning solving problems, producing and innovating technology, they are also expected to have capabilities of cooperation, communication, and project management in diverse social context, which are referred to as process competencies. Consequently, engineering education is facing challenges regarding how to facilitate students with scientific-technological competencies as well as process competencies. Problem based...

  20. Environment Perception Process in Maritime Command and Control

    NARCIS (Netherlands)

    Paradis, S.; Treurniet, W.; Roy, J.

    1999-01-01

    Various operational trends in naval warfare, such as technological advances in threat technology and an ongoing shift to littoral warfare, put the shipboard decision making process under pressure. Data must be processed under time-critical conditions and, as a consequence, the risk of saturation in

  1. Artificial neural networks modeling gene-environment interaction

    Directory of Open Access Journals (Sweden)

    Günther Frauke

    2012-05-01

    Full Text Available Abstract Background Gene-environment interactions play an important role in the etiological pathway of complex diseases. An appropriate statistical method for handling a wide variety of complex situations involving interactions between variables is still lacking, especially when continuous variables are involved. The aim of this paper is to explore the ability of neural networks to model different structures of gene-environment interactions. A simulation study is set up to compare neural networks with standard logistic regression models. Eight different structures of gene-environment interactions are investigated. These structures are characterized by penetrance functions that are based on sigmoid functions or on combinations of linear and non-linear effects of a continuous environmental factor and a genetic factor with main effect or with a masking effect only. Results In our simulation study, neural networks are more successful in modeling gene-environment interactions than logistic regression models. This outperfomance is especially pronounced when modeling sigmoid penetrance functions, when distinguishing between linear and nonlinear components, and when modeling masking effects of the genetic factor. Conclusion Our study shows that neural networks are a promising approach for analyzing gene-environment interactions. Especially, if no prior knowledge of the correct nature of the relationship between co-variables and response variable is present, neural networks provide a valuable alternative to regression methods that are limited to the analysis of linearly separable data.

  2. Health care management modelling: a process perspective.

    Science.gov (United States)

    Vissers, J M

    1998-10-01

    Modelling-based health care management ought to become just as popular as evidence based medicine. Making managerial decisions based on evidence by modelling efforts is certainly a step forward. Examples can be given of many successful applications in different areas of decision making: disease process modelling, screening and prevention policy development, resource allocation, waiting lists and waiting times, patient scheduling. Also examples can be given which would have benefited by prior modelling, for example adverse effects of health care system reform decisions. This contribution aims at giving an overview of health care management modelling areas, and observations from a European perspective on developing successful health care management models. The overview is created by presenting different reference frameworks for mapping health care management modelling applications. We report a development from an almost arbitrary list of applications used for bibliographic purposes (scheduling, simulation, queueing, etc.) towards frameworks that focus on the process of delivery. The advantage of mapping modelling applications in this way is that we are able to position contributions within a reference framework with a focus on processes, with the patient process at the top. The acceptance of process-orientation as a basis for modelling has consequences for the way models are developed. Close cooperation between modeller and manager and a profound insight into the dynamics of the modelling area concerned are important requirements for developing successful models. This is illustrated for waiting lists as an area of modelling.

  3. Unified regression model of binding equilibria in crowded environments

    Science.gov (United States)

    Lee, Byoungkoo; LeDuc, Philip R.; Schwartz, Russell

    2011-01-01

    Molecular crowding is a critical feature distinguishing intracellular environments from idealized solution-based environments and is essential to understanding numerous biochemical reactions, from protein folding to signal transduction. Many biochemical reactions are dramatically altered by crowding, yet it is extremely difficult to predict how crowding will quantitatively affect any particular reaction systems. We previously developed a novel stochastic off-lattice model to efficiently simulate binding reactions across wide parameter ranges in various crowded conditions. We now show that a polynomial regression model can incorporate several interrelated parameters influencing chemistry under crowded conditions. The unified model of binding equilibria accurately reproduces the results of particle simulations over a broad range of variation of six physical parameters that collectively yield a complicated, non-linear crowding effect. The work represents an important step toward the long-term goal of computationally tractable predictive models of reaction chemistry in the cellular environment. PMID:22355615

  4. Patterns for Semantic Business Process Modeling

    OpenAIRE

    Seitz, Christian

    2008-01-01

    Business Process Management has been one of the main topics in commercial information technology for many years and is becoming even more important now. The graphical modeling of business processes and their processing into software products requires so much human labor, that the production cycles can not comply with the fast changing demands of today's global markets. To improve the degree of automatic processing in Business Process Management, techniques from the Semantic Web like ontologie...

  5. Integrated Process Modeling-A Process Validation Life Cycle Companion.

    Science.gov (United States)

    Zahel, Thomas; Hauer, Stefan; Mueller, Eric M; Murphy, Patrick; Abad, Sandra; Vasilieva, Elena; Maurer, Daniel; Brocard, Cécile; Reinisch, Daniela; Sagmeister, Patrick; Herwig, Christoph

    2017-10-17

    During the regulatory requested process validation of pharmaceutical manufacturing processes, companies aim to identify, control, and continuously monitor process variation and its impact on critical quality attributes (CQAs) of the final product. It is difficult to directly connect the impact of single process parameters (PPs) to final product CQAs, especially in biopharmaceutical process development and production, where multiple unit operations are stacked together and interact with each other. Therefore, we want to present the application of Monte Carlo (MC) simulation using an integrated process model (IPM) that enables estimation of process capability even in early stages of process validation. Once the IPM is established, its capability in risk and criticality assessment is furthermore demonstrated. IPMs can be used to enable holistic production control strategies that take interactions of process parameters of multiple unit operations into account. Moreover, IPMs can be trained with development data, refined with qualification runs, and maintained with routine manufacturing data which underlines the lifecycle concept. These applications will be shown by means of a process characterization study recently conducted at a world-leading contract manufacturing organization (CMO). The new IPM methodology therefore allows anticipation of out of specification (OOS) events, identify critical process parameters, and take risk-based decisions on counteractions that increase process robustness and decrease the likelihood of OOS events.

  6. Health care process modelling: which method when?

    Science.gov (United States)

    Jun, Gyuchan Thomas; Ward, James; Morris, Zoe; Clarkson, John

    2009-06-01

    The role of process modelling has been widely recognized for effective quality improvement. However, application in health care is somewhat limited since the health care community lacks knowledge about a broad range of methods and their applicability to health care. Therefore, the objectives of this paper are to present a summary description of a limited number of distinct modelling methods and evaluate how health care workers perceive them. Various process modelling methods from several different disciplines were reviewed and characterized. Case studies in three different health care scenarios were carried out to model those processes and evaluate how health care workers perceive the usability and utility of the process models. Eight distinct modelling methods were identified and characterized by what the modelling elements in each explicitly represents. Flowcharts, which had been most extensively used by the participants, were most favoured in terms of their usability and utility. However, some alternative methods, although having been used by a much smaller number of participants, were considered to be helpful, specifically in understanding certain aspects of complex processes, e.g. communication diagrams for understanding interactions, swim lane activity diagrams for roles and responsibilities and state transition diagrams for a patient-centred perspective. We believe that it is important to make the various process modelling methods more easily accessible to health care by providing clear guidelines or computer-based tool support for health care-specific process modelling. These supports can assist health care workers to apply initially unfamiliar, but eventually more effective modelling methods.

  7. Process modelling on a canonical basis[Process modelling; Canonical modelling

    Energy Technology Data Exchange (ETDEWEB)

    Siepmann, Volker

    2006-12-20

    Based on an equation oriented solving strategy, this thesis investigates a new approach to process modelling. Homogeneous thermodynamic state functions represent consistent mathematical models of thermodynamic properties. Such state functions of solely extensive canonical state variables are the basis of this work, as they are natural objective functions in optimisation nodes to calculate thermodynamic equilibrium regarding phase-interaction and chemical reactions. Analytical state function derivatives are utilised within the solution process as well as interpreted as physical properties. By this approach, only a limited range of imaginable process constraints are considered, namely linear balance equations of state variables. A second-order update of source contributions to these balance equations is obtained by an additional constitutive equation system. These equations are general dependent on state variables and first-order sensitivities, and cover therefore practically all potential process constraints. Symbolic computation technology efficiently provides sparsity and derivative information of active equations to avoid performance problems regarding robustness and computational effort. A benefit of detaching the constitutive equation system is that the structure of the main equation system remains unaffected by these constraints, and a priori information allows to implement an efficient solving strategy and a concise error diagnosis. A tailor-made linear algebra library handles the sparse recursive block structures efficiently. The optimisation principle for single modules of thermodynamic equilibrium is extended to host entire process models. State variables of different modules interact through balance equations, representing material flows from one module to the other. To account for reusability and encapsulation of process module details, modular process modelling is supported by a recursive module structure. The second-order solving algorithm makes it

  8. Management of health, safety and environment in process industry

    DEFF Research Database (Denmark)

    Duijm, Nijs Jan; Fiévez, C.; Gerbec, M.

    2008-01-01

    The present status of industrial HSE management in a number of EU member states is reviewed, with a focus on the integration of health, safety and environment in single management systems. The review provides insight into the standards and paradigms adopted by industry, and it identifies trends...... and needs for improvement. It appears that most industries consider goal-based HSE management programs to be a success and believe them to contribute to the profitability of the industry. We conclude that HSE management would benefit greatly from guidance on how to use existing management systems...

  9. TAME - the terrestrial-aquatic model of the environment: model definition

    Energy Technology Data Exchange (ETDEWEB)

    Klos, R.A. [Paul Scherrer Inst. (PSI), Villigen (Switzerland); Mueller-Lemans, H. [Tergoso AG fuer Umweltfragen, Sargans (Switzerland); Dorp, F. van [Nationale Genossenschaft fuer die Lagerung Radioaktiver Abfaelle (NAGRA), Baden (Switzerland); Gribi, P. [Colenco AG, Baden (Switzerland)

    1996-10-01

    TAME - the Terrestrial-Aquatic Model of the Environment is a new computer model for use in assessments of the radiological impact of the release of radionuclides to the biosphere, following their disposal in underground waste repositories. Based on regulatory requirements, the end-point of the calculations is the maximum annual individual dose to members of a hypothetical population group inhabiting the biosphere region. Additional mid- and end-points in the TAME calculations are dose as function of time from eleven exposure pathways, foodstuff concentrations and the distribution of radionuclides in the modelled biosphere. A complete description of the mathematical representations of the biosphere in TAME is given in this document, based on a detailed review of the underlying conceptual framework for the model. Example results are used to illustrate features of the conceptual and mathematical models. The end-point of dose is shown to be robust for the simplifying model assumptions used to define the biosphere for the example calculations. TAME comprises two distinct sub-models - one representing the transport of radionuclides in the near-surface environment and one for the calculation of dose to individual inhabitants of that biosphere. The former is the result of a detailed review of the modelling requirements for such applications and is based on a comprehensive consideration of all features, events and processes (FEPs) relevant to Swiss biospheres, both in the present-day biosphere and in potential future biosphere states. Representations of the transport processes are derived from first principles. Mass balance for water and solid material fluxes is used to determine the rates of contaminant transfer between components of the biosphere system. The calculation of doses is based on existing representations of exposure pathways and draws on experience both from Switzerland and elsewhere. (author) figs., tabs., refs.

  10. Hypercompetitive Environments: An Agent-based model approach

    Science.gov (United States)

    Dias, Manuel; Araújo, Tanya

    Information technology (IT) environments are characterized by complex changes and rapid evolution. Globalization and the spread of technological innovation have increased the need for new strategic information resources, both from individual firms and management environments. Improvements in multidisciplinary methods and, particularly, the availability of powerful computational tools, are giving researchers an increasing opportunity to investigate management environments in their true complex nature. The adoption of a complex systems approach allows for modeling business strategies from a bottom-up perspective — understood as resulting from repeated and local interaction of economic agents — without disregarding the consequences of the business strategies themselves to individual behavior of enterprises, emergence of interaction patterns between firms and management environments. Agent-based models are at the leading approach of this attempt.

  11. Report of the 2014 Programming Models and Environments Summit

    Energy Technology Data Exchange (ETDEWEB)

    Heroux, Michael [US Dept. of Energy, Washington, DC (United States); Lethin, Richard [US Dept. of Energy, Washington, DC (United States)

    2016-09-19

    Programming models and environments play the essential roles in high performance computing of enabling the conception, design, implementation and execution of science and engineering application codes. Programmer productivity is strongly influenced by the effectiveness of our programming models and environments, as is software sustainability since our codes have lifespans measured in decades, so the advent of new computing architectures, increased concurrency, concerns for resilience, and the increasing demands for high-fidelity, multi-physics, multi-scale and data-intensive computations mean that we have new challenges to address as part of our fundamental R&D requirements. Fortunately, we also have new tools and environments that make design, prototyping and delivery of new programming models easier than ever. The combination of new and challenging requirements and new, powerful toolsets enables significant synergies for the next generation of programming models and environments R&D. This report presents the topics discussed and results from the 2014 DOE Office of Science Advanced Scientific Computing Research (ASCR) Programming Models & Environments Summit, and subsequent discussions among the summit participants and contributors to topics in this report.

  12. Stochastic Signal Processing for Sound Environment System with Decibel Evaluation and Energy Observation

    Directory of Open Access Journals (Sweden)

    Akira Ikuta

    2014-01-01

    Full Text Available In real sound environment system, a specific signal shows various types of probability distribution, and the observation data are usually contaminated by external noise (e.g., background noise of non-Gaussian distribution type. Furthermore, there potentially exist various nonlinear correlations in addition to the linear correlation between input and output time series. Consequently, often the system input and output relationship in the real phenomenon cannot be represented by a simple model using only the linear correlation and lower order statistics. In this study, complex sound environment systems difficult to analyze by using usual structural method are considered. By introducing an estimation method of the system parameters reflecting correlation information for conditional probability distribution under existence of the external noise, a prediction method of output response probability for sound environment systems is theoretically proposed in a suitable form for the additive property of energy variable and the evaluation in decibel scale. The effectiveness of the proposed stochastic signal processing method is experimentally confirmed by applying it to the observed data in sound environment systems.

  13. A knowledge generic environment for petrochemistry and refining processes supervision

    Energy Technology Data Exchange (ETDEWEB)

    Cauvin, S.

    1995-11-01

    This thesis is proposing a generic architecture for a real time process supervision knowledge based system, easily adaptable to different refining and petrochemical processes, and, by extension, to other dynamic systems. The aim is to ensure the monitoring of industrial units and to provide operators advices in his diagnosis, command and plant operation tasks. The architecture which is proposed mainly includes an alarm filtering module using causal networks of problems which are obtained automatically using the block diagram of the unit, a diagnosis module of the critical sections which is using causal networks of mid and long term interactions between variables (an algorithm is determining the coexistence of dominant or masked events on a plant), a command module which is in charge of the real time selection of action plans which have to be undertaken on the unit taking into account the current diagnosed situation (operating procedures are linked to situations which are managed in real time thanks to a mechanism derived from Petri nets). The knowledge that is manipulated and its use in real time are formalized to reach a level of genericness which effectively allow the adaptation to several processes. For a new process, one mostly has to describe the new industrial device, the events that can occur on the process, their consequences, the control variables and the priority of actions, following the defined formalism. (author). 144 refs., 97 figs., 3 tabs.

  14. Hybrid modelling of anaerobic wastewater treatment processes.

    Science.gov (United States)

    Karama, A; Bernard, O; Genovesi, A; Dochain, D; Benhammou, A; Steyer, J P

    2001-01-01

    This paper presents a hybrid approach for the modelling of an anaerobic digestion process. The hybrid model combines a feed-forward network, describing the bacterial kinetics, and the a priori knowledge based on the mass balances of the process components. We have considered an architecture which incorporates the neural network as a static model of unmeasured process parameters (kinetic growth rate) and an integrator for the dynamic representation of the process using a set of dynamic differential equations. The paper contains a description of the neural network component training procedure. The performance of this approach is illustrated with experimental data.

  15. Threat processing: models and mechanisms.

    Science.gov (United States)

    Bentz, Dorothée; Schiller, Daniela

    2015-01-01

    The experience of fear is closely linked to the survival of species. Fear can be conceptualized as a brain state that orchestrates defense reactions to threats. To avoid harm, an organism must be equipped with neural circuits that allow learning, detecting, and rapidly responding to threats. Past experience with threat can transform neutral stimuli present at the time of experience into learned threat-related stimuli via associative learning. Pavlovian threat conditioning is the central experimental paradigm to study associative learning. Once learned, these stimulus-response associations are not always expressed depending on context or new experiences with the conditioned stimuli. Neural circuits mediating threat learning have the inherent plasticity to adapt to changing environmental threats. Encounters devoid of danger pave the way for extinction or reconsolidation to occur. Extinction and reconsolidation can both lead to changes in the expression of threat-induced defense responses, but differ in stability and have a different neural basis. This review presents the behavioral models and the system-level neural mechanisms in animals and humans of threat learning and modulation. © 2015 Wiley Periodicals, Inc.

  16. Environment. Biological processing of wastes; Environnement. Traitement biologique des dechets

    Energy Technology Data Exchange (ETDEWEB)

    Gourdon, R. [Institut National des Sciences Appliquees, INSA, Lab. d' Analyse Environnementale des Procedes et des Systemes Industriels, 69 - Villeurbanne (France)

    2001-01-01

    The main principle of the biological processing is the utilization of microbial activities by a control stimulation in order to decrease the wastes harmful effects, or by an energetic valorization. This paper deals with the solid wastes or the sludges. After a short presentation of the concerned wastes, their metabolism and their consequences, the author details two treatments: the composting (aerobic) and the methanization (anaerobic). The last part is devoted to the alcoholic fermentation and the industrial wastes (non agricultural) processing. (A.L.B.)

  17. Spectroscopic analyses of chemical adaptation processes within microalgal biomass in response to changing environments.

    Science.gov (United States)

    Vogt, Frank; White, Lauren

    2015-03-31

    Via photosynthesis, marine phytoplankton transforms large quantities of inorganic compounds into biomass. This has considerable environmental impacts as microalgae contribute for instance to counter-balancing anthropogenic releases of the greenhouse gas CO2. On the other hand, high concentrations of nitrogen compounds in an ecosystem can lead to harmful algae blooms. In previous investigations it was found that the chemical composition of microalgal biomass is strongly dependent on the nutrient availability. Therefore, it is expected that algae's sequestration capabilities and productivity are also determined by the cells' chemical environments. For investigating this hypothesis, novel analytical methodologies are required which are capable of monitoring live cells exposed to chemically shifting environments followed by chemometric modeling of their chemical adaptation dynamics. FTIR-ATR experiments have been developed for acquiring spectroscopic time series of live Dunaliella parva cultures adapting to different nutrient situations. Comparing experimental data from acclimated cultures to those exposed to a chemically shifted nutrient situation reveals insights in which analyte groups participate in modifications of microalgal biomass and on what time scales. For a chemometric description of these processes, a data model has been deduced which explains the chemical adaptation dynamics explicitly rather than empirically. First results show that this approach is feasible and derives information about the chemical biomass adaptations. Future investigations will utilize these instrumental and chemometric methodologies for quantitative investigations of the relation between chemical environments and microalgal sequestration capabilities. Copyright © 2015 Elsevier B.V. All rights reserved.

  18. A production model and maintenance planning model for the process industry

    OpenAIRE

    Ashayeri, J.; Teelen, A.; Selen, W.J.

    1995-01-01

    In this paper a model is developed to simultaneously plan preventive maintenance and production in a process industry environment, where maintenance planning is extremely important. The model schedules production jobs and preventive maintenance jobs, while minimizing costs associated with production, backorders, corrective maintenance and preventive maintenance. The formulation of the model is flexible, so that it can be adapted to several production situations. The performance of the model i...

  19. Human Information Processing in the Dynamic Environment (HIPDE)

    Science.gov (United States)

    2008-01-01

    largest increases occurred after the Gz exposure was over. This was most likely caused by subjects prematurely arresting their performance on the task... Neuroanatomy of Spatial Orientation Processing in Turner Syndrome. Cerebral Cortex, 14 (2): 174-180. 198 33. Krnjevic, K. (1999). Early Effects of

  20. The Voltan application programming environment for fail-silent processes

    Science.gov (United States)

    Black, D.; Low, C.; Shrivastava, S. K.

    1998-06-01

    The Voltan software library for building distributed applications provides the support for (i) a process pair to act as a single Voltan self-checking `fail-silent' process; and (ii) connection management for Voltan process communication. A Voltan fail-silent process is written by the application developer as a single threaded program. The Voltan system replicates this program transparently. The active replication of applications engenders problems when dealing with non-deterministic calculations. This paper outlines the mechanisms deployed by Voltan to deal with non-determinism. The current implementation can achieve a level of performance that is suitable for many real-time applications. The work described in this paper provides a way of solving the challenging problem of constructing fault-tolerant distributed computing systems capable of tolerating Byzantine failures, using general-purpose, low-cost components. The present practice is to employ hardware-based approaches to construct a `fail-silent' node using a self-checking processor pair working in lock-step. However, this approach is very costly in terms of the engineering effort required, and further, as processor speeds increase, keeping a pair in lock-step execution may prove difficult.

  1. Effect of processing methods and storage environment on moisture ...

    African Journals Online (AJOL)

    The objective of this study was to determine the effect of processing methods and storage parameters on moisture adsorption characteristics of dry matured yellow ginger (Zingiber officianale) to provide information for the prediction of shelf life and selection of packaging materials. Moisture adsorption was determined ...

  2. Conceptual modelling of human resource evaluation process

    Directory of Open Access Journals (Sweden)

    Negoiţă Doina Olivia

    2017-01-01

    Full Text Available Taking into account the highly diverse tasks which employees have to fulfil due to complex requirements of nowadays consumers, the human resource within an enterprise has become a strategic element for developing and exploiting products which meet the market expectations. Therefore, organizations encounter difficulties when approaching the human resource evaluation process. Hence, the aim of the current paper is to design a conceptual model of the aforementioned process, which allows the enterprises to develop a specific methodology. In order to design the conceptual model, Business Process Modelling instruments were employed - Adonis Community Edition Business Process Management Toolkit using the ADONIS BPMS Notation. The conceptual model was developed based on an in-depth secondary research regarding the human resource evaluation process. The proposed conceptual model represents a generic workflow (sequential and/ or simultaneously activities, which can be extended considering the enterprise’s needs regarding their requirements when conducting a human resource evaluation process. Enterprises can benefit from using software instruments for business process modelling as they enable process analysis and evaluation (predefined / specific queries and also model optimization (simulations.

  3. Measures of Quality in Business Process Modelling

    Directory of Open Access Journals (Sweden)

    Radek Hronza

    2015-06-01

    Full Text Available Business process modelling and analysing is undoubtedly one of the most important parts of Applied (Business Informatics. Quality of business process models (diagrams is crucial for any purpose in this area. The goal of a process analyst’s work is to create generally understandable, explicit and error free models. If a process is properly described, created models can be used as an input into deep analysis and optimization. It can be assumed that properly designed business process models (similarly as in the case of correctly written algorithms contain characteristics that can be mathematically described. Besides it will be possible to create a tool that will help process analysts to design proper models. As part of this review will be conducted systematic literature review in order to find and analyse business process model’s design and business process model’s quality measures. It was found that mentioned area had already been the subject of research investigation in the past. Thirty-three suitable scietific publications and twenty-two quality measures were found. Analysed scientific publications and existing quality measures do not reflect all important attributes of business process model’s clarity, simplicity and completeness. Therefore it would be appropriate to add new measures of quality.

  4. Modeling human behaviors and reactions under dangerous environment.

    Science.gov (United States)

    Kang, J; Wright, D K; Qin, S F; Zhao, Y

    2005-01-01

    This paper describes the framework of a real-time simulation system to model human behavior and reactions in dangerous environments. The system utilizes the latest 3D computer animation techniques, combined with artificial intelligence, robotics and psychology, to model human behavior, reactions and decision making under expected/unexpected dangers in real-time in virtual environments. The development of the system includes: classification on the conscious/subconscious behaviors and reactions of different people; capturing different motion postures by the Eagle Digital System; establishing 3D character animation models; establishing 3D models for the scene; planning the scenario and the contents; and programming within Virtools Dev. Programming within Virtools Dev is subdivided into modeling dangerous events, modeling character's perceptions, modeling character's decision making, modeling character's movements, modeling character's interaction with environment and setting up the virtual cameras. The real-time simulation of human reactions in hazardous environments is invaluable in military defense, fire escape, rescue operation planning, traffic safety studies, and safety planning in chemical factories, the design of buildings, airplanes, ships and trains. Currently, human motion modeling can be realized through established technology, whereas to integrate perception and intelligence into virtual human's motion is still a huge undertaking. The challenges here are the synchronization of motion and intelligence, the accurate modeling of human's vision, smell, touch and hearing, the diversity and effects of emotion and personality in decision making. There are three types of software platforms which could be employed to realize the motion and intelligence within one system, and their advantages and disadvantages are discussed.

  5. Computer Forensics Field Triage Process Model

    Directory of Open Access Journals (Sweden)

    Marcus K. Rogers

    2006-06-01

    Full Text Available With the proliferation of digital based evidence, the need for the timely identification, analysis and interpretation of digital evidence is becoming more crucial. In many investigations critical information is required while at the scene or within a short period of time - measured in hours as opposed to days. The traditional cyber forensics approach of seizing a system(s/media, transporting it to the lab, making a forensic image(s, and then searching the entire system for potential evidence, is no longer appropriate in some circumstances. In cases such as child abductions, pedophiles, missing or exploited persons, time is of the essence. In these types of cases, investigators dealing with the suspect or crime scene need investigative leads quickly; in some cases it is the difference between life and death for the victim(s. The Cyber Forensic Field Triage Process Model (CFFTPM proposes an onsite or field approach for providing the identification, analysis and interpretation of digital evidence in a short time frame, without the requirement of having to take the system(s/media back to the lab for an in-depth examination or acquiring a complete forensic image(s. The proposed model adheres to commonly held forensic principles, and does not negate the ability that once the initial field triage is concluded, the system(s/storage media be transported back to a lab environment for a more thorough examination and analysis. The CFFTPM has been successfully used in various real world cases, and its investigative importance and pragmatic approach has been amply demonstrated. Furthermore, the derived evidence from these cases has not been challenged in the court proceedings where it has been introduced. The current article describes the CFFTPM in detail, discusses the model’s forensic soundness, investigative support capabilities and practical considerations.

  6. Counting Processes for Retail Default Modeling

    DEFF Research Database (Denmark)

    Kiefer, Nicholas Maximilian; Larson, C. Erik

    in a discrete state space. In a simple case, the states could be default/non-default; in other models relevant for credit modeling the states could be credit scores or payment status (30 dpd, 60 dpd, etc.). Here we focus on the use of stochastic counting processes for mortgage default modeling, using data...

  7. Influence of fractal substructures of the percolating cluster on transferring processes in macroscopically disordered environments

    Science.gov (United States)

    Kolesnikov, B. P.

    2017-11-01

    The presented work belongs to the issue of searching for the effective kinetic properties of macroscopically disordered environments (MDE). These properties characterize MDE in general on the sizes which significantly exceed the sizes of macro inhomogeneity. The structure of MDE is considered as a complex of interpenetrating percolating and finite clusters consolidated from homonymous components, topological characteristics of which influence on the properties of the whole environment. The influence of percolating clusters’ fractal substructures (backbone, skeleton of backbone, red bonds) on the transfer processes during crossover (a structure transition from fractal to homogeneous condition) is investigated based on the offered mathematical approach for finding the effective conductivity of MDEs and on the percolating cluster model. The nature of the change of the critical conductivity index t during crossover from the characteristic value for the area close to percolation threshold to the value corresponded to homogeneous condition is demonstrated. The offered model describes the transfer processes in MDE with the finite conductivity relation of «conductive» and «low conductive» phases above and below percolation threshold and in smearing area (an analogue of a blur area of the second-order phase transfer).

  8. Modelling urban rainfall-runoff responses using an experimental, two-tiered physical modelling environment

    Science.gov (United States)

    Green, Daniel; Pattison, Ian; Yu, Dapeng

    2016-04-01

    Surface water (pluvial) flooding occurs when rainwater from intense precipitation events is unable to infiltrate into the subsurface or drain via natural or artificial drainage channels. Surface water flooding poses a serious hazard to urban areas across the world, with the UK's perceived risk appearing to have increased in recent years due to surface water flood events seeming more severe and frequent. Surface water flood risk currently accounts for 1/3 of all UK flood risk, with approximately two million people living in urban areas at risk of a 1 in 200-year flood event. Research often focuses upon using numerical modelling techniques to understand the extent, depth and severity of actual or hypothetical flood scenarios. Although much research has been conducted using numerical modelling, field data available for model calibration and validation is limited due to the complexities associated with data collection in surface water flood conditions. Ultimately, the data which numerical models are based upon is often erroneous and inconclusive. Physical models offer a novel, alternative and innovative environment to collect data within, creating a controlled, closed system where independent variables can be altered independently to investigate cause and effect relationships. A physical modelling environment provides a suitable platform to investigate rainfall-runoff processes occurring within an urban catchment. Despite this, physical modelling approaches are seldom used in surface water flooding research. Scaled laboratory experiments using a 9m2, two-tiered 1:100 physical model consisting of: (i) a low-cost rainfall simulator component able to simulate consistent, uniformly distributed (>75% CUC) rainfall events of varying intensity, and; (ii) a fully interchangeable, modular plot surface have been conducted to investigate and quantify the influence of a number of terrestrial and meteorological factors on overland flow and rainfall-runoff patterns within a modelled

  9. Impact Modelling for Circular Economy: Geodesign Discussion Support Environment

    NARCIS (Netherlands)

    Šileryte, R.; Wandl, A.; van Timmeren, A.; Bregt, Arnold; Sarjakoski, Tapani; van Lammeren, Ron; Rip, Frans

    2017-01-01

    Transitioning towards circular economy requires changes in the current system which yield a number of impacts on such fundamental values as human health, natural environment, exhaustible resources, social well-being and prosperity. Moreover, this process involves multiple actors and requires careful

  10. Piecewise deterministic processes in biological models

    CERN Document Server

    Rudnicki, Ryszard

    2017-01-01

    This book presents a concise introduction to piecewise deterministic Markov processes (PDMPs), with particular emphasis on their applications to biological models. Further, it presents examples of biological phenomena, such as gene activity and population growth, where different types of PDMPs appear: continuous time Markov chains, deterministic processes with jumps, processes with switching dynamics, and point processes. Subsequent chapters present the necessary tools from the theory of stochastic processes and semigroups of linear operators, as well as theoretical results concerning the long-time behaviour of stochastic semigroups induced by PDMPs and their applications to biological models. As such, the book offers a valuable resource for mathematicians and biologists alike. The first group will find new biological models that lead to interesting and often new mathematical questions, while the second can observe how to include seemingly disparate biological processes into a unified mathematical theory, and...

  11. Modeling closed nuclear fuel cycles processes

    Energy Technology Data Exchange (ETDEWEB)

    Shmidt, O.V. [A.A. Bochvar All-Russian Scientific Research Institute for Inorganic Materials, Rogova, 5a street, Moscow, 123098 (Russian Federation); Makeeva, I.R. [Zababakhin All-Russian Scientific Research Institute of Technical Physics, Vasiliev street 13, Snezhinsk, Chelyabinsk region, 456770 (Russian Federation); Liventsov, S.N. [Tomsk Polytechnic University, Tomsk, Lenin Avenue, 30, 634050 (Russian Federation)

    2016-07-01

    Computer models of processes are necessary for determination of optimal operating conditions for closed nuclear fuel cycle (NFC) processes. Computer models can be quickly changed in accordance with new and fresh data from experimental research. 3 kinds of process simulation are necessary. First, the VIZART software package is a balance model development used for calculating the material flow in technological processes. VIZART involves taking into account of equipment capacity, transport lines and storage volumes. Secondly, it is necessary to simulate the physico-chemical processes that are involved in the closure of NFC. The third kind of simulation is the development of software that allows the optimization, diagnostics and control of the processes which implies real-time simulation of product flows on the whole plant or on separate lines of the plant. (A.C.)

  12. Processing of soot in an urban environment: case study from the Mexico City Metropolitan Area

    Science.gov (United States)

    Johnson, K. S.; Zuberi, B.; Molina, L. T.; Molina, M. J.; Iedema, M. J.; Cowin, J. P.; Gaspar, D. J.; Wang, C.; Laskin, A.

    2005-11-01

    Chemical composition, size, and mixing state of atmospheric particles are critical in determining their effects on the environment. There is growing evidence that soot aerosols play a particularly important role in both climate and human health, but still relatively little is known of their physical and chemical nature. In addition, the atmospheric residence times and removal mechanisms for soot are neither well understood nor adequately represented in regional and global climate models. To investigate the effect of locality and residence time on properties of soot and mixing state in a polluted urban environment, particles of diameter 0.2-2.0 μm were collected in the Mexico City Metropolitan Area (MCMA) during the MCMA-2003 Field Campaign from various sites within the city. Individual particle analysis by different electron microscopy methods coupled with energy dispersed x-ray spectroscopy, and secondary ionization mass spectrometry show that freshly-emitted soot particles become rapidly processed in the MCMA. Whereas fresh particulate emissions from mixed-traffic are almost entirely carbonaceous, consisting of soot aggregates with liquid coatings suggestive of unburned lubricating oil and water, ambient soot particles which have been processed for less than a few hours are heavily internally mixed, primarily with ammonium sulfate. Single particle analysis suggests that this mixing occurs through several mechanisms that require further investigation. In light of previously published results, the internally-mixed nature of processed soot particles is expected to affect heterogeneous chemistry on the soot surface, including interaction with water during wet-removal.

  13. Multi-environment model estimation for motility analysis of Caenorhabditis elegans.

    Directory of Open Access Journals (Sweden)

    Raphael Sznitman

    Full Text Available The nematode Caenorhabditis elegans is a well-known model organism used to investigate fundamental questions in biology. Motility assays of this small roundworm are designed to study the relationships between genes and behavior. Commonly, motility analysis is used to classify nematode movements and characterize them quantitatively. Over the past years, C. elegans' motility has been studied across a wide range of environments, including crawling on substrates, swimming in fluids, and locomoting through microfluidic substrates. However, each environment often requires customized image processing tools relying on heuristic parameter tuning. In the present study, we propose a novel Multi-Environment Model Estimation (MEME framework for automated image segmentation that is versatile across various environments. The MEME platform is constructed around the concept of Mixture of Gaussian (MOG models, where statistical models for both the background environment and the nematode appearance are explicitly learned and used to accurately segment a target nematode. Our method is designed to simplify the burden often imposed on users; here, only a single image which includes a nematode in its environment must be provided for model learning. In addition, our platform enables the extraction of nematode 'skeletons' for straightforward motility quantification. We test our algorithm on various locomotive environments and compare performances with an intensity-based thresholding method. Overall, MEME outperforms the threshold-based approach for the overwhelming majority of cases examined. Ultimately, MEME provides researchers with an attractive platform for C. elegans' segmentation and 'skeletonizing' across a wide range of motility assays.

  14. Modeling of non-additive mixture properties using the Online CHEmical database and Modeling environment (OCHEM

    Directory of Open Access Journals (Sweden)

    Oprisiu Ioana

    2013-01-01

    Full Text Available Abstract The Online Chemical Modeling Environment (OCHEM, http://ochem.eu is a web-based platform that provides tools for automation of typical steps necessary to create a predictive QSAR/QSPR model. The platform consists of two major subsystems: a database of experimental measurements and a modeling framework. So far, OCHEM has been limited to the processing of individual compounds. In this work, we extended OCHEM with a new ability to store and model properties of binary non-additive mixtures. The developed system is publicly accessible, meaning that any user on the Web can store new data for binary mixtures and develop models to predict their non-additive properties. The database already contains almost 10,000 data points for the density, bubble point, and azeotropic behavior of binary mixtures. For these data, we developed models for both qualitative (azeotrope/zeotrope and quantitative endpoints (density and bubble points using different learning methods and specially developed descriptors for mixtures. The prediction performance of the models was similar to or more accurate than results reported in previous studies. Thus, we have developed and made publicly available a powerful system for modeling mixtures of chemical compounds on the Web.

  15. Modeling of non-additive mixture properties using the Online CHEmical database and Modeling environment (OCHEM).

    Science.gov (United States)

    Oprisiu, Ioana; Novotarskyi, Sergii; Tetko, Igor V

    2013-01-15

    The Online Chemical Modeling Environment (OCHEM, http://ochem.eu) is a web-based platform that provides tools for automation of typical steps necessary to create a predictive QSAR/QSPR model. The platform consists of two major subsystems: a database of experimental measurements and a modeling framework. So far, OCHEM has been limited to the processing of individual compounds. In this work, we extended OCHEM with a new ability to store and model properties of binary non-additive mixtures. The developed system is publicly accessible, meaning that any user on the Web can store new data for binary mixtures and develop models to predict their non-additive properties.The database already contains almost 10,000 data points for the density, bubble point, and azeotropic behavior of binary mixtures. For these data, we developed models for both qualitative (azeotrope/zeotrope) and quantitative endpoints (density and bubble points) using different learning methods and specially developed descriptors for mixtures. The prediction performance of the models was similar to or more accurate than results reported in previous studies. Thus, we have developed and made publicly available a powerful system for modeling mixtures of chemical compounds on the Web.

  16. Dissolving decision making? : Models and their roles in decision-making processes and policy at large

    NARCIS (Netherlands)

    Zeiss, Ragna; van Egmond, S.

    2014-01-01

    This article studies the roles three science-based models play in Dutch policy and decision making processes. Key is the interaction between model construction and environment. Their political and scientific environments form contexts that shape the roles of models in policy decision making.

  17. A COMPUTER-BASED ENVIRONMENT FOR PROCESSING AND SELECTION OF SEISMIC GROUND MOTION RECORDS: OPENSIGNAL

    Directory of Open Access Journals (Sweden)

    Gian Paolo eCimellaro

    2015-09-01

    Full Text Available A new computer-based platform has been proposed whose novelty consists in modeling the local site effects of the ground motion propagation using a hybrid approach based on an equivalent linear model. The soil behavior is modeled assuming that both the shear modulus and the damping ratio vary with the shear strain amplitude. So the hysteretic behavior of the soil is described using the shear modulus degradation and damping ratio curves. In addition, another originality of the proposed system architecture consists in the evaluation of the Conditional Mean Spectrum (CMS on the entire Italian territory automatically, knowing the geographical coordinates. The computer-based platform based on signal processing has been developed using a modular programming approach, to enable the selection and the processing of earthquake ground motion records. The proposed computer-based platform combines in unified environment different features such as: (i selection of ground motion records using both spectral and waveform matching, (ii signal processing, (iii response spectra analysis, (iv soil response analysis etc. The computer-based platform OPENSIGNAL is freely available for the general public at http://areeweb.polito.it/ricerca/ICRED/Software/OpenSignal.php.

  18. A Total Quality Leadership Process Improvement Model

    Science.gov (United States)

    1993-12-01

    APPENDIX A DEMING’S 14 MANAGEMENT PRINCIPLES Department of the Navy TQL Office A-0 TQLO No. 93-02 A Total Quality Leadership Process Improvement Model 1... Leadership Process Improvement Model by Archester Houston, Ph.D. and Steven L. Dockstader, Ph.D. DTICS ELECTE tleaese oand sale itsFeat ben proe 94-12058...tTl ’AND SIATE COVERID0 Z lits Z40 uerI’Ll12/93 IFinalS.FNR IM F A Total Quality Leadership Process Improvement Model M ARRhOW~ Archester Houston, Ph.D

  19. Hybrid modelling of a sugar boiling process

    CERN Document Server

    Lauret, Alfred Jean Philippe; Gatina, Jean Claude

    2012-01-01

    The first and maybe the most important step in designing a model-based predictive controller is to develop a model that is as accurate as possible and that is valid under a wide range of operating conditions. The sugar boiling process is a strongly nonlinear and nonstationary process. The main process nonlinearities are represented by the crystal growth rate. This paper addresses the development of the crystal growth rate model according to two approaches. The first approach is classical and consists of determining the parameters of the empirical expressions of the growth rate through the use of a nonlinear programming optimization technique. The second is a novel modeling strategy that combines an artificial neural network (ANN) as an approximator of the growth rate with prior knowledge represented by the mass balance of sucrose crystals. The first results show that the first type of model performs local fitting while the second offers a greater flexibility. The two models were developed with industrial data...

  20. Probabilistic models of language processing and acquisition.

    Science.gov (United States)

    Chater, Nick; Manning, Christopher D

    2006-07-01

    Probabilistic methods are providing new explanatory approaches to fundamental cognitive science questions of how humans structure, process and acquire language. This review examines probabilistic models defined over traditional symbolic structures. Language comprehension and production involve probabilistic inference in such models; and acquisition involves choosing the best model, given innate constraints and linguistic and other input. Probabilistic models can account for the learning and processing of language, while maintaining the sophistication of symbolic models. A recent burgeoning of theoretical developments and online corpus creation has enabled large models to be tested, revealing probabilistic constraints in processing, undermining acquisition arguments based on a perceived poverty of the stimulus, and suggesting fruitful links with probabilistic theories of categorization and ambiguity resolution in perception.

  1. Interdependencies of Arctic land surface processes: A uniquely sensitive environment

    Science.gov (United States)

    Bowling, L. C.

    2007-12-01

    The circumpolar arctic drainage basin is composed of several distinct ecoregions including steppe grassland and cropland, boreal forest and tundra. Land surface hydrology throughout this diverse region shares several unique features such as dramatic seasonal runoff differences controlled by snowmelt and ice break-up; the storage of significant portions of annual precipitation as snow and in lakes and wetlands; and the effects of ephemeral and permanently frozen soils. These arctic land processes are delicately balanced with the climate and are therefore important indicators of change. The litany of recently-detected changes in the Arctic includes changes in snow precipitation, trends and seasonal shifts in river discharge, increases and decreases in the extent of surface water, and warming soil temperatures. Although not unique to the arctic, increasing anthropogenic pressures represent an additional element of change in the form of resource extraction, fire threat and reservoir construction. The interdependence of the physical, biological and social systems mean that changes in primary indicators have large implications for land cover, animal populations and the regional carbon balance, all of which have the potential to feed back and induce further change. In fact, the complex relationships between the hydrological processes that make the Artic unique also render observed historical change difficult to interpret and predict, leading to conflicting explanations. For example, a decrease in snow accumulation may provide less insulation to the underlying soil resulting in greater frost development and increased spring runoff. Similarly, melting permafrost and ground ice may lead to ground subsidence and increased surface saturation and methane production, while more complete thaw may enhance drainage and result in drier soil conditions. The threshold nature of phase change around the freezing point makes the system especially sensitive to change. In addition, spatial

  2. Event-Entity-Relationship Modeling in Data Warehouse Environments

    DEFF Research Database (Denmark)

    Bækgaard, Lars

    We use the event-entity-relationship model (EVER) to illustrate the use of entity-based modeling languages for conceptual schema design in data warehouse environments. EVER is a general-purpose information modeling language that supports the specification of both general schema structures and multi......-dimensional schemes that are customized to serve specific information needs. EVER is based on an event concept that is very well suited for multi-dimensional modeling because measurement data often represent events in multi-dimensional databases...

  3. Towards a universal competitive intelligence process model

    Directory of Open Access Journals (Sweden)

    Rene Pellissier

    2013-07-01

    Full Text Available Background: Competitive intelligence (CI provides actionable intelligence, which provides a competitive edge in enterprises. However, without proper process, it is difficult to develop actionable intelligence. There are disagreements about how the CI process should be structured. For CI professionals to focus on producing actionable intelligence, and to do so with simplicity, they need a common CI process model.Objectives: The purpose of this research is to review the current literature on CI, to look at the aims of identifying and analysing CI process models, and finally to propose a universal CI process model.Method: The study was qualitative in nature and content analysis was conducted on all identified sources establishing and analysing CI process models. To identify relevant literature, academic databases and search engines were used. Moreover, a review of references in related studies led to more relevant sources, the references of which were further reviewed and analysed. To ensure reliability, only peer-reviewed articles were used.Results: The findings reveal that the majority of scholars view the CI process as a cycle of interrelated phases. The output of one phase is the input of the next phase.Conclusion: The CI process is a cycle of interrelated phases. The output of one phase is the input of the next phase. These phases are influenced by the following factors: decision makers, process and structure, organisational awareness and culture, and feedback.

  4. Towards a universal competitive intelligence process model

    Directory of Open Access Journals (Sweden)

    Rene Pellissier

    2013-08-01

    Full Text Available Background: Competitive intelligence (CI provides actionable intelligence, which provides a competitive edge in enterprises. However, without proper process, it is difficult to develop actionable intelligence. There are disagreements about how the CI process should be structured. For CI professionals to focus on producing actionable intelligence, and to do so with simplicity, they need a common CI process model.Objectives: The purpose of this research is to review the current literature on CI, to look at the aims of identifying and analysing CI process models, and finally to propose a universal CI process model.Method: The study was qualitative in nature and content analysis was conducted on all identified sources establishing and analysing CI process models. To identify relevant literature, academic databases and search engines were used. Moreover, a review of references in related studies led to more relevant sources, the references of which were further reviewed and analysed. To ensure reliability, only peer-reviewed articles were used.Results: The findings reveal that the majority of scholars view the CI process as a cycle of interrelated phases. The output of one phase is the input of the next phase.Conclusion: The CI process is a cycle of interrelated phases. The output of one phase is the input of the next phase. These phases are influenced by the following factors: decision makers, process and structure, organisational awareness and culture, and feedback.

  5. Students' reasoning during modeling in an inquiry learning environment.

    NARCIS (Netherlands)

    Lohner, Simone; van Joolingen, Wouter; Savelsbergh, Elwin R.; van Hout-Wolters, Bernadette

    2005-01-01

    In an inquiry-learning task, computer modeling can be a powerful tool to enhance students’ reasoning and help them improve their understanding. Optimal learning effects in such an environment require a high quality of students reasoning activities. It is not straightforward, however, which type of

  6. HexSim: a modeling environment for ecology and conservation.

    Science.gov (United States)

    HexSim is a powerful and flexible new spatially-explicit, individual based modeling environment intended for use in ecology, conservation, genetics, epidemiology, toxicology, and other disciplines. We describe HexSim, illustrate past applications that contributed to our >10 ye...

  7. Computer modeling of dosimetric pattern in aquatic environment of ...

    African Journals Online (AJOL)

    The dose distribution functions for the three sources of radiation in the environment have been reviewed. The model representing the geometry of aquatic organisms have been employed in computationally solving the dose rates to aquatic organisms with emphasis on the coastal areas of Nigeria where oil exploration ...

  8. Environment modeling using runtime values for JPF-Android

    CSIR Research Space (South Africa)

    Van der Merwe, H

    2015-11-01

    Full Text Available , the environment of an application is simplified/abstracted using models or stubs. Empty stubs, returning default values, are simple to generate automatically, but they do not perform well when the application expects specific return values. Symbolic execution...

  9. Environment Modeling Using Runtime Values for JPF-Android

    Science.gov (United States)

    van der Merwe, Heila; Tkachuk, Oksana; Nel, Seal; van der Merwe, Brink; Visser, Willem

    2015-01-01

    Software applications are developed to be executed in a specific environment. This environment includes external native libraries to add functionality to the application and drivers to fire the application execution. For testing and verification, the environment of an application is simplified abstracted using models or stubs. Empty stubs, returning default values, are simple to generate automatically, but they do not perform well when the application expects specific return values. Symbolic execution is used to find input parameters for drivers and return values for library stubs, but it struggles to detect the values of complex objects. In this work-in-progress paper, we explore an approach to generate drivers and stubs based on values collected during runtime instead of using default values. Entry-points and methods that need to be modeled are instrumented to log their parameters and return values. The instrumented applications are then executed using a driver and instrumented libraries. The values collected during runtime are used to generate driver and stub values on- the-fly that improve coverage during verification by enabling the execution of code that previously crashed or was missed. We are implementing this approach to improve the environment model of JPF-Android, our model checking and analysis tool for Android applications.

  10. A Virtual Environment for Resilient Infrastructure Modeling and Design

    Science.gov (United States)

    2015-09-01

    NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA THESIS Approved for public release; distribution is unlimited A VIRTUAL ...September 2015 3. REPORT TYPE AND DATES COVERED Master’s Thesis 4. TITLE AND SUBTITLE A VIRTUAL ENVIRONMENT FOR RESILIENT INFRASTRUCTURE MODELING AND...postprocessing, including convenient and efficient methods for manipulating CI network data. Moreover, the object-oriented nature of Pyomo creates a natural

  11. Modelling Global Pattern Formations for Collaborative Learning Environments

    DEFF Research Database (Denmark)

    Grappiolo, Corrado; Cheong, Yun-Gyung; Khaled, Rilla

    2012-01-01

    We present our research towards the design of a computational framework capable of modelling the formation and evolution of global patterns (i.e. group structures) in a population of social individuals. The framework is intended to be used in collaborative environments, e.g. social serious games...

  12. Modeling nanomaterial fate and uptake in the environment

    NARCIS (Netherlands)

    Baalousha, M.; Cornelis, G.; Kuhlbusch, T.A.J.; Lynch, I.; Nickel, C.; Peijnenburg, W.; Brink, Van Den N.W.

    2016-01-01

    Modeling the environmental fate of nanomaterials (NMs) and their uptake by cells and organisms in the environment is essential to underpin experimental research, develop overarching theories, improve our fundamental understanding of NM exposure and hazard, and thus enable risk assessment of NMs.

  13. Evaluation and Comparison of Ecological Models Simulating Nitrogen Processes in Treatment Wetlands,Implemented in Modelica

    OpenAIRE

    Edelfeldt, Stina

    2005-01-01

    Two ecological models of nitrogen processes in treatment wetlands have been evaluated and compared. These models have been implemented, simulated, and visualized in the Modelica language. The differences and similarities between the Modelica modeling environment used in this thesis and other environments or tools for ecological modeling have been evaluated. The modeling tools evaluated are PowerSim, Simile, Stella, the MathModelica Model Editor, and WEST. The evaluation and the analysis have...

  14. Models and Modelling Tools for Chemical Product and Process Design

    DEFF Research Database (Denmark)

    Gani, Rafiqul

    2016-01-01

    The design, development and reliability of a chemical product and the process to manufacture it, need to be consistent with the end-use characteristics of the desired product. One of the common ways to match the desired product-process characteristics is through trial and error based experiments......-based framework is that in the design, development and/or manufacturing of a chemical product-process, the knowledge of the applied phenomena together with the product-process design details can be provided with diverse degrees of abstractions and details. This would allow the experimental resources......, are the needed models for such a framework available? Or, are modelling tools that can help to develop the needed models available? Can such a model-based framework provide the needed model-based work-flows matching the requirements of the specific chemical product-process design problems? What types of models...

  15. MODELLING PURCHASING PROCESSES FROM QUALITY ASPECTS

    Directory of Open Access Journals (Sweden)

    Zora Arsovski

    2008-12-01

    Full Text Available Management has a fundamental task to identify and direct primary and specific processes within purchasing function, applying the up-to-date information infrastructure. ISO 9001:2000 defines a process as a number of interrelated or interactive activities transforming inputs and outputs, and the "process approach" as a systematic identification in management processes employed with the organization and particularly - relationships among the processes. To direct a quality management system using process approach, the organization is to determine the map of its general (basic processes. Primary processes are determined on the grounds of their interrelationship and impact on satisfying customers' needs. To make a proper choice of general business processes, it is necessary to determine the entire business flow, beginning with the customer demand up to the delivery of products or service provided. In the next step the process model is to be converted into data model which is essential for implementation of the information system enabling automation, monitoring, measuring, inspection, analysis and improvement of key purchase processes. In this paper are given methodology and some results of investigation of development of IS for purchasing process from aspects of quality.

  16. Process control for sheet-metal stamping process modeling, controller design and shop-floor implementation

    CERN Document Server

    Lim, Yongseob; Ulsoy, A Galip

    2014-01-01

    Process Control for Sheet-Metal Stamping presents a comprehensive and structured approach to the design and implementation of controllers for the sheet metal stamping process. The use of process control for sheet-metal stamping greatly reduces defects in deep-drawn parts and can also yield large material savings from reduced scrap. Sheet-metal forming is a complex process and most often characterized by partial differential equations that are numerically solved using finite-element techniques. In this book, twenty years of academic research are reviewed and the resulting technology transitioned to the industrial environment. The sheet-metal stamping process is modeled in a manner suitable for multiple-input multiple-output control system design, with commercially available sensors and actuators. These models are then used to design adaptive controllers and real-time controller implementation is discussed. Finally, experimental results from actual shopfloor deployment are presented along with ideas for further...

  17. Three-dimensional model analysis and processing

    CERN Document Server

    Yu, Faxin; Luo, Hao; Wang, Pinghui

    2011-01-01

    This book focuses on five hot research directions in 3D model analysis and processing in computer science:  compression, feature extraction, content-based retrieval, irreversible watermarking and reversible watermarking.

  18. Value-Oriented Coordination Process Modeling

    NARCIS (Netherlands)

    Fatemi, Hassan; van Sinderen, Marten J.; Wieringa, Roelf J.; Hull, Richard; Mendling, Jan; Tai, Stefan

    Business webs are collections of enterprises designed to jointly satisfy a consumer need. Designing business webs calls for modeling the collaboration of enterprises from different perspectives, in particular the business value and coordination process perspectives, and for mutually aligning these

  19. Direct numerical simulation of microcavitation processes in different bio environments

    Science.gov (United States)

    Ly, Kevin; Wen, Sy-Bor; Schmidt, Morgan S.; Thomas, Robert J.

    2017-02-01

    Laser-induced microcavitation refers to the rapid formation and expansion of a vapor bubble inside the bio-tissue when it is exposed to intense, pulsed laser energy. With the associated microscale dissection occurring within the tissue, laserinduced microcavitation is a common approach for high precision bio-surgeries. For example, laser-induced microcavitation is used for laser in-situ keratomileusis (LASIK) to precisely reshape the midstromal corneal tissue through excimer laser beam. Multiple efforts over the last several years have observed unique characteristics of microcavitions in biotissues. For example, it was found that the threshold energy for microcavitation can be significantly reduced when the size of the biostructure is increased. Also, it was found that the dynamics of microcavitation are significantly affected by the elastic modules of the bio-tissue. However, these efforts have not focused on the early events during microcavitation development. In this study, a direct numerical simulation of the microcavitation process based on equation of state of the biotissue was established. With the direct numerical simulation, we were able to reproduce the dynamics of microcavitation in water-rich bio tissues. Additionally, an experimental setup in deionized water and 10% PAA gel was made to verify the results of the simulation for early micro-cavitation formation for 10% Polyacrylamide (PAA) gel in deionized water.

  20. Modeling of Dielectric Heating within Lyophilization Process

    Directory of Open Access Journals (Sweden)

    Jan Kyncl

    2014-01-01

    Full Text Available A process of lyophilization of paper books is modeled. The process of drying is controlled by a dielectric heating system. From the physical viewpoint, the task represents a 2D coupled problem described by two partial differential equations for the electric and temperature fields. The material parameters are supposed to be temperature-dependent functions. The continuous mathematical model is solved numerically. The methodology is illustrated with some examples whose results are discussed.

  1. The impact of working memory and the "process of process modelling" on model quality: Investigating experienced versus inexperienced modellers

    DEFF Research Database (Denmark)

    Martini, Markus; Pinggera, Jakob; Neurauter, Manuel

    2016-01-01

    the role of cognitive processes as well as modelling processes in creating a PM in experienced and inexperienced modellers. Specifically, two working memory (WM) functions (holding and processing of information and relational integration) and three process of process modelling phases (comprehension...

  2. Sedimentary Environments and Processes in the Vicinity of Quicks Hole, Elizabeth Islands, Massachusetts

    Science.gov (United States)

    Poppe, L. J.; Ackerman, S. D.; Moser, M. S.; Stewart, H. F.; Foster, D. S.; Blackwood, D. S.; Butman, B.

    2007-05-01

    Continuous-coverage multibeam bathymetric models and sidescan sonar imagery, verified with bottom sampling and photography, provide 1) detailed basemaps that yield topographic and geological perspectives of the sea floor, 2) a fundamental framework for research and management activities, and 3) information on sedimentary environments and processes. Interpretations presented here are based on NOAA hydrographic survey H11076 that covers approximately 23 sq. km around Quicks Hole, a major passage through the Elizabeth Islands chain, offshore southeastern Massachusetts. Bouldery gravels overgrown with seaweed and sessile fauna dominate the sea floor in Quicks Hole, along shorelines, and on isolated bathymetric highs. These deposits, which reflect environments of erosion and nondeposition, are winnowed lags of till from the exposed Buzzards Bay moraine. The sea floor south of the Hole in Vineyard Sound is characterized by environments associated with coarse bedload transport and covered with transverse and barchanoid sand waves. Transverse waves exceed 7 m in amplitude, have slip faces predominantly oriented to the west and southwest, and have straight, slightly sinuous, or curved crests. Megaripples, which mimic asymmetry of the transverse waves but not necessarily their orientation, are commonly present on stoss slopes; current ripples are ubiquitous. These smaller bedforms suggest that transport is active and that sand waves are propagating under the present hydraulic regime. Net sediment transport is primarily to the west and southwest as evidenced by comparisons with data from an earlier hydrographic survey, orientation of barchanoid waves, and asymmetry of transverse waves and of scour marks around boulders and shipwrecks. The sea floor across the northern part of the study area in Buzzards Bay and away from the opening to Quicks Hole is more protected from wind- and tidally-driven currents. Environments here are primarily characterized by processes associated

  3. Flux Analysis in Process Models via Causality

    Directory of Open Access Journals (Sweden)

    Ozan Kahramanoğulları

    2010-02-01

    Full Text Available We present an approach for flux analysis in process algebra models of biological systems. We perceive flux as the flow of resources in stochastic simulations. We resort to an established correspondence between event structures, a broadly recognised model of concurrency, and state transitions of process models, seen as Petri nets. We show that we can this way extract the causal resource dependencies in simulations between individual state transitions as partial orders of events. We propose transformations on the partial orders that provide means for further analysis, and introduce a software tool, which implements these ideas. By means of an example of a published model of the Rho GTP-binding proteins, we argue that this approach can provide the substitute for flux analysis techniques on ordinary differential equation models within the stochastic setting of process algebras.

  4. Fundamentals of Numerical Modelling of Casting Processes

    DEFF Research Database (Denmark)

    Pryds, Nini; Thorborg, Jesper; Lipinski, Marek

    Fundamentals of Numerical Modelling of Casting Processes comprises a thorough presentation of the basic phenomena that need to be addressed in numerical simulation of casting processes. The main philosophy of the book is to present the topics in view of their physical meaning, whenever possible...

  5. Business Process Modeling Notation - An Overview

    Directory of Open Access Journals (Sweden)

    Alexandra Fortiş

    2006-01-01

    Full Text Available BPMN represents an industrial standard created to offer a common and user friendly notation to all the participants to a business process. The present paper aims to briefly present the main features of this notation as well as an interpretation of some of the main patterns characterizing a business process modeled by the working fluxes.

  6. Perceptions of Instructional Design Process Models.

    Science.gov (United States)

    Branch, Robert Maribe

    Instructional design is a process that is creative, active, iterative and complex; however, many diagrams of instructional design are interpreted as stifling, passive, lock-step and simple because of the visual elements used to model the process. The purpose of this study was to determine the expressed perceptions of the types of flow diagrams…

  7. Physical and mathematical modelling of extrusion processes

    DEFF Research Database (Denmark)

    Arentoft, Mogens; Gronostajski, Z.; Niechajowics, A.

    2000-01-01

    The main objective of the work is to study the extrusion process using physical modelling and to compare the findings of the study with finite element predictions. The possibilities and advantages of the simultaneous application of both of these methods for the analysis of metal forming processes...

  8. Pedagogic process modeling: Humanistic-integrative approach

    OpenAIRE

    Boritko Nikolaj M.

    2007-01-01

    The paper deals with some current problems of modeling the dynamics of the subject-features development of the individual. The term "process" is considered in the context of the humanistic-integrative approach, in which the principles of self education are regarded as criteria for efficient pedagogic activity. Four basic characteristics of the pedagogic process are pointed out: intentionality reflects logicality and regularity of the development of the process; discreteness (stageability) in ...

  9. A Model for Urban Environment and Resource Planning Based on Green GDP Accounting System

    Directory of Open Access Journals (Sweden)

    Linyu Xu

    2013-01-01

    Full Text Available The urban environment and resources are currently on course that is unsustainable in the long run due to excessive human pursuit of economic goals. Thus, it is very important to develop a model to analyse the relationship between urban economic development and environmental resource protection during the process of rapid urbanisation. This paper proposed a model to identify the key factors in urban environment and resource regulation based on a green GDP accounting system, which consisted of four parts: economy, society, resource, and environment. In this model, the analytic hierarchy process (AHP method and a modified Pearl curve model were combined to allow for dynamic evaluation, with higher green GDP value as the planning target. The model was applied to the environmental and resource planning problem of Wuyishan City, and the results showed that energy use was a key factor that influenced the urban environment and resource development. Biodiversity and air quality were the most sensitive factors that influenced the value of green GDP in the city. According to the analysis, the urban environment and resource planning could be improved for promoting sustainable development in Wuyishan City.

  10. The Cognitive Complexity in Modelling the Group Decision Process

    Directory of Open Access Journals (Sweden)

    Barna Iantovics

    2010-06-01

    Full Text Available The paper investigates for some basic contextual factors (such
    us the problem complexity, the users' creativity and the problem space complexity the cognitive complexity associated with modelling the group decision processes (GDP in e-meetings. The analysis is done by conducting a socio-simulation experiment for an envisioned collaborative software tool that acts as a stigmergic environment for modelling the GDP. The simulation results revels some interesting design guidelines for engineering some contextual functionalities that minimize the cognitive complexity associated with modelling the GDP.

  11. How Is the Learning Environment in Physics Lesson with Using 7E Model Teaching Activities

    Science.gov (United States)

    Turgut, Umit; Colak, Alp; Salar, Riza

    2017-01-01

    The aim of this research is to reveal the results in the planning, implementation and evaluation of the process for learning environments to be designed in compliance with 7E learning cycle model in physics lesson. "Action research", which is a qualitative research pattern, is employed in this research in accordance with the aim of the…

  12. Modeling biochemical transformation processes and information processing with Narrator.

    Science.gov (United States)

    Mandel, Johannes J; Fuss, Hendrik; Palfreyman, Niall M; Dubitzky, Werner

    2007-03-27

    Software tools that model and simulate the dynamics of biological processes and systems are becoming increasingly important. Some of these tools offer sophisticated graphical user interfaces (GUIs), which greatly enhance their acceptance by users. Such GUIs are based on symbolic or graphical notations used to describe, interact and communicate the developed models. Typically, these graphical notations are geared towards conventional biochemical pathway diagrams. They permit the user to represent the transport and transformation of chemical species and to define inhibitory and stimulatory dependencies. A critical weakness of existing tools is their lack of supporting an integrative representation of transport, transformation as well as biological information processing. Narrator is a software tool facilitating the development and simulation of biological systems as Co-dependence models. The Co-dependence Methodology complements the representation of species transport and transformation together with an explicit mechanism to express biological information processing. Thus, Co-dependence models explicitly capture, for instance, signal processing structures and the influence of exogenous factors or events affecting certain parts of a biological system or process. This combined set of features provides the system biologist with a powerful tool to describe and explore the dynamics of life phenomena. Narrator's GUI is based on an expressive graphical notation which forms an integral part of the Co-dependence Methodology. Behind the user-friendly GUI, Narrator hides a flexible feature which makes it relatively easy to map models defined via the graphical notation to mathematical formalisms and languages such as ordinary differential equations, the Systems Biology Markup Language or Gillespie's direct method. This powerful feature facilitates reuse, interoperability and conceptual model development. Narrator is a flexible and intuitive systems biology tool. It is

  13. Modeling biochemical transformation processes and information processing with Narrator

    Directory of Open Access Journals (Sweden)

    Palfreyman Niall M

    2007-03-01

    Full Text Available Abstract Background Software tools that model and simulate the dynamics of biological processes and systems are becoming increasingly important. Some of these tools offer sophisticated graphical user interfaces (GUIs, which greatly enhance their acceptance by users. Such GUIs are based on symbolic or graphical notations used to describe, interact and communicate the developed models. Typically, these graphical notations are geared towards conventional biochemical pathway diagrams. They permit the user to represent the transport and transformation of chemical species and to define inhibitory and stimulatory dependencies. A critical weakness of existing tools is their lack of supporting an integrative representation of transport, transformation as well as biological information processing. Results Narrator is a software tool facilitating the development and simulation of biological systems as Co-dependence models. The Co-dependence Methodology complements the representation of species transport and transformation together with an explicit mechanism to express biological information processing. Thus, Co-dependence models explicitly capture, for instance, signal processing structures and the influence of exogenous factors or events affecting certain parts of a biological system or process. This combined set of features provides the system biologist with a powerful tool to describe and explore the dynamics of life phenomena. Narrator's GUI is based on an expressive graphical notation which forms an integral part of the Co-dependence Methodology. Behind the user-friendly GUI, Narrator hides a flexible feature which makes it relatively easy to map models defined via the graphical notation to mathematical formalisms and languages such as ordinary differential equations, the Systems Biology Markup Language or Gillespie's direct method. This powerful feature facilitates reuse, interoperability and conceptual model development. Conclusion Narrator is a

  14. Modeling and control for closed environment plant production systems

    Science.gov (United States)

    Fleisher, David H.; Ting, K. C.; Janes, H. W. (Principal Investigator)

    2002-01-01

    A computer program was developed to study multiple crop production and control in controlled environment plant production systems. The program simulates crop growth and development under nominal and off-nominal environments. Time-series crop models for wheat (Triticum aestivum), soybean (Glycine max), and white potato (Solanum tuberosum) are integrated with a model-based predictive controller. The controller evaluates and compensates for effects of environmental disturbances on crop production scheduling. The crop models consist of a set of nonlinear polynomial equations, six for each crop, developed using multivariate polynomial regression (MPR). Simulated data from DSSAT crop models, previously modified for crop production in controlled environments with hydroponics under elevated atmospheric carbon dioxide concentration, were used for the MPR fitting. The model-based predictive controller adjusts light intensity, air temperature, and carbon dioxide concentration set points in response to environmental perturbations. Control signals are determined from minimization of a cost function, which is based on the weighted control effort and squared-error between the system response and desired reference signal.

  15. The (Mathematical) Modeling Process in Biosciences.

    Science.gov (United States)

    Torres, Nestor V; Santos, Guido

    2015-01-01

    In this communication, we introduce a general framework and discussion on the role of models and the modeling process in the field of biosciences. The objective is to sum up the common procedures during the formalization and analysis of a biological problem from the perspective of Systems Biology, which approaches the study of biological systems as a whole. We begin by presenting the definitions of (biological) system and model. Particular attention is given to the meaning of mathematical model within the context of biology. Then, we present the process of modeling and analysis of biological systems. Three stages are described in detail: conceptualization of the biological system into a model, mathematical formalization of the previous conceptual model and optimization and system management derived from the analysis of the mathematical model. All along this work the main features and shortcomings of the process are analyzed and a set of rules that could help in the task of modeling any biological system are presented. Special regard is given to the formative requirements and the interdisciplinary nature of this approach. We conclude with some general considerations on the challenges that modeling is posing to current biology.

  16. INTUITEL and the Hypercube Model - Developing Adaptive Learning Environments

    Directory of Open Access Journals (Sweden)

    Kevin Fuchs

    2016-06-01

    Full Text Available In this paper we introduce an approach for the creation of adaptive learning environments that give human-like recommendations to a learner in the form of a virtual tutor. We use ontologies defining pedagogical, didactic and learner-specific data describing a learner's progress, learning history, capabilities and the learner's current state within the learning environment. Learning recommendations are based on a reasoning process on these ontologies and can be provided in real-time. The ontologies may describe learning content from any domain of knowledge. Furthermore, we describe an approach to store learning histories as spatio-temporal trajectories and to correlate them with influencing didactic factors. We show how such analysis of spatiotemporal data can be used for learning analytics to improve future adaptive learning environments.

  17. Evolution of quantum-like modeling in decision making processes

    Science.gov (United States)

    Khrennikova, Polina

    2012-12-01

    The application of the mathematical formalism of quantum mechanics to model behavioral patterns in social science and economics is a novel and constantly emerging field. The aim of the so called 'quantum like' models is to model the decision making processes in a macroscopic setting, capturing the particular 'context' in which the decisions are taken. Several subsequent empirical findings proved that when making a decision people tend to violate the axioms of expected utility theory and Savage's Sure Thing principle, thus violating the law of total probability. A quantum probability formula was devised to describe more accurately the decision making processes. A next step in the development of QL-modeling in decision making was the application of Schrödinger equation to describe the evolution of people's mental states. A shortcoming of Schrödinger equation is its inability to capture dynamics of an open system; the brain of the decision maker can be regarded as such, actively interacting with the external environment. Recently the master equation, by which quantum physics describes the process of decoherence as the result of interaction of the mental state with the environmental 'bath', was introduced for modeling the human decision making. The external environment and memory can be referred to as a complex 'context' influencing the final decision outcomes. The master equation can be considered as a pioneering and promising apparatus for modeling the dynamics of decision making in different contexts.

  18. Model for Simulating a Spiral Software-Development Process

    Science.gov (United States)

    Mizell, Carolyn; Curley, Charles; Nayak, Umanath

    2010-01-01

    A discrete-event simulation model, and a computer program that implements the model, have been developed as means of analyzing a spiral software-development process. This model can be tailored to specific development environments for use by software project managers in making quantitative cases for deciding among different software-development processes, courses of action, and cost estimates. A spiral process can be contrasted with a waterfall process, which is a traditional process that consists of a sequence of activities that include analysis of requirements, design, coding, testing, and support. A spiral process is an iterative process that can be regarded as a repeating modified waterfall process. Each iteration includes assessment of risk, analysis of requirements, design, coding, testing, delivery, and evaluation. A key difference between a spiral and a waterfall process is that a spiral process can accommodate changes in requirements at each iteration, whereas in a waterfall process, requirements are considered to be fixed from the beginning and, therefore, a waterfall process is not flexible enough for some projects, especially those in which requirements are not known at the beginning or may change during development. For a given project, a spiral process may cost more and take more time than does a waterfall process, but may better satisfy a customer's expectations and needs. Models for simulating various waterfall processes have been developed previously, but until now, there have been no models for simulating spiral processes. The present spiral-process-simulating model and the software that implements it were developed by extending a discrete-event simulation process model of the IEEE 12207 Software Development Process, which was built using commercially available software known as the Process Analysis Tradeoff Tool (PATT). Typical inputs to PATT models include industry-average values of product size (expressed as number of lines of code

  19. Analysis and evaluation of collaborative modeling processes

    NARCIS (Netherlands)

    Ssebuggwawo, D.

    2012-01-01

    Analysis and evaluation of collaborative modeling processes is confronted with many challenges. On the one hand, many systems design and re-engineering projects require collaborative modeling approaches that can enhance their productivity. But, such collaborative efforts, which often consist of the

  20. Chemical vapor infiltration process modeling and optimization

    Energy Technology Data Exchange (ETDEWEB)

    Besmann, T.M.; Stinton, D.P. [Oak Ridge National Lab., TN (United States); Matlin, W.M. [Tennessee Univ., Knoxville, TN (United States). Dept. of Materials Science and Engineering

    1995-12-31

    Chemical vapor infiltration is a unique method for preparing continuous fiber ceramic composites that spares the strong but relatively fragile fibers from damaging thermal, mechanical, and chemical degradation. The process is relatively complex and modeling requires detailed phenomenological knowledge of the chemical kinetics and mass and heat transport. An overview of some of the current understanding and modeling of CVI and examples of efforts to optimize the processes is given. Finally, recent efforts to scale-up the process to produce tubular forms are described.

  1. CAD ACTIVE MODELS: AN INNOVATIVE METHOD IN ASSEMBLY ENVIRONMENT

    Directory of Open Access Journals (Sweden)

    NADDEO Alessandro

    2010-07-01

    Full Text Available The aim of this work is to show the use and the versatility of the active models in different applications. It has been realized an active model of a cylindrical spring and it has been applied in two mechanisms, different for typology and for backlash loads. The first example is a dynamometer in which nthe cylindrical spring is loaded by traction forces, while the second example is made up from a pressure valve in which the cylindrical-conic spring works under compression. The imposition of the loads in both cases, has allowed us to evaluate the model of the mechanism in different working conditions, also in assembly environment.

  2. Saint: a lightweight integration environment for model annotation.

    Science.gov (United States)

    Lister, Allyson L; Pocock, Matthew; Taschuk, Morgan; Wipat, Anil

    2009-11-15

    Saint is a web application which provides a lightweight annotation integration environment for quantitative biological models. The system enables modellers to rapidly mark up models with biological information derived from a range of data sources. Saint is freely available for use on the web at http://www.cisban.ac.uk/saint. The web application is implemented in Google Web Toolkit and Tomcat, with all major browsers supported. The Java source code is freely available for download at http://saint-annotate.sourceforge.net. The Saint web server requires an installation of libSBML and has been tested on Linux (32-bit Ubuntu 8.10 and 9.04).

  3. ARTEMIS: Ares Real Time Environments for Modeling, Integration, and Simulation

    Science.gov (United States)

    Hughes, Ryan; Walker, David

    2009-01-01

    This slide presentation reviews the use of ARTEMIS in the development and testing of the ARES launch vehicles. Ares Real Time Environment for Modeling, Simulation and Integration (ARTEMIS) is the real time simulation supporting Ares I hardware-in-the-loop (HWIL) testing. ARTEMIS accurately models all Ares/Orion/Ground subsystems which interact with Ares avionics components from pre-launch through orbit insertion The ARTEMIS System integration Lab, and the STIF architecture is reviewed. The functional components of ARTEMIS are outlined. An overview of the models and a block diagram is presented.

  4. Distributed data processing and analysis environment for neutron scattering experiments at CSNS

    Science.gov (United States)

    Tian, H. L.; Zhang, J. R.; Yan, L. L.; Tang, M.; Hu, L.; Zhao, D. X.; Qiu, Y. X.; Zhang, H. Y.; Zhuang, J.; Du, R.

    2016-10-01

    China Spallation Neutron Source (CSNS) is the first high-performance pulsed neutron source in China, which will meet the increasing fundamental research and technique applications demands domestically and overseas. A new distributed data processing and analysis environment has been developed, which has generic functionalities for neutron scattering experiments. The environment consists of three parts, an object-oriented data processing framework adopting a data centered architecture, a communication and data caching system based on the C/S paradigm, and data analysis and visualization software providing the 2D/3D experimental data display. This environment will be widely applied in CSNS for live data processing.

  5. The NPS Virtual Thermal Image processing model

    OpenAIRE

    Kenter, Yucel.

    2001-01-01

    A new virtual thermal image-processing model that has been developed at the Naval Postgraduate School is introduced in this thesis. This visualization program is based on an earlier work, the Visibility MRTD model, which is focused on predicting the minimum resolvable temperature difference (MRTD). The MRTD is a standard performance measure for forward-looking infrared (FLIR) imaging systems. It takes into account thermal imaging system modeling concerns, such as modulation transfer functions...

  6. Using Perspective to Model Complex Processes

    Energy Technology Data Exchange (ETDEWEB)

    Kelsey, R.L.; Bisset, K.R.

    1999-04-04

    The notion of perspective, when supported in an object-based knowledge representation, can facilitate better abstractions of reality for modeling and simulation. The object modeling of complex physical and chemical processes is made more difficult in part due to the poor abstractions of state and phase changes available in these models. The notion of perspective can be used to create different views to represent the different states of matter in a process. These techniques can lead to a more understandable model. Additionally, the ability to record the progress of a process from start to finish is problematic. It is desirable to have a historic record of the entire process, not just the end result of the process. A historic record should facilitate backtracking and re-start of a process at different points in time. The same representation structures and techniques can be used to create a sequence of process markers to represent a historic record. By using perspective, the sequence of markers can have multiple and varying views tailored for a particular user's context of interest.

  7. Numerical modeling process of embolization arteriovenous malformation

    Science.gov (United States)

    Cherevko, A. A.; Gologush, T. S.; Petrenko, I. A.; Ostapenko, V. V.

    2017-10-01

    Cerebral arteriovenous malformation is a difficult, dangerous, and most frequently encountered vascular failure of development. It consists of vessels of very small diameter, which perform a discharge of blood from the artery to the vein. In this regard it can be adequately modeled using porous medium. Endovascular embolization of arteriovenous malformation is effective treatment of such pathologies. However, the danger of intraoperative rupture during embolization still exists. The purpose is to model this process and build an optimization algorithm for arteriovenous malformation embolization. To study the different embolization variants, the initial-boundary value problems, describing the process of embolization, were solved numerically by using a new modification of CABARET scheme. The essential moments of embolization process were modeled in our numerical experiments. This approach well reproduces the essential features of discontinuous two-phase flows, arising in the embolization problems. It can be used for further study on the process of embolization.

  8. On the modeling of planetary plasma environments by a fully kinetic electromagnetic global model HYB-em

    Directory of Open Access Journals (Sweden)

    V. Pohjola

    2010-03-01

    Full Text Available We have developed a fully kinetic electromagnetic model to study instabilities and waves in planetary plasma environments. In the particle-in-a-cell (PIC model both ions and electrons are modeled as particles. An important feature of the developed global kinetic model, called HYB-em, compared to other electromagnetic codes is that it is built up on an earlier quasi-neutral hybrid simulation platform called HYB and that it can be used in conjunction with earlier hybrid models. The HYB models have been used during the past ten years to study globally the flowing plasma interaction with various Solar System objects: Mercury, Venus, the Moon, Mars, Saturnian moon Titan and asteroids. The new stand-alone fully kinetic model enables us to (1 study the stability of various planetary plasma regions in three-dimensional space, (2 analyze the propagation of waves in a plasma environment derived from the other global HYB models. All particle processes in a multi-ion plasma which are implemented on the HYB platform (e.g. ion-neutral-collisions, chemical processes, particle loss and production processes are also automatically included in HYB-em model. In this brief report we study the developed approach by analyzing the propagation of high frequency electromagnetic waves in non-magnetized plasma in two cases: We study (1 expansion of a spherical wave generated from a point source and (2 propagation of a plane wave in plasma. The analysis shows that the HYB-em model is capable of describing these space plasma situations successfully. The analysis also suggests the potential of the developed model to study both high density-high magnetic field plasma environments, such as Mercury, and low density-low magnetic field plasma environments, such as Venus and Mars.

  9. On the modeling of planetary plasma environments by a fully kinetic electromagnetic global model HYB-em

    Directory of Open Access Journals (Sweden)

    V. Pohjola

    2010-03-01

    Full Text Available We have developed a fully kinetic electromagnetic model to study instabilities and waves in planetary plasma environments. In the particle-in-a-cell (PIC model both ions and electrons are modeled as particles. An important feature of the developed global kinetic model, called HYB-em, compared to other electromagnetic codes is that it is built up on an earlier quasi-neutral hybrid simulation platform called HYB and that it can be used in conjunction with earlier hybrid models. The HYB models have been used during the past ten years to study globally the flowing plasma interaction with various Solar System objects: Mercury, Venus, the Moon, Mars, Saturnian moon Titan and asteroids. The new stand-alone fully kinetic model enables us to (1 study the stability of various planetary plasma regions in three-dimensional space, (2 analyze the propagation of waves in a plasma environment derived from the other global HYB models. All particle processes in a multi-ion plasma which are implemented on the HYB platform (e.g. ion-neutral-collisions, chemical processes, particle loss and production processes are also automatically included in HYB-em model.

    In this brief report we study the developed approach by analyzing the propagation of high frequency electromagnetic waves in non-magnetized plasma in two cases: We study (1 expansion of a spherical wave generated from a point source and (2 propagation of a plane wave in plasma. The analysis shows that the HYB-em model is capable of describing these space plasma situations successfully. The analysis also suggests the potential of the developed model to study both high density-high magnetic field plasma environments, such as Mercury, and low density-low magnetic field plasma environments, such as Venus and Mars.

  10. Modeling Multivariate Volatility Processes: Theory and Evidence

    Directory of Open Access Journals (Sweden)

    Jelena Z. Minovic

    2009-05-01

    Full Text Available This article presents theoretical and empirical methodology for estimation and modeling of multivariate volatility processes. It surveys the model specifications and the estimation methods. Multivariate GARCH models covered are VEC (initially due to Bollerslev, Engle and Wooldridge, 1988, diagonal VEC (DVEC, BEKK (named after Baba, Engle, Kraft and Kroner, 1995, Constant Conditional Correlation Model (CCC, Bollerslev, 1990, Dynamic Conditional Correlation Model (DCC models of Tse and Tsui, 2002, and Engle, 2002. I illustrate approach by applying it to daily data from the Belgrade stock exchange, I examine two pairs of daily log returns for stocks and index, report the results obtained, and compare them with the restricted version of BEKK, DVEC and CCC representations. The methods for estimation parameters used are maximum log-likehood (in BEKK and DVEC models and twostep approach (in CCC model.

  11. A One-Dimensional Numerical Model to Study the Effects of Cumulus Clouds on the Environment,

    Science.gov (United States)

    the environment . The model combines a one-dimensional Lagrangian cumulus cloud model with the basic physical processes of cyclonic scale lifting, surface eddy mixing, cloud induced environmental subsidence, sub-cloud hydrometeor water evaporation and horizontal diffusion of the dissipating cloud. Included in this documentation are a detailed model description, derivation of the model equations, a basic flow diagram, a list of program mnemonics, a description of the input data format and a model listing and output from the National Center for Atmospheric Research’s

  12. On Choosing Between Two Probabilistic Choice Sub-models in a Dynamic Multitask Environment

    Science.gov (United States)

    Soulsby, E. P.

    1984-01-01

    An independent random utility model based on Thurstone's Theory of Comparative Judgment and a constant utility model based on Luce's Choice Axiom are reviewed in detail. Predictions from the two models are shown to be equivalent under certain restrictions on the distribution of the underlying random process. Each model is applied as a stochastic choice submodel in a dynamic, multitask, environment. Resulting choice probabilities are nearly identical, indicating that, despite their conceptual differences, neither model may be preferred over the other based solely on its predictive capability.

  13. Pedagogic process modeling: Humanistic-integrative approach

    Directory of Open Access Journals (Sweden)

    Boritko Nikolaj M.

    2007-01-01

    Full Text Available The paper deals with some current problems of modeling the dynamics of the subject-features development of the individual. The term "process" is considered in the context of the humanistic-integrative approach, in which the principles of self education are regarded as criteria for efficient pedagogic activity. Four basic characteristics of the pedagogic process are pointed out: intentionality reflects logicality and regularity of the development of the process; discreteness (stageability in dicates qualitative stages through which the pedagogic phenomenon passes; nonlinearity explains the crisis character of pedagogic processes and reveals inner factors of self-development; situationality requires a selection of pedagogic conditions in accordance with the inner factors, which would enable steering the pedagogic process. Offered are two steps for singling out a particular stage and the algorithm for developing an integrative model for it. The suggested conclusions might be of use for further theoretic research, analyses of educational practices and for realistic predicting of pedagogical phenomena. .

  14. From Business Value Model to Coordination Process Model

    Science.gov (United States)

    Fatemi, Hassan; van Sinderen, Marten; Wieringa, Roel

    The increased complexity of business webs calls for modeling the collaboration of enterprises from different perspectives, in particular the business and process perspectives, and for mutually aligning these perspectives. Business value modeling and coordination process modeling both are necessary for a good e-business design, but these activities have different goals and use different concepts. Nevertheless, the resulting models should be consistent with each other because they refer to the same system from different perspectives. Hence, checking the consistency between these models or producing one based on the other would be of high value. In this paper we discuss the issue of achieving consistency in multi-level e-business design and give guidelines to produce consistent coordination process models from business value models in a stepwise manner.

  15. Various Models for Reading Comprehension Process

    Directory of Open Access Journals (Sweden)

    Parastoo Babashamsi

    2013-11-01

    Full Text Available In recent years reading can be viewed as a process, as a form of thinking, as a true experience, and as a tool subject. As a process, reading includes visual discrimination, independent recognition of word, rhythmic progression along a line of print, precision in the return sweep of the eyes, and adjustment of rate. In the same line, the present paper aims at considering the various models of reading process. Moreover, the paper will take a look at various factors such as schema and vocabulary knowledge which affect reading comprehension process.

  16. Ancestral process and diffusion model with selection

    CERN Document Server

    Mano, Shuhei

    2008-01-01

    The ancestral selection graph in population genetics introduced by Krone and Neuhauser (1997) is an analogue to the coalescent genealogy. The number of ancestral particles, backward in time, of a sample of genes is an ancestral process, which is a birth and death process with quadratic death and linear birth rate. In this paper an explicit form of the number of ancestral particle is obtained, by using the density of the allele frequency in the corresponding diffusion model obtained by Kimura (1955). It is shown that fixation is convergence of the ancestral process to the stationary measure. The time to fixation of an allele is studied in terms of the ancestral process.

  17. Biomedical Simulation Models of Human Auditory Processes

    Science.gov (United States)

    Bicak, Mehmet M. A.

    2012-01-01

    Detailed acoustic engineering models that explore noise propagation mechanisms associated with noise attenuation and transmission paths created when using hearing protectors such as earplugs and headsets in high noise environments. Biomedical finite element (FE) models are developed based on volume Computed Tomography scan data which provides explicit external ear, ear canal, middle ear ossicular bones and cochlea geometry. Results from these studies have enabled a greater understanding of hearing protector to flesh dynamics as well as prioritizing noise propagation mechanisms. Prioritization of noise mechanisms can form an essential framework for exploration of new design principles and methods in both earplug and earcup applications. These models are currently being used in development of a novel hearing protection evaluation system that can provide experimentally correlated psychoacoustic noise attenuation. Moreover, these FE models can be used to simulate the effects of blast related impulse noise on human auditory mechanisms and brain tissue.

  18. Natural headland sand bypassing; towards identifying and modelling the mechanisms and processes

    NARCIS (Netherlands)

    Bin Ab Razak, M.S.

    2015-01-01

    Natural headland sand bypassing: Towards identifying and modelling the mechanisms and processes contributes to the understanding of the mechanisms and processes of sand bypassing in artificial and non-artificial coastal environments through a numerical modelling study. Sand bypassing processes in

  19. Road environment perception algorithm based on object semantic probabilistic model

    Science.gov (United States)

    Liu, Wei; Wang, XinMei; Tian, Jinwen; Wang, Yong

    2015-12-01

    This article seeks to discover the object categories' semantic probabilistic model (OSPM) based on statistical test analysis method. We applied this model on road forward environment perception algorithm, including on-road object recognition and detection. First, the image was represented by a set composed of words (local feature regions). Then, found the probability distribution among image, local regions and object semantic category based on the new model. In training, the parameters of the object model are estimated. This is done by using expectation-maximization in a maximum likelihood setting. In recognition, this model is used to classify images by using a Bayesian manner. In detection, the posterios is calculated to detect the typical on-road objects. Experiments release the good performance on object recognition and detection in urban street background.

  20. Gene-Environment Processes Linking Aggression, Peer Victimization, and the Teacher-Child Relationship

    Science.gov (United States)

    Brendgen, Mara; Boivin, Michel; Dionne, Ginette; Barker, Edward D.; Vitaro, Frank; Girard, Alain; Tremblay, Richard; Perusse, Daniel

    2011-01-01

    Aggressive behavior in middle childhood is at least partly explained by genetic factors. Nevertheless, estimations of simple effects ignore possible gene-environment interactions (G x E) or gene-environment correlations (rGE) in the etiology of aggression. The present study aimed to simultaneously test for G x E and rGE processes between…

  1. Engineered Barrier System Degradation, Flow, and Transport Process Model Report

    Energy Technology Data Exchange (ETDEWEB)

    E.L. Hardin

    2000-07-17

    The Engineered Barrier System Degradation, Flow, and Transport Process Model Report (EBS PMR) is one of nine PMRs supporting the Total System Performance Assessment (TSPA) being developed by the Yucca Mountain Project for the Site Recommendation Report (SRR). The EBS PMR summarizes the development and abstraction of models for processes that govern the evolution of conditions within the emplacement drifts of a potential high-level nuclear waste repository at Yucca Mountain, Nye County, Nevada. Details of these individual models are documented in 23 supporting Analysis/Model Reports (AMRs). Nineteen of these AMRs are for process models, and the remaining 4 describe the abstraction of results for application in TSPA. The process models themselves cluster around four major topics: ''Water Distribution and Removal Model, Physical and Chemical Environment Model, Radionuclide Transport Model, and Multiscale Thermohydrologic Model''. One AMR (Engineered Barrier System-Features, Events, and Processes/Degradation Modes Analysis) summarizes the formal screening analysis used to select the Features, Events, and Processes (FEPs) included in TSPA and those excluded from further consideration. Performance of a potential Yucca Mountain high-level radioactive waste repository depends on both the natural barrier system (NBS) and the engineered barrier system (EBS) and on their interactions. Although the waste packages are generally considered as components of the EBS, the EBS as defined in the EBS PMR includes all engineered components outside the waste packages. The principal function of the EBS is to complement the geologic system in limiting the amount of water contacting nuclear waste. A number of alternatives were considered by the Project for different EBS designs that could provide better performance than the design analyzed for the Viability Assessment. The design concept selected was Enhanced Design Alternative II (EDA II).

  2. Software Engineering Laboratory (SEL) cleanroom process model

    Science.gov (United States)

    Green, Scott; Basili, Victor; Godfrey, Sally; Mcgarry, Frank; Pajerski, Rose; Waligora, Sharon

    1991-01-01

    The Software Engineering Laboratory (SEL) cleanroom process model is described. The term 'cleanroom' originates in the integrated circuit (IC) production process, where IC's are assembled in dust free 'clean rooms' to prevent the destructive effects of dust. When applying the clean room methodology to the development of software systems, the primary focus is on software defect prevention rather than defect removal. The model is based on data and analysis from previous cleanroom efforts within the SEL and is tailored to serve as a guideline in applying the methodology to future production software efforts. The phases that are part of the process model life cycle from the delivery of requirements to the start of acceptance testing are described. For each defined phase, a set of specific activities is discussed, and the appropriate data flow is described. Pertinent managerial issues, key similarities and differences between the SEL's cleanroom process model and the standard development approach used on SEL projects, and significant lessons learned from prior cleanroom projects are presented. It is intended that the process model described here will be further tailored as additional SEL cleanroom projects are analyzed.

  3. Stochastic differential equation model to Prendiville processes

    Science.gov (United States)

    Granita, Bahar, Arifah

    2015-10-01

    The Prendiville process is another variation of the logistic model which assumes linearly decreasing population growth rate. It is a continuous time Markov chain (CTMC) taking integer values in the finite interval. The continuous time Markov chain can be approximated by stochastic differential equation (SDE). This paper discusses the stochastic differential equation of Prendiville process. The work started with the forward Kolmogorov equation in continuous time Markov chain of Prendiville process. Then it was formulated in the form of a central-difference approximation. The approximation was then used in Fokker-Planck equation in relation to the stochastic differential equation of the Prendiville process. The explicit solution of the Prendiville process was obtained from the stochastic differential equation. Therefore, the mean and variance function of the Prendiville process could be easily found from the explicit solution.

  4. Implementation of the Business Process Modelling Notation (BPMN) in the modelling of anatomic pathology processes.

    Science.gov (United States)

    Rojo, Marcial García; Rolón, Elvira; Calahorra, Luis; García, Felix Oscar; Sánchez, Rosario Paloma; Ruiz, Francisco; Ballester, Nieves; Armenteros, María; Rodríguez, Teresa; Espartero, Rafael Martín

    2008-07-15

    Process orientation is one of the essential elements of quality management systems, including those in use in healthcare. Business processes in hospitals are very complex and variable. BPMN (Business Process Modelling Notation) is a user-oriented language specifically designed for the modelling of business (organizational) processes. Previous experiences of the use of this notation in the processes modelling within the Pathology in Spain or another country are not known. We present our experience in the elaboration of the conceptual models of Pathology processes, as part of a global programmed surgical patient process, using BPMN. With the objective of analyzing the use of BPMN notation in real cases, a multidisciplinary work group was created, including software engineers from the Dep. of Technologies and Information Systems from the University of Castilla-La Mancha and health professionals and administrative staff from the Hospital General de Ciudad Real. The work in collaboration was carried out in six phases: informative meetings, intensive training, process selection, definition of the work method, process describing by hospital experts, and process modelling. The modelling of the processes of Anatomic Pathology is presented using BPMN. The presented subprocesses are those corresponding to the surgical pathology examination of the samples coming from operating theatre, including the planning and realization of frozen studies. The modelling of Anatomic Pathology subprocesses has allowed the creation of an understandable graphical model, where management and improvements are more easily implemented by health professionals.

  5. Modelling income processes with lots of heterogeneity

    DEFF Research Database (Denmark)

    Browning, Martin; Ejrnæs, Mette; Alvarez, Javier

    2010-01-01

    We model earnings processes allowing for lots of heterogeneity across agents. We also introduce an extension to the linear ARMA model which allows the initial convergence in the long run to be different from that implied by the conventional ARMA model. This is particularly important for unit root...... this observable homogeneity, we find more latent heterogeneity than previous investigators. We show that allowance for heterogeneity makes substantial differences to estimates of model parameters and to outcomes of interest. Additionally, we find strong evidence against the hypothesis that any worker has a unit...

  6. The Probability Model of Expectation Disconfirmation Process

    Directory of Open Access Journals (Sweden)

    Hui-Hsin HUANG

    2015-06-01

    Full Text Available This paper proposes a probability model to explore the dynamic process of customer’s satisfaction. Bases on expectation disconfirmation theory, the satisfaction is constructed with customer’s expectation before buying behavior and the perceived performance after purchase. The experiment method is designed to measure expectation disconfirmation effects and we also use the collection data to estimate the overall satisfaction and model calibration. The results show good fitness between the model and the real data. This model has application for business marketing areas in order to manage relationship satisfaction.

  7. Home-Network Security Model in Ubiquitous Environment

    OpenAIRE

    Dong-Young Yoo; Jong-Whoi Shin; Jin-Young Choi

    2007-01-01

    Social interest and demand on Home-Network has been increasing greatly. Although various services are being introduced to respond to such demands, they can cause serious security problems when linked to the open network such as Internet. This paper reviews the security requirements to protect the service users with assumption that the Home-Network environment is connected to Internet and then proposes the security model based on the requirement. The proposed security mode...

  8. Wideband Channel Modeling in Real Atmospheric Environments with Experimental Evaluation

    Science.gov (United States)

    2013-04-01

    received signal will experience ISI and the channel is considered wideband. If either the transmitter or receiver is mobile or the environment is not...are commonly used in spread spectrum communication systems such as Code Division Multiple Access ( CDMA ) systems. Narrowband interference mitigation...Model (APM) for Mobile Radio Applications,” IEEE Trans. Antennas and Propagation, vol. 54, no. 10 (October), pp. 2869–2877. [5] A. Barrios. 1995

  9. Quantitative Modeling of Earth Surface Processes

    Science.gov (United States)

    Pelletier, Jon D.

    This textbook describes some of the most effective and straightforward quantitative techniques for modeling Earth surface processes. By emphasizing a core set of equations and solution techniques, the book presents state-of-the-art models currently employed in Earth surface process research, as well as a set of simple but practical research tools. Detailed case studies demonstrate application of the methods to a wide variety of processes including hillslope, fluvial, aeolian, glacial, tectonic, and climatic systems. Exercises at the end of each chapter begin with simple calculations and then progress to more sophisticated problems that require computer programming. All the necessary computer codes are available online at www.cambridge.org/9780521855976. Assuming some knowledge of calculus and basic programming experience, this quantitative textbook is designed for advanced geomorphology courses and as a reference book for professional researchers in Earth and planetary science looking for a quantitative approach to Earth surface processes. More details...

  10. A neurolinguistic model of grammatical construction processing.

    Science.gov (United States)

    Dominey, Peter Ford; Hoen, Michel; Inui, Toshio

    2006-12-01

    One of the functions of everyday human language is to communicate meaning. Thus, when one hears or reads the sentence, "John gave a book to Mary," some aspect of an event concerning the transfer of possession of a book from John to Mary is (hopefully) transmitted. One theoretical approach to language referred to as construction grammar emphasizes this link between sentence structure and meaning in the form of grammatical constructions. The objective of the current research is to (1) outline a functional description of grammatical construction processing based on principles of psycholinguistics, (2) develop a model of how these functions can be implemented in human neurophysiology, and then (3) demonstrate the feasibility of the resulting model in processing languages of typologically diverse natures, that is, English, French, and Japanese. In this context, particular interest will be directed toward the processing of novel compositional structure of relative phrases. The simulation results are discussed in the context of recent neurophysiological studies of language processing.

  11. Method for Signal Processing of Electric Field Modulation Sensor in a Conductive Environment

    Directory of Open Access Journals (Sweden)

    O. I. Miseyk

    2015-01-01

    Full Text Available In investigating the large waters and deep oceans the most promising are modulation sensors for measuring electric field in a conducting environment in a very low frequency range in devices of autonomous or non-autonomous vertical sounding. When using sensors of this type it is necessary to solve the problem of enhancement and measurement of the modulated signal from the baseband noise.The work analyses hydrodynamic and electromagnetic noise at the input of transducer with "rotating" sensitive axis. By virtue of matching the measuring electrodes with the signal processing circuit a conclusion has been drawn that the proposed basic model of a transducer with "rotating” sensitive axis is the most efficient in terms of enhancement and measurement of modulated signal from the baseband noise. It has been shown that it is undesirable for transducers to have the rotation of electrodes resulting, in this case, in arising noise to be synchronously changed with transducer rotation frequency (modulation frequency. This will complicate the further signal-noise enhancement later in their processing.The paper justifies the choice of demodulation output signal, called synchronous demodulation using a low-pass filter with a cutoff frequency much lower than the carrier frequency to provide an output signal in the range of very low frequency and dc electric fields.The paper offers an original circuit to process the signals taken from the modulation sensor with "rotating" measurement base. This circuit has advantages over the earlier known circuits for measuring electric fields in a conducting (marine environment in the ultralow frequency range of these fields in terms of sensitivity and measuring accuracy of modulation sensors.

  12. Aqueous Electrolytes: Model Parameters and Process Simulation

    DEFF Research Database (Denmark)

    Thomsen, Kaj

    This thesis deals with aqueous electrolyte mixtures. The Extended UNIQUAC model is being used to describe the excess Gibbs energy of such solutions. Extended UNIQUAC parameters for the twelve ions Na+, K+, NH4+, H+, Cl-, NO3-, SO42-, HSO4-, OH-, CO32-, HCO3-, and S2O82- are estimated. A computer...... program including a steady state process simulator for the design, simulation, and optimization of fractional crystallization processes is presented....

  13. HYDROLOGICAL PROCESSES MODELLING USING ADVANCED HYDROINFORMATIC TOOLS

    Directory of Open Access Journals (Sweden)

    BEILICCI ERIKA

    2014-03-01

    Full Text Available The water has an essential role in the functioning of ecosystems by integrating the complex physical, chemical, and biological processes that sustain life. Water is a key factor in determining the productivity of ecosystems, biodiversity and species composition. Water is also essential for humanity: water supply systems for population, agriculture, fisheries, industries, and hydroelectric power depend on water supplies. The modelling of hydrological processes is an important activity for water resources management, especially now, when the climate change is one of the major challenges of our century, with strong influence on hydrological processes dynamics. Climate change and needs for more knowledge in water resources require the use of advanced hydroinformatic tools in hydrological processes modelling. The rationale and purpose of advanced hydroinformatic tools is to develop a new relationship between the stakeholders and the users and suppliers of the systems: to offer the basis (systems which supply useable results, the validity of which cannot be put in reasonable doubt by any of the stakeholders involved. For a successful modelling of hydrological processes also need specialists well trained and able to use advanced hydro-informatics tools. Results of modelling can be a useful tool for decision makers to taking efficient measures in social, economical and ecological domain regarding water resources, for an integrated water resources management.

  14. Refining Distributed Snowmelt Models in a Mountain Environment

    Science.gov (United States)

    Deems, J. S.; Lott, F. C.; Lundquist, J. D.

    2008-12-01

    In the radiation-dominated snowmelt environment of the Tuolumne River basin in Yosemite National Park, California, we have observed that snowmelt models using explicit representations of the snow surface energy balance are superior to bulk temperature-index models for capturing spatial and temporal variations in snowmelt. In comparing a temperature-index model (Snow-17) with two energy-balance models (the Utah Energy Balance model - UEB and the Distributed Hydrology Soil Vegetation Model - DHSVM), we found that the details in melt timing necessary for ecosystem studies are best simulated using the energy balance models. Additionally, two model-derived variables, namely shortwave albedo and snowpack surface and internal temperatures, are critical to accurate estimation of snowmelt onset, spring melt volumes, and the timing of hydrograph recessions, including when ephemeral streams go dry. We examine model representations of these variables as implemented in UEB and DHSVM using comparisons with observed streamflow and MODIS snow-covered area (SCA) and albedo derived from the MODSCAG algorithms.

  15. Fundamentals of Numerical Modelling of Casting Processes

    DEFF Research Database (Denmark)

    Pryds, Nini; Thorborg, Jesper; Lipinski, Marek

    Fundamentals of Numerical Modelling of Casting Processes comprises a thorough presentation of the basic phenomena that need to be addressed in numerical simulation of casting processes. The main philosophy of the book is to present the topics in view of their physical meaning, whenever possible......) presents the most important aspects of solidification theory related to modelling. Part III (Chapter 5) describes the fluid flow phenomena and in part IV (Chapter 6) the stress-strain analysis is addressed. For all parts, both numerical formulations as well as some important analytical solutions...

  16. Analytical Model of IPsec Process Throughput

    Directory of Open Access Journals (Sweden)

    Adam Tisovsky

    2012-01-01

    Full Text Available The paper concerns with a throughput of securing process which cannot be described neither by a constant value of bits per second nor by a constant value of packets per second over the range of packet sizes. We propose general throughput model of IPsec process based on characteristic parameters that are independent on the packet size. These parameters might be used for comprehensive definition of throughput on any security system. Further, a method for obtaining characteristic parameters is proposed. Usage of the method can significantly decrease count of throughput measurements required for modelling the system.

  17. Kinetics and modeling of anaerobic digestion process

    DEFF Research Database (Denmark)

    Gavala, Hariklia N.; Angelidaki, Irini; Ahring, Birgitte Kiær

    2003-01-01

    Anaerobic digestion modeling started in the early 1970s when the need for design and efficient operation of anaerobic systems became evident. At that time not only was the knowledge about the complex process of anaerobic digestion inadequate but also there were computational limitations. Thus......, the first models were very simple and consisted of a limited number of equations. During the past thirty years much research has been conducted on the peculiarities of the process and on the factors that influence it on the one hand while an enormous progress took place in computer science on the other....... The combination of both parameters resulted in the development of more and more concise and complex models. In this chapter the most important models found in the literature are described starting from the simplest and oldest to the more recent and complex ones....

  18. Kinetics and modeling of anaerobic digestion process.

    Science.gov (United States)

    Gavala, Hariklia N; Angelidaki, Irini; Ahring, Birgitte K

    2003-01-01

    Anaerobic digestion modeling started in the early 1970s when the need for design and efficient operation of anaerobic systems became evident. At that time not only was the knowledge about the complex process of anaerobic digestion inadequate but also there were computational limitations. Thus, the first models were very simple and consisted of a limited number of equations. During the past thirty years much research has been conducted on the peculiarities of the process and on the factors that influence it on the one hand while an enormous progress took place in computer science on the other. The combination of both parameters resulted in the development of more and more concise and complex models. In this chapter the most important models found in the literature are described starting from the simplest and oldest to the more recent and complex ones.

  19. Retort process modelling for Indian traditional foods.

    Science.gov (United States)

    Gokhale, S V; Lele, S S

    2014-11-01

    Indian traditional staple and snack food is typically a heterogeneous recipe that incorporates varieties of vegetables, lentils and other ingredients. Modelling the retorting process of multilayer pouch packed Indian food was achieved using lumped-parameter approach. A unified model is proposed to estimate cold point temperature. Initial process conditions, retort temperature and % solid content were the significantly affecting independent variables. A model was developed using combination of vegetable solids and water, which was then validated using four traditional Indian vegetarian products: Pulav (steamed rice with vegetables), Sambar (south Indian style curry containing mixed vegetables and lentils), Gajar Halawa (carrot based sweet product) and Upama (wheat based snack product). The predicted and experimental values of temperature profile matched with ±10 % error which is a good match considering the food was a multi component system. Thus the model will be useful as a tool to reduce number of trials required to optimize retorting of various Indian traditional vegetarian foods.

  20. A Mathematical Model of Cigarette Smoldering Process

    Directory of Open Access Journals (Sweden)

    Chen P

    2014-12-01

    Full Text Available A mathematical model for a smoldering cigarette has been proposed. In the analysis of the cigarette combustion and pyrolysis processes, a receding burning front is defined, which has a constant temperature (~450 °C and divides the cigarette into two zones, the burning zone and the pyrolysis zone. The char combustion processes in the burning zone and the pyrolysis of virgin tobacco and evaporation of water in the pyrolysis zone are included in the model. The hot gases flow from the burning zone, are assumed to go out as sidestream smoke during smoldering. The internal heat transport is characterized by effective thermal conductivities in each zone. Thermal conduction of cigarette paper and convective and radiative heat transfer at the outer surface were also considered. The governing partial differential equations were solved using an integral method. Model predictions of smoldering speed as well as temperature and density profiles in the pyrolysis zone for different kinds of cigarettes were found to agree with the experimental data. The model also predicts the coal length and the maximum coal temperatures during smoldering conditions. The model provides a relatively fast and efficient way to simulate the cigarette burning processes. It offers a practical tool for exploring important parameters for cigarette smoldering processes, such as tobacco components, properties of cigarette paper, and heat generation in the burning zone and its dependence on the mass burn rate.

  1. Radiation Belt Environment Model: Application to Space Weather and Beyond

    Science.gov (United States)

    Fok, Mei-Ching H.

    2011-01-01

    Understanding the dynamics and variability of the radiation belts are of great scientific and space weather significance. A physics-based Radiation Belt Environment (RBE) model has been developed to simulate and predict the radiation particle intensities. The RBE model considers the influences from the solar wind, ring current and plasmasphere. It takes into account the particle drift in realistic, time-varying magnetic and electric field, and includes diffusive effects of wave-particle interactions with various wave modes in the magnetosphere. The RBE model has been used to perform event studies and real-time prediction of energetic electron fluxes. In this talk, we will describe the RBE model equation, inputs and capabilities. Recent advancement in space weather application and artificial radiation belt study will be discussed as well.

  2. Big Data X-Learning Resources Integration and Processing in Cloud Environments

    Directory of Open Access Journals (Sweden)

    Kong Xiangsheng

    2014-09-01

    Full Text Available The cloud computing platform has good flexibility characteristics, more and more learning systems are migrated to the cloud platform. Firstly, this paper describes different types of educational environments and the data they provide. Then, it proposes a kind of heterogeneous learning resources mining, integration and processing architecture. In order to integrate and process the different types of learning resources in different educational environments, this paper specifically proposes a novel solution and massive storage integration algorithm and conversion algorithm to the heterogeneous learning resources storage and management cloud environments.

  3. A virtual auditory environment for investigating the auditory signal processing of realistic sounds

    DEFF Research Database (Denmark)

    Favrot, Sylvain Emmanuel

    A loudspeaker-based virtual auditory environment (VAE) has been developed to provide a realistic versatile research environment for investigating the auditory signal processing in real environments, i.e., considering multiple sound sources and room reverberation. The VAE allows a full control...... of the acoustic scenario in order to systematically study the auditory processing of reverberant sounds. It is based on the ODEON software, which is state-of-the-art software for room acoustic simulations developed at Acoustic Technology, DTU. First, a MATLAB interface to the ODEON software has been developed...

  4. Modelling hydrological processes at different scales across Russian permafrost domain

    Science.gov (United States)

    Makarieva, Olga; Lebedeva, Lyudmila; Nesterova, Natalia; Vinogradova, Tatyana

    2017-04-01

    The project aims to study the interactions between permafrost and runoff generation processes across Russian Arctic domain based on hydrological modelling. The uniqueness of the approach is a unified modelling framework which allows for coupled simulations of upper permafrost dynamics and streamflow generation at different scales (from soil column to large watersheds). The base of the project is hydrological model Hydrograph (Vinogradov et al. 2011, Semenova et al. 2013, 2015; Lebedeva et al., 2015). The model algorithms combine physically-based and conceptual approaches for the description of land hydrological cycle processes, which allows for maintaining a balance between the complexity of model design and the use of limited input information. The method for modeling heat dynamics in soil is integrated into the model. Main parameters of the model are the physical properties of landscapes that may be measured (observed) in nature and are classified according to the types of soil, vegetation and other characteristics. A set of parameters specified in the studied catchments (basins analog) can be transferred to ungauged basins with similar types of the underlying surface without calibration. The results of modelling from small research watersheds to large poorly gauged river basins in different climate and landscape settings of Russian Arctic (within the Yenisey, Lena, Yana, Indigirka, Kolyma rivers basins) will be presented. Based on gained experience methodological aspects of hydrological modelling approaches in permafrost environment will be discussed. The study is partially supported by Russian Foundation for Basic Research, projects 16-35-50151 and 17-05-01138.

  5. The Development of the Proving Process Within a Dynamic Geometry Environment

    Directory of Open Access Journals (Sweden)

    Danh Nam Nguyen

    2012-10-01

    Full Text Available In this paper we classify student’s proving level and design an interactive help system (IHS corresponding with these levels in order to investigate the development of the proving process within a dynamic geometry environment. This help system was also used to provide tertiary students with a strategy for proving and to improve their proving levels. The open-ended questions and explorative tasks in the IHS make a contribution to support students’ learning of proving, especially during the processes of realizing invariants, formulating conjectures, producing arguments, and writing proofs. This research wants to react on the well-known students’ difficulties in writing a formal proof. The hypothesis of this work is that these difficulties are based on the lack of students’ understanding the relationship between argumentation and proof. Therefore, we used Toulmin model to analyze student’s argumentation structure and examine the role of abduction in writing a deductive proof. Furthermore, this paper also provides mathematics teachers with three basic conditions for understanding the development of the proving process and teaching strategies for assisting their students in constructing formal proofs.

  6. Processing and Modeling of Porous Copper Using Sintering Dissolution Process

    Science.gov (United States)

    Salih, Mustafa Abualgasim Abdalhakam

    The growth of porous metal has produced materials with improved properties as compared to non-metals and solid metals. Porous metal can be classified as either open cell or closed cell. Open cell allows a fluid media to pass through it. Closed cell is made up of adjacent sealed pores with shared cell walls. Metal foams offer higher strength to weight ratios, increased impact energy absorption, and a greater tolerance to high temperatures and adverse environmental conditions when compared to bulk materials. Copper and its alloys are examples of these, well known for high strength and good mechanical, thermal and electrical properties. In the present study, the porous Cu was made by a powder metallurgy process, using three different space holders, sodium chloride, sodium carbonate and potassium carbonate. Several different samples have been produced, using different ratios of volume fraction. The densities of the porous metals have been measured and compared to the theoretical density calculated using an equation developed for these foams. The porous structure was determined with the removal of spacer materials through sintering process. The sintering process of each spacer material depends on the melting point of the spacer material. Processing, characterization, and mechanical properties were completed. These tests include density measurements, compression tests, computed tomography (CT) and scanning electron microscopy (SEM). The captured morphological images are utilized to generate the object-oriented finite element (OOF) analysis for the porous copper. Porous copper was formed with porosities in the range of 40-66% with density ranges from 3 to 5.2 g/cm3. A study of two different methods to measure porosity was completed. OOF (Object Oriented Finite Elements) is a desktop software application for studying the relationship between the microstructure of a material and its overall mechanical, dielectric, or thermal properties using finite element models based on

  7. Extending MBI Model using ITIL and COBIT Processes

    Directory of Open Access Journals (Sweden)

    Sona Karkoskova

    2015-10-01

    Full Text Available Most organizations today operate in a highly complex and competitive business environment and need to be able to react to rapidly changing market conditions. IT management frameworks are widely used to provide effective support for business objectives through aligning IT with business and optimizing the use of IT resources. In this paper we analyze three IT management frameworks (ITIL, COBIT and MBI with the objective to identify the relationships between these frameworks, and mapping ITIL and COBIT processes to MBI tasks. As a result of this analysis we propose extensions to the MBI model to incorporate IT Performance Management and a Capability Maturity Model.

  8. Theoretical Modelling of Intercultural Communication Process

    OpenAIRE

    Mariia Soter

    2016-01-01

    The definition of the concepts of “communication”, “intercultural communication”, “model of communication” are analyzed in the article. The basic components of the communication process are singled out. The model of intercultural communication is developed. Communicative, behavioral and complex skills for optimal organization of intercultural communication, establishment of productive contact with a foreign partner to achieve mutual understanding, searching for acceptable ways of organizing i...

  9. Process model development for optimization of forged disk manufacturing processes

    Energy Technology Data Exchange (ETDEWEB)

    Fischer, C.E.; Gunasekera, J.S. [Ohio Univ., Athens, OH (United States). Center for Advanced Materials Processing; Malas, J.C. [Wright Labs., Wright Patterson AFB, OH (United States). Materials Directorate

    1997-12-31

    This paper addresses the development of a system which will enable the optimization of an entire processing sequence for a forged part. Typically such a sequence may involve several stages and alternative routes of manufacturing a given part. It is important that such a system be optimized globally, (rather than locally, as is the current practice) in order to achieve improvements in affordability, producibility, and performance. This paper demonstrates the development of a simplified forging model, discussion techniques for searching and reducing a very large design space, and an objective function to evaluate the cost of a design sequence.

  10. Various Models for Reading Comprehension Process

    OpenAIRE

    Parastoo Babashamsi; Saeideh Bolandifar; Nahid Shakib

    2013-01-01

    In recent years reading can be viewed as a process, as a form of thinking, as a true experience, and as a tool subject. As a process, reading includes visual discrimination, independent recognition of word, rhythmic progression along a line of print, precision in the return sweep of the eyes, and adjustment of rate. In the same line, the present paper aims at considering the various models of reading process. Moreover, the paper will take a look at various factors such as schema and vocabular...

  11. Modelling novel coal based direct reduction process

    Energy Technology Data Exchange (ETDEWEB)

    Shi, J.Y.; Donskoi, E.; McElwain, D.L.S.; Wibberley, L.J. [Queensland University of Technology, Brisbane, Qld. (Australia). School of Mathematical Science

    2008-01-15

    The present paper develops a one-dimensional model of a novel coal based iron ore direct reduction process. In this process, a mixture of iron ore, coal fines and small amount of binder is made into pellets and these are placed in a bed. Air is forced upward through the pellet bed and provides oxygen for the volatiles and part of the coal in the pellets to be burnt. Initially the pellet bed is heated from the top. As the temperature of the top level of pellets increases, they start to evolve pyrolytic matter which is ignited and, as a consequence, the pellets at lower levels in the bed are heated. In this way, a flame propagates downward through the bed. The iron ore reacts with the gases evolved from the coal (including volatiles) and carbon in the coal and undergoes reduction. The model presented in the article simulates the processes occurring in the solid and gaseous phases. In the solid phase, it uses a novel porous medium model consisting of porous pellets in a porous bed with two associated porosities. The model includes equations for energy balance, reactions of iron oxide with carbon monoxide and hydrogen, coal pyrolysis and reactions between the gas components in the voids. The model shows that a rapidly increasing temperature front can travel downward through the bed if the air is supplied for long enough. The predictions of the modelling are discussed and compared with observations obtained from an experimental rig.

  12. Modeling stroke rehabilitation processes using the Unified Modeling Language (UML).

    Science.gov (United States)

    Ferrante, Simona; Bonacina, Stefano; Pinciroli, Francesco

    2013-10-01

    In organising and providing rehabilitation procedures for stroke patients, the usual need for many refinements makes it inappropriate to attempt rigid standardisation, but greater detail is required concerning workflow. The aim of this study was to build a model of the post-stroke rehabilitation process. The model, implemented in the Unified Modeling Language, was grounded on international guidelines and refined following the clinical pathway adopted at local level by a specialized rehabilitation centre. The model describes the organisation of the rehabilitation delivery and it facilitates the monitoring of recovery during the process. Indeed, a system software was developed and tested to support clinicians in the digital administration of clinical scales. The model flexibility assures easy updating after process evolution. Copyright © 2013 Elsevier Ltd. All rights reserved.

  13. Derivative processes for modelling metabolic fluxes

    Science.gov (United States)

    Žurauskienė, Justina; Kirk, Paul; Thorne, Thomas; Pinney, John; Stumpf, Michael

    2014-01-01

    Motivation: One of the challenging questions in modelling biological systems is to characterize the functional forms of the processes that control and orchestrate molecular and cellular phenotypes. Recently proposed methods for the analysis of metabolic pathways, for example, dynamic flux estimation, can only provide estimates of the underlying fluxes at discrete time points but fail to capture the complete temporal behaviour. To describe the dynamic variation of the fluxes, we additionally require the assumption of specific functional forms that can capture the temporal behaviour. However, it also remains unclear how to address the noise which might be present in experimentally measured metabolite concentrations. Results: Here we propose a novel approach to modelling metabolic fluxes: derivative processes that are based on multiple-output Gaussian processes (MGPs), which are a flexible non-parametric Bayesian modelling technique. The main advantages that follow from MGPs approach include the natural non-parametric representation of the fluxes and ability to impute the missing data in between the measurements. Our derivative process approach allows us to model changes in metabolite derivative concentrations and to characterize the temporal behaviour of metabolic fluxes from time course data. Because the derivative of a Gaussian process is itself a Gaussian process, we can readily link metabolite concentrations to metabolic fluxes and vice versa. Here we discuss how this can be implemented in an MGP framework and illustrate its application to simple models, including nitrogen metabolism in Escherichia coli. Availability and implementation: R code is available from the authors upon request. Contact: j.norkunaite@imperial.ac.uk; m.stumpf@imperial.ac.uk Supplementary information: Supplementary data are available at Bioinformatics online. PMID:24578401

  14. Estimation, modeling, and simulation of patterned growth in extreme environments.

    Science.gov (United States)

    Strader, B; Schubert, K E; Quintana, M; Gomez, E; Curnutt, J; Boston, P

    2011-01-01

    In the search for life on Mars and other extraterrestrial bodies or in our attempts to identify biological traces in the most ancient rock record of Earth, one of the biggest problems facing us is how to recognize life or the remains of ancient life in a context very different from our planet's modern biological examples. Specific chemistries or biological properties may well be inapplicable to extraterrestrial conditions or ancient Earth environments. Thus, we need to develop an arsenal of techniques that are of broader applicability. The notion of patterning created in some fashion by biological processes and properties may provide such a generalized property of biological systems no matter what the incidentals of chemistry or environmental conditions. One approach to recognizing these kinds of patterns is to look at apparently organized arrangements created and left by life in extreme environments here on Earth, especially at various spatial scales, different geologies, and biogeochemical circumstances.

  15. Modeling higher education attractiveness to stand global environment

    Directory of Open Access Journals (Sweden)

    Leonel Cezar Rodrigues

    2016-04-01

    Full Text Available Inabilities to deal with the changing environment may lead Higher Education Institutions (HEI to loose institutional attractiveness. Digital transformation requires global insertion as essential feature to institutional attractiveness. Processes for international education seem to lack the links between real environmental trends and the internal capabilities to global education. HEI managers may approach endeavors to internationalize education combining ambidextrous strategy supported by consolidated resilience capabilities. The latest ones refer to building internal value attributes to increase institutional attractiveness assuring solid standing in the global environment. In this article, a theoretical essay, we approach the problem of creating resilience as a way of backing up ambidexterity to generate institutional attractiveness. The set of value attributes, on the other hand, may originate strategic routes to strengthen internal competences and to make the institution more attractive, as a dynamic capability.

  16. Ant-mediated ecosystem processes are driven by trophic community structure but mainly by the environment.

    Science.gov (United States)

    Salas-Lopez, Alex; Mickal, Houadria; Menzel, Florian; Orivel, Jérôme

    2017-01-01

    The diversity and functional identity of organisms are known to be relevant to the maintenance of ecosystem processes but can be variable in different environments. Particularly, it is uncertain whether ecosystem processes are driven by complementary effects or by dominant groups of species. We investigated how community structure (i.e., the diversity and relative abundance of biological entities) explains the community-level contribution of Neotropical ant communities to different ecosystem processes in different environments. Ants were attracted with food resources representing six ant-mediated ecosystem processes in four environments: ground and vegetation strata in cropland and forest habitats. The exploitation frequencies of the baits were used to calculate the taxonomic and trophic structures of ant communities and their contribution to ecosystem processes considered individually or in combination (i.e., multifunctionality). We then investigated whether community structure variables could predict ecosystem processes and whether such relationships were affected by the environment. We found that forests presented a greater biodiversity and trophic complementarity and lower dominance than croplands, but this did not affect ecosystem processes. In contrast, trophic complementarity was greater on the ground than on vegetation and was followed by greater resource exploitation levels. Although ant participation in ecosystem processes can be predicted by means of trophic-based indices, we found that variations in community structure and performance in ecosystem processes were best explained by environment. We conclude that determining the extent to which the dominance and complementarity of communities affect ecosystem processes in different environments requires a better understanding of resource availability to different species.

  17. Model of an excitatory synapse based on stochastic processes.

    Science.gov (United States)

    L'Espérance, Pierre-Yves; Labib, Richard

    2013-09-01

    We present a mathematical model of a biological synapse based on stochastic processes to establish the temporal behavior of the postsynaptic potential following a quantal synaptic transmission. This potential form is the basis of the neural code. We suppose that the release of neurotransmitters in the synaptic cleft follows a Poisson process, and that they diffuse according to integrated Ornstein-Uhlenbeck processes in 3-D with random initial positions and velocities. The diffusion occurs in an isotropic environment between two infinite parallel planes representing the pre- and postsynaptic membrane. We state that the presynaptic membrane is perfectly reflecting and that the other is perfectly absorbing. The activation of the receptors polarizes the postsynaptic membrane according to a parallel RC circuit scheme. We present the results obtained by simulations according to a Gillespie algorithm and we show that our model exhibits realistic postsynaptic behaviors from a simple quantal occurrence.

  18. Programmatic access to logical models in the Cell Collective modeling environment via a REST API.

    Science.gov (United States)

    Kowal, Bryan M; Schreier, Travis R; Dauer, Joseph T; Helikar, Tomáš

    2016-01-01

    Cell Collective (www.cellcollective.org) is a web-based interactive environment for constructing, simulating and analyzing logical models of biological systems. Herein, we present a Web service to access models, annotations, and simulation data in the Cell Collective platform through the Representational State Transfer (REST) Application Programming Interface (API). The REST API provides a convenient method for obtaining Cell Collective data through almost any programming language. To ensure easy processing of the retrieved data, the request output from the API is available in a standard JSON format. The Cell Collective REST API is freely available at http://thecellcollective.org/tccapi. All public models in Cell Collective are available through the REST API. For users interested in creating and accessing their own models through the REST API first need to create an account in Cell Collective (http://thecellcollective.org). thelikar2@unl.edu. Technical user documentation: https://goo.gl/U52GWo. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  19. Nutrition Care Process Implementation: Experiences in Various Dietetics Environments in Sweden.

    Science.gov (United States)

    Lövestam, Elin; Boström, Anne-Marie; Orrevall, Ylva

    2017-11-01

    The Nutrition Care Process (NCP) and Nutrition Care Process Terminology (NCPT) are currently being implemented by nutrition and dietetics practitioners all over the world. Several advantages have been related to this implementation, such as consistency and clarity of dietetics-related health care records and the possibility to collect and research patient outcomes. However, little is known about dietitians' experiences of the implementation process. The aim of this qualitative study was to explore Swedish dietitians' experiences of the NCP implementation process in different dietetics environments. Thirty-seven Swedish dietitians from 13 different dietetics workplaces participated in seven focus group discussions that were audiotaped and carefully transcribed. A thematic secondary analysis was performed, after which all the discussions were re-read, following the implementation narrative from each workplace. In the analysis, The Promoting Action on Research Implementation in Health Services implementation model was used as a framework. Main categories identified in the thematic analysis were leadership and implementation strategy, the group and colleagues, the electronic health record, and evaluation. Three typical cases are described to illustrate the diversity of these aspects in dietetics settings: Case A represents a small hospital with an inclusive leadership style and discussion-friendly culture where dietitians had embraced the NCP/NCPT implementation. Case B represents a larger hospital with a more hierarchical structure where dietitians were more ambivalent toward NCP/NCPT implementation. Case C represents the only dietitian working at a small multiprofessional primary care center who received no dietetics-related support from management or colleagues. She had not started NCP/NCPT implementation. The diversity of dietetics settings and their different prerequisites should be considered in the development of NCP/NCPT implementation strategies. Tailored

  20. Integrated Process Model on Intercultural Competence

    Directory of Open Access Journals (Sweden)

    Diana Bebenova - Nikolova

    2016-08-01

    Full Text Available The paper proposes an integrated model on intercultural competence, which attempts to present intercultural communication and competence from the term point of the dialectical approach, described by Martin and Nakayama (2010. The suggested concept deploys from previously developed and accepted models, both structure-oriented and process-oriented. At the same time it replies to the principles of the “Theory of Models” as outlined by Balboni and Caon (2014. In the near future, the model will be applied to assess intercultural competence of cross-border project teams, working under the CBC program between Romania – Bulgaria 2007-2014.

  1. A 3D visualization approach for process training in office environments

    NARCIS (Netherlands)

    Aysolmaz, Banu; Brown, Ross; Bruza, Peter; Reijers, Hajo A.

    2016-01-01

    Process participants need to learn how to perform in the context of their business processes. Process training is challenging due to cognitive difficulties in relating process model elements to real world concepts. In this paper we present a 3D VirtualWorld (VW) process training approach for office

  2. Cognitive Virtualization: Combining Cognitive Models and Virtual Environments

    Energy Technology Data Exchange (ETDEWEB)

    Tuan Q. Tran; David I. Gertman; Donald D. Dudenhoeffer; Ronald L. Boring; Alan R. Mecham

    2007-08-01

    3D manikins are often used in visualizations to model human activity in complex settings. Manikins assist in developing understanding of human actions, movements and routines in a variety of different environments representing new conceptual designs. One such environment is a nuclear power plant control room, here they have the potential to be used to simulate more precise ergonomic assessments of human work stations. Next generation control rooms will pose numerous challenges for system designers. The manikin modeling approach by itself, however, may be insufficient for dealing with the desired technical advancements and challenges of next generation automated systems. Uncertainty regarding effective staffing levels; and the potential for negative human performance consequences in the presence of advanced automated systems (e.g., reduced vigilance, poor situation awareness, mistrust or blind faith in automation, higher information load and increased complexity) call for further research. Baseline assessment of novel control room equipment(s) and configurations needs to be conducted. These design uncertainties can be reduced through complementary analysis that merges ergonomic manikin models with models of higher cognitive functions, such as attention, memory, decision-making, and problem-solving. This paper will discuss recent advancements in merging a theoretical-driven cognitive modeling framework within a 3D visualization modeling tool to evaluate of next generation control room human factors and ergonomic assessment. Though this discussion primary focuses on control room design, the application for such a merger between 3D visualization and cognitive modeling can be extended to various areas of focus such as training and scenario planning.

  3. Management of Ecological-Economic Processes of Pollution Accumulation and Assimilation in the Coastal Zone Marine Environment

    Directory of Open Access Journals (Sweden)

    I.E. Timchenko

    2017-02-01

    Full Text Available A model for managing the balance of pollution (getting into the sea with the coastal runoff assimilation and accumulation, based on the negative feedback between the coastal economic system efficiency and penalties for the sea coastal zone pollution is proposed. The model is constructed by the Adaptive Balance of Causes method and is intended for finding a rational balance of profit from the use of assimilative resources of the marine environment and the costs of maintaining its quality. The increase of pollutions in the coastal zone is taken as proportional to the volume of product realization. The decrease of pollution concentration is related to the environment protection activities paid for by the production. The model contains the agents for managing the volume of the economic system generalized production release. The agents control pollution accumulation rate at different ones of the bio-chemical processes resulting in the marine environment natural purification. Scenario analysis of ecological-economic processes in the “Land–Sea” system is carried out, and the dependencies of economic subsystem production profitability on penalty sanctions limiting the pollutant flux getting into the sea are constructed. Sea temperature and water mass dynamics effect on these processes is considered. The scenarios of their intra-annual variability are constructed. It is shown that the sea temperature and near-water wind consideration in the model have a significant effect on marine environment pollution level and production profitability. The conclusion is that the proposed adaptive simulation model “Sea–Land” can be used for forecasting the scenarios of coastal subsystem production processes (the volume of generalized product manufacturing, production cost, profitability in parallel with the forecast of pollution concentration in the sea scenarios.

  4. A Generic Modeling Process to Support Functional Fault Model Development

    Science.gov (United States)

    Maul, William A.; Hemminger, Joseph A.; Oostdyk, Rebecca; Bis, Rachael A.

    2016-01-01

    Functional fault models (FFMs) are qualitative representations of a system's failure space that are used to provide a diagnostic of the modeled system. An FFM simulates the failure effect propagation paths within a system between failure modes and observation points. These models contain a significant amount of information about the system including the design, operation and off nominal behavior. The development and verification of the models can be costly in both time and resources. In addition, models depicting similar components can be distinct, both in appearance and function, when created individually, because there are numerous ways of representing the failure space within each component. Generic application of FFMs has the advantages of software code reuse: reduction of time and resources in both development and verification, and a standard set of component models from which future system models can be generated with common appearance and diagnostic performance. This paper outlines the motivation to develop a generic modeling process for FFMs at the component level and the effort to implement that process through modeling conventions and a software tool. The implementation of this generic modeling process within a fault isolation demonstration for NASA's Advanced Ground System Maintenance (AGSM) Integrated Health Management (IHM) project is presented and the impact discussed.

  5. Querying Business Process Models with VMQL

    DEFF Research Database (Denmark)

    Störrle, Harald; Acretoaie, Vlad

    2013-01-01

    . In this paper, we apply VMQL to the Business Process Modeling Notation (BPMN) to evaluate the second claim. We explore the adaptations required, and re-evaluate the usability of VMQL in this context. We find similar results to earlier work, thus both supporting our claims and establishing the usability of VMQL...

  6. Modeling of Reaction Processes Controlled by Diffusion

    CERN Document Server

    Revelli, J

    2003-01-01

    Stochastic modeling is quite powerful in science and technology.The technics derived from this process have been used with great success in laser theory, biological systems and chemical reactions.Besides, they provide a theoretical framework for the analysis of experimental results on the field of particle's diffusion in ordered and disordered materials.In this work we analyze transport processes in one-dimensional fluctuating media, which are media that change their state in time.This fact induces changes in the movements of the particles giving rise to different phenomena and dynamics that will be described and analyzed in this work.We present some random walk models to describe these fluctuating media.These models include state transitions governed by different dynamical processes.We also analyze the trapping problem in a lattice by means of a simple model which predicts a resonance-like phenomenon.Also we study effective diffusion processes over surfaces due to random walks in the bulk.We consider differe...

  7. Mathematical Modelling of Continuous Biotechnological Processes

    Science.gov (United States)

    Pencheva, T.; Hristozov, I.; Shannon, A. G.

    2003-01-01

    Biotechnological processes (BTP) are characterized by a complicated structure of organization and interdependent characteristics. Partial differential equations or systems of partial differential equations are used for their behavioural description as objects with distributed parameters. Modelling of substrate without regard to dispersion…

  8. Modeling as a Decision-Making Process

    Science.gov (United States)

    Bleiler-Baxter, Sarah K.; Stephens, D. Christopher; Baxter, Wesley A.; Barlow, Angela T.

    2017-01-01

    The goal in this article is to support teachers in better understanding what it means to model with mathematics by focusing on three key decision-making processes: Simplification, Relationship Mapping, and Situation Analysis. The authors use the Theme Park task to help teachers develop a vision of how students engage in these three decision-making…

  9. The Student Selection Process: A Model.

    Science.gov (United States)

    Saunders, J. A.; Lancaster, G. A.

    1980-01-01

    Student admission criteria and a college's advising and recruitment efforts are viewed from the perspective of a growing higher education establishment trying to attract students. A student selection model is proposed, derived from an "innovation-decision process" (Rogers and Shoemaker), which focuses on applicant behavior and decision-making…

  10. Aligning Grammatical Theories and Language Processing Models

    Science.gov (United States)

    Lewis, Shevaun; Phillips, Colin

    2015-01-01

    We address two important questions about the relationship between theoretical linguistics and psycholinguistics. First, do grammatical theories and language processing models describe separate cognitive systems, or are they accounts of different aspects of the same system? We argue that most evidence is consistent with the one-system view. Second,…

  11. Content, Process, and Product: Modeling Differentiated Instruction

    Science.gov (United States)

    Taylor, Barbara Kline

    2015-01-01

    Modeling differentiated instruction is one way to demonstrate how educators can incorporate instructional strategies to address students' needs, interests, and learning styles. This article discusses how secondary teacher candidates learn to focus on content--the "what" of instruction; process--the "how" of instruction;…

  12. Computational Process Modeling for Additive Manufacturing (OSU)

    Science.gov (United States)

    Bagg, Stacey; Zhang, Wei

    2015-01-01

    Powder-Bed Additive Manufacturing (AM) through Direct Metal Laser Sintering (DMLS) or Selective Laser Melting (SLM) is being used by NASA and the Aerospace industry to "print" parts that traditionally are very complex, high cost, or long schedule lead items. The process spreads a thin layer of metal powder over a build platform, then melts the powder in a series of welds in a desired shape. The next layer of powder is applied, and the process is repeated until layer-by-layer, a very complex part can be built. This reduces cost and schedule by eliminating very complex tooling and processes traditionally used in aerospace component manufacturing. To use the process to print end-use items, NASA seeks to understand SLM material well enough to develop a method of qualifying parts for space flight operation. Traditionally, a new material process takes many years and high investment to generate statistical databases and experiential knowledge, but computational modeling can truncate the schedule and cost -many experiments can be run quickly in a model, which would take years and a high material cost to run empirically. This project seeks to optimize material build parameters with reduced time and cost through modeling.

  13. Reaction time for processing visual stimulus in a computer-assisted rehabilitation environment.

    Science.gov (United States)

    Sanchez, Yerly; Pinzon, David; Zheng, Bin

    2017-10-01

    To examine the reaction time when human subjects process information presented in the visual channel under both a direct vision and a virtual rehabilitation environment when walking was performed. Visual stimulus included eight math problems displayed on the peripheral vision to seven healthy human subjects in a virtual rehabilitation training (computer-assisted rehabilitation environment (CAREN)) and a direct vision environment. Subjects were required to verbally report the results of these math calculations in a short period of time. Reaction time measured by Tobii Eye tracker and calculation accuracy were recorded and compared between the direct vision and virtual rehabilitation environment. Performance outcomes measured for both groups included reaction time, reading time, answering time and the verbal answer score. A significant difference between the groups was only found for the reaction time (p = .004). Participants had more difficulty recognizing the first equation of the virtual environment. Participants reaction time was faster in the direct vision environment. This reaction time delay should be kept in mind when designing skill training scenarios in virtual environments. This was a pilot project to a series of studies assessing cognition ability of stroke patients who are undertaking a rehabilitation program with a virtual training environment. Implications for rehabilitation Eye tracking is a reliable tool that can be employed in rehabilitation virtual environments. Reaction time changes between direct vision and virtual environment.

  14. Development of climate data storage and processing model

    Science.gov (United States)

    Okladnikov, I. G.; Gordov, E. P.; Titov, A. G.

    2016-11-01

    We present a storage and processing model for climate datasets elaborated in the framework of a virtual research environment (VRE) for climate and environmental monitoring and analysis of the impact of climate change on the socio-economic processes on local and regional scales. The model is based on a «shared nothings» distributed computing architecture and assumes using a computing network where each computing node is independent and selfsufficient. Each node holds a dedicated software for the processing and visualization of geospatial data providing programming interfaces to communicate with the other nodes. The nodes are interconnected by a local network or the Internet and exchange data and control instructions via SSH connections and web services. Geospatial data is represented by collections of netCDF files stored in a hierarchy of directories in the framework of a file system. To speed up data reading and processing, three approaches are proposed: a precalculation of intermediate products, a distribution of data across multiple storage systems (with or without redundancy), and caching and reuse of the previously obtained products. For a fast search and retrieval of the required data, according to the data storage and processing model, a metadata database is developed. It contains descriptions of the space-time features of the datasets available for processing, their locations, as well as descriptions and run options of the software components for data analysis and visualization. The model and the metadata database together will provide a reliable technological basis for development of a high- performance virtual research environment for climatic and environmental monitoring.

  15. Operational SAR Data Processing in GIS Environments for Rapid Disaster Mapping

    Science.gov (United States)

    Bahr, Thomas

    2014-05-01

    DEM without the need of ground control points. This step includes radiometric calibration. (3) A subsequent change detection analysis generates the final map showing the extent of the flash flood on Nov. 5th 2010. The underlying algorithms are provided by three different sources: Geocoding & radiometric calibration (2) is a standard functionality from the commercial SARscape Toolbox for ArcGIS. This toolbox is extended by the filter tool (1), which is called from the SARscape modules in ENVI. The change detection analysis (3) is based on ENVI processing routines and scripted with IDL. (2) and (3) are integrated with ArcGIS using a predefined Python interface. These 3 processing steps are combined using the ArcGIS ModelBuilder to create a new model for rapid disaster mapping in ArcGIS, based on SAR data. Moreover, this model can be dissolved from its desktop environment and published to users across the ArcGIS Server enterprise. Thus disaster zones, e.g. after severe flooding, can be automatically identified and mapped to support local task forces - using an operational workflow for SAR image analysis, which can be executed by the responsible operators without SAR expert knowledge.

  16. The Draw-an-Environment Test Rubric (DAET-R): Exploring Pre-Service Teachers' Mental Models of the Environment

    Science.gov (United States)

    Moseley, Christine; Desjean-Perrotta, Blanche; Utley, Julianna

    2010-01-01

    The use of drawings as representations of personal mental models or images is one method of analyzing personal beliefs. This article discusses the development of the Draw-An-Environment Test and Rubric (DAET-R) for assessing the mental models or images of the environment held by pre-service teachers. It also provides results of preliminary…

  17. Learning Markov Decision Processes for Model Checking

    DEFF Research Database (Denmark)

    Mao, Hua; Chen, Yingke; Jaeger, Manfred

    2012-01-01

    . The proposed learning algorithm is adapted from algorithms for learning deterministic probabilistic finite automata, and extended to include both probabilistic and nondeterministic transitions. The algorithm is empirically analyzed and evaluated by learning system models of slot machines. The evaluation......Constructing an accurate system model for formal model verification can be both resource demanding and time-consuming. To alleviate this shortcoming, algorithms have been proposed for automatically learning system models based on observed system behaviors. In this paper we extend the algorithm...... on learning probabilistic automata to reactive systems, where the observed system behavior is in the form of alternating sequences of inputs and outputs. We propose an algorithm for automatically learning a deterministic labeled Markov decision process model from the observed behavior of a reactive system...

  18. Processing and performance of topobathymetric lidar data for geomorphometric and morphological classification in a high-energy tidal environment

    DEFF Research Database (Denmark)

    Andersen, Mikkel Skovgaard; Gergely, Áron; Al-Hamdani, Zyad K.

    2017-01-01

    processing steps involve data filtering, water surface detection and refraction correction. Specifically, the procedure of water surface detection and modelling, solely using green laser lidar data, has not previously been described in detail for tidal environments. The aim of this study was to fill this gap...... of detecting features with a size of less than 1 m2. The derived high-resolution DEM was applied for detection and classification of geomorphometric and morphological features within the natural environment of the study area. Initially, the bathymetric position index (BPI) and the slope of the DEM were used...

  19. Educational Process Reengineering and Diffusion of Innovation in Formal Learning Environment

    DEFF Research Database (Denmark)

    Khalid, Md. Saifuddin; Hossain, Mohammad Shahadat; Rongbutsri, Nikorn

    2011-01-01

    In technology mediated learning while relative advantages of technologies is proven, lack of contextualization and process centric change, and lack of user driven change has kept intervention and adoption of educational technologies among individuals and organizations as challenges. Reviewing...... the formal, informal and non-formal learning environments, this study focuses on the formal part. This paper coins the term 'Educational Process Reengineering (EPR) based on the established concept of 'Business Process Reengineering (BPR) for process improvement of teaching learning activities, academic...... and address root cause. Future work is to elaborately demonstrate use of proposed conceptual process design for integrated education process reengineering and diffusion reasoning....

  20. Modeling veterans healthcare administration disclosure processes :

    Energy Technology Data Exchange (ETDEWEB)

    Beyeler, Walter E; DeMenno, Mercy B.; Finley, Patrick D.

    2013-09-01

    As with other large healthcare organizations, medical adverse events at the Department of Veterans Affairs (VA) facilities can expose patients to unforeseen negative risks. VHA leadership recognizes that properly handled disclosure of adverse events can minimize potential harm to patients and negative consequences for the effective functioning of the organization. The work documented here seeks to help improve the disclosure process by situating it within the broader theoretical framework of issues management, and to identify opportunities for process improvement through modeling disclosure and reactions to disclosure. The computational model will allow a variety of disclosure actions to be tested across a range of incident scenarios. Our conceptual model will be refined in collaboration with domain experts, especially by continuing to draw on insights from VA Study of the Communication of Adverse Large-Scale Events (SCALE) project researchers.

  1. Temperature Modelling of the Biomass Pretreatment Process

    DEFF Research Database (Denmark)

    Prunescu, Remus Mihail; Blanke, Mogens; Jensen, Jakob M.

    2012-01-01

    that captures the environmental temperature differences inside the reactor using distributed parameters. A Kalman filter is then added to account for any missing dynamics and the overall model is embedded into a temperature soft sensor. The operator of the plant will be able to observe the temperature in any......In a second generation biorefinery, the biomass pretreatment stage has an important contribution to the efficiency of the downstream processing units involved in biofuel production. Most of the pretreatment process occurs in a large pressurized thermal reactor that presents an irregular temperature...... distribution. Therefore, an accurate temperature model is critical for observing the biomass pretreatment. More than that, the biomass is also pushed with a constant horizontal speed along the reactor in order to ensure a continuous throughput. The goal of this paper is to derive a temperature model...

  2. A Model of Smart Environment for E-learning Based on Crowdsourcing

    Directory of Open Access Journals (Sweden)

    Konstantin Simić

    2015-03-01

    Full Text Available This paper deals with the application of the concepts of Internet of Things and its application in creating smart environments. The specific goal is to design a smart environment for enhancing the teaching and learning processes at universities. The environment should integrate adequate concepts of smart buildings and smart classrooms with e-learning systems, in order to provide students with advanced e-learning services and services that improve the overall quality of students' experience. In addition, the model is based on the concept of crowdsourcing, where students actively participate in gathering the information, designing and implementing the e-services. Finally, the paper describes a prototype of the designed smart environment implemented at the Department for e-Business, at University of Belgrade.

  3. Modeling Electric Arcs in a Force-Free Environment

    Science.gov (United States)

    Tolfree, I. M.; Brooks, N. H.; Jensen, T. H.; Moeller, C. P.

    2001-10-01

    The behavior of the inter-electrodal region of an electric arc in a force-free environment is modeled. The assumption of weightlessness allows for the creation of horizontal arcs which are not subject to bowing by buoyancy effects. Consequently, axial symmetry obtains and transport by convection and hydrodynamic turbulence are absent. The arc is assumed long, so that electrode effects may be ignored. Additionally, the pressure is taken to be large enough that the mean free paths of the electrons, ions and neutrals are small compared to the other dimensions of the problem and, as a result, all the particles are in local thermodynamic equilibrium. The model yields radial temperature profiles for a range in operating conditions in which energy loss is dominated by thermal conduction at one extreme, and by radiative emission at the other. Model predictions are compared with experimental results.

  4. Recent Developments in the Radiation Belt Environment Model

    Science.gov (United States)

    Fok, M.-C.; Glocer, A.; Zheng, Q.; Horne, R. B.; Meredith, N. P.; Albert, J. M.; Nagai, T.

    2010-01-01

    The fluxes of energetic particles in the radiation belts are found to be strongly controlled by the solar wind conditions. In order to understand and predict the radiation particle intensities, we have developed a physics-based Radiation Belt Environment (RBE) model that considers the influences from the solar wind, ring current and plasmasphere. Recently, an improved calculation of wave-particle interactions has been incorporated. In particular, the model now includes cross diffusion in energy and pitch-angle. We find that the exclusion of cross diffusion could cause significant overestimation of electron flux enhancement during storm recovery. The RBE model is also connected to MHD fields so that the response of the radiation belts to fast variations in the global magnetosphere can be studied.Weare able to reproduce the rapid flux increase during a substorm dipolarization on 4 September 2008. The timing is much shorter than the time scale of wave associated acceleration.

  5. Integrodifference models for persistence in temporally varying river environments.

    Science.gov (United States)

    Jacobsen, Jon; Jin, Yu; Lewis, Mark A

    2015-02-01

    To fully understand population persistence in river ecosystems, it is necessary to consider the effect of the water flow, which varies tremendously with seasonal fluctuations of water runoff and snow melt. In this paper, we study integrodifference models for growth and dispersal in the presence of advective flow with both periodic (alternating) and random kernel parameters. For the alternating kernel model, we obtain the principal eigenvalue of the linearization operator to determine population persistence and derive a boundary value problem to calculate it. For the random model, we establish two persistence metrics: a generalized spectral radius and the asymptotic growth rate, which are mathematically equivalent but can be understood differently, to determine population persistence or extinction. The theoretical framework and methods for calculations are provided, and the framework is applied to calculating persistence in highly variable river environments.

  6. Process model for data-driven business model generation

    OpenAIRE

    Benta, Christian; Wilberg, Julian; Hollauer, Christoph; Omer, Mayada

    2017-01-01

    Digitalization is advancing fast and at the same time the volume of data is increasing. Examples from industry show that business models using big data can lead to competitive advantages. Currently the number of smart products is rising, which means more data will be available to engineering ompanies. The challenge is to extract additional profits and value from it. The literature review revealed that existing process models and methods for business model generation do not consider data in a ...

  7. METHODS OF TEACHING STUDENTS FOR SOLVING PROBLEMS ON ARRAYS PROCESSING IN THE DELPHI VISUAL PROGRAMMING ENVIRONMENT

    Directory of Open Access Journals (Sweden)

    Vitalii M. Bazurin

    2015-05-01

    Full Text Available Delphi visual programming environment provides ample opportunities for visual mapping arrays. There are a number of Delphi screen form components, which help you to visualize the array on the form. Processing arrays programs in Delphi environment have their differences from the same programs in Pascal. The article describes these differences. Also, the features of student learning methods for solving problems of array processing using Delphi visual components are highlighted. It has been exposed sequence and logic of the teaching material on arrays processing using TStringGrid and TMemo components.

  8. A multi-agent system model to integrate Virtual Learning Environments and Intelligent Tutoring Systems

    Directory of Open Access Journals (Sweden)

    Giuffra P.

    2013-03-01

    Full Text Available Virtual learning environments (VLEs are used in distance learning and classroom teaching as teachers and students support tools in the teaching–learning process, where teachers can provide material, activities and assessments for students. However, this process is done in the same way for all the students, regardless of their differences in performance and behavior in the environment. The purpose of this work is to develop an agent-based intelligent learning environment model inspired by intelligent tutoring to provide adaptability to distributed VLEs, using Moodle as a case study and taking into account students’ performance on tasks and activities proposed by the teacher, as well as monitoring his/her study material access.

  9. Chaotic home environment is associated with reduced infant processing speed under high task demands.

    Science.gov (United States)

    Tomalski, Przemysław; Marczuk, Karolina; Pisula, Ewa; Malinowska, Anna; Kawa, Rafał; Niedźwiecka, Alicja

    2017-08-01

    Early adversity has profound long-term consequences for child development across domains. The effects of early adversity on structural and functional brain development were shown for infants under 12 months of life. However, the causal mechanisms of these effects remain relatively unexplored. Using a visual habituation task we investigated whether chaotic home environment may affect processing speed in 5.5 month-old infants (n=71). We found detrimental effects of chaos on processing speed for complex but not for simple visual stimuli. No effects of socio-economic status on infant processing speed were found although the sample was predominantly middle class. Our results indicate that chaotic early environment may adversely affect processing speed in early infancy, but only when greater cognitive resources need to be deployed. The study highlights an attractive avenue for research on the mechanisms linking home environment with the development of attention control. Copyright © 2017 Elsevier Inc. All rights reserved.

  10. A Model for Design of Tailored Working Environment Intervention Programmes for Small Enterprises

    Science.gov (United States)

    Kvorning, Laura V; Rasmussen, Charlotte DN; Smith, Louise H; Flyvholm, Mari-Ann

    2012-01-01

    Objectives Small enterprises have higher exposure to occupational hazards compared to larger enterprises and further, they have fewer resources to control the risks. In order to improve the working environment, development of efficient measures is therefore a major challenge for regulators and other stakeholders. The aim of this paper is to develop a systematic model for the design of tailored intervention programmes meeting the needs of small enterprises. Methods An important challenge for the design process is the transfer of knowledge from one context to another. The concept of realist analysis can provide insight into mechanisms by which intervention knowledge can be transferred from one context to another. We use this theoretical approach to develop a design model. Results The model consist of five steps: 1) Defining occupational health and safety challenges of the target group, 2) selecting methods to improve the working environment, 3) developing theories about mechanisms which motivate the target group, 4) analysing the specific context of the target group for small enterprise programmes including owner-management role, social relations, and the perception of the working environment, and 5) designing the intervention based on the preceding steps. We demonstrate how the design model can be applied in practice by the development of an intervention programme for small enterprises in the construction industry. Conclusion The model provides a useful tool for a systematic design process. The model makes it transparent for both researchers and practitioners as to how existing knowledge can be used in the design of new intervention programmes. PMID:23019530

  11. Robust analysis of semiparametric renewal process models.

    Science.gov (United States)

    Lin, Feng-Chang; Truong, Young K; Fine, Jason P

    2013-09-01

    A rate model is proposed for a modulated renewal process comprising a single long sequence, where the covariate process may not capture the dependencies in the sequence as in standard intensity models. We consider partial likelihood-based inferences under a semiparametric multiplicative rate model, which has been widely studied in the context of independent and identical data. Under an intensity model, gap times in a single long sequence may be used naively in the partial likelihood with variance estimation utilizing the observed information matrix. Under a rate model, the gap times cannot be treated as independent and studying the partial likelihood is much more challenging. We employ a mixing condition in the application of limit theory for stationary sequences to obtain consistency and asymptotic normality. The estimator's variance is quite complicated owing to the unknown gap times dependence structure. We adapt block bootstrapping and cluster variance estimators to the partial likelihood. Simulation studies and an analysis of a semiparametric extension of a popular model for neural spike train data demonstrate the practical utility of the rate approach in comparison with the intensity approach.

  12. Mathematical modeling of the flash converting process

    Energy Technology Data Exchange (ETDEWEB)

    Sohn, H.Y.; Perez-Tello, M.; Riihilahti, K.M. [Utah Univ., Salt Lake City, UT (United States)

    1996-12-31

    An axisymmetric mathematical model for the Kennecott-Outokumpu flash converting process for converting solid copper matte to copper is presented. The model is an adaptation of the comprehensive mathematical model formerly developed at the University of Utah for the flash smelting of copper concentrates. The model incorporates the transport of momentum, heat, mass, and reaction kinetics between gas and particles in a particle-laden turbulent gas jet. The standard k-{epsilon} model is used to describe gas-phase turbulence in an Eulerian framework. The particle-phase is treated from a Lagrangian viewpoint which is coupled to the gas-phase via the source terms in the Eulerian gas-phase governing equations. Matte particles were represented as Cu{sub 2}S yFeS, and assumed to undergo homogeneous oxidation to Cu{sub 2}O, Fe{sub 3}O{sub 4}, and SO{sub 2}. A reaction kinetics mechanism involving both external mass transfer of oxygen gas to the particle surface and diffusion of oxygen through the porous oxide layer is proposed to estimate the particle oxidation rate Predictions of the mathematical model were compared with the experimental data collected in a bench-scale flash converting facility. Good agreement between the model predictions and the measurements was obtained. The model was used to study the effect of different gas-injection configurations on the overall fluid dynamics in a commercial size flash converting shaft. (author)

  13. Modelling and control of crystallization process

    Directory of Open Access Journals (Sweden)

    S.K. Jha

    2017-03-01

    Full Text Available Batch crystallizers are predominantly used in chemical industries like pharmaceuticals, food industries and specialty chemicals. The nonlinear nature of the batch process leads to difficulties when the objective is to obtain a uniform Crystal Size Distribution (CSD. In this study, a linear PI controller is designed using classical controller tuning methods for controlling the crystallizer outlet temperature by manipulating the inlet jacket temperature; however, the response is not satisfactory. A simple PID controller cannot guarantee a satisfactory response that is why an optimal controller is designed to keep the concentration and temperature in a range that suits our needs. Any typical process operation has constraints on states, inputs and outputs. So, a nonlinear process needs to be operated satisfying the constraints. Hence, a nonlinear controller like Generic Model Controller (GMC which is similar in structure to the PI controller is implemented. It minimizes the derivative of the squared error, thus improving the output response of the process. Minimization of crystal size variation is considered as an objective function in this study. Model predictive control is also designed that uses advanced optimization algorithm to minimize the error while linearizing the process. Constraints are fed into the MPC toolbox in MATLAB and Prediction, Control horizons and Performance weights are tuned using Sridhar and Cooper Method. Performances of all the three controllers (PID, GMC and MPC are compared and it is found that MPC is the most superior one in terms of settling time and percentage overshoot.

  14. Physical processes and real-time chemical measurement of the insect olfactory environment.

    Science.gov (United States)

    Riffell, Jeffrey A; Abrell, Leif; Hildebrand, John G

    2008-07-01

    Odor-mediated insect navigation in airborne chemical plumes is vital to many ecological interactions, including mate finding, flower nectaring, and host locating (where disease transmission or herbivory may begin). After emission, volatile chemicals become rapidly mixed and diluted through physical processes that create a dynamic olfactory environment. This review examines those physical processes and some of the analytical technologies available to characterize those behavior-inducing chemical signals at temporal scales equivalent to the olfactory processing in insects. In particular, we focus on two areas of research that together may further our understanding of olfactory signal dynamics and its processing and perception by insects. First, measurement of physical atmospheric processes in the field can provide insight into the spatiotemporal dynamics of the odor signal available to insects. Field measurements in turn permit aspects of the physical environment to be simulated in the laboratory, thereby allowing careful investigation into the links between odor signal dynamics and insect behavior. Second, emerging analytical technologies with high recording frequencies and field-friendly inlet systems may offer new opportunities to characterize natural odors at spatiotemporal scales relevant to insect perception and behavior. Characterization of the chemical signal environment allows the determination of when and where olfactory-mediated behaviors may control ecological interactions. Finally, we argue that coupling of these two research areas will foster increased understanding of the physicochemical environment and enable researchers to determine how olfactory environments shape insect behaviors and sensory systems.

  15. Modeling Computer Communication Networks in a Realistic 3D Environment

    Science.gov (United States)

    2010-03-01

    is a proprietary toolkit used for modeling, animation, and rendering of 3D graphics. It is used by numerous video game developers, television and...rendition of the connected battlefield . . . . . . . . . . 2 2. Increasing graph readability . . . . . . . . . . . . . . . . . . . . 8 3. Sample wired...such as pie charts or illustrated diagrams, allow people to easily process information visually in a quicker and more intuitive fashion than by

  16. Predicting plants -modeling traits as a function of environment

    Science.gov (United States)

    Franklin, Oskar

    2016-04-01

    A central problem in understanding and modeling vegetation dynamics is how to represent the variation in plant properties and function across different environments. Addressing this problem there is a strong trend towards trait-based approaches, where vegetation properties are functions of the distributions of functional traits rather than of species. Recently there has been enormous progress in in quantifying trait variability and its drivers and effects (Van Bodegom et al. 2012; Adier et al. 2014; Kunstler et al. 2015) based on wide ranging datasets on a small number of easily measured traits, such as specific leaf area (SLA), wood density and maximum plant height. However, plant function depends on many other traits and while the commonly measured trait data are valuable, they are not sufficient for driving predictive and mechanistic models of vegetation dynamics -especially under novel climate or management conditions. For this purpose we need a model to predict functional traits, also those not easily measured, and how they depend on the plants' environment. Here I present such a mechanistic model based on fitness concepts and focused on traits related to water and light limitation of trees, including: wood density, drought response, allocation to defense, and leaf traits. The model is able to predict observed patterns of variability in these traits in relation to growth and mortality, and their responses to a gradient of water limitation. The results demonstrate that it is possible to mechanistically predict plant traits as a function of the environment based on an eco-physiological model of plant fitness. References Adier, P.B., Salguero-Gómez, R., Compagnoni, A., Hsu, J.S., Ray-Mukherjee, J., Mbeau-Ache, C. et al. (2014). Functional traits explain variation in plant lifehistory strategies. Proc. Natl. Acad. Sci. U. S. A., 111, 740-745. Kunstler, G., Falster, D., Coomes, D.A., Hui, F., Kooyman, R.M., Laughlin, D.C. et al. (2015). Plant functional traits

  17. Process models and model-data fusion in dendroecology

    Directory of Open Access Journals (Sweden)

    Joel eGuiot

    2014-08-01

    Full Text Available Dendrochronology (i.e. the study of annually dated tree-ring time series has proved to be a powerful technique to understand tree-growth. This paper intends to show the interest of using ecophysiological modeling not only to understand and predict tree-growth (dendroecology but also to reconstruct past climates (dendroclimatology. Process models have been used for several decades in dendroclimatology, but it is only recently that methods of model-data fusion have led to significant progress in modeling tree-growth as a function of climate and in reconstructing past climates. These model-data fusion (MDF methods, mainly based on the Bayesian paradigm, have been shown to be powerful for both model calibration and model inversion. After a rapid survey of tree-growth modeling, we illustrate MDF with examples based on series of Southern France Aleppo pines and Central France oaks. These examples show that if plants experienced CO2 fertilization, this would have a significant effect on tree-growth which in turn would bias the climate reconstructions. This bias could be extended to other environmental non-climatic factors directly or indirectly affecting annual ring formation and not taken into account in classical empirical models, which supports the use of more complex process-based models. Finally, we conclude by showing the interest of the data assimilation methods applied in climatology to produce climate re-analyses.

  18. MOUNTAIN-SCALE COUPLED PROCESSES (TH/THC/THM)MODELS

    Energy Technology Data Exchange (ETDEWEB)

    Y.S. Wu

    2005-08-24

    This report documents the development and validation of the mountain-scale thermal-hydrologic (TH), thermal-hydrologic-chemical (THC), and thermal-hydrologic-mechanical (THM) models. These models provide technical support for screening of features, events, and processes (FEPs) related to the effects of coupled TH/THC/THM processes on mountain-scale unsaturated zone (UZ) and saturated zone (SZ) flow at Yucca Mountain, Nevada (BSC 2005 [DIRS 174842], Section 2.1.1.1). The purpose and validation criteria for these models are specified in ''Technical Work Plan for: Near-Field Environment and Transport: Coupled Processes (Mountain-Scale TH/THC/THM, Drift-Scale THC Seepage, and Drift-Scale Abstraction) Model Report Integration'' (BSC 2005 [DIRS 174842]). Model results are used to support exclusion of certain FEPs from the total system performance assessment for the license application (TSPA-LA) model on the basis of low consequence, consistent with the requirements of 10 CFR 63.342 [DIRS 173273]. Outputs from this report are not direct feeds to the TSPA-LA. All the FEPs related to the effects of coupled TH/THC/THM processes on mountain-scale UZ and SZ flow are discussed in Sections 6 and 7 of this report. The mountain-scale coupled TH/THC/THM processes models numerically simulate the impact of nuclear waste heat release on the natural hydrogeological system, including a representation of heat-driven processes occurring in the far field. The mountain-scale TH simulations provide predictions for thermally affected liquid saturation, gas- and liquid-phase fluxes, and water and rock temperature (together called the flow fields). The main focus of the TH model is to predict the changes in water flux driven by evaporation/condensation processes, and drainage between drifts. The TH model captures mountain-scale three-dimensional flow effects, including lateral diversion and mountain-scale flow patterns. The mountain-scale THC model evaluates TH effects on

  19. Rocks in the River: The Challenge of Piloting the Inquiry Process in Today's Learning Environment

    Science.gov (United States)

    Lambusta, Patrice; Graham, Sandy; Letteri-Walker, Barbara

    2014-01-01

    School librarians in Newport News, Virginia, are meeting the challenges of integrating an Inquiry Process Model into instruction. In the original model the process began by asking students to develop questions to start their inquiry journey. As this model was taught it was discovered that students often did not have enough background knowledge to…

  20. The comprehensive process model of engagement.

    Science.gov (United States)

    Cohen-Mansfield, Jiska; Marx, Marcia S; Freedman, Laurence S; Murad, Havi; Regier, Natalie G; Thein, Khin; Dakheel-Ali, Maha

    2011-10-01

    Engagement refers to the act of being occupied or involved with an external stimulus. In dementia, engagement is the antithesis of apathy. The Comprehensive Process Model of Engagement was examined, in which environmental, personal, and stimulus characteristics impact the level of engagement. : Participants were 193 residents of 7 Maryland nursing with a diagnosis of dementia. Stimulus engagement was assessed via the Observational Measure of Engagement, measuring duration, attention, and attitude to the stimulus. Twenty-five stimuli were presented, which were categorized as live human social stimuli, simulated social stimuli, inanimate social stimuli, a reading stimulus, manipulative stimuli, a music stimulus, task and work-related stimuli, and two different self-identity stimuli. All stimuli elicited significantly greater engagement in comparison to the control stimulus. In the multivariate model, music significantly increased engagement duration, whereas all other stimuli significantly increased duration, attention, and attitude. Significant environmental variables in the multivariate model that increased engagement were: use of the long introduction with modeling (relative to minimal introduction), any level of sound (especially moderate sound), and the presence of between 2 and 24 people in the room. Significant personal attributes included Mini-Mental State Examination scores, activities of daily living performance and clarity of speech, which were positively associated with higher engagement scores. Results are consistent with the Comprehensive Process Model of Engagement. Personal attributes, environmental factors, and stimulus characteristics all contribute to the level and nature of engagement, with a secondary finding being that exposure to any stimulus elicits engagement in persons with dementia.

  1. Theoretical Modelling of Intercultural Communication Process

    Directory of Open Access Journals (Sweden)

    Mariia Soter

    2016-08-01

    Full Text Available The definition of the concepts of “communication”, “intercultural communication”, “model of communication” are analyzed in the article. The basic components of the communication process are singled out. The model of intercultural communication is developed. Communicative, behavioral and complex skills for optimal organization of intercultural communication, establishment of productive contact with a foreign partner to achieve mutual understanding, searching for acceptable ways of organizing interaction and cooperation for both communicants are highlighted in the article. It is noted that intercultural communication through interaction between people affects the development of different cultures’ aspects.

  2. Using {sup 222}Rn as a tracer of geophysical processes in underground environments

    Energy Technology Data Exchange (ETDEWEB)

    Lacerda, T.; Anjos, R. M. [LARA - Laboratório de Radioecologia e Alterações Ambientais, Instituto de Física, Universidade Federal Fluminense, Av. Gal Milton Tavares de Souza, s/no, Gragoatá, 24210-346, Niterói, RJ (Brazil); Valladares, D. L.; Rizzotto, M.; Velasco, H.; Rosas, J. P. de; Ayub, J. Juri [GEA - Instituto de Matemática Aplicada San Luis (IMASL), Universidad Nacional de San Luis, CCT-San Luis CONICET, Ej. de los Andes 950, D5700HHW San Luis (Argentina); Silva, A. A. R. da; Yoshimura, E. M. [Instituto de Física, Universidade de São Paulo, P.O. Box 66318, 05314-970, São Paulo, SP (Brazil)

    2014-11-11

    Radon levels in two old mines in San Luis, Argentina, are reported and analyzed. These mines are today used for touristic visitation. Our goal was to assess the potential use of such radioactive noble gas as tracer of geological processes in underground environments. CR-39 nuclear track detectors were used during the winter and summer seasons. The findings show that the significant radon concentrations reported in this environment are subject to large seasonal modulations, due to the strong dependence of natural ventilation on the variations of outside temperature. The results also indicate that radon pattern distribution appear as a good method to localize unknown ducts, fissures or secondary tunnels in subterranean environments.

  3. Modeling of processes in the tourism sector

    Directory of Open Access Journals (Sweden)

    Salamatina Victoriya, S.

    2015-06-01

    Full Text Available In modern conditions for a number of Russian regions tourism is becoming budget. In this regard, it is of interest to the simulation of processes occurring in the tourism business, because they are affected by many random parameters, due to various economic, political, geographic, and other aspects. For improvement and development of systems for the management of tourism business systematically embeds economic mathematical apparatus in this area, because increased competitiveness requires continuous and constructive changes. Results of application of the economic mathematical apparatus allow a more systematic and internal unity to analyze and evaluate the applicability of further processes in tourism. For some economic processes typical tourist activities is that a certain effect and result from exposure to any of the factors on the indicators of the processes is not immediately but gradually, after some certain time, with a certain lag. With the necessity of accounting for this delay has to face when developing mathematical models of tourist business processes. In this case, the simulation of such processes it is advisable to apply economic-mathematical formalism of optimal control, called game theory.

  4. Validation of a Solid Rocket Motor Internal Environment Model

    Science.gov (United States)

    Martin, Heath T.

    2017-01-01

    In a prior effort, a thermal/fluid model of the interior of Penn State University's laboratory-scale Insulation Test Motor (ITM) was constructed to predict both the convective and radiative heat transfer to the interior walls of the ITM with a minimum of empiricism. These predictions were then compared to values of total and radiative heat flux measured in a previous series of ITM test firings to assess the capabilities and shortcomings of the chosen modeling approach. Though the calculated fluxes reasonably agreed with those measured during testing, this exercise revealed means of improving the fidelity of the model to, in the case of the thermal radiation, enable direct comparison of the measured and calculated fluxes and, for the total heat flux, compute a value indicative of the average measured condition. By replacing the P1-Approximation with the discrete ordinates (DO) model for the solution of the gray radiative transfer equation, the radiation intensity field in the optically thin region near the radiometer is accurately estimated, allowing the thermal radiation flux to be calculated on the heat-flux sensor itself, which was then compared directly to the measured values. Though the fully coupling the wall thermal response with the flow model was not attempted due to the excessive computational time required, a separate wall thermal response model was used to better estimate the average temperature of the graphite surfaces upstream of the heat flux gauges and improve the accuracy of both the total and radiative heat flux computations. The success of this modeling approach increases confidence in the ability of state-of-the-art thermal and fluid modeling to accurately predict SRM internal environments, offers corrections to older methods, and supplies a tool for further studies of the dynamics of SRM interiors.

  5. Distributed data processing and analysis environment for neutron scattering experiments at CSNS

    CERN Document Server

    Tian, H L; Yan, L L; Tang, M; Hu, L; Zhao, D X; Qiu, Y X; Zhang, H Y; Zhuang, J; Du, R

    2016-01-01

    China Spallation Neutron Source (CSNS) is the first high performance pulsed neutron source in China, which will meet the increasing fundamental research and technique applications demands in the domestic and oversea. A new distributed data processing and analysis environment has been developed, which has generic functionalities for neutron scattering experiments. The environment consists of three parts, an object-oriented data processing framework adopting a data centered architecture, a communication and data caching system based on C/S paradigm, and a data analysis and visualization software providing the 2D/3D experimental data display. This environment will be widely applied in CSNS for live data processing and virtual neutron scattering experiments based on Monte Carlo methods.

  6. A model of disruptive surgeon behavior in the perioperative environment.

    Science.gov (United States)

    Cochran, Amalia; Elder, William B

    2014-09-01

    Surgeons are the physicians with the highest rates of documented disruptive behavior. We hypothesized that a unified conceptual model of disruptive surgeon behavior could be developed based on specific individual and system factors in the perioperative environment. Semi-structured interviews were conducted with 19 operating room staff of diverse occupations at a single institution. Interviews were analyzed using grounded theory methods. Participants described episodes of disruptive surgeon behavior, personality traits of perpetrators, environmental conditions of power, and situations when disruptive behavior was demonstrated. Verbal hostility and throwing or hitting objects were the most commonly described disruptive behaviors. Participants indicated that surgical training attracts and creates individuals with particular personality traits, including a sense of shame. Interviewees stated this behavior is tolerated because surgeons have unchecked power, have strong money-making capabilities for the institution, and tend to direct disruptive behavior toward the least powerful employees. The most frequent situational stressors were when something went wrong during an operation and working with unfamiliar team members. Each factor group (ie, situational stressors, cultural conditions, and personality factors) was viewed as being necessary, but none of them alone were sufficient to catalyze disruptive behavior events. Disruptive physician behavior has strong implications for the work environment and patient safety. This model can be used by hospitals to better conceptualize conditions that facilitate disruptive surgeon behavior and to establish programs to mitigate conduct that threatens patient safety and employee satisfaction. Copyright © 2014 American College of Surgeons. Published by Elsevier Inc. All rights reserved.

  7. A New Fractal Model of Chromosome and DNA Processes

    Science.gov (United States)

    Bouallegue, K.

    Dynamic chromosome structure remains unknown. Can fractals and chaos be used as new tools to model, identify and generate a structure of chromosomes?Fractals and chaos offer a rich environment for exploring and modeling the complexity of nature. In a sense, fractal geometry is used to describe, model, and analyze the complex forms found in nature. Fractals have also been widely not only in biology but also in medicine. To this effect, a fractal is considered an object that displays self-similarity under magnification and can be constructed using a simple motif (an image repeated on ever-reduced scales).It is worth noting that the problem of identifying a chromosome has become a challenge to find out which one of the models it belongs to. Nevertheless, the several different models (a hierarchical coiling, a folded fiber, and radial loop) have been proposed for mitotic chromosome but have not reached a dynamic model yet.This paper is an attempt to solve topological problems involved in the model of chromosome and DNA processes. By combining the fractal Julia process and the numerical dynamical system, we have finally found out four main points. First, we have developed not only a model of chromosome but also a model of mitosis and one of meiosis. Equally important, we have identified the centromere position through the numerical model captured below. More importantly, in this paper, we have discovered the processes of the cell divisions of both mitosis and meiosis. All in all, the results show that this work could have a strong impact on the welfare of humanity and can lead to a cure of genetic diseases.

  8. Incorporating evolutionary processes into population viability models.

    Science.gov (United States)

    Pierson, Jennifer C; Beissinger, Steven R; Bragg, Jason G; Coates, David J; Oostermeijer, J Gerard B; Sunnucks, Paul; Schumaker, Nathan H; Trotter, Meredith V; Young, Andrew G

    2015-06-01

    We examined how ecological and evolutionary (eco-evo) processes in population dynamics could be better integrated into population viability analysis (PVA). Complementary advances in computation and population genomics can be combined into an eco-evo PVA to offer powerful new approaches to understand the influence of evolutionary processes on population persistence. We developed the mechanistic basis of an eco-evo PVA using individual-based models with individual-level genotype tracking and dynamic genotype-phenotype mapping to model emergent population-level effects, such as local adaptation and genetic rescue. We then outline how genomics can allow or improve parameter estimation for PVA models by providing genotypic information at large numbers of loci for neutral and functional genome regions. As climate change and other threatening processes increase in rate and scale, eco-evo PVAs will become essential research tools to evaluate the effects of adaptive potential, evolutionary rescue, and locally adapted traits on persistence. © 2014 Society for Conservation Biology.

  9. Reinventing The Design Process: Teams and Models

    Science.gov (United States)

    Wall, Stephen D.

    1999-01-01

    The future of space mission designing will be dramatically different from the past. Formerly, performance-driven paradigms emphasized data return with cost and schedule being secondary issues. Now and in the future, costs are capped and schedules fixed-these two variables must be treated as independent in the design process. Accordingly, JPL has redesigned its design process. At the conceptual level, design times have been reduced by properly defining the required design depth, improving the linkages between tools, and managing team dynamics. In implementation-phase design, system requirements will be held in crosscutting models, linked to subsystem design tools through a central database that captures the design and supplies needed configuration management and control. Mission goals will then be captured in timelining software that drives the models, testing their capability to execute the goals. Metrics are used to measure and control both processes and to ensure that design parameters converge through the design process within schedule constraints. This methodology manages margins controlled by acceptable risk levels. Thus, teams can evolve risk tolerance (and cost) as they would any engineering parameter. This new approach allows more design freedom for a longer time, which tends to encourage revolutionary and unexpected improvements in design.

  10. Kinetic models of tissue fusion processes

    Science.gov (United States)

    Pearce, John A.; Thomsen, Sharon L.

    1992-06-01

    Recent studies of tissue fusion (welding) processes have reported temperature ranges but have not carefully analyzed critical exposure time data. Electron microscopic (EM) studies suggest that the fusion process in blood vessels may be dominated by random re-entwinement of thermally dissociated adventitial collagen fibrils (Type I) during the end stage heating and early cooling phases. At the light microscopic level, this bonding process is reflected by the formation of an amorphous coagulum of thermally coagulated adventitial collagen at the anastomotic site. We have constructed a numerical model of the vessel welding process, assuming CO2 laser impingement, and used it to simulate quantitative histologic data obtained from successful welds of rat femoral and canine brachial arteries (unpublished data). The model estimates smooth muscle and collagen damage based on kinetic thermal damage analysis and water loss boundaries as a function of irradiation beam parameters and heating time. Both heating and cooling phases are simulated. The results illustrate the importance of the damage kinetics and local heat transfer phenomena to the weld characteristics realized.

  11. Applications integration in a hybrid cloud computing environment: modelling and platform

    Science.gov (United States)

    Li, Qing; Wang, Ze-yuan; Li, Wei-hua; Li, Jun; Wang, Cheng; Du, Rui-yang

    2013-08-01

    With the development of application services providers and cloud computing, more and more small- and medium-sized business enterprises use software services and even infrastructure services provided by professional information service companies to replace all or part of their information systems (ISs). These information service companies provide applications, such as data storage, computing processes, document sharing and even management information system services as public resources to support the business process management of their customers. However, no cloud computing service vendor can satisfy the full functional IS requirements of an enterprise. As a result, enterprises often have to simultaneously use systems distributed in different clouds and their intra enterprise ISs. Thus, this article presents a framework to integrate applications deployed in public clouds and intra ISs. A run-time platform is developed and a cross-computing environment process modelling technique is also developed to improve the feasibility of ISs under hybrid cloud computing environments.

  12. The impact of working memory and the "process of process modelling" on model quality: Investigating experienced versus inexperienced modellers.

    Science.gov (United States)

    Martini, Markus; Pinggera, Jakob; Neurauter, Manuel; Sachse, Pierre; Furtner, Marco R; Weber, Barbara

    2016-05-09

    A process model (PM) represents the graphical depiction of a business process, for instance, the entire process from online ordering a book until the parcel is delivered to the customer. Knowledge about relevant factors for creating PMs of high quality is lacking. The present study investigated the role of cognitive processes as well as modelling processes in creating a PM in experienced and inexperienced modellers. Specifically, two working memory (WM) functions (holding and processing of information and relational integration) and three process of process modelling phases (comprehension, modelling, and reconciliation) were related to PM quality. Our results show that the WM function of relational integration was positively related to PM quality in both modelling groups. The ratio of comprehension phases was negatively related to PM quality in inexperienced modellers and the ratio of reconciliation phases was positively related to PM quality in experienced modellers. Our research reveals central cognitive mechanisms in process modelling and has potential practical implications for the development of modelling software and teaching the craft of process modelling.

  13. Workforce Transition Modeling Environment user`s guide

    Energy Technology Data Exchange (ETDEWEB)

    Stahlman, E.J.; Oens, M.A.; Lewis, R.E.

    1993-10-01

    The Pacific Northwest Laboratory (PNL) was tasked by the US Department of Energy Albuquerque Field Office (DOE-AL) to develop a workforce assessment and transition planning tool to support integrated decision making at a single DOE installation. The planning tool permits coordinated, integrated workforce planning to manage growth, decline, or transition within a DOE installation. The tool enhances the links and provides commonality between strategic, programmatic, and operations planners and human resources. Successful development and subsequent complex-wide implementation of the model also will facilitate planning at the national level by enforcing a consistent format on data that are now collected by installations in corporate-specific formats that are not amenable to national-level analyses. The workforce assessment and transition planning tool, the Workforce Transition Modeling Environment (WFTME), consists of two components: the Workforce Transition Model and the Workforce Budget Constraint Model. The Workforce Transition Model, the preponderant of the two, assists decision makers identify and evaluate alternatives for transitioning the current workforce to meet the skills required to support projected workforce requirements. The Workforce Budget Constraint Model helps estimate the number of personnel that will be affected by a given workforce budget increase or decrease and assists in identifying how the corresponding hirings or layoffs should be distributed across the Common Occupation Classification System (COCS) occupations. This user`s guide describes the use and operation of the WFTME. This includes the functions of modifying data and running models, interpreting output reports, and an approach for using the WFTME to evaluate various workforce transition scenarios.

  14. Spherical Process Models for Global Spatial Statistics

    KAUST Repository

    Jeong, Jaehong

    2017-11-28

    Statistical models used in geophysical, environmental, and climate science applications must reflect the curvature of the spatial domain in global data. Over the past few decades, statisticians have developed covariance models that capture the spatial and temporal behavior of these global data sets. Though the geodesic distance is the most natural metric for measuring distance on the surface of a sphere, mathematical limitations have compelled statisticians to use the chordal distance to compute the covariance matrix in many applications instead, which may cause physically unrealistic distortions. Therefore, covariance functions directly defined on a sphere using the geodesic distance are needed. We discuss the issues that arise when dealing with spherical data sets on a global scale and provide references to recent literature. We review the current approaches to building process models on spheres, including the differential operator, the stochastic partial differential equation, the kernel convolution, and the deformation approaches. We illustrate realizations obtained from Gaussian processes with different covariance structures and the use of isotropic and nonstationary covariance models through deformations and geographical indicators for global surface temperature data. To assess the suitability of each method, we compare their log-likelihood values and prediction scores, and we end with a discussion of related research problems.

  15. Machine platform and software environment for rapid optics assembly process development

    Science.gov (United States)

    Sauer, Sebastian; Müller, Tobias; Haag, Sebastian; Zontar, Daniel

    2016-03-01

    The assembly of optical components for laser systems is proprietary knowledge and typically done by well-trained personnel in clean room environment as it has major impact on the overall laser performance. Rising numbers of laser systems drives laser production to industrial-level automation solutions allowing for high volumes by simultaneously ensuring stable quality, lots of variants and low cost. Therefore, an easy programmable, expandable and reconfigurable machine with intuitive and flexible software environment for process configuration is required. With Fraunhofer IPT's expertise on optical assembly processes, the next step towards industrializing the production of optical systems is made.

  16. Teaching and Learning of Computational Modelling in Creative Shaping Processes

    Directory of Open Access Journals (Sweden)

    Daniela REIMANN

    2017-10-01

    Full Text Available Today, not only diverse design-related disciplines are required to actively deal with the digitization of information and its potentials and side effects for education processes. In Germany, technology didactics developed in vocational education and computer science education in general education, both separated from media pedagogy as an after-school program. Media education is not a subject in German schools yet. However, in the paper we argue for an interdisciplinary approach to learn about computational modeling in creative processes and aesthetic contexts. It crosses the borders of programming technology, arts and design processes in meaningful contexts. Educational scenarios using smart textile environments are introduced and reflected for project based learning.

  17. Spartan random processes in time series modeling

    Science.gov (United States)

    Žukovič, M.; Hristopulos, D. T.

    2008-06-01

    A Spartan random process (SRP) is used to estimate the correlation structure of time series and to predict (interpolate and extrapolate) the data values. SRPs are motivated from statistical physics, and they can be viewed as Ginzburg-Landau models. The temporal correlations of the SRP are modeled in terms of ‘interactions’ between the field values. Model parameter inference employs the computationally fast modified method of moments, which is based on matching sample energy moments with the respective stochastic constraints. The parameters thus inferred are then compared with those obtained by means of the maximum likelihood method. The performance of the Spartan predictor (SP) is investigated using real time series of the quarterly S&P 500 index. SP prediction errors are compared with those of the Kolmogorov-Wiener predictor. Two predictors, one of which is explicit, are derived and used for extrapolation. The performance of the predictors is similarly evaluated.

  18. Managing information security in a process industrial environment; Gestao de seguranca da informacao em processos industriais

    Energy Technology Data Exchange (ETDEWEB)

    Pereira, Raphael Gomes; Aguiar, Leandro Pfleger de [Siemens Company (Brazil)

    2008-07-01

    With the recently globalization expansion (growth), the exploration of energetic resources is crossing over countries boundaries, resulting in worldwide companies exploring Oil and Gas fields available in any place of the world. To the government's bodies, this information about those fields should be treated as a national security interest subject by bringing an adequate management and protection to all the important and critical information and assets, and making possible, at the same time, the freedom and transparency in concurrence processes. This create a complex security context to be managed, where information disruption might, for instance, imply in broke of integrity in public auctions processes as a result of privileged information usage. Furthermore, with the terrorism problem, the process itself becomes an attractive target for different kinds of attacks, motivated by the opportunism to explore the known incapacity of the big industries in well manage their large and complex environments. With all transformations that are happening in productive processes, as the growing TCP/IP protocol usage, the Windows operating systems adoption in SCADA systems and the integration of industrial with business network, are factors that contribute to an eminent landscape of problems. This landscape demonstrates the need from the organizations and countries that are operating in energetic resources exploration, for renew their risk management areas, establishing a unique and integrated process to protect information security infrastructure. This work presents a study of the challenges to be faced by the organizations while rebuilding their internal processes to integrate the risk management and information security areas, as long as a set of essential steps to establish an affective corporative governance of risk management and compliance aspects. Moreover, the work presents the necessary points of the government involvement to improve all the regulatory aspects

  19. A Formal Framework for Integrated Environment Modeling Systems

    Directory of Open Access Journals (Sweden)

    Gaofeng Zhang

    2017-02-01

    Full Text Available Integrated Environment Modeling (IEM has become more and more important for environmental studies and applications. IEM systems have also been extended from scientific studies to much wider practical application situations. The quality and improved efficiency of IEM systems have therefore become increasingly critical. Although many advanced and creative technologies have been adopted to improve the quality of IEM systems, there is scarcely any formal method for evaluating and improving them. This paper is devoted to proposing a formal method to improve the quality and the developing efficiency of IEM systems. Two primary contributions are made. Firstly, a formal framework for IEM is proposed. The framework not only reflects the static and dynamic features of IEM but also covers different views from variant roles throughout the IEM lifecycle. Secondly, the formal operational semantics corresponding to the former model of the IEM is derived in detail; it can be used as the basis for aiding automated integrated modeling and verifying the integrated model.

  20. Discovering Reference Process Models by Mining Process Variants

    NARCIS (Netherlands)

    Li, C.; Reichert, M.U.; Wombacher, Andreas

    Recently, a new generation of adaptive Process-Aware Information Systems (PAIS) has emerged, which allows for dynamic process and service changes (e.g., to insert, delete, and move activities and service executions in a running process). This, in turn, has led to a large number of process variants

  1. Hidden Markov Models as a Process Monitor in Robotic Assembly

    Directory of Open Access Journals (Sweden)

    Geir E. Hovland

    1999-10-01

    Full Text Available A process monitor for robotic assembly based on hidden Markov models (HMMs is presented. The HMM process monitor is based on the dynamic force/torque signals arising from interaction between the workpiece and the environment. The HMMs represent a stochastic, knowledge-based system in which the models are trained off-line with the Baum-Welch reestimation algorithm. The assembly task is modeled as a discrete event dynamic system in which a discrete event is defined as a change in contact state between the workpiece and the environment. Our method (1 allows for dynamic motions of the workpiece, (2 accounts for sensor noise and friction, and (3 exploits the fact that the amount of force information is large when there is a sudden change of discrete state in robotic assembly. After the HMMs have been trained, the authors use them on-line in a 2D experimental setup to recognize discrete events as they occur. Successful event recognition with an accuracy as high as 97with a training set size of only 20 examples for each discrete event.

  2. Applying nonlinear MODM model to supply chain management with quantity discount policy under complex fuzzy environment

    Directory of Open Access Journals (Sweden)

    Zhe Zhang

    2014-06-01

    Full Text Available Purpose: The aim of this paper is to deal with the supply chain management (SCM with quantity discount policy under the complex fuzzy environment, which is characterized as the bi-fuzzy variables. By taking into account the strategy and the process of decision making, a bi-fuzzy nonlinear multiple objective decision making (MODM model is presented to solve the proposed problem.Design/methodology/approach: The bi-fuzzy variables in the MODM model are transformed into the trapezoidal fuzzy variables by the DMs's degree of optimism ?1 and ?2, which are de-fuzzified by the expected value index subsequently. For solving the complex nonlinear model, a multi-objective adaptive particle swarm optimization algorithm (MO-APSO is designed as the solution method.Findings: The proposed model and algorithm are applied to a typical example of SCM problem to illustrate the effectiveness. Based on the sensitivity analysis of the results, the bi-fuzzy nonlinear MODM SCM model is proved to be sensitive to the possibility level ?1.Practical implications: The study focuses on the SCM under complex fuzzy environment in SCM, which has a great practical significance. Therefore, the bi-fuzzy MODM model and MO-APSO can be further applied in SCM problem with quantity discount policy.Originality/value: The bi-fuzzy variable is employed in the nonlinear MODM model of SCM to characterize the hybrid uncertain environment, and this work is original. In addition, the hybrid crisp approach is proposed to transferred to model to an equivalent crisp one by the DMs's degree of optimism and the expected value index. Since the MODM model consider the bi-fuzzy environment and quantity discount policy, so this paper has a great practical significance.

  3. Affordances perspective and grammaticalization: Incorporation of language, environment and users in the model of semantic paths

    Directory of Open Access Journals (Sweden)

    Alexander Andrason

    2015-12-01

    Full Text Available The present paper demonstrates that insights from the affordances perspective can contribute to developing a more comprehensive model of grammaticalization. The authors argue that the grammaticalization process is afforded differently depending on the values of three contributing parameters: the factor (schematized as a qualitative-quantitative map or a wave of a gram, environment (understood as the structure of the stream along which the gram travels, and actor (narrowed to certain cognitive-epistemological capacities of the users, in particular to the fact of being a native speaker. By relating grammaticalization to these three parameters and by connecting it to the theory of optimization, the proposed model offers a better approximation to realistic cases of grammaticalization: The actor and environment are overtly incorporated into the model and divergences from canonical grammaticalization paths are both tolerated and explicable.

  4. A model for assessing information technology effectiveness in the business environment

    Directory of Open Access Journals (Sweden)

    Sandra Cristina Riascos Erazo

    2010-04-01

    Full Text Available The impact of technology on administrative processes has improved business strategies (especially regarding the e-ffect of information technology - IT, often leading to organisational success. Its effectiveness in this environment was thus modelled due to such importance; this paper describes studying a series of models aimed at assessing IT, its ad-vantages and disadvantages. A model is proposed involving different aspects for an integral assessment of IT effecti-veness and considering administrative activities’ particular characteristics. This analytical study provides guidelines for identifying IT effectiveness in a business environment and current key strategies in technological innovation. This stu-dy was based on ISO 9126, ISO 9001, ISO 15939 and ISO 25000 standards as well as COBIT and CMM stan-dards.

  5. A model for assessing information technology effectiveness in the business environment

    Directory of Open Access Journals (Sweden)

    Sandra Cristina Riascos Erazo

    2008-05-01

    Full Text Available The impact of technology on administrative processes has improved business strategies (especially regarding the e-ffect of information technology - IT, often leading to organisational success. Its effectiveness in this environment was thus modelled due to such importance; this paper describes studying a series of models aimed at assessing IT, its ad-vantages and disadvantages. A model is proposed involving different aspects for an integral assessment of IT effecti-veness and considering administrative activities’ particular characteristics. This analytical study provides guidelines for identifying IT effectiveness in a business environment and current key strategies in technological innovation. This stu-dy was based on ISO 9126, ISO 9001, ISO 15939 and ISO 25000 standards as well as COBIT and CMM stan-dards.

  6. Multifunctional multiscale composites: Processing, modeling and characterization

    Science.gov (United States)

    Qiu, Jingjing

    Carbon nanotubes (CNTs) demonstrate extraordinary properties and show great promise in enhancing out-of-plane properties of traditional polymer/fiber composites and enabling functionality. However, current manufacturing challenges hinder the realization of their potential. In the dissertation research, both experimental and computational efforts have been conducted to investigate effective manufacturing techniques of CNT integrated multiscale composites. The fabricated composites demonstrated significant improvements in physical properties, such as tensile strength, tensile modulus, inter-laminar shear strength, thermal dimension stability and electrical conductivity. Such multiscale composites were truly multifunctional with the addition of CNTs. Furthermore, a novel hierarchical multiscale modeling method was developed in this research. Molecular dynamic (MD) simulation offered reasonable explanation of CNTs dispersion and their motion in polymer solution. Bi-mode finite-extensible-nonlinear-elastic (FENE) dumbbell simulation was used to analyze the influence of CNT length distribution on the stress tensor and shear-rate-dependent viscosity. Based on the simulated viscosity profile and empirical equations from experiments, a macroscale flow simulation model on the finite element method (FEM) method was developed and validated to predict resin flow behavior in the processing of CNT-enhanced multiscale composites. The proposed multiscale modeling method provided a comprehensive understanding of micro/nano flow in both atomistic details and mesoscale. The simulation model can be used to optimize process design and control of the mold-filling process in multiscale composite manufacturing. This research provided systematic investigations into the CNT-based multiscale composites. The results from this study may be used to leverage the benefits of CNTs and open up new application opportunities for high-performance multifunctional multiscale composites. Keywords. Carbon

  7. Development of an interdisciplinary model cluster for tidal water environments

    Science.gov (United States)

    Dietrich, Stephan; Winterscheid, Axel; Jens, Wyrwa; Hartmut, Hein; Birte, Hein; Stefan, Vollmer; Andreas, Schöl

    2013-04-01

    Global climate change has a high potential to influence both the persistence and the transport pathways of water masses and its constituents in tidal waters and estuaries. These processes are linked through dispersion processes, thus directly influencing the sediment and solid suspend matter budgets, and thus the river morphology. Furthermore, the hydrologic regime has an impact on the transport of nutrients, phytoplankton, suspended matter, and temperature that determine the oxygen content within water masses, which is a major parameter describing the water quality. This project aims at the implementation of a so-called (numerical) model cluster in tidal waters, which includes the model compartments hydrodynamics, morphology and ecology. For the implementation of this cluster it is required to continue with the integration of different models that work in a wide range of spatial and temporal scales. The model cluster is thus suggested to lead to a more precise knowledge of the feedback processes between the single interdisciplinary model compartments. In addition to field measurements this model cluster will provide a complementary scientific basis required to address a spectrum of research questions concerning the integral management of estuaries within the Federal Institute of Hydrology (BfG, Germany). This will in particular include aspects like sediment and water quality management as well as adaptation strategies to climate change. The core of the model cluster will consist of the 3D-hydrodynamic model Delft3D (Roelvink and van Banning, 1994), long-term hydrodynamics in the estuaries are simulated with the Hamburg Shelf Ocean Model HAMSOM (Backhaus, 1983; Hein et al., 2012). The simulation results will be compared with the unstructured grid based SELFE model (Zhang and Bapista, 2008). The additional coupling of the BfG-developed 1D-water quality model QSim (Kirchesch and Schöl, 1999; Hein et al., 2011) with the morphological/hydrodynamic models is an

  8. Geomagnetic Environment Modeling at the Community Coordinated Modeling Center: Successes, Challenges and Perspectives.

    Science.gov (United States)

    Kuznetsova, Maria; Toth, Gabor; Hesse, Michael; Rastaetter, Lutz; Glocer, Alex

    The Community Coordinated Modeling Center (CCMC, http://ccmc.gsfc.nasa.gov) hosts an expanding collection of modern space science and space weather models developed by the international space science community. The goals of the CCMC are to support the research and developmental work necessary to substantially increase the present-day space environment modeling capability and to maximize scientific return on investments into model development. CCMC is servicing models through interactive web-based systems, supporting community-wide research projects and designing displays and tools customized for specific applications. The presentation will review the current state of the geomagnetic environment modeling, highlight resent progress, and showcase the role of state-of-the-art magnetosphere models in advancing our understanding of fundamental phenomena in magnetosphere plasma physics.

  9. TEACH-ME: IMPLEMENTATION OF MOBILE ENVIRONMENTS TO THE TEACH - LEARNING PROCESS

    Directory of Open Access Journals (Sweden)

    Luis Eduardo Pérez Peregrino

    2011-05-01

    Full Text Available The research project TEACH-ME (Technology, Engineering Calculus and Hewlett-Packard Mobile Environment presents an educational proposal that seeks to innovate the teaching and learning processes of mathematics, Logic Basic Programming and Management of Information, through the introduction of collaborative working environments, in order to provide the integrated development of learning methodologies, enhancing cognitive abilities in their students. As a case study, it presents the results obtained when applying this project to students in their first semester at the Faculty of Engineering at “Corporación Universitaria Minuto de Dios” University, which introduces the use of tablet PCs from Hewlett Packard to support the teaching process. This article presents the process of implementing of the TEACH-ME Project, developed as an academic environment that has allowed the implementation processes of research on the impact of the application of information technologies and communication technologies to the higher education teaching. We will present the project background, what the implementation process has so far done, the impact obtained from the learning and teaching processes, the integration of technologies at an academic meeting who has helped carry out the project and, finally, the contributions of the Tablet PC to the teaching-learning process at the University.

  10. Explicitly representing soil microbial processes in Earth system models: Soil microbes in earth system models

    Energy Technology Data Exchange (ETDEWEB)

    Wieder, William R. [Climate and Global Dynamics Division, National Center for Atmospheric Research, Boulder Colorado USA; Allison, Steven D. [Department of Ecology and Evolutionary Biology, University of California, Irvine California USA; Department of Earth System Science, University of California, Irvine California USA; Davidson, Eric A. [Appalachian Laboratory, University of Maryland Center for Environmental Science, Frostburg Maryland USA; Georgiou, Katerina [Department of Chemical and Biomolecular Engineering, University of California, Berkeley California USA; Earth Sciences Division, Lawrence Berkeley National Laboratory, Berkeley California USA; Hararuk, Oleksandra [Natural Resources Canada, Canadian Forest Service, Pacific Forestry Centre, Victoria British Columbia Canada; He, Yujie [Department of Earth System Science, University of California, Irvine California USA; Department of Earth, Atmospheric and Planetary Sciences, Purdue University, West Lafayette Indiana USA; Hopkins, Francesca [Department of Earth System Science, University of California, Irvine California USA; Jet Propulsion Laboratory, California Institute of Technology, Pasadena California USA; Luo, Yiqi [Department of Microbiology & Plant Biology, University of Oklahoma, Norman Oklahoma USA; Smith, Matthew J. [Computational Science Laboratory, Microsoft Research, Cambridge UK; Sulman, Benjamin [Department of Biology, Indiana University, Bloomington Indiana USA; Todd-Brown, Katherine [Department of Microbiology & Plant Biology, University of Oklahoma, Norman Oklahoma USA; Pacific Northwest National Laboratory, Richland Washington USA; Wang, Ying-Ping [CSIRO Ocean and Atmosphere Flagship, Aspendale Victoria Australia; Xia, Jianyang [Department of Microbiology & Plant Biology, University of Oklahoma, Norman Oklahoma USA; Tiantong National Forest Ecosystem Observation and Research Station, School of Ecological and Environmental Sciences, East China Normal University, Shanghai China; Xu, Xiaofeng [Department of Biological Sciences, University of Texas at El Paso, Texas USA

    2015-10-01

    Microbes influence soil organic matter (SOM) decomposition and the long-term stabilization of carbon (C) in soils. We contend that by revising the representation of microbial processes and their interactions with the physicochemical soil environment, Earth system models (ESMs) may make more realistic global C cycle projections. Explicit representation of microbial processes presents considerable challenges due to the scale at which these processes occur. Thus, applying microbial theory in ESMs requires a framework to link micro-scale process-level understanding and measurements to macro-scale models used to make decadal- to century-long projections. Here, we review the diversity, advantages, and pitfalls of simulating soil biogeochemical cycles using microbial-explicit modeling approaches. We present a roadmap for how to begin building, applying, and evaluating reliable microbial-explicit model formulations that can be applied in ESMs. Drawing from experience with traditional decomposition models we suggest: (1) guidelines for common model parameters and output that can facilitate future model intercomparisons; (2) development of benchmarking and model-data integration frameworks that can be used to effectively guide, inform, and evaluate model parameterizations with data from well-curated repositories; and (3) the application of scaling methods to integrate microbial-explicit soil biogeochemistry modules within ESMs. With contributions across scientific disciplines, we feel this roadmap can advance our fundamental understanding of soil biogeochemical dynamics and more realistically project likely soil C response to environmental change at global scales.

  11. Modeling the Object-Oriented Software Process: OPEN and the Unified Process

    NARCIS (Netherlands)

    van den Berg, Klaas; Aksit, Mehmet; van den Broek, P.M.

    A short introduction to software process modeling is presented, particularly object-oriented modeling. Two major industrial process models are discussed: the OPEN model and the Unified Process model. In more detail, the quality assurance in the Unified Process tool (formally called Objectory) is

  12. The Parental Environment Cluster Model of Child Neglect: An Integrative Conceptual Model.

    Science.gov (United States)

    Burke, Judith; Chandy, Joseph; Dannerbeck, Anne; Watt, J. Wilson

    1998-01-01

    Presents Parental Environment Cluster model of child neglect which identifies three clusters of factors involved in parents' neglectful behavior: (1) parenting skills and functions; (2) development and use of positive social support; and (3) resource availability and management skills. Model offers a focal theory for research, structure for…

  13. Mathematical Modeling of Thermofrictional Milling Process Using ANSYS WB Software

    National Research Council Canada - National Science Library

    K.T. Sherov; M.R. Sikhimbayev; A.K. Sherov; B.S. Donenbayev; A.K. Rakishev; A.B. Mazdubai; M.M. Musayev; A.M. Abeuova

    2017-01-01

    This article presents ANSYS WB-based mathematical modelling of the thermofrictional milling process, which allowed studying the dynamics of thermal and physical processes occurring during the processing...

  14. Application of simplified model to sensitivity analysis of solidification process

    Directory of Open Access Journals (Sweden)

    R. Szopa

    2007-12-01

    Full Text Available The sensitivity models of thermal processes proceeding in the system casting-mould-environment give the essential information concerning the influence of physical and technological parameters on a course of solidification. Knowledge of time-dependent sensitivity field is also very useful in a case of inverse problems numerical solution. The sensitivity models can be constructed using the direct approach, this means by differentiation of basic energy equations and boundary-initial conditions with respect to parameter considered. Unfortunately, the analytical form of equations and conditions obtained can be very complex both from the mathematical and numerical points of view. Then the other approach consisting in the application of differential quotient can be applied. In the paper the exact and approximate approaches to the modelling of sensitivity fields are discussed, the examples of computations are also shown.

  15. Improving science and mathematics education with computational modelling in interactive engagement environments

    Science.gov (United States)

    Neves, Rui Gomes; Teodoro, Vítor Duarte

    2012-09-01

    A teaching approach aiming at an epistemologically balanced integration of computational modelling in science and mathematics education is presented. The approach is based on interactive engagement learning activities built around computational modelling experiments that span the range of different kinds of modelling from explorative to expressive modelling. The activities are designed to make a progressive introduction to scientific computation without requiring prior development of a working knowledge of programming, generate and foster the resolution of cognitive conflicts in the understanding of scientific and mathematical concepts and promote performative competency in the manipulation of different and complementary representations of mathematical models. The activities are supported by interactive PDF documents which explain the fundamental concepts, methods and reasoning processes using text, images and embedded movies, and include free space for multimedia enriched student modelling reports and teacher feedback. To illustrate, an example from physics implemented in the Modellus environment and tested in undergraduate university general physics and biophysics courses is discussed.

  16. Runaway sexual selection without genetic correlations: social environments and flexible mate choice initiate and enhance the Fisher process.

    Science.gov (United States)

    Bailey, Nathan W; Moore, Allen J

    2012-09-01

    Female mating preferences are often flexible, reflecting the social environment in which they are expressed. Associated indirect genetic effects (IGEs) can affect the rate and direction of evolutionary change, but sexual selection models do not capture these dynamics. We incorporate IGEs into quantitative genetic models to explore how variation in social environments and mate choice flexibility influence Fisherian sexual selection. The importance of IGEs is that runaway sexual selection can occur in the absence of a genetic correlation between male traits and female preferences. Social influences can facilitate the initiation of the runaway process and increase the rate of trait elaboration. Incorporating costs to choice do not alter the main findings. Our model provides testable predictions: (1) genetic covariances between male traits and female preferences may not exist, (2) social flexibility in female choice will be common in populations experiencing strong sexual selection, (3) variation in social environments should be associated with rapid sexual trait divergence, and (4) secondary sexual traits will be more elaborate than previously predicted. Allowing feedback from the social environment resolves discrepancies between theoretical predictions and empirical data, such as why indirect selection on female preferences, theoretically weak, might be sufficient for preferences to become elaborated. © 2012 The Author(s). Evolution© 2012 The Society for the Study of Evolution.

  17. The challenges to the safety process when using Agile development models

    OpenAIRE

    Jamissen, Hanne-Gro

    2012-01-01

    Safety related systems are traditionally developed using traditional models like the V-model. Agile development models are now increasingly used, and the experiences with these models makes it tempting to also use Agile models when developing safety related systems. To do this, Agile development models need include a safety process that also are as agile as possible. However, introducing safety activities into an agile environment reintroduces limitations from traditional de-velopment. The...

  18. Strategies to Automatically Derive a Process Model from a Configurable Process Model Based on Event Data

    Directory of Open Access Journals (Sweden)

    Mauricio Arriagada-Benítez

    2017-10-01

    Full Text Available Configurable process models are frequently used to represent business workflows and other discrete event systems among different branches of large organizations: they unify commonalities shared by all branches and describe their differences, at the same time. The configuration of such models is usually done manually, which is challenging. On the one hand, when the number of configurable nodes in the configurable process model grows, the size of the search space increases exponentially. On the other hand, the person performing the configuration may lack the holistic perspective to make the right choice for all configurable nodes at the same time, since choices influence each other. Nowadays, information systems that support the execution of business processes create event data reflecting how processes are performed. In this article, we propose three strategies (based on exhaustive search, genetic algorithms and a greedy heuristic that use event data to automatically derive a process model from a configurable process model that better represents the characteristics of the process in a specific branch. These strategies have been implemented in our proposed framework and tested in both business-like event logs as recorded in a higher educational enterprise resource planning system and a real case scenario involving a set of Dutch municipalities.

  19. TUNS/TCIS information model/process model

    Science.gov (United States)

    Wilson, James

    1992-01-01

    An Information Model is comprised of graphical and textual notation suitable for describing and defining the problem domain - in our case, TUNS or TCIS. The model focuses on the real world under study. It identifies what is in the problem and organizes the data into a formal structure for documentation and communication purposes. The Information Model is composed of an Entity Relationship Diagram (ERD) and a Data Dictionary component. The combination of these components provide an easy to understand methodology for expressing the entities in the problem space, the relationships between entities and the characteristics (attributes) of the entities. This approach is the first step in information system development. The Information Model identifies the complete set of data elements processed by TUNS. This representation provides a conceptual view of TUNS from the perspective of entities, data, and relationships. The Information Model reflects the business practices and real-world entities that users must deal with.

  20. Three-dimensional model for fusion processes

    Energy Technology Data Exchange (ETDEWEB)

    Olson, A.P.

    1984-01-01

    Active galactic nuclei (AGN) emit unusual spectra of radiation which is interpreted to signify extreme distance, extreme power, or both. The status of AGNs was recently reviewed by Balick and Heckman. It seems that the greatest conceptual difficulty with understanding AGNs is how to form a coherent phenomenological model of their properties. What drives the galactic engine. What and where are the mass-flows of fuel to this engine. Are there more than one engine. Do the engines have any symmetry properties. Is observed radiation isotropically emitted from the source. If it is polarized, what causes the polarization. Why is there a roughly spherical cloud of ionized gas about the center of our own galaxy, the Milky Way. The purpose of this paper is to discuss a new model, based on fusion processes which are not axisymmetric, uniform, isotropic, or even time-invariant. Then, the relationship to these questions will be developed. A unified model of fusion processes applicable to many astronomical phenomena will be proposed and discussed.

  1. Process-Based Modeling of Constructed Wetlands

    Science.gov (United States)

    Baechler, S.; Brovelli, A.; Rossi, L.; Barry, D. A.

    2007-12-01

    Constructed wetlands (CWs) are widespread facilities for wastewater treatment. In subsurface flow wetlands, contaminated wastewater flows through a porous matrix, where oxidation and detoxification phenomena occur. Despite the large number of working CWs, system design and optimization are still mainly based upon empirical equations or simplified first-order kinetics. This results from an incomplete understanding of the system functioning, and may in turn hinder the performance and effectiveness of the treatment process. As a result, CWs are often considered not suitable to meet high water quality-standards, or to treat water contaminated with recalcitrant anthropogenic contaminants. To date, only a limited number of detailed numerical models have been developed and successfully applied to simulate constructed wetland behavior. Among these, one of the most complete and powerful is CW2D, which is based on Hydrus2D. The aim of this work is to develop a comprehensive simulator tailored to model the functioning of horizontal flow constructed wetlands and in turn provide a reliable design and optimization tool. The model is based upon PHWAT, a general reactive transport code for saturated flow. PHWAT couples MODFLOW, MT3DMS and PHREEQC-2 using an operator-splitting approach. The use of PHREEQC to simulate reactions allows great flexibility in simulating biogeochemical processes. The biogeochemical reaction network is similar to that of CW2D, and is based on the Activated Sludge Model (ASM). Kinetic oxidation of carbon sources and nutrient transformations (nitrogen and phosphorous primarily) are modeled via Monod-type kinetic equations. Oxygen dissolution is accounted for via a first-order mass-transfer equation. While the ASM model only includes a limited number of kinetic equations, the new simulator permits incorporation of an unlimited number of both kinetic and equilibrium reactions. Changes in pH, redox potential and surface reactions can be easily incorporated

  2. Modelling and control of dynamic systems using gaussian process models

    CERN Document Server

    Kocijan, Juš

    2016-01-01

    This monograph opens up new horizons for engineers and researchers in academia and in industry dealing with or interested in new developments in the field of system identification and control. It emphasizes guidelines for working solutions and practical advice for their implementation rather than the theoretical background of Gaussian process (GP) models. The book demonstrates the potential of this recent development in probabilistic machine-learning methods and gives the reader an intuitive understanding of the topic. The current state of the art is treated along with possible future directions for research. Systems control design relies on mathematical models and these may be developed from measurement data. This process of system identification, when based on GP models, can play an integral part of control design in data-based control and its description as such is an essential aspect of the text. The background of GP regression is introduced first with system identification and incorporation of prior know...

  3. AN INCLUSIVE APPROACH TO ONLINE LEARNING ENVIRONMENTS: Models and Resources

    Directory of Open Access Journals (Sweden)

    Aline Germain-RUTHERFORD

    2008-04-01

    Full Text Available The impact of ever-increasing numbers of online courses on the demographic composition of classes has meant that the notions of diversity, multiculturality and globalization are now key aspects of curriculum planning. With the internationalization and globalization of education, and faced with rising needs for an increasingly educated and more adequately trained workforce, universities are offering more flexible programs, assisted by new educational and communications technologies. Faced with this diversity of populations and needs, many instructors are becoming aware of the importance of addressing the notions of multiculturality and interculturality in the design of online however this raises many questions. For example, how do we integrate and address this multicultural dimension in a distance education course aimed at students who live in diverse cultural environments? How do the challenges of intercultural communication in an online environment affect online teaching and learning? What are the characteristics of an online course that is inclusive of all types of diversity, and what are the guiding principles for designing such courses? We will attempt to answer some of these questions by first exploring the concepts of culture and learning cultures. This will help us to characterize the impact on online learning of particular cultural dimensions. We will then present and discuss different online instructional design models that are culturally inclusive, and conclude with the description of a mediated instructional training module on the management of the cultural dimension of online teaching and learning. This module is mainly addressed to teachers and designers of online courses.

  4. Collision avoidance in unstructured environments for autonomous robots: a behavioural modelling approach

    CSIR Research Space (South Africa)

    Yinka-Banjo, CO

    2011-03-01

    Full Text Available for autonomous robots in an unstructured environment with static obstacles. In our approach, an unstructured environment was simulated and the information of the obstacles generated was used to build the Behavioural Bayesian Network Model (BBNM). This model...

  5. A framework for process-solution analysis in collaborative learning environments

    OpenAIRE

    Bravo Santos, Crescencio; Redondo Duque, Miguel Angel; Verdejo Maillo, María Felisa; Ortega Cantero, Manuel

    2008-01-01

    One of the most challenging aspects of computer-supported collaborative learning (CSCL) research is automation of collaboration and interaction analysis in order to understand and improve the learning processes. It is particularly necessary to look in more depth at the joint analysis of the collaborative process and its resulting product. In this article, we present a framework for comprehensive analysis in CSCL synchronous environments supporting a problem-solving approach to learning. This ...

  6. Modelling of Processes of Logistics in Cyberspace Security

    Directory of Open Access Journals (Sweden)

    Konečný Jiří

    2017-01-01

    Full Text Available The goal of this contribution is especially to familiarize experts in various fields with the need for a new approach to the system-defined model and modelling of processes in the engineering practice and the expression of some state variables' possibilities for the modelling of real-world systems with regard to the highly dynamic development of structures and to the behaviour of systems of logistics. Thus, in this contribution, the necessity of making full use of cybernetics as a field for the management and communication of information is expressed, and also the environment of cybernetics as a much needed cybernetic realm (cyberspace, determining the steady state between cyber-attacks and cyber-defence as a modern knowledge-based potential in general and specifically of logistics in cyber security. Connected with this process is the very important area of lifelong training of experts in the dynamic world of science and technology (that is, also in a social system which is also expressed here briefly, and also the cyber and information security, all of which falls under the cyberspace of new perspective electronic learning (e-learning with the use of modern laboratories with new effects also for future possibilities of process modelling of artificial intelligence (AI with a perspective of mass use of UAVs in logistics.

  7. Marketing the use of the space environment for the processing of biological and pharmaceutical materials

    Science.gov (United States)

    1984-01-01

    The perceptions of U.S. biotechnology and pharmaceutical companies concerning the potential use of the space environment for the processing of biological substances was examined. Physical phenomena that may be important in space-base processing of biological materials are identified and discussed in the context of past and current experiment programs. The capabilities of NASA to support future research and development, and to engage in cooperative risk sharing programs with industry are discussed. Meetings were held with several biotechnology and pharmaceutical companies to provide data for an analysis of the attitudes and perceptions of these industries toward the use of the space environment. Recommendations are made for actions that might be taken by NASA to facilitate the marketing of the use of the space environment, and in particular the Space Shuttle, to the biotechnology and pharmaceutical industries.

  8. Upgrading Preschool Environment in a Swedish Municipality: Evaluation of an Implementation Process.

    Science.gov (United States)

    Altin, Carolina; Kvist Lindholm, Sofia; Wejdmark, Mats; Lättman-Masch, Robert; Boldemann, Cecilia

    2015-07-01

    Redesigning outdoor preschool environment may favorably affect multiple factors relevant to health and reach many children. Cross-sectional studies in various landscapes at different latitudes have explored the characteristics of preschool outdoor environment considering the play potential triggering combined physical activity and sun-protective behavior due to space, vegetation, and topography. Criteria were pinpointed to upgrade preschool outdoor environment for multiple health outcomes to be applied in local government in charge of public preschools. Purposeful land use policies and administrative management of outdoor land use may serve to monitor the quality of preschool outdoor environments (upgrading and planning). This study evaluates the process of implementing routines for upgrading outdoor preschool environments in a medium-sized municipality, Sweden, 2008-2011, using qualitative and quantitative analysis. Recorded written material (logs and protocols) related to the project was processed using thematic analysis. Quantitative data (m(2) flat/multileveled, overgrown/naked surface, and fraction of free visible sky) were analyzed to assess the impact of implementation (surface, topography, greenery integrated in play). The preschool outdoor environments were upgraded accordingly. The quality of implementation was assessed using the theory of policy streams approach. Though long-term impact remains to be confirmed the process seems to have changed work routines in the interior management for purposeful upgrading of preschool outdoor environments. The aptitude and applicability of inexpensive methods for assessing, selecting, and upgrading preschool land at various latitudes, climates, and outdoor play policies (including gender aspects and staff policies) should be further discussed, as well as the compilation of data for monitoring and evaluation. © 2015 Society for Public Health Education.

  9. Model systems for life processes on Mars

    Science.gov (United States)

    Mitz, M. A.

    1974-01-01

    In the evolution of life forms nonphotosynthetic mechanisms are developed. The question remains whether a total life system could evolve which is not dependent upon photosynthesis. In trying to visualize life on other planets, the photosynthetic process has problems. On Mars, the high intensity of light at the surface is a concern and alternative mechanisms need to be defined and analyzed. In the UV search for alternate mechanisms, several different areas may be identified. These involve activated inorganic compounds in the atmosphere, such as the products of photodissociation of carbon dioxide and the organic material which may be created by natural phenomena. In addition, a life system based on the pressure of the atmospheric constituents, such as carbon dioxide, is a possibility. These considerations may be important for the understanding of evolutionary processes of life on another planet. Model systems which depend on these alternative mechanisms are defined and related to presently planned and future planetary missions.

  10. Representing the environment 3.0. Maps, models, networks.

    Directory of Open Access Journals (Sweden)

    Letizia Bollini

    2014-05-01

    Full Text Available Web 3.0 is changing the world we live and perceive the environment anthropomorphized, making a stratifation of levels of experience and mediated by the devices. If the urban landscape is designed, shaped and planned space, there is a social landscape that overwrite the territory of values, representations shared images, narratives of personal and collective history. Mobile technology introduces an additional parameter, a kind of non-place, which allows the coexistence of the here and elsewhere in an sort of digital landscape. The maps, mental models, the system of social networks become, then, the way to present, represented and represent themselves in a kind of ideal coring of the co-presence of levels of physical, cognitive and collective space.

  11. Virtual building environments (VBE) - Applying information modeling to buildings

    Energy Technology Data Exchange (ETDEWEB)

    Bazjanac, Vladimir

    2004-06-21

    A Virtual Building Environment (VBE) is a ''place'' where building industry project staffs can get help in creating Building Information Models (BIM) and in the use of virtual buildings. It consists of a group of industry software that is operated by industry experts who are also experts in the use of that software. The purpose of a VBE is to facilitate expert use of appropriate software applications in conjunction with each other to efficiently support multidisciplinary work. This paper defines BIM and virtual buildings, and describes VBE objectives, set-up and characteristics of operation. It informs about the VBE Initiative and the benefits from a couple of early VBE projects.

  12. A coupled vegetation/sediment transport model for dryland environments

    Science.gov (United States)

    Mayaud, Jerome R.; Bailey, Richard M.; Wiggs, Giles F. S.

    2017-04-01

    Dryland regions are characterized by patchy vegetation, erodible surfaces, and erosive aeolian processes. Understanding how these constituent factors interact and shape landscape evolution is critical for managing potential environmental and anthropogenic impacts in drylands. However, modeling wind erosion on partially vegetated surfaces is a complex problem that has remained challenging for researchers. We present the new, coupled cellular automaton Vegetation and Sediment TrAnsport (ViSTA) model, which is designed to address fundamental questions about the development of arid and semiarid landscapes in a spatially explicit way. The technical aspects of the ViSTA model are described, including a new method for directly imposing oblique wind and transport directions onto a cell-based domain. Verification tests for the model are reported, including stable state solutions, the impact of drought and fire stress, wake flow dynamics, temporal scaling issues, and the impact of feedbacks between sediment movement and vegetation growth on landscape morphology. The model is then used to simulate an equilibrium nebkha dune field, and the resultant bed forms are shown to have very similar size and spacing characteristics to nebkhas observed in the Skeleton Coast, Namibia. The ViSTA model is a versatile geomorphological tool that could be used to predict threshold-related transitions in a range of dryland ecogeomorphic systems.

  13. Global food chains and environment: agro-food production and processing in Thailand

    NARCIS (Netherlands)

    Sriwichailamphan, T.H.

    2007-01-01

    In this study on the global food chain and the environment, the objective is to understand the dynamics of food safety and environmental improvements among the large and medium-sized agro-food processing industries and farmers in Thailand that operate in the global

  14. Students' Expectations of the Learning Process in Virtual Reality and Simulation-Based Learning Environments

    Science.gov (United States)

    Keskitalo, Tuulikki

    2012-01-01

    Expectations for simulations in healthcare education are high; however, little is known about healthcare students' expectations of the learning process in virtual reality (VR) and simulation-based learning environments (SBLEs). This research aims to describe first-year healthcare students' (N=97) expectations regarding teaching, studying, and…

  15. Hydrochemical and physical processes influencing salinization and freshening in Mediterranean low-lying coastal environments

    NARCIS (Netherlands)

    Mollema, P.N.; Antonelli, M.; Dinelli, E.; Gabbianelli, G.; Greggio, N.; Stuijfzand, P.J.

    2013-01-01

    Ground- and surface water chemistry and stable isotope data from the coastal zone near Ravenna (Italy) have been examined to determine the geochemical conditions and processes that occur and their implications for fresh water availability in the various brackish/saline coastal environments. Fresh

  16. Journey into the Problem-Solving Process: Cognitive Functions in a PBL Environment

    Science.gov (United States)

    Chua, B. L.; Tan, O. S.; Liu, W. C.

    2016-01-01

    In a PBL environment, learning results from learners engaging in cognitive processes pivotal in the understanding or resolution of the problem. Using Tan's cognitive function disc, this study examines the learner's perceived cognitive functions at each stage of PBL, as facilitated by the PBL schema. The results suggest that these learners…

  17. Collaborative Learning Processes in an Asynchronous Environment: An Analysis through Discourse and Social Networks

    Science.gov (United States)

    Tirado, Ramon; Aguaded, Ignacio; Hernando, Angel

    2011-01-01

    This article analyses an experience in collaborative learning in an asynchronous writing environment through discussion forums on a WebCt platform of the University of Huelva's virtual campus, and was part of an innovative teaching project in 2007-08. The main objectives are to describe the processes of collaborative knowledge construction and the…

  18. Designing discovery learning environments: process analysis and implications for designing an information system

    NARCIS (Netherlands)

    Pieters, Julius Marie; Limbach, R.; de Jong, Anthonius J.M.

    2004-01-01

    A systematic analysis of the design process of authors of (simulation based) discovery learning environments was carried out. The analysis aimed at identifying the design activities of authors and categorising knowledge gaps that they experience. First, five existing studies were systematically

  19. A Logic-Based Prototyping Environment For Process Oriented Second Generation Expert Systems

    Science.gov (United States)

    Freeman, Edward H.

    1988-03-01

    Logic programming methods have been used to implement an expert system shell which represents and reasons about the implications of causal event streams, in arbitrarily complex causal networks. The system features the capability of "forward" and "backward" inference, "fuzzy" causal inference, default vs conditional reasoning, the ability to combine 1st generation "if-then" heuristics with 2nd generation structural inference during fault diagnosis exercises and the ability to generate inferences from empirical data as well as from conceptually based models of causal relationships. A description of the direct causal connections between pairs of focal events (the causal topology), along with the enumeration of policy variables (manipulable inputs), system observables (events which can be observed) and goal variables (desired outputs) enable declarative representations of meta-level causal structures used during the causal inference process. The simplicity of these input requirements, the expressive power of an open ended Logic Programming environment and the availability of a rich set of analytic tools all combine to provide the knowledge engineer with the ability to quickly build systems which can reason symbolically about complex causal structures.

  20. Empirical modeling of eucalyptus wood processing

    Energy Technology Data Exchange (ETDEWEB)

    Parajo, J.C.; Alonso, J.L.; Lage, M.A.; Vazquez, D. (Dept. of Analytical Chemistry, Nutrition and Bromatology, Univ. of Santiago de Compostela, La Coruna (Spain))

    1992-11-01

    Eucalyptus globulus wood samples were treated with NaOH solutions in order to obtain substrates highly susceptible to enzymatic hydrolysis. The experiments performed in the extraction and hydrolysis stages followed an incomplete factorial design. Temperature, NaOH concentration and extraction time were considered as independent variables. Their influence on five dependent variables (defined to measure the extraction yield, the chemical composition of processed samples and the enzymatic conversion) was assessed using second order, empirical models. In addition to the experimental results, other aspects related to the extraction selectivity are discussed. (orig.).

  1. A neuroconstructivist model of past tense development and processing.

    Science.gov (United States)

    Westermann, Gert; Ruh, Nicolas

    2012-07-01

    We present a neural network model of learning and processing the English past tense that is based on the notion that experience-dependent cortical development is a core aspect of cognitive development. During learning the model adds and removes units and connections to develop a task-specific final architecture. The model provides an integrated account of characteristic errors during learning the past tense, adult generalization to pseudoverbs, and dissociations between verbs observed after brain damage in aphasic patients. We put forward a theory of verb inflection in which a functional processing architecture develops through interactions between experience-dependent brain development and the structure of the environment, in this case, the statistical properties of verbs in the language. The outcome of this process is a structured processing system giving rise to graded dissociations between verbs that are easy and verbs that are hard to learn and process. In contrast to dual-mechanism accounts of inflection, we argue that describing dissociations as a dichotomy between regular and irregular verbs is a post hoc abstraction and is not linked to underlying processing mechanisms. We extend current single-mechanism accounts of inflection by highlighting the role of structural adaptation in development and in the formation of the adult processing system. In contrast to some single-mechanism accounts, we argue that the link between irregular inflection and verb semantics is not causal and that existing data can be explained on the basis of phonological representations alone. This work highlights the benefit of taking brain development seriously in theories of cognitive development. Copyright 2012 APA, all rights reserved.

  2. Comments on: Spatiotemporal models for skewed processes

    KAUST Repository

    Genton, Marc G.

    2017-09-04

    We would first like to thank the authors for this paper that highlights the important problem of building models for non-Gaussian space-time processes. We will hereafter refer to the paper as SGV, and we also would like to acknowledge and thank them for providing us with the temporally detrended temperatures, plotted in their Figure 1, along with the coordinates of the twenty-one locations and the posterior means of the parameters for the MA1 model. We find much of interest to discuss in this paper, and as we progress through points of interest, we pose some questions to the authors that we hope they will be able to address.

  3. Time models and cognitive processes: a review

    Directory of Open Access Journals (Sweden)

    Michail eManiadakis

    2014-02-01

    Full Text Available The sense of time is an essential capacity of humans, with a major role in many of the cognitive processes expressed in our daily lifes. So far, in cognitive science and robotics research, mental capacities have been investigated in a theoretical and modelling framework that largely neglects the flow of time. Only recently there has been a small but constantly increasing interest in the temporal aspects of cognition, integrating time into a range of different models of perceptuo-motor capacities. The current paper aims to review existing works in the field and suggest directions for fruitful future work. This is particularly important for the newly developed field of artificial temporal cognition that is expected to significantly contribute in the development of sophisticated artificial agents seamlessly integrated into human societies.

  4. Models of Fixation and Tissue Processing

    Science.gov (United States)

    Grizzle, William E.

    2009-01-01

    in 10% NBF, i.e., immunorecognition is almost lost completely for such antibody-antigen combinations as Ki67/MIB, ERα and PR, and partially lost for Bcl-2. Several models have been developed to study the interactions of tissue fixation and immunorecognition, but most have viewed the problem in immunorecognition as being completely caused by fixation. Also, some of the models discussed in this special issue do not predict observations of the effects of fixation on frozen tissues fixed in 10% NBF, but not processed to paraffin blocks. This article is a brief review of issues with using 10% NBF combined with tissue processing as a combined process to study biomarkers as identified by immunohistochemistry. PMID:19886755

  5. Privatization processes in banking: Motives and models

    Directory of Open Access Journals (Sweden)

    Ristić Života

    2006-01-01

    Full Text Available The paper consists of three methodologically and causally connected thematic parts: the first part deals with crucial motives and models of the privatization processes in the USA and EU with a particular analytical focus on the Herfindahl-Hirschman doctrine of the collective domination index, as well as on the essence of merger-acquisition and take-over models. The second thematic part of the paper, as a logical continuation of the first one represents a brief comparative analysis of the motives and models implemented in bank privatization in the south-eastern European countries with particular focus on identifying interests of foreign investors, an optimal volume and price of the investment, and assessment of finalized privatizations in those countries. The final part of the paper theoretically and practically stems from the first and the second part, in that way making an interdependent and a compatible thematic whole with them, presents qualitative and quantitative aspects of analyzing finalized privatization and/or sale-purchase of Serbian banks with particular focus on IPO and IPOPLUS as the prevailing models of future sale-purchase in privatizing Serbian banks.

  6. A Landscape Disturbance Process in the Marine Environment: Revising Expectations of Climate Change Impacts.

    Science.gov (United States)

    Robles, C.; Halpin, P. M.; Schrecengost, R.; Orr, D.; Aleman-Zometa, J.

    2016-02-01

    Episodic disturbance is a ubiquitous feature of natural communities. An archetype for theory in the marine environment is wave-torn gaps in the cover of intertidal mussel beds. We mapped gap formation in eight mussel beds over eight successive years in Barkley Sound, British Columbia. We constructed a GIS database integrating small-scale measurements of topography, wave force, the 3-D structure of the mussel aggregations, and photo-mosaics of the mussel covers. Photographic analysis showed that gaps recurred predominantly in the center of the beds. The more stable peripheral regions of the beds are continually thinned by physical and biotic stresses, while the center region thickens and differentiates into layers. The mussels comprising the superficial layer attach to each other and have no direct attachment to the rock. Furthermore, the superficial layer suppresses mussels in the interior, weakening their attachment. Spatial analyses showed that it is these structural differences, rather than the spatial distribution of wave force, that account for landscape patterns of gap formation. Thus, sub-regions of disturbance arise from processes intrinsic to the community, including self-organization of the mussel aggregation, which interact with the external forcing of waves in a stochastic but coherent landscape process. This view differs from published models, which assume wave force alone generates gaps randomly across the mussel beds. Long-term records of seismographic activity and indicate the wave beat on the shore is increasing with global warming. We may fail to accurately anticipate the consequences of the elevated wave forcing, unless we also take into consideration changes in ocean production and other factors affecting mussel bed structure.

  7. STEPP: A Grounded Model to Assure the Quality of Instructional Activities in e-Learning Environments

    Directory of Open Access Journals (Sweden)

    Hamdy AHMED ABDELAZIZ

    2013-07-01

    Full Text Available The present theoretical paper aims to develop a grounded model for designing instructional activities appropriate to e-learning and online learning environments. The suggested model is guided by learning principles of cognitivism, constructivism, and connectivism learning principles to help online learners constructing meaningful experiences and moving from knowledge acquisition to knowledge creation process. The proposed model consists of five dynamic and grounded domains that assure the quality of designing and using e-learning activities: Ø Social Domain; Ø Technological Domain; Ø Epistemological Domain; Ø Psychological domain; and Ø Pedagogical Domain. Each of these domains needs four types of presences to reflect the design and the application process of e-learning activities. These four presences are: Ø cognitive presence, Ø human presence, Ø psychological presence and Ø mental presence. Applying the proposed model (STEPP throughout all online and adaptive e-learning environments may improve the process of designing and developing e-learning activities to be used as mindtools for current and future learners.

  8. How Plant Hydraulics can Improve the Modeling of Plant and Ecosystem Responses to Environment

    Science.gov (United States)

    Sperry, J.; Anderegg, W.; Mackay, D. S.; Venturas, M.

    2016-12-01

    Stomatal regulation is an important, yet problematic component in modeling plant-environment interactions. The problem is that stomata respond to so many environmental cues via complex and uncertain mechanisms. But the assumed end result of regulation is conceptually simple: an optimization of CO2 for H2O exchange in response to changing conditions. Stomata open when photosynthetic opportunity is high and water is cheap. They close if photosynthetic opportunity is low or water is very expensive. Photosynthetic opportunity is relatively easy to model. The cost of water loss is also easy to model if it is assumed to rise with greater proximity to hydraulic failure and desiccation. Unsaturated hydraulic conductivity curves of soil- and plant are used to estimate proximity to failure. At any given instant, a model can calculate opportunity and cost curves associated with greater stomatal opening. If stomata regulate to maximize the instantaneous difference between photosynthetic gain and hydraulic cost, then a model can predict the trajectory of stomatal responses to changes in environment across time. Results of this optimization routine extend the utility of hydraulic predecessor models, and are consistent with widely used empirical models across a wide range of vapor pressure deficit and ambient CO2 concentrations for wet soil. The advantage of the optimization approach is the absence of empirical coefficients, applicability to dry as well as wet soil, and prediction of plant hydraulic status along with gas exchange. The optimization algorithm is a trait- and process-based approach that could improve next generation land surface models.

  9. A Workflow Environment for Reactive Transport Modeling with Application to a Mixing- Controlled Precipitation Experiment

    Science.gov (United States)

    Schuchardt, K. L.; Sun, L.; Chase, J. M.; Elsethagen, T. O.; Freedman, V. L.; Redden, G. D.; Scheibe, T. D.

    2007-12-01

    Advances in subsurface modeling techniques such as multi-scale methods, hybrid models, and inverse modeling, combined with petascale computing capabilities, will result in simulations that run over longer time scales, cover larger geographic regions, and model increasingly detailed physical processes. This will lead to significantly more data of increased complexity, creating challenges to already strained processes for parameterizing and running models, organizing and tracking data, and visualizing outputs. To support effective development and utilization of next-generation simulators, we are developing a process integration framework that combines and extends leading edge technologies for process automation, data and metadata management, and large-scale data visualization. Our process integration framework applies workflow techniques to integrate components for accessing and preparing inputs, running simulations, and analyzing results. Data management and provenance middleware enables sharing and community development of data sources and stores full information about data and processes. In the hands of modelers, experimentalists, and developers, the process integration framework will improve efficiency, accuracy and confidence in results, and broaden the array of theories available. In this poster (which will include a live computer demo of the workflow environment) we will present a prototype of the process integration framework, developed to address a selected benchmark problem. The prototype is being used to perform simulations of an intermediate-scale experiment in which a solid mineral is precipitated from the reaction of two mixing solutes. A range of possible experimental configurations are being explored to support design of a planned set of experiments incorporating heterogeneous media. The prototype provides a user interface to specify parameter ranges, runs the required simulations on a user specified machine, automatically manages the input and output

  10. Dynamic stepping information process method in mobile bio-sensing computing environments.

    Science.gov (United States)

    Lee, Tae-Gyu; Lee, Seong-Hoon

    2014-01-01

    Recently, the interest toward human longevity free from diseases is being converged as one system frame along with the development of mobile computing environment, diversification of remote medical system and aging society. Such converged system enables implementation of a bioinformatics system created as various supplementary information services by sensing and gathering health conditions and various bio-information of mobile users to set up medical information. The existing bio-information system performs static and identical process without changes after the bio-information process defined at the initial system configuration executes the system. However, such static process indicates ineffective execution in the application of mobile bio-information system performing mobile computing. Especially, an inconvenient duty of having to perform initialization of new definition and execution is accompanied during the process configuration of bio-information system and change of method. This study proposes a dynamic process design and execution method to overcome such ineffective process.

  11. Business model transformation process in the context of business ecosystem

    OpenAIRE

    Heikkinen, A.-M. (Anne-Mari)

    2014-01-01

    Abstract It is current phenomena that business environment has changed and has set new requirements for companies. Companies must adapt to the changes comes from outside its normal business environment and take into consideration wider business environment where it operates. These changes also have set new demands for company business model. Companies Busin...

  12. Investigating Pre-service Mathematics Teachers’ Geometric Problem Solving Process in Dynamic Geometry Environment

    Directory of Open Access Journals (Sweden)

    Deniz Özen

    2013-03-01

    Full Text Available The aim of this study is to investigate pre-service elementary mathematics teachers’ open geometric problem solving process in a Dynamic Geometry Environment. With its qualitative inquiry based research design employed, the participants of the study are three pre-service teachers from 4th graders of the Department of Elementary Mathematics Teaching. In this study, clinical interviews, screencaptures of the problem solving process in the Cabri Geomery Environment, and worksheets included 2 open geometry problems have been used to collect the data. It has been investigated that all the participants passed through similar recursive phases as construction, exploration, conjecture, validate, and justification in the problem solving process. It has been thought that this study provide a new point of view to curriculum developers, teachers and researchers

  13. LARGE SCALE IMAGE PROCESSING IN REAL-TIME ENVIRONMENTS WITH KAFKA

    OpenAIRE

    Yoon-Ki Kim; Chang-Sung Jeong

    2017-01-01

    Recently, real-time image data generated is increasing not only in resolution but also in amount. This large-scale image originates from a large number of camera channels. There is a way to use GPU for high-speed processing of images, but it cannot be done efficiently by using single GPU for large-scale image processing. In this paper, we provide a new method for constructing a distributed environment using open source called Apache Kafka for real-time processing of large-scale images. This m...

  14. DYNAMIC ITELLECTUAL SYSTEM OF PROCESS MANAGEMENT IN INFORMATION AND EDUCATION ENVIRONMENT OF HIGHER EDUCATIONAL INSTITUTIONS

    Directory of Open Access Journals (Sweden)

    Yuriy F. Telnov

    2013-01-01

    Full Text Available The paper represents the technology of application of dynamic intelligent process management system for integrated information-educational environment of university and providing the access for community in order to develop flexible education programs and teaching manuals based on multi-agent and service-oriented architecture. The article depicts the prototype of dynamic intelligent process management system using for forming of educational-methodic body. Efficiency of creation and usage of dynamic intelligent process management system is evaluated. 

  15. Process Modeling Applied to Metal Forming and Thermomechanical Processing

    Science.gov (United States)

    1984-09-01

    advance which greatly enhances the efficiency of the FEM for metalforming calculations is the development of the matrix method program ALPID ( Analysis ...NUMERICAL METHODS IN METALFORMING byN.Rebelo 6 FINITE ELEMENT SIMULATION OF FORGING PROCESSES by N.Rebelo 7 COMPUTER-AIDED DESIGN OF EXTRUSION...SUMMARY Flow-based analysis of metal deformation processing provides a new design perspective to assess optimum process variables and workpiece material

  16. Atrazine degradation using chemical-free process of USUV: Analysis of the micro-heterogeneous environments and the degradation mechanisms

    Energy Technology Data Exchange (ETDEWEB)

    Xu, L.J., E-mail: xulijie827@gmail.com [Department of Civil and Environmental Engineering, The Hong Kong Polytechnic University, Hung Hom, Kowloon (Hong Kong); Chu, W., E-mail: cewchu@polyu.edu.hk [Department of Civil and Environmental Engineering, The Hong Kong Polytechnic University, Hung Hom, Kowloon (Hong Kong); Graham, Nigel, E-mail: n.graham@imperial.ac.uk [Department of Civil and Environmental Engineering, Imperial College London, South Kensington Campus, London SW7 2AZ (United Kingdom)

    2014-06-30

    Graphical abstract: - Highlights: • Two chemical-free AOP processes are combined to enhance atrazine degradation. • ATZ degradation in sonophotolytic process was analyzed using a previous proposed model. • The micro-bubble/liquid heterogeneous environments in sonolytic processes were investigated. • The salt effects on different sonolytic processes were examined. • ATZ degradation mechanisms were investigated and pathways were proposed. - Abstract: The effectiveness of sonolysis (US), photolysis (UV), and sonophotolysis (USUV) for the degradation of atrazine (ATZ) was investigated. An untypical kinetics analysis was found useful to describe the combined process, which is compatible to pseudo first-order kinetics. The heterogeneous environments of two different ultrasounds (20 and 400 kHz) were evaluated. The heterogeneous distribution of ATZ in the ultrasonic solution was found critical in determining the reaction rates at different frequencies. The presence of NaCl would promote/inhibit the rates by the growth and decline of “salting out” effect and surface tension. The benefits of combining these two processes were for the first time investigated from the aspect of promoting the intermediates degradation which were resistant in individual processes. UV caused a rapid transformation of ATZ to 2-hydroxyatrazine (OIET), which was insensitive to UV irradiation; however, US and USUV were able to degrade OIET and other intermediates through • OH attack. On the other hand, UV irradiation also could promote radical generation via H{sub 2}O{sub 2} decomposition, thereby resulting in less accumulation of more hydrophilic intermediates, which are difficult to degradation in the US process. Reaction pathways for ATZ degradation by all three processes are proposed. USUV achieved the greatest degree of ATZ mineralization with more than 60% TOC removed, contributed solely by the oxidation of side chains. Ammeline was found to be the only end-product in both US

  17. The MASTER-99 space debris and meteoroid environment model

    Science.gov (United States)

    Klinkrad, H.; Bendisch, J.; Bunte, K. D.; Krag, H.; Sdunnus, H.; Wegener, P.

    2001-01-01

    MASTER-99 is a space debris and meteoroid environment model produced by TU Braunschweig (D), eta_max space (D), and DERA (UK) under an ESA contract. The model allows to compute particulate impact fluxes on any terrestrial target orbit up to geostationary altitudes. Flux contributions can be discriminated with respect to debris source types (catalog objects, explosion and collision fragments, NaK droplets, solid rocket motor dust and slag, impact ejecta, and surface degradation products), meteoroid source types (Divine-Staubach populations, and annual stream events), and with respect to origin and impact direction of each flux contributing particulate. Impact fluxes of meteoroids and debris down to 1 μm sizes can be determined for spherical targets, for tumbling plates, or for oriented, planar surfaces which are controlled according to standard attitude steering laws. MASTER-99 is distributed by ESA/ESOC on a CD ROM which includes user documentation, and the necessary data files, executables, and GUI driven installation scripts for the most common operating systems and computer platforms. MASTER-99 is delivered together with PROOF-99, a program for radar and optical observation forecasting. Based on the MASTER-99 population larger than 1 mm, it predicts debris detections from ground-based or space-based sensors (radars or telescopes) of user-defined system performances.

  18. Atomistic Modeling of Corrosion Events at the Interface between a Metal and Its Environment

    Directory of Open Access Journals (Sweden)

    Christopher D. Taylor

    2012-01-01

    Full Text Available Atomistic simulation is a powerful tool for probing the structure and properties of materials and the nature of chemical reactions. Corrosion is a complex process that involves chemical reactions occurring at the interface between a material and its environment and is, therefore, highly suited to study by atomistic modeling techniques. In this paper, the complex nature of corrosion processes and mechanisms is briefly reviewed. Various atomistic methods for exploring corrosion mechanisms are then described, and recent applications in the literature surveyed. Several instances of the application of atomistic modeling to corrosion science are then reviewed in detail, including studies of the metal-water interface, the reaction of water on electrified metallic interfaces, the dissolution of metal atoms from metallic surfaces, and the role of competitive adsorption in controlling the chemical nature and structure of a metallic surface. Some perspectives are then given concerning the future of atomistic modeling in the field of corrosion science.

  19. Measuring the precision of multi-perspective process models

    NARCIS (Netherlands)

    Mannhardt, Felix; De Leoni, Massimiliano; Reijers, Hajo A.; Van Der Aalst, Wil M P

    2016-01-01

    Process models need to reflect the real behavior of an organization’s processes to be beneficial for several use cases, such as process analysis, process documentation and process improvement. One quality criterion for a process model is that they should precise and not express more behavior than

  20. Statistical models for genotype by environment data: from conventional ANOVA models to eco-physiological QTL models

    NARCIS (Netherlands)

    Eeuwijk, van F.A.; Malosetti, M.; Yin, X.; Struik, P.C.; Stam, P.

    2005-01-01

    To study the performance of genotypes under different growing conditions, plant breeders evaluate their germplasm in multi-environment trials. These trials produce genotype × environment data. We present statistical models for the analysis of such data that differ in the extent to which additional