WorldWideScience

Sample records for model selection process

  1. Multicriteria framework for selecting a process modelling language

    Science.gov (United States)

    Scanavachi Moreira Campos, Ana Carolina; Teixeira de Almeida, Adiel

    2016-01-01

    The choice of process modelling language can affect business process management (BPM) since each modelling language shows different features of a given process and may limit the ways in which a process can be described and analysed. However, choosing the appropriate modelling language for process modelling has become a difficult task because of the availability of a large number modelling languages and also due to the lack of guidelines on evaluating, and comparing languages so as to assist in selecting the most appropriate one. This paper proposes a framework for selecting a modelling language in accordance with the purposes of modelling. This framework is based on the semiotic quality framework (SEQUAL) for evaluating process modelling languages and a multicriteria decision aid (MCDA) approach in order to select the most appropriate language for BPM. This study does not attempt to set out new forms of assessment and evaluation criteria, but does attempt to demonstrate how two existing approaches can be combined so as to solve the problem of selection of modelling language. The framework is described in this paper and then demonstrated by means of an example. Finally, the advantages and disadvantages of using SEQUAL and MCDA in an integrated manner are discussed.

  2. Numerical Model based Reliability Estimation of Selective Laser Melting Process

    DEFF Research Database (Denmark)

    Mohanty, Sankhya; Hattel, Jesper Henri

    2014-01-01

    Selective laser melting is developing into a standard manufacturing technology with applications in various sectors. However, the process is still far from being at par with conventional processes such as welding and casting, the primary reason of which is the unreliability of the process. While...... of the selective laser melting process. A validated 3D finite-volume alternating-direction-implicit numerical technique is used to model the selective laser melting process, and is calibrated against results from single track formation experiments. Correlation coefficients are determined for process input...... parameters such as laser power, speed, beam profile, etc. Subsequently, uncertainties in the processing parameters are utilized to predict a range for the various outputs, using a Monte Carlo method based uncertainty analysis methodology, and the reliability of the process is established....

  3. Process chain modeling and selection in an additive manufacturing context

    DEFF Research Database (Denmark)

    Thompson, Mary Kathryn; Stolfi, Alessandro; Mischkot, Michael

    2016-01-01

    This paper introduces a new two-dimensional approach to modeling manufacturing process chains. This approach is used to consider the role of additive manufacturing technologies in process chains for a part with micro scale features and no internal geometry. It is shown that additive manufacturing...... evolving fields like additive manufacturing....

  4. Unraveling the sub-processes of selective attention: insights from dynamic modeling and continuous behavior.

    Science.gov (United States)

    Frisch, Simon; Dshemuchadse, Maja; Görner, Max; Goschke, Thomas; Scherbaum, Stefan

    2015-11-01

    Selective attention biases information processing toward stimuli that are relevant for achieving our goals. However, the nature of this bias is under debate: Does it solely rely on the amplification of goal-relevant information or is there a need for additional inhibitory processes that selectively suppress currently distracting information? Here, we explored the processes underlying selective attention with a dynamic, modeling-based approach that focuses on the continuous evolution of behavior over time. We present two dynamic neural field models incorporating the diverging theoretical assumptions. Simulations with both models showed that they make similar predictions with regard to response times but differ markedly with regard to their continuous behavior. Human data observed via mouse tracking as a continuous measure of performance revealed evidence for the model solely based on amplification but no indication of persisting selective distracter inhibition.

  5. Probabilistic wind power forecasting with online model selection and warped gaussian process

    International Nuclear Information System (INIS)

    Kou, Peng; Liang, Deliang; Gao, Feng; Gao, Lin

    2014-01-01

    Highlights: • A new online ensemble model for the probabilistic wind power forecasting. • Quantifying the non-Gaussian uncertainties in wind power. • Online model selection that tracks the time-varying characteristic of wind generation. • Dynamically altering the input features. • Recursive update of base models. - Abstract: Based on the online model selection and the warped Gaussian process (WGP), this paper presents an ensemble model for the probabilistic wind power forecasting. This model provides the non-Gaussian predictive distributions, which quantify the non-Gaussian uncertainties associated with wind power. In order to follow the time-varying characteristics of wind generation, multiple time dependent base forecasting models and an online model selection strategy are established, thus adaptively selecting the most probable base model for each prediction. WGP is employed as the base model, which handles the non-Gaussian uncertainties in wind power series. Furthermore, a regime switch strategy is designed to modify the input feature set dynamically, thereby enhancing the adaptiveness of the model. In an online learning framework, the base models should also be time adaptive. To achieve this, a recursive algorithm is introduced, thus permitting the online updating of WGP base models. The proposed model has been tested on the actual data collected from both single and aggregated wind farms

  6. The application of feature selection to the development of Gaussian process models for percutaneous absorption.

    Science.gov (United States)

    Lam, Lun Tak; Sun, Yi; Davey, Neil; Adams, Rod; Prapopoulou, Maria; Brown, Marc B; Moss, Gary P

    2010-06-01

    The aim was to employ Gaussian processes to assess mathematically the nature of a skin permeability dataset and to employ these methods, particularly feature selection, to determine the key physicochemical descriptors which exert the most significant influence on percutaneous absorption, and to compare such models with established existing models. Gaussian processes, including automatic relevance detection (GPRARD) methods, were employed to develop models of percutaneous absorption that identified key physicochemical descriptors of percutaneous absorption. Using MatLab software, the statistical performance of these models was compared with single linear networks (SLN) and quantitative structure-permeability relationships (QSPRs). Feature selection methods were used to examine in more detail the physicochemical parameters used in this study. A range of statistical measures to determine model quality were used. The inherently nonlinear nature of the skin data set was confirmed. The Gaussian process regression (GPR) methods yielded predictive models that offered statistically significant improvements over SLN and QSPR models with regard to predictivity (where the rank order was: GPR > SLN > QSPR). Feature selection analysis determined that the best GPR models were those that contained log P, melting point and the number of hydrogen bond donor groups as significant descriptors. Further statistical analysis also found that great synergy existed between certain parameters. It suggested that a number of the descriptors employed were effectively interchangeable, thus questioning the use of models where discrete variables are output, usually in the form of an equation. The use of a nonlinear GPR method produced models with significantly improved predictivity, compared with SLN or QSPR models. Feature selection methods were able to provide important mechanistic information. However, it was also shown that significant synergy existed between certain parameters, and as such it

  7. A structured approach for selecting carbon capture process models : A case study on monoethanolamine

    NARCIS (Netherlands)

    van der Spek, Mijndert; Ramirez, Andrea

    2014-01-01

    Carbon capture and storage is considered a promising option to mitigate CO2 emissions. This has resulted in many R&D efforts focusing at developing viable carbon capture technologies. During carbon capture technology development, process modeling plays an important role. Selecting an appropriate

  8. Consumer Decision Process in Restaurant Selection: An Application of the Stylized EKB Model

    Directory of Open Access Journals (Sweden)

    Eugenia Wickens

    2016-12-01

    Full Text Available Purpose – The aim of this paper is to propose a framework based on empirical work for understanding the consumer decision processes involved in the selection of a restaurant for leisure meals. Design/Methodology/Approach – An interpretive approach is taken in order to understand the intricacies of the process and the various stages in the process. Six focus group interviews with consumers of various ages and occupations in the South East of the United Kingdom were conducted. Findings and implications – The stylized EKB model of the consumer decision process (Tuan-Pham & Higgins, 2005 was used as a framework for developing different stages of the process. Two distinct parts of the process were identified. Occasion was found to be critical to the stage of problem recognition. In terms of evaluation of alternatives and, in particular, sensitivity to evaluative content, the research indicates that the regulatory focus theory of Tuan-Pham and Higgins (2005 applies to the decision of selecting a restaurant. Limitations – It is acknowledged that this exploratory study is based on a small sample in a single geographical area. Originality – The paper is the first application of the stylized EKB model, which takes into account the motivational dimensions of consumer decision making, missing in other models. It concludes that it may have broader applications to other research contexts.

  9. Modeling and Experimental Validation of the Electron Beam Selective Melting Process

    Directory of Open Access Journals (Sweden)

    Wentao Yan

    2017-10-01

    Full Text Available Electron beam selective melting (EBSM is a promising additive manufacturing (AM technology. The EBSM process consists of three major procedures: ① spreading a powder layer, ② preheating to slightly sinter the powder, and ③ selectively melting the powder bed. The highly transient multi-physics phenomena involved in these procedures pose a significant challenge for in situ experimental observation and measurement. To advance the understanding of the physical mechanisms in each procedure, we leverage high-fidelity modeling and post-process experiments. The models resemble the actual fabrication procedures, including ① a powder-spreading model using the discrete element method (DEM, ② a phase field (PF model of powder sintering (solid-state sintering, and ③ a powder-melting (liquid-state sintering model using the finite volume method (FVM. Comprehensive insights into all the major procedures are provided, which have rarely been reported. Preliminary simulation results (including powder particle packing within the powder bed, sintering neck formation between particles, and single-track defects agree qualitatively with experiments, demonstrating the ability to understand the mechanisms and to guide the design and optimization of the experimental setup and manufacturing process.

  10. Model of the best-of-N nest-site selection process in honeybees

    Science.gov (United States)

    Reina, Andreagiovanni; Marshall, James A. R.; Trianni, Vito; Bose, Thomas

    2017-05-01

    The ability of a honeybee swarm to select the best nest site plays a fundamental role in determining the future colony's fitness. To date, the nest-site selection process has mostly been modeled and theoretically analyzed for the case of binary decisions. However, when the number of alternative nests is larger than two, the decision-process dynamics qualitatively change. In this work, we extend previous analyses of a value-sensitive decision-making mechanism to a decision process among N nests. First, we present the decision-making dynamics in the symmetric case of N equal-quality nests. Then, we generalize our findings to a best-of-N decision scenario with one superior nest and N -1 inferior nests, previously studied empirically in bees and ants. Whereas previous binary models highlighted the crucial role of inhibitory stop-signaling, the key parameter in our new analysis is the relative time invested by swarm members in individual discovery and in signaling behaviors. Our new analysis reveals conflicting pressures on this ratio in symmetric and best-of-N decisions, which could be solved through a time-dependent signaling strategy. Additionally, our analysis suggests how ecological factors determining the density of suitable nest sites may have led to selective pressures for an optimal stable signaling ratio.

  11. Decision Support Model for Selection Technologies in Processing of Palm Oil Industrial Liquid Waste

    Science.gov (United States)

    Ishak, Aulia; Ali, Amir Yazid bin

    2017-12-01

    The palm oil industry continues to grow from year to year. Processing of the palm oil industry into crude palm oil (CPO) and palm kernel oil (PKO). The ratio of the amount of oil produced by both products is 30% of the raw material. This means that 70% is palm oil waste. The amount of palm oil waste will increase in line with the development of the palm oil industry. The amount of waste generated by the palm oil industry if it is not handled properly and effectively will contribute significantly to environmental damage. Industrial activities ranging from raw materials to produce products will disrupt the lives of people around the factory. There are many alternative technologies available to process other industries, but problems that often occur are difficult to implement the most appropriate technology. The purpose of this research is to develop a database of waste processing technology, looking for qualitative and quantitative criteria to select technology and develop Decision Support System (DSS) that can help make decisions. The method used to achieve the objective of this research is to develop a questionnaire to identify waste processing technology and develop the questionnaire to find appropriate database technology. Methods of data analysis performed on the system by using Analytic Hierarchy Process (AHP) and to build the model by using the MySQL Software that can be used as a tool in the evaluation and selection of palm oil mill processing technology.

  12. Variable Selection for Nonparametric Gaussian Process Priors: Models and Computational Strategies.

    Science.gov (United States)

    Savitsky, Terrance; Vannucci, Marina; Sha, Naijun

    2011-02-01

    This paper presents a unified treatment of Gaussian process models that extends to data from the exponential dispersion family and to survival data. Our specific interest is in the analysis of data sets with predictors that have an a priori unknown form of possibly nonlinear associations to the response. The modeling approach we describe incorporates Gaussian processes in a generalized linear model framework to obtain a class of nonparametric regression models where the covariance matrix depends on the predictors. We consider, in particular, continuous, categorical and count responses. We also look into models that account for survival outcomes. We explore alternative covariance formulations for the Gaussian process prior and demonstrate the flexibility of the construction. Next, we focus on the important problem of selecting variables from the set of possible predictors and describe a general framework that employs mixture priors. We compare alternative MCMC strategies for posterior inference and achieve a computationally efficient and practical approach. We demonstrate performances on simulated and benchmark data sets.

  13. Development of Physics-Based Numerical Models for Uncertainty Quantification of Selective Laser Melting Processes

    Data.gov (United States)

    National Aeronautics and Space Administration — The goal of the proposed research is to characterize the influence of process parameter variability inherent to Selective Laser Melting (SLM) and performance effect...

  14. Improving staff selection processes.

    Science.gov (United States)

    Cerinus, Marie; Shannon, Marina

    2014-11-11

    This article, the second in a series of articles on Leading Better Care, describes the actions undertaken in recent years in NHS Lanarkshire to improve selection processes for nursing, midwifery and allied health professional (NMAHP) posts. This is an area of significant interest to these professions, management colleagues and patients given the pivotal importance of NMAHPs to patient care and experience. In recent times the importance of selecting staff not only with the right qualifications but also with the right attributes has been highlighted to ensure patients are well cared for in a safe, effective and compassionate manner. The article focuses on NMAHP selection processes, tracking local, collaborative development work undertaken to date. It presents an overview of some of the work being implemented, highlights a range of important factors, outlines how evaluation is progressing and concludes by recommending further empirical research.

  15. HOW DO STUDENTS SELECT SOCIAL NETWORKING SITES? AN ANALYTIC HIERARCHY PROCESS (AHP MODEL

    Directory of Open Access Journals (Sweden)

    Chun Meng Tang

    2015-12-01

    Full Text Available Social networking sites are popular among university students, and students today are indeed spoiled for choice. New emerging social networking sites sprout up amid popular sites, while some existing ones die out. Given the choice of so many social networking sites, how do students decide which one they will sign up for and stay on as an active user? The answer to this question is of interest to social networking site designers and marketers. The market of social networking sites is highly competitive. To maintain the current user base and continue to attract new users, how should social networking sites design their sites? Marketers spend a fairly large percent of their marketing budget on social media marketing. To formulate an effective social media strategy, how much do marketers understand the users of social networking sites? Learning from website evaluation studies, this study intends to provide some answers to these questions by examining how university students decide between two popular social networking sites, Facebook and Twitter. We first developed an analytic hierarchy process (AHP model of four main selection criteria and 12 sub-criteria, and then administered a questionnaire to a group of university students attending a course at a Malaysian university. AHP analyses of the responses from 12 respondents provided an insight into the decision-making process involved in students’ selection of social networking sites. It seemed that of the four main criteria, privacy was the top concern, followed by functionality, usability, and content. The sub-criteria that were of key concern to the students were apps, revenue-generating opportunities, ease of use, and information security. Between Facebook and Twitter, the students thought that Facebook was the better choice. This information is useful for social networking site designers to design sites that are more relevant to their users’ needs, and for marketers to craft more effective

  16. Decision making model design for antivirus software selection using Factor Analysis and Analytical Hierarchy Process

    Directory of Open Access Journals (Sweden)

    Nurhayati Ai

    2018-01-01

    Full Text Available Virus spread increase significantly through the internet in 2017. One of the protection method is using antivirus software. The wide variety of antivirus software in the market tends to creating confusion among consumer. Selecting the right antivirus according to their needs has become difficult. This is the reason we conduct our research. We formulate a decision making model for antivirus software consumer. The model is constructed by using factor analysis and AHP method. First we spread questionnaires to consumer, then from those questionnaires we identified 16 variables that needs to be considered on selecting antivirus software. This 16 variables then divided into 5 factors by using factor analysis method in SPSS software. These five factors are security, performance, internal, time and capacity. To rank those factors we spread questionnaires to 6 IT expert then the data is analyzed using AHP method. The result is that performance factors gained the highest rank from all of the other factors. Thus, consumer can select antivirus software by judging the variables in the performance factors. Those variables are software loading speed, user friendly, no excessive memory use, thorough scanning, and scanning virus fast and accurately.

  17. Gaussian Process Model for Antarctic Surface Mass Balance and Ice Core Site Selection

    Science.gov (United States)

    White, P. A.; Reese, S.; Christensen, W. F.; Rupper, S.

    2017-12-01

    Surface mass balance (SMB) is an important factor in the estimation of sea level change, and data are collected to estimate models for prediction of SMB on the Antarctic ice sheet. Using Favier et al.'s (2013) quality-controlled aggregate data set of SMB field measurements, a fully Bayesian spatial model is posed to estimate Antarctic SMB and propose new field measurement locations. Utilizing Nearest-Neighbor Gaussian process (NNGP) models, SMB is estimated over the Antarctic ice sheet. An Antarctic SMB map is rendered using this model and is compared with previous estimates. A prediction uncertainty map is created to identify regions of high SMB uncertainty. The model estimates net SMB to be 2173 Gton yr-1 with 95% credible interval (2021,2331) Gton yr-1. On average, these results suggest lower Antarctic SMB and higher uncertainty than previously purported [Vaughan et al. (1999); Van de Berg et al. (2006); Arthern, Winebrenner and Vaughan (2006); Bromwich et al. (2004); Lenaerts et al. (2012)], even though this model utilizes significantly more observations than previous models. Using the Gaussian process' uncertainty and model parameters, we propose 15 new measurement locations for field study utilizing a maximin space-filling, error-minimizing design; these potential measurements are identied to minimize future estimation uncertainty. Using currently accepted Antarctic mass balance estimates and our SMB estimate, we estimate net mass loss [Shepherd et al. (2012); Jacob et al. (2012)]. Furthermore, we discuss modeling details for both space-time data and combining field measurement data with output from mathematical models using the NNGP framework.

  18. Process-based models of feeding and prey selection in larval fish

    DEFF Research Database (Denmark)

    Fiksen, O.; MacKenzie, Brian

    2002-01-01

    believed to be important to prey selectivity and environmental regulation of feeding in fish. We include the sensitivity of prey to the hydrodynamic signal generated by approaching larval fish and a simple model of the potential loss of prey due to turbulence whereby prey is lost if it leaves...... jig dry wt l(-1). The spatio-temporal fluctuation of turbulence (tidal cycle) and light (sun height) over the bank generates complex structure in the patterns of food intake of larval fish, with different patterns emerging for small and large larvae....

  19. Modeling intermediate product selection under production and storage capacity limitations in food processing

    DEFF Research Database (Denmark)

    Kilic, Onur Alper; Akkerman, Renzo; Grunow, Martin

    2009-01-01

    In the food industry products are usually characterized by their recipes, which are specified by various quality attributes. For end products, this is given by customer requirements, but for intermediate products, the recipes can be chosen in such a way that raw material procurement costs...... with production and inventory planning, thereby considering the production and storage capacity limitations. The resulting model can be used to solve an important practical problem typical for many food processing industries....

  20. On the selection of significant variables in a model for the deteriorating process of facades

    Science.gov (United States)

    Serrat, C.; Gibert, V.; Casas, J. R.; Rapinski, J.

    2017-10-01

    In previous works the authors of this paper have introduced a predictive system that uses survival analysis techniques for the study of time-to-failure in the facades of a building stock. The approach is population based, in order to obtain information on the evolution of the stock across time, and to help the manager in the decision making process on global maintenance strategies. For the decision making it is crutial to determine those covariates -like materials, morphology and characteristics of the facade, orientation or environmental conditions- that play a significative role in the progression of different failures. The proposed platform also incorporates an open source GIS plugin that includes survival and test moduli that allow the investigator to model the time until a lesion taking into account the variables collected during the inspection process. The aim of this paper is double: a) to shortly introduce the predictive system, as well as the inspection and the analysis methodologies and b) to introduce and illustrate the modeling strategy for the deteriorating process of an urban front. The illustration will be focused on the city of L’Hospitalet de Llobregat (Barcelona, Spain) in which more than 14,000 facades have been inspected and analyzed.

  1. On selecting a prior for the precision parameter of Dirichlet process mixture models

    Science.gov (United States)

    Dorazio, R.M.

    2009-01-01

    In hierarchical mixture models the Dirichlet process is used to specify latent patterns of heterogeneity, particularly when the distribution of latent parameters is thought to be clustered (multimodal). The parameters of a Dirichlet process include a precision parameter ?? and a base probability measure G0. In problems where ?? is unknown and must be estimated, inferences about the level of clustering can be sensitive to the choice of prior assumed for ??. In this paper an approach is developed for computing a prior for the precision parameter ?? that can be used in the presence or absence of prior information about the level of clustering. This approach is illustrated in an analysis of counts of stream fishes. The results of this fully Bayesian analysis are compared with an empirical Bayes analysis of the same data and with a Bayesian analysis based on an alternative commonly used prior.

  2. Preparatory selection of sterilization regime for canned Natural Atlantic Mackerel with oil based on developed mathematical models of the process

    Directory of Open Access Journals (Sweden)

    Maslov A. A.

    2016-12-01

    Full Text Available Definition of preparatory parameters for sterilization regime of canned "Natural Atlantic Mackerel with Oil" is the aim of current study. PRSC software developed at the department of automation and computer engineering is used for preparatory selection. To determine the parameters of process model, in laboratory autoclave AVK-30M the pre-trial process of sterilization and cooling in water with backpressure of canned "Natural Atlantic Mackerel with Oil" in can N 3 has been performed. Gathering information about the temperature in the autoclave sterilization chamber and the can with product has been carried out using Ellab TrackSense PRO loggers. Due to the obtained information three transfer functions for the product model have been identified: in the least heated area of autoclave, the average heated and the most heated. In PRSC programme temporary temperature dependences in the sterilization chamber have been built using this information. The model of sterilization process of canned "Natural Atlantic Mackerel with Oil" has been received after the pre-trial process. Then in the automatic mode the sterilization regime of canned "Natural Atlantic Mackerel with Oil" has been selected using the value of actual effect close to normative sterilizing effect (5.9 conditional minutes. Furthermore, in this study step-mode sterilization of canned "Natural Atlantic Mackerel with Oil" has been selected. Utilization of step-mode sterilization with the maximum temperature equal to 125 °C in the sterilization chamber allows reduce process duration by 10 %. However, the application of this regime in practice requires additional research. Using the described approach based on the developed mathematical models of the process allows receive optimal step and variable canned food sterilization regimes with high energy efficiency and product quality.

  3. Selection of Prediction Methods for Thermophysical Properties for Process Modeling and Product Design of Biodiesel Manufacturing

    DEFF Research Database (Denmark)

    Su, Yung-Chieh; Liu, Y. A.; Díaz Tovar, Carlos Axel

    2011-01-01

    To optimize biodiesel manufacturing, many reported studies have built simulation models to quantify the relationship between operating conditions and process performance. For mass and energy balance simulations, it is essential to know the four fundamental thermophysical properties of the feed oil......: liquid density (ρL), vapor pressure (Pvap), liquid heat capacity (CPL), and heat of vaporization (ΔHvap). Additionally, to characterize the fuel qualities, it is critical to develop quantitative correlations to predict three biodiesel properties, namely, viscosity, cetane number, and flash point. Also......, to ensure the operability of biodiesel in cold weather, one needs to quantitatively predict three low-temperature flow properties: cloud point (CP), pour point (PP), and cold filter plugging point (CFPP). This article presents the results from a comprehensive evaluation of the methods for predicting...

  4. Supplier Selection Process Using ELECTRE I Decision Model and an Application in the Retail Sector

    Directory of Open Access Journals (Sweden)

    Oğuzhan Yavuz

    2013-12-01

    Full Text Available Supplier selection problem is one of the main topic for the today’s businesses. The supplier selection problem within the supply chain management activities is very important for the businesses, particularly operating in the retail sector. Thus, in this study, the supplier selection problem was discussed in order of importance between energy drinks suppliers of food business in the retail sector. Costs, delivery, quality and flexibility variables were used to select suppliers, and ELECTRE I Method, one of the multiple decision methods, was used to ranking suppliers according to this variables. Which suppliers are more important for the food company was determined by ranking suppliers according to computing net superior values and net inferior values. Results obtained werepresented in tables and certain steps

  5. The split between availability and selection. Business models for scientific information, and the scientific process?

    NARCIS (Netherlands)

    Zalewska-Kurek, Katarzyna; Geurts, Petrus A.T.M.; Roosendaal, Hans E.

    2006-01-01

    The Berlin declaration on Open Access to Knowledge in the Sciences and Humanities has resulted in a strong impetus in the discussion on business models, and in particular the model of open access. A business model is defined as just the organisation of property. Consequently, business models for

  6. ARM Mentor Selection Process

    Energy Technology Data Exchange (ETDEWEB)

    Sisterson, D. L. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2015-10-01

    The Atmospheric Radiation Measurement (ARM) Program was created in 1989 with funding from the U.S. Department of Energy (DOE) to develop several highly instrumented ground stations to study cloud formation processes and their influence on radiative transfer. In 2003, the ARM Program became a national scientific user facility, known as the ARM Climate Research Facility. This scientific infrastructure provides for fixed sites, mobile facilities, an aerial facility, and a data archive available for use by scientists worldwide through the ARM Climate Research Facility—a scientific user facility. The ARM Climate Research Facility currently operates more than 300 instrument systems that provide ground-based observations of the atmospheric column. To keep ARM at the forefront of climate observations, the ARM infrastructure depends heavily on instrument scientists and engineers, also known as lead mentors. Lead mentors must have an excellent understanding of in situ and remote-sensing instrumentation theory and operation and have comprehensive knowledge of critical scale-dependent atmospheric processes. They must also possess the technical and analytical skills to develop new data retrievals that provide innovative approaches for creating research-quality data sets. The ARM Climate Research Facility is seeking the best overall qualified candidate who can fulfill lead mentor requirements in a timely manner.

  7. Learning and Selection Processes

    Directory of Open Access Journals (Sweden)

    Marc Artiga

    2010-06-01

    Full Text Available Normal 0 21 false false false ES X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Tabla normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0cm 5.4pt 0cm 5.4pt; mso-para-margin:0cm; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} In this paper I defend a teleological explanation of normativity, i. e., I argue that what an organism (or device is supposed to do is determined by its etiological function. In particular, I present a teleological account of the normativity that arises in learning processes, and I defend it from some objections.

  8. Social Influence Interpretation of Interpersonal Processes and Team Performance Over Time Using Bayesian Model Selection

    NARCIS (Netherlands)

    Johnson, Alan R.; van de Schoot, Rens; Delmar, Frédéric; Crano, William D.

    The team behavior literature is ambiguous about the relations between members’ interpersonal processes—task debate and task conflict—and team performance. From a social influence perspective, we show why members’ interpersonal processes determine team performance over time in small groups. Together,

  9. Modeling the Supply Process Using the Application of Selected Methods of Operational Analysis

    Science.gov (United States)

    Chovancová, Mária; Klapita, Vladimír

    2017-03-01

    Supply process is one of the most important enterprise activities. All raw materials, intermediate products and products, which are moved within enterprise, are the subject of inventory management and by their effective management significant improvement of enterprise position on the market can be achieved. For that reason, the inventory needs to be managed, monitored, evaluated and affected. The paper deals with utilizing the methods of the operational analysis in the field of inventory management in terms of achieving the economic efficiency and ensuring the particular customer's service level as well.

  10. Sexual selection: Another Darwinian process.

    Science.gov (United States)

    Gayon, Jean

    2010-02-01

    the Darwin-Wallace controversy was that most Darwinian biologists avoided the subject of sexual selection until at least the 1950s, Ronald Fisher being a major exception. This controversy still deserves attention from modern evolutionary biologists, because the modern approach inherits from both Darwin and Wallace. The modern approach tends to present sexual selection as a special aspect of the theory of natural selection, although it also recognizes the big difficulties resulting from the inevitable interaction between these two natural processes of selection. And contra Wallace, it considers mate choice as a major process that deserves a proper evolutionary treatment. The paper's conclusion explains why sexual selection can be taken as a test case for a proper assessment of "Darwinism" as a scientific tradition. Darwin's and Wallace's attitudes towards sexual selection reveal two different interpretations of the principle of natural selection: Wallace's had an environmentalist conception of natural selection, whereas Darwin was primarily sensitive to the element of competition involved in the intimate mechanism of any natural process of selection. Sexual selection, which can lack adaptive significance, reveals this exemplarily. 2010 Académie des sciences. Published by Elsevier SAS. All rights reserved.

  11. Vibration and acoustic frequency spectra for industrial process modeling using selective fusion multi-condition samples and multi-source features

    Science.gov (United States)

    Tang, Jian; Qiao, Junfei; Wu, ZhiWei; Chai, Tianyou; Zhang, Jian; Yu, Wen

    2018-01-01

    Frequency spectral data of mechanical vibration and acoustic signals relate to difficult-to-measure production quality and quantity parameters of complex industrial processes. A selective ensemble (SEN) algorithm can be used to build a soft sensor model of these process parameters by fusing valued information selectively from different perspectives. However, a combination of several optimized ensemble sub-models with SEN cannot guarantee the best prediction model. In this study, we use several techniques to construct mechanical vibration and acoustic frequency spectra of a data-driven industrial process parameter model based on selective fusion multi-condition samples and multi-source features. Multi-layer SEN (MLSEN) strategy is used to simulate the domain expert cognitive process. Genetic algorithm and kernel partial least squares are used to construct the inside-layer SEN sub-model based on each mechanical vibration and acoustic frequency spectral feature subset. Branch-and-bound and adaptive weighted fusion algorithms are integrated to select and combine outputs of the inside-layer SEN sub-models. Then, the outside-layer SEN is constructed. Thus, "sub-sampling training examples"-based and "manipulating input features"-based ensemble construction methods are integrated, thereby realizing the selective information fusion process based on multi-condition history samples and multi-source input features. This novel approach is applied to a laboratory-scale ball mill grinding process. A comparison with other methods indicates that the proposed MLSEN approach effectively models mechanical vibration and acoustic signals.

  12. Modeling Natural Selection

    Science.gov (United States)

    Bogiages, Christopher A.; Lotter, Christine

    2011-01-01

    In their research, scientists generate, test, and modify scientific models. These models can be shared with others and demonstrate a scientist's understanding of how the natural world works. Similarly, students can generate and modify models to gain a better understanding of the content, process, and nature of science (Kenyon, Schwarz, and Hug…

  13. Fusion strategies for selecting multiple tuning parameters for multivariate calibration and other penalty based processes: A model updating application for pharmaceutical analysis

    International Nuclear Information System (INIS)

    Tencate, Alister J.; Kalivas, John H.; White, Alexander J.

    2016-01-01

    New multivariate calibration methods and other processes are being developed that require selection of multiple tuning parameter (penalty) values to form the final model. With one or more tuning parameters, using only one measure of model quality to select final tuning parameter values is not sufficient. Optimization of several model quality measures is challenging. Thus, three fusion ranking methods are investigated for simultaneous assessment of multiple measures of model quality for selecting tuning parameter values. One is a supervised learning fusion rule named sum of ranking differences (SRD). The other two are non-supervised learning processes based on the sum and median operations. The effect of the number of models evaluated on the three fusion rules are also evaluated using three procedures. One procedure uses all models from all possible combinations of the tuning parameters. To reduce the number of models evaluated, an iterative process (only applicable to SRD) is applied and thresholding a model quality measure before applying the fusion rules is also used. A near infrared pharmaceutical data set requiring model updating is used to evaluate the three fusion rules. In this case, calibration of the primary conditions is for the active pharmaceutical ingredient (API) of tablets produced in a laboratory. The secondary conditions for calibration updating is for tablets produced in the full batch setting. Two model updating processes requiring selection of two unique tuning parameter values are studied. One is based on Tikhonov regularization (TR) and the other is a variation of partial least squares (PLS). The three fusion methods are shown to provide equivalent and acceptable results allowing automatic selection of the tuning parameter values. Best tuning parameter values are selected when model quality measures used with the fusion rules are for the small secondary sample set used to form the updated models. In this model updating situation, evaluation of

  14. Business Model Process Configurations

    DEFF Research Database (Denmark)

    Taran, Yariv; Nielsen, Christian; Thomsen, Peter

    2015-01-01

    strategic preference, as part of their business model innovation activity planned. Practical implications – This paper aimed at strengthening researchers and, particularly, practitioner’s perspectives into the field of business model process configurations. By insuring an [abstracted] alignment between......Purpose – The paper aims: 1) To develop systematically a structural list of various business model process configuration and to group (deductively) these selected configurations in a structured typological categorization list. 2) To facilitate companies in the process of BM innovation......, by developing (inductively) an ontological classification framework, in view of the BM process configurations typology developed. Design/methodology/approach – Given the inconsistencies found in the business model studies (e.g. definitions, configurations, classifications) we adopted the analytical induction...

  15. It Takes Three: Selection, Influence, and De-Selection Processes of Depression in Adolescent Friendship Networks

    Science.gov (United States)

    Van Zalk, Maarten Herman Walter; Kerr, Margaret; Branje, Susan J. T.; Stattin, Hakan; Meeus, Wim H. J.

    2010-01-01

    The authors of this study tested a selection-influence-de-selection model of depression. This model explains friendship influence processes (i.e., friends' depressive symptoms increase adolescents' depressive symptoms) while controlling for two processes: friendship selection (i.e., selection of friends with similar levels of depressive symptoms)…

  16. Selected Topics on Systems Modeling and Natural Language Processing: Editorial Introduction to the Issue 7 of CSIMQ

    Directory of Open Access Journals (Sweden)

    Witold Andrzejewski

    2016-07-01

    Full Text Available The seventh issue of Complex Systems Informatics and Modeling Quarterly presents five papers devoted to two distinct research topics: systems modeling and natural language processing (NLP. Both of these subjects are very important in computer science. Through modeling we can simplify the studied problem by concentrating on only one aspect at a time. Moreover, a properly constructed model allows the modeler to work on higher levels of abstraction and not having to concentrate on details. Since the size and complexity of information systems grows rapidly, creating good models of such systems is crucial. The analysis of natural language is slowly becoming a widely used tool in commerce and day to day life. Opinion mining allows recommender systems to provide accurate recommendations based on user-generated reviews. Speech recognition and NLP are the basis for such widely used personal assistants as Apple’s Siri, Microsoft’s Cortana, and Google Now. While a lot of work has already been done on natural language processing, the research usually concerns widely used languages, such as English. Consequently, natural language processing in languages other than English is very relevant subject and is addressed in this issue.

  17. Physics-based simulation modeling and optimization of microstructural changes induced by machining and selective laser melting processes in titanium and nickel based alloys

    Science.gov (United States)

    Arisoy, Yigit Muzaffer

    Manufacturing processes may significantly affect the quality of resultant surfaces and structural integrity of the metal end products. Controlling manufacturing process induced changes to the product's surface integrity may improve the fatigue life and overall reliability of the end product. The goal of this study is to model the phenomena that result in microstructural alterations and improve the surface integrity of the manufactured parts by utilizing physics-based process simulations and other computational methods. Two different (both conventional and advanced) manufacturing processes; i.e. machining of Titanium and Nickel-based alloys and selective laser melting of Nickel-based powder alloys are studied. 3D Finite Element (FE) process simulations are developed and experimental data that validates these process simulation models are generated to compare against predictions. Computational process modeling and optimization have been performed for machining induced microstructure that includes; i) predicting recrystallization and grain size using FE simulations and the Johnson-Mehl-Avrami-Kolmogorov (JMAK) model, ii) predicting microhardness using non-linear regression models and the Random Forests method, and iii) multi-objective machining optimization for minimizing microstructural changes. Experimental analysis and computational process modeling of selective laser melting have been also conducted including; i) microstructural analysis of grain sizes and growth directions using SEM imaging and machine learning algorithms, ii) analysis of thermal imaging for spattering, heating/cooling rates and meltpool size, iii) predicting thermal field, meltpool size, and growth directions via thermal gradients using 3D FE simulations, iv) predicting localized solidification using the Phase Field method. These computational process models and predictive models, once utilized by industry to optimize process parameters, have the ultimate potential to improve performance of

  18. A DECISION MAKING MODEL FOR SELECTION OF WIND ENERGY PRODUCTION FARMS BASED ON FUZZY ANALYTIC HIERARCHY PROCESS

    OpenAIRE

    SAGBAS, Aysun; MAZMANOGLU, Adnan; ALP, Reyhan

    2013-01-01

    The purpose of this paper is to present an evaluation model for the prioritization of wind energy production sites, namely, Mersin, Silifke and Anamur, located in Mediterranean Sea region of Turkey. For this purpose, a fuzzy analytical hierarchy decision making approach based on multi-criteria decision making framework including economic, technical, and environmental criteria was performed. It is found that the results obtained from fuzzy analytical hierarchy process (FAHP) approach, Anamur d...

  19. 15 CFR 2301.18 - Selection process.

    Science.gov (United States)

    2010-01-01

    ... 15 Commerce and Foreign Trade 3 2010-01-01 2010-01-01 false Selection process. 2301.18 Section... PROGRAM Evaluation and Selection Process § 2301.18 Selection process. (a) The PTFP Director will consider the summary evaluations prepared by program staff, rank the applications, and present recommendations...

  20. Modeling of biopharmaceutical processes. Part 2: Process chromatography unit operation

    DEFF Research Database (Denmark)

    Kaltenbrunner, Oliver; McCue, Justin; Engel, Philip

    2008-01-01

    Process modeling can be a useful tool to aid in process development, process optimization, and process scale-up. When modeling a chromatography process, one must first select the appropriate models that describe the mass transfer and adsorption that occurs within the porous adsorbent. The theoret......Process modeling can be a useful tool to aid in process development, process optimization, and process scale-up. When modeling a chromatography process, one must first select the appropriate models that describe the mass transfer and adsorption that occurs within the porous adsorbent...

  1. Innovation During the Supplier Selection Process

    DEFF Research Database (Denmark)

    Pilkington, Alan; Pedraza, Isabel

    2014-01-01

    Established ideas on supplier selection have not moved much from the original premise of how to choose between bidders. Whilst we have added many different tools and refinements to choose between alternative suppliers, its nature has not evolved. We move the original selection process approach...... observed through an ethnographic embedded researcher study has refined the selection process and has two selection stages one for first supply covering tool/process developed and another later for resupply of mature parts. We report the details of the process, those involved, the criteria employed...... and identify benefits and weaknesses of this enhanced selection process....

  2. On Data Space Selection and Data Processing for Parameter Identification in a Reaction-Diffusion Model Based on FRAP Experiments

    Directory of Open Access Journals (Sweden)

    Stefan Kindermann

    2015-01-01

    Full Text Available Fluorescence recovery after photobleaching (FRAP is a widely used measurement technique to determine the mobility of fluorescent molecules within living cells. While the experimental setup and protocol for FRAP experiments are usually fixed, data (preprocessing represents an important issue. The aim of this paper is twofold. First, we formulate and solve the problem of relevant FRAP data selection. The theoretical findings are illustrated by the comparison of the results of parameter identification when the full data set was used and the case when the irrelevant data set (data with negligible impact on the confidence interval of the estimated parameters was removed from the data space. Second, we analyze and compare two approaches of FRAP data processing. Our proposition, surprisingly for the FRAP community, claims that the data set represented by the FRAP recovery curves in form of a time series (integrated data approach commonly used by the FRAP community leads to a larger confidence interval compared to the full (spatiotemporal data approach.

  3. UML in business process modeling

    Directory of Open Access Journals (Sweden)

    Bartosz Marcinkowski

    2013-03-01

    Full Text Available Selection and proper application of business process modeling methods and techniques have a significant impact on organizational improvement capabilities as well as proper understanding of functionality of information systems that shall support activity of the organization. A number of business process modeling notations were popularized in practice in recent decades. Most significant of the notations include Business Process Modeling Notation (OMG BPMN and several Unified Modeling Language (OMG UML extensions. In this paper, the assessment whether one of the most flexible and strictly standardized contemporary business process modeling notations, i.e. Rational UML Profile for Business Modeling, enable business analysts to prepare business models that are all-embracing and understandable by all the stakeholders. After the introduction, methodology of research is discussed. Section 2 presents selected case study results. The paper is concluded with a summary.

  4. Modeling of biopharmaceutical processes. Part 2: Process chromatography unit operation

    DEFF Research Database (Denmark)

    Kaltenbrunner, Oliver; McCue, Justin; Engel, Philip

    2008-01-01

    Process modeling can be a useful tool to aid in process development, process optimization, and process scale-up. When modeling a chromatography process, one must first select the appropriate models that describe the mass transfer and adsorption that occurs within the porous adsorbent. The theoret...

  5. Graphical tools for model selection in generalized linear models.

    Science.gov (United States)

    Murray, K; Heritier, S; Müller, S

    2013-11-10

    Model selection techniques have existed for many years; however, to date, simple, clear and effective methods of visualising the model building process are sparse. This article describes graphical methods that assist in the selection of models and comparison of many different selection criteria. Specifically, we describe for logistic regression, how to visualize measures of description loss and of model complexity to facilitate the model selection dilemma. We advocate the use of the bootstrap to assess the stability of selected models and to enhance our graphical tools. We demonstrate which variables are important using variable inclusion plots and show that these can be invaluable plots for the model building process. We show with two case studies how these proposed tools are useful to learn more about important variables in the data and how these tools can assist the understanding of the model building process. Copyright © 2013 John Wiley & Sons, Ltd.

  6. Models selection and fitting

    International Nuclear Information System (INIS)

    Martin Llorente, F.

    1990-01-01

    The models of atmospheric pollutants dispersion are based in mathematic algorithms that describe the transport, diffusion, elimination and chemical reactions of atmospheric contaminants. These models operate with data of contaminants emission and make an estimation of quality air in the area. This model can be applied to several aspects of atmospheric contamination

  7. ARM Lead Mentor Selection Process

    Energy Technology Data Exchange (ETDEWEB)

    Sisterson, DL

    2013-03-13

    The ARM Climate Research Facility currently operates more than 300 instrument systems that provide ground-based observations of the atmospheric column. To keep ARM at the forefront of climate observations, the ARM infrastructure depends heavily on instrument scientists and engineers, also known as Instrument Mentors. Instrument Mentors must have an excellent understanding of in situ and remote-sensing instrumentation theory and operation and have comprehensive knowledge of critical scale-dependent atmospheric processes. They also possess the technical and analytical skills to develop new data retrievals that provide innovative approaches for creating research-quality data sets.

  8. Selected System Models

    Science.gov (United States)

    Schmidt-Eisenlohr, F.; Puñal, O.; Klagges, K.; Kirsche, M.

    Apart from the general issue of modeling the channel, the PHY and the MAC of wireless networks, there are specific modeling assumptions that are considered for different systems. In this chapter we consider three specific wireless standards and highlight modeling options for them. These are IEEE 802.11 (as example for wireless local area networks), IEEE 802.16 (as example for wireless metropolitan networks) and IEEE 802.15 (as example for body area networks). Each section on these three systems discusses also at the end a set of model implementations that are available today.

  9. 7 CFR 3570.68 - Selection process.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 15 2010-01-01 2010-01-01 false Selection process. 3570.68 Section 3570.68 Agriculture Regulations of the Department of Agriculture (Continued) RURAL HOUSING SERVICE, DEPARTMENT OF AGRICULTURE COMMUNITY PROGRAMS Community Facilities Grant Program § 3570.68 Selection process. Each request...

  10. 44 CFR 150.7 - Selection process.

    Science.gov (United States)

    2010-10-01

    ... 44 Emergency Management and Assistance 1 2010-10-01 2010-10-01 false Selection process. 150.7 Section 150.7 Emergency Management and Assistance FEDERAL EMERGENCY MANAGEMENT AGENCY, DEPARTMENT OF... Selection process. (a) President's Award. Nominations for the President's Award shall be reviewed, and...

  11. 45 CFR 1634.8 - Selection process.

    Science.gov (United States)

    2010-10-01

    ... 45 Public Welfare 4 2010-10-01 2010-10-01 false Selection process. 1634.8 Section 1634.8 Public... FOR GRANTS AND CONTRACTS § 1634.8 Selection process. (a) After receipt of all applications for a particular service area, Corporation staff shall: (1) Review each application and any additional information...

  12. Bubble point pressures of the selected model system for CatLiq® bio-oil process

    DEFF Research Database (Denmark)

    Toor, Saqib Sohail; Rosendahl, Lasse; Baig, Muhammad Noman

    2010-01-01

    were presented for the temperatures between 40 oC and 75 oC. The results were correlated by the PSRK (Predictive Soave-Redlich-Kwong) model using Huron-Vidal first-order mixing rule of Michelsen coupled with the modified UNIFAC model. The average absolute deviation between the experimental...

  13. Predictive modeling, simulation, and optimization of laser processing techniques: UV nanosecond-pulsed laser micromachining of polymers and selective laser melting of powder metals

    Science.gov (United States)

    Criales Escobar, Luis Ernesto

    One of the most frequently evolving areas of research is the utilization of lasers for micro-manufacturing and additive manufacturing purposes. The use of laser beam as a tool for manufacturing arises from the need for flexible and rapid manufacturing at a low-to-mid cost. Laser micro-machining provides an advantage over mechanical micro-machining due to the faster production times of large batch sizes and the high costs associated with specific tools. Laser based additive manufacturing enables processing of powder metals for direct and rapid fabrication of products. Therefore, laser processing can be viewed as a fast, flexible, and cost-effective approach compared to traditional manufacturing processes. Two types of laser processing techniques are studied: laser ablation of polymers for micro-channel fabrication and selective laser melting of metal powders. Initially, a feasibility study for laser-based micro-channel fabrication of poly(dimethylsiloxane) (PDMS) via experimentation is presented. In particular, the effectiveness of utilizing a nanosecond-pulsed laser as the energy source for laser ablation is studied. The results are analyzed statistically and a relationship between process parameters and micro-channel dimensions is established. Additionally, a process model is introduced for predicting channel depth. Model outputs are compared and analyzed to experimental results. The second part of this research focuses on a physics-based FEM approach for predicting the temperature profile and melt pool geometry in selective laser melting (SLM) of metal powders. Temperature profiles are calculated for a moving laser heat source to understand the temperature rise due to heating during SLM. Based on the predicted temperature distributions, melt pool geometry, i.e. the locations at which melting of the powder material occurs, is determined. Simulation results are compared against data obtained from experimental Inconel 625 test coupons fabricated at the National

  14. A Heckman Selection- t Model

    KAUST Repository

    Marchenko, Yulia V.

    2012-03-01

    Sample selection arises often in practice as a result of the partial observability of the outcome of interest in a study. In the presence of sample selection, the observed data do not represent a random sample from the population, even after controlling for explanatory variables. That is, data are missing not at random. Thus, standard analysis using only complete cases will lead to biased results. Heckman introduced a sample selection model to analyze such data and proposed a full maximum likelihood estimation method under the assumption of normality. The method was criticized in the literature because of its sensitivity to the normality assumption. In practice, data, such as income or expenditure data, often violate the normality assumption because of heavier tails. We first establish a new link between sample selection models and recently studied families of extended skew-elliptical distributions. Then, this allows us to introduce a selection-t (SLt) model, which models the error distribution using a Student\\'s t distribution. We study its properties and investigate the finite-sample performance of the maximum likelihood estimators for this model. We compare the performance of the SLt model to the conventional Heckman selection-normal (SLN) model and apply it to analyze ambulatory expenditures. Unlike the SLNmodel, our analysis using the SLt model provides statistical evidence for the existence of sample selection bias in these data. We also investigate the performance of the test for sample selection bias based on the SLt model and compare it with the performances of several tests used with the SLN model. Our findings indicate that the latter tests can be misleading in the presence of heavy-tailed data. © 2012 American Statistical Association.

  15. Process modeling style

    CERN Document Server

    Long, John

    2014-01-01

    Process Modeling Style focuses on other aspects of process modeling beyond notation that are very important to practitioners. Many people who model processes focus on the specific notation used to create their drawings. While that is important, there are many other aspects to modeling, such as naming, creating identifiers, descriptions, interfaces, patterns, and creating useful process documentation. Experience author John Long focuses on those non-notational aspects of modeling, which practitioners will find invaluable. Gives solid advice for creating roles, work produ

  16. The partner selection process : Steps, effectiveness, governance

    NARCIS (Netherlands)

    Duisters, D.; Duijsters, G.M.; de Man, A.P.

    2011-01-01

    Selecting the right partner is important for creating value in alliances. Even though prior research suggests that a structured partner selection process increases alliance success, empirical research remains scarce. This paper presents an explorative empirical study that shows that some steps in

  17. Product and Process Modelling

    DEFF Research Database (Denmark)

    Cameron, Ian T.; Gani, Rafiqul

    This book covers the area of product and process modelling via a case study approach. It addresses a wide range of modelling applications with emphasis on modelling methodology and the subsequent in-depth analysis of mathematical models to gain insight via structural aspects of the models....... These approaches are put into the context of life cycle modelling, where multiscale and multiform modelling is increasingly prevalent in the 21st century. The book commences with a discussion of modern product and process modelling theory and practice followed by a series of case studies drawn from a variety...... to biotechnology applications, food, polymer and human health application areas. The book highlights to important nature of modern product and process modelling in the decision making processes across the life cycle. As such it provides an important resource for students, researchers and industrial practitioners....

  18. Socio-cultural models as an important element of the site selection process in rural waste management

    Directory of Open Access Journals (Sweden)

    Nenković-Riznić Marina

    2011-01-01

    Full Text Available The problem of waste management in rural areas has not been the subject of detailed specific researches since most of the research has been directed towards the study of means, mechanisms and procedures of waste elimination in urban settlements. The reason for the reduced scope of research in this field lies in the fact that rural settlements cannot be considered as "grateful" subjects due to usual deficiency of specific data (population number, fluctuations, amount of waste, waste composition, methods of waste elimination, etc.. In addition, for several decades the villages have primarily eliminated waste spontaneously. This has proven difficult to research because of the variations of methods applied to each specific locale, as well as different environmental variables. These criteria are based on patterns of behavior, customs and habits of the local population, but they also insist on absolute participation of local stakeholders in waste management. On the other hand, although Serbia has a legislative frame which is fully harmonized with European laws, there is a problem within unclearly defined waste management system which is oriented mainly on rural areas. The reason for this is the fact that waste management in rural areas is the part of regional waste management, and does not operate independently from the system in "urban" areas. However, since rural areas require the construction of recycling yards, this paper will present a new methodology, which equally valuates techno-economic criteria and social criteria in determining waste elimination locations. This paper will also point out varieties of actors in the process of waste elimination in rural areas, as well as the possibility of their participation.

  19. Standard Model processes

    CERN Document Server

    Mangano, M.L.; Aguilar-Saavedra, Juan Antonio; Alekhin, S.; Badger, S.; Bauer, C.W.; Becher, T.; Bertone, V.; Bonvini, M.; Boselli, S.; Bothmann, E.; Boughezal, R.; Cacciari, M.; Carloni Calame, C.M.; Caola, F.; Campbell, J.M.; Carrazza, S.; Chiesa, M.; Cieri, L.; Cimaglia, F.; Febres Cordero, F.; Ferrarese, P.; D'Enterria, D.; Ferrera, G.; Garcia i Tormo, X.; Garzelli, M.V.; Germann, E.; Hirschi, V.; Han, T.; Ita, H.; Jäger, B.; Kallweit, S.; Karlberg, A.; Kuttimalai, S.; Krauss, F.; Larkoski, A.J.; Lindert, J.; Luisoni, G.; Maierhöfer, P.; Mattelaer, O.; Martinez, H.; Moch, S.; Montagna, G.; Moretti, M.; Nason, P.; Nicrosini, O.; Oleari, C.; Pagani, D.; Papaefstathiou, A.; Petriello, F.; Piccinini, F.; Pierini, M.; Pierog, T.; Pozzorini, S.; Re, E.; Robens, T.; Rojo, J.; Ruiz, R.; Sakurai, K.; Salam, G.P.; Salfelder, L.; Schönherr, M.; Schulze, M.; Schumann, S.; Selvaggi, M.; Shivaji, A.; Siodmok, A.; Skands, P.; Torrielli, P.; Tramontano, F.; Tsinikos, I.; Tweedie, B.; Vicini, A.; Westhoff, S.; Zaro, M.; Zeppenfeld, D.; CERN. Geneva. ATS Department

    2017-06-22

    This report summarises the properties of Standard Model processes at the 100 TeV pp collider. We document the production rates and typical distributions for a number of benchmark Standard Model processes, and discuss new dynamical phenomena arising at the highest energies available at this collider. We discuss the intrinsic physics interest in the measurement of these Standard Model processes, as well as their role as backgrounds for New Physics searches.

  20. PRIME – PRocess modelling in ImpleMEntation research: selecting a theoretical basis for interventions to change clinical practice

    Directory of Open Access Journals (Sweden)

    Pitts Nigel

    2003-12-01

    modelling. In the final phase of the project, the findings from all surveys will be analysed simultaneously adopting a random effects approach to investigate whether the relationships between predictor variables and outcome measures are modified by behaviour, professional group or geographical location.

  1. Exploring Several Methods of Groundwater Model Selection

    Science.gov (United States)

    Samani, Saeideh; Ye, Ming; Asghari Moghaddam, Asghar

    2017-04-01

    Selecting reliable models for simulating groundwater flow and solute transport is essential to groundwater resources management and protection. This work is to explore several model selection methods for avoiding over-complex and/or over-parameterized groundwater models. We consider six groundwater flow models with different numbers (6, 10, 10, 13, 13 and 15) of model parameters. These models represent alternative geological interpretations, recharge estimates, and boundary conditions at a study site in Iran. The models were developed with Model Muse, and calibrated against observations of hydraulic head using UCODE. Model selection was conducted by using the following four approaches: (1) Rank the models using their root mean square error (RMSE) obtained after UCODE-based model calibration, (2) Calculate model probability using GLUE method, (3) Evaluate model probability using model selection criteria (AIC, AICc, BIC, and KIC), and (4) Evaluate model weights using the Fuzzy Multi-Criteria-Decision-Making (MCDM) approach. MCDM is based on the fuzzy analytical hierarchy process (AHP) and fuzzy technique for order performance, which is to identify the ideal solution by a gradual expansion from the local to the global scale of model parameters. The KIC and MCDM methods are superior to other methods, as they consider not only the fit between observed and simulated data and the number of parameter, but also uncertainty in model parameters. Considering these factors can prevent from occurring over-complexity and over-parameterization, when selecting the appropriate groundwater flow models. These methods selected, as the best model, one with average complexity (10 parameters) and the best parameter estimation (model 3).

  2. Material and process selection using product examples

    DEFF Research Database (Denmark)

    Lenau, Torben Anker

    2002-01-01

    The objective of the paper is to suggest a different procedure for selecting materials and processes within the product development work. The procedure includes using product examples in order to increase the number of alternative materials and processes that is considered. Product examples can...... communicate information about materials and processes in a very concentrated and effective way. The product examples represent desired material properties but also includes information that can not be associated directly to the material, e.g. functional or perceived attributes. Previous studies suggest...... that designers often limit their selection of materials and processes to a few well-known ones. Designers need to expand the solution space by considering more materials and processes. But they have to be convinced that the materials and processes are likely candidates that are worth investing time in exploring...

  3. Material and process selection using product examples

    DEFF Research Database (Denmark)

    Lenau, Torben Anker

    2001-01-01

    The objective of the paper is to suggest a different procedure for selecting materials and processes within the product development work. The procedure includes using product examples in order to increase the number of alternative materials and processes that is considered. Product examples can...... communicate information about materials and processes in a very concentrated and effective way. The product examples represent desired material properties but also includes information that can not be associated directly to the material, e.g. functional or perceived attributes. Previous studies suggest...... that designers often limit their selection of materials and processes to a few well-known ones. Designers need to expand the solution space by considering more materials and processes. But they have to be convinced that the materials and processes are likely candidates that are worth investing time in exploring...

  4. WWTP Process Tank Modelling

    DEFF Research Database (Denmark)

    Laursen, Jesper

    solution of the Navier-Stokes equations in a multiphase scheme. After a general introduction to the activated sludge tank as a system, the activated sludge tank model is gradually setup in separate stages. The individual sub-processes that are often occurring in activated sludge tanks are initially......-process models, the last part of the thesis, where the integrated process tank model is tested on three examples of activated sludge systems, is initiated. The three case studies are introduced with an increasing degree of model complexity. All three cases are take basis in Danish municipal wastewater treatment...... plants. The first case study involves the modeling of an activated sludge tank undergoing a special controlling strategy with the intention minimizing the sludge loading on the subsequent secondary settlers during storm events. The applied model is a two-phase model, where the sedimentation of sludge...

  5. On spatial mutation-selection models

    Energy Technology Data Exchange (ETDEWEB)

    Kondratiev, Yuri, E-mail: kondrat@math.uni-bielefeld.de [Fakultät für Mathematik, Universität Bielefeld, Postfach 100131, 33501 Bielefeld (Germany); Kutoviy, Oleksandr, E-mail: kutoviy@math.uni-bielefeld.de, E-mail: kutovyi@mit.edu [Fakultät für Mathematik, Universität Bielefeld, Postfach 100131, 33501 Bielefeld (Germany); Department of Mathematics, Massachusetts Institute of Technology, 77 Massachusetts Avenue, Cambridge, Massachusetts 02139 (United States); Minlos, Robert, E-mail: minl@iitp.ru; Pirogov, Sergey, E-mail: pirogov@proc.ru [IITP, RAS, Bolshoi Karetnyi 19, Moscow (Russian Federation)

    2013-11-15

    We discuss the selection procedure in the framework of mutation models. We study the regulation for stochastically developing systems based on a transformation of the initial Markov process which includes a cost functional. The transformation of initial Markov process by cost functional has an analytic realization in terms of a Kimura-Maruyama type equation for the time evolution of states or in terms of the corresponding Feynman-Kac formula on the path space. The state evolution of the system including the limiting behavior is studied for two types of mutation-selection models.

  6. Selection of power market structure using the analytic hierarchy process

    International Nuclear Information System (INIS)

    Subhes Bhattacharyya; Prasanta Kumar Dey

    2003-01-01

    Selection of a power market structure from the available alternatives is an important activity within an overall power sector reform program. The evaluation criteria for selection are both subjective as well as objective in nature and the selection of alternatives is characterised by their conflicting nature. This study demonstrates a methodology for power market structure selection using the analytic hierarchy process, a multiple attribute decision- making technique, to model the selection methodology with the active participation of relevant stakeholders in a workshop environment. The methodology is applied to a hypothetical case of a State Electricity Board reform in India. (author)

  7. Selecting model complexity in learning problems

    Energy Technology Data Exchange (ETDEWEB)

    Buescher, K.L. [Los Alamos National Lab., NM (United States); Kumar, P.R. [Illinois Univ., Urbana, IL (United States). Coordinated Science Lab.

    1993-10-01

    To learn (or generalize) from noisy data, one must resist the temptation to pick a model for the underlying process that overfits the data. Many existing techniques solve this problem at the expense of requiring the evaluation of an absolute, a priori measure of each model`s complexity. We present a method that does not. Instead, it uses a natural, relative measure of each model`s complexity. This method first creates a pool of ``simple`` candidate models using part of the data and then selects from among these by using the rest of the data.

  8. Material and process selection using product examples

    DEFF Research Database (Denmark)

    Lenau, Torben Anker

    2002-01-01

    The objective of the paper is to suggest a different procedure for selecting materials and processes within the product development work. The procedure includes using product examples in order to increase the number of alternative materials and processes that is considered. Product examples can...... communicate information about materials and processes in a very concentrated and effective way. The product examples represent desired material properties but also includes information that can not be associated directly to the material, e.g. functional or perceived attributes. Previous studies suggest....... A database that support the selection procedure has been compiled. It contains uniform descriptions of a wide range of materials and processes. For each of those, good product examples have been identified, described and associated with keywords. Product examples matching the requirements can be found using...

  9. Model Process Control Language

    Data.gov (United States)

    National Aeronautics and Space Administration — The MPC (Model Process Control) language enables the capture, communication and preservation of a simulation instance, with sufficient detail that it can be...

  10. The Process of Marketing Segmentation Strategy Selection

    OpenAIRE

    Ionel Dumitru

    2007-01-01

    The process of marketing segmentation strategy selection represents the essence of strategical marketing. We present hereinafter the main forms of the marketing statategy segmentation: undifferentiated marketing, differentiated marketing, concentrated marketing and personalized marketing. In practice, the companies use a mix of these marketing segmentation methods in order to maximize the proffit and to satisfy the consumers’ needs.

  11. Biosphere Process Model Report

    Energy Technology Data Exchange (ETDEWEB)

    J. Schmitt

    2000-05-25

    To evaluate the postclosure performance of a potential monitored geologic repository at Yucca Mountain, a Total System Performance Assessment (TSPA) will be conducted. Nine Process Model Reports (PMRs), including this document, are being developed to summarize the technical basis for each of the process models supporting the TSPA model. These reports cover the following areas: (1) Integrated Site Model; (2) Unsaturated Zone Flow and Transport; (3) Near Field Environment; (4) Engineered Barrier System Degradation, Flow, and Transport; (5) Waste Package Degradation; (6) Waste Form Degradation; (7) Saturated Zone Flow and Transport; (8) Biosphere; and (9) Disruptive Events. Analysis/Model Reports (AMRs) contain the more detailed technical information used to support TSPA and the PMRs. The AMRs consists of data, analyses, models, software, and supporting documentation that will be used to defend the applicability of each process model for evaluating the postclosure performance of the potential Yucca Mountain repository system. This documentation will ensure the traceability of information from its source through its ultimate use in the TSPA-Site Recommendation (SR) and in the National Environmental Policy Act (NEPA) analysis processes. The objective of the Biosphere PMR is to summarize (1) the development of the biosphere model, and (2) the Biosphere Dose Conversion Factors (BDCFs) developed for use in TSPA. The Biosphere PMR does not present or summarize estimates of potential radiation doses to human receptors. Dose calculations are performed as part of TSPA and will be presented in the TSPA documentation. The biosphere model is a component of the process to evaluate postclosure repository performance and regulatory compliance for a potential monitored geologic repository at Yucca Mountain, Nevada. The biosphere model describes those exposure pathways in the biosphere by which radionuclides released from a potential repository could reach a human receptor

  12. Biosphere Process Model Report

    International Nuclear Information System (INIS)

    Schmitt, J.

    2000-01-01

    To evaluate the postclosure performance of a potential monitored geologic repository at Yucca Mountain, a Total System Performance Assessment (TSPA) will be conducted. Nine Process Model Reports (PMRs), including this document, are being developed to summarize the technical basis for each of the process models supporting the TSPA model. These reports cover the following areas: (1) Integrated Site Model; (2) Unsaturated Zone Flow and Transport; (3) Near Field Environment; (4) Engineered Barrier System Degradation, Flow, and Transport; (5) Waste Package Degradation; (6) Waste Form Degradation; (7) Saturated Zone Flow and Transport; (8) Biosphere; and (9) Disruptive Events. Analysis/Model Reports (AMRs) contain the more detailed technical information used to support TSPA and the PMRs. The AMRs consists of data, analyses, models, software, and supporting documentation that will be used to defend the applicability of each process model for evaluating the postclosure performance of the potential Yucca Mountain repository system. This documentation will ensure the traceability of information from its source through its ultimate use in the TSPA-Site Recommendation (SR) and in the National Environmental Policy Act (NEPA) analysis processes. The objective of the Biosphere PMR is to summarize (1) the development of the biosphere model, and (2) the Biosphere Dose Conversion Factors (BDCFs) developed for use in TSPA. The Biosphere PMR does not present or summarize estimates of potential radiation doses to human receptors. Dose calculations are performed as part of TSPA and will be presented in the TSPA documentation. The biosphere model is a component of the process to evaluate postclosure repository performance and regulatory compliance for a potential monitored geologic repository at Yucca Mountain, Nevada. The biosphere model describes those exposure pathways in the biosphere by which radionuclides released from a potential repository could reach a human receptor

  13. Application of numerical modeling of selective NOx reduction by hydrocarbon under diesel transient conditions in consideration of hydrocarbon adsorption and desorption process

    International Nuclear Information System (INIS)

    Watanabe, Y.; Asano, A.; Banno, K.; Yokota, K.; Sugiura, M.

    2001-01-01

    A model of NO x selective reduction by hydrocarbon (HC) was developed, which takes into account the adsorption and desorption of HC. The model was applied for predicting the performance of a De-NO x catalytic reactor, working under transient conditions such as a legislative driving cycle. Diesel fuel was used as a supplemental reductant. The behavior of HC and NO x reactions and HC adsorption and desorption has been simulated successfully by our numerical approach under the transient conditions of the simulated Japanese 10-15 driving cycle. Our model is expected to optimize the design of selective diesel NO x reduction systems using a diesel fuel as a supplemental reductant

  14. Selective epitaxy using the gild process

    Science.gov (United States)

    Weiner, Kurt H.

    1992-01-01

    The present invention comprises a method of selective epitaxy on a semiconductor substrate. The present invention provides a method of selectively forming high quality, thin GeSi layers in a silicon circuit, and a method for fabricating smaller semiconductor chips with a greater yield (more error free chips) at a lower cost. The method comprises forming an upper layer over a substrate, and depositing a reflectivity mask which is then removed over selected sections. Using a laser to melt the unmasked sections of the upper layer, the semiconductor material in the upper layer is heated and diffused into the substrate semiconductor material. By varying the amount of laser radiation, the epitaxial layer is formed to a controlled depth which may be very thin. When cooled, a single crystal epitaxial layer is formed over the patterned substrate. The present invention provides the ability to selectively grow layers of mixed semiconductors over patterned substrates such as a layer of Ge.sub.x Si.sub.1-x grown over silicon. Such a process may be used to manufacture small transistors that have a narrow base, heavy doping, and high gain. The narrowness allows a faster transistor, and the heavy doping reduces the resistance of the narrow layer. The process does not require high temperature annealing; therefore materials such as aluminum can be used. Furthermore, the process may be used to fabricate diodes that have a high reverse breakdown voltage and a low reverse leakage current.

  15. The Brookhaven Process Optimization Models

    Energy Technology Data Exchange (ETDEWEB)

    Pilati, D. A.; Sparrow, F. T.

    1979-01-01

    The Brookhaven National Laboratory Industry Model Program (IMP) has undertaken the development of a set of industry-specific process-optimization models. These models are to be used for energy-use projections, energy-policy analyses, and process technology assessments. Applications of the models currently under development show that system-wide energy impacts may be very different from engineering estimates, selected investment tax credits for cogeneration (or other conservation strategies) may have the perverse effect of increasing industrial energy use, and that a proper combination of energy taxes and investment tax credits is more socially desirable than either policy alone. A section is included describing possible extensions of these models to answer questions or address other systems (e.g., a single plant instead of an entire industry).

  16. Foam process models.

    Energy Technology Data Exchange (ETDEWEB)

    Moffat, Harry K.; Noble, David R.; Baer, Thomas A. (Procter & Gamble Co., West Chester, OH); Adolf, Douglas Brian; Rao, Rekha Ranjana; Mondy, Lisa Ann

    2008-09-01

    In this report, we summarize our work on developing a production level foam processing computational model suitable for predicting the self-expansion of foam in complex geometries. The model is based on a finite element representation of the equations of motion, with the movement of the free surface represented using the level set method, and has been implemented in SIERRA/ARIA. An empirically based time- and temperature-dependent density model is used to encapsulate the complex physics of foam nucleation and growth in a numerically tractable model. The change in density with time is at the heart of the foam self-expansion as it creates the motion of the foam. This continuum-level model uses an homogenized description of foam, which does not include the gas explicitly. Results from the model are compared to temperature-instrumented flow visualization experiments giving the location of the foam front as a function of time for our EFAR model system.

  17. Voter models with heterozygosity selection

    Czech Academy of Sciences Publication Activity Database

    Sturm, A.; Swart, Jan M.

    2008-01-01

    Roč. 18, č. 1 (2008), s. 59-99 ISSN 1050-5164 R&D Projects: GA ČR GA201/06/1323; GA ČR GA201/07/0237 Institutional research plan: CEZ:AV0Z10750506 Keywords : Heterozygosity selection * rebellious voter model * branching * annihilation * survival * coexistence Subject RIV: BA - General Mathematics Impact factor: 1.285, year: 2008

  18. Dynamic treatment selection and modification for personalised blood pressure therapy using a Markov decision process model: a cost-effectiveness analysis.

    Science.gov (United States)

    Choi, Sung Eun; Brandeau, Margaret L; Basu, Sanjay

    2017-11-15

    Personalised medicine seeks to select and modify treatments based on individual patient characteristics and preferences. We sought to develop an automated strategy to select and modify blood pressure treatments, incorporating the likelihood that patients with different characteristics would benefit from different types of medications and dosages and the potential severity and impact of different side effects among patients with different characteristics. We developed a Markov decision process (MDP) model to incorporate meta-analytic data and estimate the optimal treatment for maximising discounted lifetime quality-adjusted life-years (QALYs) based on individual patient characteristics, incorporating medication adjustment choices when a patient incurs side effects. We compared the MDP to current US blood pressure treatment guidelines (the Eighth Joint National Committee, JNC8) and a variant of current guidelines that incorporates results of a major recent trial of intensive treatment (Intensive JNC8). We used a microsimulation model of patient demographics, cardiovascular disease risk factors and side effect probabilities, sampling from the National Health and Nutrition Examination Survey (2003-2014), to compare the expected population outcomes from adopting the MDP versus guideline-based strategies. Costs and QALYs for the MDP-based treatment (MDPT), JNC8 and Intensive JNC8 strategies. Compared with the JNC8 guideline, the MDPT strategy would be cost-saving from a societal perspective with discounted savings of US$1187 per capita (95% CI 1178 to 1209) and an estimated discounted gain of 0.06 QALYs per capita (95% CI 0.04 to 0.08) among the US adult population. QALY gains would largely accrue from reductions in severe side effects associated with higher treatment doses later in life. The Intensive JNC8 strategy was dominated by the MDPT strategy. An MDP-based approach can aid decision-making by incorporating meta-analytic evidence to personalise blood pressure

  19. Filament winding cylinders. III - Selection of the process variables

    Science.gov (United States)

    Lee, Soo-Yong; Springer, George S.

    1990-01-01

    By using the Lee-Springer filament winding model temperatures, degrees of cure, viscosities, stresses, strains, fiber tensions, fiber motions, and void diameters were calculated in graphite-epoxy composite cylinders during the winding and subsequent curing. The results demonstrate the type of information which can be generated by the model. It is shown, in reference to these results, how the model, and the corresponding WINDTHICK code, can be used to select the appropriate process variables.

  20. Spacecraft Electrical Connector Selection and Application Processes

    Science.gov (United States)

    Iannello, Chris; Davis, Mitchell I; Kichak, Robert A.; Slenski, George

    2009-01-01

    This assessment was initiated by the NASA Engineering & Safety Center (NESC) after a number of recent "high profile" connector problems, the most visible and publicized of these being the problem with the Space Shuttle's Engine Cut-Off System cryogenic feed-thru connector. The NESC commissioned a review of NASA's connector selection and application processes for space flight applications, including how lessons learned and past problem records are fed back into the processes to avoid recurring issues. Team members were primarily from the various NASA Centers and included connector and electrical parts specialists. The commissioned study was conducted on spacecraft connector selection and application processes at NASA Centers. The team also compared the NASA spacecraft connector selection and application process to the military process, identified recent high profile connector failures, and analyzed problem report data looking for trends and common occurrences. The team characterized NASA's connector problem experience into a list of top connector issues based on anecdotal evidence of a system's impact and commonality between Centers. These top issues are as follows, in no particular rank order: electrically shorted, bent and/or recessed contact pins, contact pin/socket contamination leading to electrically open or intermittencies, connector plating corrosion or corrosion of connector components, low or inadequate contact pin retention forces, contact crimp failures, unmated connectors and mis-wiring due to workmanship errors during installation or maintenance, loose connectors due to manufacturing defects such as wavy washer and worn bayonet retention, damaged connector elastomeric seals and cryogenic connector failure. A survey was also conducted of SAE Connector AE-8C1 committee members regarding their experience relative to the NASA concerns on connectors. The most common responses in order of occurrence were contact retention, plating issues, worn-out or damaged

  1. Selected applications and processing techniques for LTCC.

    Energy Technology Data Exchange (ETDEWEB)

    Peterson, Kenneth Allen; Krueger, Daniel S. (NNSA, Kansas City, MO); Sandoval, Charles E.

    2010-11-01

    Low Temperature Cofired Ceramic has proven itself in microelectronics, microsystems (including microfluidic systems), sensors, RF features, and various non-electronic applications. We will discuss selected applications and the processing associated with those applications. We will then focus on our recent work in the area of EMI shielding using full tape thickness features (FTTF) and sidewall metallization. The FTTF is very effective in applications with -150 dB isolation requirements, but presents obvious processing difficulties in full-scale fabrication. The FTTF forms a single continuous solid wall around the volume to be shielded by using sequential punching and feature-filling. We discuss the material incompatibilities and manufacturing considerations that need to be addressed for such structures and show preliminary implementations.

  2. Neuroscientific model of motivational process.

    Science.gov (United States)

    Kim, Sung-Il

    2013-01-01

    Considering the neuroscientific findings on reward, learning, value, decision-making, and cognitive control, motivation can be parsed into three sub processes, a process of generating motivation, a process of maintaining motivation, and a process of regulating motivation. I propose a tentative neuroscientific model of motivational processes which consists of three distinct but continuous sub processes, namely reward-driven approach, value-based decision-making, and goal-directed control. Reward-driven approach is the process in which motivation is generated by reward anticipation and selective approach behaviors toward reward. This process recruits the ventral striatum (reward area) in which basic stimulus-action association is formed, and is classified as an automatic motivation to which relatively less attention is assigned. By contrast, value-based decision-making is the process of evaluating various outcomes of actions, learning through positive prediction error, and calculating the value continuously. The striatum and the orbitofrontal cortex (valuation area) play crucial roles in sustaining motivation. Lastly, the goal-directed control is the process of regulating motivation through cognitive control to achieve goals. This consciously controlled motivation is associated with higher-level cognitive functions such as planning, retaining the goal, monitoring the performance, and regulating action. The anterior cingulate cortex (attention area) and the dorsolateral prefrontal cortex (cognitive control area) are the main neural circuits related to regulation of motivation. These three sub processes interact with each other by sending reward prediction error signals through dopaminergic pathway from the striatum and to the prefrontal cortex. The neuroscientific model of motivational process suggests several educational implications with regard to the generation, maintenance, and regulation of motivation to learn in the learning environment.

  3. Multiattribute Supplier Selection Using Fuzzy Analytic Hierarchy Process

    Directory of Open Access Journals (Sweden)

    Serhat Aydin

    2010-11-01

    Full Text Available Supplier selection is a multiattribute decision making (MADM problem which contains both qualitative and quantitative factors. Supplier selection has vital importance for most companies. The aim of this paper is to provide an AHP based analytical tool for decision support enabling an effective multicriteria supplier selection process in an air conditioner seller firm under fuzziness. In this article, the Analytic Hierarchy Process (AHP under fuzziness is employed for its permissiveness to use an evaluation scale including linguistic expressions, crisp numerical values, fuzzy numbers and range numerical values. This scale provides a more flexible evaluation compared with the other fuzzy AHP methods. In this study, the modified AHP was used in supplier selection in an air conditioner firm. Three experts evaluated the suppliers according to the proposed model and the most appropriate supplier was selected. The proposed model enables decision makers select the best supplier among supplier firms effectively. We confirm that the modified fuzzy AHP is appropriate for group decision making in supplier selection problems.

  4. Review and selection of unsaturated flow models

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1993-09-10

    Under the US Department of Energy (DOE), the Civilian Radioactive Waste Management System Management and Operating Contractor (CRWMS M&O) has the responsibility to review, evaluate, and document existing computer ground-water flow models; to conduct performance assessments; and to develop performance assessment models, where necessary. In the area of scientific modeling, the M&O CRWMS has the following responsibilities: To provide overall management and integration of modeling activities. To provide a framework for focusing modeling and model development. To identify areas that require increased or decreased emphasis. To ensure that the tools necessary to conduct performance assessment are available. These responsibilities are being initiated through a three-step process. It consists of a thorough review of existing models, testing of models which best fit the established requirements, and making recommendations for future development that should be conducted. Future model enhancement will then focus on the models selected during this activity. Furthermore, in order to manage future model development, particularly in those areas requiring substantial enhancement, the three-step process will be updated and reported periodically in the future.

  5. Solvent selection methodology for pharmaceutical processes: Solvent swap

    DEFF Research Database (Denmark)

    Papadakis, Emmanouil; Kumar Tula, Anjan; Gani, Rafiqul

    2016-01-01

    in pharmaceutical processes as well as new solvent swap alternatives. The method takes into account process considerations such as batch distillation and crystallization to achieve the swap task. Rigorous model based simulations of the swap operation are performed to evaluate and compare the performance......A method for the selection of appropriate solvents for the solvent swap task in pharmaceutical processes has been developed. This solvent swap method is based on the solvent selection method of Gani et al. (2006) and considers additional selection criteria such as boiling point difference......, volatility difference, VLE phase diagram analysis, and azeotropic information that are particularly important for the solvent swap task. The method employs a solvent-swap database together with calculation tools for properties–functions of solvents. The database contains solvents that are commonly used...

  6. Processing plant persistent strains of Listeria monocytogenes appear to have a lower virulence potential than clinical strains in selected virulence models

    DEFF Research Database (Denmark)

    Jensen, Anne; Thomsen, L.E.; Jørgensen, R.L.

    2008-01-01

    cell line, Caco-2; time to death in a nematode model, Caenorhabditis elegans and in a fruit fly model, Drosophila melanogaster and fecal shedding in a guinea pig model. All strains adhered to and grew in Caco-2 cells in similar levels. When exposed to 10(6) CFU/ml, two strains representing......Listeria monocytogenes is an important foodborne bacterial pathogen that can colonize food processing equipment. One group of genetically similar L. monocytogenes strains (RAPID type 9) was recently shown to reside in several independent fish processing plants. Persistent strains are likely...... the persistent RAPD type 9 invaded Caco-2 cells in lower numbers (10(2)-10(3) CFU/ml) as compared to the four other strains (10(4)-10(6) CFU/ml), including food and human clinical strains. In the D. melanogaster model, the two RAPD type 9 strains were among the slowest to kill. Similarly, the time to reach 50...

  7. Expatriates Selection: An Essay of Model Analysis

    Directory of Open Access Journals (Sweden)

    Rui Bártolo-Ribeiro

    2015-03-01

    Full Text Available The business expansion to other geographical areas with different cultures from which organizations were created and developed leads to the expatriation of employees to these destinations. Recruitment and selection procedures of expatriates do not always have the intended success leading to an early return of these professionals with the consequent organizational disorders. In this study, several articles published in the last five years were analyzed in order to identify the most frequently mentioned dimensions in the selection of expatriates in terms of success and failure. The characteristics in the selection process that may increase prediction of adaptation of expatriates to new cultural contexts of the some organization were studied according to the KSAOs model. Few references were found concerning Knowledge, Skills and Abilities dimensions in the analyzed papers. There was a strong predominance on the evaluation of Other Characteristics, and was given more importance to dispositional factors than situational factors for promoting the integration of the expatriates.

  8. Behavioral optimization models for multicriteria portfolio selection

    Directory of Open Access Journals (Sweden)

    Mehlawat Mukesh Kumar

    2013-01-01

    Full Text Available In this paper, behavioral construct of suitability is used to develop a multicriteria decision making framework for portfolio selection. To achieve this purpose, we rely on multiple methodologies. Analytical hierarchy process technique is used to model the suitability considerations with a view to obtaining the suitability performance score in respect of each asset. A fuzzy multiple criteria decision making method is used to obtain the financial quality score of each asset based upon investor's rating on the financial criteria. Two optimization models are developed for optimal asset allocation considering simultaneously financial and suitability criteria. An empirical study is conducted on randomly selected assets from National Stock Exchange, Mumbai, India to demonstrate the effectiveness of the proposed methodology.

  9. Integrated Site Model Process Model Report

    International Nuclear Information System (INIS)

    Booth, T.

    2000-01-01

    The Integrated Site Model (ISM) provides a framework for discussing the geologic features and properties of Yucca Mountain, which is being evaluated as a potential site for a geologic repository for the disposal of nuclear waste. The ISM is important to the evaluation of the site because it provides 3-D portrayals of site geologic, rock property, and mineralogic characteristics and their spatial variabilities. The ISM is not a single discrete model; rather, it is a set of static representations that provide three-dimensional (3-D), computer representations of site geology, selected hydrologic and rock properties, and mineralogic-characteristics data. These representations are manifested in three separate model components of the ISM: the Geologic Framework Model (GFM), the Rock Properties Model (RPM), and the Mineralogic Model (MM). The GFM provides a representation of the 3-D stratigraphy and geologic structure. Based on the framework provided by the GFM, the RPM and MM provide spatial simulations of the rock and hydrologic properties, and mineralogy, respectively. Functional summaries of the component models and their respective output are provided in Section 1.4. Each of the component models of the ISM considers different specific aspects of the site geologic setting. Each model was developed using unique methodologies and inputs, and the determination of the modeled units for each of the components is dependent on the requirements of that component. Therefore, while the ISM represents the integration of the rock properties and mineralogy into a geologic framework, the discussion of ISM construction and results is most appropriately presented in terms of the three separate components. This Process Model Report (PMR) summarizes the individual component models of the ISM (the GFM, RPM, and MM) and describes how the three components are constructed and combined to form the ISM

  10. Model selection for univariable fractional polynomials.

    Science.gov (United States)

    Royston, Patrick

    2017-07-01

    Since Royston and Altman's 1994 publication ( Journal of the Royal Statistical Society, Series C 43: 429-467), fractional polynomials have steadily gained popularity as a tool for flexible parametric modeling of regression relationships. In this article, I present fp_select, a postestimation tool for fp that allows the user to select a parsimonious fractional polynomial model according to a closed test procedure called the fractional polynomial selection procedure or function selection procedure. I also give a brief introduction to fractional polynomial models and provide examples of using fp and fp_select to select such models with real data.

  11. SUPPLIER SELECTION PROCESS IN SUPPLY CHAIN MANAGEMENT

    OpenAIRE

    Chandraveer Singh Rathore*, Sachin Agarwal

    2016-01-01

    Supplier’s Selection is one among the foremost essential activities of supply chain management. Supplier’s Selection could be an advanced activity involving qualitative and quantitative multi-criteria. A trade-off between these tangible and intangible factors is essential in choosing the most effective Supplier.This paper explains the various methodsfor supplier selection and the use of AHP in selecting the most effective suppliers. The complete procedure of AHP is explained in this paper wit...

  12. Mathematical modeling of biological processes

    CERN Document Server

    Friedman, Avner

    2014-01-01

    This book on mathematical modeling of biological processes includes a wide selection of biological topics that demonstrate the power of mathematics and computational codes in setting up biological processes with a rigorous and predictive framework. Topics include: enzyme dynamics, spread of disease, harvesting bacteria, competition among live species, neuronal oscillations, transport of neurofilaments in axon, cancer and cancer therapy, and granulomas. Complete with a description of the biological background and biological question that requires the use of mathematics, this book is developed for graduate students and advanced undergraduate students with only basic knowledge of ordinary differential equations and partial differential equations; background in biology is not required. Students will gain knowledge on how to program with MATLAB without previous programming experience and how to use codes in order to test biological hypothesis.

  13. NON-TRADITIONAL MACHINING PROCESS SELECTION - AN INTEGRATED APPROACH

    Directory of Open Access Journals (Sweden)

    Manish Kumar Roy

    2017-03-01

    Full Text Available With a large demand intended for the use of harder and difficult to machine materials like titanium, Inconel, high-strength temperature resistant (HSTR alloys etc. coupled with the need for high accuracy and desired surface finish have lead us to the situation where we find ourselves entangled in a large pool of Non-Traditional machining (NTM processes. As such selecting a particular NTM process turns out to be a complicated job for a specific task. Meticulous selection of a NTM process involves a lot of criteria and hence multi-criteria decision making (MCDM method is used to solve such problems. For the aid of decision maker such that the process of selection gets simplified an integrated method of fuzzy analytic hierarchy process (FAHP with Quality function deployment (QFD has been implemented for finding the significance of different technical requirements on a relative basis. Subsequently grey relational analysis (GRA has been implemented for ranking out the alternatives and it was found that Electrochemical machining (ECM overrules other NTM processes. A problem already existing in the literature has been picked up for the numerical illustration. The results obtained in the present research study are comparable with the existing literature and sensitivity analysis indicates the robustness of the proposed model.

  14. Fundamental Aspects of Selective Melting Additive Manufacturing Processes

    Energy Technology Data Exchange (ETDEWEB)

    van Swol, Frank B. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Miller, James E. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2014-12-01

    Certain details of the additive manufacturing process known as selective laser melting (SLM) affect the performance of the final metal part. To unleash the full potential of SLM it is crucial that the process engineer in the field receives guidance about how to select values for a multitude of process variables employed in the building process. These include, for example, the type of powder (e.g., size distribution, shape, type of alloy), orientation of the build axis, the beam scan rate, the beam power density, the scan pattern and scan rate. The science-based selection of these settings con- stitutes an intrinsically challenging multi-physics problem involving heating and melting a metal alloy, reactive, dynamic wetting followed by re-solidification. In addition, inherent to the process is its considerable variability that stems from the powder packing. Each time a limited number of powder particles are placed, the stacking is intrinsically different from the previous, possessing a different geometry, and having a different set of contact areas with the surrounding particles. As a result, even if all other process parameters (scan rate, etc) are exactly the same, the shape and contact geometry and area of the final melt pool will be unique to that particular configuration. This report identifies the most important issues facing SLM, discusses the fundamental physics associated with it and points out how modeling can support the additive manufacturing efforts.

  15. Animal models and conserved processes

    Directory of Open Access Journals (Sweden)

    Greek Ray

    2012-09-01

    Full Text Available Abstract Background The concept of conserved processes presents unique opportunities for using nonhuman animal models in biomedical research. However, the concept must be examined in the context that humans and nonhuman animals are evolved, complex, adaptive systems. Given that nonhuman animals are examples of living systems that are differently complex from humans, what does the existence of a conserved gene or process imply for inter-species extrapolation? Methods We surveyed the literature including philosophy of science, biological complexity, conserved processes, evolutionary biology, comparative medicine, anti-neoplastic agents, inhalational anesthetics, and drug development journals in order to determine the value of nonhuman animal models when studying conserved processes. Results Evolution through natural selection has employed components and processes both to produce the same outcomes among species but also to generate different functions and traits. Many genes and processes are conserved, but new combinations of these processes or different regulation of the genes involved in these processes have resulted in unique organisms. Further, there is a hierarchy of organization in complex living systems. At some levels, the components are simple systems that can be analyzed by mathematics or the physical sciences, while at other levels the system cannot be fully analyzed by reducing it to a physical system. The study of complex living systems must alternate between focusing on the parts and examining the intact whole organism while taking into account the connections between the two. Systems biology aims for this holism. We examined the actions of inhalational anesthetic agents and anti-neoplastic agents in order to address what the characteristics of complex living systems imply for inter-species extrapolation of traits and responses related to conserved processes. Conclusion We conclude that even the presence of conserved processes is

  16. Animal models and conserved processes.

    Science.gov (United States)

    Greek, Ray; Rice, Mark J

    2012-09-10

    The concept of conserved processes presents unique opportunities for using nonhuman animal models in biomedical research. However, the concept must be examined in the context that humans and nonhuman animals are evolved, complex, adaptive systems. Given that nonhuman animals are examples of living systems that are differently complex from humans, what does the existence of a conserved gene or process imply for inter-species extrapolation? We surveyed the literature including philosophy of science, biological complexity, conserved processes, evolutionary biology, comparative medicine, anti-neoplastic agents, inhalational anesthetics, and drug development journals in order to determine the value of nonhuman animal models when studying conserved processes. Evolution through natural selection has employed components and processes both to produce the same outcomes among species but also to generate different functions and traits. Many genes and processes are conserved, but new combinations of these processes or different regulation of the genes involved in these processes have resulted in unique organisms. Further, there is a hierarchy of organization in complex living systems. At some levels, the components are simple systems that can be analyzed by mathematics or the physical sciences, while at other levels the system cannot be fully analyzed by reducing it to a physical system. The study of complex living systems must alternate between focusing on the parts and examining the intact whole organism while taking into account the connections between the two. Systems biology aims for this holism. We examined the actions of inhalational anesthetic agents and anti-neoplastic agents in order to address what the characteristics of complex living systems imply for inter-species extrapolation of traits and responses related to conserved processes. We conclude that even the presence of conserved processes is insufficient for inter-species extrapolation when the trait or response

  17. GREENSCOPE: Sustainable Process Modeling

    Science.gov (United States)

    EPA researchers are responding to environmental problems by incorporating sustainability into process design and evaluation. EPA researchers are also developing a tool that allows users to assess modifications to existing and new chemical processes to determine whether changes in...

  18. Quality Quandaries- Time Series Model Selection and Parsimony

    DEFF Research Database (Denmark)

    Bisgaard, Søren; Kulahci, Murat

    2009-01-01

    Some of the issues involved in selecting adequate models for time series data are discussed using an example concerning the number of users of an Internet server. The process of selecting an appropriate model is subjective and requires experience and judgment. The authors believe an important...... consideration in model selection should be parameter parsimony. They favor the use of parsimonious mixed ARMA models, noting that research has shown that a model building strategy that considers only autoregressive representations will lead to non-parsimonious models and to loss of forecasting accuracy....

  19. Auditory processing models

    DEFF Research Database (Denmark)

    Dau, Torsten

    2008-01-01

    The Handbook of Signal Processing in Acoustics will compile the techniques and applications of signal processing as they are used in the many varied areas of Acoustics. The Handbook will emphasize the interdisciplinary nature of signal processing in acoustics. Each Section of the Handbook...... will present topics on signal processing which are important in a specific area of acoustics. These will be of interest to specialists in these areas because they will be presented from their technical perspective, rather than a generic engineering approach to signal processing. Non-specialists, or specialists...

  20. A Hybrid Multiple Criteria Decision Making Model for Supplier Selection

    OpenAIRE

    Wu, Chung-Min; Hsieh, Ching-Lin; Chang, Kuei-Lun

    2013-01-01

    The sustainable supplier selection would be the vital part in the management of a sustainable supply chain. In this study, a hybrid multiple criteria decision making (MCDM) model is applied to select optimal supplier. The fuzzy Delphi method, which can lead to better criteria selection, is used to modify criteria. Considering the interdependence among the selection criteria, analytic network process (ANP) is then used to obtain their weights. To avoid calculation and additional pairwise compa...

  1. Model selection and comparison for independents sinusoids

    DEFF Research Database (Denmark)

    Nielsen, Jesper Kjær; Christensen, Mads Græsbøll; Jensen, Søren Holdt

    2014-01-01

    In the signal processing literature, many methods have been proposed for estimating the number of sinusoidal basis functions from a noisy data set. The most popular method is the asymptotic MAP criterion, which is sometimes also referred to as the BIC. In this paper, we extend and improve this me....... Through simulations, we demonstrate that the lp-BIC outperforms the asymptotic MAP criterion and other state of the art methods in terms of model selection, de-noising and prediction performance. The simulation code is available online.......In the signal processing literature, many methods have been proposed for estimating the number of sinusoidal basis functions from a noisy data set. The most popular method is the asymptotic MAP criterion, which is sometimes also referred to as the BIC. In this paper, we extend and improve...... this method by considering the problem in a full Bayesian framework instead of the approximate formulation, on which the asymptotic MAP criterion is based. This leads to a new model selection and comparison method, the lp-BIC, whose computational complexity is of the same order as the asymptotic MAP criterion...

  2. Selected sports talent development models

    OpenAIRE

    Michal Vičar

    2017-01-01

    Background: Sports talent in the Czech Republic is generally viewed as a static, stable phenomena. It stands in contrast with widespread praxis carried out in Anglo-Saxon countries that emphasise its fluctuant nature. This is reflected in the current models describing its development. Objectives: The aim is to introduce current models of talent development in sport. Methods: Comparison and analysing of the following models: Balyi - Long term athlete development model, Côté - Developmen...

  3. Method for Business Process Management System Selection

    NARCIS (Netherlands)

    Thijs van de Westelaken; Bas Terwee; Pascal Ravesteijn

    2013-01-01

    In recent years business process management (BPM) and specifically information systems that support the analysis, design and execution of processes (also called business process management systems (BPMS)) are getting more attention. This has lead to an increase in research on BPM and BPMS. However

  4. Selection of Sustainable Processes using Sustainability ...

    Science.gov (United States)

    Chemical products can be obtained by process pathways involving varying amounts and types of resources, utilities, and byproduct formation. When such competing process options such as six processes for making methanol as are considered in this study, it is necessary to identify the most sustainable option. Sustainability of a chemical process is generally evaluated with indicators that require process and chemical property data. These indicators individually reflect the impacts of the process on areas of sustainability, such as the environment or society. In order to choose among several alternative processes an overall comparative analysis is essential. Generally net profit will show the most economic process. A mixed integer optimization problem can also be solved to identify the most economic among competing processes. This method uses economic optimization and leaves aside the environmental and societal impacts. To make a decision on the most sustainable process, the method presented here rationally aggregates the sustainability indicators into a single index called sustainability footprint (De). Process flow and economic data were used to compute the indicator values. Results from sustainability footprint (De) are compared with those from solving a mixed integer optimization problem. In order to identify the rank order of importance of the indicators, a multivariate analysis is performed using partial least square variable importance in projection (PLS-VIP)

  5. Creativity: Intuitive processing outperforms deliberative processing in creative idea selection

    NARCIS (Netherlands)

    Zhu, Y.; Ritter, S.M.; Müller, B.C.N.; Dijksterhuis, A.J.

    2017-01-01

    Creative ideas are highly valued, and various techniques have been designed to maximize the generation of creative ideas. However, for actual implementation of creative ideas, the most creative ideas must be recognized and selected from a pool of ideas. Although idea generation and idea selection

  6. MODEL SELECTION FOR SPECTROPOLARIMETRIC INVERSIONS

    International Nuclear Information System (INIS)

    Asensio Ramos, A.; Manso Sainz, R.; Martínez González, M. J.; Socas-Navarro, H.; Viticchié, B.; Orozco Suárez, D.

    2012-01-01

    Inferring magnetic and thermodynamic information from spectropolarimetric observations relies on the assumption of a parameterized model atmosphere whose parameters are tuned by comparison with observations. Often, the choice of the underlying atmospheric model is based on subjective reasons. In other cases, complex models are chosen based on objective reasons (for instance, the necessity to explain asymmetries in the Stokes profiles) but it is not clear what degree of complexity is needed. The lack of an objective way of comparing models has, sometimes, led to opposing views of the solar magnetism because the inferred physical scenarios are essentially different. We present the first quantitative model comparison based on the computation of the Bayesian evidence ratios for spectropolarimetric observations. Our results show that there is not a single model appropriate for all profiles simultaneously. Data with moderate signal-to-noise ratios (S/Ns) favor models without gradients along the line of sight. If the observations show clear circular and linear polarization signals above the noise level, models with gradients along the line are preferred. As a general rule, observations with large S/Ns favor more complex models. We demonstrate that the evidence ratios correlate well with simple proxies. Therefore, we propose to calculate these proxies when carrying out standard least-squares inversions to allow for model comparison in the future.

  7. INNOVATION PROCESS MODELLING

    Directory of Open Access Journals (Sweden)

    JANUSZ K. GRABARA

    2011-01-01

    Full Text Available Modelling phenomena in accordance with the structural approach enables one to simplify the observed relations and to present the classification grounds. An example may be a model of organisational structure identifying the logical relations between particular units and presenting the division of authority, work.

  8. 7 CFR 1469.6 - Enrollment criteria and selection process.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 10 2010-01-01 2010-01-01 false Enrollment criteria and selection process. 1469.6... General Provisions § 1469.6 Enrollment criteria and selection process. (a) Selection and funding of... existing natural resource, environmental quality, and agricultural activity data along with other...

  9. VEMAP 1: Selected Model Results

    Data.gov (United States)

    National Aeronautics and Space Administration — The Vegetation/Ecosystem Modeling and Analysis Project (VEMAP) was a multi-institutional, international effort addressing the response of biogeography and...

  10. The linear utility model for optimal selection

    NARCIS (Netherlands)

    Mellenbergh, Gideon J.; van der Linden, Willem J.

    A linear utility model is introduced for optimal selection when several subpopulations of applicants are to be distinguished. Using this model, procedures are described for obtaining optimal cutting scores in subpopulations in quota-free as well as quota-restricted selection situations. The cutting

  11. VEMAP 1: Selected Model Results

    Data.gov (United States)

    National Aeronautics and Space Administration — ABSTRACT: The Vegetation/Ecosystem Modeling and Analysis Project (VEMAP) was a multi-institutional, international effort addressing the response of biogeography and...

  12. Multi-enzyme Process Modeling

    DEFF Research Database (Denmark)

    Andrade Santacoloma, Paloma de Gracia

    The subject of this thesis is to develop a methodological framework that can systematically guide mathematical model building for better understanding of multi-enzyme processes. In this way, opportunities for process improvements can be identified by analyzing simulations of either existing...... features of the process and provides the information required to structure the process model by using a step-by-step procedure with the required tools and methods. In this way, this framework increases efficiency of the model development process with respect to time and resources needed (fast and effective...... in the scientific literature. Reliable mathematical models of such multi-catalytic schemes can exploit the potential benefit of these processes. In this way, the best outcome of the process can be obtained understanding the types of modification that are required for process optimization. An effective evaluation...

  13. Selection of classification models from repository of model for water ...

    African Journals Online (AJOL)

    This paper proposes a new technique, Model Selection Technique (MST) for selection and ranking of models from the repository of models by combining three performance measures (Acc, TPR and TNR). This technique provides weightage to each performance measure to find the most suitable model from the repository of ...

  14. Modeling of column apparatus processes

    CERN Document Server

    Boyadjiev, Christo; Boyadjiev, Boyan; Popova-Krumova, Petya

    2016-01-01

    This book presents a new approach for the modeling of chemical and interphase mass transfer processes in industrial column apparatuses, using convection-diffusion and average-concentration models. The convection-diffusion type models are used for a qualitative analysis of the processes and to assess the main, small and slight physical effects, and then reject the slight effects. As a result, the process mechanism can be identified. It also introduces average concentration models for quantitative analysis, which use the average values of the velocity and concentration over the cross-sectional area of the column. The new models are used to analyze different processes (simple and complex chemical reactions, absorption, adsorption and catalytic reactions), and make it possible to model the processes of gas purification with sulfur dioxide, which form the basis of several patents.

  15. Ground-water transport model selection and evaluation guidelines

    International Nuclear Information System (INIS)

    Simmons, C.S.; Cole, C.R.

    1983-01-01

    Guidelines are being developed to assist potential users with selecting appropriate computer codes for ground-water contaminant transport modeling. The guidelines are meant to assist managers with selecting appropriate predictive models for evaluating either arid or humid low-level radioactive waste burial sites. Evaluation test cases in the form of analytical solutions to fundamental equations and experimental data sets have been identified and recommended to ensure adequate code selection, based on accurate simulation of relevant physical processes. The recommended evaluation procedures will consider certain technical issues related to the present limitations in transport modeling capabilities. A code-selection plan will depend on identifying problem objectives, determining the extent of collectible site-specific data, and developing a site-specific conceptual model for the involved hydrology. Code selection will be predicated on steps for developing an appropriate systems model. This paper will review the progress in developing those guidelines. 12 references

  16. Processes in arithmetic strategy selection: a fMRI study.

    Science.gov (United States)

    Taillan, Julien; Ardiale, Eléonore; Anton, Jean-Luc; Nazarian, Bruno; Félician, Olivier; Lemaire, Patrick

    2015-01-01

    This neuroimaging (functional magnetic resonance imaging) study investigated neural correlates of strategy selection. Young adults performed an arithmetic task in two different conditions. In both conditions, participants had to provide estimates of two-digit multiplication problems like 54 × 78. In the choice condition, participants had to select the better of two available rounding strategies, rounding-up (RU) strategy (i.e., doing 60 × 80 = 4,800) or rounding-down (RD) strategy (i.e., doing 50 × 70 = 3,500 to estimate product of 54 × 78). In the no-choice condition, participants did not have to select strategy on each problem but were told which strategy to use; they executed RU and RD strategies each on a series of problems. Participants also had a control task (i.e., providing correct products of multiplication problems like 40 × 50). Brain activations and performance were analyzed as a function of these conditions. Participants were able to frequently choose the better strategy in the choice condition; they were also slower when they executed the difficult RU than the easier RD. Neuroimaging data showed greater brain activations in right anterior cingulate cortex (ACC), dorso-lateral prefrontal cortex (DLPFC), and angular gyrus (ANG), when selecting (relative to executing) the better strategy on each problem. Moreover, RU was associated with more parietal cortex activation than RD. These results suggest an important role of fronto-parietal network in strategy selection and have important implications for our further understanding and modeling cognitive processes underlying strategy selection.

  17. Processes in arithmetic strategy selection: A fMRI study.

    Directory of Open Access Journals (Sweden)

    Julien eTaillan

    2015-02-01

    Full Text Available This neuroimaging (fMRI study investigated neural correlates of strategy selection. Young adults performed an arithmetic task in two different conditions. In both conditions, participants had to provide estimates of two-digit multiplication problems like 54 x 78. In the choice condition, participants had to select the better of two available rounding strategies, rounding-up strategy (RU (i.e., doing 60x80 = 4,800 or rounding-down strategy (RD (i.e., doing 50x70=3,500 to estimate product of 54x78. In the no-choice condition, participants did not have to select strategy on each problem but were told which strategy to use; they executed RU and RD strategies each on a series of problems. Participants also had a control task (i.e., providing correct products of multiplication problems like 40x50. Brain activations and performance were analyzed as a function of these conditions. Participants were able to frequently choose the better strategy in the choice condition; they were also slower when they executed the difficult RU than the easier RD. Neuroimaging data showed greater brain activations in right anterior cingulate cortex (ACC, dorso-lateral prefrontal cortex (DLPFC, and angular gyrus (ANG, when selecting (relative to executing the better strategy on each problem. Moreover, RU was associated with more parietal cortex activation than RD. These results suggest an important role of fronto-parietal network in strategy selection and have important implications for our further understanding and modelling cognitive processes underlying strategy selection.

  18. Economic assessment model architecture for AGC/AVLIS selection

    International Nuclear Information System (INIS)

    Hoglund, R.L.

    1984-01-01

    The economic assessment model architecture described provides the flexibility and completeness in economic analysis that the selection between AGC and AVLIS demands. Process models which are technology-specific will provide the first-order responses of process performance and cost to variations in process parameters. The economics models can be used to test the impacts of alternative deployment scenarios for a technology. Enterprise models provide global figures of merit for evaluating the DOE perspective on the uranium enrichment enterprise, and business analysis models compute the financial parameters from the private investor's viewpoint

  19. Business Process Modeling: Perceived Benefits

    Science.gov (United States)

    Indulska, Marta; Green, Peter; Recker, Jan; Rosemann, Michael

    The process-centered design of organizations and information systems is globally seen as an appropriate response to the increased economic pressure on organizations. At the methodological core of process-centered management is process modeling. However, business process modeling in large initiatives can be a time-consuming and costly exercise, making it potentially difficult to convince executive management of its benefits. To date, and despite substantial interest and research in the area of process modeling, the understanding of the actual benefits of process modeling in academia and practice is limited. To address this gap, this paper explores the perception of benefits derived from process modeling initiatives, as reported through a global Delphi study. The study incorporates the views of three groups of stakeholders - academics, practitioners and vendors. Our findings lead to the first identification and ranking of 19 unique benefits associated with process modeling. The study in particular found that process modeling benefits vary significantly between practitioners and academics. We argue that the variations may point to a disconnect between research projects and practical demands.

  20. Chemical Process Modeling and Control.

    Science.gov (United States)

    Bartusiak, R. Donald; Price, Randel M.

    1987-01-01

    Describes some of the features of Lehigh University's (Pennsylvania) process modeling and control program. Highlights the creation and operation of the Chemical Process Modeling and Control Center (PMC). Outlines the program's philosophy, faculty, technical program, current research projects, and facilities. (TW)

  1. Chapter 1: Standard Model processes

    OpenAIRE

    Becher, Thomas

    2017-01-01

    This chapter documents the production rates and typical distributions for a number of benchmark Standard Model processes, and discusses new dynamical phenomena arising at the highest energies available at this collider. We discuss the intrinsic physics interest in the measurement of these Standard Model processes, as well as their role as backgrounds for New Physics searches.

  2. Intermediate product selection and blending in the food processing industry

    DEFF Research Database (Denmark)

    Kilic, Onur A.; Akkerman, Renzo; van Donk, Dirk Pieter

    2013-01-01

    This study addresses a capacitated intermediate product selection and blending problem typical for two-stage production systems in the food processing industry. The problem involves the selection of a set of intermediates and end-product recipes characterising how those selected intermediates...

  3. Using Card Games to Simulate the Process of Natural Selection

    Science.gov (United States)

    Grilliot, Matthew E.; Harden, Siegfried

    2014-01-01

    In 1858, Darwin published "On the Origin of Species by Means of Natural Selection." His explanation of evolution by natural selection has become the unifying theme of biology. We have found that many students do not fully comprehend the process of evolution by natural selection. We discuss a few simple games that incorporate hands-on…

  4. Goal, strategy and project selection for the radiation processing

    International Nuclear Information System (INIS)

    Ma, Z.; Zhou, L.

    1984-01-01

    In the past two decades, a number of results in the field of R/D of the radiation processing have been attained in China. Many of them are of great value in application and have been put into pilot plant or trial production. From the viewpoint of system analysis, the radiation processing as a technological system follows its own law of development, now entering a stage of rapid development in the areas of R/D and application. As a favourable factor for the development of radiation, two significant policies have been formulated; first, the policy of development of science and technology should contribute to the development of national economy; secondly the policy of the development of atomic science and technology emphasis should be shifted to civil purposes. These two policies are bound to greatly promote the R/D of the radiation processing, especially its transfer and diffusion. Now, the linkage between the radiation processing and national economy is getting closer and closer. A lot of proposals for investment in this field have been put forward recently. Therefore, it is vital to make investigations into the strategy for the developing, economic effectiveness and the criteria in project selection of the radiation processing. In this article, we study the present state and the characteristics of the technology transfer of radiation processing in China from the viewpoint of system analysis and of effectiveness. We also analyse this issue in the light of the concept of life cycle of technology so as to establish, we hope, a realistic goal and an appropriate development strategy. Based on this, a model of project selection and suitable forms of diffusion and transfer are suggested

  5. Model Identification of Integrated ARMA Processes

    Science.gov (United States)

    Stadnytska, Tetiana; Braun, Simone; Werner, Joachim

    2008-01-01

    This article evaluates the Smallest Canonical Correlation Method (SCAN) and the Extended Sample Autocorrelation Function (ESACF), automated methods for the Autoregressive Integrated Moving-Average (ARIMA) model selection commonly available in current versions of SAS for Windows, as identification tools for integrated processes. SCAN and ESACF can…

  6. Business process modeling in healthcare.

    Science.gov (United States)

    Ruiz, Francisco; Garcia, Felix; Calahorra, Luis; Llorente, César; Gonçalves, Luis; Daniel, Christel; Blobel, Bernd

    2012-01-01

    The importance of the process point of view is not restricted to a specific enterprise sector. In the field of health, as a result of the nature of the service offered, health institutions' processes are also the basis for decision making which is focused on achieving their objective of providing quality medical assistance. In this chapter the application of business process modelling - using the Business Process Modelling Notation (BPMN) standard is described. Main challenges of business process modelling in healthcare are the definition of healthcare processes, the multi-disciplinary nature of healthcare, the flexibility and variability of the activities involved in health care processes, the need of interoperability between multiple information systems, and the continuous updating of scientific knowledge in healthcare.

  7. Modeling nuclear processes by Simulink

    Science.gov (United States)

    Rashid, Nahrul Khair Alang Md

    2015-04-01

    Modelling and simulation are essential parts in the study of dynamic systems behaviours. In nuclear engineering, modelling and simulation are important to assess the expected results of an experiment before the actual experiment is conducted or in the design of nuclear facilities. In education, modelling can give insight into the dynamic of systems and processes. Most nuclear processes can be described by ordinary or partial differential equations. Efforts expended to solve the equations using analytical or numerical solutions consume time and distract attention from the objectives of modelling itself. This paper presents the use of Simulink, a MATLAB toolbox software that is widely used in control engineering, as a modelling platform for the study of nuclear processes including nuclear reactor behaviours. Starting from the describing equations, Simulink models for heat transfer, radionuclide decay process, delayed neutrons effect, reactor point kinetic equations with delayed neutron groups, and the effect of temperature feedback are used as examples.

  8. Modeling nuclear processes by Simulink

    Energy Technology Data Exchange (ETDEWEB)

    Rashid, Nahrul Khair Alang Md, E-mail: nahrul@iium.edu.my [Faculty of Engineering, International Islamic University Malaysia, Jalan Gombak, Selangor (Malaysia)

    2015-04-29

    Modelling and simulation are essential parts in the study of dynamic systems behaviours. In nuclear engineering, modelling and simulation are important to assess the expected results of an experiment before the actual experiment is conducted or in the design of nuclear facilities. In education, modelling can give insight into the dynamic of systems and processes. Most nuclear processes can be described by ordinary or partial differential equations. Efforts expended to solve the equations using analytical or numerical solutions consume time and distract attention from the objectives of modelling itself. This paper presents the use of Simulink, a MATLAB toolbox software that is widely used in control engineering, as a modelling platform for the study of nuclear processes including nuclear reactor behaviours. Starting from the describing equations, Simulink models for heat transfer, radionuclide decay process, delayed neutrons effect, reactor point kinetic equations with delayed neutron groups, and the effect of temperature feedback are used as examples.

  9. A Dynamic Model for Limb Selection

    NARCIS (Netherlands)

    Cox, R.F.A; Smitsman, A.W.

    2008-01-01

    Two experiments and a model on limb selection are reported. In Experiment 1 left-handed and right-handed participants (N = 36) repeatedly used one hand for grasping a small cube. After a clear switch in the cube’s location, perseverative limb selection was revealed in both handedness groups. In

  10. Review and selection of unsaturated flow models

    Energy Technology Data Exchange (ETDEWEB)

    Reeves, M.; Baker, N.A.; Duguid, J.O. [INTERA, Inc., Las Vegas, NV (United States)

    1994-04-04

    Since the 1960`s, ground-water flow models have been used for analysis of water resources problems. In the 1970`s, emphasis began to shift to analysis of waste management problems. This shift in emphasis was largely brought about by site selection activities for geologic repositories for disposal of high-level radioactive wastes. Model development during the 1970`s and well into the 1980`s focused primarily on saturated ground-water flow because geologic repositories in salt, basalt, granite, shale, and tuff were envisioned to be below the water table. Selection of the unsaturated zone at Yucca Mountain, Nevada, for potential disposal of waste began to shift model development toward unsaturated flow models. Under the US Department of Energy (DOE), the Civilian Radioactive Waste Management System Management and Operating Contractor (CRWMS M&O) has the responsibility to review, evaluate, and document existing computer models; to conduct performance assessments; and to develop performance assessment models, where necessary. This document describes the CRWMS M&O approach to model review and evaluation (Chapter 2), and the requirements for unsaturated flow models which are the bases for selection from among the current models (Chapter 3). Chapter 4 identifies existing models, and their characteristics. Through a detailed examination of characteristics, Chapter 5 presents the selection of models for testing. Chapter 6 discusses the testing and verification of selected models. Chapters 7 and 8 give conclusions and make recommendations, respectively. Chapter 9 records the major references for each of the models reviewed. Appendix A, a collection of technical reviews for each model, contains a more complete list of references. Finally, Appendix B characterizes the problems used for model testing.

  11. Review and selection of unsaturated flow models

    International Nuclear Information System (INIS)

    Reeves, M.; Baker, N.A.; Duguid, J.O.

    1994-01-01

    Since the 1960's, ground-water flow models have been used for analysis of water resources problems. In the 1970's, emphasis began to shift to analysis of waste management problems. This shift in emphasis was largely brought about by site selection activities for geologic repositories for disposal of high-level radioactive wastes. Model development during the 1970's and well into the 1980's focused primarily on saturated ground-water flow because geologic repositories in salt, basalt, granite, shale, and tuff were envisioned to be below the water table. Selection of the unsaturated zone at Yucca Mountain, Nevada, for potential disposal of waste began to shift model development toward unsaturated flow models. Under the US Department of Energy (DOE), the Civilian Radioactive Waste Management System Management and Operating Contractor (CRWMS M ampersand O) has the responsibility to review, evaluate, and document existing computer models; to conduct performance assessments; and to develop performance assessment models, where necessary. This document describes the CRWMS M ampersand O approach to model review and evaluation (Chapter 2), and the requirements for unsaturated flow models which are the bases for selection from among the current models (Chapter 3). Chapter 4 identifies existing models, and their characteristics. Through a detailed examination of characteristics, Chapter 5 presents the selection of models for testing. Chapter 6 discusses the testing and verification of selected models. Chapters 7 and 8 give conclusions and make recommendations, respectively. Chapter 9 records the major references for each of the models reviewed. Appendix A, a collection of technical reviews for each model, contains a more complete list of references. Finally, Appendix B characterizes the problems used for model testing

  12. Path modeling and process control

    DEFF Research Database (Denmark)

    Høskuldsson, Agnar; Rodionova, O.; Pomerantsev, A.

    2007-01-01

    and having three or more stages. The methods are applied to a process control of a multi-stage production process having 25 variables and one output variable. When moving along the process, variables change their roles. It is shown how the methods of path modeling can be applied to estimate variables......Many production processes are carried out in stages. At the end of each stage, the production engineer can analyze the intermediate results and correct process parameters (variables) of the next stage. Both analysis of the process and correction to process parameters at next stage should...... be performed regarding the foreseeable output property y, and with respect to an admissible range of correcting actions for the parameters of the next stage. In this paper the basic principles of path modeling is presented. The mathematics is presented for processes having only one stage, having two stages...

  13. Selection of basic data for numerical modeling of rock mass stress state at Mirny Mining and Processing Works, Alrosa Group of Companies

    Science.gov (United States)

    Bokiy, IB; Zoteev, OV; Pul, VV; Pul, EK

    2018-03-01

    The influence of structural features on the strength and elasticity modulus is studied in rock mass in the area of Mirny Mining and Processing Works. The authors make recommendations on the values of physical properties of rocks.

  14. Selection of water treatment processes special study

    International Nuclear Information System (INIS)

    1991-11-01

    Characterization of the level and extent of groundwater contamination in the vicinity of Title I mill sites began during the surface remedial action stage (Phase 1) of the Uranium Mill Tailings Remedial Action (UMTRA) Project. Some of the contamination in the aquifer(s) at the abandoned sites is attributable to milling activities during the years the mills were in operation. The restoration of contaminated aquifers is to be undertaken in Phase II of the UMTRA Project. To begin implementation of Phase II, DOE requested that groundwater restoration methods and technologies be investigated by the Technical Assistance Contractor (TAC). and that the results of the TAC investigations be documented in special study reports. Many active and passive methods are available to clean up contaminated groundwater. Passive groundwater treatment includes natural flushing, geochemical barriers, and gradient manipulation by stream diversion or slurry walls. Active groundwater.cleanup techniques include gradient manipulation by well extraction or injection. in-situ biological or chemical reclamation, and extraction and treatment. Although some or all of the methods listed above may play a role in the groundwater cleanup phase of the UMTRA Project, the extraction and treatment (pump and treat) option is the only restoration alternative discussed in this report. Hence, all sections of this report relate either directly or indirectly to the technical discipline of process engineering

  15. The Added Value of the Project Selection Process

    Directory of Open Access Journals (Sweden)

    Adel Oueslati

    2016-06-01

    Full Text Available The project selection process comes in the first stage of the overall project management life cycle. It does have a very important impact on organization success. The present paper provides defi nitions of the basic concepts and tools related to the project selection process. It aims to stress the added value of this process for the entire organization success. The mastery of the project selection process is the right way for any organization to ensure that it will do the right project with the right resources at the right time and within the right priorities

  16. Genetic search feature selection for affective modeling

    DEFF Research Database (Denmark)

    Martínez, Héctor P.; Yannakakis, Georgios N.

    2010-01-01

    Automatic feature selection is a critical step towards the generation of successful computational models of affect. This paper presents a genetic search-based feature selection method which is developed as a global-search algorithm for improving the accuracy of the affective models built....... The method is tested and compared against sequential forward feature selection and random search in a dataset derived from a game survey experiment which contains bimodal input features (physiological and gameplay) and expressed pairwise preferences of affect. Results suggest that the proposed method...

  17. A SUPPLIER SELECTION MODEL FOR SOFTWARE DEVELOPMENT OUTSOURCING

    Directory of Open Access Journals (Sweden)

    Hancu Lucian-Viorel

    2010-12-01

    Full Text Available This paper presents a multi-criteria decision making model used for supplier selection for software development outsourcing on e-marketplaces. This model can be used in auctions. The supplier selection process becomes complex and difficult on last twenty years since the Internet plays an important role in business management. Companies have to concentrate their efforts on their core activities and the others activities should be realized by outsourcing. They can achieve significant cost reduction by using e-marketplaces in their purchase process and by using decision support systems on supplier selection. In the literature were proposed many approaches for supplier evaluation and selection process. The performance of potential suppliers is evaluated using multi criteria decision making methods rather than considering a single factor cost.

  18. Selecting public relations personnel of hospitals by analytic network process.

    Science.gov (United States)

    Liao, Sen-Kuei; Chang, Kuei-Lun

    2009-01-01

    This study describes the use of analytic network process (ANP) in the Taiwanese hospital public relations personnel selection process. Starting with interviewing 48 practitioners and executives in north Taiwan, we collected selection criteria. Then, we retained the 12 critical criteria that were mentioned above 40 times by theses respondents, including: interpersonal skill, experience, negotiation, language, ability to follow orders, cognitive ability, adaptation to environment, adaptation to company, emotion, loyalty, attitude, and Response. Finally, we discussed with the 20 executives to take these important criteria into three perspectives to structure the hierarchy for hospital public relations personnel selection. After discussing with practitioners and executives, we find that selecting criteria are interrelated. The ANP, which incorporates interdependence relationships, is a new approach for multi-criteria decision-making. Thus, we apply ANP to select the most optimal public relations personnel of hospitals. An empirical study of public relations personnel selection problems in Taiwan hospitals is conducted to illustrate how the selection procedure works.

  19. Markov Decision Process Measurement Model.

    Science.gov (United States)

    LaMar, Michelle M

    2018-03-01

    Within-task actions can provide additional information on student competencies but are challenging to model. This paper explores the potential of using a cognitive model for decision making, the Markov decision process, to provide a mapping between within-task actions and latent traits of interest. Psychometric properties of the model are explored, and simulation studies report on parameter recovery within the context of a simple strategy game. The model is then applied to empirical data from an educational game. Estimates from the model are found to correlate more strongly with posttest results than a partial-credit IRT model based on outcome data alone.

  20. The genealogy of samples in models with selection.

    Science.gov (United States)

    Neuhauser, C; Krone, S M

    1997-02-01

    We introduce the genealogy of a random sample of genes taken from a large haploid population that evolves according to random reproduction with selection and mutation. Without selection, the genealogy is described by Kingman's well-known coalescent process. In the selective case, the genealogy of the sample is embedded in a graph with a coalescing and branching structure. We describe this graph, called the ancestral selection graph, and point out differences and similarities with Kingman's coalescent. We present simulations for a two-allele model with symmetric mutation in which one of the alleles has a selective advantage over the other. We find that when the allele frequencies in the population are already in equilibrium, then the genealogy does not differ much from the neutral case. This is supported by rigorous results. Furthermore, we describe the ancestral selection graph for other selective models with finitely many selection classes, such as the K-allele models, infinitely-many-alleles models. DNA sequence models, and infinitely-many-sites models, and briefly discuss the diploid case.

  1. Simple Models for Process Control

    Czech Academy of Sciences Publication Activity Database

    Gorez, R.; Klán, Petr

    2011-01-01

    Roč. 22, č. 2 (2011), s. 58-62 ISSN 0929-2268 Institutional research plan: CEZ:AV0Z10300504 Keywords : process models * PID control * second order dynamics Subject RIV: JB - Sensors, Measurment, Regulation

  2. Model selection for Gaussian kernel PCA denoising

    DEFF Research Database (Denmark)

    Jørgensen, Kasper Winther; Hansen, Lars Kai

    2012-01-01

    We propose kernel Parallel Analysis (kPA) for automatic kernel scale and model order selection in Gaussian kernel PCA. Parallel Analysis [1] is based on a permutation test for covariance and has previously been applied for model order selection in linear PCA, we here augment the procedure to also...... tune the Gaussian kernel scale of radial basis function based kernel PCA.We evaluate kPA for denoising of simulated data and the US Postal data set of handwritten digits. We find that kPA outperforms other heuristics to choose the model order and kernel scale in terms of signal-to-noise ratio (SNR...

  3. Melody Track Selection Using Discriminative Language Model

    Science.gov (United States)

    Wu, Xiao; Li, Ming; Suo, Hongbin; Yan, Yonghong

    In this letter we focus on the task of selecting the melody track from a polyphonic MIDI file. Based on the intuition that music and language are similar in many aspects, we solve the selection problem by introducing an n-gram language model to learn the melody co-occurrence patterns in a statistical manner and determine the melodic degree of a given MIDI track. Furthermore, we propose the idea of using background model and posterior probability criteria to make modeling more discriminative. In the evaluation, the achieved 81.6% correct rate indicates the feasibility of our approach.

  4. Bayesian site selection for fast Gaussian process regression

    KAUST Repository

    Pourhabib, Arash

    2014-02-05

    Gaussian Process (GP) regression is a popular method in the field of machine learning and computer experiment designs; however, its ability to handle large data sets is hindered by the computational difficulty in inverting a large covariance matrix. Likelihood approximation methods were developed as a fast GP approximation, thereby reducing the computation cost of GP regression by utilizing a much smaller set of unobserved latent variables called pseudo points. This article reports a further improvement to the likelihood approximation methods by simultaneously deciding both the number and locations of the pseudo points. The proposed approach is a Bayesian site selection method where both the number and locations of the pseudo inputs are parameters in the model, and the Bayesian model is solved using a reversible jump Markov chain Monte Carlo technique. Through a number of simulated and real data sets, it is demonstrated that with appropriate priors chosen, the Bayesian site selection method can produce a good balance between computation time and prediction accuracy: it is fast enough to handle large data sets that a full GP is unable to handle, and it improves, quite often remarkably, the prediction accuracy, compared with the existing likelihood approximations. © 2014 Taylor and Francis Group, LLC.

  5. Selection Criteria in Regime Switching Conditional Volatility Models

    Directory of Open Access Journals (Sweden)

    Thomas Chuffart

    2015-05-01

    Full Text Available A large number of nonlinear conditional heteroskedastic models have been proposed in the literature. Model selection is crucial to any statistical data analysis. In this article, we investigate whether the most commonly used selection criteria lead to choice of the right specification in a regime switching framework. We focus on two types of models: the Logistic Smooth Transition GARCH and the Markov-Switching GARCH models. Simulation experiments reveal that information criteria and loss functions can lead to misspecification ; BIC sometimes indicates the wrong regime switching framework. Depending on the Data Generating Process used in the experiments, great care is needed when choosing a criterion.

  6. Methods for model selection in applied science and engineering.

    Energy Technology Data Exchange (ETDEWEB)

    Field, Richard V., Jr.

    2004-10-01

    Mathematical models are developed and used to study the properties of complex systems and/or modify these systems to satisfy some performance requirements in just about every area of applied science and engineering. A particular reason for developing a model, e.g., performance assessment or design, is referred to as the model use. Our objective is the development of a methodology for selecting a model that is sufficiently accurate for an intended use. Information on the system being modeled is, in general, incomplete, so that there may be two or more models consistent with the available information. The collection of these models is called the class of candidate models. Methods are developed for selecting the optimal member from a class of candidate models for the system. The optimal model depends on the available information, the selected class of candidate models, and the model use. Classical methods for model selection, including the method of maximum likelihood and Bayesian methods, as well as a method employing a decision-theoretic approach, are formulated to select the optimal model for numerous applications. There is no requirement that the candidate models be random. Classical methods for model selection ignore model use and require data to be available. Examples are used to show that these methods can be unreliable when data is limited. The decision-theoretic approach to model selection does not have these limitations, and model use is included through an appropriate utility function. This is especially important when modeling high risk systems, where the consequences of using an inappropriate model for the system can be disastrous. The decision-theoretic method for model selection is developed and applied for a series of complex and diverse applications. These include the selection of the: (1) optimal order of the polynomial chaos approximation for non-Gaussian random variables and stationary stochastic processes, (2) optimal pressure load model to be

  7. Model feedstock supply processing plants

    Directory of Open Access Journals (Sweden)

    V. M. Bautin

    2013-01-01

    Full Text Available The model of raw providing the processing enterprises entering into vertically integrated structure on production and processing of dairy raw materials, differing by an orientation on achievement of cumulative effect by the integrated structure acting as criterion function which maximizing is reached by optimization of capacities, volumes of deliveries of raw materials and its qualitative characteristics, costs of industrial processing of raw materials and demand for dairy production is developed.

  8. Modeling HIV-1 drug resistance as episodic directional selection.

    Directory of Open Access Journals (Sweden)

    Ben Murrell

    Full Text Available The evolution of substitutions conferring drug resistance to HIV-1 is both episodic, occurring when patients are on antiretroviral therapy, and strongly directional, with site-specific resistant residues increasing in frequency over time. While methods exist to detect episodic diversifying selection and continuous directional selection, no evolutionary model combining these two properties has been proposed. We present two models of episodic directional selection (MEDS and EDEPS which allow the a priori specification of lineages expected to have undergone directional selection. The models infer the sites and target residues that were likely subject to directional selection, using either codon or protein sequences. Compared to its null model of episodic diversifying selection, MEDS provides a superior fit to most sites known to be involved in drug resistance, and neither one test for episodic diversifying selection nor another for constant directional selection are able to detect as many true positives as MEDS and EDEPS while maintaining acceptable levels of false positives. This suggests that episodic directional selection is a better description of the process driving the evolution of drug resistance.

  9. Sato Processes in Default Modeling

    DEFF Research Database (Denmark)

    Kokholm, Thomas; Nicolato, Elisa

    In reduced form default models, the instantaneous default intensity is classically the modeling object. Survival probabilities are then given by the Laplace transform of the cumulative hazard defined as the integrated intensity process. Instead, recent literature has shown a tendency towards...

  10. Sato Processes in Default Modeling

    DEFF Research Database (Denmark)

    Kokholm, Thomas; Nicolato, Elisa

    2010-01-01

    In reduced form default models, the instantaneous default intensity is the classical modeling object. Survival probabilities are then given by the Laplace transform of the cumulative hazard defined as the integrated intensity process. Instead, recent literature tends to specify the cumulative...

  11. Model structure selection in convolutive mixtures

    DEFF Research Database (Denmark)

    Dyrholm, Mads; Makeig, Scott; Hansen, Lars Kai

    2006-01-01

    The CICAAR algorithm (convolutive independent component analysis with an auto-regressive inverse model) allows separation of white (i.i.d) source signals from convolutive mixtures. We introduce a source color model as a simple extension to the CICAAR which allows for a more parsimoneous...... representation in many practical mixtures. The new filter-CICAAR allows Bayesian model selection and can help answer questions like: 'Are we actually dealing with a convolutive mixture?'. We try to answer this question for EEG data....

  12. EIS and adjunct electrical modeling for material selection by evaluating two mild steels for use in super-alkaline mineral processing

    DEFF Research Database (Denmark)

    Bakhtiyari, Leila; Moghimi, Fereshteh; Mansouri, Seyed Soheil

    2012-01-01

    The production of metal concentrates during mineral processing of ferrous and non-ferrous metals involves a variety of highly corrosive chemicals which deteriorate common mild steel as the material of choice in the construction of such lines, through rapid propagation of localized pitting...... in susceptible parts, often in sensitive areas. This requires unscheduled maintenance and plant shut down. In order to test the corrosion resistance of different available materials as replacement materials, polarization and electrochemical impedance spectroscopy (EIS) tests were carried out. The EIS numerical...... software-enhanced polarization resistance, and reduced capacitance added to much diminished current densities, verified the acceptable performance of CK45 compared with high priced stainless steel substitutes with comparable operational life. Therefore, CK45 can be a suitable alternative in steel...

  13. Pedagogic process modeling: Humanistic-integrative approach

    Directory of Open Access Journals (Sweden)

    Boritko Nikolaj M.

    2007-01-01

    Full Text Available The paper deals with some current problems of modeling the dynamics of the subject-features development of the individual. The term "process" is considered in the context of the humanistic-integrative approach, in which the principles of self education are regarded as criteria for efficient pedagogic activity. Four basic characteristics of the pedagogic process are pointed out: intentionality reflects logicality and regularity of the development of the process; discreteness (stageability in dicates qualitative stages through which the pedagogic phenomenon passes; nonlinearity explains the crisis character of pedagogic processes and reveals inner factors of self-development; situationality requires a selection of pedagogic conditions in accordance with the inner factors, which would enable steering the pedagogic process. Offered are two steps for singling out a particular stage and the algorithm for developing an integrative model for it. The suggested conclusions might be of use for further theoretic research, analyses of educational practices and for realistic predicting of pedagogical phenomena. .

  14. Risk calculations in the manufacturing technology selection process

    DEFF Research Database (Denmark)

    Farooq, S.; O'Brien, C.

    2010-01-01

    and supports an industrial manager in achieving objective and comprehensive decisions regarding selection of a manufacturing technology. Originality/value - The paper explains the process of risk calculation in manufacturing technology selection by dividing the decision-making environment into manufacturing...... in the shape of opportunities and threats in different decision-making environments. Practical implications - The research quantifies the risk associated with different available manufacturing technology alternatives. This quantification of risk crystallises the process of technology selection decision making......Purpose - The purpose of this paper is to present result obtained from a developed technology selection framework and provide a detailed insight into the risk calculations and their implications in manufacturing technology selection process. Design/methodology/approach - The results illustrated...

  15. Command Process Modeling & Risk Analysis

    Science.gov (United States)

    Meshkat, Leila

    2011-01-01

    Commanding Errors may be caused by a variety of root causes. It's important to understand the relative significance of each of these causes for making institutional investment decisions. One of these causes is the lack of standardized processes and procedures for command and control. We mitigate this problem by building periodic tables and models corresponding to key functions within it. These models include simulation analysis and probabilistic risk assessment models.

  16. Modelling Hospital Materials Management Processes

    Directory of Open Access Journals (Sweden)

    Raffaele Iannone

    2013-06-01

    integrated and detailed analysis and description model for hospital materials management data and tasks, which is able to tackle information from patient requirements to usage, from replenishment requests to supplying and handling activities. The model takes account of medical risk reduction, traceability and streamlined processes perspectives. Second, the paper translates this information into a business process model and mathematical formalization.The study provides a useful guide to the various relevant technology‐related, management and business issues, laying the foundations of an efficient reengineering of the supply chain to reduce healthcare costs and improve the quality of care.

  17. Natural Selection Is a Sorting Process: What Does that Mean?

    Science.gov (United States)

    Price, Rebecca M.

    2013-01-01

    To learn why natural selection acts only on existing variation, students categorize processes as either creative or sorting. This activity helps students confront the misconception that adaptations evolve because species need them.

  18. Natural Selection as an Emergent Process: Instructional Implications

    Science.gov (United States)

    Cooper, Robert A.

    2017-01-01

    Student reasoning about cases of natural selection is often plagued by errors that stem from miscategorising selection as a direct, causal process, misunderstanding the role of randomness, and from the intuitive ideas of intentionality, teleology and essentialism. The common thread throughout many of these reasoning errors is a failure to apply…

  19. Comparative analysis of business rules and business process modeling languages

    Directory of Open Access Journals (Sweden)

    Audrius Rima

    2013-03-01

    Full Text Available During developing an information system is important to create clear models and choose suitable modeling languages. The article analyzes the SRML, SBVR, PRR, SWRL, OCL rules specifying language and UML, DFD, CPN, EPC and IDEF3 BPMN business process modeling language. The article presents business rules and business process modeling languages theoretical comparison. The article according to selected modeling aspects of the comparison between different business process modeling languages ​​and business rules representation languages sets. Also, it is selected the best fit of language set for three layer framework for business rule based software modeling.

  20. A Hybrid Multiple Criteria Decision Making Model for Supplier Selection

    Directory of Open Access Journals (Sweden)

    Chung-Min Wu

    2013-01-01

    Full Text Available The sustainable supplier selection would be the vital part in the management of a sustainable supply chain. In this study, a hybrid multiple criteria decision making (MCDM model is applied to select optimal supplier. The fuzzy Delphi method, which can lead to better criteria selection, is used to modify criteria. Considering the interdependence among the selection criteria, analytic network process (ANP is then used to obtain their weights. To avoid calculation and additional pairwise comparisons of ANP, a technique for order preference by similarity to ideal solution (TOPSIS is used to rank the alternatives. The use of a combination of the fuzzy Delphi method, ANP, and TOPSIS, proposing an MCDM model for supplier selection, and applying these to a real case are the unique features of this study.

  1. Evidence accumulation as a model for lexical selection.

    Science.gov (United States)

    Anders, R; Riès, S; van Maanen, L; Alario, F X

    2015-11-01

    We propose and demonstrate evidence accumulation as a plausible theoretical and/or empirical model for the lexical selection process of lexical retrieval. A number of current psycholinguistic theories consider lexical selection as a process related to selecting a lexical target from a number of alternatives, which each have varying activations (or signal supports), that are largely resultant of an initial stimulus recognition. We thoroughly present a case for how such a process may be theoretically explained by the evidence accumulation paradigm, and we demonstrate how this paradigm can be directly related or combined with conventional psycholinguistic theory and their simulatory instantiations (generally, neural network models). Then with a demonstrative application on a large new real data set, we establish how the empirical evidence accumulation approach is able to provide parameter results that are informative to leading psycholinguistic theory, and that motivate future theoretical development. Copyright © 2015 Elsevier Inc. All rights reserved.

  2. Sparse model selection via integral terms

    Science.gov (United States)

    Schaeffer, Hayden; McCalla, Scott G.

    2017-08-01

    Model selection and parameter estimation are important for the effective integration of experimental data, scientific theory, and precise simulations. In this work, we develop a learning approach for the selection and identification of a dynamical system directly from noisy data. The learning is performed by extracting a small subset of important features from an overdetermined set of possible features using a nonconvex sparse regression model. The sparse regression model is constructed to fit the noisy data to the trajectory of the dynamical system while using the smallest number of active terms. Computational experiments detail the model's stability, robustness to noise, and recovery accuracy. Examples include nonlinear equations, population dynamics, chaotic systems, and fast-slow systems.

  3. Adverse selection model regarding tobacco consumption

    Directory of Open Access Journals (Sweden)

    Dumitru MARIN

    2006-01-01

    Full Text Available The impact of introducing a tax on tobacco consumption can be studied trough an adverse selection model. The objective of the model presented in the following is to characterize the optimal contractual relationship between the governmental authorities and the two type employees: smokers and non-smokers, taking into account that the consumers’ decision to smoke or not represents an element of risk and uncertainty. Two scenarios are run using the General Algebraic Modeling Systems software: one without taxes set on tobacco consumption and another one with taxes set on tobacco consumption, based on an adverse selection model described previously. The results of the two scenarios are compared in the end of the paper: the wage earnings levels and the social welfare in case of a smoking agent and in case of a non-smoking agent.

  4. Modeling and Selection of Software Service Variants

    OpenAIRE

    Wittern, John Erik

    2015-01-01

    Providers and consumers have to deal with variants, meaning alternative instances of a service?s design, implementation, deployment, or operation, when developing or delivering software services. This work presents service feature modeling to deal with associated challenges, comprising a language to represent software service variants and a set of methods for modeling and subsequent variant selection. This work?s evaluation includes a POC implementation and two real-life use cases.

  5. Testing exclusion restrictions and additive separability in sample selection models

    DEFF Research Database (Denmark)

    Huber, Martin; Mellace, Giovanni

    2014-01-01

    Standard sample selection models with non-randomly censored outcomes assume (i) an exclusion restriction (i.e., a variable affecting selection, but not the outcome) and (ii) additive separability of the errors in the selection process. This paper proposes tests for the joint satisfaction of these......Standard sample selection models with non-randomly censored outcomes assume (i) an exclusion restriction (i.e., a variable affecting selection, but not the outcome) and (ii) additive separability of the errors in the selection process. This paper proposes tests for the joint satisfaction...... of these assumptions by applying the approach of Huber and Mellace (Testing instrument validity for LATE identification based on inequality moment constraints, 2011) (for testing instrument validity under treatment endogeneity) to the sample selection framework. We show that the exclusion restriction and additive...... separability imply two testable inequality constraints that come from both point identifying and bounding the outcome distribution of the subpopulation that is always selected/observed. We apply the tests to two variables for which the exclusion restriction is frequently invoked in female wage regressions: non...

  6. ASYMMETRIC PRICE TRANSMISSION MODELING: THE IMPORTANCE OF MODEL COMPLEXITY AND THE PERFORMANCE OF THE SELECTION CRITERIA

    Directory of Open Access Journals (Sweden)

    Henry de-Graft Acquah

    2013-01-01

    Full Text Available Information Criteria provides an attractive basis for selecting the best model from a set of competing asymmetric price transmission models or theories. However, little is understood about the sensitivity of the model selection methods to model complexity. This study therefore fits competing asymmetric price transmission models that differ in complexity to simulated data and evaluates the ability of the model selection methods to recover the true model. The results of Monte Carlo experimentation suggest that in general BIC, CAIC and DIC were superior to AIC when the true data generating process was the standard error correction model, whereas AIC was more successful when the true model was the complex error correction model. It is also shown that the model selection methods performed better in large samples for a complex asymmetric data generating process than with a standard asymmetric data generating process. Except for complex models, AIC's performance did not make substantial gains in recovery rates as sample size increased. The research findings demonstrate the influence of model complexity in asymmetric price transmission model comparison and selection.

  7. Model Selection in Data Analysis Competitions

    DEFF Research Database (Denmark)

    Wind, David Kofoed; Winther, Ole

    2014-01-01

    The use of data analysis competitions for selecting the most appropriate model for a problem is a recent innovation in the field of predictive machine learning. Two of the most well-known examples of this trend was the Netflix Competition and recently the competitions hosted on the online platfor...

  8. Efficiently adapting graphical models for selectivity estimation

    DEFF Research Database (Denmark)

    Tzoumas, Kostas; Deshpande, Amol; Jensen, Christian S.

    2013-01-01

    in estimation accuracy. We show how to efficiently construct such a graphical model from the database using only two-way join queries, and we show how to perform selectivity estimation in a highly efficient manner. We integrate our algorithms into the PostgreSQL DBMS. Experimental results indicate...

  9. Process Models for Security Architectures

    Directory of Open Access Journals (Sweden)

    Floarea NASTASE

    2006-01-01

    Full Text Available This paper presents a model for an integrated security system, which can be implemented in any organization. It is based on security-specific standards and taxonomies as ISO 7498-2 and Common Criteria. The functionalities are derived from the classes proposed in the Common Criteria document. In the paper we present the process model for each functionality and also we focus on the specific components.

  10. morphology of the anterior clinoid process in a select kenyan ...

    African Journals Online (AJOL)

    2018-02-28

    Feb 28, 2018 ... populations, the anterior clinoid process in our setting shows some differences involving its type and the caroticoclinoid ... Cheruiyot I, Munguti J, Kigera J, Gikenye G. Morphology of the anterior clinoid process in a select Kenyan population. Anatomy Journal of ... Means, standard deviations and range.

  11. 15 CFR 296.20 - The selection process.

    Science.gov (United States)

    2010-01-01

    ... INSTITUTE OF STANDARDS AND TECHNOLOGY, DEPARTMENT OF COMMERCE NIST EXTRAMURAL PROGRAMS TECHNOLOGY INNOVATION PROGRAM The Competition Process § 296.20 The selection process. (a) To begin a competition, the Program... regarding that competition, including the areas of critical national need that proposals must address. An...

  12. Computationally efficient thermal-mechanical modelling of selective laser melting

    NARCIS (Netherlands)

    Yang, Y.; Ayas, C.; Brabazon, Dermot; Naher, Sumsun; Ul Ahad, Inam

    2017-01-01

    The Selective laser melting (SLM) is a powder based additive manufacturing (AM) method to produce high density metal parts with complex topology. However, part distortions and accompanying residual stresses deteriorates the mechanical reliability of SLM products. Modelling of the SLM process is

  13. PROPOSAL OF AN EMPIRICAL MODEL FOR SUPPLIERS SELECTION

    Directory of Open Access Journals (Sweden)

    Paulo Ávila

    2015-03-01

    Full Text Available The problem of selecting suppliers/partners is a crucial and important part in the process of decision making for companies that intend to perform competitively in their area of activity. The selection of supplier/partner is a time and resource-consuming task that involves data collection and a careful analysis of the factors that can positively or negatively influence the choice. Nevertheless it is a critical process that affects significantly the operational performance of each company. In this work, trough the literature review, there were identified five broad suppliers selection criteria: Quality, Financial, Synergies, Cost, and Production System. Within these criteria, it was also included five sub-criteria. Thereafter, a survey was elaborated and companies were contacted in order to answer which factors have more relevance in their decisions to choose the suppliers. Interpreted the results and processed the data, it was adopted a model of linear weighting to reflect the importance of each factor. The model has a hierarchical structure and can be applied with the Analytic Hierarchy Process (AHP method or Simple Multi-Attribute Rating Technique (SMART. The result of the research undertaken by the authors is a reference model that represents a decision making support for the suppliers/partners selection process.

  14. Mathematical modelling in economic processes.

    Directory of Open Access Journals (Sweden)

    L.V. Kravtsova

    2008-06-01

    Full Text Available In article are considered a number of methods of mathematical modelling of economic processes and opportunities of use of spreadsheets Excel for reception of the optimum decision of tasks or calculation of financial operations with the help of the built-in functions.

  15. Introduction to gas lasers with emphasis on selective excitation processes

    CERN Document Server

    Willett, Colin S

    1974-01-01

    Introduction to Gas Lasers: Population Inversion Mechanisms focuses on important processes in gas discharge lasers and basic atomic collision processes that operate in a gas laser. Organized into six chapters, this book first discusses the historical development and basic principles of gas lasers. Subsequent chapters describe the selective excitation processes in gas discharges and the specific neutral, ionized and molecular laser systems. This book will be a valuable reference on the behavior of gas-discharge lasers to anyone already in the field.

  16. Model selection criterion in survival analysis

    Science.gov (United States)

    Karabey, Uǧur; Tutkun, Nihal Ata

    2017-07-01

    Survival analysis deals with time until occurrence of an event of interest such as death, recurrence of an illness, the failure of an equipment or divorce. There are various survival models with semi-parametric or parametric approaches used in medical, natural or social sciences. The decision on the most appropriate model for the data is an important point of the analysis. In literature Akaike information criteria or Bayesian information criteria are used to select among nested models. In this study,the behavior of these information criterion is discussed for a real data set.

  17. Variable selection in Logistic regression model with genetic algorithm.

    Science.gov (United States)

    Zhang, Zhongheng; Trevino, Victor; Hoseini, Sayed Shahabuddin; Belciug, Smaranda; Boopathi, Arumugam Manivanna; Zhang, Ping; Gorunescu, Florin; Subha, Velappan; Dai, Songshi

    2018-02-01

    Variable or feature selection is one of the most important steps in model specification. Especially in the case of medical-decision making, the direct use of a medical database, without a previous analysis and preprocessing step, is often counterproductive. In this way, the variable selection represents the method of choosing the most relevant attributes from the database in order to build a robust learning models and, thus, to improve the performance of the models used in the decision process. In biomedical research, the purpose of variable selection is to select clinically important and statistically significant variables, while excluding unrelated or noise variables. A variety of methods exist for variable selection, but none of them is without limitations. For example, the stepwise approach, which is highly used, adds the best variable in each cycle generally producing an acceptable set of variables. Nevertheless, it is limited by the fact that it commonly trapped in local optima. The best subset approach can systematically search the entire covariate pattern space, but the solution pool can be extremely large with tens to hundreds of variables, which is the case in nowadays clinical data. Genetic algorithms (GA) are heuristic optimization approaches and can be used for variable selection in multivariable regression models. This tutorial paper aims to provide a step-by-step approach to the use of GA in variable selection. The R code provided in the text can be extended and adapted to other data analysis needs.

  18. On Using Selection Procedures with Binomial Models.

    Science.gov (United States)

    1983-10-01

    eds.), Shinko Tsusho Co. Ltd., Tokyo, Japan , pp. 501-533. Gupta, S. S. and Sobel, M. (1960). Selecting a subset containing the best of several...IA_____3_6r__I____ *TITLE food A$ieweI L TYPE of 09PORT 6 PERIOD COVERED ON USING SELECTION PROCEDURES WITH BINOMIAL MODELS Technical 6. PeSPRFeauS1 ONG. REPORT...ontoedis stoc toeSI. to Ei.,..,t&* toemR.,. 14. SUPPOLEMENTARY MOCTES 19. Rey WORDS (Coatiou. 40 ow.oa* edo if Necesary and #do""&a by block number

  19. Aerosol model selection and uncertainty modelling by adaptive MCMC technique

    Directory of Open Access Journals (Sweden)

    M. Laine

    2008-12-01

    Full Text Available We present a new technique for model selection problem in atmospheric remote sensing. The technique is based on Monte Carlo sampling and it allows model selection, calculation of model posterior probabilities and model averaging in Bayesian way.

    The algorithm developed here is called Adaptive Automatic Reversible Jump Markov chain Monte Carlo method (AARJ. It uses Markov chain Monte Carlo (MCMC technique and its extension called Reversible Jump MCMC. Both of these techniques have been used extensively in statistical parameter estimation problems in wide area of applications since late 1990's. The novel feature in our algorithm is the fact that it is fully automatic and easy to use.

    We show how the AARJ algorithm can be implemented and used for model selection and averaging, and to directly incorporate the model uncertainty. We demonstrate the technique by applying it to the statistical inversion problem of gas profile retrieval of GOMOS instrument on board the ENVISAT satellite. Four simple models are used simultaneously to describe the dependence of the aerosol cross-sections on wavelength. During the AARJ estimation all the models are used and we obtain a probability distribution characterizing how probable each model is. By using model averaging, the uncertainty related to selecting the aerosol model can be taken into account in assessing the uncertainty of the estimates.

  20. Additive Manufacturing Processes: Selective Laser Melting, Electron Beam Melting and Binder Jetting-Selection Guidelines.

    Science.gov (United States)

    Gokuldoss, Prashanth Konda; Kolla, Sri; Eckert, Jürgen

    2017-06-19

    Additive manufacturing (AM), also known as 3D printing or rapid prototyping, is gaining increasing attention due to its ability to produce parts with added functionality and increased complexities in geometrical design, on top of the fact that it is theoretically possible to produce any shape without limitations. However, most of the research on additive manufacturing techniques are focused on the development of materials/process parameters/products design with different additive manufacturing processes such as selective laser melting, electron beam melting, or binder jetting. However, we do not have any guidelines that discuss the selection of the most suitable additive manufacturing process, depending on the material to be processed, the complexity of the parts to be produced, or the design considerations. Considering the very fact that no reports deal with this process selection, the present manuscript aims to discuss the different selection criteria that are to be considered, in order to select the best AM process (binder jetting/selective laser melting/electron beam melting) for fabricating a specific component with a defined set of material properties.

  1. Additive Manufacturing Processes: Selective Laser Melting, Electron Beam Melting and Binder Jetting—Selection Guidelines

    Science.gov (United States)

    Konda Gokuldoss, Prashanth; Kolla, Sri; Eckert, Jürgen

    2017-01-01

    Additive manufacturing (AM), also known as 3D printing or rapid prototyping, is gaining increasing attention due to its ability to produce parts with added functionality and increased complexities in geometrical design, on top of the fact that it is theoretically possible to produce any shape without limitations. However, most of the research on additive manufacturing techniques are focused on the development of materials/process parameters/products design with different additive manufacturing processes such as selective laser melting, electron beam melting, or binder jetting. However, we do not have any guidelines that discuss the selection of the most suitable additive manufacturing process, depending on the material to be processed, the complexity of the parts to be produced, or the design considerations. Considering the very fact that no reports deal with this process selection, the present manuscript aims to discuss the different selection criteria that are to be considered, in order to select the best AM process (binder jetting/selective laser melting/electron beam melting) for fabricating a specific component with a defined set of material properties. PMID:28773031

  2. Investigating the Process of Process Modeling with Eye Movement Analysis

    OpenAIRE

    Pinggera, Jakob; Furtner, Marco; Martini, Markus; Sachse, Pierre; Reiter, Katharina; Zugal, Stefan; Weber, Barbara

    2015-01-01

    Research on quality issues of business process models has recently begun to explore the process of creating process models by analyzing the modeler's interactions with the modeling environment. In this paper we aim to complement previous insights on the modeler's modeling behavior with data gathered by tracking the modeler's eye movements when engaged in the act of modeling. We present preliminary results and outline directions for future research to triangulate toward a more comprehensive un...

  3. Post-model selection inference and model averaging

    Directory of Open Access Journals (Sweden)

    Georges Nguefack-Tsague

    2011-07-01

    Full Text Available Although model selection is routinely used in practice nowadays, little is known about its precise effects on any subsequent inference that is carried out. The same goes for the effects induced by the closely related technique of model averaging. This paper is concerned with the use of the same data first to select a model and then to carry out inference, in particular point estimation and point prediction. The properties of the resulting estimator, called a post-model-selection estimator (PMSE, are hard to derive. Using selection criteria such as hypothesis testing, AIC, BIC, HQ and Cp, we illustrate that, in terms of risk function, no single PMSE dominates the others. The same conclusion holds more generally for any penalised likelihood information criterion. We also compare various model averaging schemes and show that no single one dominates the others in terms of risk function. Since PMSEs can be regarded as a special case of model averaging, with 0-1 random-weights, we propose a connection between the two theories, in the frequentist approach, by taking account of the selection procedure when performing model averaging. We illustrate the point by simulating a simple linear regression model.

  4. Hydrological scenarios for two selected Alpine catchments for the 21st century using a stochastic weather generator and enhanced process understanding for modelling of seasonal snow and glacier melt for improved water resources management

    Science.gov (United States)

    Strasser, Ulrich; Schneeberger, Klaus; Dabhi, Hetal; Dubrovsky, Martin; Hanzer, Florian; Marke, Thomas; Oberguggenberger, Michael; Rössler, Ole; Schmieder, Jan; Rotach, Mathias; Stötter, Johann; Weingartner, Rolf

    2016-04-01

    The overall objective of HydroGeM³ is to quantify and assess both water demand and water supply in two coupled human-environment mountain systems, i.e. Lütschine in Switzerland and Ötztaler Ache in Austria. Special emphasis is laid on the analysis of possible future seasonal water scarcity. The hydrological response of high Alpine catchments is characterised by a strong seasonal variability with low runoff in winter and high runoff in spring and summer. Climate change is expected to cause a seasonal shift of the runoff regime and thus it has significant impact on both amount and timing of the release of the available water resources, and thereof, possible future water conflicts. In order to identify and quantify the contribution of snow and ice melt as well as rain to runoff, streamflow composition will be analysed with natural tracers. The results of the field investigations will help to improve the snow and ice melt and runoff modules of two selected hydrological models (i.e. AMUNDSEN and WaSiM) which are used to investigate the seasonal water availability under current and future climate conditions. Together, they comprise improved descriptions of boundary layer and surface melt processes (AMUNDSEN), and of streamflow runoff generation (WaSiM). Future meteorological forcing for the modelling until the end of the century will be provided by both a stochastic multi-site weather generator, and downscaled climate model output. Both approches will use EUROCORDEX data as input. The water demand in the selected study areas is quantified for the relevant societal sectors, e.g. agriculture, hydropower generation and (winter) tourism. The comparison of water availability and water demand under current and future climate conditions will allow the identification of possible seasonal bottlenecks of future water supply and resulting conflicts. Thus these investigations can provide a quantitative basis for the development of strategies for sustainable water management in

  5. A process for selection and training of super-users for ERP implementation projects

    DEFF Research Database (Denmark)

    Danielsen, Peter; Sandfeld Hansen, Kenneth; Helt, Mads

    2017-01-01

    -users in practice. To address this research gap, we analyze the case of an ERP implementation program at a large manufacturing company. We combine Katz’s widely accepted skill measurement model with the process observed in practice to describe and test a model of super-user selection and training. The resulting...... model contains a systematic process of super-user development and highlights the specific skillsets required in different phases of the selection and training process. Our results from a comparative assessment of management expectations and super-user skills in the ERP program show that the model can......The concept of super-users as a means to facilitate ERP implementation projects has recently taken a foothold in practice, but is still largely overlooked in research. In particular, little is known about the selection and training processes required to successfully develop skilled super...

  6. Impact of Redox on Glass Durability: The Glass Selection Process

    International Nuclear Information System (INIS)

    PEELER, DAVID

    2004-01-01

    Recent glass formulation activities have focused on developing alternative frit compositions for use with specific sludge batches to maximize melt rate and/or waste throughput. The general trend has been to increase the total alkali content in the glass through the use of a high alkali based frit, a less washed sludge, or a combination of the two. As a result, predictions of durability have become a limiting factor in defining the projected operating windows for the Defense Waste Processing Facility (DWPF) for certain systems. An additional issue for these high alkali glasses has been the effect of REDuction/OXidation (REDOX) on the durability of the glass. Recent analyses have indicated that the application of the durability model's value without consideration of the overall glass composition may lead to a more significant shift (larger magnitude) than needed. Therefore, activation of the REDOX term in the Product Composition Control System (PCCS) may have a significant impact on the predicted operational windows based on model predictions, but may not represent the realistic impact on the measured durability. In this report, two specific issues are addressed. First, a review of the data used to develop PCCS (in particular the durability model) showed the potential for a REDOX interaction that is not accounted for. More specifically, three terms were added to the current model and were found to be statistically significant at a confidence level of 95 per cent. These results suggest a possible interaction between REDOX and glass composition that is not accurately captured leading to potentially conservative decisions regarding the durability of reduced glasses. The second issue addressed in this report is the development of a 45 glass test matrix to assess the effect of REDOX on durability as well as to provide insight into specific interactive compositional effects on durability. The glasses were selected to support the assessment of the following specific

  7. Model structure selection in convolutive mixtures

    DEFF Research Database (Denmark)

    Dyrholm, Mads; Makeig, S.; Hansen, Lars Kai

    2006-01-01

    The CICAAR algorithm (convolutive independent component analysis with an auto-regressive inverse model) allows separation of white (i.i.d) source signals from convolutive mixtures. We introduce a source color model as a simple extension to the CICAAR which allows for a more parsimonious represent......The CICAAR algorithm (convolutive independent component analysis with an auto-regressive inverse model) allows separation of white (i.i.d) source signals from convolutive mixtures. We introduce a source color model as a simple extension to the CICAAR which allows for a more parsimonious...... representation in many practical mixtures. The new filter-CICAAR allows Bayesian model selection and can help answer questions like: ’Are we actually dealing with a convolutive mixture?’. We try to answer this question for EEG data....

  8. Skewed factor models using selection mechanisms

    KAUST Repository

    Kim, Hyoung-Moon

    2015-12-21

    Traditional factor models explicitly or implicitly assume that the factors follow a multivariate normal distribution; that is, only moments up to order two are involved. However, it may happen in real data problems that the first two moments cannot explain the factors. Based on this motivation, here we devise three new skewed factor models, the skew-normal, the skew-tt, and the generalized skew-normal factor models depending on a selection mechanism on the factors. The ECME algorithms are adopted to estimate related parameters for statistical inference. Monte Carlo simulations validate our new models and we demonstrate the need for skewed factor models using the classic open/closed book exam scores dataset.

  9. TOURIST HOTEL LOCATION SELECTION WITH ANALYTIC HIERARCHY PROCESS

    OpenAIRE

    Kundakçı, Nilsen; Aytaç Adalı, Esra; Tuş Işık, Ayşegül

    2015-01-01

    Selection of tourist hotel location is a multi-criteria decision making (MCDM) problem and has astrategic importance for the hotel management. In tourism sector, location selection decisions arecritical as they are costly and difficult to reverse, and entail a long term commitment. For this reason,in this study AHP (Analytic Hierarchy Process) method is proposed to help the hotel management toselect the most proper location for their new tourist hotel investment. In the application part, sele...

  10. Chemical identification using Bayesian model selection

    Energy Technology Data Exchange (ETDEWEB)

    Burr, Tom; Fry, H. A. (Herbert A.); McVey, B. D. (Brian D.); Sander, E. (Eric)

    2002-01-01

    Remote detection and identification of chemicals in a scene is a challenging problem. We introduce an approach that uses some of the image's pixels to establish the background characteristics while other pixels represent the target for which we seek to identify all chemical species present. This leads to a generalized least squares problem in which we focus on 'subset selection' to identify the chemicals thought to be present. Bayesian model selection allows us to approximate the posterior probability that each chemical in the library is present by adding the posterior probabilities of all the subsets which include the chemical. We present results using realistic simulated data for the case with 1 to 5 chemicals present in each target and compare performance to a hybrid of forward and backward stepwise selection procedure using the F statistic.

  11. Attribute based selection of thermoplastic resin for vacuum infusion process

    DEFF Research Database (Denmark)

    Prabhakaran, R.T. Durai; Lystrup, Aage; Løgstrup Andersen, Tom

    2011-01-01

    The composite industry looks toward a new material system (resins) based on thermoplastic polymers for the vacuum infusion process, similar to the infusion process using thermosetting polymers. A large number of thermoplastics are available in the market with a variety of properties suitable...... for different engineering applications, and few of those are available in a not yet polymerised form suitable for resin infusion. The proper selection of a new resin system among these thermoplastic polymers is a concern for manufactures in the current scenario and a special mathematical tool would...... be beneficial. In this paper, the authors introduce a new decision making tool for resin selection based on significant attributes. This article provides a broad overview of suitable thermoplastic material systems for vacuum infusion process available in today’s market. An illustrative example—resin selection...

  12. Reserve selection using nonlinear species distribution models.

    Science.gov (United States)

    Moilanen, Atte

    2005-06-01

    Reserve design is concerned with optimal selection of sites for new conservation areas. Spatial reserve design explicitly considers the spatial pattern of the proposed reserve network and the effects of that pattern on reserve cost and/or ability to maintain species there. The vast majority of reserve selection formulations have assumed a linear problem structure, which effectively means that the biological value of a potential reserve site does not depend on the pattern of selected cells. However, spatial population dynamics and autocorrelation cause the biological values of neighboring sites to be interdependent. Habitat degradation may have indirect negative effects on biodiversity in areas neighboring the degraded site as a result of, for example, negative edge effects or lower permeability for animal movement. In this study, I present a formulation and a spatial optimization algorithm for nonlinear reserve selection problems in grid-based landscapes that accounts for interdependent site values. The method is demonstrated using habitat maps and nonlinear habitat models for threatened birds in the Netherlands, and it is shown that near-optimal solutions are found for regions consisting of up to hundreds of thousands grid cells, a landscape size much larger than those commonly attempted even with linear reserve selection formulations.

  13. Multi-dimensional model order selection

    Directory of Open Access Journals (Sweden)

    Roemer Florian

    2011-01-01

    Full Text Available Abstract Multi-dimensional model order selection (MOS techniques achieve an improved accuracy, reliability, and robustness, since they consider all dimensions jointly during the estimation of parameters. Additionally, from fundamental identifiability results of multi-dimensional decompositions, it is known that the number of main components can be larger when compared to matrix-based decompositions. In this article, we show how to use tensor calculus to extend matrix-based MOS schemes and we also present our proposed multi-dimensional model order selection scheme based on the closed-form PARAFAC algorithm, which is only applicable to multi-dimensional data. In general, as shown by means of simulations, the Probability of correct Detection (PoD of our proposed multi-dimensional MOS schemes is much better than the PoD of matrix-based schemes.

  14. Model for amorphous aggregation processes

    Science.gov (United States)

    Stranks, Samuel D.; Ecroyd, Heath; van Sluyter, Steven; Waters, Elizabeth J.; Carver, John A.; von Smekal, Lorenz

    2009-11-01

    The amorphous aggregation of proteins is associated with many phenomena, ranging from the formation of protein wine haze to the development of cataract in the eye lens and the precipitation of recombinant proteins during their expression and purification. While much literature exists describing models for linear protein aggregation, such as amyloid fibril formation, there are few reports of models which address amorphous aggregation. Here, we propose a model to describe the amorphous aggregation of proteins which is also more widely applicable to other situations where a similar process occurs, such as in the formation of colloids and nanoclusters. As first applications of the model, we have tested it against experimental turbidimetry data of three proteins relevant to the wine industry and biochemistry, namely, thaumatin, a thaumatinlike protein, and α -lactalbumin. The model is very robust and describes amorphous experimental data to a high degree of accuracy. Details about the aggregation process, such as shape parameters of the aggregates and rate constants, can also be extracted.

  15. A simple parametric model selection test

    OpenAIRE

    Susanne M. Schennach; Daniel Wilhelm

    2014-01-01

    We propose a simple model selection test for choosing among two parametric likelihoods which can be applied in the most general setting without any assumptions on the relation between the candidate models and the true distribution. That is, both, one or neither is allowed to be correctly speci fied or misspeci fied, they may be nested, non-nested, strictly non-nested or overlapping. Unlike in previous testing approaches, no pre-testing is needed, since in each case, the same test statistic to...

  16. Robust inference in sample selection models

    KAUST Repository

    Zhelonkin, Mikhail

    2015-11-20

    The problem of non-random sample selectivity often occurs in practice in many fields. The classical estimators introduced by Heckman are the backbone of the standard statistical analysis of these models. However, these estimators are very sensitive to small deviations from the distributional assumptions which are often not satisfied in practice. We develop a general framework to study the robustness properties of estimators and tests in sample selection models. We derive the influence function and the change-of-variance function of Heckman\\'s two-stage estimator, and we demonstrate the non-robustness of this estimator and its estimated variance to small deviations from the model assumed. We propose a procedure for robustifying the estimator, prove its asymptotic normality and give its asymptotic variance. Both cases with and without an exclusion restriction are covered. This allows us to construct a simple robust alternative to the sample selection bias test. We illustrate the use of our new methodology in an analysis of ambulatory expenditures and we compare the performance of the classical and robust methods in a Monte Carlo simulation study.

  17. Novel metrics for growth model selection.

    Science.gov (United States)

    Grigsby, Matthew R; Di, Junrui; Leroux, Andrew; Zipunnikov, Vadim; Xiao, Luo; Crainiceanu, Ciprian; Checkley, William

    2018-01-01

    Literature surrounding the statistical modeling of childhood growth data involves a diverse set of potential models from which investigators can choose. However, the lack of a comprehensive framework for comparing non-nested models leads to difficulty in assessing model performance. This paper proposes a framework for comparing non-nested growth models using novel metrics of predictive accuracy based on modifications of the mean squared error criteria. Three metrics were created: normalized, age-adjusted, and weighted mean squared error (MSE). Predictive performance metrics were used to compare linear mixed effects models and functional regression models. Prediction accuracy was assessed by partitioning the observed data into training and test datasets. This partitioning was constructed to assess prediction accuracy for backward (i.e., early growth), forward (i.e., late growth), in-range, and on new-individuals. Analyses were done with height measurements from 215 Peruvian children with data spanning from near birth to 2 years of age. Functional models outperformed linear mixed effects models in all scenarios tested. In particular, prediction errors for functional concurrent regression (FCR) and functional principal component analysis models were approximately 6% lower when compared to linear mixed effects models. When we weighted subject-specific MSEs according to subject-specific growth rates during infancy, we found that FCR was the best performer in all scenarios. With this novel approach, we can quantitatively compare non-nested models and weight subgroups of interest to select the best performing growth model for a particular application or problem at hand.

  18. Using AHP for Selecting the Best Wastewater Treatment Process

    Directory of Open Access Journals (Sweden)

    AbdolReza Karimi

    2011-01-01

    Full Text Available In this paper, Analytical Hierarchy Process (AHP method that is based on expert knowledge is used for the selection of the optimal anaerobic wastewater treatment process in industrial estates. This method can be applied for complicated multi-criteria decision making to obtain reasonable results. The different anaerobic processes employed in Iranian industrial estates consist of UASB, UAFB, ABR, Contact process, and Anaerobic Lagoons. Based on the general conditions in wastewater treatment plants in industrial estates and on expert judgments and using technical, economic, environmental, and administrative criteria, the processes are weighted and the results obtained are assessed using the Expert Choice Software. Finally, the five processes investigated are ranked as 1 to 5 in a descending order of UAFB, ABR, UASB, Anaerobic Lagoon, and Contact Process. Sensitivity analysis showing the effects of input parameters on changes in the results was applied for technical, economic, environmental, and administrative criteria.

  19. The Formalization of the Business Process Modeling Goals

    Directory of Open Access Journals (Sweden)

    Ligita Bušinska

    2016-10-01

    Full Text Available In business process modeling the de facto standard BPMN has emerged. However, the applications of this notation have many subsets of elements and various extensions. Also, BPMN still coincides with many other modeling languages, forming a large set of available options for business process modeling languages and dialects. While, in general, the goal of modelers is a central notion in the choice of modeling languages and notations, in most researches that propose guidelines, techniques, and methods for business process modeling language evaluation and/or selection, the business process modeling goal is not formalized and not transparently taken into account. To overcome this gap, and to explicate and help to handle business process modeling complexity, the approach to formalize the business process modeling goal, and the supporting three dimensional business process modeling framework, are proposed.

  20. Concepts of radiation processes selection for industrial realization. Chapter 6

    International Nuclear Information System (INIS)

    1997-01-01

    For selection of radiation processes in industry the processes usually are analyzing by technological and social effects, power-insensitivity, common efficiency. Technological effect is generally conditioned with uniqueness of radiation technologies which allow to obtain new material or certain one but with new properties. Social effect first of all concerns with influence of radiation technologies on consumer's psychology. Implementation of equipment for radiation technological process for both the new material production and natural materials radiation treatment is related with decision of three tasks: 1) Choice of radiation source; 2). Creation of special equipment for radiation and untraditional stages of the process; 3) Selection of radiation and other conditions ensuring of achievement of optimal technological and economical indexes

  1. Selecting food process designs from a supply chain perspective

    NARCIS (Netherlands)

    Jonkman, Jochem; Bloemhof-Ruwaard, Jacqueline; Vorst, van der Jack G.A.J.; Padt, van der Albert

    2017-01-01

    The food industry can convert agro-materials into products using many alternative process designs. To remain competitive, companies have to select the design leading to the best supply chain performance. These designs differ in the technologies used and the product portfolio produced.

  2. Effect of Thermo-extrusion Process Parameters on Selected Quality ...

    African Journals Online (AJOL)

    Effect of Thermo-extrusion Process Parameters on Selected Quality Attributes of Meat Analogue from Mucuna Bean Seed Flour. ... Nigerian Food Journal ... The product functional responses with coefficients of determination (R2) ranging between 0.658 and 0.894 were most affected by changes in barrel temperature and ...

  3. The use of Analytical Hierarchy Process (AHP) in selecting a ...

    African Journals Online (AJOL)

    Appropriate maintenance policy selection has been a challenge for many industries. AHP (analytical hierarchy process), an important decision making technique has the ability to resolve these challenges to some extent. AHP helps to provide overall ranking of maintenance alternatives. However the overall rankings can be ...

  4. Friendship and Delinquency : Selection and Influence Processes in Early Adolescence

    NARCIS (Netherlands)

    Knecht, Andrea; Snijders, Tom A. B.; Baerveldt, Chris; Steglich, Christian E. G.; Raub, Werner

    Positive association of relevant characteristics is a widespread pattern among adolescent friends. A positive association may be caused by the selection of similar others as friends and by the deselection of dissimilar ones, but also by influence processes where friends adjust their behavior to each

  5. Employee Selection Process: Integrating Employee Needs and Employer Motivators.

    Science.gov (United States)

    Carroll, Brian J.

    1989-01-01

    Offers suggestions for managers relative to the employee selection process, focusing on the identification of a potential employee's needs and the employer's motivators that affect employee productivity. Discusses the use of a preemployment survey and offers a questionnaire that allows matching of the employee's needs with employment…

  6. Predictive Active Set Selection Methods for Gaussian Processes

    DEFF Research Database (Denmark)

    Henao, Ricardo; Winther, Ole

    2012-01-01

    We propose an active set selection framework for Gaussian process classification for cases when the dataset is large enough to render its inference prohibitive. Our scheme consists of a two step alternating procedure of active set update rules and hyperparameter optimization based upon marginal l...

  7. Computational Process Modeling for Additive Manufacturing (OSU)

    Science.gov (United States)

    Bagg, Stacey; Zhang, Wei

    2015-01-01

    Powder-Bed Additive Manufacturing (AM) through Direct Metal Laser Sintering (DMLS) or Selective Laser Melting (SLM) is being used by NASA and the Aerospace industry to "print" parts that traditionally are very complex, high cost, or long schedule lead items. The process spreads a thin layer of metal powder over a build platform, then melts the powder in a series of welds in a desired shape. The next layer of powder is applied, and the process is repeated until layer-by-layer, a very complex part can be built. This reduces cost and schedule by eliminating very complex tooling and processes traditionally used in aerospace component manufacturing. To use the process to print end-use items, NASA seeks to understand SLM material well enough to develop a method of qualifying parts for space flight operation. Traditionally, a new material process takes many years and high investment to generate statistical databases and experiential knowledge, but computational modeling can truncate the schedule and cost -many experiments can be run quickly in a model, which would take years and a high material cost to run empirically. This project seeks to optimize material build parameters with reduced time and cost through modeling.

  8. Models of memory: information processing.

    Science.gov (United States)

    Eysenck, M W

    1988-01-01

    A complete understanding of human memory will necessarily involve consideration of the active processes involved at the time of learning and of the organization and nature of representation of information in long-term memory. In addition to process and structure, it is important for theory to indicate the ways in which stimulus-driven and conceptually driven processes interact with each other in the learning situation. Not surprisingly, no existent theory provides a detailed specification of all of these factors. However, there are a number of more specific theories which are successful in illuminating some of the component structures and processes. The working memory model proposed by Baddeley and Hitch (1974) and modified subsequently has shown how the earlier theoretical construct of the short-term store should be replaced with the notion of working memory. In essence, working memory is a system which is used both to process information and to permit the transient storage of information. It comprises a number of conceptually distinct, but functionally interdependent components. So far as long-term memory is concerned, there is evidence of a number of different kinds of representation. Of particular importance is the distinction between declarative knowledge and procedural knowledge, a distinction which has received support from the study of amnesic patients. Kosslyn has argued for a distinction between literal representation and propositional representation, whereas Tulving has distinguished between episodic and semantic memories. While Tulving's distinction is perhaps the best known, there is increasing evidence that episodic and semantic memory differ primarily in content rather than in process, and so the distinction may be of less theoretical value than was originally believed.(ABSTRACT TRUNCATED AT 250 WORDS)

  9. Analysis of Using Resources in Business Process Modeling and Simulation

    Directory of Open Access Journals (Sweden)

    Vasilecas Olegas

    2014-12-01

    Full Text Available One of the key purposes of Business Process Model and Notation (BPMN is to support graphical representation of the process model. However, such models have a lack of support for the graphical representation of resources, whose processes are used during simulation or execution of process instance. The paper analyzes different methods and their extensions for resource modeling. Further, this article presents a selected set of resource properties that are relevant for resource modeling. The paper proposes an approach that explains how to use the selected set of resource properties for extension of process modeling using BPMN and simulation tools. They are based on BPMN, where business process instances use resources in a concurrency manner.

  10. Modeling pellet impact drilling process

    Science.gov (United States)

    Kovalyov, A. V.; Ryabchikov, S. Ya; Isaev, Ye D.; Ulyanova, O. S.

    2016-03-01

    The paper describes pellet impact drilling which could be used to increase the drilling speed and the rate of penetration when drilling hard rocks. Pellet impact drilling implies rock destruction by metal pellets with high kinetic energy in the immediate vicinity of the earth formation encountered. The pellets are circulated in the bottom hole by a high velocity fluid jet, which is the principle component of the ejector pellet impact drill bit. The experiments conducted has allowed modeling the process of pellet impact drilling, which creates the scientific and methodological basis for engineering design of drilling operations under different geo-technical conditions.

  11. Communication activities for NUMO's site selection process

    International Nuclear Information System (INIS)

    Takeuchi, Mitsuo; Okuyama, Shigeru; Kitayama, Kazumi; Kuba, Michiyoshi

    2004-01-01

    A siting program for geological disposal of high-level radioactive waste (HLW) in Japan has just started and is moving into a new stage of communication with the public. A final repository site will be selected via a stepwise process, as stipulated in the Specified Radioactive Waste Final Disposal Act promulgated in June 2000. Based on the Act, the site selection process of the Nuclear Waste Management Organization of Japan (NUMO, established in October 2000) will be carried out in the three steps: selection of Preliminary Investigation Areas (PIAs), selection of Detailed Investigation Areas (DIAs) and selection of the Repository Site. The Act also defines NUMO's responsibilities in terms of implementing the HLW disposal program in an open and transparent manner. NUMO fully understands the importance of public participation in its activities and is aiming to promote public involvement in the process of site selection based on a fundamental policy, which consists of 'adopting a stepwise approach', 'respecting the initiative of municipalities' and 'ensuring transparency in information disclosure'. This policy is clearly reflected in the adoption of an open solicitation approach for volunteer municipalities for Preliminary Investigation Areas (PIAs). NUMO made the official announcement of the start of its open solicitation program on 19 December 2002. This paper outlines how NUMO's activities are currently carried out with a view to encouraging municipalities to volunteer as PIAs and how public awareness of the safety of the HLW disposal is evaluated at this stage

  12. Selecting an optimal mixed products using grey relationship model

    Directory of Open Access Journals (Sweden)

    Farshad Faezy Razi

    2013-06-01

    Full Text Available This paper presents an integrated supplier selection and inventory management using grey relationship model (GRM as well as multi-objective decision making process. The proposed model of this paper first ranks different suppliers based on GRM technique and then determines the optimum level of inventory by considering different objectives. To show the implementation of the proposed model, we use some benchmark data presented by Talluri and Baker [Talluri, S., & Baker, R. C. (2002. A multi-phase mathematical programming approach for effective supply chain design. European Journal of Operational Research, 141(3, 544-558.]. The preliminary results indicate that the proposed model of this paper is capable of handling different criteria for supplier selection.

  13. Spatial Fleming-Viot models with selection and mutation

    CERN Document Server

    Dawson, Donald A

    2014-01-01

    This book constructs a rigorous framework for analysing selected phenomena in evolutionary theory of populations arising due to the combined effects of migration, selection and mutation in a spatial stochastic population model, namely the evolution towards fitter and fitter types through punctuated equilibria. The discussion is based on a number of new methods, in particular multiple scale analysis, nonlinear Markov processes and their entrance laws, atomic measure-valued evolutions and new forms of duality (for state-dependent mutation and multitype selection) which are used to prove ergodic theorems in this context and are applicable for many other questions and renormalization analysis for a variety of phenomena (stasis, punctuated equilibrium, failure of naive branching approximations, biodiversity) which occur due to the combination of rare mutation, mutation, resampling, migration and selection and make it necessary to mathematically bridge the gap (in the limit) between time and space scales.

  14. Substitution of Organic Solvents in Selected Industrial Cleaning Processes

    DEFF Research Database (Denmark)

    Jacobsen, Thomas; Rasmussen, Pia Brunn

    1997-01-01

    Volatile organic solvents (VOC)are becoming increasingly unwanted in industrial processes. Substitution of VOC with non-volatile, low-toxic compounds is a possibility to reduce VOC-use. It has been successfully demonstrated, that organic solvents used in cleaning processes in sheet offset printing...... can be replaced with monoesters of fatty acids from vegetable oils (VOFA). The paper describes the selection of other industrial cleaning or degreasing processes where VOC could be replaced by VOFA. Manual degreasing and cleaning processes in metal industry, maintenance and repair of vehicles......, and industrial coating processes are likely candidates for substitution of VOC with VOFA. Requirements to the resulting surfaces may, however, hinder the replacement. This is especially important when the surface has to be coated in a subsequent step....

  15. PASS-GP: Predictive active set selection for Gaussian processes

    DEFF Research Database (Denmark)

    Henao, Ricardo; Winther, Ole

    2010-01-01

    We propose a new approximation method for Gaussian process (GP) learning for large data sets that combines inline active set selection with hyperparameter optimization. The predictive probability of the label is used for ranking the data points. We use the leave-one-out predictive probability...... to the active set selection strategy and marginal likelihood optimization on the active set. We make extensive tests on the USPS and MNIST digit classification databases with and without incorporating invariances, demonstrating that we can get state-of-the-art results (e.g.0.86% error on MNIST) with reasonable...

  16. Selections from 2017: Image Processing with AstroImageJ

    Science.gov (United States)

    Kohler, Susanna

    2017-12-01

    Editors note:In these last two weeks of 2017, well be looking at a few selections that we havent yet discussed on AAS Nova from among the most-downloaded paperspublished in AAS journals this year. The usual posting schedule will resume in January.AstroImageJ: Image Processing and Photometric Extraction for Ultra-Precise Astronomical Light CurvesPublished January2017The AIJ image display. A wide range of astronomy specific image display options and image analysis tools are available from the menus, quick access icons, and interactive histogram. [Collins et al. 2017]Main takeaway:AstroImageJ is a new integrated software package presented in a publication led byKaren Collins(Vanderbilt University,Fisk University, andUniversity of Louisville). Itenables new users even at the level of undergraduate student, high school student, or amateur astronomer to quickly start processing, modeling, and plotting astronomical image data.Why its interesting:Science doesnt just happen the momenta telescope captures a picture of a distantobject. Instead, astronomical images must firstbe carefully processed to clean up thedata, and this data must then be systematically analyzed to learn about the objects within it. AstroImageJ as a GUI-driven, easily installed, public-domain tool is a uniquelyaccessible tool for thisprocessing and analysis, allowing even non-specialist users to explore and visualizeastronomical data.Some features ofAstroImageJ:(as reported by Astrobites)Image calibration:generate master flat, dark, and bias framesImage arithmetic:combineimages viasubtraction, addition, division, multiplication, etc.Stack editing:easily perform operations on a series of imagesImage stabilization and image alignment featuresPrecise coordinate converters:calculate Heliocentric and Barycentric Julian DatesWCS coordinates:determine precisely where atelescope was pointed for an image by PlateSolving using Astronomy.netMacro and plugin support:write your own macrosMulti-aperture photometry

  17. Integrated modelling in materials and process technology

    DEFF Research Database (Denmark)

    Hattel, Jesper Henri

    2008-01-01

    Integrated modelling of entire process sequences and the subsequent in-service conditions, and multiphysics modelling of the single process steps are areas that increasingly support optimisation of manufactured parts. In the present paper, three different examples of modelling manufacturing...... processes from the viewpoint of combined materials and process modelling are presented: solidification of thin walled ductile cast iron, integrated modelling of spray forming and multiphysics modelling of friction stir welding. The fourth example describes integrated modelling applied to a failure analysis...

  18. A visual analysis of the process of process modeling

    NARCIS (Netherlands)

    Claes, J.; Vanderfeesten, I.; Pinggera, J.; Reijers, H.A.; Weber, B.; Poels, G.

    2015-01-01

    The construction of business process models has become an important requisite in the analysis and optimization of processes. The success of the analysis and optimization efforts heavily depends on the quality of the models. Therefore, a research domain emerged that studies the process of process

  19. Control processes during selective long-term memory retrieval.

    Science.gov (United States)

    Kızılırmak, J M; Rösler, F; Khader, P H

    2012-01-16

    In our daily life, we often need to selectively remember information related to the same retrieval cue in a consecutive manner (e.g., ingredients from a recipe). To investigate such selection processes during cued long-term memory (LTM) retrieval, we used a paradigm in which the retrieval demands were systematically varied from trial to trial and analyzed, by means of behavior and slow cortical EEG potentials (SCPs), the retrieval processes in the current trial depending on those of the previous trial. We varied whether the retrieval cue, the type of to-be-retrieved association (feature), or retrieval load was repeated or changed from trial to trial. The behavioral data revealed a benefit of feature repetition, probably due to trial-by-trial feature priming. SCPs further showed an effect of cue change with a mid-frontal maximum, suggesting increased control demands when the cue was repeated, as well as a parietal effect of retrieval-load change, indicating increased activation of posterior neural resources when focusing on a single association after all learned associations had been activated previously, compared to staying with single associations across trials. These effects suggest the existence of two distinct types of dynamic (trial-by-trial) control processes during LTM retrieval: (1) medial frontal processes that monitor or regulate interference within a set of activated associations, and (2) posterior processes regulating attention to LTM representations. The present study demonstrates that processes mediating selective LTM retrieval can be successfully studied by manipulating the history of processing demands in trial sequences. Copyright © 2011 Elsevier Inc. All rights reserved.

  20. Description of processes for the immobilization of selected transuranic wastes

    International Nuclear Information System (INIS)

    Timmerman, C.L.

    1980-12-01

    Processed sludge and incinerator-ash wastes contaminated with transuranic (TRU) elements may require immobilization to prevent the release of these elements to the environment. As part of the TRU Waste Immobilization Program sponsored by the Department of Energy (DOE), the Pacific Northwest Laboratory is developing applicable waste-form and processing technology that may meet this need. This report defines and describes processes that are capable of immobilizing a selected TRU waste-stream consisting of a blend of three parts process sludge and one part incinerator ash. These selected waste streams are based on the compositions and generation rates of the waste processing and incineration facility at the Rocky Flats Plant. The specific waste forms that could be produced by the described processes include: in-can melted borosilicate-glass monolith; joule-heated melter borosilicate-glass monolith or marble; joule-heated melter aluminosilicate-glass monolith or marble; joule-heated melter basaltic-glass monolith or marble; joule-heated melter glass-ceramic monolith; cast-cement monolith; pressed-cement pellet; and cold-pressed sintered-ceramic pellet

  1. Collapse models and perceptual processes

    International Nuclear Information System (INIS)

    Ghirardi, Gian Carlo; Romano, Raffaele

    2014-01-01

    Theories including a collapse mechanism have been presented various years ago. They are based on a modification of standard quantum mechanics in which nonlinear and stochastic terms are added to the evolution equation. Their principal merits derive from the fact that they are mathematically precise schemes accounting, on the basis of a unique universal dynamical principle, both for the quantum behavior of microscopic systems as well as for the reduction associated to measurement processes and for the classical behavior of macroscopic objects. Since such theories qualify themselves not as new interpretations but as modifications of the standard theory they can be, in principle, tested against quantum mechanics. Recently, various investigations identifying possible crucial test have been discussed. In spite of the extreme difficulty to perform such tests it seems that recent technological developments allow at least to put precise limits on the parameters characterizing the modifications of the evolution equation. Here we will simply mention some of the recent investigations in this direction, while we will mainly concentrate our attention to the way in which collapse theories account for definite perceptual process. The differences between the case of reductions induced by perceptions and those related to measurement procedures by means of standard macroscopic devices will be discussed. On this basis, we suggest a precise experimental test of collapse theories involving conscious observers. We make plausible, by discussing in detail a toy model, that the modified dynamics can give rise to quite small but systematic errors in the visual perceptual process.

  2. Evaluation and selection of in-situ leaching mining method using analytic hierarchy process

    International Nuclear Information System (INIS)

    Zhao Heyong; Tan Kaixuan; Liu Huizhen

    2007-01-01

    According to the complicated conditions and main influence factors of in-situ leaching min- ing, a model and processes of analytic hierarchy are established for evaluation and selection of in-situ leaching mining methods based on analytic hierarchy process. Taking a uranium mine in Xinjiang of China for example, the application of this model is presented. The results of analyses and calculation indicate that the acid leaching is the optimum project. (authors)

  3. Models of speciation by sexual selection on polygenic traits

    OpenAIRE

    Lande, Russell

    1981-01-01

    The joint evolution of female mating preferences and secondary sexual characters of males is modeled for polygamous species in which males provide only genetic material to the next generation and females have many potential mates to choose among. Despite stabilizing natural selection on males, various types of mating preferences may create a runaway process in which the outcome of phenotypic evolution depends critically on the genetic variation parameters and initial conditions of a populatio...

  4. Laser Process for Selective Emitter Silicon Solar Cells

    Directory of Open Access Journals (Sweden)

    G. Poulain

    2012-01-01

    Full Text Available Selective emitter solar cells can provide a significant increase in conversion efficiency. However current approaches need many technological steps and alignment procedures. This paper reports on a preliminary attempt to reduce the number of processing steps and therefore the cost of selective emitter cells. In the developed procedure, a phosphorous glass covered with silicon nitride acts as the doping source. A laser is used to open locally the antireflection coating and at the same time achieve local phosphorus diffusion. In this process the standard chemical etching of the phosphorous glass is avoided. Sheet resistance variation from 100 Ω/sq to 40 Ω/sq is demonstrated with a nanosecond UV laser. Numerical simulation of the laser-matter interaction is discussed to understand the dopant diffusion efficiency. Preliminary solar cells results show a 0.5% improvement compared with a homogeneous emitter structure.

  5. SELECTION AND PROMOTION PROCESS TO SUPERVISORY POSITIONS IN MEXICO, 2015

    Directory of Open Access Journals (Sweden)

    José Guadalupe Hernández López

    2015-12-01

    Full Text Available In Mexico it is starting a process of selection and promotion of teachers to supervisory positions through what has been called competitive examinations. This competition, derived from the Education Reform 2013, is justified by the alleged finding the best teachers to fill them. As a "new" process in the Mexican education system has led to a series of disputes since that examination was confined to the application and resolution of a standardized test consisting of multiple-choice questions applied in a session of eight hours which it determines whether a teacher is qualified or not qualified for the job.

  6. Development of microforming process combined with selective chemical vapor deposition

    Directory of Open Access Journals (Sweden)

    Koshimizu Kazushi

    2015-01-01

    Full Text Available Microforming has been received much attention in the recent decades due to the wide use of microparts in electronics and medical purpose. For the further functionalization of these micro devices, high functional surface with noble metals and nanomaterials are strongly required in bio- and medical fields, such as bio-sensors. To realize the efficient manufacturing process, which can deform the submillimeter scale bulk structure and can construct the micro to nanometer scale structures in one process, the present study proposes a combined process of microforming for metal foils with a selective chemical vapor deposition (SCVD on the active surface of work materials. To clarify the availability of this proposed process, the feasibility of SCVD of functional materials to active surface of titanium (Ti was investigated. CVD of iron (Fe and carbon nanotubes (CNTs which construct CNTs on the patterned surface of active Ti and non-active oxidation layer were conducted. Ti thin films on silicon substrate and Fe were used as work materials and functional materials, respectively. CNTs were grown on only Ti surface. Consequently, the selectivity of the active surface of Ti to the synthesis of Fe particles in CVD process was confirmed.

  7. Ecohydrological model parameter selection for stream health evaluation.

    Science.gov (United States)

    Woznicki, Sean A; Nejadhashemi, A Pouyan; Ross, Dennis M; Zhang, Zhen; Wang, Lizhu; Esfahanian, Abdol-Hossein

    2015-04-01

    Variable selection is a critical step in development of empirical stream health prediction models. This study develops a framework for selecting important in-stream variables to predict four measures of biological integrity: total number of Ephemeroptera, Plecoptera, and Trichoptera (EPT) taxa, family index of biotic integrity (FIBI), Hilsenhoff biotic integrity (HBI), and fish index of biotic integrity (IBI). Over 200 flow regime and water quality variables were calculated using the Hydrologic Index Tool (HIT) and Soil and Water Assessment Tool (SWAT). Streams of the River Raisin watershed in Michigan were grouped using the Strahler stream classification system (orders 1-3 and orders 4-6), k-means clustering technique (two clusters: C1 and C2), and all streams (one grouping). For each grouping, variable selection was performed using Bayesian variable selection, principal component analysis, and Spearman's rank correlation. Following selection of best variable sets, models were developed to predict the measures of biological integrity using adaptive-neuro fuzzy inference systems (ANFIS), a technique well-suited to complex, nonlinear ecological problems. Multiple unique variable sets were identified, all which differed by selection method and stream grouping. Final best models were mostly built using the Bayesian variable selection method. The most effective stream grouping method varied by health measure, although k-means clustering and grouping by stream order were always superior to models built without grouping. Commonly selected variables were related to streamflow magnitude, rate of change, and seasonal nitrate concentration. Each best model was effective in simulating stream health observations, with EPT taxa validation R2 ranging from 0.67 to 0.92, FIBI ranging from 0.49 to 0.85, HBI from 0.56 to 0.75, and fish IBI at 0.99 for all best models. The comprehensive variable selection and modeling process proposed here is a robust method that extends our

  8. A model evaluation checklist for process-based environmental models

    Science.gov (United States)

    Jackson-Blake, Leah

    2015-04-01

    Mechanistic catchment-scale phosphorus models appear to perform poorly where diffuse sources dominate. The reasons for this were investigated for one commonly-applied model, the INtegrated model of CAtchment Phosphorus (INCA-P). Model output was compared to 18 months of daily water quality monitoring data in a small agricultural catchment in Scotland, and model structure, key model processes and internal model responses were examined. Although the model broadly reproduced dissolved phosphorus dynamics, it struggled with particulates. The reasons for poor performance were explored, together with ways in which improvements could be made. The process of critiquing and assessing model performance was then generalised to provide a broadly-applicable model evaluation checklist, incorporating: (1) Calibration challenges, relating to difficulties in thoroughly searching a high-dimensional parameter space and in selecting appropriate means of evaluating model performance. In this study, for example, model simplification was identified as a necessary improvement to reduce the number of parameters requiring calibration, whilst the traditionally-used Nash Sutcliffe model performance statistic was not able to discriminate between realistic and unrealistic model simulations, and alternative statistics were needed. (2) Data limitations, relating to a lack of (or uncertainty in) input data, data to constrain model parameters, data for model calibration and testing, and data to test internal model processes. In this study, model reliability could be improved by addressing all four kinds of data limitation. For example, there was insufficient surface water monitoring data for model testing against an independent dataset to that used in calibration, whilst additional monitoring of groundwater and effluent phosphorus inputs would help distinguish between alternative plausible model parameterisations. (3) Model structural inadequacies, whereby model structure may inadequately represent

  9. Bayesian Model Selection in Geophysics: The evidence

    Science.gov (United States)

    Vrugt, J. A.

    2016-12-01

    Bayesian inference has found widespread application and use in science and engineering to reconcile Earth system models with data, including prediction in space (interpolation), prediction in time (forecasting), assimilation of observations and deterministic/stochastic model output, and inference of the model parameters. Per Bayes theorem, the posterior probability, , P(H|D), of a hypothesis, H, given the data D, is equivalent to the product of its prior probability, P(H), and likelihood, L(H|D), divided by a normalization constant, P(D). In geophysics, the hypothesis, H, often constitutes a description (parameterization) of the subsurface for some entity of interest (e.g. porosity, moisture content). The normalization constant, P(D), is not required for inference of the subsurface structure, yet of great value for model selection. Unfortunately, it is not particularly easy to estimate P(D) in practice. Here, I will introduce the various building blocks of a general purpose method which provides robust and unbiased estimates of the evidence, P(D). This method uses multi-dimensional numerical integration of the posterior (parameter) distribution. I will then illustrate this new estimator by application to three competing subsurface models (hypothesis) using GPR travel time data from the South Oyster Bacterial Transport Site, in Virginia, USA. The three subsurface models differ in their treatment of the porosity distribution and use (a) horizontal layering with fixed layer thicknesses, (b) vertical layering with fixed layer thicknesses and (c) a multi-Gaussian field. The results of the new estimator are compared against the brute force Monte Carlo method, and the Laplace-Metropolis method.

  10. On selection of optimal stochastic model for accelerated life testing

    International Nuclear Information System (INIS)

    Volf, P.; Timková, J.

    2014-01-01

    This paper deals with the problem of proper lifetime model selection in the context of statistical reliability analysis. Namely, we consider regression models describing the dependence of failure intensities on a covariate, for instance, a stressor. Testing the model fit is standardly based on the so-called martingale residuals. Their analysis has already been studied by many authors. Nevertheless, the Bayes approach to the problem, in spite of its advantages, is just developing. We shall present the Bayes procedure of estimation in several semi-parametric regression models of failure intensity. Then, our main concern is the Bayes construction of residual processes and goodness-of-fit tests based on them. The method is illustrated with both artificial and real-data examples. - Highlights: • Statistical survival and reliability analysis and Bayes approach. • Bayes semi-parametric regression modeling in Cox's and AFT models. • Bayes version of martingale residuals and goodness-of-fit test

  11. Python Program to Select HII Region Models

    Science.gov (United States)

    Miller, Clare; Lamarche, Cody; Vishwas, Amit; Stacey, Gordon J.

    2016-01-01

    HII regions are areas of singly ionized Hydrogen formed by the ionizing radiaiton of upper main sequence stars. The infrared fine-structure line emissions, particularly Oxygen, Nitrogen, and Neon, can give important information about HII regions including gas temperature and density, elemental abundances, and the effective temperature of the stars that form them. The processes involved in calculating this information from observational data are complex. Models, such as those provided in Rubin 1984 and those produced by Cloudy (Ferland et al, 2013) enable one to extract physical parameters from observational data. However, the multitude of search parameters can make sifting through models tedious. I digitized Rubin's models and wrote a Python program that is able to take observed line ratios and their uncertainties and find the Rubin or Cloudy model that best matches the observational data. By creating a Python script that is user friendly and able to quickly sort through models with a high level of accuracy, this work increases efficiency and reduces human error in matching HII region models to observational data.

  12. Computer modeling of lung cancer diagnosis-to-treatment process.

    Science.gov (United States)

    Ju, Feng; Lee, Hyo Kyung; Osarogiagbon, Raymond U; Yu, Xinhua; Faris, Nick; Li, Jingshan

    2015-08-01

    We introduce an example of a rigorous, quantitative method for quality improvement in lung cancer care-delivery. Computer process modeling methods are introduced for lung cancer diagnosis, staging and treatment selection process. Two types of process modeling techniques, discrete event simulation (DES) and analytical models, are briefly reviewed. Recent developments in DES are outlined and the necessary data and procedures to develop a DES model for lung cancer diagnosis, leading up to surgical treatment process are summarized. The analytical models include both Markov chain model and closed formulas. The Markov chain models with its application in healthcare are introduced and the approach to derive a lung cancer diagnosis process model is presented. Similarly, the procedure to derive closed formulas evaluating the diagnosis process performance is outlined. Finally, the pros and cons of these methods are discussed.

  13. Numerical simulation of the selection process of the ovarian follicles

    Directory of Open Access Journals (Sweden)

    Aymard Benjamin

    2013-01-01

    Full Text Available This paper presents the design and implementation of a numerical method to simulate a multiscale model describing the selection process in ovarian follicles. The PDE model consists in a quasi-linear hyperbolic system of large size, namely Nf × Nf, ruling the time evolution of the cell density functions of Nf follicles (in practice Nf is of the order of a few to twenty. These equations are weakly coupled through the sum of the first order moments of the density functions. The time-dependent equations make use of two structuring variables, age and maturity, which play the roles of space variables. The problem is naturally set over a compact domain of R2. The formulation of the time-dependent controlled transport coefficients accounts for available biological knowledge on follicular cell kinetics. We introduce a dedicated numerical scheme that is amenable to parallelization, by taking advantage of the weak coupling. Numerical illustrations assess th e relevance of the proposed method both in term of accuracy and HPC achievements. Ce document présente la conception et l’implémentation d’une méthode numérique servant à simuler un modèle multiéchelle décrivant le processus de sélection des follicules ovariens. Le modèle EDP consiste en un système hyperbolique quasi linéaire de grande taille, typiquement Nf × Nf, gouvernant l’évolution des fonctions de densité cellulaire pour Nf follicules (en pratique Nf est de l’ordre de quelques-uns à une vingtaine. Ces équations d’évolution utilisent deux variables structurantes, l’âge et la maturité, qui jouent le rôle de variables d’espace. Le problème est naturellement posé sur un domaine compact de R2. La formulation du transport à coefficients variables au cours du temps en fonction du contrôle est issue des connaissances disponibles sur la cinétique cellulaire au sein des follicules ovariens. Nous présentons un schéma numérique dédié au problème parall

  14. High-dimensional model estimation and model selection

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    I will review concepts and algorithms from high-dimensional statistics for linear model estimation and model selection. I will particularly focus on the so-called p>>n setting where the number of variables p is much larger than the number of samples n. I will focus mostly on regularized statistical estimators that produce sparse models. Important examples include the LASSO and its matrix extension, the Graphical LASSO, and more recent non-convex methods such as the TREX. I will show the applicability of these estimators in a diverse range of scientific applications, such as sparse interaction graph recovery and high-dimensional classification and regression problems in genomics.

  15. Models of cultural niche construction with selection and assortative mating.

    Science.gov (United States)

    Creanza, Nicole; Fogarty, Laurel; Feldman, Marcus W

    2012-01-01

    Niche construction is a process through which organisms modify their environment and, as a result, alter the selection pressures on themselves and other species. In cultural niche construction, one or more cultural traits can influence the evolution of other cultural or biological traits by affecting the social environment in which the latter traits may evolve. Cultural niche construction may include either gene-culture or culture-culture interactions. Here we develop a model of this process and suggest some applications of this model. We examine the interactions between cultural transmission, selection, and assorting, paying particular attention to the complexities that arise when selection and assorting are both present, in which case stable polymorphisms of all cultural phenotypes are possible. We compare our model to a recent model for the joint evolution of religion and fertility and discuss other potential applications of cultural niche construction theory, including the evolution and maintenance of large-scale human conflict and the relationship between sex ratio bias and marriage customs. The evolutionary framework we introduce begins to address complexities that arise in the quantitative analysis of multiple interacting cultural traits.

  16. Models of cultural niche construction with selection and assortative mating.

    Directory of Open Access Journals (Sweden)

    Nicole Creanza

    Full Text Available Niche construction is a process through which organisms modify their environment and, as a result, alter the selection pressures on themselves and other species. In cultural niche construction, one or more cultural traits can influence the evolution of other cultural or biological traits by affecting the social environment in which the latter traits may evolve. Cultural niche construction may include either gene-culture or culture-culture interactions. Here we develop a model of this process and suggest some applications of this model. We examine the interactions between cultural transmission, selection, and assorting, paying particular attention to the complexities that arise when selection and assorting are both present, in which case stable polymorphisms of all cultural phenotypes are possible. We compare our model to a recent model for the joint evolution of religion and fertility and discuss other potential applications of cultural niche construction theory, including the evolution and maintenance of large-scale human conflict and the relationship between sex ratio bias and marriage customs. The evolutionary framework we introduce begins to address complexities that arise in the quantitative analysis of multiple interacting cultural traits.

  17. MODELLING OF POSTSEISMIC PROCESSES IN SUBDUCTION ZONES

    Directory of Open Access Journals (Sweden)

    Irina S. Vladimirova

    2012-01-01

    Full Text Available Large intraplate subduction earthquakes are generally accompanied by prolonged and intense postseismic anomalies. In the present work, viscoelastic relaxation in the upper mantle and the asthenosphere is considered as a main mechanism responsible for the occurrence of such postseismic effects. The study of transient processes is performed on the basis of data on postseismic processes accompanying the first Simushir earthquake on 15 November 2006 and Maule earthquake on 27 February 2010.The methodology of modelling a viscoelastic relaxation process after a large intraplate subduction earthquake is presented. A priori parameters of the selected model describing observed postseismic effects are adjusted by minimizing deviations between modeled surface displacements and actual surface displacements recorded by geodetic methods through solving corresponding inverse problems.The presented methodology yielded estimations of Maxwell’s viscosity of the asthenosphere of the central Kuril Arc and also of the central Chile. Besides, postseismic slip distribution patterns were obtained for the focus of the Simushir earthquake of 15 November 2006 (Mw=8.3 (Figure 3, and distribution patterns of seismic and postseismic slip were determined for the focus of the Maule earthquake of 27 February 2010 (Mw=8.8 (Figure 6. These estimations and patterns can provide for prediction of the intensity of viscoelastic stress attenuation in the asthenosphere; anomalous values should be taken into account as adjustment factors when analyzing inter-seismic deformation in order to ensure correct estimation of the accumulated elastic seismogenic potential.

  18. Properties of spatial Cox process models

    DEFF Research Database (Denmark)

    Møller, Jesper

    Probabilistic properties of Cox processes of relevance for statistical modelling and inference are studied. Particularly, we study the most important classes of Cox processes, including log Gaussian Cox processes, shot noise Cox processes, and permanent Cox processes. We consider moment properties...... and point process operations such as thinning, displacements, and superpositioning. We also discuss how to simulate specific Cox processes....

  19. Selecting a model of supersymmetry breaking mediation

    International Nuclear Information System (INIS)

    AbdusSalam, S. S.; Allanach, B. C.; Dolan, M. J.; Feroz, F.; Hobson, M. P.

    2009-01-01

    We study the problem of selecting between different mechanisms of supersymmetry breaking in the minimal supersymmetric standard model using current data. We evaluate the Bayesian evidence of four supersymmetry breaking scenarios: mSUGRA, mGMSB, mAMSB, and moduli mediation. The results show a strong dependence on the dark matter assumption. Using the inferred cosmological relic density as an upper bound, minimal anomaly mediation is at least moderately favored over the CMSSM. Our fits also indicate that evidence for a positive sign of the μ parameter is moderate at best. We present constraints on the anomaly and gauge mediated parameter spaces and some previously unexplored aspects of the dark matter phenomenology of the moduli mediation scenario. We use sparticle searches, indirect observables and dark matter observables in the global fit and quantify robustness with respect to prior choice. We quantify how much information is contained within each constraint.

  20. Continuous-Time Mean-Variance Portfolio Selection under the CEV Process

    OpenAIRE

    Ma, Hui-qiang

    2014-01-01

    We consider a continuous-time mean-variance portfolio selection model when stock price follows the constant elasticity of variance (CEV) process. The aim of this paper is to derive an optimal portfolio strategy and the efficient frontier. The mean-variance portfolio selection problem is formulated as a linearly constrained convex program problem. By employing the Lagrange multiplier method and stochastic optimal control theory, we obtain the optimal portfolio strategy and mean-variance effici...

  1. GREENSCOPE: A Method for Modeling Chemical Process ...

    Science.gov (United States)

    Current work within the U.S. Environmental Protection Agency’s National Risk Management Research Laboratory is focused on the development of a method for modeling chemical process sustainability. The GREENSCOPE methodology, defined for the four bases of Environment, Economics, Efficiency, and Energy, can evaluate processes with over a hundred different indicators. These indicators provide a means for realizing the principles of green chemistry and green engineering in the context of sustainability. Development of the methodology has centered around three focal points. One is a taxonomy of impacts that describe the indicators and provide absolute scales for their evaluation. The setting of best and worst limits for the indicators allows the user to know the status of the process under study in relation to understood values. Thus, existing or imagined processes can be evaluated according to their relative indicator scores, and process modifications can strive towards realizable targets. A second area of focus is in advancing definitions of data needs for the many indicators of the taxonomy. Each of the indicators has specific data that is necessary for their calculation. Values needed and data sources have been identified. These needs can be mapped according to the information source (e.g., input stream, output stream, external data, etc.) for each of the bases. The user can visualize data-indicator relationships on the way to choosing selected ones for evalua

  2. Modelling Technical and Economic Parameters in Selection of Manufacturing Devices

    Directory of Open Access Journals (Sweden)

    Naqib Daneshjo

    2017-11-01

    Full Text Available Sustainable science and technology development is also conditioned by continuous development of means of production which have a key role in structure of each production system. Mechanical nature of the means of production is complemented by controlling and electronic devices in context of intelligent industry. A selection of production machines for a technological process or technological project has so far been practically resolved, often only intuitively. With regard to increasing intelligence, the number of variable parameters that have to be considered when choosing a production device is also increasing. It is necessary to use computing techniques and decision making methods according to heuristic methods and more precise methodological procedures during the selection. The authors present an innovative model for optimization of technical and economic parameters in the selection of manufacturing devices for industry 4.0.

  3. Cupola Furnace Computer Process Model

    Energy Technology Data Exchange (ETDEWEB)

    Seymour Katz

    2004-12-31

    The cupola furnace generates more than 50% of the liquid iron used to produce the 9+ million tons of castings annually. The cupola converts iron and steel into cast iron. The main advantages of the cupola furnace are lower energy costs than those of competing furnaces (electric) and the ability to melt less expensive metallic scrap than the competing furnaces. However the chemical and physical processes that take place in the cupola furnace are highly complex making it difficult to operate the furnace in optimal fashion. The results are low energy efficiency and poor recovery of important and expensive alloy elements due to oxidation. Between 1990 and 2004 under the auspices of the Department of Energy, the American Foundry Society and General Motors Corp. a computer simulation of the cupola furnace was developed that accurately describes the complex behavior of the furnace. When provided with the furnace input conditions the model provides accurate values of the output conditions in a matter of seconds. It also provides key diagnostics. Using clues from the diagnostics a trained specialist can infer changes in the operation that will move the system toward higher efficiency. Repeating the process in an iterative fashion leads to near optimum operating conditions with just a few iterations. More advanced uses of the program have been examined. The program is currently being combined with an ''Expert System'' to permit optimization in real time. The program has been combined with ''neural network'' programs to affect very easy scanning of a wide range of furnace operation. Rudimentary efforts were successfully made to operate the furnace using a computer. References to these more advanced systems will be found in the ''Cupola Handbook''. Chapter 27, American Foundry Society, Des Plaines, IL (1999).

  4. Special concrete shield selection using the analytic hierarchy process

    International Nuclear Information System (INIS)

    Abulfaraj, W.H.

    1994-01-01

    Special types of concrete radiation shields that depend on locally available materials and have improved properties for both neutron and gamma-ray attenuation were developed by using plastic materials and heavy ores. The analytic hierarchy process (AHP) is implemented to evaluate these types for selecting the best biological radiation shield for nuclear reactors. Factors affecting the selection decision are degree of protection against neutrons, degree of protection against gamma rays, suitability of the concrete as building material, and economic considerations. The seven concrete alternatives are barite-polyethylene concrete, barite-polyvinyl chloride (PVC) concrete, barite-portland cement concrete, pyrite-polyethylene concrete, pyrite-PVC concrete, pyrite-portland cement concrete, and ordinary concrete. The AHP analysis shows the superiority of pyrite-polyethylene concrete over the others

  5. Hidden Markov Model for Stock Selection

    Directory of Open Access Journals (Sweden)

    Nguyet Nguyen

    2015-10-01

    Full Text Available The hidden Markov model (HMM is typically used to predict the hidden regimes of observation data. Therefore, this model finds applications in many different areas, such as speech recognition systems, computational molecular biology and financial market predictions. In this paper, we use HMM for stock selection. We first use HMM to make monthly regime predictions for the four macroeconomic variables: inflation (consumer price index (CPI, industrial production index (INDPRO, stock market index (S&P 500 and market volatility (VIX. At the end of each month, we calibrate HMM’s parameters for each of these economic variables and predict its regimes for the next month. We then look back into historical data to find the time periods for which the four variables had similar regimes with the forecasted regimes. Within those similar periods, we analyze all of the S&P 500 stocks to identify which stock characteristics have been well rewarded during the time periods and assign scores and corresponding weights for each of the stock characteristics. A composite score of each stock is calculated based on the scores and weights of its features. Based on this algorithm, we choose the 50 top ranking stocks to buy. We compare the performances of the portfolio with the benchmark index, S&P 500. With an initial investment of $100 in December 1999, over 15 years, in December 2014, our portfolio had an average gain per annum of 14.9% versus 2.3% for the S&P 500.

  6. Psyche Mission: Scientific Models and Instrument Selection

    Science.gov (United States)

    Polanskey, C. A.; Elkins-Tanton, L. T.; Bell, J. F., III; Lawrence, D. J.; Marchi, S.; Park, R. S.; Russell, C. T.; Weiss, B. P.

    2017-12-01

    NASA has chosen to explore (16) Psyche with their 14th Discovery-class mission. Psyche is a 226-km diameter metallic asteroid hypothesized to be the exposed core of a planetesimal that was stripped of its rocky mantle by multiple hit and run collisions in the early solar system. The spacecraft launch is planned for 2022 with arrival at the asteroid in 2026 for 21 months of operations. The Psyche investigation has five primary scientific objectives: A. Determine whether Psyche is a core, or if it is unmelted material. B. Determine the relative ages of regions of Psyche's surface. C. Determine whether small metal bodies incorporate the same light elements as are expected in the Earth's high-pressure core. D. Determine whether Psyche was formed under conditions more oxidizing or more reducing than Earth's core. E. Characterize Psyche's topography. The mission's task was to select the appropriate instruments to meet these objectives. However, exploring a metal world, rather than one made of ice, rock, or gas, requires development of new scientific models for Psyche to support the selection of the appropriate instruments for the payload. If Psyche is indeed a planetary core, we expect that it should have a detectable magnetic field. However, the strength of the magnetic field can vary by orders of magnitude depending on the formational history of Psyche. The implications of both the extreme low-end and the high-end predictions impact the magnetometer and mission design. For the imaging experiment, what can the team expect for the morphology of a heavily impacted metal body? Efforts are underway to further investigate the differences in crater morphology between high velocity impacts into metal and rock to be prepared to interpret the images of Psyche when they are returned. Finally, elemental composition measurements at Psyche using nuclear spectroscopy encompass a new and unexplored phase space of gamma-ray and neutron measurements. We will present some end

  7. Computer Aided Continuous Time Stochastic Process Modelling

    DEFF Research Database (Denmark)

    Kristensen, N.R.; Madsen, Henrik; Jørgensen, Sten Bay

    2001-01-01

    A grey-box approach to process modelling that combines deterministic and stochastic modelling is advocated for identification of models for model-based control of batch and semi-batch processes. A computer-aided tool designed for supporting decision-making within the corresponding modelling cycle...

  8. Proposition of a multicriteria model to select logistics services providers

    Directory of Open Access Journals (Sweden)

    Miriam Catarina Soares Aharonovitz

    2014-06-01

    Full Text Available This study aims to propose a multicriteria model to select logistics service providers by the development of a decision tree. The methodology consists of a survey, which resulted in a sample of 181 responses. The sample was analyzed using statistic methods, descriptive statistics among them, multivariate analysis, variance analysis, and parametric tests to compare means. Based on these results, it was possible to obtain the decision tree and information to support the multicriteria analysis. The AHP (Analytic Hierarchy Process was applied to determine the data influence and thus ensure better consistency in the analysis. The decision tree categorizes the criteria according to the decision levels (strategic, tactical and operational. Furthermore, it allows to generically evaluate the importance of each criterion in the supplier selection process from the point of view of logistics services contractors.

  9. Process selection methodology for service management in SME

    Directory of Open Access Journals (Sweden)

    Juan Luis Rubio Sánchez

    2017-09-01

    Full Text Available It is a fact that more and more companies operations lay in information and communication technologies (ICT. Traditional management models need to be adapted to this new reality. That is why some initiatives are emerging (COBIT [control objectives for information and related technology], CMMI [capability maturity model integration], ITIL [information technology infrastructure library], etc. which pretend to guide about the processes, metrics and technology management indicators most suitable. This document focuses in ITIL, that is the best representation of what has been called IT Governance. ITIL is a reference in technology services companies and in ICT departments of any company. That is due to the high level of utility provided by the organization and coverage of the processes proposed. Implantation of a management model based in ITIL processes forces companies to a relevant decision: which processes should be implemented?, which one should be the first one?, etc. The answer to this and other questions is not easy because the adoption of these processes implies an economical investment. This article shows an approach to the implementation order so we can optimize the position of the company in front of the competence in its sector, in front of similar sized companies or any other parameter we could define.

  10. SELECTION AND PRELIMINARY EVALUATION OF ALTERNATIVE REDUCTANTS FOR SRAT PROCESSING

    Energy Technology Data Exchange (ETDEWEB)

    Stone, M.; Pickenheim, B.; Peeler, D.

    2009-06-30

    Defense Waste Processing Facility - Engineering (DWPF-E) has requested the Savannah River National Laboratory (SRNL) to perform scoping evaluations of alternative flowsheets with the primary focus on alternatives to formic acid during Chemical Process Cell (CPC) processing. The reductants shown below were selected for testing during the evaluation of alternative reductants for Sludge Receipt and Adjustment Tank (SRAT) processing. The reductants fall into two general categories: reducing acids and non-acidic reducing agents. Reducing acids were selected as direct replacements for formic acid to reduce mercury in the SRAT, to acidify the sludge, and to balance the melter REDuction/OXidation potential (REDOX). Non-acidic reductants were selected as melter reductants and would not be able to reduce mercury in the SRAT. Sugar was not tested during this scoping evaluation as previous work has already been conducted on the use of sugar with DWPF feeds. Based on the testing performed, the only viable short-term path to mitigating hydrogen generation in the CPC is replacement of formic acid with a mixture of glycolic and formic acids. An experiment using glycolic acid blended with formic on an 80:20 molar basis was able to reduce mercury, while also targeting a predicted REDuction/OXidation (REDOX) of 0.2 expressed as Fe{sup 2+}/{Sigma}Fe. Based on this result, SRNL recommends performing a complete CPC demonstration of the glycolic/formic acid flowsheet followed by a design basis development and documentation. Of the options tested recently and in the past, nitric/glycolic/formic blended acids has the potential for near term implementation in the existing CPC equipment providing rapid throughput improvement. Use of a non-acidic reductant is recommended only if the processing constraints to remove mercury and acidify the sludge acidification are eliminated. The non-acidic reductants (e.g. sugar) will not reduce mercury during CPC processing and sludge acidification would

  11. Process Correlation Analysis Model for Process Improvement Identification

    Directory of Open Access Journals (Sweden)

    Su-jin Choi

    2014-01-01

    software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data.

  12. Modeling and generating input processes

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, M.E.

    1987-01-01

    This tutorial paper provides information relevant to the selection and generation of stochastic inputs to simulation studies. The primary area considered is multivariate but much of the philosophy at least is relevant to univariate inputs as well. 14 refs.

  13. Pyrochemical and Dry Processing Methods Program. A selected bibliography

    Energy Technology Data Exchange (ETDEWEB)

    McDuffie, H.F.; Smith, D.H.; Owen, P.T.

    1979-03-01

    This selected bibliography with abstracts was compiled to provide information support to the Pyrochemical and Dry Processing Methods (PDPM) Program sponsored by DOE and administered by the Argonne National Laboratory. Objectives of the PDPM Program are to evaluate nonaqueous methods of reprocessing spent fuel as a route to the development of proliferation-resistant and diversion-resistant methods for widespread use in the nuclear industry. Emphasis was placed on the literature indexed in the ERDA--DOE Energy Data Base (EDB). The bibliography includes indexes to authors, subject descriptors, EDB subject categories, and titles.

  14. Temporally selective processing of communication signals by auditory midbrain neurons

    DEFF Research Database (Denmark)

    Elliott, Taffeta M; Christensen-Dalsgaard, Jakob; Kelley, Darcy B

    2011-01-01

    Perception of the temporal structure of acoustic signals contributes critically to vocal signaling. In the aquatic clawed frog Xenopus laevis, calls differ primarily in the temporal parameter of click rate, which conveys sexual identity and reproductive state. We show here that an ensemble...... click rates ranged from 4 to 50 Hz, the rate at which the clicks begin to overlap. Frequency selectivity and temporal processing were characterized using response-intensity curves, temporal-discharge patterns, and autocorrelations of reduplicated responses to click trains. Characteristic frequencies...

  15. Pyrochemical and Dry Processing Methods Program. A selected bibliography

    International Nuclear Information System (INIS)

    McDuffie, H.F.; Smith, D.H.; Owen, P.T.

    1979-03-01

    This selected bibliography with abstracts was compiled to provide information support to the Pyrochemical and Dry Processing Methods (PDPM) Program sponsored by DOE and administered by the Argonne National Laboratory. Objectives of the PDPM Program are to evaluate nonaqueous methods of reprocessing spent fuel as a route to the development of proliferation-resistant and diversion-resistant methods for widespread use in the nuclear industry. Emphasis was placed on the literature indexed in the ERDA--DOE Energy Data Base (EDB). The bibliography includes indexes to authors, subject descriptors, EDB subject categories, and titles

  16. Analog modelling of obduction processes

    Science.gov (United States)

    Agard, P.; Zuo, X.; Funiciello, F.; Bellahsen, N.; Faccenna, C.; Savva, D.

    2012-04-01

    Obduction corresponds to one of plate tectonics oddities, whereby dense, oceanic rocks (ophiolites) are presumably 'thrust' on top of light, continental ones, as for the short-lived, almost synchronous Peri-Arabic obduction (which took place along thousands of km from Turkey to Oman in c. 5-10 Ma). Analog modelling experiments were performed to study the mechanisms of obduction initiation and test various triggering hypotheses (i.e., plate acceleration, slab hitting the 660 km discontinuity, ridge subduction; Agard et al., 2007). The experimental setup comprises (1) an upper mantle, modelled as a low-viscosity transparent Newtonian glucose syrup filling a rigid Plexiglas tank and (2) high-viscosity silicone plates (Rhodrosil Gomme with PDMS iron fillers to reproduce densities of continental or oceanic plates), located at the centre of the tank above the syrup to simulate the subducting and the overriding plates - and avoid friction on the sides of the tank. Convergence is simulated by pushing on a piston at one end of the model with velocities comparable to those of plate tectonics (i.e., in the range 1-10 cm/yr). The reference set-up includes, from one end to the other (~60 cm): (i) the piston, (ii) a continental margin containing a transition zone to the adjacent oceanic plate, (iii) a weakness zone with variable resistance and dip (W), (iv) an oceanic plate - with or without a spreading ridge, (v) a subduction zone (S) dipping away from the piston and (vi) an upper, active continental margin, below which the oceanic plate is being subducted at the start of the experiment (as is known to have been the case in Oman). Several configurations were tested and over thirty different parametric tests were performed. Special emphasis was placed on comparing different types of weakness zone (W) and the extent of mechanical coupling across them, particularly when plates were accelerated. Displacements, together with along-strike and across-strike internal deformation in all

  17. Selection, Deselection, and Socialization Processes of Happiness in Adolescent Friendship Networks

    NARCIS (Netherlands)

    Workum, N. van; Scholte, R.H.J.; Cillessen, A.H.N.; Lodder, G.M.A.; Giletta, M.

    2013-01-01

    This study investigated selection, deselection, and influence processes of happiness in adolescent friendship networks. Longitudinal data on friendship networks and happiness of 426 adolescents (M = 15.78, SD = 0.65) were analyzed, using stochastic actor-based models. Although happiness similarity

  18. Parameter estimation and model selection in computational biology.

    Directory of Open Access Journals (Sweden)

    Gabriele Lillacci

    2010-03-01

    Full Text Available A central challenge in computational modeling of biological systems is the determination of the model parameters. Typically, only a fraction of the parameters (such as kinetic rate constants are experimentally measured, while the rest are often fitted. The fitting process is usually based on experimental time course measurements of observables, which are used to assign parameter values that minimize some measure of the error between these measurements and the corresponding model prediction. The measurements, which can come from immunoblotting assays, fluorescent markers, etc., tend to be very noisy and taken at a limited number of time points. In this work we present a new approach to the problem of parameter selection of biological models. We show how one can use a dynamic recursive estimator, known as extended Kalman filter, to arrive at estimates of the model parameters. The proposed method follows. First, we use a variation of the Kalman filter that is particularly well suited to biological applications to obtain a first guess for the unknown parameters. Secondly, we employ an a posteriori identifiability test to check the reliability of the estimates. Finally, we solve an optimization problem to refine the first guess in case it should not be accurate enough. The final estimates are guaranteed to be statistically consistent with the measurements. Furthermore, we show how the same tools can be used to discriminate among alternate models of the same biological process. We demonstrate these ideas by applying our methods to two examples, namely a model of the heat shock response in E. coli, and a model of a synthetic gene regulation system. The methods presented are quite general and may be applied to a wide class of biological systems where noisy measurements are used for parameter estimation or model selection.

  19. From business value model to coordination process model

    NARCIS (Netherlands)

    Fatemi, Hassan; Wieringa, Roelf J.; Poler, R.; van Sinderen, Marten J.; Sanchis, R.

    2009-01-01

    The increased complexity of business webs calls for modeling the collaboration of enterprises from different perspectives, in particular the business and process perspectives, and for mutually aligning these perspectives. Business value modeling and coordination process modeling both are necessary

  20. Understanding Managers Decision Making Process for Tools Selection in the Core Front End of Innovation

    DEFF Research Database (Denmark)

    Appio, Francesco P.; Achiche, Sofiane; McAloone, Tim C.

    2011-01-01

    and optimise the activities. To select these tools, managers of the product development team have to use several premises to decide upon which tool is more appropriate to which activity. This paper proposes an approach to model the decision making process of the managers. The results underline the dimensions...... hypotheses are tested. A preliminary version of a theoretical model depicting the decision process of managers during tools selection in the FFE is proposed. The theoretical model is built from the constructed hypotheses....... influencing the decision process before a certain tool is chosen, and how those tools impact the performance of cost, time and efficiency. In order to achieve this, five companies participated for the data collection. Interesting trends and differences emerge from the analysis of the data in hand, and several...

  1. Business process modeling for processing classified documents using RFID technology

    Directory of Open Access Journals (Sweden)

    Koszela Jarosław

    2016-01-01

    Full Text Available The article outlines the application of the processing approach to the functional description of the designed IT system supporting the operations of the secret office, which processes classified documents. The article describes the application of the method of incremental modeling of business processes according to the BPMN model to the description of the processes currently implemented (“as is” in a manual manner and target processes (“to be”, using the RFID technology for the purpose of their automation. Additionally, the examples of applying the method of structural and dynamic analysis of the processes (process simulation to verify their correctness and efficiency were presented. The extension of the process analysis method is a possibility of applying the warehouse of processes and process mining methods.

  2. Process correlation analysis model for process improvement identification.

    Science.gov (United States)

    Choi, Su-jin; Kim, Dae-Kyoo; Park, Sooyong

    2014-01-01

    Software process improvement aims at improving the development process of software systems. It is initiated by process assessment identifying strengths and weaknesses and based on the findings, improvement plans are developed. In general, a process reference model (e.g., CMMI) is used throughout the process of software process improvement as the base. CMMI defines a set of process areas involved in software development and what to be carried out in process areas in terms of goals and practices. Process areas and their elements (goals and practices) are often correlated due to the iterative nature of software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data.

  3. Social Influence and Selection Processes as Predictors of Normative Perceptions and Alcohol Use across the Transition to College

    Science.gov (United States)

    Abar, Caitlin C.; Maggs, Jennifer L.

    2010-01-01

    Research indicates that social influences impact college students' alcohol consumption; however, how selection processes may serve as an influential factor predicting alcohol use in this population has not been widely addressed. A model of influence and selection processes contributing to alcohol use across the transition to college was examined…

  4. Towards the Automated Annotation of Process Models

    NARCIS (Netherlands)

    Leopold, H.; Meilicke, C.; Fellmann, M.; Pittke, F.; Stuckenschmidt, H.; Mendling, J.

    2016-01-01

    Many techniques for the advanced analysis of process models build on the annotation of process models with elements from predefined vocabularies such as taxonomies. However, the manual annotation of process models is cumbersome and sometimes even hardly manageable taking the size of taxonomies into

  5. A CONCEPTUAL MODEL FOR IMPROVED PROJECT SELECTION AND PRIORITISATION

    Directory of Open Access Journals (Sweden)

    P. J. Viljoen

    2012-01-01

    Full Text Available

    ENGLISH ABSTRACT: Project portfolio management processes are often designed and operated as a series of stages (or project phases and gates. However, the flow of such a process is often slow, characterised by queues waiting for a gate decision and by repeated work from previous stages waiting for additional information or for re-processing. In this paper the authors propose a conceptual model that applies supply chain and constraint management principles to the project portfolio management process. An advantage of the proposed model is that it provides the ability to select and prioritise projects without undue changes to project schedules. This should result in faster flow through the system.

    AFRIKAANSE OPSOMMING: Prosesse om portefeuljes van projekte te bestuur word normaalweg ontwerp en bedryf as ’n reeks fases en hekke. Die vloei deur so ’n proses is dikwels stadig en word gekenmerk deur toue wat wag vir besluite by die hekke en ook deur herwerk van vorige fases wat wag vir verdere inligting of vir herprosessering. In hierdie artikel word ‘n konseptuele model voorgestel. Die model berus op die beginsels van voorsieningskettings sowel as van beperkingsbestuur, en bied die voordeel dat projekte geselekteer en geprioritiseer kan word sonder onnodige veranderinge aan projekskedules. Dit behoort te lei tot versnelde vloei deur die stelsel.

  6. Nutritional and toxicological composition analysis of selected cassava processed products

    Directory of Open Access Journals (Sweden)

    Kuda Dewage Supun Charuni Nilangeka Rajapaksha

    2017-01-01

    Full Text Available Cassava (Manihot esculanta Crantz is an important food source in tropical countries where it can withstand environmentally stressed conditions. Cassava and its processed products have a high demand in both local and export market of Sri Lanka. MU51 cassava variety is one of the more common varieties and boiling is the main consumption pattern of cassava among Sri Lankans. The less utilization of cassava is due to the presence of cyanide which is a toxic substance. This research was designed to analyse the nutritional composition and toxicological (cyanide content of Cassava MU51 variety and selected processed products of cassava MU51 (boiled, starch, flour, chips, two chips varieties purchased from market to identify the effect of processing on cassava MU51 variety. Nutritional composition was analysed by AOAC (2012 methods with modifications and cyanide content was determined following picric acid method of spectrophotometric determination. The Flesh of MU51 variety and different processed products of cassava had an average range of moisture content (3.18 - 61.94%, total fat (0.31 - 23.30%, crude fiber (0.94 - 2.15%, protein (1.67 - 3.71% and carbohydrates (32.68 - 84.20% and where they varied significantly in between products and the variety MU51, where no significance difference (p >0.05 observed in between MU51 flesh and processed products' ash content where it ranged (1.02 - 1.91%. However, boiled product and MU51 flesh had more similar results in their nutritional composition where they showed no significant difference at any of the nutrient that was analysed. Thus, there could be no significant effect on the nutrient composition of raw cassava once it boiled. Cyanide content of the MU51 flesh and selected products (boiled, starch, flour and chips prepared using MU51 variety, showed wide variation ranging from 4.68 mg.kg-1 to 33.92 mg.kg-1 in dry basis. But except boiled cassava all processed products had cyanide content <10 mg.kg-1, which

  7. Algorithms of control parameters selection for automation of FDM 3D printing process

    Directory of Open Access Journals (Sweden)

    Kogut Paweł

    2017-01-01

    Full Text Available The paper presents algorithms of control parameters selection of the Fused Deposition Modelling (FDM technology in case of an open printing solutions environment and 3DGence ONE printer. The following parameters were distinguished: model mesh density, material flow speed, cooling performance, retraction and printing speeds. These parameters are independent in principle printing system, but in fact to a certain degree that results from the selected printing equipment features. This is the first step for automation of the 3D printing process in FDM technology.

  8. Cellular scanning strategy for selective laser melting: Generating reliable, optimized scanning paths and processing parameters

    DEFF Research Database (Denmark)

    Mohanty, Sankhya; Hattel, Jesper Henri

    2015-01-01

    method based uncertainty and reliability analysis. The reliability of the scanning paths are established using cumulative probability distribution functions for process output criteria such as sample density, thermal homogeneity, etc. A customized genetic algorithm is used along with the simulation model......Selective laser melting is yet to become a standardized industrial manufacturing technique. The process continues to suffer from defects such as distortions, residual stresses, localized deformations and warpage caused primarily due to the localized heating, rapid cooling and high temperature...... gradients that occur during the process. While process monitoring and control of selective laser melting is an active area of research, establishing the reliability and robustness of the process still remains a challenge.In this paper, a methodology for generating reliable, optimized scanning paths...

  9. Nanoemulsion: process selection and application in cosmetics--a review.

    Science.gov (United States)

    Yukuyama, M N; Ghisleni, D D M; Pinto, T J A; Bou-Chacra, N A

    2016-02-01

    In recent decades, considerable and continuous growth in consumer demand in the cosmetics field has spurred the development of sophisticated formulations, aiming at high performance, attractive appearance, sensorial benefit and safety. Yet despite increasing demand from consumers, the formulator faces certain restrictions regarding the optimum equilibrium between the active compound concentration and the formulation base taking into account the nature of the skin structure, mainly concerning to the ideal penetration of the active compound, due to the natural skin barrier. Emulsion is a mixture of two immiscible phases, and the interest in nanoscale emulsion has been growing considerably in recent decades due to its specific attributes such as high stability, attractive appearance and drug delivery properties; therefore, performance is expected to improve using a lipid-based nanocarrier. Nanoemulsions are generated by different approaches: the so-called high-energy and low-energy methods. The global overview of these mechanisms and different alternatives for each method are presented in this paper, along with their benefits and drawbacks. As a cosmetics formulation is reflected in product delivery to consumers, nanoemulsion development with prospects for large-scale production is one of the key attributes in the method selection process. Thus, the aim of this review was to highlight the main high- and low-energy methods applicable in cosmetics and dermatological product development, their specificities, recent research on these methods in the cosmetics and consideration for the process selection optimization. The specific process with regard to inorganic nanoparticles, polymer nanoparticles and nanocapsule formulation is not considered in this paper. © 2015 Society of Cosmetic Scientists and the Société Française de Cosmétologie.

  10. Business Process Modelling based on Petri nets

    Directory of Open Access Journals (Sweden)

    Qin Jianglong

    2017-01-01

    Full Text Available Business process modelling is the way business processes are expressed. Business process modelling is the foundation of business process analysis, reengineering, reorganization and optimization. It can not only help enterprises to achieve internal information system integration and reuse, but also help enterprises to achieve with the external collaboration. Based on the prototype Petri net, this paper adds time and cost factors to form an extended generalized stochastic Petri net. It is a formal description of the business process. The semi-formalized business process modelling algorithm based on Petri nets is proposed. Finally, The case from a logistics company proved that the modelling algorithm is correct and effective.

  11. Modeling process flow using diagrams

    NARCIS (Netherlands)

    Kemper, B.; de Mast, J.; Mandjes, M.

    2010-01-01

    In the practice of process improvement, tools such as the flowchart, the value-stream map (VSM), and a variety of ad hoc variants of such diagrams are commonly used. The purpose of this paper is to present a clear, precise, and consistent framework for the use of such flow diagrams in process

  12. Evaluating experimental design for soil-plant model selection with Bayesian model averaging

    Science.gov (United States)

    Wöhling, Thomas; Geiges, Andreas; Nowak, Wolfgang; Gayler, Sebastian

    2013-04-01

    The objective selection of appropriate models for realistic simulations of coupled soil-plant processes is a challenging task since the processes are complex, not fully understood at larger scales, and highly non-linear. Also, comprehensive data sets are scarce, and measurements are uncertain. In the past decades, a variety of different models have been developed that exhibit a wide range of complexity regarding their approximation of processes in the coupled model compartments. We present a method for evaluating experimental design for maximum confidence in the model selection task. The method considers uncertainty in parameters, measurements and model structures. Advancing the ideas behind Bayesian Model Averaging (BMA), the model weights in BMA are perceived as uncertain quantities with assigned probability distributions that narrow down as more data are made available. This allows assessing the power of different data types, data densities and data locations in identifying the best model structure from among a suite of plausible models. The models considered in this study are the crop models CERES, SUCROS, GECROS and SPASS, which are coupled to identical routines for simulating soil processes within the modelling framework Expert-N. The four models considerably differ in the degree of detail at which crop growth and root water uptake are represented. Monte-Carlo simulations were conducted for each of these models considering their uncertainty in soil hydraulic properties and selected crop model parameters. The models were then conditioned on field measurements of soil moisture, leaf-area index (LAI), and evapotranspiration rates (from eddy-covariance measurements) during a vegetation period of winter wheat at the Nellingen site in Southwestern Germany. Following our new method, we derived the BMA model weights (and their distributions) when using all data or different subsets thereof. We discuss to which degree the posterior BMA mean outperformed the prior BMA

  13. A new process sensitivity index to identify important system processes under process model and parametric uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Dai, Heng [Pacific Northwest National Laboratory, Richland Washington USA; Ye, Ming [Department of Scientific Computing, Florida State University, Tallahassee Florida USA; Walker, Anthony P. [Environmental Sciences Division and Climate Change Science Institute, Oak Ridge National Laboratory, Oak Ridge Tennessee USA; Chen, Xingyuan [Pacific Northwest National Laboratory, Richland Washington USA

    2017-04-01

    Hydrological models are always composed of multiple components that represent processes key to intended model applications. When a process can be simulated by multiple conceptual-mathematical models (process models), model uncertainty in representing the process arises. While global sensitivity analysis methods have been widely used for identifying important processes in hydrologic modeling, the existing methods consider only parametric uncertainty but ignore the model uncertainty for process representation. To address this problem, this study develops a new method to probe multimodel process sensitivity by integrating the model averaging methods into the framework of variance-based global sensitivity analysis, given that the model averaging methods quantify both parametric and model uncertainty. A new process sensitivity index is derived as a metric of relative process importance, and the index includes variance in model outputs caused by uncertainty in both process models and model parameters. For demonstration, the new index is used to evaluate the processes of recharge and geology in a synthetic study of groundwater reactive transport modeling. The recharge process is simulated by two models that converting precipitation to recharge, and the geology process is also simulated by two models of different parameterizations of hydraulic conductivity; each process model has its own random parameters. The new process sensitivity index is mathematically general, and can be applied to a wide range of problems in hydrology and beyond.

  14. modeling grinding modeling grinding processes as micro processes

    African Journals Online (AJOL)

    eobe

    into two parts: static specific chip formation energy and dynamic specific chip formation ... the ratio of static normal chip formation force to static tangential chip formation force and the ratio ... grinding processing parameters to the friction coefficient between workpiece and grinding wheel. From equation. (20), the calculation ...

  15. Early environmental planning: A process for power line corridor selection

    International Nuclear Information System (INIS)

    Haagenstad, T.; Bare, C.M.

    1998-01-01

    Los Alamos National Laboratory (LANL) conducted an environmental planning study in the fall of 1997 to help determine the best alternative for upgrading the Laboratory's electrical power system. Alternatives considered included an on-site power generation facility and two corridors for a 10-mile-long 115-kV power line. This planning process was conducted prior to the formal National Environmental Policy Act (NEPA) review. The goals were to help select the best proposed action, to recommend modifications and mitigation measures for each alternative for a more environmentally sound project, and to avoid potential delays once the formal Department of Energy review process began. Significant constraints existed from a planning perspective, including operational issues such as existing outdoor high explosives testing areas, as well as environmental issues including threatened and endangered species habitats, multiple archeological sites, contaminated areas, and aesthetics. The study had to be completed within 45 days to meet project schedule needs. The process resulted in a number of important recommendations. While the construction and operation of the on-site power generation facility could have minimal environmental impacts, the need for a new air quality permit would create severe cost and schedule constraints for the project. From an environmental perspective, construction and operation of a power line within either corridor was concluded to be a viable alternative. However, impacts with either corridor would have to be reduced through specific recommended alignment modifications and mitigation measures

  16. Selecting a Control Strategy for Plug and Process Loads

    Energy Technology Data Exchange (ETDEWEB)

    Lobato, C.; Sheppy, M.; Brackney, L.; Pless, S.; Torcellini, P.

    2012-09-01

    Plug and Process Loads (PPLs) are building loads that are not related to general lighting, heating, ventilation, cooling, and water heating, and typically do not provide comfort to the building occupants. PPLs in commercial buildings account for almost 5% of U.S. primary energy consumption. On an individual building level, they account for approximately 25% of the total electrical load in a minimally code-compliant commercial building, and can exceed 50% in an ultra-high efficiency building such as the National Renewable Energy Laboratory's (NREL) Research Support Facility (RSF) (Lobato et al. 2010). Minimizing these loads is a primary challenge in the design and operation of an energy-efficient building. A complex array of technologies that measure and manage PPLs has emerged in the marketplace. Some fall short of manufacturer performance claims, however. NREL has been actively engaged in developing an evaluation and selection process for PPLs control, and is using this process to evaluate a range of technologies for active PPLs management that will cap RSF plug loads. Using a control strategy to match plug load use to users' required job functions is a huge untapped potential for energy savings.

  17. Model-based software process improvement

    Science.gov (United States)

    Zettervall, Brenda T.

    1994-01-01

    The activities of a field test site for the Software Engineering Institute's software process definition project are discussed. Products tested included the improvement model itself, descriptive modeling techniques, the CMM level 2 framework document, and the use of process definition guidelines and templates. The software process improvement model represents a five stage cyclic approach for organizational process improvement. The cycles consist of the initiating, diagnosing, establishing, acting, and leveraging phases.

  18. A new Russell model for selecting suppliers

    NARCIS (Netherlands)

    Azadi, Majid; Shabani, Amir; Farzipoor Saen, Reza

    2014-01-01

    Recently, supply chain management (SCM) has been considered by many researchers. Supplier evaluation and selection plays a significant role in establishing an effective SCM. One of the techniques that can be used for selecting suppliers is data envelopment analysis (DEA). In some situations, to

  19. Selecting, weeding, and weighting biased climate model ensembles

    Science.gov (United States)

    Jackson, C. S.; Picton, J.; Huerta, G.; Nosedal Sanchez, A.

    2012-12-01

    In the Bayesian formulation, the "log-likelihood" is a test statistic for selecting, weeding, or weighting climate model ensembles with observational data. This statistic has the potential to synthesize the physical and data constraints on quantities of interest. One of the thorny issues for formulating the log-likelihood is how one should account for biases. While in the past we have included a generic discrepancy term, not all biases affect predictions of quantities of interest. We make use of a 165-member ensemble CAM3.1/slab ocean climate models with different parameter settings to think through the issues that are involved with predicting each model's sensitivity to greenhouse gas forcing given what can be observed from the base state. In particular we use multivariate empirical orthogonal functions to decompose the differences that exist among this ensemble to discover what fields and regions matter to the model's sensitivity. We find that the differences that matter are a small fraction of the total discrepancy. Moreover, weighting members of the ensemble using this knowledge does a relatively poor job of adjusting the ensemble mean toward the known answer. This points out the shortcomings of using weights to correct for biases in climate model ensembles created by a selection process that does not emphasize the priorities of your log-likelihood.

  20. Development of Solar Drying Model for Selected Cambodian Fish Species

    Directory of Open Access Journals (Sweden)

    Anna Hubackova

    2014-01-01

    Full Text Available A solar drying was investigated as one of perspective techniques for fish processing in Cambodia. The solar drying was compared to conventional drying in electric oven. Five typical Cambodian fish species were selected for this study. Mean solar drying temperature and drying air relative humidity were 55.6°C and 19.9%, respectively. The overall solar dryer efficiency was 12.37%, which is typical for natural convection solar dryers. An average evaporative capacity of solar dryer was 0.049 kg·h−1. Based on coefficient of determination (R2, chi-square (χ2 test, and root-mean-square error (RMSE, the most suitable models describing natural convection solar drying kinetics were Logarithmic model, Diffusion approximate model, and Two-term model for climbing perch and Nile tilapia, swamp eel and walking catfish and Channa fish, respectively. In case of electric oven drying, the Modified Page 1 model shows the best results for all investigated fish species except Channa fish where the two-term model is the best one. Sensory evaluation shows that most preferable fish is climbing perch, followed by Nile tilapia and walking catfish. This study brings new knowledge about drying kinetics of fresh water fish species in Cambodia and confirms the solar drying as acceptable technology for fish processing.

  1. Selective experimental review of the Standard Model

    International Nuclear Information System (INIS)

    Bloom, E.D.

    1985-02-01

    Before disussing experimental comparisons with the Standard Model, (S-M) it is probably wise to define more completely what is commonly meant by this popular term. This model is a gauge theory of SU(3)/sub f/ x SU(2)/sub L/ x U(1) with 18 parameters. The parameters are α/sub s/, α/sub qed/, theta/sub W/, M/sub W/ (M/sub Z/ = M/sub W//cos theta/sub W/, and thus is not an independent parameter), M/sub Higgs/; the lepton masses, M/sub e/, Mμ, M/sub r/; the quark masses, M/sub d/, M/sub s/, M/sub b/, and M/sub u/, M/sub c/, M/sub t/; and finally, the quark mixing angles, theta 1 , theta 2 , theta 3 , and the CP violating phase delta. The latter four parameters appear in the quark mixing matrix for the Kobayashi-Maskawa and Maiani forms. Clearly, the present S-M covers an enormous range of physics topics, and the author can only lightly cover a few such topics in this report. The measurement of R/sub hadron/ is fundamental as a test of the running coupling constant α/sub s/ in QCD. The author will discuss a selection of recent precision measurements of R/sub hadron/, as well as some other techniques for measuring α/sub s/. QCD also requires the self interaction of gluons. The search for the three gluon vertex may be practically realized in the clear identification of gluonic mesons. The author will present a limited review of recent progress in the attempt to untangle such mesons from the plethora q anti q states of the same quantum numbers which exist in the same mass range. The electroweak interactions provide some of the strongest evidence supporting the S-M that exists. Given the recent progress in this subfield, and particularly with the discovery of the W and Z bosons at CERN, many recent reviews obviate the need for further discussion in this report. In attempting to validate a theory, one frequently searches for new phenomena which would clearly invalidate it. 49 references, 28 figures

  2. Manufacturing Process Selection of Composite Bicycle’s Crank Arm using Analytical Hierarchy Process (AHP)

    Science.gov (United States)

    Luqman, M.; Rosli, M. U.; Khor, C. Y.; Zambree, Shayfull; Jahidi, H.

    2018-03-01

    Crank arm is one of the important parts in a bicycle that is an expensive product due to the high cost of material and production process. This research is aimed to investigate the potential type of manufacturing process to fabricate composite bicycle crank arm and to describe an approach based on analytical hierarchy process (AHP) that assists decision makers or manufacturing engineers in determining the most suitable process to be employed in manufacturing of composite bicycle crank arm at the early stage of the product development process to reduce the production cost. There are four types of processes were considered, namely resin transfer molding (RTM), compression molding (CM), vacuum bag molding and filament winding (FW). The analysis ranks these four types of process for its suitability in the manufacturing of bicycle crank arm based on five main selection factors and 10 sub factors. Determining the right manufacturing process was performed based on AHP process steps. Consistency test was performed to make sure the judgements are consistent during the comparison. The results indicated that the compression molding was the most appropriate manufacturing process because it has the highest value (33.6%) among the other manufacturing processes.

  3. Modelling of Batch Process Operations

    DEFF Research Database (Denmark)

    Abdul Samad, Noor Asma Fazli; Cameron, Ian; Gani, Rafiqul

    2011-01-01

    Here a batch cooling crystalliser is modelled and simulated as is a batch distillation system. In the batch crystalliser four operational modes of the crystalliser are considered, namely: initial cooling, nucleation, crystal growth and product removal. A model generation procedure is shown that s...

  4. Mathematical Modeling: A Structured Process

    Science.gov (United States)

    Anhalt, Cynthia Oropesa; Cortez, Ricardo

    2015-01-01

    Mathematical modeling, in which students use mathematics to explain or interpret physical, social, or scientific phenomena, is an essential component of the high school curriculum. The Common Core State Standards for Mathematics (CCSSM) classify modeling as a K-12 standard for mathematical practice and as a conceptual category for high school…

  5. Selection of models to calculate the LLW source term

    International Nuclear Information System (INIS)

    Sullivan, T.M.

    1991-10-01

    Performance assessment of a LLW disposal facility begins with an estimation of the rate at which radionuclides migrate out of the facility (i.e., the source term). The focus of this work is to develop a methodology for calculating the source term. In general, the source term is influenced by the radionuclide inventory, the wasteforms and containers used to dispose of the inventory, and the physical processes that lead to release from the facility (fluid flow, container degradation, wasteform leaching, and radionuclide transport). In turn, many of these physical processes are influenced by the design of the disposal facility (e.g., infiltration of water). The complexity of the problem and the absence of appropriate data prevent development of an entirely mechanistic representation of radionuclide release from a disposal facility. Typically, a number of assumptions, based on knowledge of the disposal system, are used to simplify the problem. This document provides a brief overview of disposal practices and reviews existing source term models as background for selecting appropriate models for estimating the source term. The selection rationale and the mathematical details of the models are presented. Finally, guidance is presented for combining the inventory data with appropriate mechanisms describing release from the disposal facility. 44 refs., 6 figs., 1 tab

  6. Sleep-dependent memory triage: Evolving generalization through selective processing

    Science.gov (United States)

    Stickgold, Robert; Walker, Matthew P.

    2018-01-01

    The brain does not retain all the information it encodes in a day. Much is forgotten, and of those memories retained, their subsequent “evolution” can follow any of a number of pathways. Emerging data makes clear that sleep is a compelling candidate for performing many of these operations. But how does the sleeping brain know which information to preserve and which to forget? What should sleep do with that information it chooses to keep? For information that is retained, sleep can integrate it into existing memory networks, look for common patterns and distill overarching rules, or simply stabilize and strengthen the memory exactly as it was learned. We suggest such “memory triage” lies at the heart of a sleep-dependent memory processing system that selects new information, in a discriminatory manner, and assimilates it into the brain’s vast armamentarium of evolving knowledge, helping guide each organism through its own, unique life. PMID:23354387

  7. Selection of Vendor Based on Intuitionistic Fuzzy Analytical Hierarchy Process

    Directory of Open Access Journals (Sweden)

    Prabjot Kaur

    2014-01-01

    Full Text Available Business environment is characterized by greater domestic and international competitive position in the global market. Vendors play a key role in achieving the so-called corporate competition. It is not easy however to identify good vendors because evaluation is based on multiple criteria. In practice, for VSP most of the input information about the criteria is not known precisely. Intuitionistic fuzzy set is an extension of the classical fuzzy set theory (FST, which is a suitable way to deal with impreciseness. In other words, the application of intuitionistic fuzzy sets instead of fuzzy sets means the introduction of another degree of freedom called nonmembership function into the set description. In this paper, we proposed a triangular intuitionistic fuzzy number based approach for the vendor selection problem using analytical hierarchy process. The crisp data of the vendors is represented in the form of triangular intuitionistic fuzzy numbers. By applying AHP which involves decomposition, pairwise comparison, and deriving priorities for the various levels of the hierarchy, an overall crisp priority is obtained for ranking the best vendor. A numerical example illustrates our method. Lastly a sensitivity analysis is performed to find the most critical criterion on the basis of which vendor is selected.

  8. Supercritical boiler material selection using fuzzy analytic network process

    Directory of Open Access Journals (Sweden)

    Saikat Ranjan Maity

    2012-08-01

    Full Text Available The recent development of world is being adversely affected by the scarcity of power and energy. To survive in the next generation, it is thus necessary to explore the non-conventional energy sources and efficiently consume the available sources. For efficient exploitation of the existing energy sources, a great scope lies in the use of Rankin cycle-based thermal power plants. Today, the gross efficiency of Rankin cycle-based thermal power plants is less than 28% which has been increased up to 40% with reheating and regenerative cycles. But, it can be further improved up to 47% by using supercritical power plant technology. Supercritical power plants use supercritical boilers which are able to withstand a very high temperature (650-720˚C and pressure (22.1 MPa while producing superheated steam. The thermal efficiency of a supercritical boiler greatly depends on the material of its different components. The supercritical boiler material should possess high creep rupture strength, high thermal conductivity, low thermal expansion, high specific heat and very high temperature withstandability. This paper considers a list of seven supercritical boiler materials whose performance is evaluated based on seven pivotal criteria. Given the intricacy and difficulty of this supercritical boiler material selection problem having interactions and interdependencies between different criteria, this paper applies fuzzy analytic network process to select the most appropriate material for a supercritical boiler. Rene 41 is the best supercritical boiler material, whereas, Haynes 230 is the worst preferred choice.

  9. Optimal processing pathway selection for microalgae-based biorefinery under uncertainty

    DEFF Research Database (Denmark)

    Rizwan, Muhammad; Zaman, Muhammad; Lee, Jay H.

    2015-01-01

    to the sMINLP problem determines the processing technologies, material flows, and product portfolio that are optimal with respect to all the sampled scenarios. The developed framework is implemented and tested on a specific case study. The optimal processing pathways selected with and without......We propose a systematic framework for the selection of optimal processing pathways for a microalgaebased biorefinery under techno-economic uncertainty. The proposed framework promotes robust decision making by taking into account the uncertainties that arise due to inconsistencies among...... and shortage in the available technical information. A stochastic mixed integer nonlinear programming (sMINLP) problem is formulated for determining the optimal biorefinery configurations based on a superstructure model where parameter uncertainties are modeled and included as sampled scenarios. The solution...

  10. Characteristics of products generated by selective sintering and stereolithography rapid prototyping processes

    Science.gov (United States)

    Cariapa, Vikram

    1993-01-01

    The trend in the modern global economy towards free market policies has motivated companies to use rapid prototyping technologies to not only reduce product development cycle time but also to maintain their competitive edge. A rapid prototyping technology is one which combines computer aided design with computer controlled tracking of focussed high energy source (eg. lasers, heat) on modern ceramic powders, metallic powders, plastics or photosensitive liquid resins in order to produce prototypes or models. At present, except for the process of shape melting, most rapid prototyping processes generate products that are only dimensionally similar to those of the desired end product. There is an urgent need, therefore, to enhance the understanding of the characteristics of these processes in order to realize their potential for production. Currently, the commercial market is dominated by four rapid prototyping processes, namely selective laser sintering, stereolithography, fused deposition modelling and laminated object manufacturing. This phase of the research has focussed on the selective laser sintering and stereolithography rapid prototyping processes. A theoretical model for these processes is under development. Different rapid prototyping sites supplied test specimens (based on ASTM 638-84, Type I) that have been measured and tested to provide a data base on surface finish, dimensional variation and ultimate tensile strength. Further plans call for developing and verifying the theoretical models by carefully designed experiments. This will be a joint effort between NASA and other prototyping centers to generate a larger database, thus encouraging more widespread usage by product designers.

  11. Parental selection: a third selection process in the evolution of human hairlessness and skin color.

    Science.gov (United States)

    Harris, Judith Rich

    2006-01-01

    It is proposed that human hairlessness, and the pale skin seen in modern Europeans and Asians, are not the results of Darwinian selection; these attributes provide no survival benefits. They are instead the results of sexual selection combined with a third, previously unrecognized, process: parental selection. The use of infanticide as a method of birth control in premodern societies gave parents - in particular, mothers - the power to exert an influence on the course of human evolution by deciding whether to keep or abandon a newborn infant. If such a decision was made before the infant was born, it could be overturned in the positive direction if the infant was particularly beautiful - that is, if the infant conformed to the standards of beauty prescribed by the mother's culture. It could be overturned in the negative direction if the infant failed to meet those standards. Thus, human hairlessness and pale skin could have resulted in part from cultural preferences expressed as decisions made by women immediately after childbirth.

  12. Modeling business processes: theoretical and practical aspects

    Directory of Open Access Journals (Sweden)

    V.V. Dubininа

    2015-06-01

    Full Text Available The essence of process-oriented enterprise management has been examined in the article. The content and types of information technology have been analyzed in the article, due to the complexity and differentiation of existing methods, as well as the specificity of language, terminology of the enterprise business processes modeling. The theoretical aspects of business processes modeling have been reviewed and the modern traditional modeling techniques received practical application in the visualization model of retailers activity have been studied in the article. In the process of theoretical analysis of the modeling methods found that UFO-toolkit method that has been developed by Ukrainian scientists due to it systemology integrated opportunities, is the most suitable for structural and object analysis of retailers business processes. It was designed visualized simulation model of the business process "sales" as is" of retailers using a combination UFO-elements with the aim of the further practical formalization and optimization of a given business process.

  13. Applying a Hybrid MCDM Model for Six Sigma Project Selection

    Directory of Open Access Journals (Sweden)

    Fu-Kwun Wang

    2014-01-01

    Full Text Available Six Sigma is a project-driven methodology; the projects that provide the maximum financial benefits and other impacts to the organization must be prioritized. Project selection (PS is a type of multiple criteria decision making (MCDM problem. In this study, we present a hybrid MCDM model combining the decision-making trial and evaluation laboratory (DEMATEL technique, analytic network process (ANP, and the VIKOR method to evaluate and improve Six Sigma projects for reducing performance gaps in each criterion and dimension. We consider the film printing industry of Taiwan as an empirical case. The results show that our study not only can use the best project selection, but can also be used to analyze the gaps between existing performance values and aspiration levels for improving the gaps in each dimension and criterion based on the influential network relation map.

  14. Analytical network process based optimum cluster head selection in wireless sensor network.

    Science.gov (United States)

    Farman, Haleem; Javed, Huma; Jan, Bilal; Ahmad, Jamil; Ali, Shaukat; Khalil, Falak Naz; Khan, Murad

    2017-01-01

    Wireless Sensor Networks (WSNs) are becoming ubiquitous in everyday life due to their applications in weather forecasting, surveillance, implantable sensors for health monitoring and other plethora of applications. WSN is equipped with hundreds and thousands of small sensor nodes. As the size of a sensor node decreases, critical issues such as limited energy, computation time and limited memory become even more highlighted. In such a case, network lifetime mainly depends on efficient use of available resources. Organizing nearby nodes into clusters make it convenient to efficiently manage each cluster as well as the overall network. In this paper, we extend our previous work of grid-based hybrid network deployment approach, in which merge and split technique has been proposed to construct network topology. Constructing topology through our proposed technique, in this paper we have used analytical network process (ANP) model for cluster head selection in WSN. Five distinct parameters: distance from nodes (DistNode), residual energy level (REL), distance from centroid (DistCent), number of times the node has been selected as cluster head (TCH) and merged node (MN) are considered for CH selection. The problem of CH selection based on these parameters is tackled as a multi criteria decision system, for which ANP method is used for optimum cluster head selection. Main contribution of this work is to check the applicability of ANP model for cluster head selection in WSN. In addition, sensitivity analysis is carried out to check the stability of alternatives (available candidate nodes) and their ranking for different scenarios. The simulation results show that the proposed method outperforms existing energy efficient clustering protocols in terms of optimum CH selection and minimizing CH reselection process that results in extending overall network lifetime. This paper analyzes that ANP method used for CH selection with better understanding of the dependencies of

  15. Modelling heat processing of dairy products

    NARCIS (Netherlands)

    Hotrum, N.; Fox, M.B.; Lieverloo, H.; Smit, E.; Jong, de P.; Schutyser, M.A.I.

    2010-01-01

    This chapter discusses the application of computer modelling to optimise the heat processing of milk. The chapter first reviews types of heat processing equipment used in the dairy industry. Then, the types of objectives that can be achieved using model-based process optimisation are discussed.

  16. How visual cognition influences process model comprehension

    NARCIS (Netherlands)

    Petrusel, Razvan; Mendling, Jan; Reijers, Hajo A.

    2017-01-01

    Process analysts and other professionals extensively use process models to analyze business processes and identify performance improvement opportunities. Therefore, it is important that such models can be easily and properly understood. Previous research has mainly focused on two types of factors

  17. Modeling process flow using diagrams

    OpenAIRE

    Kemper, B.; de Mast, J.; Mandjes, M.

    2010-01-01

    In the practice of process improvement, tools such as the flowchart, the value-stream map (VSM), and a variety of ad hoc variants of such diagrams are commonly used. The purpose of this paper is to present a clear, precise, and consistent framework for the use of such flow diagrams in process improvement projects. The paper finds that traditional diagrams, such as the flowchart, the VSM, and OR-type of diagrams, have severe limitations, miss certain elements, or are based on implicit but cons...

  18. Diesel oil volatilization processes affected by selected porous media.

    Science.gov (United States)

    Ma, Yanfei; Zheng, Xilai; Anderson, S H; Lu, Jie; Feng, Xuedong

    2014-03-01

    Volatilization plays an important role in attenuating petroleum products in contaminated soils. The objective of this study was to evaluate the influence of wind speed, vessel diameter and mean grain size of porous media on diesel oil volatilization. Experiments were conducted to investigate the volatilization behavior of diesel oil from porous media by weighing contaminated samples pre- and post-volatilization. Three selected field porous media materials were evaluated: Silty Clay Loam, Fine Sand, and Coarse Sand along with six individual sand fractions of the Coarse Sand. Results indicate that increasing wind speed accelerates the diesel oil volatilization process, especially for wind speeds below 2.10ms(-1). The low-carbon components of diesel oil volatilize more rapidly, with the effects of wind speed more pronounced on C10 to C15 volatilization than on C16 and higher. The volatilization rate coefficient of diesel oil increases with decreasing mean grain size of porous media, and with increasing vessel diameter. A power function expressed the relationship with mean grain size. All processes (wind speed, vessel diameter, and mean grain size) were included in an equation which explained over 92% of the measured diesel oil volatilization rate coefficient variations for the experiments. Diesel oil volatilization appears to be boundary-layer regulated to some extent. Copyright © 2013 Elsevier Ltd. All rights reserved.

  19. A Bayesian random effects discrete-choice model for resource selection: Population-level selection inference

    Science.gov (United States)

    Thomas, D.L.; Johnson, D.; Griffith, B.

    2006-01-01

    Modeling the probability of use of land units characterized by discrete and continuous measures, we present a Bayesian random-effects model to assess resource selection. This model provides simultaneous estimation of both individual- and population-level selection. Deviance information criterion (DIC), a Bayesian alternative to AIC that is sample-size specific, is used for model selection. Aerial radiolocation data from 76 adult female caribou (Rangifer tarandus) and calf pairs during 1 year on an Arctic coastal plain calving ground were used to illustrate models and assess population-level selection of landscape attributes, as well as individual heterogeneity of selection. Landscape attributes included elevation, NDVI (a measure of forage greenness), and land cover-type classification. Results from the first of a 2-stage model-selection procedure indicated that there is substantial heterogeneity among cow-calf pairs with respect to selection of the landscape attributes. In the second stage, selection of models with heterogeneity included indicated that at the population-level, NDVI and land cover class were significant attributes for selection of different landscapes by pairs on the calving ground. Population-level selection coefficients indicate that the pairs generally select landscapes with higher levels of NDVI, but the relationship is quadratic. The highest rate of selection occurs at values of NDVI less than the maximum observed. Results for land cover-class selections coefficients indicate that wet sedge, moist sedge, herbaceous tussock tundra, and shrub tussock tundra are selected at approximately the same rate, while alpine and sparsely vegetated landscapes are selected at a lower rate. Furthermore, the variability in selection by individual caribou for moist sedge and sparsely vegetated landscapes is large relative to the variability in selection of other land cover types. The example analysis illustrates that, while sometimes computationally intense, a

  20. Selection of hydrologic modeling approaches for climate change assessment: A comparison of model scale and structures

    Science.gov (United States)

    Surfleet, Christopher G.; Tullos, Desirèe; Chang, Heejun; Jung, Il-Won

    2012-09-01

    SummaryA wide variety of approaches to hydrologic (rainfall-runoff) modeling of river basins confounds our ability to select, develop, and interpret models, particularly in the evaluation of prediction uncertainty associated with climate change assessment. To inform the model selection process, we characterized and compared three structurally-distinct approaches and spatial scales of parameterization to modeling catchment hydrology: a large-scale approach (using the VIC model; 671,000 km2 area), a basin-scale approach (using the PRMS model; 29,700 km2 area), and a site-specific approach (the GSFLOW model; 4700 km2 area) forced by the same future climate estimates. For each approach, we present measures of fit to historic observations and predictions of future response, as well as estimates of model parameter uncertainty, when available. While the site-specific approach generally had the best fit to historic measurements, the performance of the model approaches varied. The site-specific approach generated the best fit at unregulated sites, the large scale approach performed best just downstream of flood control projects, and model performance varied at the farthest downstream sites where streamflow regulation is mitigated to some extent by unregulated tributaries and water diversions. These results illustrate how selection of a modeling approach and interpretation of climate change projections require (a) appropriate parameterization of the models for climate and hydrologic processes governing runoff generation in the area under study, (b) understanding and justifying the assumptions and limitations of the model, and (c) estimates of uncertainty associated with the modeling approach.

  1. Steady-State Process Modelling

    DEFF Research Database (Denmark)

    Cameron, Ian; Gani, Rafiqul

    2011-01-01

    illustrate the “equation oriented” approach as well as the “sequential modular” approach to solving complex flowsheets for steady state applications. The applications include the Williams-Otto plant, the hydrodealkylation (HDA) of toluene, conversion of ethylene to ethanol and a bio-ethanol process....

  2. Selection of Representative Models for Decision Analysis Under Uncertainty

    Science.gov (United States)

    Meira, Luis A. A.; Coelho, Guilherme P.; Santos, Antonio Alberto S.; Schiozer, Denis J.

    2016-03-01

    The decision-making process in oil fields includes a step of risk analysis associated with the uncertainties present in the variables of the problem. Such uncertainties lead to hundreds, even thousands, of possible scenarios that are supposed to be analyzed so an effective production strategy can be selected. Given this high number of scenarios, a technique to reduce this set to a smaller, feasible subset of representative scenarios is imperative. The selected scenarios must be representative of the original set and also free of optimistic and pessimistic bias. This paper is devoted to propose an assisted methodology to identify representative models in oil fields. To do so, first a mathematical function was developed to model the representativeness of a subset of models with respect to the full set that characterizes the problem. Then, an optimization tool was implemented to identify the representative models of any problem, considering not only the cross-plots of the main output variables, but also the risk curves and the probability distribution of the attribute-levels of the problem. The proposed technique was applied to two benchmark cases and the results, evaluated by experts in the field, indicate that the obtained solutions are richer than those identified by previously adopted manual approaches. The program bytecode is available under request.

  3. Relations between frequency selectivity, temporal fine-structure processing, and speech reception in impaired hearing

    DEFF Research Database (Denmark)

    Strelcyk, Olaf; Dau, Torsten

    2009-01-01

    Frequency selectivity, temporal fine-structure (TFS) processing, and speech reception were assessed for six normal-hearing (NH) listeners, ten sensorineurally hearing-impaired (HI) listeners with similar high-frequency losses, and two listeners with an obscure dysfunction (OD). TFS processing...... was investigated at low frequencies in regions of normal hearing, through measurements of binaural masked detection, tone lateralization, and monaural frequency modulation (FM) detection. Lateralization and FM detection thresholds were measured in quiet and in background noise. Speech reception thresholds were...... in a two-talker background and lateralized noise, but not in amplitude-modulated noise. The results provide constraints for future models of impaired auditory signal processing....

  4. Numerical modelling of reflood processes

    International Nuclear Information System (INIS)

    Glynn, D.R.; Rhodes, N.; Tatchell, D.G.

    1983-01-01

    The use of a detailed computer model to investigate the effects of grid size and the choice of wall-to-fluid heat-transfer correlations on the predictions obtained for reflooding of a vertical heated channel is described. The model employs equations for the momentum and enthalpy of vapour and liquid and hence accounts for both thermal non-equilibrium and slip between the phases. Empirical correlations are used to calculate interphase and wall-to-fluid friction and heat-transfer as functions of flow regime and local conditions. The empirical formulae have remained fixed with the exception of the wall-to-fluid heat-transfer correlations. These have been varied according to the practices adopted in other computer codes used to model reflood, namely REFLUX, RELAP and TRAC. Calculations have been performed to predict the CSNI standard problem number 7, and the results are compared with experiment. It is shown that the results are substantially grid-independent, and that the choice of correlation has a significant influence on the general flow behaviour, the rate of quenching and on the maximum cladding temperature predicted by the model. It is concluded that good predictions of reflooding rates can be obtained with particular correlation sets. (author)

  5. Branching process models of cancer

    CERN Document Server

    Durrett, Richard

    2015-01-01

    This volume develops results on continuous time branching processes and applies them to study rate of tumor growth, extending classic work on the Luria-Delbruck distribution. As a consequence, the authors calculate the probability that mutations that confer resistance to treatment are present at detection and quantify the extent of tumor heterogeneity. As applications, the authors evaluate ovarian cancer screening strategies and give rigorous proofs for results of Heano and Michor concerning tumor metastasis. These notes should be accessible to students who are familiar with Poisson processes and continuous time. Richard Durrett is mathematics professor at Duke University, USA. He is the author of 8 books, over 200 journal articles, and has supervised more than 40 Ph.D. students. Most of his current research concerns the applications of probability to biology: ecology, genetics, and most recently cancer.

  6. The cost of ethanol production from lignocellulosic biomass -- A comparison of selected alternative processes. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Grethlein, H.E.; Dill, T.

    1993-04-30

    The purpose of this report is to compare the cost of selected alternative processes for the conversion of lignocellulosic biomass to ethanol. In turn, this information will be used by the ARS/USDA to guide the management of research and development programs in biomass conversion. The report will identify where the cost leverages are for the selected alternatives and what performance parameters need to be achieved to improve the economics. The process alternatives considered here are not exhaustive, but are selected on the basis of having a reasonable potential in improving the economics of producing ethanol from biomass. When other alternatives come under consideration, they should be evaluated by the same methodology used in this report to give fair comparisons of opportunities. A generic plant design is developed for an annual production of 25 million gallons of anhydrous ethanol using corn stover as the model substrate at $30/dry ton. Standard chemical engineering techniques are used to give first order estimates of the capital and operating costs. Following the format of the corn to ethanol plant, there are nine sections to the plant; feed preparation, pretreatment, hydrolysis, fermentation, distillation and dehydration, stillage evaporation, storage and denaturation, utilities, and enzyme production. There are three pretreatment alternatives considered: the AFEX process, the modified AFEX process (which is abbreviated as MAFEX), and the STAKETECH process. These all use enzymatic hydrolysis and so an enzyme production section is included in the plant. The STAKETECH is the only commercially available process among the alternative processes.

  7. Evaluation of EOR Processes Using Network Models

    DEFF Research Database (Denmark)

    Winter, Anatol; Larsen, Jens Kjell; Krogsbøll, Anette

    1998-01-01

    The report consists of the following parts: 1) Studies of wetting properties of model fluids and fluid mixtures aimed at an optimal selection of candidates for micromodel experiments. 2) Experimental studies of multiphase transport properties using physical models of porous networks (micromodels......) including estimation of their "petrophysical" properties (e.g. absolute permeability). 3) Mathematical modelling and computer studies of multiphase transport through pore space using mathematical network models. 4) Investigation of link between pore-scale and macroscopic recovery mechanisms....

  8. Discovering Process Reference Models from Process Variants Using Clustering Techniques

    NARCIS (Netherlands)

    Li, C.; Reichert, M.U.; Wombacher, Andreas

    2008-01-01

    In today's dynamic business world, success of an enterprise increasingly depends on its ability to react to changes in a quick and flexible way. In response to this need, process-aware information systems (PAIS) emerged, which support the modeling, orchestration and monitoring of business processes

  9. Systematic approach for the identification of process reference models

    CSIR Research Space (South Africa)

    Van Der Merwe, A

    2009-02-01

    Full Text Available Process models are used in different application domains to capture knowledge on the process flow. Process reference models (PRM) are used to capture reusable process models, which should simplify the identification process of process models...

  10. Uncertainty associated with selected environmental transport models

    International Nuclear Information System (INIS)

    Little, C.A.; Miller, C.W.

    1979-11-01

    A description is given of the capabilities of several models to predict accurately either pollutant concentrations in environmental media or radiological dose to human organs. The models are discussed in three sections: aquatic or surface water transport models, atmospheric transport models, and terrestrial and aquatic food chain models. Using data published primarily by model users, model predictions are compared to observations. This procedure is infeasible for food chain models and, therefore, the uncertainty embodied in the models input parameters, rather than the model output, is estimated. Aquatic transport models are divided into one-dimensional, longitudinal-vertical, and longitudinal-horizontal models. Several conclusions were made about the ability of the Gaussian plume atmospheric dispersion model to predict accurately downwind air concentrations from releases under several sets of conditions. It is concluded that no validation study has been conducted to test the predictions of either aquatic or terrestrial food chain models. Using the aquatic pathway from water to fish to an adult for 137 Cs as an example, a 95% one-tailed confidence limit interval for the predicted exposure is calculated by examining the distributions of the input parameters. Such an interval is found to be 16 times the value of the median exposure. A similar one-tailed limit for the air-grass-cow-milk-thyroid for 131 I and infants was 5.6 times the median dose. Of the three model types discussed in this report,the aquatic transport models appear to do the best job of predicting observed concentrations. However, this conclusion is based on many fewer aquatic validation data than were availaable for atmospheric model validation

  11. Selecting an Appropriate Upscaled Reservoir Model Based on Connectivity Analysis

    Directory of Open Access Journals (Sweden)

    Preux Christophe

    2016-09-01

    Full Text Available Reservoir engineers aim to build reservoir models to investigate fluid flows within hydrocarbon reservoirs. These models consist of three-dimensional grids populated by petrophysical properties. In this paper, we focus on permeability that is known to significantly influence fluid flow. Reservoir models usually encompass a very large number of fine grid blocks to better represent heterogeneities. However, performing fluid flow simulations for such fine models is extensively CPU-time consuming. A common practice consists in converting the fine models into coarse models with less grid blocks: this is the upscaling process. Many upscaling methods have been proposed in the literature that all lead to distinct coarse models. The problem is how to choose the appropriate upscaling method. Various criteria have been established to evaluate the information loss due to upscaling, but none of them investigate connectivity. In this paper, we propose to first perform a connectivity analysis for the fine and candidate coarse models. This makes it possible to identify shortest paths connecting wells. Then, we introduce two indicators to quantify the length and trajectory mismatch between the paths for the fine and the coarse models. The upscaling technique to be recommended is the one that provides the coarse model for which the shortest paths are the closest to the shortest paths determined for the fine model, both in terms of length and trajectory. Last, the potential of this methodology is investigated from two test cases. We show that the two indicators help select suitable upscaling techniques as long as gravity is not a prominent factor that drives fluid flows.

  12. Multiphysics modeling of selective laser sintering/melting

    Science.gov (United States)

    Ganeriwala, Rishi Kumar

    A significant percentage of total global employment is due to the manufacturing industry. However, manufacturing also accounts for nearly 20% of total energy usage in the United States according to the EIA. In fact, manufacturing accounted for 90% of industrial energy consumption and 84% of industry carbon dioxide emissions in 2002. Clearly, advances in manufacturing technology and efficiency are necessary to curb emissions and help society as a whole. Additive manufacturing (AM) refers to a relatively recent group of manufacturing technologies whereby one can 3D print parts, which has the potential to significantly reduce waste, reconfigure the supply chain, and generally disrupt the whole manufacturing industry. Selective laser sintering/melting (SLS/SLM) is one type of AM technology with the distinct advantage of being able to 3D print metals and rapidly produce net shape parts with complicated geometries. In SLS/SLM parts are built up layer-by-layer out of powder particles, which are selectively sintered/melted via a laser. However, in order to produce defect-free parts of sufficient strength, the process parameters (laser power, scan speed, layer thickness, powder size, etc.) must be carefully optimized. Obviously, these process parameters will vary depending on material, part geometry, and desired final part characteristics. Running experiments to optimize these parameters is costly, energy intensive, and extremely material specific. Thus a computational model of this process would be highly valuable. In this work a three dimensional, reduced order, coupled discrete element - finite difference model is presented for simulating the deposition and subsequent laser heating of a layer of powder particles sitting on top of a substrate. Validation is provided and parameter studies are conducted showing the ability of this model to help determine appropriate process parameters and an optimal powder size distribution for a given material. Next, thermal stresses upon

  13. Validation process of simulation model

    International Nuclear Information System (INIS)

    San Isidro, M. J.

    1998-01-01

    It is presented a methodology on empirical validation about any detailed simulation model. This king of validation it is always related with an experimental case. The empirical validation has a residual sense, because the conclusions are based on comparisons between simulated outputs and experimental measurements. This methodology will guide us to detect the fails of the simulation model. Furthermore, it can be used a guide in the design of posterior experiments. Three steps can be well differentiated: Sensitivity analysis. It can be made with a DSA, differential sensitivity analysis, and with a MCSA, Monte-Carlo sensitivity analysis. Looking the optimal domains of the input parameters. It has been developed a procedure based on the Monte-Carlo methods and Cluster techniques, to find the optimal domains of these parameters. Residual analysis. This analysis has been made on the time domain and on the frequency domain, it has been used the correlation analysis and spectral analysis. As application of this methodology, it is presented the validation carried out on a thermal simulation model on buildings, Esp., studying the behavior of building components on a Test Cell of LECE of CIEMAT. (Author) 17 refs

  14. Computationally efficient thermal-mechanical modelling of selective laser melting

    Science.gov (United States)

    Yang, Yabin; Ayas, Can

    2017-10-01

    The Selective laser melting (SLM) is a powder based additive manufacturing (AM) method to produce high density metal parts with complex topology. However, part distortions and accompanying residual stresses deteriorates the mechanical reliability of SLM products. Modelling of the SLM process is anticipated to be instrumental for understanding and predicting the development of residual stress field during the build process. However, SLM process modelling requires determination of the heat transients within the part being built which is coupled to a mechanical boundary value problem to calculate displacement and residual stress fields. Thermal models associated with SLM are typically complex and computationally demanding. In this paper, we present a simple semi-analytical thermal-mechanical model, developed for SLM that represents the effect of laser scanning vectors with line heat sources. The temperature field within the part being build is attained by superposition of temperature field associated with line heat sources in a semi-infinite medium and a complimentary temperature field which accounts for the actual boundary conditions. An analytical solution of a line heat source in a semi-infinite medium is first described followed by the numerical procedure used for finding the complimentary temperature field. This analytical description of the line heat sources is able to capture the steep temperature gradients in the vicinity of the laser spot which is typically tens of micrometers. In turn, semi-analytical thermal model allows for having a relatively coarse discretisation of the complimentary temperature field. The temperature history determined is used to calculate the thermal strain induced on the SLM part. Finally, a mechanical model governed by elastic-plastic constitutive rule having isotropic hardening is used to predict the residual stresses.

  15. Modeling Knowledge Resource Selection in Expert Librarian Search

    Science.gov (United States)

    KAUFMAN, David R.; MEHRYAR, Maryam; CHASE, Herbert; HUNG, Peter; CHILOV, Marina; JOHNSON, Stephen B.; MENDONCA, Eneida

    2011-01-01

    Providing knowledge at the point of care offers the possibility for reducing error and improving patient outcomes. However, the vast majority of physician’s information needs are not met in a timely fashion. The research presented in this paper models an expert librarian’s search strategies as it pertains to the selection and use of various electronic information resources. The 10 searches conducted by the librarian to address physician’s information needs, varied in terms of complexity and question type. The librarian employed a total of 10 resources and used as many as 7 in a single search. The longer term objective is to model the sequential process in sufficient detail as to be able to contribute to the development of intelligent automated search agents. PMID:19380912

  16. River water quality model no. 1 (RWQM1): III. Biochemical submodel selection

    DEFF Research Database (Denmark)

    Vanrolleghem, P.; Borchardt, D.; Henze, Mogens

    2001-01-01

    The new River Water Quality Model no.1 introduced in the two accompanying papers by Shanahan et al. and Reichert et al. is comprehensive. Shanahan et al. introduced a six-step decision procedure to select the necessary model features for a certain application. This paper specifically addresses one...... of these steps, i.e. the selection of submodels of the comprehensive biochemical conversion model introduced in Reichert et al. Specific conditions for inclusion of one or the other conversion process or model component are introduced, as are some general rules that can support the selection. Examples...... of simplified models are presented....

  17. Using Ionic Liquids in Selective Hydrocarbon Conversion Processes

    Energy Technology Data Exchange (ETDEWEB)

    Tang, Yongchun; Periana, Roy; Chen, Weiqun; van Duin, Adri; Nielsen, Robert; Shuler, Patrick; Ma, Qisheng; Blanco, Mario; Li, Zaiwei; Oxgaard, Jonas; Cheng, Jihong; Cheung, Sam; Pudar, Sanja

    2009-09-28

    This is the Final Report of the five-year project Using Ionic Liquids in Selective Hydrocarbon Conversion Processes (DE-FC36-04GO14276, July 1, 2004- June 30, 2009), in which we present our major accomplishments with detailed descriptions of our experimental and theoretical efforts. Upon the successful conduction of this project, we have followed our proposed breakdown work structure completing most of the technical tasks. Finally, we have developed and demonstrated several optimized homogenously catalytic methane conversion systems involving applications of novel ionic liquids, which present much more superior performance than the Catalytica system (the best-to-date system) in terms of three times higher reaction rates and longer catalysts lifetime and much stronger resistance to water deactivation. We have developed in-depth mechanistic understandings on the complicated chemistry involved in homogenously catalytic methane oxidation as well as developed the unique yet effective experimental protocols (reactors, analytical tools and screening methodologies) for achieving a highly efficient yet economically feasible and environmentally friendly catalytic methane conversion system. The most important findings have been published, patented as well as reported to DOE in this Final Report and our 20 Quarterly Reports.

  18. Integrated Process Modeling-A Process Validation Life Cycle Companion.

    Science.gov (United States)

    Zahel, Thomas; Hauer, Stefan; Mueller, Eric M; Murphy, Patrick; Abad, Sandra; Vasilieva, Elena; Maurer, Daniel; Brocard, Cécile; Reinisch, Daniela; Sagmeister, Patrick; Herwig, Christoph

    2017-10-17

    During the regulatory requested process validation of pharmaceutical manufacturing processes, companies aim to identify, control, and continuously monitor process variation and its impact on critical quality attributes (CQAs) of the final product. It is difficult to directly connect the impact of single process parameters (PPs) to final product CQAs, especially in biopharmaceutical process development and production, where multiple unit operations are stacked together and interact with each other. Therefore, we want to present the application of Monte Carlo (MC) simulation using an integrated process model (IPM) that enables estimation of process capability even in early stages of process validation. Once the IPM is established, its capability in risk and criticality assessment is furthermore demonstrated. IPMs can be used to enable holistic production control strategies that take interactions of process parameters of multiple unit operations into account. Moreover, IPMs can be trained with development data, refined with qualification runs, and maintained with routine manufacturing data which underlines the lifecycle concept. These applications will be shown by means of a process characterization study recently conducted at a world-leading contract manufacturing organization (CMO). The new IPM methodology therefore allows anticipation of out of specification (OOS) events, identify critical process parameters, and take risk-based decisions on counteractions that increase process robustness and decrease the likelihood of OOS events.

  19. Multiple High-Fidelity Modeling Tools for Metal Additive Manufacturing Process Development, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Despite the rapid commercialization of additive manufacturing technology such as selective laser melting, SLM, there are gaps in process modeling and material...

  20. Multiple High-Fidelity Modeling Tools for Metal Additive Manufacturing Process Development, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — Despite the rapid commercialization of additive manufacturing technology such as selective laser melting, SLM, there are gaps in process modeling and material...

  1. Process modelling on a canonical basis[Process modelling; Canonical modelling

    Energy Technology Data Exchange (ETDEWEB)

    Siepmann, Volker

    2006-12-20

    Based on an equation oriented solving strategy, this thesis investigates a new approach to process modelling. Homogeneous thermodynamic state functions represent consistent mathematical models of thermodynamic properties. Such state functions of solely extensive canonical state variables are the basis of this work, as they are natural objective functions in optimisation nodes to calculate thermodynamic equilibrium regarding phase-interaction and chemical reactions. Analytical state function derivatives are utilised within the solution process as well as interpreted as physical properties. By this approach, only a limited range of imaginable process constraints are considered, namely linear balance equations of state variables. A second-order update of source contributions to these balance equations is obtained by an additional constitutive equation system. These equations are general dependent on state variables and first-order sensitivities, and cover therefore practically all potential process constraints. Symbolic computation technology efficiently provides sparsity and derivative information of active equations to avoid performance problems regarding robustness and computational effort. A benefit of detaching the constitutive equation system is that the structure of the main equation system remains unaffected by these constraints, and a priori information allows to implement an efficient solving strategy and a concise error diagnosis. A tailor-made linear algebra library handles the sparse recursive block structures efficiently. The optimisation principle for single modules of thermodynamic equilibrium is extended to host entire process models. State variables of different modules interact through balance equations, representing material flows from one module to the other. To account for reusability and encapsulation of process module details, modular process modelling is supported by a recursive module structure. The second-order solving algorithm makes it

  2. Hierarchical models in ecology: confidence intervals, hypothesis testing, and model selection using data cloning.

    Science.gov (United States)

    Ponciano, José Miguel; Taper, Mark L; Dennis, Brian; Lele, Subhash R

    2009-02-01

    Hierarchical statistical models are increasingly being used to describe complex ecological processes. The data cloning (DC) method is a new general technique that uses Markov chain Monte Carlo (MCMC) algorithms to compute maximum likelihood (ML) estimates along with their asymptotic variance estimates for hierarchical models. Despite its generality, the method has two inferential limitations. First, it only provides Wald-type confidence intervals, known to be inaccurate in small samples. Second, it only yields ML parameter estimates, but not the maximized likelihood values used for profile likelihood intervals, likelihood ratio hypothesis tests, and information-theoretic model selection. Here we describe how to overcome these inferential limitations with a computationally efficient method for calculating likelihood ratios via data cloning. The ability to calculate likelihood ratios allows one to do hypothesis tests, construct accurate confidence intervals and undertake information-based model selection with hierarchical models in a frequentist context. To demonstrate the use of these tools with complex ecological models, we reanalyze part of Gause's classic Paramecium data with state-space population models containing both environmental noise and sampling error. The analysis results include improved confidence intervals for parameters, a hypothesis test of laboratory replication, and a comparison of the Beverton-Holt and the Ricker growth forms based on a model selection index.

  3. Peers and the Emergence of Alcohol Use: Influence and Selection Processes in Adolescent Friendship Networks

    OpenAIRE

    Osgood, D. Wayne; Ragan, Daniel T.; Wallace, Lacey; Gest, Scott D.; Feinberg, Mark E.; Moody, James

    2013-01-01

    This study addresses not only influence and selection of friends as sources of similarity in alcohol use, but also peer processes leading drinkers to be chosen as friends more often than non-drinkers, which increases the number of adolescents subject to their influence. Analyses apply a stochastic actor-based model to friendship networks assessed five times from 6th through 9th grades for 50 grade cohort networks in Iowa and Pennsylvania, which include 13,214 individuals. Results show definit...

  4. Application of Bayesian Model Selection for Metal Yield Models using ALEGRA and Dakota.

    Energy Technology Data Exchange (ETDEWEB)

    Portone, Teresa; Niederhaus, John Henry; Sanchez, Jason James; Swiler, Laura Painton

    2018-02-01

    This report introduces the concepts of Bayesian model selection, which provides a systematic means of calibrating and selecting an optimal model to represent a phenomenon. This has many potential applications, including for comparing constitutive models. The ideas described herein are applied to a model selection problem between different yield models for hardened steel under extreme loading conditions.

  5. Process and analytical studies of enhanced low severity co-processing using selective coal pretreatment

    Energy Technology Data Exchange (ETDEWEB)

    Baldwin, R.M.; Miller, R.L.

    1990-01-01

    The objectives of the project are to investigate various coal pretreatment techniques and to determine the effect of these pretreatment procedures on the reactivity of the coal. Reactivity enhancement will be evaluated under both direct hydroliquefaction and co-processing conditions. Coal conversion utilizing low rank coals and low severity conditions (reaction temperatures generally less than 350{degrees}C) are the primary focus of the liquefaction experiments, as it is expected that the effect of pretreatment conditions and the attendant reactivity enhancement will be greatest for these coals and at these conditions. This document presents a comprehensive report summarizing the findings on the effect of mild alkylation pretreatment on coal reactivity under both direct hydroliquefaction and liquefaction co-processing conditions. Results of experiments using a dispersed catalyst system (chlorine) are also presented for purposes of comparison. IN general, mild alkylation has been found to be an effective pretreatment method for altering the reactivity of coal. Selective (oxygen) methylation was found to be more effective for high oxygen (subbituminous) coals compared to coals of higher rank. This reactivity enhancement was evidenced under both low and high severity liquefaction conditions, and for both direct hydroliquefaction and liquefaction co-processing reaction environments. Non-selective alkylation (methylation) was also effective, although the enhancement was less pronounced than found for coal activated by O-alkylation. The degree of reactivity enhancement was found to vary with both liquefaction and/or co-processing conditions and coal type, with the greatest positive effect found for subbituminous coal which had been selectively O-methylated and subsequently liquefied at low severity reaction conditions. 5 refs., 18 figs., 9 tabs.

  6. Astrophysical Model Selection in Gravitational Wave Astronomy

    Science.gov (United States)

    Adams, Matthew R.; Cornish, Neil J.; Littenberg, Tyson B.

    2012-01-01

    Theoretical studies in gravitational wave astronomy have mostly focused on the information that can be extracted from individual detections, such as the mass of a binary system and its location in space. Here we consider how the information from multiple detections can be used to constrain astrophysical population models. This seemingly simple problem is made challenging by the high dimensionality and high degree of correlation in the parameter spaces that describe the signals, and by the complexity of the astrophysical models, which can also depend on a large number of parameters, some of which might not be directly constrained by the observations. We present a method for constraining population models using a hierarchical Bayesian modeling approach which simultaneously infers the source parameters and population model and provides the joint probability distributions for both. We illustrate this approach by considering the constraints that can be placed on population models for galactic white dwarf binaries using a future space-based gravitational wave detector. We find that a mission that is able to resolve approximately 5000 of the shortest period binaries will be able to constrain the population model parameters, including the chirp mass distribution and a characteristic galaxy disk radius to within a few percent. This compares favorably to existing bounds, where electromagnetic observations of stars in the galaxy constrain disk radii to within 20%.

  7. Integrated chemical/physical and biological processes modeling Part 2

    African Journals Online (AJOL)

    The approach of characterising sewage sludge into carbohydrates, lipids and proteins, as is done in the International Water Association (IWA) AD model No 1 ... found to be 64 to 68% biodegradable (depending on the kinetic formulation selected for the hydrolysis process) and to have a C,sub>3.5H7O2N0.196 composition.

  8. Modeling and Analysis of Supplier Selection Method Using ...

    African Journals Online (AJOL)

    However, in these parts of the world the application of tools and models for supplier selection problem is yet to surface and the banking and finance industry here in Ethiopia is no exception. Thus, the purpose of this research was to address supplier selection problem through modeling and application of analytical hierarchy ...

  9. Dealing with selection bias in educational transition models

    DEFF Research Database (Denmark)

    Holm, Anders; Jæger, Mads Meier

    2011-01-01

    This paper proposes the bivariate probit selection model (BPSM) as an alternative to the traditional Mare model for analyzing educational transitions. The BPSM accounts for selection on unobserved variables by allowing for unobserved variables which affect the probability of making educational tr...

  10. Adaptive Gaussian Predictive Process Models for Large Spatial Datasets

    Science.gov (United States)

    Guhaniyogi, Rajarshi; Finley, Andrew O.; Banerjee, Sudipto; Gelfand, Alan E.

    2011-01-01

    Large point referenced datasets occur frequently in the environmental and natural sciences. Use of Bayesian hierarchical spatial models for analyzing these datasets is undermined by onerous computational burdens associated with parameter estimation. Low-rank spatial process models attempt to resolve this problem by projecting spatial effects to a lower-dimensional subspace. This subspace is determined by a judicious choice of “knots” or locations that are fixed a priori. One such representation yields a class of predictive process models (e.g., Banerjee et al., 2008) for spatial and spatial-temporal data. Our contribution here expands upon predictive process models with fixed knots to models that accommodate stochastic modeling of the knots. We view the knots as emerging from a point pattern and investigate how such adaptive specifications can yield more flexible hierarchical frameworks that lead to automated knot selection and substantial computational benefits. PMID:22298952

  11. Performance Measurement Model for the Supplier Selection Based on AHP

    Directory of Open Access Journals (Sweden)

    Fabio De Felice

    2015-10-01

    Full Text Available The performance of the supplier is a crucial factor for the success or failure of any company. Rational and effective decision making in terms of the supplier selection process can help the organization to optimize cost and quality functions. The nature of supplier selection processes is generally complex, especially when the company has a large variety of products and vendors. Over the years, several solutions and methods have emerged for addressing the supplier selection problem (SSP. Experience and studies have shown that there is no best way for evaluating and selecting a specific supplier process, but that it varies from one organization to another. The aim of this research is to demonstrate how a multiple attribute decision making approach can be effectively applied for the supplier selection process.

  12. On Optimal Input Design and Model Selection for Communication Channels

    Energy Technology Data Exchange (ETDEWEB)

    Li, Yanyan [ORNL; Djouadi, Seddik M [ORNL; Olama, Mohammed M [ORNL

    2013-01-01

    In this paper, the optimal model (structure) selection and input design which minimize the worst case identification error for communication systems are provided. The problem is formulated using metric complexity theory in a Hilbert space setting. It is pointed out that model selection and input design can be handled independently. Kolmogorov n-width is used to characterize the representation error introduced by model selection, while Gel fand and Time n-widths are used to represent the inherent error introduced by input design. After the model is selected, an optimal input which minimizes the worst case identification error is shown to exist. In particular, it is proven that the optimal model for reducing the representation error is a Finite Impulse Response (FIR) model, and the optimal input is an impulse at the start of the observation interval. FIR models are widely popular in communication systems, such as, in Orthogonal Frequency Division Multiplexing (OFDM) systems.

  13. Measures of Quality in Business Process Modelling

    Directory of Open Access Journals (Sweden)

    Radek Hronza

    2015-06-01

    Full Text Available Business process modelling and analysing is undoubtedly one of the most important parts of Applied (Business Informatics. Quality of business process models (diagrams is crucial for any purpose in this area. The goal of a process analyst’s work is to create generally understandable, explicit and error free models. If a process is properly described, created models can be used as an input into deep analysis and optimization. It can be assumed that properly designed business process models (similarly as in the case of correctly written algorithms contain characteristics that can be mathematically described. Besides it will be possible to create a tool that will help process analysts to design proper models. As part of this review will be conducted systematic literature review in order to find and analyse business process model’s design and business process model’s quality measures. It was found that mentioned area had already been the subject of research investigation in the past. Thirty-three suitable scietific publications and twenty-two quality measures were found. Analysed scientific publications and existing quality measures do not reflect all important attributes of business process model’s clarity, simplicity and completeness. Therefore it would be appropriate to add new measures of quality.

  14. Model and Variable Selection Procedures for Semiparametric Time Series Regression

    Directory of Open Access Journals (Sweden)

    Risa Kato

    2009-01-01

    Full Text Available Semiparametric regression models are very useful for time series analysis. They facilitate the detection of features resulting from external interventions. The complexity of semiparametric models poses new challenges for issues of nonparametric and parametric inference and model selection that frequently arise from time series data analysis. In this paper, we propose penalized least squares estimators which can simultaneously select significant variables and estimate unknown parameters. An innovative class of variable selection procedure is proposed to select significant variables and basis functions in a semiparametric model. The asymptotic normality of the resulting estimators is established. Information criteria for model selection are also proposed. We illustrate the effectiveness of the proposed procedures with numerical simulations.

  15. 45 CFR 2522.415 - How does the grant selection process work?

    Science.gov (United States)

    2010-10-01

    ... 45 Public Welfare 4 2010-10-01 2010-10-01 false How does the grant selection process work? 2522.415 Section 2522.415 Public Welfare Regulations Relating to Public Welfare (Continued) CORPORATION FOR... Programs § 2522.415 How does the grant selection process work? The selection process includes: (a...

  16. A decision model for the risk management of hazardous processes

    International Nuclear Information System (INIS)

    Holmberg, J.E.

    1997-03-01

    A decision model for risk management of hazardous processes as an optimisation problem of a point process is formulated in the study. In the approach, the decisions made by the management are divided into three categories: (1) planned process lifetime, (2) selection of the design and, (3) operational decisions. These three controlling methods play quite different roles in the practical risk management, which is also reflected in our approach. The optimisation of the process lifetime is related to the licensing problem of the process. It provides a boundary condition for a feasible utility function that is used as the actual objective function, i.e., maximizing the process lifetime utility. By design modifications, the management can affect the inherent accident hazard rate of the process. This is usually a discrete optimisation task. The study particularly concentrates upon the optimisation of the operational strategies given a certain design and licensing time. This is done by a dynamic risk model (marked point process model) representing the stochastic process of events observable or unobservable to the decision maker. An optimal long term control variable guiding the selection of operational alternatives in short term problems is studied. The optimisation problem is solved by the stochastic quasi-gradient procedure. The approach is illustrated by a case study. (23 refs.)

  17. Selective processes in development: implications for the costs and benefits of phenotypic plasticity.

    Science.gov (United States)

    Snell-Rood, Emilie C

    2012-07-01

    Adaptive phenotypic plasticity, the ability of a genotype to develop a phenotype appropriate to the local environment, allows organisms to cope with environmental variation and has implications for predicting how organisms will respond to rapid, human-induced environmental change. This review focuses on the importance of developmental selection, broadly defined as a developmental process that involves the sampling of a range of phenotypes and feedback from the environment reinforcing high-performing phenotypes. I hypothesize that understanding the degree to which developmental selection underlies plasticity is key to predicting the costs, benefits, and consequences of plasticity. First, I review examples that illustrate that elements of developmental selection are common across the development of many different traits, from physiology and immunity to circulation and behavior. Second, I argue that developmental selection, relative to a fixed strategy or determinate (switch) mechanisms of plasticity, increases the probability that an individual will develop a phenotype best matched to the local environment. However, the exploration and environmental feedback associated with developmental selection is costly in terms of time, energy, and predation risk, resulting in major changes in life history such as increased duration of development and greater investment in individual offspring. Third, I discuss implications of developmental selection as a mechanism of plasticity, from predicting adaptive responses to novel environments to understanding conditions under which genetic assimilation may fuel diversification. Finally, I outline exciting areas of future research, in particular exploring costs of selective processes in the development of traits outside of behavior and modeling developmental selection and evolution in novel environments.

  18. A Network Analysis Model for Selecting Sustainable Technology

    Directory of Open Access Journals (Sweden)

    Sangsung Park

    2015-09-01

    Full Text Available Most companies develop technologies to improve their competitiveness in the marketplace. Typically, they then patent these technologies around the world in order to protect their intellectual property. Other companies may use patented technologies to develop new products, but must pay royalties to the patent holders or owners. Should they fail to do so, this can result in legal disputes in the form of patent infringement actions between companies. To avoid such situations, companies attempt to research and develop necessary technologies before their competitors do so. An important part of this process is analyzing existing patent documents in order to identify emerging technologies. In such analyses, extracting sustainable technology from patent data is important, because sustainable technology drives technological competition among companies and, thus, the development of new technologies. In addition, selecting sustainable technologies makes it possible to plan their R&D (research and development efficiently. In this study, we propose a network model that can be used to select the sustainable technology from patent documents, based on the centrality and degree of a social network analysis. To verify the performance of the proposed model, we carry out a case study using actual patent data from patent databases.

  19. Modelling income processes with lots of heterogeneity

    DEFF Research Database (Denmark)

    Browning, Martin; Ejrnæs, Mette; Alvarez, Javier

    2010-01-01

    We model earnings processes allowing for lots of heterogeneity across agents. We also introduce an extension to the linear ARMA model which allows the initial convergence in the long run to be different from that implied by the conventional ARMA model. This is particularly important for unit root...

  20. Counting Processes for Retail Default Modeling

    DEFF Research Database (Denmark)

    Kiefer, Nicholas Maximilian; Larson, C. Erik

    in a discrete state space. In a simple case, the states could be default/non-default; in other models relevant for credit modeling the states could be credit scores or payment status (30 dpd, 60 dpd, etc.). Here we focus on the use of stochastic counting processes for mortgage default modeling, using data...

  1. Verification Techniques for Parameter Selection and Bayesian Model Calibration Presented for an HIV Model

    Science.gov (United States)

    Wentworth, Mami Tonoe

    Uncertainty quantification plays an important role when making predictive estimates of model responses. In this context, uncertainty quantification is defined as quantifying and reducing uncertainties, and the objective is to quantify uncertainties in parameter, model and measurements, and propagate the uncertainties through the model, so that one can make a predictive estimate with quantified uncertainties. Two of the aspects of uncertainty quantification that must be performed prior to propagating uncertainties are model calibration and parameter selection. There are several efficient techniques for these processes; however, the accuracy of these methods are often not verified. This is the motivation for our work, and in this dissertation, we present and illustrate verification frameworks for model calibration and parameter selection in the context of biological and physical models. First, HIV models, developed and improved by [2, 3, 8], describe the viral infection dynamics of an HIV disease. These are also used to make predictive estimates of viral loads and T-cell counts and to construct an optimal control for drug therapy. Estimating input parameters is an essential step prior to uncertainty quantification. However, not all the parameters are identifiable, implying that they cannot be uniquely determined by the observations. These unidentifiable parameters can be partially removed by performing parameter selection, a process in which parameters that have minimal impacts on the model response are determined. We provide verification techniques for Bayesian model calibration and parameter selection for an HIV model. As an example of a physical model, we employ a heat model with experimental measurements presented in [10]. A steady-state heat model represents a prototypical behavior for heat conduction and diffusion process involved in a thermal-hydraulic model, which is a part of nuclear reactor models. We employ this simple heat model to illustrate verification

  2. Distillation modeling for a uranium refining process

    International Nuclear Information System (INIS)

    Westphal, B.R.

    1996-01-01

    As part of the spent fuel treatment program at Argonne National Laboratory, a vacuum distillation process is being employed for the recovery of uranium following an electrorefining process. Distillation of a salt electrolyte, containing a eutectic mixture of lithium and potassium chlorides, from uranium is achieved by a simple batch operation and is termed open-quotes cathode processingclose quotes. The incremental distillation of electrolyte salt will be modeled by an equilibrium expression and on a molecular basis since the operation is conducted under moderate vacuum conditions. As processing continues, the two models will be compared and analyzed for correlation with actual operating results. Possible factors that may contribute to aberrations from the models include impurities at the vapor-liquid boundary, distillate reflux, anomalous pressure gradients, and mass transport phenomena at the evaporating surface. Ultimately, the purpose of either process model is to enable the parametric optimization of the process

  3. Piecewise deterministic processes in biological models

    CERN Document Server

    Rudnicki, Ryszard

    2017-01-01

    This book presents a concise introduction to piecewise deterministic Markov processes (PDMPs), with particular emphasis on their applications to biological models. Further, it presents examples of biological phenomena, such as gene activity and population growth, where different types of PDMPs appear: continuous time Markov chains, deterministic processes with jumps, processes with switching dynamics, and point processes. Subsequent chapters present the necessary tools from the theory of stochastic processes and semigroups of linear operators, as well as theoretical results concerning the long-time behaviour of stochastic semigroups induced by PDMPs and their applications to biological models. As such, the book offers a valuable resource for mathematicians and biologists alike. The first group will find new biological models that lead to interesting and often new mathematical questions, while the second can observe how to include seemingly disparate biological processes into a unified mathematical theory, and...

  4. Modeling closed nuclear fuel cycles processes

    Energy Technology Data Exchange (ETDEWEB)

    Shmidt, O.V. [A.A. Bochvar All-Russian Scientific Research Institute for Inorganic Materials, Rogova, 5a street, Moscow, 123098 (Russian Federation); Makeeva, I.R. [Zababakhin All-Russian Scientific Research Institute of Technical Physics, Vasiliev street 13, Snezhinsk, Chelyabinsk region, 456770 (Russian Federation); Liventsov, S.N. [Tomsk Polytechnic University, Tomsk, Lenin Avenue, 30, 634050 (Russian Federation)

    2016-07-01

    Computer models of processes are necessary for determination of optimal operating conditions for closed nuclear fuel cycle (NFC) processes. Computer models can be quickly changed in accordance with new and fresh data from experimental research. 3 kinds of process simulation are necessary. First, the VIZART software package is a balance model development used for calculating the material flow in technological processes. VIZART involves taking into account of equipment capacity, transport lines and storage volumes. Secondly, it is necessary to simulate the physico-chemical processes that are involved in the closure of NFC. The third kind of simulation is the development of software that allows the optimization, diagnostics and control of the processes which implies real-time simulation of product flows on the whole plant or on separate lines of the plant. (A.C.)

  5. Random effect selection in generalised linear models

    DEFF Research Database (Denmark)

    Denwood, Matt; Houe, Hans; Forkman, Björn

    We analysed abattoir recordings of meat inspection codes with possible relevance to onfarm animal welfare in cattle. Random effects logistic regression models were used to describe individual-level data obtained from 461,406 cattle slaughtered in Denmark. Our results demonstrate that the largest...

  6. Hierarchical Model of Assessing and Selecting Experts

    Science.gov (United States)

    Chernysheva, T. Y.; Korchuganova, M. A.; Borisov, V. V.; Min'kov, S. L.

    2016-04-01

    Revealing experts’ competences is a multi-objective issue. Authors of the paper deal with competence assessing methods of experts seen as objects, and criteria of qualities. An analytic hierarchy process of assessing and ranking experts is offered, which is based on paired comparison matrices and scores, quality parameters are taken into account as well. Calculation and assessment of experts is given as an example.

  7. Hierarchical Model of Assessing and Selecting Experts

    OpenAIRE

    Chernysheva, Tatiana Yurievna; Korchuganova, Mariya Anatolievna; Borisov, V. V.; Minkov, S. L.

    2016-01-01

    Revealing experts' competences is a multi-objective issue. Authors of the paper deal with competence assessing methods of experts seen as objects, and criteria of qualities. An analytic hierarchy process of assessing and ranking experts is offered, which is based on paired comparison matrices and scores, quality parameters are taken into account as well. Calculation and assessment of experts is given as an example.

  8. Model Selection in Kernel Ridge Regression

    DEFF Research Database (Denmark)

    Exterkate, Peter

    , including polynomial kernels, the Gaussian kernel, and the Sinc kernel. We interpret the latter two kernels in terms of their smoothing properties, and we relate the tuning parameters associated to all these kernels to smoothness measures of the prediction function and to the signal-to-noise ratio. Based...... applicable, and we recommend their use instead of the popular polynomial kernels in general settings, in which no information on the data-generating process is available....

  9. Adolescent girls' friendship networks, body dissatisfaction, and disordered eating: examining selection and socialization processes.

    Science.gov (United States)

    Rayner, Kathryn E; Schniering, Carolyn A; Rapee, Ronald M; Taylor, Alan; Hutchinson, Delyse M

    2013-02-01

    Previous research has shown that adolescent girls tend to resemble their friends in their level of body dissatisfaction and disordered eating. However, no studies to date have attempted to disentangle the underlying peer selection and socialization processes that may explain this homophily. The current study used longitudinal stochastic actor-based modeling to simultaneously examine these two processes in a large community sample of adolescent girls (N = 1,197) from nine Australian girls' high schools. Friendship nominations and measures of body dissatisfaction, dieting and bulimic behaviors were collected across three annual waves. Results indicated that selection rather than socialization effects contributed to similarity within friendship groups when both processes were examined simultaneously. Specifically, girls tended to select friends who were similar to themselves in terms of body dissatisfaction and bulimic behaviors, but dissimilar in terms of dieting. Network and individual attribute variables also emerged as significant in explaining changes in adolescents' friendships and behaviors. As well as having important clinical implications, the findings point to the importance of controlling for friendship selection when examining the role of peers in adolescent body image and eating problems. 2013 APA, all rights reserved

  10. The site selection process for a spent fuel repository in Finland. Summary report

    International Nuclear Information System (INIS)

    McEwen, T.; Aeikaes, T.

    2000-12-01

    This Summary Report describes the Finnish programme for the selection and characterisation of potential sites for the deep disposal of spent nuclear fuel and explains the process by which Olkiluoto has been selected as the single site proposed for the development of a spent fuel disposal facility. Its aim is to provide an overview of this process, initiated almost twenty years ago, which has entered its final phase. It provides information in three areas: a review of the early site selection criteria, a description of the site selection process, including all the associated site characterisation work, up to the point at which a single site was selected and an outline of the proposed work, in particular that proposed underground, to characterise further the Olkiluoto site. In 1983 the Finnish Government made a policy decision on the management of nuclear waste in which the main goals and milestones for the site selection programme for the deep disposal of spent fuel were presented. According to this decision several site candidates, whose selection was to be based on careful studies of the whole country, should be characterised and the site for the repository selected by the end of the year 2000. This report describes the process by which this policy decision has been achieved. The report begins with a discussion of the definition of the geological and environmental site selection criteria and how they were applied in order to select a small number of sites, five in all, that were to be the subject of the preliminary investigations. The methods used to investigate these sites and the results of these investigations are described, as is the evaluation of the results of these investigations and the process used to discard two of the sites and continue more detailed investigations at the remaining three. The detailed site investigations that commenced in 1993 are described with respect to the overall strategy followed and the investigation techniques applied. The

  11. Modeling shape selection of buckled dielectric elastomers

    Science.gov (United States)

    Langham, Jacob; Bense, Hadrien; Barkley, Dwight

    2018-02-01

    A dielectric elastomer whose edges are held fixed will buckle, given a sufficiently applied voltage, resulting in a nontrivial out-of-plane deformation. We study this situation numerically using a nonlinear elastic model which decouples two of the principal electrostatic stresses acting on an elastomer: normal pressure due to the mutual attraction of oppositely charged electrodes and tangential shear ("fringing") due to repulsion of like charges at the electrode edges. These enter via physically simplified boundary conditions that are applied in a fixed reference domain using a nondimensional approach. The method is valid for small to moderate strains and is straightforward to implement in a generic nonlinear elasticity code. We validate the model by directly comparing the simulated equilibrium shapes with the experiment. For circular electrodes which buckle axisymetrically, the shape of the deflection profile is captured. Annular electrodes of different widths produce azimuthal ripples with wavelengths that match our simulations. In this case, it is essential to compute multiple equilibria because the first model solution obtained by the nonlinear solver (Newton's method) is often not the energetically favored state. We address this using a numerical technique known as "deflation." Finally, we observe the large number of different solutions that may be obtained for the case of a long rectangular strip.

  12. An experimental study of factors affecting the selective inhibition of sintering process

    Science.gov (United States)

    Asiabanpour, Bahram

    Selective Inhibition of Sintering (SIS) is a new rapid prototyping method that builds parts in a layer-by-layer fabrication basis. SIS works by joining powder particles through sintering in the part's body, and by sintering inhibition of some selected powder areas. The objective of this research has been to improve the new SIS process, which has been invented at USC. The process improvement is based on statistical design of experiments. To conduct the needed experiments a working machine and related path generator software were needed. The machine and its control software were made available prior to this research. The path generator algorithms and software had to be created. This program should obtain model geometry data from a CAD file and generate an appropriate path file for the printer nozzle. Also, the program should generate a simulation file for path file inspection using virtual prototyping. The activities related to path generator constitute the first part of this research, which has resulted in an efficient path generator. In addition, to reach an acceptable level of accuracy, strength, and surface quality in the fabricated parts, all effective factors in the SIS process should be identified and controlled. Simultaneous analytical and experimental studies were conducted to recognize effective factors and to control the SIS process. Also, it was known that polystyrene was the most appropriate polymer powder and saturated potassium iodide was the most effective inhibitor among the available candidate materials. In addition, statistical tools were applied to improve the desirable properties of the parts fabricated by the SIS process. An investigation of part strength was conducted using the Response Surface Methodology (RSM) and a region of acceptable operating conditions for the part strength was found. Then, through analysis of the experimental results, the impact of the factors on the final part surface quality and dimensional accuracy was modeled. After

  13. Models as instruments for optimizing hospital processes: a systematic review.

    Science.gov (United States)

    van Sambeek, J R C; Cornelissen, F A; Bakker, P J M; Krabbendam, J J

    2010-01-01

    The purpose of this article is to find decision-making models for the design and control of processes regarding patient flows, considering various problem types, and to find out how usable these models are for managerial decision making. A systematic review of the literature was carried out. Relevant literature from three databases was selected based on inclusion and exclusion criteria and the results were analyzed. A total of 68 articles were selected. Of these, 31 contained computer simulation models, ten contained descriptive models, and 27 contained analytical models. The review showed that descriptive models are only applied to process design problems, and that analytical and computer simulation models are applied to all types of problems to approximately the same extent. Only a few models have been validated in practice, and it seems that most models are not used for their intended purpose: to support management in decision making. The comparability of the relevant databases appears to be limited and there is an insufficient number of suitable keywords and MeSH headings, which makes searching systematically within the broad field of health care management relatively hard to accomplish. The findings give managers insight into the characteristics of various types of decision-support models and into the kinds of situations in which they are used. This is the first time literature on various kinds of models for supporting managerial decision making in hospitals has been systematically collected and assessed.

  14. Multi-Criteria Decision Making For Determining A Simple Model of Supplier Selection

    Science.gov (United States)

    Harwati

    2017-06-01

    Supplier selection is a decision with many criteria. Supplier selection model usually involves more than five main criteria and more than 10 sub-criteria. In fact many model includes more than 20 criteria. Too many criteria involved in supplier selection models sometimes make it difficult to apply in many companies. This research focuses on designing supplier selection that easy and simple to be applied in the company. Analytical Hierarchy Process (AHP) is used to weighting criteria. The analysis results there are four criteria that are easy and simple can be used to select suppliers: Price (weight 0.4) shipment (weight 0.3), quality (weight 0.2) and services (weight 0.1). A real case simulation shows that simple model provides the same decision with a more complex model.

  15. Statistical Model Selection for TID Hardness Assurance

    Science.gov (United States)

    Ladbury, R.; Gorelick, J. L.; McClure, S.

    2010-01-01

    Radiation Hardness Assurance (RHA) methodologies against Total Ionizing Dose (TID) degradation impose rigorous statistical treatments for data from a part's Radiation Lot Acceptance Test (RLAT) and/or its historical performance. However, no similar methods exist for using "similarity" data - that is, data for similar parts fabricated in the same process as the part under qualification. This is despite the greater difficulty and potential risk in interpreting of similarity data. In this work, we develop methods to disentangle part-to-part, lot-to-lot and part-type-to-part-type variation. The methods we develop apply not just for qualification decisions, but also for quality control and detection of process changes and other "out-of-family" behavior. We begin by discussing the data used in ·the study and the challenges of developing a statistic providing a meaningful measure of degradation across multiple part types, each with its own performance specifications. We then develop analysis techniques and apply them to the different data sets.

  16. Engineered Barrier System Degradation, Flow, and Transport Process Model Report

    Energy Technology Data Exchange (ETDEWEB)

    E.L. Hardin

    2000-07-17

    The Engineered Barrier System Degradation, Flow, and Transport Process Model Report (EBS PMR) is one of nine PMRs supporting the Total System Performance Assessment (TSPA) being developed by the Yucca Mountain Project for the Site Recommendation Report (SRR). The EBS PMR summarizes the development and abstraction of models for processes that govern the evolution of conditions within the emplacement drifts of a potential high-level nuclear waste repository at Yucca Mountain, Nye County, Nevada. Details of these individual models are documented in 23 supporting Analysis/Model Reports (AMRs). Nineteen of these AMRs are for process models, and the remaining 4 describe the abstraction of results for application in TSPA. The process models themselves cluster around four major topics: ''Water Distribution and Removal Model, Physical and Chemical Environment Model, Radionuclide Transport Model, and Multiscale Thermohydrologic Model''. One AMR (Engineered Barrier System-Features, Events, and Processes/Degradation Modes Analysis) summarizes the formal screening analysis used to select the Features, Events, and Processes (FEPs) included in TSPA and those excluded from further consideration. Performance of a potential Yucca Mountain high-level radioactive waste repository depends on both the natural barrier system (NBS) and the engineered barrier system (EBS) and on their interactions. Although the waste packages are generally considered as components of the EBS, the EBS as defined in the EBS PMR includes all engineered components outside the waste packages. The principal function of the EBS is to complement the geologic system in limiting the amount of water contacting nuclear waste. A number of alternatives were considered by the Project for different EBS designs that could provide better performance than the design analyzed for the Viability Assessment. The design concept selected was Enhanced Design Alternative II (EDA II).

  17. Engineered Barrier System Degradation, Flow, and Transport Process Model Report

    International Nuclear Information System (INIS)

    E.L. Hardin

    2000-01-01

    The Engineered Barrier System Degradation, Flow, and Transport Process Model Report (EBS PMR) is one of nine PMRs supporting the Total System Performance Assessment (TSPA) being developed by the Yucca Mountain Project for the Site Recommendation Report (SRR). The EBS PMR summarizes the development and abstraction of models for processes that govern the evolution of conditions within the emplacement drifts of a potential high-level nuclear waste repository at Yucca Mountain, Nye County, Nevada. Details of these individual models are documented in 23 supporting Analysis/Model Reports (AMRs). Nineteen of these AMRs are for process models, and the remaining 4 describe the abstraction of results for application in TSPA. The process models themselves cluster around four major topics: ''Water Distribution and Removal Model, Physical and Chemical Environment Model, Radionuclide Transport Model, and Multiscale Thermohydrologic Model''. One AMR (Engineered Barrier System-Features, Events, and Processes/Degradation Modes Analysis) summarizes the formal screening analysis used to select the Features, Events, and Processes (FEPs) included in TSPA and those excluded from further consideration. Performance of a potential Yucca Mountain high-level radioactive waste repository depends on both the natural barrier system (NBS) and the engineered barrier system (EBS) and on their interactions. Although the waste packages are generally considered as components of the EBS, the EBS as defined in the EBS PMR includes all engineered components outside the waste packages. The principal function of the EBS is to complement the geologic system in limiting the amount of water contacting nuclear waste. A number of alternatives were considered by the Project for different EBS designs that could provide better performance than the design analyzed for the Viability Assessment. The design concept selected was Enhanced Design Alternative II (EDA II)

  18. Chronic Treatment with a Promnesiant GABA-A α5-Selective Inverse Agonist Increases Immediate Early Genes Expression during Memory Processing in Mice and Rectifies Their Expression Levels in a Down Syndrome Mouse Model

    Science.gov (United States)

    Braudeau, J.; Dauphinot, L.; Duchon, A.; Loistron, A.; Dodd, R. H.; Hérault, Y.; Delatour, B.; Potier, M. C.

    2011-01-01

    Decrease of GABAergic transmission has been proposed to improve memory functions. Indeed, inverse agonists selective for α5 GABA-A-benzodiazepine receptors (α5IA) have promnesiant activity. Interestingly, we have recently shown that α5IA can rescue cognitive deficits in Ts65Dn mice, a Down syndrome mouse model with altered GABAergic transmission. Here, we studied the impact of chronic treatment with α5IA on gene expression in the hippocampus of Ts65Dn and control euploid mice after being trained in the Morris water maze task. In euploid mice, chronic treatment with α5IA increased IEGs expression, particularly of c-Fos and Arc genes. In Ts65Dn mice, deficits of IEGs activation were completely rescued after treatment with α5IA. In addition, normalization of Sod1 overexpression in Ts65Dn mice after α5IA treatment was observed. IEG expression regulation after α5IA treatment following behavioral stimulation could be a contributing factor for both the general promnesiant activity of α5IA and its rescuing effect in Ts65Dn mice alongside signaling cascades that are critical for memory consolidation and cognition. PMID:22028705

  19. Numerical simulation of complex part manufactured by selective laser melting process

    Science.gov (United States)

    Van Belle, Laurent

    2017-10-01

    Selective Laser Melting (SLM) process belonging to the family of the Additive Manufacturing (AM) technologies, enable to build parts layer by layer, from metallic powder and a CAD model. Physical phenomena that occur in the process have the same issues as conventional welding. Thermal gradients generate significant residual stresses and distortions in the parts. Moreover, the large and complex parts to manufacturing, accentuate the undesirable effects. Therefore, it is essential for manufacturers to offer a better understanding of the process and to ensure production reliability of parts with high added value. This paper focuses on the simulation of manufacturing turbine by SLM process in order to calculate residual stresses and distortions. Numerical results will be presented.

  20. The application of the FMEA method in the selected production process of a company

    Directory of Open Access Journals (Sweden)

    Piotr Barosz

    2018-04-01

    Full Text Available The aim of this article is to show the use of the analysis of the failure causes and effects as a prevention tool in controlling the quality of a given production process in the company. The scope of the work covers an analysis of a selected process, definition of inconsistencies present in this process, and then the FMEA analysis. In the production company one should implement thinking and actions based on the so-called ‘quality loop’ – it is an interdependence model of the undertaken actions which affect the quality shaping. It is carried out from the possibility for identifying a customer’s requirements through a project, production process, up to the assessment of effective capability for meeting the defined requirements. The application of such an approach enables to take the actions improving the operation of quality management in a systemic way.

  1. MODELLING PURCHASING PROCESSES FROM QUALITY ASPECTS

    Directory of Open Access Journals (Sweden)

    Zora Arsovski

    2008-12-01

    Full Text Available Management has a fundamental task to identify and direct primary and specific processes within purchasing function, applying the up-to-date information infrastructure. ISO 9001:2000 defines a process as a number of interrelated or interactive activities transforming inputs and outputs, and the "process approach" as a systematic identification in management processes employed with the organization and particularly - relationships among the processes. To direct a quality management system using process approach, the organization is to determine the map of its general (basic processes. Primary processes are determined on the grounds of their interrelationship and impact on satisfying customers' needs. To make a proper choice of general business processes, it is necessary to determine the entire business flow, beginning with the customer demand up to the delivery of products or service provided. In the next step the process model is to be converted into data model which is essential for implementation of the information system enabling automation, monitoring, measuring, inspection, analysis and improvement of key purchase processes. In this paper are given methodology and some results of investigation of development of IS for purchasing process from aspects of quality.

  2. Three-dimensional model analysis and processing

    CERN Document Server

    Yu, Faxin; Luo, Hao; Wang, Pinghui

    2011-01-01

    This book focuses on five hot research directions in 3D model analysis and processing in computer science:  compression, feature extraction, content-based retrieval, irreversible watermarking and reversible watermarking.

  3. Value-Oriented Coordination Process Modeling

    NARCIS (Netherlands)

    Fatemi, Hassan; van Sinderen, Marten J.; Wieringa, Roelf J.; Hull, Richard; Mendling, Jan; Tai, Stefan

    Business webs are collections of enterprises designed to jointly satisfy a consumer need. Designing business webs calls for modeling the collaboration of enterprises from different perspectives, in particular the business value and coordination process perspectives, and for mutually aligning these

  4. Fast Bayesian Inference in Dirichlet Process Mixture Models.

    Science.gov (United States)

    Wang, Lianming; Dunson, David B

    2011-01-01

    There has been increasing interest in applying Bayesian nonparametric methods in large samples and high dimensions. As Markov chain Monte Carlo (MCMC) algorithms are often infeasible, there is a pressing need for much faster algorithms. This article proposes a fast approach for inference in Dirichlet process mixture (DPM) models. Viewing the partitioning of subjects into clusters as a model selection problem, we propose a sequential greedy search algorithm for selecting the partition. Then, when conjugate priors are chosen, the resulting posterior conditionally on the selected partition is available in closed form. This approach allows testing of parametric models versus nonparametric alternatives based on Bayes factors. We evaluate the approach using simulation studies and compare it with four other fast nonparametric methods in the literature. We apply the proposed approach to three datasets including one from a large epidemiologic study. Matlab codes for the simulation and data analyses using the proposed approach are available online in the supplemental materials.

  5. Variable selection for mixture and promotion time cure rate models.

    Science.gov (United States)

    Masud, Abdullah; Tu, Wanzhu; Yu, Zhangsheng

    2016-11-16

    Failure-time data with cured patients are common in clinical studies. Data from these studies are typically analyzed with cure rate models. Variable selection methods have not been well developed for cure rate models. In this research, we propose two least absolute shrinkage and selection operators based methods, for variable selection in mixture and promotion time cure models with parametric or nonparametric baseline hazards. We conduct an extensive simulation study to assess the operating characteristics of the proposed methods. We illustrate the use of the methods using data from a study of childhood wheezing. © The Author(s) 2016.

  6. Process and Context in Choice Models

    DEFF Research Database (Denmark)

    Ben-Akiva, Moshe; Palma, André de; McFadden, Daniel

    2012-01-01

    We develop a general framework that extends choice models by including an explicit representation of the process and context of decision making. Process refers to the steps involved in decision making. Context refers to factors affecting the process, focusing in this paper on social networks....... The extended choice framework includes more behavioral richness through the explicit representation of the planning process preceding an action and its dynamics and the effects of context (family, friends, and market) on the process leading to a choice, as well as the inclusion of new types of subjective data...

  7. Modeling of Dielectric Heating within Lyophilization Process

    Directory of Open Access Journals (Sweden)

    Jan Kyncl

    2014-01-01

    Full Text Available A process of lyophilization of paper books is modeled. The process of drying is controlled by a dielectric heating system. From the physical viewpoint, the task represents a 2D coupled problem described by two partial differential equations for the electric and temperature fields. The material parameters are supposed to be temperature-dependent functions. The continuous mathematical model is solved numerically. The methodology is illustrated with some examples whose results are discussed.

  8. The MCDM Model for Personnel Selection Based on SWARA and ARAS Methods

    Directory of Open Access Journals (Sweden)

    Darjan Karabasevic

    2015-05-01

    Full Text Available Competent employees are the key resource in an organization for achieving success and, therefore, competitiveness on the market. The aim of the recruitment and selection process is to acquire personnel with certain competencies required for a particular position, i.e.,a position within the company. Bearing in mind the fact that in the process of decision making decision-makers have underused the methods of making decisions, this paper aims to establish an MCDM model for the evaluation and selection of candidates in the process of the recruitment and selection of personnel based on the SWARA and the ARAS methods. Apart from providing an MCDM model, the paper will additionally provide a set of evaluation criteria for the position of a sales manager (the middle management in the telecommunication industry which will also be used in the numerical example. On the basis of a numerical example, in the process of employment, theproposed MCDMmodel can be successfully usedin selecting candidates.

  9. Cost Models for MMC Manufacturing Processes

    Science.gov (United States)

    Elzey, Dana M.; Wadley, Haydn N. G.

    1996-01-01

    Processes for the manufacture of advanced metal matrix composites are rapidly approaching maturity in the research laboratory and there is growing interest in their transition to industrial production. However, research conducted to date has almost exclusively focused on overcoming the technical barriers to producing high-quality material and little attention has been given to the economical feasibility of these laboratory approaches and process cost issues. A quantitative cost modeling (QCM) approach was developed to address these issues. QCM are cost analysis tools based on predictive process models relating process conditions to the attributes of the final product. An important attribute, of the QCM approach is the ability to predict the sensitivity of material production costs to product quality and to quantitatively explore trade-offs between cost and quality. Applications of the cost models allow more efficient direction of future MMC process technology development and a more accurate assessment of MMC market potential. Cost models were developed for two state-of-the art metal matrix composite (MMC) manufacturing processes: tape casting and plasma spray deposition. Quality and Cost models are presented for both processes and the resulting predicted quality-cost curves are presented and discussed.

  10. Physical and mathematical modelling of extrusion processes

    DEFF Research Database (Denmark)

    Arentoft, Mogens; Gronostajski, Z.; Niechajowics, A.

    2000-01-01

    The main objective of the work is to study the extrusion process using physical modelling and to compare the findings of the study with finite element predictions. The possibilities and advantages of the simultaneous application of both of these methods for the analysis of metal forming processes...

  11. Business Process Modeling Notation - An Overview

    Directory of Open Access Journals (Sweden)

    Alexandra Fortiş

    2006-01-01

    Full Text Available BPMN represents an industrial standard created to offer a common and user friendly notation to all the participants to a business process. The present paper aims to briefly present the main features of this notation as well as an interpretation of some of the main patterns characterizing a business process modeled by the working fluxes.

  12. Partner Selection Optimization Model of Agricultural Enterprises in Supply Chain

    OpenAIRE

    Feipeng Guo; Qibei Lu

    2013-01-01

    With more and more importance of correctly selecting partners in supply chain of agricultural enterprises, a large number of partner evaluation techniques are widely used in the field of agricultural science research. This study established a partner selection model to optimize the issue of agricultural supply chain partner selection. Firstly, it constructed a comprehensive evaluation index system after analyzing the real characteristics of agricultural supply chain. Secondly, a heuristic met...

  13. Qualitative simulation in formal process modelling

    International Nuclear Information System (INIS)

    Sivertsen, Elin R.

    1999-01-01

    In relation to several different research activities at the OECD Halden Reactor Project, the usefulness of formal process models has been identified. Being represented in some appropriate representation language, the purpose of these models is to model process plants and plant automatics in a unified way to allow verification and computer aided design of control strategies. The present report discusses qualitative simulation and the tool QSIM as one approach to formal process models. In particular, the report aims at investigating how recent improvements of the tool facilitate the use of the approach in areas like process system analysis, procedure verification, and control software safety analysis. An important long term goal is to provide a basis for using qualitative reasoning in combination with other techniques to facilitate the treatment of embedded programmable systems in Probabilistic Safety Analysis (PSA). This is motivated from the potential of such a combination in safety analysis based on models comprising both software, hardware, and operator. It is anticipated that the research results from this activity will benefit V and V in a wide variety of applications where formal process models can be utilized. Examples are operator procedures, intelligent decision support systems, and common model repositories (author) (ml)

  14. Effect of Model Selection on Computed Water Balance Components

    NARCIS (Netherlands)

    Jhorar, R.K.; Smit, A.A.M.F.R.; Roest, C.W.J.

    2009-01-01

    Soil water flow modelling approaches as used in four selected on-farm water management models, namely CROPWAT. FAIDS, CERES and SWAP, are compared through numerical experiments. The soil water simulation approaches used in the first three models are reformulated to incorporate ail evapotranspiration

  15. Quantitative analytical hierarchy process to marketing store location selection

    Directory of Open Access Journals (Sweden)

    Harwati

    2018-01-01

    Full Text Available The selection of Store to market the product is belong to Multi Criteria Decision Making problem. The criteria used have conflict of interest with each other to produce an optimal location. This research uses four important criteria to select new location of marketing store appropriate with the references: distance to location, competition level with competitor, number of potential customer, and location rent cost. Quantitative data is used to determine the optimum location with AHP method. Quantitative data are preferred to avoid inconsistency when using expert opinion. The AHP result optimum location among three alternatives places.

  16. Modelling uncertainty due to imperfect forward model and aerosol microphysical model selection in the satellite aerosol retrieval

    Science.gov (United States)

    Määttä, Anu; Laine, Marko; Tamminen, Johanna

    2015-04-01

    This study aims to characterize the uncertainty related to the aerosol microphysical model selection and the modelling error due to approximations in the forward modelling. Many satellite aerosol retrieval algorithms rely on pre-calculated look-up tables of model parameters representing various atmospheric conditions. In the retrieval we need to choose the most appropriate aerosol microphysical models from the pre-defined set of models by fitting them to the observations. The aerosol properties, e.g. AOD, are then determined from the best models. This choice of an appropriate aerosol model composes a notable part in the AOD retrieval uncertainty. The motivation in our study was to account these two sources in the total uncertainty budget: uncertainty in selecting the most appropriate model, and uncertainty resulting from the approximations in the pre-calculated aerosol microphysical model. The systematic model error was analysed by studying the behaviour of the model residuals, i.e. the differences between modelled and observed reflectances, by statistical methods. We utilised Gaussian processes to characterize the uncertainty related to approximations in aerosol microphysics modelling due to use of look-up tables and other non-modelled systematic features in the Level 1 data. The modelling error is described by a non-diagonal covariance matrix parameterised by correlation length, which is estimated from the residuals using computational tools from spatial statistics. In addition, we utilised Bayesian model selection and model averaging methods to account the uncertainty due to aerosol model selection. By acknowledging the modelling error as a source of uncertainty in the retrieval of AOD from observed spectral reflectance, we allow the observed values to deviate from the modelled values within limits determined by both the measurement and modelling errors. This results in a more realistic uncertainty level of the retrieved AOD. The method is illustrated by both

  17. The (Mathematical) Modeling Process in Biosciences.

    Science.gov (United States)

    Torres, Nestor V; Santos, Guido

    2015-01-01

    In this communication, we introduce a general framework and discussion on the role of models and the modeling process in the field of biosciences. The objective is to sum up the common procedures during the formalization and analysis of a biological problem from the perspective of Systems Biology, which approaches the study of biological systems as a whole. We begin by presenting the definitions of (biological) system and model. Particular attention is given to the meaning of mathematical model within the context of biology. Then, we present the process of modeling and analysis of biological systems. Three stages are described in detail: conceptualization of the biological system into a model, mathematical formalization of the previous conceptual model and optimization and system management derived from the analysis of the mathematical model. All along this work the main features and shortcomings of the process are analyzed and a set of rules that could help in the task of modeling any biological system are presented. Special regard is given to the formative requirements and the interdisciplinary nature of this approach. We conclude with some general considerations on the challenges that modeling is posing to current biology.

  18. Ensembling Variable Selectors by Stability Selection for the Cox Model

    Directory of Open Access Journals (Sweden)

    Qing-Yan Yin

    2017-01-01

    Full Text Available As a pivotal tool to build interpretive models, variable selection plays an increasingly important role in high-dimensional data analysis. In recent years, variable selection ensembles (VSEs have gained much interest due to their many advantages. Stability selection (Meinshausen and Bühlmann, 2010, a VSE technique based on subsampling in combination with a base algorithm like lasso, is an effective method to control false discovery rate (FDR and to improve selection accuracy in linear regression models. By adopting lasso as a base learner, we attempt to extend stability selection to handle variable selection problems in a Cox model. According to our experience, it is crucial to set the regularization region Λ in lasso and the parameter λmin properly so that stability selection can work well. To the best of our knowledge, however, there is no literature addressing this problem in an explicit way. Therefore, we first provide a detailed procedure to specify Λ and λmin. Then, some simulated and real-world data with various censoring rates are used to examine how well stability selection performs. It is also compared with several other variable selection approaches. Experimental results demonstrate that it achieves better or competitive performance in comparison with several other popular techniques.

  19. Models and Modelling Tools for Chemical Product and Process Design

    DEFF Research Database (Denmark)

    Gani, Rafiqul

    2016-01-01

    -process design. Illustrative examples highlighting the need for efficient model-based systems will be presented, where the need for predictive models for innovative chemical product-process design will be highlighted. The examples will cover aspects of chemical product-process design where the idea of the grand......The design, development and reliability of a chemical product and the process to manufacture it, need to be consistent with the end-use characteristics of the desired product. One of the common ways to match the desired product-process characteristics is through trial and error based experiments......, which can be expensive and time consuming. An alternative approach is the use of a systematic model-based framework according to an established work-flow in product-process design, replacing some of the time consuming and/or repetitive experimental steps. The advantages of the use of a model...

  20. Elementary Teachers' Selection and Use of Visual Models

    Science.gov (United States)

    Lee, Tammy D.; Gail Jones, M.

    2018-02-01

    As science grows in complexity, science teachers face an increasing challenge of helping students interpret models that represent complex science systems. Little is known about how teachers select and use models when planning lessons. This mixed methods study investigated the pedagogical approaches and visual models used by elementary in-service and preservice teachers in the development of a science lesson about a complex system (e.g., water cycle). Sixty-seven elementary in-service and 69 elementary preservice teachers completed a card sort task designed to document the types of visual models (e.g., images) that teachers choose when planning science instruction. Quantitative and qualitative analyses were conducted to analyze the card sort task. Semistructured interviews were conducted with a subsample of teachers to elicit the rationale for image selection. Results from this study showed that both experienced in-service teachers and novice preservice teachers tended to select similar models and use similar rationales for images to be used in lessons. Teachers tended to select models that were aesthetically pleasing and simple in design and illustrated specific elements of the water cycle. The results also showed that teachers were not likely to select images that represented the less obvious dimensions of the water cycle. Furthermore, teachers selected visual models more as a pedagogical tool to illustrate specific elements of the water cycle and less often as a tool to promote student learning related to complex systems.

  1. Validation of elk resource selection models with spatially independent data

    Science.gov (United States)

    Priscilla K. Coe; Bruce K. Johnson; Michael J. Wisdom; John G. Cook; Marty Vavra; Ryan M. Nielson

    2011-01-01

    Knowledge of how landscape features affect wildlife resource use is essential for informed management. Resource selection functions often are used to make and validate predictions about landscape use; however, resource selection functions are rarely validated with data from landscapes independent of those from which the models were built. This problem has severely...

  2. A Working Model of Natural Selection Illustrated by Table Tennis

    Science.gov (United States)

    Dinc, Muhittin; Kilic, Selda; Aladag, Caner

    2013-01-01

    Natural selection is one of the most important topics in biology and it helps to clarify the variety and complexity of organisms. However, students in almost every stage of education find it difficult to understand the mechanism of natural selection and they can develop misconceptions about it. This article provides an active model of natural…

  3. Augmented Self-Modeling as an Intervention for Selective Mutism

    Science.gov (United States)

    Kehle, Thomas J.; Bray, Melissa A.; Byer-Alcorace, Gabriel F.; Theodore, Lea A.; Kovac, Lisa M.

    2012-01-01

    Selective mutism is a rare disorder that is difficult to treat. It is often associated with oppositional defiant behavior, particularly in the home setting, social phobia, and, at times, autism spectrum disorder characteristics. The augmented self-modeling treatment has been relatively successful in promoting rapid diminishment of selective mutism…

  4. Extending Model Checking To Object Process Validation

    NARCIS (Netherlands)

    van Rein, H.

    2002-01-01

    Object-oriented techniques allow the gathering and modelling of system requirements in terms of an application area. The expression of data and process models at that level is a great asset in communication with non-technical people in that area, but it does not necessarily lead to consistent

  5. Hierarchical Structured Model for Nonlinear Dynamical Processes ...

    African Journals Online (AJOL)

    The mathematical representation of the process, in this context, is by a set of linear stochastic differential equations (SDE) with unique solutions. The problem of realization is that of constructing the dynamical system by looking at the problem of scientific model building. In model building, one must be able to calculate the ...

  6. Nanotube/Polymer Composites: Materials Selection and Process Design

    National Research Council Canada - National Science Library

    Winey, Karen

    2004-01-01

    ...) define processing methods most appropriate for the materials identified. Our study of SWNT-polymer composites focuses on thermoplastics, because these materials can be readily drawn into fibers...

  7. Robust Decision-making Applied to Model Selection

    Energy Technology Data Exchange (ETDEWEB)

    Hemez, Francois M. [Los Alamos National Laboratory

    2012-08-06

    The scientific and engineering communities are relying more and more on numerical models to simulate ever-increasingly complex phenomena. Selecting a model, from among a family of models that meets the simulation requirements, presents a challenge to modern-day analysts. To address this concern, a framework is adopted anchored in info-gap decision theory. The framework proposes to select models by examining the trade-offs between prediction accuracy and sensitivity to epistemic uncertainty. The framework is demonstrated on two structural engineering applications by asking the following question: Which model, of several numerical models, approximates the behavior of a structure when parameters that define each of those models are unknown? One observation is that models that are nominally more accurate are not necessarily more robust, and their accuracy can deteriorate greatly depending upon the assumptions made. It is posited that, as reliance on numerical models increases, establishing robustness will become as important as demonstrating accuracy.

  8. Online Learning of Hierarchical Pitman-Yor Process Mixture of Generalized Dirichlet Distributions With Feature Selection.

    Science.gov (United States)

    Fan, Wentao; Sallay, Hassen; Bouguila, Nizar

    2017-09-01

    In this paper, a novel statistical generative model based on hierarchical Pitman-Yor process and generalized Dirichlet distributions (GDs) is presented. The proposed model allows us to perform joint clustering and feature selection thanks to the interesting properties of the GD distribution. We develop an online variational inference algorithm, formulated in terms of the minimization of a Kullback-Leibler divergence, of our resulting model that tackles the problem of learning from high-dimensional examples. This variational Bayes formulation allows simultaneously estimating the parameters, determining the model's complexity, and selecting the appropriate relevant features for the clustering structure. Moreover, the proposed online learning algorithm allows data instances to be processed in a sequential manner, which is critical for large-scale and real-time applications. Experiments conducted using challenging applications, namely, scene recognition and video segmentation, where our approach is viewed as an unsupervised technique for visual learning in high-dimensional spaces, showed that the proposed approach is suitable and promising.

  9. The Selection of Bridge Materials Utilizing the Analytical Hierarchy Process

    Science.gov (United States)

    Robert L. Smith; Robert J. Bush; Daniel L. Schmoldt

    1997-01-01

    Effective decisions on the use of natural resources often require the input of many individuals. Determining how specific criteria affect the selection of materials can lead to better utilization of raw materials. Concrete, steel, and timber represent over 98% of the materials used for bridge construction in the United States. Highway officials must often consider...

  10. MODEL OF QUALITY MANAGEMENT OF TECHNOLOGICAL PROCESSES OF THE GRAIN PROCESSING AND MILL ENTERPRISES

    Directory of Open Access Journals (Sweden)

    M. M. Blagoveshchenskaia

    2014-01-01

    Full Text Available Summary. In the work the model of quality management of technological processes of the grain processing and mill enterprises is presented. It is shown that flour-grinding production is an important part of agro-industrial complex because it provides production of the main food product of people – flour. The analytical indicators of quality of technological process are presented. The matrix of expert estimates of i-th level of quality for the set combinations of parameters values according to the scheme of complete factorial experiment is made. Considered a model for the calculation of the raw material preparation for milling, which characterizes the main qualities of the processed raw materials. For the purpose of management of quality of technological processes of flour mill the mathematical model which includes calculation of two groups of indicators of an assessment is developed: qualities of preparation of raw materials for a grinding and qualities of conducting technological process. The algorithm of an analytical assessment of indicators of quality of technological process of the flour-grinding enterprises, including the selection of waste, selection of bran, a compliance rate of output of flour-grinding products, compliance rate of moisture products, is offered. The assessment of quality management of technological process of a high-quality grinding on the example of several leading flour-grinding enterprises of Central Federal District is carried out. The two-dimensional model of quality management of technological process based on an analytical indicators of an assessment of quality, an assessment of quality of preparation the raw materials for a grinding and an optimum effective condition of technological process is constructed. It is shown that quality management at the enterprise provides collecting, processing and the analysis of information on a condition of material streams and productions on all of their stages.

  11. Target Selection Models with Preference Variation Between Offenders

    NARCIS (Netherlands)

    Townsley, Michael; Birks, Daniel; Ruiter, Stijn; Bernasco, Wim; White, Gentry

    2016-01-01

    Objectives: This study explores preference variation in location choice strategies of residential burglars. Applying a model of offender target selection that is grounded in assertions of the routine activity approach, rational choice perspective, crime pattern and social disorganization theories,

  12. Akaike information criterion to select well-fit resist models

    Science.gov (United States)

    Burbine, Andrew; Fryer, David; Sturtevant, John

    2015-03-01

    In the field of model design and selection, there is always a risk that a model is over-fit to the data used to train the model. A model is well suited when it describes the physical system and not the stochastic behavior of the particular data collected. K-fold cross validation is a method to check this potential over-fitting to the data by calibrating with k-number of folds in the data, typically between 4 and 10. Model training is a computationally expensive operation, however, and given a wide choice of candidate models, calibrating each one repeatedly becomes prohibitively time consuming. Akaike information criterion (AIC) is an information-theoretic approach to model selection based on the maximized log-likelihood for a given model that only needs a single calibration per model. It is used in this study to demonstrate model ranking and selection among compact resist modelforms that have various numbers and types of terms to describe photoresist behavior. It is shown that there is a good correspondence of AIC to K-fold cross validation in selecting the best modelform, and it is further shown that over-fitting is, in most cases, not indicated. In modelforms with more than 40 fitting parameters, the size of the calibration data set benefits from additional parameters, statistically validating the model complexity.

  13. Filament winding cylinders. I - Process model

    Science.gov (United States)

    Lee, Soo-Yong; Springer, George S.

    1990-01-01

    A model was developed which describes the filament winding process of composite cylinders. The model relates the significant process variables such as winding speed, fiber tension, and applied temperature to the thermal, chemical and mechanical behavior of the composite cylinder and the mandrel. Based on the model, a user friendly code was written which can be used to calculate (1) the temperature in the cylinder and the mandrel, (2) the degree of cure and viscosity in the cylinder, (3) the fiber tensions and fiber positions, (4) the stresses and strains in the cylinder and in the mandrel, and (5) the void diameters in the cylinder.

  14. A risk assessment model for selecting cloud service providers

    OpenAIRE

    Cayirci, Erdal; Garaga, Alexandr; Santana de Oliveira, Anderson; Roudier, Yves

    2016-01-01

    The Cloud Adoption Risk Assessment Model is designed to help cloud customers in assessing the risks that they face by selecting a specific cloud service provider. It evaluates background information obtained from cloud customers and cloud service providers to analyze various risk scenarios. This facilitates decision making an selecting the cloud service provider with the most preferable risk profile based on aggregated risks to security, privacy, and service delivery. Based on this model we ...

  15. SELECTION MOMENTS AND GENERALIZED METHOD OF MOMENTS FOR HETEROSKEDASTIC MODELS

    Directory of Open Access Journals (Sweden)

    Constantin ANGHELACHE

    2016-06-01

    Full Text Available In this paper, the authors describe the selection methods for moments and the application of the generalized moments method for the heteroskedastic models. The utility of GMM estimators is found in the study of the financial market models. The selection criteria for moments are applied for the efficient estimation of GMM for univariate time series with martingale difference errors, similar to those studied so far by Kuersteiner.

  16. Resource use efficiency in garri processing in some selected Local ...

    African Journals Online (AJOL)

    The results showed that, labour (x1), operating capital (x2) and land (x4) were the major factors influencing the output of processed Garri in Ekiti State, Nigeria. ... that a conducive environment should be provided to both the producers and processors for its sustainability at the production, processing and marketing levels.

  17. Selection of parameters for advanced machining processes using firefly algorithm

    Directory of Open Access Journals (Sweden)

    Rajkamal Shukla

    2017-02-01

    Full Text Available Advanced machining processes (AMPs are widely utilized in industries for machining complex geometries and intricate profiles. In this paper, two significant processes such as electric discharge machining (EDM and abrasive water jet machining (AWJM are considered to get the optimum values of responses for the given range of process parameters. The firefly algorithm (FA is attempted to the considered processes to obtain optimized parameters and the results obtained are compared with the results given by previous researchers. The variation of process parameters with respect to the responses are plotted to confirm the optimum results obtained using FA. In EDM process, the performance parameter “MRR” is increased from 159.70 gm/min to 181.6723 gm/min, while “Ra” and “REWR” are decreased from 6.21 μm to 3.6767 μm and 6.21% to 6.324 × 10−5% respectively. In AWJM process, the value of the “kerf” and “Ra” are decreased from 0.858 mm to 0.3704 mm and 5.41 mm to 4.443 mm respectively. In both the processes, the obtained results show a significant improvement in the responses.

  18. Effect of Processing on the Elemental Composition of Selected Leafy ...

    African Journals Online (AJOL)

    The elemental composition of leaves of Vernonia amygdalina, Gnetum africana, Gongronema latifolium and Ocimum gratissimum subjected to different processing methods were investigated. Processing methods employed include oven drying, sun drying, fresh milling, steaming and a combination of these while the mineral ...

  19. Modeling Aspects of Activated Sludge Processes Part l l: Mathematical Process Modeling and Biokinetics of Activated Sludge Processes

    International Nuclear Information System (INIS)

    AbdElHaleem, H.S.; EI-Ahwany, A. H.; Ibrahim, H.I.; Ibrahim, G.

    2004-01-01

    Mathematical process modeling and biokinetics of activated sludge process were reviewed considering different types of models. It has been evaluated the task group models of ASMI. and 2, and 3 versioned by Henze et al considering the conditions of each model and the different processes of which every model consists. It is revealed that ASMI contains some defects avoided in ASM3. Relied on homogeneity, Models can be classified into homogenous models characterized by taking the activated sludge process as one phase. In this type of models, the internal mass transfer inside the floes was neglected.. Hence, the kinetic parameter produces can be considered inaccurate. The other type of models is the heterogeneous model This type considers the mass transfer operations in addition to the biochemical reaction processes; hence, the resulted kinetic parameters can be considered more accurate than that of homogenous type

  20. Multivariate fault isolation of batch processes via variable selection in partial least squares discriminant analysis.

    Science.gov (United States)

    Yan, Zhengbing; Kuang, Te-Hui; Yao, Yuan

    2017-09-01

    In recent years, multivariate statistical monitoring of batch processes has become a popular research topic, wherein multivariate fault isolation is an important step aiming at the identification of the faulty variables contributing most to the detected process abnormality. Although contribution plots have been commonly used in statistical fault isolation, such methods suffer from the smearing effect between correlated variables. In particular, in batch process monitoring, the high autocorrelations and cross-correlations that exist in variable trajectories make the smearing effect unavoidable. To address such a problem, a variable selection-based fault isolation method is proposed in this research, which transforms the fault isolation problem into a variable selection problem in partial least squares discriminant analysis and solves it by calculating a sparse partial least squares model. As different from the traditional methods, the proposed method emphasizes the relative importance of each process variable. Such information may help process engineers in conducting root-cause diagnosis. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  1. Guide for selection of dosimetry system for electron processing

    International Nuclear Information System (INIS)

    Mehta, K.

    1988-01-01

    Correct applications of radiation processing depend on accurate measurements of absorbed radiation dose. Radiation dosimetry plays several important roles in radiation processing. In particular, there are three stages for any radiation process during which dosimetry is a key to success: basic laboratory research, commissioning of the process and quality control. Radiation dosimeters may be divided into various classes depending upon their areas of applications and their relative quality: primary standard dosimeter, reference standard dosimeter, transfer standard dosimeter and routine in-house dosimeter. Several commercially available dosimeters are described under each class, and their advantages and limitations are discussed. Finally, recommendations are made as to which dosimeter is most suitable for each of the three stages of electron-beam processing. 124 refs

  2. Process modeling study of the CIF incinerator

    International Nuclear Information System (INIS)

    Hang, T.

    1995-01-01

    The Savannah River Site (SRS) plans to begin operating the Consolidated Incineration Facility (CIF) in 1996. The CIF will treat liquid and solid low-level radioactive, mixed and RCRA hazardous wastes generated at SRS. In addition to experimental test programs, process modeling was applied to provide guidance in areas of safety, environmental regulation compliances, process improvement and optimization. A steady-state flowsheet model was used to calculate material/energy balances and to track key chemical constituents throughout the process units. Dynamic models were developed to predict the CIF transient characteristics in normal and abnormal operation scenarios. Predictions include the rotary kiln heat transfer, dynamic responses of the CIF to fluctuations in the solid waste feed or upsets in the system equipments, performance of the control system, air inleakage in the kiln, etc. This paper reviews the modeling study performed to assist in the deflagration risk assessment

  3. Model Selection in Continuous Test Norming With GAMLSS.

    Science.gov (United States)

    Voncken, Lieke; Albers, Casper J; Timmerman, Marieke E

    2017-06-01

    To compute norms from reference group test scores, continuous norming is preferred over traditional norming. A suitable continuous norming approach for continuous data is the use of the Box-Cox Power Exponential model, which is found in the generalized additive models for location, scale, and shape. Applying the Box-Cox Power Exponential model for test norming requires model selection, but it is unknown how well this can be done with an automatic selection procedure. In a simulation study, we compared the performance of two stepwise model selection procedures combined with four model-fit criteria (Akaike information criterion, Bayesian information criterion, generalized Akaike information criterion (3), cross-validation), varying data complexity, sampling design, and sample size in a fully crossed design. The new procedure combined with one of the generalized Akaike information criterion was the most efficient model selection procedure (i.e., required the smallest sample size). The advocated model selection procedure is illustrated with norming data of an intelligence test.

  4. A finite volume alternate direction implicit approach to modeling selective laser melting

    DEFF Research Database (Denmark)

    Hattel, Jesper Henri; Mohanty, Sankhya

    2013-01-01

    Over the last decade, several studies have attempted to develop thermal models for analyzing the selective laser melting process with a vision to predict thermal stresses, microstructures and resulting mechanical properties of manufactured products. While a holistic model addressing all involved...... is proposed for modeling single-layer and few-layers selective laser melting processes. The ADI technique is implemented and applied for two cases involving constant material properties and non-linear material behavior. The ADI FV method consume less time while having comparable accuracy with respect to 3D...

  5. Seeking inclusion in an exclusive process: discourses of medical school student selection.

    Science.gov (United States)

    Razack, Saleem; Hodges, Brian; Steinert, Yvonne; Maguire, Mary

    2015-01-01

    Calls to increase medical class representativeness to better reflect the diversity of society represent a growing international trend. There is an inherent tension between these calls and competitive student selection processes driven by academic achievement. How is this tension manifested? Our three-phase interdisciplinary research programme focused on the discourses of excellence, equity and diversity in the medical school selection process, as conveyed by key stakeholders: (i) institutions and regulatory bodies (the websites of 17 medical schools and 15 policy documents from national regulatory bodies); (ii) admissions committee members (ACMs) (according to semi-structured interviews [n = 9]), and (iii) successful applicants (according to semi-structured interviews [n = 14]). The work is theoretically situated within the works of Foucault, Bourdieu and Bakhtin. The conceptual framework is supplemented by critical hermeneutics and the performance theories of Goffman. Academic excellence discourses consistently predominate over discourses calling for greater representativeness in medical classes. Policy addressing demographic representativeness in medicine may unwittingly contribute to the reproduction of historical patterns of exclusion of under-represented groups. In ACM selection practices, another discursive tension is exposed as the inherent privilege in the process is marked, challenging the ideal of medicine as a meritocracy. Applicants' representations of self in the 'performance' of interviewing demonstrate implicit recognition of the power inherent in the act of selection and are manifested in the use of explicit strategies to 'fit in'. How can this critical discourse analysis inform improved inclusiveness in student selection? Policymakers addressing diversity and equity issues in medical school admissions should explicitly recognise the power dynamics at play between the profession and marginalised groups. For greater inclusion and to avoid one

  6. From Business Value Model to Coordination Process Model

    Science.gov (United States)

    Fatemi, Hassan; van Sinderen, Marten; Wieringa, Roel

    The increased complexity of business webs calls for modeling the collaboration of enterprises from different perspectives, in particular the business and process perspectives, and for mutually aligning these perspectives. Business value modeling and coordination process modeling both are necessary for a good e-business design, but these activities have different goals and use different concepts. Nevertheless, the resulting models should be consistent with each other because they refer to the same system from different perspectives. Hence, checking the consistency between these models or producing one based on the other would be of high value. In this paper we discuss the issue of achieving consistency in multi-level e-business design and give guidelines to produce consistent coordination process models from business value models in a stepwise manner.

  7. Model Selection and Hypothesis Testing for Large-Scale Network Models with Overlapping Groups

    Directory of Open Access Journals (Sweden)

    Tiago P. Peixoto

    2015-03-01

    Full Text Available The effort to understand network systems in increasing detail has resulted in a diversity of methods designed to extract their large-scale structure from data. Unfortunately, many of these methods yield diverging descriptions of the same network, making both the comparison and understanding of their results a difficult challenge. A possible solution to this outstanding issue is to shift the focus away from ad hoc methods and move towards more principled approaches based on statistical inference of generative models. As a result, we face instead the more well-defined task of selecting between competing generative processes, which can be done under a unified probabilistic framework. Here, we consider the comparison between a variety of generative models including features such as degree correction, where nodes with arbitrary degrees can belong to the same group, and community overlap, where nodes are allowed to belong to more than one group. Because such model variants possess an increasing number of parameters, they become prone to overfitting. In this work, we present a method of model selection based on the minimum description length criterion and posterior odds ratios that is capable of fully accounting for the increased degrees of freedom of the larger models and selects the best one according to the statistical evidence available in the data. In applying this method to many empirical unweighted networks from different fields, we observe that community overlap is very often not supported by statistical evidence and is selected as a better model only for a minority of them. On the other hand, we find that degree correction tends to be almost universally favored by the available data, implying that intrinsic node proprieties (as opposed to group properties are often an essential ingredient of network formation.

  8. Numerical modeling of atmospheric washout processes

    International Nuclear Information System (INIS)

    Bayer, D.; Beheng, K.D.; Herbert, F.

    1987-01-01

    For the washout of particles from the atmosphere by clouds and rain one has to distinguish between processes which work in the first phase of cloud development, when condensation nuclei build up in saturated air (Nucleation Aerosol Scavenging, NAS) and those processes which work at the following cloud development. In the second case particles are taken off by cloud droplets or by falling rain drops via collision (Collision Aerosol Scavenging, CAS). The physics of both processes is described. For the CAS process a numerical model is presented. The report contains a documentation of the mathematical equations and the computer programs (FORTRAN). (KW) [de

  9. How Different Medical School Selection Processes Call upon Different Personality Characteristics

    NARCIS (Netherlands)

    Schripsema, Nienke R; van Trigt, Anke M; van der Wal, Martha A; Cohen-Schotanus, Janke

    2016-01-01

    BACKGROUND: Research indicates that certain personality traits relate to performance in the medical profession. Yet, personality testing during selection seems ineffective. In this study, we examine the extent to which different medical school selection processes call upon desirable personality

  10. A guide to Bayesian model selection for ecologists

    Science.gov (United States)

    Hooten, Mevin B.; Hobbs, N.T.

    2015-01-01

    The steady upward trend in the use of model selection and Bayesian methods in ecological research has made it clear that both approaches to inference are important for modern analysis of models and data. However, in teaching Bayesian methods and in working with our research colleagues, we have noticed a general dissatisfaction with the available literature on Bayesian model selection and multimodel inference. Students and researchers new to Bayesian methods quickly find that the published advice on model selection is often preferential in its treatment of options for analysis, frequently advocating one particular method above others. The recent appearance of many articles and textbooks on Bayesian modeling has provided welcome background on relevant approaches to model selection in the Bayesian framework, but most of these are either very narrowly focused in scope or inaccessible to ecologists. Moreover, the methodological details of Bayesian model selection approaches are spread thinly throughout the literature, appearing in journals from many different fields. Our aim with this guide is to condense the large body of literature on Bayesian approaches to model selection and multimodel inference and present it specifically for quantitative ecologists as neutrally as possible. We also bring to light a few important and fundamental concepts relating directly to model selection that seem to have gone unnoticed in the ecological literature. Throughout, we provide only a minimal discussion of philosophy, preferring instead to examine the breadth of approaches as well as their practical advantages and disadvantages. This guide serves as a reference for ecologists using Bayesian methods, so that they can better understand their options and can make an informed choice that is best aligned with their goals for inference.

  11. NERSC-6 Workload Analysis and Benchmark Selection Process

    Energy Technology Data Exchange (ETDEWEB)

    Antypas, Katie; Shalf, John; Wasserman, Harvey

    2008-08-29

    This report describes efforts carried out during early 2008 to determine some of the science drivers for the"NERSC-6" next-generation high-performance computing system acquisition. Although the starting point was existing Greenbooks from DOE and the NERSC User Group, the main contribution of this work is an analysis of the current NERSC computational workload combined with requirements information elicited from key users and other scientists about expected needs in the 2009-2011 timeframe. The NERSC workload is described in terms of science areas, computer codes supporting research within those areas, and description of key algorithms that comprise the codes. This work was carried out in large part to help select a small set of benchmark programs that accurately capture the science and algorithmic characteristics of the workload. The report concludes with a description of the codes selected and some preliminary performance data for them on several important systems.

  12. Selection of radioactive waste disposal site considering natural processes

    International Nuclear Information System (INIS)

    Nakamura, H.

    1991-01-01

    To dispose the radioactive waste, it is necessary to consider the transfer of material in natural environment. The points of consideration are 1) Long residence time of water 2) Independence of biosphere from the compartment containing the disposal site in the natural hydrologic cycle 3) Dilution with the natural inactive isotope or the same group of elements. Isotope dilution for 129 I and 14 C can be expected by proper selection of the site. 241 Am and 239 Pu will be homogenized into soil or sediment with insoluble elements such as iron and aluminium. For 237 Np and 99 Tc anionic condition is important for the selection. From the point of view of hydrologic cycle, anoxic dead water zone avoiding beneath mountain area is preferable for the disposal site. (author)

  13. Integration of Fast Predictive Model and SLM Process Development Chamber, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — This STTR project seeks to develop a fast predictive model for selective laser melting (SLM) processes and then integrate that model with an SLM chamber that allows...

  14. Gender Inequality and Emigration: Push factor or Selection process?

    OpenAIRE

    Baudassé, Thierry; Bazillier, Rémi

    2012-01-01

    Our objective in this research is to provide empirical evidence relating to the linkages between gender equality and international emigration. Two theoretical hypotheses can be made for the purpose of analyzing such linkages. The fi rst is that gender inequality in origin countries could be a push factor for women. The second one is that gender inequality may create a \\gender bias" in the selection of migrants within a household or a community. An improvement of gender equality would then inc...

  15. Improving the Air Force Squadron Command Selection Process

    Science.gov (United States)

    2017-04-19

    units by utilizing predictive analytics and the Person-Environment Fit Theory to aid in selection. Introduction Organizational...most suitable commander candidates with possible units by utilizing predictive analytics and the Person-Environment Fit Theory to better inform the...positive work atmosphere. The commander must keep the long view in mind , not just the short term, first-order effects of a decision. Leading people

  16. Pavement maintenance optimization model using Markov Decision Processes

    Science.gov (United States)

    Mandiartha, P.; Duffield, C. F.; Razelan, I. S. b. M.; Ismail, A. b. H.

    2017-09-01

    This paper presents an optimization model for selection of pavement maintenance intervention using a theory of Markov Decision Processes (MDP). There are some particular characteristics of the MDP developed in this paper which distinguish it from other similar studies or optimization models intended for pavement maintenance policy development. These unique characteristics include a direct inclusion of constraints into the formulation of MDP, the use of an average cost method of MDP, and the policy development process based on the dual linear programming solution. The limited information or discussions that are available on these matters in terms of stochastic based optimization model in road network management motivates this study. This paper uses a data set acquired from road authorities of state of Victoria, Australia, to test the model and recommends steps in the computation of MDP based stochastic optimization model, leading to the development of optimum pavement maintenance policy.

  17. Improving permafrost distribution modelling using feature selection algorithms

    Science.gov (United States)

    Deluigi, Nicola; Lambiel, Christophe; Kanevski, Mikhail

    2016-04-01

    overall operation. It operates by constructing a large collection of decorrelated classification trees, and then predicts the permafrost occurrence through a majority vote. With the so-called out-of-bag (OOB) error estimate, the classification of permafrost data can be validated as well as the contribution of each predictor can be assessed. The performances of compared permafrost distribution models (computed on independent testing sets) increased with the application of FS algorithms on the original dataset and irrelevant or redundant variables were removed. As a consequence, the process provided faster and more cost-effective predictors and a better understanding of the underlying structures residing in permafrost data. Our work demonstrates the usefulness of a feature selection step prior to applying a machine learning algorithm. In fact, permafrost predictors could be ranked not only based on their heuristic and subjective importance (expert knowledge), but also based on their statistical relevance in relation of the permafrost distribution.

  18. Modeling Dynamic Food Choice Processes to Understand Dietary Intervention Effects.

    Science.gov (United States)

    Marcum, Christopher Steven; Goldring, Megan R; McBride, Colleen M; Persky, Susan

    2018-02-17

    Meal construction is largely governed by nonconscious and habit-based processes that can be represented as a collection of in dividual, micro-level food choices that eventually give rise to a final plate. Despite this, dietary behavior intervention research rarely captures these micro-level food choice processes, instead measuring outcomes at aggregated levels. This is due in part to a dearth of analytic techniques to model these dynamic time-series events. The current article addresses this limitation by applying a generalization of the relational event framework to model micro-level food choice behavior following an educational intervention. Relational event modeling was used to model the food choices that 221 mothers made for their child following receipt of an information-based intervention. Participants were randomized to receive either (a) control information; (b) childhood obesity risk information; (c) childhood obesity risk information plus a personalized family history-based risk estimate for their child. Participants then made food choices for their child in a virtual reality-based food buffet simulation. Micro-level aspects of the built environment, such as the ordering of each food in the buffet, were influential. Other dynamic processes such as choice inertia also influenced food selection. Among participants receiving the strongest intervention condition, choice inertia decreased and the overall rate of food selection increased. Modeling food selection processes can elucidate the points at which interventions exert their influence. Researchers can leverage these findings to gain insight into nonconscious and uncontrollable aspects of food selection that influence dietary outcomes, which can ultimately improve the design of dietary interventions.

  19. The Use of Evolution in a Central Action Selection Model

    Directory of Open Access Journals (Sweden)

    F. Montes-Gonzalez

    2007-01-01

    Full Text Available The use of effective central selection provides flexibility in design by offering modularity and extensibility. In earlier papers we have focused on the development of a simple centralized selection mechanism. Our current goal is to integrate evolutionary methods in the design of non-sequential behaviours and the tuning of specific parameters of the selection model. The foraging behaviour of an animal robot (animat has been modelled in order to integrate the sensory information from the robot to perform selection that is nearly optimized by the use of genetic algorithms. In this paper we present how selection through optimization finally arranges the pattern of presented behaviours for the foraging task. Hence, the execution of specific parts in a behavioural pattern may be ruled out by the tuning of these parameters. Furthermore, the intensive use of colour segmentation from a colour camera for locating a cylinder sets a burden on the calculations carried out by the genetic algorithm.

  20. Understanding Managers Decision Making Process for Tools Selection in the Core Front End of Innovation

    DEFF Research Database (Denmark)

    Appio, Francesco P.; Achiche, Sofiane; McAloone, Tim C.

    2011-01-01

    and optimise the activities. To select these tools, managers of the product development team have to use several premises to decide upon which tool is more appropriate to which activity. This paper proposes an approach to model the decision making process of the managers. The results underline the dimensions......New product development (NPD) describes the process of bringing a new product or service to the market. The Fuzzy Front End (FFE) of Innovation is the term describing the activities happening before the product development phase of NPD. In the FFE of innovation, several tools are used to facilitate...... influencing the decision process before a certain tool is chosen, and how those tools impact the performance of cost, time and efficiency. In order to achieve this, five companies participated for the data collection. Interesting trends and differences emerge from the analysis of the data in hand, and several...

  1. Selection of a green manufacturing process based on CAD features

    OpenAIRE

    Gaha, Raoudha; Yannou, Bernard; Benamara, Abdelmajid

    2016-01-01

    International audience; Environmentally conscious manufacturing process (ECMP) has become an obligation to the environment and to the society itself, enforced primarily by governmental regulations and customer perspective on environmental issues. ECMP involves integrating environmental thinking into new product development. This is especially true in the computer-aided design (CAD) phase which is the last phase in the design process. At this stage, more than 80 % of choices are done. Feature ...

  2. Modeling nonhomogeneous Markov processes via time transformation.

    Science.gov (United States)

    Hubbard, R A; Inoue, L Y T; Fann, J R

    2008-09-01

    Longitudinal studies are a powerful tool for characterizing the course of chronic disease. These studies are usually carried out with subjects observed at periodic visits giving rise to panel data. Under this observation scheme the exact times of disease state transitions and sequence of disease states visited are unknown and Markov process models are often used to describe disease progression. Most applications of Markov process models rely on the assumption of time homogeneity, that is, that the transition rates are constant over time. This assumption is not satisfied when transition rates depend on time from the process origin. However, limited statistical tools are available for dealing with nonhomogeneity. We propose models in which the time scale of a nonhomogeneous Markov process is transformed to an operational time scale on which the process is homogeneous. We develop a method for jointly estimating the time transformation and the transition intensity matrix for the time transformed homogeneous process. We assess maximum likelihood estimation using the Fisher scoring algorithm via simulation studies and compare performance of our method to homogeneous and piecewise homogeneous models. We apply our methodology to a study of delirium progression in a cohort of stem cell transplantation recipients and show that our method identifies temporal trends in delirium incidence and recovery.

  3. Software Engineering Laboratory (SEL) cleanroom process model

    Science.gov (United States)

    Green, Scott; Basili, Victor; Godfrey, Sally; Mcgarry, Frank; Pajerski, Rose; Waligora, Sharon

    1991-01-01

    The Software Engineering Laboratory (SEL) cleanroom process model is described. The term 'cleanroom' originates in the integrated circuit (IC) production process, where IC's are assembled in dust free 'clean rooms' to prevent the destructive effects of dust. When applying the clean room methodology to the development of software systems, the primary focus is on software defect prevention rather than defect removal. The model is based on data and analysis from previous cleanroom efforts within the SEL and is tailored to serve as a guideline in applying the methodology to future production software efforts. The phases that are part of the process model life cycle from the delivery of requirements to the start of acceptance testing are described. For each defined phase, a set of specific activities is discussed, and the appropriate data flow is described. Pertinent managerial issues, key similarities and differences between the SEL's cleanroom process model and the standard development approach used on SEL projects, and significant lessons learned from prior cleanroom projects are presented. It is intended that the process model described here will be further tailored as additional SEL cleanroom projects are analyzed.

  4. Heat transfer modelling and stability analysis of selective laser melting

    International Nuclear Information System (INIS)

    Gusarov, A.V.; Yadroitsev, I.; Bertrand, Ph.; Smurov, I.

    2007-01-01

    The process of direct manufacturing by selective laser melting basically consists of laser beam scanning over a thin powder layer deposited on a dense substrate. Complete remelting of the powder in the scanned zone and its good adhesion to the substrate ensure obtaining functional parts with improved mechanical properties. Experiments with single-line scanning indicate, that an interval of scanning velocities exists where the remelted tracks are uniform. The tracks become broken if the scanning velocity is outside this interval. This is extremely undesirable and referred to as the 'balling' effect. A numerical model of coupled radiation and heat transfer is proposed to analyse the observed instability. The 'balling' effect at high scanning velocities (above ∼20 cm/s for the present conditions) can be explained by the Plateau-Rayleigh capillary instability of the melt pool. Two factors stabilize the process with decreasing the scanning velocity: reducing the length-to-width ratio of the melt pool and increasing the width of its contact with the substrate

  5. Determination of Properties of Selected Fresh and Processed Medicinal Plants

    Directory of Open Access Journals (Sweden)

    Shirley G. Cabrera

    2015-11-01

    Full Text Available The study aimed to determine the chemical properties, bioactive compounds, antioxidant activity and toxicity level of fresh and processed medicinal plants such as corn (Zea mays silk, pancitpancitan (Peperomiapellucida leaves, pandan (Pandanus amaryllifolius leaves, and commercially available tea. The toxicity level of the samples was measured using the Brine Shrimp Lethality Assay (BSLA. Statistical analysis was done using Statistical Package for Social Sciences (SPSS. Results showed that in terms of chemical properties there is significant difference between fresh and processed corn silk except in crude fiber content was noted. Based on proximate analyses of fresh and processed medicinal plants specifically in terms of % moisture, %crude protein and % total carbohydrates were also observed. In addition, there is also significant difference on bioactive compound contents such as total flavonoids and total phenolics between fresh and processed corn silk except in total vitamin E (TVE content. Pandan and pancit-pancitan showed significant difference in all bioactive compounds except in total antioxidant content (TAC. Fresh pancit-pancitan has the highest total phenolics content (TPC and TAC, while the fresh and processed corn silk has the lowest TAC and TVE content, respectively. Furthermore, results of BSLA for the three medicinal plants and commercially available tea extract showed after 24 hours exposure significant difference in toxicity level was observed. The percentage mortality increased with an increase in exposure time of the three medicinal plants and tea extract. The results of the study can served as baseline data for further processing and commercialization of these medicinal plants.

  6. Training Self-Regulated Learning Skills with Video Modeling Examples: Do Task-Selection Skills Transfer?

    Science.gov (United States)

    Raaijmakers, Steven F.; Baars, Martine; Schaap, Lydia; Paas, Fred; van Merriënboer, Jeroen; van Gog, Tamara

    2018-01-01

    Self-assessment and task-selection skills are crucial in self-regulated learning situations in which students can choose their own tasks. Prior research suggested that training with video modeling examples, in which another person (the model) demonstrates and explains the cyclical process of problem-solving task performance, self-assessment, and…

  7. Causally nonseparable processes admitting a causal model

    International Nuclear Information System (INIS)

    Feix, Adrien; Araújo, Mateus; Brukner, Caslav

    2016-01-01

    A recent framework of quantum theory with no global causal order predicts the existence of ‘causally nonseparable’ processes. Some of these processes produce correlations incompatible with any causal order (they violate so-called ‘causal inequalities’ analogous to Bell inequalities ) while others do not (they admit a ‘causal model’ analogous to a local model ). Here we show for the first time that bipartite causally nonseparable processes with a causal model exist, and give evidence that they have no clear physical interpretation. We also provide an algorithm to generate processes of this kind and show that they have nonzero measure in the set of all processes. We demonstrate the existence of processes which stop violating causal inequalities but are still causally nonseparable when mixed with a certain amount of ‘white noise’. This is reminiscent of the behavior of Werner states in the context of entanglement and nonlocality. Finally, we provide numerical evidence for the existence of causally nonseparable processes which have a causal model even when extended with an entangled state shared among the parties. (paper)

  8. Statistical model selection with “Big Data”

    Directory of Open Access Journals (Sweden)

    Jurgen A. Doornik

    2015-12-01

    Full Text Available Big Data offer potential benefits for statistical modelling, but confront problems including an excess of false positives, mistaking correlations for causes, ignoring sampling biases and selecting by inappropriate methods. We consider the many important requirements when searching for a data-based relationship using Big Data, and the possible role of Autometrics in that context. Paramount considerations include embedding relationships in general initial models, possibly restricting the number of variables to be selected over by non-statistical criteria (the formulation problem, using good quality data on all variables, analyzed with tight significance levels by a powerful selection procedure, retaining available theory insights (the selection problem while testing for relationships being well specified and invariant to shifts in explanatory variables (the evaluation problem, using a viable approach that resolves the computational problem of immense numbers of possible models.

  9. Stochastic differential equation model to Prendiville processes

    International Nuclear Information System (INIS)

    Granita; Bahar, Arifah

    2015-01-01

    The Prendiville process is another variation of the logistic model which assumes linearly decreasing population growth rate. It is a continuous time Markov chain (CTMC) taking integer values in the finite interval. The continuous time Markov chain can be approximated by stochastic differential equation (SDE). This paper discusses the stochastic differential equation of Prendiville process. The work started with the forward Kolmogorov equation in continuous time Markov chain of Prendiville process. Then it was formulated in the form of a central-difference approximation. The approximation was then used in Fokker-Planck equation in relation to the stochastic differential equation of the Prendiville process. The explicit solution of the Prendiville process was obtained from the stochastic differential equation. Therefore, the mean and variance function of the Prendiville process could be easily found from the explicit solution

  10. Stochastic differential equation model to Prendiville processes

    Energy Technology Data Exchange (ETDEWEB)

    Granita, E-mail: granitafc@gmail.com [Dept. of Mathematical Science, Universiti Teknologi Malaysia, 81310, Johor Malaysia (Malaysia); Bahar, Arifah [Dept. of Mathematical Science, Universiti Teknologi Malaysia, 81310, Johor Malaysia (Malaysia); UTM Center for Industrial & Applied Mathematics (UTM-CIAM) (Malaysia)

    2015-10-22

    The Prendiville process is another variation of the logistic model which assumes linearly decreasing population growth rate. It is a continuous time Markov chain (CTMC) taking integer values in the finite interval. The continuous time Markov chain can be approximated by stochastic differential equation (SDE). This paper discusses the stochastic differential equation of Prendiville process. The work started with the forward Kolmogorov equation in continuous time Markov chain of Prendiville process. Then it was formulated in the form of a central-difference approximation. The approximation was then used in Fokker-Planck equation in relation to the stochastic differential equation of the Prendiville process. The explicit solution of the Prendiville process was obtained from the stochastic differential equation. Therefore, the mean and variance function of the Prendiville process could be easily found from the explicit solution.

  11. The Probability Model of Expectation Disconfirmation Process

    Directory of Open Access Journals (Sweden)

    Hui-Hsin HUANG

    2015-06-01

    Full Text Available This paper proposes a probability model to explore the dynamic process of customer’s satisfaction. Bases on expectation disconfirmation theory, the satisfaction is constructed with customer’s expectation before buying behavior and the perceived performance after purchase. The experiment method is designed to measure expectation disconfirmation effects and we also use the collection data to estimate the overall satisfaction and model calibration. The results show good fitness between the model and the real data. This model has application for business marketing areas in order to manage relationship satisfaction.

  12. Chain binomial models and binomial autoregressive processes.

    Science.gov (United States)

    Weiss, Christian H; Pollett, Philip K

    2012-09-01

    We establish a connection between a class of chain-binomial models of use in ecology and epidemiology and binomial autoregressive (AR) processes. New results are obtained for the latter, including expressions for the lag-conditional distribution and related quantities. We focus on two types of chain-binomial model, extinction-colonization and colonization-extinction models, and present two approaches to parameter estimation. The asymptotic distributions of the resulting estimators are studied, as well as their finite-sample performance, and we give an application to real data. A connection is made with standard AR models, which also has implications for parameter estimation. © 2011, The International Biometric Society.

  13. Mathematical Modelling of Coal Gasification Processes

    Science.gov (United States)

    Sundararajan, T.; Raghavan, V.; Ajilkumar, A.; Vijay Kumar, K.

    2017-07-01

    Coal is by far the most commonly employed fuel for electrical power generation around the world. While combustion could be the route for coal utilization for high grade coals, gasification becomes the preferred process for low grade coals having higher composition of volatiles or ash. Indian coals suffer from high ash content-nearly 50% by weight in some cases. Instead of transporting such high ash coals, it is more energy efficient to gasify the coal and transport the product syngas. Integrated Gasification Combined Cycle (IGCC) plants and Underground Gasification of coal have become attractive technologies for the best utilization of high ash coals. Gasification could be achieved in fixed beds, fluidized beds and entrained beds; faster rates of gasification are possible in fluidized beds and entrained flow systems, because of the small particle sizes and higher gas velocities. The media employed for gasification could involve air/oxygen and steam. Use of oxygen will yield relatively higher calorific value syngas because of the absence of nitrogen. Sequestration of the carbon dioxide after the combustion of the syngas is also easier, if oxygen is used for gasification. Addition of steam can increase hydrogen yield in the syngas and thereby increase the calorific value also. Gasification in the presence of suitable catalysts can increase the composition of methane in the product gas. Several competing heterogenous and homogenous reactions occur during coal major heterogenous reaction pathways, while interactions between carbon monoxide, oxygen, hydrogen, water vapour, methane and carbon dioxide result in several simultaneous gas-phase (homogenous) reactions. The overall product composition of the coal gasification process depends on the input reactant composition, particle size and type of gasifier, and pressure and temperature of the gasifier. The use of catalysts can also selectively change the product composition. At IIT Madras, over the last one decade, both

  14. The Choice Is Yours: The Role of Cognitive Processes for IT-Supported Idea Selection

    DEFF Research Database (Denmark)

    Seeber, Isabella; Weber, Barbara; Maier, Ronald

    2017-01-01

    The selection of good ideas out of hundreds or even thousands has proven to be the next big challenge for organizations that conduct open idea contests for innovation. Cognitive load and attention loss hinder crowds to effectively run their idea selection process. Facilitation techniques...... of selection direction and selection type. A laboratory experiment using eye-tracking will investigate variations in selection type and selection direction. Moreover, the experiment will test the effects on the decision-making process and the number and quality of ideas in a filtered set. Findings will provide...

  15. Modeling and Advanced Control for Sustainable Process ...

    Science.gov (United States)

    This book chapter introduces a novel process systems engineering framework that integrates process control with sustainability assessment tools for the simultaneous evaluation and optimization of process operations. The implemented control strategy consists of a biologically-inspired, multi-agent-based method. The sustainability and performance assessment of process operating points is carried out using the U.S. E.P.A.’s GREENSCOPE assessment tool that provides scores for the selected economic, material management, environmental and energy indicators. The indicator results supply information on whether the implementation of the controller is moving the process towards a more sustainable operation. The effectiveness of the proposed framework is illustrated through a case study of a continuous bioethanol fermentation process whose dynamics are characterized by steady-state multiplicity and oscillatory behavior. This book chapter contribution demonstrates the application of novel process control strategies for sustainability by increasing material management, energy efficiency, and pollution prevention, as needed for SHC Sustainable Uses of Wastes and Materials Management.

  16. Sequentially solution-processed, nanostructured polymer photovoltaics using selective solvents

    KAUST Repository

    Kim, Do Hwan

    2014-01-01

    We demonstrate high-performance sequentially solution-processed organic photovoltaics (OPVs) with a power conversion efficiency (PCE) of 5% for blend films using a donor polymer based on the isoindigo-bithiophene repeat unit (PII2T-C10C8) and a fullerene derivative [6,6]-phenyl-C[71]-butyric acid methyl ester (PC71BM). This has been accomplished by systematically controlling the swelling and intermixing processes of the layer with various processing solvents during deposition of the fullerene. We find that among the solvents used for fullerene deposition that primarily swell but do not re-dissolve the polymer underlayer, there were significant microstructural differences between chloro and o-dichlorobenzene solvents (CB and ODCB, respectively). Specifically, we show that the polymer crystallite orientation distribution in films where ODCB was used to cast the fullerene is broad. This indicates that out-of-plane charge transport through a tortuous transport network is relatively efficient due to a large density of inter-grain connections. In contrast, using CB results in primarily edge-on oriented polymer crystallites, which leads to diminished out-of-plane charge transport. We correlate these microstructural differences with photocurrent measurements, which clearly show that casting the fullerene out of ODCB leads to significantly enhanced power conversion efficiencies. Thus, we believe that tuning the processing solvents used to cast the electron acceptor in sequentially-processed devices is a viable way to controllably tune the blend film microstructure. © 2014 The Royal Society of Chemistry.

  17. Evaluation and comparison of alternative fleet-level selective maintenance models

    International Nuclear Information System (INIS)

    Schneider, Kellie; Richard Cassady, C.

    2015-01-01

    Fleet-level selective maintenance refers to the process of identifying the subset of maintenance actions to perform on a fleet of repairable systems when the maintenance resources allocated to the fleet are insufficient for performing all desirable maintenance actions. The original fleet-level selective maintenance model is designed to maximize the probability that all missions in a future set are completed successfully. We extend this model in several ways. First, we consider a cost-based optimization model and show that a special case of this model maximizes the expected value of the number of successful missions in the future set. We also consider the situation in which one or more of the future missions may be canceled. These models and the original fleet-level selective maintenance optimization models are nonlinear. Therefore, we also consider an alternative model in which the objective function can be linearized. We show that the alternative model is a good approximation to the other models. - Highlights: • Investigate nonlinear fleet-level selective maintenance optimization models. • A cost based model is used to maximize the expected number of successful missions. • Another model is allowed to cancel missions if reliability is sufficiently low. • An alternative model has an objective function that can be linearized. • We show that the alternative model is a good approximation to the other models

  18. Probing the Accretion Processes in Soft X-Ray Selected Polars

    Directory of Open Access Journals (Sweden)

    I. Traulsen

    2015-02-01

    Full Text Available High-energy data of accreting white dwarfs give access to the regime of the primary accretion-induced energy release and the different proposed accretion scenarios. We perform XMM-Newton observations of polars selected due to their ROSAT hardness ratios close to -1.0 and model the emission processes in accretion column and accretion region. Our models consider the multi-temperature structure of the emission regions and are mainly determined by mass-flow density, magnetic field strength, and white-dwarf mass. To describe the full spectral energy distribution from infrared to X-rays in a physically consistent way, we include the stellar contributions and establish composite models, which will also be of relevance for future X-ray missions. We confirm the X-ray soft nature of three polars.

  19. A neurolinguistic model of grammatical construction processing.

    Science.gov (United States)

    Dominey, Peter Ford; Hoen, Michel; Inui, Toshio

    2006-12-01

    One of the functions of everyday human language is to communicate meaning. Thus, when one hears or reads the sentence, "John gave a book to Mary," some aspect of an event concerning the transfer of possession of a book from John to Mary is (hopefully) transmitted. One theoretical approach to language referred to as construction grammar emphasizes this link between sentence structure and meaning in the form of grammatical constructions. The objective of the current research is to (1) outline a functional description of grammatical construction processing based on principles of psycholinguistics, (2) develop a model of how these functions can be implemented in human neurophysiology, and then (3) demonstrate the feasibility of the resulting model in processing languages of typologically diverse natures, that is, English, French, and Japanese. In this context, particular interest will be directed toward the processing of novel compositional structure of relative phrases. The simulation results are discussed in the context of recent neurophysiological studies of language processing.

  20. Quality of margarine: fats selection and processing parameters.

    Science.gov (United States)

    Miskandar, Mat Sahri; Man, Yaakob Che; Yusoff, Mohd Suria Affandi; Rahman, Russly Abd

    2005-01-01

    Optimum processing conditions on palm oil-based formulations are required to produce the desired quality margarine. As oils and fats contribute to the overall property of the margarine, this paper will review the importance of beta' tending oils and fats in margarine formulation, effects of the processing parameters -- emulsion temperature, flow-rate, product temperature and pin-worker speed -- on palm oil margarines produced and their subsequent behaviour in storage. Palm oil, which contributes the beta' crystal polymorph and the best alternative to hydrogenated liquid fats, and the processing conditions can affect the margarine consistency by influencing the solid fat content (SFC) and the types of crystal polymorph formed during production as well as in storage. Palm oil, or hydrogenated palm oil and olein, in mixture with oils of beta tending, can veer the product to the beta' crystal form. However, merely having beta' crystal tending oils is not sufficient as the processing conditions are also important. The emulsion temperature had no significant effect on the consistency and polymorphic changes of the product during storage, even though differences were observed during processing. The consistency of margarine during storage was high at low emulsion flow-rates and low at high flow rates. The temperature of the scraped-surface tube-cooler is the most important parameter in margarine processing. High temperature will produce a hardened product with formation of beta-crystals during storage. The speed of the pin-worker is responsible for inducing crystallization but, at the same time, destroys the crystal agglomerates, resulting in melting.

  1. Optimal experiment design for model selection in biochemical networks.

    Science.gov (United States)

    Vanlier, Joep; Tiemann, Christian A; Hilbers, Peter A J; van Riel, Natal A W

    2014-02-20

    Mathematical modeling is often used to formalize hypotheses on how a biochemical network operates by discriminating between competing models. Bayesian model selection offers a way to determine the amount of evidence that data provides to support one model over the other while favoring simple models. In practice, the amount of experimental data is often insufficient to make a clear distinction between competing models. Often one would like to perform a new experiment which would discriminate between competing hypotheses. We developed a novel method to perform Optimal Experiment Design to predict which experiments would most effectively allow model selection. A Bayesian approach is applied to infer model parameter distributions. These distributions are sampled and used to simulate from multivariate predictive densities. The method is based on a k-Nearest Neighbor estimate of the Jensen Shannon divergence between the multivariate predictive densities of competing models. We show that the method successfully uses predictive differences to enable model selection by applying it to several test cases. Because the design criterion is based on predictive distributions, which can be computed for a wide range of model quantities, the approach is very flexible. The method reveals specific combinations of experiments which improve discriminability even in cases where data is scarce. The proposed approach can be used in conjunction with existing Bayesian methodologies where (approximate) posteriors have been determined, making use of relations that exist within the inferred posteriors.

  2. Improvement of Selected Logistics Processes Using Quality Engineering Tools

    Science.gov (United States)

    Zasadzień, Michał; Žarnovský, Jozef

    2018-03-01

    Increase in the number of orders, the increasing quality requirements and the speed of order preparation require implementation of new solutions and improvement of logistics processes. Any disruption that occurs during execution of an order often leads to customer dissatisfaction, as well as loss of his/her confidence. The article presents a case study of the use of quality engineering methods and tools to improve the e-commerce logistic process. This made it possible to identify and prioritize key issues, identify their causes, and formulate improvement and prevention measures.

  3. Similarity metrics for surgical process models.

    Science.gov (United States)

    Neumuth, Thomas; Loebe, Frank; Jannin, Pierre

    2012-01-01

    The objective of this work is to introduce a set of similarity metrics for comparing surgical process models (SPMs). SPMs are progression models of surgical interventions that support quantitative analyses of surgical activities, supporting systems engineering or process optimization. Five different similarity metrics are presented and proven. These metrics deal with several dimensions of process compliance in surgery, including granularity, content, time, order, and frequency of surgical activities. The metrics were experimentally validated using 20 clinical data sets each for cataract interventions, craniotomy interventions, and supratentorial tumor resections. The clinical data sets were controllably modified in simulations, which were iterated ten times, resulting in a total of 600 simulated data sets. The simulated data sets were subsequently compared to the original data sets to empirically assess the predictive validity of the metrics. We show that the results of the metrics for the surgical process models correlate significantly (pmetrics meet predictive validity. The clinical use of the metrics was exemplarily, as demonstrated by assessment of the learning curves of observers during surgical process model acquisition. Measuring similarity between surgical processes is a complex task. However, metrics for computing the similarity between surgical process models are needed in many uses in the field of medical engineering. These metrics are essential whenever two SPMs need to be compared, such as during the evaluation of technical systems, the education of observers, or the determination of surgical strategies. These metrics are key figures that provide a solid base for medical decisions, such as during validation of sensor systems for use in operating rooms in the future. Copyright © 2011 Elsevier B.V. All rights reserved.

  4. A process algebra model of QED

    International Nuclear Information System (INIS)

    Sulis, William

    2016-01-01

    The process algebra approach to quantum mechanics posits a finite, discrete, determinate ontology of primitive events which are generated by processes (in the sense of Whitehead). In this ontology, primitive events serve as elements of an emergent space-time and of emergent fundamental particles and fields. Each process generates a set of primitive elements, using only local information, causally propagated as a discrete wave, forming a causal space termed a causal tapestry. Each causal tapestry forms a discrete and finite sampling of an emergent causal manifold (space-time) M and emergent wave function. Interactions between processes are described by a process algebra which possesses 8 commutative operations (sums and products) together with a non-commutative concatenation operator (transitions). The process algebra possesses a representation via nondeterministic combinatorial games. The process algebra connects to quantum mechanics through the set valued process and configuration space covering maps, which associate each causal tapestry with sets of wave functions over M. Probabilities emerge from interactions between processes. The process algebra model has been shown to reproduce many features of the theory of non-relativistic scalar particles to a high degree of accuracy, without paradox or divergences. This paper extends the approach to a semi-classical form of quantum electrodynamics. (paper)

  5. Development and application of microbial selective plugging processes

    Energy Technology Data Exchange (ETDEWEB)

    Jenneman, G.E. [Phillips Petroleum Co., Bartlesville, OK (United States); Gevertz, D.; Davey, M.E. [Agouron Institute, La Jolla, CA (United States)] [and others

    1995-12-31

    Phillips Petroleum Company recently completed a microbial selective plugging (MSP) pilot at the North Burbank Unit (NBU), Shidler, Oklahoma. Nutrients were selected for the pilot that could stimulate indigenous microflora in the reservoir brine to grow and produce exopolymer. It was found that soluble corn starch polymers (e.g., maltodextrins) stimulated the indigenous bacteria to produce exopolymer, whereas simple sugars (e.g., glucose and sucrose), as well as complex media (e.g., molasses and Nutrient Broth), did not. Injection of maltodextrin into rock cores in the presence of indigenous NBU bacteria resulted in stable permeability reductions (> 90%) across the entire length, while injection of glucose resulted only in face plugging. In addition, it was found that organic phosphate esters (OPE) served as a preferable source of phosphorus for the indigenous bacteria, since orthophosphates and condensed phosphates precipitated in NBU brine at reservoir temperature (45{degrees}C). Injection of maltodextrin and ethyl acid phosphate into a producing well stimulated an increase in maltodextrin utilizing bacteria (MUB) in the back-flowed, produced fluid. Additional screens of indigenous and nonindigenous bacteria yielded several nonindigenous isolates that could synthesize polymer when growing in brine containing 6% NaCl at 45{degrees}C.

  6. Quantile hydrologic model selection and model structure deficiency assessment : 1. Theory

    NARCIS (Netherlands)

    Pande, S.

    2013-01-01

    A theory for quantile based hydrologic model selection and model structure deficiency assessment is presented. The paper demonstrates that the degree to which a model selection problem is constrained by the model structure (measured by the Lagrange multipliers of the constraints) quantifies

  7. Continuous-Time Mean-Variance Portfolio Selection under the CEV Process

    Directory of Open Access Journals (Sweden)

    Hui-qiang Ma

    2014-01-01

    Full Text Available We consider a continuous-time mean-variance portfolio selection model when stock price follows the constant elasticity of variance (CEV process. The aim of this paper is to derive an optimal portfolio strategy and the efficient frontier. The mean-variance portfolio selection problem is formulated as a linearly constrained convex program problem. By employing the Lagrange multiplier method and stochastic optimal control theory, we obtain the optimal portfolio strategy and mean-variance efficient frontier analytically. The results show that the mean-variance efficient frontier is still a parabola in the mean-variance plane, and the optimal strategies depend not only on the total wealth but also on the stock price. Moreover, some numerical examples are given to analyze the sensitivity of the efficient frontier with respect to the elasticity parameter and to illustrate the results presented in this paper. The numerical results show that the price of risk decreases as the elasticity coefficient increases.

  8. Peers and the Emergence of Alcohol Use: Influence and Selection Processes in Adolescent Friendship Networks.

    Science.gov (United States)

    Osgood, D Wayne; Ragan, Daniel T; Wallace, Lacey; Gest, Scott D; Feinberg, Mark E; Moody, James

    2013-09-01

    This study addresses not only influence and selection of friends as sources of similarity in alcohol use, but also peer processes leading drinkers to be chosen as friends more often than non-drinkers, which increases the number of adolescents subject to their influence. Analyses apply a stochastic actor-based model to friendship networks assessed five times from 6 th through 9 th grades for 50 grade cohort networks in Iowa and Pennsylvania, which include 13,214 individuals. Results show definite influence and selection for similarity in alcohol use, as well as reciprocal influences between drinking and frequently being chosen as a friend. These findings suggest that adolescents view alcohol use as an attractive, high status activity and that friendships expose adolescents to opportunities for drinking.

  9. Fuzzy Investment Portfolio Selection Models Based on Interval Analysis Approach

    Directory of Open Access Journals (Sweden)

    Haifeng Guo

    2012-01-01

    Full Text Available This paper employs fuzzy set theory to solve the unintuitive problem of the Markowitz mean-variance (MV portfolio model and extend it to a fuzzy investment portfolio selection model. Our model establishes intervals for expected returns and risk preference, which can take into account investors' different investment appetite and thus can find the optimal resolution for each interval. In the empirical part, we test this model in Chinese stocks investment and find that this model can fulfill different kinds of investors’ objectives. Finally, investment risk can be decreased when we add investment limit to each stock in the portfolio, which indicates our model is useful in practice.

  10. Development of an Environment for Software Reliability Model Selection

    Science.gov (United States)

    1992-09-01

    now is directed to other related problems such as tools for model selection, multiversion programming, and software fault tolerance modeling... multiversion programming, 7. Hlardware can be repaired by spare modules, which is not. the case for software, 2-6 N. Preventive maintenance is very important

  11. The Ideal Criteria of Supplier Selection for SMEs Food Processing Industry

    Directory of Open Access Journals (Sweden)

    Ramlan Rohaizan

    2016-01-01

    Full Text Available Selection of good supplier is important to determine the performance and profitability of SMEs food processing industry. The lack of managerial capability on supplier selection in SMEs food processing industry affects the competitiveness of SMEs food processing industry. This research aims to determine the ideal criteria of supplier for food processing industry using Analytical Hierarchy Process (AHP. The research was carried out in a quantitative method by distributing questionnaires to 50 SMEs food processing industries. The collected data analysed using Expert Choice software to rank the supplier selection criteria. The result shows that criteria for supplier selection are ranked by cost, quality, service, delivery and management and organisation while purchase cost, audit result, defect analysis, transportation cost and fast responsiveness are the first five sub-criteria. The result of this research intends to improve managerial capabilities of SMEs food processing industry in supplier selection.

  12. Determinantal point process models on the sphere

    DEFF Research Database (Denmark)

    Møller, Jesper; Nielsen, Morten; Porcu, Emilio

    We consider determinantal point processes on the d-dimensional unit sphere Sd . These are finite point processes exhibiting repulsiveness and with moment properties determined by a certain determinant whose entries are specified by a so-called kernel which we assume is a complex covariance function...... and eigenfunctions in a spectral representation for the kernel, and we figure out how repulsive isotropic DPPs can be. Moreover, we discuss the shortcomings of adapting existing models for isotropic covariance functions and consider strategies for developing new models, including a useful spectral approach....

  13. Performance assessment modeling of pyrometallurgical process wasteforms

    International Nuclear Information System (INIS)

    Nutt, W.M.; Hill, R.N.; Bullen, D.B.

    1995-01-01

    Performance assessment analyses have been completed to estimate the behavior of high-level nuclear wasteforms generated from the pyrometallurgical processing of liquid metal reactor (LMR) and light water reactor (LWR) spent nuclear fuel. Waste emplaced in the proposed repository at Yucca Mountain is investigated as the basis for the study. The resulting cumulative actinide and fission product releases to the accessible environment within a 100,000 year period from the various pyrometallurgical process wasteforms are compared to those of directly disposed LWR spent fuel using the same total repository system model. The impact of differing radionuclide transport models on the overall release characteristics is investigated

  14. Selection Bias in Educational Transition Models: Theory and Empirical Evidence

    DEFF Research Database (Denmark)

    Holm, Anders; Jæger, Mads

    Most studies using Mare’s (1980, 1981) seminal model of educational transitions find that the effect of family background decreases across transitions. Recently, Cameron and Heckman (1998, 2001) have argued that the “waning coefficients” in the Mare model are driven by selection on unobserved...... the United States, United Kingdom, Denmark, and the Netherlands shows that when we take selection into account the effect of family background variables on educational transitions is largely constant across transitions. We also discuss several difficulties in estimating educational transition models which...

  15. The development of a novel, selective desulfurization process

    NARCIS (Netherlands)

    ter Maat, H.; ter Maat, Hendrik

    2006-01-01

    The removal of hydrogen sulfide from natural, industrial of bio gas is an operation that is frequently encountered in process industry. Driven by tight sulfur specifications and the everlasting need for cost reduction a considerable research effort is made in this field, sprouting numerous new

  16. A Method of Measuring Inequality within a Selection Process

    Science.gov (United States)

    Bulle, Nathalie

    2016-01-01

    To explain the inequalities in access to a discrete good G across two populations, or across time in a single national context, it is necessary to distinguish, for each population or period of time, the effect of the diffusion of G from that of unequal outcomes of underlying micro-social processes. The inequality of outcomes of these micro-social…

  17. Food selectivity and processing by the cold-water coral

    NARCIS (Netherlands)

    Van Oevelen, D.; Mueller, C.E.; Lundälv, T.; Middelburg, J.J.

    2016-01-01

    Cold-water corals form prominent reef ecosystemsalong ocean margins that depend on suspended resourcesproduced in surface waters. In this study, we investigatedfood processing of 13C and 15N labelled bacteria and algaeby the cold-water coral Lophelia pertusa. Coral respiration,tissue incorporation

  18. Retort process modelling for Indian traditional foods.

    Science.gov (United States)

    Gokhale, S V; Lele, S S

    2014-11-01

    Indian traditional staple and snack food is typically a heterogeneous recipe that incorporates varieties of vegetables, lentils and other ingredients. Modelling the retorting process of multilayer pouch packed Indian food was achieved using lumped-parameter approach. A unified model is proposed to estimate cold point temperature. Initial process conditions, retort temperature and % solid content were the significantly affecting independent variables. A model was developed using combination of vegetable solids and water, which was then validated using four traditional Indian vegetarian products: Pulav (steamed rice with vegetables), Sambar (south Indian style curry containing mixed vegetables and lentils), Gajar Halawa (carrot based sweet product) and Upama (wheat based snack product). The predicted and experimental values of temperature profile matched with ±10 % error which is a good match considering the food was a multi component system. Thus the model will be useful as a tool to reduce number of trials required to optimize retorting of various Indian traditional vegetarian foods.

  19. Models of transport processes in concrete

    International Nuclear Information System (INIS)

    Pommersheim, J.M.; Clifton, J.R.

    1991-01-01

    An approach being considered by the US Nuclear Regulatory Commission for disposal of low-level radioactive waste is to place the waste forms in concrete vaults buried underground. The vaults would need a service life of 500 years. Approaches for predicting the service life of concrete of such vaults include the use of mathematical models. Mathematical models are presented in this report for the major degradation processes anticipated for the concrete vaults, which are corrosion of steel reinforcement, sulfate attack, acid attack, and leaching. The models mathematically represent rate controlling processes including diffusion, convection, and reaction and sorption of chemical species. These models can form the basis for predicting the life of concrete under in-service conditions. 33 refs., 6 figs., 7 tabs

  20. Deterministic geologic processes and stochastic modeling

    International Nuclear Information System (INIS)

    Rautman, C.A.; Flint, A.L.

    1991-01-01

    Recent outcrop sampling at Yucca Mountain, Nevada, has produced significant new information regarding the distribution of physical properties at the site of a potential high-level nuclear waste repository. Consideration of the spatial distribution of measured values and geostatistical measures of spatial variability indicates that there are a number of widespread deterministic geologic features at the site that have important implications for numerical modeling of such performance aspects as ground water flow and radionuclide transport. These deterministic features have their origin in the complex, yet logical, interplay of a number of deterministic geologic processes, including magmatic evolution; volcanic eruption, transport, and emplacement; post-emplacement cooling and alteration; and late-stage (diagenetic) alteration. Because of geologic processes responsible for formation of Yucca Mountain are relatively well understood and operate on a more-or-less regional scale, understanding of these processes can be used in modeling the physical properties and performance of the site. Information reflecting these deterministic geologic processes may be incorporated into the modeling program explicitly, using geostatistical concepts such as soft information, or implicitly, through the adoption of a particular approach to modeling. It is unlikely that any single representation of physical properties at the site will be suitable for all modeling purposes. Instead, the same underlying physical reality will need to be described many times, each in a manner conducive to assessing specific performance issues

  1. A Mathematical Model of Cigarette Smoldering Process

    Directory of Open Access Journals (Sweden)

    Chen P

    2014-12-01

    Full Text Available A mathematical model for a smoldering cigarette has been proposed. In the analysis of the cigarette combustion and pyrolysis processes, a receding burning front is defined, which has a constant temperature (~450 °C and divides the cigarette into two zones, the burning zone and the pyrolysis zone. The char combustion processes in the burning zone and the pyrolysis of virgin tobacco and evaporation of water in the pyrolysis zone are included in the model. The hot gases flow from the burning zone, are assumed to go out as sidestream smoke during smoldering. The internal heat transport is characterized by effective thermal conductivities in each zone. Thermal conduction of cigarette paper and convective and radiative heat transfer at the outer surface were also considered. The governing partial differential equations were solved using an integral method. Model predictions of smoldering speed as well as temperature and density profiles in the pyrolysis zone for different kinds of cigarettes were found to agree with the experimental data. The model also predicts the coal length and the maximum coal temperatures during smoldering conditions. The model provides a relatively fast and efficient way to simulate the cigarette burning processes. It offers a practical tool for exploring important parameters for cigarette smoldering processes, such as tobacco components, properties of cigarette paper, and heat generation in the burning zone and its dependence on the mass burn rate.

  2. Modeling of Reaction Processes Controlled by Diffusion

    International Nuclear Information System (INIS)

    Revelli, Jorge

    2003-01-01

    Stochastic modeling is quite powerful in science and technology.The technics derived from this process have been used with great success in laser theory, biological systems and chemical reactions.Besides, they provide a theoretical framework for the analysis of experimental results on the field of particle's diffusion in ordered and disordered materials.In this work we analyze transport processes in one-dimensional fluctuating media, which are media that change their state in time.This fact induces changes in the movements of the particles giving rise to different phenomena and dynamics that will be described and analyzed in this work.We present some random walk models to describe these fluctuating media.These models include state transitions governed by different dynamical processes.We also analyze the trapping problem in a lattice by means of a simple model which predicts a resonance-like phenomenon.Also we study effective diffusion processes over surfaces due to random walks in the bulk.We consider different boundary conditions and transitions movements.We derive expressions that describe diffusion behaviors constrained to bulk restrictions and the dynamic of the particles.Finally it is important to mention that the theoretical results obtained from the models proposed in this work are compared with Monte Carlo simulations.We find, in general, excellent agreements between the theory and the simulations

  3. Novel web service selection model based on discrete group search.

    Science.gov (United States)

    Zhai, Jie; Shao, Zhiqing; Guo, Yi; Zhang, Haiteng

    2014-01-01

    In our earlier work, we present a novel formal method for the semiautomatic verification of specifications and for describing web service composition components by using abstract concepts. After verification, the instantiations of components were selected to satisfy the complex service performance constraints. However, selecting an optimal instantiation, which comprises different candidate services for each generic service, from a large number of instantiations is difficult. Therefore, we present a new evolutionary approach on the basis of the discrete group search service (D-GSS) model. With regard to obtaining the optimal multiconstraint instantiation of the complex component, the D-GSS model has competitive performance compared with other service selection models in terms of accuracy, efficiency, and ability to solve high-dimensional service composition component problems. We propose the cost function and the discrete group search optimizer (D-GSO) algorithm and study the convergence of the D-GSS model through verification and test cases.

  4. Internet User Behaviour Model Discovery Process

    Directory of Open Access Journals (Sweden)

    2007-01-01

    Full Text Available The Academy of Economic Studies has more than 45000 students and about 5000 computers with Internet access which are connected to AES network. Students can access internet on these computers through a proxy server which stores information about the way the Internet is accessed. In this paper, we describe the process of discovering internet user behavior models by analyzing proxy server raw data and we emphasize the importance of such models for the e-learning environment.

  5. Process model development for optimization of forged disk manufacturing processes

    Energy Technology Data Exchange (ETDEWEB)

    Fischer, C.E.; Gunasekera, J.S. [Ohio Univ., Athens, OH (United States). Center for Advanced Materials Processing; Malas, J.C. [Wright Labs., Wright Patterson AFB, OH (United States). Materials Directorate

    1997-12-31

    This paper addresses the development of a system which will enable the optimization of an entire processing sequence for a forged part. Typically such a sequence may involve several stages and alternative routes of manufacturing a given part. It is important that such a system be optimized globally, (rather than locally, as is the current practice) in order to achieve improvements in affordability, producibility, and performance. This paper demonstrates the development of a simplified forging model, discussion techniques for searching and reducing a very large design space, and an objective function to evaluate the cost of a design sequence.

  6. Manufacturing plant location selection in logistics network using Analytic Hierarchy Process

    Directory of Open Access Journals (Sweden)

    Ping-Yu Chang

    2015-11-01

    Full Text Available Purpose: In recent years, numerous companies have moved their manufacturing plants to China to capitalize on lower cost and tax. Plant location has such an impact on cost, stocks, and logistics network but location selection in the company is usually based on subjective preference of high ranking managers. Such a decision-making process might result in selecting a location with a lower fixed cost but a higher operational cost. Therefore, this research adapts real data from an electronics company to develop a framework that incorporates both quantitative and qualitative factors for selecting new plant locations. Design/methodology/approach: In-depth interviews were conducted with 12 high rank managers (7 of them are department manager, 2 of them are vice-president, 1 of them is senior engineer, and 2 of them are plant manager in the departments of construction, finance, planning, production, and warehouse to determine the important factors. A questionnaire survey is then conducted for comparing factors which are analyzed using the Analytic Hierarchy Process (AHP. Findings: Results show that the best location chosen by the developed framework coincides well with the company’s primal production base. The results have been presented to the company’s high ranking managers for realizing the accuracy of the framework. Positive responses of the managers indicate usefulness of implementing the proposed model into reality, which adds to the value of this research. Practical implications: The proposed framework can save numerous time-consuming meetings called to compromise opinions and conflictions from different departments in location selection. Originality/value: This paper adapts the Analytic Hierarchy Process (AHP to incorporate quantitative and qualitative factors which are obtained through in-depth interviews with high rank managers in a company into the location decision.

  7. Selection of climate change scenario data for impact modelling

    DEFF Research Database (Denmark)

    Sloth Madsen, M; Fox Maule, C; MacKellar, N

    2012-01-01

    Impact models investigating climate change effects on food safety often need detailed climate data. The aim of this study was to select climate change projection data for selected crop phenology and mycotoxin impact models. Using the ENSEMBLES database of climate model output, this study...... illustrates how the projected climate change signal of important variables as temperature, precipitation and relative humidity depends on the choice of the climate model. Using climate change projections from at least two different climate models is recommended to account for model uncertainty. To make...... the climate projections suitable for impact analysis at the local scale a weather generator approach was adopted. As the weather generator did not treat all the necessary variables, an ad-hoc statistical method was developed to synthesise realistic values of missing variables. The method is presented...

  8. Derivative processes for modelling metabolic fluxes

    Science.gov (United States)

    Žurauskienė, Justina; Kirk, Paul; Thorne, Thomas; Pinney, John; Stumpf, Michael

    2014-01-01

    Motivation: One of the challenging questions in modelling biological systems is to characterize the functional forms of the processes that control and orchestrate molecular and cellular phenotypes. Recently proposed methods for the analysis of metabolic pathways, for example, dynamic flux estimation, can only provide estimates of the underlying fluxes at discrete time points but fail to capture the complete temporal behaviour. To describe the dynamic variation of the fluxes, we additionally require the assumption of specific functional forms that can capture the temporal behaviour. However, it also remains unclear how to address the noise which might be present in experimentally measured metabolite concentrations. Results: Here we propose a novel approach to modelling metabolic fluxes: derivative processes that are based on multiple-output Gaussian processes (MGPs), which are a flexible non-parametric Bayesian modelling technique. The main advantages that follow from MGPs approach include the natural non-parametric representation of the fluxes and ability to impute the missing data in between the measurements. Our derivative process approach allows us to model changes in metabolite derivative concentrations and to characterize the temporal behaviour of metabolic fluxes from time course data. Because the derivative of a Gaussian process is itself a Gaussian process, we can readily link metabolite concentrations to metabolic fluxes and vice versa. Here we discuss how this can be implemented in an MGP framework and illustrate its application to simple models, including nitrogen metabolism in Escherichia coli. Availability and implementation: R code is available from the authors upon request. Contact: j.norkunaite@imperial.ac.uk; m.stumpf@imperial.ac.uk Supplementary information: Supplementary data are available at Bioinformatics online. PMID:24578401

  9. Selection Processes and Appropriability in Art, Science and Technology

    NARCIS (Netherlands)

    Wijnberg, N.M.

    1995-01-01

    Recently, there has been a mutually beneficial interchange of models and ideas between the sociology of science and the economics of technological innovation. Concepts such as the "paradigm" and the "network" seem to lend themselves to useful application in both fields. To these is added the concept

  10. New processes for the selective production of 1-octene

    Energy Technology Data Exchange (ETDEWEB)

    van Leeuwen, P.W.N.M.; Clement, N.D.; Tschan, M.J.L. [Institute of Chemical Research Catalonia ICIQ, Tarragona (Spain)

    2011-07-15

    Linear alpha-olefins, especially 1-hexene and 1-octene, are key components for the production of LLDPE and the demand for 1-hexene and 1-octene increased enormously in recent years. Here we review the new processes for 1-octene production based on homogeneous catalysts. Sasol's coal-based high temperature Fischer-Tropsch technology produces an Anderson-Schulz-Flory distribution of hydrocarbons with high alpha-olefin content and the desired alkenes, including 1-heptene and 1-octene, are separated by distillation. In this case, as in the SHOP process, 1-octene constitutes only a minor part of the total yield. Nowadays other technologies are being applied or considered for on-purpose 1-octene production: hydroformylation of 1-heptene, the telomerization of 1,3-butadiene, and ethene tetramerization. 1-Heptene can be converted in three steps to 1-octene: (1) hydroformylation of 1-heptene to octanal, (2) hydrogenation of octanal to 1-octanol, and (3) dehydration of 1-octanol to 1-octene. This process was commercialized by Sasol. Dow commercialized a process based on butadiene. Telomerization of butadiene with methanol in the presence of a palladium catalyst yields 1-methoxy-2,7-octadiene, which is fully hydrogenated to 1-methoxyoctane in the next step. Subsequent cracking of 1-methoxyoctane gives 1-octene and methanol for recycle. Recently highly active and stable phosphine based systems were reported that show particularly good performance for the industrially attractive feedstock, the C{sub 4} cut of the paraffin cracker. 1-Hexene can be obtained by ethene trimerization by a family of catalysts based mainly on Cr.

  11. Occurrence of Aflatoxins in Selected Processed Foods from Pakistan

    Science.gov (United States)

    Mushtaq, Muhammad; Sultana, Bushra; Anwar, Farooq; Khan, Muhammad Zargham; Ashrafuzzaman, Muhammad

    2012-01-01

    A total of 125 (ready to eat) processed food samples (70 intended for infant and 55 for adult intake) belonging to 20 different food categories were analyzed for aflatoxins contamination using Reverse Phase High Performance Liquid Chromatography (RP-HPLC) with fluorescent detection. A solvent mixture of acetonitrile-water was used for the extraction followed by immunoaffinity clean-up to enhance sensitivity of the method. The limit of detection (LOD) (0.01–0.02 ng·g−1) and limit of quantification (LOQ) (0.02 ng·g−1) was established for aflatoxins based on signal to noise ratio of 3:1 and 10:1, respectively. Of the processed food samples tested, 38% were contaminated with four types of aflatoxins, i.e., AFB1 (0.02–1.24 μg·kg−1), AFB2 (0.02–0.37 μg·kg−1), AFG1 (0.25–2.7 μg·kg−1) and AFG2 (0.21–1.3 μg·kg−1). In addition, the results showed that 21% of the processed foods intended for infants contained AFB1 levels higher than the European Union permissible limits (0.1 μg·kg−1), while all of those intended for adult consumption had aflatoxin contamination levels within the permitted limits. PMID:22942705

  12. Adverse Selection Models with Three States of Nature

    Directory of Open Access Journals (Sweden)

    Daniela MARINESCU

    2011-02-01

    Full Text Available In the paper we analyze an adverse selection model with three states of nature, where both the Principal and the Agent are risk neutral. When solving the model, we use the informational rents and the efforts as variables. We derive the optimal contract in the situation of asymmetric information. The paper ends with the characteristics of the optimal contract and the main conclusions of the model.

  13. Modeling quality attributes and metrics for web service selection

    Science.gov (United States)

    Oskooei, Meysam Ahmadi; Daud, Salwani binti Mohd; Chua, Fang-Fang

    2014-06-01

    Since the service-oriented architecture (SOA) has been designed to develop the system as a distributed application, the service selection has become a vital aspect of service-oriented computing (SOC). Selecting the appropriate web service with respect to quality of service (QoS) through using mathematical solution for optimization of problem turns the service selection problem into a common concern for service users. Nowadays, number of web services that provide the same functionality is increased and selection of services from a set of alternatives which differ in quality parameters can be difficult for service consumers. In this paper, a new model for QoS attributes and metrics is proposed to provide a suitable solution for optimizing web service selection and composition with low complexity.

  14. A Generic Modeling Process to Support Functional Fault Model Development

    Science.gov (United States)

    Maul, William A.; Hemminger, Joseph A.; Oostdyk, Rebecca; Bis, Rachael A.

    2016-01-01

    Functional fault models (FFMs) are qualitative representations of a system's failure space that are used to provide a diagnostic of the modeled system. An FFM simulates the failure effect propagation paths within a system between failure modes and observation points. These models contain a significant amount of information about the system including the design, operation and off nominal behavior. The development and verification of the models can be costly in both time and resources. In addition, models depicting similar components can be distinct, both in appearance and function, when created individually, because there are numerous ways of representing the failure space within each component. Generic application of FFMs has the advantages of software code reuse: reduction of time and resources in both development and verification, and a standard set of component models from which future system models can be generated with common appearance and diagnostic performance. This paper outlines the motivation to develop a generic modeling process for FFMs at the component level and the effort to implement that process through modeling conventions and a software tool. The implementation of this generic modeling process within a fault isolation demonstration for NASA's Advanced Ground System Maintenance (AGSM) Integrated Health Management (IHM) project is presented and the impact discussed.

  15. Modeling as a Decision-Making Process

    Science.gov (United States)

    Bleiler-Baxter, Sarah K.; Stephens, D. Christopher; Baxter, Wesley A.; Barlow, Angela T.

    2017-01-01

    The goal in this article is to support teachers in better understanding what it means to model with mathematics by focusing on three key decision-making processes: Simplification, Relationship Mapping, and Situation Analysis. The authors use the Theme Park task to help teachers develop a vision of how students engage in these three decision-making…

  16. Kinetics and modeling of anaerobic digestion process

    DEFF Research Database (Denmark)

    Gavala, Hariklia N.; Angelidaki, Irini; Ahring, Birgitte Kiær

    2003-01-01

    Anaerobic digestion modeling started in the early 1970s when the need for design and efficient operation of anaerobic systems became evident. At that time not only was the knowledge about the complex process of anaerobic digestion inadequate but also there were computational limitations. Thus...

  17. Querying Business Process Models with VMQL

    DEFF Research Database (Denmark)

    Störrle, Harald; Acretoaie, Vlad

    2013-01-01

    . In this paper, we apply VMQL to the Business Process Modeling Notation (BPMN) to evaluate the second claim. We explore the adaptations required, and re-evaluate the usability of VMQL in this context. We find similar results to earlier work, thus both supporting our claims and establishing the usability of VMQL...

  18. Numerical modeling and simulation in various processes

    Directory of Open Access Journals (Sweden)

    Eliza Consuela ISBĂŞOIU

    2011-12-01

    The economic modeling offers the manager the rigorous side of his actions, multiple chances in order to connect existing resources with the objectives pursued for a certain period of time, offering the possibility of a better and faster thinking and deciding process, without deforming the reality.

  19. Aligning Grammatical Theories and Language Processing Models

    Science.gov (United States)

    Lewis, Shevaun; Phillips, Colin

    2015-01-01

    We address two important questions about the relationship between theoretical linguistics and psycholinguistics. First, do grammatical theories and language processing models describe separate cognitive systems, or are they accounts of different aspects of the same system? We argue that most evidence is consistent with the one-system view. Second,…

  20. [On selection criteria in spatially distributed models of competition].

    Science.gov (United States)

    Il'ichev, V G; Il'icheva, O A

    2014-01-01

    Discrete models of competitors (initial population and mutants) are considered in which reproduction is set by increasing and concave function, and migration in the space consisting of a set of areas, is described by a Markov matrix. This allows the use of the theory of monotonous operators to study problems of selection, coexistence and stability. It is shown that the higher is the number of areas, more and more severe constraints of selective advantage to initial population are required.

  1. Comparing the staffing models of outsourcing in selected companies

    OpenAIRE

    Chaloupková, Věra

    2010-01-01

    This thesis deals with problems of takeover of employees in outsourcing. The capital purpose is to compare the staffing model of outsourcing in selected companies. To compare in selected companies I chose multi-criteria analysis. This thesis is dividend into six chapters. The first charter is devoted to the theoretical part. In this charter describes the basic concepts as outsourcing, personal aspects, phase of the outsourcing projects, communications and culture. The rest of thesis is devote...

  2. elative age effect in the selection process of handball players of the regional selection teams

    Directory of Open Access Journals (Sweden)

    Manuel Gómez López

    2017-06-01

    Full Text Available This study aimed to analyze the effect of age on adolescent handball players of the regional selection teams. To do this, data of sex and date of birth of 84 youth players from different regional selection teams in the 2015-2016 season were analyzed, performing comparisons and differences being studied by χ2 and Z tests and the Bonferroni method. The analysis of results by quarter and half of birth revealed no statistically significant differences in gender and category. It seems to confirm that there is not relative age effect in the analyzed Teams. Whereupon, seems to confirm that in handball base, all young people participate, regardless of the degree of maturity submit.

  3. Decision support model for selecting and evaluating suppliers in the construction industry

    Directory of Open Access Journals (Sweden)

    Fernando Schramm

    2012-12-01

    Full Text Available A structured evaluation of the construction industry's suppliers, considering aspects which make their quality and credibility evident, can be a strategic tool to manage this specific supply chain. This study proposes a multi-criteria decision model for suppliers' selection from the construction industry, as well as an efficient evaluation procedure for the selected suppliers. The model is based on SMARTER (Simple Multi-Attribute Rating Technique Exploiting Ranking method and its main contribution is a new approach to structure the process of suppliers' selection, establishing explicit strategic policies on which the company management system relied to make the suppliers selection. This model was applied to a Civil Construction Company in Brazil and the main results demonstrate the efficiency of the proposed model. This study allowed the development of an approach to Construction Industry which was able to provide a better relationship among its managers, suppliers and partners.

  4. Leukocyte Motility Models Assessed through Simulation and Multi-objective Optimization-Based Model Selection.

    Directory of Open Access Journals (Sweden)

    Mark N Read

    2016-09-01

    Full Text Available The advent of two-photon microscopy now reveals unprecedented, detailed spatio-temporal data on cellular motility and interactions in vivo. Understanding cellular motility patterns is key to gaining insight into the development and possible manipulation of the immune response. Computational simulation has become an established technique for understanding immune processes and evaluating hypotheses in the context of experimental data, and there is clear scope to integrate microscopy-informed motility dynamics. However, determining which motility model best reflects in vivo motility is non-trivial: 3D motility is an intricate process requiring several metrics to characterize. This complicates model selection and parameterization, which must be performed against several metrics simultaneously. Here we evaluate Brownian motion, Lévy walk and several correlated random walks (CRWs against the motility dynamics of neutrophils and lymph node T cells under inflammatory conditions by simultaneously considering cellular translational and turn speeds, and meandering indices. Heterogeneous cells exhibiting a continuum of inherent translational speeds and directionalities comprise both datasets, a feature significantly improving capture of in vivo motility when simulated as a CRW. Furthermore, translational and turn speeds are inversely correlated, and the corresponding CRW simulation again improves capture of our in vivo data, albeit to a lesser extent. In contrast, Brownian motion poorly reflects our data. Lévy walk is competitive in capturing some aspects of neutrophil motility, but T cell directional persistence only, therein highlighting the importance of evaluating models against several motility metrics simultaneously. This we achieve through novel application of multi-objective optimization, wherein each model is independently implemented and then parameterized to identify optimal trade-offs in performance against each metric. The resultant Pareto

  5. Occurrence of Aflatoxins in Selected Processed Foods from Pakistan

    Directory of Open Access Journals (Sweden)

    Muhammad Ashrafuzzaman

    2012-07-01

    Full Text Available A total of 125 (ready to eat processed food samples (70 intended for infant and 55 for adult intake belonging to 20 different food categories were analyzed for aflatoxins contamination using Reverse Phase High Performance Liquid Chromatography (RP-HPLC with fluorescent detection. A solvent mixture of acetonitrile-water was used for the extraction followed by immunoaffinity clean-up to enhance sensitivity of the method. The limit of detection (LOD (0.01–0.02 ng·g−1 and limit of quantification (LOQ (0.02 ng·g−1 was established for aflatoxins based on signal to noise ratio of 3:1 and 10:1, respectively. Of the processed food samples tested, 38% were contaminated with four types of aflatoxins, i.e., AFB1 (0.02–1.24 μg·kg−1, AFB2 (0.02–0.37 μg·kg−1, AFG1 (0.25–2.7 μg·kg−1 and AFG2 (0.21–1.3 μg·kg−1. In addition, the results showed that 21% of the processed foods intended for infants contained AFB1 levels higher than the European Union permissible limits (0.1 μg·kg−1, while all of those intended for adult consumption had aflatoxin contamination levels within the permitted limits.

  6. Modeling of the mechanical alloying process

    Science.gov (United States)

    Maurice, D.; Courtney, T. H.

    1992-01-01

    Two programs have been developed to compute the dimensional and property changes that occur with repetitive impacts during the mechanical alloying process. The more sophisticated of the programs also maintains a running count of the fractions of particles present and from this calculates a population distribution. The programs predict powder particle size and shape changes in accord with the accepted stages of powder development during mechanical alloying of ductile species. They also predict hardness and lamellar thickness changes with processing, again with reasonable agreement with experimental results. These predictions offer support of the model (and thereby give insight into the possible 'actual' happenings of mechanical alloying) and hence allow refinement and calibration of the myriad aspects of the model. They also provide a vehicle for establishing control over the dimensions and properties of the output powders used for consolidation, thereby facilitating optimization of the consolidation process.

  7. Managing risks in business model innovation processes

    DEFF Research Database (Denmark)

    Taran, Yariv; Boer, Harry; Lindgren, Peter

    2010-01-01

    Companies today, in some industries more than others, invest more capital and resources just to stay competitive, develop more diverse solutions, and increasingly start thinking more radically when considering their business models. However, despite the understanding that business model (BM) inno...... forward, which link success and failure to the way companies appreciate and handle the risks involved in BM innovation.......) innovation is a risky enterprise, many companies are still choosing not to apply any risk management in the BM innovation process. The objective of this paper is to develop a better understanding of how risks are handled in the practice of BM innovation. An analysis of the BM innovation experiences of two...... industrial companies shows that both companies are experiencing high levels of uncertainty and complexity during their innovation processes and are, consequently, struggling to find new processes for handling the risks involved. Based on the two companies’ experiences, various testable propositions are put...

  8. Lightweight Graphical Models for Selectivity Estimation Without Independence Assumptions

    DEFF Research Database (Denmark)

    Tzoumas, Kostas; Deshpande, Amol; Jensen, Christian S.

    2011-01-01

    , propagated exponentially, can lead to severely sub-optimal plans. Modern optimizers typically maintain one-dimensional statistical summaries and make the attribute value independence and join uniformity assumptions for efficiently estimating selectivities. Therefore, selectivity estimation errors in today......’s optimizers are frequently caused by missed correlations between attributes. We present a selectivity estimation approach that does not make the independence assumptions. By carefully using concepts from the field of graphical models, we are able to factor the joint probability distribution of all...

  9. Statistical model for high energy inclusive processes

    International Nuclear Information System (INIS)

    Pomorisac, B.

    1980-01-01

    We propose a statistical model of inclusive processes. The model is an extension of the model proposed by Salapino and Sugar for the inclusive distributions in rapidity. The model is defined in terms of a random variable on the full phase space of the produced particles and in terms of a Lorentz-invariant probability distribution. We suggest that the Lorentz invariance is broken spontaneously, this may describe the observed anisotropy of the inclusive distributions. Based on this model we calculate the distribution in transverse momentum. An explicit calculation is given of the one-particle inclusive cross sections and the two-particle correlation. The results give a fair representation of the shape of one-particle inclusive cross sections, and positive correlation for the particles emitted. The relevance of our results to experiments is discussed

  10. Genetic signatures of natural selection in a model invasive ascidian

    Science.gov (United States)

    Lin, Yaping; Chen, Yiyong; Yi, Changho; Fong, Jonathan J.; Kim, Won; Rius, Marc; Zhan, Aibin

    2017-03-01

    Invasive species represent promising models to study species’ responses to rapidly changing environments. Although local adaptation frequently occurs during contemporary range expansion, the associated genetic signatures at both population and genomic levels remain largely unknown. Here, we use genome-wide gene-associated microsatellites to investigate genetic signatures of natural selection in a model invasive ascidian, Ciona robusta. Population genetic analyses of 150 individuals sampled in Korea, New Zealand, South Africa and Spain showed significant genetic differentiation among populations. Based on outlier tests, we found high incidence of signatures of directional selection at 19 loci. Hitchhiking mapping analyses identified 12 directional selective sweep regions, and all selective sweep windows on chromosomes were narrow (~8.9 kb). Further analyses indentified 132 candidate genes under selection. When we compared our genetic data and six crucial environmental variables, 16 putatively selected loci showed significant correlation with these environmental variables. This suggests that the local environmental conditions have left significant signatures of selection at both population and genomic levels. Finally, we identified “plastic” genomic regions and genes that are promising regions to investigate evolutionary responses to rapid environmental change in C. robusta.

  11. Reevaluating the selection process for the Israeli Defense Force's paramedic training program.

    Science.gov (United States)

    Lending, Gadi; Nadler, Roy; Abramovich, Amir; Gendler, Sami; Peretz, Asaf; Dagan, David; Gilad, David; Glassberg, Elon

    2015-03-01

    Selecting candidates for medical training programs is a complicated process aimed at identifying specific personal competencies, in an attempt to minimize attrition and produce better medical providers. The objective of this study was to evaluate the accuracy of the selection process for the Israeli Defense Force's paramedic training program and its ability to predict success measured at different end points. Selection process test scores were crossed and measured against three different end points: attrition, national certification test scores, and training program graduation scores. Data were available for 146 candidates. A positive association was detected between lower formulated selection scores and attrition rates (p<0.01). Out of the 11 tests conducted that comprise the final selection score, two had shown significant association with attrition. The calculated score of these specific two tests was found to have similar association with attrition as the formulated selection score. The current Israeli Defense Force's paramedic-formulated selection score has shown association with attrition; candidates performing poorly throughout the selection process were less likely to complete training. Similar results may be achieved by implementing a more efficient selection process based on fewer tests. Further studies are required to identify the optimal composition for selection processes. Ongoing learning and research form the ground for improvement, not only of trauma medicine but of all aspects of medicine. Reprint & Copyright © 2015 Association of Military Surgeons of the U.S.

  12. Fundamentals of Numerical Modelling of Casting Processes

    DEFF Research Database (Denmark)

    Hattel, Jesper Henri; Pryds, Nini; Thorborg, Jesper

    Fundamentals of Numerical Modelling of Casting Processes comprises a thorough presentation of the basic phenomena that need to be addressed in numerical simulation of casting processes. The main philosophy of the book is to present the topics in view of their physical meaning, whenever possible......, rather than relying strictly on mathematical formalism. The book, aimed both at the researcher and the practicing engineer, as well as the student, is naturally divided into four parts. Part I (Chapters 1-3) introduces the fundamentals of modelling in a 1-dimensional framework. Part II (Chapter 4......) presents the most important aspects of solidification theory related to modelling. Part III (Chapter 5) describes the fluid flow phenomena and in part IV (Chapter 6) the stress-strain analysis is addressed. For all parts, both numerical formulations as well as some important analytical solutions...

  13. Temperature Modelling of the Biomass Pretreatment Process

    DEFF Research Database (Denmark)

    Prunescu, Remus Mihail; Blanke, Mogens; Jensen, Jakob M.

    2012-01-01

    In a second generation biorefinery, the biomass pretreatment stage has an important contribution to the efficiency of the downstream processing units involved in biofuel production. Most of the pretreatment process occurs in a large pressurized thermal reactor that presents an irregular temperature...... distribution. Therefore, an accurate temperature model is critical for observing the biomass pretreatment. More than that, the biomass is also pushed with a constant horizontal speed along the reactor in order to ensure a continuous throughput. The goal of this paper is to derive a temperature model...... that captures the environmental temperature differences inside the reactor using distributed parameters. A Kalman filter is then added to account for any missing dynamics and the overall model is embedded into a temperature soft sensor. The operator of the plant will be able to observe the temperature in any...

  14. Modeling veterans healthcare administration disclosure processes :

    Energy Technology Data Exchange (ETDEWEB)

    Beyeler, Walter E; DeMenno, Mercy B.; Finley, Patrick D.

    2013-09-01

    As with other large healthcare organizations, medical adverse events at the Department of Veterans Affairs (VA) facilities can expose patients to unforeseen negative risks. VHA leadership recognizes that properly handled disclosure of adverse events can minimize potential harm to patients and negative consequences for the effective functioning of the organization. The work documented here seeks to help improve the disclosure process by situating it within the broader theoretical framework of issues management, and to identify opportunities for process improvement through modeling disclosure and reactions to disclosure. The computational model will allow a variety of disclosure actions to be tested across a range of incident scenarios. Our conceptual model will be refined in collaboration with domain experts, especially by continuing to draw on insights from VA Study of the Communication of Adverse Large-Scale Events (SCALE) project researchers.

  15. Grid Size Selection for Nonlinear Least-Squares Optimization in Spectral Estimation and Array Processing

    DEFF Research Database (Denmark)

    Nielsen, Jesper Kjær; Jensen, Tobias Lindstrøm; Jensen, Jesper Rindom

    2016-01-01

    time. Additionally, we show via three common examples how the grid size depends on parameters such as the number of data points or the number of sensors in DOA estimation. We also demonstrate that the computation time can potentially be lowered by several orders of magnitude by combining a coarse grid......In many spectral estimation and array processing problems, the process of finding estimates of model parameters often involves the optimisation of a cost function containing multiple peaks and dips. Such non-convex problems are hard to solve using traditional optimisation algorithms developed...... for convex problems, and computationally intensive grid searches are therefore often used instead. In this paper, we establish an analytical connection between the grid size and the parametrisation of the cost function so that the grid size can be selected as coarsely as possible to lower the computation...

  16. On the selection of optimized carbon nano tube synthesis method using analytic hierarchy process

    International Nuclear Information System (INIS)

    Besharati, M. K.; Afaghi Khatibi, A.; Akbari, M.

    2008-01-01

    Evidence from the early and late industrializes shows that technology, as the commercial application of scientific knowledge, has been a major driver of industrial and economic development. International technology transfer is now being recognized as having played an important role in the development of the most successful late industrializes of the second half of the twentieth Century. Our society stands to be significantly influenced by carbon nano tubes, shaped by nano tube applications in every aspect, just as silicon-based technology still shapes society today. Nano tubes can be formed in various structures using several different processing methods. In this paper, the synthesis methods used to produce nano tubes in industrial or laboratory scales are discussed and a comparison is made. A technical feasibility study is conducted by using the multi criteria decision-making model, namely Analytic Hierarchy Process. The article ends with a discussion of selecting the best method of Technology Transferring of Carbon Nano tubes to Iran

  17. Toward an understanding of methane selectivity in the Fischer-Tropsch process

    Science.gov (United States)

    Psarras, Peter C.

    The purpose of this research is to elucidate a better understanding of the conditions relevant to methane selectivity in the Fischer-Tropsch (FT) process. The development of more efficient FT catalysts can result in great commercial profit. The industrially relevant FT process has long been hampered by the production of methane. Nearly 60 percent of FT capital is devoted to the removal of methane and purification of feed-stock gases through steam-reforming. Naturally, a more efficient FT catalyst would need to have a reasonable balance between catalytic activity and suppression of methane formation (low methane selectivity). Though a significant amount of work has been devoted to understanding the mechanisms involved in methane selectivity, the exact mechanism is still not well understood. Density functional theory (DFT) methods provide an opportunity to explore the FT catalytic process at the molecular level. This work represents a combination of various DFT approaches in an attempt to gather new insight on the conditions relevant to methane selectivity. A thorough understanding of the electronic environment involved in the surface-adsorbate interaction is necessary to the advancement of more efficient Fischer-Tropsch catalysts. This study investigates the promotive effect of four late transition metals (Cu, Ag, Au and Pd) on three FT catalytic surfaces (Fe, Co and Ni). The purpose of this research is to examine the surface-adsorbate interaction from two perspectives: 1) interactions occurring between FT precursors and small, bimetallic surface analogs (clusters), and 2) plane-wave calculations of the interactions between FT precursors and simulated bulk surfaces. Our results suggest that promising candidates for the reduction of FT methane selectivity include Au and Pd on Ni, Au and Ag on Co, and Cu, Ag, and Pd on Fe. Additionally, cluster models were susceptible to effects not encountered in the plane-wave approach. Thermodynamic trends can be made more

  18. Grout pump selection process for the Transportable Grout Facility

    Energy Technology Data Exchange (ETDEWEB)

    McCarthy, D.; Treat, R.L.

    1985-01-01

    Selected low-level radioactive liquid wastes at Hanford will be disposed by grouting. Grout is formed by mixing the liquid wastes with solid materials, including Portland cement, fly ash, and clay. The mixed grouts will be pumped to disposal sites (e.g., trenches and buried structures) where the grout will be allowed to harden and, thereby, immobilize the wastes. A Transportable Grout Facility (TGF) will be constructed and operated by Rockwell Hanford Operations to perform the grouting function. A critical component of the TGF is the grout pump. A preliminary review of pumping requirements identified reciprocating pumps and progressive cavity pumps as the two classes of pumps best suited for the application. The advantages and disadvantages of specific types of pumps within these two classes were subsequently investigated. As a result of this study, the single-screw, rotary positive displacement pump was identified as the best choice for the TGF application. This pump has a simple design, is easy to operate, is rugged, and is suitable for a radioactive environment. It produces a steady, uniform flow that simplifies suction and discharge piping requirements. This pump will likely require less maintenance than reciprocating pumps and can be disassembled rapidly and decontaminated easily. If the TGF should eventually require discharge pressures in excess of 500 psi, a double-acting duplex piston pump is recommended because it can operate at low speed, with only moderate flow rate fluctuations. However, the check valves, stuffing box, piston, suction, and discharge piping must be designed carefully to allow trouble-free operations.

  19. Financial applications of a Tabu search variable selection model

    Directory of Open Access Journals (Sweden)

    Zvi Drezner

    2001-01-01

    Full Text Available We illustrate how a comparatively new technique, a Tabu search variable selection model [Drezner, Marcoulides and Salhi (1999], can be applied efficiently within finance when the researcher must select a subset of variables from among the whole set of explanatory variables under consideration. Several types of problems in finance, including corporate and personal bankruptcy prediction, mortgage and credit scoring, and the selection of variables for the Arbitrage Pricing Model, require the researcher to select a subset of variables from a larger set. In order to demonstrate the usefulness of the Tabu search variable selection model, we: (1 illustrate its efficiency in comparison to the main alternative search procedures, such as stepwise regression and the Maximum R2 procedure, and (2 show how a version of the Tabu search procedure may be implemented when attempting to predict corporate bankruptcy. We accomplish (2 by indicating that a Tabu Search procedure increases the predictability of corporate bankruptcy by up to 10 percentage points in comparison to Altman's (1968 Z-Score model.

  20. Selecting an appropriate genetic evaluation model for selection in a developing dairy sector

    NARCIS (Netherlands)

    McGill, D.M.; Mulder, H.A.; Thomson, P.C.; Lievaart, J.J.

    2014-01-01

    This study aimed to identify genetic evaluation models (GEM) to accurately select cattle for milk production when only limited data are available. It is based on a data set from the Pakistani Sahiwal progeny testing programme which includes records from five government herds, each consisting of 100