WorldWideScience

Sample records for model selection process

  1. An integrated model for supplier selection process

    Institute of Scientific and Technical Information of China (English)

    2003-01-01

    In today's highly competitive manufacturing environment, the supplier selection process becomes one of crucial activities in supply chain management. In order to select the best supplier(s) it is not only necessary to continuously tracking and benchmarking performance of suppliers but also to make a tradeoff between tangible and intangible factors some of which may conflict. In this paper an integration of case-based reasoning (CBR), analytical network process (ANP) and linear programming (LP) is proposed to solve the supplier selection problem.

  2. Numerical Model based Reliability Estimation of Selective Laser Melting Process

    DEFF Research Database (Denmark)

    Mohanty, Sankhya; Hattel, Jesper Henri

    2014-01-01

    Selective laser melting is developing into a standard manufacturing technology with applications in various sectors. However, the process is still far from being at par with conventional processes such as welding and casting, the primary reason of which is the unreliability of the process. While...... of the selective laser melting process. A validated 3D finite-volume alternating-direction-implicit numerical technique is used to model the selective laser melting process, and is calibrated against results from single track formation experiments. Correlation coefficients are determined for process input...... parameters such as laser power, speed, beam profile, etc. Subsequently, uncertainties in the processing parameters are utilized to predict a range for the various outputs, using a Monte Carlo method based uncertainty analysis methodology, and the reliability of the process is established....

  3. IT vendor selection model by using structural equation model & analytical hierarchy process

    Science.gov (United States)

    Maitra, Sarit; Dominic, P. D. D.

    2012-11-01

    Selecting and evaluating the right vendors is imperative for an organization's global marketplace competitiveness. Improper selection and evaluation of potential vendors can dwarf an organization's supply chain performance. Numerous studies have demonstrated that firms consider multiple criteria when selecting key vendors. This research intends to develop a new hybrid model for vendor selection process with better decision making. The new proposed model provides a suitable tool for assisting decision makers and managers to make the right decisions and select the most suitable vendor. This paper proposes a Hybrid model based on Structural Equation Model (SEM) and Analytical Hierarchy Process (AHP) for long-term strategic vendor selection problems. The five steps framework of the model has been designed after the thorough literature study. The proposed hybrid model will be applied using a real life case study to assess its effectiveness. In addition, What-if analysis technique will be used for model validation purpose.

  4. ERP Software Selection Model using Analytic Network Process

    OpenAIRE

    Lesmana , Andre Surya; Astanti, Ririn Diar; Ai, The Jin

    2014-01-01

    During the implementation of Enterprise Resource Planning (ERP) in any company, one of the most important issues is the selection of ERP software that can satisfy the needs and objectives of the company. This issue is crucial since it may affect the duration of ERP implementation and the costs incurred for the ERP implementation. This research tries to construct a model of the selection of ERP software that are beneficial to the company in order to carry out the selection of the right ERP sof...

  5. Process chain modeling and selection in an additive manufacturing context

    DEFF Research Database (Denmark)

    Thompson, Mary Kathryn; Stolfi, Alessandro; Mischkot, Michael

    2016-01-01

    This paper introduces a new two-dimensional approach to modeling manufacturing process chains. This approach is used to consider the role of additive manufacturing technologies in process chains for a part with micro scale features and no internal geometry. It is shown that additive manufacturing...... evolving fields like additive manufacturing....

  6. Selected bibliography on the modeling and control of plant processes

    Science.gov (United States)

    Viswanathan, M. M.; Julich, P. M.

    1972-01-01

    A bibliography of information pertinent to the problem of simulating plants is presented. Detailed simulations of constituent pieces are necessary to justify simple models which may be used for analysis. Thus, this area of study is necessary to support the Earth Resources Program. The report sums up the present state of the problem of simulating vegetation. This area holds the hope of major benefits to mankind through understanding the ecology of a region and in improving agricultural yield.

  7. On a Robust MaxEnt Process Regression Model with Sample-Selection

    Directory of Open Access Journals (Sweden)

    Hea-Jung Kim

    2018-04-01

    Full Text Available In a regression analysis, a sample-selection bias arises when a dependent variable is partially observed as a result of the sample selection. This study introduces a Maximum Entropy (MaxEnt process regression model that assumes a MaxEnt prior distribution for its nonparametric regression function and finds that the MaxEnt process regression model includes the well-known Gaussian process regression (GPR model as a special case. Then, this special MaxEnt process regression model, i.e., the GPR model, is generalized to obtain a robust sample-selection Gaussian process regression (RSGPR model that deals with non-normal data in the sample selection. Various properties of the RSGPR model are established, including the stochastic representation, distributional hierarchy, and magnitude of the sample-selection bias. These properties are used in the paper to develop a hierarchical Bayesian methodology to estimate the model. This involves a simple and computationally feasible Markov chain Monte Carlo algorithm that avoids analytical or numerical derivatives of the log-likelihood function of the model. The performance of the RSGPR model in terms of the sample-selection bias correction, robustness to non-normality, and prediction, is demonstrated through results in simulations that attest to its good finite-sample performance.

  8. Mental health courts and their selection processes: modeling variation for consistency.

    Science.gov (United States)

    Wolff, Nancy; Fabrikant, Nicole; Belenko, Steven

    2011-10-01

    Admission into mental health courts is based on a complicated and often variable decision-making process that involves multiple parties representing different expertise and interests. To the extent that eligibility criteria of mental health courts are more suggestive than deterministic, selection bias can be expected. Very little research has focused on the selection processes underpinning problem-solving courts even though such processes may dominate the performance of these interventions. This article describes a qualitative study designed to deconstruct the selection and admission processes of mental health courts. In this article, we describe a multi-stage, complex process for screening and admitting clients into mental health courts. The selection filtering model that is described has three eligibility screening stages: initial, assessment, and evaluation. The results of this study suggest that clients selected by mental health courts are shaped by the formal and informal selection criteria, as well as by the local treatment system.

  9. Bayesian model selection validates a biokinetic model for zirconium processing in humans

    Science.gov (United States)

    2012-01-01

    Background In radiation protection, biokinetic models for zirconium processing are of crucial importance in dose estimation and further risk analysis for humans exposed to this radioactive substance. They provide limiting values of detrimental effects and build the basis for applications in internal dosimetry, the prediction for radioactive zirconium retention in various organs as well as retrospective dosimetry. Multi-compartmental models are the tool of choice for simulating the processing of zirconium. Although easily interpretable, determining the exact compartment structure and interaction mechanisms is generally daunting. In the context of observing the dynamics of multiple compartments, Bayesian methods provide efficient tools for model inference and selection. Results We are the first to apply a Markov chain Monte Carlo approach to compute Bayes factors for the evaluation of two competing models for zirconium processing in the human body after ingestion. Based on in vivo measurements of human plasma and urine levels we were able to show that a recently published model is superior to the standard model of the International Commission on Radiological Protection. The Bayes factors were estimated by means of the numerically stable thermodynamic integration in combination with a recently developed copula-based Metropolis-Hastings sampler. Conclusions In contrast to the standard model the novel model predicts lower accretion of zirconium in bones. This results in lower levels of noxious doses for exposed individuals. Moreover, the Bayesian approach allows for retrospective dose assessment, including credible intervals for the initially ingested zirconium, in a significantly more reliable fashion than previously possible. All methods presented here are readily applicable to many modeling tasks in systems biology. PMID:22863152

  10. Unraveling the sub-processes of selective attention: insights from dynamic modeling and continuous behavior.

    Science.gov (United States)

    Frisch, Simon; Dshemuchadse, Maja; Görner, Max; Goschke, Thomas; Scherbaum, Stefan

    2015-11-01

    Selective attention biases information processing toward stimuli that are relevant for achieving our goals. However, the nature of this bias is under debate: Does it solely rely on the amplification of goal-relevant information or is there a need for additional inhibitory processes that selectively suppress currently distracting information? Here, we explored the processes underlying selective attention with a dynamic, modeling-based approach that focuses on the continuous evolution of behavior over time. We present two dynamic neural field models incorporating the diverging theoretical assumptions. Simulations with both models showed that they make similar predictions with regard to response times but differ markedly with regard to their continuous behavior. Human data observed via mouse tracking as a continuous measure of performance revealed evidence for the model solely based on amplification but no indication of persisting selective distracter inhibition.

  11. Probabilistic wind power forecasting with online model selection and warped gaussian process

    International Nuclear Information System (INIS)

    Kou, Peng; Liang, Deliang; Gao, Feng; Gao, Lin

    2014-01-01

    Highlights: • A new online ensemble model for the probabilistic wind power forecasting. • Quantifying the non-Gaussian uncertainties in wind power. • Online model selection that tracks the time-varying characteristic of wind generation. • Dynamically altering the input features. • Recursive update of base models. - Abstract: Based on the online model selection and the warped Gaussian process (WGP), this paper presents an ensemble model for the probabilistic wind power forecasting. This model provides the non-Gaussian predictive distributions, which quantify the non-Gaussian uncertainties associated with wind power. In order to follow the time-varying characteristics of wind generation, multiple time dependent base forecasting models and an online model selection strategy are established, thus adaptively selecting the most probable base model for each prediction. WGP is employed as the base model, which handles the non-Gaussian uncertainties in wind power series. Furthermore, a regime switch strategy is designed to modify the input feature set dynamically, thereby enhancing the adaptiveness of the model. In an online learning framework, the base models should also be time adaptive. To achieve this, a recursive algorithm is introduced, thus permitting the online updating of WGP base models. The proposed model has been tested on the actual data collected from both single and aggregated wind farms

  12. The application of feature selection to the development of Gaussian process models for percutaneous absorption.

    Science.gov (United States)

    Lam, Lun Tak; Sun, Yi; Davey, Neil; Adams, Rod; Prapopoulou, Maria; Brown, Marc B; Moss, Gary P

    2010-06-01

    The aim was to employ Gaussian processes to assess mathematically the nature of a skin permeability dataset and to employ these methods, particularly feature selection, to determine the key physicochemical descriptors which exert the most significant influence on percutaneous absorption, and to compare such models with established existing models. Gaussian processes, including automatic relevance detection (GPRARD) methods, were employed to develop models of percutaneous absorption that identified key physicochemical descriptors of percutaneous absorption. Using MatLab software, the statistical performance of these models was compared with single linear networks (SLN) and quantitative structure-permeability relationships (QSPRs). Feature selection methods were used to examine in more detail the physicochemical parameters used in this study. A range of statistical measures to determine model quality were used. The inherently nonlinear nature of the skin data set was confirmed. The Gaussian process regression (GPR) methods yielded predictive models that offered statistically significant improvements over SLN and QSPR models with regard to predictivity (where the rank order was: GPR > SLN > QSPR). Feature selection analysis determined that the best GPR models were those that contained log P, melting point and the number of hydrogen bond donor groups as significant descriptors. Further statistical analysis also found that great synergy existed between certain parameters. It suggested that a number of the descriptors employed were effectively interchangeable, thus questioning the use of models where discrete variables are output, usually in the form of an equation. The use of a nonlinear GPR method produced models with significantly improved predictivity, compared with SLN or QSPR models. Feature selection methods were able to provide important mechanistic information. However, it was also shown that significant synergy existed between certain parameters, and as such it

  13. Consumer Decision Process in Restaurant Selection: An Application of the Stylized EKB Model

    Directory of Open Access Journals (Sweden)

    Eugenia Wickens

    2016-12-01

    Full Text Available Purpose – The aim of this paper is to propose a framework based on empirical work for understanding the consumer decision processes involved in the selection of a restaurant for leisure meals. Design/Methodology/Approach – An interpretive approach is taken in order to understand the intricacies of the process and the various stages in the process. Six focus group interviews with consumers of various ages and occupations in the South East of the United Kingdom were conducted. Findings and implications – The stylized EKB model of the consumer decision process (Tuan-Pham & Higgins, 2005 was used as a framework for developing different stages of the process. Two distinct parts of the process were identified. Occasion was found to be critical to the stage of problem recognition. In terms of evaluation of alternatives and, in particular, sensitivity to evaluative content, the research indicates that the regulatory focus theory of Tuan-Pham and Higgins (2005 applies to the decision of selecting a restaurant. Limitations – It is acknowledged that this exploratory study is based on a small sample in a single geographical area. Originality – The paper is the first application of the stylized EKB model, which takes into account the motivational dimensions of consumer decision making, missing in other models. It concludes that it may have broader applications to other research contexts.

  14. Modeling and Experimental Validation of the Electron Beam Selective Melting Process

    Directory of Open Access Journals (Sweden)

    Wentao Yan

    2017-10-01

    Full Text Available Electron beam selective melting (EBSM is a promising additive manufacturing (AM technology. The EBSM process consists of three major procedures: ① spreading a powder layer, ② preheating to slightly sinter the powder, and ③ selectively melting the powder bed. The highly transient multi-physics phenomena involved in these procedures pose a significant challenge for in situ experimental observation and measurement. To advance the understanding of the physical mechanisms in each procedure, we leverage high-fidelity modeling and post-process experiments. The models resemble the actual fabrication procedures, including ① a powder-spreading model using the discrete element method (DEM, ② a phase field (PF model of powder sintering (solid-state sintering, and ③ a powder-melting (liquid-state sintering model using the finite volume method (FVM. Comprehensive insights into all the major procedures are provided, which have rarely been reported. Preliminary simulation results (including powder particle packing within the powder bed, sintering neck formation between particles, and single-track defects agree qualitatively with experiments, demonstrating the ability to understand the mechanisms and to guide the design and optimization of the experimental setup and manufacturing process.

  15. Model of the best-of-N nest-site selection process in honeybees

    Science.gov (United States)

    Reina, Andreagiovanni; Marshall, James A. R.; Trianni, Vito; Bose, Thomas

    2017-05-01

    The ability of a honeybee swarm to select the best nest site plays a fundamental role in determining the future colony's fitness. To date, the nest-site selection process has mostly been modeled and theoretically analyzed for the case of binary decisions. However, when the number of alternative nests is larger than two, the decision-process dynamics qualitatively change. In this work, we extend previous analyses of a value-sensitive decision-making mechanism to a decision process among N nests. First, we present the decision-making dynamics in the symmetric case of N equal-quality nests. Then, we generalize our findings to a best-of-N decision scenario with one superior nest and N -1 inferior nests, previously studied empirically in bees and ants. Whereas previous binary models highlighted the crucial role of inhibitory stop-signaling, the key parameter in our new analysis is the relative time invested by swarm members in individual discovery and in signaling behaviors. Our new analysis reveals conflicting pressures on this ratio in symmetric and best-of-N decisions, which could be solved through a time-dependent signaling strategy. Additionally, our analysis suggests how ecological factors determining the density of suitable nest sites may have led to selective pressures for an optimal stable signaling ratio.

  16. Decision Support Model for Selection Technologies in Processing of Palm Oil Industrial Liquid Waste

    Science.gov (United States)

    Ishak, Aulia; Ali, Amir Yazid bin

    2017-12-01

    The palm oil industry continues to grow from year to year. Processing of the palm oil industry into crude palm oil (CPO) and palm kernel oil (PKO). The ratio of the amount of oil produced by both products is 30% of the raw material. This means that 70% is palm oil waste. The amount of palm oil waste will increase in line with the development of the palm oil industry. The amount of waste generated by the palm oil industry if it is not handled properly and effectively will contribute significantly to environmental damage. Industrial activities ranging from raw materials to produce products will disrupt the lives of people around the factory. There are many alternative technologies available to process other industries, but problems that often occur are difficult to implement the most appropriate technology. The purpose of this research is to develop a database of waste processing technology, looking for qualitative and quantitative criteria to select technology and develop Decision Support System (DSS) that can help make decisions. The method used to achieve the objective of this research is to develop a questionnaire to identify waste processing technology and develop the questionnaire to find appropriate database technology. Methods of data analysis performed on the system by using Analytic Hierarchy Process (AHP) and to build the model by using the MySQL Software that can be used as a tool in the evaluation and selection of palm oil mill processing technology.

  17. Variable Selection for Nonparametric Gaussian Process Priors: Models and Computational Strategies.

    Science.gov (United States)

    Savitsky, Terrance; Vannucci, Marina; Sha, Naijun

    2011-02-01

    This paper presents a unified treatment of Gaussian process models that extends to data from the exponential dispersion family and to survival data. Our specific interest is in the analysis of data sets with predictors that have an a priori unknown form of possibly nonlinear associations to the response. The modeling approach we describe incorporates Gaussian processes in a generalized linear model framework to obtain a class of nonparametric regression models where the covariance matrix depends on the predictors. We consider, in particular, continuous, categorical and count responses. We also look into models that account for survival outcomes. We explore alternative covariance formulations for the Gaussian process prior and demonstrate the flexibility of the construction. Next, we focus on the important problem of selecting variables from the set of possible predictors and describe a general framework that employs mixture priors. We compare alternative MCMC strategies for posterior inference and achieve a computationally efficient and practical approach. We demonstrate performances on simulated and benchmark data sets.

  18. Modeling intermediate product selection under production and storage capacity limitations in food processing

    DEFF Research Database (Denmark)

    Kilic, Onur Alper; Akkerman, Renzo; Grunow, Martin

    2009-01-01

    In the food industry products are usually characterized by their recipes, which are specified by various quality attributes. For end products, this is given by customer requirements, but for intermediate products, the recipes can be chosen in such a way that raw material procurement costs and pro...... with production and inventory planning, thereby considering the production and storage capacity limitations. The resulting model can be used to solve an important practical problem typical for many food processing industries.......In the food industry products are usually characterized by their recipes, which are specified by various quality attributes. For end products, this is given by customer requirements, but for intermediate products, the recipes can be chosen in such a way that raw material procurement costs...... and processing costs are minimized. However, this product selection process is bound by production and storage capacity limitations, such as the number and size of storage tanks or silos. In this paper, we present a mathematical programming approach that combines decision making on product selection...

  19. Development of Physics-Based Numerical Models for Uncertainty Quantification of Selective Laser Melting Processes

    Data.gov (United States)

    National Aeronautics and Space Administration — The goal of the proposed research is to characterize the influence of process parameter variability inherent to Selective Laser Melting (SLM) and performance effect...

  20. HOW DO STUDENTS SELECT SOCIAL NETWORKING SITES? AN ANALYTIC HIERARCHY PROCESS (AHP MODEL

    Directory of Open Access Journals (Sweden)

    Chun Meng Tang

    2015-12-01

    Full Text Available Social networking sites are popular among university students, and students today are indeed spoiled for choice. New emerging social networking sites sprout up amid popular sites, while some existing ones die out. Given the choice of so many social networking sites, how do students decide which one they will sign up for and stay on as an active user? The answer to this question is of interest to social networking site designers and marketers. The market of social networking sites is highly competitive. To maintain the current user base and continue to attract new users, how should social networking sites design their sites? Marketers spend a fairly large percent of their marketing budget on social media marketing. To formulate an effective social media strategy, how much do marketers understand the users of social networking sites? Learning from website evaluation studies, this study intends to provide some answers to these questions by examining how university students decide between two popular social networking sites, Facebook and Twitter. We first developed an analytic hierarchy process (AHP model of four main selection criteria and 12 sub-criteria, and then administered a questionnaire to a group of university students attending a course at a Malaysian university. AHP analyses of the responses from 12 respondents provided an insight into the decision-making process involved in students’ selection of social networking sites. It seemed that of the four main criteria, privacy was the top concern, followed by functionality, usability, and content. The sub-criteria that were of key concern to the students were apps, revenue-generating opportunities, ease of use, and information security. Between Facebook and Twitter, the students thought that Facebook was the better choice. This information is useful for social networking site designers to design sites that are more relevant to their users’ needs, and for marketers to craft more effective

  1. The site selection process

    International Nuclear Information System (INIS)

    Kittel, J.H.

    1989-01-01

    One of the most arduous tasks associated with the management of radioactive wastes is the siting of new disposal facilities. Experience has shown that the performance of the disposal facility during and after disposal operations is critically dependent on the characteristics of the site itself. The site selection process consists of defining needs and objectives, identifying geographic regions of interest, screening and selecting candidate sites, collecting data on the candidate sites, and finally selecting the preferred site. Before the site selection procedures can be implemented, however, a formal legal system must be in place that defines broad objectives and, most importantly, clearly establishes responsibilities and accompanying authorities for the decision-making steps in the procedure. Site selection authorities should make every effort to develop trust and credibility with the public, local officials, and the news media. The responsibilities of supporting agencies must also be spelled out. Finally, a stable funding arrangement must be established so that activities such as data collection can proceed without interruption. Several examples, both international and within the US, are given

  2. Decision making model design for antivirus software selection using Factor Analysis and Analytical Hierarchy Process

    OpenAIRE

    Nurhayati Ai; Gautama Aditya; Naseer Muchammad

    2018-01-01

    Virus spread increase significantly through the internet in 2017. One of the protection method is using antivirus software. The wide variety of antivirus software in the market tends to creating confusion among consumer. Selecting the right antivirus according to their needs has become difficult. This is the reason we conduct our research. We formulate a decision making model for antivirus software consumer. The model is constructed by using factor analysis and AHP method. First we spread que...

  3. Decision making model design for antivirus software selection using Factor Analysis and Analytical Hierarchy Process

    Directory of Open Access Journals (Sweden)

    Nurhayati Ai

    2018-01-01

    Full Text Available Virus spread increase significantly through the internet in 2017. One of the protection method is using antivirus software. The wide variety of antivirus software in the market tends to creating confusion among consumer. Selecting the right antivirus according to their needs has become difficult. This is the reason we conduct our research. We formulate a decision making model for antivirus software consumer. The model is constructed by using factor analysis and AHP method. First we spread questionnaires to consumer, then from those questionnaires we identified 16 variables that needs to be considered on selecting antivirus software. This 16 variables then divided into 5 factors by using factor analysis method in SPSS software. These five factors are security, performance, internal, time and capacity. To rank those factors we spread questionnaires to 6 IT expert then the data is analyzed using AHP method. The result is that performance factors gained the highest rank from all of the other factors. Thus, consumer can select antivirus software by judging the variables in the performance factors. Those variables are software loading speed, user friendly, no excessive memory use, thorough scanning, and scanning virus fast and accurately.

  4. A multidimensional analysis and modelling of flotation process for selected Polish lithological copper ore types

    Directory of Open Access Journals (Sweden)

    Niedoba Tomasz

    2017-01-01

    Full Text Available The flotation of copper ore is a complex technological process that depends on many parameters. Therefore, it is necessary to take into account the complexity of this phenomenon by choosing a multidimensional data analysis. The paper presents the results of modelling and analysis of beneficiation process of sandstone copper ore. Considering the implementation of multidimensional statistical methods it was necessary to carry out a multi-level experiment, which included 4 parameters (size fraction, collector type and dosage, flotation time. The main aim of the paper was the preparation of flotation process models for the recovery and the content of the metal in products. A MANOVA was implemented to explore the relationship between dependent (β, ϑ, ε, η and independent (d, t, cd, ct variables. The design of models was based on linear and nonlinear regression. The results of the variation analysis indicated the high significance of all parameters for the process. The average degree of matching of linear models to experimental data was set at 49% and 33% for copper content in the concentrate and tailings and 47% for the recovery of copper minerals in the both. The results confirms the complexity and stochasticity of the Polish copper ore flotation.

  5. Selection of Prediction Methods for Thermophysical Properties for Process Modeling and Product Design of Biodiesel Manufacturing

    DEFF Research Database (Denmark)

    Su, Yung-Chieh; Liu, Y. A.; Díaz Tovar, Carlos Axel

    2011-01-01

    To optimize biodiesel manufacturing, many reported studies have built simulation models to quantify the relationship between operating conditions and process performance. For mass and energy balance simulations, it is essential to know the four fundamental thermophysical properties of the feed oil...... prediction methods on our group Web site (www.design.che.vt.edu) for the reader to download without charge....

  6. Process-based models of feeding and prey selection in larval fish

    DEFF Research Database (Denmark)

    Fiksen, O.; MacKenzie, Brian

    2002-01-01

    believed to be important to prey selectivity and environmental regulation of feeding in fish. We include the sensitivity of prey to the hydrodynamic signal generated by approaching larval fish and a simple model of the potential loss of prey due to turbulence whereby prey is lost if it leaves...... jig dry wt l(-1). The spatio-temporal fluctuation of turbulence (tidal cycle) and light (sun height) over the bank generates complex structure in the patterns of food intake of larval fish, with different patterns emerging for small and large larvae....

  7. Towards the Significance of Decision Aid in Building Information Modeling (BIM Software Selection Process

    Directory of Open Access Journals (Sweden)

    Omar Mohd Faizal

    2014-01-01

    Full Text Available Building Information Modeling (BIM has been considered as a solution in construction industry to numerous problems such as delays, increased lead in times and increased costs. This is due to the concept and characteristic of BIM that will reshaped the way construction project teams work together to increase productivity and improve the final project outcomes (cost, time, quality, safety, functionality, maintainability, etc.. As a result, the construction industry has witnesses numerous of BIM software available in market. Each of this software has offers different function, features. Furthermore, the adoption of BIM required high investment on software, hardware and also training expenses. Thus, there is indentified that there is a need of decision aid for appropriated BIM software selection that fulfill the project needs. However, research indicates that there is limited study attempt to guide decision in BIM software selection problem. Thus, this paper highlight the importance of decision making and support for BIM software selection as it is vital to increase productivity, construction project throughout building lifecycle.

  8. A Selection Approach for Optimized Problem-Solving Process by Grey Relational Utility Model and Multicriteria Decision Analysis

    Directory of Open Access Journals (Sweden)

    Chih-Kun Ke

    2012-01-01

    Full Text Available In business enterprises, especially the manufacturing industry, various problem situations may occur during the production process. A situation denotes an evaluation point to determine the status of a production process. A problem may occur if there is a discrepancy between the actual situation and the desired one. Thus, a problem-solving process is often initiated to achieve the desired situation. In the process, how to determine an action need to be taken to resolve the situation becomes an important issue. Therefore, this work uses a selection approach for optimized problem-solving process to assist workers in taking a reasonable action. A grey relational utility model and a multicriteria decision analysis are used to determine the optimal selection order of candidate actions. The selection order is presented to the worker as an adaptive recommended solution. The worker chooses a reasonable problem-solving action based on the selection order. This work uses a high-tech company’s knowledge base log as the analysis data. Experimental results demonstrate that the proposed selection approach is effective.

  9. The effect of addition of selected carrageenans on viscoelastic properties of model processed cheese spreads

    Directory of Open Access Journals (Sweden)

    Michaela Černíková

    2007-01-01

    Full Text Available The effect of 0.25% w/w κ-carrageenan and ι‑carrageenan on viscoelastic properties of processed cheese were studied using model samples containing 40% w/w dry matter and 45 and 50% w/w fat in dry matter. Experimental samples of processed cheese were evaluated after 14 days of storage at the temperature of 6 ± 2 °C. Basic parameters of processed cheese samples under study (i.e. their dry matter content and pH were not different (P ≥ 0.05. There were no statistically significant differences in values of storage modulus G´ [Pa], loss modulus G'' [Pa] and tangent of phase shift angle tan δ [-] for the reference frequency of 1 Hz between processed cheese with κ‑carrageenan applied in the form of powder and in the form of aqueous dispersion (P ≥ 0.05. The addition of 0.25% w/w κ‑carrageenan and ι‑carrageenan (in the powder form resulted in an increase in storage (G´ and loss (G'' moduli and a decrease in values of tan δ (P < 0.05. As compared with control (i.e. without added carrageenans, samples of processed cheese became firmer. Iota-carrageenan added in the powder form in concentration of 0.25% w/w showed a more intensive effect on the increase in firmness of processed cheese under study than κ‑carrageenan (P < 0.05.

  10. X-33 Telemetry Best Source Selection, Processing, Display, and Simulation Model Comparison

    Science.gov (United States)

    Burkes, Darryl A.

    1998-01-01

    The X-33 program requires the use of multiple telemetry ground stations to cover the launch, ascent, transition, descent, and approach phases for the flights from Edwards AFB to landings at Dugway Proving Grounds, UT and Malmstrom AFB, MT. This paper will discuss the X-33 telemetry requirements and design, including information on fixed and mobile telemetry systems, best source selection, and support for Range Safety Officers. A best source selection system will be utilized to automatically determine the best source based on the frame synchronization status of the incoming telemetry streams. These systems will be used to select the best source at the landing sites and at NASA Dryden Flight Research Center to determine the overall best source between the launch site, intermediate sites, and landing site sources. The best source at the landing sites will be decommutated to display critical flight safety parameters for the Range Safety Officers. The overall best source will be sent to the Lockheed Martin's Operational Control Center at Edwards AFB for performance monitoring by X-33 program personnel and for monitoring of critical flight safety parameters by the primary Range Safety Officer. The real-time telemetry data (received signal strength, etc.) from each of the primary ground stations will also be compared during each nu'ssion with simulation data generated using the Dynamic Ground Station Analysis software program. An overall assessment of the accuracy of the model will occur after each mission. Acknowledgment: The work described in this paper was NASA supported through cooperative agreement NCC8-115 with Lockheed Martin Skunk Works.

  11. On the selection of significant variables in a model for the deteriorating process of facades

    Science.gov (United States)

    Serrat, C.; Gibert, V.; Casas, J. R.; Rapinski, J.

    2017-10-01

    In previous works the authors of this paper have introduced a predictive system that uses survival analysis techniques for the study of time-to-failure in the facades of a building stock. The approach is population based, in order to obtain information on the evolution of the stock across time, and to help the manager in the decision making process on global maintenance strategies. For the decision making it is crutial to determine those covariates -like materials, morphology and characteristics of the facade, orientation or environmental conditions- that play a significative role in the progression of different failures. The proposed platform also incorporates an open source GIS plugin that includes survival and test moduli that allow the investigator to model the time until a lesion taking into account the variables collected during the inspection process. The aim of this paper is double: a) to shortly introduce the predictive system, as well as the inspection and the analysis methodologies and b) to introduce and illustrate the modeling strategy for the deteriorating process of an urban front. The illustration will be focused on the city of L’Hospitalet de Llobregat (Barcelona, Spain) in which more than 14,000 facades have been inspected and analyzed.

  12. Bubble point pressures of the selected model system for CatLiq® bio-oil process

    DEFF Research Database (Denmark)

    Toor, Saqib Sohail; Rosendahl, Lasse; Baig, Muhammad Noman

    2010-01-01

    . In this work, the bubble point pressures of a selected model mixture (CO2 + H2O + Ethanol + Acetic acid + Octanoic acid) were measured to investigate the phase boundaries of the CatLiq® process. The bubble points were measured in the JEFRI-DBR high pressure PVT phase behavior system. The experimental results......The CatLiq® process is a second generation catalytic liquefaction process for the production of bio-oil from WDGS (Wet Distillers Grains with Solubles) at subcritical conditions (280-350 oC and 225-250 bar) in the presence of a homogeneous alkaline and a heterogeneous Zirconia catalyst...

  13. On selecting a prior for the precision parameter of Dirichlet process mixture models

    Science.gov (United States)

    Dorazio, R.M.

    2009-01-01

    In hierarchical mixture models the Dirichlet process is used to specify latent patterns of heterogeneity, particularly when the distribution of latent parameters is thought to be clustered (multimodal). The parameters of a Dirichlet process include a precision parameter ?? and a base probability measure G0. In problems where ?? is unknown and must be estimated, inferences about the level of clustering can be sensitive to the choice of prior assumed for ??. In this paper an approach is developed for computing a prior for the precision parameter ?? that can be used in the presence or absence of prior information about the level of clustering. This approach is illustrated in an analysis of counts of stream fishes. The results of this fully Bayesian analysis are compared with an empirical Bayes analysis of the same data and with a Bayesian analysis based on an alternative commonly used prior.

  14. Preparatory selection of sterilization regime for canned Natural Atlantic Mackerel with oil based on developed mathematical models of the process

    Directory of Open Access Journals (Sweden)

    Maslov A. A.

    2016-12-01

    Full Text Available Definition of preparatory parameters for sterilization regime of canned "Natural Atlantic Mackerel with Oil" is the aim of current study. PRSC software developed at the department of automation and computer engineering is used for preparatory selection. To determine the parameters of process model, in laboratory autoclave AVK-30M the pre-trial process of sterilization and cooling in water with backpressure of canned "Natural Atlantic Mackerel with Oil" in can N 3 has been performed. Gathering information about the temperature in the autoclave sterilization chamber and the can with product has been carried out using Ellab TrackSense PRO loggers. Due to the obtained information three transfer functions for the product model have been identified: in the least heated area of autoclave, the average heated and the most heated. In PRSC programme temporary temperature dependences in the sterilization chamber have been built using this information. The model of sterilization process of canned "Natural Atlantic Mackerel with Oil" has been received after the pre-trial process. Then in the automatic mode the sterilization regime of canned "Natural Atlantic Mackerel with Oil" has been selected using the value of actual effect close to normative sterilizing effect (5.9 conditional minutes. Furthermore, in this study step-mode sterilization of canned "Natural Atlantic Mackerel with Oil" has been selected. Utilization of step-mode sterilization with the maximum temperature equal to 125 °C in the sterilization chamber allows reduce process duration by 10 %. However, the application of this regime in practice requires additional research. Using the described approach based on the developed mathematical models of the process allows receive optimal step and variable canned food sterilization regimes with high energy efficiency and product quality.

  15. THM-coupled modeling of selected processes in argillaceous rock relevant to rock mechanics

    International Nuclear Information System (INIS)

    Czaikowski, Oliver

    2012-01-01

    Scientific investigations in European countries other than Germany concentrate not only on granite formations (Switzerland, Sweden) but also on argillaceous rock formations (France, Switzerland, Belgium) to assess their suitability as host and barrier rock for the final storage of radioactive waste. In Germany, rock salt has been under thorough study as a host rock over the past few decades. According to a study by the German Federal Institute for Geosciences and Natural Resources, however, not only salt deposits but also argillaceous rock deposits are available at relevant depths and of extensions in space which make final storage of high-level radioactive waste basically possible in Germany. Equally qualified findings about the suitability/unsuitability of non-saline rock formations require fundamental studies to be conducted nationally because of the comparatively low level of knowledge. The article presents basic analyses of coupled mechanical and hydraulic properties of argillaceous rock formations as host rock for a repository. The interaction of various processes is explained on the basis of knowledge derived from laboratory studies, and open problems are deduced. For modeling coupled processes, a simplified analytical computation method is proposed and compared with the results of numerical simulations, and the limits to its application are outlined. (orig.)

  16. Effects of binge drinking and hangover on response selection sub-processes-a study using EEG and drift diffusion modeling.

    Science.gov (United States)

    Stock, Ann-Kathrin; Hoffmann, Sven; Beste, Christian

    2017-09-01

    Effects of binge drinking on cognitive control and response selection are increasingly recognized in research on alcohol (ethanol) effects. Yet, little is known about how those processes are modulated by hangover effects. Given that acute intoxication and hangover seem to be characterized by partly divergent effects and mechanisms, further research on this topic is needed. In the current study, we hence investigated this with a special focus on potentially differential effects of alcohol intoxication and subsequent hangover on sub-processes involved in the decision to select a response. We do so combining drift diffusion modeling of behavioral data with neurophysiological (EEG) data. Opposed to common sense, the results do not show an impairment of all assessed measures. Instead, they show specific effects of high dose alcohol intoxication and hangover on selective drift diffusion model and EEG parameters (as compared to a sober state). While the acute intoxication induced by binge-drinking decreased the drift rate, it was increased by the subsequent hangover, indicating more efficient information accumulation during hangover. Further, the non-decisional processes of information encoding decreased with intoxication, but not during hangover. These effects were reflected in modulations of the N2, P1 and N1 event-related potentials, which reflect conflict monitoring, perceptual gating and attentional selection processes, respectively. As regards the functional neuroanatomical architecture, the anterior cingulate cortex (ACC) as well as occipital networks seem to be modulated. Even though alcohol is known to have broad neurobiological effects, its effects on cognitive processes are rather specific. © 2016 Society for the Study of Addiction.

  17. Modeling of the thermal physical process and study on the reliability of linear energy density for selective laser melting

    Directory of Open Access Journals (Sweden)

    Zhaowei Xiang

    2018-06-01

    Full Text Available A finite element model considering volume shrinkage with powder-to-dense process of powder layer in selective laser melting (SLM is established. Comparison between models that consider and do not consider volume shrinkage or powder-to-dense process is carried out. Further, parametric analysis of laser power and scan speed is conducted and the reliability of linear energy density as a design parameter is investigated. The results show that the established model is an effective method and has better accuracy allowing for the temperature distribution, and the length and depth of molten pool. The maximum temperature is more sensitive to laser power than scan speed. The maximum heating rate and cooling rate increase with increasing scan speed at constant laser power and increase with increasing laser power at constant scan speed as well. The simulation results and experimental result reveal that linear energy density is not always reliable using as a design parameter in the SLM. Keywords: Selective laser melting, Volume shrinkage, Powder-to-dense process, Numerical modeling, Thermal analysis, Linear energy density

  18. Modeling of the thermal physical process and study on the reliability of linear energy density for selective laser melting

    Science.gov (United States)

    Xiang, Zhaowei; Yin, Ming; Dong, Guanhua; Mei, Xiaoqin; Yin, Guofu

    2018-06-01

    A finite element model considering volume shrinkage with powder-to-dense process of powder layer in selective laser melting (SLM) is established. Comparison between models that consider and do not consider volume shrinkage or powder-to-dense process is carried out. Further, parametric analysis of laser power and scan speed is conducted and the reliability of linear energy density as a design parameter is investigated. The results show that the established model is an effective method and has better accuracy allowing for the temperature distribution, and the length and depth of molten pool. The maximum temperature is more sensitive to laser power than scan speed. The maximum heating rate and cooling rate increase with increasing scan speed at constant laser power and increase with increasing laser power at constant scan speed as well. The simulation results and experimental result reveal that linear energy density is not always reliable using as a design parameter in the SLM.

  19. Supplier Selection Process Using ELECTRE I Decision Model and an Application in the Retail Sector

    Directory of Open Access Journals (Sweden)

    Oğuzhan Yavuz

    2013-12-01

    Full Text Available Supplier selection problem is one of the main topic for the today’s businesses. The supplier selection problem within the supply chain management activities is very important for the businesses, particularly operating in the retail sector. Thus, in this study, the supplier selection problem was discussed in order of importance between energy drinks suppliers of food business in the retail sector. Costs, delivery, quality and flexibility variables were used to select suppliers, and ELECTRE I Method, one of the multiple decision methods, was used to ranking suppliers according to this variables. Which suppliers are more important for the food company was determined by ranking suppliers according to computing net superior values and net inferior values. Results obtained werepresented in tables and certain steps

  20. ARM Mentor Selection Process

    Energy Technology Data Exchange (ETDEWEB)

    Sisterson, D. L. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2015-10-01

    The Atmospheric Radiation Measurement (ARM) Program was created in 1989 with funding from the U.S. Department of Energy (DOE) to develop several highly instrumented ground stations to study cloud formation processes and their influence on radiative transfer. In 2003, the ARM Program became a national scientific user facility, known as the ARM Climate Research Facility. This scientific infrastructure provides for fixed sites, mobile facilities, an aerial facility, and a data archive available for use by scientists worldwide through the ARM Climate Research Facility—a scientific user facility. The ARM Climate Research Facility currently operates more than 300 instrument systems that provide ground-based observations of the atmospheric column. To keep ARM at the forefront of climate observations, the ARM infrastructure depends heavily on instrument scientists and engineers, also known as lead mentors. Lead mentors must have an excellent understanding of in situ and remote-sensing instrumentation theory and operation and have comprehensive knowledge of critical scale-dependent atmospheric processes. They must also possess the technical and analytical skills to develop new data retrievals that provide innovative approaches for creating research-quality data sets. The ARM Climate Research Facility is seeking the best overall qualified candidate who can fulfill lead mentor requirements in a timely manner.

  1. Social Influence Interpretation of Interpersonal Processes and Team Performance Over Time Using Bayesian Model Selection

    NARCIS (Netherlands)

    Johnson, Alan R.; van de Schoot, Rens; Delmar, Frédéric; Crano, William D.

    The team behavior literature is ambiguous about the relations between members’ interpersonal processes—task debate and task conflict—and team performance. From a social influence perspective, we show why members’ interpersonal processes determine team performance over time in small groups. Together,

  2. Waste package materials selection process

    International Nuclear Information System (INIS)

    Roy, A.K.; Fish, R.L.; McCright, R.D.

    1994-01-01

    The office of Civilian Radioactive Waste Management (OCRWM) of the United States Department of Energy (USDOE) is evaluating a site at Yucca Mountain in Southern Nevada to determine its suitability as a mined geologic disposal system (MGDS) for the disposal of high-level nuclear waste (HLW). The B ampersand W Fuel Company (BWFC), as a part of the Management and Operating (M ampersand O) team in support of the Yucca Mountain Site Characterization Project (YMP), is responsible for designing and developing the waste package for this potential repository. As part of this effort, Lawrence Livermore National Laboratory (LLNL) is responsible for testing materials and developing models for the materials to be used in the waste package. This paper is aimed at presenting the selection process for materials needed in fabricating the different components of the waste package

  3. A Generalized Process Model of Human Action Selection and Error and its Application to Error Prediction

    Science.gov (United States)

    2014-07-01

    Macmillan & Creelman , 2005). This is a quite high degree of discriminability and it means that when the decision model predicts a probability of...ROC analysis. Pattern Recognition Letters, 27(8), 861-874. Retrieved from Google Scholar. Macmillan, N. A., & Creelman , C. D. (2005). Detection

  4. The model selection in the process of teambuilding for the management of the organization

    OpenAIRE

    Sergey Petrov

    2010-01-01

    Improving competitiveness of organizations necessary for their success in a market economy is no longer possible only due to material resources. This implies need for qualitatively new approach to human capital. The author reviews approaches to team building and suggests team management model based on situations-cases in which the organized one way or another team reaches goal.

  5. Mathematical Modeling, Simulation and Optimization for Selected Robotic Processes related to Manufacturing of Unique Concrete Elements

    DEFF Research Database (Denmark)

    Cortsen, Jens

    Denne afhandling præsenterer vores arbejde i det danske projekt Unique Concrete Structures (Unikabeton) and EU projekt TailorMade Concrete Structures (TailorCrete) med at automatisere udvalgte processer for konstruktion af unike beton bygninger. Vi har primært fokus på robotfræsning af komplekse...... dobbeltkurvede armerings gitter med to samarbejdende robotter, hvor delprocesserne er bøjning, transportering og binding af ameringsstænger. Robotinstallationen er baseret på et off-line simuleringsprogram med dynamisk simulerings support for stangnedbøjning og samtidigt robot control for at reducere...... produktionstiden. De to mindre processer som varmetråd skæring af EPS blokke før fræsning og sprøjtning af slipmiddel på de færdige formwork blokke er også præsenteret efter de to hoved processer. Til sidst præsenterer vi en række af real life betonstrukturer baseret på vores arbejde i denne afhandling...

  6. Modeling the Supply Process Using the Application of Selected Methods of Operational Analysis

    Science.gov (United States)

    Chovancová, Mária; Klapita, Vladimír

    2017-03-01

    Supply process is one of the most important enterprise activities. All raw materials, intermediate products and products, which are moved within enterprise, are the subject of inventory management and by their effective management significant improvement of enterprise position on the market can be achieved. For that reason, the inventory needs to be managed, monitored, evaluated and affected. The paper deals with utilizing the methods of the operational analysis in the field of inventory management in terms of achieving the economic efficiency and ensuring the particular customer's service level as well.

  7. Sexual selection: Another Darwinian process.

    Science.gov (United States)

    Gayon, Jean

    2010-02-01

    the Darwin-Wallace controversy was that most Darwinian biologists avoided the subject of sexual selection until at least the 1950s, Ronald Fisher being a major exception. This controversy still deserves attention from modern evolutionary biologists, because the modern approach inherits from both Darwin and Wallace. The modern approach tends to present sexual selection as a special aspect of the theory of natural selection, although it also recognizes the big difficulties resulting from the inevitable interaction between these two natural processes of selection. And contra Wallace, it considers mate choice as a major process that deserves a proper evolutionary treatment. The paper's conclusion explains why sexual selection can be taken as a test case for a proper assessment of "Darwinism" as a scientific tradition. Darwin's and Wallace's attitudes towards sexual selection reveal two different interpretations of the principle of natural selection: Wallace's had an environmentalist conception of natural selection, whereas Darwin was primarily sensitive to the element of competition involved in the intimate mechanism of any natural process of selection. Sexual selection, which can lack adaptive significance, reveals this exemplarily. 2010 Académie des sciences. Published by Elsevier SAS. All rights reserved.

  8. Vibration and acoustic frequency spectra for industrial process modeling using selective fusion multi-condition samples and multi-source features

    Science.gov (United States)

    Tang, Jian; Qiao, Junfei; Wu, ZhiWei; Chai, Tianyou; Zhang, Jian; Yu, Wen

    2018-01-01

    Frequency spectral data of mechanical vibration and acoustic signals relate to difficult-to-measure production quality and quantity parameters of complex industrial processes. A selective ensemble (SEN) algorithm can be used to build a soft sensor model of these process parameters by fusing valued information selectively from different perspectives. However, a combination of several optimized ensemble sub-models with SEN cannot guarantee the best prediction model. In this study, we use several techniques to construct mechanical vibration and acoustic frequency spectra of a data-driven industrial process parameter model based on selective fusion multi-condition samples and multi-source features. Multi-layer SEN (MLSEN) strategy is used to simulate the domain expert cognitive process. Genetic algorithm and kernel partial least squares are used to construct the inside-layer SEN sub-model based on each mechanical vibration and acoustic frequency spectral feature subset. Branch-and-bound and adaptive weighted fusion algorithms are integrated to select and combine outputs of the inside-layer SEN sub-models. Then, the outside-layer SEN is constructed. Thus, "sub-sampling training examples"-based and "manipulating input features"-based ensemble construction methods are integrated, thereby realizing the selective information fusion process based on multi-condition history samples and multi-source input features. This novel approach is applied to a laboratory-scale ball mill grinding process. A comparison with other methods indicates that the proposed MLSEN approach effectively models mechanical vibration and acoustic signals.

  9. Modeling Natural Selection

    Science.gov (United States)

    Bogiages, Christopher A.; Lotter, Christine

    2011-01-01

    In their research, scientists generate, test, and modify scientific models. These models can be shared with others and demonstrate a scientist's understanding of how the natural world works. Similarly, students can generate and modify models to gain a better understanding of the content, process, and nature of science (Kenyon, Schwarz, and Hug…

  10. Selection Process of ERP Systems

    OpenAIRE

    Molnár, Bálint; Szabó, Gyula; Benczúr, András

    2013-01-01

    Background: The application and introduction of ERP systems have become a central issue for management and operation of enterprises. The competition on market enforces the improvement and optimization of business processes of enterprises to increase their efficiency, effectiveness, and to manage better the resources outside the company. The primary task of ERP systems is to achieve the before-mentioned objectives. Objective: The selection of a particular ERP system has a decisive effect on th...

  11. Fusion strategies for selecting multiple tuning parameters for multivariate calibration and other penalty based processes: A model updating application for pharmaceutical analysis

    International Nuclear Information System (INIS)

    Tencate, Alister J.; Kalivas, John H.; White, Alexander J.

    2016-01-01

    New multivariate calibration methods and other processes are being developed that require selection of multiple tuning parameter (penalty) values to form the final model. With one or more tuning parameters, using only one measure of model quality to select final tuning parameter values is not sufficient. Optimization of several model quality measures is challenging. Thus, three fusion ranking methods are investigated for simultaneous assessment of multiple measures of model quality for selecting tuning parameter values. One is a supervised learning fusion rule named sum of ranking differences (SRD). The other two are non-supervised learning processes based on the sum and median operations. The effect of the number of models evaluated on the three fusion rules are also evaluated using three procedures. One procedure uses all models from all possible combinations of the tuning parameters. To reduce the number of models evaluated, an iterative process (only applicable to SRD) is applied and thresholding a model quality measure before applying the fusion rules is also used. A near infrared pharmaceutical data set requiring model updating is used to evaluate the three fusion rules. In this case, calibration of the primary conditions is for the active pharmaceutical ingredient (API) of tablets produced in a laboratory. The secondary conditions for calibration updating is for tablets produced in the full batch setting. Two model updating processes requiring selection of two unique tuning parameter values are studied. One is based on Tikhonov regularization (TR) and the other is a variation of partial least squares (PLS). The three fusion methods are shown to provide equivalent and acceptable results allowing automatic selection of the tuning parameter values. Best tuning parameter values are selected when model quality measures used with the fusion rules are for the small secondary sample set used to form the updated models. In this model updating situation, evaluation of

  12. Fusion strategies for selecting multiple tuning parameters for multivariate calibration and other penalty based processes: A model updating application for pharmaceutical analysis

    Energy Technology Data Exchange (ETDEWEB)

    Tencate, Alister J. [Department of Chemistry, Idaho State University, Pocatello, ID 83209 (United States); Kalivas, John H., E-mail: kalijohn@isu.edu [Department of Chemistry, Idaho State University, Pocatello, ID 83209 (United States); White, Alexander J. [Department of Physics and Optical Engineering, Rose-Hulman Institute of Technology, Terre Huate, IN 47803 (United States)

    2016-05-19

    New multivariate calibration methods and other processes are being developed that require selection of multiple tuning parameter (penalty) values to form the final model. With one or more tuning parameters, using only one measure of model quality to select final tuning parameter values is not sufficient. Optimization of several model quality measures is challenging. Thus, three fusion ranking methods are investigated for simultaneous assessment of multiple measures of model quality for selecting tuning parameter values. One is a supervised learning fusion rule named sum of ranking differences (SRD). The other two are non-supervised learning processes based on the sum and median operations. The effect of the number of models evaluated on the three fusion rules are also evaluated using three procedures. One procedure uses all models from all possible combinations of the tuning parameters. To reduce the number of models evaluated, an iterative process (only applicable to SRD) is applied and thresholding a model quality measure before applying the fusion rules is also used. A near infrared pharmaceutical data set requiring model updating is used to evaluate the three fusion rules. In this case, calibration of the primary conditions is for the active pharmaceutical ingredient (API) of tablets produced in a laboratory. The secondary conditions for calibration updating is for tablets produced in the full batch setting. Two model updating processes requiring selection of two unique tuning parameter values are studied. One is based on Tikhonov regularization (TR) and the other is a variation of partial least squares (PLS). The three fusion methods are shown to provide equivalent and acceptable results allowing automatic selection of the tuning parameter values. Best tuning parameter values are selected when model quality measures used with the fusion rules are for the small secondary sample set used to form the updated models. In this model updating situation, evaluation of

  13. It Takes Three: Selection, Influence, and De-Selection Processes of Depression in Adolescent Friendship Networks

    Science.gov (United States)

    Van Zalk, Maarten Herman Walter; Kerr, Margaret; Branje, Susan J. T.; Stattin, Hakan; Meeus, Wim H. J.

    2010-01-01

    The authors of this study tested a selection-influence-de-selection model of depression. This model explains friendship influence processes (i.e., friends' depressive symptoms increase adolescents' depressive symptoms) while controlling for two processes: friendship selection (i.e., selection of friends with similar levels of depressive symptoms)…

  14. 45 CFR 1305.6 - Selection process.

    Science.gov (United States)

    2010-10-01

    ... 45 Public Welfare 4 2010-10-01 2010-10-01 false Selection process. 1305.6 Section 1305.6 Public... PROGRAM ELIGIBILITY, RECRUITMENT, SELECTION, ENROLLMENT AND ATTENDANCE IN HEAD START § 1305.6 Selection process. (a) Each Head Start program must have a formal process for establishing selection criteria and...

  15. Selected Topics on Systems Modeling and Natural Language Processing: Editorial Introduction to the Issue 7 of CSIMQ

    Directory of Open Access Journals (Sweden)

    Witold Andrzejewski

    2016-07-01

    Full Text Available The seventh issue of Complex Systems Informatics and Modeling Quarterly presents five papers devoted to two distinct research topics: systems modeling and natural language processing (NLP. Both of these subjects are very important in computer science. Through modeling we can simplify the studied problem by concentrating on only one aspect at a time. Moreover, a properly constructed model allows the modeler to work on higher levels of abstraction and not having to concentrate on details. Since the size and complexity of information systems grows rapidly, creating good models of such systems is crucial. The analysis of natural language is slowly becoming a widely used tool in commerce and day to day life. Opinion mining allows recommender systems to provide accurate recommendations based on user-generated reviews. Speech recognition and NLP are the basis for such widely used personal assistants as Apple’s Siri, Microsoft’s Cortana, and Google Now. While a lot of work has already been done on natural language processing, the research usually concerns widely used languages, such as English. Consequently, natural language processing in languages other than English is very relevant subject and is addressed in this issue.

  16. Physics-based simulation modeling and optimization of microstructural changes induced by machining and selective laser melting processes in titanium and nickel based alloys

    Science.gov (United States)

    Arisoy, Yigit Muzaffer

    Manufacturing processes may significantly affect the quality of resultant surfaces and structural integrity of the metal end products. Controlling manufacturing process induced changes to the product's surface integrity may improve the fatigue life and overall reliability of the end product. The goal of this study is to model the phenomena that result in microstructural alterations and improve the surface integrity of the manufactured parts by utilizing physics-based process simulations and other computational methods. Two different (both conventional and advanced) manufacturing processes; i.e. machining of Titanium and Nickel-based alloys and selective laser melting of Nickel-based powder alloys are studied. 3D Finite Element (FE) process simulations are developed and experimental data that validates these process simulation models are generated to compare against predictions. Computational process modeling and optimization have been performed for machining induced microstructure that includes; i) predicting recrystallization and grain size using FE simulations and the Johnson-Mehl-Avrami-Kolmogorov (JMAK) model, ii) predicting microhardness using non-linear regression models and the Random Forests method, and iii) multi-objective machining optimization for minimizing microstructural changes. Experimental analysis and computational process modeling of selective laser melting have been also conducted including; i) microstructural analysis of grain sizes and growth directions using SEM imaging and machine learning algorithms, ii) analysis of thermal imaging for spattering, heating/cooling rates and meltpool size, iii) predicting thermal field, meltpool size, and growth directions via thermal gradients using 3D FE simulations, iv) predicting localized solidification using the Phase Field method. These computational process models and predictive models, once utilized by industry to optimize process parameters, have the ultimate potential to improve performance of

  17. Optimization of the selection process of the co-substrates for chicken manure fermentation using neural modeling

    Directory of Open Access Journals (Sweden)

    Lewicki Andrzej

    2016-01-01

    Full Text Available Intense development of research equipment leads directly to increasing cognitive abilities. However, along with the raising amount of data generated, the development of the techniques allowing the analysis is also essential. Currently, one of the most dynamically developing branch of computer science and mathematics are the Artificial Neural Networks (ANN. Their main advantage is very high ability to solve the regression and approximation issues. This paper presents the possibility of application of artificial intelligence methods to optimize the selection of co-substrates intended for methane fermentation of chicken manure. 4-layer MLP network has proven to be the optimal structure modeling the obtained empirical data.

  18. Nursing documentation: experience of the use of the nursing process model in selected hospitals in Ibadan, Oyo State, Nigeria.

    Science.gov (United States)

    Ofi, Bola; Sowunmi, Olanrewaju

    2012-08-01

    The descriptive study was conducted to determine the extent of utilization of the nursing process for documentation of nursing care in three selected hospitals, Ibadan, Nigeria. One hundred fifty nurses and 115 discharged clients' records were selected from the hospitals. Questionnaires and checklists were used to collect data. Utilization of nursing process for care was 100%, 73.6% and 34.8% in the three hospitals. Nurses encountered difficulties in history taking, formulation of nursing diagnoses, objectives, nursing orders and evaluation. Most nurses disagreed or were undecided with the use of authorized abbreviations and symbols (34.3%, 40.3% and 69.5%), recording errors that occurred during care (37.1%, 56.1% and 52.2%) and inclusion of change in clients' condition (54.3%, 56.1% and 73.8%). Most nurses appreciated the significance of documentation. Lack of time, knowledge and need for extensive writing are the major barriers against documentation. Seventy-seven point four per cent of the 115 clients' records from one hospital showed evidence of documentation, no evidence from the other two. Study findings have implications for continuing professional education, practice and supervision. © 2012 Blackwell Publishing Asia Pty Ltd.

  19. Selection of refractory materials for pyrochemical processing

    International Nuclear Information System (INIS)

    Axler, K.M.; DePoorter, G.L.; Bagaasen, L.M.

    1991-01-01

    Several pyrochemical processing operations require containment materials that exhibit minimal chemical interactions with the system, good thermal shock resistance, and reusability. One example is Direct Oxide Reduction (DOR). DOR involves the conversion of PuO 2 to metal by an oxidation/reduction reaction with Ca metal. The reaction proceeds within a molten salt flux at temperatures above 800C. A combination of thermodynamics, system thermodynamic modeling, and experimental investigations are in use to select and evaluate potential containment materials

  20. Processing Technology Selection for Municipal Sewage Treatment Based on a Multi-Objective Decision Model under Uncertainty

    Directory of Open Access Journals (Sweden)

    Xudong Chen

    2018-03-01

    Full Text Available This study considers the two factors of environmental protection and economic benefits to address municipal sewage treatment. Based on considerations regarding the sewage treatment plant construction site, processing technology, capital investment, operation costs, water pollutant emissions, water quality and other indicators, we establish a general multi-objective decision model for optimizing municipal sewage treatment plant construction. Using the construction of a sewage treatment plant in a suburb of Chengdu as an example, this paper tests the general model of multi-objective decision-making for the sewage treatment plant construction by implementing a genetic algorithm. The results show the applicability and effectiveness of the multi-objective decision model for the sewage treatment plant. This paper provides decision and technical support for the optimization of municipal sewage treatment.

  1. Processing Technology Selection for Municipal Sewage Treatment Based on a Multi-Objective Decision Model under Uncertainty.

    Science.gov (United States)

    Chen, Xudong; Xu, Zhongwen; Yao, Liming; Ma, Ning

    2018-03-05

    This study considers the two factors of environmental protection and economic benefits to address municipal sewage treatment. Based on considerations regarding the sewage treatment plant construction site, processing technology, capital investment, operation costs, water pollutant emissions, water quality and other indicators, we establish a general multi-objective decision model for optimizing municipal sewage treatment plant construction. Using the construction of a sewage treatment plant in a suburb of Chengdu as an example, this paper tests the general model of multi-objective decision-making for the sewage treatment plant construction by implementing a genetic algorithm. The results show the applicability and effectiveness of the multi-objective decision model for the sewage treatment plant. This paper provides decision and technical support for the optimization of municipal sewage treatment.

  2. A Comparative Investigation of the Combined Effects of Pre-Processing, Wavelength Selection, and Regression Methods on Near-Infrared Calibration Model Performance.

    Science.gov (United States)

    Wan, Jian; Chen, Yi-Chieh; Morris, A Julian; Thennadil, Suresh N

    2017-07-01

    Near-infrared (NIR) spectroscopy is being widely used in various fields ranging from pharmaceutics to the food industry for analyzing chemical and physical properties of the substances concerned. Its advantages over other analytical techniques include available physical interpretation of spectral data, nondestructive nature and high speed of measurements, and little or no need for sample preparation. The successful application of NIR spectroscopy relies on three main aspects: pre-processing of spectral data to eliminate nonlinear variations due to temperature, light scattering effects and many others, selection of those wavelengths that contribute useful information, and identification of suitable calibration models using linear/nonlinear regression . Several methods have been developed for each of these three aspects and many comparative studies of different methods exist for an individual aspect or some combinations. However, there is still a lack of comparative studies for the interactions among these three aspects, which can shed light on what role each aspect plays in the calibration and how to combine various methods of each aspect together to obtain the best calibration model. This paper aims to provide such a comparative study based on four benchmark data sets using three typical pre-processing methods, namely, orthogonal signal correction (OSC), extended multiplicative signal correction (EMSC) and optical path-length estimation and correction (OPLEC); two existing wavelength selection methods, namely, stepwise forward selection (SFS) and genetic algorithm optimization combined with partial least squares regression for spectral data (GAPLSSP); four popular regression methods, namely, partial least squares (PLS), least absolute shrinkage and selection operator (LASSO), least squares support vector machine (LS-SVM), and Gaussian process regression (GPR). The comparative study indicates that, in general, pre-processing of spectral data can play a significant

  3. Innovation During the Supplier Selection Process

    DEFF Research Database (Denmark)

    Pilkington, Alan; Pedraza, Isabel

    2014-01-01

    Established ideas on supplier selection have not moved much from the original premise of how to choose between bidders. Whilst we have added many different tools and refinements to choose between alternative suppliers, its nature has not evolved. We move the original selection process approach...... observed through an ethnographic embedded researcher study has refined the selection process and has two selection stages one for first supply covering tool/process developed and another later for resupply of mature parts. We report the details of the process, those involved, the criteria employed...... and identify benefits and weaknesses of this enhanced selection process....

  4. Recruiter Selection Model

    National Research Council Canada - National Science Library

    Halstead, John B

    2006-01-01

    .... The research uses a combination of statistical learning, feature selection methods, and multivariate statistics to determine the better prediction function approximation with features obtained...

  5. Modeling of biopharmaceutical processes. Part 2: Process chromatography unit operation

    DEFF Research Database (Denmark)

    Kaltenbrunner, Oliver; McCue, Justin; Engel, Philip

    2008-01-01

    Process modeling can be a useful tool to aid in process development, process optimization, and process scale-up. When modeling a chromatography process, one must first select the appropriate models that describe the mass transfer and adsorption that occurs within the porous adsorbent. The theoret......Process modeling can be a useful tool to aid in process development, process optimization, and process scale-up. When modeling a chromatography process, one must first select the appropriate models that describe the mass transfer and adsorption that occurs within the porous adsorbent...

  6. 45 CFR 2400.31 - Selection process.

    Science.gov (United States)

    2010-10-01

    ... FELLOWSHIP PROGRAM REQUIREMENTS Selection of Fellows § 2400.31 Selection process. (a) An independent Fellow... outstanding applicants from each state for James Madison Fellowships. (b) From among candidates recommended...

  7. Model selection in periodic autoregressions

    NARCIS (Netherlands)

    Ph.H.B.F. Franses (Philip Hans); R. Paap (Richard)

    1994-01-01

    textabstractThis paper focuses on the issue of period autoagressive time series models (PAR) selection in practice. One aspect of model selection is the choice for the appropriate PAR order. This can be of interest for the valuation of economic models. Further, the appropriate PAR order is important

  8. Models selection and fitting

    International Nuclear Information System (INIS)

    Martin Llorente, F.

    1990-01-01

    The models of atmospheric pollutants dispersion are based in mathematic algorithms that describe the transport, diffusion, elimination and chemical reactions of atmospheric contaminants. These models operate with data of contaminants emission and make an estimation of quality air in the area. This model can be applied to several aspects of atmospheric contamination

  9. A Gambler's Model of Natural Selection.

    Science.gov (United States)

    Nolan, Michael J.; Ostrovsky, David S.

    1996-01-01

    Presents an activity that highlights the mechanism and power of natural selection. Allows students to think in terms of modeling a biological process and instills an appreciation for a mathematical approach to biological problems. (JRH)

  10. Selecting a plutonium vitrification process

    Energy Technology Data Exchange (ETDEWEB)

    Jouan, A. [Centre d`Etudes de la Vallee du Rhone, Bagnols sur Ceze (France)

    1996-05-01

    Vitrification of plutonium is one means of mitigating its potential danger. This option is technically feasible, even if it is not the solution advocated in France. Two situations are possible, depending on whether or not the glass matrix also contains fission products; concentrations of up to 15% should be achievable for plutonium alone, whereas the upper limit is 3% in the presence of fission products. The French continuous vitrification process appears to be particularly suitable for plutonium vitrification: its capacity is compatible with the required throughout, and the compact dimensions of the process equipment prevent a criticality hazard. Preprocessing of plutonium metal, to convert it to PuO{sub 2} or to a nitric acid solution, may prove advantageous or even necessary depending on whether a dry or wet process is adopted. The process may involve a single step (vitrification of Pu or PuO{sub 2} mixed with glass frit) or may include a prior calcination step - notably if the plutonium is to be incorporated into a fission product glass. It is important to weigh the advantages and drawbacks of all the possible options in terms of feasibility, safety and cost-effectiveness.

  11. Selected System Models

    Science.gov (United States)

    Schmidt-Eisenlohr, F.; Puñal, O.; Klagges, K.; Kirsche, M.

    Apart from the general issue of modeling the channel, the PHY and the MAC of wireless networks, there are specific modeling assumptions that are considered for different systems. In this chapter we consider three specific wireless standards and highlight modeling options for them. These are IEEE 802.11 (as example for wireless local area networks), IEEE 802.16 (as example for wireless metropolitan networks) and IEEE 802.15 (as example for body area networks). Each section on these three systems discusses also at the end a set of model implementations that are available today.

  12. 7 CFR 3570.68 - Selection process.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 15 2010-01-01 2010-01-01 false Selection process. 3570.68 Section 3570.68 Agriculture Regulations of the Department of Agriculture (Continued) RURAL HOUSING SERVICE, DEPARTMENT OF AGRICULTURE COMMUNITY PROGRAMS Community Facilities Grant Program § 3570.68 Selection process. Each request...

  13. 44 CFR 150.7 - Selection process.

    Science.gov (United States)

    2010-10-01

    ... 44 Emergency Management and Assistance 1 2010-10-01 2010-10-01 false Selection process. 150.7 Section 150.7 Emergency Management and Assistance FEDERAL EMERGENCY MANAGEMENT AGENCY, DEPARTMENT OF... Selection process. (a) President's Award. Nominations for the President's Award shall be reviewed, and...

  14. UML in business process modeling

    Directory of Open Access Journals (Sweden)

    Bartosz Marcinkowski

    2013-03-01

    Full Text Available Selection and proper application of business process modeling methods and techniques have a significant impact on organizational improvement capabilities as well as proper understanding of functionality of information systems that shall support activity of the organization. A number of business process modeling notations were popularized in practice in recent decades. Most significant of the notations include Business Process Modeling Notation (OMG BPMN and several Unified Modeling Language (OMG UML extensions. In this paper, the assessment whether one of the most flexible and strictly standardized contemporary business process modeling notations, i.e. Rational UML Profile for Business Modeling, enable business analysts to prepare business models that are all-embracing and understandable by all the stakeholders. After the introduction, methodology of research is discussed. Section 2 presents selected case study results. The paper is concluded with a summary.

  15. Launch vehicle selection model

    Science.gov (United States)

    Montoya, Alex J.

    1990-01-01

    Over the next 50 years, humans will be heading for the Moon and Mars to build scientific bases to gain further knowledge about the universe and to develop rewarding space activities. These large scale projects will last many years and will require large amounts of mass to be delivered to Low Earth Orbit (LEO). It will take a great deal of planning to complete these missions in an efficient manner. The planning of a future Heavy Lift Launch Vehicle (HLLV) will significantly impact the overall multi-year launching cost for the vehicle fleet depending upon when the HLLV will be ready for use. It is desirable to develop a model in which many trade studies can be performed. In one sample multi-year space program analysis, the total launch vehicle cost of implementing the program reduced from 50 percent to 25 percent. This indicates how critical it is to reduce space logistics costs. A linear programming model has been developed to answer such questions. The model is now in its second phase of development, and this paper will address the capabilities of the model and its intended uses. The main emphasis over the past year was to make the model user friendly and to incorporate additional realistic constraints that are difficult to represent mathematically. We have developed a methodology in which the user has to be knowledgeable about the mission model and the requirements of the payloads. We have found a representation that will cut down the solution space of the problem by inserting some preliminary tests to eliminate some infeasible vehicle solutions. The paper will address the handling of these additional constraints and the methodology for incorporating new costing information utilizing learning curve theory. The paper will review several test cases that will explore the preferred vehicle characteristics and the preferred period of construction, i.e., within the next decade, or in the first decade of the next century. Finally, the paper will explore the interaction

  16. A Heckman Selection- t Model

    KAUST Repository

    Marchenko, Yulia V.

    2012-03-01

    Sample selection arises often in practice as a result of the partial observability of the outcome of interest in a study. In the presence of sample selection, the observed data do not represent a random sample from the population, even after controlling for explanatory variables. That is, data are missing not at random. Thus, standard analysis using only complete cases will lead to biased results. Heckman introduced a sample selection model to analyze such data and proposed a full maximum likelihood estimation method under the assumption of normality. The method was criticized in the literature because of its sensitivity to the normality assumption. In practice, data, such as income or expenditure data, often violate the normality assumption because of heavier tails. We first establish a new link between sample selection models and recently studied families of extended skew-elliptical distributions. Then, this allows us to introduce a selection-t (SLt) model, which models the error distribution using a Student\\'s t distribution. We study its properties and investigate the finite-sample performance of the maximum likelihood estimators for this model. We compare the performance of the SLt model to the conventional Heckman selection-normal (SLN) model and apply it to analyze ambulatory expenditures. Unlike the SLNmodel, our analysis using the SLt model provides statistical evidence for the existence of sample selection bias in these data. We also investigate the performance of the test for sample selection bias based on the SLt model and compare it with the performances of several tests used with the SLN model. Our findings indicate that the latter tests can be misleading in the presence of heavy-tailed data. © 2012 American Statistical Association.

  17. Semantic Business Process Modeling

    OpenAIRE

    Markovic, Ivan

    2010-01-01

    This book presents a process-oriented business modeling framework based on semantic technologies. The framework consists of modeling languages, methods, and tools that allow for semantic modeling of business motivation, business policies and rules, and business processes. Quality of the proposed modeling framework is evaluated based on the modeling content of SAP Solution Composer and several real-world business scenarios.

  18. Predictive modeling, simulation, and optimization of laser processing techniques: UV nanosecond-pulsed laser micromachining of polymers and selective laser melting of powder metals

    Science.gov (United States)

    Criales Escobar, Luis Ernesto

    One of the most frequently evolving areas of research is the utilization of lasers for micro-manufacturing and additive manufacturing purposes. The use of laser beam as a tool for manufacturing arises from the need for flexible and rapid manufacturing at a low-to-mid cost. Laser micro-machining provides an advantage over mechanical micro-machining due to the faster production times of large batch sizes and the high costs associated with specific tools. Laser based additive manufacturing enables processing of powder metals for direct and rapid fabrication of products. Therefore, laser processing can be viewed as a fast, flexible, and cost-effective approach compared to traditional manufacturing processes. Two types of laser processing techniques are studied: laser ablation of polymers for micro-channel fabrication and selective laser melting of metal powders. Initially, a feasibility study for laser-based micro-channel fabrication of poly(dimethylsiloxane) (PDMS) via experimentation is presented. In particular, the effectiveness of utilizing a nanosecond-pulsed laser as the energy source for laser ablation is studied. The results are analyzed statistically and a relationship between process parameters and micro-channel dimensions is established. Additionally, a process model is introduced for predicting channel depth. Model outputs are compared and analyzed to experimental results. The second part of this research focuses on a physics-based FEM approach for predicting the temperature profile and melt pool geometry in selective laser melting (SLM) of metal powders. Temperature profiles are calculated for a moving laser heat source to understand the temperature rise due to heating during SLM. Based on the predicted temperature distributions, melt pool geometry, i.e. the locations at which melting of the powder material occurs, is determined. Simulation results are compared against data obtained from experimental Inconel 625 test coupons fabricated at the National

  19. Expert System Model for Educational Personnel Selection

    Directory of Open Access Journals (Sweden)

    Héctor A. Tabares-Ospina

    2013-06-01

    Full Text Available The staff selection is a difficult task due to the subjectivity that the evaluation means. This process can be complemented using a system to support decision. This paper presents the implementation of an expert system to systematize the selection process of professors. The management of software development is divided into 4 parts: requirements, design, implementation and commissioning. The proposed system models a specific knowledge through relationships between variables evidence and objective.

  20. The partner selection process : Steps, effectiveness, governance

    NARCIS (Netherlands)

    Duisters, D.; Duijsters, G.M.; de Man, A.P.

    2011-01-01

    Selecting the right partner is important for creating value in alliances. Even though prior research suggests that a structured partner selection process increases alliance success, empirical research remains scarce. This paper presents an explorative empirical study that shows that some steps in

  1. The partner selection process : steps, effectiveness, governance

    NARCIS (Netherlands)

    Duisters, D.; Duysters, G.M.; Man, de A.P.

    2011-01-01

    Selecting the right partner is important for creating value in alliances. Even though prior research suggests that a structured partner selection process increases alliance success, empirical research remains scarce. This paper presents an explorative empirical study that shows that some steps in

  2. Selective detachment process in column flotation froth

    Energy Technology Data Exchange (ETDEWEB)

    Honaker, R.Q.; Ozsever, A.V.; Parekh, B.K. [University of Kentucky, Lexington, KY (United States). Dept. of Mining Engineering

    2006-05-15

    The selectivity in flotation columns involving the separation of particles of varying degrees of floatability is based on differential flotation rates in the collection zone, reflux action between the froth and collection zones, and differential detachment rates in the froth zone. Using well-known theoretical models describing the separation process and experimental data, froth zone and overall flotation recovery values were quantified for particles in an anthracite coal that have a wide range of floatability potential. For highly floatable particles, froth recovery had a very minimal impact on overall recovery while the recovery of weakly floatable material was decreased substantially by reductions in froth recovery values. In addition, under carrying-capacity limiting conditions, selectivity was enhanced by the preferential detachment of the weakly floatable material. Based on this concept, highly floatable material was added directly into the froth zone when treating the anthracite coal. The enriched froth phase reduced the product ash content of the anthracite product by five absolute percentage points while maintaining a constant recovery value.

  3. A Heckman Selection- t Model

    KAUST Repository

    Marchenko, Yulia V.; Genton, Marc G.

    2012-01-01

    for sample selection bias based on the SLt model and compare it with the performances of several tests used with the SLN model. Our findings indicate that the latter tests can be misleading in the presence of heavy-tailed data. © 2012 American Statistical

  4. PRIME – PRocess modelling in ImpleMEntation research: selecting a theoretical basis for interventions to change clinical practice

    Directory of Open Access Journals (Sweden)

    Pitts Nigel

    2003-12-01

    modelling. In the final phase of the project, the findings from all surveys will be analysed simultaneously adopting a random effects approach to investigate whether the relationships between predictor variables and outcome measures are modified by behaviour, professional group or geographical location.

  5. Modeling multiphase materials processes

    CERN Document Server

    Iguchi, Manabu

    2010-01-01

    ""Modeling Multiphase Materials Processes: Gas-Liquid Systems"" describes the methodology and application of physical and mathematical modeling to multi-phase flow phenomena in materials processing. The book focuses on systems involving gas-liquid interaction, the most prevalent in current metallurgical processes. The performance characteristics of these processes are largely dependent on transport phenomena. This volume covers the inherent characteristics that complicate the modeling of transport phenomena in such systems, including complex multiphase structure, intense turbulence, opacity of

  6. Bayesian Model Selection under Time Constraints

    Science.gov (United States)

    Hoege, M.; Nowak, W.; Illman, W. A.

    2017-12-01

    Bayesian model selection (BMS) provides a consistent framework for rating and comparing models in multi-model inference. In cases where models of vastly different complexity compete with each other, we also face vastly different computational runtimes of such models. For instance, time series of a quantity of interest can be simulated by an autoregressive process model that takes even less than a second for one run, or by a partial differential equations-based model with runtimes up to several hours or even days. The classical BMS is based on a quantity called Bayesian model evidence (BME). It determines the model weights in the selection process and resembles a trade-off between bias of a model and its complexity. However, in practice, the runtime of models is another weight relevant factor for model selection. Hence, we believe that it should be included, leading to an overall trade-off problem between bias, variance and computing effort. We approach this triple trade-off from the viewpoint of our ability to generate realizations of the models under a given computational budget. One way to obtain BME values is through sampling-based integration techniques. We argue with the fact that more expensive models can be sampled much less under time constraints than faster models (in straight proportion to their runtime). The computed evidence in favor of a more expensive model is statistically less significant than the evidence computed in favor of a faster model, since sampling-based strategies are always subject to statistical sampling error. We present a straightforward way to include this misbalance into the model weights that are the basis for model selection. Our approach follows directly from the idea of insufficient significance. It is based on a computationally cheap bootstrapping error estimate of model evidence and is easy to implement. The approach is illustrated in a small synthetic modeling study.

  7. Selective hydrogenation processes in steam cracking

    Energy Technology Data Exchange (ETDEWEB)

    Bender, M.; Schroeter, M.K.; Hinrichs, M.; Makarczyk, P. [BASF SE, Ludwigshafen (Germany)

    2010-12-30

    Hydrogen is the key elixir used to trim the quality of olefinic and aromatic product slates from steam crackers. Being co-produced in excess amounts in the thermal cracking process a small part of the hydrogen is consumed in the ''cold part'' of a steam cracker to selectively hydrogenate unwanted, unsaturated hydrocarbons. The compositions of the various steam cracker product streams are adjusted by these processes to the outlet specifications. This presentation gives an overview over state-of-art selective hydrogenation technologies available from BASF for these processes. (Published in summary form only) (orig.)

  8. Process modeling style

    CERN Document Server

    Long, John

    2014-01-01

    Process Modeling Style focuses on other aspects of process modeling beyond notation that are very important to practitioners. Many people who model processes focus on the specific notation used to create their drawings. While that is important, there are many other aspects to modeling, such as naming, creating identifiers, descriptions, interfaces, patterns, and creating useful process documentation. Experience author John Long focuses on those non-notational aspects of modeling, which practitioners will find invaluable. Gives solid advice for creating roles, work produ

  9. An evolutionary algorithm for model selection

    Energy Technology Data Exchange (ETDEWEB)

    Bicker, Karl [CERN, Geneva (Switzerland); Chung, Suh-Urk; Friedrich, Jan; Grube, Boris; Haas, Florian; Ketzer, Bernhard; Neubert, Sebastian; Paul, Stephan; Ryabchikov, Dimitry [Technische Univ. Muenchen (Germany)

    2013-07-01

    When performing partial-wave analyses of multi-body final states, the choice of the fit model, i.e. the set of waves to be used in the fit, can significantly alter the results of the partial wave fit. Traditionally, the models were chosen based on physical arguments and by observing the changes in log-likelihood of the fits. To reduce possible bias in the model selection process, an evolutionary algorithm was developed based on a Bayesian goodness-of-fit criterion which takes into account the model complexity. Starting from systematically constructed pools of waves which contain significantly more waves than the typical fit model, the algorithm yields a model with an optimal log-likelihood and with a number of partial waves which is appropriate for the number of events in the data. Partial waves with small contributions to the total intensity are penalized and likely to be dropped during the selection process, as are models were excessive correlations between single waves occur. Due to the automated nature of the model selection, a much larger part of the model space can be explored than would be possible in a manual selection. In addition the method allows to assess the dependence of the fit result on the fit model which is an important contribution to the systematic uncertainty.

  10. Method for Business Process Management System Selection

    OpenAIRE

    Westelaken, van de, Thijs; Terwee, Bas; Ravesteijn, Pascal

    2013-01-01

    In recent years business process management (BPM) and specifically information systems that support the analysis, design and execution of processes (also called business process management systems (BPMS)) are getting more attention. This has lead to an increase in research on BPM and BPMS. However the research on BPMS is mostly focused on the architecture of the system and how to implement such systems. How to select a BPM system that fits the strategy and goals of a specific organization is ...

  11. Product and Process Modelling

    DEFF Research Database (Denmark)

    Cameron, Ian T.; Gani, Rafiqul

    . These approaches are put into the context of life cycle modelling, where multiscale and multiform modelling is increasingly prevalent in the 21st century. The book commences with a discussion of modern product and process modelling theory and practice followed by a series of case studies drawn from a variety......This book covers the area of product and process modelling via a case study approach. It addresses a wide range of modelling applications with emphasis on modelling methodology and the subsequent in-depth analysis of mathematical models to gain insight via structural aspects of the models...... to biotechnology applications, food, polymer and human health application areas. The book highlights to important nature of modern product and process modelling in the decision making processes across the life cycle. As such it provides an important resource for students, researchers and industrial practitioners....

  12. Standard Model processes

    CERN Document Server

    Mangano, M.L.; Aguilar-Saavedra, Juan Antonio; Alekhin, S.; Badger, S.; Bauer, C.W.; Becher, T.; Bertone, V.; Bonvini, M.; Boselli, S.; Bothmann, E.; Boughezal, R.; Cacciari, M.; Carloni Calame, C.M.; Caola, F.; Campbell, J.M.; Carrazza, S.; Chiesa, M.; Cieri, L.; Cimaglia, F.; Febres Cordero, F.; Ferrarese, P.; D'Enterria, D.; Ferrera, G.; Garcia i Tormo, X.; Garzelli, M.V.; Germann, E.; Hirschi, V.; Han, T.; Ita, H.; Jäger, B.; Kallweit, S.; Karlberg, A.; Kuttimalai, S.; Krauss, F.; Larkoski, A.J.; Lindert, J.; Luisoni, G.; Maierhöfer, P.; Mattelaer, O.; Martinez, H.; Moch, S.; Montagna, G.; Moretti, M.; Nason, P.; Nicrosini, O.; Oleari, C.; Pagani, D.; Papaefstathiou, A.; Petriello, F.; Piccinini, F.; Pierini, M.; Pierog, T.; Pozzorini, S.; Re, E.; Robens, T.; Rojo, J.; Ruiz, R.; Sakurai, K.; Salam, G.P.; Salfelder, L.; Schönherr, M.; Schulze, M.; Schumann, S.; Selvaggi, M.; Shivaji, A.; Siodmok, A.; Skands, P.; Torrielli, P.; Tramontano, F.; Tsinikos, I.; Tweedie, B.; Vicini, A.; Westhoff, S.; Zaro, M.; Zeppenfeld, D.; CERN. Geneva. ATS Department

    2017-06-22

    This report summarises the properties of Standard Model processes at the 100 TeV pp collider. We document the production rates and typical distributions for a number of benchmark Standard Model processes, and discuss new dynamical phenomena arising at the highest energies available at this collider. We discuss the intrinsic physics interest in the measurement of these Standard Model processes, as well as their role as backgrounds for New Physics searches.

  13. Selection of power market structure using the analytic hierarchy process

    International Nuclear Information System (INIS)

    Subhes Bhattacharyya; Prasanta Kumar Dey

    2003-01-01

    Selection of a power market structure from the available alternatives is an important activity within an overall power sector reform program. The evaluation criteria for selection are both subjective as well as objective in nature and the selection of alternatives is characterised by their conflicting nature. This study demonstrates a methodology for power market structure selection using the analytic hierarchy process, a multiple attribute decision- making technique, to model the selection methodology with the active participation of relevant stakeholders in a workshop environment. The methodology is applied to a hypothetical case of a State Electricity Board reform in India. (author)

  14. Solvent selection methodology for pharmaceutical processes: Solvent swap

    DEFF Research Database (Denmark)

    Papadakis, Emmanouil; Kumar Tula, Anjan; Gani, Rafiqul

    2016-01-01

    A method for the selection of appropriate solvents for the solvent swap task in pharmaceutical processes has been developed. This solvent swap method is based on the solvent selection method of Gani et al. (2006) and considers additional selection criteria such as boiling point difference...... in pharmaceutical processes as well as new solvent swap alternatives. The method takes into account process considerations such as batch distillation and crystallization to achieve the swap task. Rigorous model based simulations of the swap operation are performed to evaluate and compare the performance...

  15. Material and process selection using product examples

    DEFF Research Database (Denmark)

    Lenau, Torben Anker

    2001-01-01

    The objective of the paper is to suggest a different procedure for selecting materials and processes within the product development work. The procedure includes using product examples in order to increase the number of alternative materials and processes that is considered. Product examples can c...... a search engine, and through hyperlinks can relevant materials and processes be explored. Realising that designers are very sensitive to user interfaces do all descriptions of materials, processes and products include graphical descriptions, i.e. pictures or computer graphics....

  16. Material and process selection using product examples

    DEFF Research Database (Denmark)

    Lenau, Torben Anker

    2002-01-01

    The objective of the paper is to suggest a different procedure for selecting materials and processes within the product development work. The procedure includes using product examples in order to increase the number of alternative materials and processes that is considered. Product examples can c...... a search engine, and through hyperlinks can relevant materials and processes be explored. Realising that designers are very sensitive to user interfaces do all descriptions of materials, processes and products include graphical descriptions, i.e. pictures or computer graphics....

  17. Multi-enzyme Process Modeling

    DEFF Research Database (Denmark)

    Andrade Santacoloma, Paloma de Gracia

    are affected (in a positive or negative way) by the presence of the other enzymes and compounds in the media. In this thesis the concept of multi-enzyme in-pot term is adopted for processes that are carried out by the combination of enzymes in a single reactor and implemented at pilot or industrial scale...... features of the process and provides the information required to structure the process model by using a step-by-step procedure with the required tools and methods. In this way, this framework increases efficiency of the model development process with respect to time and resources needed (fast and effective....... In this way the model parameters that drives the main dynamic behavior can be identified and thus a better understanding of this type of processes. In order to develop, test and verify the methodology, three case studies were selected, specifically the bi-enzyme process for the production of lactobionic acid...

  18. The Process of Marketing Segmentation Strategy Selection

    OpenAIRE

    Ionel Dumitru

    2007-01-01

    The process of marketing segmentation strategy selection represents the essence of strategical marketing. We present hereinafter the main forms of the marketing statategy segmentation: undifferentiated marketing, differentiated marketing, concentrated marketing and personalized marketing. In practice, the companies use a mix of these marketing segmentation methods in order to maximize the proffit and to satisfy the consumers’ needs.

  19. WWTP Process Tank Modelling

    DEFF Research Database (Denmark)

    Laursen, Jesper

    The present thesis considers numerical modeling of activated sludge tanks on municipal wastewater treatment plants. Focus is aimed at integrated modeling where the detailed microbiological model the Activated Sludge Model 3 (ASM3) is combined with a detailed hydrodynamic model based on a numerical...... solution of the Navier-Stokes equations in a multiphase scheme. After a general introduction to the activated sludge tank as a system, the activated sludge tank model is gradually setup in separate stages. The individual sub-processes that are often occurring in activated sludge tanks are initially...... hydrofoil shaped propellers. These two sub-processes deliver the main part of the supplied energy to the activated sludge tank, and for this reason they are important for the mixing conditions in the tank. For other important processes occurring in the activated sludge tank, existing models and measurements...

  20. Application of numerical modeling of selective NOx reduction by hydrocarbon under diesel transient conditions in consideration of hydrocarbon adsorption and desorption process

    International Nuclear Information System (INIS)

    Watanabe, Y.; Asano, A.; Banno, K.; Yokota, K.; Sugiura, M.

    2001-01-01

    A model of NO x selective reduction by hydrocarbon (HC) was developed, which takes into account the adsorption and desorption of HC. The model was applied for predicting the performance of a De-NO x catalytic reactor, working under transient conditions such as a legislative driving cycle. Diesel fuel was used as a supplemental reductant. The behavior of HC and NO x reactions and HC adsorption and desorption has been simulated successfully by our numerical approach under the transient conditions of the simulated Japanese 10-15 driving cycle. Our model is expected to optimize the design of selective diesel NO x reduction systems using a diesel fuel as a supplemental reductant

  1. Selected Tether Applications Cost Model

    Science.gov (United States)

    Keeley, Michael G.

    1988-01-01

    Diverse cost-estimating techniques and data combined into single program. Selected Tether Applications Cost Model (STACOM 1.0) is interactive accounting software tool providing means for combining several independent cost-estimating programs into fully-integrated mathematical model capable of assessing costs, analyzing benefits, providing file-handling utilities, and putting out information in text and graphical forms to screen, printer, or plotter. Program based on Lotus 1-2-3, version 2.0. Developed to provide clear, concise traceability and visibility into methodology and rationale for estimating costs and benefits of operations of Space Station tether deployer system.

  2. Selection Process for New Windows | Efficient Windows Collaborative

    Science.gov (United States)

    Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards

  3. Selection Process for Replacement Windows | Efficient Windows Collaborative

    Science.gov (United States)

    Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards

  4. Model Process Control Language

    Data.gov (United States)

    National Aeronautics and Space Administration — The MPC (Model Process Control) language enables the capture, communication and preservation of a simulation instance, with sufficient detail that it can be...

  5. Compositional Changes in Selected Minimally Processed Vegetables

    OpenAIRE

    O'Reilly, Emer, (Thesis)

    2000-01-01

    Compositional, physiological and microbiological changes in selected minimally processed vegetables packaged under a modified atmosphere of 2% oxygen and 5% carbon dioxide were monitored over a ten day storage period at 40 C and 80 C. The analysis targeted specific changes in the nutritional, chemical and physiological make up of the vegetables as well as the changes in the microbial levels. In addition the changes in the gas atmospheres within the packs were monitored. It has been widely acc...

  6. Biosphere Process Model Report

    Energy Technology Data Exchange (ETDEWEB)

    J. Schmitt

    2000-05-25

    To evaluate the postclosure performance of a potential monitored geologic repository at Yucca Mountain, a Total System Performance Assessment (TSPA) will be conducted. Nine Process Model Reports (PMRs), including this document, are being developed to summarize the technical basis for each of the process models supporting the TSPA model. These reports cover the following areas: (1) Integrated Site Model; (2) Unsaturated Zone Flow and Transport; (3) Near Field Environment; (4) Engineered Barrier System Degradation, Flow, and Transport; (5) Waste Package Degradation; (6) Waste Form Degradation; (7) Saturated Zone Flow and Transport; (8) Biosphere; and (9) Disruptive Events. Analysis/Model Reports (AMRs) contain the more detailed technical information used to support TSPA and the PMRs. The AMRs consists of data, analyses, models, software, and supporting documentation that will be used to defend the applicability of each process model for evaluating the postclosure performance of the potential Yucca Mountain repository system. This documentation will ensure the traceability of information from its source through its ultimate use in the TSPA-Site Recommendation (SR) and in the National Environmental Policy Act (NEPA) analysis processes. The objective of the Biosphere PMR is to summarize (1) the development of the biosphere model, and (2) the Biosphere Dose Conversion Factors (BDCFs) developed for use in TSPA. The Biosphere PMR does not present or summarize estimates of potential radiation doses to human receptors. Dose calculations are performed as part of TSPA and will be presented in the TSPA documentation. The biosphere model is a component of the process to evaluate postclosure repository performance and regulatory compliance for a potential monitored geologic repository at Yucca Mountain, Nevada. The biosphere model describes those exposure pathways in the biosphere by which radionuclides released from a potential repository could reach a human receptor

  7. Biosphere Process Model Report

    International Nuclear Information System (INIS)

    Schmitt, J.

    2000-01-01

    To evaluate the postclosure performance of a potential monitored geologic repository at Yucca Mountain, a Total System Performance Assessment (TSPA) will be conducted. Nine Process Model Reports (PMRs), including this document, are being developed to summarize the technical basis for each of the process models supporting the TSPA model. These reports cover the following areas: (1) Integrated Site Model; (2) Unsaturated Zone Flow and Transport; (3) Near Field Environment; (4) Engineered Barrier System Degradation, Flow, and Transport; (5) Waste Package Degradation; (6) Waste Form Degradation; (7) Saturated Zone Flow and Transport; (8) Biosphere; and (9) Disruptive Events. Analysis/Model Reports (AMRs) contain the more detailed technical information used to support TSPA and the PMRs. The AMRs consists of data, analyses, models, software, and supporting documentation that will be used to defend the applicability of each process model for evaluating the postclosure performance of the potential Yucca Mountain repository system. This documentation will ensure the traceability of information from its source through its ultimate use in the TSPA-Site Recommendation (SR) and in the National Environmental Policy Act (NEPA) analysis processes. The objective of the Biosphere PMR is to summarize (1) the development of the biosphere model, and (2) the Biosphere Dose Conversion Factors (BDCFs) developed for use in TSPA. The Biosphere PMR does not present or summarize estimates of potential radiation doses to human receptors. Dose calculations are performed as part of TSPA and will be presented in the TSPA documentation. The biosphere model is a component of the process to evaluate postclosure repository performance and regulatory compliance for a potential monitored geologic repository at Yucca Mountain, Nevada. The biosphere model describes those exposure pathways in the biosphere by which radionuclides released from a potential repository could reach a human receptor

  8. Aerospace Materials Process Modelling

    Science.gov (United States)

    1988-08-01

    Cooling Transformation diagram ( CCT diagram ) When a IT diagram is used in the heat process modelling, we suppose that a sudden cooling (instantaneous...processes. CE, chooses instead to study thermo-mechanical properties referring to a CCT diagram . This is thinked to be more reliable to give a true...k , mm-_____sml l ml A I 1 III 12.4 This determination is however based on the following approximations: i) A CCT diagram is valid only for the

  9. Selected papers on noise and stochastic processes

    CERN Document Server

    1954-01-01

    Six classic papers on stochastic process, selected to meet the needs of physicists, applied mathematicians, and engineers. Contents: 1.Chandrasekhar, S.: Stochastic Problems in Physics and Astronomy. 2. Uhlenbeck, G. E. and Ornstein, L. S.: On the Theory of the Browninan Motion. 3. Ming Chen Wang and Uhlenbeck, G. E.: On the Theory of the Browninan Motion II. 4. Rice, S. O.: Mathematical Analysis of Random Noise. 5. Kac, Mark: Random Walk and the Theory of Brownian Motion. 6. Doob, J. L.: The Brownian Movement and Stochastic Equations. Unabridged republication of the Dover reprint (1954). Pre

  10. Otolaryngology residency selection process. Medical student perspective.

    Science.gov (United States)

    Stringer, S P; Cassisi, N J; Slattery, W H

    1992-04-01

    In an effort to improve the otolaryngology matching process at the University of Florida, Gainesville, we sought to obtain the medical student's perspective of the current system. All students who interviewed here over a 3-year period were surveyed regarding the application, interview, and ranking process. In addition, suggestions for improving the system were sought from the students. The application and interviewing patterns of the students surveyed were found to be similar to those of the entire otolaryngology residency applicant pool. We were unable to identify any factors that influence a student's rank list that could be prospectively used to help select applicants for interview. A variety of suggestions for improvements in the match were received, several of which could easily be instituted. A uniform interview invitation date as requested by the students could be rapidly implemented and would provide benefits for both the students and the residency programs.

  11. Review and selection of unsaturated flow models

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1993-09-10

    Under the US Department of Energy (DOE), the Civilian Radioactive Waste Management System Management and Operating Contractor (CRWMS M&O) has the responsibility to review, evaluate, and document existing computer ground-water flow models; to conduct performance assessments; and to develop performance assessment models, where necessary. In the area of scientific modeling, the M&O CRWMS has the following responsibilities: To provide overall management and integration of modeling activities. To provide a framework for focusing modeling and model development. To identify areas that require increased or decreased emphasis. To ensure that the tools necessary to conduct performance assessment are available. These responsibilities are being initiated through a three-step process. It consists of a thorough review of existing models, testing of models which best fit the established requirements, and making recommendations for future development that should be conducted. Future model enhancement will then focus on the models selected during this activity. Furthermore, in order to manage future model development, particularly in those areas requiring substantial enhancement, the three-step process will be updated and reported periodically in the future.

  12. Business Model Process Configurations

    DEFF Research Database (Denmark)

    Taran, Yariv; Nielsen, Christian; Thomsen, Peter

    2015-01-01

    , by developing (inductively) an ontological classification framework, in view of the BM process configurations typology developed. Design/methodology/approach – Given the inconsistencies found in the business model studies (e.g. definitions, configurations, classifications) we adopted the analytical induction...

  13. Information-Processing Models and Curriculum Design

    Science.gov (United States)

    Calfee, Robert C.

    1970-01-01

    "This paper consists of three sections--(a) the relation of theoretical analyses of learning to curriculum design, (b) the role of information-processing models in analyses of learning processes, and (c) selected examples of the application of information-processing models to curriculum design problems." (Author)

  14. Multiattribute Supplier Selection Using Fuzzy Analytic Hierarchy Process

    Directory of Open Access Journals (Sweden)

    Serhat Aydin

    2010-11-01

    Full Text Available Supplier selection is a multiattribute decision making (MADM problem which contains both qualitative and quantitative factors. Supplier selection has vital importance for most companies. The aim of this paper is to provide an AHP based analytical tool for decision support enabling an effective multicriteria supplier selection process in an air conditioner seller firm under fuzziness. In this article, the Analytic Hierarchy Process (AHP under fuzziness is employed for its permissiveness to use an evaluation scale including linguistic expressions, crisp numerical values, fuzzy numbers and range numerical values. This scale provides a more flexible evaluation compared with the other fuzzy AHP methods. In this study, the modified AHP was used in supplier selection in an air conditioner firm. Three experts evaluated the suppliers according to the proposed model and the most appropriate supplier was selected. The proposed model enables decision makers select the best supplier among supplier firms effectively. We confirm that the modified fuzzy AHP is appropriate for group decision making in supplier selection problems.

  15. Expatriates Selection: An Essay of Model Analysis

    Directory of Open Access Journals (Sweden)

    Rui Bártolo-Ribeiro

    2015-03-01

    Full Text Available The business expansion to other geographical areas with different cultures from which organizations were created and developed leads to the expatriation of employees to these destinations. Recruitment and selection procedures of expatriates do not always have the intended success leading to an early return of these professionals with the consequent organizational disorders. In this study, several articles published in the last five years were analyzed in order to identify the most frequently mentioned dimensions in the selection of expatriates in terms of success and failure. The characteristics in the selection process that may increase prediction of adaptation of expatriates to new cultural contexts of the some organization were studied according to the KSAOs model. Few references were found concerning Knowledge, Skills and Abilities dimensions in the analyzed papers. There was a strong predominance on the evaluation of Other Characteristics, and was given more importance to dispositional factors than situational factors for promoting the integration of the expatriates.

  16. Business process model repositories : efficient process retrieval

    NARCIS (Netherlands)

    Yan, Z.

    2012-01-01

    As organizations increasingly work in process-oriented manner, the number of business process models that they develop and have to maintain increases. As a consequence, it has become common for organizations to have collections of hundreds or even thousands of business process models. When a

  17. Multiphysics modelling of manufacturing processes: A review

    DEFF Research Database (Denmark)

    Jabbari, Masoud; Baran, Ismet; Mohanty, Sankhya

    2018-01-01

    Numerical modelling is increasingly supporting the analysis and optimization of manufacturing processes in the production industry. Even if being mostly applied to multistep processes, single process steps may be so complex by nature that the needed models to describe them must include multiphysics...... the diversity in the field of modelling of manufacturing processes as regards process, materials, generic disciplines as well as length scales: (1) modelling of tape casting for thin ceramic layers, (2) modelling the flow of polymers in extrusion, (3) modelling the deformation process of flexible stamps...... for nanoimprint lithography, (4) modelling manufacturing of composite parts and (5) modelling the selective laser melting process. For all five examples, the emphasis is on modelling results as well as describing the models in brief mathematical details. Alongside with relevant references to the original work...

  18. Neuroscientific Model of Motivational Process

    Science.gov (United States)

    Kim, Sung-il

    2013-01-01

    Considering the neuroscientific findings on reward, learning, value, decision-making, and cognitive control, motivation can be parsed into three sub processes, a process of generating motivation, a process of maintaining motivation, and a process of regulating motivation. I propose a tentative neuroscientific model of motivational processes which consists of three distinct but continuous sub processes, namely reward-driven approach, value-based decision-making, and goal-directed control. Reward-driven approach is the process in which motivation is generated by reward anticipation and selective approach behaviors toward reward. This process recruits the ventral striatum (reward area) in which basic stimulus-action association is formed, and is classified as an automatic motivation to which relatively less attention is assigned. By contrast, value-based decision-making is the process of evaluating various outcomes of actions, learning through positive prediction error, and calculating the value continuously. The striatum and the orbitofrontal cortex (valuation area) play crucial roles in sustaining motivation. Lastly, the goal-directed control is the process of regulating motivation through cognitive control to achieve goals. This consciously controlled motivation is associated with higher-level cognitive functions such as planning, retaining the goal, monitoring the performance, and regulating action. The anterior cingulate cortex (attention area) and the dorsolateral prefrontal cortex (cognitive control area) are the main neural circuits related to regulation of motivation. These three sub processes interact with each other by sending reward prediction error signals through dopaminergic pathway from the striatum and to the prefrontal cortex. The neuroscientific model of motivational process suggests several educational implications with regard to the generation, maintenance, and regulation of motivation to learn in the learning environment. PMID:23459598

  19. Neuroscientific model of motivational process.

    Science.gov (United States)

    Kim, Sung-Il

    2013-01-01

    Considering the neuroscientific findings on reward, learning, value, decision-making, and cognitive control, motivation can be parsed into three sub processes, a process of generating motivation, a process of maintaining motivation, and a process of regulating motivation. I propose a tentative neuroscientific model of motivational processes which consists of three distinct but continuous sub processes, namely reward-driven approach, value-based decision-making, and goal-directed control. Reward-driven approach is the process in which motivation is generated by reward anticipation and selective approach behaviors toward reward. This process recruits the ventral striatum (reward area) in which basic stimulus-action association is formed, and is classified as an automatic motivation to which relatively less attention is assigned. By contrast, value-based decision-making is the process of evaluating various outcomes of actions, learning through positive prediction error, and calculating the value continuously. The striatum and the orbitofrontal cortex (valuation area) play crucial roles in sustaining motivation. Lastly, the goal-directed control is the process of regulating motivation through cognitive control to achieve goals. This consciously controlled motivation is associated with higher-level cognitive functions such as planning, retaining the goal, monitoring the performance, and regulating action. The anterior cingulate cortex (attention area) and the dorsolateral prefrontal cortex (cognitive control area) are the main neural circuits related to regulation of motivation. These three sub processes interact with each other by sending reward prediction error signals through dopaminergic pathway from the striatum and to the prefrontal cortex. The neuroscientific model of motivational process suggests several educational implications with regard to the generation, maintenance, and regulation of motivation to learn in the learning environment.

  20. Simultaneous data pre-processing and SVM classification model selection based on a parallel genetic algorithm applied to spectroscopic data of olive oils.

    Science.gov (United States)

    Devos, Olivier; Downey, Gerard; Duponchel, Ludovic

    2014-04-01

    Classification is an important task in chemometrics. For several years now, support vector machines (SVMs) have proven to be powerful for infrared spectral data classification. However such methods require optimisation of parameters in order to control the risk of overfitting and the complexity of the boundary. Furthermore, it is established that the prediction ability of classification models can be improved using pre-processing in order to remove unwanted variance in the spectra. In this paper we propose a new methodology based on genetic algorithm (GA) for the simultaneous optimisation of SVM parameters and pre-processing (GENOPT-SVM). The method has been tested for the discrimination of the geographical origin of Italian olive oil (Ligurian and non-Ligurian) on the basis of near infrared (NIR) or mid infrared (FTIR) spectra. Different classification models (PLS-DA, SVM with mean centre data, GENOPT-SVM) have been tested and statistically compared using McNemar's statistical test. For the two datasets, SVM with optimised pre-processing give models with higher accuracy than the one obtained with PLS-DA on pre-processed data. In the case of the NIR dataset, most of this accuracy improvement (86.3% compared with 82.8% for PLS-DA) occurred using only a single pre-processing step. For the FTIR dataset, three optimised pre-processing steps are required to obtain SVM model with significant accuracy improvement (82.2%) compared to the one obtained with PLS-DA (78.6%). Furthermore, this study demonstrates that even SVM models have to be developed on the basis of well-corrected spectral data in order to obtain higher classification rates. Copyright © 2013 Elsevier Ltd. All rights reserved.

  1. Predictive Active Set Selection Methods for Gaussian Processes

    DEFF Research Database (Denmark)

    Henao, Ricardo; Winther, Ole

    2012-01-01

    We propose an active set selection framework for Gaussian process classification for cases when the dataset is large enough to render its inference prohibitive. Our scheme consists of a two step alternating procedure of active set update rules and hyperparameter optimization based upon marginal...... high impact to the classifier decision process while removing those that are less relevant. We introduce two active set rules based on different criteria, the first one prefers a model with interpretable active set parameters whereas the second puts computational complexity first, thus a model...... with active set parameters that directly control its complexity. We also provide both theoretical and empirical support for our active set selection strategy being a good approximation of a full Gaussian process classifier. Our extensive experiments show that our approach can compete with state...

  2. Behavioral optimization models for multicriteria portfolio selection

    Directory of Open Access Journals (Sweden)

    Mehlawat Mukesh Kumar

    2013-01-01

    Full Text Available In this paper, behavioral construct of suitability is used to develop a multicriteria decision making framework for portfolio selection. To achieve this purpose, we rely on multiple methodologies. Analytical hierarchy process technique is used to model the suitability considerations with a view to obtaining the suitability performance score in respect of each asset. A fuzzy multiple criteria decision making method is used to obtain the financial quality score of each asset based upon investor's rating on the financial criteria. Two optimization models are developed for optimal asset allocation considering simultaneously financial and suitability criteria. An empirical study is conducted on randomly selected assets from National Stock Exchange, Mumbai, India to demonstrate the effectiveness of the proposed methodology.

  3. Selection of Activities in Dynamic Business Process Simulation

    Directory of Open Access Journals (Sweden)

    Toma Rusinaitė

    2016-06-01

    Full Text Available Maintaining dynamicity of business processes is one of the core issues of today's business as it enables businesses to adapt to constantly changing environment. Upon changing the processes, it is vital to assess possible impact, which is achieved by using simulation of dynamic processes. In order to implement dynamicity in business processes, it is necessary to have an ability to change components of the process (a set of activities, a content of activity, a set of activity sequences, a set of rules, performers and resources or dynamically select them during execution. This problem attracted attention of researches over the past few years; however, there is no proposed solution, which ensures the business process (BP dynamicity. This paper proposes and specifies dynamic business process (DBP simulation model, which satisfies all of the formulated DBP requirements.

  4. Parameter identification in multinomial processing tree models

    NARCIS (Netherlands)

    Schmittmann, V.D.; Dolan, C.V.; Raijmakers, M.E.J.; Batchelder, W.H.

    2010-01-01

    Multinomial processing tree models form a popular class of statistical models for categorical data that have applications in various areas of psychological research. As in all statistical models, establishing which parameters are identified is necessary for model inference and selection on the basis

  5. Process model repositories and PNML

    NARCIS (Netherlands)

    Hee, van K.M.; Post, R.D.J.; Somers, L.J.A.M.; Werf, van der J.M.E.M.; Kindler, E.

    2004-01-01

    Bringing system and process models together in repositories facilitates the interchange of model information between modelling tools, and allows the combination and interlinking of complementary models. Petriweb is a web application for managing such repositories. It supports hierarchical process

  6. Processing plant persistent strains of Listeria monocytogenes appear to have a lower virulence potential than clinical strains in selected virulence models

    DEFF Research Database (Denmark)

    Jensen, Anne; Thomsen, L.E.; Jørgensen, R.L.

    2008-01-01

    cell line, Caco-2; time to death in a nematode model, Caenorhabditis elegans and in a fruit fly model, Drosophila melanogaster and fecal shedding in a guinea pig model. All strains adhered to and grew in Caco-2 cells in similar levels. When exposed to 10(6) CFU/ml, two strains representing......% killed C elegans worms was longer (110 h) for the RAPD type 9 strains than for the other four strains (80 h). The Scott A strain and one RAPD type 9 strain were suspended in whipping cream before being fed to guinea pigs and the persistent RAPD type 9 strain was isolated from feces in a lower level...... to contaminate food products, and it is important to determine their virulence potential to evaluate risk to consumers. We compared the behaviour of food processing persistent and clinical L. monocytogenes strains in four virulence models: Adhesion, invasion and intracellular growth was studied in an epithelial...

  7. Modeling styles in business process modeling

    NARCIS (Netherlands)

    Pinggera, J.; Soffer, P.; Zugal, S.; Weber, B.; Weidlich, M.; Fahland, D.; Reijers, H.A.; Mendling, J.; Bider, I.; Halpin, T.; Krogstie, J.; Nurcan, S.; Proper, E.; Schmidt, R.; Soffer, P.; Wrycza, S.

    2012-01-01

    Research on quality issues of business process models has recently begun to explore the process of creating process models. As a consequence, the question arises whether different ways of creating process models exist. In this vein, we observed 115 students engaged in the act of modeling, recording

  8. Neuroscientific Model of Motivational Process

    Directory of Open Access Journals (Sweden)

    Sung-Il eKim

    2013-03-01

    Full Text Available Considering the neuroscientific findings on reward, learning, value, decision-making, and cognitive control, motivation can be parsed into three subprocesses, a process of generating motivation, a process of maintaining motivation, and a process of regulating motivation. I propose a tentative neuroscientific model of motivational processes which consists of three distinct but continuous subprocesses, namely reward-driven approach, value-based decision making, and goal-directed control. Reward-driven approach is the process in which motivation is generated by reward anticipation and selective approach behaviors toward reward. This process recruits the ventral striatum (reward area in which basic stimulus-action association is formed, and is classified as an automatic motivation to which relatively less attention is assigned. By contrast, value-based decision making is the process of evaluating various outcomes of actions, learning through positive prediction error, and calculating the value continuously. The striatum and the orbitofrontal cortex (valuation area play crucial roles in sustaining motivation. Lastly, the goal-directed control is the process of regulating motivation through cognitive control to achieve goals. This consciously controlled motivation is associated with higher-level cognitive functions such as planning, retaining the goal, monitoring the performance, and regulating action. The anterior cingulate cortex (attention area and the dorsolateral prefrontal cortex (cognitive control area are the main neural circuits related to regulation of motivation. These three subprocesses interact with each other by sending reward prediction error signals through dopaminergic pathway from the striatum and to the prefrontal cortex. The neuroscientific model of motivational process suggests several educational implications with regard to the generation, maintenance, and regulation of motivation to learn in the learning environment.

  9. Fundamental Aspects of Selective Melting Additive Manufacturing Processes

    Energy Technology Data Exchange (ETDEWEB)

    van Swol, Frank B. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Miller, James E. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2014-12-01

    Certain details of the additive manufacturing process known as selective laser melting (SLM) affect the performance of the final metal part. To unleash the full potential of SLM it is crucial that the process engineer in the field receives guidance about how to select values for a multitude of process variables employed in the building process. These include, for example, the type of powder (e.g., size distribution, shape, type of alloy), orientation of the build axis, the beam scan rate, the beam power density, the scan pattern and scan rate. The science-based selection of these settings con- stitutes an intrinsically challenging multi-physics problem involving heating and melting a metal alloy, reactive, dynamic wetting followed by re-solidification. In addition, inherent to the process is its considerable variability that stems from the powder packing. Each time a limited number of powder particles are placed, the stacking is intrinsically different from the previous, possessing a different geometry, and having a different set of contact areas with the surrounding particles. As a result, even if all other process parameters (scan rate, etc) are exactly the same, the shape and contact geometry and area of the final melt pool will be unique to that particular configuration. This report identifies the most important issues facing SLM, discusses the fundamental physics associated with it and points out how modeling can support the additive manufacturing efforts.

  10. Integrated Site Model Process Model Report

    International Nuclear Information System (INIS)

    Booth, T.

    2000-01-01

    The Integrated Site Model (ISM) provides a framework for discussing the geologic features and properties of Yucca Mountain, which is being evaluated as a potential site for a geologic repository for the disposal of nuclear waste. The ISM is important to the evaluation of the site because it provides 3-D portrayals of site geologic, rock property, and mineralogic characteristics and their spatial variabilities. The ISM is not a single discrete model; rather, it is a set of static representations that provide three-dimensional (3-D), computer representations of site geology, selected hydrologic and rock properties, and mineralogic-characteristics data. These representations are manifested in three separate model components of the ISM: the Geologic Framework Model (GFM), the Rock Properties Model (RPM), and the Mineralogic Model (MM). The GFM provides a representation of the 3-D stratigraphy and geologic structure. Based on the framework provided by the GFM, the RPM and MM provide spatial simulations of the rock and hydrologic properties, and mineralogy, respectively. Functional summaries of the component models and their respective output are provided in Section 1.4. Each of the component models of the ISM considers different specific aspects of the site geologic setting. Each model was developed using unique methodologies and inputs, and the determination of the modeled units for each of the components is dependent on the requirements of that component. Therefore, while the ISM represents the integration of the rock properties and mineralogy into a geologic framework, the discussion of ISM construction and results is most appropriately presented in terms of the three separate components. This Process Model Report (PMR) summarizes the individual component models of the ISM (the GFM, RPM, and MM) and describes how the three components are constructed and combined to form the ISM

  11. Quality Quandaries- Time Series Model Selection and Parsimony

    DEFF Research Database (Denmark)

    Bisgaard, Søren; Kulahci, Murat

    2009-01-01

    Some of the issues involved in selecting adequate models for time series data are discussed using an example concerning the number of users of an Internet server. The process of selecting an appropriate model is subjective and requires experience and judgment. The authors believe an important...... consideration in model selection should be parameter parsimony. They favor the use of parsimonious mixed ARMA models, noting that research has shown that a model building strategy that considers only autoregressive representations will lead to non-parsimonious models and to loss of forecasting accuracy....

  12. Selected sports talent development models

    Directory of Open Access Journals (Sweden)

    Michal Vičar

    2017-06-01

    Full Text Available Background: Sports talent in the Czech Republic is generally viewed as a static, stable phenomena. It stands in contrast with widespread praxis carried out in Anglo-Saxon countries that emphasise its fluctuant nature. This is reflected in the current models describing its development. Objectives: The aim is to introduce current models of talent development in sport. Methods: Comparison and analysing of the following models: Balyi - Long term athlete development model, Côté - Developmental model of sport participation, Csikszentmihalyi - The flow model of optimal expertise, Bailey and Morley - Model of talent development. Conclusion: Current models of sport talent development approach talent as dynamic phenomenon, varying in time. They are based in particular on the work of Simonton and his Emergenic and epigenic model and of Gagné and his Differentiated model of giftedness and talent. Balyi's model is characterised by its applicability and impications for practice. Côté's model highlights the role of family and deliberate play. Both models describe periodization of talent development. Csikszentmihalyi's flow model explains how the athlete acquires experience and develops during puberty based on the structure of attention and flow experience. Bailey and Morley's model accents the situational approach to talent and development of skills facilitating its growth.

  13. Mathematical modeling of biological processes

    CERN Document Server

    Friedman, Avner

    2014-01-01

    This book on mathematical modeling of biological processes includes a wide selection of biological topics that demonstrate the power of mathematics and computational codes in setting up biological processes with a rigorous and predictive framework. Topics include: enzyme dynamics, spread of disease, harvesting bacteria, competition among live species, neuronal oscillations, transport of neurofilaments in axon, cancer and cancer therapy, and granulomas. Complete with a description of the biological background and biological question that requires the use of mathematics, this book is developed for graduate students and advanced undergraduate students with only basic knowledge of ordinary differential equations and partial differential equations; background in biology is not required. Students will gain knowledge on how to program with MATLAB without previous programming experience and how to use codes in order to test biological hypothesis.

  14. Selected sports talent development models

    OpenAIRE

    Michal Vičar

    2017-01-01

    Background: Sports talent in the Czech Republic is generally viewed as a static, stable phenomena. It stands in contrast with widespread praxis carried out in Anglo-Saxon countries that emphasise its fluctuant nature. This is reflected in the current models describing its development. Objectives: The aim is to introduce current models of talent development in sport. Methods: Comparison and analysing of the following models: Balyi - Long term athlete development model, Côté - Developmen...

  15. MODEL SELECTION FOR SPECTROPOLARIMETRIC INVERSIONS

    International Nuclear Information System (INIS)

    Asensio Ramos, A.; Manso Sainz, R.; Martínez González, M. J.; Socas-Navarro, H.; Viticchié, B.; Orozco Suárez, D.

    2012-01-01

    Inferring magnetic and thermodynamic information from spectropolarimetric observations relies on the assumption of a parameterized model atmosphere whose parameters are tuned by comparison with observations. Often, the choice of the underlying atmospheric model is based on subjective reasons. In other cases, complex models are chosen based on objective reasons (for instance, the necessity to explain asymmetries in the Stokes profiles) but it is not clear what degree of complexity is needed. The lack of an objective way of comparing models has, sometimes, led to opposing views of the solar magnetism because the inferred physical scenarios are essentially different. We present the first quantitative model comparison based on the computation of the Bayesian evidence ratios for spectropolarimetric observations. Our results show that there is not a single model appropriate for all profiles simultaneously. Data with moderate signal-to-noise ratios (S/Ns) favor models without gradients along the line of sight. If the observations show clear circular and linear polarization signals above the noise level, models with gradients along the line are preferred. As a general rule, observations with large S/Ns favor more complex models. We demonstrate that the evidence ratios correlate well with simple proxies. Therefore, we propose to calculate these proxies when carrying out standard least-squares inversions to allow for model comparison in the future.

  16. Method for Business Process Management System Selection

    NARCIS (Netherlands)

    Thijs van de Westelaken; Bas Terwee; Pascal Ravesteijn

    2013-01-01

    In recent years business process management (BPM) and specifically information systems that support the analysis, design and execution of processes (also called business process management systems (BPMS)) are getting more attention. This has lead to an increase in research on BPM and BPMS. However

  17. Refining processes of selected copper alloys

    Directory of Open Access Journals (Sweden)

    S. Rzadkosz

    2009-04-01

    Full Text Available The analysis of the refining effectiveness of the liquid copper and selected copper alloys by various micro additions and special refiningsubstances – was performed. Examinations of an influence of purifying, modifying and deoxidation operations performed in a metal bath on the properties of certain selected alloys based on copper matrix - were made. Refining substances, protecting-purifying slag, deoxidation and modifying substances containing micro additions of such elements as: zirconium, boron, phosphor, sodium, lithium, or their compounds introduced in order to change micro structures and properties of alloys, were applied in examinations. A special attention was directed to macro and micro structures of alloys, their tensile and elongation strength and hot-cracks sensitivity. Refining effects were estimated by comparing the effectiveness of micro structure changes with property changes of copper and its selected alloys from the group of tin bronzes.

  18. A Computational Model of Selection by Consequences

    Science.gov (United States)

    McDowell, J. J.

    2004-01-01

    Darwinian selection by consequences was instantiated in a computational model that consisted of a repertoire of behaviors undergoing selection, reproduction, and mutation over many generations. The model in effect created a digital organism that emitted behavior continuously. The behavior of this digital organism was studied in three series of…

  19. On Organizational Adaptation via Dynamic Process Selection

    National Research Council Canada - National Science Library

    Handley, Holly A; Levis, Alexander H

    2000-01-01

    .... An executable organizational model composed of individual models of a five stage interacting decision maker is used to evaluate the effectiveness of the different adaptation strategies on organizational performance...

  20. 7 CFR 1469.6 - Enrollment criteria and selection process.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 10 2010-01-01 2010-01-01 false Enrollment criteria and selection process. 1469.6... General Provisions § 1469.6 Enrollment criteria and selection process. (a) Selection and funding of... existing natural resource, environmental quality, and agricultural activity data along with other...

  1. Generating process model collections

    NARCIS (Netherlands)

    Yan, Z.; Dijkman, R.M.; Grefen, P.W.P.J.

    2017-01-01

    Business process management plays an important role in the management of organizations. More and more organizations describe their operations as business processes. It is common for organizations to have collections of thousands of business processes, but for reasons of confidentiality these

  2. THM-coupled modeling of selected processes in argillaceous rock relevant to rock mechanics; THM-Gekoppelte Modellierung ausgewaehlter gesteinsmechanisch relevanter Prozesse im Tongestein

    Energy Technology Data Exchange (ETDEWEB)

    Czaikowski, Oliver [Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS) mbH, Braunschweig (Germany). Repository Safety Research Div.

    2012-08-15

    Scientific investigations in European countries other than Germany concentrate not only on granite formations (Switzerland, Sweden) but also on argillaceous rock formations (France, Switzerland, Belgium) to assess their suitability as host and barrier rock for the final storage of radioactive waste. In Germany, rock salt has been under thorough study as a host rock over the past few decades. According to a study by the German Federal Institute for Geosciences and Natural Resources, however, not only salt deposits but also argillaceous rock deposits are available at relevant depths and of extensions in space which make final storage of high-level radioactive waste basically possible in Germany. Equally qualified findings about the suitability/unsuitability of non-saline rock formations require fundamental studies to be conducted nationally because of the comparatively low level of knowledge. The article presents basic analyses of coupled mechanical and hydraulic properties of argillaceous rock formations as host rock for a repository. The interaction of various processes is explained on the basis of knowledge derived from laboratory studies, and open problems are deduced. For modeling coupled processes, a simplified analytical computation method is proposed and compared with the results of numerical simulations, and the limits to its application are outlined. (orig.)

  3. What makes process models understandable?

    NARCIS (Netherlands)

    Mendling, J.; Reijers, H.A.; Cardoso, J.; Alonso, G.; Dadam, P.; Rosemann, M.

    2007-01-01

    Despite that formal and informal quality aspects are of significant importance to business process modeling, there is only little empirical work reported on process model quality and its impact factors. In this paper we investigate understandability as a proxy for quality of process models and focus

  4. A computational model of selection by consequences.

    OpenAIRE

    McDowell, J J

    2004-01-01

    Darwinian selection by consequences was instantiated in a computational model that consisted of a repertoire of behaviors undergoing selection, reproduction, and mutation over many generations. The model in effect created a digital organism that emitted behavior continuously. The behavior of this digital organism was studied in three series of computational experiments that arranged reinforcement according to random-interval (RI) schedules. The quantitative features of the model were varied o...

  5. Economic assessment model architecture for AGC/AVLIS selection

    International Nuclear Information System (INIS)

    Hoglund, R.L.

    1984-01-01

    The economic assessment model architecture described provides the flexibility and completeness in economic analysis that the selection between AGC and AVLIS demands. Process models which are technology-specific will provide the first-order responses of process performance and cost to variations in process parameters. The economics models can be used to test the impacts of alternative deployment scenarios for a technology. Enterprise models provide global figures of merit for evaluating the DOE perspective on the uranium enrichment enterprise, and business analysis models compute the financial parameters from the private investor's viewpoint

  6. A criterion for selecting renewable energy processes

    International Nuclear Information System (INIS)

    Searcy, Erin; Flynn, Peter C.

    2010-01-01

    We propose that minimum incremental cost per unit of greenhouse gas (GHG) reduction, in essence the carbon credit required to economically sustain a renewable energy plant, is the most appropriate social criterion for choosing from a myriad of alternatives. The application of this criterion is illustrated for four processing alternatives for straw/corn stover: production of power by direct combustion and biomass integrated gasification and combined cycle (BIGCC), and production of transportation fuel via lignocellulosic ethanol and Fischer Tropsch (FT) syndiesel. Ethanol requires a lower carbon credit than FT, and direct combustion a lower credit than BIGCC. For comparing processes that make a different form of end use energy, in this study ethanol vs. electrical power via direct combustion, the lowest carbon credit depends on the relative values of the two energy forms. When power is 70$ MW h -1 , ethanol production has a lower required carbon credit at oil prices greater than 600$ t -1 (80$ bbl -1 ). (author)

  7. A Dynamic Model for Limb Selection

    NARCIS (Netherlands)

    Cox, R.F.A; Smitsman, A.W.

    2008-01-01

    Two experiments and a model on limb selection are reported. In Experiment 1 left-handed and right-handed participants (N = 36) repeatedly used one hand for grasping a small cube. After a clear switch in the cube’s location, perseverative limb selection was revealed in both handedness groups. In

  8. Processes in arithmetic strategy selection: A fMRI study.

    Directory of Open Access Journals (Sweden)

    Julien eTaillan

    2015-02-01

    Full Text Available This neuroimaging (fMRI study investigated neural correlates of strategy selection. Young adults performed an arithmetic task in two different conditions. In both conditions, participants had to provide estimates of two-digit multiplication problems like 54 x 78. In the choice condition, participants had to select the better of two available rounding strategies, rounding-up strategy (RU (i.e., doing 60x80 = 4,800 or rounding-down strategy (RD (i.e., doing 50x70=3,500 to estimate product of 54x78. In the no-choice condition, participants did not have to select strategy on each problem but were told which strategy to use; they executed RU and RD strategies each on a series of problems. Participants also had a control task (i.e., providing correct products of multiplication problems like 40x50. Brain activations and performance were analyzed as a function of these conditions. Participants were able to frequently choose the better strategy in the choice condition; they were also slower when they executed the difficult RU than the easier RD. Neuroimaging data showed greater brain activations in right anterior cingulate cortex (ACC, dorso-lateral prefrontal cortex (DLPFC, and angular gyrus (ANG, when selecting (relative to executing the better strategy on each problem. Moreover, RU was associated with more parietal cortex activation than RD. These results suggest an important role of fronto-parietal network in strategy selection and have important implications for our further understanding and modelling cognitive processes underlying strategy selection.

  9. Processes in arithmetic strategy selection: a fMRI study.

    Science.gov (United States)

    Taillan, Julien; Ardiale, Eléonore; Anton, Jean-Luc; Nazarian, Bruno; Félician, Olivier; Lemaire, Patrick

    2015-01-01

    This neuroimaging (functional magnetic resonance imaging) study investigated neural correlates of strategy selection. Young adults performed an arithmetic task in two different conditions. In both conditions, participants had to provide estimates of two-digit multiplication problems like 54 × 78. In the choice condition, participants had to select the better of two available rounding strategies, rounding-up (RU) strategy (i.e., doing 60 × 80 = 4,800) or rounding-down (RD) strategy (i.e., doing 50 × 70 = 3,500 to estimate product of 54 × 78). In the no-choice condition, participants did not have to select strategy on each problem but were told which strategy to use; they executed RU and RD strategies each on a series of problems. Participants also had a control task (i.e., providing correct products of multiplication problems like 40 × 50). Brain activations and performance were analyzed as a function of these conditions. Participants were able to frequently choose the better strategy in the choice condition; they were also slower when they executed the difficult RU than the easier RD. Neuroimaging data showed greater brain activations in right anterior cingulate cortex (ACC), dorso-lateral prefrontal cortex (DLPFC), and angular gyrus (ANG), when selecting (relative to executing) the better strategy on each problem. Moreover, RU was associated with more parietal cortex activation than RD. These results suggest an important role of fronto-parietal network in strategy selection and have important implications for our further understanding and modeling cognitive processes underlying strategy selection.

  10. Review and selection of unsaturated flow models

    Energy Technology Data Exchange (ETDEWEB)

    Reeves, M.; Baker, N.A.; Duguid, J.O. [INTERA, Inc., Las Vegas, NV (United States)

    1994-04-04

    Since the 1960`s, ground-water flow models have been used for analysis of water resources problems. In the 1970`s, emphasis began to shift to analysis of waste management problems. This shift in emphasis was largely brought about by site selection activities for geologic repositories for disposal of high-level radioactive wastes. Model development during the 1970`s and well into the 1980`s focused primarily on saturated ground-water flow because geologic repositories in salt, basalt, granite, shale, and tuff were envisioned to be below the water table. Selection of the unsaturated zone at Yucca Mountain, Nevada, for potential disposal of waste began to shift model development toward unsaturated flow models. Under the US Department of Energy (DOE), the Civilian Radioactive Waste Management System Management and Operating Contractor (CRWMS M&O) has the responsibility to review, evaluate, and document existing computer models; to conduct performance assessments; and to develop performance assessment models, where necessary. This document describes the CRWMS M&O approach to model review and evaluation (Chapter 2), and the requirements for unsaturated flow models which are the bases for selection from among the current models (Chapter 3). Chapter 4 identifies existing models, and their characteristics. Through a detailed examination of characteristics, Chapter 5 presents the selection of models for testing. Chapter 6 discusses the testing and verification of selected models. Chapters 7 and 8 give conclusions and make recommendations, respectively. Chapter 9 records the major references for each of the models reviewed. Appendix A, a collection of technical reviews for each model, contains a more complete list of references. Finally, Appendix B characterizes the problems used for model testing.

  11. Review and selection of unsaturated flow models

    International Nuclear Information System (INIS)

    Reeves, M.; Baker, N.A.; Duguid, J.O.

    1994-01-01

    Since the 1960's, ground-water flow models have been used for analysis of water resources problems. In the 1970's, emphasis began to shift to analysis of waste management problems. This shift in emphasis was largely brought about by site selection activities for geologic repositories for disposal of high-level radioactive wastes. Model development during the 1970's and well into the 1980's focused primarily on saturated ground-water flow because geologic repositories in salt, basalt, granite, shale, and tuff were envisioned to be below the water table. Selection of the unsaturated zone at Yucca Mountain, Nevada, for potential disposal of waste began to shift model development toward unsaturated flow models. Under the US Department of Energy (DOE), the Civilian Radioactive Waste Management System Management and Operating Contractor (CRWMS M ampersand O) has the responsibility to review, evaluate, and document existing computer models; to conduct performance assessments; and to develop performance assessment models, where necessary. This document describes the CRWMS M ampersand O approach to model review and evaluation (Chapter 2), and the requirements for unsaturated flow models which are the bases for selection from among the current models (Chapter 3). Chapter 4 identifies existing models, and their characteristics. Through a detailed examination of characteristics, Chapter 5 presents the selection of models for testing. Chapter 6 discusses the testing and verification of selected models. Chapters 7 and 8 give conclusions and make recommendations, respectively. Chapter 9 records the major references for each of the models reviewed. Appendix A, a collection of technical reviews for each model, contains a more complete list of references. Finally, Appendix B characterizes the problems used for model testing

  12. Neuroscientific Model of Motivational Process

    OpenAIRE

    Kim, Sung-il

    2013-01-01

    Considering the neuroscientific findings on reward, learning, value, decision-making, and cognitive control, motivation can be parsed into three sub processes, a process of generating motivation, a process of maintaining motivation, and a process of regulating motivation. I propose a tentative neuroscientific model of motivational processes which consists of three distinct but continuous sub processes, namely reward-driven approach, value-based decision-making, and goal-directed control. Rewa...

  13. Using Card Games to Simulate the Process of Natural Selection

    Science.gov (United States)

    Grilliot, Matthew E.; Harden, Siegfried

    2014-01-01

    In 1858, Darwin published "On the Origin of Species by Means of Natural Selection." His explanation of evolution by natural selection has become the unifying theme of biology. We have found that many students do not fully comprehend the process of evolution by natural selection. We discuss a few simple games that incorporate hands-on…

  14. Intermediate product selection and blending in the food processing industry

    NARCIS (Netherlands)

    Kilic, Onur A.; Akkerman, Renzo; van Donk, Dirk Pieter; Grunow, Martin

    2013-01-01

    This study addresses a capacitated intermediate product selection and blending problem typical for two-stage production systems in the food processing industry. The problem involves the selection of a set of intermediates and end-product recipes characterising how those selected intermediates are

  15. Intermediate product selection and blending in the food processing industry

    DEFF Research Database (Denmark)

    Kilic, Onur A.; Akkerman, Renzo; van Donk, Dirk Pieter

    2013-01-01

    This study addresses a capacitated intermediate product selection and blending problem typical for two-stage production systems in the food processing industry. The problem involves the selection of a set of intermediates and end-product recipes characterising how those selected intermediates...

  16. The Automation of Nowcast Model Assessment Processes

    Science.gov (United States)

    2016-09-01

    secondly, provide modelers with the information needed to understand the model errors and how their algorithm changes might mitigate these errors. In...by ARL modelers. 2. Development Environment The automation of Point-Stat processes (i.e., PSA) was developed using Python 3.5.* Python was selected...because it is easy to use, widely used for scripting, and satisfies all the requirements to automate the implementation of the Point-Stat tool. In

  17. Model Selection with the Linear Mixed Model for Longitudinal Data

    Science.gov (United States)

    Ryoo, Ji Hoon

    2011-01-01

    Model building or model selection with linear mixed models (LMMs) is complicated by the presence of both fixed effects and random effects. The fixed effects structure and random effects structure are codependent, so selection of one influences the other. Most presentations of LMM in psychology and education are based on a multilevel or…

  18. Genetic search feature selection for affective modeling

    DEFF Research Database (Denmark)

    Martínez, Héctor P.; Yannakakis, Georgios N.

    2010-01-01

    Automatic feature selection is a critical step towards the generation of successful computational models of affect. This paper presents a genetic search-based feature selection method which is developed as a global-search algorithm for improving the accuracy of the affective models built....... The method is tested and compared against sequential forward feature selection and random search in a dataset derived from a game survey experiment which contains bimodal input features (physiological and gameplay) and expressed pairwise preferences of affect. Results suggest that the proposed method...

  19. Selection of water treatment processes special study

    International Nuclear Information System (INIS)

    1991-11-01

    Characterization of the level and extent of groundwater contamination in the vicinity of Title I mill sites began during the surface remedial action stage (Phase 1) of the Uranium Mill Tailings Remedial Action (UMTRA) Project. Some of the contamination in the aquifer(s) at the abandoned sites is attributable to milling activities during the years the mills were in operation. The restoration of contaminated aquifers is to be undertaken in Phase II of the UMTRA Project. To begin implementation of Phase II, DOE requested that groundwater restoration methods and technologies be investigated by the Technical Assistance Contractor (TAC). and that the results of the TAC investigations be documented in special study reports. Many active and passive methods are available to clean up contaminated groundwater. Passive groundwater treatment includes natural flushing, geochemical barriers, and gradient manipulation by stream diversion or slurry walls. Active groundwater.cleanup techniques include gradient manipulation by well extraction or injection. in-situ biological or chemical reclamation, and extraction and treatment. Although some or all of the methods listed above may play a role in the groundwater cleanup phase of the UMTRA Project, the extraction and treatment (pump and treat) option is the only restoration alternative discussed in this report. Hence, all sections of this report relate either directly or indirectly to the technical discipline of process engineering

  20. Selection of basic data for numerical modeling of rock mass stress state at Mirny Mining and Processing Works, Alrosa Group of Companies

    Science.gov (United States)

    Bokiy, IB; Zoteev, OV; Pul, VV; Pul, EK

    2018-03-01

    The influence of structural features on the strength and elasticity modulus is studied in rock mass in the area of Mirny Mining and Processing Works. The authors make recommendations on the values of physical properties of rocks.

  1. Modeling of column apparatus processes

    CERN Document Server

    Boyadjiev, Christo; Boyadjiev, Boyan; Popova-Krumova, Petya

    2016-01-01

    This book presents a new approach for the modeling of chemical and interphase mass transfer processes in industrial column apparatuses, using convection-diffusion and average-concentration models. The convection-diffusion type models are used for a qualitative analysis of the processes and to assess the main, small and slight physical effects, and then reject the slight effects. As a result, the process mechanism can be identified. It also introduces average concentration models for quantitative analysis, which use the average values of the velocity and concentration over the cross-sectional area of the column. The new models are used to analyze different processes (simple and complex chemical reactions, absorption, adsorption and catalytic reactions), and make it possible to model the processes of gas purification with sulfur dioxide, which form the basis of several patents.

  2. A SUPPLIER SELECTION MODEL FOR SOFTWARE DEVELOPMENT OUTSOURCING

    Directory of Open Access Journals (Sweden)

    Hancu Lucian-Viorel

    2010-12-01

    Full Text Available This paper presents a multi-criteria decision making model used for supplier selection for software development outsourcing on e-marketplaces. This model can be used in auctions. The supplier selection process becomes complex and difficult on last twenty years since the Internet plays an important role in business management. Companies have to concentrate their efforts on their core activities and the others activities should be realized by outsourcing. They can achieve significant cost reduction by using e-marketplaces in their purchase process and by using decision support systems on supplier selection. In the literature were proposed many approaches for supplier evaluation and selection process. The performance of potential suppliers is evaluated using multi criteria decision making methods rather than considering a single factor cost.

  3. Laser dimpling process parameters selection and optimization using surrogate-driven process capability space

    Science.gov (United States)

    Ozkat, Erkan Caner; Franciosa, Pasquale; Ceglarek, Dariusz

    2017-08-01

    Remote laser welding technology offers opportunities for high production throughput at a competitive cost. However, the remote laser welding process of zinc-coated sheet metal parts in lap joint configuration poses a challenge due to the difference between the melting temperature of the steel (∼1500 °C) and the vapourizing temperature of the zinc (∼907 °C). In fact, the zinc layer at the faying surface is vapourized and the vapour might be trapped within the melting pool leading to weld defects. Various solutions have been proposed to overcome this problem over the years. Among them, laser dimpling has been adopted by manufacturers because of its flexibility and effectiveness along with its cost advantages. In essence, the dimple works as a spacer between the two sheets in lap joint and allows the zinc vapour escape during welding process, thereby preventing weld defects. However, there is a lack of comprehensive characterization of dimpling process for effective implementation in real manufacturing system taking into consideration inherent changes in variability of process parameters. This paper introduces a methodology to develop (i) surrogate model for dimpling process characterization considering multiple-inputs (i.e. key control characteristics) and multiple-outputs (i.e. key performance indicators) system by conducting physical experimentation and using multivariate adaptive regression splines; (ii) process capability space (Cp-Space) based on the developed surrogate model that allows the estimation of a desired process fallout rate in the case of violation of process requirements in the presence of stochastic variation; and, (iii) selection and optimization of the process parameters based on the process capability space. The proposed methodology provides a unique capability to: (i) simulate the effect of process variation as generated by manufacturing process; (ii) model quality requirements with multiple and coupled quality requirements; and (iii

  4. The Added Value of the Project Selection Process

    Directory of Open Access Journals (Sweden)

    Adel Oueslati

    2016-06-01

    Full Text Available The project selection process comes in the first stage of the overall project management life cycle. It does have a very important impact on organization success. The present paper provides defi nitions of the basic concepts and tools related to the project selection process. It aims to stress the added value of this process for the entire organization success. The mastery of the project selection process is the right way for any organization to ensure that it will do the right project with the right resources at the right time and within the right priorities

  5. Fermentation process diagnosis using a mathematical model

    Energy Technology Data Exchange (ETDEWEB)

    Yerushalmi, L; Volesky, B; Votruba, J

    1988-09-01

    Intriguing physiology of a solvent-producing strain of Clostridium acetobutylicum led to the synthesis of a mathematical model of the acetone-butanol fermentation process. The model presented is capable of describing the process dynamics and the culture behavior during a standard and a substandard acetone-butanol fermentation. In addition to the process kinetic parameters, the model includes the culture physiological parameters, such as the cellular membrane permeability and the number of membrane sites for active transport of sugar. Computer process simulation studies for different culture conditions used the model, and quantitatively pointed out the importance of selected culture parameters that characterize the cell membrane behaviour and play an important role in the control of solvent synthesis by the cell. The theoretical predictions by the new model were confirmed by experimental determination of the cellular membrane permeability.

  6. Applying Four Different Risk Models in Local Ore Selection

    International Nuclear Information System (INIS)

    Richmond, Andrew

    2002-01-01

    Given the uncertainty in grade at a mine location, a financially risk-averse decision-maker may prefer to incorporate this uncertainty into the ore selection process. A FORTRAN program risksel is presented to calculate local risk-adjusted optimal ore selections using a negative exponential utility function and three dominance models: mean-variance, mean-downside risk, and stochastic dominance. All four methods are demonstrated in a grade control environment. In the case study, optimal selections range with the magnitude of financial risk that a decision-maker is prepared to accept. Except for the stochastic dominance method, the risk models reassign material from higher cost to lower cost processing options as the aversion to financial risk increases. The stochastic dominance model usually was unable to determine the optimal local selection

  7. The genealogy of samples in models with selection.

    Science.gov (United States)

    Neuhauser, C; Krone, S M

    1997-02-01

    We introduce the genealogy of a random sample of genes taken from a large haploid population that evolves according to random reproduction with selection and mutation. Without selection, the genealogy is described by Kingman's well-known coalescent process. In the selective case, the genealogy of the sample is embedded in a graph with a coalescing and branching structure. We describe this graph, called the ancestral selection graph, and point out differences and similarities with Kingman's coalescent. We present simulations for a two-allele model with symmetric mutation in which one of the alleles has a selective advantage over the other. We find that when the allele frequencies in the population are already in equilibrium, then the genealogy does not differ much from the neutral case. This is supported by rigorous results. Furthermore, we describe the ancestral selection graph for other selective models with finitely many selection classes, such as the K-allele models, infinitely-many-alleles models. DNA sequence models, and infinitely-many-sites models, and briefly discuss the diploid case.

  8. Melody Track Selection Using Discriminative Language Model

    Science.gov (United States)

    Wu, Xiao; Li, Ming; Suo, Hongbin; Yan, Yonghong

    In this letter we focus on the task of selecting the melody track from a polyphonic MIDI file. Based on the intuition that music and language are similar in many aspects, we solve the selection problem by introducing an n-gram language model to learn the melody co-occurrence patterns in a statistical manner and determine the melodic degree of a given MIDI track. Furthermore, we propose the idea of using background model and posterior probability criteria to make modeling more discriminative. In the evaluation, the achieved 81.6% correct rate indicates the feasibility of our approach.

  9. Model selection for Gaussian kernel PCA denoising

    DEFF Research Database (Denmark)

    Jørgensen, Kasper Winther; Hansen, Lars Kai

    2012-01-01

    We propose kernel Parallel Analysis (kPA) for automatic kernel scale and model order selection in Gaussian kernel PCA. Parallel Analysis [1] is based on a permutation test for covariance and has previously been applied for model order selection in linear PCA, we here augment the procedure to also...... tune the Gaussian kernel scale of radial basis function based kernel PCA.We evaluate kPA for denoising of simulated data and the US Postal data set of handwritten digits. We find that kPA outperforms other heuristics to choose the model order and kernel scale in terms of signal-to-noise ratio (SNR...

  10. Selected missense mutations impair frataxin processing in Friedreich ataxia.

    Science.gov (United States)

    Clark, Elisia; Butler, Jill S; Isaacs, Charles J; Napierala, Marek; Lynch, David R

    2017-08-01

    Frataxin (FXN) is a highly conserved mitochondrial protein. Reduced FXN levels cause Friedreich ataxia, a recessive neurodegenerative disease. Typical patients carry GAA repeat expansions on both alleles, while a subgroup of patients carry a missense mutation on one allele and a GAA repeat expansion on the other. Here, we report that selected disease-related FXN missense mutations impair FXN localization, interaction with mitochondria processing peptidase, and processing. Immunocytochemical studies and subcellular fractionation were performed to study FXN import into the mitochondria and examine the mechanism by which mutations impair FXN processing. Coimmunoprecipitation was performed to study the interaction between FXN and mitochondrial processing peptidase. A proteasome inhibitor was used to model traditional therapeutic strategies. In addition, clinical profiles of subjects with and without point mutations were compared in a large natural history study. FXN I 154F and FXN G 130V missense mutations decrease FXN 81-210 levels compared with FXN WT , FXN R 165C , and FXN W 155R , but do not block its association with mitochondria. FXN I 154F and FXN G 130V also impair FXN maturation and enhance the binding between FXN 42-210 and mitochondria processing peptidase. Furthermore, blocking proteosomal degradation does not increase FXN 81-210 levels. Additionally, impaired FXN processing also occurs in fibroblasts from patients with FXN G 130V . Finally, clinical data from patients with FXN G 130V and FXN I 154F mutations demonstrates a lower severity compared with other individuals with Friedreich ataxia. These data suggest that the effects on processing associated with FXN G 130V and FXN I 154F mutations lead to higher levels of partially processed FXN, which may contribute to the milder clinical phenotypes in these patients.

  11. Business Process Modeling: Perceived Benefits

    Science.gov (United States)

    Indulska, Marta; Green, Peter; Recker, Jan; Rosemann, Michael

    The process-centered design of organizations and information systems is globally seen as an appropriate response to the increased economic pressure on organizations. At the methodological core of process-centered management is process modeling. However, business process modeling in large initiatives can be a time-consuming and costly exercise, making it potentially difficult to convince executive management of its benefits. To date, and despite substantial interest and research in the area of process modeling, the understanding of the actual benefits of process modeling in academia and practice is limited. To address this gap, this paper explores the perception of benefits derived from process modeling initiatives, as reported through a global Delphi study. The study incorporates the views of three groups of stakeholders - academics, practitioners and vendors. Our findings lead to the first identification and ranking of 19 unique benefits associated with process modeling. The study in particular found that process modeling benefits vary significantly between practitioners and academics. We argue that the variations may point to a disconnect between research projects and practical demands.

  12. Model Identification of Integrated ARMA Processes

    Science.gov (United States)

    Stadnytska, Tetiana; Braun, Simone; Werner, Joachim

    2008-01-01

    This article evaluates the Smallest Canonical Correlation Method (SCAN) and the Extended Sample Autocorrelation Function (ESACF), automated methods for the Autoregressive Integrated Moving-Average (ARIMA) model selection commonly available in current versions of SAS for Windows, as identification tools for integrated processes. SCAN and ESACF can…

  13. Selecting public relations personnel of hospitals by analytic network process.

    Science.gov (United States)

    Liao, Sen-Kuei; Chang, Kuei-Lun

    2009-01-01

    This study describes the use of analytic network process (ANP) in the Taiwanese hospital public relations personnel selection process. Starting with interviewing 48 practitioners and executives in north Taiwan, we collected selection criteria. Then, we retained the 12 critical criteria that were mentioned above 40 times by theses respondents, including: interpersonal skill, experience, negotiation, language, ability to follow orders, cognitive ability, adaptation to environment, adaptation to company, emotion, loyalty, attitude, and Response. Finally, we discussed with the 20 executives to take these important criteria into three perspectives to structure the hierarchy for hospital public relations personnel selection. After discussing with practitioners and executives, we find that selecting criteria are interrelated. The ANP, which incorporates interdependence relationships, is a new approach for multi-criteria decision-making. Thus, we apply ANP to select the most optimal public relations personnel of hospitals. An empirical study of public relations personnel selection problems in Taiwan hospitals is conducted to illustrate how the selection procedure works.

  14. Selection Criteria in Regime Switching Conditional Volatility Models

    Directory of Open Access Journals (Sweden)

    Thomas Chuffart

    2015-05-01

    Full Text Available A large number of nonlinear conditional heteroskedastic models have been proposed in the literature. Model selection is crucial to any statistical data analysis. In this article, we investigate whether the most commonly used selection criteria lead to choice of the right specification in a regime switching framework. We focus on two types of models: the Logistic Smooth Transition GARCH and the Markov-Switching GARCH models. Simulation experiments reveal that information criteria and loss functions can lead to misspecification ; BIC sometimes indicates the wrong regime switching framework. Depending on the Data Generating Process used in the experiments, great care is needed when choosing a criterion.

  15. Methods for model selection in applied science and engineering.

    Energy Technology Data Exchange (ETDEWEB)

    Field, Richard V., Jr.

    2004-10-01

    Mathematical models are developed and used to study the properties of complex systems and/or modify these systems to satisfy some performance requirements in just about every area of applied science and engineering. A particular reason for developing a model, e.g., performance assessment or design, is referred to as the model use. Our objective is the development of a methodology for selecting a model that is sufficiently accurate for an intended use. Information on the system being modeled is, in general, incomplete, so that there may be two or more models consistent with the available information. The collection of these models is called the class of candidate models. Methods are developed for selecting the optimal member from a class of candidate models for the system. The optimal model depends on the available information, the selected class of candidate models, and the model use. Classical methods for model selection, including the method of maximum likelihood and Bayesian methods, as well as a method employing a decision-theoretic approach, are formulated to select the optimal model for numerous applications. There is no requirement that the candidate models be random. Classical methods for model selection ignore model use and require data to be available. Examples are used to show that these methods can be unreliable when data is limited. The decision-theoretic approach to model selection does not have these limitations, and model use is included through an appropriate utility function. This is especially important when modeling high risk systems, where the consequences of using an inappropriate model for the system can be disastrous. The decision-theoretic method for model selection is developed and applied for a series of complex and diverse applications. These include the selection of the: (1) optimal order of the polynomial chaos approximation for non-Gaussian random variables and stationary stochastic processes, (2) optimal pressure load model to be

  16. Conceptual models of information processing

    Science.gov (United States)

    Stewart, L. J.

    1983-01-01

    The conceptual information processing issues are examined. Human information processing is defined as an active cognitive process that is analogous to a system. It is the flow and transformation of information within a human. The human is viewed as an active information seeker who is constantly receiving, processing, and acting upon the surrounding environmental stimuli. Human information processing models are conceptual representations of cognitive behaviors. Models of information processing are useful in representing the different theoretical positions and in attempting to define the limits and capabilities of human memory. It is concluded that an understanding of conceptual human information processing models and their applications to systems design leads to a better human factors approach.

  17. Business process modeling in healthcare.

    Science.gov (United States)

    Ruiz, Francisco; Garcia, Felix; Calahorra, Luis; Llorente, César; Gonçalves, Luis; Daniel, Christel; Blobel, Bernd

    2012-01-01

    The importance of the process point of view is not restricted to a specific enterprise sector. In the field of health, as a result of the nature of the service offered, health institutions' processes are also the basis for decision making which is focused on achieving their objective of providing quality medical assistance. In this chapter the application of business process modelling - using the Business Process Modelling Notation (BPMN) standard is described. Main challenges of business process modelling in healthcare are the definition of healthcare processes, the multi-disciplinary nature of healthcare, the flexibility and variability of the activities involved in health care processes, the need of interoperability between multiple information systems, and the continuous updating of scientific knowledge in healthcare.

  18. Homology modeling, docking studies and molecular dynamic simulations using graphical processing unit architecture to probe the type-11 phosphodiesterase catalytic site: a computational approach for the rational design of selective inhibitors.

    Science.gov (United States)

    Cichero, Elena; D'Ursi, Pasqualina; Moscatelli, Marco; Bruno, Olga; Orro, Alessandro; Rotolo, Chiara; Milanesi, Luciano; Fossa, Paola

    2013-12-01

    Phosphodiesterase 11 (PDE11) is the latest isoform of the PDEs family to be identified, acting on both cyclic adenosine monophosphate and cyclic guanosine monophosphate. The initial reports of PDE11 found evidence for PDE11 expression in skeletal muscle, prostate, testis, and salivary glands; however, the tissue distribution of PDE11 still remains a topic of active study and some controversy. Given the sequence similarity between PDE11 and PDE5, several PDE5 inhibitors have been shown to cross-react with PDE11. Accordingly, many non-selective inhibitors, such as IBMX, zaprinast, sildenafil, and dipyridamole, have been documented to inhibit PDE11. Only recently, a series of dihydrothieno[3,2-d]pyrimidin-4(3H)-one derivatives proved to be selective toward the PDE11 isoform. In the absence of experimental data about PDE11 X-ray structures, we found interesting to gain a better understanding of the enzyme-inhibitor interactions using in silico simulations. In this work, we describe a computational approach based on homology modeling, docking, and molecular dynamics simulation to derive a predictive 3D model of PDE11. Using a Graphical Processing Unit architecture, it is possible to perform long simulations, find stable interactions involved in the complex, and finally to suggest guideline for the identification and synthesis of potent and selective inhibitors. © 2013 John Wiley & Sons A/S.

  19. Modeling nuclear processes by Simulink

    Energy Technology Data Exchange (ETDEWEB)

    Rashid, Nahrul Khair Alang Md, E-mail: nahrul@iium.edu.my [Faculty of Engineering, International Islamic University Malaysia, Jalan Gombak, Selangor (Malaysia)

    2015-04-29

    Modelling and simulation are essential parts in the study of dynamic systems behaviours. In nuclear engineering, modelling and simulation are important to assess the expected results of an experiment before the actual experiment is conducted or in the design of nuclear facilities. In education, modelling can give insight into the dynamic of systems and processes. Most nuclear processes can be described by ordinary or partial differential equations. Efforts expended to solve the equations using analytical or numerical solutions consume time and distract attention from the objectives of modelling itself. This paper presents the use of Simulink, a MATLAB toolbox software that is widely used in control engineering, as a modelling platform for the study of nuclear processes including nuclear reactor behaviours. Starting from the describing equations, Simulink models for heat transfer, radionuclide decay process, delayed neutrons effect, reactor point kinetic equations with delayed neutron groups, and the effect of temperature feedback are used as examples.

  20. Modeling nuclear processes by Simulink

    International Nuclear Information System (INIS)

    Rashid, Nahrul Khair Alang Md

    2015-01-01

    Modelling and simulation are essential parts in the study of dynamic systems behaviours. In nuclear engineering, modelling and simulation are important to assess the expected results of an experiment before the actual experiment is conducted or in the design of nuclear facilities. In education, modelling can give insight into the dynamic of systems and processes. Most nuclear processes can be described by ordinary or partial differential equations. Efforts expended to solve the equations using analytical or numerical solutions consume time and distract attention from the objectives of modelling itself. This paper presents the use of Simulink, a MATLAB toolbox software that is widely used in control engineering, as a modelling platform for the study of nuclear processes including nuclear reactor behaviours. Starting from the describing equations, Simulink models for heat transfer, radionuclide decay process, delayed neutrons effect, reactor point kinetic equations with delayed neutron groups, and the effect of temperature feedback are used as examples

  1. Modeling HIV-1 drug resistance as episodic directional selection.

    Science.gov (United States)

    Murrell, Ben; de Oliveira, Tulio; Seebregts, Chris; Kosakovsky Pond, Sergei L; Scheffler, Konrad

    2012-01-01

    The evolution of substitutions conferring drug resistance to HIV-1 is both episodic, occurring when patients are on antiretroviral therapy, and strongly directional, with site-specific resistant residues increasing in frequency over time. While methods exist to detect episodic diversifying selection and continuous directional selection, no evolutionary model combining these two properties has been proposed. We present two models of episodic directional selection (MEDS and EDEPS) which allow the a priori specification of lineages expected to have undergone directional selection. The models infer the sites and target residues that were likely subject to directional selection, using either codon or protein sequences. Compared to its null model of episodic diversifying selection, MEDS provides a superior fit to most sites known to be involved in drug resistance, and neither one test for episodic diversifying selection nor another for constant directional selection are able to detect as many true positives as MEDS and EDEPS while maintaining acceptable levels of false positives. This suggests that episodic directional selection is a better description of the process driving the evolution of drug resistance.

  2. Modeling HIV-1 drug resistance as episodic directional selection.

    Directory of Open Access Journals (Sweden)

    Ben Murrell

    Full Text Available The evolution of substitutions conferring drug resistance to HIV-1 is both episodic, occurring when patients are on antiretroviral therapy, and strongly directional, with site-specific resistant residues increasing in frequency over time. While methods exist to detect episodic diversifying selection and continuous directional selection, no evolutionary model combining these two properties has been proposed. We present two models of episodic directional selection (MEDS and EDEPS which allow the a priori specification of lineages expected to have undergone directional selection. The models infer the sites and target residues that were likely subject to directional selection, using either codon or protein sequences. Compared to its null model of episodic diversifying selection, MEDS provides a superior fit to most sites known to be involved in drug resistance, and neither one test for episodic diversifying selection nor another for constant directional selection are able to detect as many true positives as MEDS and EDEPS while maintaining acceptable levels of false positives. This suggests that episodic directional selection is a better description of the process driving the evolution of drug resistance.

  3. MANUFACTURING PROCESS FUNCTIONS--I. AN ALTERNATIVE MODEL AND ITS COMPARISON WITH EXISTING FUNCTIONS (AND)II. SELECTION OF TRAINEES AND CONTROL OF THEIR PROGRESS.

    Science.gov (United States)

    GLOVER, J.H.

    THE CHIEF OBJECTIVE OF THIS STUDY OF SPEED-SKILL ACQUISITION WAS TO FIND A MATHEMATICAL MODEL CAPABLE OF SIMPLE GRAPHIC INTERPRETATION FOR INDUSTRIAL TRAINING AND PRODUCTION SCHEDULING AT THE SHOP FLOOR LEVEL. STUDIES OF MIDDLE SKILL DEVELOPMENT IN MACHINE AND VEHICLE ASSEMBLY, AIRCRAFT PRODUCTION, SPOOLMAKING AND THE MACHINING OF PARTS CONFIRMED…

  4. Bayesian site selection for fast Gaussian process regression

    KAUST Repository

    Pourhabib, Arash; Liang, Faming; Ding, Yu

    2014-01-01

    Gaussian Process (GP) regression is a popular method in the field of machine learning and computer experiment designs; however, its ability to handle large data sets is hindered by the computational difficulty in inverting a large covariance matrix. Likelihood approximation methods were developed as a fast GP approximation, thereby reducing the computation cost of GP regression by utilizing a much smaller set of unobserved latent variables called pseudo points. This article reports a further improvement to the likelihood approximation methods by simultaneously deciding both the number and locations of the pseudo points. The proposed approach is a Bayesian site selection method where both the number and locations of the pseudo inputs are parameters in the model, and the Bayesian model is solved using a reversible jump Markov chain Monte Carlo technique. Through a number of simulated and real data sets, it is demonstrated that with appropriate priors chosen, the Bayesian site selection method can produce a good balance between computation time and prediction accuracy: it is fast enough to handle large data sets that a full GP is unable to handle, and it improves, quite often remarkably, the prediction accuracy, compared with the existing likelihood approximations. © 2014 Taylor and Francis Group, LLC.

  5. Bayesian site selection for fast Gaussian process regression

    KAUST Repository

    Pourhabib, Arash

    2014-02-05

    Gaussian Process (GP) regression is a popular method in the field of machine learning and computer experiment designs; however, its ability to handle large data sets is hindered by the computational difficulty in inverting a large covariance matrix. Likelihood approximation methods were developed as a fast GP approximation, thereby reducing the computation cost of GP regression by utilizing a much smaller set of unobserved latent variables called pseudo points. This article reports a further improvement to the likelihood approximation methods by simultaneously deciding both the number and locations of the pseudo points. The proposed approach is a Bayesian site selection method where both the number and locations of the pseudo inputs are parameters in the model, and the Bayesian model is solved using a reversible jump Markov chain Monte Carlo technique. Through a number of simulated and real data sets, it is demonstrated that with appropriate priors chosen, the Bayesian site selection method can produce a good balance between computation time and prediction accuracy: it is fast enough to handle large data sets that a full GP is unable to handle, and it improves, quite often remarkably, the prediction accuracy, compared with the existing likelihood approximations. © 2014 Taylor and Francis Group, LLC.

  6. Automated sample plan selection for OPC modeling

    Science.gov (United States)

    Casati, Nathalie; Gabrani, Maria; Viswanathan, Ramya; Bayraktar, Zikri; Jaiswal, Om; DeMaris, David; Abdo, Amr Y.; Oberschmidt, James; Krause, Andreas

    2014-03-01

    It is desired to reduce the time required to produce metrology data for calibration of Optical Proximity Correction (OPC) models and also maintain or improve the quality of the data collected with regard to how well that data represents the types of patterns that occur in real circuit designs. Previous work based on clustering in geometry and/or image parameter space has shown some benefit over strictly manual or intuitive selection, but leads to arbitrary pattern exclusion or selection which may not be the best representation of the product. Forming the pattern selection as an optimization problem, which co-optimizes a number of objective functions reflecting modelers' insight and expertise, has shown to produce models with equivalent quality to the traditional plan of record (POR) set but in a less time.

  7. modeling grinding modeling grinding processes as micro processes

    African Journals Online (AJOL)

    eobe

    industrial precision grinding processes are cylindrical, center less and ... Several model shave been proposed and used to study grinding ..... grinding force for the two cases were 9.07237N/mm ..... International Journal of Machine Tools &.

  8. Variable selection and model choice in geoadditive regression models.

    Science.gov (United States)

    Kneib, Thomas; Hothorn, Torsten; Tutz, Gerhard

    2009-06-01

    Model choice and variable selection are issues of major concern in practical regression analyses, arising in many biometric applications such as habitat suitability analyses, where the aim is to identify the influence of potentially many environmental conditions on certain species. We describe regression models for breeding bird communities that facilitate both model choice and variable selection, by a boosting algorithm that works within a class of geoadditive regression models comprising spatial effects, nonparametric effects of continuous covariates, interaction surfaces, and varying coefficients. The major modeling components are penalized splines and their bivariate tensor product extensions. All smooth model terms are represented as the sum of a parametric component and a smooth component with one degree of freedom to obtain a fair comparison between the model terms. A generic representation of the geoadditive model allows us to devise a general boosting algorithm that automatically performs model choice and variable selection.

  9. Sato Processes in Default Modeling

    DEFF Research Database (Denmark)

    Kokholm, Thomas; Nicolato, Elisa

    -change of a homogeneous Levy process. While the processes in these two classes share the same average behavior over time, the associated intensities exhibit very different properties. Concrete specifications are calibrated to data on the single names included in the iTraxx Europe index. The performances are compared......In reduced form default models, the instantaneous default intensity is classically the modeling object. Survival probabilities are then given by the Laplace transform of the cumulative hazard defined as the integrated intensity process. Instead, recent literature has shown a tendency towards...... specifying the cumulative hazard process directly. Within this framework we present a new model class where cumulative hazards are described by self-similar additive processes, also known as Sato processes. Furthermore we also analyze specifications obtained via a simple deterministic time...

  10. EIS and adjunct electrical modeling for material selection by evaluating two mild steels for use in super-alkaline mineral processing

    DEFF Research Database (Denmark)

    Bakhtiyari, Leila; Moghimi, Fereshteh; Mansouri, Seyed Soheil

    2012-01-01

    The production of metal concentrates during mineral processing of ferrous and non-ferrous metals involves a variety of highly corrosive chemicals which deteriorate common mild steel as the material of choice in the construction of such lines, through rapid propagation of localized pitting...... in susceptible parts, often in sensitive areas. This requires unscheduled maintenance and plant shut down. In order to test the corrosion resistance of different available materials as replacement materials, polarization and electrochemical impedance spectroscopy (EIS) tests were carried out. The EIS numerical...... software-enhanced polarization resistance, and reduced capacitance added to much diminished current densities, verified the acceptable performance of CK45 compared with high priced stainless steel substitutes with comparable operational life. Therefore, CK45 can be a suitable alternative in steel...

  11. Natural Selection as an Emergent Process: Instructional Implications

    Science.gov (United States)

    Cooper, Robert A.

    2017-01-01

    Student reasoning about cases of natural selection is often plagued by errors that stem from miscategorising selection as a direct, causal process, misunderstanding the role of randomness, and from the intuitive ideas of intentionality, teleology and essentialism. The common thread throughout many of these reasoning errors is a failure to apply…

  12. Continuous process for selective metal extraction with an ionic liquid

    NARCIS (Netherlands)

    Parmentier, D.; Paradis, S.; Metz, S.J.; Wiedmer, S.K.; Kroon, M.C.

    2016-01-01

    This work describes for the first time a continuous process for selective metal extraction with an ionic liquid (IL) at room temperature. The hydrophobic fatty acid based IL tetraoctylphosphonium oleate ([P8888][oleate]) was specifically chosen for its low viscosity and high selectivity towards

  13. Natural Selection Is a Sorting Process: What Does that Mean?

    Science.gov (United States)

    Price, Rebecca M.

    2013-01-01

    To learn why natural selection acts only on existing variation, students categorize processes as either creative or sorting. This activity helps students confront the misconception that adaptations evolve because species need them.

  14. Process for selected gas oxide removal by radiofrequency catalysts

    Science.gov (United States)

    Cha, Chang Y.

    1993-01-01

    This process to remove gas oxides from flue gas utilizes adsorption on a char bed subsequently followed by radiofrequency catalysis enhancing such removal through selected reactions. Common gas oxides include SO.sub.2 and NO.sub.x.

  15. A Hybrid Multiple Criteria Decision Making Model for Supplier Selection

    Directory of Open Access Journals (Sweden)

    Chung-Min Wu

    2013-01-01

    Full Text Available The sustainable supplier selection would be the vital part in the management of a sustainable supply chain. In this study, a hybrid multiple criteria decision making (MCDM model is applied to select optimal supplier. The fuzzy Delphi method, which can lead to better criteria selection, is used to modify criteria. Considering the interdependence among the selection criteria, analytic network process (ANP is then used to obtain their weights. To avoid calculation and additional pairwise comparisons of ANP, a technique for order preference by similarity to ideal solution (TOPSIS is used to rank the alternatives. The use of a combination of the fuzzy Delphi method, ANP, and TOPSIS, proposing an MCDM model for supplier selection, and applying these to a real case are the unique features of this study.

  16. Model Selection in Data Analysis Competitions

    DEFF Research Database (Denmark)

    Wind, David Kofoed; Winther, Ole

    2014-01-01

    The use of data analysis competitions for selecting the most appropriate model for a problem is a recent innovation in the field of predictive machine learning. Two of the most well-known examples of this trend was the Netflix Competition and recently the competitions hosted on the online platform...... performers from Kaggle and use previous personal experiences from competing in Kaggle competitions. The stated hypotheses about feature engineering, ensembling, overfitting, model complexity and evaluation metrics give indications and guidelines on how to select a proper model for performing well...... Kaggle. In this paper, we will state and try to verify a set of qualitative hypotheses about predictive modelling, both in general and in the scope of data analysis competitions. To verify our hypotheses we will look at previous competitions and their outcomes, use qualitative interviews with top...

  17. Adverse selection model regarding tobacco consumption

    Directory of Open Access Journals (Sweden)

    Dumitru MARIN

    2006-01-01

    Full Text Available The impact of introducing a tax on tobacco consumption can be studied trough an adverse selection model. The objective of the model presented in the following is to characterize the optimal contractual relationship between the governmental authorities and the two type employees: smokers and non-smokers, taking into account that the consumers’ decision to smoke or not represents an element of risk and uncertainty. Two scenarios are run using the General Algebraic Modeling Systems software: one without taxes set on tobacco consumption and another one with taxes set on tobacco consumption, based on an adverse selection model described previously. The results of the two scenarios are compared in the end of the paper: the wage earnings levels and the social welfare in case of a smoking agent and in case of a non-smoking agent.

  18. Markov Decision Process Measurement Model.

    Science.gov (United States)

    LaMar, Michelle M

    2018-03-01

    Within-task actions can provide additional information on student competencies but are challenging to model. This paper explores the potential of using a cognitive model for decision making, the Markov decision process, to provide a mapping between within-task actions and latent traits of interest. Psychometric properties of the model are explored, and simulation studies report on parameter recovery within the context of a simple strategy game. The model is then applied to empirical data from an educational game. Estimates from the model are found to correlate more strongly with posttest results than a partial-credit IRT model based on outcome data alone.

  19. Evidence accumulation as a model for lexical selection.

    Science.gov (United States)

    Anders, R; Riès, S; van Maanen, L; Alario, F X

    2015-11-01

    We propose and demonstrate evidence accumulation as a plausible theoretical and/or empirical model for the lexical selection process of lexical retrieval. A number of current psycholinguistic theories consider lexical selection as a process related to selecting a lexical target from a number of alternatives, which each have varying activations (or signal supports), that are largely resultant of an initial stimulus recognition. We thoroughly present a case for how such a process may be theoretically explained by the evidence accumulation paradigm, and we demonstrate how this paradigm can be directly related or combined with conventional psycholinguistic theory and their simulatory instantiations (generally, neural network models). Then with a demonstrative application on a large new real data set, we establish how the empirical evidence accumulation approach is able to provide parameter results that are informative to leading psycholinguistic theory, and that motivate future theoretical development. Copyright © 2015 Elsevier Inc. All rights reserved.

  20. Social Models: Blueprints or Processes?

    Science.gov (United States)

    Little, Graham R.

    1981-01-01

    Discusses the nature and implications of two different models for societal planning: (1) the problem-solving process approach based on Karl Popper; and (2) the goal-setting "blueprint" approach based on Karl Marx. (DC)

  1. Simple Models for Process Control

    Czech Academy of Sciences Publication Activity Database

    Gorez, R.; Klán, Petr

    2011-01-01

    Roč. 22, č. 2 (2011), s. 58-62 ISSN 0929-2268 Institutional research plan: CEZ:AV0Z10300504 Keywords : process model s * PID control * second order dynamics Subject RIV: JB - Sensors, Measurment, Regulation

  2. Conflict between public perceptions and technical processes in site selection

    International Nuclear Information System (INIS)

    Avant, R.V. Jr.; Jacobi, L.R.

    1985-01-01

    U.S. Nuclear Regulatory Commission regulations and guidance on site selection are based on sound technical reasoning. Geology, hydrology, flora and fauna, transportation, demographics, and sociopolitical concerns, to name a few, have been factored into the process. Regardless of the technical objectivity of a site selection process, local opposition groups will challenge technical decisions using technical, nontechnical, and emotional arguments. This paper explores the many conflicts between public perceptions, technical requirements designed to protect the general public, and common arguments against site selection. Ways to deal with opposition are also discussed with emphasis placed on developing effective community relations

  3. Temporally selective processing of communication signals by auditory midbrain neurons

    DEFF Research Database (Denmark)

    Elliott, Taffeta M; Christensen-Dalsgaard, Jakob; Kelley, Darcy B

    2011-01-01

    click rates ranged from 4 to 50 Hz, the rate at which the clicks begin to overlap. Frequency selectivity and temporal processing were characterized using response-intensity curves, temporal-discharge patterns, and autocorrelations of reduplicated responses to click trains. Characteristic frequencies...... of the rate of clicks in calls. The majority of neurons (85%) were selective for click rates, and this selectivity remained unchanged over sound levels 10 to 20 dB above threshold. Selective neurons give phasic, tonic, or adapting responses to tone bursts and click trains. Some algorithms that could compute...

  4. Model feedstock supply processing plants

    Directory of Open Access Journals (Sweden)

    V. M. Bautin

    2013-01-01

    Full Text Available The model of raw providing the processing enterprises entering into vertically integrated structure on production and processing of dairy raw materials, differing by an orientation on achievement of cumulative effect by the integrated structure acting as criterion function which maximizing is reached by optimization of capacities, volumes of deliveries of raw materials and its qualitative characteristics, costs of industrial processing of raw materials and demand for dairy production is developed.

  5. A review of channel selection algorithms for EEG signal processing

    Science.gov (United States)

    Alotaiby, Turky; El-Samie, Fathi E. Abd; Alshebeili, Saleh A.; Ahmad, Ishtiaq

    2015-12-01

    Digital processing of electroencephalography (EEG) signals has now been popularly used in a wide variety of applications such as seizure detection/prediction, motor imagery classification, mental task classification, emotion classification, sleep state classification, and drug effects diagnosis. With the large number of EEG channels acquired, it has become apparent that efficient channel selection algorithms are needed with varying importance from one application to another. The main purpose of the channel selection process is threefold: (i) to reduce the computational complexity of any processing task performed on EEG signals by selecting the relevant channels and hence extracting the features of major importance, (ii) to reduce the amount of overfitting that may arise due to the utilization of unnecessary channels, for the purpose of improving the performance, and (iii) to reduce the setup time in some applications. Signal processing tools such as time-domain analysis, power spectral estimation, and wavelet transform have been used for feature extraction and hence for channel selection in most of channel selection algorithms. In addition, different evaluation approaches such as filtering, wrapper, embedded, hybrid, and human-based techniques have been widely used for the evaluation of the selected subset of channels. In this paper, we survey the recent developments in the field of EEG channel selection methods along with their applications and classify these methods according to the evaluation approach.

  6. ASYMMETRIC PRICE TRANSMISSION MODELING: THE IMPORTANCE OF MODEL COMPLEXITY AND THE PERFORMANCE OF THE SELECTION CRITERIA

    Directory of Open Access Journals (Sweden)

    Henry de-Graft Acquah

    2013-01-01

    Full Text Available Information Criteria provides an attractive basis for selecting the best model from a set of competing asymmetric price transmission models or theories. However, little is understood about the sensitivity of the model selection methods to model complexity. This study therefore fits competing asymmetric price transmission models that differ in complexity to simulated data and evaluates the ability of the model selection methods to recover the true model. The results of Monte Carlo experimentation suggest that in general BIC, CAIC and DIC were superior to AIC when the true data generating process was the standard error correction model, whereas AIC was more successful when the true model was the complex error correction model. It is also shown that the model selection methods performed better in large samples for a complex asymmetric data generating process than with a standard asymmetric data generating process. Except for complex models, AIC's performance did not make substantial gains in recovery rates as sample size increased. The research findings demonstrate the influence of model complexity in asymmetric price transmission model comparison and selection.

  7. Pedagogic process modeling: Humanistic-integrative approach

    Directory of Open Access Journals (Sweden)

    Boritko Nikolaj M.

    2007-01-01

    Full Text Available The paper deals with some current problems of modeling the dynamics of the subject-features development of the individual. The term "process" is considered in the context of the humanistic-integrative approach, in which the principles of self education are regarded as criteria for efficient pedagogic activity. Four basic characteristics of the pedagogic process are pointed out: intentionality reflects logicality and regularity of the development of the process; discreteness (stageability in dicates qualitative stages through which the pedagogic phenomenon passes; nonlinearity explains the crisis character of pedagogic processes and reveals inner factors of self-development; situationality requires a selection of pedagogic conditions in accordance with the inner factors, which would enable steering the pedagogic process. Offered are two steps for singling out a particular stage and the algorithm for developing an integrative model for it. The suggested conclusions might be of use for further theoretic research, analyses of educational practices and for realistic predicting of pedagogical phenomena. .

  8. Sato Processes in Default Modeling

    DEFF Research Database (Denmark)

    Kokholm, Thomas; Nicolato, Elisa

    2010-01-01

    In reduced form default models, the instantaneous default intensity is the classical modeling object. Survival probabilities are then given by the Laplace transform of the cumulative hazard defined as the integrated intensity process. Instead, recent literature tends to specify the cumulative haz...

  9. Sato Processes in Default Modeling

    DEFF Research Database (Denmark)

    Kokholm, Thomas; Nicolato, Elisa

    In reduced form default models, the instantaneous default intensity is classically the modeling object. Survival probabilities are then given by the Laplace transform of the cumulative hazard defined as the integrated intensity process. Instead, recent literature has shown a tendency towards...

  10. Risk calculations in the manufacturing technology selection process

    DEFF Research Database (Denmark)

    Farooq, S.; O'Brien, C.

    2010-01-01

    Purpose - The purpose of this paper is to present result obtained from a developed technology selection framework and provide a detailed insight into the risk calculations and their implications in manufacturing technology selection process. Design/methodology/approach - The results illustrated...... in the paper are the outcome of an action research study that was conducted in an aerospace company. Findings - The paper highlights the role of risk calculations in manufacturing technology selection process by elaborating the contribution of risk associated with manufacturing technology alternatives...... in the shape of opportunities and threats in different decision-making environments. Practical implications - The research quantifies the risk associated with different available manufacturing technology alternatives. This quantification of risk crystallises the process of technology selection decision making...

  11. Metrics for Business Process Models

    Science.gov (United States)

    Mendling, Jan

    Up until now, there has been little research on why people introduce errors in real-world business process models. In a more general context, Simon [404] points to the limitations of cognitive capabilities and concludes that humans act rationally only to a certain extent. Concerning modeling errors, this argument would imply that human modelers lose track of the interrelations of large and complex models due to their limited cognitive capabilities and introduce errors that they would not insert in a small model. A recent study by Mendling et al. [275] explores in how far certain complexity metrics of business process models have the potential to serve as error determinants. The authors conclude that complexity indeed appears to have an impact on error probability. Before we can test such a hypothesis in a more general setting, we have to establish an understanding of how we can define determinants that drive error probability and how we can measure them.

  12. Implementation of the Business Process Modelling Notation (BPMN) in the modelling of anatomic pathology processes.

    Science.gov (United States)

    Rojo, Marcial García; Rolón, Elvira; Calahorra, Luis; García, Felix Oscar; Sánchez, Rosario Paloma; Ruiz, Francisco; Ballester, Nieves; Armenteros, María; Rodríguez, Teresa; Espartero, Rafael Martín

    2008-07-15

    Process orientation is one of the essential elements of quality management systems, including those in use in healthcare. Business processes in hospitals are very complex and variable. BPMN (Business Process Modelling Notation) is a user-oriented language specifically designed for the modelling of business (organizational) processes. Previous experiences of the use of this notation in the processes modelling within the Pathology in Spain or another country are not known. We present our experience in the elaboration of the conceptual models of Pathology processes, as part of a global programmed surgical patient process, using BPMN. With the objective of analyzing the use of BPMN notation in real cases, a multidisciplinary work group was created, including software engineers from the Dep. of Technologies and Information Systems from the University of Castilla-La Mancha and health professionals and administrative staff from the Hospital General de Ciudad Real. The work in collaboration was carried out in six phases: informative meetings, intensive training, process selection, definition of the work method, process describing by hospital experts, and process modelling. The modelling of the processes of Anatomic Pathology is presented using BPMN. The presented subprocesses are those corresponding to the surgical pathology examination of the samples coming from operating theatre, including the planning and realization of frozen studies. The modelling of Anatomic Pathology subprocesses has allowed the creation of an understandable graphical model, where management and improvements are more easily implemented by health professionals.

  13. Selection, calibration, and validation of models of tumor growth.

    Science.gov (United States)

    Lima, E A B F; Oden, J T; Hormuth, D A; Yankeelov, T E; Almeida, R C

    2016-11-01

    This paper presents general approaches for addressing some of the most important issues in predictive computational oncology concerned with developing classes of predictive models of tumor growth. First, the process of developing mathematical models of vascular tumors evolving in the complex, heterogeneous, macroenvironment of living tissue; second, the selection of the most plausible models among these classes, given relevant observational data; third, the statistical calibration and validation of models in these classes, and finally, the prediction of key Quantities of Interest (QOIs) relevant to patient survival and the effect of various therapies. The most challenging aspects of this endeavor is that all of these issues often involve confounding uncertainties: in observational data, in model parameters, in model selection, and in the features targeted in the prediction. Our approach can be referred to as "model agnostic" in that no single model is advocated; rather, a general approach that explores powerful mixture-theory representations of tissue behavior while accounting for a range of relevant biological factors is presented, which leads to many potentially predictive models. Then representative classes are identified which provide a starting point for the implementation of OPAL, the Occam Plausibility Algorithm (OPAL) which enables the modeler to select the most plausible models (for given data) and to determine if the model is a valid tool for predicting tumor growth and morphology ( in vivo ). All of these approaches account for uncertainties in the model, the observational data, the model parameters, and the target QOI. We demonstrate these processes by comparing a list of models for tumor growth, including reaction-diffusion models, phase-fields models, and models with and without mechanical deformation effects, for glioma growth measured in murine experiments. Examples are provided that exhibit quite acceptable predictions of tumor growth in laboratory

  14. Computer Modelling of Dynamic Processes

    Directory of Open Access Journals (Sweden)

    B. Rybakin

    2000-10-01

    Full Text Available Results of numerical modeling of dynamic problems are summed in the article up. These problems are characteristic for various areas of human activity, in particular for problem solving in ecology. The following problems are considered in the present work: computer modeling of dynamic effects on elastic-plastic bodies, calculation and determination of performances of gas streams in gas cleaning equipment, modeling of biogas formation processes.

  15. Computationally efficient thermal-mechanical modelling of selective laser melting

    NARCIS (Netherlands)

    Yang, Y.; Ayas, C.; Brabazon, Dermot; Naher, Sumsun; Ul Ahad, Inam

    2017-01-01

    The Selective laser melting (SLM) is a powder based additive manufacturing (AM) method to produce high density metal parts with complex topology. However, part distortions and accompanying residual stresses deteriorates the mechanical reliability of SLM products. Modelling of the SLM process is

  16. Modelling Hospital Materials Management Processes

    Directory of Open Access Journals (Sweden)

    Raffaele Iannone

    2013-06-01

    integrated and detailed analysis and description model for hospital materials management data and tasks, which is able to tackle information from patient requirements to usage, from replenishment requests to supplying and handling activities. The model takes account of medical risk reduction, traceability and streamlined processes perspectives. Second, the paper translates this information into a business process model and mathematical formalization.The study provides a useful guide to the various relevant technology‐related, management and business issues, laying the foundations of an efficient reengineering of the supply chain to reduce healthcare costs and improve the quality of care.

  17. Declarative modeling for process supervision

    International Nuclear Information System (INIS)

    Leyval, L.

    1989-01-01

    Our work is a contribution to computer aided supervision of continuous processes. It is inspired by an area of Artificial Intelligence: qualitative physics. Here, supervision is based on a model which continuously provides operators with a synthetic view of the process; but this model is founded on general principles of control theory rather than on physics. It involves concepts such as high gain or small time response. It helps in linking temporally the evolution of various variables. Moreover, the model provides predictions of the future behaviour of the process, which allows action advice and alarm filtering. This should greatly reduce the famous cognitive overload associated to any complex and dangerous evolution of the process

  18. Variable selection in Logistic regression model with genetic algorithm.

    Science.gov (United States)

    Zhang, Zhongheng; Trevino, Victor; Hoseini, Sayed Shahabuddin; Belciug, Smaranda; Boopathi, Arumugam Manivanna; Zhang, Ping; Gorunescu, Florin; Subha, Velappan; Dai, Songshi

    2018-02-01

    Variable or feature selection is one of the most important steps in model specification. Especially in the case of medical-decision making, the direct use of a medical database, without a previous analysis and preprocessing step, is often counterproductive. In this way, the variable selection represents the method of choosing the most relevant attributes from the database in order to build a robust learning models and, thus, to improve the performance of the models used in the decision process. In biomedical research, the purpose of variable selection is to select clinically important and statistically significant variables, while excluding unrelated or noise variables. A variety of methods exist for variable selection, but none of them is without limitations. For example, the stepwise approach, which is highly used, adds the best variable in each cycle generally producing an acceptable set of variables. Nevertheless, it is limited by the fact that it commonly trapped in local optima. The best subset approach can systematically search the entire covariate pattern space, but the solution pool can be extremely large with tens to hundreds of variables, which is the case in nowadays clinical data. Genetic algorithms (GA) are heuristic optimization approaches and can be used for variable selection in multivariable regression models. This tutorial paper aims to provide a step-by-step approach to the use of GA in variable selection. The R code provided in the text can be extended and adapted to other data analysis needs.

  19. Post-model selection inference and model averaging

    Directory of Open Access Journals (Sweden)

    Georges Nguefack-Tsague

    2011-07-01

    Full Text Available Although model selection is routinely used in practice nowadays, little is known about its precise effects on any subsequent inference that is carried out. The same goes for the effects induced by the closely related technique of model averaging. This paper is concerned with the use of the same data first to select a model and then to carry out inference, in particular point estimation and point prediction. The properties of the resulting estimator, called a post-model-selection estimator (PMSE, are hard to derive. Using selection criteria such as hypothesis testing, AIC, BIC, HQ and Cp, we illustrate that, in terms of risk function, no single PMSE dominates the others. The same conclusion holds more generally for any penalised likelihood information criterion. We also compare various model averaging schemes and show that no single one dominates the others in terms of risk function. Since PMSEs can be regarded as a special case of model averaging, with 0-1 random-weights, we propose a connection between the two theories, in the frequentist approach, by taking account of the selection procedure when performing model averaging. We illustrate the point by simulating a simple linear regression model.

  20. Skewed factor models using selection mechanisms

    KAUST Repository

    Kim, Hyoung-Moon

    2015-12-21

    Traditional factor models explicitly or implicitly assume that the factors follow a multivariate normal distribution; that is, only moments up to order two are involved. However, it may happen in real data problems that the first two moments cannot explain the factors. Based on this motivation, here we devise three new skewed factor models, the skew-normal, the skew-tt, and the generalized skew-normal factor models depending on a selection mechanism on the factors. The ECME algorithms are adopted to estimate related parameters for statistical inference. Monte Carlo simulations validate our new models and we demonstrate the need for skewed factor models using the classic open/closed book exam scores dataset.

  1. Skewed factor models using selection mechanisms

    KAUST Repository

    Kim, Hyoung-Moon; Maadooliat, Mehdi; Arellano-Valle, Reinaldo B.; Genton, Marc G.

    2015-01-01

    Traditional factor models explicitly or implicitly assume that the factors follow a multivariate normal distribution; that is, only moments up to order two are involved. However, it may happen in real data problems that the first two moments cannot explain the factors. Based on this motivation, here we devise three new skewed factor models, the skew-normal, the skew-tt, and the generalized skew-normal factor models depending on a selection mechanism on the factors. The ECME algorithms are adopted to estimate related parameters for statistical inference. Monte Carlo simulations validate our new models and we demonstrate the need for skewed factor models using the classic open/closed book exam scores dataset.

  2. Model structure selection in convolutive mixtures

    DEFF Research Database (Denmark)

    Dyrholm, Mads; Makeig, S.; Hansen, Lars Kai

    2006-01-01

    The CICAAR algorithm (convolutive independent component analysis with an auto-regressive inverse model) allows separation of white (i.i.d) source signals from convolutive mixtures. We introduce a source color model as a simple extension to the CICAAR which allows for a more parsimonious represent......The CICAAR algorithm (convolutive independent component analysis with an auto-regressive inverse model) allows separation of white (i.i.d) source signals from convolutive mixtures. We introduce a source color model as a simple extension to the CICAAR which allows for a more parsimonious...... representation in many practical mixtures. The new filter-CICAAR allows Bayesian model selection and can help answer questions like: ’Are we actually dealing with a convolutive mixture?’. We try to answer this question for EEG data....

  3. Comparative analysis of business rules and business process modeling languages

    Directory of Open Access Journals (Sweden)

    Audrius Rima

    2013-03-01

    Full Text Available During developing an information system is important to create clear models and choose suitable modeling languages. The article analyzes the SRML, SBVR, PRR, SWRL, OCL rules specifying language and UML, DFD, CPN, EPC and IDEF3 BPMN business process modeling language. The article presents business rules and business process modeling languages theoretical comparison. The article according to selected modeling aspects of the comparison between different business process modeling languages ​​and business rules representation languages sets. Also, it is selected the best fit of language set for three layer framework for business rule based software modeling.

  4. Hencky's model for elastomer forming process

    Science.gov (United States)

    Oleinikov, A. A.; Oleinikov, A. I.

    2016-08-01

    In the numerical simulation of elastomer forming process, Henckys isotropic hyperelastic material model can guarantee relatively accurate prediction of strain range in terms of large deformations. It is shown, that this material model prolongate Hooke's law from the area of infinitesimal strains to the area of moderate ones. New representation of the fourth-order elasticity tensor for Hencky's hyperelastic isotropic material is obtained, it possesses both minor symmetries, and the major symmetry. Constitutive relations of considered model is implemented into MSC.Marc code. By calculating and fitting curves, the polyurethane elastomer material constants are selected. Simulation of equipment for elastomer sheet forming are considered.

  5. Additive Manufacturing Processes: Selective Laser Melting, Electron Beam Melting and Binder Jetting—Selection Guidelines

    Science.gov (United States)

    Konda Gokuldoss, Prashanth; Kolla, Sri; Eckert, Jürgen

    2017-01-01

    Additive manufacturing (AM), also known as 3D printing or rapid prototyping, is gaining increasing attention due to its ability to produce parts with added functionality and increased complexities in geometrical design, on top of the fact that it is theoretically possible to produce any shape without limitations. However, most of the research on additive manufacturing techniques are focused on the development of materials/process parameters/products design with different additive manufacturing processes such as selective laser melting, electron beam melting, or binder jetting. However, we do not have any guidelines that discuss the selection of the most suitable additive manufacturing process, depending on the material to be processed, the complexity of the parts to be produced, or the design considerations. Considering the very fact that no reports deal with this process selection, the present manuscript aims to discuss the different selection criteria that are to be considered, in order to select the best AM process (binder jetting/selective laser melting/electron beam melting) for fabricating a specific component with a defined set of material properties. PMID:28773031

  6. Additive Manufacturing Processes: Selective Laser Melting, Electron Beam Melting and Binder Jetting-Selection Guidelines.

    Science.gov (United States)

    Gokuldoss, Prashanth Konda; Kolla, Sri; Eckert, Jürgen

    2017-06-19

    Additive manufacturing (AM), also known as 3D printing or rapid prototyping, is gaining increasing attention due to its ability to produce parts with added functionality and increased complexities in geometrical design, on top of the fact that it is theoretically possible to produce any shape without limitations. However, most of the research on additive manufacturing techniques are focused on the development of materials/process parameters/products design with different additive manufacturing processes such as selective laser melting, electron beam melting, or binder jetting. However, we do not have any guidelines that discuss the selection of the most suitable additive manufacturing process, depending on the material to be processed, the complexity of the parts to be produced, or the design considerations. Considering the very fact that no reports deal with this process selection, the present manuscript aims to discuss the different selection criteria that are to be considered, in order to select the best AM process (binder jetting/selective laser melting/electron beam melting) for fabricating a specific component with a defined set of material properties.

  7. Introduction to gas lasers with emphasis on selective excitation processes

    CERN Document Server

    Willett, Colin S

    1974-01-01

    Introduction to Gas Lasers: Population Inversion Mechanisms focuses on important processes in gas discharge lasers and basic atomic collision processes that operate in a gas laser. Organized into six chapters, this book first discusses the historical development and basic principles of gas lasers. Subsequent chapters describe the selective excitation processes in gas discharges and the specific neutral, ionized and molecular laser systems. This book will be a valuable reference on the behavior of gas-discharge lasers to anyone already in the field.

  8. An International Perspective on Pharmacy Student Selection Policies and Processes.

    Science.gov (United States)

    Shaw, John; Kennedy, Julia; Jensen, Maree; Sheridan, Janie

    2015-10-25

    Objective. To reflect on selection policies and procedures for programs at pharmacy schools that are members of an international alliance of universities (Universitas 21). Methods. A questionnaire on selection policies and procedures was distributed to admissions directors at participating schools. Results. Completed questionnaires were received from 7 schools in 6 countries. Although marked differences were noted in the programs in different countries, there were commonalities in the selection processes. There was an emphasis on previous academic performance, especially in science subjects. With one exception, all schools had some form of interview, with several having moved to multiple mini-interviews in recent years. Conclusion. The majority of pharmacy schools in this survey relied on traditional selection processes. While there was increasing use of multiple mini-interviews, the authors suggest that additional new approaches may be required in light of the changing nature of the profession.

  9. ECONOMIC MODELING PROCESSES USING MATLAB

    Directory of Open Access Journals (Sweden)

    Anamaria G. MACOVEI

    2008-06-01

    Full Text Available To study economic phenomena and processes using mathem atical modeling, and to determine the approximatesolution to a problem we need to choose a method of calculation and a numerical computer program, namely thepackage of programs MatLab. Any economic process or phenomenon is a mathematical description of h is behavior,and thus draw up an economic and mathematical model that has the following stages: formulation of the problem, theanalysis process modeling, the production model and design verification, validation and implementation of the model.This article is presented an economic model and its modeling is using mathematical equations and software packageMatLab, which helps us approximation effective solution. As data entry is considered the net cost, the cost of direct andtotal cost and the link between them. I presented the basic formula for determining the total cost. Economic modelcalculations were made in MatLab software package and with graphic representation of its interpretation of the resultsachieved in terms of our specific problem.

  10. Path modeling and process control

    DEFF Research Database (Denmark)

    Høskuldsson, Agnar; Rodionova, O.; Pomerantsev, A.

    2007-01-01

    and having three or more stages. The methods are applied to a process control of a multi-stage production process having 25 variables and one output variable. When moving along the process, variables change their roles. It is shown how the methods of path modeling can be applied to estimate variables...... be performed regarding the foreseeable output property y, and with respect to an admissible range of correcting actions for the parameters of the next stage. In this paper the basic principles of path modeling is presented. The mathematics is presented for processes having only one stage, having two stages...... of the next stage with the purpose of obtaining optimal or almost optimal quality of the output variable. An important aspect of the methods presented is the possibility of extensive graphic analysis of data that can provide the engineer with a detailed view of the multi-variate variation in data....

  11. Robust inference in sample selection models

    KAUST Repository

    Zhelonkin, Mikhail; Genton, Marc G.; Ronchetti, Elvezio

    2015-01-01

    The problem of non-random sample selectivity often occurs in practice in many fields. The classical estimators introduced by Heckman are the backbone of the standard statistical analysis of these models. However, these estimators are very sensitive to small deviations from the distributional assumptions which are often not satisfied in practice. We develop a general framework to study the robustness properties of estimators and tests in sample selection models. We derive the influence function and the change-of-variance function of Heckman's two-stage estimator, and we demonstrate the non-robustness of this estimator and its estimated variance to small deviations from the model assumed. We propose a procedure for robustifying the estimator, prove its asymptotic normality and give its asymptotic variance. Both cases with and without an exclusion restriction are covered. This allows us to construct a simple robust alternative to the sample selection bias test. We illustrate the use of our new methodology in an analysis of ambulatory expenditures and we compare the performance of the classical and robust methods in a Monte Carlo simulation study.

  12. Robust inference in sample selection models

    KAUST Repository

    Zhelonkin, Mikhail

    2015-11-20

    The problem of non-random sample selectivity often occurs in practice in many fields. The classical estimators introduced by Heckman are the backbone of the standard statistical analysis of these models. However, these estimators are very sensitive to small deviations from the distributional assumptions which are often not satisfied in practice. We develop a general framework to study the robustness properties of estimators and tests in sample selection models. We derive the influence function and the change-of-variance function of Heckman\\'s two-stage estimator, and we demonstrate the non-robustness of this estimator and its estimated variance to small deviations from the model assumed. We propose a procedure for robustifying the estimator, prove its asymptotic normality and give its asymptotic variance. Both cases with and without an exclusion restriction are covered. This allows us to construct a simple robust alternative to the sample selection bias test. We illustrate the use of our new methodology in an analysis of ambulatory expenditures and we compare the performance of the classical and robust methods in a Monte Carlo simulation study.

  13. Efficiently adapting graphical models for selectivity estimation

    DEFF Research Database (Denmark)

    Tzoumas, Kostas; Deshpande, Amol; Jensen, Christian S.

    2013-01-01

    cardinality estimation without making the independence assumption. By carefully using concepts from the field of graphical models, we are able to factor the joint probability distribution over all the attributes in the database into small, usually two-dimensional distributions, without a significant loss...... in estimation accuracy. We show how to efficiently construct such a graphical model from the database using only two-way join queries, and we show how to perform selectivity estimation in a highly efficient manner. We integrate our algorithms into the PostgreSQL DBMS. Experimental results indicate...

  14. Hydrological scenarios for two selected Alpine catchments for the 21st century using a stochastic weather generator and enhanced process understanding for modelling of seasonal snow and glacier melt for improved water resources management

    Science.gov (United States)

    Strasser, Ulrich; Schneeberger, Klaus; Dabhi, Hetal; Dubrovsky, Martin; Hanzer, Florian; Marke, Thomas; Oberguggenberger, Michael; Rössler, Ole; Schmieder, Jan; Rotach, Mathias; Stötter, Johann; Weingartner, Rolf

    2016-04-01

    The overall objective of HydroGeM³ is to quantify and assess both water demand and water supply in two coupled human-environment mountain systems, i.e. Lütschine in Switzerland and Ötztaler Ache in Austria. Special emphasis is laid on the analysis of possible future seasonal water scarcity. The hydrological response of high Alpine catchments is characterised by a strong seasonal variability with low runoff in winter and high runoff in spring and summer. Climate change is expected to cause a seasonal shift of the runoff regime and thus it has significant impact on both amount and timing of the release of the available water resources, and thereof, possible future water conflicts. In order to identify and quantify the contribution of snow and ice melt as well as rain to runoff, streamflow composition will be analysed with natural tracers. The results of the field investigations will help to improve the snow and ice melt and runoff modules of two selected hydrological models (i.e. AMUNDSEN and WaSiM) which are used to investigate the seasonal water availability under current and future climate conditions. Together, they comprise improved descriptions of boundary layer and surface melt processes (AMUNDSEN), and of streamflow runoff generation (WaSiM). Future meteorological forcing for the modelling until the end of the century will be provided by both a stochastic multi-site weather generator, and downscaled climate model output. Both approches will use EUROCORDEX data as input. The water demand in the selected study areas is quantified for the relevant societal sectors, e.g. agriculture, hydropower generation and (winter) tourism. The comparison of water availability and water demand under current and future climate conditions will allow the identification of possible seasonal bottlenecks of future water supply and resulting conflicts. Thus these investigations can provide a quantitative basis for the development of strategies for sustainable water management in

  15. Simulation of the selective oxidation process of semiconductors

    International Nuclear Information System (INIS)

    Chahoud, M.

    2012-01-01

    A new approach to simulate the selective oxidation of semiconductors is presented. This approach is based on the so-called b lack box simulation method . This method is usually used to simulate complex processes. The chemical and physical details within the process are not considered. Only the input and output data of the process are relevant for the simulation. A virtual function linking the input and output data has to be found. In the case of selective oxidation the input data are the mask geometry and the oxidation duration whereas the output data are the oxidation thickness distribution. The virtual function is determined as four virtual diffusion processes between the masked und non-masked areas. Each process delivers one part of the oxidation profile. The method is applied successfully on the oxidation system silicon-silicon nitride (Si-Si 3 N 4 ). The fitting parameters are determined through comparison of experimental and simulation results two-dimensionally.(author)

  16. Attribute based selection of thermoplastic resin for vacuum infusion process

    DEFF Research Database (Denmark)

    Prabhakaran, R.T. Durai; Lystrup, Aage; Løgstrup Andersen, Tom

    2011-01-01

    The composite industry looks toward a new material system (resins) based on thermoplastic polymers for the vacuum infusion process, similar to the infusion process using thermosetting polymers. A large number of thermoplastics are available in the market with a variety of properties suitable...... for different engineering applications, and few of those are available in a not yet polymerised form suitable for resin infusion. The proper selection of a new resin system among these thermoplastic polymers is a concern for manufactures in the current scenario and a special mathematical tool would...... be beneficial. In this paper, the authors introduce a new decision making tool for resin selection based on significant attributes. This article provides a broad overview of suitable thermoplastic material systems for vacuum infusion process available in today’s market. An illustrative example—resin selection...

  17. Process Models for Security Architectures

    Directory of Open Access Journals (Sweden)

    Floarea NASTASE

    2006-01-01

    Full Text Available This paper presents a model for an integrated security system, which can be implemented in any organization. It is based on security-specific standards and taxonomies as ISO 7498-2 and Common Criteria. The functionalities are derived from the classes proposed in the Common Criteria document. In the paper we present the process model for each functionality and also we focus on the specific components.

  18. Modeling selective attention using a neuromorphic analog VLSI device.

    Science.gov (United States)

    Indiveri, G

    2000-12-01

    Attentional mechanisms are required to overcome the problem of flooding a limited processing capacity system with information. They are present in biological sensory systems and can be a useful engineering tool for artificial visual systems. In this article we present a hardware model of a selective attention mechanism implemented on a very large-scale integration (VLSI) chip, using analog neuromorphic circuits. The chip exploits a spike-based representation to receive, process, and transmit signals. It can be used as a transceiver module for building multichip neuromorphic vision systems. We describe the circuits that carry out the main processing stages of the selective attention mechanism and provide experimental data for each circuit. We demonstrate the expected behavior of the model at the system level by stimulating the chip with both artificially generated control signals and signals obtained from a saliency map, computed from an image containing several salient features.

  19. Mathematical modelling in economic processes.

    Directory of Open Access Journals (Sweden)

    L.V. Kravtsova

    2008-06-01

    Full Text Available In article are considered a number of methods of mathematical modelling of economic processes and opportunities of use of spreadsheets Excel for reception of the optimum decision of tasks or calculation of financial operations with the help of the built-in functions.

  20. Using AHP for Selecting the Best Wastewater Treatment Process

    Directory of Open Access Journals (Sweden)

    AbdolReza Karimi

    2011-01-01

    Full Text Available In this paper, Analytical Hierarchy Process (AHP method that is based on expert knowledge is used for the selection of the optimal anaerobic wastewater treatment process in industrial estates. This method can be applied for complicated multi-criteria decision making to obtain reasonable results. The different anaerobic processes employed in Iranian industrial estates consist of UASB, UAFB, ABR, Contact process, and Anaerobic Lagoons. Based on the general conditions in wastewater treatment plants in industrial estates and on expert judgments and using technical, economic, environmental, and administrative criteria, the processes are weighted and the results obtained are assessed using the Expert Choice Software. Finally, the five processes investigated are ranked as 1 to 5 in a descending order of UAFB, ABR, UASB, Anaerobic Lagoon, and Contact Process. Sensitivity analysis showing the effects of input parameters on changes in the results was applied for technical, economic, environmental, and administrative criteria.

  1. Visualizing the process of process modeling with PPMCharts

    NARCIS (Netherlands)

    Claes, J.; Vanderfeesten, I.T.P.; Pinggera, J.; Reijers, H.A.; Weber, B.; Poels, G.; La Rosa, M.; Soffer, P.

    2013-01-01

    In the quest for knowledge about how to make good process models, recent research focus is shifting from studying the quality of process models to studying the process of process modeling (often abbreviated as PPM) itself. This paper reports on our efforts to visualize this specific process in such

  2. Concepts of radiation processes selection for industrial realization. Chapter 6

    International Nuclear Information System (INIS)

    1997-01-01

    For selection of radiation processes in industry the processes usually are analyzing by technological and social effects, power-insensitivity, common efficiency. Technological effect is generally conditioned with uniqueness of radiation technologies which allow to obtain new material or certain one but with new properties. Social effect first of all concerns with influence of radiation technologies on consumer's psychology. Implementation of equipment for radiation technological process for both the new material production and natural materials radiation treatment is related with decision of three tasks: 1) Choice of radiation source; 2). Creation of special equipment for radiation and untraditional stages of the process; 3) Selection of radiation and other conditions ensuring of achievement of optimal technological and economical indexes

  3. Models of microbiome evolution incorporating host and microbial selection.

    Science.gov (United States)

    Zeng, Qinglong; Wu, Steven; Sukumaran, Jeet; Rodrigo, Allen

    2017-09-25

    Numerous empirical studies suggest that hosts and microbes exert reciprocal selective effects on their ecological partners. Nonetheless, we still lack an explicit framework to model the dynamics of both hosts and microbes under selection. In a previous study, we developed an agent-based forward-time computational framework to simulate the neutral evolution of host-associated microbial communities in a constant-sized, unstructured population of hosts. These neutral models allowed offspring to sample microbes randomly from parents and/or from the environment. Additionally, the environmental pool of available microbes was constituted by fixed and persistent microbial OTUs and by contributions from host individuals in the preceding generation. In this paper, we extend our neutral models to allow selection to operate on both hosts and microbes. We do this by constructing a phenome for each microbial OTU consisting of a sample of traits that influence host and microbial fitnesses independently. Microbial traits can influence the fitness of hosts ("host selection") and the fitness of microbes ("trait-mediated microbial selection"). Additionally, the fitness effects of traits on microbes can be modified by their hosts ("host-mediated microbial selection"). We simulate the effects of these three types of selection, individually or in combination, on microbiome diversities and the fitnesses of hosts and microbes over several thousand generations of hosts. We show that microbiome diversity is strongly influenced by selection acting on microbes. Selection acting on hosts only influences microbiome diversity when there is near-complete direct or indirect parental contribution to the microbiomes of offspring. Unsurprisingly, microbial fitness increases under microbial selection. Interestingly, when host selection operates, host fitness only increases under two conditions: (1) when there is a strong parental contribution to microbial communities or (2) in the absence of a strong

  4. Investigating the Process of Process Modeling with Eye Movement Analysis

    OpenAIRE

    Pinggera, Jakob; Furtner, Marco; Martini, Markus; Sachse, Pierre; Reiter, Katharina; Zugal, Stefan; Weber, Barbara

    2015-01-01

    Research on quality issues of business process models has recently begun to explore the process of creating process models by analyzing the modeler's interactions with the modeling environment. In this paper we aim to complement previous insights on the modeler's modeling behavior with data gathered by tracking the modeler's eye movements when engaged in the act of modeling. We present preliminary results and outline directions for future research to triangulate toward a more comprehensive un...

  5. Welding process modelling and control

    Science.gov (United States)

    Romine, Peter L.; Adenwala, Jinen A.

    1993-01-01

    The research and analysis performed, and software developed, and hardware/software recommendations made during 1992 in development of the PC-based data acquisition system for support of Welding Process Modeling and Control is reported. A need was identified by the Metals Processing Branch of NASA Marshall Space Flight Center, for a mobile data aquisition and analysis system, customized for welding measurement and calibration. Several hardware configurations were evaluated and a PC-based system was chosen. The Welding Measurement System (WMS) is a dedicated instrument, strictly for the use of data aquisition and analysis. Although the WMS supports many of the functions associated with the process control, it is not the intention for this system to be used for welding process control.

  6. Employee Selection Process: Integrating Employee Needs and Employer Motivators.

    Science.gov (United States)

    Carroll, Brian J.

    1989-01-01

    Offers suggestions for managers relative to the employee selection process, focusing on the identification of a potential employee's needs and the employer's motivators that affect employee productivity. Discusses the use of a preemployment survey and offers a questionnaire that allows matching of the employee's needs with employment…

  7. The process of selecting technology development projects: a practical framework

    NARCIS (Netherlands)

    Herps, Joost M.J.; van Mal, Herman H.; Halman, Johannes I.M.; Martens, Jack H.M.; Borsboom, Ron H.M.

    2003-01-01

    In this article a practical framework is proposed, that can be used to organise the activities related to the selection-process of technology development projects. The framework is based upon recent literature and application at DAF Trucks Company. A technology development project has a long way to

  8. The process of selecting technology development projects : a practical framework

    NARCIS (Netherlands)

    Herps, J.M.J.; Mal, van H.H.; Halman, J.I.M.; Martens, J.H.M.; Borsboom, R.H.M.

    2003-01-01

    In this article a practical framework is proposed, that can be used to organise the activities related to the selection-process of technology development projects. The framework is based upon recent literature and application at DAF Trucks Company. A technology development project has a long way to

  9. Effect of Thermo-extrusion Process Parameters on Selected Quality ...

    African Journals Online (AJOL)

    Effect of Thermo-extrusion Process Parameters on Selected Quality Attributes of Meat Analogue from Mucuna Bean Seed Flour. ... Nigerian Food Journal ... The product functional responses with coefficients of determination (R2) ranging between 0.658 and 0.894 were most affected by changes in barrel temperature and ...

  10. Understanding the selection processes of public research projects

    NARCIS (Netherlands)

    Materia, V.C.; Pascucci, S.; Kolympiris, C.

    2015-01-01

    This paper analyses factors that affect the funding of agricultural research projects by regional governments and other regional public authorities. We study the selection process of agricultural research projects funded by the emilia Romagna regional government in Italy, which follows funding

  11. Item selection via Bayesian IRT models.

    Science.gov (United States)

    Arima, Serena

    2015-02-10

    With reference to a questionnaire that aimed to assess the quality of life for dysarthric speakers, we investigate the usefulness of a model-based procedure for reducing the number of items. We propose a mixed cumulative logit model, which is known in the psychometrics literature as the graded response model: responses to different items are modelled as a function of individual latent traits and as a function of item characteristics, such as their difficulty and their discrimination power. We jointly model the discrimination and the difficulty parameters by using a k-component mixture of normal distributions. Mixture components correspond to disjoint groups of items. Items that belong to the same groups can be considered equivalent in terms of both difficulty and discrimination power. According to decision criteria, we select a subset of items such that the reduced questionnaire is able to provide the same information that the complete questionnaire provides. The model is estimated by using a Bayesian approach, and the choice of the number of mixture components is justified according to information criteria. We illustrate the proposed approach on the basis of data that are collected for 104 dysarthric patients by local health authorities in Lecce and in Milan. Copyright © 2014 John Wiley & Sons, Ltd.

  12. Spatial Fleming-Viot models with selection and mutation

    CERN Document Server

    Dawson, Donald A

    2014-01-01

    This book constructs a rigorous framework for analysing selected phenomena in evolutionary theory of populations arising due to the combined effects of migration, selection and mutation in a spatial stochastic population model, namely the evolution towards fitter and fitter types through punctuated equilibria. The discussion is based on a number of new methods, in particular multiple scale analysis, nonlinear Markov processes and their entrance laws, atomic measure-valued evolutions and new forms of duality (for state-dependent mutation and multitype selection) which are used to prove ergodic theorems in this context and are applicable for many other questions and renormalization analysis for a variety of phenomena (stasis, punctuated equilibrium, failure of naive branching approximations, biodiversity) which occur due to the combination of rare mutation, mutation, resampling, migration and selection and make it necessary to mathematically bridge the gap (in the limit) between time and space scales.

  13. Selecting an optimal mixed products using grey relationship model

    Directory of Open Access Journals (Sweden)

    Farshad Faezy Razi

    2013-06-01

    Full Text Available This paper presents an integrated supplier selection and inventory management using grey relationship model (GRM as well as multi-objective decision making process. The proposed model of this paper first ranks different suppliers based on GRM technique and then determines the optimum level of inventory by considering different objectives. To show the implementation of the proposed model, we use some benchmark data presented by Talluri and Baker [Talluri, S., & Baker, R. C. (2002. A multi-phase mathematical programming approach for effective supply chain design. European Journal of Operational Research, 141(3, 544-558.]. The preliminary results indicate that the proposed model of this paper is capable of handling different criteria for supplier selection.

  14. Advanced oxidation processes: overall models

    Energy Technology Data Exchange (ETDEWEB)

    Rodriguez, M. [Univ. de los Andes, Escuela Basica de Ingenieria, La Hechicera, Merida (Venezuela); Curco, D.; Addardak, A.; Gimenez, J.; Esplugas, S. [Dept. de Ingenieria Quimica. Univ. de Barcelona, Barcelona (Spain)

    2003-07-01

    Modelling AOPs implies to consider all the steps included in the process, that means, mass transfer, kinetic (reaction) and luminic steps. In this way, recent works develop models which relate the global reaction rate to catalyst concentration and radiation absorption. However, the application of such models requires to know what is the controlling step for the overall process. In this paper, a simple method is explained which allows to determine the controlling step. Thus, it is assumed that reactor is divided in two hypothetical zones (dark and illuminated), and according to the experimental results, obtained by varying only the reaction volume, it can be decided if reaction occurs only in the illuminated zone or in the all reactor, including dark zone. The photocatalytic degradation of phenol, by using titania degussa P-25 as catalyst, is studied as reaction model. The preliminary results obtained are presented here, showing that it seems that, in this case, reaction only occurs in the illuminated zone of photoreactor. A model is developed to explain this behaviour. (orig.)

  15. Model for amorphous aggregation processes

    Science.gov (United States)

    Stranks, Samuel D.; Ecroyd, Heath; van Sluyter, Steven; Waters, Elizabeth J.; Carver, John A.; von Smekal, Lorenz

    2009-11-01

    The amorphous aggregation of proteins is associated with many phenomena, ranging from the formation of protein wine haze to the development of cataract in the eye lens and the precipitation of recombinant proteins during their expression and purification. While much literature exists describing models for linear protein aggregation, such as amyloid fibril formation, there are few reports of models which address amorphous aggregation. Here, we propose a model to describe the amorphous aggregation of proteins which is also more widely applicable to other situations where a similar process occurs, such as in the formation of colloids and nanoclusters. As first applications of the model, we have tested it against experimental turbidimetry data of three proteins relevant to the wine industry and biochemistry, namely, thaumatin, a thaumatinlike protein, and α -lactalbumin. The model is very robust and describes amorphous experimental data to a high degree of accuracy. Details about the aggregation process, such as shape parameters of the aggregates and rate constants, can also be extracted.

  16. Process for Selecting System Level Assessments for Human System Technologies

    Science.gov (United States)

    Watts, James; Park, John

    2006-01-01

    The integration of many life support systems necessary to construct a stable habitat is difficult. The correct identification of the appropriate technologies and corresponding interfaces is an exhaustive process. Once technologies are selected secondary issues such as mechanical and electrical interfaces must be addressed. The required analytical and testing work must be approached in a piecewise fashion to achieve timely results. A repeatable process has been developed to identify and prioritize system level assessments and testing needs. This Assessment Selection Process has been defined to assess cross cutting integration issues on topics at the system or component levels. Assessments are used to identify risks, encourage future actions to mitigate risks, or spur further studies.

  17. Factors influencing creep model equation selection

    International Nuclear Information System (INIS)

    Holdsworth, S.R.; Askins, M.; Baker, A.; Gariboldi, E.; Holmstroem, S.; Klenk, A.; Ringel, M.; Merckling, G.; Sandstrom, R.; Schwienheer, M.; Spigarelli, S.

    2008-01-01

    During the course of the EU-funded Advanced-Creep Thematic Network, ECCC-WG1 reviewed the applicability and effectiveness of a range of model equations to represent the accumulation of creep strain in various engineering alloys. In addition to considering the experience of network members, the ability of several models to describe the deformation characteristics of large single and multi-cast collations of ε(t,T,σ) creep curves have been evaluated in an intensive assessment inter-comparison activity involving three steels, 21/4 CrMo (P22), 9CrMoVNb (Steel-91) and 18Cr13NiMo (Type-316). The choice of the most appropriate creep model equation for a given application depends not only on the high-temperature deformation characteristics of the material under consideration, but also on the characteristics of the dataset, the number of casts for which creep curves are available and on the strain regime for which an analytical representation is required. The paper focuses on the factors which can influence creep model selection and model-fitting approach for multi-source, multi-cast datasets

  18. Large deviations for the Fleming-Viot process with neutral mutation and selection

    OpenAIRE

    Dawson, Donald; Feng, Shui

    1998-01-01

    Large deviation principles are established for the Fleming-Viot processes with neutral mutation and selection, and the corresponding equilibrium measures as the sampling rate goes to 0. All results are first proved for the finite allele model, and then generalized, through the projective limit technique, to the infinite allele model. Explicit expressions are obtained for the rate functions.

  19. Communication activities for NUMO's site selection process

    International Nuclear Information System (INIS)

    Takeuchi, Mitsuo; Okuyama, Shigeru; Kitayama, Kazumi; Kuba, Michiyoshi

    2004-01-01

    A siting program for geological disposal of high-level radioactive waste (HLW) in Japan has just started and is moving into a new stage of communication with the public. A final repository site will be selected via a stepwise process, as stipulated in the Specified Radioactive Waste Final Disposal Act promulgated in June 2000. Based on the Act, the site selection process of the Nuclear Waste Management Organization of Japan (NUMO, established in October 2000) will be carried out in the three steps: selection of Preliminary Investigation Areas (PIAs), selection of Detailed Investigation Areas (DIAs) and selection of the Repository Site. The Act also defines NUMO's responsibilities in terms of implementing the HLW disposal program in an open and transparent manner. NUMO fully understands the importance of public participation in its activities and is aiming to promote public involvement in the process of site selection based on a fundamental policy, which consists of 'adopting a stepwise approach', 'respecting the initiative of municipalities' and 'ensuring transparency in information disclosure'. This policy is clearly reflected in the adoption of an open solicitation approach for volunteer municipalities for Preliminary Investigation Areas (PIAs). NUMO made the official announcement of the start of its open solicitation program on 19 December 2002. This paper outlines how NUMO's activities are currently carried out with a view to encouraging municipalities to volunteer as PIAs and how public awareness of the safety of the HLW disposal is evaluated at this stage

  20. The Formalization of the Business Process Modeling Goals

    Directory of Open Access Journals (Sweden)

    Ligita Bušinska

    2016-10-01

    Full Text Available In business process modeling the de facto standard BPMN has emerged. However, the applications of this notation have many subsets of elements and various extensions. Also, BPMN still coincides with many other modeling languages, forming a large set of available options for business process modeling languages and dialects. While, in general, the goal of modelers is a central notion in the choice of modeling languages and notations, in most researches that propose guidelines, techniques, and methods for business process modeling language evaluation and/or selection, the business process modeling goal is not formalized and not transparently taken into account. To overcome this gap, and to explicate and help to handle business process modeling complexity, the approach to formalize the business process modeling goal, and the supporting three dimensional business process modeling framework, are proposed.

  1. Divided versus selective attention: evidence for common processing mechanisms.

    Science.gov (United States)

    Hahn, Britta; Wolkenberg, Frank A; Ross, Thomas J; Myers, Carol S; Heishman, Stephen J; Stein, Dan J; Kurup, Pradeep K; Stein, Elliot A

    2008-06-18

    The current study revisited the question of whether there are brain mechanisms specific to divided attention that differ from those used in selective attention. Increased neuronal activity required to simultaneously process two stimulus dimensions as compared with each separate dimension has often been observed, but rarely has activity induced by a divided attention condition exceeded the sum of activity induced by the component tasks. Healthy participants performed a selective-divided attention paradigm while undergoing functional Magnetic Resonance Imaging (fMRI). The task required participants to make a same-different judgment about either one of two simultaneously presented stimulus dimensions, or about both dimensions. Performance accuracy was equated between tasks by dynamically adjusting the stimulus display time. Blood Oxygenation Level Dependent (BOLD) signal differences between tasks were identified by whole-brain voxel-wise comparisons and by region-specific analyses of all areas modulated by the divided attention task (DIV). No region displayed greater activation or deactivation by DIV than the sum of signal change by the two selective attention tasks. Instead, regional activity followed the tasks' processing demands as reflected by reaction time. Only a left cerebellar region displayed a correlation between participants' BOLD signal intensity and reaction time that was selective for DIV. The correlation was positive, reflecting slower responding with greater activation. Overall, the findings do not support the existence of functional brain activity specific to DIV. Increased activity appears to reflect additional processing demands by introducing a secondary task, but those demands do not appear to qualitatively differ from processes of selective attention.

  2. Process-driven selection of information systems for healthcare

    Science.gov (United States)

    Mills, Stephen F.; Yeh, Raymond T.; Giroir, Brett P.; Tanik, Murat M.

    1995-05-01

    Integration of networking and data management technologies such as PACS, RIS and HIS into a healthcare enterprise in a clinically acceptable manner is a difficult problem. Data within such a facility are generally managed via a combination of manual hardcopy systems and proprietary, special-purpose data processing systems. Process modeling techniques have been successfully applied to engineering and manufacturing enterprises, but have not generally been applied to service-based enterprises such as healthcare facilities. The use of process modeling techniques can provide guidance for the placement, configuration and usage of PACS and other informatics technologies within the healthcare enterprise, and thus improve the quality of healthcare. Initial process modeling activities conducted within the Pediatric ICU at Children's Medical Center in Dallas, Texas are described. The ongoing development of a full enterprise- level model for the Pediatric ICU is also described.

  3. Selections from 2017: Image Processing with AstroImageJ

    Science.gov (United States)

    Kohler, Susanna

    2017-12-01

    Editors note:In these last two weeks of 2017, well be looking at a few selections that we havent yet discussed on AAS Nova from among the most-downloaded paperspublished in AAS journals this year. The usual posting schedule will resume in January.AstroImageJ: Image Processing and Photometric Extraction for Ultra-Precise Astronomical Light CurvesPublished January2017The AIJ image display. A wide range of astronomy specific image display options and image analysis tools are available from the menus, quick access icons, and interactive histogram. [Collins et al. 2017]Main takeaway:AstroImageJ is a new integrated software package presented in a publication led byKaren Collins(Vanderbilt University,Fisk University, andUniversity of Louisville). Itenables new users even at the level of undergraduate student, high school student, or amateur astronomer to quickly start processing, modeling, and plotting astronomical image data.Why its interesting:Science doesnt just happen the momenta telescope captures a picture of a distantobject. Instead, astronomical images must firstbe carefully processed to clean up thedata, and this data must then be systematically analyzed to learn about the objects within it. AstroImageJ as a GUI-driven, easily installed, public-domain tool is a uniquelyaccessible tool for thisprocessing and analysis, allowing even non-specialist users to explore and visualizeastronomical data.Some features ofAstroImageJ:(as reported by Astrobites)Image calibration:generate master flat, dark, and bias framesImage arithmetic:combineimages viasubtraction, addition, division, multiplication, etc.Stack editing:easily perform operations on a series of imagesImage stabilization and image alignment featuresPrecise coordinate converters:calculate Heliocentric and Barycentric Julian DatesWCS coordinates:determine precisely where atelescope was pointed for an image by PlateSolving using Astronomy.netMacro and plugin support:write your own macrosMulti-aperture photometry

  4. Parameters in selective laser melting for processing metallic powders

    Science.gov (United States)

    Kurzynowski, Tomasz; Chlebus, Edward; Kuźnicka, Bogumiła; Reiner, Jacek

    2012-03-01

    The paper presents results of studies on Selective Laser Melting. SLM is an additive manufacturing technology which may be used to process almost all metallic materials in the form of powder. Types of energy emission sources, mainly fiber lasers and/or Nd:YAG laser with similar characteristics and the wavelength of 1,06 - 1,08 microns, are provided primarily for processing metallic powder materials with high absorption of laser radiation. The paper presents results of selected variable parameters (laser power, scanning time, scanning strategy) and fixed parameters such as the protective atmosphere (argon, nitrogen, helium), temperature, type and shape of the powder material. The thematic scope is very broad, so the work was focused on optimizing the process of selective laser micrometallurgy for producing fully dense parts. The density is closely linked with other two conditions: discontinuity of the microstructure (microcracks) and stability (repeatability) of the process. Materials used for the research were stainless steel 316L (AISI), tool steel H13 (AISI), and titanium alloy Ti6Al7Nb (ISO 5832-11). Studies were performed with a scanning electron microscope, a light microscopes, a confocal microscope and a μCT scanner.

  5. Probabilistic evaluation of process model matching techniques

    NARCIS (Netherlands)

    Kuss, Elena; Leopold, Henrik; van der Aa, Han; Stuckenschmidt, Heiner; Reijers, Hajo A.

    2016-01-01

    Process model matching refers to the automatic identification of corresponding activities between two process models. It represents the basis for many advanced process model analysis techniques such as the identification of similar process parts or process model search. A central problem is how to

  6. Description of processes for the immobilization of selected transuranic wastes

    International Nuclear Information System (INIS)

    Timmerman, C.L.

    1980-12-01

    Processed sludge and incinerator-ash wastes contaminated with transuranic (TRU) elements may require immobilization to prevent the release of these elements to the environment. As part of the TRU Waste Immobilization Program sponsored by the Department of Energy (DOE), the Pacific Northwest Laboratory is developing applicable waste-form and processing technology that may meet this need. This report defines and describes processes that are capable of immobilizing a selected TRU waste-stream consisting of a blend of three parts process sludge and one part incinerator ash. These selected waste streams are based on the compositions and generation rates of the waste processing and incineration facility at the Rocky Flats Plant. The specific waste forms that could be produced by the described processes include: in-can melted borosilicate-glass monolith; joule-heated melter borosilicate-glass monolith or marble; joule-heated melter aluminosilicate-glass monolith or marble; joule-heated melter basaltic-glass monolith or marble; joule-heated melter glass-ceramic monolith; cast-cement monolith; pressed-cement pellet; and cold-pressed sintered-ceramic pellet

  7. Computational Process Modeling for Additive Manufacturing (OSU)

    Science.gov (United States)

    Bagg, Stacey; Zhang, Wei

    2015-01-01

    Powder-Bed Additive Manufacturing (AM) through Direct Metal Laser Sintering (DMLS) or Selective Laser Melting (SLM) is being used by NASA and the Aerospace industry to "print" parts that traditionally are very complex, high cost, or long schedule lead items. The process spreads a thin layer of metal powder over a build platform, then melts the powder in a series of welds in a desired shape. The next layer of powder is applied, and the process is repeated until layer-by-layer, a very complex part can be built. This reduces cost and schedule by eliminating very complex tooling and processes traditionally used in aerospace component manufacturing. To use the process to print end-use items, NASA seeks to understand SLM material well enough to develop a method of qualifying parts for space flight operation. Traditionally, a new material process takes many years and high investment to generate statistical databases and experiential knowledge, but computational modeling can truncate the schedule and cost -many experiments can be run quickly in a model, which would take years and a high material cost to run empirically. This project seeks to optimize material build parameters with reduced time and cost through modeling.

  8. On selection of optimal stochastic model for accelerated life testing

    International Nuclear Information System (INIS)

    Volf, P.; Timková, J.

    2014-01-01

    This paper deals with the problem of proper lifetime model selection in the context of statistical reliability analysis. Namely, we consider regression models describing the dependence of failure intensities on a covariate, for instance, a stressor. Testing the model fit is standardly based on the so-called martingale residuals. Their analysis has already been studied by many authors. Nevertheless, the Bayes approach to the problem, in spite of its advantages, is just developing. We shall present the Bayes procedure of estimation in several semi-parametric regression models of failure intensity. Then, our main concern is the Bayes construction of residual processes and goodness-of-fit tests based on them. The method is illustrated with both artificial and real-data examples. - Highlights: • Statistical survival and reliability analysis and Bayes approach. • Bayes semi-parametric regression modeling in Cox's and AFT models. • Bayes version of martingale residuals and goodness-of-fit test

  9. Evaluation and selection of in-situ leaching mining method using analytic hierarchy process

    International Nuclear Information System (INIS)

    Zhao Heyong; Tan Kaixuan; Liu Huizhen

    2007-01-01

    According to the complicated conditions and main influence factors of in-situ leaching min- ing, a model and processes of analytic hierarchy are established for evaluation and selection of in-situ leaching mining methods based on analytic hierarchy process. Taking a uranium mine in Xinjiang of China for example, the application of this model is presented. The results of analyses and calculation indicate that the acid leaching is the optimum project. (authors)

  10. Understanding Managers Decision Making Process for Tools Selection in the Core Front End of Innovation

    DEFF Research Database (Denmark)

    Appio, Francesco P.; Achiche, Sofiane; McAloone, Tim C.

    2011-01-01

    New product development (NPD) describes the process of bringing a new product or service to the market. The Fuzzy Front End (FFE) of Innovation is the term describing the activities happening before the product development phase of NPD. In the FFE of innovation, several tools are used to facilita...... hypotheses are tested. A preliminary version of a theoretical model depicting the decision process of managers during tools selection in the FFE is proposed. The theoretical model is built from the constructed hypotheses....

  11. Some fuzzy techniques for staff selection process: A survey

    Science.gov (United States)

    Md Saad, R.; Ahmad, M. Z.; Abu, M. S.; Jusoh, M. S.

    2013-04-01

    With high level of business competition, it is vital to have flexible staff that are able to adapt themselves with work circumstances. However, staff selection process is not an easy task to be solved, even when it is tackled in a simplified version containing only a single criterion and a homogeneous skill. When multiple criteria and various skills are involved, the problem becomes much more complicated. In adddition, there are some information that could not be measured precisely. This is patently obvious when dealing with opinions, thoughts, feelings, believes, etc. One possible tool to handle this issue is by using fuzzy set theory. Therefore, the objective of this paper is to review the existing fuzzy techniques for solving staff selection process. It classifies several existing research methods and identifies areas where there is a gap and need further research. Finally, this paper concludes by suggesting new ideas for future research based on the gaps identified.

  12. Selection processes in a citrus hybrid population using RAPD markers

    Directory of Open Access Journals (Sweden)

    Oliveira Roberto Pedroso de

    2003-01-01

    Full Text Available The objective of this work was to evaluate the processes of selection in a citrus hybrid population using segregation analysis of RAPD markers. The segregation of 123 RAPD markers between 'Cravo' mandarin (Citrus reticulata Blanco and 'Pêra' sweet orange (C. sinensis (L. Osbeck was analysed in a F1 progeny of 94 hybrids. Genetic composition, diversity, heterozygosity, differences in chromosomal structure and the presence of deleterious recessive genes are discussed based on the segregation ratios obtained. A high percentage of markers had a skeweness of the 1:1 expected segregation ratio in the F1 population. Many markers showed a 3:1 segregation ratio in both varieties and 1:3 in 'Pêra' sweet orange, probably due to directional selection processes. The distribution analysis of the frequencies of the segregant markers in a hybrid population is a simple method which allows a better understanding of the genetics of citrus group.

  13. Laser Process for Selective Emitter Silicon Solar Cells

    Directory of Open Access Journals (Sweden)

    G. Poulain

    2012-01-01

    Full Text Available Selective emitter solar cells can provide a significant increase in conversion efficiency. However current approaches need many technological steps and alignment procedures. This paper reports on a preliminary attempt to reduce the number of processing steps and therefore the cost of selective emitter cells. In the developed procedure, a phosphorous glass covered with silicon nitride acts as the doping source. A laser is used to open locally the antireflection coating and at the same time achieve local phosphorus diffusion. In this process the standard chemical etching of the phosphorous glass is avoided. Sheet resistance variation from 100 Ω/sq to 40 Ω/sq is demonstrated with a nanosecond UV laser. Numerical simulation of the laser-matter interaction is discussed to understand the dopant diffusion efficiency. Preliminary solar cells results show a 0.5% improvement compared with a homogeneous emitter structure.

  14. Managing the Public Sector Research and Development Portfolio Selection Process: A Case Study of Quantitative Selection and Optimization

    Science.gov (United States)

    2016-09-01

    PUBLIC SECTOR RESEARCH & DEVELOPMENT PORTFOLIO SELECTION PROCESS: A CASE STUDY OF QUANTITATIVE SELECTION AND OPTIMIZATION by Jason A. Schwartz...PUBLIC SECTOR RESEARCH & DEVELOPMENT PORTFOLIO SELECTION PROCESS: A CASE STUDY OF QUANTITATIVE SELECTION AND OPTIMIZATION 5. FUNDING NUMBERS 6...describing how public sector organizations can implement a research and development (R&D) portfolio optimization strategy to maximize the cost

  15. High-dimensional model estimation and model selection

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    I will review concepts and algorithms from high-dimensional statistics for linear model estimation and model selection. I will particularly focus on the so-called p>>n setting where the number of variables p is much larger than the number of samples n. I will focus mostly on regularized statistical estimators that produce sparse models. Important examples include the LASSO and its matrix extension, the Graphical LASSO, and more recent non-convex methods such as the TREX. I will show the applicability of these estimators in a diverse range of scientific applications, such as sparse interaction graph recovery and high-dimensional classification and regression problems in genomics.

  16. Analysis of Using Resources in Business Process Modeling and Simulation

    Directory of Open Access Journals (Sweden)

    Vasilecas Olegas

    2014-12-01

    Full Text Available One of the key purposes of Business Process Model and Notation (BPMN is to support graphical representation of the process model. However, such models have a lack of support for the graphical representation of resources, whose processes are used during simulation or execution of process instance. The paper analyzes different methods and their extensions for resource modeling. Further, this article presents a selected set of resource properties that are relevant for resource modeling. The paper proposes an approach that explains how to use the selected set of resource properties for extension of process modeling using BPMN and simulation tools. They are based on BPMN, where business process instances use resources in a concurrency manner.

  17. Yakima tribal perspectives on high level selection process

    International Nuclear Information System (INIS)

    Jim, R.; Wittman, J.; Tousley, D.R.; Hovis, J.B.

    1987-01-01

    When Congress went through the arduous process of fashioning a comprehensive plan for resolution of the nation's long-standing nuclear waste problem, it explicitly recognized that past federal efforts in this area had been inadequate. Congress also recognized that the primary reasons for the failure of earlier federal efforts was failure on the part of the federal government to seriously deal with very real technical questions about the geologic adequacy of prospective repository sites, and failure to address the concerns of state, tribal, and local governments in the repository selection and development process

  18. Halo models of HI selected galaxies

    Science.gov (United States)

    Paul, Niladri; Choudhury, Tirthankar Roy; Paranjape, Aseem

    2018-06-01

    Modelling the distribution of neutral hydrogen (HI) in dark matter halos is important for studying galaxy evolution in the cosmological context. We use a novel approach to infer the HI-dark matter connection at the massive end (m_H{I} > 10^{9.8} M_{⊙}) from radio HI emission surveys, using optical properties of low-redshift galaxies as an intermediary. In particular, we use a previously calibrated optical HOD describing the luminosity- and colour-dependent clustering of SDSS galaxies and describe the HI content using a statistical scaling relation between the optical properties and HI mass. This allows us to compute the abundance and clustering properties of HI-selected galaxies and compare with data from the ALFALFA survey. We apply an MCMC-based statistical analysis to constrain the free parameters related to the scaling relation. The resulting best-fit scaling relation identifies massive HI galaxies primarily with optically faint blue centrals, consistent with expectations from galaxy formation models. We compare the Hi-stellar mass relation predicted by our model with independent observations from matched Hi-optical galaxy samples, finding reasonable agreement. As a further application, we make some preliminary forecasts for future observations of HI and optical galaxies in the expected overlap volume of SKA and Euclid/LSST.

  19. Selecting a model of supersymmetry breaking mediation

    International Nuclear Information System (INIS)

    AbdusSalam, S. S.; Allanach, B. C.; Dolan, M. J.; Feroz, F.; Hobson, M. P.

    2009-01-01

    We study the problem of selecting between different mechanisms of supersymmetry breaking in the minimal supersymmetric standard model using current data. We evaluate the Bayesian evidence of four supersymmetry breaking scenarios: mSUGRA, mGMSB, mAMSB, and moduli mediation. The results show a strong dependence on the dark matter assumption. Using the inferred cosmological relic density as an upper bound, minimal anomaly mediation is at least moderately favored over the CMSSM. Our fits also indicate that evidence for a positive sign of the μ parameter is moderate at best. We present constraints on the anomaly and gauge mediated parameter spaces and some previously unexplored aspects of the dark matter phenomenology of the moduli mediation scenario. We use sparticle searches, indirect observables and dark matter observables in the global fit and quantify robustness with respect to prior choice. We quantify how much information is contained within each constraint.

  20. Selective Oxidation of Lignin Model Compounds.

    Science.gov (United States)

    Gao, Ruili; Li, Yanding; Kim, Hoon; Mobley, Justin K; Ralph, John

    2018-05-02

    Lignin, the planet's most abundant renewable source of aromatic compounds, is difficult to degrade efficiently to welldefined aromatics. We developed a microwave-assisted catalytic Swern oxidation system using an easily prepared catalyst, MoO 2 Cl 2 (DMSO) 2 , and DMSO as the solvent and oxidant. It demonstrated high efficiency in transforming lignin model compounds containing the units and functional groups found in native lignins. The aromatic ring substituents strongly influenced the selectivity of β-ether phenolic dimer cleavage to generate sinapaldehyde and coniferaldehyde, monomers not usually produced by oxidative methods. Time-course studies on two key intermediates provided insight into the reaction pathway. Owing to the broad scope of this oxidation system and the insight gleaned with regard to its mechanism, this strategy could be adapted and applied in a general sense to the production of useful aromatic chemicals from phenolics and lignin. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Modeling pellet impact drilling process

    Science.gov (United States)

    Kovalyov, A. V.; Ryabchikov, S. Ya; Isaev, Ye D.; Ulyanova, O. S.

    2016-03-01

    The paper describes pellet impact drilling which could be used to increase the drilling speed and the rate of penetration when drilling hard rocks. Pellet impact drilling implies rock destruction by metal pellets with high kinetic energy in the immediate vicinity of the earth formation encountered. The pellets are circulated in the bottom hole by a high velocity fluid jet, which is the principle component of the ejector pellet impact drill bit. The experiments conducted has allowed modeling the process of pellet impact drilling, which creates the scientific and methodological basis for engineering design of drilling operations under different geo-technical conditions.

  2. Integrated modelling in materials and process technology

    DEFF Research Database (Denmark)

    Hattel, Jesper Henri

    2008-01-01

    Integrated modelling of entire process sequences and the subsequent in-service conditions, and multiphysics modelling of the single process steps are areas that increasingly support optimisation of manufactured parts. In the present paper, three different examples of modelling manufacturing...... processes from the viewpoint of combined materials and process modelling are presented: solidification of thin walled ductile cast iron, integrated modelling of spray forming and multiphysics modelling of friction stir welding. The fourth example describes integrated modelling applied to a failure analysis...

  3. Estimation of a multivariate mean under model selection uncertainty

    Directory of Open Access Journals (Sweden)

    Georges Nguefack-Tsague

    2014-05-01

    Full Text Available Model selection uncertainty would occur if we selected a model based on one data set and subsequently applied it for statistical inferences, because the "correct" model would not be selected with certainty.  When the selection and inference are based on the same dataset, some additional problems arise due to the correlation of the two stages (selection and inference. In this paper model selection uncertainty is considered and model averaging is proposed. The proposal is related to the theory of James and Stein of estimating more than three parameters from independent normal observations. We suggest that a model averaging scheme taking into account the selection procedure could be more appropriate than model selection alone. Some properties of this model averaging estimator are investigated; in particular we show using Stein's results that it is a minimax estimator and can outperform Stein-type estimators.

  4. Selective hydrogenolysis of Α–O–4, Β–O–4, 4–O–5 C–O bonds of lignin-model compounds and lignin-containing stillage derived from cellulosic bioethanol processing

    NARCIS (Netherlands)

    Gómez-Monedero, B.; Ruiz, M. P.; Bimbela, F.; Faria, J.

    2017-01-01

    Benzyl phenyl ether (BPE), phenethyl phenyl ether (PPE) and diphenyl ether (DPE) have been selected as model compounds of the most abundant and significant ether linkages found within the complex structure of lignin (e.g. α–O–4, β–O–4, and 4–O–5, respectively). The catalytic hydrogenolysis of these

  5. Modelling Technical and Economic Parameters in Selection of Manufacturing Devices

    Directory of Open Access Journals (Sweden)

    Naqib Daneshjo

    2017-11-01

    Full Text Available Sustainable science and technology development is also conditioned by continuous development of means of production which have a key role in structure of each production system. Mechanical nature of the means of production is complemented by controlling and electronic devices in context of intelligent industry. A selection of production machines for a technological process or technological project has so far been practically resolved, often only intuitively. With regard to increasing intelligence, the number of variable parameters that have to be considered when choosing a production device is also increasing. It is necessary to use computing techniques and decision making methods according to heuristic methods and more precise methodological procedures during the selection. The authors present an innovative model for optimization of technical and economic parameters in the selection of manufacturing devices for industry 4.0.

  6. Hidden Markov Model for Stock Selection

    Directory of Open Access Journals (Sweden)

    Nguyet Nguyen

    2015-10-01

    Full Text Available The hidden Markov model (HMM is typically used to predict the hidden regimes of observation data. Therefore, this model finds applications in many different areas, such as speech recognition systems, computational molecular biology and financial market predictions. In this paper, we use HMM for stock selection. We first use HMM to make monthly regime predictions for the four macroeconomic variables: inflation (consumer price index (CPI, industrial production index (INDPRO, stock market index (S&P 500 and market volatility (VIX. At the end of each month, we calibrate HMM’s parameters for each of these economic variables and predict its regimes for the next month. We then look back into historical data to find the time periods for which the four variables had similar regimes with the forecasted regimes. Within those similar periods, we analyze all of the S&P 500 stocks to identify which stock characteristics have been well rewarded during the time periods and assign scores and corresponding weights for each of the stock characteristics. A composite score of each stock is calculated based on the scores and weights of its features. Based on this algorithm, we choose the 50 top ranking stocks to buy. We compare the performances of the portfolio with the benchmark index, S&P 500. With an initial investment of $100 in December 1999, over 15 years, in December 2014, our portfolio had an average gain per annum of 14.9% versus 2.3% for the S&P 500.

  7. Psyche Mission: Scientific Models and Instrument Selection

    Science.gov (United States)

    Polanskey, C. A.; Elkins-Tanton, L. T.; Bell, J. F., III; Lawrence, D. J.; Marchi, S.; Park, R. S.; Russell, C. T.; Weiss, B. P.

    2017-12-01

    NASA has chosen to explore (16) Psyche with their 14th Discovery-class mission. Psyche is a 226-km diameter metallic asteroid hypothesized to be the exposed core of a planetesimal that was stripped of its rocky mantle by multiple hit and run collisions in the early solar system. The spacecraft launch is planned for 2022 with arrival at the asteroid in 2026 for 21 months of operations. The Psyche investigation has five primary scientific objectives: A. Determine whether Psyche is a core, or if it is unmelted material. B. Determine the relative ages of regions of Psyche's surface. C. Determine whether small metal bodies incorporate the same light elements as are expected in the Earth's high-pressure core. D. Determine whether Psyche was formed under conditions more oxidizing or more reducing than Earth's core. E. Characterize Psyche's topography. The mission's task was to select the appropriate instruments to meet these objectives. However, exploring a metal world, rather than one made of ice, rock, or gas, requires development of new scientific models for Psyche to support the selection of the appropriate instruments for the payload. If Psyche is indeed a planetary core, we expect that it should have a detectable magnetic field. However, the strength of the magnetic field can vary by orders of magnitude depending on the formational history of Psyche. The implications of both the extreme low-end and the high-end predictions impact the magnetometer and mission design. For the imaging experiment, what can the team expect for the morphology of a heavily impacted metal body? Efforts are underway to further investigate the differences in crater morphology between high velocity impacts into metal and rock to be prepared to interpret the images of Psyche when they are returned. Finally, elemental composition measurements at Psyche using nuclear spectroscopy encompass a new and unexplored phase space of gamma-ray and neutron measurements. We will present some end

  8. Collapse models and perceptual processes

    International Nuclear Information System (INIS)

    Ghirardi, Gian Carlo; Romano, Raffaele

    2014-01-01

    Theories including a collapse mechanism have been presented various years ago. They are based on a modification of standard quantum mechanics in which nonlinear and stochastic terms are added to the evolution equation. Their principal merits derive from the fact that they are mathematically precise schemes accounting, on the basis of a unique universal dynamical principle, both for the quantum behavior of microscopic systems as well as for the reduction associated to measurement processes and for the classical behavior of macroscopic objects. Since such theories qualify themselves not as new interpretations but as modifications of the standard theory they can be, in principle, tested against quantum mechanics. Recently, various investigations identifying possible crucial test have been discussed. In spite of the extreme difficulty to perform such tests it seems that recent technological developments allow at least to put precise limits on the parameters characterizing the modifications of the evolution equation. Here we will simply mention some of the recent investigations in this direction, while we will mainly concentrate our attention to the way in which collapse theories account for definite perceptual process. The differences between the case of reductions induced by perceptions and those related to measurement procedures by means of standard macroscopic devices will be discussed. On this basis, we suggest a precise experimental test of collapse theories involving conscious observers. We make plausible, by discussing in detail a toy model, that the modified dynamics can give rise to quite small but systematic errors in the visual perceptual process.

  9. Hillslope runoff processes and models

    Science.gov (United States)

    Kirkby, Mike

    1988-07-01

    Hillslope hydrology is concerned with the partition of precipitation as it passes through the vegetation and soil between overland flow and subsurface flow. Flow follows routes which attenuate and delay the flow to different extents, so that a knowledge of the relevant mechanisms is important. In the 1960s and 1970s, hillslope hydrology developed as a distinct topic through the application of new field observations to develop a generation of physically based forecasting models. In its short history, theory has continually been overturned by field observation. Thus the current tendency, particularly among temperate zone hydrologists, to dismiss all Hortonian overland flow as a myth, is now being corrected by a number of significant field studies which reveal the great range in both climatic and hillslope conditions. Some recent models have generally attempted to simplify the processes acting, for example including only vertical unsaturated flow and lateral saturated flows. Others explicitly forecast partial or contributing areas. With hindsight, the most complete and distributed models have generally shown little forecasting advantage over simpler approaches, perhaps trending towards reliable models which can run on desk top microcomputers. The variety now being recognised in hillslope hydrological responses should also lead to models which take account of more complex interactions, even if initially with a less secure physical and mathematical basis than the Richards equation. In particular, there is a need to respond to the variety of climatic responses, and to spatial variability on and beneath the surface, including the role of seepage macropores and pipes which call into question whether the hillside can be treated as a Darcian flow system.

  10. Continuous-Time Mean-Variance Portfolio Selection under the CEV Process

    OpenAIRE

    Ma, Hui-qiang

    2014-01-01

    We consider a continuous-time mean-variance portfolio selection model when stock price follows the constant elasticity of variance (CEV) process. The aim of this paper is to derive an optimal portfolio strategy and the efficient frontier. The mean-variance portfolio selection problem is formulated as a linearly constrained convex program problem. By employing the Lagrange multiplier method and stochastic optimal control theory, we obtain the optimal portfolio strategy and mean-variance effici...

  11. Business Process Modelling for Measuring Quality

    NARCIS (Netherlands)

    Heidari, F.; Loucopoulos, P.; Brazier, F.M.

    2013-01-01

    Business process modelling languages facilitate presentation, communication and analysis of business processes with different stakeholders. This paper proposes an approach that drives specification and measurement of quality requirements and in doing so relies on business process models as

  12. Process observation in fiber laser-based selective laser melting

    Science.gov (United States)

    Thombansen, Ulrich; Gatej, Alexander; Pereira, Milton

    2015-01-01

    The process observation in selective laser melting (SLM) focuses on observing the interaction point where the powder is processed. To provide process relevant information, signals have to be acquired that are resolved in both time and space. Especially in high-power SLM, where more than 1 kW of laser power is used, processing speeds of several meters per second are required for a high-quality processing results. Therefore, an implementation of a suitable process observation system has to acquire a large amount of spatially resolved data at low sampling speeds or it has to restrict the acquisition to a predefined area at a high sampling speed. In any case, it is vitally important to synchronously record the laser beam position and the acquired signal. This is a prerequisite that allows the recorded data become information. Today, most SLM systems employ f-theta lenses to focus the processing laser beam onto the powder bed. This report describes the drawbacks that result for process observation and suggests a variable retro-focus system which solves these issues. The beam quality of fiber lasers delivers the processing laser beam to the powder bed at relevant focus diameters, which is a key prerequisite for this solution to be viable. The optical train we present here couples the processing laser beam and the process observation coaxially, ensuring consistent alignment of interaction zone and observed area. With respect to signal processing, we have developed a solution that synchronously acquires signals from a pyrometer and the position of the laser beam by sampling the data with a field programmable gate array. The relevance of the acquired signals has been validated by the scanning of a sample filament. Experiments with grooved samples show a correlation between different powder thicknesses and the acquired signals at relevant processing parameters. This basic work takes a first step toward self-optimization of the manufacturing process in SLM. It enables the

  13. Distribution of Selected Trace Elements in the Bayer Process

    Directory of Open Access Journals (Sweden)

    Johannes Vind

    2018-05-01

    Full Text Available The aim of this work was to achieve an understanding of the distribution of selected bauxite trace elements (gallium (Ga, vanadium (V, arsenic (As, chromium (Cr, rare earth elements (REEs, scandium (Sc in the Bayer process. The assessment was designed as a case study in an alumina plant in operation to provide an overview of the trace elements behaviour in an actual industrial setup. A combination of analytical techniques was used, mainly inductively coupled plasma mass spectrometry and optical emission spectroscopy as well as instrumental neutron activation analysis. It was found that Ga, V and As as well as, to a minor extent, Cr are principally accumulated in Bayer process liquors. In addition, Ga is also fractionated to alumina at the end of the Bayer processing cycle. The rest of these elements pass to bauxite residue. REEs and Sc have the tendency to remain practically unaffected in the solid phases of the Bayer process and, therefore, at least 98% of their mass is transferred to bauxite residue. The interest in such a study originates from the fact that many of these trace constituents of bauxite ore could potentially become valuable by-products of the Bayer process; therefore, the understanding of their behaviour needs to be expanded. In fact, Ga and V are already by-products of the Bayer process, but their distribution patterns have not been provided in the existing open literature.

  14. Process cost and facility considerations in the selection of primary cell culture clarification technology.

    Science.gov (United States)

    Felo, Michael; Christensen, Brandon; Higgins, John

    2013-01-01

    The bioreactor volume delineating the selection of primary clarification technology is not always easily defined. Development of a commercial scale process for the manufacture of therapeutic proteins requires scale-up from a few liters to thousands of liters. While the separation techniques used for protein purification are largely conserved across scales, the separation techniques for primary cell culture clarification vary with scale. Process models were developed to compare monoclonal antibody production costs using two cell culture clarification technologies. One process model was created for cell culture clarification by disc stack centrifugation with depth filtration. A second process model was created for clarification by multi-stage depth filtration. Analyses were performed to examine the influence of bioreactor volume, product titer, depth filter capacity, and facility utilization on overall operating costs. At bioreactor volumes 5,000 L, clarification using centrifugation followed by depth filtration offers significant cost savings. For bioreactor volumes of ∼ 2,000 L, clarification costs are similar between depth filtration and centrifugation. At this scale, factors including facility utilization, available capital, ease of process development, implementation timelines, and process performance characterization play an important role in clarification technology selection. In the case study presented, a multi-product facility selected multi-stage depth filtration for cell culture clarification at the 500 and 2,000 L scales of operation. Facility implementation timelines, process development activities, equipment commissioning and validation, scale-up effects, and process robustness are examined. © 2013 American Institute of Chemical Engineers.

  15. Special concrete shield selection using the analytic hierarchy process

    International Nuclear Information System (INIS)

    Abulfaraj, W.H.

    1994-01-01

    Special types of concrete radiation shields that depend on locally available materials and have improved properties for both neutron and gamma-ray attenuation were developed by using plastic materials and heavy ores. The analytic hierarchy process (AHP) is implemented to evaluate these types for selecting the best biological radiation shield for nuclear reactors. Factors affecting the selection decision are degree of protection against neutrons, degree of protection against gamma rays, suitability of the concrete as building material, and economic considerations. The seven concrete alternatives are barite-polyethylene concrete, barite-polyvinyl chloride (PVC) concrete, barite-portland cement concrete, pyrite-polyethylene concrete, pyrite-PVC concrete, pyrite-portland cement concrete, and ordinary concrete. The AHP analysis shows the superiority of pyrite-polyethylene concrete over the others

  16. A model evaluation checklist for process-based environmental models

    Science.gov (United States)

    Jackson-Blake, Leah

    2015-04-01

    Mechanistic catchment-scale phosphorus models appear to perform poorly where diffuse sources dominate. The reasons for this were investigated for one commonly-applied model, the INtegrated model of CAtchment Phosphorus (INCA-P). Model output was compared to 18 months of daily water quality monitoring data in a small agricultural catchment in Scotland, and model structure, key model processes and internal model responses were examined. Although the model broadly reproduced dissolved phosphorus dynamics, it struggled with particulates. The reasons for poor performance were explored, together with ways in which improvements could be made. The process of critiquing and assessing model performance was then generalised to provide a broadly-applicable model evaluation checklist, incorporating: (1) Calibration challenges, relating to difficulties in thoroughly searching a high-dimensional parameter space and in selecting appropriate means of evaluating model performance. In this study, for example, model simplification was identified as a necessary improvement to reduce the number of parameters requiring calibration, whilst the traditionally-used Nash Sutcliffe model performance statistic was not able to discriminate between realistic and unrealistic model simulations, and alternative statistics were needed. (2) Data limitations, relating to a lack of (or uncertainty in) input data, data to constrain model parameters, data for model calibration and testing, and data to test internal model processes. In this study, model reliability could be improved by addressing all four kinds of data limitation. For example, there was insufficient surface water monitoring data for model testing against an independent dataset to that used in calibration, whilst additional monitoring of groundwater and effluent phosphorus inputs would help distinguish between alternative plausible model parameterisations. (3) Model structural inadequacies, whereby model structure may inadequately represent

  17. Computer modeling of lung cancer diagnosis-to-treatment process.

    Science.gov (United States)

    Ju, Feng; Lee, Hyo Kyung; Osarogiagbon, Raymond U; Yu, Xinhua; Faris, Nick; Li, Jingshan

    2015-08-01

    We introduce an example of a rigorous, quantitative method for quality improvement in lung cancer care-delivery. Computer process modeling methods are introduced for lung cancer diagnosis, staging and treatment selection process. Two types of process modeling techniques, discrete event simulation (DES) and analytical models, are briefly reviewed. Recent developments in DES are outlined and the necessary data and procedures to develop a DES model for lung cancer diagnosis, leading up to surgical treatment process are summarized. The analytical models include both Markov chain model and closed formulas. The Markov chain models with its application in healthcare are introduced and the approach to derive a lung cancer diagnosis process model is presented. Similarly, the procedure to derive closed formulas evaluating the diagnosis process performance is outlined. Finally, the pros and cons of these methods are discussed.

  18. Variable Selection for Regression Models of Percentile Flows

    Science.gov (United States)

    Fouad, G.

    2017-12-01

    Percentile flows describe the flow magnitude equaled or exceeded for a given percent of time, and are widely used in water resource management. However, these statistics are normally unavailable since most basins are ungauged. Percentile flows of ungauged basins are often predicted using regression models based on readily observable basin characteristics, such as mean elevation. The number of these independent variables is too large to evaluate all possible models. A subset of models is typically evaluated using automatic procedures, like stepwise regression. This ignores a large variety of methods from the field of feature (variable) selection and physical understanding of percentile flows. A study of 918 basins in the United States was conducted to compare an automatic regression procedure to the following variable selection methods: (1) principal component analysis, (2) correlation analysis, (3) random forests, (4) genetic programming, (5) Bayesian networks, and (6) physical understanding. The automatic regression procedure only performed better than principal component analysis. Poor performance of the regression procedure was due to a commonly used filter for multicollinearity, which rejected the strongest models because they had cross-correlated independent variables. Multicollinearity did not decrease model performance in validation because of a representative set of calibration basins. Variable selection methods based strictly on predictive power (numbers 2-5 from above) performed similarly, likely indicating a limit to the predictive power of the variables. Similar performance was also reached using variables selected based on physical understanding, a finding that substantiates recent calls to emphasize physical understanding in modeling for predictions in ungauged basins. The strongest variables highlighted the importance of geology and land cover, whereas widely used topographic variables were the weakest predictors. Variables suffered from a high

  19. SELECTION AND PRELIMINARY EVALUATION OF ALTERNATIVE REDUCTANTS FOR SRAT PROCESSING

    Energy Technology Data Exchange (ETDEWEB)

    Stone, M.; Pickenheim, B.; Peeler, D.

    2009-06-30

    Defense Waste Processing Facility - Engineering (DWPF-E) has requested the Savannah River National Laboratory (SRNL) to perform scoping evaluations of alternative flowsheets with the primary focus on alternatives to formic acid during Chemical Process Cell (CPC) processing. The reductants shown below were selected for testing during the evaluation of alternative reductants for Sludge Receipt and Adjustment Tank (SRAT) processing. The reductants fall into two general categories: reducing acids and non-acidic reducing agents. Reducing acids were selected as direct replacements for formic acid to reduce mercury in the SRAT, to acidify the sludge, and to balance the melter REDuction/OXidation potential (REDOX). Non-acidic reductants were selected as melter reductants and would not be able to reduce mercury in the SRAT. Sugar was not tested during this scoping evaluation as previous work has already been conducted on the use of sugar with DWPF feeds. Based on the testing performed, the only viable short-term path to mitigating hydrogen generation in the CPC is replacement of formic acid with a mixture of glycolic and formic acids. An experiment using glycolic acid blended with formic on an 80:20 molar basis was able to reduce mercury, while also targeting a predicted REDuction/OXidation (REDOX) of 0.2 expressed as Fe{sup 2+}/{Sigma}Fe. Based on this result, SRNL recommends performing a complete CPC demonstration of the glycolic/formic acid flowsheet followed by a design basis development and documentation. Of the options tested recently and in the past, nitric/glycolic/formic blended acids has the potential for near term implementation in the existing CPC equipment providing rapid throughput improvement. Use of a non-acidic reductant is recommended only if the processing constraints to remove mercury and acidify the sludge acidification are eliminated. The non-acidic reductants (e.g. sugar) will not reduce mercury during CPC processing and sludge acidification would

  20. Pyrochemical and Dry Processing Methods Program. A selected bibliography

    International Nuclear Information System (INIS)

    McDuffie, H.F.; Smith, D.H.; Owen, P.T.

    1979-03-01

    This selected bibliography with abstracts was compiled to provide information support to the Pyrochemical and Dry Processing Methods (PDPM) Program sponsored by DOE and administered by the Argonne National Laboratory. Objectives of the PDPM Program are to evaluate nonaqueous methods of reprocessing spent fuel as a route to the development of proliferation-resistant and diversion-resistant methods for widespread use in the nuclear industry. Emphasis was placed on the literature indexed in the ERDA--DOE Energy Data Base (EDB). The bibliography includes indexes to authors, subject descriptors, EDB subject categories, and titles

  1. Pyrochemical and Dry Processing Methods Program. A selected bibliography

    Energy Technology Data Exchange (ETDEWEB)

    McDuffie, H.F.; Smith, D.H.; Owen, P.T.

    1979-03-01

    This selected bibliography with abstracts was compiled to provide information support to the Pyrochemical and Dry Processing Methods (PDPM) Program sponsored by DOE and administered by the Argonne National Laboratory. Objectives of the PDPM Program are to evaluate nonaqueous methods of reprocessing spent fuel as a route to the development of proliferation-resistant and diversion-resistant methods for widespread use in the nuclear industry. Emphasis was placed on the literature indexed in the ERDA--DOE Energy Data Base (EDB). The bibliography includes indexes to authors, subject descriptors, EDB subject categories, and titles.

  2. Process selection methodology for service management in SME

    Directory of Open Access Journals (Sweden)

    Juan Luis Rubio Sánchez

    2017-09-01

    Full Text Available It is a fact that more and more companies operations lay in information and communication technologies (ICT. Traditional management models need to be adapted to this new reality. That is why some initiatives are emerging (COBIT [control objectives for information and related technology], CMMI [capability maturity model integration], ITIL [information technology infrastructure library], etc. which pretend to guide about the processes, metrics and technology management indicators most suitable. This document focuses in ITIL, that is the best representation of what has been called IT Governance. ITIL is a reference in technology services companies and in ICT departments of any company. That is due to the high level of utility provided by the organization and coverage of the processes proposed. Implantation of a management model based in ITIL processes forces companies to a relevant decision: which processes should be implemented?, which one should be the first one?, etc. The answer to this and other questions is not easy because the adoption of these processes implies an economical investment. This article shows an approach to the implementation order so we can optimize the position of the company in front of the competence in its sector, in front of similar sized companies or any other parameter we could define.

  3. Modeling selective pressures on phytoplankton in the global ocean.

    Directory of Open Access Journals (Sweden)

    Jason G Bragg

    Full Text Available Our view of marine microbes is transforming, as culture-independent methods facilitate rapid characterization of microbial diversity. It is difficult to assimilate this information into our understanding of marine microbe ecology and evolution, because their distributions, traits, and genomes are shaped by forces that are complex and dynamic. Here we incorporate diverse forces--physical, biogeochemical, ecological, and mutational--into a global ocean model to study selective pressures on a simple trait in a widely distributed lineage of picophytoplankton: the nitrogen use abilities of Synechococcus and Prochlorococcus cyanobacteria. Some Prochlorococcus ecotypes have lost the ability to use nitrate, whereas their close relatives, marine Synechococcus, typically retain it. We impose mutations for the loss of nitrogen use abilities in modeled picophytoplankton, and ask: in which parts of the ocean are mutants most disadvantaged by losing the ability to use nitrate, and in which parts are they least disadvantaged? Our model predicts that this selective disadvantage is smallest for picophytoplankton that live in tropical regions where Prochlorococcus are abundant in the real ocean. Conversely, the selective disadvantage of losing the ability to use nitrate is larger for modeled picophytoplankton that live at higher latitudes, where Synechococcus are abundant. In regions where we expect Prochlorococcus and Synechococcus populations to cycle seasonally in the real ocean, we find that model ecotypes with seasonal population dynamics similar to Prochlorococcus are less disadvantaged by losing the ability to use nitrate than model ecotypes with seasonal population dynamics similar to Synechococcus. The model predictions for the selective advantage associated with nitrate use are broadly consistent with the distribution of this ability among marine picocyanobacteria, and at finer scales, can provide insights into interactions between temporally varying

  4. Modeling selective pressures on phytoplankton in the global ocean.

    Science.gov (United States)

    Bragg, Jason G; Dutkiewicz, Stephanie; Jahn, Oliver; Follows, Michael J; Chisholm, Sallie W

    2010-03-10

    Our view of marine microbes is transforming, as culture-independent methods facilitate rapid characterization of microbial diversity. It is difficult to assimilate this information into our understanding of marine microbe ecology and evolution, because their distributions, traits, and genomes are shaped by forces that are complex and dynamic. Here we incorporate diverse forces--physical, biogeochemical, ecological, and mutational--into a global ocean model to study selective pressures on a simple trait in a widely distributed lineage of picophytoplankton: the nitrogen use abilities of Synechococcus and Prochlorococcus cyanobacteria. Some Prochlorococcus ecotypes have lost the ability to use nitrate, whereas their close relatives, marine Synechococcus, typically retain it. We impose mutations for the loss of nitrogen use abilities in modeled picophytoplankton, and ask: in which parts of the ocean are mutants most disadvantaged by losing the ability to use nitrate, and in which parts are they least disadvantaged? Our model predicts that this selective disadvantage is smallest for picophytoplankton that live in tropical regions where Prochlorococcus are abundant in the real ocean. Conversely, the selective disadvantage of losing the ability to use nitrate is larger for modeled picophytoplankton that live at higher latitudes, where Synechococcus are abundant. In regions where we expect Prochlorococcus and Synechococcus populations to cycle seasonally in the real ocean, we find that model ecotypes with seasonal population dynamics similar to Prochlorococcus are less disadvantaged by losing the ability to use nitrate than model ecotypes with seasonal population dynamics similar to Synechococcus. The model predictions for the selective advantage associated with nitrate use are broadly consistent with the distribution of this ability among marine picocyanobacteria, and at finer scales, can provide insights into interactions between temporally varying ocean processes and

  5. Selective blockade of microRNA processing by Lin-28

    Science.gov (United States)

    Viswanathan, Srinivas R.; Daley, George Q.; Gregory, Richard I.

    2012-01-01

    MicroRNAs (miRNAs) play critical roles in development, and dysregulation of miRNA expression has been observed in human malignancies. Recent evidence suggests that the processing of several primary miRNA transcripts (pri-miRNAs) is blocked post-transcriptionally in embryonic stem (ES) cells, embryonal carcinoma (EC) cells, and primary tumors. Here we show that Lin-28, a developmentally regulated RNA-binding protein, selectively blocks the processing of pri-let-7 miRNAs in embryonic cells. Using in vitro and in vivo studies, we demonstrate that Lin-28 is necessary and sufficient for blocking Microprocessor-mediated cleavage of pri-let-7 miRNAs. Our results identify Lin-28 as a negative regulator of miRNA biogenesis and suggest that Lin-28 may play a central role in blocking miRNA-mediated differentiation in stem cells and certain cancers. PMID:18292307

  6. Properties of spatial Cox process models

    DEFF Research Database (Denmark)

    Møller, Jesper

    Probabilistic properties of Cox processes of relevance for statistical modelling and inference are studied. Particularly, we study the most important classes of Cox processes, including log Gaussian Cox processes, shot noise Cox processes, and permanent Cox processes. We consider moment properties...... and point process operations such as thinning, displacements, and superpositioning. We also discuss how to simulate specific Cox processes....

  7. Developing engineering processes through integrated modelling of product and process

    DEFF Research Database (Denmark)

    Nielsen, Jeppe Bjerrum; Hvam, Lars

    2012-01-01

    This article aims at developing an operational tool for integrated modelling of product assortments and engineering processes in companies making customer specific products. Integrating a product model in the design of engineering processes will provide a deeper understanding of the engineering...... activities as well as insight into how product features affect the engineering processes. The article suggests possible ways of integrating models of products with models of engineering processes. The models have been tested and further developed in an action research study carried out in collaboration...... with a major international engineering company....

  8. Automatic extraction of process categories from process model collections

    NARCIS (Netherlands)

    Malinova, M.; Dijkman, R.M.; Mendling, J.; Lohmann, N.; Song, M.; Wohed, P.

    2014-01-01

    Many organizations build up their business process management activities in an incremental way. As a result, there is no overarching structure defined at the beginning. However, as business process modeling initiatives often yield hundreds to thousands of process models, there is a growing need for

  9. A CONCEPTUAL MODEL FOR IMPROVED PROJECT SELECTION AND PRIORITISATION

    Directory of Open Access Journals (Sweden)

    P. J. Viljoen

    2012-01-01

    Full Text Available

    ENGLISH ABSTRACT: Project portfolio management processes are often designed and operated as a series of stages (or project phases and gates. However, the flow of such a process is often slow, characterised by queues waiting for a gate decision and by repeated work from previous stages waiting for additional information or for re-processing. In this paper the authors propose a conceptual model that applies supply chain and constraint management principles to the project portfolio management process. An advantage of the proposed model is that it provides the ability to select and prioritise projects without undue changes to project schedules. This should result in faster flow through the system.

    AFRIKAANSE OPSOMMING: Prosesse om portefeuljes van projekte te bestuur word normaalweg ontwerp en bedryf as ’n reeks fases en hekke. Die vloei deur so ’n proses is dikwels stadig en word gekenmerk deur toue wat wag vir besluite by die hekke en ook deur herwerk van vorige fases wat wag vir verdere inligting of vir herprosessering. In hierdie artikel word ‘n konseptuele model voorgestel. Die model berus op die beginsels van voorsieningskettings sowel as van beperkingsbestuur, en bied die voordeel dat projekte geselekteer en geprioritiseer kan word sonder onnodige veranderinge aan projekskedules. Dit behoort te lei tot versnelde vloei deur die stelsel.

  10. A new Russell model for selecting suppliers

    NARCIS (Netherlands)

    Azadi, Majid; Shabani, Amir; Farzipoor Saen, Reza

    2014-01-01

    Recently, supply chain management (SCM) has been considered by many researchers. Supplier evaluation and selection plays a significant role in establishing an effective SCM. One of the techniques that can be used for selecting suppliers is data envelopment analysis (DEA). In some situations, to

  11. Process Design Aspects for Scandium-Selective Leaching of Bauxite Residue with Sulfuric Acid

    OpenAIRE

    Konstantinos Hatzilyberis; Theopisti Lymperopoulou; Lamprini-Areti Tsakanika; Klaus-Michael Ochsenkühn; Paraskevas Georgiou; Nikolaos Defteraios; Fotios Tsopelas; Maria Ochsenkühn-Petropoulou

    2018-01-01

    Aiming at the industrial scale development of a Scandium (Sc)-selective leaching process of Bauxite Residue (BR), a set of process design aspects has been investigated. The interpretation of experimental data for Sc leaching yield, with sulfuric acid as the leaching solvent, has shown significant impact from acid feed concentration, mixing time, liquid to solids ratio (L/S), and number of cycles of leachate re-usage onto fresh BR. The thin film diffusion model, as the fundamental theory for l...

  12. Nutritional and toxicological composition analysis of selected cassava processed products

    Directory of Open Access Journals (Sweden)

    Kuda Dewage Supun Charuni Nilangeka Rajapaksha

    2017-01-01

    Full Text Available Cassava (Manihot esculanta Crantz is an important food source in tropical countries where it can withstand environmentally stressed conditions. Cassava and its processed products have a high demand in both local and export market of Sri Lanka. MU51 cassava variety is one of the more common varieties and boiling is the main consumption pattern of cassava among Sri Lankans. The less utilization of cassava is due to the presence of cyanide which is a toxic substance. This research was designed to analyse the nutritional composition and toxicological (cyanide content of Cassava MU51 variety and selected processed products of cassava MU51 (boiled, starch, flour, chips, two chips varieties purchased from market to identify the effect of processing on cassava MU51 variety. Nutritional composition was analysed by AOAC (2012 methods with modifications and cyanide content was determined following picric acid method of spectrophotometric determination. The Flesh of MU51 variety and different processed products of cassava had an average range of moisture content (3.18 - 61.94%, total fat (0.31 - 23.30%, crude fiber (0.94 - 2.15%, protein (1.67 - 3.71% and carbohydrates (32.68 - 84.20% and where they varied significantly in between products and the variety MU51, where no significance difference (p >0.05 observed in between MU51 flesh and processed products' ash content where it ranged (1.02 - 1.91%. However, boiled product and MU51 flesh had more similar results in their nutritional composition where they showed no significant difference at any of the nutrient that was analysed. Thus, there could be no significant effect on the nutrient composition of raw cassava once it boiled. Cyanide content of the MU51 flesh and selected products (boiled, starch, flour and chips prepared using MU51 variety, showed wide variation ranging from 4.68 mg.kg-1 to 33.92 mg.kg-1 in dry basis. But except boiled cassava all processed products had cyanide content <10 mg.kg-1, which

  13. Process mining using BPMN: relating event logs and process models

    NARCIS (Netherlands)

    Kalenkova, A.A.; van der Aalst, W.M.P.; Lomazova, I.A.; Rubin, V.A.

    2017-01-01

    Process-aware information systems (PAIS) are systems relying on processes, which involve human and software resources to achieve concrete goals. There is a need to develop approaches for modeling, analysis, improvement and monitoring processes within PAIS. These approaches include process mining

  14. Process mining using BPMN : relating event logs and process models

    NARCIS (Netherlands)

    Kalenkova, A.A.; Aalst, van der W.M.P.; Lomazova, I.A.; Rubin, V.A.

    2015-01-01

    Process-aware information systems (PAIS) are systems relying on processes, which involve human and software resources to achieve concrete goals. There is a need to develop approaches for modeling, analysis, improvement and monitoring processes within PAIS. These approaches include process mining

  15. Modeling Suspension and Continuation of a Process

    Directory of Open Access Journals (Sweden)

    Oleg Svatos

    2012-04-01

    Full Text Available This work focuses on difficulties an analyst encounters when modeling suspension and continuation of a process in contemporary process modeling languages. As a basis there is introduced general lifecycle of an activity which is then compared to activity lifecycles supported by individual process modeling languages. The comparison shows that the contemporary process modeling languages cover the defined general lifecycle of an activity only partially. There are picked two popular process modeling languages and there is modeled real example, which reviews how the modeling languages can get along with their lack of native support of suspension and continuation of an activity. Upon the unsatisfying results of the contemporary process modeling languages in the modeled example, there is presented a new process modeling language which, as demonstrated, is capable of capturing suspension and continuation of an activity in much simpler and precise way.

  16. Equifinality and process-based modelling

    Science.gov (United States)

    Khatami, S.; Peel, M. C.; Peterson, T. J.; Western, A. W.

    2017-12-01

    Equifinality is understood as one of the fundamental difficulties in the study of open complex systems, including catchment hydrology. A review of the hydrologic literature reveals that the term equifinality has been widely used, but in many cases inconsistently and without coherent recognition of the various facets of equifinality, which can lead to ambiguity but also methodological fallacies. Therefore, in this study we first characterise the term equifinality within the context of hydrological modelling by reviewing the genesis of the concept of equifinality and then presenting a theoretical framework. During past decades, equifinality has mainly been studied as a subset of aleatory (arising due to randomness) uncertainty and for the assessment of model parameter uncertainty. Although the connection between parameter uncertainty and equifinality is undeniable, we argue there is more to equifinality than just aleatory parameter uncertainty. That is, the importance of equifinality and epistemic uncertainty (arising due to lack of knowledge) and their implications is overlooked in our current practice of model evaluation. Equifinality and epistemic uncertainty in studying, modelling, and evaluating hydrologic processes are treated as if they can be simply discussed in (or often reduced to) probabilistic terms (as for aleatory uncertainty). The deficiencies of this approach to conceptual rainfall-runoff modelling are demonstrated for selected Australian catchments by examination of parameter and internal flux distributions and interactions within SIMHYD. On this basis, we present a new approach that expands equifinality concept beyond model parameters to inform epistemic uncertainty. The new approach potentially facilitates the identification and development of more physically plausible models and model evaluation schemes particularly within the multiple working hypotheses framework, and is generalisable to other fields of environmental modelling as well.

  17. Computer Aided Continuous Time Stochastic Process Modelling

    DEFF Research Database (Denmark)

    Kristensen, N.R.; Madsen, Henrik; Jørgensen, Sten Bay

    2001-01-01

    A grey-box approach to process modelling that combines deterministic and stochastic modelling is advocated for identification of models for model-based control of batch and semi-batch processes. A computer-aided tool designed for supporting decision-making within the corresponding modelling cycle...

  18. Algorithms of control parameters selection for automation of FDM 3D printing process

    Directory of Open Access Journals (Sweden)

    Kogut Paweł

    2017-01-01

    Full Text Available The paper presents algorithms of control parameters selection of the Fused Deposition Modelling (FDM technology in case of an open printing solutions environment and 3DGence ONE printer. The following parameters were distinguished: model mesh density, material flow speed, cooling performance, retraction and printing speeds. These parameters are independent in principle printing system, but in fact to a certain degree that results from the selected printing equipment features. This is the first step for automation of the 3D printing process in FDM technology.

  19. Modeling and generating input processes

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, M.E.

    1987-01-01

    This tutorial paper provides information relevant to the selection and generation of stochastic inputs to simulation studies. The primary area considered is multivariate but much of the philosophy at least is relevant to univariate inputs as well. 14 refs.

  20. Process modeling for Humanities: tracing and analyzing scientific processes

    OpenAIRE

    Hug , Charlotte; Salinesi , Camille; Deneckere , Rebecca; Lamasse , Stéphane

    2011-01-01

    International audience; This paper concerns epistemology and the understanding of research processes in Humanities, such as Archaeology. We believe that to properly understand research processes, it is essential to trace them. The collected traces depend on the process model established, which has to be as accurate as possible to exhaustively record the traces. In this paper, we briefly explain why the existing process models for Humanities are not sufficient to represent traces. We then pres...

  1. Development of Solar Drying Model for Selected Cambodian Fish Species

    Science.gov (United States)

    Hubackova, Anna; Kucerova, Iva; Chrun, Rithy; Chaloupkova, Petra; Banout, Jan

    2014-01-01

    A solar drying was investigated as one of perspective techniques for fish processing in Cambodia. The solar drying was compared to conventional drying in electric oven. Five typical Cambodian fish species were selected for this study. Mean solar drying temperature and drying air relative humidity were 55.6°C and 19.9%, respectively. The overall solar dryer efficiency was 12.37%, which is typical for natural convection solar dryers. An average evaporative capacity of solar dryer was 0.049 kg·h−1. Based on coefficient of determination (R 2), chi-square (χ 2) test, and root-mean-square error (RMSE), the most suitable models describing natural convection solar drying kinetics were Logarithmic model, Diffusion approximate model, and Two-term model for climbing perch and Nile tilapia, swamp eel and walking catfish and Channa fish, respectively. In case of electric oven drying, the Modified Page 1 model shows the best results for all investigated fish species except Channa fish where the two-term model is the best one. Sensory evaluation shows that most preferable fish is climbing perch, followed by Nile tilapia and walking catfish. This study brings new knowledge about drying kinetics of fresh water fish species in Cambodia and confirms the solar drying as acceptable technology for fish processing. PMID:25250381

  2. Development of Solar Drying Model for Selected Cambodian Fish Species

    Directory of Open Access Journals (Sweden)

    Anna Hubackova

    2014-01-01

    Full Text Available A solar drying was investigated as one of perspective techniques for fish processing in Cambodia. The solar drying was compared to conventional drying in electric oven. Five typical Cambodian fish species were selected for this study. Mean solar drying temperature and drying air relative humidity were 55.6°C and 19.9%, respectively. The overall solar dryer efficiency was 12.37%, which is typical for natural convection solar dryers. An average evaporative capacity of solar dryer was 0.049 kg·h−1. Based on coefficient of determination (R2, chi-square (χ2 test, and root-mean-square error (RMSE, the most suitable models describing natural convection solar drying kinetics were Logarithmic model, Diffusion approximate model, and Two-term model for climbing perch and Nile tilapia, swamp eel and walking catfish and Channa fish, respectively. In case of electric oven drying, the Modified Page 1 model shows the best results for all investigated fish species except Channa fish where the two-term model is the best one. Sensory evaluation shows that most preferable fish is climbing perch, followed by Nile tilapia and walking catfish. This study brings new knowledge about drying kinetics of fresh water fish species in Cambodia and confirms the solar drying as acceptable technology for fish processing.

  3. Selection of models to calculate the LLW source term

    International Nuclear Information System (INIS)

    Sullivan, T.M.

    1991-10-01

    Performance assessment of a LLW disposal facility begins with an estimation of the rate at which radionuclides migrate out of the facility (i.e., the source term). The focus of this work is to develop a methodology for calculating the source term. In general, the source term is influenced by the radionuclide inventory, the wasteforms and containers used to dispose of the inventory, and the physical processes that lead to release from the facility (fluid flow, container degradation, wasteform leaching, and radionuclide transport). In turn, many of these physical processes are influenced by the design of the disposal facility (e.g., infiltration of water). The complexity of the problem and the absence of appropriate data prevent development of an entirely mechanistic representation of radionuclide release from a disposal facility. Typically, a number of assumptions, based on knowledge of the disposal system, are used to simplify the problem. This document provides a brief overview of disposal practices and reviews existing source term models as background for selecting appropriate models for estimating the source term. The selection rationale and the mathematical details of the models are presented. Finally, guidance is presented for combining the inventory data with appropriate mechanisms describing release from the disposal facility. 44 refs., 6 figs., 1 tab

  4. Self-Repair and Language Selection in Bilingual Speech Processing

    Directory of Open Access Journals (Sweden)

    Inga Hennecke

    2013-07-01

    Full Text Available In psycholinguistic research the exact level of language selection in bilingual lexical access is still controversial and current models of bilingual speech production offer conflicting statements about the mechanisms and location of language selection. This paper aims to provide a corpus analysis of self-repair mechanisms in code-switching contexts of highly fluent bilingual speakers in order to gain further insights into bilingual speech production. The present paper follows the assumptions of the Selection by Proficiency model, which claims that language proficiency and lexical robustness determine the mechanism and level of language selection. In accordance with this hypothesis, highly fluent bilinguals select languages at a prelexical level, which should influence the occurrence of self-repairs in bilingual speech. A corpus of natural speech data of highly fluent and balanced bilingual French-English speakers of the Canadian French variety Franco-Manitoban serves as the basis for a detailed analysis of different self-repair mechanisms in code-switching environments. Although the speech data contain a large amount of code-switching, results reveal that only a few speech errors and self-repairs occur in direct code-switching environments. A detailed analysis of the respective starting point of code-switching and the different repair mechanisms supports the hypothesis that highly proficient bilinguals do not select languages at the lexical level.Le niveau exact de la sélection des langues lors de l’accès lexical chez le bilingue reste une question controversée dans la recherche psycholinguistique. Les modèles actuels de la production verbale bilingue proposent des arguments contradictoires concernant le mécanisme et le lieu de la sélection des langues. La présente recherche vise à fournir une analyse de corpus mettant l’accent sur les mécanismes d’autoréparation dans le contexte d’alternance codique dans la production verbale

  5. Impact of selected troposphere models on Precise Point Positioning convergence

    Science.gov (United States)

    Kalita, Jakub; Rzepecka, Zofia

    2016-04-01

    The Precise Point Positioning (PPP) absolute method is currently intensively investigated in order to reach fast convergence time. Among various sources that influence the convergence of the PPP, the tropospheric delay is one of the most important. Numerous models of tropospheric delay are developed and applied to PPP processing. However, with rare exceptions, the quality of those models does not allow fixing the zenith path delay tropospheric parameter, leaving difference between nominal and final value to the estimation process. Here we present comparison of several PPP result sets, each of which based on different troposphere model. The respective nominal values are adopted from models: VMF1, GPT2w, MOPS and ZERO-WET. The PPP solution admitted as reference is based on the final troposphere product from the International GNSS Service (IGS). The VMF1 mapping function was used for all processing variants in order to provide capability to compare impact of applied nominal values. The worst case initiates zenith wet delay with zero value (ZERO-WET). Impact from all possible models for tropospheric nominal values should fit inside both IGS and ZERO-WET border variants. The analysis is based on data from seven IGS stations located in mid-latitude European region from year 2014. For the purpose of this study several days with the most active troposphere were selected for each of the station. All the PPP solutions were determined using gLAB open-source software, with the Kalman filter implemented independently by the authors of this work. The processing was performed on 1 hour slices of observation data. In addition to the analysis of the output processing files, the presented study contains detailed analysis of the tropospheric conditions for the selected data. The overall results show that for the height component the VMF1 model outperforms GPT2w and MOPS by 35-40% and ZERO-WET variant by 150%. In most of the cases all solutions converge to the same values during first

  6. Dealing with selection bias in educational transition models

    DEFF Research Database (Denmark)

    Holm, Anders; Jæger, Mads Meier

    2011-01-01

    This paper proposes the bivariate probit selection model (BPSM) as an alternative to the traditional Mare model for analyzing educational transitions. The BPSM accounts for selection on unobserved variables by allowing for unobserved variables which affect the probability of making educational tr...... account for selection on unobserved variables and high-quality data are both required in order to estimate credible educational transition models.......This paper proposes the bivariate probit selection model (BPSM) as an alternative to the traditional Mare model for analyzing educational transitions. The BPSM accounts for selection on unobserved variables by allowing for unobserved variables which affect the probability of making educational...... transitions to be correlated across transitions. We use simulated and real data to illustrate how the BPSM improves on the traditional Mare model in terms of correcting for selection bias and providing credible estimates of the effect of family background on educational success. We conclude that models which...

  7. Early environmental planning: A process for power line corridor selection

    International Nuclear Information System (INIS)

    Haagenstad, T.; Bare, C.M.

    1998-01-01

    Los Alamos National Laboratory (LANL) conducted an environmental planning study in the fall of 1997 to help determine the best alternative for upgrading the Laboratory's electrical power system. Alternatives considered included an on-site power generation facility and two corridors for a 10-mile-long 115-kV power line. This planning process was conducted prior to the formal National Environmental Policy Act (NEPA) review. The goals were to help select the best proposed action, to recommend modifications and mitigation measures for each alternative for a more environmentally sound project, and to avoid potential delays once the formal Department of Energy review process began. Significant constraints existed from a planning perspective, including operational issues such as existing outdoor high explosives testing areas, as well as environmental issues including threatened and endangered species habitats, multiple archeological sites, contaminated areas, and aesthetics. The study had to be completed within 45 days to meet project schedule needs. The process resulted in a number of important recommendations. While the construction and operation of the on-site power generation facility could have minimal environmental impacts, the need for a new air quality permit would create severe cost and schedule constraints for the project. From an environmental perspective, construction and operation of a power line within either corridor was concluded to be a viable alternative. However, impacts with either corridor would have to be reduced through specific recommended alignment modifications and mitigation measures

  8. Sex differences in sensorimotor mu rhythms during selective attentional processing.

    Science.gov (United States)

    Popovich, C; Dockstader, C; Cheyne, D; Tannock, R

    2010-12-01

    We used magnetoencephalography to investigate the effect of directed attention on sensorimotor mu (8-12 Hz) response (mu reactivity) to non-painful electrical stimulation of the median nerve in healthy adults. Mu desynchronization in the 10-12 Hz bandwidth is typically observed during higher-order cognitive functions including selective attentional processing of sensorimotor information (Pfurtscheller, Neuper, & Krauz, 2000). We found attention-related sex differences in mu reactivity, with females showing (i) prolonged mu desynchrony when attending to somatosensory stimuli, (ii) attentional modulation of the mu response based on whether attention was directed towards or away from somatosensory stimuli, which was absent in males, and (iii) a trend for greater neuronal excitability of the primary somatosensory region suggesting greater physiological responsiveness to sensory stimulation overall. Our findings suggest sex differences in attentional control strategies when processing somatosensory stimuli, whose salience may be greater for females. These sex differences in attention to somatosensory stimuli may help elucidate the well-documented sex biases in pain processing wherein females typically report greater sensitivity to experimental and clinical pain. Copyright © 2010 Elsevier Ltd. All rights reserved.

  9. Selecting a Control Strategy for Plug and Process Loads

    Energy Technology Data Exchange (ETDEWEB)

    Lobato, C.; Sheppy, M.; Brackney, L.; Pless, S.; Torcellini, P.

    2012-09-01

    Plug and Process Loads (PPLs) are building loads that are not related to general lighting, heating, ventilation, cooling, and water heating, and typically do not provide comfort to the building occupants. PPLs in commercial buildings account for almost 5% of U.S. primary energy consumption. On an individual building level, they account for approximately 25% of the total electrical load in a minimally code-compliant commercial building, and can exceed 50% in an ultra-high efficiency building such as the National Renewable Energy Laboratory's (NREL) Research Support Facility (RSF) (Lobato et al. 2010). Minimizing these loads is a primary challenge in the design and operation of an energy-efficient building. A complex array of technologies that measure and manage PPLs has emerged in the marketplace. Some fall short of manufacturer performance claims, however. NREL has been actively engaged in developing an evaluation and selection process for PPLs control, and is using this process to evaluate a range of technologies for active PPLs management that will cap RSF plug loads. Using a control strategy to match plug load use to users' required job functions is a huge untapped potential for energy savings.

  10. Applying a Hybrid MCDM Model for Six Sigma Project Selection

    Directory of Open Access Journals (Sweden)

    Fu-Kwun Wang

    2014-01-01

    Full Text Available Six Sigma is a project-driven methodology; the projects that provide the maximum financial benefits and other impacts to the organization must be prioritized. Project selection (PS is a type of multiple criteria decision making (MCDM problem. In this study, we present a hybrid MCDM model combining the decision-making trial and evaluation laboratory (DEMATEL technique, analytic network process (ANP, and the VIKOR method to evaluate and improve Six Sigma projects for reducing performance gaps in each criterion and dimension. We consider the film printing industry of Taiwan as an empirical case. The results show that our study not only can use the best project selection, but can also be used to analyze the gaps between existing performance values and aspiration levels for improving the gaps in each dimension and criterion based on the influential network relation map.

  11. A Bayesian random effects discrete-choice model for resource selection: Population-level selection inference

    Science.gov (United States)

    Thomas, D.L.; Johnson, D.; Griffith, B.

    2006-01-01

    Modeling the probability of use of land units characterized by discrete and continuous measures, we present a Bayesian random-effects model to assess resource selection. This model provides simultaneous estimation of both individual- and population-level selection. Deviance information criterion (DIC), a Bayesian alternative to AIC that is sample-size specific, is used for model selection. Aerial radiolocation data from 76 adult female caribou (Rangifer tarandus) and calf pairs during 1 year on an Arctic coastal plain calving ground were used to illustrate models and assess population-level selection of landscape attributes, as well as individual heterogeneity of selection. Landscape attributes included elevation, NDVI (a measure of forage greenness), and land cover-type classification. Results from the first of a 2-stage model-selection procedure indicated that there is substantial heterogeneity among cow-calf pairs with respect to selection of the landscape attributes. In the second stage, selection of models with heterogeneity included indicated that at the population-level, NDVI and land cover class were significant attributes for selection of different landscapes by pairs on the calving ground. Population-level selection coefficients indicate that the pairs generally select landscapes with higher levels of NDVI, but the relationship is quadratic. The highest rate of selection occurs at values of NDVI less than the maximum observed. Results for land cover-class selections coefficients indicate that wet sedge, moist sedge, herbaceous tussock tundra, and shrub tussock tundra are selected at approximately the same rate, while alpine and sparsely vegetated landscapes are selected at a lower rate. Furthermore, the variability in selection by individual caribou for moist sedge and sparsely vegetated landscapes is large relative to the variability in selection of other land cover types. The example analysis illustrates that, while sometimes computationally intense, a

  12. Business process modeling for processing classified documents using RFID technology

    Directory of Open Access Journals (Sweden)

    Koszela Jarosław

    2016-01-01

    Full Text Available The article outlines the application of the processing approach to the functional description of the designed IT system supporting the operations of the secret office, which processes classified documents. The article describes the application of the method of incremental modeling of business processes according to the BPMN model to the description of the processes currently implemented (“as is” in a manual manner and target processes (“to be”, using the RFID technology for the purpose of their automation. Additionally, the examples of applying the method of structural and dynamic analysis of the processes (process simulation to verify their correctness and efficiency were presented. The extension of the process analysis method is a possibility of applying the warehouse of processes and process mining methods.

  13. Process correlation analysis model for process improvement identification.

    Science.gov (United States)

    Choi, Su-jin; Kim, Dae-Kyoo; Park, Sooyong

    2014-01-01

    Software process improvement aims at improving the development process of software systems. It is initiated by process assessment identifying strengths and weaknesses and based on the findings, improvement plans are developed. In general, a process reference model (e.g., CMMI) is used throughout the process of software process improvement as the base. CMMI defines a set of process areas involved in software development and what to be carried out in process areas in terms of goals and practices. Process areas and their elements (goals and practices) are often correlated due to the iterative nature of software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data.

  14. Selection of Vendor Based on Intuitionistic Fuzzy Analytical Hierarchy Process

    Directory of Open Access Journals (Sweden)

    Prabjot Kaur

    2014-01-01

    Full Text Available Business environment is characterized by greater domestic and international competitive position in the global market. Vendors play a key role in achieving the so-called corporate competition. It is not easy however to identify good vendors because evaluation is based on multiple criteria. In practice, for VSP most of the input information about the criteria is not known precisely. Intuitionistic fuzzy set is an extension of the classical fuzzy set theory (FST, which is a suitable way to deal with impreciseness. In other words, the application of intuitionistic fuzzy sets instead of fuzzy sets means the introduction of another degree of freedom called nonmembership function into the set description. In this paper, we proposed a triangular intuitionistic fuzzy number based approach for the vendor selection problem using analytical hierarchy process. The crisp data of the vendors is represented in the form of triangular intuitionistic fuzzy numbers. By applying AHP which involves decomposition, pairwise comparison, and deriving priorities for the various levels of the hierarchy, an overall crisp priority is obtained for ranking the best vendor. A numerical example illustrates our method. Lastly a sensitivity analysis is performed to find the most critical criterion on the basis of which vendor is selected.

  15. Supercritical boiler material selection using fuzzy analytic network process

    Directory of Open Access Journals (Sweden)

    Saikat Ranjan Maity

    2012-08-01

    Full Text Available The recent development of world is being adversely affected by the scarcity of power and energy. To survive in the next generation, it is thus necessary to explore the non-conventional energy sources and efficiently consume the available sources. For efficient exploitation of the existing energy sources, a great scope lies in the use of Rankin cycle-based thermal power plants. Today, the gross efficiency of Rankin cycle-based thermal power plants is less than 28% which has been increased up to 40% with reheating and regenerative cycles. But, it can be further improved up to 47% by using supercritical power plant technology. Supercritical power plants use supercritical boilers which are able to withstand a very high temperature (650-720˚C and pressure (22.1 MPa while producing superheated steam. The thermal efficiency of a supercritical boiler greatly depends on the material of its different components. The supercritical boiler material should possess high creep rupture strength, high thermal conductivity, low thermal expansion, high specific heat and very high temperature withstandability. This paper considers a list of seven supercritical boiler materials whose performance is evaluated based on seven pivotal criteria. Given the intricacy and difficulty of this supercritical boiler material selection problem having interactions and interdependencies between different criteria, this paper applies fuzzy analytic network process to select the most appropriate material for a supercritical boiler. Rene 41 is the best supercritical boiler material, whereas, Haynes 230 is the worst preferred choice.

  16. Board Directors' Selection Process Following a Gender Quota

    DEFF Research Database (Denmark)

    Sigurjonsson, Olaf; Arnardottir, Audur Arna

    -quota selection of new board directors as well as the attitudes of board members towards the quota and perceptions of the effect of quota on processes. We incorporate a dual qualitative and quantitative methodology with in-depth interviews with 20 board directors and chairs, and a survey of 260 directors who...... companies with 50 or more employees. Thereby legislatively going further than any other country, out of the fifteen that have amended and adopted gender quota legislation. This article utilizes resource dependency and status expectations theory lenses to explore how the new legislation affected the post...... conviction. Furthermore, there are different avenues to the board. Although initial attitudes towards quotas are more negative among men than women, these attitudes decrease over time. Finally, consistent with status expectation theory, male directors are more negative than their female counterparts about...

  17. Syntax highlighting in business process models

    NARCIS (Netherlands)

    Reijers, H.A.; Freytag, T.; Mendling, J.; Eckleder, A.

    2011-01-01

    Sense-making of process models is an important task in various phases of business process management initiatives. Despite this, there is currently hardly any support in business process modeling tools to adequately support model comprehension. In this paper we adapt the concept of syntax

  18. Configurable multi-perspective business process models

    NARCIS (Netherlands)

    La Rosa, M.; Dumas, M.; Hofstede, ter A.H.M.; Mendling, J.

    2011-01-01

    A configurable process model provides a consolidated view of a family of business processes. It promotes the reuse of proven practices by providing analysts with a generic modeling artifact from which to derive individual process models. Unfortunately, the scope of existing notations for

  19. Manufacturing Process Selection of Composite Bicycle’s Crank Arm using Analytical Hierarchy Process (AHP)

    Science.gov (United States)

    Luqman, M.; Rosli, M. U.; Khor, C. Y.; Zambree, Shayfull; Jahidi, H.

    2018-03-01

    Crank arm is one of the important parts in a bicycle that is an expensive product due to the high cost of material and production process. This research is aimed to investigate the potential type of manufacturing process to fabricate composite bicycle crank arm and to describe an approach based on analytical hierarchy process (AHP) that assists decision makers or manufacturing engineers in determining the most suitable process to be employed in manufacturing of composite bicycle crank arm at the early stage of the product development process to reduce the production cost. There are four types of processes were considered, namely resin transfer molding (RTM), compression molding (CM), vacuum bag molding and filament winding (FW). The analysis ranks these four types of process for its suitability in the manufacturing of bicycle crank arm based on five main selection factors and 10 sub factors. Determining the right manufacturing process was performed based on AHP process steps. Consistency test was performed to make sure the judgements are consistent during the comparison. The results indicated that the compression molding was the most appropriate manufacturing process because it has the highest value (33.6%) among the other manufacturing processes.

  20. Selection of Representative Models for Decision Analysis Under Uncertainty

    Science.gov (United States)

    Meira, Luis A. A.; Coelho, Guilherme P.; Santos, Antonio Alberto S.; Schiozer, Denis J.

    2016-03-01

    The decision-making process in oil fields includes a step of risk analysis associated with the uncertainties present in the variables of the problem. Such uncertainties lead to hundreds, even thousands, of possible scenarios that are supposed to be analyzed so an effective production strategy can be selected. Given this high number of scenarios, a technique to reduce this set to a smaller, feasible subset of representative scenarios is imperative. The selected scenarios must be representative of the original set and also free of optimistic and pessimistic bias. This paper is devoted to propose an assisted methodology to identify representative models in oil fields. To do so, first a mathematical function was developed to model the representativeness of a subset of models with respect to the full set that characterizes the problem. Then, an optimization tool was implemented to identify the representative models of any problem, considering not only the cross-plots of the main output variables, but also the risk curves and the probability distribution of the attribute-levels of the problem. The proposed technique was applied to two benchmark cases and the results, evaluated by experts in the field, indicate that the obtained solutions are richer than those identified by previously adopted manual approaches. The program bytecode is available under request.

  1. Analytical network process based optimum cluster head selection in wireless sensor network.

    Science.gov (United States)

    Farman, Haleem; Javed, Huma; Jan, Bilal; Ahmad, Jamil; Ali, Shaukat; Khalil, Falak Naz; Khan, Murad

    2017-01-01

    Wireless Sensor Networks (WSNs) are becoming ubiquitous in everyday life due to their applications in weather forecasting, surveillance, implantable sensors for health monitoring and other plethora of applications. WSN is equipped with hundreds and thousands of small sensor nodes. As the size of a sensor node decreases, critical issues such as limited energy, computation time and limited memory become even more highlighted. In such a case, network lifetime mainly depends on efficient use of available resources. Organizing nearby nodes into clusters make it convenient to efficiently manage each cluster as well as the overall network. In this paper, we extend our previous work of grid-based hybrid network deployment approach, in which merge and split technique has been proposed to construct network topology. Constructing topology through our proposed technique, in this paper we have used analytical network process (ANP) model for cluster head selection in WSN. Five distinct parameters: distance from nodes (DistNode), residual energy level (REL), distance from centroid (DistCent), number of times the node has been selected as cluster head (TCH) and merged node (MN) are considered for CH selection. The problem of CH selection based on these parameters is tackled as a multi criteria decision system, for which ANP method is used for optimum cluster head selection. Main contribution of this work is to check the applicability of ANP model for cluster head selection in WSN. In addition, sensitivity analysis is carried out to check the stability of alternatives (available candidate nodes) and their ranking for different scenarios. The simulation results show that the proposed method outperforms existing energy efficient clustering protocols in terms of optimum CH selection and minimizing CH reselection process that results in extending overall network lifetime. This paper analyzes that ANP method used for CH selection with better understanding of the dependencies of

  2. Characteristics of products generated by selective sintering and stereolithography rapid prototyping processes

    Science.gov (United States)

    Cariapa, Vikram

    1993-01-01

    The trend in the modern global economy towards free market policies has motivated companies to use rapid prototyping technologies to not only reduce product development cycle time but also to maintain their competitive edge. A rapid prototyping technology is one which combines computer aided design with computer controlled tracking of focussed high energy source (eg. lasers, heat) on modern ceramic powders, metallic powders, plastics or photosensitive liquid resins in order to produce prototypes or models. At present, except for the process of shape melting, most rapid prototyping processes generate products that are only dimensionally similar to those of the desired end product. There is an urgent need, therefore, to enhance the understanding of the characteristics of these processes in order to realize their potential for production. Currently, the commercial market is dominated by four rapid prototyping processes, namely selective laser sintering, stereolithography, fused deposition modelling and laminated object manufacturing. This phase of the research has focussed on the selective laser sintering and stereolithography rapid prototyping processes. A theoretical model for these processes is under development. Different rapid prototyping sites supplied test specimens (based on ASTM 638-84, Type I) that have been measured and tested to provide a data base on surface finish, dimensional variation and ultimate tensile strength. Further plans call for developing and verifying the theoretical models by carefully designed experiments. This will be a joint effort between NASA and other prototyping centers to generate a larger database, thus encouraging more widespread usage by product designers.

  3. User's manual for a process model code

    International Nuclear Information System (INIS)

    Kern, E.A.; Martinez, D.P.

    1981-03-01

    The MODEL code has been developed for computer modeling of materials processing facilities associated with the nuclear fuel cycle. However, it can also be used in other modeling applications. This report provides sufficient information for a potential user to apply the code to specific process modeling problems. Several examples that demonstrate most of the capabilities of the code are provided

  4. Repairing process models to reflect reality

    NARCIS (Netherlands)

    Fahland, D.; Aalst, van der W.M.P.; Barros, A.; Gal, A.; Kindler, E.

    2012-01-01

    Process mining techniques relate observed behavior (i.e., event logs) to modeled behavior (e.g., a BPMN model or a Petri net). Processes models can be discovered from event logs and conformance checking techniques can be used to detect and diagnose differences between observed and modeled behavior.

  5. Uncertainty associated with selected environmental transport models

    International Nuclear Information System (INIS)

    Little, C.A.; Miller, C.W.

    1979-11-01

    A description is given of the capabilities of several models to predict accurately either pollutant concentrations in environmental media or radiological dose to human organs. The models are discussed in three sections: aquatic or surface water transport models, atmospheric transport models, and terrestrial and aquatic food chain models. Using data published primarily by model users, model predictions are compared to observations. This procedure is infeasible for food chain models and, therefore, the uncertainty embodied in the models input parameters, rather than the model output, is estimated. Aquatic transport models are divided into one-dimensional, longitudinal-vertical, and longitudinal-horizontal models. Several conclusions were made about the ability of the Gaussian plume atmospheric dispersion model to predict accurately downwind air concentrations from releases under several sets of conditions. It is concluded that no validation study has been conducted to test the predictions of either aquatic or terrestrial food chain models. Using the aquatic pathway from water to fish to an adult for 137 Cs as an example, a 95% one-tailed confidence limit interval for the predicted exposure is calculated by examining the distributions of the input parameters. Such an interval is found to be 16 times the value of the median exposure. A similar one-tailed limit for the air-grass-cow-milk-thyroid for 131 I and infants was 5.6 times the median dose. Of the three model types discussed in this report,the aquatic transport models appear to do the best job of predicting observed concentrations. However, this conclusion is based on many fewer aquatic validation data than were availaable for atmospheric model validation

  6. Genetic Process Mining: Alignment-based Process Model Mutation

    NARCIS (Netherlands)

    Eck, van M.L.; Buijs, J.C.A.M.; Dongen, van B.F.; Fournier, F.; Mendling, J.

    2015-01-01

    The Evolutionary Tree Miner (ETM) is a genetic process discovery algorithm that enables the user to guide the discovery process based on preferences with respect to four process model quality dimensions: replay fitness, precision, generalization and simplicity. Traditionally, the ETM algorithm uses

  7. Selective modulation of nociceptive processing due to noise distraction.

    Science.gov (United States)

    Boyle, Yvonne; El-Deredy, Wael; Martínez Montes, Eduardo; Bentley, Deborah E; Jones, Anthony K P

    2008-09-15

    This study investigates the effects of noise distraction on the different components and sources of laser-evoked potentials (LEPs) whilst attending to either the spatial component (localisation performance task) or the affective component (unpleasantness rating task) of pain. LEPs elicited by CO2 laser stimulation of the right forearm were recorded from 64 electrodes in 18 consenting healthy volunteers. Subjects reported either pain location or unpleasantness, in the presence and absence of distraction by continuous 85 dBa white noise. Distributed sources of the LEP peaks were identified using Low Resolution Electromagnetic Tomography (LORETA). Pain unpleasantness ratings and P2 (430 ms) peak amplitude were significantly reduced by distraction during the unpleasantness task, whereas the localisation ability and the corresponding N1/N2 (310 ms) peak amplitude remained unchanged. Noise distraction (at 310 ms) reduced activation in the anterior cingulate cortex (ACC) and precuneus during attention to localisation and unpleasantness, respectively. This suggests a complimentary role for these two areas in the control of attention to pain. In contrast, activation of the occipital pole and SII were enhanced by noise during the localisation and unpleasantness task, respectively, suggesting that the presence of noise was associated with increased spatial attentional load. This study has shown selective modulation of affective pain processing by noise distraction, indicated by a reduction in the unpleasantness ratings and P2 peak amplitude and associated activity within the medial pain system. These results show that processing of the affective component of pain can be differentially modulated by top-down processes, providing a potential mechanism for therapeutic intervention.

  8. Progressive sample processing of band selection for hyperspectral imagery

    Science.gov (United States)

    Liu, Keng-Hao; Chien, Hung-Chang; Chen, Shih-Yu

    2017-10-01

    Band selection (BS) is one of the most important topics in hyperspectral image (HSI) processing. The objective of BS is to find a set of representative bands that can represent the whole image with lower inter-band redundancy. Many types of BS algorithms were proposed in the past. However, most of them can be carried on in an off-line manner. It means that they can only be implemented on the pre-collected data. Those off-line based methods are sometime useless for those applications that are timeliness, particular in disaster prevention and target detection. To tackle this issue, a new concept, called progressive sample processing (PSP), was proposed recently. The PSP is an "on-line" framework where the specific type of algorithm can process the currently collected data during the data transmission under band-interleavedby-sample/pixel (BIS/BIP) protocol. This paper proposes an online BS method that integrates a sparse-based BS into PSP framework, called PSP-BS. In PSP-BS, the BS can be carried out by updating BS result recursively pixel by pixel in the same way that a Kalman filter does for updating data information in a recursive fashion. The sparse regression is solved by orthogonal matching pursuit (OMP) algorithm, and the recursive equations of PSP-BS are derived by using matrix decomposition. The experiments conducted on a real hyperspectral image show that the PSP-BS can progressively output the BS status with very low computing time. The convergence of BS results during the transmission can be quickly achieved by using a rearranged pixel transmission sequence. This significant advantage allows BS to be implemented in a real time manner when the HSI data is transmitted pixel by pixel.

  9. Multiphysics modeling of selective laser sintering/melting

    Science.gov (United States)

    Ganeriwala, Rishi Kumar

    A significant percentage of total global employment is due to the manufacturing industry. However, manufacturing also accounts for nearly 20% of total energy usage in the United States according to the EIA. In fact, manufacturing accounted for 90% of industrial energy consumption and 84% of industry carbon dioxide emissions in 2002. Clearly, advances in manufacturing technology and efficiency are necessary to curb emissions and help society as a whole. Additive manufacturing (AM) refers to a relatively recent group of manufacturing technologies whereby one can 3D print parts, which has the potential to significantly reduce waste, reconfigure the supply chain, and generally disrupt the whole manufacturing industry. Selective laser sintering/melting (SLS/SLM) is one type of AM technology with the distinct advantage of being able to 3D print metals and rapidly produce net shape parts with complicated geometries. In SLS/SLM parts are built up layer-by-layer out of powder particles, which are selectively sintered/melted via a laser. However, in order to produce defect-free parts of sufficient strength, the process parameters (laser power, scan speed, layer thickness, powder size, etc.) must be carefully optimized. Obviously, these process parameters will vary depending on material, part geometry, and desired final part characteristics. Running experiments to optimize these parameters is costly, energy intensive, and extremely material specific. Thus a computational model of this process would be highly valuable. In this work a three dimensional, reduced order, coupled discrete element - finite difference model is presented for simulating the deposition and subsequent laser heating of a layer of powder particles sitting on top of a substrate. Validation is provided and parameter studies are conducted showing the ability of this model to help determine appropriate process parameters and an optimal powder size distribution for a given material. Next, thermal stresses upon

  10. Composing Models of Geographic Physical Processes

    Science.gov (United States)

    Hofer, Barbara; Frank, Andrew U.

    Processes are central for geographic information science; yet geographic information systems (GIS) lack capabilities to represent process related information. A prerequisite to including processes in GIS software is a general method to describe geographic processes independently of application disciplines. This paper presents such a method, namely a process description language. The vocabulary of the process description language is derived formally from mathematical models. Physical processes in geography can be described in two equivalent languages: partial differential equations or partial difference equations, where the latter can be shown graphically and used as a method for application specialists to enter their process models. The vocabulary of the process description language comprises components for describing the general behavior of prototypical geographic physical processes. These process components can be composed by basic models of geographic physical processes, which is shown by means of an example.

  11. Selective extraction of cesium: from compound to process

    International Nuclear Information System (INIS)

    Simon, N.; Eymard, S.; Tournois, B.; Dozol, J.F.

    2000-01-01

    Under the French law of 30 December 1991 on nuclear waste management, research is conducted to recover long-lived fission products from high-level radioactive effluents generated by spent fuel reprocessing, in order to destroy them by transmutation or encapsulate them in specific matrices. Cesium extraction with mono and bis-crown calix(4)arenes (Frame 1) is a candidate for process development. These extractants remove cesium from highly acidic or basic pH media even with high salinity. A real raffinate was treated in 1994 in a hot cell to extract cesium with a calix-crown extractant. The success of this one batch experiment confirmed the feasibility of cesium decontamination from high-level liquid waste. It was then decided to develop a process flowchart to extract cesium selectively from high-level raffinate, to be included in the general scheme of long-lived radionuclide partitioning. It was accordingly decided to develop a process based on liquid-liquid extraction and hence optimize a calixarene/diluent solvent according to: - hydraulic properties: density, viscosity, interfacial tension, - chemical criteria: sufficient cesium extraction (depending on the diluent), kinetics, third phase elimination... New mono-crown-calixarenes branched with long aliphatic groups (Frame 2) were designed to be soluble in aliphatic diluents. To prevent third phase formation associated with nitric acid extraction, the addition of modifiers (alcohol, phosphate and amide) in the organic phase was tested (Frame 3). Table 1 shows examples of calixarene/diluent systems suitable for a process flowchart, and Figure 2 provides data on cesium extraction with these new systems. Alongside these improvements, a system based on a modified 1,3-di(n-octyl-oxy)2,4-calix[4]arene crown and a modified diluent was also developed, considering a mixed TPH/NPHE system as the diluent, where TPH (hydrogenated tetra propylene) is a common aliphatic industrial solvent and NPHE is nitrophenyl

  12. MORTALITY MODELING WITH LEVY PROCESSES

    Directory of Open Access Journals (Sweden)

    M. Serhat Yucel, FRM

    2012-07-01

    Full Text Available Mortality and longevity risk is usually one of the main risk components ineconomic capital models of insurance companies. Above all, future mortalityexpectations are an important input in the modeling and pricing of long termproducts. Deviations from the expectation can lead insurance company even todefault if sufficient reserves and capital is not held. Thus, Modeling of mortalitytime series accurately is a vital concern for the insurance industry. The aim of thisstudy is to perform distributional and spectral testing to the mortality data andpracticed discrete and continuous time modeling. We believe, the results and thetechniques used in this study will provide a basis for Value at Risk formula incase of mortality.

  13. Computationally efficient thermal-mechanical modelling of selective laser melting

    Science.gov (United States)

    Yang, Yabin; Ayas, Can

    2017-10-01

    The Selective laser melting (SLM) is a powder based additive manufacturing (AM) method to produce high density metal parts with complex topology. However, part distortions and accompanying residual stresses deteriorates the mechanical reliability of SLM products. Modelling of the SLM process is anticipated to be instrumental for understanding and predicting the development of residual stress field during the build process. However, SLM process modelling requires determination of the heat transients within the part being built which is coupled to a mechanical boundary value problem to calculate displacement and residual stress fields. Thermal models associated with SLM are typically complex and computationally demanding. In this paper, we present a simple semi-analytical thermal-mechanical model, developed for SLM that represents the effect of laser scanning vectors with line heat sources. The temperature field within the part being build is attained by superposition of temperature field associated with line heat sources in a semi-infinite medium and a complimentary temperature field which accounts for the actual boundary conditions. An analytical solution of a line heat source in a semi-infinite medium is first described followed by the numerical procedure used for finding the complimentary temperature field. This analytical description of the line heat sources is able to capture the steep temperature gradients in the vicinity of the laser spot which is typically tens of micrometers. In turn, semi-analytical thermal model allows for having a relatively coarse discretisation of the complimentary temperature field. The temperature history determined is used to calculate the thermal strain induced on the SLM part. Finally, a mechanical model governed by elastic-plastic constitutive rule having isotropic hardening is used to predict the residual stresses.

  14. Model-based software process improvement

    Science.gov (United States)

    Zettervall, Brenda T.

    1994-01-01

    The activities of a field test site for the Software Engineering Institute's software process definition project are discussed. Products tested included the improvement model itself, descriptive modeling techniques, the CMM level 2 framework document, and the use of process definition guidelines and templates. The software process improvement model represents a five stage cyclic approach for organizational process improvement. The cycles consist of the initiating, diagnosing, establishing, acting, and leveraging phases.

  15. Kopernik : modeling business processes for digital customers

    OpenAIRE

    Estañol Lamarca, Montserrat; Castro, Manuel; Díaz-Montenegro, Sylvia; Teniente López, Ernest

    2016-01-01

    This paper presents the Kopernik methodology for modeling business processes for digital customers. These processes require a high degree of flexibility in the execution of their tasks or actions. We achieve this by using the artifact-centric approach to process modeling and the use of condition-action rules. The processes modeled following Kopernik can then be implemented in an existing commercial tool, Balandra.

  16. The triconnected abstraction of process models

    OpenAIRE

    Polyvyanyy, Artem; Smirnov, Sergey; Weske, Mathias

    2009-01-01

    Contents: Artem Polyvanny, Sergey Smirnow, and Mathias Weske The Triconnected Abstraction of Process Models 1 Introduction 2 Business Process Model Abstraction 3 Preliminaries 4 Triconnected Decomposition 4.1 Basic Approach for Process Component Discovery 4.2 SPQR-Tree Decomposition 4.3 SPQR-Tree Fragments in the Context of Process Models 5 Triconnected Abstraction 5.1 Abstraction Rules 5.2 Abstraction Algorithm 6 Related Work and Conclusions

  17. Process and analytical studies of enhanced low severity co-processing using selective coal pretreatment

    Energy Technology Data Exchange (ETDEWEB)

    Baldwin, R.M.; Miller, R.L.

    1991-12-01

    The findings in the first phase were as follows: 1. Both reductive (non-selective) alkylation and selective oxygen alkylation brought about an increase in liquefaction reactivity for both coals. 2. Selective oxygen alkylation is more effective in enhancing the reactivity of low rank coals. In the second phase of studies, the major findings were as follows: 1. Liquefaction reactivity increases with increasing level of alkylation for both hydroliquefaction and co-processing reaction conditions. 2. the increase in reactivity found for O-alkylated Wyodak subbituminous coal is caused by chemical changes at phenolic and carboxylic functional sites. 3. O-methylation of Wyodak subbituminous coal reduced the apparent activation energy for liquefaction of this coal.

  18. Modelling of Batch Process Operations

    DEFF Research Database (Denmark)

    Abdul Samad, Noor Asma Fazli; Cameron, Ian; Gani, Rafiqul

    2011-01-01

    Here a batch cooling crystalliser is modelled and simulated as is a batch distillation system. In the batch crystalliser four operational modes of the crystalliser are considered, namely: initial cooling, nucleation, crystal growth and product removal. A model generation procedure is shown that s...

  19. Mathematical Modeling: A Structured Process

    Science.gov (United States)

    Anhalt, Cynthia Oropesa; Cortez, Ricardo

    2015-01-01

    Mathematical modeling, in which students use mathematics to explain or interpret physical, social, or scientific phenomena, is an essential component of the high school curriculum. The Common Core State Standards for Mathematics (CCSSM) classify modeling as a K-12 standard for mathematical practice and as a conceptual category for high school…

  20. Birth/death process model

    Science.gov (United States)

    Solloway, C. B.; Wakeland, W.

    1976-01-01

    First-order Markov model developed on digital computer for population with specific characteristics. System is user interactive, self-documenting, and does not require user to have complete understanding of underlying model details. Contains thorough error-checking algorithms on input and default capabilities.

  1. The cost of ethanol production from lignocellulosic biomass -- A comparison of selected alternative processes. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Grethlein, H.E.; Dill, T.

    1993-04-30

    The purpose of this report is to compare the cost of selected alternative processes for the conversion of lignocellulosic biomass to ethanol. In turn, this information will be used by the ARS/USDA to guide the management of research and development programs in biomass conversion. The report will identify where the cost leverages are for the selected alternatives and what performance parameters need to be achieved to improve the economics. The process alternatives considered here are not exhaustive, but are selected on the basis of having a reasonable potential in improving the economics of producing ethanol from biomass. When other alternatives come under consideration, they should be evaluated by the same methodology used in this report to give fair comparisons of opportunities. A generic plant design is developed for an annual production of 25 million gallons of anhydrous ethanol using corn stover as the model substrate at $30/dry ton. Standard chemical engineering techniques are used to give first order estimates of the capital and operating costs. Following the format of the corn to ethanol plant, there are nine sections to the plant; feed preparation, pretreatment, hydrolysis, fermentation, distillation and dehydration, stillage evaporation, storage and denaturation, utilities, and enzyme production. There are three pretreatment alternatives considered: the AFEX process, the modified AFEX process (which is abbreviated as MAFEX), and the STAKETECH process. These all use enzymatic hydrolysis and so an enzyme production section is included in the plant. The STAKETECH is the only commercially available process among the alternative processes.

  2. Transforming Collaborative Process Models into Interface Process Models by Applying an MDA Approach

    Science.gov (United States)

    Lazarte, Ivanna M.; Chiotti, Omar; Villarreal, Pablo D.

    Collaborative business models among enterprises require defining collaborative business processes. Enterprises implement B2B collaborations to execute these processes. In B2B collaborations the integration and interoperability of processes and systems of the enterprises are required to support the execution of collaborative processes. From a collaborative process model, which describes the global view of the enterprise interactions, each enterprise must define the interface process that represents the role it performs in the collaborative process in order to implement the process in a Business Process Management System. Hence, in this work we propose a method for the automatic generation of the interface process model of each enterprise from a collaborative process model. This method is based on a Model-Driven Architecture to transform collaborative process models into interface process models. By applying this method, interface processes are guaranteed to be interoperable and defined according to a collaborative process.

  3. Radial Domany-Kinzel models with mutation and selection

    Science.gov (United States)

    Lavrentovich, Maxim O.; Korolev, Kirill S.; Nelson, David R.

    2013-01-01

    We study the effect of spatial structure, genetic drift, mutation, and selective pressure on the evolutionary dynamics in a simplified model of asexual organisms colonizing a new territory. Under an appropriate coarse-graining, the evolutionary dynamics is related to the directed percolation processes that arise in voter models, the Domany-Kinzel (DK) model, contact process, and so on. We explore the differences between linear (flat front) expansions and the much less familiar radial (curved front) range expansions. For the radial expansion, we develop a generalized, off-lattice DK model that minimizes otherwise persistent lattice artifacts. With both simulations and analytical techniques, we study the survival probability of advantageous mutants, the spatial correlations between domains of neutral strains, and the dynamics of populations with deleterious mutations. “Inflation” at the frontier leads to striking differences between radial and linear expansions. For a colony with initial radius R0 expanding at velocity v, significant genetic demixing, caused by local genetic drift, occurs only up to a finite time t*=R0/v, after which portions of the colony become causally disconnected due to the inflating perimeter of the expanding front. As a result, the effect of a selective advantage is amplified relative to genetic drift, increasing the survival probability of advantageous mutants. Inflation also modifies the underlying directed percolation transition, introducing novel scaling functions and modifications similar to a finite-size effect. Finally, we consider radial range expansions with deflating perimeters, as might arise from colonization initiated along the shores of an island.

  4. Automation of Endmember Pixel Selection in SEBAL/METRIC Model

    Science.gov (United States)

    Bhattarai, N.; Quackenbush, L. J.; Im, J.; Shaw, S. B.

    2015-12-01

    The commonly applied surface energy balance for land (SEBAL) and its variant, mapping evapotranspiration (ET) at high resolution with internalized calibration (METRIC) models require manual selection of endmember (i.e. hot and cold) pixels to calibrate sensible heat flux. Current approaches for automating this process are based on statistical methods and do not appear to be robust under varying climate conditions and seasons. In this paper, we introduce a new approach based on simple machine learning tools and search algorithms that provides an automatic and time efficient way of identifying endmember pixels for use in these models. The fully automated models were applied on over 100 cloud-free Landsat images with each image covering several eddy covariance flux sites in Florida and Oklahoma. Observed land surface temperatures at automatically identified hot and cold pixels were within 0.5% of those from pixels manually identified by an experienced operator (coefficient of determination, R2, ≥ 0.92, Nash-Sutcliffe efficiency, NSE, ≥ 0.92, and root mean squared error, RMSE, ≤ 1.67 K). Daily ET estimates derived from the automated SEBAL and METRIC models were in good agreement with their manual counterparts (e.g., NSE ≥ 0.91 and RMSE ≤ 0.35 mm day-1). Automated and manual pixel selection resulted in similar estimates of observed ET across all sites. The proposed approach should reduce time demands for applying SEBAL/METRIC models and allow for their more widespread and frequent use. This automation can also reduce potential bias that could be introduced by an inexperienced operator and extend the domain of the models to new users.

  5. Modeling business processes: theoretical and practical aspects

    Directory of Open Access Journals (Sweden)

    V.V. Dubininа

    2015-06-01

    Full Text Available The essence of process-oriented enterprise management has been examined in the article. The content and types of information technology have been analyzed in the article, due to the complexity and differentiation of existing methods, as well as the specificity of language, terminology of the enterprise business processes modeling. The theoretical aspects of business processes modeling have been reviewed and the modern traditional modeling techniques received practical application in the visualization model of retailers activity have been studied in the article. In the process of theoretical analysis of the modeling methods found that UFO-toolkit method that has been developed by Ukrainian scientists due to it systemology integrated opportunities, is the most suitable for structural and object analysis of retailers business processes. It was designed visualized simulation model of the business process "sales" as is" of retailers using a combination UFO-elements with the aim of the further practical formalization and optimization of a given business process.

  6. Selection of Hydrological Model for Waterborne Release

    International Nuclear Information System (INIS)

    Blanchard, A.

    1999-01-01

    This evaluation will aid in determining the potential impacts of liquid releases to downstream populations on the Savannah River. The purpose of this report is to evaluate the two available models and determine the appropriate model for use in following waterborne release analyses. Additionally, this report will document the Design Basis and Beyond Design Basis accidents to be used in the future study

  7. Application of Bayesian Model Selection for Metal Yield Models using ALEGRA and Dakota.

    Energy Technology Data Exchange (ETDEWEB)

    Portone, Teresa; Niederhaus, John Henry; Sanchez, Jason James; Swiler, Laura Painton

    2018-02-01

    This report introduces the concepts of Bayesian model selection, which provides a systematic means of calibrating and selecting an optimal model to represent a phenomenon. This has many potential applications, including for comparing constitutive models. The ideas described herein are applied to a model selection problem between different yield models for hardened steel under extreme loading conditions.

  8. A Comparative of business process modelling techniques

    Science.gov (United States)

    Tangkawarow, I. R. H. T.; Waworuntu, J.

    2016-04-01

    In this era, there is a lot of business process modeling techniques. This article is the research about differences of business process modeling techniques. For each technique will explain about the definition and the structure. This paper presents a comparative analysis of some popular business process modelling techniques. The comparative framework is based on 2 criteria: notation and how it works when implemented in Somerleyton Animal Park. Each technique will end with the advantages and disadvantages. The final conclusion will give recommend of business process modeling techniques that easy to use and serve the basis for evaluating further modelling techniques.

  9. How visual cognition influences process model comprehension

    NARCIS (Netherlands)

    Petrusel, Razvan; Mendling, Jan; Reijers, Hajo A.

    2017-01-01

    Process analysts and other professionals extensively use process models to analyze business processes and identify performance improvement opportunities. Therefore, it is important that such models can be easily and properly understood. Previous research has mainly focused on two types of factors

  10. Social software for business process modeling

    NARCIS (Netherlands)

    Koschmider, A.; Song, M.S.; Reijers, H.A.

    2010-01-01

    Formal models of business processes are used for a variety of purposes. But where the elicitation of the characteristics of a business process usually takes place in a collaborative fashion, the building of the final, formal process model is done mostly by a single person. This article presents the

  11. Fermentation process tracking through enhanced spectral calibration modeling.

    Science.gov (United States)

    Triadaphillou, Sophia; Martin, Elaine; Montague, Gary; Norden, Alison; Jeffkins, Paul; Stimpson, Sarah

    2007-06-15

    The FDA process analytical technology (PAT) initiative will materialize in a significant increase in the number of installations of spectroscopic instrumentation. However, to attain the greatest benefit from the data generated, there is a need for calibration procedures that extract the maximum information content. For example, in fermentation processes, the interpretation of the resulting spectra is challenging as a consequence of the large number of wavelengths recorded, the underlying correlation structure that is evident between the wavelengths and the impact of the measurement environment. Approaches to the development of calibration models have been based on the application of partial least squares (PLS) either to the full spectral signature or to a subset of wavelengths. This paper presents a new approach to calibration modeling that combines a wavelength selection procedure, spectral window selection (SWS), where windows of wavelengths are automatically selected which are subsequently used as the basis of the calibration model. However, due to the non-uniqueness of the windows selected when the algorithm is executed repeatedly, multiple models are constructed and these are then combined using stacking thereby increasing the robustness of the final calibration model. The methodology is applied to data generated during the monitoring of broth concentrations in an industrial fermentation process from on-line near-infrared (NIR) and mid-infrared (MIR) spectrometers. It is shown that the proposed calibration modeling procedure outperforms traditional calibration procedures, as well as enabling the identification of the critical regions of the spectra with regard to the fermentation process.

  12. Selection, processing and clinical application of muscle-skeletal tissue

    International Nuclear Information System (INIS)

    Luna Z, D.; Reyes F, M.L.; Lavalley E, C.; Castaneda J, G.

    2007-01-01

    Due to the increase in the average of the world population's life, people die each time to more age, this makes that the tissues of support of the human body, as those muscle-skeletal tissues, when increasing the individual's age go weakening, this in turn leads to the increment of the illnesses like the osteoporosis and the arthritis, that undoubtedly gives as a result more injure of the muscle-skeletal tissues joined a greater number of traffic accidents where particularly these tissues are affected, for that the demand of tissues muscle-skeletal for transplant every day will be bigger. The production of these tissues in the Bank of Radio sterilized Tissues, besides helping people to improve its quality of life saved foreign currencies because most of the muscle-skeletal tissues transplanted in Mexico are of import. The use of the irradiation to sterilize tissues for transplant has shown to be one of the best techniques with that purpose for what the International Atomic Energy Agency believes a Technical cooperation program to establish banks of tissues using the nuclear energy, helping mainly to countries in development. In this work the stages that follows the bank of radio sterilized tissues of the National Institute of Nuclear Research for the cadaverous donor's of muscle-skeletal tissue selection are described, as well as the processing and the clinical application of these tissues. (Author)

  13. Using Ionic Liquids in Selective Hydrocarbon Conversion Processes

    Energy Technology Data Exchange (ETDEWEB)

    Tang, Yongchun; Periana, Roy; Chen, Weiqun; van Duin, Adri; Nielsen, Robert; Shuler, Patrick; Ma, Qisheng; Blanco, Mario; Li, Zaiwei; Oxgaard, Jonas; Cheng, Jihong; Cheung, Sam; Pudar, Sanja

    2009-09-28

    This is the Final Report of the five-year project Using Ionic Liquids in Selective Hydrocarbon Conversion Processes (DE-FC36-04GO14276, July 1, 2004- June 30, 2009), in which we present our major accomplishments with detailed descriptions of our experimental and theoretical efforts. Upon the successful conduction of this project, we have followed our proposed breakdown work structure completing most of the technical tasks. Finally, we have developed and demonstrated several optimized homogenously catalytic methane conversion systems involving applications of novel ionic liquids, which present much more superior performance than the Catalytica system (the best-to-date system) in terms of three times higher reaction rates and longer catalysts lifetime and much stronger resistance to water deactivation. We have developed in-depth mechanistic understandings on the complicated chemistry involved in homogenously catalytic methane oxidation as well as developed the unique yet effective experimental protocols (reactors, analytical tools and screening methodologies) for achieving a highly efficient yet economically feasible and environmentally friendly catalytic methane conversion system. The most important findings have been published, patented as well as reported to DOE in this Final Report and our 20 Quarterly Reports.

  14. Astrophysical Model Selection in Gravitational Wave Astronomy

    Science.gov (United States)

    Adams, Matthew R.; Cornish, Neil J.; Littenberg, Tyson B.

    2012-01-01

    Theoretical studies in gravitational wave astronomy have mostly focused on the information that can be extracted from individual detections, such as the mass of a binary system and its location in space. Here we consider how the information from multiple detections can be used to constrain astrophysical population models. This seemingly simple problem is made challenging by the high dimensionality and high degree of correlation in the parameter spaces that describe the signals, and by the complexity of the astrophysical models, which can also depend on a large number of parameters, some of which might not be directly constrained by the observations. We present a method for constraining population models using a hierarchical Bayesian modeling approach which simultaneously infers the source parameters and population model and provides the joint probability distributions for both. We illustrate this approach by considering the constraints that can be placed on population models for galactic white dwarf binaries using a future space-based gravitational wave detector. We find that a mission that is able to resolve approximately 5000 of the shortest period binaries will be able to constrain the population model parameters, including the chirp mass distribution and a characteristic galaxy disk radius to within a few percent. This compares favorably to existing bounds, where electromagnetic observations of stars in the galaxy constrain disk radii to within 20%.

  15. Measuring similarity between business process models

    NARCIS (Netherlands)

    Dongen, van B.F.; Dijkman, R.M.; Mendling, J.

    2007-01-01

    Quality aspects become increasingly important when business process modeling is used in a large-scale enterprise setting. In order to facilitate a storage without redundancy and an efficient retrieval of relevant process models in model databases it is required to develop a theoretical understanding

  16. A model for selecting leadership styles.

    Science.gov (United States)

    Perkins, V J

    1992-01-01

    Occupational therapists lead a variety of groups during their professional activities. Such groups include therapy groups, treatment teams and management meetings. Therefore it is important for each therapist to understand theories of leadership and be able to select the most effective style for him or herself in specific situations. This paper presents a review of leadership theory and research as well as therapeutic groups. It then integrates these areas to assist students and new therapists in identifying a style that is effective for a particular group.

  17. Steady-State Process Modelling

    DEFF Research Database (Denmark)

    Cameron, Ian; Gani, Rafiqul

    2011-01-01

    illustrate the “equation oriented” approach as well as the “sequential modular” approach to solving complex flowsheets for steady state applications. The applications include the Williams-Otto plant, the hydrodealkylation (HDA) of toluene, conversion of ethylene to ethanol and a bio-ethanol process....

  18. The impact of working memory and the "process of process modelling" on model quality: Investigating experienced versus inexperienced modellers

    DEFF Research Database (Denmark)

    Martini, Markus; Pinggera, Jakob; Neurauter, Manuel

    2016-01-01

    of reconciliation phases was positively related to PM quality in experienced modellers. Our research reveals central cognitive mechanisms in process modelling and has potential practical implications for the development of modelling software and teaching the craft of process modelling....... the role of cognitive processes as well as modelling processes in creating a PM in experienced and inexperienced modellers. Specifically, two working memory (WM) functions (holding and processing of information and relational integration) and three process of process modelling phases (comprehension...

  19. Performance Measurement Model for the Supplier Selection Based on AHP

    Directory of Open Access Journals (Sweden)

    Fabio De Felice

    2015-10-01

    Full Text Available The performance of the supplier is a crucial factor for the success or failure of any company. Rational and effective decision making in terms of the supplier selection process can help the organization to optimize cost and quality functions. The nature of supplier selection processes is generally complex, especially when the company has a large variety of products and vendors. Over the years, several solutions and methods have emerged for addressing the supplier selection problem (SSP. Experience and studies have shown that there is no best way for evaluating and selecting a specific supplier process, but that it varies from one organization to another. The aim of this research is to demonstrate how a multiple attribute decision making approach can be effectively applied for the supplier selection process.

  20. Computer-aided tool for solvent selection in pharmaceutical processes: Solvent swap

    DEFF Research Database (Denmark)

    Papadakis, Emmanouil; K. Tula, Anjan; Gernaey, Krist V.

    -liquid equilibria). The application of the developed model-based framework is highlighted through several cases studies published in the literature. In the current state, the framework is suitable for problems where the original solvent is exchanged by distillation. A solvent selection guide for fast of suitable......-aided framework with the objective to assist the pharmaceutical industry in gaining better process understanding. A software interface to improve the usability of the tool has been created also....

  1. On Optimal Input Design and Model Selection for Communication Channels

    Energy Technology Data Exchange (ETDEWEB)

    Li, Yanyan [ORNL; Djouadi, Seddik M [ORNL; Olama, Mohammed M [ORNL

    2013-01-01

    In this paper, the optimal model (structure) selection and input design which minimize the worst case identification error for communication systems are provided. The problem is formulated using metric complexity theory in a Hilbert space setting. It is pointed out that model selection and input design can be handled independently. Kolmogorov n-width is used to characterize the representation error introduced by model selection, while Gel fand and Time n-widths are used to represent the inherent error introduced by input design. After the model is selected, an optimal input which minimizes the worst case identification error is shown to exist. In particular, it is proven that the optimal model for reducing the representation error is a Finite Impulse Response (FIR) model, and the optimal input is an impulse at the start of the observation interval. FIR models are widely popular in communication systems, such as, in Orthogonal Frequency Division Multiplexing (OFDM) systems.

  2. Styles in business process modeling: an exploration and a model

    NARCIS (Netherlands)

    Pinggera, J.; Soffer, P.; Fahland, D.; Weidlich, M.; Zugal, S.; Weber, B.; Reijers, H.A.; Mendling, J.

    2015-01-01

    Business process models are an important means to design, analyze, implement, and control business processes. As with every type of conceptual model, a business process model has to meet certain syntactic, semantic, and pragmatic quality requirements to be of value. For many years, such quality

  3. Evaluation of EOR Processes Using Network Models

    DEFF Research Database (Denmark)

    Winter, Anatol; Larsen, Jens Kjell; Krogsbøll, Anette

    1998-01-01

    The report consists of the following parts: 1) Studies of wetting properties of model fluids and fluid mixtures aimed at an optimal selection of candidates for micromodel experiments. 2) Experimental studies of multiphase transport properties using physical models of porous networks (micromodels......) including estimation of their "petrophysical" properties (e.g. absolute permeability). 3) Mathematical modelling and computer studies of multiphase transport through pore space using mathematical network models. 4) Investigation of link between pore-scale and macroscopic recovery mechanisms....

  4. Process generalization in conceptual models

    NARCIS (Netherlands)

    Wieringa, Roelf J.

    In conceptual modeling, the universe of discourse (UoD) is divided into classes which have a taxonomic structure. The classes are usually defined in terms of attributes (all objects in a class share attribute names) and possibly of events. For enmple, the class of employees is the set of objects to

  5. Numerical modelling of reflood processes

    International Nuclear Information System (INIS)

    Glynn, D.R.; Rhodes, N.; Tatchell, D.G.

    1983-01-01

    The use of a detailed computer model to investigate the effects of grid size and the choice of wall-to-fluid heat-transfer correlations on the predictions obtained for reflooding of a vertical heated channel is described. The model employs equations for the momentum and enthalpy of vapour and liquid and hence accounts for both thermal non-equilibrium and slip between the phases. Empirical correlations are used to calculate interphase and wall-to-fluid friction and heat-transfer as functions of flow regime and local conditions. The empirical formulae have remained fixed with the exception of the wall-to-fluid heat-transfer correlations. These have been varied according to the practices adopted in other computer codes used to model reflood, namely REFLUX, RELAP and TRAC. Calculations have been performed to predict the CSNI standard problem number 7, and the results are compared with experiment. It is shown that the results are substantially grid-independent, and that the choice of correlation has a significant influence on the general flow behaviour, the rate of quenching and on the maximum cladding temperature predicted by the model. It is concluded that good predictions of reflooding rates can be obtained with particular correlation sets. (author)

  6. Branching process models of cancer

    CERN Document Server

    Durrett, Richard

    2015-01-01

    This volume develops results on continuous time branching processes and applies them to study rate of tumor growth, extending classic work on the Luria-Delbruck distribution. As a consequence, the authors calculate the probability that mutations that confer resistance to treatment are present at detection and quantify the extent of tumor heterogeneity. As applications, the authors evaluate ovarian cancer screening strategies and give rigorous proofs for results of Heano and Michor concerning tumor metastasis. These notes should be accessible to students who are familiar with Poisson processes and continuous time. Richard Durrett is mathematics professor at Duke University, USA. He is the author of 8 books, over 200 journal articles, and has supervised more than 40 Ph.D. students. Most of his current research concerns the applications of probability to biology: ecology, genetics, and most recently cancer.

  7. Model selection and comparison for independents sinusoids

    DEFF Research Database (Denmark)

    Nielsen, Jesper Kjær; Christensen, Mads Græsbøll; Jensen, Søren Holdt

    2014-01-01

    In the signal processing literature, many methods have been proposed for estimating the number of sinusoidal basis functions from a noisy data set. The most popular method is the asymptotic MAP criterion, which is sometimes also referred to as the BIC. In this paper, we extend and improve this me...

  8. Plasma Processing of Model Residential Solid Waste

    Science.gov (United States)

    Messerle, V. E.; Mossé, A. L.; Nikonchuk, A. N.; Ustimenko, A. B.; Baimuldin, R. V.

    2017-09-01

    The authors have tested the technology of processing of model residential solid waste. They have developed and created a pilot plasma unit based on a plasma chamber incinerator. The waste processing technology has been tested and prepared for commercialization.

  9. Simulation Modeling of Software Development Processes

    Science.gov (United States)

    Calavaro, G. F.; Basili, V. R.; Iazeolla, G.

    1996-01-01

    A simulation modeling approach is proposed for the prediction of software process productivity indices, such as cost and time-to-market, and the sensitivity analysis of such indices to changes in the organization parameters and user requirements. The approach uses a timed Petri Net and Object Oriented top-down model specification. Results demonstrate the model representativeness, and its usefulness in verifying process conformance to expectations, and in performing continuous process improvement and optimization.

  10. Exploring selection and recruitment processes for newly qualified nurses: a sequential-explanatory mixed-method study.

    Science.gov (United States)

    Newton, Paul; Chandler, Val; Morris-Thomson, Trish; Sayer, Jane; Burke, Linda

    2015-01-01

    To map current selection and recruitment processes for newly qualified nurses and to explore the advantages and limitations of current selection and recruitment processes. The need to improve current selection and recruitment practices for newly qualified nurses is highlighted in health policy internationally. A cross-sectional, sequential-explanatory mixed-method design with 4 components: (1) Literature review of selection and recruitment of newly qualified nurses; and (2) Literature review of a public sector professions' selection and recruitment processes; (3) Survey mapping existing selection and recruitment processes for newly qualified nurses; and (4) Qualitative study about recruiters' selection and recruitment processes. Literature searches on the selection and recruitment of newly qualified candidates in teaching and nursing (2005-2013) were conducted. Cross-sectional, mixed-method data were collected from thirty-one (n = 31) individuals in health providers in London who had responsibility for the selection and recruitment of newly qualified nurses using a survey instrument. Of these providers who took part, six (n = 6) purposively selected to be interviewed qualitatively. Issues of supply and demand in the workforce, rather than selection and recruitment tools, predominated in the literature reviews. Examples of tools to measure values, attitudes and skills were found in the nursing literature. The mapping exercise found that providers used many selection and recruitment tools, some providers combined tools to streamline process and assure quality of candidates. Most providers had processes which addressed the issue of quality in the selection and recruitment of newly qualified nurses. The 'assessment centre model', which providers were adopting, allowed for multiple levels of assessment and streamlined recruitment. There is a need to validate the efficacy of the selection tools. © 2014 John Wiley & Sons Ltd.

  11. Model selection in kernel ridge regression

    DEFF Research Database (Denmark)

    Exterkate, Peter

    2013-01-01

    Kernel ridge regression is a technique to perform ridge regression with a potentially infinite number of nonlinear transformations of the independent variables as regressors. This method is gaining popularity as a data-rich nonlinear forecasting tool, which is applicable in many different contexts....... The influence of the choice of kernel and the setting of tuning parameters on forecast accuracy is investigated. Several popular kernels are reviewed, including polynomial kernels, the Gaussian kernel, and the Sinc kernel. The latter two kernels are interpreted in terms of their smoothing properties......, and the tuning parameters associated to all these kernels are related to smoothness measures of the prediction function and to the signal-to-noise ratio. Based on these interpretations, guidelines are provided for selecting the tuning parameters from small grids using cross-validation. A Monte Carlo study...

  12. Model Selection in Kernel Ridge Regression

    DEFF Research Database (Denmark)

    Exterkate, Peter

    Kernel ridge regression is gaining popularity as a data-rich nonlinear forecasting tool, which is applicable in many different contexts. This paper investigates the influence of the choice of kernel and the setting of tuning parameters on forecast accuracy. We review several popular kernels......, including polynomial kernels, the Gaussian kernel, and the Sinc kernel. We interpret the latter two kernels in terms of their smoothing properties, and we relate the tuning parameters associated to all these kernels to smoothness measures of the prediction function and to the signal-to-noise ratio. Based...... on these interpretations, we provide guidelines for selecting the tuning parameters from small grids using cross-validation. A Monte Carlo study confirms the practical usefulness of these rules of thumb. Finally, the flexible and smooth functional forms provided by the Gaussian and Sinc kernels makes them widely...

  13. Selection of Hydrological Model for Waterborne Release

    International Nuclear Information System (INIS)

    Blanchard, A.

    1999-01-01

    Following a request from the States of South Carolina and Georgia, downstream radiological consequences from postulated accidental aqueous releases at the three Savannah River Site nonreactor nuclear facilities will be examined. This evaluation will aid in determining the potential impacts of liquid releases to downstream populations on the Savannah River. The purpose of this report is to evaluate the two available models and determine the appropriate model for use in following waterborne release analyses. Additionally, this report will document the accidents to be used in the future study

  14. A Network Analysis Model for Selecting Sustainable Technology

    Directory of Open Access Journals (Sweden)

    Sangsung Park

    2015-09-01

    Full Text Available Most companies develop technologies to improve their competitiveness in the marketplace. Typically, they then patent these technologies around the world in order to protect their intellectual property. Other companies may use patented technologies to develop new products, but must pay royalties to the patent holders or owners. Should they fail to do so, this can result in legal disputes in the form of patent infringement actions between companies. To avoid such situations, companies attempt to research and develop necessary technologies before their competitors do so. An important part of this process is analyzing existing patent documents in order to identify emerging technologies. In such analyses, extracting sustainable technology from patent data is important, because sustainable technology drives technological competition among companies and, thus, the development of new technologies. In addition, selecting sustainable technologies makes it possible to plan their R&D (research and development efficiently. In this study, we propose a network model that can be used to select the sustainable technology from patent documents, based on the centrality and degree of a social network analysis. To verify the performance of the proposed model, we carry out a case study using actual patent data from patent databases.

  15. Discovering Process Reference Models from Process Variants Using Clustering Techniques

    NARCIS (Netherlands)

    Li, C.; Reichert, M.U.; Wombacher, Andreas

    2008-01-01

    In today's dynamic business world, success of an enterprise increasingly depends on its ability to react to changes in a quick and flexible way. In response to this need, process-aware information systems (PAIS) emerged, which support the modeling, orchestration and monitoring of business processes

  16. Validation process of simulation model

    International Nuclear Information System (INIS)

    San Isidro, M. J.

    1998-01-01

    It is presented a methodology on empirical validation about any detailed simulation model. This king of validation it is always related with an experimental case. The empirical validation has a residual sense, because the conclusions are based on comparisons between simulated outputs and experimental measurements. This methodology will guide us to detect the fails of the simulation model. Furthermore, it can be used a guide in the design of posterior experiments. Three steps can be well differentiated: Sensitivity analysis. It can be made with a DSA, differential sensitivity analysis, and with a MCSA, Monte-Carlo sensitivity analysis. Looking the optimal domains of the input parameters. It has been developed a procedure based on the Monte-Carlo methods and Cluster techniques, to find the optimal domains of these parameters. Residual analysis. This analysis has been made on the time domain and on the frequency domain, it has been used the correlation analysis and spectral analysis. As application of this methodology, it is presented the validation carried out on a thermal simulation model on buildings, Esp., studying the behavior of building components on a Test Cell of LECE of CIEMAT. (Author) 17 refs

  17. Random effect selection in generalised linear models

    DEFF Research Database (Denmark)

    Denwood, Matt; Houe, Hans; Forkman, Björn

    We analysed abattoir recordings of meat inspection codes with possible relevance to onfarm animal welfare in cattle. Random effects logistic regression models were used to describe individual-level data obtained from 461,406 cattle slaughtered in Denmark. Our results demonstrate that the largest...

  18. Adapting AIC to conditional model selection

    NARCIS (Netherlands)

    T. van Ommen (Thijs)

    2012-01-01

    textabstractIn statistical settings such as regression and time series, we can condition on observed information when predicting the data of interest. For example, a regression model explains the dependent variables $y_1, \\ldots, y_n$ in terms of the independent variables $x_1, \\ldots, x_n$.

  19. Towards Model Checking Stochastic Process Algebra

    NARCIS (Netherlands)

    Hermanns, H.; Grieskamp, W.; Santen, T.; Katoen, Joost P.; Stoddart, B.; Meyer-Kayser, J.; Siegle, M.

    2000-01-01

    Stochastic process algebras have been proven useful because they allow behaviour-oriented performance and reliability modelling. As opposed to traditional performance modelling techniques, the behaviour- oriented style supports composition and abstraction in a natural way. However, analysis of

  20. Hierarchical Model of Assessing and Selecting Experts

    Science.gov (United States)

    Chernysheva, T. Y.; Korchuganova, M. A.; Borisov, V. V.; Min'kov, S. L.

    2016-04-01

    Revealing experts’ competences is a multi-objective issue. Authors of the paper deal with competence assessing methods of experts seen as objects, and criteria of qualities. An analytic hierarchy process of assessing and ranking experts is offered, which is based on paired comparison matrices and scores, quality parameters are taken into account as well. Calculation and assessment of experts is given as an example.

  1. Hierarchical Model of Assessing and Selecting Experts

    OpenAIRE

    Chernysheva, Tatiana Yurievna; Korchuganova, Mariya Anatolievna; Borisov, V. V.; Minkov, S. L.

    2016-01-01

    Revealing experts' competences is a multi-objective issue. Authors of the paper deal with competence assessing methods of experts seen as objects, and criteria of qualities. An analytic hierarchy process of assessing and ranking experts is offered, which is based on paired comparison matrices and scores, quality parameters are taken into account as well. Calculation and assessment of experts is given as an example.

  2. A Permutation Approach for Selecting the Penalty Parameter in Penalized Model Selection

    Science.gov (United States)

    Sabourin, Jeremy A; Valdar, William; Nobel, Andrew B

    2015-01-01

    Summary We describe a simple, computationally effcient, permutation-based procedure for selecting the penalty parameter in LASSO penalized regression. The procedure, permutation selection, is intended for applications where variable selection is the primary focus, and can be applied in a variety of structural settings, including that of generalized linear models. We briefly discuss connections between permutation selection and existing theory for the LASSO. In addition, we present a simulation study and an analysis of real biomedical data sets in which permutation selection is compared with selection based on the following: cross-validation (CV), the Bayesian information criterion (BIC), Scaled Sparse Linear Regression, and a selection method based on recently developed testing procedures for the LASSO. PMID:26243050

  3. Alcoholics' selective attention to alcohol stimuli: automated processing?

    Science.gov (United States)

    Stormark, K M; Laberg, J C; Nordby, H; Hugdahl, K

    2000-01-01

    This study investigated alcoholics' selective attention to alcohol words in a version of the Stroop color-naming task. Alcoholic subjects (n = 23) and nonalcoholic control subjects (n = 23) identified the color of Stroop versions of alcohol, emotional, neutral and color words. Manual reaction times (RTs), skin conductance responses (SCRs) and heart rate (HR) were recorded. Alcoholics showed overall longer RTs than controls while both groups were slower in responding to the incongruent color words than to the other words. Alcoholics showed longer RTs to both alcohol (1522.7 milliseconds [ms]) and emotional words (1523.7 ms) than to neutral words (1450.8 ms) which suggests that the content of these words interfered with the ability to attend to the color of the words. There was also a negative correlation (r = -.41) between RT and response accuracy to alcohol words for the alcoholics, reflecting that the longer time the alcoholics used to respond to the color of the alcohol words, the more incorrect their responses were. The alcoholics also showed significantly greater SCRs to alcohol words (0.16 microSiemens) than to any of the other words (ranging from 0.04-0.08 microSiemens), probably reflecting the emotional significance of the alcohol words. Finally, the alcoholics evidenced smaller HR acceleration to alcohol (1.9 delta bpm) compared to neutral (2.8 delta bpm), which could be related to difficulties alcoholics experience in terminating their attention to the alcohol words. These findings indicate that it is difficult for alcoholics to regulate their attention to alcohol stimuli, suggesting that alcoholics' processing of alcohol information is automated.

  4. Verification Techniques for Parameter Selection and Bayesian Model Calibration Presented for an HIV Model

    Science.gov (United States)

    Wentworth, Mami Tonoe

    Uncertainty quantification plays an important role when making predictive estimates of model responses. In this context, uncertainty quantification is defined as quantifying and reducing uncertainties, and the objective is to quantify uncertainties in parameter, model and measurements, and propagate the uncertainties through the model, so that one can make a predictive estimate with quantified uncertainties. Two of the aspects of uncertainty quantification that must be performed prior to propagating uncertainties are model calibration and parameter selection. There are several efficient techniques for these processes; however, the accuracy of these methods are often not verified. This is the motivation for our work, and in this dissertation, we present and illustrate verification frameworks for model calibration and parameter selection in the context of biological and physical models. First, HIV models, developed and improved by [2, 3, 8], describe the viral infection dynamics of an HIV disease. These are also used to make predictive estimates of viral loads and T-cell counts and to construct an optimal control for drug therapy. Estimating input parameters is an essential step prior to uncertainty quantification. However, not all the parameters are identifiable, implying that they cannot be uniquely determined by the observations. These unidentifiable parameters can be partially removed by performing parameter selection, a process in which parameters that have minimal impacts on the model response are determined. We provide verification techniques for Bayesian model calibration and parameter selection for an HIV model. As an example of a physical model, we employ a heat model with experimental measurements presented in [10]. A steady-state heat model represents a prototypical behavior for heat conduction and diffusion process involved in a thermal-hydraulic model, which is a part of nuclear reactor models. We employ this simple heat model to illustrate verification

  5. Multiple High-Fidelity Modeling Tools for Metal Additive Manufacturing Process Development, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Despite the rapid commercialization of additive manufacturing technology such as selective laser melting, SLM, there are gaps in process modeling and material...

  6. Multiple High-Fidelity Modeling Tools for Metal Additive Manufacturing Process Development, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — Despite the rapid commercialization of additive manufacturing technology such as selective laser melting, SLM, there are gaps in process modeling and material...

  7. A proposed selection process in Over-The-Top project portfolio management

    Directory of Open Access Journals (Sweden)

    Jemy Vestius Confido

    2018-05-01

    Full Text Available Purpose: The purpose of this paper is to propose an Over-The-Top (OTT initiative selection process for communication service providers (CSPs entering an OTT business. Design/methodology/approach: To achieve this objective, a literature review was conducted to comprehend the past and current practices of the project (or initiative selection process as mainly suggested in project portfolio management (PPM. This literature was compared with specific situations and the needs of CSPs when constructing an OTT portfolio. Based on the contrast between the conventional project selection process and specific OTT characteristics, a different selection process is developed and tested using group model-building (GMB, which involved an in-depth interview, a questionnaire and a focus group discussion (FGD. Findings: The paper recommends five distinct steps for CSPs to construct an OTT initiative portfolio: candidate list of OTT initiatives, interdependency diagram, evaluation of all interdependent OTT initiatives, evaluation of all non-interdependent OTT initiatives and optimal portfolio of OTT initiatives. Research limitations/implications: The research is empirical, and various OTT services are implemented; the conclusion is derived only from one CSP, which operates as a group. Generalization of this approach will require further empirical tests on different CSPs, OTT players or any firms performing portfolio selection with a degree of interdependency among the projects. Practical implications: Having considered interdependency, the proposed OTT initiative selection steps can be further implemented by portfolio managers for more effective OTT initiative portfolio construction. Originality/value: While the previous literature and common practices suggest ensuring the benefits (mainly financial of individual projects, this research accords higher priority to the success of the overall OTT initiative portfolio and recommends that an evaluation of the overall

  8. An Evaluation Model To Select an Integrated Learning System in a Large, Suburban School District.

    Science.gov (United States)

    Curlette, William L.; And Others

    The systematic evaluation process used in Georgia's DeKalb County School System to purchase comprehensive instructional software--an integrated learning system (ILS)--is described, and the decision-making model for selection is presented. Selection and implementation of an ILS were part of an instructional technology plan for the DeKalb schools…

  9. Modeling shape selection of buckled dielectric elastomers

    Science.gov (United States)

    Langham, Jacob; Bense, Hadrien; Barkley, Dwight

    2018-02-01

    A dielectric elastomer whose edges are held fixed will buckle, given a sufficiently applied voltage, resulting in a nontrivial out-of-plane deformation. We study this situation numerically using a nonlinear elastic model which decouples two of the principal electrostatic stresses acting on an elastomer: normal pressure due to the mutual attraction of oppositely charged electrodes and tangential shear ("fringing") due to repulsion of like charges at the electrode edges. These enter via physically simplified boundary conditions that are applied in a fixed reference domain using a nondimensional approach. The method is valid for small to moderate strains and is straightforward to implement in a generic nonlinear elasticity code. We validate the model by directly comparing the simulated equilibrium shapes with the experiment. For circular electrodes which buckle axisymetrically, the shape of the deflection profile is captured. Annular electrodes of different widths produce azimuthal ripples with wavelengths that match our simulations. In this case, it is essential to compute multiple equilibria because the first model solution obtained by the nonlinear solver (Newton's method) is often not the energetically favored state. We address this using a numerical technique known as "deflation." Finally, we observe the large number of different solutions that may be obtained for the case of a long rectangular strip.

  10. Process modelling on a canonical basis[Process modelling; Canonical modelling

    Energy Technology Data Exchange (ETDEWEB)

    Siepmann, Volker

    2006-12-20

    Based on an equation oriented solving strategy, this thesis investigates a new approach to process modelling. Homogeneous thermodynamic state functions represent consistent mathematical models of thermodynamic properties. Such state functions of solely extensive canonical state variables are the basis of this work, as they are natural objective functions in optimisation nodes to calculate thermodynamic equilibrium regarding phase-interaction and chemical reactions. Analytical state function derivatives are utilised within the solution process as well as interpreted as physical properties. By this approach, only a limited range of imaginable process constraints are considered, namely linear balance equations of state variables. A second-order update of source contributions to these balance equations is obtained by an additional constitutive equation system. These equations are general dependent on state variables and first-order sensitivities, and cover therefore practically all potential process constraints. Symbolic computation technology efficiently provides sparsity and derivative information of active equations to avoid performance problems regarding robustness and computational effort. A benefit of detaching the constitutive equation system is that the structure of the main equation system remains unaffected by these constraints, and a priori information allows to implement an efficient solving strategy and a concise error diagnosis. A tailor-made linear algebra library handles the sparse recursive block structures efficiently. The optimisation principle for single modules of thermodynamic equilibrium is extended to host entire process models. State variables of different modules interact through balance equations, representing material flows from one module to the other. To account for reusability and encapsulation of process module details, modular process modelling is supported by a recursive module structure. The second-order solving algorithm makes it

  11. Integrated Process Modeling-A Process Validation Life Cycle Companion.

    Science.gov (United States)

    Zahel, Thomas; Hauer, Stefan; Mueller, Eric M; Murphy, Patrick; Abad, Sandra; Vasilieva, Elena; Maurer, Daniel; Brocard, Cécile; Reinisch, Daniela; Sagmeister, Patrick; Herwig, Christoph

    2017-10-17

    During the regulatory requested process validation of pharmaceutical manufacturing processes, companies aim to identify, control, and continuously monitor process variation and its impact on critical quality attributes (CQAs) of the final product. It is difficult to directly connect the impact of single process parameters (PPs) to final product CQAs, especially in biopharmaceutical process development and production, where multiple unit operations are stacked together and interact with each other. Therefore, we want to present the application of Monte Carlo (MC) simulation using an integrated process model (IPM) that enables estimation of process capability even in early stages of process validation. Once the IPM is established, its capability in risk and criticality assessment is furthermore demonstrated. IPMs can be used to enable holistic production control strategies that take interactions of process parameters of multiple unit operations into account. Moreover, IPMs can be trained with development data, refined with qualification runs, and maintained with routine manufacturing data which underlines the lifecycle concept. These applications will be shown by means of a process characterization study recently conducted at a world-leading contract manufacturing organization (CMO). The new IPM methodology therefore allows anticipation of out of specification (OOS) events, identify critical process parameters, and take risk-based decisions on counteractions that increase process robustness and decrease the likelihood of OOS events.

  12. Multi-Criteria Decision Making For Determining A Simple Model of Supplier Selection

    Science.gov (United States)

    Harwati

    2017-06-01

    Supplier selection is a decision with many criteria. Supplier selection model usually involves more than five main criteria and more than 10 sub-criteria. In fact many model includes more than 20 criteria. Too many criteria involved in supplier selection models sometimes make it difficult to apply in many companies. This research focuses on designing supplier selection that easy and simple to be applied in the company. Analytical Hierarchy Process (AHP) is used to weighting criteria. The analysis results there are four criteria that are easy and simple can be used to select suppliers: Price (weight 0.4) shipment (weight 0.3), quality (weight 0.2) and services (weight 0.1). A real case simulation shows that simple model provides the same decision with a more complex model.

  13. Optimal processing pathway selection for microalgae-based biorefinery under uncertainty

    DEFF Research Database (Denmark)

    Rizwan, Muhammad; Zaman, Muhammad; Lee, Jay H.

    2015-01-01

    We propose a systematic framework for the selection of optimal processing pathways for a microalgaebased biorefinery under techno-economic uncertainty. The proposed framework promotes robust decision making by taking into account the uncertainties that arise due to inconsistencies among...... and shortage in the available technical information. A stochastic mixed integer nonlinear programming (sMINLP) problem is formulated for determining the optimal biorefinery configurations based on a superstructure model where parameter uncertainties are modeled and included as sampled scenarios. The solution...... the accounting of uncertainty are compared with respect to different objectives. (C) 2015 Elsevier Ltd. All rights reserved....

  14. The site selection process for a spent fuel repository in Finland. Summary report

    Energy Technology Data Exchange (ETDEWEB)

    McEwen, T. [EnvirosQuantiSci (United Kingdom); Aeikaes, T. [Posiva Oy, Helsinki (Finland)

    2000-12-01

    This Summary Report describes the Finnish programme for the selection and characterisation of potential sites for the deep disposal of spent nuclear fuel and explains the process by which Olkiluoto has been selected as the single site proposed for the development of a spent fuel disposal facility. Its aim is to provide an overview of this process, initiated almost twenty years ago, which has entered its final phase. It provides information in three areas: a review of the early site selection criteria, a description of the site selection process, including all the associated site characterisation work, up to the point at which a single site was selected and an outline of the proposed work, in particular that proposed underground, to characterise further the Olkiluoto site. In 1983 the Finnish Government made a policy decision on the management of nuclear waste in which the main goals and milestones for the site selection programme for the deep disposal of spent fuel were presented. According to this decision several site candidates, whose selection was to be based on careful studies of the whole country, should be characterised and the site for the repository selected by the end of the year 2000. This report describes the process by which this policy decision has been achieved. The report begins with a discussion of the definition of the geological and environmental site selection criteria and how they were applied in order to select a small number of sites, five in all, that were to be the subject of the preliminary investigations. The methods used to investigate these sites and the results of these investigations are described, as is the evaluation of the results of these investigations and the process used to discard two of the sites and continue more detailed investigations at the remaining three. The detailed site investigations that commenced in 1993 are described with respect to the overall strategy followed and the investigation techniques applied. The

  15. Variable selection for mixture and promotion time cure rate models.

    Science.gov (United States)

    Masud, Abdullah; Tu, Wanzhu; Yu, Zhangsheng

    2016-11-16

    Failure-time data with cured patients are common in clinical studies. Data from these studies are typically analyzed with cure rate models. Variable selection methods have not been well developed for cure rate models. In this research, we propose two least absolute shrinkage and selection operators based methods, for variable selection in mixture and promotion time cure models with parametric or nonparametric baseline hazards. We conduct an extensive simulation study to assess the operating characteristics of the proposed methods. We illustrate the use of the methods using data from a study of childhood wheezing. © The Author(s) 2016.

  16. Revising process models through inductive learning

    NARCIS (Netherlands)

    Maggi, F.M.; Corapi, D.; Russo, A.; Lupu, E.; Visaggio, G.; Muehlen, zur M.; Su, J.

    2011-01-01

    Discovering the Business Process (BP) model underpinning existing practices through analysis of event logs, allows users to understand, analyse and modify the process. But, to be useful, the BP model must be kept in line with practice throughout its lifetime, as changes occur to the business

  17. Diagnosing differences between business process models

    NARCIS (Netherlands)

    Dijkman, R.M.; Dumas, M.; Reichert, M.; Shan, M.-C.

    2008-01-01

    This paper presents a technique to diagnose differences between business process models in the EPC notation. The diagnosis returns the exact position of a difference in the business process models and diagnoses the type of a difference, using a typology of differences developed in previous work.

  18. APROMORE : an advanced process model repository

    NARCIS (Netherlands)

    La Rosa, M.; Reijers, H.A.; Aalst, van der W.M.P.; Dijkman, R.M.; Mendling, J.; Dumas, M.; García-Bañuelos, L.

    2011-01-01

    Business process models are becoming available in large numbers due to their widespread use in many industrial applications such as enterprise and quality engineering projects. On the one hand, this raises a challenge as to their proper management: how can it be ensured that the proper process model

  19. Efficient querying of large process model repositories

    NARCIS (Netherlands)

    Jin, Tao; Wang, Jianmin; La Rosa, M.; Hofstede, ter A.H.M.; Wen, Lijie

    2013-01-01

    Recent years have seen an increased uptake of business process management technology in industries. This has resulted in organizations trying to manage large collections of business process models. One of the challenges facing these organizations concerns the retrieval of models from large business

  20. Two-step variable selection in quantile regression models

    Directory of Open Access Journals (Sweden)

    FAN Yali

    2015-06-01

    Full Text Available We propose a two-step variable selection procedure for high dimensional quantile regressions, in which the dimension of the covariates, pn is much larger than the sample size n. In the first step, we perform ℓ1 penalty, and we demonstrate that the first step penalized estimator with the LASSO penalty can reduce the model from an ultra-high dimensional to a model whose size has the same order as that of the true model, and the selected model can cover the true model. The second step excludes the remained irrelevant covariates by applying the adaptive LASSO penalty to the reduced model obtained from the first step. Under some regularity conditions, we show that our procedure enjoys the model selection consistency. We conduct a simulation study and a real data analysis to evaluate the finite sample performance of the proposed approach.

  1. Quantitative analytical hierarchy process to marketing store location selection

    OpenAIRE

    Harwati; Utami Intan

    2018-01-01

    The selection of Store to market the product is belong to Multi Criteria Decision Making problem. The criteria used have conflict of interest with each other to produce an optimal location. This research uses four important criteria to select new location of marketing store appropriate with the references: distance to location, competition level with competitor, number of potential customer, and location rent cost. Quantitative data is used to determine the optimum location with AHP method. Q...

  2. Distillation modeling for a uranium refining process

    Energy Technology Data Exchange (ETDEWEB)

    Westphal, B.R.

    1996-03-01

    As part of the spent fuel treatment program at Argonne National Laboratory, a vacuum distillation process is being employed for the recovery of uranium following an electrorefining process. Distillation of a salt electrolyte, containing a eutectic mixture of lithium and potassium chlorides, from uranium is achieved by a simple batch operation and is termed {open_quotes}cathode processing{close_quotes}. The incremental distillation of electrolyte salt will be modeled by an equilibrium expression and on a molecular basis since the operation is conducted under moderate vacuum conditions. As processing continues, the two models will be compared and analyzed for correlation with actual operating results. Possible factors that may contribute to aberrations from the models include impurities at the vapor-liquid boundary, distillate reflux, anomalous pressure gradients, and mass transport phenomena at the evaporating surface. Ultimately, the purpose of either process model is to enable the parametric optimization of the process.

  3. Neural Underpinnings of Decision Strategy Selection: A Review and a Theoretical Model.

    Science.gov (United States)

    Wichary, Szymon; Smolen, Tomasz

    2016-01-01

    In multi-attribute choice, decision makers use decision strategies to arrive at the final choice. What are the neural mechanisms underlying decision strategy selection? The first goal of this paper is to provide a literature review on the neural underpinnings and cognitive models of decision strategy selection and thus set the stage for a neurocognitive model of this process. The second goal is to outline such a unifying, mechanistic model that can explain the impact of noncognitive factors (e.g., affect, stress) on strategy selection. To this end, we review the evidence for the factors influencing strategy selection, the neural basis of strategy use and the cognitive models of this process. We also present the Bottom-Up Model of Strategy Selection (BUMSS). The model assumes that the use of the rational Weighted Additive strategy and the boundedly rational heuristic Take The Best can be explained by one unifying, neurophysiologically plausible mechanism, based on the interaction of the frontoparietal network, orbitofrontal cortex, anterior cingulate cortex and the brainstem nucleus locus coeruleus. According to BUMSS, there are three processes that form the bottom-up mechanism of decision strategy selection and lead to the final choice: (1) cue weight computation, (2) gain modulation, and (3) weighted additive evaluation of alternatives. We discuss how these processes might be implemented in the brain, and how this knowledge allows us to formulate novel predictions linking strategy use and neural signals.

  4. Neural Underpinnings of Decision Strategy Selection: A Review and a Theoretical Model

    Science.gov (United States)

    Wichary, Szymon; Smolen, Tomasz

    2016-01-01

    In multi-attribute choice, decision makers use decision strategies to arrive at the final choice. What are the neural mechanisms underlying decision strategy selection? The first goal of this paper is to provide a literature review on the neural underpinnings and cognitive models of decision strategy selection and thus set the stage for a neurocognitive model of this process. The second goal is to outline such a unifying, mechanistic model that can explain the impact of noncognitive factors (e.g., affect, stress) on strategy selection. To this end, we review the evidence for the factors influencing strategy selection, the neural basis of strategy use and the cognitive models of this process. We also present the Bottom-Up Model of Strategy Selection (BUMSS). The model assumes that the use of the rational Weighted Additive strategy and the boundedly rational heuristic Take The Best can be explained by one unifying, neurophysiologically plausible mechanism, based on the interaction of the frontoparietal network, orbitofrontal cortex, anterior cingulate cortex and the brainstem nucleus locus coeruleus. According to BUMSS, there are three processes that form the bottom-up mechanism of decision strategy selection and lead to the final choice: (1) cue weight computation, (2) gain modulation, and (3) weighted additive evaluation of alternatives. We discuss how these processes might be implemented in the brain, and how this knowledge allows us to formulate novel predictions linking strategy use and neural signals. PMID:27877103

  5. Neural underpinnings of decision strategy selection: a review and a theoretical model

    Directory of Open Access Journals (Sweden)

    Szymon Wichary

    2016-11-01

    Full Text Available In multi-attribute choice, decision makers use various decision strategies to arrive at the final choice. What are the neural mechanisms underlying decision strategy selection? The first goal of this paper is to provide a literature review on the neural underpinnings and cognitive models of decision strategy selection and thus set the stage for a unifying neurocognitive model of this process. The second goal is to outline such a unifying, mechanistic model that can explain the impact of noncognitive factors (e.g. affect, stress on strategy selection. To this end, we review the evidence for the factors influencing strategy selection, the neural basis of strategy use and the cognitive models explaining this process. We also present the neurocognitive Bottom-Up Model of Strategy Selection (BUMSS. The model assumes that the use of the rational, normative Weighted Additive strategy and the boundedly rational heuristic Take The Best can be explained by one unifying, neurophysiologically plausible mechanism, based on the interaction of the frontoparietal network, orbitofrontal cortex, anterior cingulate cortex and the brainstem nucleus locus coeruleus. According to BUMSS, there are three processes that form the bottom-up mechanism of decision strategy selection and lead to the final choice: 1 cue weight computation, 2 gain modulation, and 3 weighted additive evaluation of alternatives. We discuss how these processes might be implemented in the brain, and how this knowledge allows us to formulate novel predictions linking strategy use and neurophysiological indices.

  6. Partner Selection Optimization Model of Agricultural Enterprises in Supply Chain

    OpenAIRE

    Feipeng Guo; Qibei Lu

    2013-01-01

    With more and more importance of correctly selecting partners in supply chain of agricultural enterprises, a large number of partner evaluation techniques are widely used in the field of agricultural science research. This study established a partner selection model to optimize the issue of agricultural supply chain partner selection. Firstly, it constructed a comprehensive evaluation index system after analyzing the real characteristics of agricultural supply chain. Secondly, a heuristic met...

  7. Improved model management with aggregated business process models

    NARCIS (Netherlands)

    Reijers, H.A.; Mans, R.S.; Toorn, van der R.A.

    2009-01-01

    Contemporary organizations invest much efforts in creating models of their business processes. This raises the issue of how to deal with large sets of process models that become available over time. This paper proposes an extension of Event-driven Process Chains, called the aggregate EPC (aEPC),

  8. The MCDM Model for Personnel Selection Based on SWARA and ARAS Methods

    Directory of Open Access Journals (Sweden)

    Darjan Karabasevic

    2015-05-01

    Full Text Available Competent employees are the key resource in an organization for achieving success and, therefore, competitiveness on the market. The aim of the recruitment and selection process is to acquire personnel with certain competencies required for a particular position, i.e.,a position within the company. Bearing in mind the fact that in the process of decision making decision-makers have underused the methods of making decisions, this paper aims to establish an MCDM model for the evaluation and selection of candidates in the process of the recruitment and selection of personnel based on the SWARA and the ARAS methods. Apart from providing an MCDM model, the paper will additionally provide a set of evaluation criteria for the position of a sales manager (the middle management in the telecommunication industry which will also be used in the numerical example. On the basis of a numerical example, in the process of employment, theproposed MCDMmodel can be successfully usedin selecting candidates.

  9. Effect of Model Selection on Computed Water Balance Components

    NARCIS (Netherlands)

    Jhorar, R.K.; Smit, A.A.M.F.R.; Roest, C.W.J.

    2009-01-01

    Soil water flow modelling approaches as used in four selected on-farm water management models, namely CROPWAT. FAIDS, CERES and SWAP, are compared through numerical experiments. The soil water simulation approaches used in the first three models are reformulated to incorporate ail evapotranspiration

  10. Ensembling Variable Selectors by Stability Selection for the Cox Model

    Directory of Open Access Journals (Sweden)

    Qing-Yan Yin

    2017-01-01

    Full Text Available As a pivotal tool to build interpretive models, variable selection plays an increasingly important role in high-dimensional data analysis. In recent years, variable selection ensembles (VSEs have gained much interest due to their many advantages. Stability selection (Meinshausen and Bühlmann, 2010, a VSE technique based on subsampling in combination with a base algorithm like lasso, is an effective method to control false discovery rate (FDR and to improve selection accuracy in linear regression models. By adopting lasso as a base learner, we attempt to extend stability selection to handle variable selection problems in a Cox model. According to our experience, it is crucial to set the regularization region Λ in lasso and the parameter λmin properly so that stability selection can work well. To the best of our knowledge, however, there is no literature addressing this problem in an explicit way. Therefore, we first provide a detailed procedure to specify Λ and λmin. Then, some simulated and real-world data with various censoring rates are used to examine how well stability selection performs. It is also compared with several other variable selection approaches. Experimental results demonstrate that it achieves better or competitive performance in comparison with several other popular techniques.

  11. Conceptual modelling of human resource evaluation process

    Directory of Open Access Journals (Sweden)

    Negoiţă Doina Olivia

    2017-01-01

    Full Text Available Taking into account the highly diverse tasks which employees have to fulfil due to complex requirements of nowadays consumers, the human resource within an enterprise has become a strategic element for developing and exploiting products which meet the market expectations. Therefore, organizations encounter difficulties when approaching the human resource evaluation process. Hence, the aim of the current paper is to design a conceptual model of the aforementioned process, which allows the enterprises to develop a specific methodology. In order to design the conceptual model, Business Process Modelling instruments were employed - Adonis Community Edition Business Process Management Toolkit using the ADONIS BPMS Notation. The conceptual model was developed based on an in-depth secondary research regarding the human resource evaluation process. The proposed conceptual model represents a generic workflow (sequential and/ or simultaneously activities, which can be extended considering the enterprise’s needs regarding their requirements when conducting a human resource evaluation process. Enterprises can benefit from using software instruments for business process modelling as they enable process analysis and evaluation (predefined / specific queries and also model optimization (simulations.

  12. Measures of Quality in Business Process Modelling

    Directory of Open Access Journals (Sweden)

    Radek Hronza

    2015-06-01

    Full Text Available Business process modelling and analysing is undoubtedly one of the most important parts of Applied (Business Informatics. Quality of business process models (diagrams is crucial for any purpose in this area. The goal of a process analyst’s work is to create generally understandable, explicit and error free models. If a process is properly described, created models can be used as an input into deep analysis and optimization. It can be assumed that properly designed business process models (similarly as in the case of correctly written algorithms contain characteristics that can be mathematically described. Besides it will be possible to create a tool that will help process analysts to design proper models. As part of this review will be conducted systematic literature review in order to find and analyse business process model’s design and business process model’s quality measures. It was found that mentioned area had already been the subject of research investigation in the past. Thirty-three suitable scietific publications and twenty-two quality measures were found. Analysed scientific publications and existing quality measures do not reflect all important attributes of business process model’s clarity, simplicity and completeness. Therefore it would be appropriate to add new measures of quality.

  13. Validation of elk resource selection models with spatially independent data

    Science.gov (United States)

    Priscilla K. Coe; Bruce K. Johnson; Michael J. Wisdom; John G. Cook; Marty Vavra; Ryan M. Nielson

    2011-01-01

    Knowledge of how landscape features affect wildlife resource use is essential for informed management. Resource selection functions often are used to make and validate predictions about landscape use; however, resource selection functions are rarely validated with data from landscapes independent of those from which the models were built. This problem has severely...

  14. A Working Model of Natural Selection Illustrated by Table Tennis

    Science.gov (United States)

    Dinc, Muhittin; Kilic, Selda; Aladag, Caner

    2013-01-01

    Natural selection is one of the most important topics in biology and it helps to clarify the variety and complexity of organisms. However, students in almost every stage of education find it difficult to understand the mechanism of natural selection and they can develop misconceptions about it. This article provides an active model of natural…

  15. Augmented Self-Modeling as an Intervention for Selective Mutism

    Science.gov (United States)

    Kehle, Thomas J.; Bray, Melissa A.; Byer-Alcorace, Gabriel F.; Theodore, Lea A.; Kovac, Lisa M.

    2012-01-01

    Selective mutism is a rare disorder that is difficult to treat. It is often associated with oppositional defiant behavior, particularly in the home setting, social phobia, and, at times, autism spectrum disorder characteristics. The augmented self-modeling treatment has been relatively successful in promoting rapid diminishment of selective mutism…

  16. Response to selection in finite locus models with nonadditive effects

    NARCIS (Netherlands)

    Esfandyari, Hadi; Henryon, Mark; Berg, Peer; Thomasen, Jørn Rind; Bijma, Piter; Sørensen, Anders Christian

    2017-01-01

    Under the finite-locus model in the absence of mutation, the additive genetic variation is expected to decrease when directional selection is acting on a population, according to quantitative-genetic theory. However, some theoretical studies of selection suggest that the level of additive

  17. Online Rule Generation Software Process Model

    OpenAIRE

    Sudeep Marwaha; Alka Aroa; Satma M C; Rajni Jain; R C Goyal

    2013-01-01

    For production systems like expert systems, a rule generation software can facilitate the faster deployment. The software process model for rule generation using decision tree classifier refers to the various steps required to be executed for the development of a web based software model for decision rule generation. The Royce’s final waterfall model has been used in this paper to explain the software development process. The paper presents the specific output of various steps of modified wat...

  18. Elementary Teachers' Selection and Use of Visual Models

    Science.gov (United States)

    Lee, Tammy D.; Gail Jones, M.

    2018-02-01

    As science grows in complexity, science teachers face an increasing challenge of helping students interpret models that represent complex science systems. Little is known about how teachers select and use models when planning lessons. This mixed methods study investigated the pedagogical approaches and visual models used by elementary in-service and preservice teachers in the development of a science lesson about a complex system (e.g., water cycle). Sixty-seven elementary in-service and 69 elementary preservice teachers completed a card sort task designed to document the types of visual models (e.g., images) that teachers choose when planning science instruction. Quantitative and qualitative analyses were conducted to analyze the card sort task. Semistructured interviews were conducted with a subsample of teachers to elicit the rationale for image selection. Results from this study showed that both experienced in-service teachers and novice preservice teachers tended to select similar models and use similar rationales for images to be used in lessons. Teachers tended to select models that were aesthetically pleasing and simple in design and illustrated specific elements of the water cycle. The results also showed that teachers were not likely to select images that represented the less obvious dimensions of the water cycle. Furthermore, teachers selected visual models more as a pedagogical tool to illustrate specific elements of the water cycle and less often as a tool to promote student learning related to complex systems.

  19. A decision model for the risk management of hazardous processes

    International Nuclear Information System (INIS)

    Holmberg, J.E.

    1997-03-01

    A decision model for risk management of hazardous processes as an optimisation problem of a point process is formulated in the study. In the approach, the decisions made by the management are divided into three categories: (1) planned process lifetime, (2) selection of the design and, (3) operational decisions. These three controlling methods play quite different roles in the practical risk management, which is also reflected in our approach. The optimisation of the process lifetime is related to the licensing problem of the process. It provides a boundary condition for a feasible utility function that is used as the actual objective function, i.e., maximizing the process lifetime utility. By design modifications, the management can affect the inherent accident hazard rate of the process. This is usually a discrete optimisation task. The study particularly concentrates upon the optimisation of the operational strategies given a certain design and licensing time. This is done by a dynamic risk model (marked point process model) representing the stochastic process of events observable or unobservable to the decision maker. An optimal long term control variable guiding the selection of operational alternatives in short term problems is studied. The optimisation problem is solved by the stochastic quasi-gradient procedure. The approach is illustrated by a case study. (23 refs.)

  20. Numerical simulation of complex part manufactured by selective laser melting process

    Science.gov (United States)

    Van Belle, Laurent

    2017-10-01

    Selective Laser Melting (SLM) process belonging to the family of the Additive Manufacturing (AM) technologies, enable to build parts layer by layer, from metallic powder and a CAD model. Physical phenomena that occur in the process have the same issues as conventional welding. Thermal gradients generate significant residual stresses and distortions in the parts. Moreover, the large and complex parts to manufacturing, accentuate the undesirable effects. Therefore, it is essential for manufacturers to offer a better understanding of the process and to ensure production reliability of parts with high added value. This paper focuses on the simulation of manufacturing turbine by SLM process in order to calculate residual stresses and distortions. Numerical results will be presented.

  1. The application of the FMEA method in the selected production process of a company

    Directory of Open Access Journals (Sweden)

    Piotr Barosz

    2018-04-01

    Full Text Available The aim of this article is to show the use of the analysis of the failure causes and effects as a prevention tool in controlling the quality of a given production process in the company. The scope of the work covers an analysis of a selected process, definition of inconsistencies present in this process, and then the FMEA analysis. In the production company one should implement thinking and actions based on the so-called ‘quality loop’ – it is an interdependence model of the undertaken actions which affect the quality shaping. It is carried out from the possibility for identifying a customer’s requirements through a project, production process, up to the assessment of effective capability for meeting the defined requirements. The application of such an approach enables to take the actions improving the operation of quality management in a systemic way.

  2. Target Selection Models with Preference Variation Between Offenders

    NARCIS (Netherlands)

    Townsley, Michael; Birks, Daniel; Ruiter, Stijn; Bernasco, Wim; White, Gentry

    2016-01-01

    Objectives: This study explores preference variation in location choice strategies of residential burglars. Applying a model of offender target selection that is grounded in assertions of the routine activity approach, rational choice perspective, crime pattern and social disorganization theories,

  3. COPS model estimates of LLEA availability near selected reactor sites

    International Nuclear Information System (INIS)

    Berkbigler, K.P.

    1979-11-01

    The COPS computer model has been used to estimate local law enforcement agency (LLEA) officer availability in the neighborhood of selected nuclear reactor sites. The results of these analyses are presented both in graphic and tabular form in this report

  4. Molecular modelling of a chemodosimeter for the selective detection ...

    Indian Academy of Sciences (India)

    Wintec

    Molecular modelling of a chemodosimeter for the selective detection of. As(III) ion in water. † ... high levels of arsenic cause severe skin diseases in- cluding skin cancer ..... Special Attention to Groundwater in SE Asia (eds) D. Chakraborti, A ...

  5. Evaluation of Models of the Reading Process.

    Science.gov (United States)

    Balajthy, Ernest

    A variety of reading process models have been proposed and evaluated in reading research. Traditional approaches to model evaluation specify the workings of a system in a simplified fashion to enable organized, systematic study of the system's components. Following are several statistical methods of model evaluation: (1) empirical research on…

  6. Edgar Schein's Process versus Content Consultation Models.

    Science.gov (United States)

    Rockwood, Gary F.

    1993-01-01

    Describes Schein's three models of consultation based on assumptions inherent in different helping styles: purchase of expertise and doctor-patient models, which focus on content of organization problems; and process consultation model, which focuses on how organizational problems are solved. Notes that Schein has suggested that consultants begin…

  7. Analysis of the resolution processes of three modeling tasks

    Directory of Open Access Journals (Sweden)

    Cèsar Gallart Palau

    2017-08-01

    Full Text Available In this paper we present a comparative analysis of the resolution process of three modeling tasks performed by secondary education students (13-14 years, designed from three different points of view: The Modelling-eliciting Activities, the LEMA project, and the Realistic Mathematical Problems. The purpose of this analysis is to obtain a methodological characterization of them in order to provide to secondary education teachers a proper selection and sequencing of tasks for their implementation in the classroom.

  8. Quantitative analytical hierarchy process to marketing store location selection

    Directory of Open Access Journals (Sweden)

    Harwati

    2018-01-01

    Full Text Available The selection of Store to market the product is belong to Multi Criteria Decision Making problem. The criteria used have conflict of interest with each other to produce an optimal location. This research uses four important criteria to select new location of marketing store appropriate with the references: distance to location, competition level with competitor, number of potential customer, and location rent cost. Quantitative data is used to determine the optimum location with AHP method. Quantitative data are preferred to avoid inconsistency when using expert opinion. The AHP result optimum location among three alternatives places.

  9. Distillation modeling for a uranium refining process

    International Nuclear Information System (INIS)

    Westphal, B.R.

    1996-01-01

    As part of the spent fuel treatment program at Argonne National Laboratory, a vacuum distillation process is being employed for the recovery of uranium following an electrorefining process. Distillation of a salt electrolyte, containing a eutectic mixture of lithium and potassium chlorides, from uranium is achieved by a simple batch operation and is termed open-quotes cathode processingclose quotes. The incremental distillation of electrolyte salt will be modeled by an equilibrium expression and on a molecular basis since the operation is conducted under moderate vacuum conditions. As processing continues, the two models will be compared and analyzed for correlation with actual operating results. Possible factors that may contribute to aberrations from the models include impurities at the vapor-liquid boundary, distillate reflux, anomalous pressure gradients, and mass transport phenomena at the evaporating surface. Ultimately, the purpose of either process model is to enable the parametric optimization of the process

  10. Model Selection in Continuous Test Norming With GAMLSS.

    Science.gov (United States)

    Voncken, Lieke; Albers, Casper J; Timmerman, Marieke E

    2017-06-01

    To compute norms from reference group test scores, continuous norming is preferred over traditional norming. A suitable continuous norming approach for continuous data is the use of the Box-Cox Power Exponential model, which is found in the generalized additive models for location, scale, and shape. Applying the Box-Cox Power Exponential model for test norming requires model selection, but it is unknown how well this can be done with an automatic selection procedure. In a simulation study, we compared the performance of two stepwise model selection procedures combined with four model-fit criteria (Akaike information criterion, Bayesian information criterion, generalized Akaike information criterion (3), cross-validation), varying data complexity, sampling design, and sample size in a fully crossed design. The new procedure combined with one of the generalized Akaike information criterion was the most efficient model selection procedure (i.e., required the smallest sample size). The advocated model selection procedure is illustrated with norming data of an intelligence test.

  11. Piecewise deterministic processes in biological models

    CERN Document Server

    Rudnicki, Ryszard

    2017-01-01

    This book presents a concise introduction to piecewise deterministic Markov processes (PDMPs), with particular emphasis on their applications to biological models. Further, it presents examples of biological phenomena, such as gene activity and population growth, where different types of PDMPs appear: continuous time Markov chains, deterministic processes with jumps, processes with switching dynamics, and point processes. Subsequent chapters present the necessary tools from the theory of stochastic processes and semigroups of linear operators, as well as theoretical results concerning the long-time behaviour of stochastic semigroups induced by PDMPs and their applications to biological models. As such, the book offers a valuable resource for mathematicians and biologists alike. The first group will find new biological models that lead to interesting and often new mathematical questions, while the second can observe how to include seemingly disparate biological processes into a unified mathematical theory, and...

  12. Modeling closed nuclear fuel cycles processes

    Energy Technology Data Exchange (ETDEWEB)

    Shmidt, O.V. [A.A. Bochvar All-Russian Scientific Research Institute for Inorganic Materials, Rogova, 5a street, Moscow, 123098 (Russian Federation); Makeeva, I.R. [Zababakhin All-Russian Scientific Research Institute of Technical Physics, Vasiliev street 13, Snezhinsk, Chelyabinsk region, 456770 (Russian Federation); Liventsov, S.N. [Tomsk Polytechnic University, Tomsk, Lenin Avenue, 30, 634050 (Russian Federation)

    2016-07-01

    Computer models of processes are necessary for determination of optimal operating conditions for closed nuclear fuel cycle (NFC) processes. Computer models can be quickly changed in accordance with new and fresh data from experimental research. 3 kinds of process simulation are necessary. First, the VIZART software package is a balance model development used for calculating the material flow in technological processes. VIZART involves taking into account of equipment capacity, transport lines and storage volumes. Secondly, it is necessary to simulate the physico-chemical processes that are involved in the closure of NFC. The third kind of simulation is the development of software that allows the optimization, diagnostics and control of the processes which implies real-time simulation of product flows on the whole plant or on separate lines of the plant. (A.C.)

  13. Improving the Air Force Squadron Command Selection Process

    Science.gov (United States)

    2017-04-19

    8 maturity in the unit, selectively-manned versus non-volunteer members, and individual motivations. These characteristics work in conjunction... maturity among members allows leaders to devote attention to larger problems and issues. Similarly, a solid staff of strong Senior NCOs, NCOs, and...personal interviews combined with psychological testing. Personality tests such as the Myers-Briggs Type Indicator, Judgement Index, Emotional

  14. The Selection of Bridge Materials Utilizing the Analytical Hierarchy Process

    Science.gov (United States)

    Robert L. Smith; Robert J. Bush; Daniel L. Schmoldt

    1997-01-01

    Effective decisions on the use of natural resources often require the input of many individuals. Determining how specific criteria affect the selection of materials can lead to better utilization of raw materials. Concrete, steel, and timber represent over 98% of the materials used for bridge construction in the United States. Highway officials must often consider...

  15. Data-driven process decomposition and robust online distributed modelling for large-scale processes

    Science.gov (United States)

    Shu, Zhang; Lijuan, Li; Lijuan, Yao; Shipin, Yang; Tao, Zou

    2018-02-01

    With the increasing attention of networked control, system decomposition and distributed models show significant importance in the implementation of model-based control strategy. In this paper, a data-driven system decomposition and online distributed subsystem modelling algorithm was proposed for large-scale chemical processes. The key controlled variables are first partitioned by affinity propagation clustering algorithm into several clusters. Each cluster can be regarded as a subsystem. Then the inputs of each subsystem are selected by offline canonical correlation analysis between all process variables and its controlled variables. Process decomposition is then realised after the screening of input and output variables. When the system decomposition is finished, the online subsystem modelling can be carried out by recursively block-wise renewing the samples. The proposed algorithm was applied in the Tennessee Eastman process and the validity was verified.

  16. The Use of Evolution in a Central Action Selection Model

    Directory of Open Access Journals (Sweden)

    F. Montes-Gonzalez

    2007-01-01

    Full Text Available The use of effective central selection provides flexibility in design by offering modularity and extensibility. In earlier papers we have focused on the development of a simple centralized selection mechanism. Our current goal is to integrate evolutionary methods in the design of non-sequential behaviours and the tuning of specific parameters of the selection model. The foraging behaviour of an animal robot (animat has been modelled in order to integrate the sensory information from the robot to perform selection that is nearly optimized by the use of genetic algorithms. In this paper we present how selection through optimization finally arranges the pattern of presented behaviours for the foraging task. Hence, the execution of specific parts in a behavioural pattern may be ruled out by the tuning of these parameters. Furthermore, the intensive use of colour segmentation from a colour camera for locating a cylinder sets a burden on the calculations carried out by the genetic algorithm.

  17. Software-Engineering Process Simulation (SEPS) model

    Science.gov (United States)

    Lin, C. Y.; Abdel-Hamid, T.; Sherif, J. S.

    1992-01-01

    The Software Engineering Process Simulation (SEPS) model is described which was developed at JPL. SEPS is a dynamic simulation model of the software project development process. It uses the feedback principles of system dynamics to simulate the dynamic interactions among various software life cycle development activities and management decision making processes. The model is designed to be a planning tool to examine tradeoffs of cost, schedule, and functionality, and to test the implications of different managerial policies on a project's outcome. Furthermore, SEPS will enable software managers to gain a better understanding of the dynamics of software project development and perform postmodern assessments.

  18. Improving permafrost distribution modelling using feature selection algorithms

    Science.gov (United States)

    Deluigi, Nicola; Lambiel, Christophe; Kanevski, Mikhail

    2016-04-01

    overall operation. It operates by constructing a large collection of decorrelated classification trees, and then predicts the permafrost occurrence through a majority vote. With the so-called out-of-bag (OOB) error estimate, the classification of permafrost data can be validated as well as the contribution of each predictor can be assessed. The performances of compared permafrost distribution models (computed on independent testing sets) increased with the application of FS algorithms on the original dataset and irrelevant or redundant variables were removed. As a consequence, the process provided faster and more cost-effective predictors and a better understanding of the underlying structures residing in permafrost data. Our work demonstrates the usefulness of a feature selection step prior to applying a machine learning algorithm. In fact, permafrost predictors could be ranked not only based on their heuristic and subjective importance (expert knowledge), but also based on their statistical relevance in relation of the permafrost distribution.

  19. Model of diffusers / permeators for hydrogen processing

    International Nuclear Information System (INIS)

    Jacobs, W. D.; Hang, T.

    2008-01-01

    Palladium-silver (Pd-Ag) diffusers are mainstays of hydrogen processing. Diffusers separate hydrogen from inert species such as nitrogen, argon or helium. The tubing becomes permeable to hydrogen when heated to more than 250 C and a differential pressure is created across the membrane. The hydrogen diffuses better at higher temperatures. Experimental or experiential results have been the basis for determining or predicting a diffuser's performance. However, the process can be mathematically modeled, and comparison to experimental or other operating data can be utilized to improve the fit of the model. A reliable model-based diffuser system design is the goal which will have impacts on tritium and hydrogen processing. A computer model has been developed to solve the differential equations for diffusion given the operating boundary conditions. The model was compared to operating data for a low pressure diffuser system. The modeling approach and the results are presented in this paper. (authors)

  20. Modeling Business Processes in Public Administration

    Science.gov (United States)

    Repa, Vaclav

    During more than 10 years of its existence business process modeling became a regular part of organization management practice. It is mostly regarded. as a part of information system development or even as a way to implement some supporting technology (for instance workflow system). Although I do not agree with such reduction of the real meaning of a business process, it is necessary to admit that information technologies play an essential role in business processes (see [1] for more information), Consequently, an information system is inseparable from a business process itself because it is a cornerstone of the general basic infrastructure of a business. This fact impacts on all dimensions of business process management. One of these dimensions is the methodology that postulates that the information systems development provide the business process management with exact methods and tools for modeling business processes. Also the methodology underlying the approach presented in this paper has its roots in the information systems development methodology.

  1. Mathematical model of seed germination process

    International Nuclear Information System (INIS)

    Gładyszewska, B.; Koper, R.; Kornarzyński, K.

    1999-01-01

    An analytical model of seed germination process was described. The model based on proposed working hypothesis leads - by analogy - to a law corresponding with Verhulst-Pearl's law, known from the theory of population kinetics. The model was applied to describe the germination kinetics of tomato seeds, Promyk field cultivar, biostimulated by laser treatment. Close agreement of experimental and model data was obtained [pl

  2. Engineered Barrier System Degradation, Flow, and Transport Process Model Report

    Energy Technology Data Exchange (ETDEWEB)

    E.L. Hardin

    2000-07-17

    The Engineered Barrier System Degradation, Flow, and Transport Process Model Report (EBS PMR) is one of nine PMRs supporting the Total System Performance Assessment (TSPA) being developed by the Yucca Mountain Project for the Site Recommendation Report (SRR). The EBS PMR summarizes the development and abstraction of models for processes that govern the evolution of conditions within the emplacement drifts of a potential high-level nuclear waste repository at Yucca Mountain, Nye County, Nevada. Details of these individual models are documented in 23 supporting Analysis/Model Reports (AMRs). Nineteen of these AMRs are for process models, and the remaining 4 describe the abstraction of results for application in TSPA. The process models themselves cluster around four major topics: ''Water Distribution and Removal Model, Physical and Chemical Environment Model, Radionuclide Transport Model, and Multiscale Thermohydrologic Model''. One AMR (Engineered Barrier System-Features, Events, and Processes/Degradation Modes Analysis) summarizes the formal screening analysis used to select the Features, Events, and Processes (FEPs) included in TSPA and those excluded from further consideration. Performance of a potential Yucca Mountain high-level radioactive waste repository depends on both the natural barrier system (NBS) and the engineered barrier system (EBS) and on their interactions. Although the waste packages are generally considered as components of the EBS, the EBS as defined in the EBS PMR includes all engineered components outside the waste packages. The principal function of the EBS is to complement the geologic system in limiting the amount of water contacting nuclear waste. A number of alternatives were considered by the Project for different EBS designs that could provide better performance than the design analyzed for the Viability Assessment. The design concept selected was Enhanced Design Alternative II (EDA II).

  3. Engineered Barrier System Degradation, Flow, and Transport Process Model Report

    International Nuclear Information System (INIS)

    E.L. Hardin

    2000-01-01

    The Engineered Barrier System Degradation, Flow, and Transport Process Model Report (EBS PMR) is one of nine PMRs supporting the Total System Performance Assessment (TSPA) being developed by the Yucca Mountain Project for the Site Recommendation Report (SRR). The EBS PMR summarizes the development and abstraction of models for processes that govern the evolution of conditions within the emplacement drifts of a potential high-level nuclear waste repository at Yucca Mountain, Nye County, Nevada. Details of these individual models are documented in 23 supporting Analysis/Model Reports (AMRs). Nineteen of these AMRs are for process models, and the remaining 4 describe the abstraction of results for application in TSPA. The process models themselves cluster around four major topics: ''Water Distribution and Removal Model, Physical and Chemical Environment Model, Radionuclide Transport Model, and Multiscale Thermohydrologic Model''. One AMR (Engineered Barrier System-Features, Events, and Processes/Degradation Modes Analysis) summarizes the formal screening analysis used to select the Features, Events, and Processes (FEPs) included in TSPA and those excluded from further consideration. Performance of a potential Yucca Mountain high-level radioactive waste repository depends on both the natural barrier system (NBS) and the engineered barrier system (EBS) and on their interactions. Although the waste packages are generally considered as components of the EBS, the EBS as defined in the EBS PMR includes all engineered components outside the waste packages. The principal function of the EBS is to complement the geologic system in limiting the amount of water contacting nuclear waste. A number of alternatives were considered by the Project for different EBS designs that could provide better performance than the design analyzed for the Viability Assessment. The design concept selected was Enhanced Design Alternative II (EDA II)

  4. Effect of Processing on the Elemental Composition of Selected Leafy ...

    African Journals Online (AJOL)

    The elemental composition of leaves of Vernonia amygdalina, Gnetum africana, Gongronema latifolium and Ocimum gratissimum subjected to different processing methods were investigated. Processing methods employed include oven drying, sun drying, fresh milling, steaming and a combination of these while the mineral ...

  5. Selection of parameters for advanced machining processes using firefly algorithm

    Directory of Open Access Journals (Sweden)

    Rajkamal Shukla

    2017-02-01

    Full Text Available Advanced machining processes (AMPs are widely utilized in industries for machining complex geometries and intricate profiles. In this paper, two significant processes such as electric discharge machining (EDM and abrasive water jet machining (AWJM are considered to get the optimum values of responses for the given range of process parameters. The firefly algorithm (FA is attempted to the considered processes to obtain optimized parameters and the results obtained are compared with the results given by previous researchers. The variation of process parameters with respect to the responses are plotted to confirm the optimum results obtained using FA. In EDM process, the performance parameter “MRR” is increased from 159.70 gm/min to 181.6723 gm/min, while “Ra” and “REWR” are decreased from 6.21 μm to 3.6767 μm and 6.21% to 6.324 × 10−5% respectively. In AWJM process, the value of the “kerf” and “Ra” are decreased from 0.858 mm to 0.3704 mm and 5.41 mm to 4.443 mm respectively. In both the processes, the obtained results show a significant improvement in the responses.

  6. Substitution of Organic Solvents in Selected Industrial Cleaning Processes

    DEFF Research Database (Denmark)

    Jacobsen, Thomas; Rasmussen, Pia Brunn

    1997-01-01

    Volatile organic solvents (VOC)are becoming increasingly unwanted in industrial processes. Substitution of VOC with non-volatile, low-toxic compounds is a possibility to reduce VOC-use. It has been successfully demonstrated, that organic solvents used in cleaning processes in sheet offset printing...

  7. Fast Bayesian Inference in Dirichlet Process Mixture Models.

    Science.gov (United States)

    Wang, Lianming; Dunson, David B

    2011-01-01

    There has been increasing interest in applying Bayesian nonparametric methods in large samples and high dimensions. As Markov chain Monte Carlo (MCMC) algorithms are often infeasible, there is a pressing need for much faster algorithms. This article proposes a fast approach for inference in Dirichlet process mixture (DPM) models. Viewing the partitioning of subjects into clusters as a model selection problem, we propose a sequential greedy search algorithm for selecting the partition. Then, when conjugate priors are chosen, the resulting posterior conditionally on the selected partition is available in closed form. This approach allows testing of parametric models versus nonparametric alternatives based on Bayes factors. We evaluate the approach using simulation studies and compare it with four other fast nonparametric methods in the literature. We apply the proposed approach to three datasets including one from a large epidemiologic study. Matlab codes for the simulation and data analyses using the proposed approach are available online in the supplemental materials.

  8. Heat transfer modelling and stability analysis of selective laser melting

    International Nuclear Information System (INIS)

    Gusarov, A.V.; Yadroitsev, I.; Bertrand, Ph.; Smurov, I.

    2007-01-01

    The process of direct manufacturing by selective laser melting basically consists of laser beam scanning over a thin powder layer deposited on a dense substrate. Complete remelting of the powder in the scanned zone and its good adhesion to the substrate ensure obtaining functional parts with improved mechanical properties. Experiments with single-line scanning indicate, that an interval of scanning velocities exists where the remelted tracks are uniform. The tracks become broken if the scanning velocity is outside this interval. This is extremely undesirable and referred to as the 'balling' effect. A numerical model of coupled radiation and heat transfer is proposed to analyse the observed instability. The 'balling' effect at high scanning velocities (above ∼20 cm/s for the present conditions) can be explained by the Plateau-Rayleigh capillary instability of the melt pool. Two factors stabilize the process with decreasing the scanning velocity: reducing the length-to-width ratio of the melt pool and increasing the width of its contact with the substrate

  9. Guide for selection of dosimetry system for electron processing

    International Nuclear Information System (INIS)

    Mehta, K.

    1988-01-01

    Correct applications of radiation processing depend on accurate measurements of absorbed radiation dose. Radiation dosimetry plays several important roles in radiation processing. In particular, there are three stages for any radiation process during which dosimetry is a key to success: basic laboratory research, commissioning of the process and quality control. Radiation dosimeters may be divided into various classes depending upon their areas of applications and their relative quality: primary standard dosimeter, reference standard dosimeter, transfer standard dosimeter and routine in-house dosimeter. Several commercially available dosimeters are described under each class, and their advantages and limitations are discussed. Finally, recommendations are made as to which dosimeter is most suitable for each of the three stages of electron-beam processing. 124 refs

  10. Statistical model selection with “Big Data”

    Directory of Open Access Journals (Sweden)

    Jurgen A. Doornik

    2015-12-01

    Full Text Available Big Data offer potential benefits for statistical modelling, but confront problems including an excess of false positives, mistaking correlations for causes, ignoring sampling biases and selecting by inappropriate methods. We consider the many important requirements when searching for a data-based relationship using Big Data, and the possible role of Autometrics in that context. Paramount considerations include embedding relationships in general initial models, possibly restricting the number of variables to be selected over by non-statistical criteria (the formulation problem, using good quality data on all variables, analyzed with tight significance levels by a powerful selection procedure, retaining available theory insights (the selection problem while testing for relationships being well specified and invariant to shifts in explanatory variables (the evaluation problem, using a viable approach that resolves the computational problem of immense numbers of possible models.

  11. Multivariate fault isolation of batch processes via variable selection in partial least squares discriminant analysis.

    Science.gov (United States)

    Yan, Zhengbing; Kuang, Te-Hui; Yao, Yuan

    2017-09-01

    In recent years, multivariate statistical monitoring of batch processes has become a popular research topic, wherein multivariate fault isolation is an important step aiming at the identification of the faulty variables contributing most to the detected process abnormality. Although contribution plots have been commonly used in statistical fault isolation, such methods suffer from the smearing effect between correlated variables. In particular, in batch process monitoring, the high autocorrelations and cross-correlations that exist in variable trajectories make the smearing effect unavoidable. To address such a problem, a variable selection-based fault isolation method is proposed in this research, which transforms the fault isolation problem into a variable selection problem in partial least squares discriminant analysis and solves it by calculating a sparse partial least squares model. As different from the traditional methods, the proposed method emphasizes the relative importance of each process variable. Such information may help process engineers in conducting root-cause diagnosis. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  12. Ontological Model of Business Process Management Systems

    Science.gov (United States)

    Manoilov, G.; Deliiska, B.

    2008-10-01

    The activities which constitute business process management (BPM) can be grouped into five categories: design, modeling, execution, monitoring and optimization. Dedicated software packets for business process management system (BPMS) are available on the market. But the efficiency of its exploitation depends on used ontological model in the development time and run time of the system. In the article an ontological model of BPMS in area of software industry is investigated. The model building is preceded by conceptualization of the domain and taxonomy of BPMS development. On the base of the taxonomy an simple online thesaurus is created.

  13. Seeking inclusion in an exclusive process: discourses of medical school student selection.

    Science.gov (United States)

    Razack, Saleem; Hodges, Brian; Steinert, Yvonne; Maguire, Mary

    2015-01-01

    Calls to increase medical class representativeness to better reflect the diversity of society represent a growing international trend. There is an inherent tension between these calls and competitive student selection processes driven by academic achievement. How is this tension manifested? Our three-phase interdisciplinary research programme focused on the discourses of excellence, equity and diversity in the medical school selection process, as conveyed by key stakeholders: (i) institutions and regulatory bodies (the websites of 17 medical schools and 15 policy documents from national regulatory bodies); (ii) admissions committee members (ACMs) (according to semi-structured interviews [n = 9]), and (iii) successful applicants (according to semi-structured interviews [n = 14]). The work is theoretically situated within the works of Foucault, Bourdieu and Bakhtin. The conceptual framework is supplemented by critical hermeneutics and the performance theories of Goffman. Academic excellence discourses consistently predominate over discourses calling for greater representativeness in medical classes. Policy addressing demographic representativeness in medicine may unwittingly contribute to the reproduction of historical patterns of exclusion of under-represented groups. In ACM selection practices, another discursive tension is exposed as the inherent privilege in the process is marked, challenging the ideal of medicine as a meritocracy. Applicants' representations of self in the 'performance' of interviewing demonstrate implicit recognition of the power inherent in the act of selection and are manifested in the use of explicit strategies to 'fit in'. How can this critical discourse analysis inform improved inclusiveness in student selection? Policymakers addressing diversity and equity issues in medical school admissions should explicitly recognise the power dynamics at play between the profession and marginalised groups. For greater inclusion and to avoid one

  14. Towards a universal competitive intelligence process model

    Directory of Open Access Journals (Sweden)

    Rene Pellissier

    2013-08-01

    Full Text Available Background: Competitive intelligence (CI provides actionable intelligence, which provides a competitive edge in enterprises. However, without proper process, it is difficult to develop actionable intelligence. There are disagreements about how the CI process should be structured. For CI professionals to focus on producing actionable intelligence, and to do so with simplicity, they need a common CI process model.Objectives: The purpose of this research is to review the current literature on CI, to look at the aims of identifying and analysing CI process models, and finally to propose a universal CI process model.Method: The study was qualitative in nature and content analysis was conducted on all identified sources establishing and analysing CI process models. To identify relevant literature, academic databases and search engines were used. Moreover, a review of references in related studies led to more relevant sources, the references of which were further reviewed and analysed. To ensure reliability, only peer-reviewed articles were used.Results: The findings reveal that the majority of scholars view the CI process as a cycle of interrelated phases. The output of one phase is the input of the next phase.Conclusion: The CI process is a cycle of interrelated phases. The output of one phase is the input of the next phase. These phases are influenced by the following factors: decision makers, process and structure, organisational awareness and culture, and feedback.

  15. Factors influencing equipment selection in electron beam processing

    Science.gov (United States)

    Barnard, J. W.

    2003-08-01

    During the eighties and nineties accelerator manufacturers dramatically increased the beam power available for high-energy equipment. This effort was directed primarily at meeting the demands of the sterilization industry. During this era, the perception that bigger (higher power, higher energy) was always better prevailed since the operating and capital costs of accelerators did not increase with power and energy as fast as the throughput. High power was needed to maintain per unit costs low for treatment. This philosophy runs counter to certain present-day realities of the sterilization business as well as conditions influencing accelerator selection in other electron beam applications. Recent experience in machine selection is described and factors affecting choice are presented.

  16. Selection of radioactive waste disposal site considering natural processes

    International Nuclear Information System (INIS)

    Nakamura, H.

    1991-01-01

    To dispose the radioactive waste, it is necessary to consider the transfer of material in natural environment. The points of consideration are 1) Long residence time of water 2) Independence of biosphere from the compartment containing the disposal site in the natural hydrologic cycle 3) Dilution with the natural inactive isotope or the same group of elements. Isotope dilution for 129 I and 14 C can be expected by proper selection of the site. 241 Am and 239 Pu will be homogenized into soil or sediment with insoluble elements such as iron and aluminium. For 237 Np and 99 Tc anionic condition is important for the selection. From the point of view of hydrologic cycle, anoxic dead water zone avoiding beneath mountain area is preferable for the disposal site. (author)

  17. Gender Inequality and Emigration: Push factor or Selection process?

    OpenAIRE

    Baudassé, Thierry; Bazillier, Rémi

    2012-01-01

    Our objective in this research is to provide empirical evidence relating to the linkages between gender equality and international emigration. Two theoretical hypotheses can be made for the purpose of analyzing such linkages. The fi rst is that gender inequality in origin countries could be a push factor for women. The second one is that gender inequality may create a \\gender bias" in the selection of migrants within a household or a community. An improvement of gender equality would then inc...

  18. CHAIN-WISE GENERALIZATION OF ROAD NETWORKS USING MODEL SELECTION

    Directory of Open Access Journals (Sweden)

    D. Bulatov

    2017-05-01

    Full Text Available Streets are essential entities of urban terrain and their automatized extraction from airborne sensor data is cumbersome because of a complex interplay of geometric, topological and semantic aspects. Given a binary image, representing the road class, centerlines of road segments are extracted by means of skeletonization. The focus of this paper lies in a well-reasoned representation of these segments by means of geometric primitives, such as straight line segments as well as circle and ellipse arcs. We propose the fusion of raw segments based on similarity criteria; the output of this process are the so-called chains which better match to the intuitive perception of what a street is. Further, we propose a two-step approach for chain-wise generalization. First, the chain is pre-segmented using circlePeucker and finally, model selection is used to decide whether two neighboring segments should be fused to a new geometric entity. Thereby, we consider both variance-covariance analysis of residuals and model complexity. The results on a complex data-set with many traffic roundabouts indicate the benefits of the proposed procedure.

  19. A computational neural model of goal-directed utterance selection.

    Science.gov (United States)

    Klein, Michael; Kamp, Hans; Palm, Guenther; Doya, Kenji

    2010-06-01

    It is generally agreed that much of human communication is motivated by extra-linguistic goals: we often make utterances in order to get others to do something, or to make them support our cause, or adopt our point of view, etc. However, thus far a computational foundation for this view on language use has been lacking. In this paper we propose such a foundation using Markov Decision Processes. We borrow computational components from the field of action selection and motor control, where a neurobiological basis of these components has been established. In particular, we make use of internal models (i.e., next-state transition functions defined on current state action pairs). The internal model is coupled with reinforcement learning of a value function that is used to assess the desirability of any state that utterances (as well as certain non-verbal actions) can bring about. This cognitive architecture is tested in a number of multi-agent game simulations. In these computational experiments an agent learns to predict the context-dependent effects of utterances by interacting with other agents that are already competent speakers. We show that the cognitive architecture can account for acquiring the capability of deciding when to speak in order to achieve a certain goal (instead of performing a non-verbal action or simply doing nothing), whom to address and what to say. Copyright 2010 Elsevier Ltd. All rights reserved.

  20. Training Self-Regulated Learning Skills with Video Modeling Examples: Do Task-Selection Skills Transfer?

    Science.gov (United States)

    Raaijmakers, Steven F.; Baars, Martine; Schaap, Lydia; Paas, Fred; van Merriënboer, Jeroen; van Gog, Tamara

    2018-01-01

    Self-assessment and task-selection skills are crucial in self-regulated learning situations in which students can choose their own tasks. Prior research suggested that training with video modeling examples, in which another person (the model) demonstrates and explains the cyclical process of problem-solving task performance, self-assessment, and…

  1. Selecting device for processing method of radioactive gaseous wastes

    International Nuclear Information System (INIS)

    Sasaki, Ryoichi; Komoda, Norihisa.

    1976-01-01

    Object: To extend the period of replacement of a filter for adsorbing radioactive material by discharging waste gas containing radioactive material produced from an atomic power equipment after treating it by a method selected on the basis of the results of measurement of wind direction. Structure: Exhaust gas containing radioactive material produced from atomic power equipment is discharged after it is treated by a method selected on the basis of the results of wind direction measurement. For Instance, in case of sea wind the waste gas passes through a route selected for this case and is discharged through the waste gas outlet. When the sea wind disappears (that is, when a land wind or calm sets in), the exhaust gas is switched to a route for the case other than that of the sea wind, so that it passes through a filter consisting of active carbon where the radioactive material is removed through adsorption. The waste gas now free from the radioactive material is discharged through the waste gas outlet. (Moriyama, K.)

  2. Model for analyzing decontamination process systems

    International Nuclear Information System (INIS)

    Boykin, R.F.; Rolland, C.W.

    1979-06-01

    Selection of equipment and the design of a new facility in light of minimizing cost and maximizing capacity, is a problem managers face many times in the operations of a manufacturing organization. This paper deals with the actual analysis of equipment facility design for a decontamination operation. Discussions on the selection method of the equipment and the development of the facility design criteria are presented along with insight into the problems encountered in the equipment analysis for a new decontamination facility. The presentation also includes a review of the transition from the old facility into the new facility and the process used to minimize the cost and conveyance problems of the transition

  3. Three-dimensional model analysis and processing

    CERN Document Server

    Yu, Faxin; Luo, Hao; Wang, Pinghui

    2011-01-01

    This book focuses on five hot research directions in 3D model analysis and processing in computer science:  compression, feature extraction, content-based retrieval, irreversible watermarking and reversible watermarking.

  4. Comparison of climate envelope models developed using expert-selected variables versus statistical selection

    Science.gov (United States)

    Brandt, Laura A.; Benscoter, Allison; Harvey, Rebecca G.; Speroterra, Carolina; Bucklin, David N.; Romañach, Stephanie; Watling, James I.; Mazzotti, Frank J.

    2017-01-01

    Climate envelope models are widely used to describe potential future distribution of species under different climate change scenarios. It is broadly recognized that there are both strengths and limitations to using climate envelope models and that outcomes are sensitive to initial assumptions, inputs, and modeling methods Selection of predictor variables, a central step in modeling, is one of the areas where different techniques can yield varying results. Selection of climate variables to use as predictors is often done using statistical approaches that develop correlations between occurrences and climate data. These approaches have received criticism in that they rely on the statistical properties of the data rather than directly incorporating biological information about species responses to temperature and precipitation. We evaluated and compared models and prediction maps for 15 threatened or endangered species in Florida based on two variable selection techniques: expert opinion and a statistical method. We compared model performance between these two approaches for contemporary predictions, and the spatial correlation, spatial overlap and area predicted for contemporary and future climate predictions. In general, experts identified more variables as being important than the statistical method and there was low overlap in the variable sets (0.9 for area under the curve (AUC) and >0.7 for true skill statistic (TSS). Spatial overlap, which compares the spatial configuration between maps constructed using the different variable selection techniques, was only moderate overall (about 60%), with a great deal of variability across species. Difference in spatial overlap was even greater under future climate projections, indicating additional divergence of model outputs from different variable selection techniques. Our work is in agreement with other studies which have found that for broad-scale species distribution modeling, using statistical methods of variable

  5. Modeling of Dielectric Heating within Lyophilization Process

    Directory of Open Access Journals (Sweden)

    Jan Kyncl

    2014-01-01

    Full Text Available A process of lyophilization of paper books is modeled. The process of drying is controlled by a dielectric heating system. From the physical viewpoint, the task represents a 2D coupled problem described by two partial differential equations for the electric and temperature fields. The material parameters are supposed to be temperature-dependent functions. The continuous mathematical model is solved numerically. The methodology is illustrated with some examples whose results are discussed.

  6. Histogram bin width selection for time-dependent Poisson processes

    International Nuclear Information System (INIS)

    Koyama, Shinsuke; Shinomoto, Shigeru

    2004-01-01

    In constructing a time histogram of the event sequences derived from a nonstationary point process, we wish to determine the bin width such that the mean squared error of the histogram from the underlying rate of occurrence is minimized. We find that the optimal bin widths obtained for a doubly stochastic Poisson process and a sinusoidally regulated Poisson process exhibit different scaling relations with respect to the number of sequences, time scale and amplitude of rate modulation, but both diverge under similar parametric conditions. This implies that under these conditions, no determination of the time-dependent rate can be made. We also apply the kernel method to these point processes, and find that the optimal kernels do not exhibit any critical phenomena, unlike the time histogram method

  7. Histogram bin width selection for time-dependent Poisson processes

    Energy Technology Data Exchange (ETDEWEB)

    Koyama, Shinsuke; Shinomoto, Shigeru [Department of Physics, Graduate School of Science, Kyoto University, Sakyo-ku, Kyoto 606-8502 (Japan)

    2004-07-23

    In constructing a time histogram of the event sequences derived from a nonstationary point process, we wish to determine the bin width such that the mean squared error of the histogram from the underlying rate of occurrence is minimized. We find that the optimal bin widths obtained for a doubly stochastic Poisson process and a sinusoidally regulated Poisson process exhibit different scaling relations with respect to the number of sequences, time scale and amplitude of rate modulation, but both diverge under similar parametric conditions. This implies that under these conditions, no determination of the time-dependent rate can be made. We also apply the kernel method to these point processes, and find that the optimal kernels do not exhibit any critical phenomena, unlike the time histogram method.

  8. Selectivity of radiation-induced processes in hydrocarbons, related polymers and organized polymer systems

    International Nuclear Information System (INIS)

    Feldman, V.I.; Sukhov, F.F.; Zezin, A.A.; Orlov, A.Yu.

    1999-01-01

    Fundamental aspects of the selectivity of radiation-induced events in polymers and polymeric systems were considered: (1) The grounds of selectivity of the primary events were analyzed on the basis of the results of studies of model compounds (molecular aspect). Basic results were obtained for hydrocarbon molecules irradiated in low-temperature matrices. The effects of selective localization of the primary events on the radical formation were examined for several polymers irradiated at low and superlow temperatures (77 and 15 K). A remarkable correlation between the properties of prototype ionized molecules (radical cations) and selectivity of the primary bond rupture in the corresponding polymers were found for polyethylene, polystyrene and some other hydrocarbon polymers. The first direct indication of selective localization of primary events at conformational defects was obtained for oriented high-crystalline polyethylene irradiated at 15 K. The significance of dimeric ring association was proved for the radiation chemistry of polystyrene. Specific mechanisms of low-temperature radiation-induced degradation were also analyzed for polycarbonate and poly(alkylene terephthalates). (2) Specific features of the localization of primary radiation-induced events in microheterogeneous polymeric systems were investigated (microstructural aspect). It was found that the interphase processes played an important role in the radiation chemistry of such systems. The interphase electron migration may result in both positive and negative non-additive effects in the formation of radiolysis products. The effects of component diffusion and chemical reactions on the radiation-induced processes in microheterogeneous polymeric systems were studied with the example of polycarbonate - poly(alkylene terephthalate) blends. (3) The effects of restricted molecular motion on the development of the radiation-chemical processes in polymers were investigated (dynamic aspect). In particular, it

  9. Determination of Properties of Selected Fresh and Processed Medicinal Plants

    Directory of Open Access Journals (Sweden)

    Shirley G. Cabrera

    2015-11-01

    Full Text Available The study aimed to determine the chemical properties, bioactive compounds, antioxidant activity and toxicity level of fresh and processed medicinal plants such as corn (Zea mays silk, pancitpancitan (Peperomiapellucida leaves, pandan (Pandanus amaryllifolius leaves, and commercially available tea. The toxicity level of the samples was measured using the Brine Shrimp Lethality Assay (BSLA. Statistical analysis was done using Statistical Package for Social Sciences (SPSS. Results showed that in terms of chemical properties there is significant difference between fresh and processed corn silk except in crude fiber content was noted. Based on proximate analyses of fresh and processed medicinal plants specifically in terms of % moisture, %crude protein and % total carbohydrates were also observed. In addition, there is also significant difference on bioactive compound contents such as total flavonoids and total phenolics between fresh and processed corn silk except in total vitamin E (TVE content. Pandan and pancit-pancitan showed significant difference in all bioactive compounds except in total antioxidant content (TAC. Fresh pancit-pancitan has the highest total phenolics content (TPC and TAC, while the fresh and processed corn silk has the lowest TAC and TVE content, respectively. Furthermore, results of BSLA for the three medicinal plants and commercially available tea extract showed after 24 hours exposure significant difference in toxicity level was observed. The percentage mortality increased with an increase in exposure time of the three medicinal plants and tea extract. The results of the study can served as baseline data for further processing and commercialization of these medicinal plants.

  10. Development of an Environment for Software Reliability Model Selection

    Science.gov (United States)

    1992-09-01

    now is directed to other related problems such as tools for model selection, multiversion programming, and software fault tolerance modeling... multiversion programming, 7. Hlardware can be repaired by spare modules, which is not. the case for software, 2-6 N. Preventive maintenance is very important

  11. Evaluation and comparison of alternative fleet-level selective maintenance models

    International Nuclear Information System (INIS)

    Schneider, Kellie; Richard Cassady, C.

    2015-01-01

    Fleet-level selective maintenance refers to the process of identifying the subset of maintenance actions to perform on a fleet of repairable systems when the maintenance resources allocated to the fleet are insufficient for performing all desirable maintenance actions. The original fleet-level selective maintenance model is designed to maximize the probability that all missions in a future set are completed successfully. We extend this model in several ways. First, we consider a cost-based optimization model and show that a special case of this model maximizes the expected value of the number of successful missions in the future set. We also consider the situation in which one or more of the future missions may be canceled. These models and the original fleet-level selective maintenance optimization models are nonlinear. Therefore, we also consider an alternative model in which the objective function can be linearized. We show that the alternative model is a good approximation to the other models. - Highlights: • Investigate nonlinear fleet-level selective maintenance optimization models. • A cost based model is used to maximize the expected number of successful missions. • Another model is allowed to cancel missions if reliability is sufficiently low. • An alternative model has an objective function that can be linearized. • We show that the alternative model is a good approximation to the other models

  12. Cost Models for MMC Manufacturing Processes

    Science.gov (United States)

    Elzey, Dana M.; Wadley, Haydn N. G.

    1996-01-01

    Processes for the manufacture of advanced metal matrix composites are rapidly approaching maturity in the research laboratory and there is growing interest in their transition to industrial production. However, research conducted to date has almost exclusively focused on overcoming the technical barriers to producing high-quality material and little attention has been given to the economical feasibility of these laboratory approaches and process cost issues. A quantitative cost modeling (QCM) approach was developed to address these issues. QCM are cost analysis tools based on predictive process models relating process conditions to the attributes of the final product. An important attribute, of the QCM approach is the ability to predict the sensitivity of material production costs to product quality and to quantitatively explore trade-offs between cost and quality. Applications of the cost models allow more efficient direction of future MMC process technology development and a more accurate assessment of MMC market potential. Cost models were developed for two state-of-the art metal matrix composite (MMC) manufacturing processes: tape casting and plasma spray deposition. Quality and Cost models are presented for both processes and the resulting predicted quality-cost curves are presented and discussed.

  13. Fuzzy Investment Portfolio Selection Models Based on Interval Analysis Approach

    Directory of Open Access Journals (Sweden)

    Haifeng Guo

    2012-01-01

    Full Text Available This paper employs fuzzy set theory to solve the unintuitive problem of the Markowitz mean-variance (MV portfolio model and extend it to a fuzzy investment portfolio selection model. Our model establishes intervals for expected returns and risk preference, which can take into account investors' different investment appetite and thus can find the optimal resolution for each interval. In the empirical part, we test this model in Chinese stocks investment and find that this model can fulfill different kinds of investors’ objectives. Finally, investment risk can be decreased when we add investment limit to each stock in the portfolio, which indicates our model is useful in practice.

  14. Model visualization for evaluation of biocatalytic processes

    DEFF Research Database (Denmark)

    Law, HEM; Lewis, DJ; McRobbie, I

    2008-01-01

    Biocatalysis offers great potential as an additional, and in some cases as an alternative, synthetic tool for organic chemists, especially as a route to introduce chirality. However, the implementation of scalable biocatalytic processes nearly always requires the introduction of process and/or bi......,S-EDDS), a biodegradable chelant, and is characterised by the use of model visualization using `windows of operation"....

  15. Business process modeling using Petri nets

    NARCIS (Netherlands)

    Hee, van K.M.; Sidorova, N.; Werf, van der J.M.E.M.; Jensen, K.; Aalst, van der W.M.P.; Balbo, G.; Koutny, M.; Wolf, K.

    2013-01-01

    Business process modeling has become a standard activity in many organizations. We start with going back into the history and explain why this activity appeared and became of such importance for organizations to achieve their business targets. We discuss the context in which business process

  16. Business Process Modeling Notation - An Overview

    Directory of Open Access Journals (Sweden)

    Alexandra Fortiş

    2006-01-01

    Full Text Available BPMN represents an industrial standard created to offer a common and user friendly notation to all the participants to a business process. The present paper aims to briefly present the main features of this notation as well as an interpretation of some of the main patterns characterizing a business process modeled by the working fluxes.

  17. Workflow Patterns for Business Process Modeling

    NARCIS (Netherlands)

    Thom, Lucineia Heloisa; Lochpe, Cirano; Reichert, M.U.

    For its reuse advantages, workflow patterns (e.g., control flow patterns, data patterns, resource patterns) are increasingly attracting the interest of both researchers and vendors. Frequently, business process or workflow models can be assembeled out of a set of recurrent process fragments (or

  18. Physical and mathematical modelling of extrusion processes

    DEFF Research Database (Denmark)

    Arentoft, Mogens; Gronostajski, Z.; Niechajowics, A.

    2000-01-01

    The main objective of the work is to study the extrusion process using physical modelling and to compare the findings of the study with finite element predictions. The possibilities and advantages of the simultaneous application of both of these methods for the analysis of metal forming processes...

  19. Diversified models for portfolio selection based on uncertain semivariance

    Science.gov (United States)

    Chen, Lin; Peng, Jin; Zhang, Bo; Rosyida, Isnaini

    2017-02-01

    Since the financial markets are complex, sometimes the future security returns are represented mainly based on experts' estimations due to lack of historical data. This paper proposes a semivariance method for diversified portfolio selection, in which the security returns are given subjective to experts' estimations and depicted as uncertain variables. In the paper, three properties of the semivariance of uncertain variables are verified. Based on the concept of semivariance of uncertain variables, two types of mean-semivariance diversified models for uncertain portfolio selection are proposed. Since the models are complex, a hybrid intelligent algorithm which is based on 99-method and genetic algorithm is designed to solve the models. In this hybrid intelligent algorithm, 99-method is applied to compute the expected value and semivariance of uncertain variables, and genetic algorithm is employed to seek the best allocation plan for portfolio selection. At last, several numerical examples are presented to illustrate the modelling idea and the effectiveness of the algorithm.

  20. A Primer for Model Selection: The Decisive Role of Model Complexity

    Science.gov (United States)

    Höge, Marvin; Wöhling, Thomas; Nowak, Wolfgang

    2018-03-01

    Selecting a "best" model among several competing candidate models poses an often encountered problem in water resources modeling (and other disciplines which employ models). For a modeler, the best model fulfills a certain purpose best (e.g., flood prediction), which is typically assessed by comparing model simulations to data (e.g., stream flow). Model selection methods find the "best" trade-off between good fit with data and model complexity. In this context, the interpretations of model complexity implied by different model selection methods are crucial, because they represent different underlying goals of modeling. Over the last decades, numerous model selection criteria have been proposed, but modelers who primarily want to apply a model selection criterion often face a lack of guidance for choosing the right criterion that matches their goal. We propose a classification scheme for model selection criteria that helps to find the right criterion for a specific goal, i.e., which employs the correct complexity interpretation. We identify four model selection classes which seek to achieve high predictive density, low predictive error, high model probability, or shortest compression of data. These goals can be achieved by following either nonconsistent or consistent model selection and by either incorporating a Bayesian parameter prior or not. We allocate commonly used criteria to these four classes, analyze how they represent model complexity and what this means for the model selection task. Finally, we provide guidance on choosing the right type of criteria for specific model selection tasks. (A quick guide through all key points is given at the end of the introduction.)