WorldWideScience

Sample records for optimal decision rules

  1. Optimal decision fusion given sensor rules

    Institute of Scientific and Technical Information of China (English)

    Yunmin ZHU; Xiaorong LI

    2005-01-01

    When all the rules of sensor decision are known,the optimal distributed decision fusion,which relies only on the joint conditional probability densities,can be derived for very general decision systems.They include those systems with interdependent sensor observations and any network structure.It is also valid for m-ary Bayesian decision problems and binary problems under the Neyman-Pearson criterion.Local decision rules of a sensor with communication from other sensors that are optimal for the sensor itself are also presented,which take the form of a generalized likelihood ratio test.Numerical examples are given to reveal some interesting phenomena that communication between sensors can improve performance of a senor decision,but cannot guarantee to improve the global fusion performance when sensor rules were given before fusing.

  2. Classifiers based on optimal decision rules

    KAUST Repository

    Amin, Talha

    2013-11-25

    Based on dynamic programming approach we design algorithms for sequential optimization of exact and approximate decision rules relative to the length and coverage [3, 4]. In this paper, we use optimal rules to construct classifiers, and study two questions: (i) which rules are better from the point of view of classification-exact or approximate; and (ii) which order of optimization gives better results of classifier work: length, length+coverage, coverage, or coverage+length. Experimental results show that, on average, classifiers based on exact rules are better than classifiers based on approximate rules, and sequential optimization (length+coverage or coverage+length) is better than the ordinary optimization (length or coverage).

  3. Decision and Inhibitory Rule Optimization for Decision Tables with Many-valued Decisions

    KAUST Repository

    Alsolami, Fawaz

    2016-04-25

    ‘If-then’ rule sets are one of the most expressive and human-readable knowledge representations. This thesis deals with optimization and analysis of decision and inhibitory rules for decision tables with many-valued decisions. The most important areas of applications are knowledge extraction and representation. The benefit of considering inhibitory rules is connected with the fact that in some situations they can describe more knowledge than the decision ones. Decision tables with many-valued decisions arise in combinatorial optimization, computational geometry, fault diagnosis, and especially under the processing of data sets. In this thesis, various examples of real-life problems are considered which help to understand the motivation of the investigation. We extend relatively simple results obtained earlier for decision rules over decision tables with many-valued decisions to the case of inhibitory rules. The behavior of Shannon functions (which characterize complexity of rule systems) is studied for finite and infinite information systems, for global and local approaches, and for decision and inhibitory rules. The extensions of dynamic programming for the study of decision rules over decision tables with single-valued decisions are generalized to the case of decision tables with many-valued decisions. These results are also extended to the case of inhibitory rules. As a result, we have algorithms (i) for multi-stage optimization of rules relative to such criteria as length or coverage, (ii) for counting the number of optimal rules, (iii) for construction of Pareto optimal points for bi-criteria optimization problems, (iv) for construction of graphs describing relationships between two cost functions, and (v) for construction of graphs describing relationships between cost and accuracy of rules. The applications of created tools include comparison (based on information about Pareto optimal points) of greedy heuristics for bi-criteria optimization of rules

  4. Optimization of approximate decision rules relative to number of misclassifications

    KAUST Repository

    Amin, Talha

    2012-12-01

    In the paper, we study an extension of dynamic programming approach which allows optimization of approximate decision rules relative to the number of misclassifications. We introduce an uncertainty measure J(T) which is a difference between the number of rows in a decision table T and the number of rows with the most common decision for T. For a nonnegative real number γ, we consider γ-decision rules that localize rows in subtables of T with uncertainty at most γ. The presented algorithm constructs a directed acyclic graph Δγ(T). Based on this graph we can describe the whole set of so-called irredundant γ-decision rules. We can optimize rules from this set according to the number of misclassifications. Results of experiments with decision tables from the UCI Machine Learning Repository are presented. © 2012 The authors and IOS Press. All rights reserved.

  5. Dynamic programming approach to optimization of approximate decision rules

    KAUST Repository

    Amin, Talha

    2013-02-01

    This paper is devoted to the study of an extension of dynamic programming approach which allows sequential optimization of approximate decision rules relative to the length and coverage. We introduce an uncertainty measure R(T) which is the number of unordered pairs of rows with different decisions in the decision table T. For a nonnegative real number β, we consider β-decision rules that localize rows in subtables of T with uncertainty at most β. Our algorithm constructs a directed acyclic graph Δβ(T) which nodes are subtables of the decision table T given by systems of equations of the kind "attribute = value". This algorithm finishes the partitioning of a subtable when its uncertainty is at most β. The graph Δβ(T) allows us to describe the whole set of so-called irredundant β-decision rules. We can describe all irredundant β-decision rules with minimum length, and after that among these rules describe all rules with maximum coverage. We can also change the order of optimization. The consideration of irredundant rules only does not change the results of optimization. This paper contains also results of experiments with decision tables from UCI Machine Learning Repository. © 2012 Elsevier Inc. All rights reserved.

  6. Optimization of decision rules based on dynamic programming approach

    KAUST Repository

    Zielosko, Beata

    2014-01-14

    This chapter is devoted to the study of an extension of dynamic programming approach which allows optimization of approximate decision rules relative to the length and coverage. We introduce an uncertainty measure that is the difference between number of rows in a given decision table and the number of rows labeled with the most common decision for this table divided by the number of rows in the decision table. We fix a threshold γ, such that 0 ≤ γ < 1, and study so-called γ-decision rules (approximate decision rules) that localize rows in subtables which uncertainty is at most γ. Presented algorithm constructs a directed acyclic graph Δ γ T which nodes are subtables of the decision table T given by pairs "attribute = value". The algorithm finishes the partitioning of a subtable when its uncertainty is at most γ. The chapter contains also results of experiments with decision tables from UCI Machine Learning Repository. © 2014 Springer International Publishing Switzerland.

  7. Dynamic Programming Approach for Exact Decision Rule Optimization

    KAUST Repository

    Amin, Talha

    2013-01-01

    This chapter is devoted to the study of an extension of dynamic programming approach that allows sequential optimization of exact decision rules relative to the length and coverage. It contains also results of experiments with decision tables from UCI Machine Learning Repository. © Springer-Verlag Berlin Heidelberg 2013.

  8. Optimization of decision rule complexity for decision tables with many-valued decisions

    KAUST Repository

    Azad, Mohammad

    2013-10-01

    We describe new heuristics to construct decision rules for decision tables with many-valued decisions from the point of view of length and coverage which are enough good. We use statistical test to find leaders among the heuristics. After that, we compare our results with optimal result obtained by dynamic programming algorithms. The average percentage of relative difference between length (coverage) of constructed and optimal rules is at most 6.89% (15.89%, respectively) for leaders which seems to be a promising result. © 2013 IEEE.

  9. Dynamic programming approach for partial decision rule optimization

    KAUST Repository

    Amin, Talha

    2012-10-04

    This paper is devoted to the study of an extension of dynamic programming approach which allows optimization of partial decision rules relative to the length or coverage. We introduce an uncertainty measure J(T) which is the difference between number of rows in a decision table T and number of rows with the most common decision for T. For a nonnegative real number γ, we consider γ-decision rules (partial decision rules) that localize rows in subtables of T with uncertainty at most γ. Presented algorithm constructs a directed acyclic graph Δ γ(T) which nodes are subtables of the decision table T given by systems of equations of the kind "attribute = value". This algorithm finishes the partitioning of a subtable when its uncertainty is at most γ. The graph Δ γ(T) allows us to describe the whole set of so-called irredundant γ-decision rules. We can optimize such set of rules according to length or coverage. This paper contains also results of experiments with decision tables from UCI Machine Learning Repository.

  10. Optimization of inhibitory decision rules relative to length and coverage

    KAUST Repository

    Alsolami, Fawaz

    2012-01-01

    The paper is devoted to the study of algorithms for optimization of inhibitory rules relative to the length and coverage. In contrast with usual rules that have on the right-hand side a relation "attribute ≠ value", inhibitory rules have a relation "attribute = value" on the right-hand side. The considered algorithms are based on extensions of dynamic programming. © 2012 Springer-Verlag.

  11. Optimization of β-decision rules relative to number of misclassifications

    KAUST Repository

    Zielosko, Beata

    2012-01-01

    In the paper, we present an algorithm for optimization of approximate decision rules relative to the number of misclassifications. The considered algorithm is based on extensions of dynamic programming and constructs a directed acyclic graph Δ β (T). Based on this graph we can describe the whole set of so-called irredundant β-decision rules. We can optimize rules from this set according to the number of misclassifications. Results of experiments with decision tables from the UCI Machine Learning Repository are presented. © 2012 Springer-Verlag.

  12. An Integrated Model for Optimization Oriented Decision Aiding and Rule Based Decision Making in Fuzzy Environment

    Directory of Open Access Journals (Sweden)

    A. Yousefli

    2014-01-01

    Full Text Available In this paper a fuzzy decision aid system is developed base on new concepts that presented in the field of fuzzy decision making in fuzzy environment (FDMFE. This framework aids decision makers to understand different circumstances of an uncertain problem that may occur in the future. Also, to keep decision maker from the optimization problem complexities, a decision support system, which can be replaced by optimization problem, is presented to make optimum or near optimum decisions without solving optimization problem directly. An application of the developed decision aid model and the decision support system is presented in the field of inventory models.

  13. An Integrated Model for Optimization Oriented Decision Aiding and Rule Based Decision Making in Fuzzy Environment

    OpenAIRE

    A. Yousefli; M. Ghazanfari; M. B. Abiri

    2014-01-01

    In this paper a fuzzy decision aid system is developed base on new concepts that presented in the field of fuzzy decision making in fuzzy environment (FDMFE). This framework aids decision makers to understand different circumstances of an uncertain problem that may occur in the future. Also, to keep decision maker from the optimization problem complexities, a decision support system, which can be replaced by optimization problem, is presented to make optimum or near optimum decisions without ...

  14. Optimization and analysis of decision trees and rules: Dynamic programming approach

    KAUST Repository

    Alkhalid, Abdulaziz

    2013-08-01

    This paper is devoted to the consideration of software system Dagger created in KAUST. This system is based on extensions of dynamic programming. It allows sequential optimization of decision trees and rules relative to different cost functions, derivation of relationships between two cost functions (in particular, between number of misclassifications and depth of decision trees), and between cost and uncertainty of decision trees. We describe features of Dagger and consider examples of this systems work on decision tables from UCI Machine Learning Repository. We also use Dagger to compare 16 different greedy algorithms for decision tree construction. © 2013 Taylor and Francis Group, LLC.

  15. Optimization of approximate decision rules relative to number of misclassifications: Comparison of greedy and dynamic programming approaches

    KAUST Repository

    Amin, Talha

    2013-01-01

    In the paper, we present a comparison of dynamic programming and greedy approaches for construction and optimization of approximate decision rules relative to the number of misclassifications. We use an uncertainty measure that is a difference between the number of rows in a decision table T and the number of rows with the most common decision for T. For a nonnegative real number γ, we consider γ-decision rules that localize rows in subtables of T with uncertainty at most γ. Experimental results with decision tables from the UCI Machine Learning Repository are also presented. © 2013 Springer-Verlag.

  16. Optimal offering and operating strategies for wind-storage systems with linear decision rules

    DEFF Research Database (Denmark)

    Ding, Huajie; Pinson, Pierre; Hu, Zechun

    2016-01-01

    The participation of wind farm-energy storage systems (WF-ESS) in electricity markets calls for an integrated view of day-ahead offering strategies and real-time operation policies. Such an integrated strategy is proposed here by co-optimizing offering at the day-ahead stage and operation policy...... to be used at the balancing stage. Linear decision rules are seen as a natural approach to model and optimize the real-time operation policy. These allow enhancing profits from balancing markets based on updated information on prices and wind power generation. Our integrated strategies for WF...

  17. Optimal Decision Rules in Repeated Games Where Players Infer an Opponent’s Mind via Simplified Belief Calculation

    Directory of Open Access Journals (Sweden)

    Mitsuhiro Nakamura

    2016-07-01

    Full Text Available In strategic situations, humans infer the state of mind of others, e.g., emotions or intentions, adapting their behavior appropriately. Nonetheless, evolutionary studies of cooperation typically focus only on reaction norms, e.g., tit for tat, whereby individuals make their next decisions by only considering the observed outcome rather than focusing on their opponent’s state of mind. In this paper, we analyze repeated two-player games in which players explicitly infer their opponent’s unobservable state of mind. Using Markov decision processes, we investigate optimal decision rules and their performance in cooperation. The state-of-mind inference requires Bayesian belief calculations, which is computationally intensive. We therefore study two models in which players simplify these belief calculations. In Model 1, players adopt a heuristic to approximately infer their opponent’s state of mind, whereas in Model 2, players use information regarding their opponent’s previous state of mind, obtained from external evidence, e.g., emotional signals. We show that players in both models reach almost optimal behavior through commitment-like decision rules by which players are committed to selecting the same action regardless of their opponent’s behavior. These commitment-like decision rules can enhance or reduce cooperation depending on the opponent’s strategy.

  18. Decision rules for decision tables with many-valued decisions

    KAUST Repository

    Chikalov, Igor

    2011-01-01

    In the paper, authors presents a greedy algorithm for construction of exact and partial decision rules for decision tables with many-valued decisions. Exact decision rules can be \\'over-fitted\\', so instead of exact decision rules with many attributes, it is more appropriate to work with partial decision rules with smaller number of attributes. Based on results for set cover problem authors study bounds on accuracy of greedy algorithm for exact and partial decision rule construction, and complexity of the problem of minimization of decision rule length. © 2011 Springer-Verlag.

  19. Robust optimization of uncertain multistage inventory systems with inexact data in decision rules

    NARCIS (Netherlands)

    de Ruiter, Frans; Ben-Tal, A.; Brekelmans, Ruud; den Hertog, Dick

    2017-01-01

    In production-inventory problems customer demand is often subject to uncertainty. Therefore, it is challenging to design production plans that satisfy both demand and a set of constraints on e.g. production capacity and required inventory levels. Adjustable robust optimization (ARO) is a technique t

  20. Conformance Testing: Measurement Decision Rules

    Science.gov (United States)

    Mimbs, Scott M.

    2010-01-01

    The goal of a Quality Management System (QMS) as specified in ISO 9001 and AS9100 is to provide assurance to the customer that end products meet specifications. Measuring devices, often called measuring and test equipment (MTE), are used to provide the evidence of product conformity to specified requirements. Unfortunately, processes that employ MTE can become a weak link to the overall QMS if proper attention is not given to the measurement process design, capability, and implementation. Documented "decision rules" establish the requirements to ensure measurement processes provide the measurement data that supports the needs of the QMS. Measurement data are used to make the decisions that impact all areas of technology. Whether measurements support research, design, production, or maintenance, ensuring the data supports the decision is crucial. Measurement data quality can be critical to the resulting consequences of measurement-based decisions. Historically, most industries required simplistic, one-size-fits-all decision rules for measurements. One-size-fits-all rules in some cases are not rigorous enough to provide adequate measurement results, while in other cases are overly conservative and too costly to implement. Ideally, decision rules should be rigorous enough to match the criticality of the parameter being measured, while being flexible enough to be cost effective. The goal of a decision rule is to ensure that measurement processes provide data with a sufficient level of quality to support the decisions being made - no more, no less. This paper discusses the basic concepts of providing measurement-based evidence that end products meet specifications. Although relevant to all measurement-based conformance tests, the target audience is the MTE end-user, which is anyone using MTE other than calibration service providers. Topics include measurement fundamentals, the associated decision risks, verifying conformance to specifications, and basic measurement

  1. Design and Analysis of Decision Rules via Dynamic Programming

    KAUST Repository

    Amin, Talha M.

    2017-04-24

    The areas of machine learning, data mining, and knowledge representation have many different formats used to represent information. Decision rules, amongst these formats, are the most expressive and easily-understood by humans. In this thesis, we use dynamic programming to design decision rules and analyze them. The use of dynamic programming allows us to work with decision rules in ways that were previously only possible for brute force methods. Our algorithms allow us to describe the set of all rules for a given decision table. Further, we can perform multi-stage optimization by repeatedly reducing this set to only contain rules that are optimal with respect to selected criteria. One way that we apply this study is to generate small systems with short rules by simulating a greedy algorithm for the set cover problem. We also compare maximum path lengths (depth) of deterministic and non-deterministic decision trees (a non-deterministic decision tree is effectively a complete system of decision rules) with regards to Boolean functions. Another area of advancement is the presentation of algorithms for constructing Pareto optimal points for rules and rule systems. This allows us to study the existence of “totally optimal” decision rules (rules that are simultaneously optimal with regards to multiple criteria). We also utilize Pareto optimal points to compare and rate greedy heuristics with regards to two criteria at once. Another application of Pareto optimal points is the study of trade-offs between cost and uncertainty which allows us to find reasonable systems of decision rules that strike a balance between length and accuracy.

  2. Decision Rules for Enhanced Breakout.

    Science.gov (United States)

    1987-03-20

    347AD-A18 2753 DECISION RULES FOR ENHANCED GRERICOUT(U) MODERN TECNOLOGIES CORP DAYTON 0ON T MI MCCANN 20 MAR 87 MTC-TR-8883-02 BRMC-85-564-i F33615... information to develop a priori estimates of the cost to break-out specific spare parts. In addition, some recommendations were to be developed...implementing directives. Applicable AF and AFLC Regulations and Pamphlets were also reviewed to obtain information on costs and their estimated

  3. Generation and Interpretation of Temporal Decision Rules

    CERN Document Server

    Karimi, Kamran

    2010-01-01

    We present a solution to the problem of understanding a system that produces a sequence of temporally ordered observations. Our solution is based on generating and interpreting a set of temporal decision rules. A temporal decision rule is a decision rule that can be used to predict or retrodict the value of a decision attribute, using condition attributes that are observed at times other than the decision attribute's time of observation. A rule set, consisting of a set of temporal decision rules with the same decision attribute, can be interpreted by our Temporal Investigation Method for Enregistered Record Sequences (TIMERS) to signify an instantaneous, an acausal or a possibly causal relationship between the condition attributes and the decision attribute. We show the effectiveness of our method, by describing a number of experiments with both synthetic and real temporal data.

  4. Simultaneous optimization of decisions using a linear utility function

    NARCIS (Netherlands)

    Vos, Hendrik J.

    1988-01-01

    The purpose of this paper is to simultaneously optimize decision rules for combinations of elementary decisions. As a result of this approach, rules are found that make more efficient use of the data than does optimizing those decisions separately. The framework for the approach is derived from empi

  5. Length and coverage of inhibitory decision rules

    KAUST Repository

    Alsolami, Fawaz

    2012-01-01

    Authors present algorithms for optimization of inhibitory rules relative to the length and coverage. Inhibitory rules have a relation "attribute ≠ value" on the right-hand side. The considered algorithms are based on extensions of dynamic programming. Paper contains also comparison of length and coverage of inhibitory rules constructed by a greedy algorithm and by the dynamic programming algorithm. © 2012 Springer-Verlag.

  6. Decision Analysis of Dynamic Spectrum Access Rules

    Energy Technology Data Exchange (ETDEWEB)

    Juan D. Deaton; Luiz A. DaSilva; Christian Wernz

    2011-12-01

    A current trend in spectrum regulation is to incorporate spectrum sharing through the design of spectrum access rules that support Dynamic Spectrum Access (DSA). This paper develops a decision-theoretic framework for regulators to assess the impacts of different decision rules on both primary and secondary operators. We analyze access rules based on sensing and exclusion areas, which in practice can be enforced through geolocation databases. Our results show that receiver-only sensing provides insufficient protection for primary and co-existing secondary users and overall low social welfare. On the other hand, using sensing information between the transmitter and receiver of a communication link, provides dramatic increases in system performance. The performance of using these link end points is relatively close to that of using many cooperative sensing nodes associated to the same access point and large link exclusion areas. These results are useful to regulators and network developers in understanding in developing rules for future DSA regulation.

  7. Optimization of Approximate Inhibitory Rules Relative to Number of Misclassifications

    KAUST Repository

    Alsolami, Fawaz

    2013-10-04

    In this work, we consider so-called nonredundant inhibitory rules, containing an expression “attribute:F value” on the right- hand side, for which the number of misclassifications is at most a threshold γ. We study a dynamic programming approach for description of the considered set of rules. This approach allows also the optimization of nonredundant inhibitory rules relative to the length and coverage. The aim of this paper is to investigate an additional possibility of optimization relative to the number of misclassifications. The results of experiments with decision tables from the UCI Machine Learning Repository show this additional optimization achieves a fewer misclassifications. Thus, the proposed optimization procedure is promising.

  8. Decision rule classifiers for multi-label decision tables

    KAUST Repository

    Alsolami, Fawaz

    2014-01-01

    Recently, multi-label classification problem has received significant attention in the research community. This paper is devoted to study the effect of the considered rule heuristic parameters on the generalization error. The results of experiments for decision tables from UCI Machine Learning Repository and KEEL Repository show that rule heuristics taking into account both coverage and uncertainty perform better than the strategies taking into account a single criterion. © 2014 Springer International Publishing.

  9. Decisiveness and Inclusiveness: Intergovernmental Choice of European Decision Rules

    Directory of Open Access Journals (Sweden)

    Thomas König

    1997-12-01

    Full Text Available Studying the member states' constitutional choice of European decision rules most power index analyses concentrate on the relative decisiveness of member states in the Council of Minister. However, this emphasis has two shortcomings: First, it ignores the interaction between the Commission, the Council of Ministers and the European Parliament which provides multi-cameral decision making for European legislation. Second, although relative decisiveness is applied to the measurement of the member states' (expected distribution of legislative gains, it does not take into account the member states' expectation of the extent of gains depending on their absolute inclusiveness. In this article we present a model of member states' constitutional choice of European decision rules with regard to the two notions of power: actors' relative decisiveness and their absolute inclusiveness in decision making. We present an index to measure inclusiveness and we apply our concept to European multi-cameral procedures. Hereby, we give an account for the member states' recent reforms of legislative procedures.

  10. Optimization of CHR propagation rules: extended report

    OpenAIRE

    Van Weert, Peter

    2008-01-01

    Constraint Handling Rules (CHR) is an elegant, high-level programming language based on multi-headed, forward chaining rules. To ensure CHR propagation rules are applied at most once with the same combination of constraints, CHR implementations maintain a so-called propagation history. The performance impact of this history can be significant. We introduce several optimizations that, for the majority of CHR rules, eliminate this overhead. We formally prove their correctness, and evaluate thei...

  11. Understanding Optimal Decision-Making

    Science.gov (United States)

    2015-06-01

    2014). Assessment of cognitive components of decision-making with military versions of the IGT and WCST. Human Factors and Ergonomics Society 2014...optimal decision-making will allow the military to more effectively train its leaders. The Cognitive Alignment with Performance Targeted Training...optimal or suboptimal) is aligned or misaligned with cognitive state (categorized as exploration or exploitation): when someone thinks they have

  12. Comparison of Heuristics for Inhibitory Rule Optimization

    KAUST Repository

    Alsolami, Fawaz

    2014-09-13

    Knowledge representation and extraction are very important tasks in data mining. In this work, we proposed a variety of rule-based greedy algorithms that able to obtain knowledge contained in a given dataset as a series of inhibitory rules containing an expression “attribute ≠ value” on the right-hand side. The main goal of this paper is to determine based on rule characteristics, rule length and coverage, whether the proposed rule heuristics are statistically significantly different or not; if so, we aim to identify the best performing rule heuristics for minimization of rule length and maximization of rule coverage. Friedman test with Nemenyi post-hoc are used to compare the greedy algorithms statistically against each other for length and coverage. The experiments are carried out on real datasets from UCI Machine Learning Repository. For leading heuristics, the constructed rules are compared with optimal ones obtained based on dynamic programming approach. The results seem to be promising for the best heuristics: the average relative difference between length (coverage) of constructed and optimal rules is at most 2.27% (7%, respectively). Furthermore, the quality of classifiers based on sets of inhibitory rules constructed by the considered heuristics are compared against each other, and the results show that the three best heuristics from the point of view classification accuracy coincides with the three well-performed heuristics from the point of view of rule length minimization.

  13. Optimizing Early Retirement Decisions.

    Science.gov (United States)

    2007-11-02

    the military. The U.S. Army’s early retirement program is a temporary one designed to allow some soldiers to leave the service prior to 20 years of...whether it makes financial sense for an officer to select early retirement . A spreadsheet formulation is developed and used to indicate if and when...an officer should select early retirement . The program investigates the decision that various civilian salary levels and various assumed discount rates.

  14. Evolving Decision Rules to Predict Investment Opportunities

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    This paper is motivated by the interest in finding significant movements in financial stock prices. However, when the number of profitable opportunities is scarce, the prediction of these cases is difficult. In a previous work, we have introduced evolving decision rules (EDR) to detect financial opportunities. The objective of EDR is to classify the minority class (positive cases) in imbalanced environments. EDR provides a range of classifications to find the best balance between not making mistakes and not missing opportunities. The goals of this paper are: 1) to show that EDR produces a range of solutions to suit the investor's preferences and 2) to analyze the factors that benefit the performance of EDR. A series of experiments was performed. EDR was tested using a data set from the London Financial Market. To analyze the EDR behaviour, another experiment was carried out using three artificial data sets, whose solutions have different levels of complexity. Finally, an illustrative example was provided to show how a bigger collection of rules is able to classify more positive cases in imbalanced data sets. Experimental results show that: 1) EDR offers a range of solutions to fit the risk guidelines of different types of investors, and 2) a bigger collection of rules is able to classify more positive cases in imbalanced environments.

  15. Rule-Based Optimization of Reversible Circuits

    CERN Document Server

    Arabzadeh, Mona; Zamani, Morteza Saheb

    2010-01-01

    Reversible logic has applications in various research areas including low-power design and quantum computation. In this paper, a rule-based optimization approach for reversible circuits is proposed which uses both negative and positive control Toffoli gates during the optimization. To this end, a set of rules for removing NOT gates and optimizing sub-circuits with common-target gates are proposed. To evaluate the proposed approach, the best-reported synthesized circuits and the results of a recent synthesis algorithm which uses both negative and positive controls are used. Our experiments reveal the potential of the proposed approach in optimizing synthesized circuits.

  16. Derived operating rules for a reservoir operation system: Comparison of decision trees, neural decision trees and fuzzy decision trees

    Science.gov (United States)

    Wei, Chih-Chiang; Hsu, Nien-Sheng

    2008-02-01

    This article compares the decision-tree algorithm (C5.0), neural decision-tree algorithm (NDT) and fuzzy decision-tree algorithm (FIDs) for addressing reservoir operations regarding water supply during normal periods. The conventional decision-tree algorithm, such as ID3 and C5.0, executes rapidly and can easily be translated into if-then-else rules. However, the C5.0 algorithm cannot discover dependencies among attributes and cannot treat the non-axis-parallel class boundaries of data. The basic concepts of the two algorithms presented are: (1) NDT algorithm combines the neural network technologies and conventional decision-tree algorithm capabilities, and (2) FIDs algorithm extends to apply fuzzy sets for all attributes with membership function grades and generates a fuzzy decision tree. In order to obtain higher classification rates in FIDs, the flexible trapezoid fuzzy sets are employed to define membership functions. Furthermore, an intelligent genetic algorithm is utilized to optimize the large number of variables in fuzzy decision-tree design. The applicability of the presented algorithms is demonstrated through a case study of the Shihmen Reservoir system. A network flow optimization model for analyzing long-term supply demand is employed to generate the input-output patterns. Findings show superior performance of the FIDs model in contrast with C5.0, NDT and current reservoir operating rules.

  17. Relationships between length and coverage of decision rules

    KAUST Repository

    Amin, Talha

    2014-02-14

    The paper describes a new tool for study relationships between length and coverage of exact decision rules. This tool is based on dynamic programming approach. We also present results of experiments with decision tables from UCI Machine Learning Repository.

  18. Optimal short-sighted rules

    Directory of Open Access Journals (Sweden)

    Sacha eBourgeois-Gironde

    2012-09-01

    Full Text Available The aim of this paper is to assess the relevance of methodological transfers from behavioral ecology to experimental economics with respect to the elicitation of intertemporal preferences. More precisely our discussion will stem from the analysis of Stephens and Anderson’s (2001 seminal article. In their study with blue jays they document that foraging behavior typically implements short sighted choice rules which are beneficial in the long-run. Such long term profitability of short-sighted behavior cannot be evidenced when using a self-control paradigm (one which contrasts in a binary way sooner smaller and later larger payoffs but becomes apparent when ecological patch-paradigms (replicating economic situations in which the main trade-off consists in staying on a food patch or leaving for another patch are implemented. We transfer this methodology in view of contrasting foraging strategies and self-control in human intertemporal choices.

  19. Unrealistic optimism and decision making

    Directory of Open Access Journals (Sweden)

    Božović Bojana

    2009-01-01

    Full Text Available One of the leading descriptive theories of decision-making under risk, Tversky & Kahneman's Prospect theory, reveals that normative explanation of decisionmaking, based only on principle of maximizing outcomes expected utility, is unsustainable. It also underlines the effect of alternative factors on decision-making. Framing effect relates to an influence that verbal formulation of outcomes has on choosing between certain and risky outcomes; in negative frame people tend to be risk seeking, whereas in positive frame people express risk averse tendencies. Individual decisions are not based on objective probabilities of outcomes, but on subjective probabilities that depend on outcome desirability. Unrealistically pessimistic subjects assign lower probabilities (than the group average to the desired outcomes, while unrealistically optimistic subjects assign higher probabilities (than the group average to the desired outcomes. Experiment was conducted in order to test the presumption that there's a relation between unrealistic optimism and decision-making under risk. We expected optimists to be risk seeking, and pessimist to be risk averse. We also expected such cognitive tendencies, if they should become manifest, to be framing effect resistant. Unrealistic optimism scale was applied, followed by the questionnaire composed of tasks of decision-making under risk. Results within the whole sample, and results of afterwards extracted groups of pessimists and optimists both revealed dominant risk seeking tendency that is resistant to the influence of subjective probabilities as well as to the influence of frame in which the outcome is presented.

  20. Decision rules and group rationality: cognitive gain or standstill?

    Directory of Open Access Journals (Sweden)

    Petru Lucian Curşeu

    Full Text Available Recent research in group cognition points towards the existence of collective cognitive competencies that transcend individual group members' cognitive competencies. Since rationality is a key cognitive competence for group decision making, and group cognition emerges from the coordination of individual cognition during social interactions, this study tests the extent to which collaborative and consultative decision rules impact the emergence of group rationality. Using a set of decision tasks adapted from the heuristics and biases literature, we evaluate rationality as the extent to which individual choices are aligned with a normative ideal. We further operationalize group rationality as cognitive synergy (the extent to which collective rationality exceeds average or best individual rationality in the group, and we test the effect of collaborative and consultative decision rules in a sample of 176 groups. Our results show that the collaborative decision rule has superior synergic effects as compared to the consultative decision rule. The ninety one groups working in a collaborative fashion made more rational choices (above and beyond the average rationality of their members than the eighty five groups working in a consultative fashion. Moreover, the groups using a collaborative decision rule were closer to the rationality of their best member than groups using consultative decision rules. Nevertheless, on average groups did not outperformed their best member. Therefore, our results reveal how decision rules prescribing interpersonal interactions impact on the emergence of collective cognitive competencies. They also open potential venues for further research on the emergence of collective rationality in human decision-making groups.

  1. Portfolio theory and the alternative decision rule of cost-effectiveness analysis: theoretical and practical considerations.

    Science.gov (United States)

    Sendi, Pedram; Al, Maiwenn J; Gafni, Amiram; Birch, Stephen

    2004-05-01

    Bridges and Terris (Soc. Sci. Med. (2004)) critique our paper on the alternative decision rule of economic evaluation in the presence of uncertainty and constrained resources within the context of a portfolio of health care programs (Sendi et al. Soc. Sci. Med. 57 (2003) 2207). They argue that by not adopting a formal portfolio theory approach we overlook the optimal solution. We show that these arguments stem from a fundamental misunderstanding of the alternative decision rule of economic evaluation. In particular, the portfolio theory approach advocated by Bridges and Terris is based on the same theoretical assumptions that the alternative decision rule set out to relax. Moreover, Bridges and Terris acknowledge that the proposed portfolio theory approach may not identify the optimal solution to resource allocation problems. Hence, it provides neither theoretical nor practical improvements to the proposed alternative decision rule.

  2. Optimal Rules to Adopt High Technology under Uncertainty

    Institute of Scientific and Technical Information of China (English)

    ZHANG Xiao-jun; LI Shi-min

    2006-01-01

    In the research of choosing the optimal timing for the high technology products,especially IT products to the market, most studies prefer to provide the scope or infimum of timing. In this paper, an optimal rule is adopted to guild the timing of high technology product to the market, this idea is illustrated through the theory of optimal stopping, and a high approach is developed to theoretical framework for timing decision. On this basis, a random programming model is established, in which the objective function is the expected profit to adopt high technology and the constraint condition is the successful probability over critical value α with all variables beyond the rule, and it is used to find the optimal timing of adopt high technology product.

  3. ENDER: A Statistical Framework for Boosting Decision Rules

    NARCIS (Netherlands)

    Dembczynski, K.; Kotlowski, W.T.; Slowinski, R.

    2010-01-01

    Induction of decision rules plays an important role in machine learning. Themain advantage of decision rules is their simplicity and human-interpretable form. Moreover, they are capable of modeling complex interactions between attributes. In this paper, we thoroughly analyze a learning algorithm, ca

  4. The optimum decision rules for the oddity task

    NARCIS (Netherlands)

    Versfeld, N.J.; Dai, H.; Green, D.M.

    1996-01-01

    This paper presents the optimum decision rule for an m-interval oddity task in which m-1 intervals contain the same signal and one is different or odd. The optimum decision rule depends on the degree of correlation among observations. The present approach unifies the different strategies that occur

  5. Rough set and rule-based multicriteria decision aiding

    Directory of Open Access Journals (Sweden)

    Roman Slowinski

    2012-08-01

    Full Text Available The aim of multicriteria decision aiding is to give the decision maker a recommendation concerning a set of objects evaluated from multiple points of view called criteria. Since a rational decision maker acts with respect to his/her value system, in order to recommend the most-preferred decision, one must identify decision maker's preferences. In this paper, we focus on preference discovery from data concerning some past decisions of the decision maker. We consider the preference model in the form of a set of "if..., then..." decision rules discovered from the data by inductive learning. To structure the data prior to induction of rules, we use the Dominance-based Rough Set Approach (DRSA. DRSA is a methodology for reasoning about data, which handles ordinal evaluations of objects on considered criteria and monotonic relationships between these evaluations and the decision. We review applications of DRSA to a large variety of multicriteria decision problems.

  6. Rule Extraction in Transient Stability Study Using Linear Decision Trees

    Institute of Scientific and Technical Information of China (English)

    SUN Hongbin; WANG Kang; ZHANG Boming; ZHAO Feng

    2011-01-01

    Traditional operation rules depend on human experience, which are relatively fixed and difficult to fulfill the new demand of the modern power grid. In order to formulate suitable and quickly refreshed operation rules, a method of linear decision tree based on support samples is proposed for rule extraction in this paper. The operation rules extracted by this method have advantages of refinement and intelligence, which helps the dispatching center meet the requirement of smart grid construction.

  7. Optimal decision fusion and its application on 3D face recognition

    NARCIS (Netherlands)

    Tao, Qian; Rootseler, van Robin; Veldhuis, Raymond; Gehlen, Stefan; Weber, Frank; Bromme, A.; Busch, C.; Huhnlein, D.

    2007-01-01

    Fusion is a popular practice to combine multiple classifiers or multiple modalities in biometrics. In this paper, optimal decision fusion (ODF) by AND rule and OR rule is presented. We show that the decision fusion can be done in an optimal way such that it always gives an improvement in terms of er

  8. Brainstorming, Brainstorming Rules and Decision Making

    Science.gov (United States)

    Putman, Vicky L.; Paulus, Paul B.

    2009-01-01

    This study investigated the effect of brainstorming experience on the ability of groups to subsequently select the best ideas for implementation. Participants generated ideas either alone or in interactive groups and with either the regular brainstorming rules or with additional rules designed to further increase the number of ideas generated. All…

  9. Brainstorming, Brainstorming Rules and Decision Making

    Science.gov (United States)

    Putman, Vicky L.; Paulus, Paul B.

    2009-01-01

    This study investigated the effect of brainstorming experience on the ability of groups to subsequently select the best ideas for implementation. Participants generated ideas either alone or in interactive groups and with either the regular brainstorming rules or with additional rules designed to further increase the number of ideas generated. All…

  10. Optimal decisions principles of programming

    CERN Document Server

    Lange, Oskar

    1971-01-01

    Optimal Decisions: Principles of Programming deals with all important problems related to programming.This book provides a general interpretation of the theory of programming based on the application of the Lagrange multipliers, followed by a presentation of the marginal and linear programming as special cases of this general theory. The praxeological interpretation of the method of Lagrange multipliers is also discussed.This text covers the Koopmans' model of transportation, geometric interpretation of the programming problem, and nature of activity analysis. The solution of t

  11. Reducing Inconsistent Rules Based on Irregular Decision Table

    Institute of Scientific and Technical Information of China (English)

    兰轶东; 张霖; 刘连臣

    2004-01-01

    In this paper, we study the problem of rule extraction from data sets using the rough set method. For inconsistent rules due to improper selection of split-points during discretization, and/or to lack of information, we propose two methods to remove their inconsistency based on irregular decision tables.By using these methods, inconsistent rules are eliminated as far as possible, without affecting the remaining consistent rules. Experimental test indicates that use of the new method leads to an improvement in the mean accuracy of the extracted rules.

  12. Online learning algorithm for ensemble of decision rules

    KAUST Repository

    Chikalov, Igor

    2011-01-01

    We describe an online learning algorithm that builds a system of decision rules for a classification problem. Rules are constructed according to the minimum description length principle by a greedy algorithm or using the dynamic programming approach. © 2011 Springer-Verlag.

  13. Real Life Decision Optimization Model

    OpenAIRE

    Raju, Naga; Reddy, Diwakar; Reddy, Rajeswara; Krishnaiah, G

    2016-01-01

    In real life scientific and engineering problems decision making is common practice. Decision making include single decision maker or group of decision makers. Decision maker’s expressions consists imprecise, inconsistent and indeterminate information. Also, the decision maker cannot select the best solution in unidirectional (single goal) way. Therefore, proposed model adopts decision makers’ opinions in Neutrosophic Values (SVNS/INV) which effectively deals imprecise, inconsistent and indet...

  14. Measuring acceptability of clinical decision rules: validation of the Ottawa acceptability of decision rules instrument (OADRI) in four countries.

    Science.gov (United States)

    Brehaut, Jamie C; Graham, Ian D; Wood, Timothy J; Taljaard, Monica; Eagles, Debra; Lott, Alison; Clement, Catherine; Kelly, Anne-Maree; Mason, Suzanne; Stiell, Ian G

    2010-01-01

    Clinical decision rules can benefit clinicians, patients, and health systems, but they involve considerable up-front development costs and must be acceptable to the target audience. No existing instrument measures the acceptability of a rule. The current study validated such an instrument. The authors administered the Ottawa Acceptability of Decision Rules Instrument (OADRI) via postal survey to emergency physicians from 4 regions (Australasia, Canada, United Kingdom, and United States), in the context of 2 recently developed rules, the Canadian C-Spine Rule (C-Spine) and the Canadian CT Head Rule (CT-Head). Construct validity of the 12-item instrument was evaluated by hypothesis testing. As predicted by a priori hypotheses, OADRI scores were 1) higher among rule users than nonusers, 2) higher among those using the rule ''all of the time'' v. ''most of the time'' v. ''some of the time,'' and 3) higher among rule nonusers who would consider using a rule v. those who would not. We also examined explicit reasons given by respondents who said they would not use these rules. Items in the OADRI accounted for 85.5% (C- Spine) and 90.2% (CT-Head) of the reasons given for not considering a rule acceptable. The OADRI is a simple, 12-item instrument that evaluates rule acceptability among clinicians. Potential uses include comparing multiple ''protorules'' during development, examining acceptability of a rule to a new audience prior to implementation, indicating barriers to rule use addressable by knowledge translation interventions, and potentially serving as a proxy measure for future rule use.

  15. Totally optimal decision trees for Boolean functions

    KAUST Repository

    Chikalov, Igor

    2016-07-28

    We study decision trees which are totally optimal relative to different sets of complexity parameters for Boolean functions. A totally optimal tree is an optimal tree relative to each parameter from the set simultaneously. We consider the parameters characterizing both time (in the worst- and average-case) and space complexity of decision trees, i.e., depth, total path length (average depth), and number of nodes. We have created tools based on extensions of dynamic programming to study totally optimal trees. These tools are applicable to both exact and approximate decision trees, and allow us to make multi-stage optimization of decision trees relative to different parameters and to count the number of optimal trees. Based on the experimental results we have formulated the following hypotheses (and subsequently proved): for almost all Boolean functions there exist totally optimal decision trees (i) relative to the depth and number of nodes, and (ii) relative to the depth and average depth.

  16. Totally Optimal Decision Trees for Monotone Boolean Functions with at Most Five Variables

    KAUST Repository

    Chikalov, Igor

    2013-01-01

    In this paper, we present the empirical results for relationships between time (depth) and space (number of nodes) complexity of decision trees computing monotone Boolean functions, with at most five variables. We use Dagger (a tool for optimization of decision trees and decision rules) to conduct experiments. We show that, for each monotone Boolean function with at most five variables, there exists a totally optimal decision tree which is optimal with respect to both depth and number of nodes.

  17. Optimizing Mining Association Rules for Artificial Immune System based Classification

    Directory of Open Access Journals (Sweden)

    SAMEER DIXIT

    2011-08-01

    Full Text Available The primary function of a biological immune system is to protect the body from foreign molecules known as antigens. It has great pattern recognition capability that may be used to distinguish between foreigncells entering the body (non-self or antigen and the body cells (self. Immune systems have many characteristics such as uniqueness, autonomous, recognition of foreigners, distributed detection, and noise tolerance . Inspired by biological immune systems, Artificial Immune Systems have emerged during the last decade. They are incited by many researchers to design and build immune-based models for a variety of application domains. Artificial immune systems can be defined as a computational paradigm that is inspired by theoretical immunology, observed immune functions, principles and mechanisms. Association rule mining is one of the most important and well researched techniques of data mining. The goal of association rules is to extract interesting correlations, frequent patterns, associations or casual structures among sets of items in thetransaction databases or other data repositories. Association rules are widely used in various areas such as inventory control, telecommunication networks, intelligent decision making, market analysis and risk management etc. Apriori is the most widely used algorithm for mining the association rules. Other popular association rule mining algorithms are frequent pattern (FP growth, Eclat, dynamic itemset counting (DIC etc. Associative classification uses association rule mining in the rule discovery process to predict the class labels of the data. This technique has shown great promise over many other classification techniques. Associative classification also integrates the process of rule discovery and classification to build the classifier for the purpose of prediction. The main problem with the associative classification approach is the discovery of highquality association rules in a very large space of

  18. The optimum decision rules for the oddity task.

    Science.gov (United States)

    Versfeld, N J; Dai, H; Green, D M

    1996-01-01

    This paper presents the optimum decision rule for an m-interval oddity task in which m-1 intervals contain the same signal and one is different or odd. The optimum decision rule depends on the degree of correlation among observations. The present approach unifies the different strategies that occur with "roved" or "fixed" experiments (Macmillan & Creelman, 1991, p. 147). It is shown that the commonly used decision rule for an m-interval oddity task corresponds to the special case of highly correlated observations. However, as is also true for the same-different paradigm, there exists a different optimum decision rule when the observations are independent. The relation between the probability of a correct response and d' is derived for the three-interval oddity task. Tables are presented of this relation for the three-, four-, and five-interval oddity task. Finally, an experimental method is proposed that allows one to determine the decision rule used by the observer in an oddity experiment.

  19. Greedy Algorithm for the Construction of Approximate Decision Rules for Decision Tables with Many-Valued Decisions

    KAUST Repository

    Azad, Mohammad

    2016-10-20

    The paper is devoted to the study of a greedy algorithm for construction of approximate decision rules. This algorithm is applicable to decision tables with many-valued decisions where each row is labeled with a set of decisions. For a given row, we should find a decision from the set attached to this row. We consider bounds on the precision of this algorithm relative to the length of rules. To illustrate proposed approach we study a problem of recognition of labels of points in the plain. This paper contains also results of experiments with modified decision tables from UCI Machine Learning Repository.

  20. Thin Capitalization Rules and Entrepreneurial Capital Structure Decisions

    Directory of Open Access Journals (Sweden)

    Alexandra Maßbaum

    2009-12-01

    Full Text Available Tax planners often choose debt over equity financing. As this has led to increased corporate debt financing, many countries have introduced thin capitalization rules to secure their tax revenues. In a general capital structure model we analyze if thin capitalization rules affect dividend and financing decisions, and whether they can partially explain why corporations receive both debt and equity capital. We model the Belgian, German and Italian rules as examples. We find that the so-called Miller equilibrium and definite financing effects depend significantly on the underlying tax system. Further, our results are useful for the treasury to decide what thin capitalization type to implement.

  1. Associative Regressive Decision Rule Mining for Predicting Customer Satisfactory Patterns

    Directory of Open Access Journals (Sweden)

    P. Suresh

    2016-04-01

    Full Text Available Opinion mining also known as sentiment analysis, involves cust omer satisfactory patterns, sentiments and attitudes toward entities, products, service s and their attributes. With the rapid development in the field of Internet, potential customer’s provi des a satisfactory level of product/service reviews. The high volume of customer rev iews were developed for product/review through taxonomy-aware processing but, it was di fficult to identify the best reviews. In this paper, an Associative Regression Decisio n Rule Mining (ARDRM technique is developed to predict the pattern for service provider and to improve customer satisfaction based on the review comments. Associative Regression based Decisi on Rule Mining performs two- steps for improving the customer satisfactory level. Initial ly, the Machine Learning Bayes Sentiment Classifier (MLBSC is used to classify the cla ss labels for each service reviews. After that, Regressive factor of the opinion words and Class labels w ere checked for Association between the words by using various probabilistic rules. Based on t he probabilistic rules, the opinion and sentiments effect on customer reviews, are analyzed to arrive at specific set of service preferred by the customers with their review com ments. The Associative Regressive Decision Rule helps the service provider to take decision on imp roving the customer satisfactory level. The experimental results reveal that the Associ ative Regression Decision Rule Mining (ARDRM technique improved the performance in terms of true positive rate, Associative Regression factor, Regressive Decision Rule Generation time a nd Review Detection Accuracy of similar pattern.

  2. 依法治国视域下公安决策程序的优化策略%The Optimization Strategy of Public Security Decision-making Procedure from the Perspective of the Rule of Law

    Institute of Scientific and Technical Information of China (English)

    王虹

    2015-01-01

    决策作为行政管理工作的主要环节之一,对公安队伍的建设管理水平及职责履行情况有着重大的影响。随着依法治国基本方略的全面推进,公安部门提高了对决策法制化的重视,在决策过程中增强决策程序的法制性和科学性。本文结合依法治国的理念,简述公安决策程序的法制化内容,分析目前我国公安决策程序中存在的问题,如决策前缺少实地调查及定量分析、决策时缺乏程序评估及公众参与、决策后缺乏权力监督和人权保障等,并结合我国公安机关的法制化趋向,提出转变领导决策作风、借鉴先进决策理论、完善决策监督体系、增强程序法制意识等建议,提高公安机关决策程序的法制化程度。%As one of the main part of the administrative management, decision has a significant impact on the public security troop construction management level and duty performance. With the comprehensive advancement of the basic strategy of ruling the country by law, public security departments improve the emphasis on Legalization of decision-making, enhance decision procedure of legal and scientific nature in the decision-making process. According to the idea of the rule of law, the legal content of the public security decision procedures are described in this paper, analyzed existing problems in China's public security decision procedures, such as missing field survey and quantitative analysis before making a decision, lack of program evaluation and public participation when making a decision, and lack of supervision of the power and protection of human rights after decision, combined with the legalization trend of China’s public security organs, propose changes in leadership decision-making style, referenced advanced decision theory, improve the supervision and decision-making system and procedural law consciousness enhancement, improve the degree of legalization of decision

  3. Relationships among various parameters for decision tree optimization

    KAUST Repository

    Hussain, Shahid

    2014-01-14

    In this chapter, we study, in detail, the relationships between various pairs of cost functions and between uncertainty measure and cost functions, for decision tree optimization. We provide new tools (algorithms) to compute relationship functions, as well as provide experimental results on decision tables acquired from UCI ML Repository. The algorithms presented in this paper have already been implemented and are now a part of Dagger, which is a software system for construction/optimization of decision trees and decision rules. The main results presented in this chapter deal with two types of algorithms for computing relationships; first, we discuss the case where we construct approximate decision trees and are interested in relationships between certain cost function, such as depth or number of nodes of a decision trees, and an uncertainty measure, such as misclassification error (accuracy) of decision tree. Secondly, relationships between two different cost functions are discussed, for example, the number of misclassification of a decision tree versus number of nodes in a decision trees. The results of experiments, presented in the chapter, provide further insight. © 2014 Springer International Publishing Switzerland.

  4. A detailed comparison of optimality and simplicity in perceptual decision making.

    Science.gov (United States)

    Shen, Shan; Ma, Wei Ji

    2016-07-01

    Two prominent ideas in the study of decision making have been that organisms behave near-optimally, and that they use simple heuristic rules. These principles might be operating in different types of tasks, but this possibility cannot be fully investigated without a direct, rigorous comparison within a single task. Such a comparison was lacking in most previous studies, because (a) the optimal decision rule was simple, (b) no simple suboptimal rules were considered, (c) it was unclear what was optimal, or (d) a simple rule could closely approximate the optimal rule. Here, we used a perceptual decision-making task in which the optimal decision rule is well-defined and complex, and makes qualitatively distinct predictions from many simple suboptimal rules. We find that all simple rules tested fail to describe human behavior, that the optimal rule accounts well for the data, and that several complex suboptimal rules are indistinguishable from the optimal one. Moreover, we found evidence that the optimal model is close to the true model: First, the better the trial-to-trial predictions of a suboptimal model agree with those of the optimal model, the better that suboptimal model fits; second, our estimate of the Kullback-Leibler divergence between the optimal model and the true model is not significantly different from zero. When observers receive no feedback, the optimal model still describes behavior best, suggesting that sensory uncertainty is implicitly represented and taken into account. Beyond the task and models studied here, our results have implications for best practices of model comparison. (PsycINFO Database Record

  5. Portable Rule Extraction Method for Neural Network Decisions Reasoning

    Directory of Open Access Journals (Sweden)

    Darius PLIKYNAS

    2005-08-01

    Full Text Available Neural network (NN methods are sometimes useless in practical applications, because they are not properly tailored to the particular market's needs. We focus thereinafter specifically on financial market applications. NNs have not gained full acceptance here yet. One of the main reasons is the "Black Box" problem (lack of the NN decisions explanatory power. There are though some NN decisions rule extraction methods like decompositional, pedagogical or eclectic, but they suffer from low portability of the rule extraction technique across various neural net architectures, high level of granularity, algorithmic sophistication of the rule extraction technique etc. The authors propose to eliminate some known drawbacks using an innovative extension of the pedagogical approach. The idea is exposed by the use of a widespread MLP neural net (as a common tool in the financial problems' domain and SOM (input data space clusterization. The feedback of both nets' performance is related and targeted through the iteration cycle by achievement of the best matching between the decision space fragments and input data space clusters. Three sets of rules are generated algorithmically or by fuzzy membership functions. Empirical validation of the common financial benchmark problems is conducted with an appropriately prepared software solution.

  6. Comparison of some classification algorithms based on deterministic and nondeterministic decision rules

    KAUST Repository

    Delimata, Paweł

    2010-01-01

    We discuss two, in a sense extreme, kinds of nondeterministic rules in decision tables. The first kind of rules, called as inhibitory rules, are blocking only one decision value (i.e., they have all but one decisions from all possible decisions on their right hand sides). Contrary to this, any rule of the second kind, called as a bounded nondeterministic rule, can have on the right hand side only a few decisions. We show that both kinds of rules can be used for improving the quality of classification. In the paper, two lazy classification algorithms of polynomial time complexity are considered. These algorithms are based on deterministic and inhibitory decision rules, but the direct generation of rules is not required. Instead of this, for any new object the considered algorithms extract from a given decision table efficiently some information about the set of rules. Next, this information is used by a decision-making procedure. The reported results of experiments show that the algorithms based on inhibitory decision rules are often better than those based on deterministic decision rules. We also present an application of bounded nondeterministic rules in construction of rule based classifiers. We include the results of experiments showing that by combining rule based classifiers based on minimal decision rules with bounded nondeterministic rules having confidence close to 1 and sufficiently large support, it is possible to improve the classification quality. © 2010 Springer-Verlag.

  7. Sensitivity of a Clinical Decision Rule and Early Computed Tomography in Aneurysmal Subarachnoid Hemorrhage

    Directory of Open Access Journals (Sweden)

    Dustin G. Mark

    2015-10-01

    Full Text Available Introduction: Application of a clinical decision rule for subarachnoid hemorrhage, in combination with cranial computed tomography (CT performed within six hours of ictus (early cranial CT, may be able to reasonably exclude a diagnosis of aneurysmal subarachnoid hemorrhage (aSAH. This study’s objective was to examine the sensitivity of both early cranial CT and a previously validated clinical decision rule among emergency department (ED patients with aSAH and a normal mental status. Methods: Patients were evaluated in the 21 EDs of an integrated health delivery system between January 2007 and June 2013. We identified by chart review a retrospective cohort of patients diagnosed with aSAH in the setting of a normal mental status and performance of early cranial CT. Variables comprising the SAH clinical decision rule (age >40, presence of neck pain or stiffness, headache onset with exertion, loss of consciousness at headache onset were abstracted from the chart and assessed for inter-rater reliability. Results: One hundred fifty-five patients with aSAH met study inclusion criteria. The sensitivity of early cranial CT was 95.5% (95% CI [90.9-98.2]. The sensitivity of the SAH clinical decision rule was also 95.5% (95% CI [90.9-98.2]. Since all false negative cases for each diagnostic modality were mutually independent, the combined use of both early cranial CT and the clinical decision rule improved sensitivity to 100% (95% CI [97.6-100.0]. Conclusion: Neither early cranial CT nor the SAH clinical decision rule demonstrated ideal sensitivity for aSAH in this retrospective cohort. However, the combination of both strategies might optimize sensitivity for this life-threatening disease.

  8. Decision Rules, Trees and Tests for Tables with Many-valued Decisions–comparative Study

    KAUST Repository

    Azad, Mohammad

    2013-10-04

    In this paper, we present three approaches for construction of decision rules for decision tables with many-valued decisions. We construct decision rules directly for rows of decision table, based on paths in decision tree, and based on attributes contained in a test (super-reduct). Experimental results for the data sets taken from UCI Machine Learning Repository, contain comparison of the maximum and the average length of rules for the mentioned approaches.

  9. Acceleration of association‐rule based markov decision processes

    Directory of Open Access Journals (Sweden)

    Ma. de G. García‐Hernández

    2009-12-01

    Full Text Available In this paper, we present a new approach for the estimation of Markov decision processes based on efficient association rulemining techniques such as Apriori. For the fastest solution of the resulting association‐rule based Markov decision process,several accelerating procedures such as asynchronous updates and prioritization using a static ordering have been applied. Anew criterion for state reordering in decreasing order of maximum reward is also compared with a modified topologicalreordering algorithm. Experimental results obtained on a finite state and action‐space stochastic shortest path problemdemonstrate the feasibility of the new approach.

  10. Detection of Attacks on MAODV Association Rule Mining Optimization

    Directory of Open Access Journals (Sweden)

    A. Fidalcastro

    2015-02-01

    Full Text Available Current mining algorithms can generate large number of rules and very slow to generate rules or generate few results, omitting interesting and valuable information. To address this problem, we propose an algorithm Optimized Featured Top Association Rules (OFTAR algorithm, where every attack have many features and some of the features are more important. The Features are selected by genetic algorithm and processed by the OFTAR algorithm to find the optimized rules. The algorithm utilizes Genetic Algorithm feature selection approach to find optimized features. OFTAR incorporate association rules with several rule optimization techniques and expansion techniques to improve efficiency. Increasing popularity of Mobile ad hoc network users of wireless networks lead to threats and attacks on MANET, due to its features. The main challenge in designing a MANET is protecting from various attacks in the network. Intrusion Detection System is required to monitor the network and to detect the malicious node in the network in multi casting mobility environment. The node features are processed in Association Analysis to generate rules, the generated rules are applied to nodes to detect the attacks. Experimental results show that the algorithm has higher scalability and good performance that is an advantageous to several association rule mining algorithms when the rule generation is controlled and optimized to detect the attacks.

  11. Learning Dispatching Rules for Scheduling: A Synergistic View Comprising Decision Trees, Tabu Search and Simulation

    Directory of Open Access Journals (Sweden)

    Atif Shahzad

    2016-02-01

    Full Text Available A promising approach for an effective shop scheduling that synergizes the benefits of the combinatorial optimization, supervised learning and discrete-event simulation is presented. Though dispatching rules are in widely used by shop scheduling practitioners, only ordinary performance rules are known; hence, dynamic generation of dispatching rules is desired to make them more effective in changing shop conditions. Meta-heuristics are able to perform quite well and carry more knowledge of the problem domain, however at the cost of prohibitive computational effort in real-time. The primary purpose of this research lies in an offline extraction of this domain knowledge using decision trees to generate simple if-then rules that subsequently act as dispatching rules for scheduling in an online manner. We used similarity index to identify parametric and structural similarity in problem instances in order to implicitly support the learning algorithm for effective rule generation and quality index for relative ranking of the dispatching decisions. Maximum lateness is used as the scheduling objective in a job shop scheduling environment.

  12. Sequential optimization of approximate inhibitory rules relative to the length, coverage and number of misclassifications

    KAUST Repository

    Alsolami, Fawaz

    2013-01-01

    This paper is devoted to the study of algorithms for sequential optimization of approximate inhibitory rules relative to the length, coverage and number of misclassifications. Theses algorithms are based on extensions of dynamic programming approach. The results of experiments for decision tables from UCI Machine Learning Repository are discussed. © 2013 Springer-Verlag.

  13. The Use of a Modification of the Hurwicz’s Decision Rule in Multicriteria Decision Making under Complete Uncertainty

    Directory of Open Access Journals (Sweden)

    Helena Gaspars-Wieloch

    2014-12-01

    Full Text Available The paper concerns multicriteria decision making under uncertainty with scenario planning. This topic is explored by many researchers because almost all real-world decision problems have multiple conflicting criteria and a deterministic criteria evaluation is often impossible (e.g. mergers and acquisitions, new product development. We propose two procedures for uncertain multi-objective optimization (for dependent and independent criteria matrices which are based on the SAPO method – a modification of the Hurwicz’s rule for one-criterion problems, recently presented in another paper. The new approaches take into account the decision maker’s preference structure and attitude towards risk. It considers the frequency and the level of extreme evaluations and generates logic rankings for symmetric and asymmetric distributions. The application of the suggested tool is illustrated with an example of marketing strategies selection.

  14. Enterprise resource planning implementation decision & optimization models

    Institute of Scientific and Technical Information of China (English)

    Wang Shaojun; Wang Gang; Lü Min; Gao Guoan

    2008-01-01

    To study the uncertain optimization problems on implementation schedule, time-cost trade-off and quality in enterprise resource planning (ERP) implementation, combined with program evaluation and review technique (PERT), some optimization models are proposed, which include the implementation schedule model, the timecost trade-off model, the quality model, and the implementation time-cost-quality synthetic optimization model. A PERT-embedded genetic algorithm (GA) based on stochastic simulation technique is introduced to the optimization models solution. Finally, an example is presented to show that the models and algorithm are reasonable and effective, which can offer a reliable quantitative decision method for ERP implementation.

  15. Free Energy and the Generalized Optimality Equations for Sequential Decision Making

    CERN Document Server

    Ortega, Pedro A

    2012-01-01

    The free energy functional has recently been proposed as a variational principle for bounded rational decision-making, since it instantiates a natural trade-off between utility gains and information processing costs that can be axiomatically derived. Here we apply the free energy principle to general decision trees that include both adversarial and stochastic environments. We derive generalized sequential optimality equations that not only include the Bellman optimality equations as a limit case, but also lead to well-known decision-rules such as Expectimax, Minimax and Expectiminimax. We show how these decision-rules can be derived from a single free energy principle that assigns a resource parameter to each node in the decision tree. These resource parameters express a concrete computational cost that can be measured as the amount of samples that are needed from the distribution that belongs to each node. The free energy principle therefore provides the normative basis for generalized optimality equations t...

  16. Threshold-optimized decision-level fusion and its application to biometrics

    NARCIS (Netherlands)

    Tao, Q.; Veldhuis, R.N.J.

    2009-01-01

    Fusion is a popular practice to increase the reliability of biometric verification. In this paper, we propose an optimal fusion scheme at decision level by the AND or OR rule, based on optimizing matching score thresholds. The proposed fusion scheme will always give an improvement in the Neyman–Pear

  17. Optimal Rules of Negligent Misrepresentation in Insurance Law

    DEFF Research Database (Denmark)

    Lando, Henrik

    such as a deductible. On the other hand, a strict rule exposes the insured, who may have committed a mistake, to risk. In this trade-off, the optimal rule depends, among other factors, on the cost for the insurer of auditing types when claims are presented, on whether the insurer can commit to an auditing strategy......This article analyzes rules for negligent misrepresentation in insurance contract law. Before contract signature, the applicant can be asked by the insurer to fill in a questionnaire concerning the risk, and may then omit or make untrue statements about facts. Such misrepresentation is considered...... negligent by the court when it is unclear the misrepresentation was due to a mistake or intentional. Rules of negligent misrepresentation differ significantly across jurisdictions. For example, the rule of common law allows the insurer to rescind the contract, whereas the German rule does not allow...

  18. Rough set based decision rule generation to find behavioural patterns of customers

    Indian Academy of Sciences (India)

    L SUMALATHA; P UMA SANKAR; B SUJATHA

    2016-09-01

    Rough sets help in finding significant attributes of large data sets and generating decision rules for classifying new instances. Though multiple regression analysis, discriminant analysis, log-it analysis and several other techniques can be used for predicting results, they consider insignificant information also for processing which may lead to false positives and false negatives. In this study, we proposed rough set based decision rule generation framework to find reduct and to generate decision rules for predicting the Decision class. We conducted experiments over data of Portuguese Banking institution. From the proposed method, the dimensionality of data is reduced and decision rules are generated which predicts deposit nature of customers by 90%accuracy.

  19. Effective Network Intrusion Detection using Classifiers Decision Trees and Decision rules

    Directory of Open Access Journals (Sweden)

    G.MeeraGandhi

    2010-11-01

    Full Text Available In the era of information society, computer networks and their related applications are the emerging technologies. Network Intrusion Detection aims at distinguishing the behavior of the network. As the network attacks have increased in huge numbers over the past few years, Intrusion Detection System (IDS is increasingly becoming a critical component to secure the network. Owing to large volumes of security audit data in a network in addition to intricate and vibrant properties of intrusion behaviors, optimizing performance of IDS becomes an important open problem which receives more and more attention from the research community. In this work, the field of machine learning attempts to characterize how such changes can occur by designing, implementing, running, and analyzing algorithms that can be run on computers. The discipline draws on ideas, with the goal of understanding the computational character of learning. Learning always occurs in the context of some performance task, and that a learning method should always be coupled with a performance element that uses the knowledge acquired during learning. In this research, machine learning is being investigated as a technique for making the selection, using as training data and their outcome. In this paper, we evaluate the performance of a set of classifier algorithms of rules (JRIP, Decision Tabel, PART, and OneR and trees (J48, RandomForest, REPTree, NBTree. Based on the evaluation results, best algorithms for each attack category is chosen and two classifier algorithm selection models are proposed. The empirical simulation result shows the comparison between the noticeable performance improvements. The classification models were trained using the data collected from Knowledge Discovery Databases (KDD for Intrusion Detection. The trained models were then used for predicting the risk of the attacks in a web server environment or by any network administrator or any Security Experts. The

  20. Algorithms for optimal dyadic decision trees

    Energy Technology Data Exchange (ETDEWEB)

    Hush, Don [Los Alamos National Laboratory; Porter, Reid [Los Alamos National Laboratory

    2009-01-01

    A new algorithm for constructing optimal dyadic decision trees was recently introduced, analyzed, and shown to be very effective for low dimensional data sets. This paper enhances and extends this algorithm by: introducing an adaptive grid search for the regularization parameter that guarantees optimal solutions for all relevant trees sizes, revising the core tree-building algorithm so that its run time is substantially smaller for most regularization parameter values on the grid, and incorporating new data structures and data pre-processing steps that provide significant run time enhancement in practice.

  1. Optimized reaction mechanism rate rules for ignition of normal alkanes

    KAUST Repository

    Cai, Liming

    2016-08-11

    The increasing demand for cleaner combustion and reduced greenhouse gas emissions motivates research on the combustion of hydrocarbon fuels and their surrogates. Accurate detailed chemical kinetic models are an important prerequisite for high fidelity reacting flow simulations capable of improving combustor design and operation. The development of such models for many new fuel components and/or surrogate molecules is greatly facilitated by the application of reaction classes and rate rules. Accurate and versatile rate rules are desirable to improve the predictive accuracy of kinetic models. A major contribution in the literature is the recent work by Bugler et al. (2015), which has significantly improved rate rules and thermochemical parameters used in kinetic modeling of alkanes. In the present study, it is demonstrated that rate rules can be used and consistently optimized for a set of normal alkanes including n-heptane, n-octane, n-nonane, n-decane, and n-undecane, thereby improving the predictive accuracy for all the considered fuels. A Bayesian framework is applied in the calibration of the rate rules. The optimized rate rules are subsequently applied to generate a mechanism for n-dodecane, which was not part of the training set for the optimized rate rules. The developed mechanism shows accurate predictions compared with published well-validated mechanisms for a wide range of conditions.

  2. Optimizing Decisions in Web Services Orchestrations

    OpenAIRE

    Kattepur, Ajay; Benveniste, Albert; Jard, Claude

    2011-01-01

    International audience; Web services orchestrations conventionally employ exhaustive comparison of runtime quality of service (QoS) metrics for decision making. The ability to incorporate more complex mathematical packages are needed, especially in case of workflows for resource allocation and queuing systems. By modeling such optimization routines as service calls within orchestration specifications, techniques such as linear programming can be conveniently invoked by non-specialist workflow...

  3. Optimization In Searching Daily Rule Curve At Mosul Regulating Reservoir, North Iraq Using Genetic Algorithms

    Directory of Open Access Journals (Sweden)

    Thair M. Al-Taiee

    2013-05-01

    Full Text Available To obtain optimal operating rules for storage reservoirs, large numbers of simulation and optimization models have been developed over the past several decades, which vary significantly in their mechanisms and applications. Rule curves are guidelines for long term reservoir operation. An efficient technique is required to find the optimal rule curves that can mitigate water shortage in long term operation. The investigation of developed Genetic Algorithm (GA technique, which is an optimization approach base on the mechanics of natural selection, derived from the theory of natural evolution, was carried out to through the application to predict the daily rule curve of  Mosul regulating reservoir in Iraq.  Record daily inflows, outflow, water level in the reservoir for 19 year (1986-1990 and (1994-2007 were used in the developed model for assessing the optimal reservoir operation. The objective function is set to minimize the annual sum of squared deviation from the desired downstream release and desired storage volume in the reservoir. The decision variables are releases, storage volume, water level and outlet (demand from the reservoir. The results of the GA model gave a good agreement during the comparison with the actual rule curve and the designed rating curve of the reservoir. The simulated result shows that GA-derived policies are promising and competitive and can be effectively used for daily reservoir operation in addition to the rational monthly operation and predicting also rating curve of reservoirs.

  4. Time-Saving Approach for Optimal Mining of Association Rules

    Directory of Open Access Journals (Sweden)

    Mouhir Mohammed

    2016-10-01

    Full Text Available Data mining is the process of analyzing data so as to get useful information to be exploited by users. Association rules is one of data mining techniques used to detect different correlations and to reveal relationships among data individual items in huge data bases. These rules usually take the following form: if X then Y as independent attributes. An association rule has become a popular technique used in several vital fields of activity such as insurance, medicine, banks, supermarkets… Association rules are generated in huge numbers by algorithms known as Association Rules Mining algorithms. The generation of huge quantities of Association Rules may be time-and-effort consuming this is the reason behind an urgent necessity of an efficient and scaling approach to mine only the relevant and significant association rules. This paper proposes an innovative approach which mines the optimal rules from a large set of Association Rules in a distributive processing way to improve its efficiency and to decrease the running time.

  5. Extensions of Dynamic Programming: Decision Trees, Combinatorial Optimization, and Data Mining

    KAUST Repository

    Hussain, Shahid

    2016-07-10

    This thesis is devoted to the development of extensions of dynamic programming to the study of decision trees. The considered extensions allow us to make multi-stage optimization of decision trees relative to a sequence of cost functions, to count the number of optimal trees, and to study relationships: cost vs cost and cost vs uncertainty for decision trees by construction of the set of Pareto-optimal points for the corresponding bi-criteria optimization problem. The applications include study of totally optimal (simultaneously optimal relative to a number of cost functions) decision trees for Boolean functions, improvement of bounds on complexity of decision trees for diagnosis of circuits, study of time and memory trade-off for corner point detection, study of decision rules derived from decision trees, creation of new procedure (multi-pruning) for construction of classifiers, and comparison of heuristics for decision tree construction. Part of these extensions (multi-stage optimization) was generalized to well-known combinatorial optimization problems: matrix chain multiplication, binary search trees, global sequence alignment, and optimal paths in directed graphs.

  6. An Elite Decision Making Harmony Search Algorithm for Optimization Problem

    Directory of Open Access Journals (Sweden)

    Lipu Zhang

    2012-01-01

    Full Text Available This paper describes a new variant of harmony search algorithm which is inspired by a well-known item “elite decision making.” In the new algorithm, the good information captured in the current global best and the second best solutions can be well utilized to generate new solutions, following some probability rule. The generated new solution vector replaces the worst solution in the solution set, only if its fitness is better than that of the worst solution. The generating and updating steps and repeated until the near-optimal solution vector is obtained. Extensive computational comparisons are carried out by employing various standard benchmark optimization problems, including continuous design variables and integer variables minimization problems from the literature. The computational results show that the proposed new algorithm is competitive in finding solutions with the state-of-the-art harmony search variants.

  7. Financial benefit of using crop protection decision rules over systematic spraying strategies.

    Science.gov (United States)

    Fabre, F; Plantegenest, M; Yuen, J

    2007-11-01

    ABSTRACT Decision rule models are considered to be one of the main cornerstones of the implementation of integrated pest management (IPM) programs. Even if the need for such programs to offer cost advantages over conventional strategies is a major incentive for IPM adoption, few studies focus on this financial dimension. In this article, a modeling approach of the response of a pathosystem to a disease control method and of the predictive performance of decision rules is used to explore how some basic factors act on the likelihood of adoption of decision rule models strategies (such as using an IPM system) over systematic strategies (such as systematic-spraying and never-spraying strategies). Even if the average cost of using the decision rule strategies is always lower than the average cost of systematic strategies in several different scenarios, the models developed here showed strong effects of different pathosystems and decision rules on financial benefits. The number of production situations where decision rules are of interest is highly correlated with their accuracy. However, because of the inescapable trade-offs between decision rule accuracy and limiting factors such as its user-friendly characteristics, the use of decision rules is unlikely to reduce costs to <70% of the costs of systemic strategies. In more general terms, this study provides quantitative guidelines on the financial advantage that decision rules can offer in plant protection as well as a better understanding of their potential usefulness.

  8. Emergent Linguistic Rules from Inducing Decision Trees Disambiguating Discourse Clue Words

    CERN Document Server

    Siegel, E V; Siegel, Eric V.; Keown, Kathleen R. Mc

    1994-01-01

    We apply decision tree induction to the problem of discourse clue word sense disambiguation with a genetic algorithm. The automatic partitioning of the training set which is intrinsic to decision tree induction gives rise to linguistically viable rules.

  9. An intelligent temporal pattern classification system using fuzzy temporal rules and particle swarm optimization

    Indian Academy of Sciences (India)

    S Ganapathy; R Sethukkarasi; P Yogesh; P Vijayakumar; A Kannan

    2014-04-01

    In this paper, we propose a new pattern classification system by combining Temporal features with Fuzzy Min–Max (TFMM) neural network based classifier for effective decision support in medical diagnosis. Moreover, a Particle Swarm Optimization (PSO) algorithm based rule extractor is also proposed in this work for improving the detection accuracy. Intelligent fuzzy rules are extracted from the temporal features with Fuzzy Min–Max neural network based classifier, and then PSO rule extractor is used to minimize the number of features in the extracted rules. We empirically evaluated the effectiveness of the proposed TFMM-PSO system using the UCI Machine Learning Repository Data Set. The results are analysed and compared with other published results. In addition, the detection accuracy is validated by using the ten-fold cross validation.

  10. Multi-stage optimization of decision and inhibitory trees for decision tables with many-valued decisions

    KAUST Repository

    Azad, Mohammad

    2017-06-16

    We study problems of optimization of decision and inhibitory trees for decision tables with many-valued decisions. As cost functions, we consider depth, average depth, number of nodes, and number of terminal/nonterminal nodes in trees. Decision tables with many-valued decisions (multi-label decision tables) are often more accurate models for real-life data sets than usual decision tables with single-valued decisions. Inhibitory trees can sometimes capture more information from decision tables than decision trees. In this paper, we create dynamic programming algorithms for multi-stage optimization of trees relative to a sequence of cost functions. We apply these algorithms to prove the existence of totally optimal (simultaneously optimal relative to a number of cost functions) decision and inhibitory trees for some modified decision tables from the UCI Machine Learning Repository.

  11. Optimizing Decision Tree Attack on CAS Scheme

    Directory of Open Access Journals (Sweden)

    PERKOVIC, T.

    2016-05-01

    Full Text Available In this paper we show a successful side-channel timing attack on a well-known high-complexity cognitive authentication (CAS scheme. We exploit the weakness of CAS scheme that comes from the asymmetry of the virtual interface and graphical layout which results in nonuniform human behavior during the login procedure, leading to detectable variations in user's response times. We optimized a well-known probabilistic decision tree attack on CAS scheme by introducing this timing information into the attack. We show that the developed classifier could be used to significantly reduce the number of login sessions required to break the CAS scheme.

  12. Algorithm Optimally Orders Forward-Chaining Inference Rules

    Science.gov (United States)

    James, Mark

    2008-01-01

    People typically develop knowledge bases in a somewhat ad hoc manner by incrementally adding rules with no specific organization. This often results in a very inefficient execution of those rules since they are so often order sensitive. This is relevant to tasks like Deep Space Network in that it allows the knowledge base to be incrementally developed and have it automatically ordered for efficiency. Although data flow analysis was first developed for use in compilers for producing optimal code sequences, its usefulness is now recognized in many software systems including knowledge-based systems. However, this approach for exhaustively computing data-flow information cannot directly be applied to inference systems because of the ubiquitous execution of the rules. An algorithm is presented that efficiently performs a complete producer/consumer analysis for each antecedent and consequence clause in a knowledge base to optimally order the rules to minimize inference cycles. An algorithm was developed that optimally orders a knowledge base composed of forwarding chaining inference rules such that independent inference cycle executions are minimized, thus, resulting in significantly faster execution. This algorithm was integrated into the JPL tool Spacecraft Health Inference Engine (SHINE) for verification and it resulted in a significant reduction in inference cycles for what was previously considered an ordered knowledge base. For a knowledge base that is completely unordered, then the improvement is much greater.

  13. Optimizing breeding decisions for Finnish dairy herds.

    Science.gov (United States)

    Rajala-Schultz, P J; Gröhn, Y T; Allore, H G

    2000-01-01

    The purpose of this study was to determine the effect of reproductive performance on profitability and optimal breeding decisions for Finnish dairy herds. We used a dynamic programming model to optimize dairy cow insemination and replacement decisions. This optimization model maximizes the expected net revenues from a given cow and her replacements over a decision horizon. Input values and prices reflecting the situation in 1998 in Finland were used in the study. Reproductive performance was reflected in the model by overall pregnancy rate, which was a function of heat detection and conception rate. Seasonality was included in conception rate. The base run had a pregnancy rate of 0.49 (both heat detection and conception rate of 0.7). Different scenarios were modeled by changing levels of conception rate, heat detection, and seasonality in fertility. Reproductive performance had a considerable impact on profitability of a herd; good heat detection and conception rates provided an opportunity for management control. When heat detection rate decreased from 0.7 to 0.5, and everything else was held constant, net revenues decreased approximately 2.6%. If the conception rate also decreased to 0.5 (resulting in a pregnancy rate of 0.25), net revenues were approximately 5% lower than with a pregnancy rate of 0.49. With lower fertility, replacement percentage was higher and the financial losses were mainly from higher replacement costs. Under Finnish conditions, it is not optimal to start breeding cows calving in spring and early summer immediately after the voluntary waiting period. Instead, it is preferable to allow the calving interval to lengthen for these cows so that their next calving is in the fall. However, cows calving in the fall should be bred immediately after the voluntary waiting period. Across all scenarios, optimal solutions predicted most calvings should occur in fall and the most profitable time to bring a replacement heifer into a herd was in the fall. It

  14. Learning the optimal buffer-stock consumption rule of Carroll

    NARCIS (Netherlands)

    Yıldızoğlu, M.; Sénégas, M.A.; Salle, I.; Zumpe, M.

    2014-01-01

    This article questions the rather pessimistic conclusions of Allen and Carroll [Macroeconomic Dynamics 5 (2001), 255-271] about the ability of consumers to learn the optimal buffer-stock-based consumption rule. To this end, we develop an agent-based model in which alternative learning schemes can be

  15. Evaluation of a rule base for decision making in general practice.

    Science.gov (United States)

    Essex, B; Healy, M

    1994-01-01

    BACKGROUND. Decision making in general practice relies heavily on judgmental expertise. It should be possible to codify this expertise into rules and principles. AIM. A study was undertaken to evaluate the effectiveness, of rules from a rule base designed to improve students' and trainees' management decisions relating to patients seen in general practice. METHOD. The rule base was developed after studying decisions about and management of thousands of patients seen in one general practice over an eight year period. Vignettes were presented to 93 fourth year medical students and 179 general practitioner trainees. They recorded their perception and management of each case before and after being presented with a selection of relevant rules. Participants also commented on their level of agreement with each of the rules provided with the vignettes. A panel of five independent assessors then rated as good, acceptable or poor, the participants' perception and management of each case before and after seeing the rules. RESULTS. Exposure to a few selected rules of thumb improved the problem perception and management decisions of both undergraduates and trainees. The degree of improvement was not related to previous experience or to the stated level of agreement with the proposed rules. The assessors identified difficulties students and trainees experienced in changing their perceptions and management decisions when the rules suggested options they had not considered. CONCLUSION. The rules developed to improve decision making skills in general practice are effective when used with vignettes. The next phase is to transform the rule base into an expert system to train students and doctors to acquire decision making skills. It could also be used to provide decision support when confronted with difficult management decisions in general practice. PMID:8204334

  16. A decision rule to identify adolescent females with cervical infections.

    Science.gov (United States)

    Reed, Jennifer L; Mahabee-Gittens, E Melinda; Huppert, Jill S

    2007-03-01

    To develop a clinical decision rule to direct empiric treatment of adolescent females with Neisseria gonorrhoeae (GC) or Chlamydia trachomatis (CT) (or both) cervical infections in a pediatric emergency department. This was a cross-sectional study of adolescent females with symptoms necessitating sexually transmitted infection (STI) testing. The outcome was defined as cervical specimens positive for either GC on culture or CT on nucleic amplification assay test. Clinical variables included demographic, historical, physical, and laboratory findings. Bivariate associations were assessed using chi-square for categorical data and Student's t test for continuous variables. Variables significant at p 10 white blood cells (WBCs) (aOR = 2.5, CI 1.3-4.6) on vaginal gram stain. Variables comprising the RP analysis included partner penile discharge, >10 WBCs on vaginal gram stain, African American race, absence of yeast forms on vaginal gram stain, and no hormonal birth control use. This algorithm was 75% sensitive and 71% specific, with a negative predictive value of 85%. The LR model confirmed associations seen in other populations. Although STI testing is imperative, the RP model can be used to direct empiric treatment among high-risk adolescent females.

  17. A new generalization of the proportional conflict redistribution rule stable in terms of decision

    CERN Document Server

    Martin, Arnaud

    2008-01-01

    In this chapter, we present and discuss a new generalized proportional conflict redistribution rule. The Dezert-Smarandache extension of the Demster-Shafer theory has relaunched the studies on the combination rules especially for the management of the conflict. Many combination rules have been proposed in the last few years. We study here different combination rules and compare them in terms of decision on didactic example and on generated data. Indeed, in real applications, we need a reliable decision and it is the final results that matter. This chapter shows that a fine proportional conflict redistribution rule must be preferred for the combination in the belief function theory.

  18. Evolution of Decision Rules Used for IT Portfolio Management: An Inductive Approach

    Science.gov (United States)

    Karhade, Prasanna P.; Shaw, Michael J.; Subramanyam, Ramanath

    IT portfolio management and the related planning decisions for IT-dependent initiatives are critical to organizational performance. Building on the logic of appropriateness theoretical framework, we define an important characteristic of decision rules used during IT portfolio planning; rule appropriateness with regards to the risk-taking criterion. We propose that rule appropriateness will be an important factor explaining the evolution of rules over time. Using an inductive learning methodology, we analyze a unique dataset of actual IT portfolio planning decisions spanning two consecutive years within one organization. We present systematic comparative analysis of the evolution of rules used in planning over two years to validate our research proposition. We find that rules that were inappropriate in the first year are being redefined to design appropriate rules for use in the second year. Our work provides empirical evidence demonstrating organizational learning and improvements in IT portfolio planning capabilities.

  19. A tool for study of optimal decision trees

    KAUST Repository

    Alkhalid, Abdulaziz

    2010-01-01

    The paper describes a tool which allows us for relatively small decision tables to make consecutive optimization of decision trees relative to various complexity measures such as number of nodes, average depth, and depth, and to find parameters and the number of optimal decision trees. © 2010 Springer-Verlag Berlin Heidelberg.

  20. Automatic Mining of Numerical Classification Rules with Parliamentary Optimization Algorithm

    Directory of Open Access Journals (Sweden)

    KIZILOLUK, S.

    2015-11-01

    Full Text Available In recent years, classification rules mining has been one of the most important data mining tasks. In this study, one of the newest social-based metaheuristic methods, Parliamentary Optimization Algorithm (POA, is firstly used for automatically mining of comprehensible and accurate classification rules within datasets which have numerical attributes. Four different numerical datasets have been selected from UCI data warehouse and classification rules of high quality have been obtained. Furthermore, the results obtained from designed POA have been compared with the results obtained from four different popular classification rules mining algorithms used in WEKA. Although POA is very new and no applications in complex data mining problems have been performed, the results seem promising. The used objective function is very flexible and many different objectives can easily be added to. The intervals of the numerical attributes in the rules have been automatically found without any a priori process, as done in other classification rules mining algorithms, which causes the modification of datasets.

  1. A Decision Making Methodology in Support of the Business Rules Lifecycle

    Science.gov (United States)

    Wild, Christopher; Rosca, Daniela

    1998-01-01

    The business rules that underlie an enterprise emerge as a new category of system requirements that represent decisions about how to run the business, and which are characterized by their business-orientation and their propensity for change. In this report, we introduce a decision making methodology which addresses several aspects of the business rules lifecycle: acquisition, deployment and evolution. We describe a meta-model for representing business rules in terms of an enterprise model, and also a decision support submodel for reasoning about and deriving the rules. The possibility for lifecycle automated assistance is demonstrated in terms of the automatic extraction of business rules from the decision structure. A system based on the metamodel has been implemented, including the extraction algorithm. This is the final report for Daniela Rosca's PhD fellowship. It describes the work we have done over the past year, current research and the list of publications associated with her thesis topic.

  2. Learning the optimal buffer-stock consumption rule of Carroll

    OpenAIRE

    Yildizoglu, Murat; Sénégas, Marc-Alexandre; Salle, Isabelle; Zumpe, Martin

    2011-01-01

    This article questions the rather pessimistic conclusions of Allen et Carroll (2001) about the ability of consumer to learn the optimal buffer-stock based consumption rule. To this aim, we develop an agent based model where alternative learning schemes can be compared in terms of the consumption behaviour that they yield. We show that neither purely adaptive learning, nor social learning based on imitation can ensure satisfactory consumption behaviours. By contrast, if the agents can form ada...

  3. Optimization and risk analyses for rule curves of reservoir operation: application to Tien-Hua-Hu Reservoir in Taiwan.

    Science.gov (United States)

    Kuo, J T; Hsu, N S; Chiu, S K

    2006-01-01

    Tien-Hua-Hu Reservoir is currently under planning by the Water Resources Agency, Taiwan to meet the increasing water demands of central Taiwan arising from rapid growth of domestic water supply, and high-tech industrial parks. This study develops a simulation model for the ten-day period reservoir operation to calculate the ten-day water shortage index under varying rule curves. A genetic algorithm is coupled to the simulation model to find the optimal rule curves using the minimum ten-day water shortage index as an objective function. This study generates many sets of synthetic streamflows for risk, reliability, resiliency, and vulnerability analyses of reservoir operation. ARMA and disaggregation models are developed and applied to the synthetic streamflow generation. The optimal rule curves obtained from this study perform better in the ten-day shortage index when compared to the originally designed rule curves from a previous study. The optimal rule curves are also superior to the originally designed rule curves in terms of vulnerability. However, in terms of reliability and resiliency, the optimal rule curves are inferior to the those originally designed. Results from this study have provided in general a set of improved rule curves for operation of the Tien-Hua-Hu Reservoir. Furthermore, results from reliability, resiliency and vulnerability analyses offer much useful information for decision making in reservoir operation.

  4. Probabilistic decision graphs for optimization under uncertainty

    DEFF Research Database (Denmark)

    Jensen, Finn V.; Nielsen, Thomas Dyhre

    2011-01-01

    This paper provides a survey on probabilistic decision graphs for modeling and solving decision problems under uncertainty. We give an introduction to influence diagrams, which is a popular framework for representing and solving sequential decision problems with a single decision maker. As the me...

  5. Fast generation method of fuzzy rules and its application to flux optimization in process of matter converting

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    A fast generation method of fuzzy rules for flux optimization decision-making was proposed in order to extract the linguistic knowledge from numerical data in the process of matter converting. The fuzzy if-then rules with consequent real number were extracted from numerical data, and a linguistic representation method for deriving linguistic rules from fuzzy if-then rules with consequent real numbers was developed. The linguistic representation consisted of two linguistic variables with the degree of certainty and the storage structure of rule base was described.The simulation results show that the method involves neither the time-consuming iterative learning procedure nor the complicated rule generation mechanisms, and can approximate complex system. The method was applied to determine the flux amount of copper converting furnace in the process of matter converting. The real result shows that the mass fraction of Cu in slag is reduced by 0.5%.

  6. Simplifying rules for optimal allocation of preventive care resources.

    Science.gov (United States)

    Gandjour, Afschin

    2012-04-01

    Given the limited resources for preventive care, policy-makers need to consider the efficiency/cost-effectiveness of preventive measures, such as drugs and vaccines, when allocating preventive care resources. However, in many settings only limited information on lifetime costs and effects of preventive measures exists. Therefore, it seems useful to provide policy-makers with some simplifying rules when allocating preventive care resources. The purpose of this article is to investigate the relevance of risk and severity of the disease to be prevented for the optimal allocation of preventive care resources. The report shows - based on a constrained optimization model - that optimal allocation of preventive care resources does, in fact, depend on both factors. Resources should be allocated to the prevention of diseases with a higher probability of occurrence or larger severity. This article also identifies situations where preventive care resources should be allocated to the prevention of less severe disease.

  7. Decision rules for the use of radiography in acute ankle injuries. Refinement and prospective validation.

    Science.gov (United States)

    Stiell, I G; Greenberg, G H; McKnight, R D; Nair, R C; McDowell, I; Reardon, M; Stewart, J P; Maloney, J

    1993-03-03

    To validate and refine previously derived clinical decision rules that aid the efficient use of radiography in acute ankle injuries. Survey prospectively administered in two stages: validation and refinement of the original rules (first stage) and validation of the refined rules (second stage). Emergency departments of two university hospitals. Convenience sample of adults with acute ankle injuries: 1032 of 1130 eligible patients in the first stage and 453 of 530 eligible patients in the second stage. Attending emergency physicians assessed each patient for standardized clinical variables and classified the need for radiography according to the original (first stage) and the refined (second stage) decision rules. The decision rules were assessed for their ability to correctly identify the criterion standard of fractures on ankle and foot radiographic series. The original decision rules were refined by univariate and recursive partitioning analyses. In the first stage, the original decision rules were found to have sensitivities of 1.0 (95% confidence interval [CI], 0.97 to 1.0) for detecting 121 maleolar zone fractures, and 0.98 (95% CI, 0.88 to 1.0) for detecting 49 midfoot zone fractures. For interpretation of the rules in 116 patients, kappa values were 0.56 for the ankle series rule and 0.69 for the foot series rule. Recursive partitioning of 20 predictor variables yielded refined decision rules for ankle and foot radiographic series. In the second stage, the refined rules proved to have sensitivities of 1.0 (95% CI, 0.93 to 1.0) for 50 malleolar zone fractures, and 1.0 (95% CI, 0.83 to 1.0) for 19 midfoot zone fractures. The potential reduction in radiography is estimated to be 34% for the ankle series and 30% for the foot series. The probability of fracture, if the corresponding decision rule were "negative," is estimated to be 0% (95% CI, 0% to 0.8%) in the ankle series, and 0% (95% CI, 0% to 0.4%) in the foot series. Refinement and validation have shown the

  8. Optimal Rules for Single Machine Scheduling with Stochastic Breakdowns

    Directory of Open Access Journals (Sweden)

    Jinwei Gu

    2014-01-01

    Full Text Available This paper studies the problem of scheduling a set of jobs on a single machine subject to stochastic breakdowns, where jobs have to be restarted if preemptions occur because of breakdowns. The breakdown process of the machine is independent of the jobs processed on the machine. The processing times required to complete the jobs are constants if no breakdown occurs. The machine uptimes are independently and identically distributed (i.i.d. and are subject to a uniform distribution. It is proved that the Longest Processing Time first (LPT rule minimizes the expected makespan. For the large-scale problem, it is also showed that the Shortest Processing Time first (SPT rule is optimal to minimize the expected total completion times of all jobs.

  9. 48 CFR 6103.306 - Decisions [Rule 306].

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 7 2010-10-01 2010-10-01 false Decisions . 6103.306... agency, and information provided during conferences. The claimant, the Audit Division, and the agency... Board decisions are posted weekly on the Internet. The Board's Internet address is: www.cbca.gsa.gov....

  10. Decision rules and group rationality: cognitive gain or standstill?

    NARCIS (Netherlands)

    Curseu, P.L.; Jansen, R.J.G.; Chappin, M.M.H.

    2013-01-01

    Recent research in group cognition points towards the existence of collective cognitive competencies that transcend individual group members’ cognitive competencies. Since rationality is a key cognitive competence for group decision making, and group cognition emerges from the coordination of indivi

  11. A simple threshold rule is sufficient to explain sophisticated collective decision-making.

    Science.gov (United States)

    Robinson, Elva J H; Franks, Nigel R; Ellis, Samuel; Okuda, Saki; Marshall, James A R

    2011-01-01

    Decision-making animals can use slow-but-accurate strategies, such as making multiple comparisons, or opt for simpler, faster strategies to find a 'good enough' option. Social animals make collective decisions about many group behaviours including foraging and migration. The key to the collective choice lies with individual behaviour. We present a case study of a collective decision-making process (house-hunting ants, Temnothorax albipennis), in which a previously proposed decision strategy involved both quality-dependent hesitancy and direct comparisons of nests by scouts. An alternative possible decision strategy is that scouting ants use a very simple quality-dependent threshold rule to decide whether to recruit nest-mates to a new site or search for alternatives. We use analytical and simulation modelling to demonstrate that this simple rule is sufficient to explain empirical patterns from three studies of collective decision-making in ants, and can account parsimoniously for apparent comparison by individuals and apparent hesitancy (recruitment latency) effects, when available nests differ strongly in quality. This highlights the need to carefully design experiments to detect individual comparison. We present empirical data strongly suggesting that best-of-n comparison is not used by individual ants, although individual sequential comparisons are not ruled out. However, by using a simple threshold rule, decision-making groups are able to effectively compare options, without relying on any form of direct comparison of alternatives by individuals. This parsimonious mechanism could promote collective rationality in group decision-making.

  12. 48 CFR 6302.1 - How to appeal a contracting officer's decision (Rule 1).

    Science.gov (United States)

    2010-10-01

    ... officer's decision (Rule 1). (a) Notice of an appeal shall be in writing and mailed or otherwise furnished... requested a written decision within 60 days from receipt of the request, and the contracting officer has not done so, the contractor may file a notice of appeal as provided in paragraph (a) of this section citing...

  13. Stress influences decisions to break a safety rule in a complex simulation task in females.

    Science.gov (United States)

    Starcke, Katrin; Brand, Matthias; Kluge, Annette

    2016-07-01

    The current study examines the effects of acutely induced laboratory stress on a complex decision-making task, the Waste Water Treatment Simulation. Participants are instructed to follow a certain decision rule according to safety guidelines. Violations of this rule are associated with potential high rewards (working faster and earning more money) but also with the risk of a catastrophe (an explosion). Stress was induced with the Trier Social Stress Test while control participants underwent a non-stress condition. In the simulation task, stressed females broke the safety rule more often than unstressed females: χ(2) (1, N=24)=10.36, p<0.001, V=0.66. In males, no difference between stressed and unstressed participants was observed. We conclude that stress increased the decisions to break the safety rule because stressed female participants focused on the potential high gains while they neglected the risk of potential negative consequences.

  14. Ant-based extraction of rules in simple decision systems over ontological graphs

    Directory of Open Access Journals (Sweden)

    Pancerz Krzysztof

    2015-06-01

    Full Text Available In the paper, the problem of extraction of complex decision rules in simple decision systems over ontological graphs is considered. The extracted rules are consistent with the dominance principle similar to that applied in the dominancebased rough set approach (DRSA. In our study, we propose to use a heuristic algorithm, utilizing the ant-based clustering approach, searching the semantic spaces of concepts presented by means of ontological graphs. Concepts included in the semantic spaces are values of attributes describing objects in simple decision systems

  15. Multiview coding mode decision with hybrid optimal stopping model.

    Science.gov (United States)

    Zhao, Tiesong; Kwong, Sam; Wang, Hanli; Wang, Zhou; Pan, Zhaoqing; Kuo, C-C Jay

    2013-04-01

    In a generic decision process, optimal stopping theory aims to achieve a good tradeoff between decision performance and time consumed, with the advantages of theoretical decision-making and predictable decision performance. In this paper, optimal stopping theory is employed to develop an effective hybrid model for the mode decision problem, which aims to theoretically achieve a good tradeoff between the two interrelated measurements in mode decision, as computational complexity reduction and rate-distortion degradation. The proposed hybrid model is implemented and examined with a multiview encoder. To support the model and further promote coding performance, the multiview coding mode characteristics, including predicted mode probability and estimated coding time, are jointly investigated with inter-view correlations. Exhaustive experimental results with a wide range of video resolutions reveal the efficiency and robustness of our method, with high decision accuracy, negligible computational overhead, and almost intact rate-distortion performance compared to the original encoder.

  16. On algorithm for building of optimal α-decision trees

    KAUST Repository

    Alkhalid, Abdulaziz

    2010-01-01

    The paper describes an algorithm that constructs approximate decision trees (α-decision trees), which are optimal relatively to one of the following complexity measures: depth, total path length or number of nodes. The algorithm uses dynamic programming and extends methods described in [4] to constructing approximate decision trees. Adjustable approximation rate allows controlling algorithm complexity. The algorithm is applied to build optimal α-decision trees for two data sets from UCI Machine Learning Repository [1]. © 2010 Springer-Verlag Berlin Heidelberg.

  17. The decision-feedback equalizer optimization for Gaussian noise

    Directory of Open Access Journals (Sweden)

    Arkadiusz Grzybowski

    2014-04-01

    Full Text Available The new method of decision-feedback parameters optimization for intersymbol interference equalizers is described in this paper. The error extension phenomena is well known and investigated in the decision feedback equalizers in data transmission. The existing coefficient in decision feedback depends on the receive decision risk qualification. There is proved the bit error probability can be decreased by this method for any channel with single interference sample and small Gaussian noise. The experimental results are presented for chosen type channels. The dependences of optimal feedback parameters on channel interference sample and on noise power are presented too.

  18. A common rule for decision making in animal collectives across species

    Science.gov (United States)

    Arganda, Sara; Pérez-Escudero, Alfonso; de Polavieja, Gonzalo G.

    2012-01-01

    A diversity of decision-making systems has been observed in animal collectives. In some species, choices depend on the differences of the numbers of animals that have chosen each of the available options, whereas in other species on the relative differences (a behavior known as Weber’s law), or follow more complex rules. We here show that this diversity of decision systems corresponds to a single rule of decision making in collectives. We first obtained a decision rule based on Bayesian estimation that uses the information provided by the behaviors of the other individuals to improve the estimation of the structure of the world. We then tested this rule in decision experiments using zebrafish (Danio rerio), and in existing rich datasets of argentine ants (Linepithema humile) and sticklebacks (Gasterosteus aculeatus), showing that a unified model across species can quantitatively explain the diversity of decision systems. Further, these results show that the different counting systems used by animals, including humans, can emerge from the common principle of using social information to make good decisions. PMID:23197836

  19. Benefiting from deep-level diversity : How congruence between knowledge and decision rules improves team decision making and team perceptions

    NARCIS (Netherlands)

    Rink, Floor; Ellemers, Naomi

    2010-01-01

    In two experiments we show how teams can benefit from the presence of multiple sources of deep-level task-related diversity. We manipulated differences (vs. similarities) in task information and personal decision rules in dyads (Study 1) and three-person teams (Study 2). The results indicate that wh

  20. Optimal operating rules definition in complex water resource systems combining fuzzy logic, expert criteria and stochastic programming

    Science.gov (United States)

    Macian-Sorribes, Hector; Pulido-Velazquez, Manuel

    2016-04-01

    This contribution presents a methodology for defining optimal seasonal operating rules in multireservoir systems coupling expert criteria and stochastic optimization. Both sources of information are combined using fuzzy logic. The structure of the operating rules is defined based on expert criteria, via a joint expert-technician framework consisting in a series of meetings, workshops and surveys carried out between reservoir managers and modelers. As a result, the decision-making process used by managers can be assessed and expressed using fuzzy logic: fuzzy rule-based systems are employed to represent the operating rules and fuzzy regression procedures are used for forecasting future inflows. Once done that, a stochastic optimization algorithm can be used to define optimal decisions and transform them into fuzzy rules. Finally, the optimal fuzzy rules and the inflow prediction scheme are combined into a Decision Support System for making seasonal forecasts and simulate the effect of different alternatives in response to the initial system state and the foreseen inflows. The approach presented has been applied to the Jucar River Basin (Spain). Reservoir managers explained how the system is operated, taking into account the reservoirs' states at the beginning of the irrigation season and the inflows previewed during that season. According to the information given by them, the Jucar River Basin operating policies were expressed via two fuzzy rule-based (FRB) systems that estimate the amount of water to be allocated to the users and how the reservoir storages should be balanced to guarantee those deliveries. A stochastic optimization model using Stochastic Dual Dynamic Programming (SDDP) was developed to define optimal decisions, which are transformed into optimal operating rules embedding them into the two FRBs previously created. As a benchmark, historical records are used to develop alternative operating rules. A fuzzy linear regression procedure was employed to

  1. 48 CFR 6104.406 - Decisions [Rule 406].

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 7 2010-10-01 2010-10-01 false Decisions . 6104.406... information provided during conferences. The claimant and the agency will each be furnished a copy of the... Internet. The Board's Internet address is: www.cbca.gsa.gov....

  2. 48 CFR 6105.505 - Decisions [Rule 505].

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 7 2010-10-01 2010-10-01 false Decisions . 6105.505... employee, and information provided during conferences. The agency and the affected employee will each be... are posted weekly on the Internet. The Board's Internet address is: http://www.cbca.gsa.gov....

  3. Collective decision making and social interaction rules in mixed-species flocks of songbirds.

    Science.gov (United States)

    Farine, Damien R; Aplin, Lucy M; Garroway, Colin J; Mann, Richard P; Sheldon, Ben C

    2014-09-01

    Associations in mixed-species foraging groups are common in animals, yet have rarely been explored in the context of collective behaviour. Despite many investigations into the social and ecological conditions under which individuals should form groups, we still know little about the specific behavioural rules that individuals adopt in these contexts, or whether these can be generalized to heterospecifics. Here, we studied collective behaviour in flocks in a community of five species of woodland passerine birds. We adopted an automated data collection protocol, involving visits by RFID-tagged birds to feeding stations equipped with antennae, over two winters, recording 91 576 feeding events by 1904 individuals. We demonstrated highly synchronized feeding behaviour within patches, with birds moving towards areas of the patch with the largest proportion of the flock. Using a model of collective decision making, we then explored the underlying decision rule birds may be using when foraging in mixed-species flocks. The model tested whether birds used a different decision rule for conspecifics and heterospecifics, and whether the rules used by individuals of different species varied. We found that species differed in their response to the distribution of conspecifics and heterospecifics across foraging patches. However, simulating decisions using the different rules, which reproduced our data well, suggested that the outcome of using different decision rules by each species resulted in qualitatively similar overall patterns of movement. It is possible that the decision rules each species uses may be adjusted to variation in mean species abundance in order for individuals to maintain the same overall flock-level response. This is likely to be important for maintaining coordinated behaviour across species, and to result in quick and adaptive flock responses to food resources that are patchily distributed in space and time.

  4. Geriatric Fever Score: a new decision rule for geriatric care.

    Directory of Open Access Journals (Sweden)

    Min-Hsien Chung

    Full Text Available Evaluating geriatric patients with fever is time-consuming and challenging. We investigated independent mortality predictors of geriatric patients with fever and developed a prediction rule for emergency care, critical care, and geriatric care physicians to classify patients into mortality risk and disposition groups.Consecutive geriatric patients (≥65 years old visiting the emergency department (ED of a university-affiliated medical center between June 1 and July 21, 2010, were enrolled when they met the criteria of fever: a tympanic temperature ≥37.2°C or a baseline temperature elevated ≥1.3°C. Thirty-day mortality was the primary endpoint. Internal validation with bootstrap re-sampling was done.Three hundred thirty geriatric patients were enrolled. We found three independent mortality predictors: Leukocytosis (WBC >12,000 cells/mm3, Severe coma (GCS ≤ 8, and Thrombocytopenia (platelets <150 10(3/mm3 (LST. After assigning weights to each predictor, we developed a Geriatric Fever Score that stratifies patients into two mortality-risk and disposition groups: low (4.0% (95% CI: 2.3-6.9%: a general ward or treatment in the ED then discharge and high (30.3% (95% CI: 17.4-47.3%: consider the intensive care unit. The area under the curve for the rule was 0.73.We found that the Geriatric Fever Score is a simple and rapid rule for predicting 30-day mortality and classifying mortality risk and disposition in geriatric patients with fever, although external validation should be performed to confirm its usefulness in other clinical settings. It might help preserve medical resources for patients in greater need.

  5. Optimal pricing decision model based on activity-based costing

    Institute of Scientific and Technical Information of China (English)

    王福胜; 常庆芳

    2003-01-01

    In order to find out the applicability of the optimal pricing decision model based on conventional costbehavior model after activity-based costing has given strong shock to the conventional cost behavior model andits assumptions, detailed analyses have been made using the activity-based cost behavior and cost-volume-profitanalysis model, and it is concluded from these analyses that the theory behind the construction of optimal pri-cing decision model is still tenable under activity-based costing, but the conventional optimal pricing decisionmodel must be modified as appropriate to the activity-based costing based cost behavior model and cost-volume-profit analysis model, and an optimal pricing decision model is really a product pricing decision model construc-ted by following the economic principle of maximizing profit.

  6. Decision Support System for Optimized Herbicide Dose in Spring Barley

    DEFF Research Database (Denmark)

    Sønderskov, Mette; Kudsk, Per; Mathiassen, Solvejg K;

    2014-01-01

    Crop Protection Online (CPO) is a decision support system, which integrates decision algorithms quantifying the requirement for weed control and a herbicide dose model. CPO was designed to be used by advisors and farmers to optimize the choice of herbicide and dose. The recommendations from CPO...

  7. A study of diverse clinical decision support rule authoring environments and requirements for integration

    Directory of Open Access Journals (Sweden)

    Zhou Li

    2012-11-01

    Full Text Available Abstract Background Efficient rule authoring tools are critical to allow clinical Knowledge Engineers (KEs, Software Engineers (SEs, and Subject Matter Experts (SMEs to convert medical knowledge into machine executable clinical decision support rules. The goal of this analysis was to identify the critical success factors and challenges of a fully functioning Rule Authoring Environment (RAE in order to define requirements for a scalable, comprehensive tool to manage enterprise level rules. Methods The authors evaluated RAEs in active use across Partners Healthcare, including enterprise wide, ambulatory only, and system specific tools, with a focus on rule editors for reminder and medication rules. We conducted meetings with users of these RAEs to discuss their general experience and perceived advantages and limitations of these tools. Results While the overall rule authoring process is similar across the 10 separate RAEs, the system capabilities and architecture vary widely. Most current RAEs limit the ability of the clinical decision support (CDS interventions to be standardized, sharable, interoperable, and extensible. No existing system meets all requirements defined by knowledge management users. Conclusions A successful, scalable, integrated rule authoring environment will need to support a number of key requirements and functions in the areas of knowledge representation, metadata, terminology, authoring collaboration, user interface, integration with electronic health record (EHR systems, testing, and reporting.

  8. Neural correlates of rules and conflict in medial prefrontal cortex during decision and feedback epochs.

    Science.gov (United States)

    Bissonette, Gregory B; Roesch, Matthew R

    2015-01-01

    The ability to properly adjust behavioral responses to cues in a changing environment is crucial for survival. Activity in the medial Prefrontal Cortex (mPFC) is thought to both represent rules to guide behavior as well as detect and resolve conflicts between rules in changing contingencies. However, while lesion and pharmacological studies have supported a crucial role for mPFC in this type of set-shifting, an understanding of how mPFC represents current rules or detects and resolves conflict between different rules is unclear. Here, we directly address the role of rat mPFC in shifting rule based behavioral strategies using a novel behavioral task designed to tease apart neural signatures of rules, conflict and direction. We demonstrate that activity of single neurons in rat mPFC represent distinct rules. Further, we show increased firing on high conflict trials in a separate population of mPFC neurons. Reduced firing in both populations of neurons was associated with poor performance. Moreover, activity in both populations increased and decreased firing during the outcome epoch when reward was and was not delivered on correct and incorrect trials, respectively. In addition, outcome firing was modulated by the current rule and the degree of conflict associated with the previous decision. These results promote a greater understanding of the role that mPFC plays in switching between rules, signaling both rule and conflict to promote improved behavioral performance.

  9. Rule acquisition in formal decision contexts based on formal, object-oriented and property-oriented concept lattices.

    Science.gov (United States)

    Ren, Yue; Li, Jinhai; Aswani Kumar, Cherukuri; Liu, Wenqi

    2014-01-01

    Rule acquisition is one of the main purposes in the analysis of formal decision contexts. Up to now, there have been several types of rules in formal decision contexts such as decision rules, decision implications, and granular rules, which can be viewed as ∧-rules since all of them have the following form: "if conditions 1,2,…, and m hold, then decisions hold." In order to enrich the existing rule acquisition theory in formal decision contexts, this study puts forward two new types of rules which are called ∨-rules and ∨-∧ mixed rules based on formal, object-oriented, and property-oriented concept lattices. Moreover, a comparison of ∨-rules, ∨-∧ mixed rules, and ∧-rules is made from the perspectives of inclusion and inference relationships. Finally, some real examples and numerical experiments are conducted to compare the proposed rule acquisition algorithms with the existing one in terms of the running efficiency.

  10. A Decision Support System for Solving Multiple Criteria Optimization Problems

    Science.gov (United States)

    Filatovas, Ernestas; Kurasova, Olga

    2011-01-01

    In this paper, multiple criteria optimization has been investigated. A new decision support system (DSS) has been developed for interactive solving of multiple criteria optimization problems (MOPs). The weighted-sum (WS) approach is implemented to solve the MOPs. The MOPs are solved by selecting different weight coefficient values for the criteria…

  11. Determining rules for closing customer service centers: A public utility company's fuzzy decision

    Science.gov (United States)

    Dekorvin, Andre; Shipley, Margaret F.; Lea, Robert N.

    1992-01-01

    In the present work, we consider the general problem of knowledge acquisition under uncertainty. Simply stated, the problem reduces to the following: how can we capture the knowledge of an expert when the expert is unable to clearly formulate how he or she arrives at a decision? A commonly used method is to learn by examples. We observe how the expert solves specific cases and from this infer some rules by which the decision may have been made. Unique to our work is the fuzzy set representation of the conditions or attributes upon which the expert may possibly base his fuzzy decision. From our examples, we infer certain and possible fuzzy rules for closing a customer service center and illustrate the importance of having the decision closely relate to the conditions under consideration.

  12. Fuzzy rule-based models for decision support in ecosystem management.

    Science.gov (United States)

    Adriaenssens, Veronique; De Baets, Bernard; Goethals, Peter L M; De Pauw, Niels

    2004-02-05

    To facilitate decision support in the ecosystem management, ecological expertise and site-specific data need to be integrated. Fuzzy logic can deal with highly variable, linguistic, vague and uncertain data or knowledge and, therefore, has the ability to allow for a logical, reliable and transparent information stream from data collection down to data usage in decision-making. Several environmental applications already implicate the use of fuzzy logic. Most of these applications have been set up by trial and error and are mainly limited to the domain of environmental assessment. In this article, applications of fuzzy logic for decision support in ecosystem management are reviewed and assessed, with an emphasis on rule-based models. In particular, the identification, optimisation, validation, the interpretability and uncertainty aspects of fuzzy rule-based models for decision support in ecosystem management are discussed.

  13. Understanding Optimal Decision-Making in Wargaming

    Science.gov (United States)

    2013-10-01

    previously (Russell & Norvig , 2010). The perception 8 Figure 2.3: Information processing model of human cognition (C. Wickens & Hollands, 2000). Dotted...decision-making (Perla, 1990). 9 Figure 2.4: Agent environment interaction (Russell & Norvig , 2010). 2.2. Neurophysiological Measures The study of...selection while balancing exploration and exploitation (Sutton & Barto, 1998; Russell & Norvig , 2010). For our purposes, with our version of the IGT

  14. Clinical decision rules for acute bacterial meningitis: current insights

    Directory of Open Access Journals (Sweden)

    Viallon A

    2016-04-01

    Full Text Available Alain Viallon,1 Elisabeth Botelho-Nevers,2 Fabrice Zeni3 1Emergency Department, 2Department of Infectious Disease, 3Intensive Care Unit, University Hospital, Saint-Etienne, France Abstract: Acute community-acquired bacterial meningitis (BM requires rapid diagnosis so that suitable treatment can be instituted within 60 minutes of admitting the patient. The cornerstone of diagnostic examination is lumbar puncture, which enables microbiological analysis and determination of the cerebrospinal fluid (CSF cytochemical characteristics. However, microbiological testing is not sufficiently sensitive to rule out this diagnosis. With regard to the analysis of standard CSF cytochemical characteristics (polymorphonuclear count, CSF glucose and protein concentration, and CSF:serum glucose, this is often misleading. Indeed, the relatively imprecise nature of the cutoff values for these BM diagnosis markers can make their interpretation difficult. However, there are two markers that appear to be more efficient than the standard ones: CSF lactate and serum procalcitonin levels. Scores and predictive models are also available; however, they only define a clinical probability, and in addition, their use calls for prior validation on the population in which they are used. In this article, we review current methods of BM diagnosis. Keywords: meningitis, diagnosis, emergency

  15. On optimal soft-decision demodulation. [in digital communication system

    Science.gov (United States)

    Lee, L.-N.

    1976-01-01

    A necessary condition is derived for optimal J-ary coherent demodulation of M-ary (M greater than 2) signals. Optimality is defined as maximality of the symmetric cutoff rate of the resulting discrete memoryless channel. Using a counterexample, it is shown that the condition derived is generally not sufficient for optimality. This condition is employed as the basis for an iterative optimization method to find the optimal demodulator decision regions from an initial 'good guess'. In general, these regions are found to be bounded by hyperplanes in likelihood space; the corresponding regions in signal space are found to have hyperplane asymptotes for the important case of additive white Gaussian noise. Some examples are presented, showing that the regions in signal space bounded by these asymptotic hyperplanes define demodulator decision regions that are virtually optimal.

  16. OPTIMAL HIERARCHY STRUCTURES FOR MULTI-ATTRIBUTE-CRITERIA DECISIONS

    Institute of Scientific and Technical Information of China (English)

    Stan LIPOVETSKY

    2009-01-01

    A problem of a hierarchy structure optimization is considered. Hierarchical structures are widely used in the Analytic Hierarchy Process, conjoint analysis, and various other methods of multiple criteria decision making. The problem consists in finding a structure that needs a minimum number of pair comparisons for a given total number of the alternatives. For an optimal hierarchy, the minimum efforts are needed for eliciting data and synthesizing the local References across the hierarchy to get the global priorities or utilities. Special estimation techniques are developed and numerical simulations performed. Analytical and numerical results suggest optimal ways of priority evaluations for practical managerial decisions in a complex environment.

  17. Data Mining Framework for Generating Sales Decision Making Information Using Association Rules

    OpenAIRE

    Md. Humayun Kabir

    2016-01-01

    The rapid technological development in the field of information and communication technology (ICT) has enabled the databases of super shops to be organized under a countrywide sales decision making network to develop intelligent business systems by generating enriched business policies. This paper presents a data mining framework for generating sales decision making information from sales data using association rules generated from valid user input item set with respect to the sales data unde...

  18. A patient with a large pulmonary saddle embolus eluding both clinical gestalt and validated decision rules.

    Science.gov (United States)

    Hennessey, Adam; Setyono, Devy A; Lau, Wayne Bond; Fields, Jason Matthew

    2012-06-01

    We report a patient with chest pain who was classified as having low risk for pulmonary embolism with clinical gestalt and accepted clinical decision rules. An inadvertently ordered D-dimer and abnormal result, however, led to the identification of a large saddle embolus. This case illustrates the fallibility of even well-validated decision aids and that an embolism missed by these tools is not necessarily low risk or indicative of a low clot burden. Copyright © 2011. Published by Mosby, Inc.

  19. Discounting in economic evaluations : Stepping forward towards optimal decision rules

    NARCIS (Netherlands)

    Gravelle, Hugh; Brouwer, Werner; Niessen, Louis; Postma, Maarten; Rutten, Frans

    2007-01-01

    The National Institute for Clinical Excellence has recently changed its guidelines on discounting costs and effects in economic evaluations. In common with most other regulatory bodies it now requires that health effects should be discounted at the same rate as costs. We show that the guideline lead

  20. Sparse coding joint decision rule for ear print recognition

    Science.gov (United States)

    Guermoui, Mawloud; Melaab, Djamel; Mekhalfi, Mohamed Lamine

    2016-09-01

    Human ear recognition has been promoted as a profitable biometric over the past few years. With respect to other modalities, such as the face and iris, that have undergone a significant investigation in the literature, ear pattern is relatively still uncommon. We put forth a sparse coding-induced decision-making for ear recognition. It jointly involves the reconstruction residuals and the respective reconstruction coefficients pertaining to the input features (co-occurrence of adjacent local binary patterns) for a further fusion. We particularly show that combining both components (i.e., the residuals as well as the coefficients) yields better outcomes than the case when either of them is deemed singly. The proposed method has been evaluated on two benchmark datasets, namely IITD1 (125 subject) and IITD2 (221 subjects). The recognition rates of the suggested scheme amount for 99.5% and 98.95% for both datasets, respectively, which suggest that our method decently stands out against reference state-of-the-art methodologies. Furthermore, experiments conclude that the presented scheme manifests a promising robustness under large-scale occlusion scenarios.

  1. Primary motor cortex contributes to the implementation of implicit value-based rules during motor decisions.

    Science.gov (United States)

    Derosiere, Gerard; Zénon, Alexandre; Alamia, Andrea; Duque, Julie

    2017-02-01

    In the present study, we investigated the functional contribution of the human primary motor cortex (M1) to motor decisions. Continuous theta burst stimulation (cTBS) was used to alter M1 activity while participants performed a decision-making task in which the reward associated with the subjects' responses (right hand finger movements) depended on explicit and implicit value-based rules. Subjects performed the task over two consecutive days and cTBS occurred in the middle of Day 2, once the subjects were just about to implement implicit rules, in addition to the explicit instructions, to choose their responses, as evident in the control group (cTBS over the right somatosensory cortex). Interestingly, cTBS over the left M1 prevented subjects from implementing the implicit value-based rule while its implementation was enhanced in the group receiving cTBS over the right M1. Hence, cTBS had opposite effects depending on whether it was applied on the contralateral or ipsilateral M1. The use of the explicit value-based rule was unaffected by cTBS in the three groups of subject. Overall, the present study provides evidence for a functional contribution of M1 to the implementation of freshly acquired implicit rules, possibly through its involvement in a cortico-subcortical network controlling value-based motor decisions.

  2. Decision-Theoretic Methods in Simulation Optimization

    Science.gov (United States)

    2014-09-24

    Materiel Command REPORT DOCUMENTATION PAGE Form Approved OMB No. 0704-0188 Public reporting burden for this collection of information is...Alamos National Lab: Frazier visited LANL , hosted by Frank Alexander, in January 2013, where he discussed the use of simulation optimization methods for...Alexander, Turab Lookman, and others from LANL , at the Materials Informatics Workshop at the Sante Fe Institute in April 2013. In February 2014, Frazier

  3. 4 CFR 22.19 - Findings and Decisions of the Board [Rule 19].

    Science.gov (United States)

    2010-01-01

    ... 4 Accounts 1 2010-01-01 2010-01-01 false Findings and Decisions of the Board . 22.19 Section 22.19 Accounts GOVERNMENT ACCOUNTABILITY OFFICE GENERAL PROCEDURES RULES OF PROCEDURE OF THE GOVERNMENT..., and directions to the parties issued by the Board; (vi) Written transcripts and electronic recordings...

  4. Decision rules for GHB (gamma-hydroxybutyric acid) detoxification: a vignette study

    NARCIS (Netherlands)

    Kamal, R.M.; Iwaarden, S. van; Dijkstra, B.A.; Jong, C.A.J. de

    2014-01-01

    BACKGROUND: GHB dependent patients can suffer from a severe and sometimes life-threatening withdrawal syndrome. Therefore, most of the patients are treated within inpatient settings. However, some prefers an outpatient approach to treatment. The aim of this study was to develop decision rules for ad

  5. Decision rules for GHB (gamma-hydroxybutyric acid) detoxification: A vignette study

    NARCIS (Netherlands)

    Kamal, R.M.; Iwaarden, S. van; Dijkstra, B.A.G.; Jong, C.A.J. de

    2014-01-01

    Background GHB dependent patients can suffer from a severe and sometimes life-threatening withdrawal syndrome. Therefore, most of the patients are treated within inpatient settings. However, some prefers an outpatient approach to treatment. The aim of this study was to develop decision rules for add

  6. Investigating decision rules with a new experimental design: the EXACT paradigm

    Science.gov (United States)

    Biscione, Valerio; Harris, Christopher M.

    2015-01-01

    In the decision-making field, it is important to distinguish between the perceptual process (how information is collected) and the decision rule (the strategy governing decision-making). We propose a new paradigm, called EXogenous ACcumulation Task (EXACT) to disentangle these two components. The paradigm consists of showing a horizontal gauge that represents the probability of receiving a reward at time t and increases with time. The participant is asked to press a button when they want to request a reward. Thus, the perceptual mechanism is hard-coded and does not need to be inferred from the data. Based on this paradigm, we compared four decision rules (Bayes Risk, Reward Rate, Reward/Accuracy, and Modified Reward Rate) and found that participants appeared to behave according to the Modified Reward Rate. We propose a new way of analysing the data by using the accuracy of responses, which can only be inferred in classic RT tasks. Our analysis suggests that several experimental findings such as RT distribution and its relationship with experimental conditions, usually deemed to be the result of a rise-to-threshold process, may be simply explained by the effect of the decision rule employed. PMID:26578916

  7. Comparison of Greedy Algorithms for Decision Tree Optimization

    KAUST Repository

    Alkhalid, Abdulaziz

    2013-01-01

    This chapter is devoted to the study of 16 types of greedy algorithms for decision tree construction. The dynamic programming approach is used for construction of optimal decision trees. Optimization is performed relative to minimal values of average depth, depth, number of nodes, number of terminal nodes, and number of nonterminal nodes of decision trees. We compare average depth, depth, number of nodes, number of terminal nodes and number of nonterminal nodes of constructed trees with minimum values of the considered parameters obtained based on a dynamic programming approach. We report experiments performed on data sets from UCI ML Repository and randomly generated binary decision tables. As a result, for depth, average depth, and number of nodes we propose a number of good heuristics. © Springer-Verlag Berlin Heidelberg 2013.

  8. Strategy optimization for mask rule check in wafer fab

    Science.gov (United States)

    Yang, Chuen Huei; Lin, Shaina; Lin, Roger; Wang, Alice; Lee, Rachel; Deng, Erwin

    2015-07-01

    Photolithography process is getting more and more sophisticated for wafer production following Moore's law. Therefore, for wafer fab, consolidated and close cooperation with mask house is a key to achieve silicon wafer success. However, generally speaking, it is not easy to preserve such partnership because many engineering efforts and frequent communication are indispensable. The inattentive connection is obvious in mask rule check (MRC). Mask houses will do their own MRC at job deck stage, but the checking is only for identification of mask process limitation including writing, etching, inspection, metrology, etc. No further checking in terms of wafer process concerned mask data errors will be implemented after data files of whole mask are composed in mask house. There are still many potential data errors even post-OPC verification has been done for main circuits. What mentioned here are the kinds of errors which will only occur as main circuits combined with frame and dummy patterns to form whole reticle. Therefore, strategy optimization is on-going in UMC to evaluate MRC especially for wafer fab concerned errors. The prerequisite is that no impact on mask delivery cycle time even adding this extra checking. A full-mask checking based on job deck in gds or oasis format is necessary in order to secure acceptable run time. Form of the summarized error report generated by this checking is also crucial because user friendly interface will shorten engineers' judgment time to release mask for writing. This paper will survey the key factors of MRC in wafer fab.

  9. Orthogonal search-based rule extraction for modelling the decision to transfuse.

    Science.gov (United States)

    Etchells, T A; Harrison, M J

    2006-04-01

    Data from an audit relating to transfusion decisions during intermediate or major surgery were analysed to determine the strengths of certain factors in the decision making process. The analysis, using orthogonal search-based rule extraction (OSRE) from a trained neural network, demonstrated that the risk of tissue hypoxia (ROTH) assessed using a 100-mm visual analogue scale, the haemoglobin value (Hb) and the presence or absence of on-going haemorrhage (OGH) were able to reproduce the transfusion decisions with a joint specificity of 0.96 and sensitivity of 0.93 and a positive predictive value of 0.9. The rules indicating transfusion were: 1. ROTH > 32 mm and Hb 13 mm and Hb 38 mm, Hb < 102 g x l(-1) and OGH; 4. Hb < 78 g x l(-1).

  10. Optimal Sensor Decision Based on Particle Filter

    Institute of Scientific and Technical Information of China (English)

    XU Meng; WANG Hong-wei; HU Shi-qiang

    2006-01-01

    A novel infrared and radar synergistic tracking algorithm, which is based on the idea of closed loop control, and target's motion model identification and particle filter approach, was put forward. In order to improve the observability and filtering divergence of infrared search and tracking, the unscented Kalman filter algorithm that has stronger ability of non-linear approximation was adopted. The polynomial and least square method based on radar and IRST measurements to identify the parameters of the model was proposed, and a "pseudo sensor" was suggested to estimate the target position according to the identified model even if the radar is turned off. At last,the average Kullback-Leibler discrimination distance based on particle filter was used to measure the tracking performance, based on tracking performance and fuzzy stochastic decision, the idea of closed loop was used to retrieve the module parameter of "pseudo sensor". The experimental result indicates that the algorithm can not only limit the radar activity effectively but also keep the tracking accuracy of active/passive system well.

  11. Dispositional optimism, self-framing and medical decision-making.

    Science.gov (United States)

    Zhao, Xu; Huang, Chunlei; Li, Xuesong; Zhao, Xin; Peng, Jiaxi

    2015-03-01

    Self-framing is an important but underinvestigated area in risk communication and behavioural decision-making, especially in medical settings. The present study aimed to investigate the relationship among dispositional optimism, self-frame and decision-making. Participants (N = 500) responded to the Life Orientation Test-Revised and self-framing test of medical decision-making problem. The participants whose scores were higher than the middle value were regarded as highly optimistic individuals. The rest were regarded as low optimistic individuals. The results showed that compared to the high dispositional optimism group, participants from the low dispositional optimism group showed a greater tendency to use negative vocabulary to construct their self-frame, and tended to choose the radiation therapy with high treatment survival rate, but low 5-year survival rate. Based on the current findings, it can be concluded that self-framing effect still exists in medical situation and individual differences in dispositional optimism can influence the processing of information in a framed decision task, as well as risky decision-making.

  12. Are perceptuo-motor decisions really more optimal than cognitive decisions?

    Science.gov (United States)

    Jarvstad, Andreas; Hahn, Ulrike; Warren, Paul A; Rushton, Simon K

    2014-03-01

    Human high-level cognitive decisions appear sub-optimal (Kahneman, Slovic, & Tversky, 1982; Kahneman & Tversky, 1979). Paradoxically, perceptuo-motor decisions appear optimal, or nearly optimal (Trommershäuser, Maloney, & Landy, 2008). Here, we highlight limitations to the comparison of performance between and within domains. These limitations are illustrated by means of two perceptuo-motor decision-making experiments. The results indicate that participants did not optimize fundamental performance-related factors (precision and time usage), even though standard analyses may have classed participants as 'optimal'. Moreover, simulations and comparisons across our studies demonstrate that optimality depends on task difficulty. Thus, it seems that a standard model of perceptuo-motor decision-making fails to provide an absolute standard of performance. Importantly, this appears to be a limitation of optimal models of human behaviour in general. This, in conjunction with non-trivial evaluative- and methodological differences, suggests that verdicts favouring perceptuo-motor, or perceptual, systems over higher-level cognitive systems in terms of level of performance are premature.

  13. A programmable rules engine to provide clinical decision support using HTML forms.

    Science.gov (United States)

    Heusinkveld, J; Geissbuhler, A; Sheshelidze, D; Miller, R

    1999-01-01

    The authors have developed a simple method for specifying rules to be applied to information on HTML forms. This approach allows clinical experts, who lack the programming expertise needed to write CGI scripts, to construct and maintain domain-specific knowledge and ordering capabilities within WizOrder, the order-entry and decision support system used at Vanderbilt Hospital. The clinical knowledge base maintainers use HTML editors to create forms and spreadsheet programs for rule entry. A test environment has been developed which uses Netscape to display forms; the production environment displays forms using an embedded browser.

  14. Automatic generation of optimal business processes from business rules

    NARCIS (Netherlands)

    Steen, B.; Ferreira Pires, Luis; Iacob, Maria Eugenia

    2010-01-01

    In recent years, business process models are increasingly being used as a means for business process improvement. Business rules can be seen as requirements for business processes, in that they describe the constraints that must hold for business processes that implement these business rules.

  15. Hedging Rules for Water Supply Reservoir Based on the Model of Simulation and Optimization

    Directory of Open Access Journals (Sweden)

    Yi Ji

    2016-06-01

    Full Text Available This study proposes a hedging rule model which is composed of a two-period reservior operation model considering the damage depth and hedging rule parameter optimization model. The former solves hedging rules based on a given poriod’s water supply weighting factor and carryover storage target, while the latter optimization model is used to optimize the weighting factor and carryover storage target based on the hedging rules. The coupling model gives the optimal poriod’s water supply weighting factor and carryover storage target to guide release. The conclusions achieved from this study as follows: (1 the water supply weighting factor and carryover storage target have a direct impact on the three elements of the hedging rule; (2 parameters can guide reservoirs to supply water reasonably after optimization of the simulation and optimization model; and (3 in order to verify the utility of the hedging rule, the Heiquan reservoir is used as a case study and particle swarm optimization algorithm with a simulation model is adopted for optimizing the parameter. The results show that the proposed hedging rule can improve the operation performances of the water supply reservoir.

  16. An Optimized Weighted Association Rule Mining On Dynamic Content

    CERN Document Server

    Velvadivu, P

    2010-01-01

    Association rule mining aims to explore large transaction databases for association rules. Classical Association Rule Mining (ARM) model assumes that all items have the same significance without taking their weight into account. It also ignores the difference between the transactions and importance of each and every itemsets. But, the Weighted Association Rule Mining (WARM) does not work on databases with only binary attributes. It makes use of the importance of each itemset and transaction. WARM requires each item to be given weight to reflect their importance to the user. The weights may correspond to special promotions on some products, or the profitability of different items. This research work first focused on a weight assignment based on a directed graph where nodes denote items and links represent association rules. A generalized version of HITS is applied to the graph to rank the items, where all nodes and links are allowed to have weights. This research then uses enhanced HITS algorithm by developing...

  17. Optimal Rules of Negligent Misrepresentation in Insurance Contract Law

    DEFF Research Database (Denmark)

    Lando, Henrik

    2016-01-01

    lenient rules. The article compares the efficiency of the common and the civil law rules in an adverse selection model in which the insurer separates types of risk not only through a deductible but also by requiring the insured to represent their type. A strict rule of misrepresentation increases...... the incentive for policy-holders to represent truthfully but also exposes them to risk when they may misrepresent by mistake. While the economic literature has tended to defend the strict common law rule, because it makes it easier for the insurer to separate types, the present article demonstrates...... that the more lenient civil law rules may be more efficient, especially when the cost for the insurer of auditing types is low....

  18. Antagonistic and Bargaining Games in Optimal Marketing Decisions

    Science.gov (United States)

    Lipovetsky, S.

    2007-01-01

    Game theory approaches to find optimal marketing decisions are considered. Antagonistic games with and without complete information, and non-antagonistic games techniques are applied to paired comparison, ranking, or rating data for a firm and its competitors in the market. Mix strategy, equilibrium in bi-matrix games, bargaining models with…

  19. Antagonistic and Bargaining Games in Optimal Marketing Decisions

    Science.gov (United States)

    Lipovetsky, S.

    2007-01-01

    Game theory approaches to find optimal marketing decisions are considered. Antagonistic games with and without complete information, and non-antagonistic games techniques are applied to paired comparison, ranking, or rating data for a firm and its competitors in the market. Mix strategy, equilibrium in bi-matrix games, bargaining models with…

  20. A decision-analytic approach to the optimal allocation of resources for endangered species consultation

    Science.gov (United States)

    Converse, Sarah J.; Shelley, Kevin J.; Morey, Steve; Chan, Jeffrey; LaTier, Andrea; Scafidi, Carolyn; Crouse, Deborah T.; Runge, Michael C.

    2011-01-01

    The resources available to support conservation work, whether time or money, are limited. Decision makers need methods to help them identify the optimal allocation of limited resources to meet conservation goals, and decision analysis is uniquely suited to assist with the development of such methods. In recent years, a number of case studies have been described that examine optimal conservation decisions under fiscal constraints; here we develop methods to look at other types of constraints, including limited staff and regulatory deadlines. In the US, Section Seven consultation, an important component of protection under the federal Endangered Species Act, requires that federal agencies overseeing projects consult with federal biologists to avoid jeopardizing species. A benefit of consultation is negotiation of project modifications that lessen impacts on species, so staff time allocated to consultation supports conservation. However, some offices have experienced declining staff, potentially reducing the efficacy of consultation. This is true of the US Fish and Wildlife Service's Washington Fish and Wildlife Office (WFWO) and its consultation work on federally-threatened bull trout (Salvelinus confluentus). To improve effectiveness, WFWO managers needed a tool to help allocate this work to maximize conservation benefits. We used a decision-analytic approach to score projects based on the value of staff time investment, and then identified an optimal decision rule for how scored projects would be allocated across bins, where projects in different bins received different time investments. We found that, given current staff, the optimal decision rule placed 80% of informal consultations (those where expected effects are beneficial, insignificant, or discountable) in a short bin where they would be completed without negotiating changes. The remaining 20% would be placed in a long bin, warranting an investment of seven days, including time for negotiation. For formal

  1. Where should I send it? Optimizing the submission decision process.

    Science.gov (United States)

    Salinas, Santiago; Munch, Stephan B

    2015-01-01

    How do scientists decide where to submit manuscripts? Many factors influence this decision, including prestige, acceptance probability, turnaround time, target audience, fit, and impact factor. Here, we present a framework for evaluating where to submit a manuscript based on the theory of Markov decision processes. We derive two models, one in which an author is trying to optimally maximize citations and another in which that goal is balanced by either minimizing the number of resubmissions or the total time in review. We parameterize the models with data on acceptance probability, submission-to-decision times, and impact factors for 61 ecology journals. We find that submission sequences beginning with Ecology Letters, Ecological Monographs, or PLOS ONE could be optimal depending on the importance given to time to acceptance or number of resubmissions. This analysis provides some guidance on where to submit a manuscript given the individual-specific values assigned to these disparate objectives.

  2. Where should I send it? Optimizing the submission decision process.

    Directory of Open Access Journals (Sweden)

    Santiago Salinas

    Full Text Available How do scientists decide where to submit manuscripts? Many factors influence this decision, including prestige, acceptance probability, turnaround time, target audience, fit, and impact factor. Here, we present a framework for evaluating where to submit a manuscript based on the theory of Markov decision processes. We derive two models, one in which an author is trying to optimally maximize citations and another in which that goal is balanced by either minimizing the number of resubmissions or the total time in review. We parameterize the models with data on acceptance probability, submission-to-decision times, and impact factors for 61 ecology journals. We find that submission sequences beginning with Ecology Letters, Ecological Monographs, or PLOS ONE could be optimal depending on the importance given to time to acceptance or number of resubmissions. This analysis provides some guidance on where to submit a manuscript given the individual-specific values assigned to these disparate objectives.

  3. Multinational Firm’s Production Decisions under Overlapping Free Trade Agreements: Rule of Origin Requirements and Environmental Regulation

    Directory of Open Access Journals (Sweden)

    Sung Hee Lee

    2016-12-01

    Full Text Available In this paper, we study the impact of the rule of origin (ROO requirements accompanied by free trade agreements (FTAs, which specify the minimum portion of supplies that should satisfy the origin requirement, on a multinational firm’s production decisions. We consider a multinational firm who exports its product to multiple countries and analyze its production decision in the presence of multiple free trading agreements (FTAs. The ROO requirements in FTAs not only refer to the origin of supplies but are also involved with an environmental regulation of a country of the supplies. As such, meeting multiple ROO requirements can be costly since it may be involved with an adjustment of production facility and suppliers according to different environmental standards. We investigate a multinational firm’s choice of the ROO level in its production decision under multiple FTAs. It is well known that, in the presence of overlapping FTAs, the firm may strategically choose not to comply with the minimum ROO requirements in the FTAs. Instead, the firm may choose to comply with an ROO level that is higher than required, or pay the tariff instead without enjoying tariff exemption by the FTA in the new country. Such unintended outcomes in the FTAs are called the Spaghetti Bowl Effect. We characterize and quantify two types of such Spaghetti Bowl Effects with the optimal production decisions of a multinational firm under multiple ROO requirements and derive policy implications.

  4. Multiple Criteria Decision Making: Efficient Outcome Assessments with Evolutionary Optimization

    Science.gov (United States)

    Kaliszewski, Ignacy; Miroforidis, Janusz

    We propose to derive assessments of outcomes to MCDM problems instead of just outcomes and carry decision making processes with the former. In contrast to earlier works in that direction, which to calculate assessments make use of subsets of the efficient set (shells), here we provide formulas for calculation of assessments based on the use of upper and lower approximations (upper and lower shells) of the efficient set, derived by evolutionary optimization. Hence, by replacing shells, which are to be in general derived by optimization, by pairs of upper and lower shells, exact optimization methods can be eliminated from MCDM.

  5. Decision or norm: Judicial discretion as a treat to the rule of law

    Directory of Open Access Journals (Sweden)

    Avramović Dragutin

    2012-01-01

    Full Text Available Principle of legality and legal certainty, as key notions even of the thinnest concept of rule of law, are largely endangered in our times by widening of judicial discretion range. That trend is more and more at hand in European states as well, due to convergence of common law and civil law legal systems. Judicial decision acquires higher and higher factual importance in European legal systems, although it is generally not considered as a source of law. After analysis of standings by leading scholars of legal realism theory, the author admits that a very high level of tension frequently exists between judicial decision and legal norm. Within that conflict often and relatively easy decision succeeds to tear off by the strict letter of the law. In application of general legal rules upon concrete case, by creative adjustment of the law to life, due to necessary general and abstract character of legal norms, judge becomes more creator of law, rather than the one who applies it. The author points to danger of subjective and prejudiced attitudes of the judges, as they, due to their wide discretion, make a decision more upon their own feeling of justice, rather than upon law itself. In that way the law transforms itself in judicial decision based upon subjective understanding of justice and fairness.

  6. Decision making under uncertainty, illustrated by the optimal working interest

    Energy Technology Data Exchange (ETDEWEB)

    Dekker, P. [Gaz de France, Paris (France)

    2007-09-13

    One of the most controversial and difficult decisions to make in the hydrocarbon exploration industry is: ''Which project to participate in and what share to take in this project?''. In the past these decisions were based on the experience of the exploration managers, however this is often biased by personal experiences and recent exploration results. During the last few years a significant amount of research as been done towards portfolio management and decision-making (see references). More theory has become available to make consistent and auditable decisions. The aim of this presentation is to demonstrate the value of understanding the company's ''risk tolerance'' level, and how this relates to determining the ''Optimal Working Interest'' in exploration projects. (orig.)

  7. Classification and Optimization of Decision Trees for Inconsistent Decision Tables Represented as MVD Tables

    KAUST Repository

    Azad, Mohammad

    2015-10-11

    Decision tree is a widely used technique to discover patterns from consistent data set. But if the data set is inconsistent, where there are groups of examples (objects) with equal values of conditional attributes but different decisions (values of the decision attribute), then to discover the essential patterns or knowledge from the data set is challenging. We consider three approaches (generalized, most common and many-valued decision) to handle such inconsistency. We created different greedy algorithms using various types of impurity and uncertainty measures to construct decision trees. We compared the three approaches based on the decision tree properties of the depth, average depth and number of nodes. Based on the result of the comparison, we choose to work with the many-valued decision approach. Now to determine which greedy algorithms are efficient, we compared them based on the optimization and classification results. It was found that some greedy algorithms Mult\\\\_ws\\\\_entSort, and Mult\\\\_ws\\\\_entML are good for both optimization and classification.

  8. Optimization of European call options considering physical delivery network and reservoir operation rules

    Science.gov (United States)

    Cheng, Wei-Chen; Hsu, Nien-Sheng; Cheng, Wen-Ming; Yeh, William W.-G.

    2011-10-01

    This paper develops alternative strategies for European call options for water purchase under hydrological uncertainties that can be used by water resources managers for decision making. Each alternative strategy maximizes its own objective over a selected sequence of future hydrology that is characterized by exceedance probability. Water trade provides flexibility and enhances water distribution system reliability. However, water trade between two parties in a regional water distribution system involves many issues, such as delivery network, reservoir operation rules, storage space, demand, water availability, uncertainty, and any existing contracts. An option is a security giving the right to buy or sell an asset; in our case, the asset is water. We extend a flow path-based water distribution model to include reservoir operation rules. The model simultaneously considers both the physical distribution network as well as the relationships between water sellers and buyers. We first test the model extension. Then we apply the proposed optimization model for European call options to the Tainan water distribution system in southern Taiwan. The formulation lends itself to a mixed integer linear programming model. We use the weighing method to formulate a composite function for a multiobjective problem. The proposed methodology provides water resources managers with an overall picture of water trade strategies and the consequence of each strategy. The results from the case study indicate that the strategy associated with a streamflow exceedence probability of 50% or smaller should be adopted as the reference strategy for the Tainan water distribution system.

  9. Confronting dynamics and uncertainty in optimal decision making for conservation

    Science.gov (United States)

    Williams, Byron K.; Johnson, Fred A.

    2013-01-01

    The effectiveness of conservation efforts ultimately depends on the recognition that decision making, and the systems that it is designed to affect, are inherently dynamic and characterized by multiple sources of uncertainty. To cope with these challenges, conservation planners are increasingly turning to the tools of decision analysis, especially dynamic optimization methods. Here we provide a general framework for optimal, dynamic conservation and then explore its capacity for coping with various sources and degrees of uncertainty. In broadest terms, the dynamic optimization problem in conservation is choosing among a set of decision options at periodic intervals so as to maximize some conservation objective over the planning horizon. Planners must account for immediate objective returns, as well as the effect of current decisions on future resource conditions and, thus, on future decisions. Undermining the effectiveness of such a planning process are uncertainties concerning extant resource conditions (partial observability), the immediate consequences of decision choices (partial controllability), the outcomes of uncontrolled, environmental drivers (environmental variation), and the processes structuring resource dynamics (structural uncertainty). Where outcomes from these sources of uncertainty can be described in terms of probability distributions, a focus on maximizing the expected objective return, while taking state-specific actions, is an effective mechanism for coping with uncertainty. When such probability distributions are unavailable or deemed unreliable, a focus on maximizing robustness is likely to be the preferred approach. Here the idea is to choose an action (or state-dependent policy) that achieves at least some minimum level of performance regardless of the (uncertain) outcomes. We provide some examples of how the dynamic optimization problem can be framed for problems involving management of habitat for an imperiled species, conservation of a

  10. Confronting dynamics and uncertainty in optimal decision making for conservation

    Science.gov (United States)

    Williams, Byron K.; Johnson, Fred A.

    2013-06-01

    The effectiveness of conservation efforts ultimately depends on the recognition that decision making, and the systems that it is designed to affect, are inherently dynamic and characterized by multiple sources of uncertainty. To cope with these challenges, conservation planners are increasingly turning to the tools of decision analysis, especially dynamic optimization methods. Here we provide a general framework for optimal, dynamic conservation and then explore its capacity for coping with various sources and degrees of uncertainty. In broadest terms, the dynamic optimization problem in conservation is choosing among a set of decision options at periodic intervals so as to maximize some conservation objective over the planning horizon. Planners must account for immediate objective returns, as well as the effect of current decisions on future resource conditions and, thus, on future decisions. Undermining the effectiveness of such a planning process are uncertainties concerning extant resource conditions (partial observability), the immediate consequences of decision choices (partial controllability), the outcomes of uncontrolled, environmental drivers (environmental variation), and the processes structuring resource dynamics (structural uncertainty). Where outcomes from these sources of uncertainty can be described in terms of probability distributions, a focus on maximizing the expected objective return, while taking state-specific actions, is an effective mechanism for coping with uncertainty. When such probability distributions are unavailable or deemed unreliable, a focus on maximizing robustness is likely to be the preferred approach. Here the idea is to choose an action (or state-dependent policy) that achieves at least some minimum level of performance regardless of the (uncertain) outcomes. We provide some examples of how the dynamic optimization problem can be framed for problems involving management of habitat for an imperiled species, conservation of a

  11. Optimizing water supply and hydropower reservoir operation rule curves: An imperialist competitive algorithm approach

    Science.gov (United States)

    Afshar, Abbas; Emami Skardi, Mohammad J.; Masoumi, Fariborz

    2015-09-01

    Efficient reservoir management requires the implementation of generalized optimal operating policies that manage storage volumes and releases while optimizing a single objective or multiple objectives. Reservoir operating rules stipulate the actions that should be taken under the current state of the system. This study develops a set of piecewise linear operating rule curves for water supply and hydropower reservoirs, employing an imperialist competitive algorithm in a parameterization-simulation-optimization approach. The adaptive penalty method is used for constraint handling and proved to work efficiently in the proposed scheme. Its performance is tested deriving an operation rule for the Dez reservoir in Iran. The proposed modelling scheme converged to near-optimal solutions efficiently in the case examples. It was shown that the proposed optimum piecewise linear rule may perform quite well in reservoir operation optimization as the operating period extends from very short to fairly long periods.

  12. A Simple Decision Rule for Recognition of Poly(A) Tail Signal Motifs in Human Genome

    KAUST Repository

    AbouEisha, Hassan M.

    2015-05-12

    Background is the numerous attempts were made to predict motifs in genomic sequences that correspond to poly (A) tail signals. Vast portion of this effort has been directed to a plethora of nonlinear classification methods. Even when such approaches yield good discriminant results, identifying dominant features of regulatory mechanisms nevertheless remains a challenge. In this work, we look at decision rules that may help identifying such features. Findings are we present a simple decision rule for classification of candidate poly (A) tail signal motifs in human genomic sequence obtained by evaluating features during the construction of gradient boosted trees. We found that values of a single feature based on the frequency of adenine in the genomic sequence surrounding candidate signal and the number of consecutive adenine molecules in a well-defined region immediately following the motif displays good discriminative potential in classification of poly (A) tail motifs for samples covered by the rule. Conclusions is the resulting simple rule can be used as an efficient filter in construction of more complex poly(A) tail motifs classification algorithms.

  13. Optimized block-based connected components labeling with decision trees.

    Science.gov (United States)

    Grana, Costantino; Borghesani, Daniele; Cucchiara, Rita

    2010-06-01

    In this paper, we define a new paradigm for eight-connection labeling, which employs a general approach to improve neighborhood exploration and minimizes the number of memory accesses. First, we exploit and extend the decision table formalism introducing OR-decision tables, in which multiple alternative actions are managed. An automatic procedure to synthesize the optimal decision tree from the decision table is used, providing the most effective conditions evaluation order. Second, we propose a new scanning technique that moves on a 2 x 2 pixel grid over the image, which is optimized by the automatically generated decision tree. An extensive comparison with the state of art approaches is proposed, both on synthetic and real datasets. The synthetic dataset is composed of different sizes and densities random images, while the real datasets are an artistic image analysis dataset, a document analysis dataset for text detection and recognition, and finally a standard resolution dataset for picture segmentation tasks. The algorithm provides an impressive speedup over the state of the art algorithms.

  14. Constructing an optimal decision tree for FAST corner point detection

    KAUST Repository

    Alkhalid, Abdulaziz

    2011-01-01

    In this paper, we consider a problem that is originated in computer vision: determining an optimal testing strategy for the corner point detection problem that is a part of FAST algorithm [11,12]. The problem can be formulated as building a decision tree with the minimum average depth for a decision table with all discrete attributes. We experimentally compare performance of an exact algorithm based on dynamic programming and several greedy algorithms that differ in the attribute selection criterion. © 2011 Springer-Verlag.

  15. Optimal strategies for electric energy contract decision making

    Science.gov (United States)

    Song, Haili

    2000-10-01

    The power industry restructuring in various countries in recent years has created an environment where trading of electric energy is conducted in a market environment. In such an environment, electric power companies compete for the market share through spot and bilateral markets. Being profit driven, electric power companies need to make decisions on spot market bidding, contract evaluation, and risk management. New methods and software tools are required to meet these upcoming needs. In this research, bidding strategy and contract pricing are studied from a market participant's viewpoint; new methods are developed to guide a market participant in spot and bilateral market operation. A supplier's spot market bidding decision is studied. Stochastic optimization is formulated to calculate a supplier's optimal bids in a single time period. This decision making problem is also formulated as a Markov Decision Process. All the competitors are represented by their bidding parameters with corresponding probabilities. A systematic method is developed to calculate transition probabilities and rewards. The optimal strategy is calculated to maximize the expected reward over a planning horizon. Besides the spot market, a power producer can also trade in the bilateral markets. Bidding strategies in a bilateral market are studied with game theory techniques. Necessary and sufficient conditions of Nash Equilibrium (NE) bidding strategy are derived based on the generators' cost and the loads' willingness to pay. The study shows that in any NE, market efficiency is achieved. Furthermore, all Nash equilibria are revenue equivalent for the generators. The pricing of "Flexible" contracts, which allow delivery flexibility over a period of time with a fixed total amount of electricity to be delivered, is analyzed based on the no-arbitrage pricing principle. The proposed algorithm calculates the price based on the optimality condition of the stochastic optimization formulation

  16. Collective learning and optimal consensus decisions in social animal groups.

    Science.gov (United States)

    Kao, Albert B; Miller, Noam; Torney, Colin; Hartnett, Andrew; Couzin, Iain D

    2014-08-01

    Learning has been studied extensively in the context of isolated individuals. However, many organisms are social and consequently make decisions both individually and as part of a collective. Reaching consensus necessarily means that a single option is chosen by the group, even when there are dissenting opinions. This decision-making process decouples the otherwise direct relationship between animals' preferences and their experiences (the outcomes of decisions). Instead, because an individual's learned preferences influence what others experience, and therefore learn about, collective decisions couple the learning processes between social organisms. This introduces a new, and previously unexplored, dynamical relationship between preference, action, experience and learning. Here we model collective learning within animal groups that make consensus decisions. We reveal how learning as part of a collective results in behavior that is fundamentally different from that learned in isolation, allowing grouping organisms to spontaneously (and indirectly) detect correlations between group members' observations of environmental cues, adjust strategy as a function of changing group size (even if that group size is not known to the individual), and achieve a decision accuracy that is very close to that which is provably optimal, regardless of environmental contingencies. Because these properties make minimal cognitive demands on individuals, collective learning, and the capabilities it affords, may be widespread among group-living organisms. Our work emphasizes the importance and need for theoretical and experimental work that considers the mechanism and consequences of learning in a social context.

  17. Pavement maintenance optimization model using Markov Decision Processes

    Science.gov (United States)

    Mandiartha, P.; Duffield, C. F.; Razelan, I. S. b. M.; Ismail, A. b. H.

    2017-09-01

    This paper presents an optimization model for selection of pavement maintenance intervention using a theory of Markov Decision Processes (MDP). There are some particular characteristics of the MDP developed in this paper which distinguish it from other similar studies or optimization models intended for pavement maintenance policy development. These unique characteristics include a direct inclusion of constraints into the formulation of MDP, the use of an average cost method of MDP, and the policy development process based on the dual linear programming solution. The limited information or discussions that are available on these matters in terms of stochastic based optimization model in road network management motivates this study. This paper uses a data set acquired from road authorities of state of Victoria, Australia, to test the model and recommends steps in the computation of MDP based stochastic optimization model, leading to the development of optimum pavement maintenance policy.

  18. Validation of the Oudega diagnostic decision rule for diagnosing deep vein thrombosis in frail older out-of-hospital patients

    NARCIS (Netherlands)

    Schouten, Henrike J.; Koek, Huiberdina L.; Oudega, Ruud; Van Delden, Johannes J M; Moons, Karel G M; Geersing, Geert Jan

    2015-01-01

    Objective. We aimed to validate the Oudega diagnostic decision rule-which was developed and validated among younger aged primary care patients-to rule-out deep vein thrombosis (DVT) in frail older outpatients. Methods. In older patients (>60 years, either community dwelling or residing in nursing ho

  19. Heuristic Algorithm with Simulation Model for Searching Optimal Reservoir Rule Curves

    Directory of Open Access Journals (Sweden)

    Anongrit Kangrang

    2009-01-01

    Full Text Available This study proposes a heuristic algorithm to connect with simulation model for searching the optimal reservoir rule curves. The proposed model was applied to determine the optimal rule curves of the Ubolratana reservoir (the Chi River Basin, Thailand. The results showed that the pattern of the obtained rule curves similar to the existing rule curve. Then the obtained rule curves were used to simulate the Ubolratana reservoir system with the synthetic inflows. The results indicated that the frequency of water shortage and the average water shortage are reduced to 44.31 and 43.75% respectively, the frequency of excess release and the average excess release are reduced to 24.08% and 22.81%.

  20. When none of us perform better than all of us together: the role of analogical decision rules in groups.

    Directory of Open Access Journals (Sweden)

    Nicoleta Meslec

    Full Text Available During social interactions, groups develop collective competencies that (ideally should assist groups to outperform average standalone individual members (weak cognitive synergy or the best performing member in the group (strong cognitive synergy. In two experimental studies we manipulate the type of decision rule used in group decision-making (identify the best vs. collaborative, and the way in which the decision rules are induced (direct vs. analogical and we test the effect of these two manipulations on the emergence of strong and weak cognitive synergy. Our most important results indicate that an analogically induced decision rule (imitate-the-successful heuristic in which groups have to identify the best member and build on his/her performance (take-the-best heuristic is the most conducive for strong cognitive synergy. Our studies bring evidence for the role of analogy-making in groups as well as the role of fast-and-frugal heuristics for group decision-making.

  1. Generalized concavity in fuzzy optimization and decision analysis

    CERN Document Server

    Ramík, Jaroslav

    2002-01-01

    Convexity of sets in linear spaces, and concavity and convexity of functions, lie at the root of beautiful theoretical results that are at the same time extremely useful in the analysis and solution of optimization problems, including problems of either single objective or multiple objectives. Not all of these results rely necessarily on convexity and concavity; some of the results can guarantee that each local optimum is also a global optimum, giving these methods broader application to a wider class of problems. Hence, the focus of the first part of the book is concerned with several types of generalized convex sets and generalized concave functions. In addition to their applicability to nonconvex optimization, these convex sets and generalized concave functions are used in the book's second part, where decision-making and optimization problems under uncertainty are investigated. Uncertainty in the problem data often cannot be avoided when dealing with practical problems. Errors occur in real-world data for...

  2. Optimal Portfolio Rules with Habit Formation and Preference for Wealth

    Institute of Scientific and Technical Information of China (English)

    XiaoZheng-yan; XuXu-song

    2003-01-01

    This paper describes a model in which a representative investor''s preference depends on both the consumption history consumption and his wealth. Thus, the investor accumulates wealth not only for the sake of consumption history but also for wealth. We examine the implication for consumption, portfolio choice. We solve the consumption portfolio choice problem and provide the optimal policy. The optimal solution to the problem shows that the preference for wealth and consumption formation will affect the investor''s optimal portfolio policy. For the purpose of further research,we also calculate the steady-state distribution of habit-consumption ratio.

  3. Optimal Portfolio Rules with Habit Formation and Preference for Wealth

    Institute of Scientific and Technical Information of China (English)

    Xiao Zheng-yan; Xu Xu-song

    2003-01-01

    This paper describes a model in which a representative investor's preference depends on both the consumption history consumption and his wealth. Thus, the investor accumulates wealth not only for the sake of consumption history but also for wealth. We examine the implication for consumption, portfolio choice. We solve the consumption portfolio choice problem and provide the optimal policy. The optimal solution to the problem shows that the preference for wealth and consumption formation will affect the investor's optimal portfolio policy. For the purpose of further research, we also calculate the steady-state distribution of habit-consumption ratio.

  4. Stability analysis of group decision-making under weighted scoring rules

    Science.gov (United States)

    Wu, Fan; Zhao, Yong; Chen, Yang

    2016-12-01

    The result of group decision-making is always unstable, influenced by some uncertain factors. It is necessary to measure and analyse the stability of the result. A measurement based on the inclined angle of two vectors is proposed in this paper, in order to measure the stabilities of the results of weighted scoring rules. The concepts of stability degree and stability angle are given, whose geometric interpretations are displayed in the case of three candidates. Then an extended measurement called the relative stability degree is discussed to analyse the comparability of stability measurements for different numbers of candidates. Furthermore, this measurement and its extension are used to aid the decision-making of new project development in a software company.

  5. Rules of Normalisation and their Importance for Interpretation of Systems of Optimal Taxation

    DEFF Research Database (Denmark)

    Munk, Knud Jørgen

    representation of the general equilibrium conditions the rules of normalisation in standard optimal tax models. This allows us to provide an intuitive explanation of what determines the optimal tax system. Finally, we review a number of examples where lack of precision with respect to normalisation in otherwise...

  6. Rough set based rule induction in decision making using credible classification and preference from medical application perspective.

    Science.gov (United States)

    Tseng, Tzu-Liang Bill; Huang, Chun-Che; Fraser, Kym; Ting, Hsien-Wei

    2016-04-01

    This paper presents a new heuristic algorithm for reduct selection based on credible index in the rough set theory (RST) applications. This algorithm is efficient and effective in selecting the decision rules particularly the problem to be solved in a large scale. This algorithm is capable to derive the rules with multi-outcomes and identify the most significant features simultaneously, which is unique and useful in solving predictive medical problems. The end results of the proposed approach are a set of decision rules that illustrates the causes for solitary pulmonary nodule and results of the long term treatment.

  7. 38 CFR 20.401 - Rule 401. Effect of decision on administrative or merged appeal on claimant's appellate rights.

    Science.gov (United States)

    2010-07-01

    ... decision on administrative or merged appeal on claimant's appellate rights. 20.401 Section 20.401 Pensions... OF PRACTICE Administrative Appeals § 20.401 Rule 401. Effect of decision on administrative or merged appeal on claimant's appellate rights. (a) Merged appeal. If the administrative appeal is merged,...

  8. FWFA Optimization based Decision Support System for Road Traffic Engineering

    Science.gov (United States)

    Utama, D. N.; Zaki, F. A.; Munjeri, I. J.; Putri, N. U.

    2017-01-01

    Several ways and efforts have been already conducted to formally solve the road traffic congestion. However, the objective strategy type of road traffic engineering could not be proven truly. Try and error is one inefficient way in road traffic engineering to degrade the level of congestion. The combination between fuzzy-logic and water flow algorithm methods (called FWFA) was used as a main method to construct the decision support system (DSS) for selecting the objective strategy in road traffic engineering. The proposed DSS can suggest the most optimal strategy decision in road traffic engineering. Here, a main traffic road of Juanda in area Ciputat, Tangerang Selatan, province Banten, Indonesia; was selected as a research object in this study. The constructed DSS for road traffic engineering was structurally delivered in this paper.

  9. Silvicultural decisions based on simulation-optimization systems

    Energy Technology Data Exchange (ETDEWEB)

    Cao, Tianjian

    2010-05-15

    Forest management is facing new challenges under climate change. By adjusting thinning regimes, conventional forest management can be adapted to various objectives of utilization of forest resources, such as wood quality, forest bioenergy, and carbon sequestration. This thesis aims to develop and apply a simulation-optimization system as a tool for an interdisciplinary understanding of the interactions between wood science, forest ecology, and forest economics. In this thesis, the OptiFor software was developed for forest resources management. The OptiFor simulation-optimization system integrated the process-based growth model PipeQual, wood quality models, biomass production and carbon emission models, as well as energy wood and commercial logging models into a single optimization model. Osyczka s direct and random search algorithm was employed to identify optimal values for a set of decision variables. The numerical studies in this thesis broadened our current knowledge and understanding of the relationships between wood science, forest ecology, and forest economics. The results for timber production show that optimal thinning regimes depend on site quality and initial stand characteristics. Taking wood properties into account, our results show that increasing the intensity of thinning resulted in lower wood density and shorter fibers. The addition of nutrients accelerated volume growth, but lowered wood quality for Norway spruce. Integrating energy wood harvesting into conventional forest management showed that conventional forest management without energy wood harvesting was still superior in sparse stands of Scots pine. Energy wood from pre-commercial thinning turned out to be optimal for dense stands. When carbon balance is taken into account, our results show that changing carbon assessment methods leads to very different optimal thinning regimes and average carbon stocks. Raising the carbon price resulted in longer rotations and a higher mean annual

  10. Evaluation of Clinical Decision Rules for Bone Mineral Density Testing among White Women

    Directory of Open Access Journals (Sweden)

    Michael E. Anders

    2013-01-01

    Full Text Available Background. Osteoporosis is a devastating, insidious disease that causes skeletal fragility. Half of women will suffer osteoporotic fractures during their lifetimes. Many fractures occur needlessly, because of inattentiveness to assessment, diagnosis, prevention, and treatment of osteoporosis. Study Purpose. Study Purpose. To evaluate the discriminatory performance of clinical decision rules to determine the need to undergo bone mineral density testing. Methods. A nationally representative sample from the Third National Health and Nutrition Examination Survey consisted of 14,060 subjects who completed surveys, physical examinations, laboratory tests, and bone mineral density exams. Multivariable linear regression tested the correlation of covariates that composed the clinical decision rules with bone mineral density. Results. Increased age and decreased weight were variables in the final regression models for each gender and race/ethnicity. Among the indices, the Osteoporosis Self-Assessment Tool, which is composed of age and weight, performed best for White women. Study Implications. These results have implications for the prevention, assessment, diagnosis, and treatment of osteoporosis. The Osteoporosis Self-Assessment Tool performed best and is inexpensive and the least time consuming to implement.

  11. Derivation of a clinical decision rule for predictive factors for the development of pharyngocutaneous fistula postlaryngectomy.

    Science.gov (United States)

    Cecatto, Suzana Boltes; Monteiro-Soares, Matilde; Henriques, Teresa; Monteiro, Eurico; Moura, Carla Isabel Ferreira Pinto

    2015-01-01

    Pharyngocutaneous fistula after larynx and hypopharynx cancer surgery can cause several damages. This study's aim was to derive a clinical decision rule to predict pharyngocutaneous fistula development after pharyngolaryngeal cancer surgery. A retrospective cohort study was conducted, including all patients performing total laryngectomy/pharyngolaryngectomy (n=171). Association between pertinent variables and pharyngocutaneous fistula development was assessed and a predictive model proposed. American Society of Anesthesiologists scale, chemoradiotherapy, and tracheotomy before surgery were associated with fistula in the univariate analysis. In the multivariate analysis, only American Society of Anesthesiologists maintained statistical significance. Using logistic regression, a predictive model including the following was derived: American Society of Anesthesiologists, alcohol, chemoradiotherapy, tracheotomy, hemoglobin and albumin pre-surgery, local extension, N-classification, and diabetes mellitus. The model's score area under the curve was 0.76 (95% CI 0.64-0.87). The high-risk group presented specificity of 93%, positive likelihood ratio of 7.10, and positive predictive value of 76%. Including the medium-low, medium-high, and high-risk groups, a sensitivity of 92%, negative likelihood ratio of 0.25, and negative predictive value of 89% were observed. A clinical decision rule was created to identify patients with high risk of pharyngocutaneous fistula development. Prognostic accuracy measures were substantial. Nevertheless, it is essential to conduct larger prospective studies for validation and refinement. Copyright © 2015 Associação Brasileira de Otorrinolaringologia e Cirurgia Cérvico-Facial. Published by Elsevier Editora Ltda. All rights reserved.

  12. An international study of emergency physicians' practice for acute headache management and the need for a clinical decision rule.

    Science.gov (United States)

    Perry, Jeffrey J; Eagles, Debra; Clement, Catherine M; Brehaut, Jamie; Kelly, Anne-Maree; Mason, Suzanne; Stiell, Ian G

    2009-11-01

    Patients with acute headache often undergo computed tomography (CT) followed by a lumbar puncture to rule out subarachnoid hemorrhage. Our international study examined current practice, the perceived need for a clinical decision rule for acute headache and the required sensitivity for such a rule. We approached 2100 emergency physicians from 4 countries (Australia, Canada, the United Kingdom and the United States) to participate in our survey by sampling the membership of their emergency associations. We used a modified Dillman technique with 3-5 notifications and a prenotification letter employing a combination of electronic mail and postal mail. Physicians were questioned about neurologically intact patients who presented with headache. Analysis included both descriptive statistics for the entire sample and stratification by country. The total response rate was 54.7% (1149/2100). Respondents were primarily male (75.5%), with a mean age of 42.5 years and a mean 12.3 years of emergency department (ED) experience. Of the physicians who responded, 49.5% thought all acute headache patients should be investigated with CT and 57.4% felt CT should always be followed by lumbar puncture. Of the respondents, 95.7% reported they would consider using a clinical decision rule for patients with acute headache to rule out subarachnoid hemorrhage. Respondents deemed the median sensitivity required by such a rule to be 99% (interquartile range 98%-99%). Approximately 1 in 5 physicians suggested that 100% sensitivity was required. Emergency physicians report that they would welcome a clinical decision rule for headache that would determine which patients require costly or invasive tests to rule out subarachnoid hemorrhage. The required sensitivity of such a rule was realistic. These results will inform and inspire the development of clinical decision rules for acute headache in the ED.

  13. Optimization of conventional rule curves coupled with hedging rules for reservoir operation

    DEFF Research Database (Denmark)

    Taghian, Mehrdad; Rosbjerg, Dan; Haghighi, Ali;

    2014-01-01

    to achieve the optimal water allocation and the target storage levels for reservoirs. As a case study, a multipurpose, multireservoir system in southern Iran is selected. The results show that the model has good performance in extracting the optimum policy for reservoir operation under both normal...

  14. Extending the horizons advances in computing, optimization, and decision technologies

    CERN Document Server

    Joseph, Anito; Mehrotra, Anuj; Trick, Michael

    2007-01-01

    Computer Science and Operations Research continue to have a synergistic relationship and this book represents the results of cross-fertilization between OR/MS and CS/AI. It is this interface of OR/CS that makes possible advances that could not have been achieved in isolation. Taken collectively, these articles are indicative of the state-of-the-art in the interface between OR/MS and CS/AI and of the high caliber of research being conducted by members of the INFORMS Computing Society. EXTENDING THE HORIZONS: Advances in Computing, Optimization, and Decision Technologies is a volume that presents the latest, leading research in the design and analysis of algorithms, computational optimization, heuristic search and learning, modeling languages, parallel and distributed computing, simulation, computational logic and visualization. This volume also emphasizes a variety of novel applications in the interface of CS, AI, and OR/MS.

  15. Comparative Analysis and Survey of Ant Colony Optimization based Rule Miners

    Directory of Open Access Journals (Sweden)

    Zulfiqar Ali

    2017-01-01

    Full Text Available In this research study, we analyze the performance of bio inspired classification approaches by selecting Ant-Miners (Ant-Miner, cAnt_Miner, cAnt_Miner2 and cAnt_MinerPB for the discovery of classification rules in terms of accuracy, terms per rule, number of rules, running time and model size discovered by the corresponding rule mining algorithm. Classification rule discovery is still a challenging and emerging research problem in the field of data mining and knowledge discovery. Rule based classification has become cutting edge research area due to its importance and popular application areas in the banking, market basket analysis, credit card fraud detection, costumer behaviour, stock market prediction and protein sequence analysis. There are various approaches proposed for the discovery of classification rules like Artificial Neural Networks, Genetic Algorithm, Evolutionary Programming, SVM and Swarm Intelligence. This research study is focused on classification rule discovery by Ant Colony Optimization. For the performance analysis, Myra Tool is used for experiments on the 18 public datasets (available on the UCI repository. Data sets are selected with varying number of instances, number of attributes and number of classes. This research paper also provides focused survey of Ant-Miners for the discovery of classification rules.

  16. A diagnosis-based clinical decision rule for spinal pain part 2: review of the literature

    Directory of Open Access Journals (Sweden)

    Hurwitz Eric L

    2008-08-01

    Full Text Available Abstract Background Spinal pain is a common and often disabling problem. The research on various treatments for spinal pain has, for the most part, suggested that while several interventions have demonstrated mild to moderate short-term benefit, no single treatment has a major impact on either pain or disability. There is great need for more accurate diagnosis in patients with spinal pain. In a previous paper, the theoretical model of a diagnosis-based clinical decision rule was presented. The approach is designed to provide the clinician with a strategy for arriving at a specific working diagnosis from which treatment decisions can be made. It is based on three questions of diagnosis. In the current paper, the literature on the reliability and validity of the assessment procedures that are included in the diagnosis-based clinical decision rule is presented. Methods The databases of Medline, Cinahl, Embase and MANTIS were searched for studies that evaluated the reliability and validity of clinic-based diagnostic procedures for patients with spinal pain that have relevance for questions 2 (which investigates characteristics of the pain source and 3 (which investigates perpetuating factors of the pain experience. In addition, the reference list of identified papers and authors' libraries were searched. Results A total of 1769 articles were retrieved, of which 138 were deemed relevant. Fifty-one studies related to reliability and 76 related to validity. One study evaluated both reliability and validity. Conclusion Regarding some aspects of the DBCDR, there are a number of studies that allow the clinician to have a reasonable degree of confidence in his or her findings. This is particularly true for centralization signs, neurodynamic signs and psychological perpetuating factors. There are other aspects of the DBCDR in which a lesser degree of confidence is warranted, and in which further research is needed.

  17. Optimal decision making and matching are tied through diminishing returns.

    Science.gov (United States)

    Kubanek, Jan

    2017-08-08

    How individuals make decisions has been a matter of long-standing debate among economists and researchers in the life sciences. In economics, subjects are viewed as optimal decision makers who maximize their overall reward income. This framework has been widely influential, but requires a complete knowledge of the reward contingencies associated with a given choice situation. Psychologists and ecologists have observed that individuals tend to use a simpler "matching" strategy, distributing their behavior in proportion to relative rewards associated with their options. This article demonstrates that the two dominant frameworks of choice behavior are linked through the law of diminishing returns. The relatively simple matching can in fact provide maximal reward when the rewards associated with decision makers' options saturate with the invested effort. Such saturating relationships between reward and effort are hallmarks of the law of diminishing returns. Given the prevalence of diminishing returns in nature and social settings, this finding can explain why humans and animals so commonly behave according to the matching law. The article underscores the importance of the law of diminishing returns in choice behavior.

  18. Investigation of Multi-Criteria Decision Consistency: A Triplex Approach to Optimal Oilfield Portfolio Investment Decisions

    Science.gov (United States)

    Qaradaghi, Mohammed

    Complexity of the capital intensive oil and gas portfolio investments is continuously growing. It is manifested in the constant increase in the type, number and degree of risks and uncertainties, which consequently lead to more challenging decision making problems. A typical complex decision making problem in petroleum exploration and production (E&P) is the selection and prioritization of oilfields/projects in a portfolio investment. Prioritizing oilfields maybe required for different purposes, including the achievement of a targeted production and allocation of limited available development resources. These resources cannot be distributed evenly nor can they be allocated based on the oilfield size or production capacity alone since various other factors need to be considered simultaneously. These factors may include subsurface complexity, size of reservoir, plateau production and needed infrastructure in addition to other issues of strategic concern, such as socio-economic, environmental and fiscal policies, particularly when the decision making involves governments or national oil companies. Therefore, it would be imperative to employ decision aiding tools that not only address these factors, but also incorporate the decision makers' preferences clearly and accurately. However, the tools commonly used in project portfolio selection and optimization, including intuitive approaches, vary in their focus and strength in addressing the different criteria involved in such decision problems. They are also disadvantaged by a number of drawbacks, which may include lacking the capacity to address multiple and interrelated criteria, uncertainty and risk, project relationship with regard to value contribution and optimum resource utilization, non-monetary attributes, decision maker's knowledge and expertise, in addition to varying levels of ease of use and other practical and theoretical drawbacks. These drawbacks have motivated researchers to investigate other tools and

  19. How accurate are interpretations of curriculum-based measurement progress monitoring data? Visual analysis versus decision rules.

    Science.gov (United States)

    Van Norman, Ethan R; Christ, Theodore J

    2016-10-01

    Curriculum based measurement of oral reading (CBM-R) is used to monitor the effects of academic interventions for individual students. Decisions to continue, modify, or terminate these interventions are made by interpreting time series CBM-R data. Such interpretation is founded upon visual analysis or the application of decision rules. The purpose of this study was to compare the accuracy of visual analysis and decision rules. Visual analysts interpreted 108 CBM-R progress monitoring graphs one of three ways: (a) without graphic aids, (b) with a goal line, or (c) with a goal line and a trend line. Graphs differed along three dimensions, including trend magnitude, variability of observations, and duration of data collection. Automated trend line and data point decision rules were also applied to each graph. Inferential analyses permitted the estimation of the probability of a correct decision (i.e., the student is improving - continue the intervention, or the student is not improving - discontinue the intervention) for each evaluation method as a function of trend magnitude, variability of observations, and duration of data collection. All evaluation methods performed better when students made adequate progress. Visual analysis and decision rules performed similarly when observations were less variable. Results suggest that educators should collect data for more than six weeks, take steps to control measurement error, and visually analyze graphs when data are variable. Implications for practice and research are discussed.

  20. Performance of a New Enhanced Topological Decision-Rule Map-Matching Algorithm for Transportation Applications

    Directory of Open Access Journals (Sweden)

    C. Blazquez

    2012-11-01

    Full Text Available Map-matching problems arise in numerous transportation-related applications when spatial data is collected usinginaccurate GPS technology and integrated with a flawed digital roadway map in a GIS environment. This paperpresents a new enhanced post-processing topological decision-rule map-matching algorithm in order to addressrelevant special cases that occur in the spatial mismatch resolution. The proposed map-matching algorithm includessimple algorithmic improvements: dynamic buffer that varies its size to snap GPS data points to at least one roadwaycenterline; a comparison between vehicle heading measurements and associated roadway centerline direction; and anew design of the sequence of steps in the algorithm architecture. The original and new versions of the algorithmwere tested on different spatial data qualities collected in Canada and United States. Although both versionssatisfactorily resolve complex spatial ambiguities, the comparative and statistical analysis indicates that the newalgorithm with the simple algorithmic improvements outperformed the original version of the map-matching algorithm.

  1. Optimizing Fuzzy Rule Base for Illumination Compensation in Face Recognition using Genetic Algorithms

    Directory of Open Access Journals (Sweden)

    Bima Sena Bayu Dewantara

    2014-12-01

    Full Text Available Fuzzy rule optimization is a challenging step in the development of a fuzzy model. A simple two inputs fuzzy model may have thousands of combination of fuzzy rules when it deals with large number of input variations. Intuitively and trial‐error determination of fuzzy rule is very difficult. This paper addresses the problem of optimizing Fuzzy rule using Genetic Algorithm to compensate illumination effect in face recognition. Since uneven illumination contributes negative effects to the performance of face recognition, those effects must be compensated. We have developed a novel algorithmbased on a reflectance model to compensate the effect of illumination for human face recognition. We build a pair of model from a single image and reason those modelsusing Fuzzy.Fuzzy rule, then, is optimized using Genetic Algorithm. This approachspendsless computation cost by still keepinga high performance. Based on the experimental result, we can show that our algorithm is feasiblefor recognizing desired person under variable lighting conditions with faster computation time. Keywords: Face recognition, harsh illumination, reflectance model, fuzzy, genetic algorithm

  2. Survey of emergency physicians' requirements for a clinical decision rule for acute respiratory illnesses in three countries.

    Science.gov (United States)

    Perry, Jeffrey J; Goindi, Reena; Symington, Cheryl; Brehaut, Jamie; Taljaard, Monica; Schneider, Sandra; Stiell, Ian G

    2012-03-01

    ABSTRACTObjective:There are currently no widely used guidelines to determine which older patients with acute respiratory conditions require hospital admission. This study assessed the need for clinical decision rules to help determine whether hospital admission is required for patients over 50 years for three common respiratory conditions: chronic obstructive pulmonary disease (COPD), heart failure (HF), and community-acquired pneumonia (CAP). Postal survey. Emergency physicians (EPs) from the United States, Canada, and Australasia. A random sample of EPs from the United States, Canada, and Australasia. A modified Dillman technique with a prenotification letter and up to three postal surveys. EP opinions regarding the need for and willingness to use clinical decision rules for emergency department (ED) patients over 50 years with COPD, HF, or CAP to predict hospital admission. We assessed the required sensitivity of each rule for return ED visit or death within 14 days. A total of 801 responses from 1,493 surveys were received, with response rates of 55%, 60%, and 46% for Australasia, Canada, and the United States, respectively. Over 90% of EPs reported that they would consider using clinical decision rules for HF, CAP, and COPD. The median required sensitivity for death within 14 days was 97 to 98% for all conditions. EPs are likely to adopt highly sensitive clinical decision rules to predict the need for hospital admission for patients over 50 years with COPD, HF, or CAP.

  3. Optimal quadrature rules for odd-degree spline spaces and their application to tensor-product-based isogeometric analysis

    KAUST Repository

    Barton, Michael

    2016-03-14

    We introduce optimal quadrature rules for spline spaces that are frequently used in Galerkin discretizations to build mass and stiffness matrices. Using the homotopy continuation concept (Bartoň and Calo, 2016) that transforms optimal quadrature rules from source spaces to target spaces, we derive optimal rules for splines defined on finite domains. Starting with the classical Gaussian quadrature for polynomials, which is an optimal rule for a discontinuous odd-degree space, we derive rules for target spaces of higher continuity. We further show how the homotopy methodology handles cases where the source and target rules require different numbers of optimal quadrature points. We demonstrate it by deriving optimal rules for various odd-degree spline spaces, particularly with non-uniform knot sequences and non-uniform multiplicities. We also discuss convergence of our rules to their asymptotic counterparts, that is, the analogues of the midpoint rule of Hughes et al. (2010), that are exact and optimal for infinite domains. For spaces of low continuities, we numerically show that the derived rules quickly converge to their asymptotic counterparts as the weights and nodes of a few boundary elements differ from the asymptotic values.

  4. Allogeneic cell therapy bioprocess economics and optimization: downstream processing decisions.

    Science.gov (United States)

    Hassan, Sally; Simaria, Ana S; Varadaraju, Hemanthram; Gupta, Siddharth; Warren, Kim; Farid, Suzanne S

    2015-01-01

    To develop a decisional tool to identify the most cost effective process flowsheets for allogeneic cell therapies across a range of production scales. A bioprocess economics and optimization tool was built to assess competing cell expansion and downstream processing (DSP) technologies. Tangential flow filtration was generally more cost-effective for the lower cells/lot achieved in planar technologies and fluidized bed centrifugation became the only feasible option for handling large bioreactor outputs. DSP bottlenecks were observed at large commercial lot sizes requiring multiple large bioreactors. The DSP contribution to the cost of goods/dose ranged between 20-55%, and 50-80% for planar and bioreactor flowsheets, respectively. This analysis can facilitate early decision-making during process development.

  5. Optimization design of Iptables rules set%Iptables规则集的优化设计

    Institute of Scientific and Technical Information of China (English)

    林燕

    2015-01-01

    分析了Iptables工作原理,介绍了Iptables规则集,提出了一种有效的规则集优化算法,并在Linux下实现。运用该优化算法能够有效剔除规则集中重复的规则,直接提升系统过滤效率,减少过滤数据包所需的时间,从而提高防火墙系统的网络吞吐量。%The working principles of Iptables are analyzed and its rule set is introduced. An effective optimization algorithm is put forward and implemented in Linux. The optimization algorithm can effectively eliminate rules focused duplicate rules, directly improve the system filtration efficiency and reduce the required data packet filtering time, so as to improve the throughput of network firewall system.

  6. Finding optimal step of fuzzy Newton-Cotes integration rules by using the CESTAC method

    Directory of Open Access Journals (Sweden)

    Samad Noeiaghdam

    2017-08-01

    Full Text Available The aim of this work, is to evaluate the value of a fuzzy integral by applying the Newton-Cotes integration rules via a reliable scheme. In order to perform the numerical examples, the CADNA (Control of Accuracy and Debugging for Numerical Applications library and the CESTAC (Controle et Estimation Stochastique des Arrondis de Calculs method are applied based on the stochastic arithmetic. By using this method, the optimal number of points in the fuzzy numerical integration rules and the optimal approximate solution are obtained. Also, the accuracy of the fuzzy quadrature rules are discussed. An algorithm is given to illustrate the implementation of the method. In this case, the termination criterion is considered as the Hausdorff distance between two sequential results to be an informatical zero. Two sample fuzzy integrals are evaluated based on the proposed algorithm to show the importance and advantage of using the stochastic arithmetic in place of the floating-point arithmetic.

  7. Using an improved association rules mining optimization algorithm in web-based mobile-learning system

    Science.gov (United States)

    Huang, Yin; Chen, Jianhua; Xiong, Shaojun

    2009-07-01

    Mobile-Learning (M-learning) makes many learners get the advantages of both traditional learning and E-learning. Currently, Web-based Mobile-Learning Systems have created many new ways and defined new relationships between educators and learners. Association rule mining is one of the most important fields in data mining and knowledge discovery in databases. Rules explosion is a serious problem which causes great concerns, as conventional mining algorithms often produce too many rules for decision makers to digest. Since Web-based Mobile-Learning System collects vast amounts of student profile data, data mining and knowledge discovery techniques can be applied to find interesting relationships between attributes of learners, assessments, the solution strategies adopted by learners and so on. Therefore ,this paper focus on a new data-mining algorithm, combined with the advantages of genetic algorithm and simulated annealing algorithm , called ARGSA(Association rules based on an improved Genetic Simulated Annealing Algorithm), to mine the association rules. This paper first takes advantage of the Parallel Genetic Algorithm and Simulated Algorithm designed specifically for discovering association rules. Moreover, the analysis and experiment are also made to show the proposed method is superior to the Apriori algorithm in this Mobile-Learning system.

  8. Association Rule Mining for Both Frequent and Infrequent Items Using Particle Swarm Optimization Algorithm

    Directory of Open Access Journals (Sweden)

    MIR MD. JAHANGIR KABIR

    2014-07-01

    Full Text Available In data mining research, generating frequent items from large databases is one of the important issues and the key factor for implementing association rule mining tasks. Mining infrequent items such as relationships among rare but expensive products is another demanding issue which have been shown in some recent studies. Therefore this study considers user assigned threshold values as a constraint which helps users mine those rules which are more interesting for them. In addition, in real world users may prefer to know relationships among frequent items along with infrequent ones. The particle swarm optimization algorithm is an important heuristic technique in recent years and this study uses this technique to mine association rules effectively. If this technique considers user defined threshold values, interesting association rules can be generated more efficiently. Therefore this study proposes a novel approach which includes using particle swarm optimization algorithm to mine association rules from databases. Our implementation of the search strategy includes bitmap representation of nodes in a lexicographic tree and from superset-subset relationship of the nodes it classifies frequent items along with infrequent itemsets. In addition, this approach avoids extra calculation overhead for generating frequent pattern trees and handling large memory which store the support values of candidate item sets. Our experimental results show that this approach efficiently mines association rules. It accesses a database to calculate a support value for fewer numbers of nodes to find frequent itemsets and from that it generates association rules, which dramatically reduces search time. The main aim of this proposed algorithm is to show how heuristic method works on real databases to find all the interesting association rules in an efficient way.

  9. Optimal decisions of countries with carbon tax and carbon tariff

    Directory of Open Access Journals (Sweden)

    Yumei Hou

    2015-05-01

    Full Text Available Purpose: Reducing carbon emission has been the core problem of controlling global warming and climate deterioration recently. This paper focuses on the optimal carbon taxation policy levied by countries and the impact on firms’ optimal production decisions. Design/methodology/approach: This paper uses a two-stage game theory model to analyze the impact of carbon tariff and tax. Numerical simulation is used to supplement the theoretical analysis. Findings: Results derived from the paper indicate that the demand in an unstable market is significantly affected by environmental damage level. Carbon tariff is a policy-oriented tax while the carbon tax is a market-oriented one. Comprehensive carbon taxation policy benefit developed countries and basic policy is more suitable for developing countries. Research limitations/implications: In this research, we do not consider random demand and asymmetric information, which may not well suited the reality. Originality/value: This work provides a different perspective in analyzing the impact of carbon tax and tariff. It is the first study to consider two consuming market and the strategic game between two countries. Different international status of countries considered in the paper is also a unique point.

  10. A clinical decision rule for the use of plain radiography in children after acute wrist injury: development and external validation of the Amsterdam Pediatric Wrist Rules

    Energy Technology Data Exchange (ETDEWEB)

    Slaar, Annelie; Maas, Mario; Rijn, Rick R. van [University of Amsterdam, Department of Radiology, Academic Medical Centre, Meibergdreef 9, 1105, AZ, Amsterdam (Netherlands); Walenkamp, Monique M.J.; Bentohami, Abdelali; Goslings, J.C. [University of Amsterdam, Trauma Unit, Department of Surgery, Academic Medical Centre, Amsterdam (Netherlands); Steyerberg, Ewout W. [Erasmus MC - University Medical Centre, Department of Public Health, Rotterdam (Netherlands); Jager, L.C. [University of Amsterdam, Emergency Department, Academic Medical Centre, Amsterdam (Netherlands); Sosef, Nico L. [Spaarne Hospital, Department of Surgery, Hoofddorp (Netherlands); Velde, Romuald van [Tergooi Hospitals, Department of Surgery, Hilversum (Netherlands); Ultee, Jan M. [Sint Lucas Andreas Hospital, Department of Surgery, Amsterdam (Netherlands); Schep, Niels W.L. [University of Amsterdam, Trauma Unit, Department of Surgery, Academic Medical Centre, Amsterdam (Netherlands); Maasstadziekenhuis Rotterdam, Department of Surgery, Rotterdam (Netherlands)

    2016-01-15

    In most hospitals, children with acute wrist trauma are routinely referred for radiography. To develop and validate a clinical decision rule to decide whether radiography in children with wrist trauma is required. We prospectively developed and validated a clinical decision rule in two study populations. All children who presented in the emergency department of four hospitals with pain following wrist trauma were included and evaluated for 18 clinical variables. The outcome was a wrist fracture diagnosed by plain radiography. Included in the study were 787 children. The prediction model consisted of six variables: age, swelling of the distal radius, visible deformation, distal radius tender to palpation, anatomical snuffbox tender to palpation, and painful or abnormal supination. The model showed an area under the receiver operator characteristics curve of 0.79 (95% CI: 0.76-0.83). The sensitivity and specificity were 95.9% and 37.3%, respectively. The use of this model would have resulted in a 22% absolute reduction of radiographic examinations. In a validation study, 7/170 fractures (4.1%, 95% CI: 1.7-8.3%) would have been missed using the decision model. The decision model may be a valuable tool to decide whether radiography in children after wrist trauma is required. (orig.)

  11. Comparing two socially optimal work allocation rules when having a profit optimizing subcontractor with ample capacity

    DEFF Research Database (Denmark)

    Larsen, Christian

    2003-01-01

    We study a service system modelled as a single server queueing system where request for service either can be processed at the service system or by a subcontractor. In the former case the customer is incurred waiting costs but the service is free, while in the latter case the customer must pay...... are minimized. The two work allocation rules are characterized by one being centralized and randomized while the other is decentralized and deterministic. Form the subcontractors point of view the latter is preferred and under this rule he receives lesser requests but can charge a higher price compared...

  12. Comparing two socially optimal work allocation rules when having a profit optimizing subcontractor with ample capacity

    DEFF Research Database (Denmark)

    Larsen, Christian

    We study a service system modelled as a single server queueing system where requests for service either can be processed at the service system or by a subcontractor. In the former case the subcontractor is incurred waiting costs but the service is free, while in the latter case the customer must...... of the customers are minimized. The two work allocation rules are characterized by one being centralized and randomized while the other is decentralized and deterministic. From the subcontractors point of view the latter is preferred and under this rule he receives lesser requests but can charge a higher price...

  13. Comparing two socially optimal work allocation rules when having a profit optimizing subcontractor with ample capacity

    DEFF Research Database (Denmark)

    Larsen, Christian

    2005-01-01

    We study a service system modelled as a single server queuing system where request for service either can be processed at the service system or by a subcontractor. In the former case the customer is incurred waiting costs but the service is free, while in the latter case the customer must pay....... The two work allocation rules are characterized by one being centralized and randomized while the other is decentralized and deterministic. From the subcontractors point of view the latter is preferred and under this rule he receives lesser requests but can charge a higher price compared to the former. We...

  14. Optimization of Mapping Rule of Bit-Interleaved Turbo Coded Modulation with 16QAM

    Institute of Scientific and Technical Information of China (English)

    FEI Ze-song; YANG Yu; LIU Lin-nan; KUANG Jing-ming

    2005-01-01

    Optimization of mapping rule of bit-interleaved Turbo coded modulation with 16 quadrature amplitude modulation (QAM) is investigated based on different impacts of various encoded bits sequence on Turbo decoding performance. Furthermore, bit-interleaved in-phase and quadrature phase (I-Q) Turbo coded modulation scheme are designed similarly with I-Q trellis coded modulation (TCM). Through performance evaluation and analysis, it can be seen that the novel mapping rule outperforms traditional one and the I-Q Turbo coded modulation can not achieve good performance as expected. Therefore, there is not obvious advantage in using I-Q method in bit-interleaved Turbo coded modulation.

  15. Developing a decision rule to optimise clinical pharmacist resources for medication reconciliation in the emergency department.

    Science.gov (United States)

    De Winter, Sabrina; Vanbrabant, Peter; Laeremans, Pieter; Foulon, Veerle; Willems, Ludo; Verelst, Sandra; Spriet, Isabel

    2017-08-01

    The process of obtaining a complete medication history for patients admitted to the hospital from the ED at hospital admission, without discrepancies, is error prone and time consuming. The goal of this study was the development of a clinical decision rule (CDR) with a high positive predictive value in detecting ED patients admitted to hospital at risk of at least one discrepancy during regular medication history acquisition, along with favourable feasibility considering time and budget constraints. Data were based on a previous prospective study conducted at the ED in Belgium, describing discrepancies in 3592 medication histories. Data were split into a training and a validation set. A model predicting the number of discrepancies was derived from the training set with negative binomial regression and was validated on the validation set. The performance of the model was assessed. Several CDRs were constructed and evaluated on positive predictive value and alert rate. The following variables were retained in the prediction model: (1) age, (2) gender, (3) medical discipline for which the patient was admitted, (4) degree of physician training, (5) season of admission, (6) type of care before admission, number of (7) drugs, (8) high-risk drugs, (9) drugs acting on alimentary tract and metabolism, (10) antithrombotics, antihaemorrhagics and antianaemic preparations, (11) cardiovascular drugs, (12) drugs acting on musculoskeletal system and (13) drugs acting on the nervous system; all recorded by the ED physician on admission. The final CDR resulted in an alert rate of 29% with a positive predictive value of 74%. The final CDR allows identification of the majority of patients with a potential discrepancy within a feasible workload for the pharmacy staff. Our CDR is a first step towards a rule that could be incorporated into electronic medical records or a scoring system. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017

  16. A data mining approach to optimize pellets manufacturing process based on a decision tree algorithm.

    Science.gov (United States)

    Ronowicz, Joanna; Thommes, Markus; Kleinebudde, Peter; Krysiński, Jerzy

    2015-06-20

    The present study is focused on the thorough analysis of cause-effect relationships between pellet formulation characteristics (pellet composition as well as process parameters) and the selected quality attribute of the final product. The shape using the aspect ratio value expressed the quality of pellets. A data matrix for chemometric analysis consisted of 224 pellet formulations performed by means of eight different active pharmaceutical ingredients and several various excipients, using different extrusion/spheronization process conditions. The data set contained 14 input variables (both formulation and process variables) and one output variable (pellet aspect ratio). A tree regression algorithm consistent with the Quality by Design concept was applied to obtain deeper understanding and knowledge of formulation and process parameters affecting the final pellet sphericity. The clear interpretable set of decision rules were generated. The spehronization speed, spheronization time, number of holes and water content of extrudate have been recognized as the key factors influencing pellet aspect ratio. The most spherical pellets were achieved by using a large number of holes during extrusion, a high spheronizer speed and longer time of spheronization. The described data mining approach enhances knowledge about pelletization process and simultaneously facilitates searching for the optimal process conditions which are necessary to achieve ideal spherical pellets, resulting in good flow characteristics. This data mining approach can be taken into consideration by industrial formulation scientists to support rational decision making in the field of pellets technology. Copyright © 2015 Elsevier B.V. All rights reserved.

  17. Approximation Algorithms for Optimal Decision Trees and Adaptive TSP Problems

    CERN Document Server

    Gupta, Anupam; Nagarajan, Viswanath; Ravi, R

    2010-01-01

    We consider the problem of constructing optimal decision trees: given a collection of tests which can disambiguate between a set of $m$ possible diseases, each test having a cost, and the a-priori likelihood of the patient having any particular disease, what is a good adaptive strategy to perform these tests to minimize the expected cost to identify the disease? We settle the approximability of this problem by giving a tight $O(\\log m)$-approximation algorithm. We also consider a more substantial generalization, the Adaptive TSP problem. Given an underlying metric space, a random subset $S$ of cities is drawn from a known distribution, but $S$ is initially unknown to us--we get information about whether any city is in $S$ only when we visit the city in question. What is a good adaptive way of visiting all the cities in the random subset $S$ while minimizing the expected distance traveled? For this problem, we give the first poly-logarithmic approximation, and show that this algorithm is best possible unless w...

  18. Comparing two socially optimal work allocation rules when having a profit optimizing subcontractor with ample capacity

    DEFF Research Database (Denmark)

    Larsen, Christian

    2005-01-01

    for the service but there is no waiting time, hence no waiting costs. Under the premises that the subcontractor prices his services in order to maximize his profit, we study two work allocation rules, which given the price of subcontractor seek to allocate work such that the costs of the customers are minimized......We study a service system modelled as a single server queuing system where request for service either can be processed at the service system or by a subcontractor. In the former case the customer is incurred waiting costs but the service is free, while in the latter case the customer must pay...... also show that from the customers point view, any of the two work allocation rules are to be preferred compared to the base case where there are no subcontractor. Udgivelsesdato: MAR...

  19. The internal and external optimality of decisions based on tests

    NARCIS (Netherlands)

    van der Linden, Willem J.

    1979-01-01

    In applied measurement, test scores are usually transformed to decisions. Analogous to classical test theory, the reliability of decisions has been defined as the consistency of decisions on a test and a retest or on two parallel tests. Coefficient kappa (Cohen, 1960) is used for assessing the

  20. Adaptive Conflict-Free Optimization of Rule Sets for Network Security Packet Filtering Devices

    Directory of Open Access Journals (Sweden)

    Andrea Baiocchi

    2015-01-01

    Full Text Available Packet filtering and processing rules management in firewalls and security gateways has become commonplace in increasingly complex networks. On one side there is a need to maintain the logic of high level policies, which requires administrators to implement and update a large amount of filtering rules while keeping them conflict-free, that is, avoiding security inconsistencies. On the other side, traffic adaptive optimization of large rule lists is useful for general purpose computers used as filtering devices, without specific designed hardware, to face growing link speeds and to harden filtering devices against DoS and DDoS attacks. Our work joins the two issues in an innovative way and defines a traffic adaptive algorithm to find conflict-free optimized rule sets, by relying on information gathered with traffic logs. The proposed approach suits current technology architectures and exploits available features, like traffic log databases, to minimize the impact of ACO development on the packet filtering devices. We demonstrate the benefit entailed by the proposed algorithm through measurements on a test bed made up of real-life, commercial packet filtering devices.

  1. Optimal two-phase sampling design for comparing accuracies of two binary classification rules.

    Science.gov (United States)

    Xu, Huiping; Hui, Siu L; Grannis, Shaun

    2014-02-10

    In this paper, we consider the design for comparing the performance of two binary classification rules, for example, two record linkage algorithms or two screening tests. Statistical methods are well developed for comparing these accuracy measures when the gold standard is available for every unit in the sample, or in a two-phase study when the gold standard is ascertained only in the second phase in a subsample using a fixed sampling scheme. However, these methods do not attempt to optimize the sampling scheme to minimize the variance of the estimators of interest. In comparing the performance of two classification rules, the parameters of primary interest are the difference in sensitivities, specificities, and positive predictive values. We derived the analytic variance formulas for these parameter estimates and used them to obtain the optimal sampling design. The efficiency of the optimal sampling design is evaluated through an empirical investigation that compares the optimal sampling with simple random sampling and with proportional allocation. Results of the empirical study show that the optimal sampling design is similar for estimating the difference in sensitivities and in specificities, and both achieve a substantial amount of variance reduction with an over-sample of subjects with discordant results and under-sample of subjects with concordant results. A heuristic rule is recommended when there is no prior knowledge of individual sensitivities and specificities, or the prevalence of the true positive findings in the study population. The optimal sampling is applied to a real-world example in record linkage to evaluate the difference in classification accuracy of two matching algorithms. Copyright © 2013 John Wiley & Sons, Ltd.

  2. Optimal unified combination rule in application of Dempster-Shafer theory to lung cancer radiotherapy dose response outcome analysis.

    Science.gov (United States)

    He, Yanyan; Hussaini, M Yousuff; Gong, Yutao U T; Xiao, Ying

    2016-01-08

    Our previous study demonstrated the application of the Dempster-Shafer theory of evidence to dose/volume/outcome data analysis. Specifically, it provided Yager's rule to fuse data from different institutions pertaining to radiotherapy pneumonitis versus mean lung dose. The present work is a follow-on study that employs the optimal unified combination rule, which optimizes data similarity among independent sources. Specifically, we construct belief and plausibility functions on the lung cancer radiotherapy dose outcome datasets, and then apply the optimal unified combination rule to obtain combined belief and plausibility, which bound the probabilities of pneumonitis incidence. To estimate the incidence of pneumonitis at any value of mean lung dose, we use the Lyman-Kutcher-Burman (LKB) model to fit the combined belief and plausibility curves. The results show that the optimal unified combination rule yields a narrower uncertainty range (as represented by the belief-plausibility range) than Yager's rule, which is also theoretically proven.

  3. A Fuzzy Optimization Technique for the Prediction of Coronary Heart Disease Using Decision Tree

    Directory of Open Access Journals (Sweden)

    Persi Pamela. I

    2013-06-01

    Full Text Available Data mining along with soft computing techniques helps to unravel hidden relationships and diagnose diseases efficiently even with uncertainties and inaccuracies. Coronary Heart Disease (CHD is akiller disease leading to heart attack and sudden deaths. Since the diagnosis involves vague symptoms and tedious procedures, diagnosis is usually time-consuming and false diagnosis may occur. A fuzzy system is one of the soft computing methodologies is proposed in this paper along with a data mining technique for efficient diagnosis of coronary heart disease. Though the database has 76 attributes, only 14 attributes are found to be efficient for CHD diagnosis as per all the published experiments and doctors’ opinion. So only the essential attributes are taken from the heart disease database. From these attributes crisp rules are obtained by employing CART decision tree algorithm, which are then applied to the fuzzy system. A Particle Swarm Optimization (PSO technique is applied for the optimization of the fuzzy membership functions where the parameters of the membership functions are altered to new positions. The result interpreted from the fuzzy system predicts the prevalence of coronary heart disease and also the system’s accuracy was found to be good.

  4. Problems on Solving Matrix Aggregation in Group Decision-Making by Glowworm Swarm Optimization

    OpenAIRE

    Yaping Li

    2016-01-01

    Judgment matrix aggregation, as an important part of group decision-making, has been widely and deeply studied due to the universality and importance of group decision-making in the management field. For the variety of judgment matrix in group decision-making, the matrix aggregation result can be obtained by using the mode of glowworm swarm optimization. First, this paper introduces the basic principle of the glowworm swarm optimization (GSO) algorithm and gives the improved GSO algorithm to ...

  5. A study to derive a clinical decision rule for triage of emergency department patients with chest pain: design and methodology

    Directory of Open Access Journals (Sweden)

    Jaffe Allan

    2008-02-01

    Full Text Available Abstract Background Chest pain is the second most common chief complaint in North American emergency departments. Data from the U.S. suggest that 2.1% of patients with acute myocardial infarction and 2.3% of patients with unstable angina are misdiagnosed, with slightly higher rates reported in a recent Canadian study (4.6% and 6.4%, respectively. Information obtained from the history, 12-lead ECG, and a single set of cardiac enzymes is unable to identify patients who are safe for early discharge with sufficient sensitivity. The 2007 ACC/AHA guidelines for UA/NSTEMI do not identify patients at low risk for adverse cardiac events who can be safely discharged without provocative testing. As a result large numbers of low risk patients are triaged to chest pain observation units and undergo provocative testing, at significant cost to the healthcare system. Clinical decision rules use clinical findings (history, physical exam, test results to suggest a diagnostic or therapeutic course of action. Currently no methodologically robust clinical decision rule identifies patients safe for early discharge. Methods/design The goal of this study is to derive a clinical decision rule which will allow emergency physicians to accurately identify patients with chest pain who are safe for early discharge. The study will utilize a prospective cohort design. Standardized clinical variables will be collected on all patients at least 25 years of age complaining of chest pain prior to provocative testing. Variables strongly associated with the composite outcome acute myocardial infarction, revascularization, or death will be further analyzed with multivariable analysis to derive the clinical rule. Specific aims are to: i apply standardized clinical assessments to patients with chest pain, incorporating results of early cardiac testing; ii determine the inter-observer reliability of the clinical information; iii determine the statistical association between the clinical

  6. Extensions of dynamic programming as a new tool for decision tree optimization

    KAUST Repository

    Alkhalid, Abdulaziz

    2013-01-01

    The chapter is devoted to the consideration of two types of decision trees for a given decision table: α-decision trees (the parameter α controls the accuracy of tree) and decision trees (which allow arbitrary level of accuracy). We study possibilities of sequential optimization of α-decision trees relative to different cost functions such as depth, average depth, and number of nodes. For decision trees, we analyze relationships between depth and number of misclassifications. We also discuss results of computer experiments with some datasets from UCI ML Repository. ©Springer-Verlag Berlin Heidelberg 2013.

  7. The spatial decision-supporting system combination of RBR & CBR based on artificial neural network and association rules

    Science.gov (United States)

    Tian, Yangge; Bian, Fuling

    2007-06-01

    The technology of artificial intelligence should be imported on the basis of the geographic information system to bring up the spatial decision-supporting system (SDSS). The paper discusses the structure of SDSS, after comparing the characteristics of RBR and CBR, the paper brings up the frame of a spatial decisional system that combines RBR and CBR, which has combined the advantages of them both. And the paper discusses the CBR in agriculture spatial decisions, the application of ANN (Artificial Neural Network) in CBR, and enriching the inference rule base based on association rules, etc. And the paper tests and verifies the design of this system with the examples of the evaluation of the crops' adaptability.

  8. Decision Tool for optimal deployment of radar systems

    NARCIS (Netherlands)

    Vogel, M.H.

    1995-01-01

    A Decision Tool for air defence is presented. This Decision Tool, when provided with information about the radar, the environment, and the expected class of targets, informs the radar operator about detection probabilities. This assists the radar operator to select the optimum radar parameters. n

  9. Decision Tool for optimal deployment of radar systems

    NARCIS (Netherlands)

    Vogel, M.H.

    1995-01-01

    A Decision Tool for air defence is presented. This Decision Tool, when provided with information about the radar, the environment, and the expected class of targets, informs the radar operator about detection probabilities. This assists the radar operator to select the optimum radar parameters. n th

  10. Soft matrices on soft multisets in an optimal decision process

    Science.gov (United States)

    Coskun, Arzu Erdem; Aras, Cigdem Gunduz; Cakalli, Huseyin; Sonmez, Ayse

    2016-08-01

    In this paper, we introduce a concept of a soft matrix on a soft multiset, and investigate how to use soft matrices to solve decision making problems. An algorithm for a multiple choose selection problem is also provided. Finally, we demonstrate an illustrative example to show the decision making steps.

  11. OPTIMIZATION OF MULTIPLE-CHANNEL COOPERATIVE SPECTRUM SENSING WITH DATA FUSION RULE IN COGNITIVE RADIO NETWORKS

    Institute of Scientific and Technical Information of China (English)

    Yu Huogen; Tang Wanbin; Li Shaoqian

    2012-01-01

    This paper focuses on multi-channel Cooperative Spectrum Sensing (CSS) where Secondary Users (SUs) are assigned to cooperatively sense multiple channels simultaneously.A multi-channel CSS optimization problem of joint spectrum sensing and SU assignment based on data fusion rule is formulated,which maximizes the total throughput of the Cognitive Radio Network (CRN) subject to the constraints of probabilities of detection and false alarm.To address the optimization problem,a Branch and Bound (BnB) algorithm and a greedy algorithm are proposed to obtain the optimal solutions.Simulation results are presented to demonstrate the effectiveness of our proposed algorithms and show that the throughput improvement is achieved through the joint design.It is also shown that the greedy algorithm with a low complexity achieves the comparable performance to the exhaustive algorithm.

  12. On optimal decision-making in brains and social insect colonies.

    Science.gov (United States)

    Marshall, James A R; Bogacz, Rafal; Dornhaus, Anna; Planqué, Robert; Kovacs, Tim; Franks, Nigel R

    2009-11-01

    The problem of how to compromise between speed and accuracy in decision-making faces organisms at many levels of biological complexity. Striking parallels are evident between decision-making in primate brains and collective decision-making in social insect colonies: in both systems, separate populations accumulate evidence for alternative choices; when one population reaches a threshold, a decision is made for the corresponding alternative, and this threshold may be varied to compromise between the speed and the accuracy of decision-making. In primate decision-making, simple models of these processes have been shown, under certain parametrizations, to implement the statistically optimal procedure that minimizes decision time for any given error rate. In this paper, we adapt these same analysis techniques and apply them to new models of collective decision-making in social insect colonies. We show that social insect colonies may also be able to achieve statistically optimal collective decision-making in a very similar way to primate brains, via direct competition between evidence-accumulating populations. This optimality result makes testable predictions for how collective decision-making in social insects should be organized. Our approach also represents the first attempt to identify a common theoretical framework for the study of decision-making in diverse biological systems.

  13. Dimension reduction of decision variables for multireservoir operation: A spectral optimization model

    Science.gov (United States)

    Chen, Duan; Leon, Arturo S.; Gibson, Nathan L.; Hosseini, Parnian

    2016-01-01

    Optimizing the operation of a multireservoir system is challenging due to the high dimension of the decision variables that lead to a large and complex search space. A spectral optimization model (SOM), which transforms the decision variables from time domain to frequency domain, is proposed to reduce the dimensionality. The SOM couples a spectral dimensionality-reduction method called Karhunen-Loeve (KL) expansion within the routine of Nondominated Sorting Genetic Algorithm (NSGA-II). The KL expansion is used to represent the decision variables as a series of terms that are deterministic orthogonal functions with undetermined coefficients. The KL expansion can be truncated into fewer significant terms, and consequently, fewer coefficients by a predetermined number. During optimization, operators of the NSGA-II (e.g., crossover) are conducted only on the coefficients of the KL expansion rather than the large number of decision variables, significantly reducing the search space. The SOM is applied to the short-term operation of a 10-reservoir system in the Columbia River of the United States. Two scenarios are considered herein, the first with 140 decision variables and the second with 3360 decision variables. The hypervolume index is used to evaluate the optimization performance in terms of convergence and diversity. The evaluation of optimization performance is conducted for both conventional optimization model (i.e., NSGA-II without KL) and the SOM with different number of KL terms. The results show that the number of decision variables can be greatly reduced in the SOM to achieve a similar or better performance compared to the conventional optimization model. For the scenario with 140 decision variables, the optimal performance of the SOM model is found with six KL terms. For the scenario with 3360 decision variables, the optimal performance of the SOM model is obtained with 11 KL terms.

  14. Architecture For The Optimization Of A Machining Process In Real Time Through Rule-Based Expert System

    Science.gov (United States)

    Serrano, Rafael; González, Luis Carlos; Martín, Francisco Jesús

    2009-11-01

    Under the project SENSOR-IA which has had financial funding from the Order of Incentives to the Regional Technology Centers of the Counsil of Innovation, Science and Enterprise of Andalusia, an architecture for the optimization of a machining process in real time through rule-based expert system has been developed. The architecture consists of an acquisition system and sensor data processing engine (SATD) from an expert system (SE) rule-based which communicates with the SATD. The SE has been designed as an inference engine with an algorithm for effective action, using a modus ponens rule model of goal-oriented rules.The pilot test demonstrated that it is possible to govern in real time the machining process based on rules contained in a SE. The tests have been done with approximated rules. Future work includes an exhaustive collection of data with different tool materials and geometries in a database to extract more precise rules.

  15. Optimization for decision making linear and quadratic models

    CERN Document Server

    Murty, Katta G

    2010-01-01

    While maintaining the rigorous linear programming instruction required, Murty's new book is unique in its focus on developing modeling skills to support valid decision-making for complex real world problems, and includes solutions to brand new algorithms.

  16. a New Framework for Geospatial Site Selection Using Artificial Neural Networks as Decision Rules: a Case Study on Landfill Sites

    Science.gov (United States)

    Abujayyab, S. K. M.; Ahamad, M. A. S.; Yahya, A. S.; Saad, A.-M. H. Y.

    2015-10-01

    This paper briefly introduced the theory and framework of geospatial site selection (GSS) and discussed the application and framework of artificial neural networks (ANNs). The related literature on the use of ANNs as decision rules in GSS is scarce from 2000 till 2015. As this study found, ANNs are not only adaptable to dynamic changes but also capable of improving the objectivity of acquisition in GSS, reducing time consumption, and providing high validation. ANNs make for a powerful tool for solving geospatial decision-making problems by enabling geospatial decision makers to implement their constraints and imprecise concepts. This tool offers a way to represent and handle uncertainty. Specifically, ANNs are decision rules implemented to enhance conventional GSS frameworks. The main assumption in implementing ANNs in GSS is that the current characteristics of existing sites are indicative of the degree of suitability of new locations with similar characteristics. GSS requires several input criteria that embody specific requirements and the desired site characteristics, which could contribute to geospatial sites. In this study, the proposed framework consists of four stages for implementing ANNs in GSS. A multilayer feed-forward network with a backpropagation algorithm was used to train the networks from prior sites to assess, generalize, and evaluate the outputs on the basis of the inputs for the new sites. Two metrics, namely, confusion matrix and receiver operating characteristic tests, were utilized to achieve high accuracy and validation. Results proved that ANNs provide reasonable and efficient results as an accurate and inexpensive quantitative technique for GSS.

  17. Optimal Decisions with Multiple Agents of Varying Performance

    OpenAIRE

    Silverman, Noah

    2013-01-01

    In this dissertation, I look at four distinct systems that all embody a similar challenge to modeling complex scenarios from noisy multidimensional historical data. In many scenarios, it is important to form an opinion, make a prediction, implement a business decision, or make an investment based upon expected future system behavior. All systems embody an amount of uncertainty, and quantify- ing that uncertainty using statistical methods, allows for better decision making. Three distinct scen...

  18. Optimal Decisions with Multiple Agents of Varying Performance

    OpenAIRE

    Silverman, Noah

    2013-01-01

    In this dissertation, I look at four distinct systems that all embody a similar challenge to modeling complex scenarios from noisy multidimensional historical data. In many scenarios, it is important to form an opinion, make a prediction, implement a business decision, or make an investment based upon expected future system behavior. All systems embody an amount of uncertainty, and quantify- ing that uncertainty using statistical methods, allows for better decision making. Three distinct scen...

  19. Research on group expandable optimization decision-ms,king model for construction programme choice

    Institute of Scientific and Technical Information of China (English)

    Yan Hongyan

    2012-01-01

    Aiming at the decision-making problem of construction programme choice, this paper advanced a new group expandable optimization decision-making model. Various factors were comprehensively considered, and the new multi- attribute evaluation index system was established. Based on the assumption that decision-makers were rational, this pa- per formed a group of rational preferences through extracting the personal preferences pertinently, so the decision-makers position matrix was determined. The decision-makers position matrix integrated values given by group decision-makers to construction programme comprised a matrix, of which the entropy theory for data mining was adopted in this paper to determine the attribute weights. A decision was made by using the extension decision method. Eventually, the feasibility and practicability of the model were verified by the example.

  20. Optimal Guaranteed Service Time and Service Level Decision with Time and Service Level Sensitive Demand

    Directory of Open Access Journals (Sweden)

    Sangjun Park

    2014-01-01

    Full Text Available We consider a two-stage supply chain with one supplier and one retailer. The retailer sells a product to customer and the supplier provides a product in a make-to-order mode. In this case, the supplier’s decisions on service time and service level and the retailer’s decision on retail price have effects on customer demand. We develop optimization models to determine the optimal retail price, the optimal guaranteed service time, the optimal service level, and the optimal capacity to maximize the expected profit of the whole supply chain. The results of numerical experiments show that it is more profitable to determine the optimal price, the optimal guaranteed service time, and the optimal service level simultaneously and the proposed model is more profitable in service level sensitive market.

  1. Feature-based decision rules for control charts pattern recognition: A comparison between CART and QUEST algorithm

    Directory of Open Access Journals (Sweden)

    Shankar Chakraborty

    2012-01-01

    Full Text Available Control chart pattern (CCP recognition can act as a problem identification tool in any manufacturing organization. Feature-based rules in the form of decision trees have become quite popular in recent years for CCP recognition. This is because the practitioners can clearly understand how a particular pattern has been identified by the use of relevant shape features. Moreover, since the extracted features represent the main characteristics of the original data in a condensed form, it can also facilitate efficient pattern recognition. The reported feature-based decision trees can recognize eight types of CCPs using extracted values of seven shape features. In this paper, a different set of seven most useful features is presented that can recognize nine main CCPs, including mixture pattern. Based on these features, decision trees are developed using CART (classification and regression tree and QUEST (quick unbiased efficient statistical tree algorithms. The relative performance of the CART and QUEST-based decision trees are extensively studied using simulated pattern data. The results show that the CART-based decision trees result in better recognition performance but lesser consistency, whereas, the QUEST-based decision trees give better consistency but lesser recognition performance.

  2. A Probability Collectives Approach with a Feasibility-Based Rule for Constrained Optimization

    Directory of Open Access Journals (Sweden)

    Anand J. Kulkarni

    2011-01-01

    Full Text Available This paper demonstrates an attempt to incorporate a simple and generic constraint handling technique to the Probability Collectives (PC approach for solving constrained optimization problems. The approach of PC optimizes any complex system by decomposing it into smaller subsystems and further treats them in a distributed and decentralized way. These subsystems can be viewed as a Multi-Agent System with rational and self-interested agents optimizing their local goals. However, as there is no inherent constraint handling capability in the PC approach, a real challenge is to take into account constraints and at the same time make the agents work collectively avoiding the tragedy of commons to optimize the global/system objective. At the core of the PC optimization methodology are the concepts of Deterministic Annealing in Statistical Physics, Game Theory and Nash Equilibrium. Moreover, a rule-based procedure is incorporated to handle solutions based on the number of constraints violated and drive the convergence towards feasibility. Two specially developed cases of the Circle Packing Problem with known solutions are solved and the true optimum results are obtained at reasonable computational costs. The proposed algorithm is shown to be sufficiently robust, and strengths and weaknesses of the methodology are also discussed.

  3. Do the right thing: the assumption of optimality in lay decision theory and causal judgment.

    Science.gov (United States)

    Johnson, Samuel G B; Rips, Lance J

    2015-03-01

    Human decision-making is often characterized as irrational and suboptimal. Here we ask whether people nonetheless assume optimal choices from other decision-makers: Are people intuitive classical economists? In seven experiments, we show that an agent's perceived optimality in choice affects attributions of responsibility and causation for the outcomes of their actions. We use this paradigm to examine several issues in lay decision theory, including how responsibility judgments depend on the efficacy of the agent's actual and counterfactual choices (Experiments 1-3), individual differences in responsibility assignment strategies (Experiment 4), and how people conceptualize decisions involving trade-offs among multiple goals (Experiments 5-6). We also find similar results using everyday decision problems (Experiment 7). Taken together, these experiments show that attributions of responsibility depend not only on what decision-makers do, but also on the quality of the options they choose not to take.

  4. Visualising Pareto-optimal trade-offs helps move beyond monetary-only criteria for water management decisions

    Science.gov (United States)

    Hurford, Anthony; Harou, Julien

    2014-05-01

    Water related eco-system services are important to the livelihoods of the poorest sectors of society in developing countries. Degradation or loss of these services can increase the vulnerability of people decreasing their capacity to support themselves. New approaches to help guide water resources management decisions are needed which account for the non-market value of ecosystem goods and services. In case studies from Brazil and Kenya we demonstrate the capability of many objective Pareto-optimal trade-off analysis to help decision makers balance economic and non-market benefits from the management of existing multi-reservoir systems. A multi-criteria search algorithm is coupled to a water resources management simulator of each basin to generate a set of Pareto-approximate trade-offs representing the best case management decisions. In both cases, volume dependent reservoir release rules are the management decisions being optimised. In the Kenyan case we further assess the impacts of proposed irrigation investments, and how the possibility of new investments impacts the system's trade-offs. During the multi-criteria search (optimisation), performance of different sets of management decisions (policies) is assessed against case-specific objective functions representing provision of water supply and irrigation, hydropower generation and maintenance of ecosystem services. Results are visualised as trade-off surfaces to help decision makers understand the impacts of different policies on a broad range of stakeholders and to assist in decision-making. These case studies show how the approach can reveal unexpected opportunities for win-win solutions, and quantify the trade-offs between investing to increase agricultural revenue and negative impacts on protected ecosystems which support rural livelihoods.

  5. Real-time Container Transport Planning with Decision Trees based on Offline Obtained Optimal Solutions

    NARCIS (Netherlands)

    B. van Riessen (Bart); R.R. Negenborn (Rudy); R. Dekker (Rommert)

    2016-01-01

    textabstractHinterland networks for container transportation require planning methods in order to increase efficiency and reliability of the inland road, rail and waterway connections. In this paper we aim to derive real-time decision rules for suitable allocations of containers to inland services b

  6. Optimization-based decision support systems for planning problems in processing industries

    NARCIS (Netherlands)

    Claassen, G.D.H.

    2014-01-01

    Summary Optimization-based decision support systems for planning problems in processing industries Nowadays, efficient planning of material flows within and between supply chains is of vital importance and has become one of the most challenging problems for decision support in practice. The

  7. Optimization-based decision support systems for planning problems in processing industries

    NARCIS (Netherlands)

    Claassen, G.D.H.

    2014-01-01

    Summary Optimization-based decision support systems for planning problems in processing industries Nowadays, efficient planning of material flows within and between supply chains is of vital importance and has become one of the most challenging problems for decision support in practice. The tremendo

  8. Optimization-based decision support systems for planning problems in processing industries

    NARCIS (Netherlands)

    Claassen, G.D.H.

    2014-01-01

    Summary Optimization-based decision support systems for planning problems in processing industries Nowadays, efficient planning of material flows within and between supply chains is of vital importance and has become one of the most challenging problems for decision support in practice. The tremendo

  9. 19 CFR 177.12 - Modification or revocation of interpretive rulings, protest review decisions, and previous...

    Science.gov (United States)

    2010-04-01

    ... rulings under the North American Free Trade Agreement. (e) Effective date and application to transactions... Schedule of the United States; (ii) Promulgation of a treaty or other international agreement under...

  10. Optimal Planning of Sustainable Buildings: Integration of Life Cycle Assessment and Optimization in a Decision Support System (DSS

    Directory of Open Access Journals (Sweden)

    Fabio Magrassi

    2016-06-01

    Full Text Available Energy efficiency measures in buildings can provide for a significant reduction of greenhouse gas (GHG emissions. A sustainable design and planning of technologies for energy production should be based on economic and environmental criteria. Life Cycle Assessment (LCA is used to quantify the environmental impacts over the whole cycle of life of production plants. Optimization models can support decisions that minimize costs and negative impacts. In this work, a multi-objective decision problem is formalized that takes into account LCA calculations and that minimizes costs and GHG emissions for general buildings. A decision support system (DSS is applied to a real case study in the Northern Italy, highlighting the advantage provided by the installation of renewable energy. Moreover, a comparison among different optimal and non optimal solution was carried out to demonstrate the effectiveness of the proposed DSS.

  11. Construction of a clinical decision support system for undergoing surgery based on domain ontology and rules reasoning.

    Science.gov (United States)

    Bau, Cho-Tsan; Chen, Rung-Ching; Huang, Chung-Yi

    2014-05-01

    To construct a clinical decision support system (CDSS) for undergoing surgery based on domain ontology and rules reasoning in the setting of hospitalized diabetic patients. The ontology was created with a modified ontology development method, including specification and conceptualization, formalization, implementation, and evaluation and maintenance. The Protégé-Web Ontology Language editor was used to implement the ontology. Embedded clinical knowledge was elicited to complement the domain ontology with formal concept analysis. The decision rules were translated into JENA format, which JENA can use to infer recommendations based on patient clinical situations. The ontology includes 31 classes and 13 properties, plus 38 JENA rules that were built to generate recommendations. The evaluation studies confirmed the correctness of the ontology, acceptance of recommendations, satisfaction with the system, and usefulness of the ontology for glycemic management of diabetic patients undergoing surgery, especially for domain experts. The contribution of this research is to set up an evidence-based hybrid ontology and an evaluation method for CDSS. The system can help clinicians to achieve inpatient glycemic control in diabetic patients undergoing surgery while avoiding hypoglycemia.

  12. A clinical decision rule for triage of children under 5 years of age with hydrocarbon (kerosene) aspiration in developing countries.

    Science.gov (United States)

    Bond, G R; Pièche, S; Sonicki, Z; Gamaluddin, H; El Guindi, M; Sakr, M; El Seddawy, A; Abouzaid, M; Youssef, A

    2008-03-01

    Unintended hydrocarbon ingestion is a common reason for pediatric hospitalization in the developing world. To derive a clinical decision rule, to identify patients likely to require a higher level facility (resource-requiring cases), that can be used at primary health care facilities with limited diagnostic and therapeutic resources. A prospective study of children 2 to 59 months old presenting to a poison treatment facility within 2 hours of oral hydrocarbon exposure. History and objective signs were recorded at admission and at 6, 12, 24 and, if present, 48 hours. Inclusion in the resource-requiring outcome group required: oxygen saturation or= 50/min if age or= 40/min if age >or= 12 mo) at presentation identified 167 of 170 of these patients (sensitivity 0.98). Thirty-six of 86 patients classified as non-resource requiring were correctly identified (specificity 0.42). No combination of clinical symptoms provided better discrimination while preserving sensitivity. This study suggests a triage decision rule based on the presence of wheezing, altered consciousness, or a rapid respiratory rate within 2 hours of hydrocarbon exposure. Such a rule requires validation in other settings.

  13. Multi-attribute decision making model based on optimal membership and relative entropy

    Institute of Scientific and Technical Information of China (English)

    Rao Congjun; Zhao Yong

    2009-01-01

    To study the problems of multi-attribute decision making in which the attribute values are given in the form of linguistic fuzzy numbers and the information of attribute weights are incomplete, a new multi-attribute decision making model is presented based on the optimal membership and the relative entropy. Firstly, the definitions of the optimal membership and the relative entropy are given. Secondly, for all alternatives, a set of preference weight vectors are obtained by solving a set of linear programming models whose goals are all to maximize the optimal membership. Thirdly, a relative entropy model is established to aggregate the preference weight vectors, thus an optimal weight vector is determined. Based on this optimal weight vector, the algorithm of deviation degree minimization is proposed to rank all the alternatives. Finally, a decision making example is given to demonstrate the feasibility and rationality of this new model.

  14. Water Expert: a conceptualized framework for development of a rule-based decision support system for distribution system decontamination

    Directory of Open Access Journals (Sweden)

    J. L. Gutenson

    2014-05-01

    Full Text Available Significant drinking water contamination events pose a serious threat to public and environmental health. Water utilities often must make timely, critical decisions without evaluating all facets of the incident, as the data needed to enact informed decisions are inevitably dispersant and disparate, originating from policy, science, and heuristic contributors. Water Expert is a functioning hybrid decision support system (DSS and expert system framework, with emphases on meshing parallel data structures to expedite and optimize the decision pathway. Delivered as a thin-client application through the user's web browser, Water Expert's extensive knowledgebase is a product of inter-university collaboration that methodically pieced together system decontamination procedures through consultation with subject matter experts, literature review, and prototyping with stakeholders. This paper discusses development of Water Expert, analyzing the development process underlying the DSS and the system's existing architecture specifications.

  15. Risk Acceptance Criteria and/or Decision optimization

    DEFF Research Database (Denmark)

    Ditlevsen, Ove Dalager

    1996-01-01

    a decision theoretical point of view. It is argued that the phenomenon of risk aversion rather than being of concern to the authority should be of concern to the owner. Finally it is discussed whether there is an ethical problem when formally capitalising human lives with a positive interest rate. Keywords...

  16. Understanding Optimal Military Decision Making: Year 2 Progress Report

    Science.gov (United States)

    2014-01-01

    learning and cognitive flexibility, while their eye gaze and brain activity was monitored via eye - tracking and electroencephalography (EEG) technology...performance occurred. Preliminary results of eye tracking provided insight into which pieces of information the subjects used in making their decisions

  17. Rules for Rational Decision Making: An Experiment with 15- and 16-Year Old Students

    Science.gov (United States)

    Guerra, Ana Teresa Antequera; Febles, Maria Candelaria Espinel

    2012-01-01

    Multicriteria analysis constitutes a way to model decision processes, which allow the decision maker to assess the possible implications each course of action may entail. A multicriteria problem is chosen from the Programme for International Student Assessment 2003 Report and then extended to include questions involving a choice of preferences and…

  18. A new intuitionistic fuzzy rule-based decision-making system for an operating system process scheduler.

    Science.gov (United States)

    Butt, Muhammad Arif; Akram, Muhammad

    2016-01-01

    We present a new intuitionistic fuzzy rule-based decision-making system based on intuitionistic fuzzy sets for a process scheduler of a batch operating system. Our proposed intuitionistic fuzzy scheduling algorithm, inputs the nice value and burst time of all available processes in the ready queue, intuitionistically fuzzify the input values, triggers appropriate rules of our intuitionistic fuzzy inference engine and finally calculates the dynamic priority (dp) of all the processes in the ready queue. Once the dp of every process is calculated the ready queue is sorted in decreasing order of dp of every process. The process with maximum dp value is sent to the central processing unit for execution. Finally, we show complete working of our algorithm on two different data sets and give comparisons with some standard non-preemptive process schedulers.

  19. Optimizing Negotiation Conflict in the Cloud Service Negotiation Framework Using Probabilistic Decision Making Model

    Science.gov (United States)

    Rajavel, Rajkumar; Thangarathinam, Mala

    2015-01-01

    Optimization of negotiation conflict in the cloud service negotiation framework is identified as one of the major challenging issues. This negotiation conflict occurs during the bilateral negotiation process between the participants due to the misperception, aggressive behavior, and uncertain preferences and goals about their opponents. Existing research work focuses on the prerequest context of negotiation conflict optimization by grouping similar negotiation pairs using distance, binary, context-dependent, and fuzzy similarity approaches. For some extent, these approaches can maximize the success rate and minimize the communication overhead among the participants. To further optimize the success rate and communication overhead, the proposed research work introduces a novel probabilistic decision making model for optimizing the negotiation conflict in the long-term negotiation context. This decision model formulates the problem of managing different types of negotiation conflict that occurs during negotiation process as a multistage Markov decision problem. At each stage of negotiation process, the proposed decision model generates the heuristic decision based on the past negotiation state information without causing any break-off among the participants. In addition, this heuristic decision using the stochastic decision tree scenario can maximize the revenue among the participants available in the cloud service negotiation framework. PMID:26543899

  20. Optimizing Negotiation Conflict in the Cloud Service Negotiation Framework Using Probabilistic Decision Making Model.

    Science.gov (United States)

    Rajavel, Rajkumar; Thangarathinam, Mala

    2015-01-01

    Optimization of negotiation conflict in the cloud service negotiation framework is identified as one of the major challenging issues. This negotiation conflict occurs during the bilateral negotiation process between the participants due to the misperception, aggressive behavior, and uncertain preferences and goals about their opponents. Existing research work focuses on the prerequest context of negotiation conflict optimization by grouping similar negotiation pairs using distance, binary, context-dependent, and fuzzy similarity approaches. For some extent, these approaches can maximize the success rate and minimize the communication overhead among the participants. To further optimize the success rate and communication overhead, the proposed research work introduces a novel probabilistic decision making model for optimizing the negotiation conflict in the long-term negotiation context. This decision model formulates the problem of managing different types of negotiation conflict that occurs during negotiation process as a multistage Markov decision problem. At each stage of negotiation process, the proposed decision model generates the heuristic decision based on the past negotiation state information without causing any break-off among the participants. In addition, this heuristic decision using the stochastic decision tree scenario can maximize the revenue among the participants available in the cloud service negotiation framework.

  1. Optimizing Negotiation Conflict in the Cloud Service Negotiation Framework Using Probabilistic Decision Making Model

    Directory of Open Access Journals (Sweden)

    Rajkumar Rajavel

    2015-01-01

    Full Text Available Optimization of negotiation conflict in the cloud service negotiation framework is identified as one of the major challenging issues. This negotiation conflict occurs during the bilateral negotiation process between the participants due to the misperception, aggressive behavior, and uncertain preferences and goals about their opponents. Existing research work focuses on the prerequest context of negotiation conflict optimization by grouping similar negotiation pairs using distance, binary, context-dependent, and fuzzy similarity approaches. For some extent, these approaches can maximize the success rate and minimize the communication overhead among the participants. To further optimize the success rate and communication overhead, the proposed research work introduces a novel probabilistic decision making model for optimizing the negotiation conflict in the long-term negotiation context. This decision model formulates the problem of managing different types of negotiation conflict that occurs during negotiation process as a multistage Markov decision problem. At each stage of negotiation process, the proposed decision model generates the heuristic decision based on the past negotiation state information without causing any break-off among the participants. In addition, this heuristic decision using the stochastic decision tree scenario can maximize the revenue among the participants available in the cloud service negotiation framework.

  2. Collective Decision Dynamics in Group Evacuation: Modeling Tradeoffs and Optimal Behavior

    CERN Document Server

    Schlesinger, Kimberly J; Ali, Imtiaz; Carlson, Jean M

    2016-01-01

    Quantifying uncertainties in collective human behavior and decision making is crucial for ensuring public health and safety, enabling effective disaster response, informing the design of transportation and communication networks, and guiding the development of new technologies. However, modeling and predicting such behavior is notoriously difficult, due to the influence of a variety of complex factors such as the availability and uncertainty of information, the interaction and influence of social groups and networks, the degree of risk or time pressure involved in a situation, and differences in individual personalities and preferences. Here, we develop a stochastic model of human decision making to describe the empirical behavior of subjects in a controlled experiment simulating a natural disaster scenario. We compare the observed behavior to that of statistically optimal Bayesian decision makers, quantifying the extent to which human decisions are optimal and identifying the conditions in which sub-optimal ...

  3. Optimization-based Analysis and Training of Human Decision Making

    OpenAIRE

    Engelhart, Michael

    2015-01-01

    In the research domain Complex Problem Solving (CPS) in psychology, computer-supported tests are used to analyze complex human decision making and problem solving. The approach is to use computer-based microworlds and to evaluate the performance of participants in such test-scenarios and correlate it to certain characteristics. However, these test-scenarios have usually been defined on a trial-and-error basis, until certain characteristics became apparent. The more complex models ...

  4. Understanding Optimal Decision-making in War-gaming II

    Science.gov (United States)

    2015-02-01

    at NPS. • Assumptions – Results of experimentation with available subject pool will be sufficient to provide insight into study issues . Methodology...Accomplished Goals Study 1 • 34 Officers completed: • Map Task. • Convoy Task. • Covariate Measures. • Synchronization of decision and EEG data...Large amounts of individual variability. Mean Total Damage Score (all participants). 8 October 2014 ODM II B-6 ODM II, Preliminary EEG results The

  5. The Application of Time-Delay Dependent H∞ Control Model in Manufacturing Decision Optimization

    Directory of Open Access Journals (Sweden)

    Haifeng Guo

    2015-01-01

    Full Text Available This paper uses a time-delay dependent H∞ control model to analyze the effect of manufacturing decisions on the process of transmission from resources to capability. We establish a theoretical framework of manufacturing management process based on three terms: resource, manufacturing decision, and capability. Then we build a time-delay H∞ robust control model to analyze the robustness of manufacturing management. With the state feedback controller between manufacturing resources and decision, we find that there is an optimal decision to adjust the process of transmission from resources to capability under uncertain environment. Finally, we provide an example to prove the robustness of this model.

  6. Sequential decision analysis for nonstationary stochastic processes

    Science.gov (United States)

    Schaefer, B.

    1974-01-01

    A formulation of the problem of making decisions concerning the state of nonstationary stochastic processes is given. An optimal decision rule, for the case in which the stochastic process is independent of the decisions made, is derived. It is shown that this rule is a generalization of the Bayesian likelihood ratio test; and an analog to Wald's sequential likelihood ratio test is given, in which the optimal thresholds may vary with time.

  7. Implementation of virtual medical record object model for a standards-based clinical decision support rule engine.

    Science.gov (United States)

    Huang, Christine; Noirot, Laura A; Heard, Kevin M; Reichley, Richard M; Dunagan, Wm Claiborne; Bailey, Thomas C

    2006-01-01

    The Virtual Medical Record (vMR) is a structured data model for representing individual patient informations. Our implementation of vMR is based on HL7 Reference Information Model (RIM) v2.13 from which a minimum set of objects and attributes are selected to meet the requirement of a clinical decision support (CDS) rule engine. Our success of mapping local patient data to the vMR model and building a vMR adaptor middle layer demonstrate the feasibility and advantages of implementing a vMR in a portable CDS solution.

  8. A multiobjective optimization approach to obtain decision thresholds for distributed detection in wireless sensor networks.

    Science.gov (United States)

    Masazade, Engin; Rajagopalan, Ramesh; Varshney, Pramod K; Mohan, Chilukuri K; Sendur, Gullu Kiziltas; Keskinoz, Mehmet

    2010-04-01

    For distributed detection in a wireless sensor network, sensors arrive at decisions about a specific event that are then sent to a central fusion center that makes global inference about the event. For such systems, the determination of the decision thresholds for local sensors is an essential task. In this paper, we study the distributed detection problem and evaluate the sensor thresholds by formulating and solving a multiobjective optimization problem, where the objectives are to minimize the probability of error and the total energy consumption of the network. The problem is investigated and solved for two types of fusion schemes: 1) parallel decision fusion and 2) serial decision fusion. The Pareto optimal solutions are obtained using two different multiobjective optimization techniques. The normal boundary intersection (NBI) method converts the multiobjective problem into a number of single objective-constrained subproblems, where each subproblem can be solved with appropriate optimization methods and nondominating sorting genetic algorithm-II (NSGA-II), which is a multiobjective evolutionary algorithm. In our simulations, NBI yielded better and evenly distributed Pareto optimal solutions in a shorter time as compared with NSGA-II. The simulation results show that, instead of only minimizing the probability of error, multiobjective optimization provides a number of design alternatives, which achieve significant energy savings at the cost of slightly increasing the best achievable decision error probability. The simulation results also show that the parallel fusion model achieves better error probability, but the serial fusion model is more efficient in terms of energy consumption.

  9. Detection of Stator Winding Fault in Induction Motor Using Fuzzy Logic with Optimal Rules

    Directory of Open Access Journals (Sweden)

    Hamid Fekri Azgomi

    2013-04-01

    Full Text Available Induction motors are critical components in many industrial processes. Therefore, swift, precise and reliable monitoring and fault detection systems are required to prevent any further damages. The online monitoring of induction motors has been becoming increasingly important. The main difficulty in this task is the lack of an accurate analytical model to describe a faulty motor. A fuzzy logic approach may help to diagnose traction motor faults. This paper presents a simple method for the detection of stator winding faults (which make up 38% of induction motor failures based on monitoring the line/terminal current amplitudes. In this method, fuzzy logic is used to make decisions about the stator motor condition. In fact, fuzzy logic is reminiscent of human thinking processes and natural language enabling decisions to be made based on vague information. The motor condition is described using linguistic variables. Fuzzy subsets and the corresponding membership functions describe stator current amplitudes. A knowledge base, comprising rule and data bases, is built to support the fuzzy inference. Simulation results are presented to verify the accuracy of motor’s fault detection and knowledge extraction feasibility. The preliminary results show that the proposed fuzzy approach can be used for accurate stator fault diagnosis.

  10. Graph-related optimization and decision support systems

    CERN Document Server

    Krichen, Saoussen

    2014-01-01

    Constrained optimization is a challenging branch of operations research that aims to create a model which has a wide range of applications in the supply chain, telecommunications and medical fields. As the problem structure is split into two main components, the objective is to accomplish the feasible set framed by the system constraints. The aim of this book is expose optimization problems that can be expressed as graphs, by detailing, for each studied problem, the set of nodes and the set of edges.  This graph modeling is an incentive for designing a platform that integrates all optimizatio

  11. Text of High Court's Ruling on Judges' Right to Upset Academic Decisions.

    Science.gov (United States)

    Stevens, John Paul

    1985-01-01

    The Supreme Court's opinion and concurring opinion in a case limiting the right of courts to overturn academic decisions, based on the case of university's dismissal of a student after his failure of an important examination, are presented. (MSE)

  12. Review of experimental studies in social psychology of small groups when an optimal choice exists and application to operating room management decision-making.

    Science.gov (United States)

    Prahl, Andrew; Dexter, Franklin; Braun, Michael T; Van Swol, Lyn

    2013-11-01

    Because operating room (OR) management decisions with optimal choices are made with ubiquitous biases, decisions are improved with decision-support systems. We reviewed experimental social-psychology studies to explore what an OR leader can do when working with stakeholders lacking interest in learning the OR management science but expressing opinions about decisions, nonetheless. We considered shared information to include the rules-of-thumb (heuristics) that make intuitive sense and often seem "close enough" (e.g., staffing is planned based on the average workload). We considered unshared information to include the relevant mathematics (e.g., staffing calculations). Multiple studies have shown that group discussions focus more on shared than unshared information. Quality decisions are more likely when all group participants share knowledge (e.g., have taken a course in OR management science). Several biases in OR management are caused by humans' limited abilities to estimate tails of probability distributions in their heads. Groups are more susceptible to analogous biases than are educated individuals. Since optimal solutions are not demonstrable without groups sharing common language, only with education of most group members can a knowledgeable individual influence the group. The appropriate model of decision-making is autocratic, with information obtained from stakeholders. Although such decisions are good quality, the leaders often are disliked and the decisions considered unjust. In conclusion, leaders will find the most success if they do not bring OR management operational decisions to groups, but instead act autocratically while obtaining necessary information in 1:1 conversations. The only known route for the leader making such decisions to be considered likable and for the decisions to be considered fair is through colleagues and subordinates learning the management science.

  13. Logical-Rule Models of Classification Response Times: A Synthesis of Mental-Architecture, Random-Walk, and Decision-Bound Approaches

    Science.gov (United States)

    Fific, Mario; Little, Daniel R.; Nosofsky, Robert M.

    2010-01-01

    We formalize and provide tests of a set of logical-rule models for predicting perceptual classification response times (RTs) and choice probabilities. The models are developed by synthesizing mental-architecture, random-walk, and decision-bound approaches. According to the models, people make independent decisions about the locations of stimuli…

  14. Accounting standards and earnings management : The role of rules-based and principles-based accounting standards and incentives on accounting and transaction decisions

    NARCIS (Netherlands)

    Beest, van F.

    2012-01-01

    This book examines the effect that rules-based and principles-based accounting standards have on the level and nature of earnings management decisions. A cherry picking experiment is conducted to test the hypothesis that a substitution effect is expected from accounting decisions to transaction deci

  15. Accounting standards and earnings management : The role of rules-based and principles-based accounting standards and incentives on accounting and transaction decisions

    NARCIS (Netherlands)

    Beest, van F.

    2012-01-01

    This book examines the effect that rules-based and principles-based accounting standards have on the level and nature of earnings management decisions. A cherry picking experiment is conducted to test the hypothesis that a substitution effect is expected from accounting decisions to transaction

  16. Optimization and decision support systems for supply chains

    CERN Document Server

    Corominas, Albert; Miranda, João

    2017-01-01

    This contributed volume presents a collection of materials on supply chain management including industry-based case studies addressing petrochemical, pharmaceutical, manufacturing and reverse logistics topics. Moreover, the book covers sustainability issues, as well as optimization approaches. The target audience comprises academics, industry managers, and practitioners in the field of supply chain management, being the book also beneficial for graduate students.

  17. Nervous Network Model for Optimized Reservoir Operation Rule%水库最优调度规则的神经网络模型

    Institute of Scientific and Technical Information of China (English)

    陈建康; 马光文

    2001-01-01

    By taking advantage of nonlinear decision-making featu re of artificial nervous network , artificial nervous network model for optimize d reservoir (power plant)operation rule is provided. The optimized reservoir ope ration is simulated for Baozhusi hydropower station in Sichuan.The analysis resu lts prove the rationality , feasibility and practicality of the model.%利用人工神经网络的非线性决策特点,提出了水库(电站)最优调度规则的人工神经网络模型,以四川宝珠寺水电站为例,进行其最优调度规则的模拟。研究结果表明了模型的合理性、可行性及实用性。

  18. Some new step-size rules for optimization problems%优化问题的新步长法则

    Institute of Scientific and Technical Information of China (English)

    吴庆军; 韦增欣

    2007-01-01

    The step-size procedure is very important for solving optimization problems. The Armijo step-size rule, the Armijo-Goldstein step-size rule and the Wolfe-Powell step-size rule are three well-known line search methods. On the basis of the above three types of line search methods and the idea of the proximal point methods, a new class of step-size rules was proposed. Instead of a single objective function f, f + 1/2(x - xk)TBk(x - xk) was used as the merit function in iteration k,where Bκ is a given symmetric positive definite matrix. The existence of the steplength for the new rules was proved. Some convergence properties were also discussed.

  19. Economic irrationality is optimal during noisy decision making.

    Science.gov (United States)

    Tsetsos, Konstantinos; Moran, Rani; Moreland, James; Chater, Nick; Usher, Marius; Summerfield, Christopher

    2016-03-15

    According to normative theories, reward-maximizing agents should have consistent preferences. Thus, when faced with alternatives A, B, and C, an individual preferring A to B and B to C should prefer A to C. However, it has been widely argued that humans can incur losses by violating this axiom of transitivity, despite strong evolutionary pressure for reward-maximizing choices. Here, adopting a biologically plausible computational framework, we show that intransitive (and thus economically irrational) choices paradoxically improve accuracy (and subsequent economic rewards) when decision formation is corrupted by internal neural noise. Over three experiments, we show that humans accumulate evidence over time using a "selective integration" policy that discards information about alternatives with momentarily lower value. This policy predicts violations of the axiom of transitivity when three equally valued alternatives differ circularly in their number of winning samples. We confirm this prediction in a fourth experiment reporting significant violations of weak stochastic transitivity in human observers. Crucially, we show that relying on selective integration protects choices against "late" noise that otherwise corrupts decision formation beyond the sensory stage. Indeed, we report that individuals with higher late noise relied more strongly on selective integration. These findings suggest that violations of rational choice theory reflect adaptive computations that have evolved in response to irreducible noise during neural information processing.

  20. ERDOS 1.0. Emergency response decisions as problems of optimal stopping

    Energy Technology Data Exchange (ETDEWEB)

    Pauwels, N

    1998-11-01

    The ERDOS-software is a stochastic dynamic program to support the decision problem of preventively evacuating the workers of an industrial company threatened by a nuclear accident taking place in the near future with a particular probability. ERDOS treats this problem as one of optimal stopping: the governmental decision maker initially holds a call option enabling him to postpone the evacuation decision and observe the further evolution of the alarm situation. As such, he has to decide on the optimal point in time to exercise this option, i.e. to take the irreversible decision to evacuate the threatened workers. ERDOS allows to calculate the expected costs of an optimal intervention strategy and to compare this outcome with the costs resulting from a myopic evacuation decision, ignoring the prospect of more complete information at later stages of the decision process. Furthermore, ERDOS determines the free boundary, giving the critical severity as a function of time that will trigger immediate evacuation in case it is exceeded. Finally, the software provides useful insights in the financial implications of loosing time during the initial stages of the decision process (due to the gathering of information, discussions on the intervention strategy and so on)

  1. Dispositional Optimism as a Correlate of Decision-Making Styles in Adolescence

    Directory of Open Access Journals (Sweden)

    Paola Magnano

    2015-06-01

    Full Text Available Despite the numerous psychological areas in which optimism has been studied, including career planning, only a small amount of research has been done to investigate the relationship between optimism and decision-making styles. Consequently, we have investigated the role of dispositional optimism as a correlate of different decision-making styles, in particular, positive for effective styles and negative for ineffective ones (doubtfulness, procrastination, and delegation. Data were gathered through questionnaires administered to 803 Italian adolescents in their last 2 years of high schools with different fields of study, each at the beginning stages of planning for their professional future. A paper questionnaire was completed containing measures of dispositional optimism and career-related decision styles, during a vocational guidance intervention conducted at school. Data were analyzed using stepwise multiple regression. Results supported the proposed model by showing optimism to be a strong correlate of decision-making styles, thereby offering important intervention guidelines aimed at modifying unrealistically negative expectations regarding their future and helping students learn adaptive decision-making skills.

  2. A Markov decision model for determining optimal outpatient scheduling.

    Science.gov (United States)

    Patrick, Jonathan

    2012-06-01

    Managing an efficient outpatient clinic can often be complicated by significant no-show rates and escalating appointment lead times. One method that has been proposed for avoiding the wasted capacity due to no-shows is called open or advanced access. The essence of open access is "do today's demand today". We develop a Markov Decision Process (MDP) model that demonstrates that a short booking window does significantly better than open access. We analyze a number of scenarios that explore the trade-off between patient-related measures (lead times) and physician- or system-related measures (revenue, overtime and idle time). Through simulation, we demonstrate that, over a wide variety of potential scenarios and clinics, the MDP policy does as well or better than open access in terms of minimizing costs (or maximizing profits) as well as providing more consistent throughput.

  3. An Optimal Decision Procedure for MPNL over the Integers

    Directory of Open Access Journals (Sweden)

    Davide Bresolin

    2011-06-01

    Full Text Available Interval temporal logics provide a natural framework for qualitative and quantitative temporal reason- ing over interval structures, where the truth of formulae is defined over intervals rather than points. In this paper, we study the complexity of the satisfiability problem for Metric Propositional Neigh- borhood Logic (MPNL. MPNL features two modalities to access intervals "to the left" and "to the right" of the current one, respectively, plus an infinite set of length constraints. MPNL, interpreted over the naturals, has been recently shown to be decidable by a doubly exponential procedure. We improve such a result by proving that MPNL is actually EXPSPACE-complete (even when length constraints are encoded in binary, when interpreted over finite structures, the naturals, and the in- tegers, by developing an EXPSPACE decision procedure for MPNL over the integers, which can be easily tailored to finite linear orders and the naturals (EXPSPACE-hardness was already known.

  4. Image Theory's counting rule in clinical decision making: Does it describe how clinicians make patient-specific forecasts?

    Directory of Open Access Journals (Sweden)

    Melissa Garman

    2012-05-01

    Full Text Available The field of clinical decision making is polarized by two predominate views. One holds that treatment recommendations should conform with guidelines; the other emphasizes clinical expertise in reaching case-specific judgments. Previous work developed a test for a proposed alternative, that clinical judgment should systematically incorporate both general knowledge and patient-specific information. The test was derived from image theory's two phase-account of decision making and its ``simple counting rule'', which describes how possible courses of action are pre-screened for compatibility with standards and values. The current paper applies this rule to clinical forecasting, where practitioners indicate how likely a specific patient will respond favorably to a recommended treatment. Psychiatric trainees evaluated eight case vignettes that exhibited from 0 to 3 incompatible attributes. They made two forecasts, one based on a guideline recommendation, the other based on their own alternative. Both forecasts were predicted by equally- and unequally-weighted counting rules. Unequal weighting provided a better fit and exhibited a clearer rejection threshold, or point at which forecasts are not diminished by additional incompatibilities. The hypothesis that missing information is treated as an incompatibility was not confirmed. There was evidence that the rejection threshold was influenced by clinician preference. Results suggests that guidelines may have a de-biasing influence on clinical judgment. Subject to limitations pertaining to the subject sample and population, clinical paradigm, guideline, and study procedure, the data support the use of a compatibility test to describe how clinicians make patient-specific forecasts.

  5. Hybrid Medical Image Classification Using Association Rule Mining with Decision Tree Algorithm

    OpenAIRE

    Rajendran, P.; M.Madheswaran

    2010-01-01

    The main focus of image mining in the proposed method is concerned with the classification of brain tumor in the CT scan brain images. The major steps involved in the system are: pre-processing, feature extraction, association rule mining and hybrid classifier. The pre-processing step has been done using the median filtering process and edge features have been extracted using canny edge detection technique. The two image mining approaches with a hybrid manner have been proposed in this paper....

  6. End-of-life decisions and the reinvented Rule of Double Effect: a critical analysis.

    Science.gov (United States)

    Lindblad, Anna; Lynöe, Niels; Juth, Niklas

    2014-09-01

    The Rule of Double Effect (RDE) holds that it may be permissible to harm an individual while acting for the sake of a proportionate good, given that the harm is not an intended means to the good but merely a foreseen side-effect. Although frequently used in medical ethical reasoning, the rule has been repeatedly questioned in the past few decades. However, Daniel Sulmasy, a proponent who has done a lot of work lately defending the RDE, has recently presented a reformulated and more detailed version of the rule. Thanks to its greater precision, this reinvented RDE avoids several problems thought to plague the traditional RDE. Although an improvement compared with the traditional version, we argue that Sulmasy's reinvented RDE will not stand closer scrutiny. Not only has the range of proper applicability narrowed significantly, but, more importantly, Sulmasy fails to establish that there is a morally relevant distinction between intended and foreseen effects. In particular, he fails to establish that there is any distinction that can account for the alleged moral difference between sedation therapy and euthanasia.

  7. Decision Bayes Criteria for Optimal Classifier Based on Probabilistic Measures

    Institute of Scientific and Technical Information of China (English)

    Wissal Drira; Faouzi Ghorbel

    2014-01-01

    This paper addresses the high dimension sample problem in discriminate analysis under nonparametric and supervised assumptions. Since there is a kind of equivalence between the probabilistic dependence measure and the Bayes classification error probability, we propose to use an iterative algorithm to optimize the dimension reduction for classification with a probabilistic approach to achieve the Bayes classifier. The estimated probabilities of different errors encountered along the different phases of the system are realized by the Kernel estimate which is adjusted in a means of the smoothing parameter. Experiment results suggest that the proposed approach performs well.

  8. Optimal decision making on the basis of evidence represented in spike trains.

    Science.gov (United States)

    Zhang, Jiaxiang; Bogacz, Rafal

    2010-05-01

    Experimental data indicate that perceptual decision making involves integration of sensory evidence in certain cortical areas. Theoretical studies have proposed that the computation in neural decision circuits approximates statistically optimal decision procedures (e.g., sequential probability ratio test) that maximize the reward rate in sequential choice tasks. However, these previous studies assumed that the sensory evidence was represented by continuous values from gaussian distributions with the same variance across alternatives. In this article, we make a more realistic assumption that sensory evidence is represented in spike trains described by the Poisson processes, which naturally satisfy the mean-variance relationship observed in sensory neurons. We show that for such a representation, the neural circuits involving cortical integrators and basal ganglia can approximate the optimal decision procedures for two and multiple alternative choice tasks.

  9. Optimization Algorithm for a Handoff Decision in Wireless Heterogeneous Networks

    Directory of Open Access Journals (Sweden)

    E.Arun

    2010-09-01

    Full Text Available Future wireless networks will consist of multiple heterogeneous access technologies such as UMTS,WLAN and Wi-Max. The main challenge in these networks is to provide users with a wide range ofservices across different radio access technologies through a single mobile terminal. With regard to verticalhand off performance, there is a need for developing algorithms for connection management and optimalresource allocation for seamless mobility. The Media Independent Handover (MIH architecture is used forthe special case of handoff optimization between heterogeneous networks. The signaling messages areexchanged by triggers in 802.21 is obtained through Service Access Points (SAP. These messages must bedelivered in a timely and reliable manner. In this paper, a novel solution to communicate MIH messagesand to review their limitations are analyzed. An efficient solution using 3SE a Sender-Side Stream control,transport-layer protocol Extension, that modifies the Standard SCTP protocol is introduced. 3SE aims atexploiting SCTP’s multi homing and multi streaming capabilities and is optimized by using MIH services.The performance is analyzed for various packet loss conditions and bandwidth capacity.

  10. Robust Management of Combined Heat and Power Systems via Linear Decision Rules

    DEFF Research Database (Denmark)

    Zugno, Marco; Morales González, Juan Miguel; Madsen, Henrik

    2014-01-01

    The heat and power outputs of Combined Heat and Power (CHP) units are jointly constrained. Hence, the optimal management of systems including CHP units is a multicommodity optimization problem. Problems of this type are stochastic, owing to the uncertainty inherent both in the demand for heat...... and in the electricity prices that owners of CHP units receive for the power they sell in the market. In this work, we model the management problem for a coupled heat-and-power system comprising CHP plants, units solely producing heat as well as heat storages. We propose a robust optimization model including unit...... commitment, day-ahead power and heat dispatch as well as real-time re-dispatch (recourse) variables. This model yields a solution that is feasible under any realization of the heat demand within a given uncertainty set. Optimal recourse functions for the real-time operation of the units are approximated via...

  11. Design rule for optimization of microelectrodes used in electric cell-substrate impedance sensing (ECIS).

    Science.gov (United States)

    Price, Dorielle T; Rahman, Abdur Rub Abdur; Bhansali, Shekhar

    2009-03-15

    This paper presents an experimentally derived design rule for optimization of microelectrodes used in electric cell-substrate impedance sensing (ECIS) up to 10MHz. The effect of change in electrode design (through electrode sensor area, lead trace widths, and passivation coating thickness) on electrode characteristics was experimentally evaluated using electrochemical impedance spectroscopy (EIS) measurements and analyzed using equivalent circuit models. A parasitic passivation coating impedance was successfully minimized by designing electrodes with either a thicker passivation layer or a smaller lead trace area. It was observed that the passivated lead trace area to passivation coating thickness ratio has a critical value of 5.5, under which the impedance contribution of the coating is minimized. The optimized design of ECIS-based microelectrode devices reported in this work will make it possible to probe the entire beta dispersion region of adherent biological cell layers by reducing measurement artifacts and improving the quality of data across the beta-dispersion region. The new design will enable the use of the commonly used ECIS technique to measure real-time cellular properties in high frequency ranges (beta dispersion) that was not possible thus far.

  12. Delusions of success. How optimism undermines executives' decisions.

    Science.gov (United States)

    Lovallo, Dan; Kahneman, Daniel

    2003-07-01

    The evidence is disturbingly clear: Most major business initiatives--mergers and acquisitions, capital investments, market entries--fail to ever pay off. Economists would argue that the low success rate reflects a rational assessment of risk, with the returns from a few successes outweighing the losses of many failures. But two distinguished scholars of decision making, Dan Lovallo of the University of New South Wales and Nobel laureate Daniel Kahneman of Princeton University, provide a very different explanation. They show that a combination of cognitive biases (including anchoring and competitor neglect) and organizational pressures lead managers to make overly optimistic forecasts in analyzing proposals for major investments. By exaggerating the likely benefits of a project and ignoring the potential pitfalls, they lead their organizations into initiatives that are doomed to fall well short of expectations. The biases and pressures cannot be escaped, the authors argue, but they can be tempered by applying a very different method of forecasting--one that takes a much more objective "outside view" of an initiative's likely outcome. This outside view, also known as reference-class forecasting, completely ignores the details of the project at hand; instead, it encourages managers to examine the experiences of a class of similar projects, to lay out a rough distribution of outcomes for this reference class, and then to position the current project in that distribution. The outside view is more likely than the inside view to produce accurate forecasts--and much less likely to deliver highly unrealistic ones, the authors say.

  13. Integration of reinforcement learning and optimal decision-making theories of the basal ganglia.

    Science.gov (United States)

    Bogacz, Rafal; Larsen, Tobias

    2011-04-01

    This article seeks to integrate two sets of theories describing action selection in the basal ganglia: reinforcement learning theories describing learning which actions to select to maximize reward and decision-making theories proposing that the basal ganglia selects actions on the basis of sensory evidence accumulated in the cortex. In particular, we present a model that integrates the actor-critic model of reinforcement learning and a model assuming that the cortico-basal-ganglia circuit implements a statistically optimal decision-making procedure. The values of cortico-striatal weights required for optimal decision making in our model differ from those provided by standard reinforcement learning models. Nevertheless, we show that an actor-critic model converges to the weights required for optimal decision making when biologically realistic limits on synaptic weights are introduced. We also describe the model's predictions concerning reaction times and neural responses during learning, and we discuss directions required for further integration of reinforcement learning and optimal decision-making theories.

  14. Note---The Mean-Coefficient-of-Variation Rule: The Lognormal Case

    OpenAIRE

    Haim Levy

    1991-01-01

    The mean-variance (M-V) rule may lead to paradoxical results which may be resolved by employing the mean coefficient of variation (M-C) rule. It is shown that the M-C rule constitutes an optimal decision rule for lognormal distributions.

  15. Certain and possible rules for decision making using rough set theory extended to fuzzy sets

    Science.gov (United States)

    Dekorvin, Andre; Shipley, Margaret F.

    1993-01-01

    Uncertainty may be caused by the ambiguity in the terms used to describe a specific situation. It may also be caused by skepticism of rules used to describe a course of action or by missing and/or erroneous data. To deal with uncertainty, techniques other than classical logic need to be developed. Although, statistics may be the best tool available for handling likelihood, it is not always adequate for dealing with knowledge acquisition under uncertainty. Inadequacies caused by estimating probabilities in statistical processes can be alleviated through use of the Dempster-Shafer theory of evidence. Fuzzy set theory is another tool used to deal with uncertainty where ambiguous terms are present. Other methods include rough sets, the theory of endorsements and nonmonotonic logic. J. Grzymala-Busse has defined the concept of lower and upper approximation of a (crisp) set and has used that concept to extract rules from a set of examples. We will define the fuzzy analogs of lower and upper approximations and use these to obtain certain and possible rules from a set of examples where the data is fuzzy. Central to these concepts will be the idea of the degree to which a fuzzy set A is contained in another fuzzy set B, and the degree of intersection of a set A with set B. These concepts will also give meaning to the statement; A implies B. The two meanings will be: (1) if x is certainly in A then it is certainly in B, and (2) if x is possibly in A then it is possibly in B. Next, classification will be looked at and it will be shown that if a classification will be looked at and it will be shown that if a classification is well externally definable then it is well internally definable, and if it is poorly externally definable then it is poorly internally definable, thus generalizing a result of Grzymala-Busse. Finally, some ideas of how to define consensus and group options to form clusters of rules will be given.

  16. Reward optimization in the primate brain: a probabilistic model of decision making under uncertainty.

    Directory of Open Access Journals (Sweden)

    Yanping Huang

    Full Text Available A key problem in neuroscience is understanding how the brain makes decisions under uncertainty. Important insights have been gained using tasks such as the random dots motion discrimination task in which the subject makes decisions based on noisy stimuli. A descriptive model known as the drift diffusion model has previously been used to explain psychometric and reaction time data from such tasks but to fully explain the data, one is forced to make ad-hoc assumptions such as a time-dependent collapsing decision boundary. We show that such assumptions are unnecessary when decision making is viewed within the framework of partially observable Markov decision processes (POMDPs. We propose an alternative model for decision making based on POMDPs. We show that the motion discrimination task reduces to the problems of (1 computing beliefs (posterior distributions over the unknown direction and motion strength from noisy observations in a bayesian manner, and (2 selecting actions based on these beliefs to maximize the expected sum of future rewards. The resulting optimal policy (belief-to-action mapping is shown to be equivalent to a collapsing decision threshold that governs the switch from evidence accumulation to a discrimination decision. We show that the model accounts for both accuracy and reaction time as a function of stimulus strength as well as different speed-accuracy conditions in the random dots task.

  17. Conceptual air sparging decision tool in support of the development of an air sparging optimization decision tool

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-09-01

    The enclosed document describes a conceptual decision tool (hereinafter, Tool) for determining applicability of and for optimizing air sparging systems. The Tool was developed by a multi-disciplinary team of internationally recognized experts in air sparging technology, lead by a group of project and task managers at Parsons Engineering Science, Inc. (Parsons ES). The team included Mr. Douglas Downey and Dr. Robert Hinchee of Parsons ES, Dr. Paul Johnson of Arizona State University, Dr. Richard Johnson of Oregon Graduate Institute, and Mr. Michael Marley of Envirogen, Inc. User Community Panel Review was coordinated by Dr. Robert Siegrist of Colorado School of Mines (also of Oak Ridge National Laboratory) and Dr. Thomas Brouns of Battelle/Pacific Northwest Laboratory. The Tool is intended to provide guidance to field practitioners and environmental managers for evaluating the applicability and optimization of air sparging as remedial action technique.

  18. Data-driven deselection for monographs: a rules-based approach to weeding, storage, and shared print decisions

    Directory of Open Access Journals (Sweden)

    Rick Lugg

    2012-07-01

    Full Text Available The value of local print book collections is changing. Even as stacks fill and library traffic grows, circulation continues to decline. Across the ‘collective collection’, millions of unused books occupy prime central campus space. Meanwhile, users want more collaborative study space and online resources. Libraries want room for information commons, teaching and learning centers and cafes. Done properly, removing unused books can free space for these and other purposes, with little impact on users. Many low-use titles are securely archived, accessible digitally, and widely held in print. Surplus copies can be removed without endangering the scholarly record. But identifying candidates for deselection is time-consuming. Batch-oriented tools that incorporate both archival and service values are needed. This article describes the characteristics of a decision-support system that assembles deselection metadata and enables library-defined rules to generate lists of titles eligible for withdrawal, storage, or inclusion in shared print programs.

  19. Optimization of mask manufacturing rule check constraint for model based assist feature generation

    Science.gov (United States)

    Shim, Seongbo; Kim, Young-chang; Chun, Yong-jin; Lee, Seong-Woo; Lee, Suk-joo; Choi, Seong-woon; Han, Woo-sung; Chang, Seong-hoon; Yoon, Seok-chan; Kim, Hee-bom; Ki, Won-tai; Woo, Sang-gyun; Cho, Han-gu

    2008-11-01

    SRAF (sub-resolution assist feature) generation technology has been a popular resolution enhancement technique in photo-lithography past sub-65nm node. It helps to increase the process window, and these are some times called ILT(inverse lithography technology). Also, many studies have been presented on how to determine the best positions of SRAFs, and optimize its size. According to these reports, the generation of SRAF can be formulated as a constrained optimization problem. The constraints are the side lobe suppression and allowable minimum feature size or MRC (mask manufacturing rule check). As we know, bigger SRAF gives better contribution to main feature but susceptible to SRAF side lobe issue. Thus, we finally have no choice but to trade-off the advantages of the ideally optimized mask that contains very complicated SRAF patterns to the layout that has been MRC imposed applied to it. The above dilemma can be resolved by simultaneously using lower dose (high threshold) and cleaning up by smaller MRC. This solution makes the room between threshold (side lobe limitation) and MRC constraint (minimum feature limitation) wider. In order to use smaller MRC restriction without considering the mask writing and inspection issue, it is also appropriate to identify the exact mask writing limitation and find the smart mask constraints that well reflect the mask manufacturability and the e-beam lithography characteristics. In this article, we discuss two main topics on mask optimizations with SRAF. The first topic is on the experimental work to find what behavior of the mask writing ability is in term of several MRC parameters, and we propose more effective MRC constraint for aggressive generation of SRAF. The next topic is on finding the optimum MRC condition in practical case, 3X nm node DRAM contact layer. In fact, it is not easy to encompass the mask writing capability for very complicate real SRAF pattern by using the current MRC constraint based on the only width and

  20. Leveraging human decision making through the optimal management of centralized resources

    Science.gov (United States)

    Hyden, Paul; McGrath, Richard G.

    2016-05-01

    Combining results from mixed integer optimization, stochastic modeling and queuing theory, we will advance the interdisciplinary problem of efficiently and effectively allocating centrally managed resources. Academia currently fails to address this, as the esoteric demands of each of these large research areas limits work across traditional boundaries. The commercial space does not currently address these challenges due to the absence of a profit metric. By constructing algorithms that explicitly use inputs across boundaries, we are able to incorporate the advantages of using human decision makers. Key improvements in the underlying algorithms are made possible by aligning decision maker goals with the feedback loops introduced between the core optimization step and the modeling of the overall stochastic process of supply and demand. A key observation is that human decision-makers must be explicitly included in the analysis for these approaches to be ultimately successful. Transformative access gives warfighters and mission owners greater understanding of global needs and allows for relationships to guide optimal resource allocation decisions. Mastery of demand processes and optimization bottlenecks reveals long term maximum marginal utility gaps in capabilities.

  1. OPTIMAL BUSINESS DECISION SYSTEM FOR MULTINATIONALS: A MULTIFACTOR ANALYSIS OF SELECTED MANUFACTURING FIRMS

    Directory of Open Access Journals (Sweden)

    Oforegbunam Thaddeus Ebiringa

    2011-03-01

    Full Text Available Traditional MIS has been made more effective through the integration of organization, human andtechnology factors into a decision matrix. The study is motivated by the need to find an optimal mixof interactive factors that will optimize the result of decision to apply ICT to manufacturingprocesses. The study used Factor analysis model based on the sampled opinion of forty (40operations/production managers and two thousand (2000 production line workers of three leadingmanufacturing firms: Uniliver Plc., PZ Plc, and Nigerian Breweries Plc operating in Aba IndustrialEstate of Nigeria. The results shows that a progressive mixed factor loading matrix, based on thepreferred ordered importance of resources factors in the formulation, implementation, monitoring,control and evaluation of ICT projects of the selected firms led to an average capability improvementof 0.764 in decision efficiency. This is considered strategic for achieving balanced corporate growthand development.

  2. Speedy standing wave design, optimization, and scaling rules of simulated moving bed systems with linear isotherms.

    Science.gov (United States)

    Weeden, George S; Wang, Nien-Hwa Linda

    2017-04-14

    Simulated Moving Bed (SMB) systems with linear adsorption isotherms have been used for many different separations, including large-scale sugar separations. While SMBs are much more efficient than batch operations, they are not widely used for large-scale production because there are two key barriers. The methods for design, optimization, and scale-up are complex for non-ideal systems. The Speedy Standing Wave Design (SSWD) is developed here to reduce these barriers. The productivity (PR) and the solvent efficiency (F/D) are explicitly related to seven material properties and 13 design parameters. For diffusion-controlled systems, the maximum PR or F/D is controlled by two key dimensionless material properties, the selectivity (α) and the effective diffusivity ratio (η), and two key dimensionless design parameters, the ratios of step time/diffusion time and pressure-limited convection time/diffusion time. The optimum column configuration for maximum PR or F/D is controlled by the weighted diffusivity ratio (η/α(2)). In general, high α and low η/α(2) favor high PR and F/D. The productivity is proportional to the ratio of the feed concentration to the diffusion time. Small particles and high diffusivities favor high productivity, but do not affect solvent efficiency. Simple scaling rules are derived from the two key dimensionless design parameters. The separation of acetic acid from glucose in biomass hydrolysate is used as an example to show how the productivity and the solvent efficiency are affected by the key dimensionless material and design parameters. Ten design parameters are optimized for maximum PR or minimum cost in one minute on a laptop computer. If the material properties are the same for different particle sizes and the dimensionless groups are kept constant, then lab-scale testing consumes less materials and can be done four times faster using particles with half the particle size. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. A normative inference approach for optimal sample sizes in decisions from experience

    Directory of Open Access Journals (Sweden)

    Dirk eOstwald

    2015-09-01

    Full Text Available Decisions from experience (DFE refers to a body of work that emerged in research on behavioral decision making over the last decade. One of the major experimental paradigms employed to study experienced-based choice is the sampling paradigm, which serves as a model of decision making under limited knowledge about the statistical structure of the world. In this paradigm respondents are presented with two payoff distributions, which, in contrast to standard approaches in behavioral economics, are specified not in terms of explicit outcome-probability information, but by the opportunity to sample outcomes from each distribution without economic consequences. Participants are encouraged to explore the distributions until they feel confident enough to decide from which they would prefer to draw from in a final trial involving real monetary payoffs. One commonly employed measure to characterize the behavior of participants in the sampling paradigm is the sample size, that is, the number of outcome draws which participants choose to obtain from each distribution prior to terminating sampling. A natural question that arises in this context concerns the optimal sample size, which could be used as a normative benchmark to evaluate human sampling behavior in DFE. In this theoretical manuscript, we relate the DFE sampling paradigm to the classical statistical decision theoretic literature and, under a probabilistic inference assumption, evaluate optimal sample sizes for decisions from experience. In our treatment we go beyond analytically established results by showing how the classical statistical decision theoretic framework can be used to derive optimal sample sizes under arbitrary, but numerically evaluable, constraints. Finally, we critically evaluate the value of deriving optimal sample sizes under this framework as testable predictions for the experimental study of sampling behavior in DFE.

  4. 一种基于决策表的分类规则挖掘新算法%A New Algorithm of Mining Classification Rules Based on Decision Table

    Institute of Scientific and Technical Information of China (English)

    谢娟英; 冯德民

    2003-01-01

    The mining of classification rules is an important field in Data Mining. Decision table of rough sets theory is an efficient tool for mining classification rules. The elementary concepts corresponding to decision table of Rough Sets Theory are introduced in this paper. A new algorithm for mining classification rules based on Decision Table is presented, along with a discernable function in reduction of attribute values, and a new principle for accuracy of rules. An example of its application to the car's classification problem is included, and the accuracy of rules discovered is analyzed. The potential fields for its application in data mining are also discussed.

  5. Simple, High-Performance Fusion Rule for Censored Decisions in Wireless Sensor Networks

    Institute of Scientific and Technical Information of China (English)

    LIU Xiangyang; PENG Yingning; WANG Xiutan

    2008-01-01

    Data selection-based summation fusion (DSSF) was developed to overcome the shortcomings of previously developed likelihood ratio tests based on channel statistics (LRT-CS) for the problem of fusing censored binary decisions transmitted over Nakagami fading channels in a wireless sensor network (WSN).The LRT-CS relies on detection probabilities of the local sensors, while the detection probabilities are a pri-ori unknown for uncooperative targets. Also, for Nakagami fading channels, the LRT-CS involves an infinite series, which is cumbersome for real-time application. In contrast, the DSSF only involves data comparisons and additions and does not require the detection probabilities of local sensors. Furthermore, the perform-ance of DSSF is only slightly degraded in comparison with the LRT-CS when the detection probabilities of local sensors are a priori unknown. Therefore, the DSSF should be used in a WSN with limited resources.

  6. Optimality and some of its discontents: successes and shortcomings of existing models for binary decisions.

    Science.gov (United States)

    Holmes, Philip; Cohen, Jonathan D

    2014-04-01

    We review how leaky competing accumulators (LCAs) can be used to model decision making in two-alternative, forced-choice tasks, and we show how they reduce to drift diffusion (DD) processes in special cases. As continuum limits of the sequential probability ratio test, DD processes are optimal in producing decisions of specified accuracy in the shortest possible time. Furthermore, the DD model can be used to derive a speed-accuracy trade-off that optimizes reward rate for a restricted class of two alternative forced-choice decision tasks. We review findings that compare human performance with this benchmark, and we reveal both approximations to and deviations from optimality. We then discuss three potential sources of deviations from optimality at the psychological level--avoidance of errors, poor time estimation, and minimization of the cost of control--and review recent theoretical and empirical findings that address these possibilities. We also discuss the role of cognitive control in changing environments and in modulating exploitation and exploration. Finally, we consider physiological factors in which nonlinear dynamics may also contribute to deviations from optimality.

  7. A decision-making framework for river water quality management under uncertainty: Application of social choice rules.

    Science.gov (United States)

    Zolfagharipoor, Mohammad Amin; Ahmadi, Azadeh

    2016-12-01

    An important issue in river water quality management is taking into account the role played by wastewater dischargers in the decision-making process and in the implementation of any proposed waste load allocation program in a given region. In this study, a new decision-making methodology, called 'stochastic social choice rules' (SSCR), was developed for modeling the bargaining process among different wastewater dischargers into shared environments. For this purpose, the costs associated with each treatment strategy were initially calculated as the sum of treatment cost and the fines incurred due to violation of water quality standards. The qualitative simulation model (QUAL2Kw) was then used to determine the penalty function. The uncertainty associated with the implementation of strategies under the economic costs (i.e., the sum of treatment and penalty costs) was dealt with by a Monte-Carlo selection method. This method was coupled with different social choice methods to identify the best solution for the waste load allocation problem. Finally, using the extended trading-ratio system (ETRS), the most preferred treatment strategy was exchanged among dischargers as the initial set of discharge permits aimed at reducing the costs and encouraging dischargers to participate in the river water quality protection scheme. The proposed model was finally applied to the Zarjoub River in Gilan Province, northern Iran, as a case study. Results showed the efficiency of the proposed model in developing waste load allocation strategies for rivers. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. Testing game theory models: fighting ability and decision rules in chameleon contests.

    Science.gov (United States)

    Stuart-Fox, Devi

    2006-06-22

    Game theory models of animal contests make many non-mutually exclusive predictions, complicating empirical tests. These predictions regard the relationship between contest parameters and fighting ability, for which body size is usually used as a proxy. However, in many systems, body size may be a limited proxy since multiple traits and contextual factors such as experience influence fighting ability. Using contests between male Cape dwarf chameleons, Bradypodion pumilum, I test alternative game theory models of extended contests. I show how the most likely candidate model can be identified through a process of elimination, based on tests of key predictions. In addition, I present a measure of fighting ability based on multiple traits that allows ability to change as experience changes. In dwarf chameleons, persistence is based on loser thresholds rather than assessment of relative ability, ruling out the sequential assessment model. Winners and losers do not match behaviours in early parts of the contest, arguing against all types of war of attrition models. Although the cumulative assessment model remained as the most likely candidate model, not all specific predictions of this model were upheld.

  9. Second decision-making method of the fuzzy optimal model and its application

    Institute of Scientific and Technical Information of China (English)

    刘金禄; 陈守煜

    2004-01-01

    In the process of concept design offshore platforms, it is necessary to select the best from feasible alternatives through comparison and filtering. Fuzzy optimal selection in multiple stages is an useful method in engineering decisionmaking, but in some cases, it is difficult to make a distinction between the first alternative and the second. Therefore it is necessary to improve the fuzzy optimal selection method with multiple stages. In this paper, using fuzzy cross iteration optimal modeling, the weight sensitivity analysis is made firstly. Then a second decision-making method to solve fuzzy optimal problems is proposed. Its basic principle is the inverse reasoning in mathematics. This method expands the applied range of fuzzy optimal selection. Finally this method is used in the concept design of offshore platform. A case study shows that this method is scientific, reasonable and easy to use in practice.

  10. Decision-Aiding and Optimization for Vertical Navigation of Long-Haul Aircraft

    Science.gov (United States)

    Patrick, Nicholas J. M.; Sheridan, Thomas B.

    1996-01-01

    Most decisions made in the cockpit are related to safety, and have therefore been proceduralized in order to reduce risk. There are very few which are made on the basis of a value metric such as economic cost. One which can be shown to be value based, however, is the selection of a flight profile. Fuel consumption and flight time both have a substantial effect on aircraft operating cost, but they cannot be minimized simultaneously. In addition, winds, turbulence, and performance vary widely with altitude and time. These factors make it important and difficult for pilots to (a) evaluate the outcomes associated with a particular trajectory before it is flown and (b) decide among possible trajectories. The two elements of this problem considered here are: (1) determining what constitutes optimality, and (2) finding optimal trajectories. Pilots and dispatchers from major u.s. airlines were surveyed to determine which attributes of the outcome of a flight they considered the most important. Avoiding turbulence-for passenger comfort-topped the list of items which were not safety related. Pilots' decision making about the selection of flight profile on the basis of flight time, fuel burn, and exposure to turbulence was then observed. Of the several behavioral and prescriptive decision models invoked to explain the pilots' choices, utility maximization is shown to best reproduce the pilots' decisions. After considering more traditional methods for optimizing trajectories, a novel method is developed using a genetic algorithm (GA) operating on a discrete representation of the trajectory search space. The representation is a sequence of command altitudes, and was chosen to be compatible with the constraints imposed by Air Traffic Control, and with the training given to pilots. Since trajectory evaluation for the GA is performed holistically, a wide class of objective functions can be optimized easily. Also, using the GA it is possible to compare the costs associated with

  11. Optimization of sensing time and cooperative user allocation for OR-rule cooperative spectrum sensing in cognitive radio network

    Institute of Scientific and Technical Information of China (English)

    刘鑫; 仲伟志; 陈琨奇

    2015-01-01

    In order to improve the throughput of cognitive radio (CR), optimization of sensing time and cooperative user allocation for OR-rule cooperative spectrum sensing was investigated in a CR network that includes multiple users and one fusion center. The frame structure of cooperative spectrum sensing was divided into multiple transmission time slots and one sensing time slot consisting of local energy detection and cooperative overhead. An optimization problem was formulated to maximize the throughput of CR network, subject to the constraints of both false alarm probability and detection probability. A joint optimization algorithm of sensing time and number of users was proposed to solve this optimization problem with low time complexity. An allocation algorithm of cooperative users was proposed to preferentially allocate the users to the channels with high utilization probability. The simulation results show that the significant improvement on the throughput can be achieved through the proposed joint optimization and allocation algorithms.

  12. A general method to select representative models for decision making and optimization under uncertainty

    Science.gov (United States)

    Shirangi, Mehrdad G.; Durlofsky, Louis J.

    2016-11-01

    The optimization of subsurface flow processes under geological uncertainty technically requires flow simulation to be performed over a large set of geological realizations for each function evaluation at every iteration of the optimizer. Because flow simulation over many permeability realizations (only permeability is considered to be uncertain in this study) may entail excessive computation, simulations are often performed for only a subset of 'representative' realizations. It is however challenging to identify a representative subset that provides flow statistics in close agreement with those from the full set, especially when the decision parameters (e.g., time-varying well pressures, well locations) are unknown a priori, as they are in optimization problems. In this work, we introduce a general framework, based on clustering, for selecting a representative subset of realizations for use in simulations involving 'new' sets of decision parameters. Prior to clustering, each realization is represented by a low-dimensional feature vector that contains a combination of permeability-based and flow-based quantities. Calculation of flow-based features requires the specification of a (base) flow problem and simulation over the full set of realizations. Permeability information is captured concisely through use of principal component analysis. By computing the difference between the flow response for the subset and the full set, we quantify the performance of various realization-selection methods. The impact of different weightings for flow and permeability information in the cluster-based selection procedure is assessed for a range of examples involving different types of decision parameters. These decision parameters are generated either randomly, in a manner that is consistent with the solutions proposed in global stochastic optimization procedures such as GA and PSO, or through perturbation around a base case, consistent with the solutions considered in pattern search

  13. Consideration of Time as a Decision Variable in Subsurface Remediation Optimization

    Science.gov (United States)

    Endres, K. L.; Mayer, A. S.; Horn, J.

    2003-12-01

    Remediation time frames are normally fixed by a number of management and regulatory issues without consideration of the interaction between remediation cost and the time constraint. This work looks at the implications of the time constraint by considering time as a decision variable in the optimization process. We utilize a multi-objective optimization of a hypothetical contaminated aquifer that results in a trade off curve of total remediation time vs. remediation costs. This curve allows decision makers to view the full range of options for time and cost. The cost function includes treatment, pumping and management costs. The multi-objective problem is formulated to minimize the design cost while also minimizing the remediation time. The Niched Pareto Genetic Algorithm (NPGA) has been modified to allow enforcement of water quality constraints. The addition of this constraint enforcement is developed by two methods. The first method initiates a penalty in the fitness values as the enforcement mechanism. The second uses the niching domination to apply the constraint. Each of these methods is innovative in remediation optimization work. Comparisons of the two methods are presented. Three sets of numerical computational experiments are performed to produce tradeoff curves of cost and total time. The experiments increase in computational effort as the complexity of the time variables increases. In each experiment the cost objective will be a function of pumping rate. The first experiment will use a single management period, where total time is the decision variable. The second will use multiple management periods of fixed length, where the number of management periods is the decision variable. The third will have the number of management periods and the length of the periods as decision variables. This method of investigation in to the impact of time as an optimization variable incorporates the full range of management possibilities. Comparisons of the three

  14. A normative inference approach for optimal sample sizes in decisions from experience.

    Science.gov (United States)

    Ostwald, Dirk; Starke, Ludger; Hertwig, Ralph

    2015-01-01

    "Decisions from experience" (DFE) refers to a body of work that emerged in research on behavioral decision making over the last decade. One of the major experimental paradigms employed to study experience-based choice is the "sampling paradigm," which serves as a model of decision making under limited knowledge about the statistical structure of the world. In this paradigm respondents are presented with two payoff distributions, which, in contrast to standard approaches in behavioral economics, are specified not in terms of explicit outcome-probability information, but by the opportunity to sample outcomes from each distribution without economic consequences. Participants are encouraged to explore the distributions until they feel confident enough to decide from which they would prefer to draw from in a final trial involving real monetary payoffs. One commonly employed measure to characterize the behavior of participants in the sampling paradigm is the sample size, that is, the number of outcome draws which participants choose to obtain from each distribution prior to terminating sampling. A natural question that arises in this context concerns the "optimal" sample size, which could be used as a normative benchmark to evaluate human sampling behavior in DFE. In this theoretical study, we relate the DFE sampling paradigm to the classical statistical decision theoretic literature and, under a probabilistic inference assumption, evaluate optimal sample sizes for DFE. In our treatment we go beyond analytically established results by showing how the classical statistical decision theoretic framework can be used to derive optimal sample sizes under arbitrary, but numerically evaluable, constraints. Finally, we critically evaluate the value of deriving optimal sample sizes under this framework as testable predictions for the experimental study of sampling behavior in DFE.

  15. Can Subjects be Guided to Optimal Decisions The Use of a Real-Time Training Intervention Model

    Science.gov (United States)

    2016-06-01

    GROUP CODE # Optimal Decision Making Demonstration # Military Wargaming, Convoy Route Selection # # Multi Arm Bandit (n=4) # In support of...root.bind(“ə>,” callback) app.mainloop() root.destroy() B. FEEDBACK GROUP CODE # Optimal Decision Making Demonstration 76 # Military... DECISIONS ? THE USE OF A REAL-TIME TRAINING INTERVENTION MODEL 5. FUNDING NUMBERS 6. AUTHOR(S) Travis D. Carlson 7. PERFORMING ORGANIZATION NAME(S) AND

  16. A modeling framework for optimal long-term care insurance purchase decisions in retirement planning.

    Science.gov (United States)

    Gupta, Aparna; Li, Lepeng

    2004-05-01

    The level of need and costs of obtaining long-term care (LTC) during retired life require that planning for it is an integral part of retirement planning. In this paper, we divide retirement planning into two phases, pre-retirement and post-retirement. On the basis of four interrelated models for health evolution, wealth evolution, LTC insurance premium and coverage, and LTC cost structure, a framework for optimal LTC insurance purchase decisions in the pre-retirement phase is developed. Optimal decisions are obtained by developing a trade-off between post-retirement LTC costs and LTC insurance premiums and coverage. Two-way branching models are used to model stochastic health events and asset returns. The resulting optimization problem is formulated as a dynamic programming problem. We compare the optimal decision under two insurance purchase scenarios: one assumes that insurance is purchased for good and other assumes it may be purchased, relinquished and re-purchased. Sensitivity analysis is performed for the retirement age.

  17. Dimensions of design space: a decision-theoretic approach to optimal research design.

    Science.gov (United States)

    Conti, Stefano; Claxton, Karl

    2009-01-01

    Bayesian decision theory can be used not only to establish the optimal sample size and its allocation in a single clinical study but also to identify an optimal portfolio of research combining different types of study design. Within a single study, the highest societal payoff to proposed research is achieved when its sample sizes and allocation between available treatment options are chosen to maximize the expected net benefit of sampling (ENBS). Where a number of different types of study informing different parameters in the decision problem could be conducted, the simultaneous estimation of ENBS across all dimensions of the design space is required to identify the optimal sample sizes and allocations within such a research portfolio. This is illustrated through a simple example of a decision model of zanamivir for the treatment of influenza. The possible study designs include: 1) a single trial of all the parameters, 2) a clinical trial providing evidence only on clinical endpoints, 3) an epidemiological study of natural history of disease, and 4) a survey of quality of life. The possible combinations, samples sizes, and allocation between trial arms are evaluated over a range of cost-effectiveness thresholds. The computational challenges are addressed by implementing optimization algorithms to search the ENBS surface more efficiently over such large dimensions.

  18. Problems on Solving Matrix Aggregation in Group Decision-Making by Glowworm Swarm Optimization

    Directory of Open Access Journals (Sweden)

    Yaping Li

    2016-01-01

    Full Text Available Judgment matrix aggregation, as an important part of group decision-making, has been widely and deeply studied due to the universality and importance of group decision-making in the management field. For the variety of judgment matrix in group decision-making, the matrix aggregation result can be obtained by using the mode of glowworm swarm optimization. First, this paper introduces the basic principle of the glowworm swarm optimization (GSO algorithm and gives the improved GSO algorithm to solve the matrix aggregation problems. In this approach, the consistency ratio is introduced to the objective function of the glowworm swarm optimization, thus reducing the subjectivity and information loss in the aggregation process. Then, the improved GSO algorithm is applied to the solution of the deterministic matrix and the fuzzy matrix. The method optimization can provide an effective and relatively uniform aggregation method for matrix aggregation. Finally, through comparative analysis, it is shown that the method of this paper has certain advantages in terms of adaptability, accuracy, and stability to solving the matrix aggregation problems.

  19. Assessing Traffic Accident Occurrence of Road Segments through an Optimized Decision Rule

    Directory of Open Access Journals (Sweden)

    Lu Ma

    2015-01-01

    Full Text Available Statistical models for estimating the safety status of transportation facilities have received great attention in the last two decades. These models also perform an important role in transportation safety planning as well as diagnoses of locations with high accident risks. However, the current methods largely rely on regression analyses and therefore they could ignore the multicollinearity characteristics of factors, which may provide additional information for enhancing the performance of forecasting models. This study seeks to develop more precise models for forecasting safety status as well as addressing the issue of multicollinearity of dataset. The proposed mathematical approach is indeed a discriminant analysis with respect to the goal of minimizing Bayes risks given multivariate distributions of factors. Based on this model, numerical analyses also perform with the application of a simulated dataset and an empirically observed dataset of traffic accidents in road segments. These examples essentially illustrate the process of Bayes risk minimization on predicating the safety status of road segments toward the objective of smallest misclassification rate. The paper finally concludes with a discussion of this methodology and several important avenues for future studies are also provided.

  20. Inside the black box: Starting to uncover the underlying decision rules used in a one-by-one expert assessment of occupational exposure in case-control studies

    NARCIS (Netherlands)

    Wheeler, D.C.; Burstyn, I.; Vermeulen, R.; Yu, K.; Shortreed, S.M.; Pronk, A.; Stewart, P.A.; Colt, J.S.; Baris, D.; Karagas, M.R.; Schwenn, M.; Johnson, A.; Silverman, D.T.; Friesen, M.C.

    2013-01-01

    Objectives Evaluating occupational exposures in population-based case-control studies often requires exposure assessors to review each study participant's reported occupational information job-by-job to derive exposure estimates. Although such assessments likely have underlying decision rules, they

  1. Ionic force field optimization based on single-ion and ion-pair solvation properties: going beyond standard mixing rules.

    Science.gov (United States)

    Fyta, Maria; Netz, Roland R

    2012-03-28

    Using molecular dynamics (MD) simulations in conjunction with the SPC/E water model, we optimize ionic force-field parameters for seven different halide and alkali ions, considering a total of eight ion-pairs. Our strategy is based on simultaneous optimizing single-ion and ion-pair properties, i.e., we first fix ion-water parameters based on single-ion solvation free energies, and in a second step determine the cation-anion interaction parameters (traditionally given by mixing or combination rules) based on the Kirkwood-Buff theory without modification of the ion-water interaction parameters. In doing so, we have introduced scaling factors for the cation-anion Lennard-Jones (LJ) interaction that quantify deviations from the standard mixing rules. For the rather size-symmetric salt solutions involving bromide and chloride ions, the standard mixing rules work fine. On the other hand, for the iodide and fluoride solutions, corresponding to the largest and smallest anion considered in this work, a rescaling of the mixing rules was necessary. For iodide, the experimental activities suggest more tightly bound ion pairing than given by the standard mixing rules, which is achieved in simulations by reducing the scaling factor of the cation-anion LJ energy. For fluoride, the situation is different and the simulations show too large attraction between fluoride and cations when compared with experimental data. For NaF, the situation can be rectified by increasing the cation-anion LJ energy. For KF, it proves necessary to increase the effective cation-anion Lennard-Jones diameter. The optimization strategy outlined in this work can be easily adapted to different kinds of ions.

  2. Experimental optimization of a real time fed-batch fermentation process using Markov decision process.

    Science.gov (United States)

    Saucedo, V M; Karim, M N

    1997-07-20

    This article describes a methodology that implements a Markov decision process (MDP) optimization technique in a real time fed-batch experiment. Biological systems can be better modeled under the stochastic framework and MDP is shown to be a suitable technique for their optimization. A nonlinear input/output model is used to calculate the probability transitions. All elements of the MDP are identified according to physical parameters. Finally, this study compares the results obtained when optimizing ethanol production using the infinite horizon problem, with total expected discount policy, to previous experimental results aimed at optimizing ethanol production using a recombinant Escherichia coli fed-batch cultivation. (c) 1997 John Wiley & Sons, Inc. Biotechnol Bioeng 55: 317-327, 1997.

  3. A Multiswarm Optimizer for Distributed Decision Making in Virtual Enterprise Risk Management

    Directory of Open Access Journals (Sweden)

    Yichuan Shao

    2012-01-01

    Full Text Available We develop an optimization model for risk management in a virtual enterprise environment based on a novel multiswarm particle swarm optimizer called PS2O. The main idea of PS2O is to extend the single population PSO to the interacting multiswarms model by constructing hierarchical interaction topology and enhanced dynamical update equations. With the hierarchical interaction topology, a suitable diversity in the whole population can be maintained. At the same time, the enhanced dynamical update rule significantly speeds up the multiswarm to converge to the global optimum. With five mathematical benchmark functions, PS2O is proved to have considerable potential for solving complex optimization problems. PS2O is then applied to risk management in a virtual enterprise environment. Simulation results demonstrate that the PS2O algorithm is more feasible and efficient than the PSO algorithm in solving this real-world problem.

  4. Discovery of Transition Rules for Cellular Automata Using Artificial Bee Colony and Particle Swarm Optimization Algorithms in Urban Growth Modeling

    Directory of Open Access Journals (Sweden)

    Fereydoun Naghibi

    2016-12-01

    Full Text Available This paper presents an advanced method in urban growth modeling to discover transition rules of cellular automata (CA using the artificial bee colony (ABC optimization algorithm. Also, comparisons between the simulation results of CA models optimized by the ABC algorithm and the particle swarm optimization algorithms (PSO as intelligent approaches were performed to evaluate the potential of the proposed methods. According to previous studies, swarm intelligence algorithms for solving optimization problems such as discovering transition rules of CA in land use change/urban growth modeling can produce reasonable results. Modeling of urban growth as a dynamic process is not straightforward because of the existence of nonlinearity and heterogeneity among effective involved variables which can cause a number of challenges for traditional CA. ABC algorithm, the new powerful swarm based optimization algorithms, can be used to capture optimized transition rules of CA. This paper has proposed a methodology based on remote sensing data for modeling urban growth with CA calibrated by the ABC algorithm. The performance of ABC-CA, PSO-CA, and CA-logistic models in land use change detection is tested for the city of Urmia, Iran, between 2004 and 2014. Validations of the models based on statistical measures such as overall accuracy, figure of merit, and total operating characteristic were made. We showed that the overall accuracy of the ABC-CA model was 89%, which was 1.5% and 6.2% higher than those of the PSO-CA and CA-logistic model, respectively. Moreover, the allocation disagreement (simulation error of the simulation results for the ABC-CA, PSO-CA, and CA-logistic models are 11%, 12.5%, and 17.2%, respectively. Finally, for all evaluation indices including running time, convergence capability, flexibility, statistical measurements, and the produced spatial patterns, the ABC-CA model performance showed relative improvement and therefore its superiority was

  5. Training Sequence Length Optimization for a Turbo-Detector Using Decision-Directed Channel Estimation

    Directory of Open Access Journals (Sweden)

    Imed Hadj Kacem

    2008-01-01

    Full Text Available We consider the problem of optimization of the training sequence length when a turbo-detector composed of a maximum a posteriori (MAP equalizer and a MAP decoder is used. At each iteration of the receiver, the channel is estimated using the hard decisions on the transmitted symbols at the output of the decoder. The optimal length of the training sequence is found by maximizing an effective signal-to-noise ratio (SNR taking into account the data throughput loss due to the use of pilot symbols.

  6. Multiobjective Optimization of Aircraft Maintenance in Thailand Using Goal Programming: A Decision-Support Model

    Directory of Open Access Journals (Sweden)

    Yuttapong Pleumpirom

    2012-01-01

    Full Text Available The purpose of this paper is to develop the multiobjective optimization model in order to evaluate suppliers for aircraft maintenance tasks, using goal programming. The authors have developed a two-step process. The model will firstly be used as a decision-support tool for managing demand, by using aircraft and flight schedules to evaluate and generate aircraft-maintenance requirements, including spare-part lists. Secondly, they develop a multiobjective optimization model by minimizing cost, minimizing lead time, and maximizing the quality under various constraints in the model. Finally, the model is implemented in the actual airline's case.

  7. Optimization of Management Decision by Network Method used for Chipboards Manufacturing

    Directory of Open Access Journals (Sweden)

    Ivan Cismaru

    2008-03-01

    Full Text Available The paper presents a method of an economic analyses through which the bases of the management decision may be set in order to optimize the chipboards manufacturing activity. The method is focused on the national and efficient capitalization of raw materials, as network rate assortments, and depending on the stocks situation within the store, to establish momently the optimal recipe to be implemented according to the manufacturing expenses and the profit implicitely, this recipe being variable in time depending on the supply possibilities.

  8. Supply Chain Network Optimization Based on Fuzzy Multiobjective Centralized Decision-Making Model

    Directory of Open Access Journals (Sweden)

    Xinyi Fu

    2017-01-01

    Full Text Available Supply chain cooperation strategy believes that the integration of the operation process can produce value for customers, optimize the supply chain of the connection between the vertical nodes, and constantly strengthen the performance of the advantages, so as to achieve mutual benefit and win-win results. Under fuzzy uncertain environment and centralized decision-making mode, we study multiobjective decision-making optimization, which focuses on equilibrium and compensation of multiobjective problems; that is to say, the proper adjustment of the individual goal satisfaction level will make other goals’ satisfaction levels change greatly. Through coordination among the multiobjectives, supply chain system will achieve global optimum. Finally, a simulation experiment from Shaoxing textile case is used to verify its efficiency and effectiveness.

  9. Bounded rationality, abstraction and hierarchical decision-making: an information-theoretic optimality principle

    Directory of Open Access Journals (Sweden)

    Tim eGenewein

    2015-11-01

    Full Text Available Abstraction and hierarchical information-processing are hallmarks of human and animal intelligence underlying the unrivaled flexibility of behavior in biological systems. Achieving such a flexibility in artificial systems is challenging, even with more and more computational power. Here we investigate the hypothesis that abstraction and hierarchical information-processing might in fact be the consequence of limitations in information-processing power. In particular, we study an information-theoretic framework of bounded rational decision-making that trades off utility maximization against information-processing costs. We apply the basic principle of this framework to perception-action systems with multiple information-processing nodes and derive bounded optimal solutions. We show how the formation of abstractions and decision-making hierarchies depends on information-processing costs. We illustrate the theoretical ideas with example simulations and conclude by formalizing a mathematically unifying optimization principle that could potentially be extended to more complex systems.

  10. Nonuniqueness versus Uniqueness of Optimal Policies in Convex Discounted Markov Decision Processes

    Directory of Open Access Journals (Sweden)

    Raúl Montes-de-Oca

    2013-01-01

    Full Text Available From the classical point of view, it is important to determine if in a Markov decision process (MDP, besides their existence, the uniqueness of the optimal policies is guaranteed. It is well known that uniqueness does not always hold in optimization problems (for instance, in linear programming. On the other hand, in such problems it is possible for a slight perturbation of the functional cost to restore the uniqueness. In this paper, it is proved that the value functions of an MDP and its cost perturbed version stay close, under adequate conditions, which in some sense is a priority. We are interested in the stability of Markov decision processes with respect to the perturbations of the cost-as-you-go function.

  11. Optimal Financing Decisions of Two Cash-Constrained Supply Chains with Complementary Products

    Directory of Open Access Journals (Sweden)

    Yuting Li

    2016-04-01

    Full Text Available In recent years; financing difficulties have been obsessed small and medium enterprises (SMEs; especially emerging SMEs. Inter-members’ joint financing within a supply chain is one of solutions for SMEs. How about members’ joint financing of inter-supply chains? In order to answer the question, we firstly employ the Stackelberg game to propose three kinds of financing decision models of two cash-constrained supply chains with complementary products. Secondly, we analyze qualitatively these models and find the joint financing decision of the two supply chains is the most optimal one. Lastly, we conduct some numerical simulations not only to illustrate above results but also to find that the larger are cross-price sensitivity coefficients; the higher is the motivation for participants to make joint financing decisions; and the more are profits for them to gain.

  12. Parameter uncertainty-based pattern identification and optimization for robust decision making on watershed load reduction

    Science.gov (United States)

    Jiang, Qingsong; Su, Han; Liu, Yong; Zou, Rui; Ye, Rui; Guo, Huaicheng

    2017-04-01

    Nutrients loading reduction in watershed is essential for lake restoration from eutrophication. The efficient and optimal decision-making on loading reduction is generally based on water quality modeling and the quantitative identification of nutrient sources at the watershed scale. The modeling process is influenced inevitably by inherent uncertainties, especially by uncertain parameters due to equifinality. Therefore, the emerging question is: if there is parameter uncertainty, how to ensure the robustness of the optimal decisions? Based on simulation-optimization models, an integrated approach of pattern identification and analysis of robustness was proposed in this study that focuses on the impact of parameter uncertainty in water quality modeling. Here the pattern represents the discernable regularity of solutions for load reduction under multiple parameter sets. Pattern identification is achieved by using a hybrid clustering analysis (i.e., Ward-Hierarchical and K-means), which was flexible and efficient in analyzing Lake Bali near the Yangtze River in China. The results demonstrated that urban domestic nutrient load is the most potential source that should be reduced, and there are two patterns for Total Nitrogen (TN) reduction and three patterns for Total Phosphorus (TP) reduction. The patterns indicated different total reduction of nutrient loads, which reflect diverse decision preferences. The robust solution was identified by the highest accomplishment with the water quality at monitoring stations that were improved uniformly with this solution. We conducted a process analysis of robust decision-making that was based on pattern identification and uncertainty, which provides effective support for decision-making with preference under uncertainty.

  13. RSLES: an architectural implementation of a decision support system for optimal recruit station location

    OpenAIRE

    Houck, Dale E.; Shigley, Mark V.

    1999-01-01

    Approved for Public release; distribution is unlimited This thesis describes a component-based methodology for developing a decision support system (DSS) for optimal location of military recruiting stations in regional recruiting markets. The DSS is designed to ensure that stations are selected that minimize cost for a given level of production. The interface allows users to perform "what if' analysis to determine if there are better locations to meet desired objectives. The Recruit Statio...

  14. Randomized Search Methods for Solving Markov Decision Processes and Global Optimization

    Science.gov (United States)

    2006-01-01

    over relaxation (SOR) method ([81]). Puterman and Shin [62] proposed a modified policy iteration algorithm, which takes the basic form of PI, with the...99018) (1999). [61] Pintér, J. D., Global Optimization in Action, Kluwer Academic Publisher, The Netherlands, 1996. [62] Puterman , M. L. and Shin, M. C...Modified policy iteration algorithms for dis- counted Markov decision processes,” Management Science, 24, 1127–1137 (1978). [63] Puterman , M. L

  15. Optimization

    CERN Document Server

    Pearce, Charles

    2009-01-01

    Focuses on mathematical structure, and on real-world applications. This book includes developments in several optimization-related topics such as decision theory, linear programming, turnpike theory, duality theory, convex analysis, and queuing theory.

  16. A multi-criteria optimization and decision-making approach for improvement of food engineering processes

    Directory of Open Access Journals (Sweden)

    Alik Abakarov

    2013-04-01

    Full Text Available The objective of this study was to propose a multi-criteria optimization and decision-making technique to solve food engineering problems. This technique was demonstrated using experimental data obtained on osmotic dehydration of carrot cubes in a sodium chloride solution. The Aggregating Functions Approach, the Adaptive Random Search Algorithm, and the Penalty Functions Approach were used in this study to compute the initial set of non-dominated or Pareto-optimal solutions. Multiple non-linear regression analysis was performed on a set of experimental data in order to obtain particular multi-objective functions (responses, namely water loss, solute gain, rehydration ratio, three different colour criteria of rehydrated product, and sensory evaluation (organoleptic quality. Two multi-criteria decision-making approaches, the Analytic Hierarchy Process (AHP and the Tabular Method (TM, were used simultaneously to choose the best alternative among the set of non-dominated solutions. The multi-criteria optimization and decision-making technique proposed in this study can facilitate the assessment of criteria weights, giving rise to a fairer, more consistent, and adequate final compromised solution or food process. This technique can be useful to food scientists in research and education, as well as to engineers involved in the improvement of a variety of food engineering processes.

  17. Optimization of matrix tablets controlled drug release using Elman dynamic neural networks and decision trees.

    Science.gov (United States)

    Petrović, Jelena; Ibrić, Svetlana; Betz, Gabriele; Đurić, Zorica

    2012-05-30

    The main objective of the study was to develop artificial intelligence methods for optimization of drug release from matrix tablets regardless of the matrix type. Static and dynamic artificial neural networks of the same topology were developed to model dissolution profiles of different matrix tablets types (hydrophilic/lipid) using formulation composition, compression force used for tableting and tablets porosity and tensile strength as input data. Potential application of decision trees in discovering knowledge from experimental data was also investigated. Polyethylene oxide polymer and glyceryl palmitostearate were used as matrix forming materials for hydrophilic and lipid matrix tablets, respectively whereas selected model drugs were diclofenac sodium and caffeine. Matrix tablets were prepared by direct compression method and tested for in vitro dissolution profiles. Optimization of static and dynamic neural networks used for modeling of drug release was performed using Monte Carlo simulations or genetic algorithms optimizer. Decision trees were constructed following discretization of data. Calculated difference (f(1)) and similarity (f(2)) factors for predicted and experimentally obtained dissolution profiles of test matrix tablets formulations indicate that Elman dynamic neural networks as well as decision trees are capable of accurate predictions of both hydrophilic and lipid matrix tablets dissolution profiles. Elman neural networks were compared to most frequently used static network, Multi-layered perceptron, and superiority of Elman networks have been demonstrated. Developed methods allow simple, yet very precise way of drug release predictions for both hydrophilic and lipid matrix tablets having controlled drug release.

  18. Product Form Design Model Based on Multiobjective Optimization and Multicriteria Decision-Making

    Directory of Open Access Journals (Sweden)

    Meng-Dar Shieh

    2017-01-01

    Full Text Available Affective responses concern customers’ affective needs and have received increasing attention in consumer-focused research. To design a product that appeals to consumers, designers should consider multiple affective responses (MARs. Designing products capable of satisfying MARs falls into the category of multiobjective optimization (MOO. However, when exploring optimal product form design, most relevant studies have transformed multiple objectives into a single objective, which limits their usefulness to designers and consumers. To optimize product form design for MARs, this paper proposes an integrated model based on MOO and multicriteria decision-making (MCDM. First, design analysis is applied to identify design variables and MARs; quantification theory type I is then employed to build the relationship models between them; on the basis of these models, an MOO model for optimization of product form design is constructed. Next, we use nondominated sorting genetic algorithm-II (NSGA-II as a multiobjective evolutionary algorithm (MOEA to solve the MOO model and thereby derive Pareto optimal solutions. Finally, we adopt the fuzzy analytic hierarchy process (FAHP to obtain the optimal design from the Pareto solutions. A case study of car form design is conducted to demonstrate the proposed approach. The results suggest that this approach is feasible and effective in obtaining optimal designs and can provide great insight for product form design.

  19. Prediction of Early Recurrence of Liver Cancer by a Novel Discrete Bayes Decision Rule for Personalized Medicine

    Science.gov (United States)

    Ogihara, Hiroyuki

    2016-01-01

    We discuss a novel diagnostic method for predicting the early recurrence of liver cancer with high accuracy for personalized medicine. The difficulty with cancer treatment is that even if the types of cancer are the same, the cancers vary depending on the patient. Thus, remarkable attention has been paid to personalized medicine. Unfortunately, although the Tokyo Score, the Modified JIS, and the TNM classification have been proposed as liver scoring systems, none of these scoring systems have met the needs of clinical practice. In this paper, we convert continuous and discrete data to categorical data and keep the natively categorical data as is. Then, we propose a discrete Bayes decision rule that can deal with the categorical data. This may lead to its use with various types of laboratory data. Experimental results show that the proposed method produced a sensitivity of 0.86 and a specificity of 0.49 for the test samples. This suggests that our method may be superior to the well-known Tokyo Score, the Modified JIS, and the TNM classification in terms of sensitivity. Additional comparative study shows that if the numbers of test samples in two classes are the same, this method works well in terms of the F1 measure compared to the existing scoring methods. PMID:27800494

  20. Validation of New and Existing Decision Rules for the Estimation of Beat-to-Beat Pulse Transit Time

    Directory of Open Access Journals (Sweden)

    Xiaolin Zhou

    2015-01-01

    Full Text Available Pulse transit time (PTT is a pivotal marker of vascular stiffness. Because the actual PTT duration in vivo is unknown and the complicated variation in waveform may occur, the robust determination of characteristic point is still a very difficult task in the PTT estimation. Our objective is to devise a method for real-time estimation of PTT duration in pulse wave. It has an ability to reduce the interference caused by both high- and low-frequency noise. The reproducibility and performance of these methods are assessed on both artificial and clinical pulse data. Artificial data are generated to investigate the reproducibility with various signal-to-noise ratios. For all artificial data, the mean biases obtained from all methods are less than 1 ms; collectively, this newly proposed method has minimum standard deviation (SD, <1 ms. A set of data from 33 participants together with the synchronously recorded continuous blood pressure data are used to investigate the correlation coefficient (CC. The statistical analysis shows that our method has maximum values of mean CC (0.5231, sum of CCs (17.26, and median CC (0.5695 and has the minimum SD of CCs (0.1943. Overall, the test results in this study indicate that the newly developed method has advantages over traditional decision rules for the PTT measurement.

  1. The role of situation assessment and flight experience in pilots' decisions to continue visual flight rules flight into adverse weather.

    Science.gov (United States)

    Wiegmann, Douglas A; Goh, Juliana; O'Hare, David

    2002-01-01

    Visual flight rules (VFR) flight into instrument meteorological conditions (IMC) is a major safety hazard in general aviation. In this study we examined pilots' decisions to continue or divert from a VFR flight into IMC during a dynamic simulation of a cross-country flight. Pilots encountered IMC either early or later into the flight, and the amount of time and distance pilots flew into the adverse weather prior to diverting was recorded. Results revealed that pilots who encountered the deteriorating weather earlier in the flight flew longer into the weather prior to diverting and had more optimistic estimates of weather conditions than did pilots who encountered the deteriorating weather later in the flight. Both the time and distance traveled into the weather prior to diverting were negatively correlated with pilots' previous flight experience. These findings suggest that VFR flight into IMC may be attributable, at least in part, to poor situation assessment and experience rather than to motivational judgment that induces risk-taking behavior as more time and effort are invested in a flight. Actual or potential applications of this research include the design of interventions that focus on improving weather evaluation skills in addition to addressing risk-taking attitudes.

  2. Reliability-oriented multi-objective optimal decision-making approach for uncertainty-based watershed load reduction

    Energy Technology Data Exchange (ETDEWEB)

    Dong, Feifei [College of Environmental Science and Engineering, Key Laboratory of Water and Sediment Sciences (MOE), Peking University, Beijing 100871 (China); Liu, Yong, E-mail: yongliu@pku.edu.cn [College of Environmental Science and Engineering, Key Laboratory of Water and Sediment Sciences (MOE), Peking University, Beijing 100871 (China); Institute of Water Sciences, Peking University, Beijing 100871 (China); Su, Han [College of Environmental Science and Engineering, Key Laboratory of Water and Sediment Sciences (MOE), Peking University, Beijing 100871 (China); Zou, Rui [Tetra Tech, Inc., 10306 Eaton Place, Ste 340, Fairfax, VA 22030 (United States); Yunnan Key Laboratory of Pollution Process and Management of Plateau Lake-Watershed, Kunming 650034 (China); Guo, Huaicheng [College of Environmental Science and Engineering, Key Laboratory of Water and Sediment Sciences (MOE), Peking University, Beijing 100871 (China)

    2015-05-15

    Water quality management and load reduction are subject to inherent uncertainties in watershed systems and competing decision objectives. Therefore, optimal decision-making modeling in watershed load reduction is suffering due to the following challenges: (a) it is difficult to obtain absolutely “optimal” solutions, and (b) decision schemes may be vulnerable to failure. The probability that solutions are feasible under uncertainties is defined as reliability. A reliability-oriented multi-objective (ROMO) decision-making approach was proposed in this study for optimal decision making with stochastic parameters and multiple decision reliability objectives. Lake Dianchi, one of the three most eutrophic lakes in China, was examined as a case study for optimal watershed nutrient load reduction to restore lake water quality. This study aimed to maximize reliability levels from considerations of cost and load reductions. The Pareto solutions of the ROMO optimization model were generated with the multi-objective evolutionary algorithm, demonstrating schemes representing different biases towards reliability. The Pareto fronts of six maximum allowable emission (MAE) scenarios were obtained, which indicated that decisions may be unreliable under unpractical load reduction requirements. A decision scheme identification process was conducted using the back propagation neural network (BPNN) method to provide a shortcut for identifying schemes at specific reliability levels for decision makers. The model results indicated that the ROMO approach can offer decision makers great insights into reliability tradeoffs and can thus help them to avoid ineffective decisions. - Highlights: • Reliability-oriented multi-objective (ROMO) optimal decision approach was proposed. • The approach can avoid specifying reliability levels prior to optimization modeling. • Multiple reliability objectives can be systematically balanced using Pareto fronts. • Neural network model was used to

  3. A Generalized Decision Framework Using Multi-objective Optimization for Water Resources Planning

    Science.gov (United States)

    Basdekas, L.; Stewart, N.; Triana, E.

    2013-12-01

    Colorado Springs Utilities (CSU) is currently engaged in an Integrated Water Resource Plan (IWRP) to address the complex planning scenarios, across multiple time scales, currently faced by CSU. The modeling framework developed for the IWRP uses a flexible data-centered Decision Support System (DSS) with a MODSIM-based modeling system to represent the operation of the current CSU raw water system coupled with a state-of-the-art multi-objective optimization algorithm. Three basic components are required for the framework, which can be implemented for planning horizons ranging from seasonal to interdecadal. First, a water resources system model is required that is capable of reasonable system simulation to resolve performance metrics at the appropriate temporal and spatial scales of interest. The system model should be an existing simulation model, or one developed during the planning process with stakeholders, so that 'buy-in' has already been achieved. Second, a hydrologic scenario tool(s) capable of generating a range of plausible inflows for the planning period of interest is required. This may include paleo informed or climate change informed sequences. Third, a multi-objective optimization model that can be wrapped around the system simulation model is required. The new generation of multi-objective optimization models do not require parameterization which greatly reduces problem complexity. Bridging the gap between research and practice will be evident as we use a case study from CSU's planning process to demonstrate this framework with specific competing water management objectives. Careful formulation of objective functions, choice of decision variables, and system constraints will be discussed. Rather than treating results as theoretically Pareto optimal in a planning process, we use the powerful multi-objective optimization models as tools to more efficiently and effectively move out of the inferior decision space. The use of this framework will help CSU

  4. Review of models and actors in energy mix optimization – can leader visions and decisions align with optimum model strategies for our future energy systems?

    NARCIS (Netherlands)

    Weijermars, R.; Taylor, P.; Bahn, O.; Das, S.R.; Wei, Y.M.

    2011-01-01

    Organizational behavior and stakeholder processes continually influence energy strategy choices and decisions. Although theoretical optimizations can provide guidance for energy mix decisions from a pure physical systems engineering point of view, these solutions might not be optimal from a politica

  5. Average Sample-path Optimality for Continuous-time Markov Decision Processes in Polish Spaces

    Institute of Scientific and Technical Information of China (English)

    Quan-xin ZHU

    2011-01-01

    In this paper we study the average sample-path cost (ASPC) problem for continuous-time Markov decision processes in Polish spaces.To the best of our knowledge,this paper is a first attempt to study the ASPC criterion on continuous-time MDPs with Polish state and action spaces.The corresponding transition rates are allowed to be unbounded,and the cost rates may have neither upper nor lower bounds.Under some mild hypotheses,we prove the existence of e (ε ≥ 0)-ASPC optimal stationary policies based on two different approaches:one is the “optimality equation” approach and the other is the “two optimality inequalities” approach.

  6. Preventive maintenance: optimization of time - based discard decisions at the bruce nuclear generating station

    Energy Technology Data Exchange (ETDEWEB)

    Doyle, E.K. [Ontario Power Generation, Tiverton (Canada); Jardine, A.K.S. [Toronto Univ., Dept. of Mechanical and Industrial Engineering, ON (Canada)

    2001-07-01

    The use of various maintenance optimization techniques at Bruce has lead to cost effective preventive maintenance applications for complex systems. As previously reported at ICONE 6 in New Orleans, 1996, several innovative practices reduced Reliability Centered Maintenance costs while maintaining the accuracy of the analysis. The optimization strategy has undergone further evolution and at the present an Integrated Maintenance Program (IMP) is in place where an Expert Panel consisting of all players/experts proceed through each system in a disciplined fashion and reach agreement on all items under a rigorous time frame. It is well known that there are essentially 3 maintenance based actions that can flow from a Maintenance Optimization Analysis: condition based maintenance, time based maintenance and time based discard. The present effort deals with time based discard decisions. Maintenance data from the Remote On-Power Fuel Changing System was used. (author)

  7. A complex systems approach to planning, optimization and decision making for energy networks

    Energy Technology Data Exchange (ETDEWEB)

    Beck, Jessica; Kempener, Ruud [School of Chemical and Biomolecular Engineering, Building J01, University of Sydney, NSW 2006 (Australia); Cohen, Brett [Department of Chemical Engineering, University of Cape Town, Rondebosch (South Africa); Petrie, Jim [School of Chemical and Biomolecular Engineering, Building J01, University of Sydney, NSW 2006 (Australia); Department of Chemical Engineering, University of Cape Town, Rondebosch (South Africa)

    2008-08-15

    This paper explores a new approach to planning and optimization of energy networks, using a mix of global optimization and agent-based modeling tools. This approach takes account of techno-economic, environmental and social criteria, and engages explicitly with inherent network complexity in terms of the autonomous decision-making capability of individual agents within the network, who may choose not to act as economic rationalists. This is an important consideration from the standpoint of meeting sustainable development goals. The approach attempts to set targets for energy planning, by determining preferred network development pathways through multi-objective optimization. The viability of such plans is then explored through agent-based models. The combined approach is demonstrated for a case study of regional electricity generation in South Africa, with biomass as feedstock. (author)

  8. A Clinical Decision Rule to Identify Emergency Department Patients at Low Risk for Acute Coronary Syndrome Who Do Not Need Objective Coronary Artery Disease Testing: The No Objective Testing Rule.

    Science.gov (United States)

    Greenslade, Jaimi H; Parsonage, William; Than, Martin; Scott, Adam; Aldous, Sally; Pickering, John W; Hammett, Christopher J; Cullen, Louise

    2016-04-01

    We derive a clinical decision rule for ongoing investigation of patients who present to the emergency department (ED) with chest pain. The rule identifies patients who are at low risk of acute coronary syndrome and could be discharged without further cardiac testing. This was a prospective observational study of 2,396 patients who presented to 2 EDs with chest pain suggestive of acute coronary syndrome and had normal troponin and ECG results 2 hours after presentation. Research nurses collected clinical data on presentation, and the primary endpoint was diagnosis of acute coronary syndrome within 30 days of presentation to the ED. Logistic regression analyses were conducted on 50 bootstrapped samples to identify predictors of acute coronary syndrome. A rule was derived and diagnostic accuracy statistics were computed. Acute coronary syndrome was diagnosed in 126 (5.3%) patients. Regression analyses identified the following predictors of acute coronary syndrome: cardiac risk factors, age, sex, previous myocardial infarction, or coronary artery disease and nitrate use. A rule was derived that identified 753 low-risk patients (31.4%), with sensitivity 97.6% (95% confidence interval [CI] 93.2% to 99.5%), negative predictive value 99.6% (95% CI 98.8% to 99.9%), specificity 33.0% (95% CI 31.1% to 35.0%), and positive predictive value 7.5% (95% CI 6.3% to 8.9%) for acute coronary syndrome. This was referred to as the no objective testing rule. We have derived a clinical decision rule for chest pain patients with negative early cardiac biomarker and ECG testing results that identifies 31% at low risk and who may not require objective testing for coronary artery disease. A prospective trial is required to confirm these findings. Copyright © 2015 American College of Emergency Physicians. Published by Elsevier Inc. All rights reserved.

  9. The decision optimization of product development by considering the customer demand saturation

    Directory of Open Access Journals (Sweden)

    Qing-song Xing

    2015-05-01

    Full Text Available Purpose: The purpose of this paper is to analyze the impacts of over meeting customer demands on the product development process, which is on the basis of the quantitative model of customer demands, development cost and time. Then propose the corresponding product development optimization decision. Design/methodology/approach: First of all, investigate to obtain the customer demand information, and then quantify customer demands weights by using variation coefficient method. Secondly, analyses the relationship between customer demands and product development time and cost based on the quality function deployment and establish corresponding mathematical model. On this basis, put forward the concept of customer demand saturation and optimization decision method of product development, and then apply it in the notebook development process of a company. Finally, when customer demand is saturated, it also needs to prove the consistency of strengthening satisfies customer demands and high attention degree customer demands, and the stability of customer demand saturation under different parameters. Findings: The development cost and the time will rise sharply when over meeting the customer demand. On the basis of considering the customer demand saturation, the relationship between customer demand and development time cost is quantified and balanced. And also there is basically consistent between the sequence of meeting customer demands and customer demands survey results. Originality/value: The paper proposes a model of customer demand saturation. It proves the correctness and effectiveness on the product development decision method.

  10. Application of a rule extraction algorithm family based on the Re-RX algorithm to financial credit risk assessment from a Pareto optimal perspective

    Directory of Open Access Journals (Sweden)

    Yoichi Hayashi

    2016-01-01

    Full Text Available Historically, the assessment of credit risk has proved to be both highly important and extremely difficult. Currently, financial institutions rely on the use of computer-generated credit scores for risk assessment. However, automated risk evaluations are currently imperfect, and the loss of vast amounts of capital could be prevented by improving the performance of computerized credit assessments. A number of approaches have been developed for the computation of credit scores over the last several decades, but these methods have been considered too complex without good interpretability and have therefore not been widely adopted. Therefore, in this study, we provide the first comprehensive comparison of results regarding the assessment of credit risk obtained using 10 runs of 10-fold cross validation of the Re-RX algorithm family, including the Re-RX algorithm, the Re-RX algorithm with both discrete and continuous attributes (Continuous Re-RX, the Re-RX algorithm with J48graft, the Re-RX algorithm with a trained neural network (Sampling Re-RX, NeuroLinear, NeuroLinear+GRG, and three unique rule extraction techniques involving support vector machines and Minerva from four real-life, two-class mixed credit-risk datasets. We also discuss the roles of various newly-extended types of the Re-RX algorithm and high performance classifiers from a Pareto optimal perspective. Our findings suggest that Continuous Re-RX, Re-RX with J48graft, and Sampling Re-RX comprise a powerful management tool that allows the creation of advanced, accurate, concise and interpretable decision support systems for credit risk evaluation. In addition, from a Pareto optimal perspective, the Re-RX algorithm family has superior features in relation to the comprehensibility of extracted rules and the potential for credit scoring with Big Data.

  11. The 1991 Lead/Copper Drinking Water Rule and the 1995 Decision Not to Revise the Arsenic Drinking Water Rule: Two Case Studies in EPA's Use of Science

    OpenAIRE

    Powell, Mark

    1996-01-01

    This paper discusses EPA's acquisition and use of science in two decisions under the Safe Drinking Water Act: the 1991 revision of the lead drinking water regulations and the 1995 decision to pursue additional research instead of revising the arsenic in drinking water standard. In the first case, a committed band of policy entrepreneurs within EPA mobilized and supplemented scientific information which had accumulated in the agency's air program to force lead in drinking water up the agency's...

  12. Optimal Breast Biopsy Decision-Making Based on Mammographic Features and Demographic Factors.

    Science.gov (United States)

    Chhatwal, Jagpreet; Alagoz, Oguzhan; Burnside, Elizabeth S

    2010-11-01

    Breast cancer is the most common non-skin cancer affecting women in the United States, where every year more than 20 million mammograms are performed. Breast biopsy is commonly performed on the suspicious findings on mammograms to confirm the presence of cancer. Currently, 700,000 biopsies are performed annually in the U.S.; 55%-85% of these biopsies ultimately are found to be benign breast lesions, resulting in unnecessary treatments, patient anxiety, and expenditures. This paper addresses the decision problem faced by radiologists: When should a woman be sent for biopsy based on her mammographic features and demographic factors? This problem is formulated as a finite-horizon discrete-time Markov decision process. The optimal policy of our model shows that the decision to biopsy should take the age of patient into account; particularly, an older patient's risk threshold for biopsy should be higher than that of a younger patient. When applied to the clinical data, our model outperforms radiologists in the biopsy decision-making problem. This study also derives structural properties of the model, including sufficiency conditions that ensure the existence of a control-limit type policy and nondecreasing control-limits with age.

  13. Surface stability and the selection rules of substrate orientation for optimal growth of epitaxial II-VI semiconductors

    Energy Technology Data Exchange (ETDEWEB)

    Yin, Wan-Jian [National Renewable Energy Laboratory, Golden, Colorado 80401 (United States); Department of Physics & Astronomy, and Wright Center for Photovoltaics Innovation and Commercialization, The University of Toledo, Toledo, Ohio 43606 (United States); Yang, Ji-Hui; Zaunbrecher, Katherine; Gessert, Tim; Barnes, Teresa; Wei, Su-Huai, E-mail: Suhuai.Wei@nrel.gov [National Renewable Energy Laboratory, Golden, Colorado 80401 (United States); Yan, Yanfa [Department of Physics & Astronomy, and Wright Center for Photovoltaics Innovation and Commercialization, The University of Toledo, Toledo, Ohio 43606 (United States)

    2015-10-05

    The surface structures of ionic zinc-blende CdTe (001), (110), (111), and (211) surfaces are systematically studied by first-principles density functional calculations. Based on the surface structures and surface energies, we identify the detrimental twinning appearing in molecular beam epitaxy (MBE) growth of II-VI compounds as the (111) lamellar twin boundaries. To avoid the appearance of twinning in MBE growth, we propose the following selection rules for choosing optimal substrate orientations: (1) the surface should be nonpolar so that there is no large surface reconstructions that could act as a nucleation center and promote the formation of twins; (2) the surface structure should have low symmetry so that there are no multiple equivalent directions for growth. These straightforward rules, in consistent with experimental observations, provide guidelines for selecting proper substrates for high-quality MBE growth of II-VI compounds.

  14. Optimization of urban water supply portfolios combining infrastructure capacity expansion and water use decisions

    Science.gov (United States)

    Medellin-Azuara, J.; Fraga, C. C. S.; Marques, G.; Mendes, C. A.

    2015-12-01

    The expansion and operation of urban water supply systems under rapidly growing demands, hydrologic uncertainty, and scarce water supplies requires a strategic combination of various supply sources for added reliability, reduced costs and improved operational flexibility. The design and operation of such portfolio of water supply sources merits decisions of what and when to expand, and how much to use of each available sources accounting for interest rates, economies of scale and hydrologic variability. The present research provides a framework and an integrated methodology that optimizes the expansion of various water supply alternatives using dynamic programming and combining both short term and long term optimization of water use and simulation of water allocation. A case study in Bahia Do Rio Dos Sinos in Southern Brazil is presented. The framework couples an optimization model with quadratic programming model in GAMS with WEAP, a rain runoff simulation models that hosts the water supply infrastructure features and hydrologic conditions. Results allow (a) identification of trade offs between cost and reliability of different expansion paths and water use decisions and (b) evaluation of potential gains by reducing water system losses as a portfolio component. The latter is critical in several developing countries where water supply system losses are high and often neglected in favor of more system expansion. Results also highlight the potential of various water supply alternatives including, conservation, groundwater, and infrastructural enhancements over time. The framework proves its usefulness for planning its transferability to similarly urbanized systems.

  15. An Optimal Hierarchical Decision Model for a Regional Logistics Network with Environmental Impact Consideration

    Directory of Open Access Journals (Sweden)

    Dezhi Zhang

    2014-01-01

    Full Text Available This paper proposes a new model of simultaneous optimization of three-level logistics decisions, for logistics authorities, logistics operators, and logistics users, for regional logistics network with environmental impact consideration. The proposed model addresses the interaction among the three logistics players in a complete competitive logistics service market with CO2 emission charges. We also explicitly incorporate the impacts of the scale economics of the logistics park and the logistics users’ demand elasticity into the model. The logistics authorities aim to maximize the total social welfare of the system, considering the demand of green logistics development by two different methods: optimal location of logistics nodes and charging a CO2 emission tax. Logistics operators are assumed to compete with logistics service fare and frequency, while logistics users minimize their own perceived logistics disutility given logistics operators’ service fare and frequency. A heuristic algorithm based on the multinomial logit model is presented for the three-level decision model, and a numerical example is given to illustrate the above optimal model and its algorithm. The proposed model provides a useful tool for modeling competitive logistics services and evaluating logistics policies at the strategic level.

  16. An optimal hierarchical decision model for a regional logistics network with environmental impact consideration.

    Science.gov (United States)

    Zhang, Dezhi; Li, Shuangyan; Qin, Jin

    2014-01-01

    This paper proposes a new model of simultaneous optimization of three-level logistics decisions, for logistics authorities, logistics operators, and logistics users, for regional logistics network with environmental impact consideration. The proposed model addresses the interaction among the three logistics players in a complete competitive logistics service market with CO2 emission charges. We also explicitly incorporate the impacts of the scale economics of the logistics park and the logistics users' demand elasticity into the model. The logistics authorities aim to maximize the total social welfare of the system, considering the demand of green logistics development by two different methods: optimal location of logistics nodes and charging a CO2 emission tax. Logistics operators are assumed to compete with logistics service fare and frequency, while logistics users minimize their own perceived logistics disutility given logistics operators' service fare and frequency. A heuristic algorithm based on the multinomial logit model is presented for the three-level decision model, and a numerical example is given to illustrate the above optimal model and its algorithm. The proposed model provides a useful tool for modeling competitive logistics services and evaluating logistics policies at the strategic level.

  17. An Optimal Hierarchical Decision Model for a Regional Logistics Network with Environmental Impact Consideration

    Science.gov (United States)

    Zhang, Dezhi; Li, Shuangyan

    2014-01-01

    This paper proposes a new model of simultaneous optimization of three-level logistics decisions, for logistics authorities, logistics operators, and logistics users, for regional logistics network with environmental impact consideration. The proposed model addresses the interaction among the three logistics players in a complete competitive logistics service market with CO2 emission charges. We also explicitly incorporate the impacts of the scale economics of the logistics park and the logistics users' demand elasticity into the model. The logistics authorities aim to maximize the total social welfare of the system, considering the demand of green logistics development by two different methods: optimal location of logistics nodes and charging a CO2 emission tax. Logistics operators are assumed to compete with logistics service fare and frequency, while logistics users minimize their own perceived logistics disutility given logistics operators' service fare and frequency. A heuristic algorithm based on the multinomial logit model is presented for the three-level decision model, and a numerical example is given to illustrate the above optimal model and its algorithm. The proposed model provides a useful tool for modeling competitive logistics services and evaluating logistics policies at the strategic level. PMID:24977209

  18. Knowledge-based systems as decision support tools in an ecosystem approach to fisheries: Comparing a fuzzy-logic and rule-based approach

    DEFF Research Database (Denmark)

    Jarre, Astrid; Paterson, B.; Moloney, C.L.

    2008-01-01

    In an ecosystem approach to fisheries (EAF), management must draw on information of widely different types, and information addressing various scales. Knowledge-based systems assist in the decision-making process by summarising this information in a logical, transparent and reproducible way. Both...... decision support tools in our evaluation of the two approaches. With respect to the model objectives, no method clearly outperformed the other. The advantages of numerically processing continuous variables, and interpreting the final output. as in fuzzy-logic models, can be weighed up against...... the advantages of using a few, qualitative, easy-to-understand categories as in rule-based models. The natural language used in rule-based implementations is easily understood by, and communicated among, users of these systems. Users unfamiliar with fuzzy-set theory must "trust" the logic of the model. Graphical...

  19. Critical appraisal of clinical prediction rules that aim to optimize treatment selection for musculoskeletal conditions

    NARCIS (Netherlands)

    T.R. Stanton (Tasha); M.J. Hancock (Mark J.); C. Maher (Chris); B.W. Koes (Bart)

    2010-01-01

    textabstractBackground. Clinical prediction rules (CPRs) for treatment selection in musculoskeletal conditions have become increasingly popular. Purpose. The purposes of this review are: (1) to critically appraise studies evaluating CPRs and (2) to consider the clinical utility and stage of developm

  20. Task Analysis of the UH-60 Mission and Decision Rules for Developing a UH-60 Workload Prediction Model. Volume 1. Summary Report

    Science.gov (United States)

    1989-02-01

    as a baseline for all proposed model changes or other proposed multistage improvement program ( MSIP ). A computer model of this analysis was used to...support in the coordination of activities with F Company. The authors wish to thank Ms . Cassandra Hocutt, Anacapa Sciences, Inc., for her assistance in...develop smooth-flowing function and segment decision rules. Her assistance is greatly appreciated. The authors also wish to thank Ms . Nadine McCollim

  1. Optimal decisions and comparison of VMI and CPFR under price-sensitive uncertain demand

    Directory of Open Access Journals (Sweden)

    Yasaman Kazemi

    2013-06-01

    Full Text Available Purpose: The purpose of this study is to compare the performance of two advanced supply chain coordination mechanisms, Vendor Managed Inventory (VMI and Collaborative Planning Forecasting and Replenishment (CPFR, under a price-sensitive uncertain demand environment, and to make the optimal decisions on retail price and order quantity for both mechanisms. Design/ methodology/ approach: Analytical models are first applied to formulate a profit maximization problem; furthermore, by applying simulation optimization solution procedures, the optimal decisions and performance comparisons are accomplished. Findings: The results of the case study supported the widely held view that more advanced coordination mechanisms yield greater supply chain profit than less advanced ones. Information sharing does not only increase the supply chain profit, but also is required for the coordination mechanisms to achieve improved performance. Research limitations/implications: This study considers a single vendor and a single retailer in order to simplify the supply chain structure for modeling. Practical implications: Knowledge obtained from this study about the conditions appropriate for each specific coordination mechanism and the exact functions of coordination programs is critical to managerial decisions for industry practitioners who may apply the coordination mechanisms considered. Originality/value: This study includes the production cost in Economic Order Quantity (EOQ equations and combines it with price-sensitive demand under stochastic settings while comparing VMI and CPFR supply chain mechanisms and maximizing the total profit. Although many studies have worked on information sharing within the supply chain, determining the performance measures when the demand is price-sensitive and stochastic was not reported by researchers in the past literature.

  2. Optimal classifier feedback improves cost-benefit but not base-rate decision criterion learning in perceptual categorization.

    Science.gov (United States)

    Maddox, W Todd; Bohil, Corey J

    2005-03-01

    Unequal payoffs engender separate reward- and accuracy-maximizing decision criteria; unequal base rates do not. When payoffs are unequal, observers place greater emphasis on accuracy than is optimal. This study compares objective classifier (the objectively correct response) with optimal classifier feedback (the optimal classifier's response) when payoffs or base rates are unequal. It provides a critical test of Maddox and Bohil's (1998) competition between reward and accuracy maximization (COBRA) hypothesis, comparing it with a competition between reward and probability matching (COBRM) and a competition between reward and equal response frequencies (COBRE) hypothesis. The COBRA prediction that optimal classifier feedback leads to better decision criterion leaning relative to objective classifier feedback when payoffs are unequal, but not when base rates are unequal, was supported. Model-based analyses suggested that the weight placed on accuracy was reduced for optimal classifier feedback relative to objective classifier feedback. In addition, delayed feedback affected learning of the reward-maximizing decision criterion.

  3. A Modified Bird-Mating Optimization with Hill-Climbing for Connection Decisions of Transformers

    Directory of Open Access Journals (Sweden)

    Ting-Chia Ou

    2016-08-01

    Full Text Available This paper endeavors to apply a hybrid bird-mating optimization approach to connection decisions of distribution transformers. It is expected that with the aid of hybrid bird-mating approach, the voltage imbalance and deviation can be mitigated, hence ensuring a satisfactory supplying power more effectively. To evaluate the effectiveness of this method, it has been tested through practical distribution systems with comparisons to other methods. Test results help confirm the feasibility of the approach, serving as beneficial references for the improvement of electric power grid operations.

  4. Decision-Making Approach to Selecting Optimal Platform of Service Variants

    Directory of Open Access Journals (Sweden)

    Vladimir Modrak

    2016-01-01

    Full Text Available Nowadays, it is anticipated that service sector companies will be inspired to follow mass customization trends of industrial sector. However, services are more abstract than products and therefore concepts for mass customization in manufacturing domain cannot be transformed without a methodical change. This paper is focused on the development of a methodological framework to support decisions in a selection of optimal platform of service variants when compatibility problems between service options occurred. The approach is based on mutual relations between waste and constrained design space entropy. For this purpose, software for quantification of constrained and waste design space is developed. Practicability of the methodology is presented on a realistic case.

  5. Application of an Optimized Entropy-TOPSIS Multicriteria Decision Making Model to Facilities Management

    Institute of Scientific and Technical Information of China (English)

    WANG Zhao-hong; ZHAN Wei; QIU Wan-hua

    2006-01-01

    Performance evaluation of facilities management plays a key role in the facilities management process. This paper proposes an optimized multicriteria decision making model to evaluate the performance of facilities management in schools in Hong Kong. In this model, entropy weights acted as weight coefficients for evaluated criteria in order to avoid uncertainty and randomicity of subjective judgments. Besides, the TOPSIS method was incorporated in this model. Then this model was employed to evaluate the performance of facilities management in classrooms, offices and laboratories and satisfying results were obtained. Moreover, findings indicated that one of the schools could be rehabilitated rather than removed.

  6. Optimal Manpower Recruitment and Dismissal Decision for Single-type Job

    Institute of Scientific and Technical Information of China (English)

    XudongLi; FengshengTu; YongjianLi; XiaoqiangCai

    2004-01-01

    We consider a manpower planning problem with single employee type over a long planning horizon and analyze the optimal recruitment and dismissal polices. Dynamic demands for manpower must be fulfilled by allocating enough number of employees. Costs for every employee include salary, recruitment and dismissal costs, in particular, setup costs when recruitment/dismissal activities occur. We formulate the problem as a multi-period decision model. Then we analyze properties of the problem and give an improved dynamic programming algorithm to minimize the total cost over the entire planning horizon. We report computational results to illustrate the effectiveness of the approach.

  7. Impact of Uncertainty in SWAT Model Simulations on Consequent Decisions on Optimal Crop Management Practices

    Science.gov (United States)

    Krishnan, N.; Sudheer, K. P.; Raj, C.; Chaubey, I.

    2015-12-01

    The diminishing quantities of non-renewable forms of energy have caused an increasing interest in the renewable sources of energy, such as biofuel, in the recent years. However, the demand for biofuel has created a concern for allocating grain between the fuel and food industry. Consequently, appropriate regulations that limit grain based ethanol production have been developed and are put to practice, which resulted in cultivating perennial grasses like Switch grass and Miscanthus to meet the additional cellulose demand. A change in cropping and management practice, therefore, is essential to cater the conflicting requirement for food and biofuel, which has a long-term impact on the downstream water quality. Therefore it is essential to implement optimal cropping practices to reduce the pollutant loadings. Simulation models in conjunction with optimization procedures are useful in developing efficient cropping practices in such situations. One such model is the Soil and Water Assessment Tool (SWAT), which can simulate both the water and the nutrient cycle, as well as quantify long-term impacts of changes in management practice in the watershed. It is envisaged that the SWAT model, along with an optimization algorithm, can be used to identify the optimal cropping pattern that achieves the minimum guaranteed grain production with less downstream pollution, while maximizing the biomass production for biofuel generation. However, the SWAT simulations do have a certain level of uncertainty that needs to be accounted for before making decisions. Therefore, the objectives of this study are twofold: (i) to understand how model uncertainties influence decision-making, and (ii) to develop appropriate management scenarios that account the uncertainty. The simulation uncertainty of the SWAT model is assessed using Shuffled Complex Evolutionary Metropolis Algorithm (SCEM). With the data collected from St. Joseph basin, IN, USA, the preliminary results indicate that model

  8. Optimal Financing Order Decisions of a Supply Chain under the Retailer's Delayed Payment

    Directory of Open Access Journals (Sweden)

    Honglin Yang

    2014-01-01

    Full Text Available In real supply chain, a capital-constrained retailer has two typical payment choices: the up-front payment to receive a high discount price or the delayed payment to reduce capital pressure. We compare with the efficiency of optimal decisions of different participants, that is, supplier, retailer, and bank, under both types of payments based on a game equilibrium analysis. It shows that under the equilibrium, the delayed payment leads to a greater optimal order quantity from the retailer compared to the up-front payment and, thus, improves the whole benefit of the supply chain. The numerical simulation for the random demand following a uniform distribution further verifies our findings. This study provides novel evidence that a dominant supplier who actively offers trade credit helps enhance the whole efficiency of a supply chain.

  9. A linear programming model to optimize the decision-making to managing cogeneration system

    Energy Technology Data Exchange (ETDEWEB)

    Tibi, Naser A. [United Arab Emirates University, P.O. Box 93330, Dubai (United Arab Emirates); Arman, Husam [University of Nottingham, Nottingham (United Kingdom)

    2007-08-15

    A mathematical linear programming (LP) model was developed to optimize the decision-making for managing a cogeneration facility as a potential clean-development mechanism project in a hospital in Palestine. The model was developed to optimize the cost of energy and the cost of installation of a small cogeneration plant under constraints on electricity-and-heat supply and demand balances. In the model, the sources of electricity are either from cogeneration or public utilities and it was calculated the least cost to supply electricity and heat to the hospital. The hospital is using heat for their operation and that made the application for the cogeneration to be attractive and feasible. In this study, we will develop the LP model and will show the results and the time schedule for the cogeneration. This developed LP model can be used and run to any cogeneration application with little modification. (orig.)

  10. Optimal Rule-Based Power Management for Online, Real-Time Applications in HEVs with Multiple Sources and Objectives: A Review

    Directory of Open Access Journals (Sweden)

    Bedatri Moulik

    2015-08-01

    Full Text Available The field of hybrid vehicles has undergone intensive research and development, primarily due to the increasing concern of depleting resources and increasing pollution. In order to investigate further options to optimize the performance of hybrid vehicles with regards to different criteria, such as fuel economy, battery aging, etc., a detailed state-of-the-art review is presented in this contribution. Different power management and optimization techniques are discussed focusing on rule-based power management and multi-objective optimization techniques. The extent of rule-based power management and optimization in solving battery aging issues is investigated along with an implementation in real-time driving scenarios where no pre-defined drive cycle is followed. The goal of this paper is to illustrate the significance and applications of rule-based power management optimization based on previous contributions.

  11. Judicious management of uncertain risks : II. Simple rules and more intricate models for precautionary decision-making

    NARCIS (Netherlands)

    Vlek, Charles

    2010-01-01

    Rational decision theory could be more fully exploited for the prudent management of uncertain-risk situations. After an integrative circumscription of the precautionary principle (PP), 10 key issues are discussed covering assessment, decision and control. In view of this, a variety of decision-theo

  12. Experience and net worth affects optimality in a motor decision task.

    Science.gov (United States)

    Neyedli, Heather F; Welsh, Timothy N

    2015-01-01

    Previous research has shown in a motor decision task involving a target (yielding a reward) and overlapping penalty area (yielding a loss), people initially aim closer to the penalty area than optimal. This risky strategy may be adopted to increase target hits, thereby increasing net worth. The purpose of the current study was to determine whether the starting net worth level (either 5,000 or 0 points) affected the influence of task experience on endpoint selection. It was hypothesized the 5,000-point group should adopt a less risky strategy and aim further from the penalty area than those with 0 points. Net worth affected participants' initial endpoint where the 5,000-point group aimed further from the penalty circle, and closer to the optimal endpoint, than the 0-point group. The 0-point group adapted their endpoint over the course session to aim closer to the optimal endpoint whereas no such change was seen in the 5,000-point group. The results show that changing the participants' reference point through initial net worth can affect the optimality of participants' endpoint and how endpoints change with experience.

  13. A PMBGA to Optimize the Selection of Rules for Job Shop Scheduling Based on the Giffler-Thompson Algorithm

    Directory of Open Access Journals (Sweden)

    Rui Zhang

    2012-01-01

    Full Text Available Most existing research on the job shop scheduling problem has been focused on the minimization of makespan (i.e., the completion time of the last job. However, in the fiercely competitive market nowadays, delivery punctuality is more important for maintaining a high service reputation. So in this paper, we aim at solving job shop scheduling problems with the total weighted tardiness objective. Several dispatching rules are adopted in the Giffler-Thompson algorithm for constructing active schedules. It is noticeable that the rule selections for scheduling consecutive operations are not mutually independent but actually interrelated. Under such circumstances, a probabilistic model-building genetic algorithm (PMBGA is proposed to optimize the sequence of selected rules. First, we use Bayesian networks to model the distribution characteristics of high-quality solutions in the population. Then, the new generation of individuals is produced by sampling the established Bayesian network. Finally, some elitist individuals are further improved by a special local search module based on parameter perturbation. The superiority of the proposed approach is verified by extensive computational experiments and comparisons.

  14. A Class of Optimization Method for Bilevel Multi-objective Decision Making Problem with the Help of Satisfactoriness

    Institute of Scientific and Technical Information of China (English)

    LI Tong; TENG Chun-xian; LI Hao-bai

    2002-01-01

    In the paper, it is discussed that the method on how to transform the multi-person bilevel multi-objective decision making problem into the equivalent generalized multi-objective decision making problem by using Kuhn-Tucker sufficient and necessary condition. In order to embody the decision maker's hope and transform it into single-objective decision making problem with the help of e-constraint method.Then we can obtain the global optimal solution by means of simulated annealing algorithm.

  15. Simulation of Optimal Decision-Making Under the Impacts of Climate Change.

    Science.gov (United States)

    Møller, Lea Ravnkilde; Drews, Martin; Larsen, Morten Andreas Dahl

    2017-04-03

    Climate change causes transformations to the conditions of existing agricultural practices appointing farmers to continuously evaluate their agricultural strategies, e.g., towards optimising revenue. In this light, this paper presents a framework for applying Bayesian updating to simulate decision-making, reaction patterns and updating of beliefs among farmers in a developing country, when faced with the complexity of adapting agricultural systems to climate change. We apply the approach to a case study from Ghana, where farmers seek to decide on the most profitable of three agricultural systems (dryland crops, irrigated crops and livestock) by a continuous updating of beliefs relative to realised trajectories of climate (change), represented by projections of temperature and precipitation. The climate data is based on combinations of output from three global/regional climate model combinations and two future scenarios (RCP4.5 and RCP8.5) representing moderate and unsubstantial greenhouse gas reduction policies, respectively. The results indicate that the climate scenario (input) holds a significant influence on the development of beliefs, net revenues and thereby optimal farming practices. Further, despite uncertainties in the underlying net revenue functions, the study shows that when the beliefs of the farmer (decision-maker) opposes the development of the realised climate, the Bayesian methodology allows for simulating an adjustment of such beliefs, when improved information becomes available. The framework can, therefore, help facilitating the optimal choice between agricultural systems considering the influence of climate change.

  16. Driver's Behavior and Decision-Making Optimization Model in Mixed Traffic Environment

    Directory of Open Access Journals (Sweden)

    Xiaoyuan Wang

    2015-02-01

    Full Text Available Driving process is an information treating procedure going on unceasingly. It is very important for the research of traffic flow theory, to study on drivers' information processing pattern in mixed traffic environment. In this paper, bicycle is regarded as a kind of information source to vehicle drivers; the “conflict point method” is brought forward to analyze the influence of bicycles on driving behavior. The “conflict” is studied to be translated into a special kind of car-following or lane-changing process. Furthermore, the computer clocked scan step length is dropped to 0.1 s, in order to scan and analyze the dynamic (static information which influences driving behavior in a more exact way. The driver's decision-making process is described through information fusion based on duality contrast and fuzzy optimization theory. The model test and verification show that the simulation results with the “conflict point method” and the field data are consistent basically. It is feasible to imitate driving behavior and the driver information fusion process with the proposed methods. Decision-making optimized process can be described more accurately through computer precision clocked scan strategy. The study in this paper can provide the foundation for further research of multiresource information fusion process of driving behavior.

  17. A decision analysis approach for optimal groundwater monitoring system design under uncertainty

    Directory of Open Access Journals (Sweden)

    N. B. Yenigül

    2006-01-01

    Full Text Available Groundwater contamination is the degradation of the natural quality of groundwater as a result of human activity. Landfills are one of the most common human activities threatening the groundwater quality. The objective of the monitoring systems is to detect the contaminant plumes before reaching the regulatory compliance boundary in order to prevent the severe risk to both society and groundwater quality, and also to enable cost-effective counter measures in case of a failure. The detection monitoring problem typically has a multi-objective nature. A multi-objective decision model (called MONIDAM which links a classic decision analysis approach with a stochastic simulation model is applied to determine the optimal groundwater monitoring system given uncertainties due to the hydrogeological conditions and contaminant source characteristics. A Monte Carlo approach is used to incorporate uncertainties. Hydraulic conductivity and the leak location are the random inputs of the simulation model. The design objectives considered in the model are: (1 maximizing the detection probability, (2 minimizing the contaminated area and, (3 minimize the total cost of the monitoring system. The results show that the monitoring systems located close to the source are optimal except for the cases with very high unit installation and sampling cost and/or very cheap unit remediation cost.

  18. Development of a fuzzy optimization model, supporting global warming decision-making

    Energy Technology Data Exchange (ETDEWEB)

    Leimbach, M. [Potsdam Inst. for Climate Impact Research, Potsdam (Germany)

    1996-03-01

    An increasing number of models have been developed to support global warming response policies. The model constructors are facing a lot of uncertainties which limit the evidence of these models. The support of climate policy decision-making is only possible in a semi-quantitative way, as presented by a Fuzzy model. The model design is based on an optimization approach, integrated in a bounded risk decision-making framework. Given some regional emission-related and impact-related restrictions, optimal emission paths can be calculated. The focus is not only on carbon dioxide but on other greenhouse gases too. In the paper, the components of the model will be described. Cost coefficients, emission boundaries and impact boundaries are represented as Fuzzy parameters. The Fuzzy model will be transformed into a computational one by using an approach of Rommelfanger. In the second part, some problems of applying the model to computations will be discussed. This includes discussions on the data situation and the presentation, as well as interpretation of results of sensitivity analyses. The advantage of the Fuzzy approach is that the requirements regarding data precision are not so strong. Hence, the effort for data acquisition can be reduced and computations can be started earlier. 9 figs., 3 tabs., 17 refs., 1 appendix

  19. Integrated reservoir and decision modeling to optimize spacing in unconventional gas reservoirs

    Energy Technology Data Exchange (ETDEWEB)

    Turkarslan, G.; McVay, D.A.; Ortiz, R.R. [Texas A and M Univ., College Station, TX (United States); Bickel, J.E.; Montiel, L.V. [Texas Univ., Austin, TX (United States)

    2010-07-01

    Unconventional gas plays are risky and operators must balance the need to conserve capital and protect the environment by avoiding over drilling with the desire to increase profitability. The purpose of this study was to develop technology and tools to help operators determine optimal well spacing in highly uncertain and risky unconventional gas reservoirs as quickly as possible. The paper presented a study that developed an integrated reservoir and decision modeling system that incorporated uncertainty. A Monte Carlo simulation was used to match and predict production performance in unconventional gas reservoirs. Simulation results were integrated with a Bayesian decision model that accounted for the risk facing operators. In order to determine optimal development strategies, these integrated tools were applied to a hypothetical case based on data from Deep Basin tight gas sands in Alberta. The paper provided background information on the Deep Basin Sands and the reservoir model. The Monte Carlo simulation and geostatistical analysis were presented. It was concluded that it is important to incorporate the lessons learned between development stages in unconventional gas reservoirs. 23 refs., 9 tabs., 16 figs.

  20. Value of information methods for planning and analyzing clinical studies optimize decision making and research planning.

    Science.gov (United States)

    Willan, Andrew R; Goeree, Ron; Boutis, Kathy

    2012-08-01

    The results of two randomized clinical trials (RCTs) demonstrate the clinical effectiveness of alternatives to casting for certain ankle and wrist fractures. We illustrate the use of value of information (VOI) methods for evaluating the evidence provided by these studies with respect to decision making. Using cost-effectiveness data from these studies, the expected value of sample information (EVSI) of a future RCT can be determined. If the EVSI exceeds the cost of the future trial for any sample size, then the current evidence is considered insufficient for decision making and a future trial is considered worthwhile. If, on the other hand, there is no sample size for which the EVSI exceeds the cost, then the evidence is considered sufficient, and no future trial is required. We found that the evidence from the ankle study was insufficient to support the adoption of the removable device and determined the optimal sample size for a future trial. Conversely, the evidence from the wrist study was sufficient to support the adoption of the removable device. VOI methods provide a decision-analytic alternative to the standard hypothesis testing approach for assessing the evidence provided by cost-effectiveness studies and for determining sample sizes for RCTs. Copyright © 2012 Elsevier Inc. All rights reserved.

  1. Decision models for use with criterion-referenced tests

    NARCIS (Netherlands)

    van der Linden, Willem J.

    1980-01-01

    The problem of mastery decisions and optimizing cutoff scores on criterion-referenced tests is considered. This problem can be formalized as an (empirical) Bayes problem with decisions rules of a monotone shape. Next, the derivation of optimal cutoff scores for threshold, linear, and normal ogive lo

  2. A note on “An alternative multiple attribute decision making methodology for solving optimal facility layout design selection problems”

    Directory of Open Access Journals (Sweden)

    R. Venkata Rao

    2012-04-01

    Full Text Available A paper published by Maniya and Bhatt (2011 (An alternative multiple attribute decision making methodology for solving optimal facility layout design selection problems, Computers & Industrial Engineering, 61, 542-549 proposed an alternative multiple attribute decision making method named as “Preference Selection Index (PSI method” for selection of an optimal facility layout design. The authors had claimed that the method was logical and more appropriate and the method gives directly the optimal solution without assigning the relative importance between the facility layout design selection attributes. This note discusses the mathematical validity and the shortcomings of the PSI method.

  3. Evaluation of a Dispatcher's Route Optimization Decision Aid to Avoid Aviation Weather Hazards

    Science.gov (United States)

    Dorneich, Michael C.; Olofinboba, Olu; Pratt, Steve; Osborne, Dannielle; Feyereisen, Thea; Latorella, Kara

    2003-01-01

    This document describes the results and analysis of the formal evaluation plan for the Honeywell software tool developed under the NASA AWIN (Aviation Weather Information) 'Weather Avoidance using Route Optimization as a Decision Aid' project. The software tool aims to provide airline dispatchers with a decision aid for selecting optimal routes that avoid weather and other hazards. This evaluation compares and contrasts route selection performance with the AWIN tool to that of subjects using a more traditional dispatcher environment. The evaluation assesses gains in safety, in fuel efficiency of planned routes, and in time efficiency in the pre-flight dispatch process through the use of the AWIN decision aid. In addition, we are interested in how this AWIN tool affects constructs that can be related to performance. The construct of Situation Awareness (SA), workload, trust in an information system, and operator acceptance are assessed using established scales, where these exist, as well as through the evaluation of questionnaire responses and subject comments. The intention of the experiment is to set up a simulated operations area for the dispatchers to work in. They will be given scenarios in which they are presented with stored company routes for a particular city-pair and aircraft type. A diverse set of external weather information sources is represented by a stand-alone display (MOCK), containing the actual historical weather data typically used by dispatchers. There is also the possibility of presenting selected weather data on the route visualization tool. The company routes have not been modified to avoid the weather except in the case of one additional route generated by the Honeywell prototype flight planning system. The dispatcher will be required to choose the most appropriate and efficient flight plan route in the displayed weather conditions. The route may be modified manually or may be chosen from those automatically displayed.

  4. 新型城镇化视野下地方政府决策体制的优化%Optimization of the local government decision-making system from the perspective of New Urbanization

    Institute of Scientific and Technical Information of China (English)

    周仁标

    2015-01-01

    当前我国城镇化进程中,地方政府不同程度地存在着决策价值理念偏颇、决策组织体系不健全、民众参与机制缺失、决策程序不严谨、决策规则和方式不规范等缺陷。需要创新决策理念,完善官员政绩考核与流动机制,健全公共决策组织体系,建立决策听证与社情民意的反映制度,规范决策程序、规则和方式,实行公共决策问责制度,优化地方政府的公共决策体制。%Local governments exists that the decision values biased,decision-making organization system is not perfect,lack of public participation mechanisms,decision-making process is not rigorous,decision rules and methods are not standardized,and other defects in varying degrees in the process of China's ur-banization.To do this,we need to build advanced idea of decision values,improve the performance eval-uation and flow mechanism for local officials,perfect the system of public decision-making organization, establish the system of decision-making hearings and social conditions and public opinion reflected, standardize decision-making procedures,rules and ways,strengthen the supervision to decision-making power operating,optimize the local government public decision-making system.

  5. Are Pediatric Emergency Care Applied Research Network Rules (PECARN Sufficient for Computed Cranial Tomography Decision in Pediatric Patients with Mild Head Trauma?

    Directory of Open Access Journals (Sweden)

    Hasan Mansur Durgun

    2016-03-01

    Full Text Available Objective: In this study we aimed to investigate the ap­plicability of Pediatric Emergency Care Applied Research Network (PECARN rules for decision to perform computed cranial tomography (CCT in pediatric patients with minor head trauma (MHT. Methods: 317 pediatric patients who underwent CCT for mild head trauma were evaluated retrospectively. The pa­tients were classified in two groups according to PECARN rules: below 2 years old, above 2 years old and then, these patient groups were classified into two subgroups accord­ing to the compatibility with PECARN rules. The patients re­quiring CCT according to PECARN rules were classified as PECARN compatible (PECARN +, the patients who under­went CCT without the need of CCT according to PECARN were classified as PECARN incompatible (PECARN -. Results: Approximately 20% patients in PECARN (+ group had abnormalities leading to prolonged hospitalization and only 3.8% patients of PECARN (- group had abnormali­ties. However, none of PECARN (- group patients required follow-up longer than 48 hours in the hospital. The most common symptoms necessitate CCT in PECARN (+ group were scalp swelling, scalp hematoma and vomiting. In PE­CARN (- group the most common signs were cuts in the scalp and dermal abrasions. The incidence of fracture in CCT was significantly higher in PECARN (+ group. Conclusion: Because CCT poses serious radiation expo­sure, neurological examination and clinical follow-up should be preferred in the evaluation of children with MHT. In con­clusion, PECARN rules were sufficient for CCT decision in pediatric patients with MHT. J Clin Exp Invest 2016; 7 (1: 35-40

  6. A decision aid to rule out pneumonia and reduce unnecessary prescriptions of antibiotics in primary care patients with cough and fever

    Directory of Open Access Journals (Sweden)

    Hunziker Roger

    2011-05-01

    Full Text Available Abstract Background Physicians fear missing cases of pneumonia and treat many patients with signs of respiratory infection unnecessarily with antibiotics. This is an avoidable cause for the increasing worldwide problem of antibiotic resistance. We developed a user-friendly decision aid to rule out pneumonia and thus reduce the rate of needless prescriptions of antibiotics. Methods This was a prospective cohort study in which we enrolled patients older than 18 years with a new or worsened cough and fever without serious co-morbidities. Physicians recorded results of a standardized medical history and physical examination. C-reactive protein was measured and chest radiographs were obtained. We used Classification and Regression Trees to derive the decision tool. Results A total of 621 consenting eligible patients were studied, 598 were attending a primary care facility, were 48 years on average and 50% were male. Radiographic signs for pneumonia were present in 127 (20.5% of patients. Antibiotics were prescribed to 234 (48.3% of patients without pneumonia. In patients with C-reactive protein values below 10 μg/ml or patients presenting with C-reactive protein between 11 and 50 μg/ml, but without dyspnoea and daily fever, pneumonia can be ruled out. By applying this rule in clinical practice antibiotic prescription could be reduced by 9.1% (95% confidence interval (CI: 6.4 to 11.8. Conclusions Following validation and confirmation in new patient samples, this tool could help rule out pneumonia and be used to reduce unnecessary antibiotic prescriptions in patients presenting with cough and fever in primary care. The algorithm might be especially useful in those instances where taking a medical history and physical examination alone are inconclusive for ruling out pneumonia

  7. A decision support system using analytical hierarchy process (AHP) for the optimal environmental reclamation of an open-pit mine

    Science.gov (United States)

    Bascetin, A.

    2007-04-01

    The selection of an optimal reclamation method is one of the most important factors in open-pit design and production planning. It also affects economic considerations in open-pit design as a function of plan location and depth. Furthermore, the selection is a complex multi-person, multi-criteria decision problem. The group decision-making process can be improved by applying a systematic and logical approach to assess the priorities based on the inputs of several specialists from different functional areas within the mine company. The analytical hierarchy process (AHP) can be very useful in involving several decision makers with different conflicting objectives to arrive at a consensus decision. In this paper, the selection of an optimal reclamation method using an AHP-based model was evaluated for coal production in an open-pit coal mine located at Seyitomer region in Turkey. The use of the proposed model indicates that it can be applied to improve the group decision making in selecting a reclamation method that satisfies optimal specifications. Also, it is found that the decision process is systematic and using the proposed model can reduce the time taken to select a optimal method.

  8. Optimizing patient treatment decisions in an era of rapid technological advances: the case of hepatitis C treatment.

    Science.gov (United States)

    Liu, Shan; Brandeau, Margaret L; Goldhaber-Fiebert, Jeremy D

    2017-03-01

    How long should a patient with a treatable chronic disease wait for more effective treatments before accepting the best available treatment? We develop a framework to guide optimal treatment decisions for a deteriorating chronic disease when treatment technologies are improving over time. We formulate an optimal stopping problem using a discrete-time, finite-horizon Markov decision process. The goal is to maximize a patient's quality-adjusted life expectancy. We derive structural properties of the model and analytically solve a three-period treatment decision problem. We illustrate the model with the example of treatment for chronic hepatitis C virus (HCV). Chronic HCV affects 3-4 million Americans and has been historically difficult to treat, but increasingly effective treatments have been commercialized in the past few years. We show that the optimal treatment decision is more likely to be to accept currently available treatment-despite expectations for future treatment improvement-for patients who have high-risk history, who are older, or who have more comorbidities. Insights from this study can guide HCV treatment decisions for individual patients. More broadly, our model can guide treatment decisions for curable chronic diseases by finding the optimal treatment policy for individual patients in a heterogeneous population.

  9. Barriers and facilitators to the dissemination of DECISION+, a continuing medical education program for optimizing decisions about antibiotics for acute respiratory infections in primary care: A study protocol

    Directory of Open Access Journals (Sweden)

    Gagnon Marie-Pierre

    2011-01-01

    decision making regarding the use of antibiotics in acute respiratory infections, to facilitate its dissemination in primary care on a large scale. Our results should help continuing medical educators develop a continuing medical education program in shared decision making for other clinically relevant topics. This will help optimize clinical decisions in primary care.

  10. Optimizing Prescription of Chinese Herbal Medicine for Unstable Angina Based on Partially Observable Markov Decision Process

    Directory of Open Access Journals (Sweden)

    Yan Feng

    2013-01-01

    Full Text Available Objective. Initial optimized prescription of Chinese herb medicine for unstable angina (UA. Methods. Based on partially observable Markov decision process model (POMDP, we choose hospitalized patients of 3 syndrome elements, such as qi deficiency, blood stasis, and turbid phlegm for the data mining, analysis, and objective evaluation of the diagnosis and treatment of UA at a deep level in order to optimize the prescription of Chinese herb medicine for UA. Results. The recommended treatment options of UA for qi deficiency, blood stasis, and phlegm syndrome patients were as follows: Milkvetch Root + Tangshen + Indian Bread + Largehead Atractylodes Rhizome (ADR=0.96630; Danshen Root + Chinese Angelica + Safflower + Red Peony Root + Szechwan Lovage Rhizome Orange Fruit (ADR=0.76; Snakegourd Fruit + Longstamen Onion Bulb + Pinellia Tuber + Dried Tangerine peel + Largehead Atractylodes Rhizome + Platycodon Root (ADR=0.658568. Conclusion. This study initially optimized prescriptions for UA based on POMDP, which can be used as a reference for further development of UA prescription in Chinese herb medicine.

  11. Optimizing prescription of chinese herbal medicine for unstable angina based on partially observable markov decision process.

    Science.gov (United States)

    Feng, Yan; Qiu, Yu; Zhou, Xuezhong; Wang, Yixin; Xu, Hao; Liu, Baoyan

    2013-01-01

    Objective. Initial optimized prescription of Chinese herb medicine for unstable angina (UA). Methods. Based on partially observable Markov decision process model (POMDP), we choose hospitalized patients of 3 syndrome elements, such as qi deficiency, blood stasis, and turbid phlegm for the data mining, analysis, and objective evaluation of the diagnosis and treatment of UA at a deep level in order to optimize the prescription of Chinese herb medicine for UA. Results. The recommended treatment options of UA for qi deficiency, blood stasis, and phlegm syndrome patients were as follows: Milkvetch Root + Tangshen + Indian Bread + Largehead Atractylodes Rhizome (ADR = 0.96630); Danshen Root + Chinese Angelica + Safflower + Red Peony Root + Szechwan Lovage Rhizome Orange Fruit (ADR = 0.76); Snakegourd Fruit + Longstamen Onion Bulb + Pinellia Tuber + Dried Tangerine peel + Largehead Atractylodes Rhizome + Platycodon Root (ADR = 0.658568). Conclusion. This study initially optimized prescriptions for UA based on POMDP, which can be used as a reference for further development of UA prescription in Chinese herb medicine.

  12. Intelligent decision support system of operation-optimization in copper smelting converter

    Institute of Scientific and Technical Information of China (English)

    姚俊峰; 梅炽; 彭小奇; 周安梁; 吴冬华

    2002-01-01

    An artificial intelligence technique was applied to the optimization of flux-adding systems and air-blasting systems, the display of on-line parameters, forecasting of mass and compositions of slag in the slagging period, optimization of cold material-adding systems and air-blasting systems, the display of on-line parameters, and the forecasting of copper mass in the copper blow period in copper smelting converters. They were integrated to build the Intelligent Decision Support System of the Operation-Optimization of Copper Smelting Converter(IDSSOOCSC), which is self-learning and self-adaptating. Development steps, monoblock structure and basic functions of the IDSSOOCSC were introduced. After it was applied in a copper smelting converter, every production quota was clearly improved after IDSSOOCSC had been run for 4 months. Blister copper productivity is increased by 6%, processing load of cold input is increased by 8% and average converter life-span is improved from 213 to 235 furnace times.

  13. Parallel Machine Scheduling (PMS) in Manufacturing Systems Using the Ant Colonies Optimization Algorithmic Rule

    Science.gov (United States)

    Senthiil, P. V.; Selladurai, V.; Rajesh, R.

    This study introduces a new approach for decentralized scheduling in a parallel machine environment based on the ant colonies optimization algorithm. The algorithm extends the use of the traveling salesman problem for scheduling in one single machine, to a multiple machine problem. The results are presented using simple and illustrative examples and show that the algorithm is able to optimize the different scheduling problems. Using the same parameters, the completion time of the tasks is minimized and the processing time of the parallel machines is balanced.

  14. Prediction of high-grade vesicoureteral reflux after pediatric urinary tract infection: external validation study of procalcitonin-based decision rule.

    Directory of Open Access Journals (Sweden)

    Sandrine Leroy

    Full Text Available BACKGROUND: Predicting vesico-ureteral reflux (VUR ≥3 at the time of the first urinary tract infection (UTI would make it possible to restrict cystography to high-risk children. We previously derived the following clinical decision rule for that purpose: cystography should be performed in cases with ureteral dilation and a serum procalcitonin level ≥0.17 ng/mL, or without ureteral dilatation when the serum procalcitonin level ≥0.63 ng/mL. The rule yielded a 86% sensitivity with a 46% specificity. We aimed to test its reproducibility. STUDY DESIGN: A secondary analysis of prospective series of children with a first UTI. The rule was applied, and predictive ability was calculated. RESULTS: The study included 413 patients (157 boys, VUR ≥3 in 11% from eight centers in five countries. The rule offered a 46% specificity (95% CI, 41-52, not different from the one in the derivation study. However, the sensitivity significantly decreased to 64% (95%CI, 50-76, leading to a difference of 20% (95%CI, 17-36. In all, 16 (34% patients among the 47 with VUR ≥3 were misdiagnosed by the rule. This lack of reproducibility might result primarily from a difference between derivation and validation populations regarding inflammatory parameters (CRP, PCT; the validation set samples may have been collected earlier than for the derivation one. CONCLUSIONS: The rule built to predict VUR ≥3 had a stable specificity (ie. 46%, but a decreased sensitivity (ie. 64% because of the time variability of PCT measurement. Some refinement may be warranted.

  15. Learning about the Term Structure and Optimal Rules for Inflation Targeting

    NARCIS (Netherlands)

    Tesfaselassie, M.F.; Schaling, E.; Eijffinger, S.C.W.

    2006-01-01

    In this paper we incorporate the term structure of interest rates in a standard inflation forecast targeting framework.We find that under flexible inflation targeting and uncertainty in the degree of persistence in the economy, allowing for active learning possibilities has e®ects on the optimal int

  16. Learning about the Term Structure and Optimal Rules for Inflation Targeting

    NARCIS (Netherlands)

    Tesfaselassie, M.F.; Schaling, E.; Eijffinger, S.C.W.

    2006-01-01

    In this paper we incorporate the term structure of interest rates in a standard inflation forecast targeting framework.We find that under flexible inflation targeting and uncertainty in the degree of persistence in the economy, allowing for active learning possibilities has e®ects on the optimal

  17. Learning about the Term Structure and Optimal Rules for Inflation Targeting

    NARCIS (Netherlands)

    Tesfaselassie, M.F.; Schaling, E.; Eijffinger, S.C.W.

    2006-01-01

    In this paper we incorporate the term structure of interest rates in a standard inflation forecast targeting framework.We find that under flexible inflation targeting and uncertainty in the degree of persistence in the economy, allowing for active learning possibilities has e®ects on the optimal int

  18. 政府公共决策的规则限制%The Constraints of the Rules for Government's Public Decision-making

    Institute of Scientific and Technical Information of China (English)

    尹长海

    2012-01-01

    政府公共决策是一次利益再分配过程,由于"经济人"假设和政府的有限理性,政府公共决策权力时常被滥用,造成政府公共决策失误。制定完善的规则,规范政府公共决策权力,可以有效防止政府公共决策失误,使之解决民生之苦、反映民生之需、实现民生之乐。%Government's public decision-making is a process of interests distribution among different groups.Owing to the assumed "economist" and the limited ration of government,public decision-making is often abused,resulting in faults in government's public decision-making.Laying out perfect rules can specify government's public decision-making,effectively prevent the faults in government's public decision-making,solving people's problems,meeting people's demands and making people live happy lives.

  19. Impact of the New Optimal Rules for Arbitration of Disputers Relating to Space Debris Controversies

    Science.gov (United States)

    Force, Melissa K.

    2013-09-01

    The mechanisms and procedures for settlement of disputes arising from space debris collision damage, such as that suffered by the Russian Cosmos and US Iridium satellites in 2009, are highly political, nonbinding and unpredictable - all of which contributes to the uncertainty that increases the costs of financing and insuring those endeavors that take place in near-Earth space, especially in Low Earth Orbit. Dispute settlement mechanisms can be found in the 1967 Outer Space Treaty, which provides for consultations in cases involving potentially harmful interference with activities of States parties, and in the 1972 Liability Convention which permits but does not require States - not non-governmental entities - to pursue claims in a resolution process that is nonbinding (unless otherwise agreed.) There are soft- law mechanisms to control the growth of space debris, such as the voluntary 2008 United Nations Space Debris Mitigation Guidelines, and international law and the principles of equity and justice generally provide reparation to restore a person, State or organization to the condition which would have existed if damage had not occurred, but only if all agree to a specific tribunal or international court; even then, parties may be bound by the result only if agreed and enforcement of the award internationally remains uncertain. In all, the dispute resolution process for damage resulting from inevitable future damage from space debris collisions is highly unsatisfactory. However, the Administrative Council of the Permanent Court of Arbitration's recently adopted Optional Rules for the Arbitration of Disputes Relating to Outer Space Activities are, as of yet, untested, and this article will provide an overview of the process, explore the ways in which they fill in gaps in the previous patchwork of systems and analyze the benefits and shortcomings of the new Outer Space Optional Rules.

  20. Management of redundancy in flight control systems using optimal decision theory

    Science.gov (United States)

    1981-01-01

    The problem of using redundancy that exists between dissimilar systems in aircraft flight control is addressed. That is, using the redundancy that exists between a rate gyro and an accelerometer--devices that have dissimilar outputs which are related only through the dynamics of the aircraft motion. Management of this type of redundancy requires advanced logic so that the system can monitor failure status and can reconfigure itself in the event of one or more failures. An optimal decision theory was tutorially developed for the management of sensor redundancy and the theory is applied to two aircraft examples. The first example is the space shuttle and the second is a highly maneuvering high performance aircraft--the F8-C. The examples illustrate the redundancy management design process and the performance of the algorithms presented in failure detection and control law reconfiguration.

  1. Engaging Gatekeepers, Optimizing Decision Making, and Mitigating Bias: Design Specifications for Systemic Diversity Interventions.

    Science.gov (United States)

    Vinkenburg, Claartje J

    2017-06-01

    In this contribution to the Journal of Applied Behavioral Science Special Issue on Understanding Diversity Dynamics in Systems: Social Equality as an Organization Change Issue, I develop and describe design specifications for systemic diversity interventions in upward mobility career systems, aimed at optimizing decision making through mitigating bias by engaging gatekeepers. These interventions address the paradox of meritocracy that underlies the surprising lack of diversity at the top of the career pyramid in these systems. I ground the design specifications in the limited empirical evidence on "what works" in systemic interventions. Specifically, I describe examples from interventions in academic settings, including a bias literacy program, participatory modeling, and participant observation. The design specifications, paired with inspirational examples of successful interventions, should assist diversity officers and consultants in designing and implementing interventions to promote the advancement to and representation of nondominant group members at the top of the organizational hierarchy.

  2. Optimization-based decision support to assist in logistics planning for hospital evacuations.

    Science.gov (United States)

    Glick, Roger; Bish, Douglas R; Agca, Esra

    2013-01-01

    The evacuation of the hospital is a very complex process and evacuation planning is an important part of a hospital's emergency management plan. There are numerous factors that affect the evacuation plan including the nature of threat, availability of resources and staff the characteristics of the evacuee population, and risk to patients and staff. The safety and health of patients is of fundamental importance, but safely moving patients to alternative care facilities while under threat is a very challenging task. This article describes the logistical issues and complexities involved in planning and execution of hospital evacuations. Furthermore, this article provides examples of how optimization-based decision support tools can help evacuation planners to better plan for complex evacuations by providing real-world solutions to various evacuation scenarios.

  3. MIDAS intelligent platform for medical services, support for decision optimization in virtual medical communities.

    Science.gov (United States)

    Arotăriţei, D; Toma, C M; Turnea, M; Toma, Vasilica

    2008-01-01

    The paper describes the implementation of a open multifunctional platform--MIDAS--for heterogeneous medical data management--support for optimization of clinical decision in virtual medical communities. The objectives of this intelligent environment are: diagnostic easier by access to heterogeneous medical data, a virtual support for medical personal in order to reduce medical errors, fast access to resources for education and improvement of medical education for physicians and students. The structure of the platform is based on a core module and a number of dedicated modules that give an important advantage as re-configurable platform depending on necessities. The core module tries to be as general is possible in order to be used in the future as core model in a platform focused on dentistry cases.

  4. Safety Lead Optimization and Candidate Identification: Integrating New Technologies into Decision-Making.

    Science.gov (United States)

    Dambach, Donna M; Misner, Dinah; Brock, Mathew; Fullerton, Aaron; Proctor, William; Maher, Jonathan; Lee, Dong; Ford, Kevin; Diaz, Dolores

    2016-04-18

    Discovery toxicology focuses on the identification of the most promising drug candidates through the development and implementation of lead optimization strategies and hypothesis-driven investigation of issues that enable rational and informed decision-making. The major goals are to [a] identify and progress the drug candidate with the best overall drug safety profile for a therapeutic area, [b] remove the most toxic drugs from the portfolio prior to entry into humans to reduce clinical attrition due to toxicity, and [c] establish a well-characterized hazard and translational risk profile to enable clinical trial designs. This is accomplished through a framework that balances the multiple considerations to identify a drug candidate with the overall best drug characteristics and provides a cogent understanding of mechanisms of toxicity. The framework components include establishing a target candidate profile for each program that defines the qualities of a successful candidate based on the intended therapeutic area, including the risk tolerance for liabilities; evaluating potential liabilities that may result from engaging the therapeutic target (pharmacology-mediated or on-target) and that are chemical structure-mediated (off-target); and characterizing identified liabilities. Lead optimization and investigation relies upon the integrated use of a variety of technologies and models (in silico, in vitro, and in vivo) that have achieved a sufficient level of qualification or validation to provide confidence in their use. We describe the strategic applications of various nonclinical models (established and new) for a holistic and integrated risk assessment that is used for rational decision-making. While this review focuses on strategies for small molecules, the overall concepts, approaches, and technologies are generally applicable to biotherapeutics.

  5. Spatial decision supporting for winter wheat irrigation and fertilizer optimizing in North China Plain

    Science.gov (United States)

    Yang, Xiaodong; Yang, Hao; Dong, Yansheng; Yu, Haiyang

    2014-11-01

    Production management of winter wheat is more complicated than other crops since its growth period is covered all four seasons and growth environment is very complex with frozen injury, drought, insect or disease injury and others. In traditional irrigation and fertilizer management, agricultural technicians or farmers mainly make decision based on phenology, planting experience to carry out artificial fertilizer and irrigation management. For example, wheat needs more nitrogen fertilizer in jointing and booting stage by experience, then when the wheat grow to the two growth periods, the farmer will fertilize to the wheat whether it needs or not. We developed a spatial decision support system for optimizing irrigation and fertilizer measures based on WebGIS, which monitoring winter wheat growth and soil moisture content by combining a crop model, remote sensing data and wireless sensors data, then reasoning professional management schedule from expert knowledge warehouse. This system is developed by ArcIMS, IDL in server-side and JQuery, Google Maps API, ASP.NET in client-side. All computing tasks are run on server-side, such as computing 11 normal vegetable indexes (NDVI/ NDWI/ NDWI2/ NRI/ NSI/ WI/ G_SWIR/ G_SWIR2/ SPSI/ TVDI/ VSWI) and custom VI of remote sensing image by IDL; while real-time building map configuration file and generating thematic map by ArcIMS.

  6. Optimal site selection for sitting a solar park using multi-criteria decision analysis and geographical information systems

    Science.gov (United States)

    Georgiou, Andreas; Skarlatos, Dimitrios

    2016-07-01

    Among the renewable power sources, solar power is rapidly becoming popular because it is inexhaustible, clean, and dependable. It has also become more efficient since the power conversion efficiency of photovoltaic solar cells has increased. Following these trends, solar power will become more affordable in years to come and considerable investments are to be expected. Despite the size of solar plants, the sitting procedure is a crucial factor for their efficiency and financial viability. Many aspects influence such a decision: legal, environmental, technical, and financial to name a few. This paper describes a general integrated framework to evaluate land suitability for the optimal placement of photovoltaic solar power plants, which is based on a combination of a geographic information system (GIS), remote sensing techniques, and multi-criteria decision-making methods. An application of the proposed framework for the Limassol district in Cyprus is further illustrated. The combination of a GIS and multi-criteria methods produces an excellent analysis tool that creates an extensive database of spatial and non-spatial data, which will be used to simplify problems as well as solve and promote the use of multiple criteria. A set of environmental, economic, social, and technical constrains, based on recent Cypriot legislation, European's Union policies, and expert advice, identifies the potential sites for solar park installation. The pairwise comparison method in the context of the analytic hierarchy process (AHP) is applied to estimate the criteria weights in order to establish their relative importance in site evaluation. In addition, four different methods to combine information layers and check their sensitivity were used. The first considered all the criteria as being equally important and assigned them equal weight, whereas the others grouped the criteria and graded them according to their objective perceived importance. The overall suitability of the study

  7. Integration of Rule Based Expert Systems and Case Based Reasoning in an Acute Bacterial Meningitis Clinical Decision Support System

    CERN Document Server

    Cabrera, Mariana Maceiras

    2010-01-01

    This article presents the results of the research carried out on the development of a medical diagnostic system applied to the Acute Bacterial Meningitis, using the Case Based Reasoning methodology. The research was focused on the implementation of the adaptation stage, from the integration of Case Based Reasoning and Rule Based Expert Systems. In this adaptation stage we use a higher level RBC that stores and allows reutilizing change experiences, combined with a classic rule-based inference engine. In order to take into account the most evident clinical situation, a pre-diagnosis stage is implemented using a rule engine that, given an evident situation, emits the corresponding diagnosis and avoids the complete process.

  8. OmniGA: Optimized Omnivariate Decision Trees for Generalizable Classification Models

    KAUST Repository

    Magana-Mora, Arturo

    2017-06-14

    Classification problems from different domains vary in complexity, size, and imbalance of the number of samples from different classes. Although several classification models have been proposed, selecting the right model and parameters for a given classification task to achieve good performance is not trivial. Therefore, there is a constant interest in developing novel robust and efficient models suitable for a great variety of data. Here, we propose OmniGA, a framework for the optimization of omnivariate decision trees based on a parallel genetic algorithm, coupled with deep learning structure and ensemble learning methods. The performance of the OmniGA framework is evaluated on 12 different datasets taken mainly from biomedical problems and compared with the results obtained by several robust and commonly used machine-learning models with optimized parameters. The results show that OmniGA systematically outperformed these models for all the considered datasets, reducing the F score error in the range from 100% to 2.25%, compared to the best performing model. This demonstrates that OmniGA produces robust models with improved performance. OmniGA code and datasets are available at www.cbrc.kaust.edu.sa/omniga/.

  9. A PROTOTYPE DECISION SUPPORT SYSTEM FOR OPTIMIZING THE EFFECTIVENESS OF ELEARNING IN EDUCATIONAL INSTITUTIONS

    Directory of Open Access Journals (Sweden)

    S. Abu-Naser1

    2011-07-01

    Full Text Available In this paper, a prototype of a Decision Support System (DSS is proposed for providing the knowledge for optimizing the newly adopted e-learning education strategy in educational institutions. If an educational institution adopted e-learning as a new strategy, it should undertake a preliminary evaluation to determine the percentage of success and areas of weakness of this strategy. If this evaluation is done manually, it would not be an easy task to do and would not provide knowledge about all pitfall symptoms. The proposed DSS is based on exploration (mining of knowledge from large amounts of data yielded from the operating the institution to its business. This knowledge can be used to guide and optimize any new business strategy implemented by the institution. The proposed DSS involves Database engine, Data Mining engine and Artificial Intelligence engine. All these engines work together in order to extract the knowledge necessary to improve the effectiveness of any strategy, including e-learning

  10. A markov decision process model for the optimal dispatch of military medical evacuation assets.

    Science.gov (United States)

    Keneally, Sean K; Robbins, Matthew J; Lunday, Brian J

    2016-06-01

    We develop a Markov decision process (MDP) model to examine aerial military medical evacuation (MEDEVAC) dispatch policies in a combat environment. The problem of deciding which aeromedical asset to dispatch to each service request is complicated by the threat conditions at the service locations and the priority class of each casualty event. We assume requests for MEDEVAC support arrive sequentially, with the location and the priority of each casualty known upon initiation of the request. The United States military uses a 9-line MEDEVAC request system to classify casualties as being one of three priority levels: urgent, priority, and routine. Multiple casualties can be present at a single casualty event, with the highest priority casualty determining the priority level for the casualty event. Moreover, an armed escort may be required depending on the threat level indicated by the 9-line MEDEVAC request. The proposed MDP model indicates how to optimally dispatch MEDEVAC helicopters to casualty events in order to maximize steady-state system utility. The utility gained from servicing a specific request depends on the number of casualties, the priority class for each of the casualties, and the locations of both the servicing ambulatory helicopter and casualty event. Instances of the dispatching problem are solved using a relative value iteration dynamic programming algorithm. Computational examples are used to investigate optimal dispatch policies under different threat situations and armed escort delays; the examples are based on combat scenarios in which United States Army MEDEVAC units support ground operations in Afghanistan.

  11. A web-based Decision Support System for the optimal management of construction and demolition waste.

    Science.gov (United States)

    Banias, G; Achillas, Ch; Vlachokostas, Ch; Moussiopoulos, N; Papaioannou, I

    2011-12-01

    Wastes from construction activities constitute nowadays the largest by quantity fraction of solid wastes in urban areas. In addition, it is widely accepted that the particular waste stream contains hazardous materials, such as insulating materials, plastic frames of doors, windows, etc. Their uncontrolled disposal result to long-term pollution costs, resource overuse and wasted energy. Within the framework of the DEWAM project, a web-based Decision Support System (DSS) application - namely DeconRCM - has been developed, aiming towards the identification of the optimal construction and demolition waste (CDW) management strategy that minimises end-of-life costs and maximises the recovery of salvaged building materials. This paper addresses both technical and functional structure of the developed web-based application. The web-based DSS provides an accurate estimation of the generated CDW quantities of twenty-one different waste streams (e.g. concrete, bricks, glass, etc.) for four different types of buildings (residential, office, commercial and industrial). With the use of mathematical programming, the DeconRCM provides also the user with the optimal end-of-life management alternative, taking into consideration both economic and environmental criteria. The DSS's capabilities are illustrated through a real world case study of a typical five floor apartment building in Thessaloniki, Greece.

  12. CHEOPS performance for exomoons: The detectability of exomoons by using optimal decision algorithm

    CERN Document Server

    Simon, A E; Kiss, L L; Fortier, A; Benz, W

    2015-01-01

    Many attempts have already been made for detecting exomoons around transiting exoplanets but the first confirmed discovery is still pending. The experience that have been gathered so far allow us to better optimize future space telescopes for this challenge, already during the development phase. In this paper we focus on the forthcoming CHaraterising ExOPlanet Satellite (CHEOPS),describing an optimized decision algorithm with step-by-step evaluation, and calculating the number of required transits for an exomoon detection for various planet-moon configurations that can be observable by CHEOPS. We explore the most efficient way for such an observation which minimizes the cost in observing time. Our study is based on PTV observations (photocentric transit timing variation, Szab\\'o et al. 2006) in simulated CHEOPS data, but the recipe does not depend on the actual detection method, and it can be substituted with e.g. the photodynamical method for later applications. Using the current state-of-the-art level simul...

  13. Hydraulic Modeling and Evolutionary Optimization for Enhanced Real-Time Decision Support of Combined Sewer Overflows

    Science.gov (United States)

    Zimmer, A. L.; Minsker, B. S.; Schmidt, A. R.; Ostfeld, A.

    2011-12-01

    Real-time mitigation of combined sewer overflows (CSOs) requires evaluation of multiple operational strategies during rapidly changing rainfall events. Simulation models for hydraulically complex systems can effectively provide decision support for short time intervals when coupled with efficient optimization. This work seeks to reduce CSOs for a test case roughly based on the North Branch of the Chicago Tunnel and Reservoir Plan (TARP), which is operated by the Metropolitan Water Reclamation District of Greater Chicago (MWRDGC). The North Branch tunnel flows to a junction with the main TARP system. The Chicago combined sewer system alleviates potential CSOs by directing high interceptor flows through sluice gates and dropshafts to a deep tunnel. Decision variables to control CSOs consist of sluice gate positions that control water flow to the tunnel as well as a treatment plant pumping rate that lowers interceptor water levels. A physics-based numerical model is used to simulate the hydraulic effects of changes in the decision variables. The numerical model is step-wise steady and conserves water mass and momentum at each time step by iterating through a series of look-up tables. The look-up tables are constructed offline to avoid extensive real-time calculations, and describe conduit storage and water elevations as a function of flow. A genetic algorithm (GA) is used to minimize CSOs at each time interval within a moving horizon framework. Decision variables are coded at 15-minute increments and GA solutions are two hours in duration. At each 15-minute interval, the algorithm identifies a good solution for a two-hour rainfall forecast. Three GA modifications help reduce optimization time. The first adjustment reduces the search alphabet by eliminating sluice gate positions that do not influence overflow volume. The second GA retains knowledge of the best decision at the previous interval by shifting the genes in the best previous sequence to initialize search at

  14. A cloud theory-based particle swarm optimization for multiple decision maker vehicle routing problems with fuzzy random time windows

    Science.gov (United States)

    Ma, Yanfang; Xu, Jiuping

    2015-06-01

    This article puts forward a cloud theory-based particle swarm optimization (CTPSO) algorithm for solving a variant of the vehicle routing problem, namely a multiple decision maker vehicle routing problem with fuzzy random time windows (MDVRPFRTW). A new mathematical model is developed for the proposed problem in which fuzzy random theory is used to describe the time windows and bi-level programming is applied to describe the relationship between the multiple decision makers. To solve the problem, a cloud theory-based particle swarm optimization (CTPSO) is proposed. More specifically, this approach makes improvements in initialization, inertia weight and particle updates to overcome the shortcomings of the basic particle swarm optimization (PSO). Parameter tests and results analysis are presented to highlight the performance of the optimization method, and comparison of the algorithm with the basic PSO and the genetic algorithm demonstrates its efficiency.

  15. Tax Smoothing Discretion Versus Balanced Budget Rules in the Presence of Politically Motivated Fiscal Deficits: The Design of Optimal Fiscal Rules for Europe after 1992

    OpenAIRE

    Corsetti, Giancarlo; Roubini, Nouriel

    1992-01-01

    We analyse the arguments in favour and against binding fiscal rules such as those recently agreed by European countries as preconditions for participation in the third phase of the European Monetary Union. The evidence in the paper suggests that a number of EC countries are following unsustainable fiscal policies and that this `deficits bias' may be partly due to political distortions. Binding balanced budget rules would eliminate the deficits bias that appears in the presence of such distort...

  16. The Preliminary Ruling Decision in the Case of Google vs. Louis Vuitton Concerning the AdWord Service and its Impact on the Community Law

    Directory of Open Access Journals (Sweden)

    Tomáš Gongol

    2013-02-01

    Full Text Available The internet user after entering the keywords obtains two kinds of search results – natural and sponsored ones. The following paper deals with the issue of using keywords which correspond to trademarks registered by a third party for advertising purposes through internet search portals such as Google, Yahoo, Bing, Seznam, Centrum etc. (in principle web search portals. The objective of this article is to analyze decided cases dealing with the AdWords service issued by the Court of Justice of the European Union and compare them also with the attitude in similar disputes in the U.S. Within this knowledge it is necessary to determine the impact of these decisions on further national courts decisions of European Union member states. Moreover there is also legal impact on copyright law and responsibility of internet search engines deduced. The method of the analysis of courts decisions is used and the method of legal comparison is applied to different attitudes in similar cases. Where a third party uses a sign which is identical with the trademark in relation to goods or services identical with those for which the mark is registered, the trademark proprietor is allowed to prohibit such use if it is liable to affect one of the functions of the mark (particularly the function of indicating origin. Regarding to the liability of the Internet search engine itself, decisions of the courts in matters of Internet search engines in the European Union vary from state to state. Whereas the German courts tend to currently access the responsibility for the outcome of the search engines more freely, the French courts are often more stringent. Differently, we can say much more liberal, is the access of the U.S. courts to this issue. Preliminary ruling decision in case of Louis Vuitton Malletier SA vs. Google, Inc. and community practice in further cases follow similar (liberal decisions of the courts of the U.S.

  17. An Optimization Model of the Single-Leg Air Cargo Space Control Based on Markov Decision Process

    Directory of Open Access Journals (Sweden)

    Chun-rong Qin

    2012-01-01

    Full Text Available Based on the single-leg air cargo issues, we establish a dynamic programming model to consider the overbooking and space inventory control problem. We analyze the structure of optimal booking policy for every kind of booking requests and show that the optimal booking decision is of threshold type (known as booking limit policy. Our research provides a theoretical support for the air cargo space control.

  18. Multi-objective Numeric Association Rules Mining via Ant Colony Optimization for Continuous Domains without Specifying Minimum Support and Minimum Confidence

    Directory of Open Access Journals (Sweden)

    Parisa Moslehi

    2011-09-01

    Full Text Available Currently, all search algorithms which use discretization of numeric attributes for numeric association rule mining, work in the way that the original distribution of the numeric attributes will be lost. This issue leads to loss of information, so that the association rules which are generated through this process are not precise and accurate. Based on this fact, algorithms which can natively handle numeric attributes would be interesting. Since association rule mining can be considered as a multi-objective problem, rather than a single objective one, a new multi-objective algorithm for numeric association rule mining is presented in this paper, using Ant Colony Optimization for Continuous domains (ACOR. This algorithm mines numeric association rules without any need to specify minimum support and minimum confidence, in one step. In order to do this we modified ACOR for generating rules. The results show that we have more precise and accurate rules after applying this algorithm and the number of rules is more than the ones resulted from previous works.

  19. Invasion Rule Generation Based on Fuzzy Decision Tree%基于模糊决策树的入侵规则生成技术

    Institute of Scientific and Technical Information of China (English)

    郭洪荣

    2013-01-01

      计算机免疫系统模型GECISM中的类MC Agent,可有效的利用模糊决策树Fuzzy-Id3算法,将应用程序中系统调用视为数据集构造决策树,便会生成计算机免疫系统中入侵检测规则,并分析对比试验结束后的结果,利用Fuzzy-Id3算法所生成的规则对于未知数据的收集进行分类,具有低误报率、低漏报率。%Class MC Agent of computer immune system model GECISM can effectively use fuzzy decision-making tree Fuzzy-Id3 algorithm, consider the system call in application program as data set constructed decision-making tree, generate the invasion detection rules of computer immune system, and analyze comparison test results, use rules generated by Fuzzy-Id3 algorithm to classify for unknown data of collection, has low errors reported rate, and low omitted rate.

  20. Building a hospital referral expert system with a Prediction and Optimization-Based Decision Support System algorithm.

    Science.gov (United States)

    Chi, Chih-Lin; Street, W Nick; Ward, Marcia M

    2008-04-01

    This study presents a new method for constructing an expert system using a hospital referral problem as an example. Many factors, such as institutional characteristics, patient risks, traveling distance, and chances of survival and complications should be included in the hospital-selection decision. Ideally, each patient should be treated individually, with the decision process including not only their condition but also their beliefs about trade-offs among the desired hospital features. An expert system can help with this complex decision, especially when numerous factors are to be considered. We propose a new method, called the Prediction and Optimization-Based Decision Support System (PODSS) algorithm, which constructs an expert system without an explicit knowledge base. The algorithm obtains knowledge on its own by building machine learning classifiers from a collection of labeled cases. In response to a query, the algorithm gives a customized recommendation, using an optimization step to help the patient maximize the probability of achieving a desired outcome. In this case, the recommended hospital is the optimal solution that maximizes the probability of the desired outcome. With proper formulation, this expert system can combine multiple factors to give hospital-selection decision support at the individual level.

  1. Development of GIS-Based Decision Support System for Optimizing Transportation Cost in Underground Limestone Mining

    Science.gov (United States)

    Oh, Sungchan; Park, Jihwan; Suh, Jangwon; Lee, Sangho; Choi, Youngmin

    2014-05-01

    In mining industry, large amount of cost has been invested in early stages of mine development such as prospecting, exploration, and discovery. Recent changes in mining, however, also raised the cost in operation, production, and environmental protection because ore depletion at shallow depth caused large-scale, deep mining. Therefore, many mining facilities are installed or relocated underground to reduce transportation cost as well as environmental pollution. This study presents GIS-based decision support system that optimizes transportation cost from various mining faces to mine facility in underground mines. The development of this system consists of five steps. As a first step, mining maps were collected which contains underground geo-spatial informations. In mine maps, then, mine network and contour data were converted to GIS format in second step for 3D visualization and spatial analysis. In doing so, original tunnel outline data were digitized with ground level, and converted to simplified network format, and surface morphology, contours were converted to digital elevation model (DEM). The next step is to define calculation algorithm of transportation cost. Among the many component of transportation cost, this study focused on the fuel cost because it can be easily estimated if mining maps are available by itself. The cost were calculated by multiplication of the number of blasting, haulage per blasting, distance between mining faces to facility, fuel cost per liter, and two for downhill and uphill, divided by fuel efficiency of mining trucks. Finally, decision support system, SNUTunnel was implemented. For the application of SNUTunnel in actual underground mine, Nammyeong Development Corporation, Korea, was selected as study site. This mine produces limestone with high content of calcite for paper, steel manufacture, or desulfurization, and its development is continuously ongoing to reach down to deeper calcite ore body, so the mine network is expanding

  2. Decision support for optimal location of local heat source for small district heating system on the example of biogas plant

    Directory of Open Access Journals (Sweden)

    Ciapała Bartłomiej

    2017-01-01

    Full Text Available Developing a new district heating system requires making decisions affecting entire range of following activities and wellness of the system. The article presents methodics of choosing optimal location and crucial customers with three approaches to optimisation process, discuss obtained results and obstacles. Further improvements and potential applications are named.

  3. Multi-criteria decision-making methods with optimism and pessimism based on Atanassov's intuitionistic fuzzy sets

    Science.gov (United States)

    Chen, Ting-Yu

    2012-05-01

    The theory of Atanassov's intuitionistic fuzzy sets (A-IFSs) developed over the last several decades has found useful application in fields requiring multiple-criteria decision analysis. Since the membership-nonmembership pair in A-IFSs belongs to the bivariate unipolarity type, this article describes an approach that relates optimism and pessimism to multi-criteria decision analysis in an intuitionistic fuzzy-decision environment. First, several optimistic and pessimistic point operators were defined to alter the estimation of decision outcomes. Next, based on the core of the estimations, optimistic and pessimistic score functions were developed to evaluate each alternative with respect to each criterion. The suitability function was then established to determine the degree to which each an alternative satisfies the decision maker's requirement. Because the information on multiple criteria corresponding to decision importance is often incomplete, this study included suitability functions in the optimisation models to account for poorly known membership grades. Using a linear equal-weighted summation method, these models were transformed into a single objective optimisation model to generate the optimal weights for criteria. The feasibility and effectiveness of the proposed methods were illustrated through a practical example. Finally, computational experiments with enormous amounts of simulation data were designed to conduct a comparative analysis on the ranking orders yielded by different optimistic/pessimistic point operators.

  4. An Optimal Decision Assessment Model Based on the Acceptable Maximum LGD of Commercial Banks and Its Application

    Directory of Open Access Journals (Sweden)

    Baofeng Shi

    2016-01-01

    Full Text Available This paper introduces a novel decision assessment method which is suitable for customers’ credit risk evaluation and credit decision. First of all, the paper creates an optimal credit rating model, and it consisted of an objective function and two constraint conditions. The first constraint condition of the strictly increasing LGDs eliminates the unreasonable phenomenon that the higher the credit rating is, the higher the LGD (loss given default is. Secondly, on the basis of the credit rating results, a credit decision-making assessment model based on measuring the acceptable maximum LGD of commercial banks is established. Thirdly, empirical results using the data on 2817 farmers’ microfinance of a Chinese commercial bank suggest that the proposed approach can accurately find out the good customers from all the loan applications. Moreover, our approach contributes to providing a reference for decision assessment of customers in other commercial banks in the world.

  5. A multi attribute decision making method for selection of optimal assembly line

    Directory of Open Access Journals (Sweden)

    B. Vijaya Ramnath

    2011-01-01

    Full Text Available With globalization, sweeping technological development, and increasing competition, customers are placing greater demands on manufacturers to increase quality, flexibility, on time delivery of product and less cost. Therefore, manufacturers must develop and maintain a high degree of coherence among competitive priorities, order winning criteria and improvement activities. Thus, the production managers are making an attempt to transform their organization by adopting familiar and beneficial management philosophies like cellular manufacturing (CM, lean manufacturing (LM, green manufacturing (GM, total quality management (TQM, agile manufacturing (AM, and just in time manufacturing (JIT. The main objective of this paper is to propose an optimal assembly method for an engine manufacturer’s assembly line in India. Currently, the Indian manufacturer is following traditional assembly method where the raw materials for assembly are kept along the sideways of conveyor line. It consumes more floor space, more work in process inventory, more operator's walking time and more operator's walking distance per day. In order to reduce the above mentioned wastes, lean kitting assembly is suggested by some managers. Another group of managers suggest JIT assembly as it consumes very less inventory cost compared to other types of assembly processes. Hence, a Multi-attribute decision making model namely analytical hierarchy process (AHP is applied to analyse the alternative assembly methods based on various important factors.

  6. How to construct the optimal Bayesian measurement in quantum statistical decision theory

    Science.gov (United States)

    Tanaka, Fuyuhiko

    Recently, much more attention has been paid to the study aiming at the application of fundamental properties in quantum theory to information processing and technology. In particular, modern statistical methods have been recognized in quantum state tomography (QST), where we have to estimate a density matrix (positive semidefinite matrix of trace one) representing a quantum system from finite data collected in a certain experiment. When the dimension of the density matrix gets large (from a few hundred to millions), it gets a nontrivial problem. While a specific measurement is often given and fixed in QST, we are also able to choose a measurement itself according to the purpose of QST by using qunatum statistical decision theory. Here we propose a practical method to find the best projective measurement in the Bayesian sense. We assume that a prior distribution (e.g., the uniform distribution) and a convex loss function (e.g., the squared error) are given. In many quantum experiments, these assumptions are not so restrictive. We show that the best projective measurement and the best statistical inference based on the measurement outcome exist and that they are obtained explicitly by using the Monte Carlo optimization. The Grant-in-Aid for Scientific Research (B) (No. 26280005).

  7. Fleet Planning Decision-Making: Two-Stage Optimization with Slot Purchase

    Directory of Open Access Journals (Sweden)

    Lay Eng Teoh

    2016-01-01

    Full Text Available Essentially, strategic fleet planning is vital for airlines to yield a higher profit margin while providing a desired service frequency to meet stochastic demand. In contrast to most studies that did not consider slot purchase which would affect the service frequency determination of airlines, this paper proposes a novel approach to solve the fleet planning problem subject to various operational constraints. A two-stage fleet planning model is formulated in which the first stage selects the individual operating route that requires slot purchase for network expansions while the second stage, in the form of probabilistic dynamic programming model, determines the quantity and type of aircraft (with the corresponding service frequency to meet the demand profitably. By analyzing an illustrative case study (with 38 international routes, the results show that the incorporation of slot purchase in fleet planning is beneficial to airlines in achieving economic and social sustainability. The developed model is practically viable for airlines not only to provide a better service quality (via a higher service frequency to meet more demand but also to obtain a higher revenue and profit margin, by making an optimal slot purchase and fleet planning decision throughout the long-term planning horizon.

  8. Performance guarantees for individualized treatment rules

    CERN Document Server

    Qian, Min; 10.1214/10-AOS864

    2011-01-01

    Because many illnesses show heterogeneous response to treatment, there is increasing interest in individualizing treatment to patients [Arch. Gen. Psychiatry 66 (2009) 128--133]. An individualized treatment rule is a decision rule that recommends treatment according to patient characteristics. We consider the use of clinical trial data in the construction of an individualized treatment rule leading to highest mean response. This is a difficult computational problem because the objective function is the expectation of a weighted indicator function that is nonconcave in the parameters. Furthermore, there are frequently many pretreatment variables that may or may not be useful in constructing an optimal individualized treatment rule, yet cost and interpretability considerations imply that only a few variables should be used by the individualized treatment rule. To address these challenges, we consider estimation based on $l_1$-penalized least squares. This approach is justified via a finite sample upper bound on...

  9. A Rational Decision Maker with Ordinal Utility under Uncertainty: Optimism and Pessimism

    CERN Document Server

    Han, Ji

    2009-01-01

    In game theory and artificial intelligence, decision making models often involve maximizing expected utility, which does not respect ordinal invariance. In this paper, the author discusses the possibility of preserving ordinal invariance and still making a rational decision under uncertainty.

  10. Knowledge Automation How to Implement Decision Management in Business Processes

    CERN Document Server

    Fish, Alan N

    2012-01-01

    A proven decision management methodology for increased profits and lowered risks Knowledge Automation: How to Implement Decision Management in Business Processes describes a simple but comprehensive methodology for decision management projects, which use business rules and predictive analytics to optimize and automate small, high-volume business decisions. It includes Decision Requirements Analysis (DRA), a new method for taking the crucial first step in any IT project to implement decision management: defining a set of business decisions and identifying all the information-business knowledge

  11. On optimal decision-making in brains and social insect colonies

    OpenAIRE

    Marshall, James A. R.; Bogacz, Rafal; Dornhaus, Anna; Planqué, Robert; Kovacs, Tim; Franks, Nigel R

    2009-01-01

    The problem of how to compromise between speed and accuracy in decision-making faces organisms at many levels of biological complexity. Striking parallels are evident between decision-making in primate brains and collective decision-making in social insect colonies: in both systems, separate populations accumulate evidence for alternative choices; when one population reaches a threshold, a decision is made for the corresponding alternative, and this threshold may be varied to compromise betwe...

  12. The Physics of Optimal Decision Making: A Formal Analysis of Models of Performance in Two-Alternative Forced-Choice Tasks

    Science.gov (United States)

    Bogacz, Rafal; Brown, Eric; Moehlis, Jeff; Holmes, Philip; Cohen, Jonathan D.

    2006-01-01

    In this article, the authors consider optimal decision making in two-alternative forced-choice (TAFC) tasks. They begin by analyzing 6 models of TAFC decision making and show that all but one can be reduced to the drift diffusion model, implementing the statistically optimal algorithm (most accurate for a given speed or fastest for a given…

  13. The Optimal Portfolio Selection Model under g -Expectation

    National Research Council Canada - National Science Library

    Li Li

    2014-01-01

      This paper solves the optimal portfolio selection model under the framework of the prospect theory proposed by Kahneman and Tversky in the 1970s with decision rule replaced by the g -expectation introduced by Peng...

  14. Development & optimization of a rule-based energy management strategy for fuel economy improvement in hybrid electric vehicles

    Science.gov (United States)

    Asfoor, Mostafa

    The gradual decline of oil reserves and the increasing demand for energy over the past decades has resulted in automotive manufacturers seeking alternative solutions to reduce the dependency on fossil-based fuels for transportation. A viable technology that enables significant improvements in the overall energy conversion efficiencies is the hybridization of conventional vehicle drive systems. This dissertation builds on prior hybrid powertrain development at the University of Idaho. Advanced vehicle models of a passenger car with a conventional powertrain and three different hybrid powertrain layouts were created using GT-Suite. These different powertrain models were validated against a variety of standard driving cycles. The overall fuel economy, energy consumption, and losses were monitored, and a comprehensive energy analysis was performed to compare energy sources and sinks. The GT-Suite model was then used to predict the formula hybrid SAE vehicle performance. Inputs to this model were a numerically predicted engine performance map, an electric motor torque curve, vehicle geometry, and road load parameters derived from a roll-down test. In this case study, the vehicle had a supervisory controller that followed a rule-based energy management strategy to insure a proper power split during hybrid mode operation. The supervisory controller parameters were optimized using discrete grid optimization method that minimized the total amount of fuel consumed during a specific urban driving cycle with an average speed of approximately 30 [mph]. More than a 15% increase in fuel economy was achieved by adding supervisory control and managing power split. The vehicle configuration without the supervisory controller displayed a fuel economy of 25 [mpg]. With the supervisory controller this rose to 29 [mpg]. Wider applications of this research include hybrid vehicle controller designs that can extend the range and survivability of military combat platforms. Furthermore, the

  15. Optimization of Association Rule Apriori Algorithm%关联规则挖掘算法的优化

    Institute of Scientific and Technical Information of China (English)

    张青

    2015-01-01

    Apriori算法是关联规则挖掘的经典算法,该算法在处理规模巨大的候选项目集时存在耗时长和效率低的问题,提出了采用分割法对数据进行分片的优化算法。实验证明该算法不仅能减少数据挖掘对系统资源的占用,而且解决了数据库中数据分割下局部频繁项目序列集产生和全局频繁项目序列集的转换问题。%The Apriori algorithm is a classical methodology used for data mining association rules ,but this algorithm is rather time-consuming and low-efficient in dealing with massive sets of candidate items. This thesis has put forth an optimal algorithm of data segmentation based on data division,and the experiments prove that this new algorithm not only works well to make a significiant reduction in the amount of systemic resources engaged in data mining,but also provides a fine solution to the formation and conversion of series of item sets occuring frequently in the process of data-segmentation and data-division in databases.

  16. Modeling Ignition of a Heptane Isomer: Improved Thermodynamics, Reaction Pathways, Kinetic, and Rate Rule Optimizations for 2-Methylhexane

    KAUST Repository

    Mohamed, Samah

    2016-03-21

    Accurate chemical kinetic combustion models of lightly branched alkanes (e.g., 2-methylalkanes) are important to investigate the combustion behavior of real fuels. Improving the fidelity of existing kinetic models is a necessity, as new experiments and advanced theories show inaccuracies in certain portions of the models. This study focuses on updating thermodynamic data and the kinetic reaction mechanism for a gasoline surrogate component, 2-methylhexane, based on recently published thermodynamic group values and rate rules derived from quantum calculations and experiments. Alternative pathways for the isomerization of peroxy-alkylhydroperoxide (OOQOOH) radicals are also investigated. The effects of these updates are compared against new high-pressure shock tube and rapid compression machine ignition delay measurements. It is shown that rate constant modifications are required to improve agreement between kinetic modeling simulations and experimental data. We further demonstrate the ability to optimize the kinetic model using both manual and automated techniques for rate parameter tunings to improve agreement with the measured ignition delay time data. Finally, additional low temperature chain branching reaction pathways are shown to improve the model’s performance. The present approach to model development provides better performance across extended operating conditions while also strengthening the fundamental basis of the model.

  17. Optimal Tableaux-Based Decision Procedure for Testing Satisfiability in the Alternating-Time Temporal Logic ATL+

    DEFF Research Database (Denmark)

    Cerrito, Serenella; David, Amelie; Goranko, Valentin

    2014-01-01

    We develop a sound, complete and practically implementable tableaux-based decision method for constructive satisfiability testing and model synthesis in the fragment ATL+ of the full Alternating time temporal logic ATL∗. The method extends in an essential way a previously developed tableaux......-based decision method for ATL and works in 2EXPTIME, which is the optimal worst case complexity of the satisfiability problem for ATL+. We also discuss how suitable parameterizations and syntactic restrictions on the class of input ATL+ formulae can reduce the complexity of the satisfiability problem....

  18. Can decision rules simulate carbon allocation for years with contrasting and extreme weather conditions? A case study for three temperate beech forests

    DEFF Research Database (Denmark)

    Campioli, Matteo; Verbeeck, Hans; Van den Bossche, Joris

    2013-01-01

    The allocation of carbohydrates to different tree processes and organs is crucial to understand the overall carbon (C) cycling rate in forest ecosystems. Decision rules (DR) (e.g. functional balances and source-sink relationships) are widely used to model C allocation in forests. However, standard......) and for two contrasting sites not used for parameterisation (the beech forest of Sorø, Denmark, for 1999-2006, and Collelongo, Italy, for 2005-2006). At Hesse, 2003 was characterised by a severe and extreme drought and heat wave. The standard DR allocation scheme captured the average annual dynamics of C...... of the standard DR allocation model to simulate year-to-year variability was limited. The amended DR allocation scheme improved the annual simulations and allowed capturing the stand growth dynamics at Hesse during the extreme 2003 summer and its important lag effect on next year's wood production. Modelling...

  19. Which is the optimal fiscal rule in a monetary union? Targeting the structural, the global budgetary deficit, or the public debt?

    OpenAIRE

    2014-01-01

    The aim of our paper is to contribute to the debate on optimal fiscal rules in a monetary union: in terms of global budgetary deficit, of structural budgetary deficit, or of public debt. Indeed, these rules seem to be mixed in the framework of the European Economic and Monetary Union, with the new Fiscal Compact. With the help of a simple macroeconomic model, we show that a goal in terms of public debt is the most appropriate in order to decrease the indebtedness levels, but that it could inc...

  20. Multi-objective thermodynamic optimization of an irreversible regenerative Brayton cycle using evolutionary algorithm and decision making

    Directory of Open Access Journals (Sweden)

    Rajesh Kumar

    2016-06-01

    Full Text Available Brayton heat engine model is developed in MATLAB simulink environment and thermodynamic optimization based on finite time thermodynamic analysis along with multiple criteria is implemented. The proposed work investigates optimal values of various decision variables that simultaneously optimize power output, thermal efficiency and ecological function using evolutionary algorithm based on NSGA-II. Pareto optimal frontier between triple and dual objectives is obtained and best optimal value is selected using Fuzzy, TOPSIS, LINMAP and Shannon’s entropy decision making methods. Triple objective evolutionary approach applied to the proposed model gives power output, thermal efficiency, ecological function as (53.89 kW, 0.1611, −142 kW which are 29.78%, 25.86% and 21.13% lower in comparison with reversible system. Furthermore, the present study reflects the effect of various heat capacitance rates and component efficiencies on triple objectives in graphical custom. Finally, with the aim of error investigation, average and maximum errors of obtained results are computed.

  1. Determination of fetal state from cardiotocogram using LS-SVM with particle swarm optimization and binary decision tree.

    Science.gov (United States)

    Yılmaz, Ersen; Kılıkçıer, Cağlar

    2013-01-01

    We use least squares support vector machine (LS-SVM) utilizing a binary decision tree for classification of cardiotocogram to determine the fetal state. The parameters of LS-SVM are optimized by particle swarm optimization. The robustness of the method is examined by running 10-fold cross-validation. The performance of the method is evaluated in terms of overall classification accuracy. Additionally, receiver operation characteristic analysis and cobweb representation are presented in order to analyze and visualize the performance of the method. Experimental results demonstrate that the proposed method achieves a remarkable classification accuracy rate of 91.62%.

  2. Geometric Decision Tree

    CERN Document Server

    Manwani, Naresh

    2010-01-01

    In this paper we present a new algorithm for learning oblique decision trees. Most of the current decision tree algorithms rely on impurity measures to assess the goodness of hyperplanes at each node while learning a decision tree in a top-down fashion. These impurity measures do not properly capture the geometric structures in the data. Motivated by this, our algorithm uses a strategy to assess the hyperplanes in such a way that the geometric structure in the data is taken into account. At each node of the decision tree, we find the clustering hyperplanes for both the classes and use their angle bisectors as the split rule at that node. We show through empirical studies that this idea leads to small decision trees and better performance. We also present some analysis to show that the angle bisectors of clustering hyperplanes that we use as the split rules at each node, are solutions of an interesting optimization problem and hence argue that this is a principled method of learning a decision tree.

  3. Return on marketing investments in B2B customer relationships: A decision-making and optimization approach

    OpenAIRE

    Streukens, Sandra; Hoesel, Stan; de Ruyter, Ko

    2011-01-01

    The basic notion of relationship marketing entails that firms should strive for mutually beneficial customer relationships. By combining relationship marketing theory and operations research methods, this paper aims to develop and demonstrate a managerial decision-making model that business market managers can use to optimize and evaluate marketing investments in both a customer-oriented and economically feasible manner. The intended contributions of our work are as follows. First, we add to ...

  4. Expert System Shells for Rapid Clinical Decision Support Module Development: An ESTA Demonstration of a Simple Rule-Based System for the Diagnosis of Vaginal Discharge.

    Science.gov (United States)

    Kamel Boulos, Maged N

    2012-12-01

    This study demonstrates the feasibility of using expert system shells for rapid clinical decision support module development. A readily available expert system shell was used to build a simple rule-based system for the crude diagnosis of vaginal discharge. Pictures and 'canned text explanations' are extensively used throughout the program to enhance its intuitiveness and educational dimension. All the steps involved in developing the system are documented. The system runs under Microsoft Windows and is available as a free download at http://healthcybermap.org/vagdisch.zip (the distribution archive includes both the program's executable and the commented knowledge base source as a text document). The limitations of the demonstration system, such as the lack of provisions for assessing uncertainty or various degrees of severity of a sign or symptom, are discussed in detail. Ways of improving the system, such as porting it to the Web and packaging it as an app for smartphones and tablets, are also presented. An easy-to-use expert system shell enables clinicians to rapidly become their own 'knowledge engineers' and develop concise evidence-based decision support modules of simple to moderate complexity, targeting clinical practitioners, medical and nursing students, as well as patients, their lay carers and the general public (where appropriate). In the spirit of the social Web, it is hoped that an online repository can be created to peer review, share and re-use knowledge base modules covering various clinical problems and algorithms, as a service to the clinical community.

  5. Generalized radial basis function networks for classification and novelty detection: self-organization of optimal Bayesian decision.

    Science.gov (United States)

    Albrecht, S; Busch, J; Kloppenburg, M; Metze, F; Tavan, P

    2000-12-01

    By adding reverse connections from the output layer to the central layer it is shown how a generalized radial basis functions (GRBF) network can self-organize to form a Bayesian classifier, which is also capable of novelty detection. For this purpose, three stochastic sequential learning rules are introduced from biological considerations which pertain to the centers, the shapes, and the widths of the receptive fields of the neurons and allow ajoint optimization of all network parameters. The rules are shown to generate maximum-likelihood estimates of the class-conditional probability density functions of labeled data in terms of multivariate normal mixtures. Upon combination with a hierarchy of deterministic annealing procedures, which implement a multiple-scale approach, the learning process can avoid the convergence problems hampering conventional expectation-maximization algorithms. Using an example from the field of speech recognition, the stages of the learning process and the capabilities of the self-organizing GRBF classifier are illustrated.

  6. Using multi-criteria decision making for selection of the optimal strategy for municipal solid waste management.

    Science.gov (United States)

    Jovanovic, Sasa; Savic, Slobodan; Jovicic, Nebojsa; Boskovic, Goran; Djordjevic, Zorica

    2016-09-01

    Multi-criteria decision making (MCDM) is a relatively new tool for decision makers who deal with numerous and often contradictory factors during their decision making process. This paper presents a procedure to choose the optimal municipal solid waste (MSW) management system for the area of the city of Kragujevac (Republic of Serbia) based on the MCDM method. Two methods of multiple attribute decision making, i.e. SAW (simple additive weighting method) and TOPSIS (technique for order preference by similarity to ideal solution), respectively, were used to compare the proposed waste management strategies (WMS). Each of the created strategies was simulated using the software package IWM2. Total values for eight chosen parameters were calculated for all the strategies. Contribution of each of the six waste treatment options was valorized. The SAW analysis was used to obtain the sum characteristics for all the waste management treatment strategies and they were ranked accordingly. The TOPSIS method was used to calculate the relative closeness factors to the ideal solution for all the alternatives. Then, the proposed strategies were ranked in form of tables and diagrams obtained based on both MCDM methods. As shown in this paper, the results were in good agreement, which additionally confirmed and facilitated the choice of the optimal MSW management strategy.

  7. COUNCIL DECISIONS ON THE 5-YEARLY REMUNERATION REVIEW, AJUSTMENTS FOR 2001 AND CHANGES TO THE STAFF RULES AND REGULATIONS

    CERN Multimedia

    Human Resources Division

    2001-01-01

    As announced by the Director-General in December last year, Council approved the package of measures concerning the 5-yearly remuneration review, recommended by the TREF Restricted Group, as well as the adjustments for 2001 related to salaries and pensions. These measures, as summarised below, enter into force on 1 January 2001, subject to later implementation of some items. Related changes to the Staff Rules and Regulations will be published as soon as possible in the mean time, changes which were annexed to the Council Resolution can be viewed on the HR Division Web site. 1. Scale of basic salaries (Annex R A 1 of the Staff Regulations) : increased by 4.32% resulting from the 5-yearly Review, and by 0.6% which corresponds to the salary adjustment for 2001. This includes the increases in social insurance contributions indicated below. 2. Scale of stipends of Fellows (Annex R A 2 of the Staff Regulations) : increased by 1.52% resulting from the 5-yearly Review, and by 0.6% which corresponds to the adjustment ...

  8. Enhanced Visualization of Optimal Cerebral Perfusion Pressure Over Time to Support Clinical Decision Making

    NARCIS (Netherlands)

    Aries, Marcel J H; Wesselink, Robin; Elting, Jan Willem J; Donnelly, Joseph; Czosnyka, Marek; Ercole, Ari; Maurits, Natasha M; Smielewski, Peter

    2016-01-01

    OBJECTIVE: Cerebrovascular reactivity can provide a continuously updated individualized target for management of cerebral perfusion pressure, termed optimal cerebral perfusion pressure. The objective of this project was to find a way of improving the optimal cerebral perfusion pressure methodology b

  9. Cascading of C4.5 Decision Tree and Support Vector Machine for Rule Based Intrusion Detection System

    Directory of Open Access Journals (Sweden)

    Jashan Koshal

    2012-08-01

    Full Text Available Main reason for the attack being introduced to the system is because of popularity of the internet. Information security has now become a vital subject. Hence, there is an immediate need to recognize and detect the attacks. Intrusion Detection is defined as a method of diagnosing the attack and the sign of malicious activity in a computer network by evaluating the system continuously. The software that performs such task can be defined as Intrusion Detection Systems (IDS. System developed with the individual algorithms like classification, neural networks, clustering etc. gives good detection rate and less false alarm rate. Recent studies show that the cascading of multiple algorithm yields much better performance than the system developed with the single algorithm. Intrusion detection systems that uses single algorithm, the accuracy and detection rate were not up to mark. Rise in the false alarm rate was also encountered. Cascading of algorithm is performed to solve this problem. This paper represents two hybrid algorithms for developing the intrusion detection system. C4.5 decision tree and Support Vector Machine (SVM are combined to maximize the accuracy, which is the advantage of C4.5 and diminish the wrong alarm rate which is the advantage of SVM. Results show the increase in the accuracy and detection rate and less false alarm rate.

  10. Optimal Decision-Making in Fuzzy Economic Order Quantity (EOQ Model under Restricted Space: A Non-Linear Programming Approach

    Directory of Open Access Journals (Sweden)

    M. Pattnaik

    2013-08-01

    Full Text Available In this paper the concept of fuzzy Non-Linear Programming Technique is applied to solve an economic order quantity (EOQ model under restricted space. Since various types of uncertainties and imprecision are inherent in real inventory problems they are classically modeled using the approaches from the probability theory. However, there are uncertainties that cannot be appropriately treated by usual probabilistic models. The questions how to define inventory optimization tasks in such environment how to interpret optimal solutions arise. This paper allows the modification of the Single item EOQ model in presence of fuzzy decision making process where demand is related to the unit price and the setup cost varies with the quantity produced/Purchased. This paper considers the modification of objective function and storage area in the presence of imprecisely estimated parameters. The model is developed for the problem by employing different modeling approaches over an infinite planning horizon. It incorporates all concepts of a fuzzy arithmetic approach, the quantity ordered and the demand per unit compares both fuzzy non linear and other models. Investigation of the properties of an optimal solution allows developing an algorithm whose validity is illustrated through an example problem and ugh MATLAB (R2009a version software, the two and three dimensional diagrams are represented to the application. Sensitivity analysis of the optimal solution is also studied with respect to changes in different parameter values and to draw managerial insights of the decision problem.

  11. Development of Near Optimal Rule-Based Control for Plug-In Hybrid Electric Vehicles Taking into Account Drivetrain Component Losses

    OpenAIRE

    Hanho Son; Hyunsoo Kim

    2016-01-01

    A near-optimal rule-based mode control (RBC) strategy was proposed for a target plug-in hybrid electric vehicle (PHEV) taking into account the drivetrain losses. Individual loss models were developed for drivetrain components including the gears, planetary gear (PG), bearings, and oil pump, based on experimental data and mathematical governing equations. Also, a loss model for the power electronic system was constructed, including loss from the motor-generator while rotating in the unloaded s...

  12. Electric Vehicle Charging and Discharging Coordination on Distribution Network Using Multi-Objective Particle Swarm Optimization and Fuzzy Decision Making

    Directory of Open Access Journals (Sweden)

    Dongqi Liu

    2016-03-01

    Full Text Available This paper proposed a optimal strategy for coordinated operation of electric vehicles (EVs charging and discharging with wind-thermal system. By aggregating a large number of EVs, the huge total battery capacity is sufficient to stabilize the disturbance of the transmission grid. Hence, a dynamic environmental dispatch model which coordinates a cluster of charging and discharging controllable EV units with wind farms and thermal plants is proposed. A multi-objective particle swarm optimization (MOPSO algorithm and a fuzzy decision maker are put forward for the simultaneous optimization of grid operating cost, CO2 emissions, wind curtailment, and EV users’ cost. Simulations are done in a 30 node system containing three traditional thermal plants, two carbon capture and storage (CCS thermal plants, two wind farms, and six EV aggregations. Contrast of strategies under different EV charging/discharging price is also discussed. The results are presented to prove the effectiveness of the proposed strategy.

  13. Building of Reusable Reverse Logistics Model and its Optimization Considering the Decision of Backorder or Next Arrival of Goods

    Science.gov (United States)

    Lee, Jeong-Eun; Gen, Mitsuo; Rhee, Kyong-Gu; Lee, Hee-Hyol

    This paper deals with the building of the reusable reverse logistics model considering the decision of the backorder or the next arrival of goods. The optimization method to minimize the transportation cost and to minimize the volume of the backorder or the next arrival of goods occurred by the Just in Time delivery of the final delivery stage between the manufacturer and the processing center is proposed. Through the optimization algorithms using the priority-based genetic algorithm and the hybrid genetic algorithm, the sub-optimal delivery routes are determined. Based on the case study of a distilling and sale company in Busan in Korea, the new model of the reusable reverse logistics of empty bottles is built and the effectiveness of the proposed method is verified.

  14. Optimization of multi-reservoir operation with a new hedging rule: application of fuzzy set theory and NSGA-II

    Science.gov (United States)

    Ahmadianfar, Iman; Adib, Arash; Taghian, Mehrdad

    2016-06-01

    The reservoir hedging rule curves are used to avoid severe water shortage during drought periods. In this method reservoir storage is divided into several zones, wherein the rationing factors are changed immediately when water storage level moves from one zone to another. In the present study, a hedging rule with fuzzy rationing factors was applied for creating a transition zone in up and down each rule curve, and then the rationing factor will be changed in this zone gradually. For this propose, a monthly simulation model was developed and linked to the non-dominated sorting genetic algorithm for calculation of the modified shortage index of two objective functions involving water supply of minimum flow and agriculture demands in a long-term simulation period. Zohre multi-reservoir system in south Iran has been considered as a case study. The results of the proposed hedging rule have improved the long-term system performance from 10 till 27 percent in comparison with the simple hedging rule, where these results demonstrate that the fuzzification of hedging factors increase the applicability and the efficiency of the new hedging rule in comparison to the conventional rule curve for mitigating the water shortage problem.

  15. Optimal evidence in difficult settings: improving health interventions and decision making in disasters.

    Directory of Open Access Journals (Sweden)

    Martin Gerdin

    2014-04-01

    Full Text Available Martin Gerdin and colleagues argue that disaster health interventions and decision-making can benefit from an evidence-based approach Please see later in the article for the Editors' Summary.

  16. The GMOseek matrix: a decision support tool for optimizing the detection of genetically modified plants

    National Research Council Canada - National Science Library

    Block, Annette; Debode, Frédéric; Grohmann, Lutz; Hulin, Julie; Taverniers, Isabel; Kluga, Linda; Barbau-Piednoir, Elodie; Broeders, Sylvia; Huber, Ingrid; Van den Bulcke, Marc; Heinze, Petra; Berben, Gilbert; Busch, Ulrich; Roosens, Nancy; Janssen, Eric; Žel, Jana; Gruden, Kristina; Morisset, Dany

    2013-01-01

    .... For this reason, GMO testing is very challenging and requires more complex screening strategies and decision making schemes, demanding in return the use of efficient bioinformatics tools relying on reliable information...

  17. Prevalence of postmenopausal osteoporosis in Italy and validation of decision rules for referring women for bone densitometry.

    Science.gov (United States)

    D'Amelio, Patrizia; Spertino, Elena; Martino, Francesca; Isaia, Giovanni Carlo

    2013-05-01

    We report the prevalence of osteoporosis, osteopenia, and fractures in a cohort of Italian women randomly recruited among the general population and validate the use of clinical guidelines in referring women for bone density testing. We enrolled in the study 995 healthy women (age range 45-92 years). A bone density test at the lumbar spine and femur was performed and a questionnaire on osteoporosis risk factors completed for all patients. The prevalence of osteoporosis was 33.67 %, that of osteopenia was 46.63, and 19.7 % were normal at bone density testing. Osteoporotic women were generally older and thinner, with a shorter period of estrogen exposure. The prevalence of fractures was 21.9 %, and fractured women had a lower bone density, were older, and had a longer postmenopausal period. Clinical guidelines for referring women for bone density testing performed poorly (the best performance was 68 %). This is the first study providing data on the prevalence of osteoporosis/osteopenia and of fractures in a cohort of healthy postmenopausal women. Known risk factors influence bone density and risk of fractures. The role of screening in detecting women with postmenopausal osteoporosis is far from optimal.

  18. Optimal decision-making model of spatial sampling for survey of China's land with remotely sensed data

    Institute of Scientific and Technical Information of China (English)

    LI Lianfa; WANG Jinfeng; LIU Jiyuan

    2005-01-01

    Abstract In the remote sensing survey of the country land, cost and accuracy are a pair of conflicts, for which spatial sampling is a preferable solution with the aim of an optimal balance between economic input and accuracy of results, or in other words, acquirement of higher accuracy at less cost. Counter to drawbacks of previous application models, e.g. lack of comprehensive and quantitative-comparison, the optimal decision-making model of spatial sampling is proposed. This model first acquires the possible accuracy-cost diagrams of multiple schemes through initial spatial exploration, then regresses them and standardizes them into a unified reference frame, and finally produces the relatively optimal sampling scheme by using the discrete decision-making function (built by this paper) and comparing them in combination with the diagrams. According to the test result in the survey of the arable land using remotely sensed data, the Sandwich model, while applied in the survey of the thin-feature and cultivated land areas with aerial photos, can better realize the goal of the best balance between investment and accuracy. With this case and other cases, it is shown that the optimal decision-making model of spatial sampling is a good choice in the survey of the farm areas using remote sensing, with its distinguished benefit of higher precision at less cost or vice versa. In order to extensively apply the model in the surveys of natural resources, including arable farm areas, this paper proposes the prototype of development using the component technology, that could considerably improve the analysis efficiency by insetting program components within the software environment of GIS and RS.

  19. Online Rule Generation Software Process Model

    National Research Council Canada - National Science Library

    Sudeep Marwaha; Alka Aroa; Satma M C; Rajni Jain; R C Goyal

    2013-01-01

    .... The software process model for rule generation using decision tree classifier refers to the various steps required to be executed for the development of a web based software model for decision rule generation...

  20. Toward Optimal Decision Making among Vulnerable Patients Referred for Cardiac Surgery: A Qualitative Analysis of Patient and Provider Perspectives.

    Science.gov (United States)

    Gainer, Ryan A; Curran, Janet; Buth, Karen J; David, Jennie G; Légaré, Jean-Francois; Hirsch, Gregory M

    2017-07-01

    Comprehension of risks, benefits, and alternative treatment options has been shown to be poor among patients referred for cardiac interventions. Patients' values and preferences are rarely explicitly sought. An increasing proportion of frail and older patients are undergoing complex cardiac surgical procedures with increased risk of both mortality and prolonged institutional care. We sought input from patients and caregivers to determine the optimal approach to decision making in this vulnerable patient population. Focus groups were held with both providers and former patients. Three focus groups were convened for Coronary Artery Bypass Graft (CABG), Valve, or CABG +Valve patients ≥ 70 y old (2-y post-op, ≤ 8-wk post-op, complicated post-op course) (n = 15). Three focus groups were convened for Intermediate Medical Care Unit (IMCU) nurses, Intensive Care Unit (ICU) nurses, surgeons, anesthesiologists and cardiac intensivists (n = 20). We used a semi-structured interview format to ask questions surrounding the informed consent process. Transcribed audio data was analyzed to develop consistent and comprehensive themes. We identified 5 main themes that influence the decision making process: educational barriers, educational facilitators, patient autonomy and perceived autonomy, patient and family expectations of care, and decision making advocates. All themes were influenced by time constraints experienced in the current consent process. Patient groups expressed a desire to receive information earlier in their care to allow time to identify personal values and preferences in developing plans for treatment. Both groups strongly supported a formal approach for shared decision making with a decisional coach to provide information and facilitate communication with the care team. Identifying the barriers and facilitators to patient and caretaker engagement in decision making is a key step in the development of a structured, patient-centered SDM approach. Intervention

  1. Factors limiting performance in a multitone intensity-discrimination task: disentangling non-optimal decision weights and increased internal noise.

    Directory of Open Access Journals (Sweden)

    Daniel Oberfeld

    Full Text Available To identify factors limiting performance in multitone intensity discrimination, we presented sequences of five pure tones alternating in level between loud (85 dB SPL and soft (30, 55, or 80 dB SPL. In the "overall-intensity task", listeners detected a level increment on all of the five tones. In the "masking task", the level increment was imposed only on the soft tones, rendering the soft tones targets and loud tones task-irrelevant maskers. Decision weights quantifying the importance of the five tone levels for the decision were estimated using methods of molecular psychophysics. Compatible with previous studies, listeners placed higher weights on the loud tones than on the soft tones in the overall-intensity condition. In the masking task, the decisions were systematically influenced by the to-be-ignored loud tones (maskers. Using a maximum-likelihood technique, we estimated the internal noise variance and tested whether the internal noise was higher in the alternating-level five-tone sequences than in sequences presenting only the soft or only the loud tones. For the overall-intensity task, we found no evidence for increased internal noise, but listeners applied suboptimal decision weights. These results are compatible with the hypothesis that the presence of the loud tones does not impair the precision of the representation of the intensity of the soft tones available at the decision stage, but that this information is not used in an optimal fashion due to a difficulty in attending to the soft tones. For the masking task, in some cases our data indicated an increase in internal noise. Additionally, listeners applied suboptimal decision weights. The maximum-likelihood analyses we developed should also be useful for other tasks or other sensory modalities.

  2. Existing air sparging model and literature review for the development of an air sparging optimization decision tool

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-08-01

    The objectives of this Report are two-fold: (1) to provide overviews of the state-of-the-art and state-of-the-practice with respect to air sparging technology, air sparging models and related or augmentation technologies (e.g., soil vapor extraction); and (2) to provide the basis for the development of the conceptual Decision Tool. The Project Team conducted an exhaustive review of available literature. The complete listing of the documents, numbering several hundred and reviewed as a part of this task, is included in Appendix A. Even with the large amount of material written regarding the development and application of air sparging, there still are significant gaps in the technical community`s understanding of the remediation technology. The results of the literature review are provided in Section 2. In Section 3, an overview of seventeen conceptual, theoretical, mathematical and empirical models is presented. Detailed descriptions of each of the models reviewed is provided in Appendix B. Included in Appendix D is a copy of the questionnaire used to compile information about the models. The remaining sections of the document reflect the analysis and synthesis of the information gleaned during the literature and model reviews. The results of these efforts provide the basis for development of the decision tree and conceptual decision tool for determining applicability and optimization of air sparging. The preliminary decision tree and accompanying information provided in Section 6 describe a three-tiered approach for determining air sparging applicability: comparison with established scenarios; calculation of conceptual design parameters; and the conducting of pilot-scale studies to confirm applicability. The final two sections of this document provide listings of the key success factors which will be used for evaluating the utility of the Decision Tool and descriptions of potential applications for Decision Tool use.

  3. An Optimized Distributed Association Rule Mining Algorithm in Parallel and Distributed Data Mining with XML Data for Improved Response Time

    OpenAIRE

    Sujni Paul

    2010-01-01

    Many current data mining tasks can be accomplished successfully only in a distributed setting. The field of distributed data mining has therefore gained increasing importance in the last decade. The Apriori algorithm by Rakesh Agarwal has emerged as one of the best Association Rule mining algorithms. Ii also serves as the base algorithm for most parallel algorithms. The enormity and high dimensionality of datasets typically available as input to problem of association rule discovery, makes it...

  4. A cost-effectiveness analysis comparing a clinical decision rule versus usual care to risk stratify children for intraabdominal injury after blunt torso trauma.

    Science.gov (United States)

    Nishijima, Daniel K; Yang, Zhuo; Clark, John A; Kuppermann, Nathan; Holmes, James F; Melnikow, Joy

    2013-11-01

    Recently a clinical decision rule (CDR) to identify children at very low risk for intraabdominal injury needing acute intervention (IAI) following blunt torso trauma was developed. Potential benefits of a CDR include more appropriate abdominal computed tomography (CT) use and decreased hospital costs. The objective of this study was to compare the cost-effectiveness of implementing the CDR compared to usual care for the evaluation of children with blunt torso trauma. The hypothesis was that compared to usual care, implementation of the CDR would result in lower CT use and hospital costs. A cost-effectiveness decision analytic model was constructed comparing the costs and outcomes of implementation of the CDR to usual care in the evaluation of children with blunt torso trauma. Probabilities from a multicenter cohort study of children with blunt torso trauma were derived; estimated costs were based on those at the study coordinating site. Outcome measures included missed IAI, number of abdominal CT scans, total costs, and incremental cost-effectiveness ratios. Sensitivity analyses varying imputed probabilities, costs, and scenarios were conducted. Using a hypothetical cohort of 1,000 children with blunt torso trauma, the base case model projected that the implementation of the CDR would result in 0.50 additional missed IAIs, a total cost savings of $54,527, and 104 fewer abdominal CT scans compared to usual care. The usual care strategy would cost $108,110 to prevent missing one additional IAI. Findings were robust under multiple sensitivity analyses. Compared to usual care, implementation of the CDR in the evaluation of children with blunt torso trauma would reduce hospital costs and abdominal CT imaging, with a slight increase in the risk of missed intraabdominal IAI. © 2013 by the Society for Academic Emergency Medicine.

  5. Study of multi-objective optimization and multi-attribute decision-making for economic and environmental power dispatch

    Energy Technology Data Exchange (ETDEWEB)

    Xuebin, Li [Research and Development Center, Wuhan 2nd Ship Design and Research Institute, Wuhan, Hubei Province 430064 (China)

    2009-05-15

    Environmental awareness and the recent environmental policies have forced many electric utilities to restructure their practices to account for their emission impacts. One way to accomplish this is by reformulating the traditional economic dispatch problem such that emission effects are included in the mathematical model. The economic/environmental dispatch problem is a multi-objective non-linear optimization problem with constraints. This study presents a hybrid approach to solve the combined economic-emission dispatch problem (CEED). In the first stage, a non-dominated sorting genetic algorithm II (NSGA II) is employed to approximate the set of Pareto solution through an evolutionary optimization process. In the subsequent stage, a multi-attribute decision-making (MADM) approach is adopted to rank these solutions from best to worst and to determinate the best solution in a deterministic environment with a single decision maker. This hybrid approach is tested on a six-unit system to illustrate the analysis process in present analysis. Pareto frontiers are obtained and the ranking of Pareto solutions is based on entropy weight and TOPSIS method. Results obtained show that the hybrid approach has a great potential in handling multi-objective optimization problem. (author)

  6. Life Cycle Assessment and Optimization-Based Decision Analysis of Construction Waste Recycling for a LEED-Certified University Building

    Directory of Open Access Journals (Sweden)

    Murat Kucukvar

    2016-01-01

    Full Text Available The current waste management literature lacks a comprehensive LCA of the recycling of construction materials that considers both process and supply chain-related impacts as a whole. Furthermore, an optimization-based decision support framework has not been also addressed in any work, which provides a quantifiable understanding about the potential savings and implications associated with recycling of construction materials from a life cycle perspective. The aim of this research is to present a multi-criteria optimization model, which is developed to propose economically-sound and environmentally-benign construction waste management strategies for a LEED-certified university building. First, an economic input-output-based hybrid life cycle assessment model is built to quantify the total environmental impacts of various waste management options: recycling, conventional landfilling and incineration. After quantifying the net environmental pressures associated with these waste treatment alternatives, a compromise programming model is utilized to determine the optimal recycling strategy considering environmental and economic impacts, simultaneously. The analysis results show that recycling of ferrous and non-ferrous metals significantly contributed to reductions in the total carbon footprint of waste management. On the other hand, recycling of asphalt and concrete increased the overall carbon footprint due to high fuel consumption and emissions during the crushing process. Based on the multi-criteria optimization results, 100% recycling of ferrous and non-ferrous metals, cardboard, plastic and glass is suggested to maximize the environmental and economic savings, simultaneously. We believe that the results of this research will facilitate better decision making in treating construction and debris waste for LEED-certified green buildings by combining the results of environmental LCA with multi-objective optimization modeling.

  7. Nested algorithms for optimal reservoir operation and their embedding in a decision support platform

    NARCIS (Netherlands)

    Delipetrev, B.

    2016-01-01

    Reservoir operation is a multi-objective optimization problem traditionally solved with dynamic programming (DP) and stochastic dynamic programming (SDP) algorithms. The thesis presents novel algorithms for optimal reservoir operation named nested DP (nDP), nested SDP (nSDP), nested reinforcement le

  8. An analysis of the optimal multiobjective inventory clustering decision with small quantity and great variety inventory by applying a DPSO.

    Science.gov (United States)

    Wang, Shen-Tsu; Li, Meng-Hua

    2014-01-01

    When an enterprise has thousands of varieties in its inventory, the use of a single management method could not be a feasible approach. A better way to manage this problem would be to categorise inventory items into several clusters according to inventory decisions and to use different management methods for managing different clusters. The present study applies DPSO (dynamic particle swarm optimisation) to a problem of clustering of inventory items. Without the requirement of prior inventory knowledge, inventory items are automatically clustered into near optimal clustering number. The obtained clustering results should satisfy the inventory objective equation, which consists of different objectives such as total cost, backorder rate, demand relevance, and inventory turnover rate. This study integrates the above four objectives into a multiobjective equation, and inputs the actual inventory items of the enterprise into DPSO. In comparison with other clustering methods, the proposed method can consider different objectives and obtain an overall better solution to obtain better convergence results and inventory decisions.

  9. Parameter Estimation of Computationally Expensive Watershed Models Through Efficient Multi-objective Optimization and Interactive Decision Analytics

    Science.gov (United States)

    Akhtar, Taimoor; Shoemaker, Christine

    2016-04-01

    Watershed model calibration is inherently a multi-criteria problem. Conflicting trade-offs exist between different quantifiable calibration criterions indicating the non-existence of a single optimal parameterization. Hence, many experts prefer a manual approach to calibration where the inherent multi-objective nature of the calibration problem is addressed through an interactive, subjective, time-intensive and complex decision making process. Multi-objective optimization can be used to efficiently identify multiple plausible calibration alternatives and assist calibration experts during the parameter estimation process. However, there are key challenges to the use of multi objective optimization in the parameter estimation process which include: 1) multi-objective optimization usually requires many model simulations, which is difficult for complex simulation models that are computationally expensive; and 2) selection of one from numerous calibration alternatives provided by multi-objective optimization is non-trivial. This study proposes a "Hybrid Automatic Manual Strategy" (HAMS) for watershed model calibration to specifically address the above-mentioned challenges. HAMS employs a 3-stage framework for parameter estimation. Stage 1 incorporates the use of an efficient surrogate multi-objective algorithm, GOMORS, for identification of numerous calibration alternatives within a limited simulation evaluation budget. The novelty of HAMS is embedded in Stages 2 and 3 where an interactive visual and metric based analytics framework is available as a decision support tool to choose a single calibration from the numerous alternatives identified in Stage 1. Stage 2 of HAMS provides a goodness-of-fit measure / metric based interactive framework for identification of a small subset (typically less than 10) of meaningful and diverse set of calibration alternatives from the numerous alternatives obtained in Stage 1. Stage 3 incorporates the use of an interactive visual

  10. Enhanced least squares Monte Carlo method for real-time decision optimizations for evolving natural hazards

    DEFF Research Database (Denmark)

    Anders, Annett; Nishijima, Kazuyoshi

    The present paper aims at enhancing a solution approach proposed by Anders & Nishijima (2011) to real-time decision problems in civil engineering. The approach takes basis in the Least Squares Monte Carlo method (LSM) originally proposed by Longstaff & Schwartz (2001) for computing American option...... the improvement of the computational efficiency is to “best utilize” the least squares method; i.e. least squares method is applied for estimating the expected utility for terminal decisions, conditional on realizations of underlying random phenomena at respective times in a parametric way. The implementation...

  11. A Quaternary Decision Diagram Machine and the Optimization of Its Code

    Science.gov (United States)

    2009-01-01

    Matsuura, “Area-time complexities of multi-valued decision dia- grams,” IEICE Transactions on Fundamentals of Electron - ics, Vol. E87-A, No.5, pp...Murray, “Advances in binary decision based programmable controllers,” IEEE Transactions on Industrial Electronics , Aug 1988, Vol. 35, No.3, pp. 417- 425...H. Nakahara and T. Sasao, “A PC-based logic simulator using a look-up table cascade emulator,” IEICE Trans- actions on Fundamentals of Electronics

  12. Approaches to optimal aquifer management and intelligent control in a multiresolutional decision support system

    Science.gov (United States)

    Orr, Shlomo; Meystel, Alexander M.

    2005-03-01

    Despite remarkable new developments in stochastic hydrology and adaptations of advanced methods from operations research, stochastic control, and artificial intelligence, solutions of complex real-world problems in hydrogeology have been quite limited. The main reason is the ultimate reliance on first-principle models that lead to complex, distributed-parameter partial differential equations (PDE) on a given scale. While the addition of uncertainty, and hence, stochasticity or randomness has increased insight and highlighted important relationships between uncertainty, reliability, risk, and their effect on the cost function, it has also (a) introduced additional complexity that results in prohibitive computer power even for just a single uncertain/random parameter; and (b) led to the recognition in our inability to assess the full uncertainty even when including all uncertain parameters. A paradigm shift is introduced: an adaptation of new methods of intelligent control that will relax the dependency on rigid, computer-intensive, stochastic PDE, and will shift the emphasis to a goal-oriented, flexible, adaptive, multiresolutional decision support system (MRDS) with strong unsupervised learning (oriented towards anticipation rather than prediction) and highly efficient optimization capability, which could provide the needed solutions of real-world aquifer management problems. The article highlights the links between past developments and future optimization/planning/control of hydrogeologic systems. Malgré de remarquables nouveaux développements en hydrologie stochastique ainsi que de remarquables adaptations de méthodes avancées pour les opérations de recherche, le contrôle stochastique, et l'intelligence artificielle, solutions pour les problèmes complexes en hydrogéologie sont restées assez limitées. La principale raison est l'ultime confiance en les modèles qui conduisent à des équations partielles complexes aux paramètres distribués (PDE) à une

  13. 基于不完全信息的物流企业赊账决策优化模型的研究%Study on Optimization Model in Logistics Enterprise Credit Decision-making Based on Incomplete Information

    Institute of Scientific and Technical Information of China (English)

    于冰

    2012-01-01

    In this paper, we viewed the process of credit decision-making by enterprises with asymmetric information as a stochastic decision process that is discrete in time but continuous in state and on the basis of a thorough consideration of the probability of the enterprise obtaining the expected revenue and of market disappearance, proposed the optimal cessation model for the estimation, prediction and decision of the benefit of the credit, presented the rules of decision and finally validated the effectiveness of the model through an empirical study.%将信息不对称情况下企业赊欠账款的决策过程视为一个时间离散但状态连续的随机决策过程,在对企业获得的预期收益以及市场消失的概率进行充分考虑的基础上,提出了对企业赊账收益的估计、预测以及决策的最优停止模型,给出了决策规则,最后通过实例验证了模型的有效性.

  14. Solving complex maintenance planning optimization problems using stochastic simulation and multi-criteria fuzzy decision making

    Energy Technology Data Exchange (ETDEWEB)

    Tahvili, Sahar [Mälardalen University (Sweden); Österberg, Jonas; Silvestrov, Sergei [Division of Applied Mathematics, Mälardalen University (Sweden); Biteus, Jonas [Scania CV (Sweden)

    2014-12-10

    One of the most important factors in the operations of many cooperations today is to maximize profit and one important tool to that effect is the optimization of maintenance activities. Maintenance activities is at the largest level divided into two major areas, corrective maintenance (CM) and preventive maintenance (PM). When optimizing maintenance activities, by a maintenance plan or policy, we seek to find the best activities to perform at each point in time, be it PM or CM. We explore the use of stochastic simulation, genetic algorithms and other tools for solving complex maintenance planning optimization problems in terms of a suggested framework model based on discrete event simulation.

  15. Knowledge Visualizations: A Tool to Achieve Optimized Operational Decision Making and Data Integration

    Science.gov (United States)

    2015-06-01

    synthesis or visualization, while gaining situation awareness. Understanding is the decision-maker’s application of their cognitive world model applied...Information, knowledge, and meaning [Blog post]. Retrieved from http://www.b-eye-network.com/blogs/devlin/archives/ business_uninte/4. php DIKUW

  16. Economic optimization of decisions with respect to dairy cow health management.

    NARCIS (Netherlands)

    Houben, E.H.P.

    1995-01-01

    The research described in this thesis was directed towards decision support in dairy cow health management. Attention was focused on clinical mastitis, in many countries considered to be the most important dairy health problem. First a statistical analysis was carried out to obtain biological and ec

  17. Average, sensitive and Blackwell-optimal policies in denumerable Markov decision chains with unbounded rewards

    NARCIS (Netherlands)

    R. Dekker (Rommert); A. Hordijk (Arie)

    1988-01-01

    textabstractIn this paper we consider a (discrete-time) Markov decision chain with a denumerabloe state space and compact action sets and we assume that for all states the rewards and transition probabilities depend continuously on the actions. The first objective of this paper is to develop an anal

  18. An extension of the usual model in statistical decision theory with applications to stochastic optimization problems

    NARCIS (Netherlands)

    Balder, E.J.

    1980-01-01

    By employing fundamental results from “geometric” functional analysis and the theory of multifunctions we formulate a general model for (nonsequential) statistical decision theory, which extends Wald's classical model. From central results that hold for the model we derive a general theorem on the e

  19. Living renal donors: optimizing the imaging strategy--decision- and cost-effectiveness analysis

    NARCIS (Netherlands)

    Y.S. Liem (Ylian Serina); M.C.J.M. Kock (Marc); W. Weimar (Willem); K. Visser (Karen); M.G.M. Hunink (Myriam); J.N.M. IJzermans (Jan)

    2003-01-01

    textabstractPURPOSE: To determine the most cost-effective strategy for preoperative imaging performed in potential living renal donors. MATERIALS AND METHODS: In a decision-analytic model, the societal cost-effectiveness of digital subtraction angiography (DSA), gadolinium-enhanced

  20. A dynamic decision model for portfolio investment and assets management

    Institute of Scientific and Technical Information of China (English)

    QIAN Edward Y.; FENG Ying; HIGGISION James

    2005-01-01

    This paper addresses a dynamic portfolio investment problem. It discusses how we can dynamically choose candidate assets, achieve the possible maximum revenue and reduce the risk to the minimum level. The paper generalizes Markowitz's portfolio selection theory and Sharpe's rule for investment decision. An analytical solution is presented to show how an institutional or individual investor can combine Markowitz's portfolio selection theory, generalized Sharpe's rule and Value-at-Risk(VaR) to find candidate assets and optimal level of position sizes for investment (dis-investment). The result shows that the generalized Markowitz's portfolio selection theory and generalized Sharpe's rule improve decision making for investment.

  1. PARAMETRIC OPTIMIZATION OF THE MULTIMODAL DECISION-LEVEL FUSION SCHEME IN AUTOMATIC BIOMETRIC PERSON’S IDENTIFICATION

    Directory of Open Access Journals (Sweden)

    A. V. Timofeev

    2014-05-01

    Full Text Available This paper deals with an original method of structure parametric optimization for multimodal decision-level fusion scheme which combines the results of the partial solution for the classification task obtained from assembly of the monomodal classifiers. As a result, a multimodal fusion classifier which has the minimum value of the total error rate has been obtained. Properties of the proposed approach are proved rigorously. Suggested method has an urgent practical application in the automatic multimodal biometric person’s identification systems and in the systems for remote monitoring of extended objects. The proposed solution is easy for practical implementation into real operating systems. The paper presents a simulation study of the effectiveness of this optimized multimodal fusion classifier carried out on special bimodal biometrical database. Simulation results showed high practical effectiveness of the suggested method.

  2. Experimental design and multicriteria decision making methods for the optimization of ice cream composition

    Directory of Open Access Journals (Sweden)

    Cristian Rojas

    2012-03-01

    Full Text Available The aim of the present work was to optimize the sensorial and technological features of ice cream. The experimental work was performed in two stages: 1 optimization of lactose enzymatic hydrolysis, and 2 optimization of the process and product. For the first stage a complete factorial design was developed, optimized using both response surface and the steepest ascent method. In the second stage a mixture design was performed, combining the process variables. The product with the best sensorial acceptance, high yield and low cost was selected. The acceptance of the product was developed by an untrained taster’s panel. As a main result the sensorial and technological features of the final product were improved, establishing the optimum parameters for its elaboration.

  3. Optimizing Negotiation Conflict in the Cloud Service Negotiation Framework Using Probabilistic Decision Making Model

    OpenAIRE

    Rajkumar Rajavel; Mala Thangarathinam

    2015-01-01

    Optimization of negotiation conflict in the cloud service negotiation framework is identified as one of the major challenging issues. This negotiation conflict occurs during the bilateral negotiation process between the participants due to the misperception, aggressive behavior, and uncertain preferences and goals about their opponents. Existing research work focuses on the prerequest context of negotiation conflict optimization by grouping similar negotiation pairs using distance, binary, co...

  4. The Uncertainty Threshold Principle: Some Fundamental Limitations of Optimal Decision Making Under Dynamic Uncertainity

    Science.gov (United States)

    Athans, M.; Ku, R.; Gershwin, S. B.

    1977-01-01

    This note shows that the optimal control of dynamic systems with uncertain parameters has certain limitations. In particular, by means of a simple scalar linear-quadratic optimal control example, it is shown that the infinite horizon solution does not exist if the parameter uncertainty exceeds a certain quantifiable threshold; we call this the uncertainty threshold principle. The philosophical and design implications of this result are discussed.

  5. Optimal insemination and replacement decisions to minimize the cost of pathogen-specific clinical mastitis in dairy cows.

    Science.gov (United States)

    Cha, E; Kristensen, A R; Hertl, J A; Schukken, Y H; Tauer, L W; Welcome, F L; Gröhn, Y T

    2014-01-01

    Mastitis is a serious production-limiting disease, with effects on milk yield, milk quality, and conception rate, and an increase in the risk of mortality and culling. The objective of this study was 2-fold: (1) to develop an economic optimization model that incorporates all the different types of pathogens that cause clinical mastitis (CM) categorized into 8 classes of culture results, and account for whether the CM was a first, second, or third case in the current lactation and whether the cow had a previous case or cases of CM in the preceding lactation; and (2) to develop this decision model to be versatile enough to add additional pathogens, diseases, or other cow characteristics as more information becomes available without significant alterations to the basic structure of the model. The model provides economically optimal decisions depending on the individual characteristics of the cow and the specific pathogen causing CM. The net returns for the basic herd scenario (with all CM included) were $507/cow per year, where the incidence of CM (cases per 100 cow-years) was 35.6, of which 91.8% of cases were recommended for treatment under an optimal replacement policy. The cost per case of CM was $216.11. The CM cases comprised (incidences, %) Staphylococcus spp. (1.6), Staphylococcus aureus (1.8), Streptococcus spp. (6.9), Escherichia coli (8.1), Klebsiella spp. (2.2), other treated cases (e.g., Pseudomonas; 1.1), other not treated cases (e.g., Trueperella pyogenes; 1.2), and negative culture cases (12.7). The average cost per case, even under optimal decisions, was greatest for Klebsiella spp. ($477), followed by E. coli ($361), other treated cases ($297), and other not treated cases ($280). This was followed by the gram-positive pathogens; among these, the greatest cost per case was due to Staph. aureus ($266), followed by Streptococcus spp. ($174) and Staphylococcus spp. ($135); negative culture had the lowest cost ($115). The model recommended treatment for

  6. Application of Bayesian statistical decision theory to the optimization of generating set maintenance; Application de la theorie de decision statistique bayesienne a l`optimisation de la maintenance des groupes electrogenes

    Energy Technology Data Exchange (ETDEWEB)

    Procaccia, H.; Cordier, R.; Muller, S.

    1994-07-01

    Statistical decision theory could be a alternative for the optimization of preventive maintenance periodicity. In effect, this theory concerns the situation in which a decision maker has to make a choice between a set of reasonable decisions, and where the loss associated to a given decision depends on a probabilistic risk, called state of nature. In the case of maintenance optimization, the decisions to be analyzed are different periodicities proposed by the experts, given the observed feedback experience, the states of nature are the associated failure probabilities, and the losses are the expectations of the induced cost of maintenance and of consequences of the failures. As failure probabilities concern rare events, at the ultimate state of RCM analysis (failure of sub-component), and as expected foreseeable behaviour of equipment has to be evaluated by experts, Bayesian approach is successfully used to compute states of nature. In Bayesian decision theory, a prior distribution for failure probabilities is modeled from expert knowledge, and is combined with few stochastic information provided by feedback experience, giving a posterior distribution of failure probabilities. The optimized decision is the decision that minimizes the expected loss over the posterior distribution. This methodology has been applied to inspection and maintenance optimization of cylinders of diesel generator engines of 900 MW nuclear plants. In these plants, auxiliary electric power is supplied by 2 redundant diesel generators which are tested every 2 weeks during about 1 hour. Until now, during yearly refueling of each plant, one endoscopic inspection of diesel cylinders is performed, and every 5 operating years, all cylinders are replaced. RCM has shown that cylinder failures could be critical. So Bayesian decision theory has been applied, taking into account expert opinions, and possibility of aging when maintenance periodicity is extended. (authors). 8 refs., 5 figs., 1 tab.

  7. Optimal DO Setpoint Decision and Electric Cost Saving in Aerobic Reactor Using Respirometer and Air Blower Control

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Kwang Su; Yoo, Changkyoo [Kyung Hee University, Yongin (Korea, Republic of); Kim, Minhan [Pangaea21 Ltd., Seongnam (Korea, Republic of); Kim, Jongrack [UnUsoft Ltd., Seoul (Korea, Republic of)

    2014-10-15

    Main objects for wastewater treatment operation are to maintain effluent water quality and minimize operation cost. However, the optimal operation is difficult because of the change of influent flow rate and concentrations, the nonlinear dynamics of microbiology growth rate and other environmental factors. Therefore, many wastewater treatment plants are operated for much more redundant oxygen or chemical dosing than the necessary. In this study, the optimal control scheme for dissolved oxygen (DO) is suggested to prevent over-aeration and the reduction of the electric cost in plant operation while maintaining the dissolved oxygen (DO) concentration for the metabolism of microorganisms in oxic reactor. The oxygen uptake rate (OUR) is real-time measured for the identification of influent characterization and the identification of microorganisms' oxygen requirement in oxic reactor. Optimal DO set-point needed for the micro-organism is suggested based on real-time measurement of oxygen uptake of micro-organism and the control of air blower. Therefore, both stable effluent quality and minimization of electric cost are satisfied with a suggested optimal set-point decision system by providing the necessary oxygen supply requirement to the micro-organisms coping with the variations of influent loading.

  8. A decision support system for delivering optimal quality peach and tomato

    Science.gov (United States)

    Thai, C. N.; Pease, J. N.; Shewfelt, R. L.

    1990-01-01

    Several studies have indicated that color and firmness are the two quality attributes most important to consumers in making purchasing decisions of fresh peaches and tomatoes. However, at present, retail produce managers do not have the proper information for handling fresh produce so it has the most appealing color and firmness when it reaches the consumer. This information should help them predict the consumer color and firmness perception and preference for produce from various storage conditions. Since 1987, for 'Redglobe' peach and 'Sunny' tomato, we have been generating information about their physical quality attributes (firmness and color) and their corresponding consumer sensory scores. This article reports on our current progress toward the goal of integrating such information into a model-based decision support system for retail level managers in handling fresh peaches and tomatoes.

  9. Modeling, control and optimization of water systems systems engineering methods for control and decision making tasks

    CERN Document Server

    2016-01-01

    This book provides essential background knowledge on the development of model-based real-world solutions in the field of control and decision making for water systems. It presents system engineering methods for modelling surface water and groundwater resources as well as water transportation systems (rivers, channels and pipelines). The models in turn provide information on both the water quantity (flow rates, water levels) of surface water and groundwater and on water quality. In addition, methods for modelling and predicting water demand are described. Sample applications of the models are presented, such as a water allocation decision support system for semi-arid regions, a multiple-criteria control model for run-of-river hydropower plants, and a supply network simulation for public services.

  10. Matching rules for collective behaviors on complex networks: optimal configurations for vibration frequencies of networked harmonic oscillators.

    Directory of Open Access Journals (Sweden)

    Meng Zhan

    Full Text Available The structure-dynamics-function has become one of central problems in modern sciences, and it is a great challenge to unveil the organization rules for different dynamical processes on networks. In this work, we study the vibration spectra of the classical mass spring model with different masses on complex networks, and pay our attention to how the mass spatial configuration influences the second-smallest vibrational frequency (ω2 and the largest one (ωN. For random networks, we find that ω2 becomes maximal and ωN becomes minimal if the node degrees are point-to-point-positively correlated with the masses. In these cases, we call it point-to-point matching. Moreover, ω2 becomes minimal under the condition that the heaviest mass is placed on the lowest-degree vertex, and ωN is maximal as long as the lightest mass is placed on the highest-degree vertex, and in both cases all other masses can be arbitrarily settled. Correspondingly, we call it single-point matching. These findings indicate that the matchings between the node dynamics (parameter and the node position rule the global systems dynamics, and sometimes only one node is enough to control the collective behaviors of the whole system. Therefore, the matching rules might be the common organization rules for collective behaviors on networks.

  11. Enhancing the Systems Decision Process with Flexibility Analysis for Optimal Unmanned Aircraft System Selection

    Science.gov (United States)

    2008-06-01

    167 Appendix B: Military Organization Chart ............................................................... 169 A ppendix C : Pugh M atrix ...overall motivation is to improve decision making. 1.8 Thesis Structure This work is organized into seven chapters. Chapter 2 provides a detailed...NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Massachusetts

  12. Selection of Optimal Supplier in Supply Chain Using A MultiCriteria Decision Making Method

    Directory of Open Access Journals (Sweden)

    H. Assellaou

    2015-07-01

    Full Text Available The supplier selection problem is one of the strategic decisions that have a significant impact on the performance of the supply chain. In this study, supplier selection problem of an automotive company is investigated and a comprehensive methodology is used to select the best supplier providing the most customer satisfaction for the criteria determined. The proposed methodology consists of Analytic Network Process (ANP, the criteria which are relevant in the supplier selection, have been used to construct an ANP model

  13. HEURISTIC OPTIMIZATION OF NATURAL PRODUCTION INVENTORY MODELS WITH THE PREFERENCE OF A DECISION MAKER

    OpenAIRE

    CHIH HSUN HSIEH

    2008-01-01

    In this paper, two natural production inventory models based on fuzzy total production inventory cost with the preference of a decision maker are introduced, and combined by natural number parameters in which values are linguistic values in natural language, crisp real number variables, and fuzzy number variables. These are the one natural production inventory model for crisp production quantity, and the other natural production inventory model for fuzzy production quantity. The natural arith...

  14. Optimizing Clinical Decision Support in the Electronic Health Record. Clinical Characteristics Associated with the Use of a Decision Tool for Disposition of ED Patients with Pulmonary Embolism.

    Science.gov (United States)

    Ballard, Dustin W; Vemula, Ridhima; Chettipally, Uli K; Kene, Mamata V; Mark, Dustin G; Elms, Andrew K; Lin, James S; Reed, Mary E; Huang, Jie; Rauchwerger, Adina S; Vinson, David R

    2016-09-21

    Adoption of clinical decision support (CDS) tools by clinicians is often limited by workflow barriers. We sought to assess characteristics associated with clinician use of an electronic health record-embedded clinical decision support system (CDSS). In a prospective study on emergency department (ED) activation of a CDSS tool across 14 hospitals between 9/1/14 to 4/30/15, the CDSS was deployed at 10 active sites with an on-site champion, education sessions, iterative feedback, and up to 3 gift cards/clinician as an incentive. The tool was also deployed at 4 passive sites that received only an introductory educational session. Activation of the CDSS - which calculated the Pulmonary Embolism Severity Index (PESI) score and provided guidance - and associated clinical data were collected prospectively. We used multivariable logistic regression with random effects at provider/facility levels to assess the association between activation of the CDSS tool and characteristics at: 1) patient level (PESI score), 2) provider level (demographics and clinical load at time of activation opportunity), and 3) facility level (active vs. passive site, facility ED volume, and ED acuity at time of activation opportunity). Out of 662 eligible patient encounters, the CDSS was activated in 55%: active sites: 68% (346/512); passive sites 13% (20/150). In bivariate analysis, active sites had an increase in activation rates based on the number of prior gift cards the physician had received (96% if 3 prior cards versus 60% if 0, ppromotion significantly increased odds of CDSS activation. Optimizing CDSS adoption requires active education.

  15. Action Rules Mining

    CERN Document Server

    Dardzinska, Agnieszka

    2013-01-01

    We are surrounded by data, numerical, categorical and otherwise, which must to be analyzed and processed to convert it into information that instructs, answers or aids understanding and decision making. Data analysts in many disciplines such as business, education or medicine, are frequently asked to analyze new data sets which are often composed of numerous tables possessing different properties. They try to find completely new correlations between attributes and show new possibilities for users.   Action rules mining discusses some of data mining and knowledge discovery principles and then describe representative concepts, methods and algorithms connected with action. The author introduces the formal definition of action rule, notion of a simple association action rule and a representative action rule, the cost of association action rule, and gives a strategy how to construct simple association action rules of a lowest cost. A new approach for generating action rules from datasets with numerical attributes...

  16. Interactive Genetic Algorithm - An Adaptive and Interactive Decision Support Framework for Design of Optimal Groundwater Monitoring Plans

    Science.gov (United States)

    Babbar-Sebens, M.; Minsker, B. S.

    2006-12-01

    In the water resources management field, decision making encompasses many kinds of engineering, social, and economic constraints and objectives. Representing all of these problem dependant criteria through models (analytical or numerical) and various formulations (e.g., objectives, constraints, etc.) within an optimization- simulation system can be a very non-trivial issue. Most models and formulations utilized for discerning desirable traits in a solution can only approximate the decision maker's (DM) true preference criteria, and they often fail to consider important qualitative and incomputable phenomena related to the management problem. In our research, we have proposed novel decision support frameworks that allow DMs to actively participate in the optimization process. The DMs explicitly indicate their true preferences based on their subjective criteria and the results of various simulation models and formulations. The feedback from the DMs is then used to guide the search process towards solutions that are "all-rounders" from the perspective of the DM. The two main research questions explored in this work are: a) Does interaction between the optimization algorithm and a DM assist the system in searching for groundwater monitoring designs that are robust from the DM's perspective?, and b) How can an interactive search process be made more effective when human factors, such as human fatigue and cognitive learning processes, affect the performance of the algorithm? The application of these frameworks on a real-world groundwater long-term monitoring (LTM) case study in Michigan highlighted the following salient advantages: a) in contrast to the non-interactive optimization methodology, the proposed interactive frameworks were able to identify low cost monitoring designs whose interpolation maps respected the expected spatial distribution of the contaminants, b) for many same-cost designs, the interactive methodologies were able to propose multiple alternatives

  17. Improving the Flexibility of Optimization-Based Decision Aiding Frameworks for Integrated Water Resource Management

    Science.gov (United States)

    Guillaume, J. H.; Kasprzyk, J. R.

    2013-12-01

    Deep uncertainty refers to situations in which stakeholders cannot agree on the full suite of risks for their system or their probabilities. Additionally, systems are often managed for multiple, conflicting objectives such as minimizing cost, maximizing environmental quality, and maximizing hydropower revenues. Many objective analysis (MOA) uses a quantitative model combined with evolutionary optimization to provide a tradeoff set of potential solutions to a planning problem. However, MOA is often performed using a single, fixed problem conceptualization. Focus on development of a single formulation can introduce an "inertia" into the problem solution, such that issues outside the initial formulation are less likely to ever be addressed. This study uses the Iterative Closed Question Methodology (ICQM) to continuously reframe the optimization problem, providing iterative definition and reflection for stakeholders. By using a series of directed questions to look beyond a problem's existing modeling representation, ICQM seeks to provide a working environment within which it is easy to modify the motivating question, assumptions, and model identification in optimization problems. The new approach helps identify and reduce bottle-necks introduced by properties of both the simulation model and optimization approach that reduce flexibility in generation and evaluation of alternatives. It can therefore help introduce new perspectives on the resolution of conflicts between objectives. The Lower Rio Grande Valley portfolio planning problem is used as a case study.

  18. A decision support system for optimization of regional drinking water supply

    NARCIS (Netherlands)

    Vink, C.; Schot, P.P.

    2000-01-01

    Finding a strategy that allows economically efficient drinking water production in regional supply systems at minimal environmental cost is often a complex task. In order to determine the optimal spatial production configuration, a systematic trade off among costs and benefits of possible

  19. Optimal cutoff points in single and multiple tests for psychological and educational decision making

    NARCIS (Netherlands)

    Ben-Yashar, Ruth; Nitzan, Shmuel; Vos, Hendrik J.

    2001-01-01

    This paper compares the determination of optimal cutoff points for single and multiple tests in the field of personnel selection. Decisional skills of predictor tests composing the multiple test are assumed to be endogenous variables that depend on the cutting points to be set. The main result

  20. A decision support system for optimization of regional drinking water supply

    NARCIS (Netherlands)

    Vink, C.; Schot, P.P.

    2000-01-01

    Finding a strategy that allows economically efficient drinking water production in regional supply systems at minimal environmental cost is often a complex task. In order to determine the optimal spatial production configuration, a systematic trade off among costs and benefits of possible strategies